id
int64
5
1.93M
title
stringlengths
0
128
description
stringlengths
0
25.5k
collection_id
int64
0
28.1k
published_timestamp
timestamp[s]
canonical_url
stringlengths
14
581
tag_list
stringlengths
0
120
body_markdown
stringlengths
0
716k
user_username
stringlengths
2
30
929,929
Clean Code in C# Part 3 Comments
Introduction According to uncle bob comments should be avoided at all costs. Well written...
16,214
2021-12-18T15:14:54
https://dev.to/caiocesar/clean-code-in-c-part-3-comments-17p
csharp, cleancode, programming, comment
--- series: Clean Code in C# --- #Introduction According to uncle bob comments should be avoided at all costs. Well written code should written in a way that is easy for other developers to understand. If developers follow the rules of writing clean methods as describe in [Part 2 Methods](https://dev.to/caiosousa/clean-code-in-c-part-2-methods-58mb) avoiding comments makes even more sense. #Explaining Code Developers sometimes write logic and try to explain the code through comments. These comments in most cases are not necessary as displayed in the code bellow. ```csharp // Verify if user has access to every module. if ((user.Type == ADMINISTRATOR || user.Type == MANAGER) && user.IsActive) { } ``` ```csharp if (user.HasAccessToWholeModule) { } ``` The code above in both cases has the same responsibility. The second case is a much more cleaner approach that doesn't need a comment to explain its purpose. Comments can also become obsolete if programmer's don't update them as the code evolves. #Useful Comments In some cases it can be useful to create comments, but it is important to keep in mind that no comment is always better than having comments. Analyze the following code: ```csharp services.AddQuartz(q => { q.UseMicrosoftDependencyInjectionScopedJobFactory(); var jobKey = new JobKey(JOB_NAME); q.AddJob<HelloWorldJob>(opts => opts.WithIdentity(jobKey)); q.AddTrigger(opts => opts.ForJob(jobKey) .WithIdentity(JOB_TRIGGER_NAME) .WithCronSchedule("0/5 * * * * ?")); // run every 5 seconds }); ``` The code above has an example of middleware configuration of the open source job scheduling system library named [quartz](https://www.quartz-scheduler.net). It uses [cron](https://en.wikipedia.org/wiki/Cron) to define schedules and in this case it would make sense to add a comment to interpret the schedule. Other comments might also be appropriate in code like: - Important functionalities - TODO explaining obsolete or pending items about a method. - Alerts about long running process. #Excessive Comments Many times I run into code with excessive comments. Programmers might think the code looks elegant with these comments, however they are redundant and not necessary. The code bellow has an example of excessive comments: ```csharp /// <summary> /// User Class /// </summary> public class User { /// <summary> /// User Type /// </summary> public string Type { get; set; } /// <summary> /// Verifies if user is active /// </summary> public bool IsActive { get; set; } } ``` #Regions Regions in C# are sometimes used inside a method to "Improve Code Readability". Regions inside methods actually increases the size of the method and should be avoided at all costs. Regions could be used outside of methods to organize code, but that should be avoided too. In some cases where classes contains hundreds or thousands of lines of code it could make sense to have regions as shown in the following figure: ![#Region in C# Description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/maem7tf7i6naeaz2vpfh.png) #Conclusion There are many other examples of comments that should be avoided not explored here. The main idea that you should be aware if you got to this point is to always avoid commenting code whenever possible. I find that having a concise external documentation about the system, requirements and architecture can also avoid many unnecessary comments in the code. #References 1. Clean Code: A Handbook of Agile Software Craftsmanship by Robert C. Martin. 2. [Quartz](https://www.quartz-scheduler.net)
caiocesar
929,947
What should I test?
You might often hear developers say that you should write tests. Great, let’s go! But what should you...
0
2021-12-18T15:28:11
https://www.csrhymes.com/2021/12/18/what-hould-i-test.html
testing, php, javascript, webdev
--- title: What should I test? published: true date: 2021-12-18 09:00:07 UTC tags: Testing,PHP,JavaScript,webdev canonical_url: https://www.csrhymes.com/2021/12/18/what-hould-i-test.html cover_image: https://www.csrhymes.com/img/what-should-i-test.jpg --- You might often hear developers say that you should write tests. Great, let’s go! But what should you test and where do you start? I don’t know all the answers, but here are some tips for getting started. I recently started working an existing application which was very different to what I had been working on previously. When I write a brand new application, one of the first things I do is to get the test tool up and running and start writing tests alongside writing the application code. I would often start with writing some smaller unit tests and then feature or end to end tests to ensure the process worked as a whole. When you start working on an existing project that has little test coverage it’s not so easy to figure out where to start with your tests. It can seem a very daunting task as there is so much existing code and features to test. So what should you do? Just ignore tests and carry on coding? The truth is you probably won’t get the opportunity to spend weeks solely writing tests as you still need to work on new features and produce visible output to your customer. ## Start small Some tests are better than none at all. Sometimes the smallest, simplest test is the most useful test. This tweet by Nuno Maduro says it all. Write a simple test that “ensures your application boots” and it “ensures your homepage can be loaded”. > Does your [@laravelphp](https://twitter.com/laravelphp?ref_src=twsrc%5Etfw) application have 0 tests? Here is one test you can easily add to get started. It's probably the most important test — in web projects — and it has an enormous value. > > ✓ Ensures your application boots. ✅ > ✓ Ensures the home page can be loaded. 💨 [pic.twitter.com/BclTaFcig8](https://t.co/BclTaFcig8) > > — Nuno Maduro (@enunomaduro) [December 9, 2021](https://twitter.com/enunomaduro/status/1468901807585955848?ref_src=twsrc%5Etfw) <script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> ## Feature tests So we have proven that the application boots, but should we write feature or unit tests next? In my opinion it is better to start writing feature (or end to end) tests rather than unit tests. If you inherit an application that has little or no test coverage, then writing a feature test then you can quickly cover more of your code by writing fewer tests. Even the best documentation can’t provide you with the level of detail that the code is written at. It may describe a feature at top level, such as “Users can log in with a email address and a password”, so you should be able to write a feature test that calls the login endpoint, pass in the email address and password, then assert that the user is logged in successfully. ## Unit tests Hopefully all the tests will pass, but if they don’t then this will prompt you to dig into the code a bit deeper. This will allow you to learn more about specific functions, which you can then write unit tests for to prove they are doing what they are supposed to be doing. It’s actually very difficult to write a unit test without understanding the code in detail. When you start working on an application you won’t have that in depth knowledge of what each method does, in fact you won’t even know what the method names are without spending time digging through the code. You will gain this knowledge over time, but you may not remember it all, so this is where writing unit tests will help you out, acting as a kind of documentation for your code. It will allow you to construct scenarios that you code should handle. Building on the example of users logging in, you could write a specific unit test that asserts the user has entered a valid email address in the log in form, otherwise a validation error should be thrown. ## Happy path Start by writing tests for the happy path. The happy path assumes that everything has gone as you expect and the user has entered the correct information and the process completes from start to finish. For example, the user entered their email address in the email field, instead of entering it in the password field, and they successfully logged in. You may say, what value is there in testing this? We know it works as our application is up and running and people are quite happily using it. This is true, but code won’t stay the same forever and at some point you may add a new feature, such as allowing logins with social media accounts, and you want to write this happy path test to ensure that existing users will still be able to log in as they did before. Sometimes everyone is so excited about testing the new feature, that the existing functionality can be overlooked. Testing existing functionality is also known as regression testing. > Regression testing (rarely, non-regression testing) is re-running functional and non-functional tests to ensure that previously developed and tested software still performs after a change. [https://en.wikipedia.org/wiki/Regression\_testing](https://en.wikipedia.org/wiki/Regression_testing) ## Write tests for bugfixes It’s tempting to jump straight into a bug, fix the code and then move on. Instead, take some time to write a test that replicates the bug. This will allow you to prove that the bug does in fact exist and that you know what triggers it. Once you have established this, you can then work on the fix and use your test to prove whether the bug has been resolved or not. Having a test for a bug also saves a lot of effort as you can run the automated test with the specific scenario over and over again, without having to manually set up the database or visit a specific screen and perform a specific action to replicate it. ## Tests for new features If you have to develop a new feature then this is a great time to write tests for it. One way of ensuring that tests will definitely be written for this new feature is to use Test Driven Development (TDD). TDD encourages you to write the tests first and then write the code that makes the tests pass. TDD may not be everyone’s cup of tea, but I recommend trying it out and seeing how you get on. I often find that it makes you think about what you are trying to accomplish and you can end up with a different solution than if you were to just build it as you go. ## Write tests when you update packages Quite often there will be a new major version of the framework you are using released. Rather than jumping straight in and updating the framework, ask for a bit more time to write some more tests to cover the areas that will be specifically effected by the upgrade. This will allow you to have confidence it worked before the upgrade, and that the issues are caused specifically by the upgrade and not an existing issue. I have spent many hours debugging an issue that I was sure was caused by updating a package, only to eventually realise it had nothing to do with the upgrade. If the tests fail after the upgrade then it will provide you with a checklist of what you need to work on to make the upgrade work. ## Start small, build test coverage over time These are some of the ideas for how you can build up test coverage over time. Eventually you will realise that tests are there to help you and you will have more confidence in your application. The next developer that inherits your application will thank you for the tests too! [Photo](https://stocksnap.io/photo/goldengatebridge-sanfrancisco-II0IJP2AC7) by [Burst](https://stocksnap.io/author/burstshopify) on [StockSnap](https://stocksnap.io)
chrisrhymes
930,027
Create simple calculator in javascript
A simple calculator in javascript and html. This is the basic simple calculator in javascript which...
0
2021-12-18T16:58:02
https://dev.to/shine18/create-simple-calculator-in-javascript-1j5i
javascript, beginners, programming, tutorial
A simple calculator in javascript and html. This is the basic simple calculator in javascript which is made using a simple class in javascript. It also demonstrates how you can create a class for simple tasks and instantiate it. **HTML** ``` <div id="calc"> <input type="number" id="num1" placeholder="Enter a number"/> <select id="operator" > <option value="+">+</option> <option value="-">-</option> <option value="*">*</option> <option value="/">/</option> </select> <input type="number" id="num2" placeholder="Enter a number" /> <button id="calc">=</button> <input type="number" id="result" placeholder="Result" /> </div> ``` **Javascript** ``` class Calculator{ // Properties num1 num2 result num1Input num2Input resultInput operatorSelect // member functions constructor(){ // initialization of the calculator } setEvents(){ // set events on button } add(){ // add numbers } subtract(){ // subtract numbers } multiple(){ // multiple numbers } divide(){ // divide numbers } output(){ // output the results } } ``` For detailed tutorial on (this code) simple calculator in javascript, please follow this link: https://10code.dev/javascript/a-simple-calculator-in-vanilla-javascript/
shine18
930,039
How to Create a 3D Surface Chart in JavaScript
🌈 Creating an embeddable interactive 3D Surface Plot is not as tricky as it may seem. A new tutorial...
0
2021-12-18T17:19:57
https://www.anychart.com/blog/2021/12/15/surface-chart-javascript/
javascript, webdev, datascience, programming
_🌈 Creating an embeddable interactive 3D Surface Plot is not as tricky as it may seem. A new tutorial on Hongkiat demonstrates a way that must be easy even for beginners! 🔥_ _🙋 Learn how to build a compelling surface chart using JavaScript in four quick steps and then customize it to your liking in just a few more lines of code. For an illustrative example, 15 years of GDP data for 195 countries are being visualized along the article — have fun exploring the final diagram, too!_ [Data visualization](https://www.anychart.com/blog/2018/11/20/data-visualization-definition-history-examples/) is a must-have skill today with ever-growing data and the need to analyze as well as present that data. You will definitely come across data charts whether you are in the technology industry or not and therefore, it is a good idea to learn how to build visualizations. I will show you here that building charts is not very tough and with the right tools, you can start creating interactive, interesting visualizations in little time and effort! In this step-by-step tutorial, I will demonstrate how to represent GDP values of various countries for the past 15 years on a beautiful interactive 3D surface chart using a JavaScript library for data visualization. The surface plot looks quite complex, but I will show you how straightforward it is to make a compelling and fully functional one. ## What’s a 3D Surface Chart? A 3D surface chart plots three dimensions of data on the x, y, and z axes with two of the variables being independent (displayed along the horizontal axes) and one being dependent on the other two (shown on the vertical axis). In this tutorial, I will be plotting countries and years as the independent variables and GDP values as the dependent variable. ## JavaScript Charting Libraries Currently, there are a lot of good [JS charting libraries](https://en.wikipedia.org/wiki/Comparison_of_JavaScript_charting_libraries), all of them having some pros and cons. You can choose which one to use based on your specific requirements and the best part is that the process of building visualizations is very similar for all of the libraries. So, you can start learning with any of the libraries and extend your knowledge to another library as well. For this tutorial, I am going to use the [AnyChart](https://www.anychart.com/) JavaScript charting library which is likely a good choice for beginners. It has tons of [examples](https://www.anychart.com/products/anychart/gallery/) along with extensive [documentation](https://docs.anychart.com/) that is really useful when starting out. Also, AnyChart is quite easy to use and flexible with loads of customization options. And what’s especially important to many — it is free for personal, educational, and other non-commercial use. ## Building Basic 3D Surface Plot Using a JS Library It is an advantage, of course, if you have background knowledge of HTML, CSS, and JavaScript. But don’t get overwhelmed even if you are a complete beginner. I will walk you through each line of the code, and once you understand what is happening, it should get easier to grasp. There are four general steps to create a 3D surface plot or basically any chart with a JS library, and as mentioned earlier, these steps remain alike irrespective of the library you use. - Create an HTML page to display your chart. - Include the required JavaScript files. - Prepare and connect your data. - Write the JS code for the chart. ### Step 1 — Create a basic HTML page The initial step is to have a blank HTML page that will hold the chart. I will add a block element with a unique id to the page. I will use the id to reference the `<div>` later. I will now specify the height and width of the `<div>` as 100% for both in the `<style>` block of the page. This will render the chart on the full page. You can specify the height and width according to your preference. ```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>JavaScript Surface Chart</title> <style type="text/css"> html, body, #container { width: 100%; height: 100%; margin: 0; padding: 0; } </style> </head> <body> <div id="container"></div> </body> </html> ``` ### Step 2 — Add the necessary scripts When you are using a JavaScript library, you need to add the scripts specific to the chart that you are building and the library that you are using. Here, I am using AnyChart so I need to add the corresponding scripts for the surface plot from [its CDN](https://cdn.anychart.com/) (Content Delivery Network) which is basically where all the scripts can be found. So, I will include AnyChart’s [Core](https://docs.anychart.com/Quick_Start/Modules#core) and [Surface](https://docs.anychart.com/Quick_Start/Modules#surface) modules for this chart. Just to remind you, include all these script files in the `<head>` section of your HTML page. ```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>JavaScript Surface Chart</title> <script src="https://cdn.anychart.com/releases/8.10.0/js/anychart-core.min.js"></script> <script src="https://cdn.anychart.com/releases/8.10.0/js/anychart-surface.min.js"></script> <script src="https://cdn.anychart.com/releases/8.10.0/js/anychart-data-adapter.min.js"></script> <style type="text/css"> html, body, #container { width: 100%; height: 100%; margin: 0; padding: 0; } </style> </head> <body> <div id="container"></div> <script> // All the code for the JS Surface Chart will come here </script> </body> </html> ``` ### Step 3 — Include the data The data I decided to visualize in a 3D surface plot comes from the World Bank Open Data website that gives me the GDP (PPP based) data for all the countries in a CSV file. It is easier to create a chart if the data is in the format that the chart expects and how you want to show the data. So I pre-processed the data accordingly. You can go through my JSON data file [here](https://gist.githubusercontent.com/shacheeswadia/b0d6b34a1910359e0e1a8fc0c84019a6/raw/4ab92ca6361f1bc9875d2854e2e1271bc991f86b/surfaceAreaData.json). To load the data from the JSON file, I will add the [Data Adapter](https://docs.anychart.com/Quick_Start/Modules#data_adapter) module of AnyChart and use the `loadJsonFile` method inside the `<script>` tag in the body of the HTML page. The three preparatory steps are done so get ready to write some code! ### Step 4 — Write the code to draw the chart The first thing I will do is ensure that all the code is executed only after the page is fully loaded. To do that, I will enclose the entire code within the `anychart.onDocumentReady()` function. Then, I will work with the data that is loaded from the JSON file. Even though I have already pre-processed the data, I will need to further process it for plotting the 3D surface chart. Basically, I will create an array that holds the y and z axes data according to the sequence of the x axis data. ```html anychart.onDocumentReady(function () { anychart.data.loadJsonFile( 'https://gist.githubusercontent.com/shacheeswadia/b0d6b34a1910359e0e1a8fc0c84019a6/raw/4ab92ca6361f1bc9875d2854e2e1271bc991f86b/surfaceAreaData.json', function (data) { // processing of the data var result = []; for (var x = 0; x < data.x.length; x++) { for (var y = 0; y < data.y.length; y++) { result.push([x, data.y.sort()[y], data.z[x][y]]); } } } ); }); ``` Now, I will create the surface chart and set the markers based on the data array just created. Next, I will need to set the x axis labels from the loaded data because the array that I created contains only a sequence and not the country names. I will also specify the maximum for the x scale. ```javascript // create surface chart var chart = anychart.surface(); // enable markers and set data for them chart.markers().enabled(true).data(result); // set x axis labels format chart .xAxis() .labels() .format(function () { return data.x[Math.round(this.value)]; });. // set x axis scale maximum chart.xScale().maximum(data.x.length - 1); ``` I will now give my chart a title and a bit of padding on all the sides. Lastly, I will reference the `<div>` created in step one and draw the chart. ```javascript // set chart paddings chart.padding(25, 50, 75, 50); // set chart title chart.title('GDP per capita (PPP) in 2006-2020, in USD'); // set container id for the chart chart.container('container'); // initiate chart drawing chart.draw(); ``` ![Initial 3D Surface Chart To Be Customized Next](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/73o283bouqta4u4a6l4w.png) Voila! A basic functional 3D surface plot is ready! You can have a look at this basic version of a JavaScript 3D surface plot on [AnyChart Playground](https://playground.anychart.com/R1Mq06kP/) or check out the code right here. ```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>JavaScript Surface Chart</title> <script src="https://cdn.anychart.com/releases/8.10.0/js/anychart-core.min.js"></script> <script src="https://cdn.anychart.com/releases/8.10.0/js/anychart-surface.min.js"></script> <script src="https://cdn.anychart.com/releases/8.10.0/js/anychart-data-adapter.min.js"></script> <style type="text/css"> html, body, #container { width: 100%; height: 100%; margin: 0; padding: 0; } </style> </head> <body> <div id="container"></div> <script> anychart.onDocumentReady(function () { anychart.data.loadJsonFile( 'https://gist.githubusercontent.com/shacheeswadia/b0d6b34a1910359e0e1a8fc0c84019a6/raw/4ab92ca6361f1bc9875d2854e2e1271bc991f86b/surfaceAreaData.json', function (data) { // processing of the data var result = []; for (var x = 0; x < data.x.length; x++) { for (var y = 0; y < data.y.length; y++) { result.push([x, data.y.sort()[y], data.z[x][y]]); } } // create surface chart var chart = anychart.surface(); // enable markers and set data for them chart.markers().enabled(true).data(result); // set x axis labels format chart .xAxis() .labels() .format(function () { return data.x[Math.round(this.value)]; }); // set x axis scale maximum chart.xScale().maximum(data.x.length - 1); // set chart paddings chart.padding(25, 50, 75, 50); // set chart title chart.title('GDP per capita (PPP) in 2006-2020, in USD'); // set container id for the chart chart.container('container'); // initiate chart drawing chart.draw(); } ); }); </script> </body> </html> ``` ## Customizing the JS Surface Chart One of the best parts of using any JS charting library is that you need to write a very minimal amount of code to get a working version of the chart implemented. Moreover, most of the libraries provide options to customize the chart to make it more personalized and informative. Let me show you how to enhance the JS 3D surface chart to make it more intuitive and aesthetically better: - Improve the look and feel of all the axes: Modify the basic settings of the axes Modify the labels of the axes Modify the stroke of the axes - Add a color palette - Enhance the tooltip **FOR A WALKTHROUGH OF THESE JS SURFACE CHART CUSTOMIZATIONS, [CONTINUE READING HERE](https://www.anychart.com/blog/2021/12/15/surface-chart-javascript/)**.
andreykh
930,090
Day 2 of #100DaysOfCode, google-clone flex layout
Introduction it's me again with another report from the second day of 100 days of code...
0
2021-12-18T20:05:30
https://dev.to/th3realad/day-2-of-100daysofcode-google-clone-flex-layout-od1
webdev, beginners, css, codenewbie
##Introduction it's me again with another report from the second day of 100 days of code challenge, i haven't posted yesterday, i know i know what's the point of this challenge if you can skip days, well to clarify, within the rules you can skip a day as long as you do skip just one day and not any longer, i had a pretty busy schedule yesterday but i'll try to sum it up below in the next section, i was pretty busy with school and couldn't squeeze any time for my challenge, merely because i had french classes and had to prepare for them and sleep before the time i'm used to, to attend the said classes. ##What i did yesterday ######preparing for the nodejs event in the morning i had to read up a bit in the nodejs docs as i have mentioned in earlier posts, we're studying javascript and we are starting with server side first before moving back to the front end and study a bit on react and have hands on practice with both of them. ######the NodeJs event in the evening the nodejs event started hosted by one of our classmates who is a 19 year old boy from a city not far away from mine, also happens to be experienced in javascript in the last 6 years, and merely self-taught, he was able to provide us the newbies with an introduction on nodeJs, how it works behind the scenes and how it came into existence, i was fascinated by the event loop that node uses and the libuv library that's provided along with most of the v8 engine utilites, i did read up about node before hand but having someone who has hands on experience with a subject provides you with some things that would take you forever to grasp going on the self-taught path, nevertheless this only motivated more as they say the more you know, the more you don't know. ##Today(Day2 of the challenge) today aswell was just like day 1, all about flexbox, i did feel proud of myself going through the css exercises, it still is pretty frustrating when you feel like you hit a dead end after an hour or two has passed and i'm still not capable of completing the first few assignments, i might take a break or go for a walk, clear my mind and come back with a fresh perspective and try again but it's not quick it, couple breaks later, i hit the jackpot i was able to replicate a google like homepage, to be specific it's the flex layout exercise from the css exercises on the odin project, you can find the repo [here.](https://github.com/TheOdinProject/css-exercises) ######my progress i can't really determine or limit myself just yet, i do feel like i'm progressing but pretty slowly and due to the fact that i have school and on a busy schedule makes it a little tiny bit harder, but it gets easier, you just have to do it everyday, that's the hard part, but i will keep posting daily updates and try to stay on a streak everyday ######what else am i doing right now i'm also taking up cs50 course to understand technology a little better, even though i have been around computers for most of my life, i still feel like there's a lot of levels of abstraction that i have to uncover, and i'm planning on doing that slowly but surely, i dont do much during my days, simply because school takes most of my time and the rest is either to gather more information or follow the the top curriculum, i'm satisfied with the results of the effort and time i put in lately. ##Conclusion i do apologize for the delay, but i will try to be on time on most of the days of this challenge, and hopefully lookback and remember this journey,i wish you good luck wether you are at the beginning of your path, end, or somewhere in the middle, i've only been here a few days and noticed the great support from the community, i'm also excited to grow and give back in my own way, until the next read, thank you for taking time to ready this article, you can find me or contact me directly [here.](https://twitter.com/AdnaneBouthir) Happy Coding !
th3realad
930,136
Generators and yield in python 🤓
Generators Generators are functions that can be stopped while running and then resumed....
0
2021-12-18T20:48:38
https://dev.to/00000/generators-and-yield-in-python-29do
## Generators Generators are functions that can be stopped while running and then resumed. These functions return an object that can be scrolled. Unlike lists, generators are lazy, which causes the calculation and evaluation of a phrase to be delayed as much as possible and to be calculated as soon as it was needed. These functions make memory usage more efficient when working with large data sets. ## Differences between generators and normal Python functions: - To call the next values ​​in the generator, we use the next () method. - In normal functions we use return to return values, but in generator we use yield (the yield statement acts to stop the function and save its status to continue working in the future from where it left off). - Generators are also iterators, meaning that the classes we define have the __ (iter__) method defined in them. ## Example of generator In the following code we have a generator function that takes the list as an argument and returns the sum of the numbers of its elements: ``` def myGenerator(l): total = 0 for n in l: total += n yield total mygenerator = myGenerator([15,10, 7]) print(mygenerator) print(next(mygenerator)) print(next(mygenerator)) print(next(mygenerator)) ``` Output: `<generator object myGenerator at 0x0000026B84B6C150> 15 25 32` We see that a new value is displayed for each call of the next method. If you re-read the function, you will get an error saying "You can not navigate". `Traceback (most recent call last): File "generators.py", line 13, in <module> print(next(mygenerator)) StopIteration` Do not forget to like! Bye until the next post👋😘
00000
930,144
What are the first 10 apps you install on a new computer?
I recently bought a new MacBook and don't know where to start. I learned how to code on a used...
0
2021-12-18T21:33:53
https://dev.to/jasterix/what-are-the-first-10-apps-you-install-on-a-new-computer-2gik
help, productivity, watercooler, webdev
I recently bought a new MacBook and don't know where to start. I learned how to code on a used MacBook Air and installed everything from any list of apps I found online. But after random keys stopped working, it felt like a good opportunity to put my that old thing away and get more comfortable working in a Windows environment. Instead, I just found myself writing less code because logging onto my Surface felt like work. ![Thats The Worst Thing Happen This Is The Worst GIF](https://c.tenor.com/8ikWPrZSPHcAAAAC/thats-the-worst-thing-happen-the-worst.gif) But now that I've bought a new MacBook, I don't really know how to kick things off again. I installed the basics: 1. VS Code 2. Git 3. Bear and RemNote 4. Top Notch But after that, I'm drawing a blank. My old laptop was decked out so much it was running out of space. But this new one is precious. It deserves only the finest apps. This is why I'm seeking some inspiration from the Dev community. ### What are your top 10 apps for a fresh MacBook? If you had to chooser only 10 apps, what are fave apps for coding, productivity, or anything else that brings you joy? Nothing questionable or that's impossible to remove down the line. ![wish list gif](https://media0.giphy.com/media/1TSHxNUZh1FdzUpFWK/giphy.gif?cid=ecf05e47dikullq5nwyt3mxmiulj3iiapackcwitmumid9t3&rid=giphy.gif&ct=g) Updating the list as I rediscover some old faves: 1. VS Code 2. Git 3. Bear / RemNote / Tab Notes 4. Top Notch 5. Node 6. Postman 7. Homebrew 8. Chrome 9. Discord / Slack Photo by Ron Lach from Pexels
jasterix
930,251
How I Created a Crowdfunding Platform with Web3 & Micro-Frontends
Over the past few months, I was learning about Micro-Frontend &amp; Web3 Technologies. As the best...
0
2021-12-19T06:32:06
https://dev.to/ruppysuppy/how-i-created-a-crowdfunding-platform-with-web3-micro-frontends-3pb2
webdev, typescript, webpack, web3
Over the past few months, I was learning about **Micro-Frontend** & **Web3 Technologies**. As the best way to learn is to _try things hands-on_, so I built up a small side-project to test out my understanding of the topics. The aim of the project was to create a **Crowdfunding** Platform based on the **Ethereum Blockchain** utilizing **Micro-Frontend Architecture** You can dive into the source code here {% github https://github.com/ruppysuppy/Crypto-Crowdfund no-readme %} And the website here: https://crypto-crowdfund.web.app/ **NOTE:** You need a **[MetaMask](https://metamask.io/) Wallet** to interact with the blockchain # Smart Contract You can interact with **Ethereum Blockchain** using **Smart Contracts**, so let's create one as per our requirement. The **Contracts** used in the project is available [here](https://github.com/ruppysuppy/Crypto-Crowdfund/blob/main/packages/smart-contract/contracts/Campaign.sol) Making changes to a deployed **Smart Contracts** is not possible, and you have to _re-deploy_ the contract, which costs **Ethereum coins** as **Gas Fee**. So it is essential that you [extensively test](https://github.com/ruppysuppy/Crypto-Crowdfund/tree/main/packages/smart-contract/test) the contract before deploying Since storing data in the **Smart Contracts** incur **Gas Fee** too, only the essential data, such as the **Campaign Manager**, the **Votes for a Transaction Request**, etc are stored on the **Blockchain**. ### Potential Improvements 1. Split the **Campaign Factory** and the **Campaign** into separate files containing only the given **Smart Contract** 2. Add the **Manager functionality** as a separate **Smart Contract** and add it to the **Campaign** using inheritance # Micro-Frontend With the **Smart Contract** out of the way, let's focus on the **Micro-Frontend** ![Let's Focus](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8ja1v7dql1s8ilsps3w9.gif) **Micro-Frontend architecture** is a design approach in which a Frontend app is decomposed into **individual, independent “micro-apps”** working _loosely together_. > Was **Micro-Frontend Architecture** essential for this project? The answer to that is a definite **NO**. Then why did I use it? _Just to put what I learned to practice_. **Micro-Frontend Architecture** is useful only when working with _large teams_, _where the role of each team is to work on only a small sub-section of the project_ The **Micro-Frontend** was implemented using **Webpack's Module Federation Plugin**. If you want to learn how to implement **Micro-Frontends** from scratch, you are highly encouraged to check out [this article](https://dev.to/ruppysuppy/micro-frontends-the-next-gen-way-to-build-web-apps-16da) The **Webpack** config for each of the sub-apps and the container follow the same method. 1. Make a `common config` file with the shared **config** for **dev** & **prod** build, for example: ```js // imports ... module.exports = { // all shared config ... }; ``` 2. Make a `dev config` and merge it with the `common config` ```js const { merge } = require('webpack-merge'); // other imports ... const devConfig = { // all development config ... }, }; module.exports = merge(commonConfig, devConfig); ``` 3. Finally make a `prod config` and merge it with the `common config` ```js const { merge } = require('webpack-merge'); // other imports ... const prodConfig = { // all production config ... }, }; module.exports = merge(commonConfig, prodConfig); ``` You can check out all the configuration files [here](https://github.com/ruppysuppy/Crypto-Crowdfund/search?l=JavaScript&q=webpack) **NOTE:** In the project all the **sub-apps** use **React**, but you can very well use any other _library_ or _framework_ in any of them ## Marketing Let's start with the simplest **sub-app**, the [Marketing sub-app](https://github.com/ruppysuppy/Crypto-Crowdfund/tree/main/packages/marketing) is only responsible for rendering the _home_, _about_, _faq_, _terms-and-conditions_, _privacy-policy_, and _disclaimer_ pages, and doesn't have much functionality apart from this. Just as the **Micro-Frontend sub-apps** should only expose generic functions to avoid _library_ or _framework_ dependency between **sub-apps** and the **container**, the **Marketing sub-apps** exposes a `mount` function which takes simple objects as _params_. It has the following signature: ```ts type Mount = ( mountPoint: HTMLElement, { defaultHistory?: History | MemoryHistory; initialPath?: string; isAuthenticated?: boolean; routes: { HOME: string; ABOUT: string; FAQ: string; TERMS_AND_CONDITIONS: string; PRIVACY_POLICY: string; DISCLAIMER: string; CAMPAIGNS: string; SIGN_IN: string; }; onNavigate?: ({ pathname: string }) => void;, }, ) => { onParentNavigate: ({ pathname: string }) => void } ``` Using the `mount` function, the container can mount the **sub-app** as per requirement. `defaultHistory`, `onNavigate` and `onParentNavigate` are used to keep both the container and the **sub-app** in sync and avoid some nasty bugs. ## Auth Next up is the [Auth sub-app](https://github.com/ruppysuppy/Crypto-Crowdfund/tree/main/packages/auth). It uses **Firebase** authentication to **sign-in** and **sign-up** users and grants them the required permissions. It works similarly as the **Marketing sub-app**, by exposing the `mount` function with a similar set of _params_ and handling the _sign-in_ and _sign-up_ pages ### Potential Improvements - Use a method to **sign-in** only using the [MetaMask](https://metamask.io/) account, removing the need to **sign-in** in two places to get full access to the application ## Blockchain Finally, we are at the most difficult to understand **sub-app** of all, the [Blockchain sub-app](https://github.com/ruppysuppy/Crypto-Crowdfund/tree/main/packages/blockchain) is the meat of the project, enabling users to interact with the **backend** & the **blockchain** (to interact with the **blockchain**, you need a [MetaMask](https://metamask.io/) extension on your browser). It handles the _account_, _campaign_, _campaigns_ and _create-campaign_ pages. As mentioned previously, _only the essential data is stored on the contracts_, saving the rest on **Cloud Firestore**. The data from the campaigns are fetched from the addresses residing on the **Blockchain**, and then data from **Firestore** is merged to generate the complete data for a given **Campaign**. Only the creator of the **Campaign** can modify the data on **Firestore** or create **Transaction Request** to spend the available funds, which the Contributors can approve. ### Potential Improvements - Using _image uploads_ in place of _add url to image_ for both the cover image and the user profile picture (skipped it as the main focus was on integrating **web3** & **micro-frontend**) ## Container The [Container](https://github.com/ruppysuppy/Crypto-Crowdfund/tree/main/packages/container) is responsible for _condensing all the **sub-apps** into a single application_ and _controlling what is displayed on the screen_. ### Potential Improvements - Improvised **UI/UX** # Wrapping Up It's finally over... ![Relief](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5qmw8ndm93q9ytu13kfh.gif) This article presented a _brief overview_ of the project, you can always dive into the **source code** and _check out the project line by line_ {% github https://github.com/ruppysuppy/Crypto-Crowdfund no-readme %} **Happy Developing!** Finding **personal finance** too intimidating? Checkout my **Instagram** to become a [**Dollar Ninja**](https://www.instagram.com/the.dollar.ninja/) # Thanks for reading Need a **Top Rated Front-End Development Freelancer** to chop away your development woes? Contact me on [Upwork](https://www.upwork.com/o/profiles/users/~01c12e516ee1d35044/) Want to see what I am working on? Check out my [Personal Website](https://tapajyoti-bose.vercel.app) and [GitHub](https://github.com/ruppysuppy) Want to connect? Reach out to me on [LinkedIn](https://www.linkedin.com/in/tapajyoti-bose/) I am a freelancer who will start off as a **Digital Nomad** in mid-2022. Want to catch the journey? Follow me on [Instagram](https://www.instagram.com/tapajyotib/) Follow my blogs for **Weekly new Tidbits** on [Dev](https://dev.to/ruppysuppy) **FAQ** These are a few commonly asked questions I get. So, I hope this **FAQ** section solves your issues. 1. **I am a beginner, how should I learn Front-End Web Dev?** Look into the following articles: 1. [Front End Development Roadmap](https://dev.to/ruppysuppy/front-end-developer-roadmap-zero-to-hero-4pkf) 2. [Front End Project Ideas](https://dev.to/ruppysuppy/5-projects-to-master-front-end-development-57p) 2. **Would you mentor me?** Sorry, I am already under a lot of workload and would not have the time to mentor anyone. 3. **Would you like to collaborate on our site?** As mentioned in the _previous question_, I am in a time crunch, so I would have to pass on such opportunities.
ruppysuppy
932,500
MailerLite Review - E-Mail-Marketing und Automatisierung
Wenn du als kleines Unternehmen eine E-Mail-Marketingkampagne starten willst, musst du gut über die...
0
2021-12-21T19:18:43
https://bloggerpilot.com/mailerlite/
onlinemarketing, newsletter
--- title: MailerLite Review - E-Mail-Marketing und Automatisierung published: true date: 2021-12-21 17:50:49 UTC tags: OnlineMarketing,newsletter canonical_url: https://bloggerpilot.com/mailerlite/ --- ![MailerLite Review](https://bloggerpilot.com/wp-content/uploads/2021/11/mailerlite-review.png) Wenn du als kleines Unternehmen eine E-Mail-Marketingkampagne starten willst, musst du gut über die verfügbaren Optionen informiert sein. Die Zahl der E-Mail-Dienstleister ist groß, und obwohl viele von ihnen gut sind, gibt es auch viele, die nicht so ideal sind. Es gibt MailerLite, ein Unternehmen, das eine Reihe von E-Mail-Diensten und -Produkten anbietet, darunter ein kostenloses E-Mail-Marketing-Tool namens Mailerlite. Der Original-Artikel erschien auf [MailerLite Review - E-Mail-Marketing und Automatisierung](https://bloggerpilot.com/mailerlite/). MailerLite ist ein Produktionswerkzeug für den Versand von Newslettern, Berichten und anderen Geschäftsdokumenten per E-Mail. Damit kannst du die von dir erstellten Dokumente mit verschiedenen Vorlagen, E-Mail-Adressen und Betreffs verschicken. In diesem Beitrag stellen wir dir einige der verschiedenen Preisoptionen vor. MailerLite ist ein kostenloses, leichtgewichtiges, selbstgehostetes E-Mail-Programm, das in Java geschrieben wurde. Sie ist so konzipiert, dass du keinen Mailserver installieren oder benutzen musst. Sie kann im Web oder auf dem Desktop genutzt werden. Sie hat eine einfache Benutzeroberfläche ohne zusätzliche Funktionen. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/di7f2j617wvltfpzk4e3.png) MailerLite ist eine neue Art des E-Mail-Versands, an der wir in den letzten 7 Monaten gearbeitet haben. Das Ziel ist es, das Versenden von E-Mails so einfach zu machen wie das Versenden eines Bildes. Zurzeit verwendet MailerLite das gleiche E-Mail-Konto, das du bereits benutzt. In Zukunft planen wir, den Versand von Google Mail, Outlook.com, iCloud und mehr zu unterstützen. Der folgende Text erklärt den Prozess und die Einstellungen, die nötig sind, um MailerLite zum Laufen zu bringen.
j0e
932,977
React core concepts
JSX JSX is a JavaScript syntax extension that provides a way to structure component rendering using...
0
2021-12-22T07:03:39
https://dev.to/saddaul_siam/react-core-concepts-5b8d
javascript, react, reactcoreconcepts
**JSX** JSX is a JavaScript syntax extension that provides a way to structure component rendering using syntax familiar to many developers. It is similar in appearance to HTML. **Prop Types ** Prop Types is Reacts internal mechanism for adding type checking to components. React components use a special property named prop Types to set up type checking. When props are passed to a React component, they are checked against the type definitions configured in the prop Types property. **state props ** In a React component, props are variables passed to it by its parent component. State on the other hand is still a variable but directly initialized and managed by the component. The state can be initialized by props. and any other method in this class can reference the props using this **Component Lifecycle** React components have a lifecycle. It has three phases. Mounting (componentDidMount -> Updating (componentDidUpdate)-> Unmounting (componentWillUnmount) **How React Hook works, send state via props, props vs state** useState, useEffect is called Hook in React. If the state is used in any component is called Stateful Component and doesn’t contain the state is called Stateless / Presentational component. Difference between state and props - Interview Important - Main difference is: props are Read-only, but the state can be changed. **Custom hooks** Custom Hooks lets you extract component logic into reusable functions. **Context API** Normally when we pass data from one component to another component it is called props dealing. But when we have to send data down to 4 to 5 components, we can use context API. If we use the context API, we don't have to deal with props, we can import and use data wherever we need it. **Virtual DOM and diffing- algorithm** When we made a React app, React creates a virtual DOM internally by combining all the components and sending it to the browser. The browser renders it and displays it. If the user does something in UI and needs to update the Browser DOM, React compares only the changed part with its old virtual DOM and updates it significantly faster-using diff algorithm and sends the browser only the changed part for the update. The browser immediately updates it in the browser DOM and displays it.
saddaul_siam
934,938
Build an Offline-First React Native Mobile App with Expo and Realm
In this post we'll build, step by step, a simple React Native Mobile App for iOS and Android using Expo and Realm. The App will use Realm Sync to store data in a MongoDB Atlas Database, will Sync automatically between devices and will work offline.
0
2021-12-23T16:00:21
https://www.mongodb.com/developer/how-to/build-offline-first-react-native-mobile-app-with-expo-and-realm/
expo, reactnative, realm, mobile
--- title: Build an Offline-First React Native Mobile App with Expo and Realm published: true description: In this post we'll build, step by step, a simple React Native Mobile App for iOS and Android using Expo and Realm. The App will use Realm Sync to store data in a MongoDB Atlas Database, will Sync automatically between devices and will work offline. tags: expo, reactnative, realm, mobile canonical_url: https://www.mongodb.com/developer/how-to/build-offline-first-react-native-mobile-app-with-expo-and-realm/ cover_image: https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/Twitter_Realm_expo_aef7185261.png --- ## Introduction Building Mobile Apps that work offline and sync between different devices is not an easy task. You have to write code to detect when you’re offline, save data locally, detect when you’re back online, compare your local copy of data with that in the server, send and receive data, parse JSON, etc. It’s a time consuming process that’s needed, but that appears over and over in every single mobile app. You end up solving the same problem for each new project you write. And it’s worse if you want to run your app in iOS and Android. This means redoing everything twice, with two completely different code bases, different threading libraries, frameworks, databases, etc. To help with offline data management and syncing between different devices, running different OSes, we can use MongoDB’s [Realm](https://realm.io/). To create a single code base that works well in both platforms we can use React Native. And the simplest way to create React Native Apps is using [Expo](https://expo.io/). ### React Native Apps The [React Native Project](https://reactnative.dev/), allows you to create iOS and Android apps using [React](https://reactnative.dev/docs/intro-react) _“a best-in-class JavaScript library for building user interfaces_”. So if you’re an experienced Web developer who already knows React, using React Native will be the natural next step to create native Mobile Apps. But even if you’re a native mobile developer with some experience using SwiftUI in iOS or Compose in Android, you’ll find lots of similarities here. ### Expo and React Native Expo is a set of tools built around React Native. Using Expo you can create React Native Apps quickly and easily. For that, we need to install Expo using Node.js package manager [`npm`](https://docs.npmjs.com/cli/v8/configuring-npm/install): ``` npm install --global expo-cli ``` This will install `expo-cli` globally so we can call it from anywhere in our system. In case we need to update Expo we’ll use that very same command. __For this tutorial we’ll need the latest version of Expo, that’s been updated to support Realm__. You can find all the new features and changes in the [Expo SDK 44 announcement blog post](https://blog.expo.dev/expo-sdk-44-beta-is-now-available-75d9751b0a18). To ensure you have the latest Expo version run: ``` expo --version ``` Should return at least `5.0.1`. If not, run again `npm install --global expo-cli` ![Realm & Expo Logos](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/Twitter_Realm_expo_aef7185261.png) ## Prerequisites Now that we have the latest Expo installed, let’s check out that we have everything we need to develop our application: * [Xcode 13](https://apps.apple.com/us/app/xcode/id497799835), including Command Line Tools, if we want to develop an iOS version. We’ll also need a macOS computer running at least macOS 11/Big Sur in order to run Xcode. * [Android Studio](https://developer.android.com/studio), to develop for Android and at least one Android Emulator ready to test our apps. * Any code editor. I’ll be using [Visual Studio Code](https://code.visualstudio.com/) as it has plugins to help with React Native Development, but you can use any other editor. * Check that you have the latest version of yarn running `npm install -g yarn` * Make sure you are NOT on the latest version of node, however, or you will see errors about unsupported digital envelope routines. You need the LTS version instead. Get the latest LTS version number from https://nodejs.org/ and then run: ``` nvm install 16.13.1 # swap for latest LTS version ``` If you don’t have Xcode or Android Studio, and need to build without installing anything locally you can also try [Expo Application Services](https://expo.dev/eas), a cloud-based building service that allows you to build your Expo Apps remotely. ### MongoDB Atlas and Realm App Our App will store data in a cloud-backed MongoDB Atlas cluster. So we need to [create a free MongoDB account](https://cloud.mongodb.com/) and [set up a cluster](https://docs.atlas.mongodb.com/tutorial/create-new-cluster/). For this tutorial, a Free-forever, M0 cluster will be enough. Once we have our cluster created we can go ahead and create a Realm App. The Realm App will sync our data from mobile into a MongoDB Atlas database, although it has many other uses: manages authentication, can run serverless functions, host static sites, etc. [Just follow this quick tutorial](https://docs.mongodb.com/realm/manage-apps/create/create-with-realm-ui/#std-label-create-a-realm-app) (select the React Native template) but don’t download any code, as we’re going to use Expo to create our app from scratch. That will configure our Realm App correctly to use Sync and set it into Development Mode. ## Read It Later - Maybe Now we can go ahead and create our app, a small “read it later” kind of app to store web links we save for later reading. As sometimes we never get back to those links I’ll call it Read It Later - _Maybe_. You can always [clone the repo](https://github.com/mongodb-developer/read-it-later-maybe) and follow along. | Login | Adding a Link | | :-------------: | :----------: | | ![Login/Signup screen with email and password fields](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/expo_realm_1_5bcda15f0a.png) | ![Adding a Link, with both Name and URL filled up, waiting to tap on “Add Link!” button](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/expo_realm_2_9cd565c46f.png)| | All Links | Deleting a Link | | :-------------: | :----------: | | ![The App showing a list of two links.](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/expo_realm_3_5cceea231a.png) | ![Swiping Right to Left we can show a button to delete a Link](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/expo_realm_4_229ab5b063.png)| ### Install Expo and create the App We’ll use Expo to create our app using `expo init read-later-maybe`. This will ask us which template we want to use for our app. Using up and down cursors we can select the desired template, in this case, from the Managed Workflows we will choose the `blank` one, that uses JavaScript. This will create a `read-later-maybe` directory for us containing all the files we need to get started. ![Terminal window showing how after launching expo init we choose a template and the messages Expo show until our project is ready.](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/01_expo_realm_create_project_728228e13d.png) To start our app, just enter that directory and start the React Native Metro Server using ` yarn start`. This will tell Expo to install any dependencies and start the Metro Server. ```bash cd read-later-maybe yarn start ``` This will open our default browser, with the Expo Developer Tools at [http://localhost:19002/](http://localhost:19002/). If your browser doesn't automatically open, press `d` to open Developer Tools in the browser. From this web page we can: * Start our app in the iOS Simulator * Start our app in the Android Emulator * Run it in a Web browser (if our app is designed to do that) * Change the connection method to the Developer Tools Server * Get a link to our app. (More on this later when we talk about Expo Go) We can also do the same using the developer menu that’s opened in the console, so it’s up to you to use the browser and your mouse or your Terminal and the keyboard. ![Running in a Terminal we can see all the options Expo Developer Tools are showing us.](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/02_expo_realm_development_server_2d8107947e.png) ## Running our iOS App To start the iOS App in the Simulator, we can either click “Start our app in the iOS Simulator” on Expo Developer Tools or type `i` in the console, as starting expo leaves us with the same interface we have in the browser, replicated in the console. We can also directly run the iOS app in Simulator by typing `yarn ios` if we don’t want to open the development server. ### Expo Go The first time we run our app Expo will install Expo Go. This is a native application (both for iOS and Android) that will take our JavaScript and other resources bundled by Metro and run it in our devices (real or simulated/emulated). Once run in Expo Go, we can make changes to our JavaScript code and Expo will take care of updating our app on the fly, no reload needed. | Open Expo Go | 1st time Expo Go greeting | Debug menu | | :-------------: | :----------: | :----------: | | ![before running inside the iOS Simulator, we get a confirmation “Open in Expo Go?](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/02_expo_realm_expo_go_1_0cdc295a7b.png) | ![1st time we open the app in Expo, we get a welcome message “Hello there, friend”](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/02_expo_realm_expo_go_2_b37611e586.png) | ![debug Expo Go menu inside our app has many useful options and can be opened later](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/02_expo_realm_expo_go_3_b97e8fcf23.png) | Expo Go apps have a nice debugging menu that can be opened pressing “m” in the Expo Developer console. ### Structure of our App Now our app is working, but it only shows a simple message: “Open up App.js to start working on your app!”. So we’ll open the app using our code editor. These are the main files and folders we have so far: ``` . ├── .expo-shared │ └── assets.json ├── assets │ ├── adaptive-icon.png │ ├── favicon.png │ ├── icon.png │ └── splash.png ├── .gitignore ├── App.js ├── app.json ├── babel.config.js ├── package.json └── yarn.lock ``` The main three files here are: * `package.json`, where we can check / add / delete our app’s dependencies * `app.json`: configuration file for our app * `App.js`: the starting point for our JavaScript code These changes can be found in tag `step-0` of the [repo](https://github.com/mongodb-developer/read-it-later-maybe). ## Let’s add some navigation Our App will have a Login / Register Screen and then will show the list of Links for that particular User. We’ll navigate from the Login Screen to the list of Links and when we decide to Log Out our app we’ll navigate back to the Login / Register Screen. So first we need to add the React Native Navigation Libraries, and the gesture handler (for swipe & touch detection, etc). Enter the following commands in the Terminal: ```bash expo install @react-navigation/native expo install @react-navigation/stack expo install react-native-gesture-handler expo install react-native-safe-area-context expo install react-native-elements ``` These changes can be found in tag `step-1` of the [repo](https://github.com/mongodb-developer/read-it-later-maybe). Now, we’ll create a mostly empty LoginView in `views/LoginView.js` (the `views` directory does not exist yet, we need to create it first) containing: ```javascript import React from "react"; import { View, Text, TextInput, Button, Alert } from "react-native"; export function LoginView({ navigation }) { return ( <View> <Text>Sign Up or Sign In:</Text> <View> <TextInput placeholder="email" autoCapitalize="none" /> </View> <View> <TextInput placeholder="password" secureTextEntry /> </View> <Button title="Sign In" /> <Button title="Sign Up" /> </View> ); } ``` This is just the placeholder for our Login screen. We open it from App.js. Change the `App` function to: ```javascript export default function App() { return ( <NavigationContainer> <Stack.Navigator> <Stack.Screen name="Login View" component={LoginView} options={{ title: "Read it Later - Maybe" }} /> </Stack.Navigator> </NavigationContainer> ); } ``` And add required `imports` to the top of the file, below the existing `import` lines. ```javascript import { NavigationContainer } from "@react-navigation/native"; import { createStackNavigator } from "@react-navigation/stack"; import { LoginView } from './views/LoginView'; const Stack = createStackNavigator(); ``` All these changes can be found in tag `step-2` of the [repo](https://github.com/mongodb-developer/read-it-later-maybe/compare/step-1...step-2). ## Adding the Realm Library ### Installing Realm To add our [Realm library](https://docs.mongodb.com/realm/sdk/react-native/) to the project we’ll type in the Terminal: ```bash expo install realm ``` This will add Realm as a dependency in our React Native Project. Now we can also create a file that will hold the Realm initialization code, we’ll call it `RealmApp.js` and place it in the root of the directory, alongside `App.js`. ```javascript import Realm from "realm"; const app = new Realm.App({id: "your-realm-app-id-here"}); export default app; ``` We need to add a Realm App ID to our code. [Here](https://docs.mongodb.com/realm/get-started/find-your-project-or-app-id/#find-a-realm-application-id) are instructions on how to do so. In short, a Mobile Realm-powered App will use a local database to save changes and will connect to a MongoDB Atlas Database using a Realm App that we create in the cloud. We have Realm as a library in our Mobile App, doing all the heavy lifting (sync, offline, etc.) for our React Native app, and a _Realm App in the cloud_ that connects to MongoDB Atlas, acting as our backend. This way, if we go offline we’ll be using our local database on device and when online, all changes will propagate in both directions. All these changes can be found in tag `step-3` of the [repo](https://github.com/mongodb-developer/read-it-later-maybe/compare/step-2...step-3). > > __Update 24 January 2022__ > > A simpler way to create a React Native App that uses Expo & Realm is just to create it using a template. > For JavaScript based apps: > `npx expo-cli init ReactRealmJsTemplateApp -t @realm/expo-template-js` > > For TypeScript based apps: > `npx create-react-native-app ReactRealmTsTemplateApp -t with-realm` > ## Auth Provider All Realm related code to register a new user, log in and log out is inside a Provider. This way we can provide all descendants of this Provider with a context that will hold a logged in user. All this code is in `providers/AuthProvider.js`. You’ll need to create the `providers` folder and then add `AuthProvider.js` to it. Realm not only stores data offline, syncs across multiple devices and stores all your data in a MongoDB Atlas Database, but can also run Serverless Functions, host static html sites or [authenticate using multiple providers](https://docs.mongodb.com/realm/authentication/providers/). In this case we’ll use the simpler email/password authentication. We create the context with: ```javascript const AuthContext = React.createContext(null); ``` The SignIn code is asynchronous: ```javascript const signIn = async (email, password) => { const creds = Realm.Credentials.emailPassword(email, password); const newUser = await app.logIn(creds); setUser(newUser); }; ``` As is the code to register a new user: ```javascript const signUp = async (email, password) => { await app.emailPasswordAuth.registerUser({ email, password }); }; ``` To log out we simply check if we’re already logged in, in that case call `logOut` ```javascript const signOut = () => { if (user == null) { console.warn("Not logged in, can't log out!"); return; } user.logOut(); setUser(null); }; ``` All these changes can be found in tag `step-4` of the [repo](https://github.com/mongodb-developer/read-it-later-maybe/compare/step-3...step-4). ### Login / Register code Take a moment to have a look at the styles we have for the app in the `stylesheet.js` file, then modify the styles to your heart’s content. Now, for Login and Logout we’ll add a couple `states` to our `LoginView` in `views/LoginView.js`. We’ll use these to read both email and password from our interface. Place the following code inside `export function LoginView({ navigation }) {`: ```javascript const [email, setEmail] = useState(""); const [password, setPassword] = useState(""); ``` Then, we’ll add the UI code for Login and Sign up. Here we use `signIn` and `signUp` from our `AuthProvider`. ```javascript const onPressSignIn = async () => { console.log("Trying sign in with user: " + email); try { await signIn(email, password); } catch (error) { const errorMessage = `Failed to sign in: ${error.message}`; console.error(errorMessage); Alert.alert(errorMessage); } }; const onPressSignUp = async () => { console.log("Trying signup with user: " + email); try { await signUp(email, password); signIn(email, password); } catch (error) { const errorMessage = `Failed to sign up: ${error.message}`; console.error(errorMessage); Alert.alert(errorMessage); } }; ``` All changes can be found in [`step-5`](https://github.com/mongodb-developer/read-it-later-maybe/compare/step-4...step-5). ## Prebuilding our Expo App On save we’ll find this error: ``` Error: Missing Realm constructor. Did you run "pod install"? Please see https://realm.io/docs/react-native/latest/#missing-realm-constructor for troubleshooting ``` Right now, Realm is not compatible with [Expo Managed Workflows](https://docs.expo.dev/introduction/managed-vs-bare/#managed-workflow). In a managed Workflow Expo hides all iOS and Android native details from the JavaScript/React developer so they can concentrate on writing React code. Here, we need to [prebuild](https://github.com/expo/fyi/blob/main/prebuilding.md) our App, which will mean that we lose the nice Expo Go App that allows us to load our app using a QR code. The Expo Team is working hard on improving the compatibility with Realm, as is our React Native SDK team, who are currently working on improving the compatibility with Expo, supporting the Hermes JavaScript Engine and expo-dev-client. Watch this space for all these exciting announcements! So to run our app in iOS we’ll do: ``` expo run:ios ``` We need to provide a Bundle Identifier to our iOS app. In this case we’ll use `com.realm.read-later-maybe` This will install all needed JavaScript libraries using `yarn`, then install all native libraries using CocoaPods, and finally will compile and run our app. To run on Android we’ll do: ``` expo run:android ``` ## Navigation completed Now we can register and login in our App. Our `App.js` file now looks like: ```javascript export default function App() { return ( <AuthProvider> <NavigationContainer> <Stack.Navigator> <Stack.Screen name="Welcome View" component={LoginView} options={{ title: "Read it Later - Maybe" }} /> </Stack.Navigator> </NavigationContainer> </AuthProvider> ); } ``` We have an AuthProvider that will provide the user logged in to all descendants. Inside is a Navigation Container with one Screen: Login View. But we need to have two Screens: our “Login View” with the UI to log in/register and “Links Screen”, which will show all our links. So let’s create our LinksView screen: ```javascript import React, { useState, useEffect } from "react"; import { Text } from "react-native"; export function LinksView() { return ( <Text>Links go here</Text> ); } ``` Right now only shows a simple message “Links go here”, as you can check in [`step-6`](https://github.com/mongodb-developer/read-it-later-maybe/compare/step-5...step-6) ## Log out We can register and log in, but we also need to log out of our app. To do so, we’ll add a Nav Bar item to our Links Screen, so instead of having “Back” we’ll have a logout button that closes our Realm, calls logout and pops out our Screen from the navigation, so we go back to the Welcome Screen. In our LinksView Screen in we’ll add: ```javascript React.useLayoutEffect(() => { navigation.setOptions({ headerBackTitle: "Log out", headerLeft: () => <Logout closeRealm={closeRealm} /> }); }, [navigation]); ``` Here we use a `components/Logout` component that has a button. This button will call `signOut` from our `AuthProvider`. You’ll need to add the `components` folder. ```javascript return ( <Button title="Log Out" onPress={() => { Alert.alert("Log Out", null, [ { text: "Yes, Log Out", style: "destructive", onPress: () => { navigation.popToTop(); closeRealm(); signOut(); }, }, { text: "Cancel", style: "cancel" }, ]); }} /> ); ``` Nice! Now we have Login, Logout and Register! You can follow along in [`step-7`](https://github.com/mongodb-developer/read-it-later-maybe/compare/step-6..step-7). ## Links ### CRUD We want to store Links to read later. So we’ll start by defining how our Link class will look like. We’ll store a Name and a URL for each link. Also, we need an `id` and a `partition` field to avoid pulling all Links for all users. Instead we’ll just sync Links for the logged in user. These changes are in `schemas.js` ```javascript class Link { constructor({ name, url, partition, id = new ObjectId(), }) { this._partition = partition; this._id = id; this.name = name; this.url = url; } static schema = { name: 'Link', properties: { _id: 'objectId', _partition: 'string', name: 'string', url: 'string', }, primaryKey: '_id', }; } ``` You can get these changes in `step-8` of the [repo](https://github.com/mongodb-developer/read-it-later-maybe). And now, we need to code all the CRUD methods. For that, we’ll go ahead and create a `LinksProvider` that will fetch Links and delete them. But first, we need to open a Realm to read the Links for this particular user: ```javascript realm.open(config).then((realm) => { realmRef.current = realm; const syncLinks = realm.objects("Link"); let sortedLinks = syncLinks.sorted("name"); setLinks([...sortedLinks]); // we observe changes on the Links, in case Sync informs us of changes // started in other devices (or the cloud) sortedLinks.addListener(() => { console.log("Got new data!"); setLinks([...sortedLinks]); }); }); ``` To add a new Link we’ll have this function that uses `[realm.write](https://docs.mongodb.com/realm-sdks/js/latest/Realm.html#write)` to add a new Link. This will also be [observed by the above listener](https://docs.mongodb.com/realm/sdk/react-native/examples/use-change-listeners-in-components/), triggering a UI refresh. ```javascript const createLink = (newLinkName, newLinkURL) => { const realm = realmRef.current; realm.write(() => { // Create a new link in the same partition -- that is, using the same user id. realm.create( "Link", new Link({ name: newLinkName || "New Link", url: newLinkURL || "http://", partition: user.id, }) ); }); }; ``` Finally to delete Links we’ll use `[realm.delete](https://docs.mongodb.com/realm-sdks/js/latest/Realm.html#delete)`. ```javascript const deleteLink = (link) => { const realm = realmRef.current; realm.write(() => { realm.delete(link); // after deleting, we get the Links again and update them setLinks([...realm.objects("Link").sorted("name")]); }); }; ``` ### Showing Links Our `LinksView` will `map` the contents of the `links` array of `Link` objects we get from `LinkProvider` and show a simple List of Views to show name and URL of each Link. We do that using: ```javascript {links.map((link, index) => <ScrollView> <ListItem.Content> <ListItem.Title> {link.name} </ListItem.Title> <ListItem.Subtitle> {link.url} </ListItem.Subtitle> </ListItem.Content> <ListItem.Chevron /> </ScrollView> ``` ### UI for deleting Links As we want to delete links we’ll use a swipe right-to-left gesture to show a button to delete that Link ```javascript <ListItem.Swipeable onPress={() => onClickLink(link)} bottomDivider key={index} rightContent={ <Button title="Delete" onPress={() => deleteLink(link)} /> } > ``` We get `deleteLink` from the `useLinks` hook in `LinksProvider`: ```javascript const { links, createLink, deleteLink } = useLinks(); ``` ### UI for adding Links We’ll have a [TextInput](https://reactnative.dev/docs/textinput) for entering name and URL, and a button to add a new Link directly at the top of the List of Links. We’ll use an accordion to show/hide this part of the UI: ```javascript <ListItem.Accordion content={ <ListItem.Content> <ListItem.Title>Create new Link</ListItem.Title> </ListItem.Content> } isExpanded={expanded} onPress={() => { setExpanded(!expanded); }} > { <> <TextInput style={styles.input} onChangeText={setLinkDescription} placeholder="Description" value={linkDescription} /> <TextInput style={styles.input} onChangeText={setlinkURL} placeholder="URL" value={linkURL} /> <Button title='Click!' color='red' onPress={ () => { createLink(linkDescription, linkURL); }} /> </> } </ListItem.Accordion> ``` ## Adding Links in the main App Finally, we’ll integrate the new `LinksView` inside our `LinksProvider` in `App.js` ```javascript <Stack.Screen name="Links"> {() => { return ( <LinksProvider> <LinksView /> </LinksProvider> ); }} </Stack.Screen> ``` ## The final App Wow! That was a lot, but now we have a React Native App, that works with the same code base in both iOS and Android, storing data in a MongoDB Atlas Database in the cloud thanks to Realm Sync. And what’s more, any changes in one device syncs in all other devices with the same user logged-in. But the best part is that Realm Sync works even when offline! | Syncing iOS and Android | Offline Syncing! | | :-------------: | :----------: | | ![Animation showing how adding a Link in an iOS Simulator appears in an Android Emulator. After that, deleting on Android makes data disappear also in iOS.](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/expo_realm_sync_eec72ca0bd.gif) | ![Setting Airplane mode in Android and then adding a new Link adds only in Android. When the Android emulator is back online it syncs with iOS.](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/expo_realm_offline_882a595fa9.gif) | ## Recap In this tutorial we’ve seen how to build a simple React Native application using Expo that takes advantage of Realm Sync for their offline and syncing capabilities. This App is a prebuilt app as right now Managed Expo Workflows won’t work with Realm (yet, read more below). But you still get all the simplicity of use that Expo gives you, all the Expo libraries and the EAS: build your app in the cloud without having to install Xcode or Android Studio. The Realm SDK team is working hard to make Realm fully compatible with Hermes. Once we release an update to the Realm React Native SDK compatible with Hermes, we’ll publish a new post updating this app. Also, we’re working to finish an [Expo Custom Development Client](https://blog.expo.dev/introducing-custom-development-clients-5a2c79a9ddf8). This will be our own Realm Expo Development Client that will substitute Expo Go while developing with Realm. Expect also a piece of news when that is approved! All the code for this tutorial can be found [in this repo](https://github.com/mongodb-developer/read-it-later-maybe).
dfreniche
935,660
Difference between export as class and object in javascript ?
Hello Devs, Here I am going to share what I learn from my fellow colleague while working on react...
0
2021-12-24T11:37:33
https://www.internetkatta.com/difference-between-export-as-class-and-object-in-javascript
Hello Devs, Here I am going to share what I learn from my fellow colleague while working on react application. I generally start using some concept whichever I learn recently ( generally whichever I learned I kept like arrows in quiver and whichever depends on the instance I used it ) then implement whatever requirement comes. But sometimes that concept may not be useful or best to use. This is what I learned. Long back ago I learned to use singleton class for some third party initiation code. So, I used many times as class but one problem was addressed by colleague is everything is Javascript is object based even if we use finally it converted to object based only then if can be done simple object based approach will help to reduce lines of code and simpler approach. Instead we should use export as object which will be a simpler approach and less code line compared to class based approach. Let me explain how to export as a class and object. First see how to export as class or singleton class : Below sample code is singleton class and it exports as default singleton class. ``` // ExampleClass.js export default class ExampleClass { constructor() { if (this.constructor.instance) { return this.constructor.instance; } this.constructor.instance = this; } /** * @name ExampleClass#setData * Role of this function is to set data * @param {Object} data */ setData(data) { this.data = data; } /** * @name ExampleClass#getData * Role of this function is to get data * @returns {Object} data */ getData() { return this.data; } } ``` This class we will import in another file. ``` // Samplefile.js import ExampleClass from 'ExampleClass'; const obj = new ExampleClass(); const data = {}; obj.setData(data); ``` ``` // Samplefile1.js import ExampleClass from 'ExampleClass'; const obj = new ExampleClass(); const data = obj.getData(); ``` So, when we use an import statement and create a new object whether class is singleton or normal class. When bundling whole code either Angular or React or other framework it adds code lines from `ExampleClass.js` to each file wherever instance got created. ## How to avoid this ? ![hmm-thinking.gif](https://cdn.hashnode.com/res/hashnode/image/upload/v1639750816229/pEMEIq78m.gif) Answer is we can export as objects. Because import as an object we don't need to create an instance of it. Occurrence of code line from exported object will be one time in bundle. ``` // Example.js const example = { data: {}, }; export default { setData(token, isIVToken = false) { example.data = { key: 'value' } }, getData() { return example.data }, }; ``` Actually as I said earlier everything is object only in Javascript. Only other approaches evolve based on condition and demand of projects like class, singleton or object based. **Note** : This is what I learned recently. We have to use an approach based on the situation. Nothing like which one is best. Hope this blog helps you. If you like my blog please don't forget to like the article. It will encourage me to write more such learning related blogs. You can reach out to me over my twitter handle [@aviboy2006](https://twitter.com/Aviboy2006). Feel free to comment if anything is wrong in this blog. I am happy to learn and correct.
avinashdalvi_
935,666
OOP in Python
Requirement: List, Dictionary , Function etc Twiter post by Dev.to So, OOP is basically...
0
2021-12-29T06:39:03
https://dev.to/mitul3737/oop-in-python-562k
oop, python
Requirement: List, Dictionary , Function etc ![OOP post shared by Dev Community](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pa4e6hoqu61k82qrfoxe.png) [Twiter post by Dev.to](https://twitter.com/ThePracticalDev/status/1476840641338527753) So, OOP is basically programming with Object So, What is Object? To know this, you have to know what is Class? Okay , do you know about string? Let's see what are there in the directory of string ``` print(dir(str)) ``` Output: ``` ['__add__', '__class__', '__contains__', '__delattr__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__getitem__', '__getnewargs__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__iter__', '__le__', '__len__', '__lt__', '__mod__', '__mul__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__rmod__', '__rmul__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'capitalize', 'casefold', 'center', 'count', 'encode', 'endswith', 'expandtabs', 'find', 'format', 'format_map', 'index', 'isalnum', 'isalpha', 'isascii', 'isdecimal', 'isdigit', 'isidentifier', 'islower', 'isnumeric', 'isprintable', 'isspace', 'istitle', 'isupper', 'join', 'ljust', 'lower', 'lstrip', 'maketrans', 'partition', 'removeprefix', 'removesuffix', 'replace', 'rfind', 'rindex', 'rjust', 'rpartition', 'rsplit', 'rstrip', 'split', 'splitlines', 'startswith', 'strip', 'swapcase', 'title', 'translate', 'upper', 'zfill'] ``` so, what are they? There are methods under "str" class for example: upper is a method and __doc__ is a private method and method is mainly function within a class . Nothing else! So , you have a little knowledge of "class" now Let's create a class named example : ``` class example: print("Congratulations! you have created your first class") ``` Output: ``` Congratulations! you have created your first class ``` now we will create method within the class and remember that, while you create the method, you will have to give "self" parameter ``` class example: def hello_world(self): print("Created 1st method") ``` now the question is that what is self? this self is basically to differ from other class . But to know about this, you need to know what is object. You may consider object/instance as a child of a class. Let's create a class: ``` class example1: def hello_world(self): print("Created 1st method") object=example1()# created an object ``` To use the class, you just need to create an object with the class name and you can then use all the method and variable within the class. Let's create an object and use it. ``` class example1: def hello_world(self): print("Hello world") def details(self): print("We are learning OOP") object1=example1() object1.details()#using details method ``` Output: ``` We are learning OOP ``` Here we have created an object and used details method of the class through the object Now let's assume , you want to give some input and then work depending on those input, you can use __init__ method for that . Basically __init__ method is made to do must things of a class ``` class example1: def __init__(self): print("Hey! used the __init__ method") def hello_world(self): print("Hello world") def details(self): print("We are learning OOP") object1=example1() ``` Output: ``` Hey! used the __init__ method ``` So, look here we created an object but did not call any method , still this is printed . The reason is that, we have added that under __init__ . so these sort of things are done using __init__ . Now, let's take some input while creating an object and use them . ``` class example1: def __init__(self,name, country): self.variable_1=name self.variable_2=country print(self.variable_1) print(self.variable_2) object1=example1("Mitul", 'Bangladesh') ``` here, while creating the object we are taking 2 values. Name and country name. and then look def __init__(self,name, country) here, we have set 3 parameter. self,name and country . Self is a must but other 2 are taken for 2 input we will take while creating an object. ``` class example1: def __init__(self,name, country): self.variable_1=name self.variable_2=country print(self.variable_1) print(self.variable_2) print("-----------") object1=example1("Mitul", 'Bangladesh') object2=example1('Karim', "India") ``` Now , you can see that we have 2 objects now and we can use as many time as we want providing name and country Output: ``` Mitul Bangladesh ----------- Karim India ----------- ``` You can again provide default values for variables within __init__ ``` class example1: def __init__(self,name, country="default"): self.variable_1=name self.variable_2=country print(self.variable_1) print(self.variable_2) print("-----------") object1=example1("Mitul") ``` Output: ``` Mitul default ----------- ``` Now, let's know more about self using 2 class ``` #class 1 class Hospital: def __init__(self,name): self.name=name self.d_dict={} self.p_dict={} self.d_count=0 self.p_count=0 def addDoctor(self,var): self.d_dict[var.d_id]=[var.d_name,var.d_spe] self.d_count+=1 def getDoctorByID(self,val): if val in self.d_dict.keys(): return f"Doctor's ID:{val}\nName:{self.d_dict[val][0]}\nSpeciality:{self.d_dict[val][1]}" def allDoctors(self): print("All Doctors") print(f"Number of Doctors: {self.d_count}") print(self.d_dict) #class 2 class Doctor: def __init__(self,id,occ,name,spe): self.d_id=id self.d_occ=occ self.d_name=name self.d_spe=spe h = Hospital("Evercare")# created an object with hospital name and Hospital class d1 = Doctor("1d","Doctor", "Samar Kumar", "Neurologist") #created an object with Doctor class and with id, occupation , name , speciality h.addDoctor(d1) #used a method of Hospital class . Notice h is not an object under Doctor class but addDoctor is under the class Doctor. So, we are basically using a method called addDoctor with an object not created from his own class print(h.getDoctorByID("1d")) ``` Output: ``` Doctor's ID:1d Name:Samar Kumar Speciality:Neurologist ``` So, don't think about the code. Here we created "h" object through Hospital class and "d1" through Doctor class . so we used "getDoctorById" method through the object "h" and here into the code, while we used this: ``` def addDoctor(self,var): self.d_dict[var.d_id]=[var.d_name,var.d_spe] self.d_count+=1 ``` if you check the line self.d_dict[var.d_id]=[var.d_name,var.d_spe] here, var refers to object of other class or in a word this is of a different class but self refers here things of only Hospital class. so , self here differs between 2 class . So, stay cool. Don't need to panic if you don't realize anything. Let's learn things gradually to master OOP **Public, Protected & Private Variable** Public variable : It can be used outside the class and in other class too ``` class Car: numberOfWheels=4 class Bmw(Car): def __init__(self): print("Inside the BMW Class",self.numberOfWheels) car=Car() print(car.numberOfWheels)#used outside of the class bmw=Bmw() ``` Output: ``` 4 Inside the BMW Class 4 ``` Protected Variable: This variable can be used in other class and also outside of the class but you need to use "_" to create this sort of variable ``` class Car: _color = "Black" #proteced variable class Bmw(Car): def __init__(self): print("Inside the BMW Class",self._color)#used within a different class car=Car() print(car._color)#used outside of the class bmw=Bmw() ``` Output: ``` Black Inside the BMW Class Black ``` Private Variable: You cannot use Private variable outside a class but use it within a class . Don't forget to use "__" before the variable ``` class Car: __yearOfManufacture = 2017 def Private_key(self): print("Private attribute yearOfManufacture: ",car.__yearOfManufacture) # private variable only works with its own class car=Car() car.Private_key() ``` Output: ``` Private attribute yearOfManufacture: 2017 ``` Example using all of the variables ``` #Public=membername #Protected=_memberName #Private=__memberName class Car: numberOfWheels=4 _color="Black" __yearOfManufacture=2017 def Private_key(self): print("Private attribute yearOfManufacture: ", car.__yearOfManufacture) #private variable only works with its own class class Bmw(Car): def __init__(self): print("Protected attribute color",self._color)#By using Inheritence we got Car class's variable car=Car() print("Public attribute numberOfWheels",car.numberOfWheels) bmw=Bmw() # while we create this object . things under it's __init__ will be printed car.Private_key() ``` Output: ``` Public attribute numberOfWheels 4 Protected attribute color Black Private attribute yearOfManufacture: 2017 ``` **Class variable & Instance Variable** Instance Variable: Instance variable is variable dealing with the instance/object . You can change it's value outside of the class. You can access Instance variable by <Instance/Object name.variable name> ``` class Book(): def __init__(self): self.x = 100 # instance variable def display(self): print(self.x) b = Book() print(b.x) # printing instance variable b.x=101 #changing the value of x print(b.x) ``` Output: ``` 100 101 ``` Class Variable: Class Variable is valid for the class and can be called with its <class name. variable name> ``` class Book(): x = 5 # class variable y=6 def __init__(self): self.x = 100 # instance variable def display(self): print(self.x) b = Book() print("class variable",Book.x) # printing class variable print("instance variable",b.x) # printing instance variable print("class variable",Book.y) #changng the class variable changes the value for class and instance Book.y=7 print("class variable",Book.y) print("instance variable",b.y) ``` Output: ``` class variable 5 instance variable 100 class variable 6 class variable 7 instance variable 7 ``` **Instance method, Class method ,Static method** Instance method: Instance method is method which can be used for instance/object . ``` class MyClass(): def __init__(self,x): self._x=x def method1(self):#instance method print(self._x) value=MyClass(100) value.method1() ``` Output: ``` 100 ``` Class method: class method can be used by the class and you have to create @classmethod to create a class method and it can be accessed through <Class name.method name()> . Again, you have to set cls as parameter of the class method ``` class MyClass(): a=5 #class method @classmethod def method2(cls): #cls refers to class object print(cls.a) MyClass.method2()#calling class method (prints 5) ``` Output: ``` 5 ``` But,if you don't want to use "cls", you can use your desired parameter name ``` class MyClass(): a=5 #class method @classmethod def method2(class_method): #cls refers to class object print(class_method.a) MyClass.method2()#calling class method (prints 5) ``` Output: ``` 5 ``` Static Method: Static method does not have any must parameter like self or cls . It just works like a random function we used to make ``` class MyClass(): @staticmethod def method3(m,n): #takes 2 value return m+n #returns their sum object=MyClass() print(object.method3(10,20)) ``` Output: ``` 30 ``` **property method** To get value from a method, you may set it as a property method using @property before the method name . You can access the property method using <object/instance name.method name> . Don't use () at the end of the method name . ``` class Product: def __init__(self, x, y): self._x = x self._y = y @property # to get any value using this method written as object.method or, p.value not like p.value() def value(self): #property method return self._x p = Product(23, 24) print(p.value) # you cannot use p.value() ``` Output: ``` 23 ``` Again you can set value of the property method by using <@property method name.setter> ``` class Product: def __init__(self, x, y): self._x = x self._y = y @property # to get any value using this method written as object.method or, p.value not like p.value() def value(self): return self._x @value.setter # to set value def value(self, val): self._x=val p = Product(23, 24) print(p.value) p.value=100 print("After setting the new value, it is now",p.value) ``` Output: ``` 23 After setting the new value, it is now 100 ``` To delete a value from property method, you can use <@property method.deleter> ``` class Product: def __init__(self, x, y): self._x = x self._y = y @property # to get any value using this method written a variable ex: object.method or, p.value not like p.value() def value(self): return self._x @value.setter # to set a function to assign value def value(self, val): self._x = val # while we delete a value,this method will be applied ex: del p.value @value.deleter def value(self): print('Value deleted') p = Product(12, 24) print("Property object has the 1st value",p.value) #to delete the value use del and then objectname.variable name del p.value ``` Output: ``` Property object has the 1st value 12 Value deleted ``` **Dispatch method** Dispatch method is used to work with specific thing . for example if you want to work with 3 integers or 3 float or float and an integer, you can create a custom method. Note: Don't forget to install multipledispatch package for example to work with 3 integers, you can use <@dispatch(int,int,int)> and then your desired method with 4 parameters including self . ``` @dispatch(int,int,int)#working for 3 integers def product(self,a,b,c): ``` Let's check a code ``` from multipledispatch import dispatch class my_calculator(): @dispatch(int,int)#when we have 2 input, it will work def product(self,a,b): print("Product of 2 integers : ",a*b) @dispatch(int,int,int)#working for 3 integers def product(self,a,b,c): print("Product of 3 integers : ",a*b) @dispatch(float,float,float)#working for 3 floats def product(self,a,b,c): print("Product of 3 floats : ",a*b*c) @dispatch(float,int)#working for a int and a float def product(self,c,d): print("Product of 1 float and 1 integer : ",c*d) c1=my_calculator() c1.product(4,5) c1.product(4,7,6) c1.product(4.0,5.0,3.0) c1.product(4.0,3) ``` Output: ``` Product of 2 integers : 20 Product of 3 integers : 28 Product of 3 floats : 60.0 Product of 1 float and 1 integer : 12.0 ``` **Magic Method** Magic method starts with __ and ends with __ Code: ``` class Fraction: def __init__(self,nr,dr=1): self.nr=nr self.dr=dr if self.dr<0: self.nr*=-1 self.dr*=-1 self.__reduce__() def show(self): print(f'{self.nr}/{self.dr}') def __str__(self): return f'{self.nr}/{self.dr}' def __repr__(self): return f'Fraction({self.nr}/{self.dr})' def __add__(self, other):#magic method __method__ if isinstance(other,int): other=Fraction(other) f=Fraction(self.nr*other.dr+other.nr*self.dr) f.__reduce__() return f def __radd__(self, other):#reverse add return self.__add__(other) def __sub__(self, other): if isinstance(other,int): other=Fraction(other) f=Fraction(self.nr*other.dr-other.nr*self.dr) f.__reduce__() return f def __mult__(self,other): if isinstance(other,int): other=Fraction(other) f=Fraction(self.nr*other.nr,self.dr*other.dr) f.__reduce__() return f def __eq__(self,other): return (self.nr*other.dr)==(self.dr*other.nr) def __lt__(self, other): return (self.nr*other.dr)<(self.dr*other.nr) def __le__(self, other): return (self.nr*other.dr)<=(self.dr*other.nr) def __reduce__(self): h=Fraction.hcf(self.nr,self.dr) if h==0: return self.nr//=h self.dr//=h @staticmethod def something(): pass ``` **Protected method , Private method** You can use protected method outside of the class but cannot use private method outside of the method ``` class Product: def __init__(self): self.data1=10 self._data2=20 #protected variable def methodA(self): pass def _methodB(self): #pprotected method print("Hello to protected method") def __methodC(self):#private method print("Hola") p=Product() print(dir(p)) #You can see additionally _data2', '_methodB', 'data1', 'methodA' print(p._data2) #accessing the protected variable p._methodB() #calling the proteced method ``` Output: ``` ['_Product__methodC', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', '_data2', '_methodB', 'data1', 'methodA'] 20 Hello to protected method ``` **Operator Overloading** To set specific rules for a specific operator, we use operator overloading . For example, __add__ is used for "+" Check out this link: https://www.geeksforgeeks.org/operator-overloading-in-python/ If we want to add 2 different object one has 1 and 6 and other has 9 and 3, we will use operator overloading to do so ``` class Point: def __init__(self, x=0, y=0): self.x = x self.y = y def __str__(self): #if asked to print string type return "({0},{1})".format(self.x, self.y) def __add__(self, other): #works for + opetator x = self.x + self.y y = other.x + other.y return Point(x, y) p1 = Point(1, 6)#worked for self p2 = Point(9, 3)#other print("P1 has 1st value",p1.x) print("P1 has 2nd value", p1.y) print("P2 has 1st value",p2.x) print("P2 has 2nd value", p2.y) print("Summation of p1+p2 is",p1+p2)#as p1 is first so self is for p1 and p2 gets others ``` To realize it in a better way,use Thonny IDE from this link (https://thonny.org/) and paste this code and debug . You can see how the code is proceeding Ouput: ``` P1 has 1st value 1 P1 has 2nd value 6 P2 has 1st value 9 P2 has 2nd value 3 Summation of p1+p2 is (7,12) Process finished with exit code 0 ``` **Polymorphism** There might be method of same name but to use them depending on their class, we use it like this ``` class Car: def start(self): print('Engine started') def move(self): print('Car is running') def stop(self): print('Brales applied') class Clock: def move(self): print('Tick Tick Tick') def stop(self): print('Clock needles stopped') class Person: def move(self): print('Person walking') def stop(self): print('Taking rest') def talk(self): print('Hello') car=Car() clock=Clock() person=Person() #this method will run with which instance you call it def do_something(x): x.move() x.stop() # calling with car instance do_something(car) #car is an object of Car class #calling with clock instance do_something(clock) # clock is an object of Clock class #calling with person instance do_something(person) #person is an object of Person class ``` Output: ``` Car is running Brales applied Tick Tick Tick Clock needles stopped Person walking Taking rest ``` **Inheritance** Let's assume that your college has CSE , BBA department . They have few things in common. All of them have student ID card, they are from he same college . So,, while you want to take all the information of BBA student or CSE student , you can do one thing. You can create a class names Student which works for common purposes and you can create 2 different class which will work with other extra information like BBA Students with have marketing classes where CSE Students will have Labs. So, to work with this code , we can use inheritance . So, while creating BBA Student class , we will use the "Student" class in the peremeter to mean inheritance Note: Here "Student": class will be called parent class and "BBA Student " class will be student class Again , to use something from the parent class, you will have to use <super().parent class's method name> ``` class Student: def __init__(self, name='Just a student', dept='nothing'): self.__name = name self.__department = dept def set_department(self, dept): self.__department = dept def get_name(self): return self.__name def set_name(self, name): self.__name = name def __str__(self): return f"Name: {self.__name} \nDepartment: {self.__department}\n" # write your code here class BBA_Student(Student): def __init__(self, name="default", department="BBA"): super().__init__(name, department)#used Student class's __init__ method print("I am a BBA Students . We do marketing courses") class CSE_Student(Student): def __init__(self,name="default",department="CSE"): super().__init__(name,department)#used Student class's __init__ method print("I am a CSE Student and I have a lots of lab to complete") print(BBA_Student('Karim Ali'))#using BBA_Student class inherited Student class print(CSE_Student('Mitul'))# using CSE_Student class inherited Student class ``` Output: ``` I am a BBA Students . We do marketing courses Name: Karim Ali Department: BBA I am a CSE Student and I have a lots of lab to complete Name: Mitul Department: CSE ```
mitul3737
935,708
How to Add Google Analytics And Google AdSense To your Next JS Project!
First create your next js app using npm create-next-app app-name Then go to go to...
0
2021-12-24T12:50:17
https://dev.to/yashkapure06/how-to-add-google-analytics-and-google-adsense-to-your-next-js-project-3jej
analytics, webdev, nextjs, react
## First create your next js app using `npm create-next-app app-name` Then go to ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/apewbfmu9wy9bqzzay1o.png) go to **github icon** and go to **examples** in examples go to **with-google-analytics** [Check Here](https://github.com/vercel/next.js/tree/canary/examples/with-google-analytics/lib) here you go when you click on above link Go Back to Your Code editor create a new folder named **lib** ![Starting](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/f26al50a5c0o5jmt6f4y.png) **Created? Well Done!** Now, create a new file named as `gtag.js` ![gtag.js](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/60aulx9a9m8nckg7k0mi.png) and paste the following code over in `gtag.js` **Copy this Code** ``` export const GA_TRACKING_ID = process.env.NEXT_PUBLIC_GA_ID // https://developers.google.com/analytics/devguides/collection/gtagjs/pages export const pageview = (url) => { window.gtag('config', GA_TRACKING_ID, { page_path: url, }) } // https://developers.google.com/analytics/devguides/collection/gtagjs/events export const event = ({ action, category, label, value }) => { window.gtag('event', action, { event_category: category, event_label: label, value: value, }) } ``` like this ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/r3nn8baqjt4z8vwzm9az.png) Now, How to get **TrackingID**? Just go to your [Google Analytics](https://analytics.google.com/) go to the settings icon which means **admin** ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gv3g3aktcd2zs469odt7.png) in that you will see ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i0ldvo2ge1p5qkl4ol04.png) Go and click on **create Property** 1. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7whlg74svv9qymwkkgwt.png) 2. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j8mcmqst5j9w14uo7c9z.png) Fill the required details such as website name and if you have your live website paste the link in the required section. After that it will bring you to ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ay2kmj4ilaxv76mcytd2.png) **Note: This Tracking ID is just for testing purpose You will find your Tracking Id on the same page** Copy Your Tracking ID and paste it ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qtry5sorlkq0vvkqwov9.png) Now go to pages folder in the github docs ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z219it6fgdl28pdvkabt.png) **Copy all this code to`_app.js`** ``` import { useEffect } from 'react' import Script from 'next/script' import { useRouter } from 'next/router' import * as gtag from '../lib/gtag' const App = ({ Component, pageProps }) => { const router = useRouter() useEffect(() => { const handleRouteChange = (url) => { gtag.pageview(url) } router.events.on('routeChangeComplete', handleRouteChange) return () => { router.events.off('routeChangeComplete', handleRouteChange) } }, [router.events]) return ( <> {/* Global Site Tag (gtag.js) - Google Analytics */} <Script strategy="afterInteractive" src={`https://www.googletagmanager.com/gtag/js?id=${gtag.GA_TRACKING_ID}`} /> <Script id="gtag-init" strategy="afterInteractive" dangerouslySetInnerHTML={{ __html: ` window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', '${gtag.GA_TRACKING_ID}', { page_path: window.location.pathname, }); `, }} /> <Component {...pageProps} /> </> ) } export default App ``` **Last Step is to go to `_document.js` file where actual tracking will take place** If you don't find `_document.js` file. Then just go to `pages folder` and create a new file named as `_document.js`. And add the following code over in the file. This is for google analytics. ``` <script async src={`https://www.googletagmanager.com/gtag/js?id=${process.env.GA_TRACKING_ID}`}></script> <script dangerouslySetInnerHTML={{ __html: ` window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', '${process.env.GA_TRACKING_ID}', { page_path: window.location.pathname, }); `, }} /> ``` To add google adsense we just have to add 1 line. for that follow the same steps create an account in google adsense and get the script like this ``` <script data-ad-client="ca-pub-xxxxx(yourid)" async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"/> ``` Final code will be in _document.js file You can refer the main docs of [NEXTJs Custom Doc](https://nextjs.org/docs/advanced-features/custom-document) ``` <Head> <script data-ad-client="ca-pub-xxxxx(yourid)" async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"/> //############## <script async src={`https://www.googletagmanager.com/gtag/js?id=${process.env.GA_TRACKING_ID}`}></script> <script dangerouslySetInnerHTML={{ __html: ` window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', '${process.env.GA_TRACKING_ID}', { page_path: window.location.pathname, }); `, }} /> </Head> ``` Remember the whole code should be inside `<Head></Head>` Tag Hope this might help you. Thanks for giving Your time to read this post!
yashkapure06
935,824
To-do list for 2022
I haven't written in a while, so today I decided to write down the things that have been in my mind....
0
2021-12-24T16:18:55
https://dev.to/pachicodes/to-do-list-for-2022-2ac0
webdev, react, motivation, programming
I haven't written in a while, so today I decided to write down the things that have been in my mind. Sort of a list of goals for 2022, but I am calling it a to-do list to give me some extra motivation. I will have 2 lists, a professional and a personal. ## Professional To-do 2022 - REALLY learn ReactJS, I started several times but always lose motivation. But 2022 will be the year that I learn React 🤣 - Get my Spanish to Advanced level, I want to do DevRel work focused on LATAM, and Portuguese will just take me so far. (*Please recommend me Spanish speaker devs to follow!*) - More public speaking! 2021 was a great year for me on this, but next year I want to do even more, specially in English! - Content Calendar: This year I burned out wanting to create content in too many formats and too often. For next year I will do less, but better. ## Personal - Okay, I need to start some form of exercising. I am 30 but my body hurts like it is 90 🥲 I am thinking Pilates. - I am moving back to Brazil after almost 9 years in the USA. - This one is a Personal goal but very professional: Finish writing and Publish my book. That is all I can think of right now. I hope I can look back here in 6 months and feel confidente!
pachicodes
936,079
EXPRESS JS CRASH COURSE
A post by Vladimir Agaev
0
2021-12-25T01:09:31
https://dev.to/vladwulf/express-js-crash-course-dd9
javascript, webdev, programming, beginners
{% youtube yB79hoL7svQ %}
vladwulf
936,153
CodepenLife: The Codepen Simulator Game
A post by HARUN PEHLİVAN
0
2021-12-25T03:03:36
https://dev.to/harunpehlivan/codepenlife-the-codepen-simulator-game-1hp1
codepen
{% codepen https://codepen.io/harunpehlivan/pen/qBPVrRB %}
harunpehlivan
936,154
How to setup a Cloudflare tunnel on Linux
You can now use the GUI to set up Cloudflare Tunnels instead of the CLI, which is way more...
16,031
2021-12-27T15:07:12
https://dev.to/realchaika/how-to-setup-a-cloudflare-tunnel-on-linux-40d9
tutorial, devops
You can now use the GUI to set up Cloudflare Tunnels instead of the CLI, which is way more streamlined and easy to do. {% embed https://dev.to/realchaika/how-to-setup-a-cloudflare-tunnel-new-using-gui-method-4maf %} ## What are Cloudflare Tunnels Cloudflare Tunnels can be used to expose internal services using outbound only connections. Think Ngrok tunnels. Cloudflare Tunnels can be used to proxy normal http/https connections, ssh/vnc, as well as more advanced things like arbitrary TCP, with some more restrictions. The advantage of using Cloudflare Tunnels is not having to open any ports on your web server, no need for anything like IP Restrictions, Origin Cert checking, etc. Cloudflare Tunnels also use http/2 to connect to Cloudflare's Edge (soon http3/quic), whereas normally Cloudflare will only connect to an origin over http/1.1. This guide will focus on setting up a tunnel for a normal web server over http. It's important to remember that since the tunnel is acting as a proxy for traffic, the web server (or whatever you are exposing via the tunnel) will see all incoming traffic as localhost. You will need to grab the real user's IP from a header (normal cdn things) but also not rely on restricting any resources to localhost. --- ## Pricing / Limits of Cloudflare Tunnels Cloudflare Tunnels are completely free. Cloudflare Tunnels used to be named Cloudflare Argo Tunnels, and required a Cloudflare Argo Subscription. [Cloudflare Argo] (https://www.cloudflare.com/products/argo-smart-routing/) is a service Cloudflare offers where they will use "smarter routing" to route requests to your origin avoiding network congestion, charging per gigabyte transferred. Now Cloudflare has completely separated the products, while you can still buy an Argo Subscription to try to speed up traffic to your origin. Tunnels are free for any traffic amount with only a few limits: [1000 Tunnels per account, and 100 Active Connections from each tunnel to Cloudflare's edge] (https://www.cloudflare.com/products/argo-smart-routing/). --- ### Requirements: - Cloudflare Account (free) - Domain added to Cloudflare (using CF nameservers, etc) - Linux server with a web server already configured on it - No ports need to be port forwarded or allowed through your firewall --- ### How to setup a Cloudflare Tunnel #### Installing Cloudflared Cloudflare Tunnels use Cloudflared, a tunneling daemon to proxy the traffic from Cloudflare, and also to provide a CLI interface to make and manage tunnels. ##### .deb install (Ubuntu, Linux Mint, Debian, etc) ```console wget -q https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-amd64.deb && sudo dpkg -i cloudflared-linux-amd64.deb ``` #####​ .rpm install (Centos, Fedora, Rhel, OpenSusu, etc) ```console wget -q https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-x86_64.rpm && sudo rpm -i cloudflared-linux-x86_64.rpm ``` #### Login to Cloudflared ```console cloudflared tunnel login ``` This command should give you the link to sign into Cloudflare, and select a zone (website) to create tunnels on. When done, it will download an account certificate (cert.pem file in the default cloudflared directory). This cert will be used to authorize future API Requests to create and manage tunnels. Once your tunnel is up and running, it will use its own credentials file, and you can safely delete this unless you want to keep managing/creating/deleting tunnels from this machine. #### Create a tunnel ```console cloudflared tunnel create <name> ``` This command will create a named tunnel based on the name entered. It will generate a new tunnel, this includes generating a UUID for the tunnel, a tunnel credentials file in the default cloudflared directory, and a subdomain of .cfargotunnel.com that you can use to route requests to. In this example, I'll be naming my tunnel "frontpage". #### Create your tunnel configuration file Throughout the past two steps, after logging in and creating the account cert, and making a tunnel, generating the tunnel cert, cloudflared has listed the path to your .cloudflared directory, which is most likely based off your home directory. Something like "~/.cloudflared" or "/home/{username}/.cloudflared" Navigate to that folder now. You should see cert.pem (your account cert) and a .json file named off the UUID of your tunnel. Create a new file in the same directory, config.yml, and open it using your preferred text editor. ```yml url: http://localhost:80 tunnel: <Tunnel-UUID> credentials-file: /home/{username}/.cloudflared/<Tunnel-UUID>.json ``` The URL line corresponds to the internal service you wish to expose. It's not necessary to use https://, the connection between Cloudflare Tunnel and Cloudflare's datacenter is already encrypted. This is just the tunnel connecting locally to the web server. The Tunnel UUID is a 36 character value that corresponds with your named tunnel. It was displayed when you made the tunnel. You can also find it by going to your .cloudflared directory and looking for the newly created json credentials file for the tunnel you made. It should be named {Tunnel-UUID}.json. #### Route traffic to your tunnel You just create a CNAME Record to route traffic to your tunnel. You can do so easily using the cloudflared cli ```shell cloudflared tunnel route dns <Tunnel UUID or Name> <Hostname> ``` For example, my tunnel is named `frontpage` and I wanted it to be accessible via `example.chaika.dev`. So I did ```shell cloudflared tunnel route dns frontpage example.chaika.dev ``` #### Run your tunnel Finally, you can test out your tunnel. ```console cloudflared tunnel run <UUID or Name> ``` You can also specify a specific configuration file to run ```console cloudflared tunnel --config path/config.yaml run ``` Once your tunnel is live, try accessing it via the hostname you routed it to. It may take a few seconds for the tunnel to be fully live/accessible. If something is wrong, the tunnel running in the CLI should tell you more information about errors. #### Run your tunnel as a service Running your tunnel manually will work, but isn't the best. It won't automatically start if your machine reboots, have to ensure its open/running, etc. Luckily, cloudflared supports installing itself as a service very easily. ```console sudo cloudflared service install ``` You may need to manually specify config location. In my case, I did have to specify it. For example, ```console sudo cloudflared --config /home/{username}/.cloudflared/config.yml service install ``` > Note that you specify the config argument before the 'service install' command parameters. The configuration will be copied over to `/etc/cloudflared` I would recommend copying over the tunnel credentials file ({Tunnel-UUID}.json) over to there as well. Then, just launch the service and set it to start on boot ```console sudo systemctl enable cloudflared sudo systemctl start cloudflared ``` Ensure your tunnel started/is running fine: ```console sudo systemctl status cloudflared ``` Test out your tunnel by visting the hostname you routed it to. With any luck, it all worked, and your Cloudflare Tunnel is now all set up, running as a service, automatically starting on reboots, and working well! ## How the tunnel works You may have noticed, when your tunnel starts up, it makes multiple connections. Cloudflare says it connections to multiple machines in case one crashes/reboots, it can use the other connections. Each individual connection to Cloudflare is not limited to one user request at a time. Cloudflare says each connection can handled hundreds or thousands of requests at one time. Each Tunnel supports up to 100 connections, you can launch more cloudflared replicas/instances for reliability. Cloudflare does not recommend doing this for load-balancing, and makes no guarantee about which connection is chosen. They recommend using their own load-balancing product along with tunnels for this. You can use the [Cloudflare Teams Dash](https://dash.teams.cloudflare.com/) under "Access", "Tunnels" to see a good view of each tunnel you have, what routes it has, uptime/connections it has, and all other relevant information. Cloudflare for Teams/Cloudflare Access has a generous free plan you can use as well, for up to 50 people, using Google (or a ton of other sso options) for auth. You can very easily make an Application policy to protect your tunnel and limit it to only specific emails or other options. {Tunnel-UUID}.cfargotunnel.com is a virtual/non-existent domain, that is only used internally when you make CName's pointing to your tunnel and other references. Other Cloudflare Customers cannot point their domains at your tunnel and bypass your Cloudflare Access or other restrictions. ## Closing notes Hopefully, this helped you understand and create Cloudflare Tunnels. I made this tutorial in part for myself, Cloudflare's Tunnel Documentation does exist, and covers mostly everything, but glosses over a lot of details, and can be really confusing to beginners. Thanks for reading. If you have any questions, let me know. I've used Cloudflare Tunnels for quite some time, although mostly in smaller websites/forums.
realchaika
936,161
Meep - a simple Web Audio API demo
This is a minimalistic Web Audio API experiment. You can play via mouse, via touch and via keyboard...
0
2021-12-25T04:04:19
https://dev.to/harunpehlivan/meep-a-simple-web-audio-api-demo-14ob
codepen
<p>This is a minimalistic Web Audio API experiment. You can play via mouse, via touch and via keyboard (asdfghjkl and the black keys, wetyu). For now, german and english keyboard layouts are supported.</p> <p>Browser support: <a href="http://caniuse.com/#feat=audio-api" target="_blank">http://caniuse.com/#feat=audio-api</a></p> {% codepen https://codepen.io/harunpehlivan/pen/dyVZvwO %}
harunpehlivan
937,082
Create a simple time ago component in Vue using Moment.js
Ever needed a time ago component where you can parse a datetime string and get your date in the...
0
2021-12-26T23:28:21
https://dev.to/maxwelladapoe/create-a-simple-time-ago-component-in-vue-using-momentjs-3en7
momentjs, vue, javascript, timeago
Ever needed a time ago component where you can parse a datetime string and get your date in the format like **10 days ago** , **a year ago** etc.? Well you could create that easily in [vue.js](https://vuejs.org/) using [moment.js](https://momentjs.com/). Let's dive straight into it. 1. Install moment.js using npm `npm install moment --save` or with yarn `yarn add moment` 2. Create a new component. You can name it TimeAgo.vue 3. In your component ```javascript //TimeAgo.js <template> <span>{{convertedDateTime}}</span> </template> <script> import moment from 'moment' export default { name: "TimeAgo", props:{ dateTime:{ required:true } }, computed:{ convertedDateTime(){ return moment(this.dateTime).fromNow(); } } } </script> ``` to use it in your project ```javascript <template> ... <time-ago :dateTime='new Date()'/> ... </template> <script> import TimeAgo from "@/components/TimeAgo"; export default { ... components: { TimeAgo } ... } </script> ``` and thats it. This should work in vue 2 and vue 3 without any issues. if you need to extend it you can see the [moment.js docs](https://momentjs.com/docs/#/displaying/from/)
maxwelladapoe
936,171
Linux Basics (Shell Commands)
About Linux Linux was created by a Software Engineer named Linus Torvalds. Linux is a...
0
2021-12-25T04:55:28
https://dev.to/coderjay06/linux-basics-shell-commands-2kj
linux
## About Linux Linux was created by a Software Engineer named [Linus Torvalds](https://en.wikipedia.org/wiki/Linus_Torvalds). Linux is a _“family of open source Unix-like operating systems based on the_ [_Linux kernel_](https://en.wikipedia.org/wiki/Linux_kernel)_” — _[_Wikipedia_](https://en.wikipedia.org/wiki/Linux). The first Linux kernel was written in 1991. The Software layers used to describe the Linux structure are as follows: **.** **The kernel**  — Runs the hardware and allocates resources. **.** **The shell** (Which I will focus on here) — Where you type commands into the CLI (Command Line Interface). **.** **Applications layer**  — Where programs run, such as [GCC](https://en.wikipedia.org/wiki/GNU_Compiler_Collection) (Gnu Compiler Collection) and [vi](https://en.wikipedia.org/wiki/Vi) (a popular text editor). _Linux Operating System hierarchy_ ![Image of Linux system hierarchy](https://cdn-images-1.medium.com/max/642/1*LgKCfBbHzcFapeNnAEKx0g.jpeg) ## Linux file system In most operating systems including Linux, directories (aka. folders) are structured in a hierarchical tree-like way. This keeps our files and folders organized. The root directory is labeled as a forward slash ‘/’ character, all other directories sit below the root. When you first log in you will be taken to the home directory. _Linux file system tree_ ![Image of Linux file tree layout](https://cdn-images-1.medium.com/max/831/1*Z0EG48pmJzZfN_K6GqV6GQ.jpeg) ## Basic commands The following is a list of some of the most common Linux shell commands _(each command will be followed by_ _/$ which is just an example of being in the root directory of the terminal)_: `/$ ls` - List storage. This command will list the contents of the current directory you're in. `/$ cd` - Change directory. This will bring you to your home directrory. To navigate up a level use cd.., back a directory cd -, to root cd /, or if you have it follow a path it will bring you to that directory, for example ` cd home/jay/documents`. `/$ pwd` - Print working directory. Outputs the path of the current working directory starting from the root. Example output /home/jay/photos . `/$ mkdir` - Make directory. Short for make directrory, will create a new directory when followed up with a name that the user types (must be all lower case), for example mkdir myfiles. `/$ touch` - Mainly used to create new empty files touch myfile. `/$ rm` - Remove. This command is mainly used for removing files and directories. Use this with caution as there is no trash bin in Linux. `/$ mv` - Move. Moves file(s) from one directory to another mv file1.txt file2.txt, also can be used to rename a file mv name newname. `/$ echo` - Outputs lines the user types in front of it as string arguments echo "Hello World!". This command is usually used in shell scripts and batch files. `/$ grep` - Global Regualr Expression Print. My favorite, this is a very useful command that can be used to search for string charcaters in a file. When it finds the characters you're searching for it will output the result grep "hello" greeting.txt. _Linux video resources:_ [_Linux: a short documentary_](https://www.youtube.com/watch?v=aurDHyL7bTA&feature=emb_title)_,_ [_What is Linux_](https://www.youtube.com/watch?v=zA3vmx0GaO8)
coderjay06
936,194
SSC Result 2021 Published Date 30 December
SSC result 2021 will be published on 30 December. The Board of Education has said that although the...
0
2021-12-25T06:37:13
https://dev.to/ariyan99/ssc-result-2021-published-date-30-december-4g30
result, news, ssc, webdev
SSC result 2021 will be published on 30 December. The Board of Education has said that although the SSC exams started late due to Corona virus, the results will be released soon. So the Board of Education will finally publish the results of all the boards at once. So the results of SSC examination will be published this month. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k2hi3gpfcn3uw6yfotfz.png) ### Official Result Published Link - [SSC Result 2021 Published All Education Results](https://examresultsbd.com/ssc-result-bd/ - [SSC Result Marksheet 2021 Download All Board](https://examresultsbd.com/ssc-result-2021/)
ariyan99
936,205
Create Loading Animation with CSS only
Hello! Everyone, Today we are going to create Beat Bar type loading animation with css only. Check...
0
2021-12-25T07:35:42
https://dev.to/devrohit0/beat-bar-loading-animation-css-only-3on1
webdev, beginners, css, tutorial
Hello! Everyone, Today we are going to create Beat Bar type loading animation with css only. Check What we going to create {%youtube Ys4HNb9UdCY %} ##HTML We have to create four `<div>` element. ```html <body> <div class="box" id="box1"></div> <div class="box" id="box2"></div> <div class="box" id="box3"></div> <div class="box" id="box4"></div> </body> ``` #CSS ```css *{ margin:0; padding:0; } ``` Now center our `<div>` ```css body{ background:#444; display:flex; justify-content:center; align-items:center; height:100vh; } ``` Now style our `box` ```css .box{ width:20px; height:20px; background: white; margin:5px; } ``` Now we're going to use `nth-child()` pseudo selector. ```css .box:nth-child(1){ background:red; animation: balls 1s linear infinite; } .box:nth-child(2){ background:cyan; animation: balls 1s 0.1s linear infinite; } .box:nth-child(3){ background:blue; animation: balls 1s 0.2s linear infinite; } .box:nth-child(4){ background:yellow; animation: balls 1s 0.4s linear infinite; } ``` Now it's time to animate using `keyframes` ```css @keyframes balls{ 0%{ transform:sclaeY(1); } 50%{ transform:scaleY(3); } 100%{ transform:sclaeY(1); } } ``` I hope you love it. [Support me](https://buymeacoffee.com/devrohit), If you can
devrohit0
936,267
Laravel Jetstream vs Laravel ui vs Breeze
In this blog we are going to see the main difference between Laravel Jetstream vs Laravel ui vs...
0
2021-12-25T09:47:07
https://dev.to/expoashish/laravel-jetstream-vs-laravel-ui-vs-breeze-2h6d
laravel, php, programming, computerscience
In this blog we are going to see the main difference between Laravel Jetstream vs Laravel ui vs Breeze. So Guys please like the post and visit my website>>> [Click Here to Read this Blog](https://codexashish.blogspot.com/2021/12/laravel-jetstream-vs-laravel-ui-vs.html) Thankyou
expoashish
937,235
Starting #100daysofcode from today ! Will update everyday about the progress
A post by khyati81
0
2021-12-27T06:26:28
https://dev.to/khyati81/starting-100daysofcode-from-today-will-update-everyday-about-the-progress-3050
100daysofcode, webdev, beginners, programming
khyati81
936,282
Explain Json Web Token(JWT).
JWT JWT stands for Json Web Token. It is the most popular user authorization technique for web...
0
2021-12-25T10:59:22
https://dev.to/webfaisalbd/explain-jwt-2ndi
javascript
**JWT** JWT stands for Json Web Token. It is the most popular user authorization technique for web applications nowadays, mostly micro web services. ## What we learn this blog: - What is JWT? - Why does it come? - How does JWT work? -**What is JWT?** JSON Web Token(JWT) is used to share security information between two sides like a client and a server or a server and a server.It can be used as an authentication mechanism that does not need a database. -**Why does it come?** We send/receive data from client to server or server to server using http protocol. So that time, http does not keep any data from the user side like user name or password. Because http is stateless protocol. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8z5wfna0oc08uyczj60r.PNG) When we use a static website, that time the user sends just the url of the website, so there is no problem for http behaviour. --- ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0xq11iia2d1xs9dtc0xr.PNG) But when we use a dynamic website, that time the user sends not only the url of the website but also the user identity, so there is a problem with http behaviour. Because a user sends a request for another page, how does the server understand - the user has the right to access the page or not? Then the answer is using a token. Because every time the user doesn't send the user identification. There are two authentication system we can follow, _Session token_ _JWT token_ **Session token:** ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gnoroxdutkudh0eym9mc.PNG) The user sends the request to the server with user identification, then the server generates a session id for that user and this session id saves to the server session log and also sent to the user. The user saves the session id in the browser cookies or others. --- ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/525ghonhg5l4o9cb8q9s.PNG) When the user again requests to the server, that time, the cookies send this session id with the request, then the server checks the user session id inside the server session log. If the session id matches with the session log, then the server serves the user. --- ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p2z27q3fieelilvdtyqh.PNG) But the problem is the modern and largest web application has multiple servers. Then multiple servers are maintained by load balancers and also shared redis sessions in the database. But if the shared redis session is crashed or down, then the service will be stopped. So JWT comes to solve this problem. -**How does JWT work**? ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wy6pbkndf16tvd46ono9.PNG) The user sends a request to the server,then the server sends the jwt token to the user. Inside this jwt token, the server includes header, payload and signature. Inside the header, the server is written in which algorithm is used in this jwt token. Payload is the user information and signature is the secret key. By the secret key, the server ensures that this is the right user. In this case, the server does not keep any data from the user. All the data will be sent from the server to the user and the user keeps this jwt token in browser cookies or others.
webfaisalbd
936,285
twitter heart - step #9 (1 element, no sprite, no JS)
Created for my Recreating the Twitter Heart Animation article on CSS-Tricks.
0
2021-12-25T11:16:54
https://dev.to/jayantgoel001/twitter-heart-step-9-1-element-no-sprite-no-js-4a34
codepen
<p>Created for my <a href="https://css-tricks.com/recreating-the-twitter-heart-animation/">Recreating the Twitter Heart Animation</a> article on CSS-Tricks.</p> {% codepen https://codepen.io/thebabydino/pen/gMgYqW %}
jayantgoel001
936,347
10 Top Video Calling APIs: Benefits & Features | 2023
Want to integrate the secure &amp; fastest Video calling API to your application &amp;...
0
2021-12-25T13:01:40
https://dev.to/stutinath/top-video-calling-api-and-conferencing-sdk-47g0
webrtc, showdev, news
### Want to integrate the secure & fastest [Video calling API](https://dev.to/stutinath/video-calling-api-15ln) to your application & website? Here is the top Video calling APIs that will drive the business traffic! To improve productivity and connectivity, A video calling feature can be integrated into any Web browser and mobile application with RTC via APIs . In this modernized world, Technology enhanced communication has already taken Place and will surely Upgrade in the Future. The use of mobile gadgets has surged at a rapid pace. The world is growing to step into the digital world where everything is accessible with just a few easy clicks through Video calling API. Behind the triumph of any company, there is a huge increase in digitalization. Now all of us prefer virtual communication not willingly but rather than giving a physical appearance. After the Covid Booster shot , Virtual seems better. There are many prestigious organizations working on the aspect of virtual communication. The internet is flooded with so many video calling applications. These kits are developed by the software development kits that help us to communicate with the clients, corporates, and families. Because of these tools, Virtual meets becomes easy and efficient without any lags. ###What is a video calling API? At its core, a Video Calling API, or Application Programming Interface, is a set of protocols, tools, and definitions that allow different software applications to communicate and interact with each other. Specifically, a Video Calling API enables developers to integrate video calling functionality into their applications, websites, or services seamlessly. The APIs can be embedded into an existing web and mobile project, or used to create an application from scratch. ### Here in the article, we have mentioned some effective approaches towards the Top 10 video calling API. ####Why is the popularity of video calling API increasing? The entire world has started adopting virtual activities for distant communication. If you notice, you can clearly find how students choose virtual or digital classrooms and how the healthcare industry is choosing virtual patient care assistance. Overall, the whole market is looking to turn their business into digital hype. The video calling API has been experiencing a huge hike in Capitalism. Choosing the right method of communication depends on features of SDK. Here in the below section, we bring some of the best tools that help to leverage your business smartly with digital communication. Integration Process **Choose a Platform**: Before integrating the API, decide on the platform you want to use. Popular options include WebRTC, Zoom, and Twilio. **API Documentation**: Familiarize yourself with the API's documentation provided by your chosen platform. This documentation will be your guide throughout the integration process. **API Key**: Obtain the API key from your platform provider. This key is essential for authentication. **Coding**: Begin coding the integration. Ensure that you follow the guidelines provided in the documentation to avoid compatibility issues. **Testing**: Test the integration thoroughly to iron out any bugs or glitches. **Deployment**: Once testing is successful, deploy the application with integrated video calling. Best Practices Always prioritize security to protect user data during video calls. Optimize the user interface for a seamless experience. Provide adequate customer support for any technical issues. ##Tried Top 10 video calling API for web and mobile app ### 1.[Video SDK - Video Calling API](https://videosdk.live/) The Video SDK is the driving option that makes video conferencing efficient in every possible way to explore the world of digital communication. This tool allows you to explore how the video SDK creates an impact on engagement. This tool allows the video calls to be integrated within 10 minutes along with the VideoSDK pre-built feature like whiteboards, Q&A, and polls. This video conferencing API tool supports unlimited private channels by integrating with advanced video streaming possibilities. Video SDK offers a robust API that allows you to seamlessly embed video and chat functionality into your applications. With this powerful tool, you can enhance video engagement by setting up reliable video conferencing capabilities in just a matter of minutes. This SDK is versatile, with support for various platforms including JavaScript, React JS, React Native for Android, Flutter, and iOS. It opens up exciting opportunities for integrating real-time communication features into your applications. #### Features of Video SDK APIs: - Easy to integrate prebuilt live streaming SDK with 5000 participant support. - Real-time communication SDK with 10,000 minutes free every month. - It just requires 10 minutes to integrate explicit code. - A cost-effective tool with long term solution. - Unlimited channels with enhanced video quality. - UI support & auto-scalable parallel rooms. ### 2.[Agora.io - Video Calling API](https://www.agora.io/en/) This tool works to build a new future with the help of a real-time communication channel. The company is based on real-time communication that develops SDKs and APIs. This tool works on the engagement for the users by delivering the video call service with real-time voice and messaging, live streaming products. The company is working towards building real-time connections in the virtual world. With Agora video conferencing API, anyone can engage by embedding the vivid voice and video application. It provides the SDKs along with the building blocks to enable the possibility of adopting the real-time engagement possibilities. The tool offers video call, voice call, interactive live streaming, recording.agora's APIs, developers must spend months building business logic primitives. Low-level publish-subscribe consumes the majority of developer bandwidth.i think Video SDK is best [agora alternative](https://www.videosdk.live/alternative/agora-vs-videosdk). ###Features of Agora.io APIs: - The tool offers an intelligent network that will help to connect automatically in real-time analysis. - Select the efficient routing path with 200+ data centers. - Enterprise support Platforms. - Minimal Battery Consumption. - It can withstand even the sudden spike in traffic. - Extensive API selection with customizable UI extension. ### 3.[ZujoNow - Video Calling API](https://zujonow.com) This company is developing its products on cutting-edge technologies. The company delivers the products to its clients based on the video conferencing tool with effective scalability. It delivers customizable SDKs to clients. The video conferencing API company is popularly dealing with the products like on-demand videos, live streaming, and real-time communication. This is a well-crafted platform that is helping educators and other related industries. The company delivers an end-to-end solution that enables easy integration with real-time communication, on-demand video, and a content delivery network. ###Features of ZujoNow APIs: - Provide inbuilt support to the healthcare, edtech, and dating business. - Deliver interacting experience with chatting, voice & video calling. - Get zero lags while connecting. - Low latency support with real-time video streaming encoding. ## 4.[Daily.co - Video Calling API] (https://www.daily.co/) This is the real-time video and audio SDK developer platform that focuses on the clients. Best scalable video conferencing API. The platform is developing the global infrastructure solution to deliver the people throughout the world. With Daily.co, anyone can add live audio and video experience to the products along with the prebuilt user interface & creating custom layouts. This tool helps to build calls for any device with the quality video feature of 1080p HD video and screen sharing. Daily.co also supports the flexible recording options with transcriptions. ###Features of Daily.co APIs: - It leaves the video call UI entirely up to you. - Automatic bandwidth and switching between group sessions. - Easy to use options with pre-built API. - Automatic tuning of video Quality. - Global infrastructure with HD RTMP streaming. ## 5.[Enablex.io - Video Calling API] (https://www.enablex.io/) This video solution tool helps build the HD-enabled videos application on different platforms. The tool supports Web applications, iOS and android platforms. This tool supports amazing APIs and SDKs that deliver one-to-one video chats along with the option of group video calls. Here one can get an extensive video chat experience for the users. Get the live interactive broadcast feature to broadcast varieties of content through the devices directly on the social media platforms like YouTube, Facebook, and other channels. The tool works on developing communication APIs that focus on providing real-time solutions. ###Features of Enablex.io APIs: - Get advanced features like Breakout channels (rooms), Background blurring features. - One-time payment with upfront solution. - Customizable functionality with endless features. - Get end to end encryption. - Get customizable UI with tangible layouts. - Up to 100 for video meetings and 1000 for webinar mode. ## 6.[Mirrorfly - Video Calling API](https://www.mirrorfly.com/) MirrorFly is another audio and video calling API and SDK provider that offers a solution to both large and small-scale organizations. This is the versatile messaging solution working as the market's prime product. The tool is highly customizable as compared to any other. It goes perfectly with iOS, Android, and Web Applications. With this, anyone can get the absolute chat-app solution that provides a design-enriched UI/UX. Here you can get the intuitive build with a plethora of other features. This is embedded with WebRTC that enables HD quality video interaction along with the VoIP feature that gives a dynamic voice experience. ###Features of Mirrorfly APIs: - Real time language translation. - Get a 100% customizable solution. - Private 1 to 1 chat with offline messages. - Interactive Live Broadcasting with SIP calling feature. - Push to Talk feature with VoIP calling facility. - A versatile messaging solution with endless feature. ## 7.[Twilio - Video Calling API] (https://www.twilio.com/) The platforms develop video application tools that are fully customized and scalable. This is completely flexible for usage. This tool constructs the applications and the connectivity that builds up capitalism . It allows chats, video, and programmable chats based on real-time communication with scalability and video calling API. This is a perfect fit for enterprises from small to wide. This helps the organization to engage its users in every step of the journey. This is a flexible API for any channel which has built-in intelligence with global infrastructure support. This single platform comes with flexible APIs for any channel, global infrastructure, built-in intelligence, and many more. ###Features of Twilio APIs: - It gives free trial credits for video groups and video P2P . - Cloud recording facility with workflow integration. - 24 hours support through the mail and chat. - Get endless features with an intuitive interface. ## 8.[Cometchat - Video Calling API]( https://www.cometchat.com/) This platform is designed for providing the APIs and SDKs for the unlimited solution to a variety of industries. The platform supports a solution to various healthcare organizations, dating, the healthcare community, and social media integration. It also delivers the features like on-demand videos and live streaming and the complete authority to the users for customizing their Whitelabel. Cometchat, the video conferencing API, supports varieties of languages and solutions like voice and video calling, scalable in-app messaging, and cross-platform compatibility. This tool supports medium to large-scale organizations. The best part of the platform is It has WebRTC-enabled HD video and voice calling capabilities. They also offer the free-of-cost service that holds limited options to small-scale organizations. ###Features of Cometchat APIs: - One-on-one text chat along with group chat facility. - Voice calling & video calling, conferencing tools. - Type and Read indicators with online Presence Indicators. - Drag & drop chat widgets. - Build a completely custom UPI & workflow. - Autoscaling & white label feature with message translation. ## 9.[PubNub - Video Calling API] (https://www.pubnub.com/) This is considered one of the best in-app chats that deliver the features of real-time chat engagement. The tool offers extensive functionality, full control, and customization without the time and expense of building in-house. With this, anyone can get complete outsourcing to the clients. It delivers the features and functionalities like custom chat, in-class integrations, functionality, and Chat UI support. This is specially built for conferences, virtual conversations, meetings, and enterprise entities. Get a one-stop platform & receive top-quality integration features. ###Features of PubNub APIs: - Get transfer the metadata pre-call facility. - Receive extensive plugins in one platform. - WebRTC signaling along with an end to end encryption. - Get push notifications with complete messaging broadcasting solutions. - Integrate, migrate & launch quickly. - Open-source UI kit. ## 10.[Sinch - Video Calling API] (https://www.sinch.com/) Sinch is another API that manages different APIs through messaging and calling facilities. Through this, anyone can receive the services like video calling, SMS verification, voice calls, and other engagement platforms. Varieties of industries receive an extensive solution from this, which includes health, telecommunications, retail, media and entertainment, and more. Through this, a lot of operators get opportunities for monetizing the wholesale and getting rid of the frauds and other activities. This video conferencing API offers an instant messaging SDK and API to the iOS, Android, and Web Applications. It gives complete freedom to the users for customization and supports SIP, VoIP, and PSTN. ###Features of Sinch APIs : - It offers high Quality video. - Get Personalized Messaging with Voice Calling service. - Live Broadcasting. - Personalized communication with advanced setup. - 600+ operators connections globally. - Amazing easy & intuitive interface with the endless possibilities. ## Best 10 Video Calling API Price and Comparison | No | Company Name | Pricing | Free minutes every month | | ------------- |:-------------:| -----:| -----------: | | 1 | Video SDK | [$1.99 / 1000 minutes](https://videosdk.live/pricing) | 10,000 minutes | 2 | Agora | [$3.99 / 1000 minutes] (https://www.agora.io/en/pricing/) | 10,000 minutes | | 3 | ZujoNow | [$3 / 1000 minutes](https://zujonow.com/pricing) | No minutes | | 4 | Daliy.co | [$4 / 1000 minutes](https://www.daily.co/pricing) | 2,000 Minutes | | 5 | Enablex | [$4 / 1000 minutes](https://www.enablex.io/cpaas/pricing/video-api) | No minutes | | 6 | MirrorFly | [$ 999 / Monthly](https://www.mirrorfly.com/pricing.php) | No minutes | | 7 | Twilio | [$4 / 1000 minutes](https://www.twilio.com/video/pricing) | No minutes | | 8 | PubNub | [$49 /month plus MAUs](https://pubnub.com/pricing) | No minutes | | 9 | Cometchat | [$149 /+ $3/ 1000 minutes](https://cometchat.com/pricing) | No minutes | | 10 | Sinch | [not available](https://sinch.com) | No minutes | **here's a simplified explanation of how to calculate the price of a video call API:** Determine Your Needs: Understand what features you need for your video calling service, like the number of users and call duration. Choose a Provider:Pick a video call API provider that matches your requirements. Know Their Pricing Model:Learn how the provider charges: per user, per minute, or through tiers. Estimate Your Usage:Estimate how much you'll use the API based on your users and their average call duration. Check Plans:See if the provider offers different plans. Choose one that fits your estimated usage. [Calculate Costs:Use the plan's rates to calculate your monthly or yearly costs based on your estimates.](https://www.videosdk.live/pricing#pricingCalc) Consider Extras:Be aware of any additional costs like overage charges or special features. Ensure Security:Confirm that the provider meets your security needs, which might have an associated cost. Optimize Costs:Keep an eye on usage and adjust your plan as needed to save money. Check Geographic Pricing:If your users are worldwide, consider any geographic pricing variations. Think Long-Term:Consider annual plans for potential cost savings. Review Contracts:Carefully read the provider's terms and contract to understand all fees and policies. ### FAQs **How secure is Video Calling API?** Video Calling API providers implement robust security measures to protect user data and ensure secure communication. **Can I integrate Video Calling API into my mobile app?** Absolutely! Video Calling API is compatible with both web and mobile applications. **Is Video Calling API cost-effective for small businesses?** Yes, many providers offer flexible pricing plans, making it accessible to businesses of all sizes. **What internet speed is required for smooth video calls?** A stable internet connection with at least 1 Mbps upload and download speed is recommended for quality video calls. **Can I customize the user interface of the video calling feature?** Yes, most providers allow for extensive customization to match your application's design. **Is technical support available in case of issues?** Yes, reputable [Video Calling API providers](https://discord.gg/f2WsNDN9S5) offer customer support to assist with technical problems.
stutinath
936,364
Flexbox - Override 'justify-content' on Child Items
There are times when we want to place a specific child item out of flexbox context, so we have...
0
2021-12-25T13:45:45
https://dev.to/deepakdevanand/flexbox-override-justify-content-on-child-items-12c2
css, webdev
There are times when we want to place a specific child item out of flexbox context, so we have granular control on its placement relative to its parent. `justify-self` doesn't seem to work, contrast to its sibling `align-self`. The [flexbox layout specification](https://www.w3.org/TR/css-flexbox-1/#abspos-items) talks about absolutely-positioned children: > As it is out-of-flow, an absolutely-positioned child of a flex container does not participate in flex layout. Taking this into account, we can let flex items of choice opt-out flexbox layout, and be positioned using `position:absolute` property. {% codepen https://codepen.io/deepakdevanand/pen/dyVZvqp %}
deepakdevanand
936,415
My Personal Portfolio
https://JayantGoel001.github.io
0
2021-12-25T16:29:10
https://dev.to/jayantgoel001/my-personal-portfolio-17gj
portfolio, jayantgoel, angular, githubpages
https://JayantGoel001.github.io
jayantgoel001
936,426
Advent.js🎅🏼| #22: ¿Cuantos adornos necesita el árbol?
¿Cuantos adornos necesita el árbol? ¡Ay! Que llega la Navidad y no hemos decorado todavía...
15,762
2021-12-25T17:08:11
https://dev.to/duxtech/adventjs-22-cuantos-adornos-necesita-el-arbol-3518
javascript, adventofcode, webdev, spanish
## [¿Cuantos adornos necesita el árbol?](https://adventjs.dev/challenges/22) ¡Ay! Que llega la Navidad y no hemos decorado todavía el árbol. 🎄😱 Necesitamos una función que pasándole un árbol binario nos diga el número de decoraciones que necesitamos. Para ello tenemos un objeto que sería la representación del árbol y que nos indica en cada nivel el número de ramas a decorar. Lo mejor es que veamos un ejemplo: ```js // tenemos el árbol en forma de objeto const tree = { value: 1, // el nodo raíz siempre es uno, porque es la estrella ⭐ left: { value: 2, // el nodo izquierdo necesita dos decoraciones left: null, // no tiene más ramas right: null // no tiene más ramas }, right: { value: 3, // el nodo de la derecha necesita tres decoraciones left: null, // no tiene más ramas right: null // no tiene más ramas } } /* Gráficamente sería así: 1 / \ 2 3 1 + 2 + 3 = 6 */ countDecorations(tree) // 6 const bigTree = { value: 1, left: { value: 5, left: { value: 7, left: { value: 3, left: null, right: null }, right: null }, right: null }, right: { value: 6, left: { value: 5, left: null, right: null }, right: { value: 1, left: null, right: null } } } /* 1 / \ 5 6 / / \ 7 5 1 / 3 */ countDecorations(bigTree) // 28 ``` Por cierto, Bellf Gates me ha contado que este tipo de ejercicio es muy típico en las entrevistas de trabajo para programadores. ¿Lo sabías? Completa el reto! --- Te dejo una posible solución: {% jsitor https://jsitor.com/embed/xl1T7o5rW %} --- Puedes seguir a @midudev y estar pendiente de los retos de Advent.js {% twitter 1466086678507008003 %}
duxtech
936,428
Advent.js🎅🏼| #24: Comparando árboles de navidad
Comparando árboles de navidad El abuelo 👴 dice que ve todos los árboles de navidad...
15,762
2021-12-25T17:22:10
https://dev.to/duxtech/adventjs-24-comparando-arboles-de-navidad-dng
javascript, webdev, adventofcode, spanish
## [Comparando árboles de navidad](https://adventjs.dev/challenges/24) El abuelo 👴 dice que ve todos los árboles de navidad iguales... La abuela 👵, en cambio, piensa que no. Que todos los árboles de navidad son distintos... Vamos a hacer una función que nos diga si dos árboles de navidad son iguales. Para ello, vamos a comparar [los árboles que ya creamos en el reto 22.](https://dev.to/duxtech/adventjs-22-cuantos-adornos-necesita-el-arbol-3518) Tenemos que ver si ambos árboles tienen la misma estructura y los mismos valores en todas las ramas. Aquí tienes unos ejemplos: ```js const tree = { value: 1, left: { value: 2, left: null, right: null }, right: { value: 3, left: null, right: null } } checkIsSameTree(tree, tree) // true const tree2 = { value: 1, left: { value: 3, left: { value: 2, left: null, right: null }, right: null }, right: { value: 5, left: null, right: { value: 4, left: null, right: null } } } checkIsSameTree(tree, tree2) // false checkIsSameTree(tree2, tree2) // true ``` El cuñado 🦹‍♂️, que se las sabe todas, me ha dicho que tenga cuidado porque **el truco del JSON.stringify puede no funcionar...** ya que los árboles pueden ser el mismo pero el orden de representación de las ramas izquierda y derecha puede ser inversa... Completa el reto! --- Te dejo una posible solución: {% jsitor https://jsitor.com/embed/ySuYhR3dm %} --- Puedes seguir a @midudev y estar pendiente de los retos de Advent.js {% twitter 1466086678507008003 %}
duxtech
936,486
Simple CSS Text Animations
Hi, I made some simple CSS text effects that can be used in your projects, these are beginner level...
0
2021-12-25T18:20:00
https://dev.to/kiranrajvjd/simple-css-text-animations-1nee
css, webdev, codenewbie, beginners
Hi, I made some simple CSS text effects that can be used in your projects, these are beginner level animations. The code is not optimized and accessibility is not taken into account. I just want to show that animations like these can be made using CSS. You can use these simple animations with your creativity to make awesome text animations, hope you will like this. Happy Coding !! {% codepen https://codepen.io/kiran-r-raj/pen/dyVVPPM %} {% codepen https://codepen.io/kiran-r-raj/pen/MWEvRPW %} {% codepen https://codepen.io/kiran-r-raj/pen/XWeezGo %} {% codepen https://codepen.io/kiran-r-raj/pen/bGoWgaj %} My [collection](https://codepen.io/collection/RzReVd) of text effects in [codepen](https://codepen.io/collection/RzReVd).
kiranrajvjd
936,491
How do you know if someone is a good developer?
A post by pandaquests
0
2021-12-25T18:24:54
https://dev.to/pandaquests/how-do-you-know-if-someone-is-a-good-developer-41n
productivity, work, programming, javascript
pandaquests
936,496
BEM CSS Architecture
Hey Guys,Today i am going to talk about css BEM architecture.BEM stands for...
0
2021-12-25T19:09:07
https://dev.to/suhakim/bem-css-architecture-5ai4
bem, css, architecture, html
Hey Guys,Today i am going to talk about css `BEM` architecture.BEM stands for `Block,Element,Modifiers`.Three separated words.BEM is nothing but class naming convention. ```html <div class="card"> Modifiers <div class="card__header"></div> <div class="card__body"></div> <button class="card__button--red"></button> </div> ``` In above code you can see we have a `card` div.Here `card` is `B => block` and `card__body,card__header` all of those are 'E => element'.Yes,you just write block name add `__` and write element name.Here `card__button--red` is our `M => modifier`.you add `--` with element and write modifiers name. BEM has good use cases.It helps a lot while writing css code specially while writing css pre-processor `sass/scss` code.Take a look. ```css .card { &__header {} &__body {} &__button--red {} } ``` Wow nice combination .Right? If you are not using this try it today.This will make you life easier. Thanks ❤
suhakim
936,506
How to get Experience in Tech as a Student?
If you have ever tried to apply to either Full time or even Intern positions in Tech as a Student....
0
2021-12-25T19:38:53
https://saumya.hashnode.dev/how-to-get-experience-in-tech-as-a-student
programming, beginners, career, codenewbie
--- canonical_url: https://saumya.hashnode.dev/how-to-get-experience-in-tech-as-a-student --- If you have ever tried to apply to either Full time or even Intern positions in Tech as a Student. You must have come across these crazy requirements of experience. _You need to have 1 year, 2 years, 3 Years of experience!_ ![giphy.gif](https://cdn.hashnode.com/res/hashnode/image/upload/v1638685419208/Bs9rfOy_d.gif) I mean it can be really overwhelming for students or beginners out there who really want to work, learn & get that first Experience which is super important. For example, I have been rejected many times after the Recruiter reached out to me based on my resume & stuff. But I was either denied or ghosted by them when I said that I'm an Undergrad & not a graduate. And obviously, these were big companies or on their way to becoming one. But I mean if you reach out based on the work shouldn't you at least interview & reject on basis of skills and not on so-called Experience or conventional degrees. But in hindsight, it is understandable from the POV of the company. So, from my small experience being a student, working at startups & learning from various people till now, **I'm going to share with you some points or steps you can take that will get you this work experience as a student**. This is super cool because even before you graduate you will know how Software engineering works, how to build a product that people use, hone your skills and be great at it, and if you build something cool or work somewhere you can also earn some money which might help you and your family out! > First of all, I would like to just remind you that you don't need any conventional path to become an engineer & especially a software engineer. **They say the best things in the world are free. Code is permissionless, if you want to learn it you can learn it for free, nobody or no degree makes you an engineer, you become one when you build things.** And all these things can be followed by anyone even if you don't have a degree or a super good college because all these don't matter if you have **skills**. Wait & if you are a video person here you go 🙌 : https://www.youtube.com/watch?v=QGm8iOqE4bk So with that in mind let's get started 👇: 1. Opensource The point with opensource is that if you are contributing to a good project you are already a Software Engineer. That is the reason companies directly reach out to folks with great Github stats, contributing to great projects consistently. You will notice that many of the good programmers are super active in contributing to open source projects selflessly. I mean that's the living proof of your experience and skills. ![daftpunktocat-guy.gif](https://cdn.hashnode.com/res/hashnode/image/upload/v1638687148141/AkULGIo8G.gif) > The code does all the talking you don't have to! And I understand seeing these big opensource projects can be pretty intimidating. But you can start small with some tech stack you know, just go to GitHub search that, find beginner-friendly issues, clone the repo and try doing things locally. _And make that first PR of many to come._ ## 2. Build Projects Now I am a big believer in building Projects because mostly every big thing that you see or use now just started as a small project or side project at some point of time by some "one". I myself got a lot of confidence by just building things. ![cliq-projects-integration-blog.gif](https://cdn.hashnode.com/res/hashnode/image/upload/v1638687214393/orrsmMJhN.gif) > Projects are the bridge the makes you an intermediate from a beginner & eventually even an expert. So I would suggest don't just learn, instead invest more time in just building stuff this will automatically make you learn way better than if you would have just learned. I built [HackrNews](https://hackrnews.vercel.app/) when I wanted to learn NextJS. **Now once you build things the next step is publishing those to the world:** - If you work on the Web, deploy the project for free on sites like Heroku, Netlify, or Vercel. - If you work on mobile try to publish your apps to the play store. - If you work on ML, try to make a simple UI to make a normal user interact with the model. > The result of doing this is that you are not just building projects but products that people can use & interact with and who knows what that might lead to ! I know people who have published very simple apps to play store with 100 thousand downloads now. This will definitely land you an internship let alone buying offers and Jobs! ## 3. Applying to Startups I will tell you this from my experience and also it is pretty well known that it's super hard to work in well-established companies as an undergrad. Because of their requirements, one of them being you should not be an undergrad. 😄 So, what's the solution, what will you do if you want to work, learn & get that experience. Well, the answer is startups! ![gif-startup-rocket.gif](https://cdn.hashnode.com/res/hashnode/image/upload/v1638687282811/Dsvri1kG5.gif) > India is currently having a startup boom which Silicon valley once had some years ago, there are lots of funding & investments being poured into great ideas. And people desperately want to hire people with skills to turn those ideas into real products & nobody cares about experience. which is exactly how it should be! **I work at a startup now being a student & I get to learn how a product is being made from nothing which is invaluable than 4 Years of College.** So trust me you will learn a lot and get that experience! So go apply on sites like [AngelList](https://angel.co/jobs?), [YCombinator](https://www.workatastartup.com/), and you can watch this video for various more such listings! https://www.youtube.com/watch?v=uTmWNumMSnM ## 4. Share your Learning/Work in Public What does this mean? Well, this means that share whatever you learned throughout the day, your wins, your failures, your learnings with people on the internet. Not just share the certificates that you get because those don't mean anything, instead, share what are you building now, what new topics did you learn, What problems are you facing learning a certain thing? Twitter is the best platform to do this because of its great tech community. ![tenor_1.gif](https://cdn.hashnode.com/res/hashnode/image/upload/v1638687357938/ik0Cnv5aU.gif) Make a page online or portfolio online to show all your Projects where anyone can check out the work you have been doing by just clicking on a link. As the paper resumes will be obsolete soon. If you do it, you will find learning in Tech is a journey, and sharing it will make it super enjoyable, not let you burn out & also draw in many opportunities. Trust me recruiters start reaching out to you many times if you are sharing great stuff on LinkedIn or Twitter. So, these were some points I wanted to share because I have got some great opportunities due to some of these and I have seen many amazing people doing great because of such stuff. And the best part is these have no prerequisites and anyone with just the will to learn can do these. And there are many who may not be in a good college or can't afford a degree and might think that working in Tech is not possible. But it's never that! So go and get these awesome experiences! Do let me know your thoughts below, it means a lot ! 🎉🙋‍♂️
saumyanayak
936,638
Zsh Plugins Commit TOP
Plugins evaluated at December 26th, 2021 You can try them with ❮ ZI ❯ There are so...
0
2021-12-26T05:52:08
https://dev.to/sso/zsh-plugins-commit-top-4mbl
github, news, productivity, opensource
<h2 align="center"> <a href="https://github.com/z-shell/zi"> <img src="https://raw.githubusercontent.com/z-shell/zi/main/docs/images/logo.png" alt="Logo" width="60" height="60"></a> </h2> <div align="center"><h3> Plugins evaluated at December 26th, 2021 You can try them with ❮ [ZI](https://github.com/z-shell/zi) ❯ </h2> There are so many plugins on the great [awesome-zsh-plugins](https://github.com/unixorn/awesome-zsh-plugins) page that it makes sense to filter them out and also to **score** them. The plugins below contain 50 or more commits and are [evaluated](https://github.com/z-shell/docs/runs/4634771313?check_suite_focus=true) every 3 month by an [automatic script](https://github.com/z-shell/docs/tree/main/docs/zsh-plugins-commit-top) in order to detect: - ones with 100 or more commits - active ones - very active ones - maintained ones - ones having an ongoing research & development Just take a look at the iconography. It's intuitive and easy to memorize. In minutes you'll be able to grep interesting plugins with your eyes! ## Iconography ### Devoted time and work detection 🥇 - has 100 or more commits ### Progress detection 🚶- slow/moderate progress: 1 commit in each of last 3 months or 5 commits in total during the 3 last months (90 days) 🏃- fast progress: 2 commits in each of last 3 months or 10 commits in total during the 3 last months ### Maintenance detection ⌛️- long time no update (updated not in last 3 months, but in last 6 months) ⏳ - updated in last 3 months, active ⏰ - updated in last month and month before it (single new commit will not yield such a strong symbol like the alarm clock; the additional criterion ("month before it") is to take into account only a more constantly attended projects) ### Research & development detection 💼 - has branches with at least 50 commits in total 📈 – the branches were active 3 times in last 4 months ## Plugins (#127) * [abbr (olets)](https://github.com/olets/zsh-abbr) :1st_place_medal: :hourglass_flowing_sand: - Manages auto-expanding abbreviations that expand inline when you hit space, inspired by fish shell. * [abbrev-alias](https://github.com/momo-lab/zsh-abbrev-alias) : :hourglass: - Provides functionality similar to `vim`'s abbreviation expansion. * [alias-tips](https://github.com/djui/alias-tips) :1st_place_medal: - An oh-my-zsh plugin to help remembering those aliases you defined once. * [allergen](https://github.com/stanislas/allergen) : :hourglass_flowing_sand: - A collection of custom ZSH plugins to use with Antigen. * [ansiweather](https://github.com/fcambus/ansiweather) :1st_place_medal: :hourglass: - Weather in your terminal, with ANSI colors and Unicode symbols. * [anyframe](https://github.com/mollifier/anyframe) : - A peco/percol/fzf wrapper plugin for ZSH. * [atuin](https://github.com/ellie/atuin) :1st_place_medal: :alarm_clock: :running_man: - Replaces your existing shell history with a SQLite database, and records additional context for your commands. Additionally, it provides optional and fully encrypted synchronisation of your history between machines, via an Atuin server. * [auto-fu.zsh](https://github.com/hchbaw/auto-fu.zsh) :1st_place_medal: :briefcase: - Automatic complete-word and list-choices. Originally incr-0.2.zsh by y.fujii <y-fujii at mimosa-pudica.net>. * [auto-notify](https://github.com/MichaelAquilina/zsh-auto-notify) : - Automatically sends out a notification when a long running task has completed. * [autocomplete](https://github.com/marlonrichert/zsh-autocomplete) :1st_place_medal: :alarm_clock: :running_man: - Automatically lists completions as you type and provides intuitive keybindings for selecting and inserting them. * [autoenv-extended](https://github.com/zpm-zsh/autoenv) : :hourglass_flowing_sand: - Extended version of the [zsh-autoenv](https://github.com/Tarrasch/zsh-autoenv) plugin. * [autoenv](https://github.com/Tarrasch/zsh-autoenv) :1st_place_medal: - If a directory contains a `.env` file, it will automatically be executed when you `cd` into it. * [autojump](https://github.com/wting/autojump) :1st_place_medal: - A `cd` command that learns - easily navigate directories from the command line. Install autojump-zsh for best results. * [autopair](https://github.com/hlissner/zsh-autopair) : - A ZSH plugin for auto-closing, deleting and skipping over matching delimiters. Only tested on ZSH 5.0.2 or later. * [autosuggestions](https://github.com/zsh-users/zsh-autosuggestions) :1st_place_medal: - [Fish](https://fishshell.com/)-like fast/unobtrusive autosuggestions for ZSH. * [autoswitch-virtualenv](https://github.com/MichaelAquilina/zsh-autoswitch-virtualenv) :1st_place_medal: :running_man: :hourglass_flowing_sand: - ZSH plugin to automatically switch python virtualenvs and pipenvs when traversing directories. Automatically detects pipenv and poetry projects. * [base16](https://github.com/chriskempson/base16-shell) :1st_place_medal: - Adds script to allow you to change your shell's default ANSI colors but most importantly, colors 17 to 21 of your shell's 256 colorspace (if supported by your terminal). This script makes it possible to honor the original bright colors of your shell (e.g. bright green is still green and so on) while providing additional base16 colors to applications such as [Vim](https://www.vim.org). * [bitwarden](https://github.com/Game4Move78/zsh-bitwarden) : :running_man: :hourglass_flowing_sand: - Adds functions to manage [bitwarden](https://bitwarden.com/) sessions. * [blackbox](https://github.com/StackExchange/blackbox) :1st_place_medal: :briefcase: - [Stack Exchange](https://stackexchange.com)'s toolkit for storing keys/credentials securely in a `git` repository. * [cdc](https://github.com/evanthegrayt/cdc) : - Makes it easier to change directories to directories that are subdirs of a user-defined list of directories. Includes tab-completion, session history and `pushd`, `popd` and `dirs` equivalents. * [czhttpd](https://github.com/jsks/czhttpd) :1st_place_medal: :hourglass: - A simple http server written in 99.9% pure ZSH. * [deer](https://github.com/Vifon/deer) :1st_place_medal: - A file navigator for ZSH heavily inspired by [ranger](https://ranger.github.io/). * [depot-tools](https://github.com/kuoe0/zsh-depot-tools) :1st_place_medal: - Simple oh-my-zsh plugin for installing the chromium depot_tools. Installing this plugin will put all of the chromium depot_tools in your path automatically. * [diractions](https://github.com/AdrieanKhisbe/diractions) :1st_place_medal: - Allow you to map a short logical/mnemonic name to directories to quickly access them, or perform actions in them. * [directory-history](https://github.com/tymm/zsh-directory-history) : - A per directory history for ZSH which implements forward/backward navigation as well as substring search in a directory sensitive manner. * [dirrc](https://github.com/gmatheu/shell-plugins) : :hourglass: - Executes `.dirc` when present in a directory you `cd` into. * [docker-aliases](https://github.com/webyneter/docker-aliases) : Docker aliases for everyday use. * [docker-helpers](https://github.com/unixorn/docker-helpers.zshplugin) : :hourglass_flowing_sand: - A collection of docker helper scripts. * [dotbare](https://github.com/kazhala/dotbare) :1st_place_medal: :hourglass: - Interactive dotfile management with the help of `fzf`. * [dwim](https://github.com/oknowton/zsh-dwim) : - Attempts to predict what you will want to do next. It provides a key binding (control-u) that will replace the current (or previous) command line with the command you will want to run next. * [easy-motion](https://github.com/IngoHeimbach/zsh-easy-motion) : - A port of [vim-easymotion](https://github.com/easymotion/vim-easymotion) for ZSH. * [editing-workbench](https://github.com/commiyou/zsh-editing-workbench) : - Adds sane, complex command line editing (e.g. incremental history word completion). * [elixir](https://github.com/gusaiani/elixir-oh-my-zsh) : - Adds shortcuts for Elixir, IEX, Mix, Kiex and Phoenix. * [enhancd](https://github.com/b4b4r07/enhancd) :1st_place_medal: - A simple tool that provides an enhanced `cd` command by memorizing all directories visited by a user and use it for the pathname resolution. * [evil-registers](https://github.com/zsh-vi-more/evil-registers) :1st_place_medal: :hourglass: - Extends ZLE vi commands to remotely access named registers of the vim and nvim editors, and system selection and clipboard. * [expand](https://github.com/MenkeTechnologies/zsh-expand) :1st_place_medal: :alarm_clock: :running_man: - Expands regular aliases, global aliases, incorrect spellings and phrases, globs, history expansion and $parameters with the spacebar key. * [explain-shell](https://github.com/gmatheu/shell-plugins) : :hourglass: - Opens commands on [explainshell.com](https://explainshell.com). * [F-Sy-H](https://github.com/z-shell/F-Sy-H) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Optimized and improved `zsh-users/zsh-syntax-highlighting` – better response times, switchable highlight themes. * [forgit](https://github.com/wfxr/forgit) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Utility tool for `git` which takes advantage of fuzzy finder [fzf](https://github.com/junegunn/fzf). * [functional](https://github.com/Tarrasch/zsh-functional) : - ZSH higher order functions. * [fz](https://github.com/changyuheng/fz) : - Seamlessly adds fuzzy search to [z](https://github.com/rupa/z)'s tab completion and lets you easily jump around among directories in your history. * [fzf-marks](https://github.com/urbainvaes/fzf-marks) :1st_place_medal: :hourglass_flowing_sand: - Little script to create, navigate and delete bookmarks in `bash` and `zsh`, using the fuzzy finder [fzf](https://github.com/junegunn/fzf). * [fzf-tab](https://github.com/Aloxaf/fzf-tab) :1st_place_medal: :alarm_clock: :walking_man: - Replace ZSH's default completion selection menu with [fzf](https://github.com/junegunn/fzf). * [fzf-widgets](https://github.com/ytet5uy4/fzf-widgets) :1st_place_medal: - Adds some ZLE widgets for [fzf](https://github.com/junegunn/fzf). * [fzf-z](https://github.com/andrewferrier/fzf-z) : - Brings together the *z* plugin and *fzf* to allow you to easily browse recently used directories at any point on the command line. * [gdbm](https://github.com/z-shell/zgdbm) : :hourglass_flowing_sand: - Adds GDBM as a plugin. * [git-acp](https://github.com/MenkeTechnologies/zsh-git-acp) : :hourglass: - Take the current command line as the commit message and then run git pull, add, commit and push with one keystroke. * [git-aliases (mdumitru)](https://github.com/mdumitru/git-aliases) : - Broken out version of the version in [oh-my-zsh](http://ohmyz.sh/) so users of other frameworks don't have to import all of oh-my-zsh. * [git-aliases.zsh](https://github.com/peterhurford/git-aliases.zsh) : - Creates a lot of useful aliases for combinations of commonly used `git` commands. * [git-extra-commands](https://github.com/unixorn/git-extra-commands) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Extra `git` helper scripts packaged as a plugin. * [git-fuzzy](https://github.com/bigH/git-fuzzy) : :hourglass: - A CLI interface to `git` that relies heavily on [`fzf`](https://github.com/junegunn/fzf). * [git-it-on](https://github.com/peterhurford/git-it-on.zsh) : - Adds ability to open a folder in your current branch on GitHub. * [git-secret](https://github.com/sobolevn/git-secret) :1st_place_medal: :alarm_clock: :running_man: :briefcase: :chart_with_upwards_trend: - A bash-tool to store your private data inside a `git` repository. * [gitignore](https://github.com/voronkovich/gitignore.plugin.zsh) :1st_place_medal: :hourglass_flowing_sand: - Plugin for creating `.gitignore` files. * [gitsync](https://github.com/washtubs/gitsync) : - ZSH plugin to improve workflows for one person developing on the same repository on multiple machines. * [grep2awk](https://github.com/joepvd/grep2awk) : - ZLE widget to transform `grep` command into `awk` command. * [gunstage](https://github.com/LucasLarson/gunstage) :1st_place_medal: :alarm_clock: :running_man: - There are at least eight ways to unstage files in a `git` repository. This is a command-line shell plugin for undoing `git add`. * [hist](https://github.com/marlonrichert/zsh-hist) : :hourglass_flowing_sand: - Edit your history in ZSH, without ever leaving the command line. * [histdb](https://github.com/larkery/zsh-histdb) :1st_place_medal: :walking_man: :hourglass_flowing_sand: - Stores your history in an SQLite database. Can be integrated with [zsh-autosuggestions](https://github.com/zsh-users/zsh-autosuggestions). * [history-enquirer](https://github.com/zthxxx/zsh-history-enquirer) : :hourglass_flowing_sand: - Enhances history search with more interaction and a multiline selection menu. Requires nodejs. * [H-S-MW](https://github.com/z-shell/H-S-MW) :1st_place_medal: :walking_man: :hourglass_flowing_sand: - A syntax highlighted, multi-word history searcher for ZSH, bound to Ctrl-R, with advanced functions (e.g. bump of history entry to top of history). * [history-substring-search](https://github.com/zsh-users/zsh-history-substring-search) :1st_place_medal: :hourglass: - Needs to be loaded after `zsh-syntax-highlighting`, or they'll both break. You'll also need to bind keys to its functions, details are in the README.md. * [history-sync](https://github.com/wulfgarpro/history-sync) : - An Oh My Zsh plugin for GPG encrypted, Internet synchronized ZSH history using `git`. * [instant-repl](https://github.com/jandamm/instant-repl.zsh) : - Activate a REPL for any command in your current ZSH session. * [iterm-touchbar](https://github.com/iam4x/zsh-iterm-touchbar) : - Display iTerm2 feedback in the MacbookPro TouchBar (Current directory, git branch & status). * [jhipster](https://github.com/jhipster/jhipster-oh-my-zsh-plugin) : - Adds commands for [jHipster](https://www.jhipster.tech/). * [k](https://github.com/supercrabtree/k) :1st_place_medal: :briefcase: - Directory listings for ZSH with `git` status decorations. * [kube-aliases](https://github.com/Dbz/kube-aliases) : :hourglass: :briefcase: :chart_with_upwards_trend: - Adds functions and aliases to make working with `kubectl` more pleasant. * [kube-ps1](https://github.com/jonmosco/kube-ps1) :1st_place_medal: :hourglass: - ZSH plugin for `kubectl` that adds current context and namespace. * [kubernetes](https://github.com/Dbz/zsh-kubernetes) : :hourglass: :briefcase: :chart_with_upwards_trend: - Add [kubernetes](https://kubernetes.io) helper functions and aliases. * [learn](https://github.com/MenkeTechnologies/zsh-learn) : :hourglass: - Learning collection in MySQL/MariadB to save, query and quiz everything you learn. * [liferay](https://github.com/david-gutierrez-mesa/liferay-zsh) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Adds scripts for [liferay](https://github.com/liferay/liferay-portal) development. * [morpho](https://github.com/Jacke/zsh-morpho) : - Terminal screen savers written in pure ZSH, and also screen saver framework. * [navigation-tools](https://github.com/z-shell/zsh-navigation-tools) :1st_place_medal: :walking_man: :hourglass_flowing_sand: - Adds `htop`-like kill, directory bookmarks browser, a multi-word incremental history searcher and more. * [new-file-from-template](https://github.com/zpm-zsh/new-file-from-template) : - Generates file from template. * [nix-shell](https://github.com/chisui/zsh-nix-shell) : :hourglass: - Plugin that lets you use ZSH as the default shell in a `nix-shell` environment. * [notify (luismayta)](https://github.com/luismayta/zsh-notify) :1st_place_medal: :alarm_clock: :running_man: - Notifications for ZSH with auto installation of dependencies and r2d2 sounds. * [notify (marzocchi)](https://github.com/marzocchi/zsh-notify) : - A plugin for ZSH (on macOS and Linux) that posts desktop notifications when a command terminates with a non-zero exit status or when it took more than 30 seconds to complete, if the terminal application is in the background (or the command's terminal tab is inactive). * [nvm](https://github.com/lukechilds/zsh-nvm) :1st_place_medal: :briefcase: - ZSH plugin for installing, updating and loading `nvm`. * [open-pr](https://github.com/caarlos0/zsh-open-pr) : - A ZSH plugin to open pull requests from command line. * [opp](https://github.com/hchbaw/opp.zsh) : - Vim's text-objects-ish for ZSH. * [path-ethic](https://github.com/sha1n/path-ethic) : :hourglass_flowing_sand: - Helps manage your `$PATH` quickly and easily. Doesn't touch your existing `.zshrc`, `.zprofile`, but adds on top of your existing environment instead. * [pentest](https://github.com/jhwohlgemuth/oh-my-zsh-pentest-plugin) : - Aliases and functions for the lazy penetration tester. * [ph-marks](https://github.com/lainiwa/ph-marks) : - Bookmark pornhub videos from your terminal. * [posh-git-bash](https://github.com/lyze/posh-git-sh) : :hourglass: - Adds `git` status in your prompt. * [pr-cwd](https://github.com/zpm-zsh/pr-cwd) : - Creates a global variable with current working directory. Plugin has integration with [jocelynmallon/zshmarks](https://github.com/jocelynmallon/zshmarks). * [pr-git](https://github.com/zpm-zsh/pr-git) : - Creates a global variable with `git` status information that can be displayed in prompts. * [profile-secrets](https://github.com/gmatheu/shell-plugins) : :hourglass: - Securely keep sensitive variables (api tokens, passwords, etc) as part of your terminal init files. Uses gpg to encrypt/decrypt the file with your secrets. * [project (gko)](https://github.com/gko/project) : - Create node/python/ruby project both locally and on github(private or public repository). * [sealion](https://github.com/xyproto/sealion) : :alarm_clock: :running_man: - Allows you to set reminders that will appear in your terminal when your prompt is refreshed. * [syntax-highlighting](https://github.com/zsh-users/zsh-syntax-highlighting) :1st_place_medal: :hourglass_flowing_sand: - Add syntax highlighting to your ZSH. Make sure you load this _before_ zsh-users/zsh-history-substring-search or they will both break. * [sysadmin-util](https://github.com/skx/sysadmin-util) :1st_place_medal: - Steve Kemp's collection of tool scripts for sysadmins. * [system-clipboard](https://github.com/kutsan/zsh-system-clipboard) :1st_place_medal: :hourglass_flowing_sand: - Adds key bindings support for ZLE (Zsh Line Editor) clipboard operations for vi emulation keymaps. It works under Linux, macOS and Android (via Termux). * [tig](https://github.com/MenkeTechnologies/zsh-tig-plugin) : :hourglass_flowing_sand: - Adds a few advanced bindings for [tig](https://github.com/jonas/tig) and also provides a `tig-pick` script. * [tmux-zsh-vim-titles](https://github.com/MikeDacre/tmux-zsh-vim-titles) : :hourglass_flowing_sand: - Create unified terminal titles for `tmux`, ZSH, and Vim/NVIM, modular. * [tmux](https://github.com/zpm-zsh/tmux) : :hourglass_flowing_sand: - Plugin for [tmux](https://tmux.github.io). * [tsm](https://github.com/RobertAudi/tsm) : - Adds a [tmux](https://tmux.github.io) Session Manager. * [tumult](https://github.com/unixorn/tumult.plugin.zsh) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Adds tools for macOS. * [ugit](https://github.com/Bhupesh-V/ugit) : :hourglass_flowing_sand: - Lets you undo your last `git` operation. * [vi-increment](https://github.com/zsh-vi-more/vi-increment) : :hourglass: - Add `vim`-like increment/decrement operations. * [vi-mode (jeffreytse)](https://github.com/jeffreytse/zsh-vi-mode) :1st_place_medal: :alarm_clock: :running_man: - 💻 A better and friendly vi(vim) mode plugin for ZSH. * [vi-motions](https://github.com/zsh-vi-more/vi-motions) : - Add new motions and text objects including quoted/bracketed text and commands. * [vim-mode](https://github.com/softmoth/zsh-vim-mode) : - Friendly `vi`-mode bindings, adding basic Emacs keys, incremental search, mode indicators and more. * [wakatime (wbingli)](https://github.com/wbingli/zsh-wakatime) : - Automatic time tracking for commands in ZSH using [wakatime](https://wakatime.com/). * [wd](https://github.com/mfaerevaag/wd) :1st_place_medal: - Warp directory lets you jump to custom directories in ZSH, without using `cd`. Why? Because `cd` seems inefficient when the folder is frequently visited or has a long path. * [yeoman](https://github.com/edouard-lopez/yeoman-zsh-plugin) : - Edouard Lopez's Yeoman plugin for oh-my-zsh, compatible with yeoman version ≥1.0 (includes options and command auto-completion). * [you-should-use](https://github.com/MichaelAquilina/zsh-you-should-use) :1st_place_medal: :hourglass_flowing_sand: - ZSH plugin that reminds you to use those aliases you defined. * [z.lua](https://github.com/skywind3000/z.lua) :1st_place_medal: :hourglass_flowing_sand: - A command line tool which helps you navigate faster by learning your habits. An alternative to [z.sh](https://github.com/rupa/z) with Windows and posix shells support and various improvements. 10x faster than fasd and autojump, 3x faster than [z.sh](https://github.com/rupa/z). * [zaw](https://github.com/zsh-users/zaw) :1st_place_medal: - ZSH anything.el-like widget. * [zbrowse](https://github.com/z-shell/zbrowse) : :walking_man: :hourglass_flowing_sand: - When doing shell work, it is often the case that echo $variable is invoked multiple times, to check the result of a loop, etc. With ZBrowse, you just need to press `Ctrl-B`, which invokes the ZBrowse – Zshell variable browser. * [zcolors](https://github.com/marlonrichert/zcolors) : :alarm_clock: :running_man: - Uses your `$LS_COLORS` to generate a coherent theme for Git and your Zsh prompt, completions and [ZSH syntax highlighting](https://github.com/zsh-users/zsh-syntax-highlighting). * [zconvey](https://github.com/z-shell/zconvey) :1st_place_medal: :hourglass_flowing_sand: - Adds ability to send commands to other ZSH sessions, you can use this to `cd $PWD` on all active ZSH sessions, for example. * [zeno](https://github.com/yuki-yano/zeno.zsh) :1st_place_medal: :alarm_clock: :running_man: - Fuzzy completion and utility plugin powered by [Deno](https://deno.land/). * [zero](https://github.com/arlimus/zero.zsh) :1st_place_medal: - Zero's theme & plugin. Has variants for both light and dark terminal backgrounds. * [zflai](https://github.com/z-shell/zflai) : :hourglass_flowing_sand: - A fast logging framework for ZSH. * [z-a-bin-gem-node](https://github.com/z-shell/z-a-bin-gem-node) :1st_place_medal: :walking_man: :hourglass_flowing_sand: - [ZI](https://github.com/z-shell/zi) extension that exposes binaries without altering `$PATH`, installs Ruby gems and Node modules and easily exposes their binaries, and updates the gems and modules when the associated plugin or snippet is updated. * [zsh-in-docker](https://github.com/deluan/zsh-in-docker) : :walking_man: :hourglass_flowing_sand: - Automates ZSH + Oh-My-ZSH installation into development containers. Works with Alpine, Ubuntu, Debian, CentOS or Amazon Linux. * [zsh-z (agkozak)](https://github.com/agkozak/zsh-z) :1st_place_medal: :alarm_clock: :running_man: - Jump quickly to directories that you have visited "frecently." A native ZSH port of `z.sh` - without `awk`, `sed`, `sort`, or `date`. * [zshmarks](https://github.com/jocelynmallon/zshmarks) : - A port of Bashmarks (by Todd Werth), a simple command line bookmarking plugin, for oh-my-zsh. ## Themes (#87) * [agkozak](https://github.com/agkozak/agkozak-zsh-prompt) :1st_place_medal: :running_man: :hourglass_flowing_sand: :briefcase: :chart_with_upwards_trend: - Uses three asynchronous methods to keep the ZSH prompt responsive while displaying the `git` status and indicators of SSH connection, exit codes, and `vi` mode, along with an abbreviated, `PROMPT_DIRTRIM`-style path. Very customizable. Asynchronous even on Cygwin and MSYS2. * [agnoster-j](https://github.com/apjanke/agnosterj-zsh-theme) :1st_place_medal: - Optimized for [solarized](https://ethanschoonover.com/solarized/) color scheme, `git` or other VCS tools, and unicode-compatible fonts. Includes status of last command run, user@hostname, `git` status decorations, working directory, whether running as root, whether background jobs are running, and other information. * [alien-minimal](https://github.com/eendroroy/alien-minimal) :1st_place_medal: - Minimalist ZSH theme with `git` status displayed. * [alien](https://github.com/eendroroy/alien) :1st_place_medal: - Powerline-esque ZSH theme that shows `git` decorations and the exit code of the last command. Faster than many other prompts because it determines the `git` decorations asynchronously in a background process. * [almel](https://github.com/Ryooooooga/almel) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Inspired by [agnoster](https://github.com/agnoster/agnoster-zsh-theme), written in Rust. Includes `git` status, user@host, last command exit status and working directory decorations * [apollo](https://github.com/mjrafferty/apollo-zsh-theme) :1st_place_medal: :hourglass_flowing_sand: - A heavily customizable, compatible and performant ZSH theme that uses modules to enable features. * [astral](https://github.com/xwmx/astral) :1st_place_medal: - Theme for dark backgrounds with zen mode. Works well with the zsh-users [zsh-syntax-highlighting](https://github.com/zsh-users/zsh-syntax-highlighting) plugin. * [aterminal](https://github.com/guiferpa/aterminal) : - Displays Nodejs, NPM, Docker, Go, Python, Elixir and Ruby information in the prompt. * [bar (xp-bar)](https://github.com/xp-bar/zsh-bar-theme) : - Includes username, host, pwd, `git` status decorations and 3x hour reminders to drink water. * [bklyn](https://github.com/gporrata/bklyn-zsh) :1st_place_medal: - Variant of [Powerlevel9k](https://github.com/bhilburn/powerlevel9k) with customizations applied. * [black-Void](https://github.com/black7375/BlaCk-Void-Zsh) :1st_place_medal: :alarm_clock: :walking_man: :briefcase: :chart_with_upwards_trend: - Includes account info, root user, using ssh, directory lotation, write permission, vcs info decorations. * [blox](https://github.com/yardnsm/blox-zsh-theme) : :hourglass_flowing_sand: - A minimal and fast ZSH theme that shows you what you need. It consists of blocks: each block is shown inside a pair of \[square brackets\], and you can add blocks by simply creating a function. * [bronze](https://github.com/reujab/bronze) :1st_place_medal: - A cross-shell customizable powerline-like prompt with icons written in go. Requires [nerd-fonts](https://github.com/ryanoasis/nerd-fonts). * [bullet-train](https://github.com/caiogondim/bullet-train.zsh) :1st_place_medal: - Inspired by the Powerline Vim plugin. It aims for simplicity, showing information only when it's relevant. * [chaffee](https://github.com/jasonchaffee/chaffee.zsh-theme) : - Based on sorin. Shows the current active versions of Java, Scala, Go, Node, Python and Ruby. * [clean (brandonRoehl)](https://github.com/BrandonRoehl/zsh-clean) :1st_place_medal: - A minimalist variant of [pure](https://github.com/sindresorhus/pure). Pure is not clean, clean is not pure. * [czsh](https://github.com/Cellophan/czsh) :1st_place_medal: :alarm_clock: :running_man: - [ZSH](https://en.wikipedia.org/wiki/Z_shell) with [oh-my-zsh](https://github.com/ohmyzsh/ohmyzsh) and the [agnoster](https://github.com/agnoster/agnoster-zsh-theme) theme in a container. * [dracula](https://github.com/dracula/zsh) :1st_place_medal: :hourglass: - A dark theme for Atom, Alfred, Chrome DevTools, iTerm 2, Sublime Text, Textmate, Terminal.app, Vim, Xcode, and ZSH. * [filthy](https://github.com/molovo/filthy) : - A disgustingly clean ZSH prompt. * [fishy-lite](https://github.com/sudorook/fishy-lite) : :hourglass_flowing_sand: - Fork of the original [fishy](https://github.com/ohmyzsh/ohmyzsh/wiki/themes#fishy) theme in oh-my-zsh with much of the extraneous stuff cut out to improve load speeds. Includes a battery gauge and `git` status display that can be enabled on the right-hand side of the prompt. * [garrett](https://github.com/chauncey-garrett/zsh-prompt-garrett) : - Prezto prompt with the information you need the moment you need it. * [gbt](https://github.com/jtyr/gbt) :1st_place_medal: - Go Bullet Train is a very customizable prompt builder inspired by Bullet Train that runs much faster. Includes many different status cars. * [geometry](https://github.com/geometry-zsh/geometry) :1st_place_medal: :hourglass: - A minimal ZSH theme where any function can be added to the left prompt or (async) right prompt on the fly. * [git-prompt (awgn)](https://github.com/awgn/git-prompt) :1st_place_medal: :hourglass_flowing_sand: - A fast `git` prompt for `bash`, `zsh` and `fish`. * [git-prompt (olivierverdier)](https://github.com/olivierverdier/zsh-git-prompt) :1st_place_medal: - Displays information about the current `git` repository. In particular the branch name, difference with remote branch, number of files staged or changed, etc. * [git-prompt (woefe)](https://github.com/woefe/git-prompt.zsh) : :hourglass_flowing_sand: - A fast, customizable, pure-shell, asynchronous Git prompt for ZSH heavily inspired by Olivier Verdier's [zsh-git-prompt](https://github.com/olivierverdier/zsh-git-prompt) and very similar to the "Informative VCS" prompt of fish shell. * [gops](https://github.com/noxer/gops) : :walking_man: :hourglass_flowing_sand: - Fast powerline-like prompt. Includes `git` status, current directory, root status decorations. * [guezwhoz](https://github.com/guesswhozzz/guezwhoz-zshell) : - Minimalist, includes `git` status decorations. * [hyperzsh](https://github.com/tylerreckart/hyperzsh) : - Gives you a comprehensive overview of the branch you're working on and the status of your repository without cluttering your terminal. * [infoline](https://github.com/hevi9/infoline-zsh-theme) : - Clean theme that shows `git` status, background jobs, remote host, and other information. * [itg](https://github.com/itsthatguy/itg.zsh-theme) : - itsthatguy's theme. * [jovial](https://github.com/zthxxx/jovial) :1st_place_medal: :running_man: :hourglass_flowing_sand: - Shows host, user, path, development environment, `git` branch, which python venv is active. * [jwalter](https://github.com/jeffwalter/zsh-jwalter) : - Powerline-style theme with `git`, `svn`, `npm`, `rvm` and network awareness. Requires Powerline-compatible terminal font. * [kali](https://github.com/h4ck3r0/kali-theme) : - Includes `git` decorations. * [kiss](https://github.com/rileytwo/kiss) :1st_place_medal: - Simple theme for oh-my-zsh, VSCode, iTerm2, Neovim, and RStudio. Includes `git` status decorations. * [lambda-pure](https://github.com/marszall87/lambda-pure) :1st_place_medal: - A minimal ZSH theme, based on Pure, with added NodeJS version. * [lean](https://github.com/miekg/lean) : :hourglass_flowing_sand: - Inspired by [pure](https://github.com/sindresorhus/pure). Includes `git` status and background job decorations. * [lemon](https://github.com/carlosvitr/lemon_zsh) : :hourglass: - Many beautiful colors for you to enjoy. done with care and patience. Includes `git` status and ruby version decorations. * [liquidprompt](https://github.com/nojhan/liquidprompt) :1st_place_medal: :alarm_clock: :running_man: :briefcase: - A full-featured & carefully designed adaptive prompt with useful information when you need it. It shows you what you need when you need it. You will notice what changes when it changes, saving time and frustration. * [materialshell](https://github.com/carloscuesta/materialshell) : - A [material design](https://material.io/guidelines/style/color.html) theme for your shell with a good contrast and color pops at the important parts. Designed to be easy on the eyes. * [minimal (glsorre)](https://github.com/glsorre/minimal/) : - minimal asynchronous ZSH theme optimized for use with the [Fira Code](https://github.com/tonsky/FiraCode) font and the [Solarized Light](https://ethanschoonover.com/solarized) terminal theme. * [minimal (subnixr)](https://github.com/subnixr/minimal) : :hourglass_flowing_sand: - Minimal yet feature-rich theme. * [minimal2](https://github.com/PatTheMav/minimal2) : - A minimal and extensible ZSH theme. Forked from [subnixr's original](https://github.com/subnixr/minimal) and adapted for [Zimfw](https://github.com/zimfw/zimfw). * [newt](https://github.com/softmoth/zsh-prompt-newt) : - Fat & fast theme – beautiful inside and out, styled segments done right. Extremely customizable, includes `git`, username, execution time, directory, background jobs and edit mode decorations. * [nox](https://github.com/kbrsh/nox) :1st_place_medal: :hourglass: - Dark theme, displays the current working directory and git status. * [odin](https://github.com/tylerreckart/odin) : - Odin is a `git`-flavored ZSH theme. * [oh-my-git](https://github.com/arialdomartini/oh-my-git) :1st_place_medal: - An opinionated prompt for bash and ZSH. * [oh-my-via](https://github.com/badouralix/oh-my-via) : - Theme for ZSH which mainly forks the historical theme used on VIA servers. * [persi](https://github.com/persiliao/persi-zsh-theme) :1st_place_medal: :hourglass_flowing_sand: - Includes `git` decorations. Works with both light and dark backgrounds. * [polyglot](https://github.com/agkozak/polyglot) :1st_place_medal: :hourglass_flowing_sand: - a dynamic prompt for `zsh`, `bash`, `ksh93`, `mksh`, `pdksh`, `dash`, and busybox `ash` that uses basic ASCII symbols (and color, when possible) to show username, whether it is a local or remote `ssh` sesssion, abbreviated path, `git` branch and status, exit status of last command if non-zero, any virtual environment created with `virtualenv`, `venv`, `pipenv`, `poetry`, or `conda`. * [poncho](https://github.com/RainyDayMedia/oh-my-zsh-poncho) : - RDM's basic oh-my-zsh custom theme. * [powerless](https://github.com/martinrotter/powerless) :1st_place_medal: - Tiny & simple pure ZSH prompt inspired by powerline. * [powerlevel10k](https://github.com/romkatv/powerlevel10k) :1st_place_medal: :alarm_clock: :running_man: - A fast reimplementation of [powerlevel9k](https://github.com/bhilburn/powerlevel9k) ZSH theme. Can be used as a drop-in replacement for powerlevel9k, when given the same configuration options it will generate the same prompt, only faster. * [powerlevel9k](https://github.com/bhilburn/powerlevel9k) :1st_place_medal: :briefcase: - Powerlevel9k is a theme for ZSH which uses [Powerline Fonts](https://github.com/powerline/fonts). It can be used with vanilla ZSH or ZSH frameworks such as [Oh-My-Zsh](https://github.com/ohmyzsh/ohmyzsh), [Prezto](https://github.com/sorin-ionescu/prezto), [Antigen](https://github.com/zsh-users/antigen), and [many others](https://github.com/bhilburn/powerlevel9k/wiki/Install-Instructions). * [powerline (jeremy)](https://github.com/jeremyFreeAgent/oh-my-zsh-powerline-theme) : - Another take on a powerline theme. Nicely configurable, but requires at least a 256 color-capable terminal with a powerline-compatible terminal font. * [powerline-go](https://github.com/justjanne/powerline-go) :1st_place_medal: :hourglass_flowing_sand: - A beautiful and useful low-latency prompt, written in golang. Includes `git` and `hg` status decorations, exit status of the last command run, current Python virtualenv, whether you're in a [nix](https://nixos.org/) shell, and is easy to extend. * [powerline-hs](https://github.com/rdnetto/powerline-hs) :1st_place_medal: - A [Powerline](https://github.com/powerline/powerline) clone written in Haskell. It is significantly faster than the original implementation, and makes the shell noticeably more responsive. * [powerline-pills](https://github.com/lucasqueiroz/powerline-pills-zsh) : - Created in Ruby, uses powerline characters to simulate pills with useful information. * [powerline-shell (b-ryan)](https://github.com/b-ryan/powerline-shell) :1st_place_medal: - Beautiful and useful prompt generator for Bash, ZSH, Fish, and tcsh. Includes `git`, `svn`, `fossil` and `hg` decorations, Python virtualenv information, and last command exit status. * [powerline-shell (banga)](https://github.com/b-ryan/powerline-shell) :1st_place_medal: - A [powerline](https://github.com/Lokaltog/vim-powerline)-like prompt for Bash, ZSH and Fish. Shows important details about git/svn/hg/fossil branch and is easy to customize/extend. * [powerline-train](https://github.com/sherubthakur/powerline-train) : - A powerline variant. * [prompt_j2](https://github.com/malinoskj2/prompt_j2) : :hourglass_flowing_sand: - Has a dynamic exit status indicator, can change to two lines dynamically to display context. * [pure-agnoster](https://github.com/yourfin/pure-agnoster) :1st_place_medal: - Mashup of pure and agnoster. Has `git` decorations and works well with both dark and light terminal backgrounds. * [pure](https://github.com/sindresorhus/pure) :1st_place_medal: :hourglass_flowing_sand: - A pretty, minimal and fast ZSH prompt. Includes `git` status decorations, prompt turns red if last command failed, username and host decorations when in a remote session or container, and current folder and command when a process is running. * [purify (banminkyoz)](https://github.com/banminkyoz/purify) :1st_place_medal: :hourglass_flowing_sand: - A simple, fast & cool prompt. * [purify (kyoz)](https://github.com/kyoz/purify) :1st_place_medal: :hourglass_flowing_sand: - A clean and vibrant theme, best on dark backgrounds. Includes `git` status decorations. * [qoomon](https://github.com/qoomon/zsh-theme-qoomon) : :hourglass: - Optimized for dark backgrounds, includes `git` information. Theme repo includes iTerm 2 and Terminal color settings. * [shelby](https://github.com/athul/shelby) :1st_place_medal: :alarm_clock: :walking_man: - Fast, lightweight and minimal prompt written in pure `golang`. Includes decorations for last command exit status, `git` status and the current working directory. * [shellder](https://github.com/simnalamburt/shellder) :1st_place_medal: - Minimal theme with git branch display. Requires a Powerline-compatible font. * [silver](https://github.com/reujab/silver) :1st_place_medal: - A cross-shell customizable powerline-like prompt heavily inspired by [Agnoster](https://github.com/agnoster/agnoster-zsh-theme). A faster rust port of [bronze](https://github.com/reujab/bronze). Requires [Nerd Fonts](https://github.com/ryanoasis/nerd-fonts). Very configurable, includes `git` status decorations. * [skeletor-syntax](https://github.com/ramonmcros/skeletor-syntax) :1st_place_medal: - Theme collection for Atom, Prism and ZSH inspired by Skeletor from He-Man and the Masters of the Universe. * [slick](https://github.com/nbari/slick) : - Inspired by the [pure](https://github.com/sindresorhus/pure), [purs](https://github.com/xcambar/purs) and [zsh-efgit-prompt](https://github.com/ericfreese/zsh-efgit-prompt). Requires `cargo` for installation. * [slimline](https://github.com/mengelbrecht/slimline) :1st_place_medal: - Minimal, fast and elegant ZSH prompt. Displays the right information at the right time. * [sm](https://github.com/blyndusk/sm-theme) :1st_place_medal: A **Simplist** & **Minimalist** theme for your **favorite** terminal. Includes `git` status decorations. * [solarized-powerline (KuoE0)](https://github.com/KuoE0/oh-my-zsh-solarized-powerline-theme) : - Solarized powerline variant. * [spaceship](https://github.com/denysdovhan/spaceship-prompt) :1st_place_medal: :alarm_clock: :running_man: :briefcase: :chart_with_upwards_trend: - Theme with `git`, `nvm`, rvm/rbenv/chruby, python, `ssh` and other useful status indicators. * [starship](https://github.com/starship/starship) :1st_place_medal: :alarm_clock: :running_man: :briefcase: :chart_with_upwards_trend: - Minimal, fast, extremely customizable. * [statusline](https://github.com/el1t/statusline) : - A responsive ZSH theme that provides informational segments when you need them. * [tvline](https://github.com/thvitt/tvline) :1st_place_medal: - Derived from the [agnoster](https://gist.github.com/agnoster/3712874) theme, adds powerline font enhancements. * [typewritten](https://github.com/reobin/typewritten) :1st_place_medal: :walking_man: :hourglass_flowing_sand: - Minimal and informative theme that leaves room for what's important. Does asynchronous `git` decoration updates for speed. * [wild-cherry](https://github.com/mashaal/wild-cherry) :1st_place_medal: :hourglass_flowing_sand: - A fairy-tale inspired theme for ZSH, iTerm 2, Sublime, Atom, & Mou. * [wkentaro](https://github.com/wkentaro/wkentaro.zsh-theme) : :hourglass: - A simple theme for Python users. Includes virtualenv and `git` status decorators. * [yazpt](https://github.com/jakshin/yazpt) :1st_place_medal: :alarm_clock: :running_man: - A clean, fast, good-looking ZSH prompt theme that thoughtfully incorporates Git/Subversion/TFVC status info, integrates with popular plugin managers like Oh My Zsh, and is straightforward to customize and extend. * [zero](https://github.com/arlimus/zero.zsh) :1st_place_medal: - Zero's theme & plugin. Has variants for both light and dark terminal backgrounds. * [zinc](https://gitlab.com/robobenklein/zinc) :1st_place_medal: - A blazing-fast, pure ZSH, mixed asynchronous powerline prompt that's easily extensible and extremely configurable. * [zshpower](https://github.com/snakypy/zshpower) :1st_place_medal: :alarm_clock: :running_man: - Optimized for python developers. Includes `git` and `pyenv` status decorations, username and host. Tries to install other plugins and fonts, so read its instructions before installing. * [zwsh](https://github.com/naens/zwsh) :1st_place_medal: - A Zpm3/Wordstar mode/theme for ZSH.
sso
936,656
Laravel 8 Razorpay Payment Gateway Integration Example
Hello Dev, This tutorial is focused on razorpay payment gateway integration in laravel 8. i would...
0
2021-12-26T07:10:34
https://dev.to/techdurjoy/laravel-8-razorpay-payment-gateway-integration-example-3mae
laravel
Hello Dev, This tutorial is focused on razorpay payment gateway integration in laravel 8. i would like to share with you laravel 8 razorpay pay example. i explained simply step by step laravel 8 razorpay integration. Here you will learn razorpay integration in laravel 8. So, let's follow few step to create example of razorpay api integration in laravel 8. razorpay payment gateway is for india. they provide lot's of option like credit card, debit card, UPI, phone pay, google pay, paytm payment option to do payment of user. so if you want to implement razorpay integration in your laravel app then you can do it by following bellow few steps. you can see bellow preview too: [Laravel 8 Razorpay Payment Gateway Integration Example ](https://www.codecheef.org/article/laravel-8-razorpay-payment-gateway-integration-example)
techdurjoy
936,666
Is It Time for the JavaScript Temporal API?
by author Craig Buckler Date handling in JavaScript is ugly. The Date() object has not changed since...
0
2021-12-26T07:40:03
https://blog.openreplay.com/is-it-time-for-the-javascript-temporal-api
javascript, programming, webdev, beginners
_by author [Craig Buckler](https://blog.openreplay.com/authors/craig-buckler)_ Date handling in JavaScript is *ugly*. The [`Date()` object](https://developer.mozilla.org/Web/JavaScript/Reference/Global_Objects/Date) has not changed since the first Java-inspired implementation in 1995. Java scrapped it but `Date()` remained in JavaScript for backward browser compatibility. Issues with the `Date()` API include: * it's inelegant * it only supports UTC and the user's PC time * it doesn't support calendars other than Gregorian * string to date parsing is error-prone * `Date` objects are mutable -- for example: ```js const today = new Date(); const tomorrow = new Date( today.setDate( today.getDate() + 1 ) ); console.log( tomorrow ); // is tomorrow's date console.log( today ); // is also tomorrow's date! ``` Developers often turn to date libraries such as [moment.js](https://momentjs.com/) but it's a 74Kb payload and dates remain mutable. Modern alternatives such as [Day.js](https://day.js.org/) and [date-fns](https://date-fns.org/) may be better but should a library necessary when your app has minimal date-handling requirements? Browsers must continue to support `Date()` but a new `Temporal` static global date object is at the [Stage 3 Candidate Proposal in the TC39 standards approval process](https://tc39.es/proposal-temporal/) (the final stage before implementation). The API addresses all the issues above and it's [coming to the Chrome browser soon](https://chromestatus.com/feature/5668291307634688). It's unlikely to have widespread implementation until late 2022 so be wary that changes could occur. ## Current Date and Time [`Temporal.Now`](https://tc39.es/proposal-temporal/docs/#Temporal-Now) returns an object representing the current date and time. Further methods provide information such as: ```javascript // time since the Unix epoch on 1 Janary, 1970 UTC Temporal.Now.instant().epochSeconds; Temporal.Now.instant().epochMilliseconds; // time in current location Temporal.Now.zonedDateTimeISO(); // current time zone Temporal.Now.timeZone(); // current time in another time zone Temporal.Now.zonedDateTimeISO('Europe/London'); ``` ## Instant Dates and Times [`Temporal.Instant`](https://tc39.es/proposal-temporal/docs/#Temporal-Instant) returns an object representing a date and time to the nearest nanosecond according to an ISO 8601 formatted string: ![Temporal date time string](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e6y8wfeslooq7fcl88vw.png) ```javascript Temporal.Instant.from('2022-03-04T05:56:78.999999999+02:00[Europe/Berlin]'); Temporal.Instant.from('2022-03-04T05:06+07:00'); ``` You can also use an epoch value: ```javascript Temporal.Instant.fromEpochSeconds(1.0e8); ``` ## Zoned Dates and Times [`Temporal.ZonedDateTime`](https://tc39.es/proposal-temporal/docs/#Temporal-ZonedDateTime) returns an object representing a timezone and calendar-aware date/time at the instant an event occurred (or will occur) in a particular global location, e.g. ```javascript new Temporal.ZonedDateTime( 1234567890000, // epoch nanoseconds Temporal.TimeZone.from('Europe/London'), // timezone Temporal.Calendar.from('iso8601') // default calendar ); Temporal.ZonedDateTime.from('2025-09-05T02:55:00+02:00[Africa/Cairo]'); Temporal.Instant('2022-08-05T20:06:13+05:45').toZonedDateTime('+05:45'); Temporal.ZonedDateTime.from({ timeZone: 'America/New_York' year: 2025, month: 2, day: 28, hour: 10, minute: 15, second: 0, millisecond: 0, microsecond: 0, nanosecond: 0 }); ``` ## Plain Dates and Times *Plain* dates and times reference simpler calendar events which are not associated with a specific time zone. The options include: * [`Temporal.PlainTime`](https://tc39.es/proposal-temporal/docs/#Temporal-PlainTime) refers to a specific time, e.g. "the meeting occurs at 3pm every weekday": ```javascript // both are 15:00:00 new Temporal.PlainTime(15, 0, 0); Temporal.PlainTime.from('15:00:00'); ``` * [`Temporal.PlainDate`](https://tc39.es/proposal-temporal/docs/#Temporal-PlainDate) refers to a specific date, e.g. "your tax return is due by January 31, 2022": ```javascript // both are January 31, 2022 new Temporal.PlainDate(2022, 1, 31); Temporal.PlainDate.from('2022-01-31'); ``` * [`Temporal.PlainDateTime`](https://tc39.es/proposal-temporal/docs/#Temporal-PlainDateTime) refers to a date and time without a time zone: ```javascript // both are 4 May 2022 at 10:11am and 12 seconds new Temporal.PlainDateTime(2022, 5, 4, 10, 11, 12); Temporal.PlainDateTime.from('2022-05-04T10:11:12'); ``` * [`Temporal.PlainYearMonth`](https://tc39.es/proposal-temporal/docs/#Temporal-PlainYearMonth) refers to a date without a day, e.g. "the June 2022 schedule is ready": ```javascript // both are June 2022 new Temporal.PlainYearMonth(2022, 6); Temporal.PlainYearMonth.from('2022-06'); ``` * [`Temporal.PlainMonthDay`](https://tc39.es/proposal-temporal/docs/#Temporal-PlainMonthDay) refers to a date without a year, e.g. "Star Wars day is on May 4": ```javascript // both are May 4 new Temporal.PlainMonthDay(5, 4); Temporal.PlainMonthDay.from('05-04'); ``` ## Open Source Session Replay Debugging a web application in production may be challenging and time-consuming. [OpenReplay](https://github.com/openreplay/openreplay) is an Open-source alternative to FullStory, LogRocket and Hotjar. It allows you to monitor and replay everything your users do and shows how your app behaves for every issue. It’s like having your browser’s inspector open while looking over your user’s shoulder. OpenReplay is the only open-source alternative currently available. ![OpenReplay](https://raw.githubusercontent.com/openreplay/openreplay/main/static/replayer.png) Happy debugging, for modern frontend teams - [Start monitoring your web app for free](https://github.com/openreplay/openreplay). ## Date and Time Values You can extract specific date and time values from a `Temporal` object. Assuming the following date and time: ```javascript const t1 = Temporal.ZonedDateTime.from('2022-12-07T03:24:30+02:00[Africa/Cairo]'); ``` you can extract: ```javascript t1.year; // returns 2022 t1.month; // 12 t1.day; // 7 t1.hour; // 3 t1.minute; // 24 t1.second; // 30 t1.millisecond; // 0 t1.microsecond; // 0 t1.nanosecond; // 0 ``` Other useful properties include: * `dayOfWeek` -- returns `1` for Monday to `7` for Sunday * `dayOfYear` -- returns `1` to `365` or `366` on leap years * `weekOfYear` -- returns `1` to `52` or `53` * `daysInMonth` -- returns `28`, `29`, `30`, or `31` * `daysInYear` -- returns `365` or `366` * `inLeapYear` -- returns `true` for a leap year or `false` when not ## Comparing and Sorting Dates and Times All `Temporal` objects have a `.compare(date1, date2)` method which returns: * `0` when `date1` and `date2` are the same * `1` when `date1` occurs after `date2`, or * `-1` when `date1` occurs before `date2` For example: ```javascript const date1 = Temporal.Now, date2 = Temporal.PlainDateTime.from('2022-05-04'); Temporal.ZonedDateTime.compare(date1, date2); // returns 1 when May 4, 2022 arrives ``` You can pass the `compare()` method as an Array `sort()` function to arrange dates into ascending chronological order (earliest to latest): ```javascript const t = [ '2022-01-01T00:00:00+00:00[Europe/London]', '2022-01-01T00:00:00+00:00[Africa/Cairo]', '2022-01-01T00:00:00+00:00[America/New_York]' ].map( d => Temporal.ZonedDateTime.from(d) ) .sort( Temporal.ZonedDateTime.compare ); ``` ## Date and Time Calculations All `Temporal` objects offer math methods to [add()](https://tc39.es/proposal-temporal/docs/duration.html#add), [subtract()](https://tc39.es/proposal-temporal/docs/duration.html#subtract), or [round()](https://tc39.es/proposal-temporal/docs/duration.html#round) to a duration. You can define a duration as a [`Temporal.Duration` object](https://tc39.es/proposal-temporal/docs/duration.htm) which sets a period in `years`, `months`, `weeks`, `days`, `hours`, `minutes`, `seconds`, `milliseconds`, `microseconds`, and `nanoseconds` as well as a `sign` for `-1` negative or `1` positive durations. However, all these methods accept a duration-like value without the need to create a specific object. Examples: ```javascript const t1 = Temporal.ZonedDateTime.from('2022-05-04T00:00:00+00:00[Europe/London]'); // add 8 hours 59 minutes t1.add({ hours: 8, minutes: 59 }); // or t1.add(Temporal.Duration.from({ hours: 8, minutes: 59 })); // subtract 2 weeks t1.subtract({ weeks: 2 }); // or t1.add({ weeks: 2, sign: -1 }); // round to nearest month t1.round({ smallestUnit: 'month' }); ``` *Plain* dates and times can wrap so adding 24 hours to a `PlainTime` returns a new `Temporal` object with an identical value. The `until()` and `since()` methods return a `Temporal.Duration` object describing the time until or since a specific date and time based on the current date/time, e.g. ```javascript // months to t1 t1.until().months; // days to t2 t2.until().days; // weeks since t3 t3.since().weeks; ``` The `equals()` method also determines whether two date/time values are identical: ```javascript const d1 = Temporal.PlainDate.from('2022-01-31'); d2 = Temporal.PlainDate.from('2023-01-31'); d1.equals(d2); // false ``` ## Formatting Date and Time Strings All `Temporal` objects have a string representation returned when using the `.toString()` method, e.g. `Temporal.Now.toString()`: ```javascript 2022-09-05T02:55:00+02:00[Europe/London] ``` This is not user friendly but the [Internationalization API](https://blog.openreplay.com/the-complete-guide-to-localizing-your-app-with-javascript-s-internationalization-api) offers a better alternative with localisation options. For example: ```javascript // define a date const d = new Temporal.PlainDate(2022, 3, 14); // US date format: 3/14/2022 new Intl.DateTimeFormat('en-US').format(d); // UK date format: 14/03/2022 new Intl.DateTimeFormat('en-GB').format(d); // Spanish long date format: miércoles, 14 de abril de 2022 new Intl.DateTimeFormat('es-ES', { dateStyle: 'full' }).format(d); ``` This is not part of the `Temporal` API and there's no guarantee the [`Intl` (Internationalization) API](https://developer.mozilla.org/Web/JavaScript/Reference/Global_Objects/Intl) will support `Temporal` as well as `Date` objects -- although there would be a developer outcry if it didn't! ## Temporal Time We've accepted the dodgy `Date()` since day one but `Temporal` gives JavaScript developers something to look forward to. The days of resorting to a date library are nearly over. For further information, refer to: 1. [The `Temporal` proposal](https://tc39.es/proposal-temporal/) 1. [The `Temporal` documentation](https://tc39.es/proposal-temporal/docs/) 1. [The `Temporal` cookbook examples](https://tc39.es/proposal-temporal/docs/cookbook.html)
asayerio_techblog
936,710
Common misconception of dependency inversion
Dependency inversion is widely misunderstood. I will explain why. According to the definition: the...
0
2021-12-26T08:37:22
https://chrisza.me/dependency-inversion-injection/
Dependency inversion is widely misunderstood. I will explain why. According to the definition: the principle itself consists of two rules 1. High-level modules should not depend on low-level modules. Both should depend on abstractions. 2. Abstractions should not depend on details. Details should depend on abstractions. In this context, high-level modules mean domain logic and low-level modules mean technology stack. For example: If you are working on an accounting system. All tax formulas are high-level modules, and the database which you fetch invoices and receipts to calculate tax refunds is a low-level module You can imply that the gran idea is to decouple business or domain logic from the technology used. If you look back into Robert Martin (the originator of the principle) works on software architecture, you will find that the idea of decoupling business logic out of the technology stack is always his theme. If you want to have a quick look: here is his idea of [Clean Architecture](https://blog.cleancoder.com/uncle-bob/2012/08/13/the-clean-architecture.html). # Common understanding Back from the idea world to the actual world. Technically speaking, every system heavily depends on a low-level technology stack to perform. For example: If a user want see an amount of taxes for invoice number INV001, here is what you normally need to do 1. Fetch invoice INV001 from the database 2. Apply tax calculation 3. Send the calculation result to the client, probably via HTTPS protocol and internet You can see that we depend on database implementation, HTTPS and the whole internet to work exactly and consistently as we expected to implement this feature. Let say you have to implement a method `CalculateTax(string invoiceNumber)`, you need to at least be able to fetch invoice data from the database. How can we make this "de-coupled" from the database implementation? At a first glance, this seems to be impossible. The common understanding is that you can achieve this by using a technique called dependency injection. Instead of depending on concrete implementation There are many frameworks out there such as Spring or C# MVC which implement something called dependency injection framework and IoC container. So we can globally register services like this: ```CSharp using ConfigSample.Options; using Microsoft.Extensions.DependencyInjection.ConfigSample.Options; var builder = WebApplication.CreateBuilder(args); builder.Services.AddScoped<IInvoiceRepository, InvoiceRepository>(); builder.Services.AddScoped<IMyDependency2, MyDependency2>(); var app = builder.Build(); ``` and this is where I think the idea become misunderstood. Let's get back to our tax calculation on invoice example, let say you have ```Csharp public class TaxCalculator { private IInvoiceRepository invoiceRepository; public TaxCalculator(IInvoiceRepository invoiceRepository) { this.invoiceRepository = invoiceRepository; } public double CalculateTax(String invoiceNumber) { Invoice thisInvoice = this.invoiceRepository.getById(invoiceNumber); // Calculate tax } } ``` At a first glance, this seems to be a normal dependency inversion principle implemented using dependency injection and IoC container. We seem to be able to decouple a concrete database implementation out of tax calculation by making `TaxCalculator` depending on just an interface. I want to step back from the code and get back to the main idea. All the main idea of this dependency injection and framework stuff is to make high-level modules independent of low-level modules right. My question is: What does it mean to be independent? # What does it mean to be independent? Well, it depends. To answer this question in a precise manner, I need to bring an extreme situation that will make the dependency graph clear: High bureaucracy, low-trusted environment. Let say there are two vendors binding by just contract and neither of them is willing to expose their code. One is working on a class `TaxCalculator` class, which is a high-level module. Another one is working on `InvoiceRespository` class, which is low-level module. Let's call the first team as Tax team and the second team as Repo team. In the spirit of dependency inversion principle, the Tax team should be independent of the Repo team while the Repo team can still depend on the Tax team, right? That's mean any changes from the Repo team should not affect the Tax team, but some changes from the Tax team might affect the Repo team. That's one way to make the dependency graph clear. In a real Java or C# codebase, we usually see the repository interface that looks like this ```Csharp interface IInvoiceRepository { Invoice getById(String invoiceId); List<Invoice> findAllByUserId(String userId); List<Invoice> findAllByEmail(String email); void AddInvoiceItem(InvoiceItem item); // And more methods } ``` This is a clear signal that the interface is not owned by the Tax team. First of all, the interface has many methods unrelated to what Tax team need. Second of all, according to my experience, the interface changes is likely to be dictated by either database implementation or global system requirement. Does the Tax team is independent and decoupled from database implementation, in this case, the Repo team? I highly doubt that. Assuming that dependency inversion is about making high-level modules independent of low-level modules, it safe to say that just using dependency injection and IoC container that the most famous framework provided does not automatically make the codebase achieve dependency inversion principle yet. This is where I believe most of the misconception happens. Many people see that dependency inversion must be done using dependency injection (which make sense). The common implementation of dependency injection is using a global IoC container. Hence, most people believe that by just using Spring or C# MVC IoC container, they achieve dependency inversion. Understandable, but not true. # Real world is messy, but we can make it better At this point, you might argue that it is impossible to make the Tax team truly independent of the technical implementation of the database. And I agree. Let's say Tax team want to have a GUID-based invoice id but the database only supports an integer-based id, what realistically can we do. Many hard technical limitations make it impractical to be idealistic and dogmatic about the dependency inversion principle. Some tax calculations might be impractical to do without the help of database-level aggregation. However, we can still be better by simply asking ourselves How about we let Tax team decide what interface they want? ```CSharp namespace TaxModule { interface IGetInvoiceById { Invoice getById(String invoiceId); } public class TaxCalculator { private IGetInvoiceById getInvoiceById; public TaxCalculator(IGetInvoiceById getInvoiceById) { this.getInvoiceById = invoiceRepository; } public double CalculateTax(String invoiceNumber) { Invoice thisInvoice = this.IGetInvoiceById.get(invoiceNumber); // Calculate tax } } } ``` Hence, the Tax team and `TaxModule` defined what interface they want instead of relying on either. It's a matter of who owns the interface. If high-level modules own all the interfaces. This means high-level modules dictate changes, and low-level modules need to adhere to what high-level modules define. This makes high-level modules truly independent of changes made by low-level modules or global requirements. This is better aligned with a vision of dependency inversion. And in case the Repo team need to change the interface because of technical limitation, they need to inform the Tax team and ask for them to neither change their interface nor implement some kind of anti-corruption layer. Collaboration still needs to happen. But at least, all Tax team objects depends on the code in `TaxModule`, including all required interfaces. The team become more independent. Simply put: **I suggest that to truly invert dependency, we should let high-level modules define required interfaces instead of global interface or worse, interface dictate by low-level code.** # Language limitation The problem with this approach is that, sadly, the global interface approach is endorsed by IoC container design of the most common framework in practice. If you are using Java, Kotlin or C# we don't have type-inference. ```Csharp interface IGetInvoiceById { Invoice getById(String invoiceId); } interface IInvoiceRepository { Invoice getById(String invoiceId); } ``` In this case, the interface `IInvoiceRepository` cannot be used as `IGetInvoiceById`. What we can only do is to have ```CSharp interface IInvoiceRepository: IGetInvoiceById { } ``` And in a large system, you can imagine an explosion of small interfaces ```CSharp interface IInvoiceRepository: IGetInvoiceById, ISetInvoiceId, IAddInvoiceItem, IGetInvoiceByUsername // ... and 8 more { } ``` This is where I think GoLang did a good job on type-inference and [defining Go interface philosophy](https://github.com/golang/go/wiki/CodeReviewComments#interfaces). > Go interfaces generally belong in the package that uses values of the interface type, not the package that implements those values. The implementing package should return concrete (usually pointer or struct) types: that way, new methods can be added to implementations without requiring extensive refactoring. Anyway, we need to be aware of a language limitation and why are we not doing this yet. As a polyglot developer, I believe we need to be aware of both sides of the coin 1. What the ideal principle looks like, and how is it helpful? 2. What is the language and framework limitation. Is it practical to adhere with the principle to the letter based on the limitation we have? There are two extremes stances. One is to dismiss the purity just because of current limitations ie. "Stupid. We don't do that here.". Another one is to be a purist and make a codebase become an unmaintainable mess. Both stances are unproductive. In Golang, I would obviously adhere to the principle of letting high-level modules define required interfaces instead of global interfaces. In C# or Java in a current version, I might not. In the future, who knows? # Last note I think the wide misconception occurred because the famous concrete implementation of DI is to use IoC container. But that's just a form, born out of many limitations and legacies. The principle itself should be language independent. The implementation still depends on the language and framework you use. You cannot understand principle from the eyes of just "C# Developer", "Java developer" or "React Developer". You need to understand what's out there in the programming world. And you might disagree with the principle. But at least, to be a better programmer, you need to understand 1. Ideal situation of programming. Such as principles, 2. Current state of programming. Such as technical limitations. Adhere to the ideal, you become a purist. Adhere to the current state, you become stagnant. What I try to say in this article are 1. Current implementation of Dependency Inversion principle on some famous frameworks is not really inverting dependency. 2. Ideal situation is to make high-level module code does not affect by low-level module change at all. 3. We might get closer to the ideal once we have a better language or framework. For example, type-inference is one killer feature that I hope to have in enterprise-focus programming languages. That's one way to consolidate between the ideal and the practical side of programming. That's all for today. Thanks for reading this far!
chrisza4
936,764
Weekly Digest 51/2021
Welcome to my Weekly Digest #51, which is almost the last one for this year. This weekly digest...
10,701
2021-12-26T19:51:26
https://dev.to/marcobiedermann/weekly-digest-512021-7k3
css, javascript, react, webdev
Welcome to my Weekly Digest #51, which is almost the last one for this year. This weekly digest contains a lot of interesting and inspiring articles, videos, tweets, podcasts, and designs I consumed during this week. --- ## Interesting articles to read ### How to use Storybook with ESLint Automatically validate stories in your code editor [How to use Storybook with ESLint](https://medium.com/storybookjs/how-to-use-storybook-with-eslint-e7f620a4d2c1) ### The many methods for using SVG icons Recently at work, Chen ran into a situation where we had to revisit how SVG icons were being implemented on our pages. [The many methods for using SVG icons](https://chenhuijing.com/blog/the-many-methods-for-using-svg-icons) --- ## Some great videos I watched this week ### Scroll Down to Start Playing Video It's gotta be fairly easy to play an HTML video when you scroll down to it, right? We kinda knew IntersectionObserver is the right API here, for performance reasons, but this use case is super well served by it and it's quite easy to use. We also handle the prefers-reduced-motion case here. {% youtube mV4tnQkqhmI %} by [Chris Coyier](https://twitter.com/chriscoyier) ### Solve subarray problems faster There are many algorithms that come up frequently in coding interviews, with Sliding Windows being one of the most popular. {% youtube GcW4mgmgSbw %} by [Byte by Byte](https://twitter.com/ByteByByteBlog) ### Responsive images & art direction Welcome back to Designing in the Browser with Developer Relations Engineer, Una Kravets. In this episode, we will learn all about responsive images and art direction, including how to optimize your art for size and layout. {% youtube KBQz1OSpRv8 %} by [Google Chrome Developers](https://twitter.com/ChromiumDev) ### Speed up inputs with useDeferredValue React 18 RC introduces a number of concurrent features thru new hooks: `useDeferredValue`, `startTransition`, and `useTransition`. In this video, Chantastic explores a simple `useDeferredValue` example and shares some transition gotchas. {% youtube Piq_MYrodt0 %} by [chantastic](https://twitter.com/chantastic) --- ## Useful GitHub repositories ### ML YouTube Courses A repository to index and organize the latest machine learning courses found on YouTube. {% github dair-ai/ML-YouTube-Courses %} ### Bulletproof React A simple, scalable, and powerful architecture for building production-ready React applications. {% github alan2207/bulletproof-react %} ### **UnoCSS** The instant on-demand atomic CSS engine. {% github antfu/unocss %} --- ## dribbble shots ### Orizon Bank App ![by [Rakib Kowshar](https://dribbble.com/shots/17138964-Orizon-Bank-App)](https://cdn.dribbble.com/users/3416941/screenshots/17138964/media/b3f2b8960657fa788a13e255a4f89644.jpg) by [Rakib Kowshar](https://dribbble.com/shots/17138964-Orizon-Bank-App) ### Plant Shop Mobile App ![by [Fauzan Ardhiansyah](https://dribbble.com/shots/17136610-Ijo-ijo-Plant-Shop-Mobile-App)](https://cdn.dribbble.com/users/3036385/screenshots/17136610/media/2bf60a118bdbdb59c184616c56e28c2f.png) by [Fauzan Ardhiansyah](https://dribbble.com/shots/17136610-Ijo-ijo-Plant-Shop-Mobile-App) ### Weather & Forecast Landing ![by [Yasir Ahmad Noori](https://dribbble.com/shots/17137313-Weather-Forecast-Landing-Web-Design)](https://cdn.dribbble.com/users/3894633/screenshots/17137313/media/43ed59c182722b28bcb398b9ac6e671c.png) by [Yasir Ahmad Noori](https://dribbble.com/shots/17137313-Weather-Forecast-Landing-Web-Design) ### Paygo ![by [Shafiqul Islam](https://dribbble.com/shots/17139367-Paygo-Website-Design)](https://cdn.dribbble.com/users/737304/screenshots/17139367/media/2b52e3ecaf1cca2d87d754637abcd319.png) by [Shafiqul Islam](https://dribbble.com/shots/17139367-Paygo-Website-Design) --- ## Tweets {% twitter 1472983863999799299 %} {% twitter 1473080837893722114 %} {% twitter 1473751452225003521 %} {% twitter 1474100352253321216 %} {% twitter 1474473165480886273 %} --- ## Picked Pens ### Wavy Snowman {% codepen https://codepen.io/pokecoder/pen/abLLwQM %} by [Ale](https://codepen.io/pokecoder) ### GSAP Christmas Loop {% codepen https://codepen.io/Nekto/pen/vYeJQMX %} by [Alexander](https://codepen.io/Nekto) ### Time Travelling with GSAP {% codepen https://codepen.io/jh3y/pen/yLgOgpq %} by [Jhey](https://twitter.com/jh3yy) --- ## Podcasts worth listening ### Syntax – Gitpod, iPad Coding, Web3, WTF NFT In this episode of Syntax, Scott and Wes talk with Geoff and Pauline from Gitpod about developing on Gitpod, Web3, and The NFT Bay. {% spotify spotify:episode:4cxTZcsIYkZ9t172YPF2cH %} --- Thank you for reading, talk to you next week, and stay safe! 👋
marcobiedermann
936,781
React - Component lifecycle
A post by Ikram Akbar
0
2021-12-26T11:38:42
https://dev.to/ikramakbar/react-component-lifecycle-4k0
ikramakbar
936,804
Advent of Code 2021 - Day 25
In this video series, I try to challenge myself with the Advent of Code trials. Each solution will be published to Github, and I hope you will learn something from my coding mistakes and perhaps send some code my way on how you have done these challenges. I know by reading code, so this is such an exciting thing for me.
0
2021-12-27T07:49:36
https://dev.to/kalaspuffar/advent-of-code-2021-day-25-p6l
--- title: Advent of Code 2021 - Day 25 published: true description: In this video series, I try to challenge myself with the Advent of Code trials. Each solution will be published to Github, and I hope you will learn something from my coding mistakes and perhaps send some code my way on how you have done these challenges. I know by reading code, so this is such an exciting thing for me. tags: cover_image: https://i.ytimg.com/vi/KyIQBhcLjgQ/maxresdefault.jpg --- {% youtube KyIQBhcLjgQ %} In this video series, I try to challenge myself with the Advent of Code trials. Each solution will be published to Github, and I hope you will learn something from my coding mistakes and perhaps send some code my way on how you have done these challenges. I know by reading code, so this is such an exciting thing for me.
kalaspuffar
936,896
Chronoshift - (Rebuilt) Binary Clock
Preface After a few months away from coding I have finally recovered the energy and...
0
2021-12-26T18:38:48
https://dev.to/mozetsu/chronoshift-rebuilt-binary-clock-3a6g
webdev, writing, motivation, beginners
#Preface After a few months away from coding I have finally recovered the energy and motivation to code after what seemed like the roughest months of my life 💀. This past year has been a mix of really unlucky events for me and I just had to take some time for myself. As of now I can say I'm feeling pretty good and ready to get back to work 🔥. Unfortunately, the lack of practice made me forget quite a lot. Looking through my repos I wondered how I could get to remember the basics and go back to the pace I was on when I stopped. Concluded I should rebuild an old app since I'm already familiar with it and would also get to improve some things I think could've been done differently 👌. That being said, I chose to rebuild a binary clock I made following [this](https://www.youtube.com/watch?v=VkTj1U_exwA) tutorial back in the day. After completing the tutorial I took my time designing my own version of the clock and building it on my own so I could put the knowledge in practice 🛠️. #Bitclock - Old project <p align="center"> <img src="https://i.imgur.com/O75jcHb.png" width="100%"> </p> This was the old version and although I was happy with the overall design I still had the feeling it could be better 🤔. For a basic clock that converts each number of the time into a four-digit binary sequence I assumed there were not many features I could implement other than a dark-theme and some links to Twitter and GitHub. This app was just a small project I did to get my hands around CSS grid and responsive design. #Design If I was going to rebuild this app I would like to make a design that was not only clean but also visually appealing ✨. As a developer, designing was never my strongest skill so I had to do some decent amount of research and gather some concepts that could hopefully point me in the right direction 🥲. After many shots saved on Dribble I hopped on Figma and put everything together. #HTML Used semantic HTML elements whenever possible as opposed to make everything out of divs as I had previously done. Still have to dive deeper in this area but this was quite easy to accomplish as the given layout was quite simple ✏️. #CSS Declared all the colors in the :root pseudo class as variables so I could define light and dark themes with them. <p align="center"> <img src="https://i.imgur.com/wXgz92B.png" width="100%"> </p> Went with the CSS BEM (Block, Element, Modifier) architecture when applying classes in the HTML as this is a best practice I want to implement in my future projects. It's similar to object-oriented programming but directed to CSS. The harder part was making the clock component responsive between different window-sizes but managed to get it done with the use of media-queries. #JavaScript <p align="center"> <img src="https://media.giphy.com/media/zOvBKUUEERdNm/giphy.gif" width="100%"> </p> Almost the same as the old version but with a few tweaks here and there. The core idea was: * Get the current time (17:20:32). * Split each section into an array ([1,7], [2,0], [3,2]). * Iterate through each item of each array and get the corresponding four-digit binary sequence returned as an array (1 = [0, 0, 0, 1]). * Iterate through each binary sequence and whenever a "1" is found, add the class ".on" to the corresponding bit in the clock HTML markup turning it on. If it is a "0", remove the class turning the bit off. * Repeat the process every 200ms. #Reborn as Chronoshift <p align="center"> <img src="https://i.imgur.com/XOvkNHb.png" width="100%"> </p> Kept the same features as before but arranged them differently this time with all the navigation placed down at the footer. Again, I still think a designer would do a better job with the UI but this time I was extremely satisfied with the result 😊. You can check the live version here: [Chronoshift](https://mozetsu.github.io/chronoshift/) #Conclusion There's still a lot to cover but I think I will progress much faster now. Have lots of new ideas in the design phase and cannot wait to bring them to life. Hope you enjoyed the redesign, until next time 👋.
mozetsu
936,920
Anyways, What are Layout components in React?
React is by a long shot the most popular front-end library on the market. It's modularity and...
0
2021-12-26T16:02:36
https://dev.to/mocktarissa/anyways-what-are-layout-components-in-react-fn
react, bestpractices
React is by a long shot the most popular front-end library on the market. It's modularity and component based architecture made it a front-end developer's favorite. By introducing the concept of the virtual DOM as a way of manipulating the DOM react created an abstraction concept that has been copied by a lot of main stream front end Libraries and framework. In this Serie we will discuss a few Design patterns in React and How they can be implemented either in your side projects or customer ready apps the result will be the same you will start writing better React code. The first Design pattern we will discuss is the **Layout Design Pattern**. The main idea behind building components using the Layout component pattern is that > Components should not know where they are being displayed and Layout components should only be concerned with displaying the component. Let's use this pattern in an example to get a better grasp of what it server to do. Let's say we are trying to build a split screen component to use in our project. This is a SplitScreen component that displays two panels side by side. ``` import React from 'react'; import styled from 'styled-components'; const Container= styled.div` display:flex; ` const Panel= styled.div` flex:${props=>props.width}; `; export default const SplitScreen=({Left,Right,leftWidth=1,rightWidth=1}) { return ( <Container> <Pane width={leftWidth}> <Right/> </Pane> <Pane width={rightWidth}> <Right/> </Pane> </Container> ) } ``` Now in our App component we can call <SplitScreen/> ``` import React from 'react' import styled from 'styled-component' const LeftComponent=()=>( <p style={{backgroundColor:'yellow'}}>Left</p> ) const RightComponent=()=>( <p style={{backgroundColor:'red'}}>Right</p> ) function App (){ return ( <SplitScreen left={LeftComponent} right={RightComponent} leftWidth={1} rightWidth={2}/> ) } export default App; ``` Let's say we need to pass a **title props** to both the Left and the right components. With our current implementation we would need to make a few changes. ``` import React from 'react' import styled from 'styled-component' const LeftComponent=({title})=><p style={{backgroundColor:'yellow'}}>{title}</p> const RightComponent=({title})=><p style={{backgroundColor:'red'}}>{title}</p> function App (){ return ( <SplitScreen left={LeftComponent} right={RightComponent} leftWidth={1} rightWidth={2}/> ) } export default App; ``` In the SplitScreen.js file ``` import React from 'react'; import styled from 'styled-components'; const Container= styled.div` display:flex; ` const Panel= styled.div` flex:${props=>props.width}; `; export default const SplitScreen=({ Left, Right, leftWidth=1, rightWidth=1, leftTitle='Left', rightTitle='Right'}) { return ( <Container> <Pane width={leftWidth}> <Right title={leftTitle/> </Pane> <Pane width={rightWidth}> <Right title={rightTitle}/> </Pane> </Container> ) } ``` This approach **might work** if we know for sure that our changes limits to this specific props and our component will not be used in a different context inside another component to add another prop to the Left or right component we would need to make even more changes. This sometimes can lead to **passing down multiple props to the component** which can be an anti-pattern in React. Since the Left and Right components cannot accept props on their own currently we need to rewrite the code such that the SplitScreen component **does not know about the props Left and Right need**. So instead of passing Left and Right as props to SplitScreen we can put them as React children to SplitScreen. ``` import React from 'react' import styled from 'styled-component' const LeftComponent=({title='Left'})=>( <p style={{backgroundColor:'yellow'}}>{title}</p>) const RightComponent=({title='Right'})=>( <p style={{backgroundColor:'red'}}>{title}</p>) function App (){ return ( <SplitScreen leftWidth={1} rightWidth={2} > <LeftComponent title={'Left Pane'}/> <RightComponent title={'Right Panel}/> </SplitScreen> ) } export default App; ``` And in the SplitScreen.js file: ``` import React from 'react'; import styled from 'styled-components'; const Container= styled.div` display:flex; ` const Panel= styled.div` flex:${props=>props.width}; `; export default const SplitScreen=({leftWidth=1,rightWidth=1,children}) { const [left,right]= children return ( <Container> <Pane width={leftWidth}> {left} </Pane> <Pane width={rightWidth}> {right} </Pane> </Container> ) } ``` By using this implementation we can now pass props to the Right and Left components directly without the need of passing through the SplitScreen component that only concern is to render component without actually knowing ahead of time what component it should render but only caring about the specific layout in which it should render those items. This also leads our code to be much more readable. Thank you for reading. Let's connect [Twitter](https://twitter.com/mocktarissa). [LinkedIn](https://twitter.com/mocktarissa). [Github](https://github.com/mocktarissa).
mocktarissa
205,213
How I installed GuixSD on DigitalOcean
One of the ways to install GuixSD on DigitalOcean
0
2019-11-14T05:32:50
https://dev.to/akoppela/how-i-installed-guixsd-on-digitalocean-26b0
guix, digitalocean, linux, server
--- title: How I installed GuixSD on DigitalOcean published: true description: One of the ways to install GuixSD on DigitalOcean tags: Guix, DigitalOcean, Linux, Server --- Hi. [Guix](http://guix.gnu.org/) a functional package management tool. It can be used either on top of existing Linux distribution or as a part of Guix System Distribution (GuixSD). Which is quite interesting project for many reasons. [DigitalOcean](http://digitalocean.com/) is a development cloud (DO for short). In this post I'm going to share what I did to install GuixSD on DO. I'm very new to Linux thus I got some issues on the way and I'd like to get some help in order to solve them. DO does not have GuixSD as available distribution. Thus there are two options how to install GuixSD. --- ### [Convert existing Linux distribution to GuixSD](https://lists.gnu.org/archive/html/guix-devel/2017-04/msg00139.html) However I got some issues on the last steps maybe because the post is from 2017 and things have changed, maybe there is other reason. Another thing is that it requires more knowledge about existing distribution to clean up after conversion. And being a noob in this area I decided to install GuixSD system from scratch. --- ### Install GuixSD on DO from scratch #### Reformat system installer Guix provide a compressed ISO with system installer. DO has an option to upload custom image. It supports various formats but not ISO. Thus the first thing to do is to reformat installer image to a format which DO supports. I had VirtualBox installed already so I used it to convert the image. ``` # Download GuixSD installer wget https://ftp.gnu.org/gnu/guix/guix-system-install-1.0.1.x86_64-linux.iso.xz # Uncompress ISO xz -d guix-system-install-1.0.1.x86_64-linux.iso.xz # Convert ISO to VDI vboxmanage convertfromraw guix-system-install-1.0.1.x86_64-linux.iso guix-system-install-1.0.1.x86_64-linux.vdi ``` #### Create DO droplet and install Guix system Next step is to upload image to DO and create droplet from it. When creating droplet DO provides `/dev/vda` as a main disk. DO boots the machine with installer stored in `/dev/vda` disk. When I followed [installation instructions] (https://guix.gnu.org/manual/en/html_node/System-Installation.html) I was unable to reformat `/dev/vda` due to it being used by the system. I tried to check what part of the system exactly uses it, but could not find. There were no partitions/disks mounted, etc.. So this is one part which I'd like to understand if it's possible to reformat the disk which stores operation system and if yes how can I find out how it's used by the system. The next thing I tried is to add additional volume and install Guix system on it. When adding additional volumes DO provides `/dev/sdX` disks. Following [installation instructions] (https://guix.gnu.org/manual/en/html_node/System-Installation.html) I was able to easily install GuixSD on `/dev/sda` disk. I created 3 partitions one for Bios boot (1M), one for Linux swap (8G) and one for Linux filesystem (the rest). I used [bare bone system configuration](https://github.com/akoppela/dotfiles/blob/f48aa9353fbd803a80a7bf1c07f614e3c69246d5/os-config.scm) to initialize GuixSD system. #### Booting GuixSD The next thing was to boot installed GuixSD. As I understood DO uses `/dev/vda` as a disk to boot system from. So after previous step I got 2 disks one with installer and one with GuixSD installed. Probably it would be possible to instruct DO to boot from additional volume with GuixSd but that would mean I had to carry installer along the way. What I did is I cloned `/dev/sda` disk to `/dev/vda` disk with following command (keep in mind that size of the volume should match the size of the droplet disk). ``` dd if=/dev/sda of=/dev/vda bs=64M conv=sync,noerror status=progress ``` After reboot I was able to boot GuixSD from `/dev/vda` disk. --- ### Conclusion In the end it was a successful experiment though it felt like a workaround. I would like to know what I missed and if there is an easier way to install clean GuixSD on DO. Thank you.
akoppela
937,242
A Career in WordPress Development
A career in WordPress development is very rewarding and can provide you with a wide range of skills,...
0
2021-12-27T06:51:08
https://dev.to/kumarravi577/a-career-in-wordpress-development-47co
wordpress, webdev, wordpressdevelopment, programming
A career in WordPress development is very rewarding and can provide you with a wide range of skills, such as creating and managing websites. If you're interested in developing websites, there are many ways you can get involved. If you're not sure where to start, here are some tips to get you started. If you're looking for a career in WordPress development, you'll find some of the basics below. Learning new skills is an ongoing process and you can expect to spend some time learning how to use WordPress and PHP. If you're looking to be a **[WordPress developer](https://pixxelznet.com/wordpress-development/)**, you'll have to learn a few different web technologies. Besides WordPress itself, you'll need to master a few other web technologies in order to build a strong foundation for your career. For example, you can use GitHub to manage your projects. This will allow you to track changes and save your files in a safe location. The more knowledge you have about these topics, the more successful you'll be. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y13nfedz6gu7el3ntmdg.jpg) The WordPress development platform requires you to know a variety of different programming languages and frameworks. You must learn how to write code in PHP to get your site running, as well as other important tools that make it run smoothly. You also have to learn some non-tech skills, such as project management. A good WordPress developer should be able to define requirements, forecast, and schedule work. In addition, you'll need to practice your communication skills, as the lack of communication can end up ruining the success of a project. The WordPress platform is easy to learn. But, there are many other skills that you need to have in order to become a good WordPress developer. Among these skills, learning to manage projects is essential. Project management is essential for the successful completion of a project. It's the ability to effectively define requirements, forecast, and schedule work in the best possible way. You should also practice your communication skills, as poor communication leads to more projects failing. As a WordPress developer, you need to have a good understanding of various web technologies. The WordPress core is the software available to the public for download. The WordPress core is maintained by a team led by co-founder Matt Mullenweg. Anyone can contribute to this project by reviewing the code and reporting any security vulnerabilities. The handbook for core contributors is very detailed. If you're interested in creating custom themes, you'll need to have the knowledge of these markup languages. If you want to contribute to the WordPress community, you should learn the fundamentals of HTML and PHP. The CMS is an essential part of the internet, so you need to be familiar with the language. You need to know how to use PHP, because it is the backend of WordPress. You need to understand the HTML code to create a WordPress website. For example, a web page will look like an outline if it contains a section of content.
kumarravi577
937,257
How to program a chatbot that reads all your website and answers questions based on its content
I'm sure you often get questions from your visitors and think "but this is already on the website!"....
0
2021-12-27T07:24:49
https://xatkit.com/chatbot-open-question-qa-haystack/
haystack, chatbot, qa, machinelearning
I'm sure you often get questions from your visitors and think "but this is already on the website!". You then added a chatbot to your site to filter out the <a href="https://xatkit.com/pareto-principle-chatbot-intent-design/" target="_blank" rel="noopener">most common questions</a>. But what about all the rest. Adding more and more questions to the bot it takes time. So, is there an easy way to <strong>let the chatbot find the answer by itself</strong> for those questions that are already answered somewhere in the (hundreds? thousands?) pages, posts or other online documents you've already published? YES!. The key is to plug <a href="https://haystack.deepset.ai/" target="_blank" rel="noopener" data-schema-attribute="">Haystack</a> to your chatbot. We have several <a href="https://huggingface.co/models?pipeline_tag=question-answering" target="_blank" rel="noopener">pretrained language models fine-tuned for Question Answering</a> that can be used to find an answer in a text but they only work when the text length is really small. Here it is where Haystack comes into play. Haystack architecture (see the featured image above) proposes a two-phase process: <ul> <li>A <em>Retriever</em> selects a set of candidate documents from all the available information. Among other options, we can rely on <a href="https://www.elastic.co/">ElasticSearch</a> to index the documents and return those that most likely contain the answer to the question</li> <li>A <em>Reader </em>applies state-of-the-art QA models to try to infer an answer from each candidate</li> </ul> Now we just need to return these inferred solutions together with additional information (context, confidence level,...) to help our users understand why we think this is the answer they were looking for. All thanks to a completely <a href="https://github.com/deepset-ai/haystack" target="_blank" rel="noopener">free and open source NLP framework!</a> Let's see how we can benefit from Haystack by creating a chatbot able to answer software design questions for the <a href="https://wordpress.org/" target="_blank" rel="noopener">WordPress</a> website <a href="https://modeling-languages.com/" target="_blank" rel="noopener" data-schema-attribute="">modeling-languages.com</a>. Haystack website is full of useful examples, so we'll adapt them in our scenario. <h2>Creating the chatbot, the "front-end"</h2> The easiest part is to create the chatbot. We'll obviously use <a href="https://github.com/xatkit-bot-platform/xatkit" target="_blank" rel="noopener" data-schema-attribute="">Xatkit</a> for this. The bot can have as many intents as you wish. The only part that we care about here is the <a href="https://xatkit.com/define-meaningful-fallbacks-for-your-chatbot/" target="_blank" rel="noopener" data-schema-attribute=""><em>default fallback </em></a>state. Here, instead of saying something useless, e.g. "sorry I didn't get your question, can you rephrase it and try again?", we will ask Haystack to find us a solution. {% gist https://gist.github.com/jcabot/5d3ef3282e6430acf5a709755b188f49.js %} Here we have all the pieces locally deployed but obviously each of them could be in a different server. <h2>Loading the information to ElasticSearch</h2> Before we can find an answer, we need to first power up ElasticSearch with the documents we want to use as information source for the bot. In this example, the documents will be all posts published in modeling-languages. We assume we have direct access to the WordPress database but otherwise we could write something similar using the <a href="https://developer.wordpress.org/rest-api/" target="_blank" rel="noopener" data-schema-attribute="">WordPress REST API </a>instead. Note that we split each post in different paragraphs to avoid chunks of text that could be too much for the QA models. {% gist https://gist.github.com/jcabot/7ca0c3d437659eb27ca1b6e8fa047bbe.js %} I deployed a <a href="https://flask.palletsprojects.com/en/2.0.x/" target="_blank" rel="noopener" data-schema-attribute="">Flask</a> server to facilitate calling all the endpoints on-demand, especially needed for those that the bot needs to interact with. <h2>Finding the answer</h2> The <em>Retriever</em> component will look for the most promising documents in ElasticSearch (by default, using the <a href="https://www.elastic.co/blog/practical-bm25-part-2-the-bm25-algorithm-and-its-variables" target="_blank" rel="noopener" data-schema-attribute="">BM25 algorithm</a> but there are other options). The <em>Reader</em> will look into each candidate and try to find the right answer in it. Thanks to the predefined <a href="https://haystack.deepset.ai/reference/pipelines" target="_blank" rel="noopener" data-schema-attribute="">Pipelines</a> provided by Haystack, putting everything together is really easy: {% gist https://gist.github.com/jcabot/382c617c2cb753e47f23803d05c278e3.js %} Once we have the answer, we create the response object that will be sent back to the chatbot. As a final step the chatbot will print this response to the user together with the URL of the post the answer comes from. This way, even if the answer is not perfect the user will have the option to go to the suggested URL. <h2>But, does it work?</h2> We've seen it's feasible to add the Haystack infrastructure to a chatbot. But what about the quality of the answers? Are they good enough? The answer is that it does work reasonably well. The modeling-languages website was not the easiest one to try with. It's rather large (over 1000 posts that translate into around 8000 documents) with significant overlappings. And there are still some "legay" posts in Spanish that add to the confusion. Let's see a couple of examples. In the first one I ask on how can I add a business rule (i.e. constraint) to a software design model. The first answer is technically correct (indeed, constraints are written on top of models) but rather useless. The next two are exactly what I was hoping to see an answer as they suggest me to use the Object Constraint Language to specify my constraints. <img class="size-full wp-image-120111 aligncenter" src="https://xatkit.com/wp-content/uploads/2021/12/OCLExample.png" alt="Question Answering example on a WordPress site" width="375" height="632" /> The second question is more concrete but it has a more open answer. Note that all answers are taken from the same document (the only one that realy talks about this Temporal EMF tool). All answers are reasonable but the third one really nails it. And keep in mind that we're using an extractive QA model, meaning that the model aims to return a subset of the text containing the answer. Instead, a generative QA model (also available in Haystack) would be able to "build" the answer from partial answers, potentially spread out in more than document. <img class="size-full wp-image-120112 aligncenter" src="https://xatkit.com/wp-content/uploads/2021/12/temporalExample.png" alt="QA haystack example with a more concrete question" width="374" height="625" /> In terms of <strong>performance, results were very satisfactory</strong>. The whole process took just a few seconds (once the initial model loading) but this is on my poor laptop. With proper configuration and tuning, the user should not notice a major delay. <a href="https://haystack.deepset.ai/guides/rest-api" target="_blank" rel="noopener">Haystack itself can also be deployed as a REST API</a> which should optimize even more the whole process.  And of course, you could always let the bot designer to configure whether to use Haystack in the default fallback or not, depending on a number of factors.
jcabot
959,258
Deploy Angular Application With NGINX and Docker
Hello folks, from past few months i was reading about the docker and deployment stuffs, so i thought...
0
2022-01-21T06:04:27
https://dev.to/ritesh4u/deploy-angular-application-with-nginx-and-docker-3jf6
angular, docker, nginx, webdev
Hello folks, from past few months i was reading about the docker and deployment stuffs, so i thought it will be useful to share the steps which i usually follow. > **Note:** I presumed that u already know about the docker and how angular build takes place If you didn't know much about docker you can go through link below > https://docs.docker.com/get-started/overview/ If you want to know more about angular you can go through link below > https://angular.io/cli/build If you want to know more about nginx you can go through link below > https://nginx.org/en/docs/ Before start we need few things to be setup correctly 1) Nodejs > https://nodejs.org/en/download/ 2) Angular CLI > https://angular.io/cli 3)Docker > https://docs.docker.com/get-docker/ So, Lets create simple angular application for this blog You can skip this step 1 if you have app with you ## 1) On Terminal run below command to create angular application `ng new angular-docker-blog` ## 2) Create 2 file with name Dockerfile , .dockerignore and nginx.conf in project root folder ![file location in root folder](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8qhsibdo4x2bqyrxhtba.png) Dockerfile will consist of commands which needs to execute when we are building docker image .dockerignore contains which file/folder we need to ignore while docker build takes place ### Dockerfile ``` # STEP-1 BUILD # Defining node image and giving alias as node-helper # It's better to define version otherwise me might face issue in future build FROM node:14-alpine3.15 as node-helper #Accepting build-arg to create environment specific build #it is useful when we have multiple environment (e.g: dev, tst, staging, prod) #default value is development ARG build_env=development #Creating virtual directory inside docker image WORKDIR /app RUN npm cache clean --force #Copying file from local machine to virtual docker image directory COPY . . #installing deps for project RUN npm install #creating angular build RUN ./node_modules/@angular/cli/bin/ng build --configuration=$build_env #STEP-2 RUN #Defining nginx img FROM nginx:1.20 as ngx #copying compiled code from dist to nginx folder for serving COPY --from=node-helper /app/dist/angular-docker-blog /usr/share/nginx/html #copying nginx config from local to image COPY /nginx.conf /etc/nginx/conf.d/default.conf #exposing internal port EXPOSE 80 ``` ### .dockerignore ``` .git .gitignore /node_modules ``` ### nginx.conf ``` server{ listen 80; sendfile on; default_type application/octet-stream; gzip on; gzip_http_version 1.1; gzip_disable "MSIE [1-6]\."; gzip_min_length 256; gzip_vary on; gzip_proxied expired no-cache no-store private auth; gzip_types text/plain text/css application/json application/javascript application/x-javascript text/xml application/xml application/xml+rss text/javascript; gzip_comp_level 9; root /usr/share/nginx/html; location / { try_files $uri $uri/ /index.html =404; } } ``` ## 3)Docker build command for creating docker image open terminal and run command #####For creating development build ``` docker build -t ad-blog:development . ``` #####For creating tst build ``` docker build -t ad-blog:tst --build-arg build_env=tst . ``` #####For creating production build ``` docker build -t ad-blog:production --build-arg build_env=production . ``` -t: Tag (if not specified, docker will take "latest" by default) --build-arg : for passing build argument, in our case we are passing 'build_env' to tell angular which environment to pick while creating build. ## 4) for creating docker container ``` docker run -p 8080:80 -d ad-blog:tst ``` -p for defining port Syntex:-> [host-port]:[docker-port] 80 port is exposed from container and we are mapping that with 8080 -d for running running container in Detach mode the docker will keep the console running ### Finally If you followed steps correctly you will have docker container running on port 8080 and you will able to access you application on [http://localhost:8080/](http://localhost:8080/) ### Extras if you want to see running docker containers you can run this command ``` docker ps ``` for stoping docker container ``` docker stop CONTAINER_ID ``` CONTAINER_ID you will get from docker ps command ### GITHUB Repo If you want to see how i configured for different environment consider checking angular.json, environment folder here [Github](https://github.com/ritesh4u/angular-docker-blog)
ritesh4u
959,556
Generate DinoFracture Engine pre-fracture meshes without scene GUI operation
DinoFracture Engine, an asset for adding to Unity fracture effect easier to any meshes is very nice...
0
2022-01-18T16:02:33
https://dev.to/headhigh/generate-dino-fracture-engine-pre-fracture-meshes-without-scene-gui-operation-245j
unity3d
--- title: "Generate DinoFracture Engine pre-fracture meshes without scene GUI operation" published: true description: tags: Unity, Unity3d --- DinoFracture Engine, an asset for adding to Unity fracture effect easier to any meshes is very nice product. https://assetstore.unity.com/packages/tools/physics/dinofracture-a-dynamic-fracture-library-26599 It can generate fractures runtime and editor. But if you try to make fracture in editor (not play mode), you have to make specific scene and attach PreFracturedGeometory component, and press "Create Fractures" button. In my case there are lots of meshes have to make fractures, so I made a script for Dino Fracture, to generate fractures without the clicking works in the scene. ```csharp using System; using System.Collections.Generic; using System.IO; using System.Linq; using UnityEngine; using UnityEditor; using DinoFracture; using DinoFracture.Editor; public class PreFractureInAssetFolder: PreFracturedGeometryEditor { private const string PrefabFolderPath = "Assets/{your prefab folder path}"; private const string MeshFolderPath = "Assets/{your mesh folder path}"; private const string ExportFolderName = "PreFractured"; private const string InsideMaterialPath = "Assets/{your inside material path}.mat"; private const string StandardFractureTemplatePath = "Assets/DinoFracture/Plugin/Prefabs/StandardFracturePiece.prefab"; [MenuItem("Tools/DinoFracture/PreFractureInAssetFolder")] public static void PreFracture() { //filter paths you want generate fractures string[] allPath = AssetDatabase.GetAllAssetPaths(); var prefabPathList = allPath.Where(p => p.Contains(PrefabFolderPath)) .Where(p2 => !p2.Contains(ExportFolderName)).ToList(); var objectCount = prefabPathList.Count; for (var index = 0; index < objectCount; index++) { var prefabPath = prefabPathList[index]; var contentsRoot = PrefabUtility.LoadPrefabContents(prefabPath); if (contentsRoot.GetComponent<MeshFilter>() == false) { Debug.Log("contentsRoot " + contentsRoot.name +"doesnt has meshfilter"); objectCount--; continue; } MakeFractureMeshes(contentsRoot, prefabPath, objectCount); } } private static void MakeFractureMeshes(GameObject contentsRoot, string prefabPath, int objectCount) { List<Tuple<Mesh, string>> meshAssetList = new List<Tuple<Mesh, string>>(); List<Tuple<GameObject, string, string>> prefabAssetList = new List<Tuple<GameObject, string, string>>(); var standardFractureTemplate = AssetDatabase.LoadAssetAtPath<GameObject>(StandardFractureTemplatePath); var insideMaterial = AssetDatabase.LoadAssetAtPath<Material>(InsideMaterialPath); var geom = contentsRoot.AddComponent<PreFracturedGeometry>(); geom.FractureTemplate = standardFractureTemplate; geom.InsideMaterial = insideMaterial; //set parametors as you want geom.NumFracturePieces = 4; geom.NumIterations = 2; geom.NumGenerations = 2; geom.UVScale = FractureUVScale.EntireMesh; geom.GenerateFractureMeshes( (g) => { g.GeneratedPieces.SetActive(true); //generate meshes string meshSaveFolderPath = Path.Combine(MeshFolderPath, ExportFolderName, geom.gameObject.name); for (int i = 0; i < geom.GeneratedPieces.transform.childCount; i++) { var pieceTransform = geom.GeneratedPieces.transform.GetChild(i); //in my case mesh collider is much CPU usage so replase to BoxCollider var meshCollider = pieceTransform.GetComponent<MeshCollider>(); GameObject.DestroyImmediate(meshCollider); var boxCollider = pieceTransform.gameObject.AddComponent<BoxCollider>(); boxCollider.enabled = false; var rigidbody = pieceTransform.GetComponent<Rigidbody>(); rigidbody.isKinematic = true; MeshFilter mf = geom.GeneratedPieces.transform.GetChild(i).GetComponent<MeshFilter>(); if (mf != null && mf.sharedMesh != null) { meshAssetList.Add(new Tuple<Mesh, string>(mf.sharedMesh, meshSaveFolderPath)); } } //save prefabs var prefabSaveFolderPath = Path.Combine(Path.GetDirectoryName(prefabPath), ExportFolderName); prefabAssetList.Add(new Tuple<GameObject, string, string>(g.GeneratedPieces, prefabSaveFolderPath, contentsRoot.name + ".prefab")); if (prefabAssetList.Count >= objectCount) { SaveProcess(meshAssetList, prefabAssetList); } }); } private static void SaveProcess(List<Tuple<Mesh, string>> meshAssetList, List<Tuple<GameObject, string, string>> prefabAssetList) { Debug.Log("save process"); //stop Asset import Postprocess AssetDatabase.StartAssetEditing(); //save mesh foreach (var meshAsset in meshAssetList) { DirectoryInfo dir = new DirectoryInfo(meshAsset.Item2); if (!dir.Exists) { dir.Create(); } else { // delete old meshes foreach (FileInfo file in dir.GetFiles()) { file.Delete(); } } string assetPath = Path.Combine(meshAsset.Item2, String.Format("{0}.asset", Guid.NewGuid().ToString("B"))); AssetDatabase.CreateAsset(meshAsset.Item1, assetPath); } //save prefab foreach (var prefabAsset in prefabAssetList) { DirectoryInfo dir = new DirectoryInfo(prefabAsset.Item2); if (!dir.Exists) { dir.Create(); } PrefabUtility.SaveAsPrefabAsset(prefabAsset.Item1, Path.Combine(prefabAsset.Item2, prefabAsset.Item3 )); GameObject.DestroyImmediate(prefabAsset.Item1); } meshAssetList.Clear(); prefabAssetList.Clear(); AssetDatabase.StopAssetEditing(); AssetDatabase.SaveAssets(); AssetDatabase.Refresh(); } } ```
takaakiichijo
959,764
Predictable UX
Good UX is predictable. I can operate it with my eyes closed because I know what is going to happen....
0
2022-01-18T19:48:34
https://kevincox.ca/2022/01/18/predictable-ux/
--- title: Predictable UX published: true date: 2022-01-18 16:20:00 UTC tags: canonical_url: https://kevincox.ca/2022/01/18/predictable-ux/ --- <main><p>Good UX is predictable. I can operate it with my eyes closed because I know what is going to happen. A predictable UX means that I don’t need to react to what my tool is doing.</p> <p>Imagine you were walking on a tile floor. The tiles look identical but may be grippy rubber or slick ice. If the floor is all rubber you can run across. If the floor is all ice you can slide almost as quickly. However if the tiles are mixed you can’t move quickly. You need to make every step carefully to decide if it will slip or stick. This is what using unpredictable UX feels like.</p> <h2 id="example"><a href="https://kevincox.ca/2022/01/18/predictable-ux/#example">Example</a></h2> <p>The <a href="https://support.google.com/android/answer/9079644">Android gesture navigation</a> is a good example of unpredictable navigation. It sounds like a good experience; you simply swipe up to access a list of recent windows, and it takes less space on the screen than a button. However, the left and right swipes are unpredictable, which makes it annoying to use.</p> <aside><p><strong>Note</strong>: I’m talking about swiping on the bottom edge. Not swiping in from either side which is the “back” gesture that goes back either within or across windows.</p></aside><p>At first, swiping right switches to the previous window. This is similar to double tapping the app switcher (square) button on the old three-button navigation. The flaw is that swiping right again can either go back another window or go to the new previous window (the one you were just looking at). I still don’t fully understand how long it takes for this heuristic to reset. Take an example user journey of copy+paste of a couple bits of text between two apps:</p> <ol> <li>Copy text in source app.</li> <li>Swipe right to go to previous window (sink app).</li> <li>Paste.</li> <li>How do I get back to the source app???</li> </ol> <p>Step 4 is unpredictable. If I was slow I need to swipe right, because the “previous” app is the source app. But if I am fast enough I need to swipe left because I am in the same navigation session!</p> <p>The three-button navigation got this right. Repeatedly double-tapping the app switcher button just cycled between the most recent two apps forever. It was perfectly predictable. I could switch between windows with my eyes closed, subconsciously and did not need to react to the device. It does mean that getting to the third or fourth most-recent window is slightly less convenient, but it is predictable. Given that I rarely remember what the third most recent window is anyways this is a better trade-off.</p> <p>If I were in charge of Android I would change gesture navigation as follows:</p> <ol> <li>Swipe right always goes to the previous window.</li> <li>Swipe left goes “back” so that they can give edge swipes back to apps. I really don’t need swiping from the entire either edge of my screen as a back gesture.</li> </ol></main>
kevincox
960,104
Introduction to Temporal Workflows
For the past 45 years, the database community has enjoyed an unparalleled developer experience:...
16,505
2022-01-24T20:20:43
https://docs.temporal.io/blog/dominik-workflow-part-1
temporal, workflows, distributedsystems
For the past 45 years, the database community has enjoyed an unparalleled developer experience: transactions mitigate failure in totality on a platform level, guaranteeing correctness on an application level. Despite many advancements in the past 20 years, the distributed systems community has not enjoyed an equivalent developer experience: There is no abstraction that mitigates failure in totality on a platform level, guaranteeing correctness on an application level. However, [Temporal](temporal.io) changes that equation! ## Introduction Temporal’s core abstraction, its unit of execution, reliability, and scalability, is the Workflow. Therefore understanding the Workflow is key to understanding Temporal in general. In this blog post series we deep dive into the world of Temporal Workflows. A Temporal Workflow is Temporal’s core abstraction. You may think of a Temporal Workflow Definition as a regular Function Definition—in fact, that is the developer experience that Temporal provides to its users—but a Workflow Execution provides stunning improvements over a regular Function Execution. Let’s get to know Workflow Executions and contrast them to regular Function Executions with a straightforward example: sending reminder emails. Our use case requires that our application sends a reminder email once a month to any user who signed up for a trial period to upgrade their plan. In pseudo code our use case can be expressed as: <pre> <code> <b>function</b> send monthly reminder (user) <b>do</b> <b>while</b> user has not signed up <b>do</b> send reminder to user sleep for 1 month <b>end</b> <b>end</b> </code> </pre> ## Regular Functions The pseudo code looks fairly straightforward; in fact, the pseudo code looks less like an implementation and more like a specification. Could we use the pseudo code as a blueprint for a regular Function Definition? No, not at all: > In a typical environment we cannot just invoke a function and expect the resulting function execution to reliably execute to completion — or like in this case execute indefinitely until cancelation. As a result, we have to “break up” the process of Send Reminder Email into many different pieces, scattered across the tech stack: A cron job here, a message in a queue there, maybe a row in some database table, you know, for good measure. On top of that, now we need to worry about failures, retries, duplication, and idempotence. An implementation on top of services like AWS Lambda Functions and like AWS Simple Queueing Service might look like: ```typescript // Lambda function is bound to // a. input queue "Reminder" // b. output queue "Reminder" // We assume that // - messages are never lost // - messages may be duplicated // - messages are retried on failure function SendReminderEmail(event, context) : Message { // UserSignup, SendEmail, Get, Set will throw an // Exception on failure // event.user Current user // event.iter Current iteration to limit retries (here 2) if (!UserSignedUp(event.user)) { // Retrieve the k/v pair for this user and iteration let kv = Get(`$(event.user)-${event.iter}`, 0); // Try at most twice if(kv.val < 2) { // Conditionally set the key. If the tag does not // match we are racing with another instance of // SendReminderEmail if (Set(kv.key, kv.val + 1, kv.tag)) { // This does not prevent us from calling SendEmail // twice. Do you see why? SendEmail(user); } } else { throw; } } return { message: { user: event.user, iter: event.iter + 1}, after: "1month" } } // Start by queueing the message {user: "<User>", iter: 0} on // the message queue "Reminder" ``` Listing 2 looks nothing like the pseudo code in Listing 1. Listing 2 does not tell the story of our use case—while not overly long or verbose, it is obscure and hard to reason about. ## Temporal Workflows Obviously we cannot use the pseudo code as a blueprint for a regular Function Definition. However, could we use the pseudo code as a blueprint for a Temporal Workflow Definition? Well, yes, yes we can! ```typescript import { proxyActivities, sleep } from '@temporalio/workflow'; const { sendReminderEmail, hasSignedUp } = proxyActivities({ scheduleToCloseTimeout: '10 seconds', retry: { maximumAttempts: 2 } }); async function SendReminderEmail(user: string) { while(!await hasSignedUp(user)) { try { await sendReminderEmail(user); } catch(e) { // Thanks to Temporal's retry policy, we already // tried twice, better luck next month 🍀 } await sleep("1 month"); } } ``` > In Temporal we can just invoke a Workflow Definition, and the resulting Workflow Execution reliably executes to completion — or like in this case execute indefinitely until cancelation. Temporal Workflow Executions are to distributed systems what transactions are to databases: A great developer experience and (or maybe because of) peace of mind. Doubts? Disbelief? Check out Part II and Part III to explore how Temporal implements this game-changing execution model. Photo by <a href="https://unsplash.com/@8moments?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Simon Berger</a> on <a href="https://unsplash.com/s/photos/zen?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a>
dtornow
960,428
Deploy your React projects to AWS Elastic Beanstalk using CI/CD AWS CodePipeline (Part 2)
Introduction In Part 1, we created a React application and uploaded it to a GitHub Repo....
0
2022-01-19T17:46:15
https://dev.to/ogambakerubo/deploy-your-react-projects-to-aws-elastic-beanstalk-using-cicd-aws-codepipeline-part-2-3mch
devops, aws, react, tutorial
## Introduction In [Part 1](https://dev.to/ogambakerubo/deploy-your-react-projects-to-aws-elastic-beanstalk-using-cicd-aws-codepipeline-part-1-1nne), we created a React application and uploaded it to a GitHub Repo. We also created an Elastic Beanstalk application. Now, we will pick up where we left off and create a continuous integration/continuous deployment pipeline using CodePipeline. ### Create a pipeline Type 'codepipeline' into the search bar. Select CodePipeline: ![Search for Elastic Beanstalk](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/40xy1x7evqfanp2cx9mr.png) Then, click the `Create pipeline` button: ![Click Create pipeline](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nbbmr08lc4s60co9oalb.png) Type in a name for your pipeline. Leave everything else as it is, then click next: ![Type in Pipeline Name](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j77kkb01zuob3c9v9zms.png) Next, we will select the code source. Choose 'GitHub (Version 1)' for this tutorial. Click the `Connect to GitHub` button: ![Connect to GitHub](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5rtn0lmkf77s1d9ht4tw.png) You will be prompted to authorize a AWS CodePipeline connection: ![Authorize AWS CodePipeline](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6f91gavzurcor1d7i3x5.png) Afterwards, confirm the new configurations made: ![Confirm Changes](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qt7v7mw1jfg8encednxn.png) Choose the `react-demo` repo and the branch `main` from the drop-down menus. Then click 'Next': ![Select Repo and Branch](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/w0ij91wlzmn3het3lmso.png) Skip the build stage: ![Skip the Build Stage](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s812m72kqpp8sxqtklan.png) ![Skip the Build Stage](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gush9stpwvsjkvhk0qoa.png) In the deployment stage, select the deploy provider as Elastic Beanstalk. Select the region where you launched the Elastic Beanstalk application. Choose the appropriate application name and environment: ![Deployment Stage](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0b3y4e57m7akus1mo2de.png) Review the configurations, then click `Create pipeline`: ![Review and Create pipeline](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/azi2ukkw59nfnk9w0fbr.png) It will take a couple of minutes for your pipeline to finish setting up and deploy your application. You should see a success message once it's complete: ![Pipeline Successfully Created](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nfrpzzckkq207v8w81rf.png) Navigate back to the Elastic Beanstalk application: ![Elastic Beanstalk Application](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1g1skvh410t14g6gbskl.png) ![Elastic Beanstalk Application](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dg81l95mf6695mwxpqjz.png) Click this link and it will redirect you to the deployed React application: ![Deployed React Application](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ya6yug0v960l68ndliy6.png) ![Deployed React Application](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fp708vedafwvqylw1n7g.png) Now, we'll make a small change to the application and we'll see the changes reflected on the website. Make a change to your local repo and push it to the GitHub repo: ```bash git add . git commit -m "Update React application" git push -u origin main ``` In a couple of minutes, the website successfully updates: ![React Application Updated](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ovzchhcua95vc0fdvc26.png) Congrats, you have successfully setup an automated continuous deployment and continuous integration pipeline. You can continue to make changes to your application and watch them get rolled out in near real-time. Happy Coding!
ogambakerubo
960,689
Learn how to use group in mongoDB aggregation pipeline (with Exercise)
The group stage in the MongoDB aggregation pipeline helps us group data using any field in the...
0
2022-01-19T15:58:16
https://rrawat.com/blog/group-in-mongodb
database, mongodb, programming, webdev
The group stage in the MongoDB aggregation pipeline helps us group data using any field in the MongoDB document. It is one of the most important and commonly used stages in the MongoDB aggregation pipeline. In this article, we'll look at the $group stage and use various features that it provides. We'll be working with a sample collection of movies. There's going to be a playground link for each query so you can practice and learn by doing. There's also an exercise at the end of this article for you to try out. It will help you solidify your understanding once you've finished this article. Here are the things we will cover in this article: - Find distinct using group by ♟ - Group by using multiple fields 🎳 - Using accumulator functions 🧩 - Using with $project ⚙️ - Sorting the results 📈 - $group vs $project stage 🪞 - Thing to note 📝 - Conclusion 🎬 - Exercise 🎐 ## Establishing the data Before we jump into the aggregation pipeline and the group stage, we need some data to work with. I'm taking an example `Movies` collection for understanding the concept here. Again, there'll be links to the playground for each query throughout the article. Here's the `Movies` collection with only 5 documents containing random data: ```javascript { "name": "Spidey One way home", "release_year": "2021", "rating": 9, "starring": [ "Tom Hanks", "Tom Holland", "Mark Zucks", "Samy" ], "runtime": 120, "totalReviews": 2000, "director": "Jon What" }, { "name": "The Arrival of a Train", "release_year": "1896", "rating": 6, "starring": [ "Shawn Ching", "Looker Blindspot", "Tom Hanks" ], "runtime": 115, "totalReviews": 720, "director": "Ricky" }, { "name": "Lost persuit of adventure", "release_year": "2005", "rating": 7.1, "starring": [ "Jimmy simmon", "Catarina" ], "runtime": 150, "totalReviews": 823, "director": "Ricky" }, { "name": "Jungle Warrior", "release_year": "2016", "rating": 5.9, "starring": [ "Stormer", "Carmony", "Tom Hanks" ], "runtime": 150, "totalReviews": 1368, "director": "Whim Wailer" }, { "name": "The Last of the us all", "release_year": "2005", "rating": 8.5, "starring": [ "Samy", "George wise", "Pennywise" ], "runtime": 120, "totalReviews": 1800, "director": "Jon What" } ``` Now that we have our sample collection, it's time to explore the $group stage ⚡ ## Find distinct using group by To find the distinct items in a collection we can use the group stage on any field that we want to group by. This field will be unique in the output. Let's group the movies by their release year: ```javascript { $group: { _id: "$release_year" } } ``` ([playground link](https://mongoplayground.net/p/YZXjN1X0u_p)) Here's the output of the above query. Note that we only got unique release year values in the output. ``` [ { "_id": "1896" }, { "_id": "2016" }, { "_id": "2021" }, { "_id": "2005" } ] ``` ## Group by using multiple fields Similar to grouping by a single field, we might want to group the data with more than one field as per our use case. MongoDB aggregation pipeline allows us to group by as many fields as we want. Whatever we put inside the `_id` field is used to group the documents i.e., it returns all the fields present inside the `_id` field and groups by all of them. Let's group the movies by their release year and their runtime: ```javascript { $group: { _id: { "release_year": "$release_year", "runtime": "$runtime" } } } ``` ([playground link](https://mongoplayground.net/p/ff_dPScQfCI)) Grouping by release year and their runtime gives us this output: ``` [ { "_id": { "release_year": "2005", "runtime": 150 } }, { "_id": { "release_year": "2021", "runtime": 120 } }, { "_id": { "release_year": "2016", "runtime": 150 } }, { "_id": { "release_year": "2005", "runtime": 120 } }, { "_id": { "release_year": "1896", "runtime": 115 } } ] ``` Instead of using a single field to group by, we are using multiple fields in above scenario. The combination of release year and runtime acts as the unique identifier for each document. ## Using accumulator functions There are a lot of accumulator functions available in the group stage which can be used to aggregate the data. They help us carry out some of most common operations on the grouped data. Let's take a look at some of them: ### $count accumulator $count accumulator is used to count the number of documents in the group. This can be combined with our group by query to get the total number of documents in the group. Let's apply this to our movies collection: ```javascript { $group: { _id: "$release_year", totalMovies: { $count: {} } } } ``` ([playground link](https://mongoplayground.net/p/xq9UwbUNVSp)) We'll get the total movies released in each year: ``` [ { "_id": "2016", "totalMovies": 1 }, { "_id": "1896", "totalMovies": 1 }, { "_id": "2021", "totalMovies": 1 }, { "_id": "2005", "totalMovies": 2 } ] ``` ### $sum accumulator We can use the $sum accumulator to add up all the values in a field. Let's group the movies by their rating and sum up the reviews to understand if there's a correlation between movie rating and the number of reviews. ```javascript { $group: { _id: "$rating", totalMovies: { $sum: "$totalReviews" } } } ``` ([playground link](https://mongoplayground.net/p/DapK5ScHWI7)) And here we can see that there is a slight correlation between the number of reviews and the movie rating: ``` [ { "_id": 9, "totalMovies": 2000 }, { "_id": 6, "totalMovies": 720 }, { "_id": 5.9, "totalMovies": 1368 }, { "_id": 8.5, "totalMovies": 1800 }, { "_id": 7.1, "totalMovies": 823 } ] ``` ### $avg accumulator We might want to examine which year has the highest average movies rating for analytical purposes. Let's see how we can get those stats from our data: ```javascript { $group: { _id: { year: "$release_year", }, avgRating: { $avg: "$rating" } } } ``` ([playground link](https://mongoplayground.net/p/CK0kacF5bK4)) We are first grouping the movies by the release year and then calculating the average rating for each release year. Here's the output of the above query: ``` [ { "_id": { "year": "2016" }, "avgRating": 5.9 }, { "_id": { "year": "1896" }, "avgRating": 6 }, { "_id": { "year": "2021" }, "avgRating": 9 }, { "_id": { "year": "2005" }, "avgRating": 7.8 } ] ``` ### $push accumulator We want to look at all the ratings movies received for every release year. Let's use the $push accumulator to get all the movie names for each year: ```javascript { $group: { _id: { year: "$release_year", }, ratings: { $push: "$rating" } } } ``` ([playground link](https://mongoplayground.net/p/XQgtC8oT1_E)) All the movie ratings for each release year are pushed into an array: ``` [ { "_id": { "year": "1896" }, "ratings": [ 6 ] }, { "_id": { "year": "2016" }, "ratings": [ 5.9 ] }, { "_id": { "year": "2021" }, "ratings": [ 9 ] }, { "_id": { "year": "2005" }, "ratings": [ 7.1, 8.5 ] } ] ``` ### $addToSet accumulator You can consider this to be like the $push accumulator. $addToSet only adds the value to the array if it doesn't exist already. This is the only difference between $addToSet and $push. Let's group by rating and see which (unique) release years produced each: ```javascript { $group: { _id: { rating: "$rating" }, releasedIn: { "$addToSet": "$release_year" } } } ``` ([playground link](https://mongoplayground.net/p/1BL_71aFt4s)) We get ratings along with their unique release years: ``` [ { "_id": { "rating": 8.5 }, "releasedIn": [ "2005" ] }, { "_id": { "rating": 7.1 }, "releasedIn": [ "2005" ] }, { "_id": { "rating": 9 }, "releasedIn": [ "2021" ] }, { "_id": { "rating": 6 }, "releasedIn": [ "1896" ] }, { "_id": { "rating": 5.9 }, "releasedIn": [ "2016" ] } ] ``` ### $min accumulator Let's say we want to find out successful release years for the movies. A year is considered successful if the all the movies released during that year have rating greater than 7. Let's use the $min accumulator to get the successful years: ```javascript { $group: { _id: { year: "$release_year" }, minRating: { $min: "$rating" } } }, { "$match": { minRating: { $gt: 7 } } } ``` ([playground link](https://mongoplayground.net/p/S9CWxKB3GAc)) - We have grouped the movies collection using the `release_year` field. - In addition to that, we have added `minRating` field which maintains the minimum rating for each release year. - We have also applied a `$match` stage to filter out the years which don't have a minimum rating greater than 7. ``` [ { "_id": { "year": "2021" }, "minRating": 9 }, { "_id": { "year": "2005" }, "minRating": 7.1 } ] ``` ### $first accumulator This accumulator is different from the $first array operator which gives first element in an array. For each grouped documents, $first accumulator gives us the first one. Let's fetch the highest rated movie for every release year. Since we want to get the highest rated document from the each group, **we need to sort the documents before passing them to the group stage**. ```javascript { "$sort": { "release_year": 1, "rating": -1 } }, { $group: { _id: "$release_year", highestRating: { $first: "$rating" } } } ``` ([playground link](https://mongoplayground.net/p/vSb0y0kiAjv)) We are sorting using two fields here, `release_year` and `rating`. Let's understand the output of sort stage first: ```javascript [ { "rating": 6, "release_year": "1896" }, { "rating": 8.5, "release_year": "2005" }, { "rating": 7.1, "release_year": "2005" }, { "rating": 5.9, "release_year": "2016" }, { "rating": 9, "release_year": "2021" } ] ``` ([playground link](https://mongoplayground.net/p/6aQNPVcPFXJ)) The output is first sorted on the basis of ascending release year and then for each year, the movies are sorted in descending order of rating. This sorted output is then passed to the group stage which groups the documents by their release year. For example, group stage is working with two documents for release year 2005: ``` { "rating": 8.5, "release_year": "2005" }, { "rating": 7.1, "release_year": "2005" } ``` Let's call these "shortlisted documents" for release year 2005. This happens for all (unique) release years. Group stage picks the first element from these shortlisted documents (which has the highest rating because ratings are sorted in descending order). Combining the sort and group stages, here's the final output of the query: ``` [ { "_id": "2016", "highestRating": 5.9 }, { "_id": "1896", "highestRating": 6 }, { "_id": "2021", "highestRating": 9 }, { "_id": "2005", "highestRating": 8.5 } ] ``` > **NOTE**: Passing sorted documents to $group stage does not guarantee that the order will be preserved. ## Using with $project The movie rating is a floating point number. We'll round that off to the nearest integer to get the movie rating as a whole number. Let's also group movies by their modified ratings: ```javascript { "$project": { rating: { "$round": "$rating" } } }, { $group: { _id: "$rating", movies: { $sum: 1 } } } ``` ([playground link](https://mongoplayground.net/p/OxX7-dQE22L)) - We used $project stage to round off the rating to the nearest integer. - We used $group stage to group the movies by their modified rating. Here's the output of the above query: ``` [ { "_id": 7, "movies": 1 }, { "_id": 8, "movies": 1 }, { "_id": 9, "movies": 1 }, { "_id": 6, "movies": 2 } ] ``` The possibilities are endless. You can combine many other stages, perform some filters, put conditions or even `$$REMOVE` the documents. ## Sorting the results The year with the highest movie minutes might give us some insights on movies production and its correlation with audience attention spans over the years. So let's understand how to achieve that: ```javascript { $group: { _id: "$release_year", totalRuntime: { "$sum": "$runtime" } } }, { "$sort": { "totalRuntime": -1 } } ``` ([playground link](https://mongoplayground.net/p/eoHglgc0A6G)) We are fetching the total runtime of all the movies released in a particular year and then sorting them in descending order with the help of $sort stage: ``` [ { "_id": "2005", "totalRuntime": 270 }, { "_id": "2016", "totalRuntime": 150 }, { "_id": "2021", "totalRuntime": 120 }, { "_id": "1896", "totalRuntime": 115 } ] ``` It is evident from this query that the attention spans of the target audience have been decreasing in non-uniform way over the years. ## $group vs $project stage We have an n:1 relationship between input and output documents in the group stage. But, we have a 1:1 relationship in the $project stage. In group stage we usually get a count, sum, average of documents based on the grouping key (or _id), or even build an array. All of these operations take n number of documents and the output of group is a single document with the aggregated values. ![Group Stage Example](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l29uh0hpwdmtqmlvw9td.png) On the other hand, we include/exclude fields, perform field transformations within a single document in case of [project stage in aggregation pipeline](https://rrawat.com/blog/mongo-aggregation-project-stage), ![Project Stage Example](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jjwk6d89rc8ziw4oyp50.png) ## Thing to note **$group stage has a limit of 100 megabytes of RAM** If you're working with a massive dataset and you receive an error during group stage execution, you might be hitting the memory limit. If you want to increase it, use `allowDiskUse` option to enable the $group stage to write to temporary files on disk. The reason for this issue is very well stated in the [mongoDB docs](https://docs.mongodb.com/manual/core/aggregation-pipeline-limits/#memory-restrictions): > **NOTE**: Pipeline stages operate on streams of documents with each pipeline stage taking in documents, processing them, and then outputting the resulting documents. Some stages can't output any documents until they have processed all incoming documents. These pipeline stages must keep their stage output in RAM until all incoming documents are processed. As a result, these pipeline stages may require more space than the 100 MB limit. ## Conclusion And that was how the group stage works in mongoDB aggregation pipeline. We looked at how we can group data using single fields (distinct count), multiple fields, sort them, how we can carry out complex computations by adding conditions to group stage and the subtle difference between group and [project stage](https://rrawat.com/blog/mongo-aggregation-project-stage). I hope you find this useful and interesting. Let me know your thoughts and feedback on [Twitter](https://twitter.com/Rishabh570). ## Exercise To make sure you understand the concepts, I have curated a couple of questions related to what we've learned in this article. You can download exercise PDF below. It also contains working mongoDB playground links containing the answers for all the questions. Be honest, don't cheat 🙂. <a href='https://rrawat.com/static/images/group-in-mongodb/5-quick-questions-on-group-stage-with-answers.pdf' target="_blank" download> 5 quick questions on group stage with answers </a>
rishabh570
960,728
Cómo obtener un aumento de sueldo en TI
Todos sabemos que la tecnología es una carrera bastante buena en cuanto a dinero. Aun así, puede...
0
2022-01-19T16:54:01
https://conoce.dev/como-obtener-un-aumento-de-sueldo-en-ti
spanish, career
Todos sabemos que la tecnología es una carrera bastante buena en cuanto a dinero. Aun así, puede perjudicarse a sí mismo al ceñirse a un salario más bajo que el que podría ofrecer una empresa. Incluso una pequeña diferencia en el pago se convierte en una cantidad impresionante cuando la multiplicas por 12 meses en un año y todos los años que estarás trabajando. Asegurémonos de que no se pierda ninguna oportunidad financiera. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wq6uslh6t63byrhs1kgz.jpg) ## Hable sobre el pago con sus compañeros La forma en que se percibe hablar de dinero depende de la cultura. Crecí en Polonia y hablar de mis ingresos fue como presumir durante mucho tiempo. Podría ver que este es el caso si un especialista bien pagado hablara sobre sus ingresos a las personas que están atrapadas en trabajos de salario mínimo. Por otro lado, para las personas que trabajan en trabajos similares, los posibles resultados son: * Hay una diferencia mínima en la compensación * Alguien —tal vez usted —podría darse cuenta de que está mal pagado. Hay algunas razones por las que esto podría suceder: * Le falta alguna habilidad técnica que tiene un impacto significativo en el pago * Le faltan habilidades de negociación * La empresa en la que trabaja no tiene presupuesto para pagar mejor ¿Y las personas que aprenden que les pagan mejor? No pierden nada. Además del tabú cultural, la única razón para no hablar de dinero sería si estás más interesado en los resultados de la empresa que en tu bienestar y el de tus compañeros. Por ejemplo, este podría ser el caso si su participación en la propiedad de la empresa es más significativa que su parte de la nómina total, o si cree plenamente en la misión de su empleador. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z9kuvuteyk0mok2cqokx.jpg) ## Evaluar el mercado más amplio regularmente Si eres como yo, tu enfoque en tu carrera fluctúa mucho a lo largo de los años. A veces paso mucho tiempo aprendiendo cosas nuevas, interactuando con comunidades tecnológicas y, a veces, tengo meses en los que no hago nada relacionado con TI además de mi trabajo diario. Siempre hay cambios en el mercado laboral: nuevas tecnologías que ganan popularidad, nuevas empresas en el mercado local que ofrecen mejores condiciones o empresas lejanas que se abren a candidatos remotos. Es una buena idea concertar una cita contigo mismo para comprobar periódicamente esos cambios. Como mínimo, reservar una tarde al año para buscar ofertas de trabajo debería hacerte consciente de cómo van las cosas fuera de tu empresa actual. ## No cuentes con la iniciativa de la empresa La empresa para la que trabaja está interesada en mantener sus operaciones sin interrupciones. Tal vez la evaluación de su gerente dependa de cuántos renuncien a su equipo, pero esto les da solo un pequeño incentivo para mantener su salario en línea con el mercado. Este incentivo será aún menor si les convences de que te gusta tu trabajo y que no estás buscando nada más en este momento. Si la empresa tiene una política de aumentos salariales regulares, lo más probable es que solo coincida con la inflación, no con los cambios del mercado laboral de TI. Si te ascienden a un tramo salarial diferente que viene con un aumento, parece una mejora, pero puede ser que te coloquen en la parte inferior de este nuevo tramo. ## Prepárate para la negociación En 2019 me compré [Fearless Salary Negotiation](https://bit.ly/FearlessSalaryNegotiation) como regalo de cumpleaños. Fue el mejor regalo que he recibido y se pagó solo en poco tiempo. El libro brinda muchos consejos prácticos: guías paso a paso sobre qué hacer antes de negociar el salario. Tratar con una nueva empresa es una gran parte, pero hay muchas cosas que serán útiles en un contexto más general, y un capítulo está dedicado a las promociones y los aumentos de sueldo. Si tiene un presupuesto limitado, el autor del libro lo ofrece generosamente [también gratis] (https://bit.ly/FearlessSalaryNegotiationFree). Puede obtener su consejo en su sitio web, en el breve artículo: [cómo escribir un correo electrónico a su gerente para iniciar la conversación] (https://bit.ly/FearlessSalaryNegotiationRaise). ## ¡Pregunta por ello! Hablar del tema de la compensación con su gerente puede ser estresante. La preparación adecuada debería ayudarlo a lograr su objetivo y ayudarlo con el estrés. Puede verlo desde otra perspectiva: un gerente razonable debería apreciar que hable primero con ellos sobre su problema con el salario, en lugar de ir directamente a la competencia. ¿Y si resultan ser irrazonables? Significará que hablar con otras empresas es una buena idea. ## No seas tímido; avisa a tus colegas Una vez que obtenga su aumento, ¡puede informar a sus colegas! La práctica estándar de la industria de saltar de un trabajo a otro para obtener una mejor compensación significa que su colega experimentado podría ser reemplazado algún día por alguien que no sabe nada sobre su proyecto, lo que le obliga a llevar una carga de trabajo más significativa. Compartir sus buenas noticias puede inspirar a otros miembros del equipo a intentar hacer lo mismo, mejorando la moral y asegurándose de permanecer más tiempo en el proyecto. Al final, esto hará que su trabajo sea más fácil y placentero. ## ¿Qué sigue? Es genial que estés reflexionando sobre tu carrera y tratando de conseguir el mejor trato para ti. Ya que llegaste hasta aquí, ¿qué tal averiguar si tu lugar de trabajo actual es adecuado para ti? Aquí tienes una [guía para evaluar tu empresa actual](https://conoce.dev/como-evaluar-su-lugar-de-trabajo-actual). Si tiene preguntas, sigo aceptando personas para [tutoría gratuita en JS y programación] (https://how-to.dev/free-mentoring), ¡no dude en comunicarse conmigo!
marcinwosinek
961,028
Communication Culture at the Gnar
Thanks to technology, we have more ways to communicate than at any other time in history. But I'm...
0
2022-01-19T22:38:06
https://www.thegnar.com/communication-culture-at-the-gnar
![Larkin-Birdsong](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pi2q85x3sh2cts6tpqqd.jpeg) Thanks to technology, we have more ways to communicate than at any other time in history. But I'm sure I don't have to tell you that the average exchange has often become more difficult, burdensome, and ineffective. A person need only look at the ubiquitous comment section of nearly every website to see this. As a society, we seem to have been focusing on increasing the overall volume of communications while precious little resources are channeled into increasing the overall quality of the communications we participate in on a given day. This widespread shift in communication style is challenging for anyone to manage but at The Gnar, we occupy an interesting space in our day-to-day work. The nature of simultaneously being a consultant and a contributing software developer is a trick that we are required to perform successfully all the time and I'd say we're pretty good at it! However, one thing that can make or break our ability to do this consistently is our ability to communicate effectively. As journeymen/women who are required to constantly adapt to changing situations these skills need to remain sharp for us to remain effective and deliver value to our clients and team mates. However, unlike the technical training each of us has received in our respective areas of expertise, people rarely receive any training in how to be a highly effective listener and communicator. The best case scenario is that we stumble upon these skills by trial and error. Since we value these skills so highly as an organization, The Gnar has begun to take steps to make this kind of training more deliberate and effective. It all started one day when I was in a 1-on-1 with [Taylor Kearns, our Head of Consulting](https://www.github.com/taylorkearns) here at The Gnar. We had begun talking along these lines and Taylor had mentioned to me the desire to bring more practical learning opportunities to Gnarnians for learning and practicing these skills. I remembered a very effective workshop that I had been a part of several years ago at another consultancy I had worked at concerning[ Imposter Syndrome](https://www.verywellmind.com/imposter-syndrome-and-social-anxiety-disorder-4156469#:~:text=Coping-,What%20Is%20Imposter%20Syndrome%3F,perfectionism%20and%20the%20social%20context.) which I had found to be highly effective, led by a talented therapist, [Sarah Larkin-Birdsong](https://www.larkinbirdsong.com/). I agreed that I would reach out to Sarah and see if we could set something up at The Gnar. And so... The Gnar's first-ever Communication Workshop was born! ## <u>Key Takeaways</u> ###_"Listening and Empathy are skills to be practiced and built, not inherent qualities of certain people."_ Sarah started us off with a few icebreakers and exercises designed to illustrate that different aspects of communication may be hard for us merely due to a lack of practice, not a flaw in our personalities. ###_"There is no emotion that you have ever felt that hasn't been felt by millions of other people... billions."_ She then provided us with a practical guide to naming our emotions more specifically in order to help us help others to understand where we are coming from at a given moment. ![Emotion Words](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/32e7zufi3obib5pvame0.png) <em>A larger emotional vocabulary helps us to be more effective communicators.</em></p> ###_"Reflecting back what you hear [people] say is the only way either of you know you heard them."_ Then we paired off and did an active listening exercise where one partner would speak for two minutes while the other partner simply listened. Then the listener would reflect back what they had heard for one minute. The result of this tended to illustrate: 1. How easy it is to really listen to someone and understand them. 2. How easy and effective it is make someone feel heard 3. How feeling truly heard can help strengthen relationships between people. If you've never tried doing something like this, I'd highly recommend it. I think most of us found it to be a pretty revelatory experience. ###_"Guidelines for Difficult Conversations"_ After our active listening exercise was complete, Sarah led us through the guidelines for practicing non-violent communication: ####1. Speak from your own experience A conversation takes place only between its participants, so avoid bringing others' opinions or perspectives into your conversations. It can muddy the waters of your communication here and now. Also, avoid assuming you understand the other party's perspective already and allow them to speak from their own experience. ####2. Listen and Validate (AKA "Yes, and...") Show someone that you have heard what they said by reflecting/repeating it back to them and validating it. Understanding someone's perspective correctly is essential to effective communication. Validating that perspective provides common ground and promotes good will to further ease an interaction. Validation does not mean you forfeit your right disagree or talk further about the matter, only that you acknowledge, understand, and respect their point of view. ####3. Focus on the behavior, not the person (who is performing the behavior) or the impact, not the intention. Focusing on behaviors and impacts allows the conversation to feel less personal, which in turn can help de-escalate conversations before they become conflicts. It is also an opportunity for you to care for the other party by addressing the real consequences and feelings they are dealing with as a result of the present situation as opposed to making them feel defensive as if their ego or character is under attack. ####4. Be kind: Truthful is not necessarily nice -- Take a risk into vulnerability Most people are more comfortable receiving feedback than giving it. It tends to be a lot easier to gloss over issues than to do the uncomfortable work of confronting them. To do so however, can be an incredibly generous thing to do. Addressing challenges when they are small and uncomfortable saves you and the other party having to deal them (and their outcomes) when they have snowballed and become overwhelming. _"If there is an elephant in the room, and you are able to say something, there are probably other people there who are aware of it, who aren't able to say something. So you're likely saying it for a lot of people."_ ###_"The Compass of Shame"_ Sarah brought us this image called the "Compass of Shame" which maps different stress responses, one or more of which are employed by most people. Recognizing your own behaviors when you yourself have been stressed/challenged/triggered can help you to approach conversations in a more balanced and centered way, as well allowing you to recognize and release stress within yourself. ![Compass of Shame](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6cbkvn766m3s5b3rbamg.png) _The Compass of Shame shows early warning signs that you are experiencing an over-activation of your stress response._ ###_"Non-Violent Communication: A Crash Course"_ ####1. "Urgency is myth, there is always time" Rushing to address issues and forcing a sense of urgency can lead to sloppy communication with unintended consequences. Take your time, approach the situation only when you are ready. ####2. Stay Curious Approach interactions from a "Help me understand..." perspective. This is essentially the active listening/mirroring exercise we began the workshop with. ####3. Stay Collaborative Communication is inherently a cooperative endeavor. Be sure you are sharing the work and the rewards of it with the other party equitably. Say your piece and allow them to say theirs. ####4. Stay Optimistic Most people have a tendency to approach difficult conversations with the worst case scenario outcome in mind. This can inadvertently prime the conversation to go poorly. Allow yourself, the other party, and the conversation the benefit of the doubt. _"If you can't imagine the conversation going well... how is it [ever] going to?"_ ####5. First, address the facts... Try to keep the discussion to things that are objective and undisputable as the conversation begins. Sticking to these types of topics helps to keep things focused on behaviors and impacts and not people and personalities. _[NB](https://en.wikipedia.org/wiki/Nota_bene): "How something impacted you is a fact."_ _ex. "You've missed our weekly retro this week for the second time."_ ####6. Then, address the feelings (impacts)... Try to address the impacts that the other person's actions have had on you. It doesn't matter if two people would be impacted the same way or not by a certain action. Instead, it's more important to have a shared vocabulary and have a mutual understanding of what the impact feels like or how it is experienced. Remember to try to be as specific as possible to avoid ambiguity and home in on that mutual understanding. ![Non-violent Communication Feeling Words](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y9rcab4oymoautgopt78.png) _Specificity in terms of how you've been impacted helps to clarify difficult conversations_ _ex. "When you miss retro I become very **frustrated**."_ ####7. Then, address the needs... In each workshop I've done with Sarah, I've found that there's usually one specific moment where she says something very simple that to me seems so earth-shatteringly profound I end up taking it with me forever after. During _our_ workshop it happened when Sarah was addressing the subject of needs and it's so good I am just going to include it in its entirety: ![Non-violent Communication Need Words](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cg8p3rvfssjknszsgovz.png) In reference to the above table of needs: _"If you question your right to [any of] these, that is a really good thing to explore... because you are entitled to all of this... as a human being on this earth, you are entitled to all of these things, and if anyone ever suggests that you are not [entitled to all of these things]... that is... 🚩information🚩. Also, if you think you're entitled to more than this... that is [also] something to explore."_ _ex. "I think I got become so **frustrated** because retro is really important to me and the team and it doesn't feel like you're giving it the **respect** it deserves..."_ ####8. Finally, make a request... This should be something actionable that the other person who impacted you can do to avoid impacting you similarly in the future. _ex. "Hey, if you're going to miss retro can you please check-in with the rest of the team about what was discussed? Also, can you make an extra effort to be at the next one?"_ ###_Bonus: How to give a good apology_ ####1. Acknowledge what went wrong Again, active listening/mirroring. ####2. Apologize for what went wrong An expression of regret, an acknowledgement of your impact. ####3. Make a commitment for future behavior Apologies without a commitment to do better are empty. ####4. Say "Thank you" for the feedback As above, giving feedback is a courageous and generous act for which you should express gratitude. ##<u>Final thoughts...</u> Active listening and non-violent communication aren't easy. That's probably why you don't experience them more often in your everyday life. The good news is... with practice, you can become the source! On the whole, I couldn't have been more pleased. Not only with the workshop and valuable skills presented within, but with my fellow Gnarnians who approached the whole exercise with their typical brand of honesty, openness, and humility. The Gnar is definitely a place where ideas and skills like these are highly valued, not simply for the practical reasons I stated in the opening, but because of the type of person who tends to work here. It's been great to be a part of a work environment where that is true, and I am very excited to see so many colleagues (and management too!) take the time to look at ways where they can become even better communicators. I am excited to see all this in action on my day to day here at The Gnar as folks practice and deepen these learnings. I also look forward to the next workshop... whatever that might be. See you there 😉 You can learn more about Sarah and her work at [https://www.larkinbirdsong.com/]("https://www.larkinbirdsong.com/"). _Learn more about [The Gnar and our team](https://www.thegnar.com/about)._
llanddewilovesyou
963,811
Why I don't like Jira
Yesterday I hinted that I’m not a big fan of Jira. In the interest of keeping this short, I won’t go...
0
2022-06-18T19:48:31
https://jhall.io/archive/2022/01/22/why-i-dont-like-jira/
jira, agile, tools, backlog
--- title: Why I don't like Jira published: true date: 2022-01-22 00:00:00 UTC tags: jira,agile,tools,backlog canonical_url: https://jhall.io/archive/2022/01/22/why-i-dont-like-jira/ --- [Yesterday](https://jhall.io/archive/2022/01/21/what-makes-an-agile-tool-good/) I hinted that I’m not a big fan of Jira. In the interest of keeping this short, I won’t go into a long list of Jira failings. Instead I’ll just address the two prongs of the framework I established yesterday: > Does the tool make useful things easier, and unuseful things harder? Jira is an incredibly “powerful” tool. It can be configured six ways from Sunday. This makes it a popular choice for teams that have complex project management needs. But agile teams, [by definition](https://agilemanifesto.org/), should favor “individuals and interactions over processes and tools”, and [strive for](https://agilemanifesto.org/principles.html) simplicity, or “the art of maximizing the amount of work not done.” Work about work (i.e. issue management) is part of the work that we ought to maximize _not doing_. So based on the premise that _agile teams_ need lightweight processes, and want to maximize the amount of work not done, is Jira a good tool for this purpose? Let’s look. ## Does Jira make it easy to do useful things? Not really. Jira definitely makes it _possible_ to do these things. Jira can do almost anything, if properly configured. But it’s not _easy_ to configure Jira to be lightweight. It probably requires several weeks (or longer) of trial and error, or a highly experienced expert to configure things to be “lightweight”. But on to the second of my criteria: ## Does Jira make it difficult to do unuseful things? Again, not really. The fact that Jira is so configurable means that it really doesn’t offer any enabling constraints by default. In fact, Jira often encourages a lot of anti-patterns. I recently saw a post boasting that Jira now supports “larger backlogs”! 🤦 Large backlogs are _not_ lightweight, but I’ll save that rant for another day. In my honest opinion, Jira is responsible for more lost human productivity than any other software ever created. And yes, that’s even when considering Windows Me! * * * _If you enjoyed this message, [subscribe](https://jhall.io/daily) to <u>The Daily Commit</u> to get future messages to your inbox._
jhall
961,205
Tutorial: How to Deploy Multi-Region YugabyteDB on GKE Using Multi-Cluster Services
The evolution of “build once, run anywhere” containers and Kubernetes—a cloud-agnostic,...
0
2022-01-20T08:58:44
https://blog.yugabyte.com/multi-region-yugabytedb-on-gke/
database, distributedsql, kubernetes, googlecloudplatform
--- title: Tutorial: How to Deploy Multi-Region YugabyteDB on GKE Using Multi-Cluster Services published: true date: 2022-01-18 15:32:16 UTC tags: Databases,DistributedSQL,Kubernetes,GoogleCloudPlatform canonical_url: https://blog.yugabyte.com/multi-region-yugabytedb-on-gke/ --- The evolution of “build once, run anywhere” [containers](https://www.docker.com/resources/what-container) and [Kubernetes](https://kubernetes.io/)—a cloud-agnostic, declarative-driven orchestration API—have made a scalable, self-service platform layer a reality. Even though it is not a one size fits all solution, a majority of business and technical challenges are being addressed. Kubernetes as the common denominator gives scalability, resiliency, and agility to internet-scale applications on various clouds in a predictable, consistent manner. But what good is application layer scalability if the data is still confined to a single vertically scalable server that can’t exceed a predefined limit? [YugabyteDB](https://www.yugabyte.com/yugabytedb/) addresses these challenges. It is an open source, distributed SQL database built for cloud native architecture. YugabyteDB can handle global, internet-scale applications with [low query latency](https://blog.yugabyte.com/how-to-achieve-high-availability-low-latency-gdpr-compliance-in-a-distributed-sql-database/) and [extreme resilience against failures](https://docs.yugabyte.com/latest/architecture/core-functions/high-availability/). It also offers the same level of internet-scale similar to Kubernetes but for data on bare metal, virtual machines, and containers deployed on various clouds. In this blog post, we’ll explore a multi-region deployment of YugabyteDB on [Google Kubernetes Engine (GKE)](https://cloud.google.com/kubernetes-engine) using [Google Cloud Platform’s](https://cloud.google.com/) (GCP) native multi-cluster discovery service (MCS). In a Kubernetes cluster, the “Service” object manifest facilitates service discovery and consumption only within the cluster. We need to rely on an off-platform, bespoke solution with Istio-like capabilities to discover services across clusters. But we can build and discover services that span across clusters natively with MCS. Below is an illustration of our multi-region deployment in action. ![Single YugabyteDB cluster stretched across 3 GCP regions.](https://blog.yugabyte.com/wp-content/uploads/2022/01/0114-Multi-Region-YB-on-GKE-Diagram-01.png) ## Assumptions We will find commands with appropriate placeholders and substitutions throughout this blog based on the following assignments. ![Multi-region deployment assumptions.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.22.34-PM.png) ## Step 1: Ensure Cloud Readiness We need to enable a couple of GCP service APIs to use this feature. Let’s allow the following APIs. > Enable Kubernetes Engine API. ![Ensure cloud readiness by enabling the Kubernetes Engine API](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.26.44-PM.png) > Enable GKE hub API. ![Enable cloud readiness by enabling the GKE hub API](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.30.08-PM.png) > Enable Cloud DNS API. ![Ensure cloud readiness by enabling the Cloud DNS API](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.43.33-PM.png) > Enable Traffic Director API. ![Ensure cloud readiness by enabling the Traffic Director API.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.46.57-PM.png) > Enable Cloud Resource Manager API. ![Ensure cloud readiness by enabling the Cloud Resource Manager API.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.51.50-PM.png) In addition to these APIs, enable standard services such as Compute Engine and IAM. ## Step 2: Create three GKE Clusters (VPC native) > Create the first cluster in the **US region** with workload identity enabled. ![Create the first GKE cluster.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.55.18-PM.png) > Create the second cluster in the **Europe region** with workload identity enabled. ![Create the second GKE cluster in the Europe region.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-4.57.53-PM.png) > Create the third cluster in the **Asia region** with workload identity enabled. ![Create the third GKE cluster in the Asia region.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-5.00.40-PM.png) > Validate the output of the multi-region cluster creation. ![Validate the output of the multi-region GKE cluster creation.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-05-at-5.02.37-PM.png) ## Step 3: Enable MCS Discovery Enable the MCS discovery API and Services: ![Enable the MCS discovery API and Services](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.10.35-PM.png) As workload identity has already been enabled for the clusters, we need to map the Kubernetes service account to impersonate GCP’s service account. This will allow applications running in the cluster to consume GCP services. IAM binding requires the following mapping. ![Mapping the Kubernetes service account to impersonate GCP’s service account.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.12.00-PM.png) ## Step 4: Establish Hub Membership Upon successful registration, the Hub membership service will provision `gke-connect` and `gke-mcs` services to the cluster. These are CRDs and controllers to talk to GCP APIs in order to provision the appropriate cloud resources such as network endpoints, mapping rules, and others (as necessary). This service creates a private managed hosted zone and a traffic director mapping rule on successful membership enrollment. The managed zone “clusterset.local” is similar to “cluster.local” but advertises the services across clusters to auto-discover and consume. > To get started, register all three clusters: ![Establish Hub membership by registering all three clusters.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.15.16-PM.png) > After successful membership enrollment, the following objects will be created. Managed private zone => “clusterset.local” ![Enable Hub membership by creating the following objects.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.16.48-PM.png) > From there, the traffic director initializes with the following mapping rule. ![Enable Hub membership by initializing the traffic director.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.18.04-PM.png) > Once the traffic director initializes, verify the membership status. ![Verify the member status of the traffic director.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.20.47-PM.png) This will also set up the appropriate firewall rules. When the service gets provisioned, the right network endpoints with the mapping rules would be automatically created, as illustrated below. ![Three GKE clusters connect by MCS.](https://blog.yugabyte.com/wp-content/uploads/2022/01/0114-Multi-Region-YB-on-GKE-Diagram-02.png) ## Step 5: Initialize YugabyteDB As we have been exploring MCS, it’s clear our upstream [Helm package](https://github.com/yugabyte/charts) won’t work out of the box. Let’s use the upstream chart to generate the template files locally and then make the relevant changes in the local copy. The template variable file for all three regions is available in the [gist](https://gist.github.com/srinivasa-vasu/407019af2090c8b1bd60bd3ed93426d1) repo. Download all three region-specific variable files and an additional service-export.yaml from the remote repo to the local machine and name them as “ap-south1.yaml”, “eu-west1.yaml”, “us-central1.yaml”, and “service-export.yaml”. As the upstream chart is not updated for cross-cluster service discovery, the broadcast address of the master and tserver service instances would refer to the cluster local “svc.cluster.local” DNS entry. This needs to be updated explicitly with the new managed zone private domain that the hub created during cluster membership enrollment to let the instances communicate between clusters. To get started, generate Helm templates for all three regions: ![Generate Helm templates for all three regions.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.23.15-PM.png) Once we generate the template files, we have to update the broadcast address. More specifically, search for this text **“–server\_broadcast\_addresses”** in both the master and tserver StatefulSet manifest definition and update both the entries. ![Update both entries in the master and tserver StatefulSet manifest definition.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.24.33-PM.png) The pattern is **[INSTANCE\_NAME].[MEMBERSHIP].[SERVICE\_NAME].[NAMESPACE].svc.clusterset.local.** This change is explained in the next section. Finally, get the container credentials of all three clusters. Once we get the kubeconfig, the local context would be similar to: ![Obtain the container credentials of all three clusters.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.28.37-PM.png) ## Step 6: Deploy YugabyteDB Connect to all three cluster contexts one by one and execute the generated template file using kubectl CLI. ![Connect to all three cluster contexts one by one and execute the generated template file using kubectl CLI.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.29.46-PM.png) Now, let’s explore the **service-export.yaml** file: ![Explore the service-export.yaml file.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.31.36-PM.png) **“ServiceExport”** CRD is added by the MCS service. This will export services outside of the cluster to be discovered and consumed by other services. The controller programmed to react to events from this resource would interact with GCP services to create ‘A’ name records in the internal private managed zone for both yb-tservers and yb-masters headless services. If we verify the “clusterset.local” domain in the console, we would see the following records: ![Verify the “clusterset.local” domain in the console.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.32.35-PM.png) For the exported headless services, there will be two “A” name records: ![There will be two “A” name records for the exported headless services.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.33.44-PM.png) This is similar to how the in-cluster “svc.cluster.local” DNS works. The DNS creation and propagation would take some time (around 4-5 mins) for the first time. YugabyteDB wouldn’t be able to establish the quorum until the DNS records are made available. Once those entries get propagated, the cluster would be up and running. ![An up-and-running YugabyteDB cluster.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.35.20-PM.png) When we verify the Kubernetes cluster state for the master and tserver objects, we will find all of them in a healthy state. This is represented below using the [Octant](https://github.com/vmware-tanzu/octant) dashboard. ![Verify the Kubernetes cluster state for the master and tserver objects.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.36.48-PM.png) ![Verify the Kubernetes cluster state for the master and tserver objects.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.39.07-PM.png) We can simulate the region failure by bringing down the StatefulSet replica to zero. Upon failure, the RAFT consensus group reacts and adjusts accordingly to bring the state to normalcy as we still have two surviving instances. ![Simulate the region failure by bringing down the StatefulSet replica to zero.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.39.46-PM.png) Once we scale up the replica count, the node joins back with the cluster group, and all three regions are again back to a healthy state. ![The node joins back with the cluster group, and all three regions are again back to a healthy state.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.40.57-PM.png) ### Good usage pattern Because of the distributed nature of data and the associated secondary indexes in a multi-region deployment, it is beneficial to pin a region as the preferred region to host the tablet leaders. This keeps the network latencies to a minimum and is confined to a region for cross-node RPC calls such as multi-row transactions, secondary index lookups, and other similar operations. As you can see, this is one of the best usage patterns to improve network latencies in a multi-region deployment. ## Step 7: Cleanup Delete YugabyteDB: ![Delete YugabyteDB.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.42.41-PM.png) Unregister the Hub Membership: ![Unregister the Hub Membership.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.43.33-PM.png) Disable the MCS APIs: ![Disable the MCS APIs.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-10-at-3.44.34-PM.png) And finally, delete the GKE clusters. ## Conclusion This blog post used GCP’s multi-cluster service discovery to deploy a highly available, fault-tolerant, and geo-distributed YugabyteDB cluster. As shown in the illustration below, a single YugabyteDB cluster distributed across three different regions addresses many use cases, such as region local delivery, geo-partitioning, higher availability, and resiliency. ![A single YugabyteDB cluster distributed across three different regions addresses many use cases.](https://blog.yugabyte.com/wp-content/uploads/2022/01/Screen-Shot-2022-01-18-at-10.06.21-AM.png) Give this tutorial a try—and don’t hesitate to let us know what you think in the [YugabyteDB Community Slack channel](https://www.yugabyte.com/community/). The post [Tutorial: How to Deploy Multi-Region YugabyteDB on GKE Using Multi-Cluster Services](https://blog.yugabyte.com/multi-region-yugabytedb-on-gke/) appeared first on [The Distributed SQL Blog](https://blog.yugabyte.com).
humourmind
961,329
Tailwind CSS Pseudo-elements
I only learned that Tailwind recently added the option to style pseudo-elements. Ever since the...
0
2022-01-20T05:51:59
https://daily-dev-tips.com/posts/tailwind-css-pseudo-elements/
tailwindcss, css, beginners
I only learned that Tailwind recently added the option to style pseudo-elements. Ever since the introduction of [Tailwind JIT](https://daily-dev-tips.com/posts/why-tailwind-jit-compiler-is-amazing/) it turns out we can now also leverage pseudo-elements in Tailwind! Let's look at how it works and what we can do with them. ## What are pseudo-elements If you are [not aware of pseudo-elements](https://daily-dev-tips.com/posts/css-pseudo-elements/), they are similar to pseudo-classes like `:hover`, `:first`, etc. The difference is that `pseudo-classes` are existing elements that get styled differently. As to where `pseudo-elements` are new elements. They can give us the superpower to add new styled elements to the DOM. Another way to identify `pseudo-elements` is to always start with two `::` where the classes only use one `:`. Let's look at each of the pseudo-elements and how we can use them in Tailwind CSS. ## Tailwind CSS first-line pseudo-element This pseudo-element can manipulate the first line of a specific sentence. Let's say we want to make the first line of an article blue, so it pops a bit more. While we are at it, we could also transform the first line to uppercase. ```html <p class="first-line:uppercase first-line:tracking-widest first-line:text-blue-500"> Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </p> ``` This will result in the following: {% codepen https://codepen.io/rebelchris/pen/dyVqwqz %} ## Tailwind CSS first-letter pseudo-element Like the `first-line` selector, we can also target the first letter. You often see this in those old-school books giving a nice effect. I personally really love this effect, and this is how you use it in Tailwind CSS. ```html <p class="first-letter:text-7xl first-letter:font-bold first-letter:mr-3 first-letter:float-left first-letter:text-teal-500" > Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </p> ``` The result of the first-letter will look like this: {% codepen https://codepen.io/rebelchris/pen/JjrawqN %} ## Tailwind CSS before pseudo-element The before pseudo-element is perfect for adding that extra new element to the dom, which you can use to add nice effects to certain elements. Let's try and create a fun background for an image. We want the image to show, but there should be a different colored div with an angle on the background. ```html <div class="relative before:block before:absolute before:-inset-1 before:-rotate-6 before:bg-teal-500" > <img class="relative border-4 border-white" src="img.jpg" /> </div> ``` Which will result in the following: {% codepen https://codepen.io/rebelchris/pen/NWaLeQV %} ## Tailwind CSS before pseudo-element The after element can be used the same way as the before element. Let's try something else for this one. We often have forms with required fields. Let's add a red `*` for the required fields. ```html <label class="block"> <span class="after:content-['*'] after:ml-0.5 after:text-red-500 block text-sm font-medium text-gray-700" > Email </span> <input type="email" name="email" class="mt-1 px-3 py-2 bg-white border shadow-sm border-gray-300 placeholder-gray-400 focus:outline-none focus:border-sky-500 focus:ring-sky-500 block w-full rounded-md sm:text-sm focus:ring-1" placeholder="you@example.com" /> </label> ``` Resulting in this amazing piece: {% codepen https://codepen.io/rebelchris/pen/mdBGvVz %} ## Tailwind CSS selection pseudo-element I'm sure you have seen this before, you select a piece of text, and the color is different. That is done by using the `selection` pseudo-element. It looks like this: ```html <p class="selection:bg-teal-500 selection:text-white"> Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. </p> ``` Try it out by selecting some text: {% codepen https://codepen.io/rebelchris/pen/gOGdqrZ %} ## Conclusion Now that we can use these selectors in Tailwind, there is almost no need for any custom CSS while using Tailwind. I'm thrilled these are now so well supported, and I'm sure it will be a game-changer. If you want to read up more, the [official docs of Tailwind](https://tailwindcss.com/docs/hover-focus-and-other-states#pseudo-elements) are always a gem of information. ### Thank you for reading, and let's connect! Thank you for reading my blog. Feel free to subscribe to my email newsletter and connect on [Facebook](https://www.facebook.com/DailyDevTipsBlog) or [Twitter](https://twitter.com/DailyDevTips1)
dailydevtips1
961,672
Responsive background images with image-set, the srcset for background-image
Source sets can help us to make websites load faster. We can use them in different ways to offer...
16,136
2022-01-24T09:56:45
https://dev.to/ingosteinke/responsive-background-images-with-image-set-the-srcset-for-background-image-259a
css, todayilearned, webdev, tutorial
Source sets can help us to make websites load faster. We can use them in different ways to offer browsers alternative versions of the same image to match screen size, pixel density, or network speed. ## A Source Set for Background Images The [image-set](https://developer.mozilla.org/en-US/docs/Web/CSS/image/image-set()) property allows us to do the same for background images in CSS. This feature has been [requested for years](https://stackoverflow.com/questions/26801745/is-there-a-srcset-equivalent-for-css-background-image), but it did not get the same hype as other, newer, CSS features like [parent selectors](https://dev.to/ingosteinke/css-hasparent-selectors-287c) or [container queries](https://dev.to/ingosteinke/a-css-container-queries-example-1le0). ### Understanding Image Sets step by step First, let's make sure we understand source sets. ## What are source sets and how to use them? In a typical use case, we provide different image versions and add our recommendation for appropriate screen sizes, but it's up to the browser to decide which image to load: > The `image-set()` function allows an author to ignore most of these issues, simply providing multiple resolutions of an image and letting the user agent decide which is most appropriate in a given situation. Source: [csswg.org](https://drafts.csswg.org/css-images-4/#image-set-notation) ## Providing Image Files Let's start an put one image in an image element, for example [this photography of a landscape](https://www.flickr.com/photos/fraktalisman/51224349248/), 2048 pixels wide, and 1536 pixels high. As a high resolution photography with a lot of details, the file size is 557.7 kB, which is roughly half a megabyte. We will use an image element to show this photograph on our website. We must specify the image source (the URL to the image file) and the original image dimensions (width and height). ```html <img src="large-landscape-2048x1536.jpg" width="2048" height="1536" alt="landscape" > ``` Adding the following style sheet will make browsers resize our image (and every other image on that page) proportionally when the horizontal viewport with is smaller than the original image width. ```css img { max-width: 100%; height: auto; } ``` We can test that it works as intented. {% codepen https://codepen.io/openmindculture/pen/zYEVBjO %} ### But what a waste of bandwidth! This is responsive in a visual way, but even on a small old mobile phone, browsers will still load the same large image file, half a megabyte of data, only to display a shrunk version of the same image on a tiny screen. ![Same large image on a small mobile phone screen](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/piq1saqjg7k9fh23bly4.jpg) ### A much smaller Image File that still looks good If our mobile screen is 326 CSS pixels wide, at a resolution of 2 device pixels per CSS pixel, we need an image of 750 x 536 pixels to fill our screen. Scaling our image down to that size and saving it as a high quality JPEG file (with JPEG quality set to 80), the new image file only takes up 90 kilobytes, and the image still looks good. And if you're not too ambitious, so does the 70 kB file after further image processing on codepen's asset server: ![large-landscape-750x536.jpg](https://assets.codepen.io/2332100/large-landscape-750x536.jpg) ### Adding Source Sets and Sizes to our Image Elements Now let's tell our browser to use the smaller version if the screen size is not larger than 750 (CSS) pixels. We can add a `srcset` attribute to our existing image. ```html <img src="large-landscape-2048x1536.jpg" srcset="small-landscape-750x536.jpg 750w, large-landscape-2048x1536.jpg 20480w" width="2048" height="1536" alt="landscape" > ``` For more complex definitions, we could wrap a picture element around our image and add multiple source elements each with its own srcset attribute. That can be handy if we need to combine different aspects of [responsive images](https://developer.mozilla.org/en-US/docs/Learn/HTML/Multimedia_and_embedding/Responsive_images) for the same image element, like screen width and pixel density. ## Responsive Background Images Why use background images at all? Well, in the old days before the [object-fit property](https://developer.mozilla.org/en-US/docs/Web/CSS/object-fit) and before layering content using `positioning` and `z-index` worked as it did today, [background images](https://developer.mozilla.org/en-US/docs/Web/CSS/background-image) were a very useful technique to code hero banners and they still provide an easy way to add optional decoration to pages and elements. Despite their smooth and flexible visual styling, it used to be impossible to optimize background images to save mobile bandwidth, and the warning about ["very limited support" of the image-set property](https://caniuse.com/?search=image-set) probably did not help to make it popular among web developers either. ### Using Image Sets for Background Images Defining an [image-set](https://developer.mozilla.org/en-US/docs/Web/CSS/image/image-set()) for a background-image url is easy if we know how to use `srcset` attributes for `img` and `source` elements. A drawback of limited `image-set` support in current browsers is that we can't use pixel width resolutions, so we have to set [pixel density](https://elad.medium.com/understanding-the-difference-between-css-resolution-and-device-resolution-28acae23da0b) (`1x`, `2x`) etc. as a selector instead. We can use image-set as a replacement for a single url, so that ```css .landscape-background { background-image: url(large-landscape-2048x1536.jpg); } ``` ...becomes... ```css .box { background-image: image-set( url("small-landscape-750x536.jpg") 1x, url("large-landscape-2048x1536.jpg") 2x); } ``` For the sake of maximum browser compatibility, we should add a webkit-prefixed version as well as a single image url. Currently, Chrome browser still don't support the unprefixed version. ```css .box { background-image: url("small-landscape-750x536.jpg"); background-image: -webkit-image-set( url("small-landscape-750x536.jpg") 1x, url("large-landscape-2048x1536.jpg") 2x); background-image: image-set( url("small-landscape-750x536.jpg") 1x, url("large-landscape-2048x1536.jpg") 2x); } ``` ### Progressive Enhancement with w-Units [CSS 4 Images draft](https://drafts.csswg.org/css-images-4/#image-set-notation) already proposed to introduce width and height units in the future: > We should add "w" and "h" dimensions as a possibility to match the functionality of HTML’s picture. While the quoted "we should add" was meant to say that browser vendors should add the functionality to their CSS engines, it could also mean that we, as web developers, should add the dimensions to our code even before any browser actually supports them. Using [progressive enhancement](https://developer.mozilla.org/en-US/docs/Glossary/Progressive_Enhancement), which means to use new features in an optional way, we could simply add another line with a width-based image-set. It will be ignored for containing (currently) invalid values, but it will start to work once browsers start to implement the new syntax. Last but not least, we can add a static background color which will display before the image has loaded, or in case the image fails to load for some reason. ```css .box { background: skyblue; background-image: url("small-landscape-750x536.jpg"); background-image: -webkit-image-set( url("small-landscape-750x536.jpg") 1x, url("large-landscape-2048x1536.jpg") 2x); background-image: image-set( url("small-landscape-750x536.jpg") 1x, url("large-landscape-2048x1536.jpg") 2x); background-image: image-set( url("small-landscape-750x536.jpg") 750w, url("large-landscape-2048x1536.jpg") 20480w); } ``` Firefox 96 supports `image-set` without prefix, but still sees `750w` and `2048w` as invalid values, falling back to the `image-set` with density values. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7qzqto8icd2a1eioyr0d.png) Internet Explorer would still recognize our first line, the background image wihtout an `image-set`, so it looks we're all set to have a nice display in every browser, and a performance optimiziation for the progressive ones. This is our complete demo code in action: {% codepen https://codepen.io/openmindculture/pen/KKXjMRr %} ## Conclusion and Alternatives Due to the limited browser support, I prefer using regular `<img>` elements. Images inside of `<picture>` elements support complex adaptive source sets combining rules for width and pixel density for each image at the same time, also known as the ["art direction" use case](https://developers.google.com/web/fundamentals/design-and-ux/responsive/images#art_direction_in_responsive_images_with_picture). ```html <picture> <source media="(min-width: 800px)" srcset="head.jpg, head-2x.jpg 2x"> <source media="(min-width: 450px)" srcset="head-small.jpg, head-small-2x.jpg 2x"> <img src="head-fb.jpg" srcset="head-fb-2x.jpg 2x" alt="a head carved out of wood"> </picture> ``` Quoting [my own StackOverflow answer](https://stackoverflow.com/questions/49174465/understanding-srcset-and-sizes-in-combination-with-hidpi-monitors/71242072#71242072) here: [![Understanding srcset and sizes in combination with HiDPI monitors](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y9sr671bukm0zicuen16.png)](https://stackoverflow.com/questions/49174465/understanding-srcset-and-sizes-in-combination-with-hidpi-monitors/71242072#71242072) {% stackoverflow https://stackoverflow.com/questions/49174465/understanding-srcset-and-sizes-in-combination-with-hidpi-monitors/71242072#71242072 %} ## What's next in CSS? Thanks for reading, and watch out, there is more to come!
ingosteinke
961,674
Elastic Tabs
A post by Jayant Goel
0
2022-01-20T13:50:48
https://dev.to/jayantgoel001/elastic-tabs-2dn6
codepen
{% codepen https://codepen.io/nenadkaevik/pen/odyrGm %}
jayantgoel001
961,710
Editorial-Chart experiment
This is only an experiment Source code: https://github.com/schmidtsonian/calendar-stats Original...
0
2022-01-20T14:37:00
https://dev.to/elynaur/editorial-chart-experiment-3do2
codepen
<p>This is only an experiment Source code: <a href="https://github.com/schmidtsonian/calendar-stats" target="_blank">https://github.com/schmidtsonian/calendar-stats</a> Original design: <a href="https://dribbble.com/shots/2531003-Editorial-Chart" target="_blank">https://dribbble.com/shots/2531003-Editorial-Chart</a></p> {% codepen https://codepen.io/elynaur/pen/VwMJZRw %}
elynaur
962,039
React.js Core Concepts
React.js and its pros &amp; cons React is a declarative, component-based JavaScript...
0
2022-01-20T19:16:28
https://dev.to/pranta07/reactjs-core-concepts-51ha
blogging
## **React.js and its pros & cons** React is a declarative, component-based JavaScript library for building user interfaces. Pros: - Use virtual DOM to effectively render user interfaces. - Supports component-based design which is reusable and easy to manage. - Single page application means rendering different components on the same page efficiently without any reload. Which makes a better user experience. - The HTML-like syntax is easy to learn and use. Cons: - Continuously updated features which is a bit challenging for developers to cope up with new documentation. - Still contains class-based components example in documentation instead of functional components. ## **JSX** JSX means JavaScript XML which allows HTML-like syntax with JavaScript in react. Using JSX we can create dynamic content in our react application easily. Without JSX the code is a bit lengthy and complex while using JSX the code is simple and easy to understand. Babel is used by React to convert the JSX to React elements. ``` import React from "react"; const Blogs = () => { return ( <div> <img src="https://i.ibb.co/2SfqRWr/maxresdefault.jpg" alt="" width="100%" /> </div> ); }; export default Blogs; ``` In the above code snippet inside the return statement it looks like html syntax actually it is JSX notation which makes us easier to understand what our code structure will looks like. ## **Virtual DOM** It is a virtual representation or copy of the real DOM object. Whenever anything changes in our document react will create a new virtual DOM and compare it with the previous DOM, find out the change between these two DOMs using diff algorithm and finally update the specific changes into the real DOM without updating the entire dom. This makes DOM-manipulation more effective without updating the entire DOM whenever a small part of the DOM is changed. ## **Props and State** Props are mainly used for passing dynamic data from parent component to child components while the state is a variable that is used to store information about a specific component and that can be changed dynamically. Props are read-only and immutable which means we can not change props. When the state changes re-render will happen dynamically. ``` import React from "react"; const Blogs = () => { const blogs = ["blog1","blog2","blog3","blog4","blog5",] return ( blogs.map(blogTitle => <Blog title={blogTitle}></Blog>) ); }; export default Blogs; ``` ``` import React from "react"; const Blog = (props) => { const {title} = props; return ( <h1>{title}</h1> ); }; export default Blog; ``` ## **Lifting State Up** We cannot pass states from child to parent component. React follows a top-down approach for passing state between many level components. When a parent component needs the state we need to lift the state up to the parent component and pass this as a prop to the child components. This is called `lifting state up`. ## **Props Drilling and Context API** When data are passed from component to component by props on a deeper level this is called props drilling. The best way to pass data 4-5 layers down is using context API. Context API is used to pass multiple data all over the document which is simple and easy to use. Whenever we need to share multiple data between many level components we can use context API to provide those data to the whole document. First, we have to create a context then we have to wrap our document between context Providers where we should specify the value we want to pass. We can get the value we provide from anywhere with the help of `useContext` hook. ## **Custom hook** Hooks are mainly a JavaScript function that can perform some task and returns the results we need. Building our own custom hook will help us to simplify our code which is easy to understand. For creating a custom hook we have to create a JavaScript function and export it so that it can be imported from any component we want and use it. ``` import { useState } from "react"; const useProducts = () => { const [products, setProducts] = useState([]); fetch(url) .then(res => res.json()) .then(data => setProducts(data)); return { products }; }; export default useProducts; ``` ``` const products = useProducts(); ``` ## **Performance Optimization** To optimize a react.js app we should try to divide different parts of our application into smaller chunks, use react fragments to avoid unnecessary elements for wrapping, try to avoid using an index as a key for the map, using useMemo, pure component, use of spread operator, passing object as props, etc. which will prevent unnecessary component re-render in React.js.
pranta07
962,070
Layers Exposed
Data oriented programming is here to stay and along with it comes the deconstruction of popular...
0
2022-01-20T19:59:26
https://dev.to/wolfspidercode/layers-exposed-5d0c
devjournal, programming
Data oriented programming is here to stay and along with it comes the deconstruction of popular programming idioms. When coming from a productivity oriented stack the typical 3-layer idiom is DAL, BLL, and OBJ. These would be "Data Access Layer" and "Business Logic Layer" and "Object Layer". "Object Layer" is interchangeable with "Presentation Layer" depending on the architecture of the application. Whether it's native apps or web apps there is probably at least one application with this architecture buried somewhere in the repo at nearly every company I've ever worked at. **What Happened to N-Tier?** After years of maintaining applications like this I can say wholeheartedly that each one becomes much like the subject of Chemistry where typically there is no good place to start. Using N-Tier monstrous applications begin to unfold and over years of maintenance a lot of code gets developed just interacting with the "Layers". ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/z1oici99quiaaq02uyd8.png) Rather than some silver bullet solution getting developed the term just became more popular to describe interfaces to data. That is an idea I can actually get onboard with because that is exactly how it feels. The term "Interface" in Object Oriented Programming changes every year because (surprise!) it just takes whatever it takes to communicate with "Layers". The new definition in C# 8.0 for interface is IMHO everything but the kitchen sink except for state to be used in tandem with classes. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vbs3q9ge4okth9spdmwd.png) Okay...well what about C++ I mean an interface in C++ has to be very specific right? Actually, there aren't any concepts in C++ which use the term "Interface" but a similar concept is the use of an "Abstract Class" which cannot be instantiated but can be used as a base class. A class is made abstract by declaring one of it's functions as a pure virtual class. The newer updates in C++20 allow for "constexpr virtual" (you can read about it here [constexpr-virtual](https://www.cppstories.com/2021/constexpr-virtual/)). In a convoluted sense- the "Interface" got another update. **This is getting complicated** What I think we should really be after is creating interfaces where layers of data are not so coupled. Depending on the interface it may need to be tightly coupled or it may actually be running in a container or VM somewhere or on another port. The interface should be standardized within an organization if needed. This might be why it's common to see the term "Interface" and "Layer" being used almost interchangeably in recent times. I would recommend creating an "Interface" that can talk to multiple "Layers" or creating a "Layer" that can have multiple "Interfaces" but going 1-for-1 is where the code bloat comes from. It's actually very easy to create an "Interface" for each "Layer" or vice versa. It's very challenging to create reusable code on either side but happens to be the cleanest solution. **NoSQL Layers and APIs** Now that we've delved a little into "Interfaces" and "Layers" the programming model becomes even more reduced in the world of NoSQL. Respectively, when it comes to Cosmos DB or FoundationDB, the "Layer" is something which resembles more of an API which translates the resulting output to NoSQL using an "Interface". This is a shift away from N-Tier because no longer can N-Tier fit all of this into its definition. Solutions are still being developed under FoundationDB and the solutions for Cosmos DB are varied but hosted only on Azure. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i2e7i48uad479p8f9y60.png) ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7kl6sob0s2951bd8x40s.png) Both systems are in heavy use worldwide and the concepts of NoSQL take some getting used to. Mainly- the convergence of certain principles into a single principle. The "Interface" is the "API" which also becomes the "Data Access Layer (DAL)" as it translates usable data into NoSQL. Altogether, they form a "Layer". Can you use the same code you've relied on for years to leverage these technologies? That is up to you, the developer, and how much performance you wish to squeeze out of the whole thing.
wolfspidercode
962,192
Just to let you all know.....
@ben @nickytonline and others: Three hours ago I forked Forem, asked GH support to detach fork,...
0
2022-01-20T20:53:44
https://dev.to/labspl/just-to-let-you-all-know-31l1
@Ben @nickytonline and others: ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/drt0dj056xmblh83rg70.png) Three hours ago I forked Forem, asked GH support to detach fork, they happily did it, and now Im cloning it to my local dev machine. Starting working on my version of it. Many things in my mind. Will update this post....... ======== I decided to fork & detach simply because it turned up that Forem team **lies** constantly. About being open-source, about being inclusive, about .... almost everything. ======== You can track progress [here](https://github.com/labspl/forem) ======= Our goals: * change the name, * change license, * make Forem ( or whatever name we select ) fully **open-source**, * create **working** mobile apps ( Android / iOS ) * make code portable, * improve code quality, * introduce new features. * rework/redesign many existing features, * follow [our standards](https://github.com/labspl/coding-standards), and much more.
labspl
962,272
How to edit components in Figma
In this quick post, we’ll see how to edit your components in Figma. Watch Video...
0
2022-01-21T00:35:34
https://dev.to/raoufbelakhdar/how-to-edit-components-in-figma-2h3a
figmatips, figma, uxdesign, uidesign
--- title: How to edit components in Figma published: true date: 2022-01-04 19:55:00 UTC tags: figmatips,figma,uxdesign,uidesign canonical_url: --- ![](https://cdn-images-1.medium.com/max/1024/0*E8sW31i2mp_qYOA3.jpg) In this quick post, we’ll see how to edit your components in Figma. ### Watch Video Tutorial {% youtube JpizbChgUrc %} ### Edit Components If you want to edit a certain component, just go to the root component frame and edit it. Any change you add, Figma will apply that change automatically to all the component’s instances. ### Edit Instance If you want to edit a component’s instance, you have to go back to the main component and edit it. - Right-Click on your instance and select **Go to the main component** ![](https://cdn-images-1.medium.com/max/962/0*GRHn-FY2JyXA8NMN.png) - Now, add your changes to the component. Next, you have to publish your components changes to the libraries so all the component’s instances will be updated. ### Update components changes - Go to the components file’s **Assets** panel and hit the **Team Library** icon. ![](https://cdn-images-1.medium.com/max/241/0*B89GqCeDkO-kl2fO.png) - Hit **Publish changes** button in the **Libraries modal** to update the changes to the library. ![](https://cdn-images-1.medium.com/max/984/0*ceT3SCXvFV4WZyaz.png) ![](https://cdn-images-1.medium.com/max/900/0*DPdfJj1LrIMtM1NG.png) That’s everything about editing components in Figma. In the next posts we’ll cover more concerning Figma components. ### Before you go Feel free to visit our website [**captain-design.com**](https://www.captain-design.com/) where we are sharing generously, ready for commercial use [**Figma and HTML templates**](https://www.captain-design.com/). You’ll find three things to help you kickstart your next project’s design : - [**200+ free Figma templates**](https://www.captain-design.com/templates/) **.** - [**Free Html + Bootstrap 5 templates**](https://www.captain-design.com/templates/tag/html) **.** - [**Amazing Figma Plugins and UI kits**](https://www.captain-design.com/uikit/) **.** _Originally published at_ [_https://www.captain-design.com_](https://www.captain-design.com/blog/how-to-edit-components-in-figma/) _on January 4, 2022._
raoufbelakhdar
962,895
Async HTTP Requests with Aiohttp & Aiofiles
When building applications within the confines of a single-threaded, synchronous language, the...
17,978
2022-02-02T09:10:44
https://hackersandslackers.com/async-requests-aiohttp-aiofiles/
concurrency, python, scraping, automation
--- title: Async HTTP Requests with Aiohttp & Aiofiles published: true date: 2022-02-02 09:10:00 UTC tags: Concurrency,Python,Scraping,Automation series: Asyncio canonical_url: https://hackersandslackers.com/async-requests-aiohttp-aiofiles/ --- ![Async HTTP Requests with Aiohttp & Aiofiles](https://cdn.hackersandslackers.com/2021/09/aiohttp2-3.jpg) When building applications within the confines of a single-threaded, synchronous language, the limitations become very obvious very quickly. The first thing that comes to mind is **writes**: the very definition of an I/O-bound task. When writing data to files (or databases), each "write" action intentionally occupies a thread until the write is complete. This makes a lot of sense for ensuring data integrity in most systems. For example, if two operations simultaneously attempt to update a database record, which one is correct? Alternatively, if a script requires an HTTP request to succeed before continuing, how could we move on until we know the request succeeded? HTTP requests are among the most common thread-blocking operations. When we write scripts that expect data from an external third party, we introduce a myriad of uncertainties that can only be answered by the request itself, such as response time latency, the nature of data we expect to receive, or if the request will succeed. Even when working with APIs we're confident in, no operation is sure to succeed until it's complete. Hence, we're "blocked." As applications grow in complexity to support more simultaneous user interactions, software is moving away from the paradigm of being executed linearly. So while we might not be sure that a specific request succeeds or a database write is completed, this can be acceptable as long as we have ways to handle and mitigate these issues gracefully. ## A Problem Worthy of Asynchronous Execution How long do you suppose it would take a Python script to execute a few hundred HTTP requests, parse each response, and write the output to a single file? If you were to use requests in a simple for loop, you'd need to wait a fair amount of time for Python to execute each request, open a file, write to it, close it, and move on to the next. Let's put **asyncio's** ability to improve script efficiency to an actual test. We'll execute two I/O-blocking actions per task for a few hundred URLs: executing and parsing an HTTP request and writing the desired result to a single file. The _input_ for our experiment will be a ton of URLs, with the expected _output_ to be metadata parsed from those URLs. Let's see how long it takes to do this for hundreds of URLs. This site has [roughly two hundred](https://hackersandslackers.com/sitemap-posts.xml) published posts of its own, which makes it a great guinea pig for this little experiment. I've created a CSV that contains the URLs to these posts, which will be our input. Here's a sneak peek below: ### Sample Input <!--kg-card-begin: markdown--> | url | | --- | | https://hackersandslackers.com/intro-to-asyncio-concurrency/ | | https://hackersandslackers.com/multiple-python-versions-ubuntu-20-04/ | | https://hackersandslackers.com/google-bigquery-python/ | | https://hackersandslackers.com/plotly-chart-studio/ | | https://hackersandslackers.com/deploy-serverless-golang-functions-with-netlify/ | | https://hackersandslackers.com/scrape-metadata-json-ld/ | | https://hackersandslackers.com/terraform-with-google-cloud/ | | https://hackersandslackers.com/deploy-golang-app-nginx/ | | https://hackersandslackers.com/4-ways-to-improve-your-plotly-graphs/ | | https://hackersandslackers.com/create-your-first-golang-app/ | | ...etc | <!--kg-card-end: markdown--> <figcaption>Input CSV</figcaption> ### Sample Output For each URL found in our input CSV, our script will fetch the URL, parse the page, and write some choice data to a single CSV. The result will resemble the below example: <!--kg-card-begin: markdown--> | title | description | primary\_tag | url | published\_at | | --- | --- | --- | --- | --- | | Intro to Asynchronous Python with Asyncio | Execute multiple tasks concurrently in Python with Asyncio: Python`s built-in async library. | Python | https://hackersandslackers.com/intro-to-asyncio-concurrency/ | 2022-01-04 | | Deploy Serverless Golang Functions with Netlify | Write and deploy Golang Lambda Functions to your GatsbyJS site on Netlify. | JAMStack | https://hackersandslackers.com/deploy-serverless-golang-functions-with-netlify/ | 2020-08-02 | | SSH & SCP in Python with Paramiko | Automate remote server tasks by using the Paramiko & SCP Python libraries. Use Python to SSH into hosts; execute tasks; transfer files; etc. | Python | https://hackersandslackers.com/automate-ssh-scp-python-paramiko/ | 2020-01-03 | | Create Cloud-hosted Charts with Plotly Chart Studio | Use Pandas and Plotly to create cloud-hosted data visualizations on-demand in Python. | Plotly | https://hackersandslackers.com/plotly-chart-studio/ | 2020-09-03 | | Create Your First Golang App | Set up a local Golang environment and learn the basics to create and publish your first `Hello world` app. | Golang | https://hackersandslackers.com/create-your-first-golang-app/ | 2020-05-25 | | Creating Interactive Views in Django | Create interactive user experiences by writing Django views to handle dynamic content; submitting forms; and interacting with data. | Django | https://hackersandslackers.com/creating-django-views/ | 2020-04-23 | | Define Relationships Between SQLAlchemy Data Models | SQLAlchemy`s ORM easily defines data models with relationships such as one-to-one; one-to-many; and many-to-many relationships. | SQLAlchemy | https://hackersandslackers.com/sqlalchemy-data-models/ | 2019-07-11 | | ...etc | <!--kg-card-end: markdown--> <figcaption>Example of what our script will output</figcaption> ## Tools For The Job We're going to need three core Python libraries to pull this off: - [**Asyncio**](https://docs.python.org/3/library/asyncio.html): Python's bread-and-butter library for running asynchronous IO-bound tasks. The library has somewhat built itself into the Python core language, introducing **async/await** keywords that denote when a function is run asynchronously and when to wait on such a function (respectively). - [**Aiohttp**](https://docs.aiohttp.org/en/stable/): When used on the client-side, similar to Python's **requests** library for making asynchronous requests. Alternatively, **aiohttp** can be used inversely: as an application webserver to _handle_ incoming requests & serving responses, but that's a tale for another time. - [**Aiofiles**](https://github.com/Tinche/aiofiles): Makes writing to disk (such as creating and writing bytes to files) a non-blocking task, such that multiple writes can happen on the same thread without blocking one another - even when multiple tasks are bound to the same file. ```shell $ pip install asyncio aiohttp aiofiles ``` <figcaption>Install the necessary libraries</figcaption> ### BONUS: Dependencies to Optimize Speed **aiohttp** can execute requests _even faster_ by simply installing a few supplemental libraries. These libraries are [cchardet](https://pypi.org/project/cchardet/) (character encoding detection), [aiodns](https://pypi.org/project/aiodns/) (asynchronous DNS resolution), and [brotlipy](https://pypi.org/project/brotlipy/) (lossless compression). I'd highly recommend installing these using the conveniently provided shortcut below (take it from me, I'm a stranger on the internet): ```shell $ pip install aiohttp[speedups] ``` <figcaption>Install supplemental dependencies to speed up requests</figcaption> ## Preparing an Asynchronous Script/Application We're going to structure this script like any other Python script. Our main module, **aiohttp\_aiofiles\_tutorial** will handle all of our logic. **config.py** and **main.py** both live outside the main module, and offer our script some [basic configuration](https://github.com/hackersandslackers/aiohttp-aiofiles-tutorial/blob/master/config.py) and an [entry point](https://github.com/hackersandslackers/aiohttp-aiofiles-tutorial/blob/master/main.py) respectively: ```shell /aiohttp-aiofiles-tutorial ├── /aiohttp_aiofiles_tutorial │ ├── __init__.py │ ├── fetcher.py │ ├── loops.py │ ├── tasks.py │ ├── parser.py │ └── /data # Source data │ ├── __init__.py │ ├── parser.py │ ├── tests │ └── urls.csv ├── /export # Destination for exported data ├── config.py ├── logger.py ├── main.py ├── pyproject.toml ├── Makefile ├── README.md └── requirements.txt ``` <figcaption>Project structure of our async fetcher/writer</figcaption> **/export** is simply an empty directory where we'll write our output file to. The **/data** submodule contains the input CSV mentioned above, and some basic logic to parse it. Not much to phone home about, but if you're curious the source is available on [the Github repo](https://github.com/hackersandslackers/aiohttp-aiofiles-tutorial/tree/master/aiohttp_aiofiles_tutorial/data). ### Kicking Things Off With sleeves rolled high, we start with the obligatory script "entry point," **main.py**. This initiates the core function in **/aiohttp\_aiofiles\_tutorial** called `init_script()`: ```python """Script entry point.""" import asyncio from aiohttp_aiofiles_tutorial import init_script if __name__ == " __main__": asyncio.run(init_script()) ``` <figcaption>main.py</figcaption> This seems like we're running a single function/coroutine `init_script()` via `asyncio.run()`, which seems counter-intuitive at first glance. Isn't the point of asyncio to run _multiple_ coroutines concurrently, you ask? Indeed it is! `init_script()` is a coroutine that calls other coroutines. Some of these coroutines create tasks out of other coroutines, others execute them, etc. `asyncio.run()` creates an event loop that _doesn't stop running_ until the target coroutine is done, including all the coroutines that the parent coroutines calls. So, if we keep things clean, `asyncio.run()` is a one-time call to initialize a script. ### Initializing Our Script Here's where the fun begins. We've established that the purpose of our script is to output a single CSV file, and that's where we'll start: by creating and opening an output file within the context of which our entire script will operate: ```python """Make hundreds of requests concurrently and save responses to disk.""" import aiofiles from config import EXPORT_FILEPATH async def init_script(): """Prepare output file & kickoff task creation/execution.""" async with aiofiles.open(EXPORT_FILEPATH, mode="w+") as outfile: await outfile.write( "title,description,primary_tag,url,published_at\n" ) # (The rest of our script logic will be executed here). # ... ``` <figcaption>aiohttp_aiofiles_tutorial/__init__.py</figcaption> Our script begins by opening a file context with `aiofiles`. As long as our script operates inside the context of an open async file via `async with aiofiles.open() as outfile:`, we can write to this file constantly without worrying about opening and closing the file. Compare this to the _synchronous_ (default) implementation of handling file I/O in Python, `with open() as outfile:`. By using `aiofiles`, we can write data to the same file from multiple sources at _virtually the same time._ `EXPORT_FILEPATH` happens to target a CSV ( **/export/hackers\_pages\_metadata.csv** ). Every CSV needs a row of headers; hence our one-off usage of `await outfile.write()` to write headers immediately after opening our CSV: ```python ... await outfile.write( "title,description,primary_tag,url,published_at\n" ) ``` <figcaption>Writing a single row to our CSV consisting of column headers</figcaption> ### Moving Along Below is the fully fleshed-out version of **\_\_init\_\_.py** that will ultimately put our script into action. The most notable addition is the introduction of the `execute_fetcher_tasks()` coroutine; we'll dissect this one piece at a time: ```python """Make hundreds of requests concurrently and save responses to disk.""" import asyncio import time import aiofiles from aiofiles.threadpool.text import AsyncTextIOWrapper as AsyncIOFile from aiohttp import ClientSession from config import EXPORT_FILEPATH, HTTP_HEADERS from .data import urls_to_fetch # URLs parsed from a CSV from .tasks import create_tasks # Creates one task per URL async def init_script(): """Prepare output file; begin task creation/execution.""" async with aiofiles.open(EXPORT_FILEPATH, mode="w+") as outfile: await outfile.write( "title,description,primary_tag,url,published_at\n" ) await execute_fetcher_tasks(outfile) await outfile.close() async def execute_fetcher_tasks(outfile: AsyncIOFile): """ Open async HTTP session & execute created tasks. :param AsyncIOFile outfile: Path of file to write to. """ async with ClientSession(headers=HTTP_HEADERS) as session: task_list = await create_tasks( session, urls_to_fetch, outfile ) await asyncio.gather(*task_list) ``` <figcaption>aiohttp_aiofiles_tutorial/__init__.py</figcaption> `execute_fetcher_tasks()` is broken out mainly to organize our code. This coroutine accepts `outfile` as a parameter, which will serve as the destination for data we end up parsing. Taking this line-by-line: - `async with ClientSession(headers=HTTP_HEADERS) as session`: Unlike the Python **requests** library, **aiohttp** enables us to open a client-side session that creates a connection pool that allows for up to _100 active connections at a single time._ Because we're going to make under 200 requests, the amount of time it will take to fetch _all_ these URLs will be comparable to the time it takes Python to fetch two under normal circumstances. - `create_tasks()`: This function we're about to define and accepts three parameters. The first is the async `ClientSession` we just opened a line earlier. Next, we have the `urls_to_fetch` variable (imported earlier in our script). This is a simple Python list of strings, where each string is a URL parsed from our earlier "input" CSV. That logic is handled elsewhere via a [simple function](https://github.com/hackersandslackers/aiohttp-aiofiles-tutorial/blob/master/aiohttp_aiofiles_tutorial/data/urls.py) (and not important for the purpose of this tutorial). Lastly, our `outfile` is passed along, as we'll be writing to this file later. With these parameters, `create_tasks()` will create a task for each of our 174 URLs. Each of which will download the contents of the given URL to the target directory. This function returns the tasks but will not execute them until we give the word, which happens via... - `asyncio.gather(*task_list)`: Asyncio's `gather()` method performs a collection of tasks inside the currently running event loop. Once this kicks off, the speed benefits of asynchronous I/O will become immediately apparent. ## Creating Asyncio Tasks If you recall, a Python `Task` wraps a function (coroutine) which we'll execute in the future. In addition, each task can be temporarily put on hold for other tasks. A predefined coroutine must be passed along with the proper parameters before execution to create a task. I separated `create_tasks()` to return a list of Python Tasks, where each "task" will execute fetching one of our URLs: ```python """Prepare tasks to be executed.""" import asyncio from asyncio import Task from typing import List from aiofiles.threadpool.text import AsyncTextIOWrapper as AsyncIOFile from aiohttp import ClientSession from .fetcher import fetch_url_and_save_data async def create_tasks( session: ClientSession, urls: List[str], outfile: AsyncIOFile ) -> List[Task]: """ Create asyncio tasks to parse HTTP request responses. :param ClientSession session: Async HTTP requests session. :param List[str] urls: Resource URLs to fetch. :param AsyncIOFile outfile: Path of file to write to. :returns: List[Task] """ task_list = [] for i, url in enumerate(urls): task = asyncio.create_task( fetch_url_and_save_data( session, url, outfile, len(urls), i ) ) task_list.append(task) return task_list ``` <figcaption>aiohttp_aiofiles_tutorial/tasks.py</figcaption> A few notable things about Asyncio Tasks: - We're defining "_work is to be done"_ upfront. The creation of a `Task` doesn't execute code. Our script will essentially run the same function 174 times concurrently, with different parameters. It makes sense that we'd want to define these tasks upfront. - Defining tasks is quick and straightforward. In an instant, each URL from our CSV will have a corresponding Task created and added to `task_list`. - With our tasks prepared, there's only one thing left to do to kick them all off and get the party started. That's where the `asyncio.gather(*task_list)` line from \_\_ **init** \_\_ **.py** comes into play. <!--kg-card-begin: html--> Asyncio's Task object is a class in itself with its attributes and methods, essentially providing a wrapper with ways to [check task status](https://docs.python.org/3/library/asyncio-task.html#asyncio.Task.cancel), [cancel tasks](https://docs.python.org/3/library/asyncio-task.html#asyncio.Task.cancel), and [so forth.](https://docs.python.org/3/library/asyncio-task.html#task-object) <!--kg-card-end: html--> ### Executing our Tasks Back in `create_tasks()`, we created tasks that each individually execute a method called `fetch_url_and_save_data()` per task. This function does three things: - Make an async request to the given task's URL via **aiohttp**'s session context (handled by `async with session.get(url) as resp:`) - Read the body of the response as a string. - Write the contents of the response body to a file by passing `html` to our last function, `parse_html_page_metadata()`: ```python """Fetch URLs, extract their contents, and write parsed data to file.""" from aiofiles.threadpool.text import AsyncTextIOWrapper as AsyncIOFile from aiohttp import ClientError, ClientSession, InvalidURL from logger import LOGGER from .parser import parse_html_page_metadata async def fetch_url_and_save_data( session: ClientSession, url: str, outfile: AsyncIOFile, total_count: int, i: int, ): """ Fetch raw HTML from a URL prior to parsing. :param ClientSession session: Async HTTP requests session. :param str url: Target URL to be fetched. :param AsyncIOFile outfile: Path of file to write to. :param int total_count: Total number of URLs to fetch. :param int i: Current iteration of URL out of total URLs. """ try: async with session.get(url) as resp: if resp.status != 200: pass html = await resp.text() page_metadata = await parse_html_page_metadata( html, url ) await outfile.write(f"{page_metadata}\n") LOGGER.info( f"Fetched URL {i} of {total_count}: {page_metadata}" ) except InvalidURL as e: LOGGER.error( f"Unable to fetch invalid URL `{url}`: {e}" ) except ClientError as e: LOGGER.error( f"ClientError while fetching URL `{url}`: {e}" ) except Exception as e: LOGGER.error( f"Unexpected error while fetching URL `{url}`: {e}" ) ``` <figcaption>aiohttp_aiofiles_tutorial/fetcher.py</figcaption> When fetching a URL via an **aiohttp** `ClientSession`, calling the `.text()` method on the response (`await resp.text()`) will return the response of a request as a _string_. This is not to be confused with `.body()`, which returns a _bytes_ object (useful for pulling media files or anything besides a string). If you're keeping track, we're now three "contexts" deep: 1. We started our script by opening an `aiofiles.open()` context, which will remain open until our script is complete. This allows us to write to our `outfile` from any task for the duration of our script. 2. After writing headers to our CSV file, we opened a persistent client request session with `async with ClientSession() as session`, which allows us to make requests en masse as long as the session is open. 3. In the snippet above, we've entered a third and final context: the response context for a single URL (via `async with session.get(url) as resp`). Unlike the other two contexts, we'll be entering and leaving this context 174 times (once per URL). Inside each URL response context is where we finally start producing some output. This leaves us with our final bit of logic (await parse\_html\_page\_metadata(html, url)) which parses each URL response and returns some scraped metadata from the page before writing said metadata to our `outfile` on the next line, `await outfile.write(f"{page_metadata}\n")`. ## Write Parsed Metadata to CSV How are we planning to rip metadata out of HTML pages, you ask? With [BeautifulSoup](https://hackersandslackers.com/scraping-urls-with-beautifulsoup/), of course! With the HTML of an HTTP response in hand, we use `bs4` to parse each URL response and return values for each of the columns in our `outfile`: **title** , **description** , **primary\_tag** , **published at** , and **url**. These five values are returned as a comma-separated string, then written to our `outfile` CSV as a single row. ```python """Parse metadata from raw HTML.""" from bs4 import BeautifulSoup from bs4.builder import ParserRejectedMarkup from logger import LOGGER async def parse_html_page_metadata(html: str, url: str) -> str: """ Extract page metadata from raw HTML into a CSV row. :param str html: Raw HTML source of a given fetched URL. :param str url: URL associated with the extracted HTML. :returns: str """ try: soup = BeautifulSoup(html, "html.parser") title = soup.title.string.replace(",", ";") description = ( soup.head.select_one("meta[name=description]") .get("content") .replace(",", ";") .replace('"', "`") .replace("'", "`") ) primary_tag = ( soup.head .select_one("meta[property='article:tag']") .get("content") ) published_at = ( soup.head .select_one( "meta[property='article:published_time']" ) .get("content") .split("T")[0] ) if primary_tag is None: primary_tag = "" return f"{title}, {description}, {primary_tag}, {url}, {published_at}" except ParserRejectedMarkup as e: LOGGER.error( f"Failed to parse invalid html for {url}: {e}" ) except ValueError as e: LOGGER.error( f"ValueError occurred when parsing html for {url}: {e}" ) except Exception as e: LOGGER.error( f"Parsing failed when parsing html for {url}: {e}" ) ``` <figcaption>aiohttp_aiofiles_tutorial/parser.py</figcaption> ## Run the Jewels, Run the Script Let's take this bad boy for a spin. I threw a timer into **\_\_init\_\_.py** to log the number of seconds that elapse for the duration of the script: ```python """Make hundreds of requests concurrently and save responses to disk.""" import time from time import perf_counter as timer ... async def init_script(): """Prepare output file & task creation/execution.""" start_time = timer() # Add timer to function async with aiofiles.open(EXPORT_FILEPATH, mode="w+") as outfile: await outfile.write( "title,description,primary_tag,url,published_at\n" ) await execute_fetcher_tasks(outfile) await outfile.close() LOGGER.success( f"Executed { __name__ } in {time.perf_counter() - start_time:0.2f} seconds." ) # Log time of execution ... ``` <figcaption>aiohttp_aiofiles_tutorial/__init__.py</figcaption> Mash that mfing `make run` command if you're following along in the repo (or just punch in `python3 main.py`). Strap yourself in: ```shell ... 16:12:34 PM | INFO: Fetched URL 165 of 173: Setting up a MySQL Database on Ubuntu, Setting up MySQL the old-fashioned way: on a linux server, DevOps, https://hackersandslackers.com/set-up-mysql-database/, 2018-04-17 16:12:34 PM | INFO: Fetched URL 164 of 173: Dropping Rows of Data Using Pandas, Square one of cleaning your Pandas Dataframes: dropping empty or problematic data., Data Analysis, https://hackersandslackers.com/pandas-dataframe-drop/, 2018-04-18 16:12:34 PM | INFO: Fetched URL 167 of 173: Installing Django CMS on Ubuntu, Get the play-by-play on how to install DjangoCMS: the largest of three major CMS products for Python`s Django framework., Software, https://hackersandslackers.com/installing-django-cms/, 2017-11-19 16:12:34 PM | INFO: Fetched URL 166 of 173: Starting a Python Web App with Flask & Heroku, Pairing Flask with zero-effort container deployments is a deadly path to addiction., Architecture, https://hackersandslackers.com/flask-app-heroku/, 2018-02-13 16:12:34 PM | INFO: Fetched URL 171 of 173: Another 'Intro to Data Analysis in Python Using Pandas' Post, An introduction to Python`s quintessential data analysis library., Data Analysis, https://hackersandslackers.com/intro-python-pandas/, 2017-11-16 16:12:34 PM | INFO: Fetched URL 172 of 173: Managing Python Environments With Virtualenv, Embrace core best-practices in Python by managing your Python packages using virtualenv and virtualenvwrapper., Software, https://hackersandslackers.com/python-virtualenv-virtualenvwrapper/, 2017-11-15 16:12:34 PM | INFO: Fetched URL 170 of 173: Visualize Folder Structures with Python’s Treelib, Using Python`s treelib library to output the contents of local directories as visual tree representations., Data Engineering, https://hackersandslackers.com/python-tree-hierachies-treelib/, 2017-11-17 16:12:34 PM | INFO: Fetched URL 169 of 173: Merge Sets of Data in Python Using Pandas, Perform SQL-like merges of data using Python`s Pandas., Data Analysis, https://hackersandslackers.com/merge-dataframes-with-pandas/, 2017-11-17 16:12:34 PM | INFO: Fetched URL 168 of 173: Starting an ExpressJS App, Installation guide for ExpressJS with popular customization options., JavaScript, https://hackersandslackers.com/create-an-expressjs-app/, 2017-11-18 16:12:34 PM | SUCCESS: Executed aiohttp_aiofiles_tutorial in 2.96 seconds. ``` <figcaption>The tail end of our log after fetching 174 pages in ~3 seconds</figcaption> The higher end of our script's execution time is 3 seconds. A typical Python request takes 1-2 seconds to complete, so our speed optimization is in the range of _hundreds of times faster_ for a sample size of data like this. Writing async scripts in Python surely takes more effort, but not _hundreds_ or _thousands_ of times more effort. Even if isn't speed you're after, handling _volume_ of larger-scale applications renders Asyncio absolutely critical. For example, if your chatbot or webserver is in the middle of handling a user's request, what happens when a second user attempts to interact with your app in the meantime? Often times the answer is _nothing:_ **User 1** gets what they want, and **User 2** is stuck taking to a blocked thread. Anyway, seeing is believing. Here's the source code for this tutorial: {% github hackersandslackers/aiohttp-aiofiles-tutorial %} <figcaption>Source code for this tutorial</figcaption>
toddbirchard
963,186
Ethernaut Hacks Level 7: Force
This is the level 7 of Ethernaut game. Pre-requisites selfdestruct function in...
16,194
2022-01-21T17:33:17
https://dev.to/nvnx/ethernaut-hacks-level-7-force-4g2o
solidity, ethereum, openzeppelin, smartcontract
This is the level 7 of [Ethernaut](https://ethernaut.openzeppelin.com/) game. ## Pre-requisites - [selfdestruct](https://docs.soliditylang.org/en/v0.6.0/units-and-global-variables.html#contract-related) function in Solidity ## Hack Given contract: ```solidity // SPDX-License-Identifier: MIT pragma solidity ^0.6.0; contract Force {/* MEOW ? /\_/\ / ____/ o o \ /~____ =ø= / (______)__m_m) */} ``` `player` has to somehow make this empty contract's balance grater that 0. Simple `transfer` or `send` won't work because the `Force` implements neither `receive` nor `fallaback` functions. Calls with any value will revert. However, the checks can be bypassed by using `selfdestruct` of an intermediate contract - `Payer` which would specify `Force`'s address as beneficiary of it's funds after it's self-destruction. First off make a soon-to-be-destroyed contract in Remix: ```Solidity // SPDX-License-Identifier: MIT pragma solidity ^0.6.0; contract Payer { uint public balance = 0; function destruct(address payable _to) external payable { selfdestruct(_to); } function deposit() external payable { balance += msg.value; } } ``` Send a value of say, `10000000000000 wei` (0.00001 eth) by calling `deposit`, so that `Payer`'s balance increases to same amount. Get instance address of `Force` in console: ```javascript contact.address // Output: <your-instance-address> ``` Call `destruct` of `Payer` with `<your-instance-address>` as parameter. That's destroy `Payer` and send all of it's funds to `Force`. Verify by: ```javascript await getBalance(contract.address) // Output: '0.00001' ``` Level cracked! _Learned something awesome? Consider starring the [github repo](https://github.com/theNvN/ethernaut-openzeppelin-hacks)_ 😄 _and following me on twitter [here](https://twitter.com/heyNvN)_ 🙏
nvnx
963,214
How I want to become a better programmer in 2022
At the beginning of my coding career, it was just about knowing a language perfectly and learning...
16,461
2022-01-24T06:58:42
https://dev.to/yuridevat/how-i-want-to-become-a-better-programmer-in-2022-5aon
webdev, productivity, career, motivation
At the beginning of my coding career, it was just about knowing a language perfectly and learning some soft skills and business skills to get a job. So in 2021 I gained knowledge in React, CSS, TailwindCSS, Scrum, GitHub and Git as well as mentoring and content creation. I achieved my goal of getting a job as a programmer in October 2021. In my current job I am working with new additional programming languages and tools such as Gitlab, Java Spring Boot, Angular, Angular Material, Bootstrap, SCSS and CMS. Now that I've spent some time with these different languages and tools, I've gotten a feel for what I enjoy and want to focus more on in my personal life, and what I don't enjoy as much (but still need to work with them on a project-by-project basis at work 😅). ## What I want to focus on 1. React, TypeScript, Redux 2. SCSS, A11Y 3. Java (Spring Boot) 4. deepen my knowledge about Scrum and project management 5. contribute to Open Source After I feel confident in the above, I'd like to do a bit of work with AWS Lambda API Gateway as well. ## The plan to do so ### Phase 1 I want to watch and read a lot of tutorials (besides the official documentation of course), mainly on **FreeCodeCamp** and **FrontendMasters**, to learn what's out there, what's even possible with **React** and then update my projects. Meanwhile, I want to learn **TypeScript** so I can use it in my projects. TypeScript is becoming more and more required and moving from JavaScript to TypeScript in React doesn't seem that easy to me. But I will manage it. Also, I want to create another branch in all my projects where I will work with **Redux** (which I will also learn) and **SCSS**. This will give me a good understanding of the differences in working with Context API or Redux, and CSS or SCSS and I will be able to understand advantages and disadvantages. Finally, to take my projects to the next level, I would like to add **Java Spring Boot** to the backend. Deepening my knowledge of **Scrum** and **project management** go hand in hand when I work on my projects as I work with GitHub projects, create flagged issues, track time, etc. ### Phase 2 The side projects I have right now aren't really that big. So I guess I will end up having to create another project to show what I have learned. I hope that this will be the case in 6 months. And I already have lots of ideas 🤓 When I've deepened my knowledge enough in the above areas, I'll be ready to venture into **Open Source**. ### Phase 3 If I feel comfortable with all the steps mentioned, it would be time to learn something new. I think I will start learning **AWS Lambda API Gateway**, but let's see 😅 ## How I want to achieve I'm in the process of creating a plan on **Notion** using a **Kanban Board**, creating issues, and sorting through all the bookmarks I've saved over the past year (and there were many) and adding them to the issues I've created to stop being distracted by too much content. Each issue is assigned to a property like React, WebDev, Java, etc. and a priority. Right now the Issues are still too big, like _diving deeper into React_ and within that I created To Dos. This is not a good practice☝️. ![Image description](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zqz7zf4yq074l3svaq2v.png) It takes time to find a perfect solution, make issues as small as possible and have a good overview of them, sort them and decide how best to start and how to learn to make continuous progress. I think I will have it all figured out in a couple of weeks and will share my initial progress in my next article about my journey, which I will write quarterly. Planning is also something that should be learned. And this process is perfect to get better at project management and its tools 😎 ## Why the other languages and tools didn't make the cut After all this time, I quickly realized what I definitely don't enjoy. Even if you don't (yet) know what exactly interests you, it's at least a good start to know what doesn't interest you. That also gets you going in the right direction. I'm not interested in working with frameworks like Angular, Bootstrap, or a CMS. Those have their advantages, of course, but I just don't enjoy working with them. I want to do everything myself, I want to think for myself and fully understand what is going on. I can't achieve this feeling when I work with the above mentioned tools. And the most important thing is that I am happy at work, I feel challenged and I like what I do 🥰. I really like working with **Gitlab**. It's huge and I feel like you can do a lot more than with GitHub, or it's more manageable, I don't quite know yet. That said, I'll stick with GitHub in private for now. As for **TailwindCSS**, I like it a lot. But I rarely see it in the job description, SCSS on the other hand is seen a lot and is more complicated, so I want to focus on SCSS and can still work with TailwindCSS if needed since I know some of it already. ## To sum it all up From January I will - decide which programming languages, tools and skills I want to improve - create a learning plan with a Kanban board on Notion - create issues for all topics as small and clear as possible to get the most learning effect - share my progress every 3 months to see how much progress I have made, what went really well, what I need to change or adjust My future is still unclear, but I have an idea of where I want to be at the end of 2022. I want to deepen my knowledge, improve it and take my journey as a developer in a certain direction. And now, let's get started and make progress. See you in 3 months 👋 --- ![Thank you](https://docs.google.com/uc?export=download&id=166Ecq6uBl61U14OUlkHOHIBv2ArKoumJ) _Thanks for your reading and time. I really appreciate it!_
yuridevat
963,472
Hosting an Angular application on Amazon S3 using GitHub Actions
Introduction Angular is a development platform for building WEB, mobile and desktop...
0
2022-01-23T02:26:48
https://dev.to/rodrigokamada/hosting-an-angular-application-on-amazon-s3-using-github-actions-3h6g
angular, s3, aws, github
# Introduction [Angular](https://angular.io/) is a development platform for building WEB, mobile and desktop applications using HTML, CSS and TypeScript (JavaScript). Currently, Angular is at version 14 and Google is the main maintainer of the project. [GitHub](https://github.com/) is a source code and file storage service with version control using git tool. [GitHub Actions](https://github.com/actions) is a service to automate the software workflow. [Amazon S3 (Simple Storage Service)](https://aws.amazon.com/s3/) is an object storage service offering scalability, data availability, security and performance. # Prerequisites Before you start, you need to install and configure the tools: * [git](https://git-scm.com/) * [Node.js and npm](https://nodejs.org/) * [Angular CLI](https://angular.io/cli) * IDE (e.g. [Visual Studio Code](https://code.visualstudio.com/)) # Getting started ## Create and configure the account on the Amazon S3 **1.** Let's create and configure the account. Access the site [https://aws.amazon.com/s3/](https://aws.amazon.com/s3/) and click on the button *Get Started with Amazon S3*. ![Amazon S3 - Home page](https://res.cloudinary.com/rodrigokamada/image/upload/v1642851691/Blog/angular-github-actions-amazon-s3/amazon-s3-step1.png) **2.** Click on the option *Root user*, fill in the field *Root user email address* and click on the button *Next*. ![Amazon S3 - Sign in](https://res.cloudinary.com/rodrigokamada/image/upload/v1642852101/Blog/angular-github-actions-amazon-s3/amazon-s3-step2.png) **Note:** * If you don't have an Amazon account, do steps 1 to 9 of the post *[Authentication using the Amazon Cognito to an Angular application](https://dev.to/rodrigokamada/authentication-using-the-amazon-cognito-to-an-angular-application-ilh)* in the session *Create and configure the account on the Amazon Cognito*. **3.** Fill in the field *Security check* and click on the button *Submit*. ![Amazon S3 - Security check](https://res.cloudinary.com/rodrigokamada/image/upload/v1642852700/Blog/angular-github-actions-amazon-s3/amazon-s3-step3.png) **4.** Fill in the field *Password* and click on the button *Sign in*. ![Amazon S3 - Root user sign in](https://res.cloudinary.com/rodrigokamada/image/upload/v1642852700/Blog/angular-github-actions-amazon-s3/amazon-s3-step4.png) **5.** Click on the button *Create bucket*. ![Amazon S3 - Buckets](https://res.cloudinary.com/rodrigokamada/image/upload/v1642854836/Blog/angular-github-actions-amazon-s3/amazon-s3-step5.png) **6.** Fill in the field *Bucket name*, click on the option *Block all public access* to uncheck this option, *I acknowledge that the current settings might result in this bucket and the objects within becoming public.* and click on the button *Create bucket*. ![Amazon S3 - Create bucket](https://res.cloudinary.com/rodrigokamada/image/upload/v1642869510/Blog/angular-github-actions-amazon-s3/amazon-s3-step6.png) **7.** Click on the link *angular-github-actions-amazon-s3* with the bucket name. ![Amazon S3 - Buckets](https://res.cloudinary.com/rodrigokamada/image/upload/v1642866388/Blog/angular-github-actions-amazon-s3/amazon-s3-step7.png) **8.** Click on the link *Properties*. ![Amazon S3 - Bucket objects](https://res.cloudinary.com/rodrigokamada/image/upload/v1642866647/Blog/angular-github-actions-amazon-s3/amazon-s3-step8.png) **9.** Click on the button *Edit*. ![Amazon S3 - Bucket properties](https://res.cloudinary.com/rodrigokamada/image/upload/v1642867044/Blog/angular-github-actions-amazon-s3/amazon-s3-step9.png) **10.** Click on the options *Enable*, *Host a static website*, fill in the fields *Index document*, *Error document - optional* and click on the button *Save changes*. ![Amazon S3 - Edit static website hosting](https://res.cloudinary.com/rodrigokamada/image/upload/v1642867461/Blog/angular-github-actions-amazon-s3/amazon-s3-step10.png) **11.** Copy the URL and the region displayed, in my case, the URL `http://angular-github-actions-amazon-s3.s3-website-sa-east-1.amazonaws.com` and the region `sa-east-1` were displayed because the URL will used to access the Angular application and the region will used in the GitHub Actions file setting and click on the link *Permissions*. ![Amazon S3 - Bucket properties](https://res.cloudinary.com/rodrigokamada/image/upload/v1642901966/Blog/angular-github-actions-amazon-s3/amazon-s3-step11.png) **12.** Click on the button *Edit*. ![Amazon S3 - Bucket permissions](https://res.cloudinary.com/rodrigokamada/image/upload/v1642873124/Blog/angular-github-actions-amazon-s3/amazon-s3-step12.png) **13.** Fill in the fields *Policy* with the content below and click on the button *Save changes*. ```json { "Version": "2012-10-17", "Statement": [ { "Sid": "PublicRead", "Principal": "*", "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": [ "arn:aws:s3:::angular-github-actions-amazon-s3/*" ] } ] } ``` ![Amazon S3 - Bucket policy](https://res.cloudinary.com/rodrigokamada/image/upload/v1642872376/Blog/angular-github-actions-amazon-s3/amazon-s3-step13.png) **14.** Ready! Account created and configured and bucket created and configured. ![Amazon S3 - Bucket permissions](https://res.cloudinary.com/rodrigokamada/image/upload/v1642873644/Blog/angular-github-actions-amazon-s3/amazon-s3-step14.png) **15.** Click on the menu *Services*, *Security, Identity & Compliance* and *IAM*. ![Amazon S3 - Menu IAM](https://res.cloudinary.com/rodrigokamada/image/upload/v1642874455/Blog/angular-github-actions-amazon-s3/amazon-s3-step15.png) **16.** Click on the link *Users*. ![Amazon S3 - IAM dashboard](https://res.cloudinary.com/rodrigokamada/image/upload/v1642875468/Blog/angular-github-actions-amazon-s3/amazon-s3-step16.png) **17.** Click on the button *Add users*. ![Amazon S3 - Users](https://res.cloudinary.com/rodrigokamada/image/upload/v1642875822/Blog/angular-github-actions-amazon-s3/amazon-s3-step17.png) **18.** Fill in the field *User name*, click on the option *Access key - Programmatic access* and click on the button *Next: Permissions*. ![Amazon S3 - Set user details](https://res.cloudinary.com/rodrigokamada/image/upload/v1642876438/Blog/angular-github-actions-amazon-s3/amazon-s3-step18.png) **19.** Click on the options *Attach existing policies directly*, *AmazonS3FullAccess* and click on the button *Next: Tags*. ![Amazon S3 - Set premissions](https://res.cloudinary.com/rodrigokamada/image/upload/v1642890482/Blog/angular-github-actions-amazon-s3/amazon-s3-step19.png) **20.** Click on the button *Next: Review*. ![Amazon S3 - Set tags](https://res.cloudinary.com/rodrigokamada/image/upload/v1642890751/Blog/angular-github-actions-amazon-s3/amazon-s3-step20.png) **21.** Click on the button *Create user*. ![Amazon S3 - Review](https://res.cloudinary.com/rodrigokamada/image/upload/v1642890930/Blog/angular-github-actions-amazon-s3/amazon-s3-step21.png) **22.** Copy the *Access key ID* displayed, in my case, the *Access key ID* `AKIAUAM34QRRRQ5AZD2A` was displayed, click on the link *Show*, copy the *Secret Access key* displayed because the *Access key ID* and *Secret Access key* will be configured in the GitHub repository and click on the button *Close*. ![Amazon S3 - Add user success](https://res.cloudinary.com/rodrigokamada/image/upload/v1642892908/Blog/angular-github-actions-amazon-s3/amazon-s3-step22.png) **23.** Ready! Access keys created. ## Create the account and the repository on the GitHub **1.** Let's create the account and the repository. Do steps 1 to 6 of the post *[Hosting an Angular application on GitHub Pages using GitHub Actions](https://github.com/rodrigokamada/angular-github-actions)* in the session *Create and configure the account on the GitHub*. **2.** Click on the menu *Settings*. ![GitHub - Code](https://res.cloudinary.com/rodrigokamada/image/upload/v1642897425/Blog/angular-github-actions-amazon-s3/github-step2.png) **3.** Click on the side menu *Secrets*. ![GitHub - Settings](https://res.cloudinary.com/rodrigokamada/image/upload/v1642897653/Blog/angular-github-actions-amazon-s3/github-step3.png) **4.** Click on the button *New repository secret*. ![GitHub - Secrets](https://res.cloudinary.com/rodrigokamada/image/upload/v1642897851/Blog/angular-github-actions-amazon-s3/github-step4.png) **5.** Fill in the fields *Name*, *Value* and click on the button *Add secret* to configure the key with the *Access key ID*. ![GitHub - Add access key ID](https://res.cloudinary.com/rodrigokamada/image/upload/v1642898255/Blog/angular-github-actions-amazon-s3/github-step5.png) **6.** Click on the button *New repository secret*. ![GitHub - Secrets](https://res.cloudinary.com/rodrigokamada/image/upload/v1642899086/Blog/angular-github-actions-amazon-s3/github-step6.png) **7.** Fill in the fields *Name*, *Value* and click on the button *Add secret* to configure the key with the *Secret Access key*. ![GitHub - Add secret access key](https://res.cloudinary.com/rodrigokamada/image/upload/v1642899576/Blog/angular-github-actions-amazon-s3/github-step7.png) **8.** Ready! Access keys configured. ![GitHub - Secrets configured](https://res.cloudinary.com/rodrigokamada/image/upload/v1642899804/Blog/angular-github-actions-amazon-s3/github-step8.png) ## Create the Angular application **1.** Let's create the application with the Angular base structure using the `@angular/cli` with the route file and the SCSS style format. ```powershell ng new angular-github-actions-amazon-s3 --routing true --style scss CREATE angular-github-actions-amazon-s3/README.md (1074 bytes) CREATE angular-github-actions-amazon-s3/.editorconfig (274 bytes) CREATE angular-github-actions-amazon-s3/.gitignore (548 bytes) CREATE angular-github-actions-amazon-s3/angular.json (3363 bytes) CREATE angular-github-actions-amazon-s3/package.json (1096 bytes) CREATE angular-github-actions-amazon-s3/tsconfig.json (863 bytes) CREATE angular-github-actions-amazon-s3/.browserslistrc (600 bytes) CREATE angular-github-actions-amazon-s3/karma.conf.js (1449 bytes) CREATE angular-github-actions-amazon-s3/tsconfig.app.json (287 bytes) CREATE angular-github-actions-amazon-s3/tsconfig.spec.json (333 bytes) CREATE angular-github-actions-amazon-s3/.vscode/extensions.json (130 bytes) CREATE angular-github-actions-amazon-s3/.vscode/launch.json (474 bytes) CREATE angular-github-actions-amazon-s3/.vscode/tasks.json (938 bytes) CREATE angular-github-actions-amazon-s3/src/favicon.ico (948 bytes) CREATE angular-github-actions-amazon-s3/src/index.html (314 bytes) CREATE angular-github-actions-amazon-s3/src/main.ts (372 bytes) CREATE angular-github-actions-amazon-s3/src/polyfills.ts (2338 bytes) CREATE angular-github-actions-amazon-s3/src/styles.scss (80 bytes) CREATE angular-github-actions-amazon-s3/src/test.ts (745 bytes) CREATE angular-github-actions-amazon-s3/src/assets/.gitkeep (0 bytes) CREATE angular-github-actions-amazon-s3/src/environments/environment.prod.ts (51 bytes) CREATE angular-github-actions-amazon-s3/src/environments/environment.ts (658 bytes) CREATE angular-github-actions-amazon-s3/src/app/app-routing.module.ts (245 bytes) CREATE angular-github-actions-amazon-s3/src/app/app.module.ts (393 bytes) CREATE angular-github-actions-amazon-s3/src/app/app.component.scss (0 bytes) CREATE angular-github-actions-amazon-s3/src/app/app.component.html (23364 bytes) CREATE angular-github-actions-amazon-s3/src/app/app.component.spec.ts (1151 bytes) CREATE angular-github-actions-amazon-s3/src/app/app.component.ts (237 bytes) ✔ Packages installed successfully. Successfully initialized git. ``` **2.** Change the `package.json` file and add the scripts below. ```json "build:prod": "ng build --configuration production", "test:headless": "ng test --watch=false --browsers=ChromeHeadless" ``` **3.** Run the test with the command below. ```shell npm run test:headless > angular-github-actions-amazon-s3@1.0.0 test:headless > ng test --watch=false --browsers=ChromeHeadless ⠙ Generating browser application bundles (phase: setup)...22 01 2022 22:11:29.773:INFO [karma-server]: Karma v6.3.11 server started at http://localhost:9876/ 22 01 2022 22:11:29.774:INFO [launcher]: Launching browsers ChromeHeadless with concurrency unlimited 22 01 2022 22:11:29.781:INFO [launcher]: Starting browser ChromeHeadless ✔ Browser application bundle generation complete. 22 01 2022 22:11:37.472:INFO [Chrome Headless 97.0.4692.71 (Linux x86_64)]: Connected on socket 2NkcZwzLS5MNuDnMAAAB with id 98670702 Chrome Headless 97.0.4692.71 (Linux x86_64): Executed 3 of 3 SUCCESS (0.117 secs / 0.1 secs) TOTAL: 3 SUCCESS ``` **4.** Run the application with the command below. Access the URL `http://localhost:4200/` and check if the application is working. ```shell npm start > angular-github-actions-amazon-s3@1.0.0 start > ng serve ✔ Browser application bundle generation complete. Initial Chunk Files | Names | Raw Size vendor.js | vendor | 2.00 MB | polyfills.js | polyfills | 339.24 kB | styles.css, styles.js | styles | 213.01 kB | main.js | main | 53.27 kB | runtime.js | runtime | 6.90 kB | | Initial Total | 2.60 MB Build at: 2022-01-23T01:13:06.355Z - Hash: e7a502736b12e783 - Time: 6256ms ** Angular Live Development Server is listening on localhost:4200, open your browser on http://localhost:4200/ ** ✔ Compiled successfully. ``` **5.** Build the application with the command below. ```shell npm run build:prod > angular-github-actions-amazon-s3@1.0.0 build:prod > ng build --configuration production ✔ Browser application bundle generation complete. ✔ Copying assets complete. ✔ Index html generation complete. Initial Chunk Files | Names | Raw Size | Estimated Transfer Size main.464c3a39ed8b2eb2.js | main | 206.10 kB | 56.11 kB polyfills.f007c874370f7293.js | polyfills | 36.27 kB | 11.56 kB runtime.8db60f242f6b0a2b.js | runtime | 1.09 kB | 603 bytes styles.ef46db3751d8e999.css | styles | 0 bytes | - | Initial Total | 243.46 kB | 68.25 kB Build at: 2022-01-23T01:14:01.354Z - Hash: cab5f3f1681e58fd - Time: 11304ms ``` **6.** Let's create and configure the file with the GitHub Actions flow. Create the `.github/workflows/gh-pages.yml` file. ```shell mkdir -p .github/workflows touch .github/workflows/gh-pages.yml ``` **7.** Configure the `.github/workflows/gh-pages.yml` file with the content below. ```yaml name: GitHub Pages on: push: branches: - main jobs: deploy: runs-on: ubuntu-latest steps: - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v1 with: aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} aws-region: sa-east-1 - name: Checkout uses: actions/checkout@v2 - name: Setup Node.js uses: actions/setup-node@v2 with: node-version: 14 - name: Install dependencies run: npm install - name: Run tests run: npm run test:headless - name: Build run: npm run build:prod - name: Deploy if: success() run: aws s3 sync ./dist/angular-github-actions-amazon-s3 s3://angular-github-actions-amazon-s3 ``` Notes: * The `aws-access-key-id` and `aws-secret-access-key` settings were done in the GitHub repository. * The `aws-region` setting is the bucket region. * The `./dist/angular-github-actions-amazon-s3` setting is the application build folder. * The `s3://angular-github-actions-amazon-s3` setting is the bucket name. **8.** Syncronize the application on the GitHub repository that was created. ![GitHub - Repository](https://res.cloudinary.com/rodrigokamada/image/upload/v1642902812/Blog/angular-github-actions-amazon-s3/angular-github-actions-amazon-s3-step8.png) **9.** Ready! After synchronizing the application on the GitHub repository, the GitHub Actions build the application and synchronize with Amazon S3 bucket. Access the URL [http://angular-github-actions-amazon-s3.s3-website-sa-east-1.amazonaws.com/](http://angular-github-actions-amazon-s3.s3-website-sa-east-1.amazonaws.com/) and check if the application is working. Replace the URL values with your bucket name and region. ![Angular GitHub Actions Amazon S3](https://res.cloudinary.com/rodrigokamada/image/upload/v1642902979/Blog/angular-github-actions-amazon-s3/angular-github-actions-amazon-s3-step9.png) ## Validate the run of the GitHub Actions flow **1.** Let's validate the run of the GitHub Actions flow. Access the repository [https://github.com/rodrigokamada/angular-github-actions-amazon-s3](https://github.com/rodrigokamada/angular-github-actions-amazon-s3) created and click on the link *Actions*. ![GitHub Actions - Repository](https://res.cloudinary.com/rodrigokamada/image/upload/v1642903246/Blog/angular-github-actions-amazon-s3/github-actions-step1.png) **2.** Click on the flow runned. ![GitHub Actions - Workflows](https://res.cloudinary.com/rodrigokamada/image/upload/v1642903427/Blog/angular-github-actions-amazon-s3/github-actions-step2.png) **3.** Click on the job *deploy*. ![GitHub Actions - Jobs](https://res.cloudinary.com/rodrigokamada/image/upload/v1642903880/Blog/angular-github-actions-amazon-s3/github-actions-step3.png) **4.** Click on each step to validate the run. ![GitHub Actions - Steps](https://res.cloudinary.com/rodrigokamada/image/upload/v1642903976/Blog/angular-github-actions-amazon-s3/github-actions-step4.png) **5.** Ready! We validate the run of the GitHub Actions flow. The application repository is available at [https://github.com/rodrigokamada/angular-github-actions-amazon-s3](https://github.com/rodrigokamada/angular-github-actions-amazon-s3). This tutorial was posted on my [blog](https://rodrigo.kamada.com.br/blog/hospedando-uma-aplicacao-angular-no-amazon-s3-usando-o-github-actions) in portuguese.
rodrigokamada
963,747
Livewire Button Component with Loading Indicator
In this Article, we will see how to customize our Button to have following Animation using Livewire...
0
2022-01-22T09:39:09
https://dev.to/100r0bh/livewire-button-component-with-loading-indicator-5h77
laravel, livewire
In this Article, we will see how to customize our Button to have following Animation using Livewire and extract everything into a Blade Component so as to reuse anywhere into your application. ![Preview](https://media.giphy.com/media/nyAmCKhog1tUpKmw8G/giphy.gif) We are going to use the Breeze Button Component and extend it according to our needs. Its definition looks like below: ``` <button {{ $attributes->merge(['type' => 'submit']) }}> {{ $slot }} </button> ``` I have removed all the CSS Classes for clarify.We can use this Button Component inside our Livewire Component like below: ``` <x-button wire:click="save" wire:loading.attr="disabled"> Save </x-button> ``` What we want is that whenever this button is clicked, we want to change the Text to Saving. In order to do that we will use the `wire:loading` property. ``` <x-button wire:click="save" wire:loading.attr="disabled"> Save <span wire:loading>Saving..</span> </x-button> ``` So now "Saving.." will get displayed as long as the submit button is clicked. And it will get hidden when the AJAX Call has finished. However, during this AJAX Call both "Save" and "Saving.." are showing. So we also need to hide the "Save" during this AJAX Call. We can do so using `wire:loading.remove` ``` <x-button wire:click="save" wire:loading.attr="disabled"> <span wire:loading.remove>Save</span> <span wire:loading>Saving..</span> </x-button> ``` Now, even though this is working, this will lead to unexpected issues when you have more than 1 button on the Page. So it is always a good practice to specify that we only want to change the display of these elements when the AJAX Call corresponding to save method is being called. We can do so using `wire:target`. ``` <x-button wire:click="save" wire:loading.attr="disabled"> <span wire:loading.remove wire.target="save">Save</span> <span wire:loading wire.target="save">Saving..</span> </x-button> ``` At this stage, you should see your buttons having the same behaviour as shared at the start of the Article. However, we can further improve the readability of our code by extracting the code to our Button Component. We want this to be as simple as following: ``` <x-button wire:click="save" loading="Saving.."> Save </x-button> ``` First of all we will create a `loading` property in our button component and assign it a default value of `false`. ``` @props(['loading' => false]) ``` We can read the Livewire Attributes inside our Blade Component using below code: ``` $attributes->wire('click')->value() ``` You can read more about them in the [Livewire Docs. ](https://laravel-livewire.com/docs/2.x/alpine-js#livewire-directives-from-blade-components) When both `loading` property is present and the `wire:click` attribute is present, we want to insert our span tags, otherwise we will just display the Slot. So our Blade Components becomes like this. ``` @props(['loading' => false]) <button {{ $attributes->merge(['type' => 'submit']) }}> @if ($loading && $target = $attributes->wire('click')->value()) <span wire:loading.remove wire:target="{{$target}}">{{$slot}}</span> <span wire:loading wire:target="{{$target}}">{{$loading}}</span> @else {{ $slot }} @endif </button> ``` So now we can use our Button Component using the loading attributes as well as without the loading attribute. Hope you have enjoyed this tutorial. For similar articles, you can follow me on [Twitter](https://twitter.com/TheLaravelDev)
100r0bh
963,759
Hide or show Elements/div in HTML Using JavaScript and Css
Video Documentation :- https://youtu.be/Ms7EgAJDVx0 Source :-...
16,475
2022-01-22T10:11:41
https://dev.to/sh20raj/hide-or-show-elementsdiv-in-html-using-javascript-and-css-3jb7
javascript, css, sh20raj, tricks
- Video Documentation :- https://youtu.be/Ms7EgAJDVx0 - Source :- https://codexdindia.blogspot.com/2022/01/hide-or-show-elements-in-html-using-javascript-and-css.html --- Steps :- - For This Firstly create a .hidden class where the css display property is set to `"none"`. ``` <style> .hidden{ display:none; } </style> ``` HTML - Then Using JavaScript we will `toggle` (add / remove) the hidden class from the element. So that results hiding and showing of the div/any other element. - Creating a JavaScript function to hide/show elements. ``` <script> function hideunhide(a){ document.querySelector(a).classList.toggle('hidden'); } </script> ``` HTML - This function takes a parameter (queryselector) and select the element and add or remove hidden from the class list of the element. If hidden is present it will remove and vice-versa.That's the work of toggle. - See the Demo Below or on CXDI - Tutorials :- https://tutorials.sh20raj.repl.co/hide-or-show-elements-in-html-using-javascript-and-css/. - Demo Codes :- https://replit.com/@SH20RAJ/Tutorials#hide-or-show-elements-in-html-using-javascript-and-css/index.html or on Codepen :- https://codepen.io/SH20RAJ/pen/vYeVdGj?editors=1010
sh20raj
964,195
Create a Unity to itch.io Deployment Pipeline Using Butler & Bash Scripting
I recently uploaded my first game to itch.io, and discovered that there is a nifty tool called Butler...
0
2022-01-22T21:53:53
https://dev.to/townofdon/create-a-unity-to-itchio-deployment-pipeline-using-butler-bash-scripting-3hbe
unity3d, bash, automation, gamedev
I recently uploaded my [first game to itch.io](https://donjuanjavier.itch.io/rexel-2d), and discovered that there is a nifty tool called [Butler](https://itch.io/docs/butler/) that allows uploading a build from the command line. However, I wanted to streamline the build -> zip -> upload steps, so here's what I did: ## 1: Create WebGL build in Unity First, I followed the [official Unity docs for publishing WebGL games](https://learn.unity.com/tutorial/creating-and-publishing-webgl-builds). In the Unity Build Settings, I selected a WebGL build and made sure to turn compression off in **Player Settings** (`Project Settings -> Player`), after [running into an issue with the build not working in the browser](https://forum.unity.com/threads/webgl-build-doesnt-load-in-browser.948400/#post-6395847). ![Disable compression in Unity Player Settings](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7ld4qe4fkcxn3ahx14oa.png) Finally, I made sure all of my scenes were checked that I wanted to include, and selected **Build** (I also always use the Clean Build option). I used the default build path of `./Build`. ![Unity Build Settings](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qtslynt17pj1xoloch2p.png) ## 2: (Optional) add docker-compose to test WebGL build After the build completed, I wanted to test things out before uploading to itch.io. I followed [this guide](https://dev.to/tomowatt/running-an-unity-webgl-game-within-docker-5039) to set up my own docker-compose file to run things locally. BTW, getting Docker to work on a Windows machine took some effort... I had to enable CPU virtualization in my BIOS settings. I'm a BIOS noob, so that was a little scary, but I finally got things working, and `docker-compose up` worked like a charm. ## 3: Adding `deploy.sh` script I wanted my script to automate two things: 1. Packaging my build into a `.zip` file 2. Uploading to itch.io via Butler My Windows machine uses 7-Zip for archive creation/extraction, so [I had to add 7-Zip to my Windows path](https://stackoverflow.com/questions/14122732/unzip-files-7-zip-via-cmd-command). (Alternatively, I could have used the `zip` command, but it wasn't working for me - I didn't feel like going through the hassle of [installing GoW](https://stackoverflow.com/questions/38782928/how-to-add-man-and-zip-to-git-bash-installation-on-windows) just to use `zip`). Next, I needed to install Butler and add it to my Windows path. I just added a directory called `C:\Bin` and placed the downloaded Butler folder there, and set the Windows path to `C:\Bin\Butler`. After all was said and done, I could run the following commands: ```bash # zip file to archive ZIPFILE="/path/to/zipfile-to-create" 7z a $ZIPFILE "/path/to/Build" > NUL ``` _(Note, the `> NUL` above simply silences the output from the zip command; also `NUL` would be `/dev/null` on Mac/Linux)._ ```bash # upload to itch.io USERNAME="my-itch-io-username" GAME="my-awesome-game" CHANNEL="html" VERSION=1.0.0 butler push $ZIPFILE "${USERNAME}/${GAME}:${CHANNEL}" --userversion $VERSION ``` Viola! Everything worked like a charm, and my newly-uploaded `.zip` file now showed up in my edit game page. ## Conclusion Now that this script is in place, deploying new releases for my game is ridiculously easy. Hopefully this guide can help you streamline your game release cycle as well. ![Deploy script in action](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fdhye09rvh3m08d0d75w.png) Here's my full `deploy.sh` script: `Config/deploy.sh` ```bash #!/bin/bash USERNAME="my-itch-io-username" GAME="my-awesome-game" CHANNEL="html" # # COLORS # see: https://stackoverflow.com/questions/5947742/how-to-change-the-output-color-of-echo-in-linux # NC='\033[0m' # No Color BLACK='\033[0;30m' # Black RED='\033[0;31m' # Red GREEN='\033[0;32m' # Green YELLOW='\033[0;33m' # Yellow BLUE='\033[0;34m' # Blue PURPLE='\033[0;35m' # Purple CYAN='\033[0;36m' # Cyan WHITE='\033[0;37m' # White GREY='\033[1;30m' # Grey # # UTILS # log() { echo -e "${GREY}${1}${NC}" } info() { echo -e "${CYAN}${1}${NC}" } success() { echo -e "${GREEN}✓ ${1}${NC}" } warn() { echo -e "${YELLOW}${1}${NC}" } error() { echo -e "${RED}${1}${NC}" } prompt() { read -p "$1 " -n 1 -r echo # (optional) move to a new line if [[ ! $REPLY =~ ^[Yy]$ ]] then warn "user cancelled." exit 1 fi } assertFileExists() { if [ ! -f "$1" ]; then error "$1 does not exist." exit 1 fi } assertDirExists() { if [ ! -d "$1" ]; then error "$1 does not exist." exit 1 fi } # # GET VERSION # VERSION=$(cat version.json \ | grep version \ | head -1 \ | awk -F: '{ print $2 }' \ | sed 's/[",]//g' \ | tr -d '[[:space:]]') SAFE_VERSION="${VERSION//./$'-'}" # # SCRIPT # info "WELCOME TO THE UNITY DEPLOYMENT SCRIPT!" info "About to push version ${RED}${VERSION}${CYAN} - proceed?" prompt "(y/n)" assertFileExists "../Build/index.html" mkdir -p "../Archives" ZIPFILE="../Archives/build-${SAFE_VERSION}.zip" log "creating zip archive for ${ZIPFILE}..." # zip 7z a $ZIPFILE "../Build" > NUL log "deploying to itch.io..." # push to itch.io butler push $ZIPFILE "${USERNAME}/${GAME}:${CHANNEL}" --userversion $VERSION success "All done!" ``` I also added a simple version file containing version info for my game to make it easy to update without needing to alter my script: `Config/version.json` ```json { "version": "1.0.0" } ``` Cover Photo by <a href="https://unsplash.com/@hellolightbulb?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Hello Lightbulb</a> on <a href="https://unsplash.com/s/photos/gamedev?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a>
townofdon
964,367
return keyword JavaScript - A hidden hero we do not care about
Many of us, including me, do not care about some of the tiny pieces in a programming language that...
0
2022-01-24T01:00:11
https://dev.to/mkday/return-keyword-javascript-a-hidden-hero-we-do-not-care-about-48lo
javascript, beginners, webdev, programming
Many of us, including me, do not care about some of the tiny pieces in a programming language that maybe we use more often without even thinking of it. By considering JavaScript, the **_return_** keyword becomes more crucial because every built-in function or custom function may have a return value that defines using this *return* keyword. So, let me unwrap things that make the *return* keyword so important. ### 1. Always resides inside a function First of all, it only can be used inside a function. For instance, if we use it in the global scope or inside a loop, we will get an error message like *Uncaught SyntaxError: Illegal return statement*. Let us try to return something in the global scope. ```javascript /* * try to return a value in the global scope, * and it only gives an error */ console.log("before the return statement"); return 0; console.log("after the return statement"); /* output: Uncaught SyntaxError: Illegal return statement */ ``` Instead, we can put that block of code inside a function. ```javascript function returnSomething() { console.log("before the return statement"); return 0; console.log("after the return statement"); } const returnValue = returnSomething(); console.log(returnValue); /* output: "before the return statement" 0 */ ``` And, let us try to use the *return* keyword inside a for-loop. ```javascript /* * try to return a value inside a for-loop * and it is also never going to run */ for (let i = 1; i <= 5; i++) { if (i === 4) { console.log("reach to the value: " + i); return i; } console.log("current value: " + i); } console.log("exit from the loop"); /* output: Uncaught SyntaxError: Illegal return statement */ ``` To solve the problem, wrap it up using a function. ```javascript function returnSomething() { for (let i = 1; i <= 5; i++) { if (i === 4) { console.log("reach to the value: " + i); return i; } console.log("current value: " + i); } console.log("exit from the loop"); } const returnValue = returnSomething(); console.log(returnValue); /* output: "current value: 1" "current value: 2" "current value: 3" "reach to the value: 4" 4 */ ``` The function above will never print the last console message. That is, `console.log("exit from the loop")`. Because, when the *if* condition `(i === 4)` becomes true, it will be immediately returned from the function without executing any further. So, that console message becomes **_unreacheable code_**. In addition, here is how to use the *return* keyword with the *switch-case* statement. ```javascript function returnSomething() { let fruit = "apple"; switch (fruit) { case "banana": console.log("banana"); return; case "orange": console.log("orange"); return; case "apple": console.log("apple"); return; default: console.log("no fruits here"); } console.log("outside the switch-case"); } const returnValue = returnSomething(); console.log(returnValue); /* output: "apple" undefined */ ``` ### 2. Use return with or without a value It is not compulsory to use the *return* statement inside every function. However, still, every function has a return value that is `undefined`. ```javascript function withoutReturn() { console.log("no return value for this function"); } const returnValue = withoutReturn(); console.log(returnValue); /* output: "no return value for this function" undefined */ ``` ### 3. Automatic Semicolon Insertion (ASI) Javascript does not allow to use line terminator between the `return` keyword and the return expression. See the below example. It returns `undefined` instead of `a + b` value. And, `a + b` becomes as *unreachable code*. ```javascript function returnSomething(a, b) { return a + b; } const returnValue = returnSomething(4, 5); console.log(returnValue); // undefined ``` Instead, we can use parentheses to wrap the return expression. ```javascript function returnSomething(a, b) { return ( a + b); } const returnValue = returnSomething(4, 5); console.log(returnValue); // 9 ``` ### 4. Types of return values We can return a value in any type using the `return` keyword. For instance, we can return values which are, * in the primitive data types such as string, number, boolean, etc. * in the object types such as arrays, objects, and functions. Plus, we can return console messages as well. #### (1) Primitive types * **return undefined:** ```javascript function returnType() { console.log("return value: undefined"); return undefined; } const returnValue = returnType(); console.log(returnValue); /* output: "return value: undefined" undefined */ ``` * **return null** ```javascript function returnType() { console.log("return value: null"); return null; } const returnValue = returnType(); console.log(returnValue); /* output: "return value: null" null */ ``` * **return string:** ```javascript function returnType() { console.log("return value: string"); return "Hello World"; } const returnValue = returnType(); console.log(returnValue); /* output: "return value: string" "Hello World" */ ``` * **return number:** ```javascript function returnType() { console.log("return value: number"); return 5; } const returnValue = returnType(); console.log(returnValue); /* output: "return value: number" 5 */ ``` * **return boolean:** ```javascript function returnType() { console.log("return value: boolean"); return false; } const returnValue = returnType(); console.log(returnValue); /* output: "return value: boolean" false */ ``` #### (2) Object types * **return an array:** ```javascript function returnType() { let name = "Bob"; let age = 14; let id = 123; return [name, age, id]; } const [name, age, id] = returnType(); console.log(name, age, id); /* output: "Bob" 14 123 */ ``` * **return an object:** ```javascript function returnType() { let name = "Bob"; let age = 14; let id = 123; return {name, age, id}; } const {name, age, id} = returnType(); console.log(name, age, id); /* output: "Bob" 14 123 */ ``` * **return a function:** ```javascript function returnType() { return function returnFunction(name, age) { console.log(name + " is " + age + " years old."); }; } const returnValue = returnType(); returnValue("Bob", 14); /* output: "Bob is 14 years old." */ ``` #### (3) return a console message ```javascript function returnType() { return console.log("Hello World"); } const returnValue = returnType(); /* output: "Hello World" */ ``` ### 5. Difference between *break* & *return* The `return` keyword that resides inside a loop stops the function execution and immediately exits from the function. So, the code below the loop becomes as *unreachable*. However, if we use the `break` statement instead, we can still get the code below the loop. ```javascript function useBreak() { for (let i = 1; i <= 5; i++) { if (i === 4) { console.log("reach to the value: " + i); break; } } console.log("outside the loop"); } useBreak(); /* output: "reach to the value: 4" "outside the loop" */ ``` ```javascript function useReturn() { for (let i = 1; i <= 5; i++) { if (i === 4) { console.log("reach to the value: " + i); return; } } console.log("outside the loop"); } useReturn(); /* output: "reach to the value: 4" */ ``` ### Conclusion Okay, now we have finished talking about the tiny but so important piece of JavaScript. That is the *return* keyword. So to wrap this up, let me summarize things that we have discussed above. * It only can be used inside a function. * Mainly, it stops the function execution at that point and returns the specified value to the function caller. * When executing the return statement, the code that comes after it becomes as *unreachable code*. * JavaScript does not allow line terminators between the *return* keyword and the expression. I hope you enjoyed this article, and you can support me at [ko-fi](https://ko-fi.com/mkdaycode). I always appreciate your support. It really encourages me to keep going. **_Happy Coding!_** [![image_description](https://cdn.ko-fi.com/cdn/kofi2.png?v=3)](https://ko-fi.com/mkdaycode)
mkday
964,840
10+ Best Websites To Download Free Website Templates For Developers.
Thousands of different website designs are available on the internet. Many of them are free to...
0
2022-01-23T13:34:35
https://cesscode.hashnode.dev/10-best-websites-to-download-free-website-templates-for-developers
webdev, javascript, html, css
Thousands of different website designs are available on the internet. Many of them are free to download. Whether you are a beginner or expert, you always need the best templates for your website. In this article, you will find a list of websites to download free website templates. let's get started 💃 <iframe src="https://giphy.com/embed/l0G17c5peP4uHYsco" width="300" height="368" frameBorder="0" class="giphy-embed" allowFullScreen></iframe><p><a href="https://giphy.com/gifs/thesimpsons-l0G17c5peP4uHYsco"></a></p> ## What Is a Website Template? In a word, a **template** is a predesigned page that you can use as a foundation for a new page. **A website template** is also known as a **Web-page template** or **page template**. You can use a **website template** to create beautiful websites with little or no coding skills. Check out this article to know more about [website Templates](https://www.techopedia.com/definition/4899/website-template) ## What Are the Advantages of Using a Website Template? There are many benefits to using a website template. We will focus on two of the most important, which are: - **Time-Saving:** It takes a long time to build a website from the ground up. The developer will be able to save time by using a website template. All that's left for the developer to do now is add its content. Adding materials such as text and images will take far less time than if the developer built it from the ground up. - **Money-Saving:** One of the benefits of using a website template is that it helps save money. The cost of purchasing ready-made web design templates is cheap. The designer will save money by going for finished themes for his customers. ## Where Can I Get Free Website Templates From? Here's a list of websites where you will get free responsive website templates: 1. **Html5Up** **Html5Up** contains free website templates built with HTML5 and CSS3. The website templates are responsive and customizable. There are many portfolios with good transitions to choose from on Html5Up. You can use these templates for personal or business purposes. You can also customize them in whatever way you like. You can use all the templates on HTml5up for free all you have to do is give them credit for the design. ![frame_chrome_mac_dark (1).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642715682472/ZuKm7rIZl.png) Link to HTml5up website [https://html5up.net] 2. **Colorlib** **Colorlib** provides top-of-the-line website templates that are ready to use on any site. This website contains simple, responsive, clean, and fast-loading free WordPress themes for everyone. The different theme options are customizable. You have total control over how you customize them. **Colorlib** makes creating professional websites accessible with is different customizable templates. ![frame_chrome_mac_dark (2).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642715794498/t-CL7PBzu.png) Link to Colorlib website [https://colorlib.com/wp/templates] 3. **Templatemo** This website contains free Html5, CSS, and Bootstrap templates. The templates are responsive and customizable. You can customize them in whatever way you like. You can use these templates for personal or business purposes. ![frame_chrome_mac_dark (3).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642715950029/jIO4NcgF8.png) Link to Templatemo website [https://templatemo.com] 4. **Onepage** This website contains both free and paid website templates. The template website templates are single-page websites. Each template contains a review, long screenshot, live demo, and download links. It has no extra pages like the about or services pages. All the content on the website sits within the same webpage in a long scrolling layout. The website runs on WordPress and features customized WordPress themes. ![frame_chrome_mac_dark (4).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642716115968/Oymm7qQTg.png) Link to Onepage website [https://onepagelove.com/templates] 5. **Cruip** This website contains free HTML landing page templates for download. The templates make it simple to create your landing pages. You can also sign up with your email address to receive updated templates in your inbox. ![frame_chrome_mac_dark (5).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642716339599/8xOvtHshV.png) Link to Cruip website [https://cruip.com/free-templates] 6. **StyleShout** This website contains beautiful free website templates. The websites are simple, clean, and handcrafted. It also has a blog section where they post articles. They publish articles about their daily updates, web design, and web development. ![frame_chrome_mac_dark (6).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642716521821/8hlhZKOuP.png) Link to Styleshout website [https://www.styleshout.com/free-templates] 7. **Startbootstrap** This website contains bootstrap themes, templates, and UI tools. You have complete freedom to customize them in whatever way you like. You can also sign up with your email address to receive updated templates in your inbox. ![frame_chrome_mac_dark (11).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642717619130/XVc_9kQbO.png) Link to Startbootstrap website [https://startbootstrap.com] 8. **ZeroTheme** This website contains free Html5, Css3, and bootstrap website templates. You can use these templates for personal or business purposes. The different template options are customizable. You can customize them in any form you like. ![frame_chrome_mac_dark (13).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642717845499/o04A2Q5Gw.png) Link to Zerotheme website [https://www.zerotheme.com] 9. **Html5xcss3** This website contains free responsive Css3, bootstrap, and Html5 templates. The templates are editable, and you can use them for business and personal reasons. You can also sign up with your email address to deliver new templates to your inbox. ![frame_chrome_mac_dark (12).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642717735825/1TMjDAb1X.png) Link to Html5xcss3 website [https://www.html5xcss3.com] 10. **Graphberry** This website contains free HTML, CSS, Bootstrap, and React templates. It also includes UI kits, mockups, and designs for your use. Select the mockups, icons, or template links from the navigation menu. Graphberry provides assets that help speed the design process for various needs. ![frame_chrome_mac_dark.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642715455508/5ou8ZxC7L.png) Link to Graphberry website [https://www.graphberry.com] 11. **Tooplate** This website contains responsive HTML CSS templates built on the responsive Bootstrap framework. The templates are customizable. They are easy to edit for your websites. You can download any template for free. There's no need to log in or register to use the templates. ![frame_chrome_mac_dark (10).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642717361107/GqFjAfH81.png) Link to Tooplate website [https://www.tooplate.com] 12. **UIdeck** This website contains free and paid HTML, Bootstrap, React, and Tailwind templates. These templates help you to create websites without coding from scratch. You can use these templates for personal or business purposes. ![frame_chrome_mac_dark (9).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642717231930/kjS2sX1mO.png) Link to UIdeck website [https://uideck.com] 13. **ThemeForest** This website contains professional WordPress themes and website templates for any project. The templates are customizable. You can also customize them in whatever way you like. There are both free and paid templates and designs in ThemeForest. ![frame_chrome_mac_dark (8).png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642716834306/otT_peFNk.png) Link to ThemeForest website [https://themeforest.net] ## Conclusion I hope that this article has provided you with a few websites to get free templates. With these sites, you will find the perfect website templates to download. I hope you enjoyed this article, be sure to check back for more content soon! 💙 <iframe src="https://giphy.com/embed/DdDeUk0sGATFS" width="300" height="270" frameBorder="0" class="giphy-embed" allowFullScreen></iframe><p><a href="https://giphy.com/gifs/just-glass-ill-DdDeUk0sGATFS"></a></p>
cesscode
964,915
Share Text Across Near 💻Devices📱 using this website 🔥
Sharing Text data across near devices has always been a headache. Some conventional methods to share...
0
2022-01-23T15:25:22
https://dev.to/rajeshj3/share-text-across-near-devices-using-this-website-23hh
javascript, react, webdev, programming
Sharing Text data across near devices has always been a headache. Some conventional methods to share text data are, Using native cross-platform applications (eg. WhatsApp, WeChat, Telegram, etc.). Or, Using Email Services (Gmail, Yahoo Mail, etc.) All such conventional methods needs either installation of Native Applications or Bulky Sites. ## Solution? [**TEMP-SHARE**](https://temp-share.ml), a fast 🚀, reliable 💪 and secure 🛡️ web application. **TEMP-SHARE** meets all your requirements of sharing text data across near devices- > Unlimited Text Length 🗒️ > Secure 4-characters alpha-numeric random password 🔑 > Your Custom Password 😎 > Max Data Persistence Life of 10 minutes 🤩 > One Time Read Only 🤫 > Most important, Dark/Light Themes 🌒 ## How to use? > Step. 1 Just visit [temp-share.ml](https://temp-share.ml) ![temp-share home page](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s1ekn2a19q1eb8xj4f4r.png) > Step. 2 Enter the Text you want to share, and hit **SUBMIT** > STEP. 3 You'll get a 4-characters alpha-numeric random password 🔑 ![Submitted Text](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jzzbe2la56zgqo01vhtb.png) > Step. 4 Open the same website ([temp-share.ml](https://temp-share.ml)) on another device and Paste the **passcode**. ![entered passcode](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/978jhckd7g8zp6sb9od3.png) > Step. 5 Just hit **GET TEXT**. ![GET TEXT](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e84se614rm07hc50oia6.png) Here you go, you have the text on your other device. --- **Note:** As, we checked **One Time View**, fetching the text again will return an error. ![Error](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ak08bdyqxkyx8hqjkfwe.png) --- Other than this, You can also set your custom password 🔑 --- ## Live ✨ TEMP-SHARE is currently Live at [temp-share.ml](https://temp-share.ml) --- ## Contribution ✨ TEMP-SHARE is **not Open-Souce yet**. But, if you want us to Open the Source, please drop a comment explaining why you are interested. --- I hope, you guys liked this quick introduction to TEMP-SHARE. If so, then please don't forget to drop a Like ❤️ And also, help me reach **1k Subscribers** 🤩, on my [YouTube channel](https://www.youtube.com/channel/UCCO4jIqmQVFDmVeeaAO5obA). Happy Coding! 😃💻
rajeshj3
964,963
Show loading indicator for Lazy Modules in Angular
In Angular, By default, all modules are loaded as soon as the applications load, irrespective of...
0
2022-01-23T17:59:37
https://dev.to/ahmedgmurtaza/show-loading-indicator-for-lazy-modules-in-angular-4knf
angular, javascript, async, webdev
In Angular, By default, all modules are loaded as soon as the applications load, irrespective of which modules are immediately necessary and which are not. ## Why Lazy-loaded Modules In the case of applications with many routes, these modules would eventually increase initial load time and consequently bad user experience. To prevent large load time we prefer lazy-loaded modules to minimize initial load time as well as bundle size. Every module is of different sizes as well as the network conditions, which will take different times to load. _For a better user experience, showing loader would definitely be a good idea!_ ## Loader code **app.component.html** ```html <router-outlet> <span class="loader" *ngIf="isLoading"></span> </router-outlet> ``` **app.component.css** ```css .loader { display: inline-block; width: 40px; height: 40px; position: absolute; left: 0; right: 0; margin-left: auto; margin-right: auto; top: 50%; transform: translateY(-50%); } .loader:after { content: " "; display: block; width: 100px; height: 100px; border-radius: 50%; border: 5px solid #000; border-color: #000 transparent #000 transparent; animation: loader 1.2s linear infinite; } @keyframes loader { 0% { transform: rotate(0deg); } 100% { transform: rotate(360deg); } } ``` **app.component.ts** ```javascript import { Component } from '@angular/core'; import { Router, RouteConfigLoadStart, RouteConfigLoadEnd, RouterEvent } from '@angular/router'; @Component({ selector: 'app-root', templateUrl: './app.component.html', styleUrls: ['./app.component.css'] }) export class AppComponent { isLoading: boolean = false constructor(router: Router) { router.events.subscribe( (event: RouterEvent): void => { if (event instanceof RouteConfigLoadStart) { this.isLoading = true; } else if (event instanceof RouteConfigLoadEnd) { this.isLoading = false; } } ); } } ``` The actual source code is [here](https://codesandbox.io/s/show-loader-in-lazy-module-in-angular-tb25o). The Loader part is `<span class="loader" *ngIf="isLoading"></span>` which has a condition to show and hide based on **isLoading** boolean. The last part is app.component.ts where we have added the following code block: ```javascript router.events.subscribe( (event: RouterEvent): void => { if (event instanceof RouteConfigLoadStart) { this.isLoading = true; } else if (event instanceof RouteConfigLoadEnd) { this.isLoading = false; } } ); ``` Here we are subscribed to router events and switching **isLoading** based on **RouteConfigLoadStart** and **RouteConfigLoadStart**. Hope this would be useful, see you guys soon 👋.
ahmedgmurtaza
964,997
Why you should write clean code as a JavaScript Developer?
Hello Folks 👋 What's up friends, this is SnowBit here. I am a young passionate and...
0
2022-01-23T16:52:38
https://codewithsnowbit.hashnode.dev/why-you-should-write-clean-code-as-a-javascript-developer
javascript, node, webdev
## Hello Folks 👋 What's up friends, this is **SnowBit** here. I am a young passionate and self-taught developer and have an intention to become a successful developer. Today, I am here with something important for you as a JavaScript Developer. ## Why you should write clean code as a JavaScript Developer Writing clean code improves the maintainability of the application and make the developer productive. Unfortunately, some developers are unaware of this language feature. ### 🌟 Make Use of Arrow Functions Arrow functions provide the abridged way of writing JavaScript. The main benefit of using arrow functions in JavaScript is curly braces, parenthesis, function, and return keywords become completely optional; and that makes your code more clear understanding. The example below shows a comparison between the single-line arrow function and the regular function. ```js // single line arrow function const sum = (a, b) => a + b // Regular Function function sum(a, b) { return a + b; } ``` ### 🌟 Use Template Literals for String Concatenation Template literals are determined with backticks Template literals can contain a placeholder, indicated by a dollar sign and curly braces ```js ${expression} ``` We can define a placeholder in a string to remove all concatenations. ```js // before const hello = "Hello" console.log(hello + " World") // after const hello = "Hello" console.log(`${hello} World`) ``` ### 🌟 Spread Syntax Spread Syntax(...) is another helpful addition to ES6. It is able to expand literals like arrays into individual elements with a single line of magic code. 🔮 ```js const sum = (a, b, c) => a + b + c const num = [4, 5, 6] console.log(`Sum: ${sum(...num)}`) ``` ### 🌟 Object Destruction Object destruction is a useful JS feature to extract properties from objects and bind them to variables. For example, here we create an object with curly braces and a list of properties. ```js const me = { name: "SnowBit", age: 15, language: "JavaScript" } ``` Now let’s extract `name` and `age` property values and assign them to a variable. ```js const name = me.name const age = me.age ``` Here, we have to explicitly mention the `name` and `age` property with `me` object using dot(.), and then declare the variables and assign them. We can simplify this process by using `object destruction` syntax. ```js const {name, age} = me console.log(name, age) ``` --- Thank you for reading, have a nice day! **Your appreciation is my motivation 😊** - Follow me on Twitter - [@codewithsnowbit](https://twitter.com/codewithsnowbit) - Subscribe to me on YouTube - [Code With SnowBit](https://www.youtube.com/channel/UCNTKqF1vhFYX_v0ERnUa1RQ?view_as=subscriber&sub_confirmation=1)
dhairyashah
965,010
Build Privex: A Cross-Platform ReactNative Chat App
What you’ll be building. See live demo and Git Repo Here. Introduction Let’s be real,...
0
2022-01-24T02:18:39
https://dev.to/daltonic/build-privex-a-cross-platform-reactnative-chat-app-4jb4
reactnative, cometchat, firebase, chatapp
What you’ll be building. See live [demo](https://privex-d1c15.web.app/) and Git Repo [Here](https://github.com/Daltonic/privex). ![Privex Chat Interface](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642938146196_privex-chat.gif) ## Introduction Let’s be real, if you haven’t built a chat app yet, you are still a little bit behind the cutting edge of software development. You need to upskill your app development to the next level. Luckily, cross-platform frameworks such as [ReactNative](https://reactnative.dev/) can get you building a modern chat app in no time like the one seen above. In this tutorial, you will learn how to use; ReactNative, [CometChat](http://), and Firebase to build a **one-on-one** chat app with a stunning UI. If you are ready, let’s get started… ## Prerequisite To understand this tutorial, you should already be familiar with ReactNative; the rest of the stacks are simple to grasp. The packages used to create this application are listed below. - [ReactNative](https://reactnative.dev/) - [Firebase](http://) - [CometChat](https://app.cometchat.com/) - [Expo](http://) - [NodeJs](http://) ## Installing The Project Dependencies First, download and install **NodeJs** on your machine. Then, if you haven't already, go to their [website](https://nodejs.org/) and finish the installation. The **Expo-CLI** must then be installed on your computer using the command below. You can get to their doc page by clicking on this [LINK](http://). # Install Expo-CLI npm install --global expo-cli After that, open the terminal and create a new expo project called **privex**, selecting the blank template when prompted. Use the example below to demonstrate this. #Create a new expo project and navigate to the directory expo init privex cd privex #Start the newly created expo project expo start Running the above commands on the terminal will create a new react-native project and start it up on the browser. Now you will have the option of launching the IOS, Android, or the Web interface by simply selecting the one that you want. To spin up the development server on IOS or Android you will need a simulator for that, use the instruction found here to use an [IOS](https://docs.expo.dev/workflow/ios-simulator/) or [Android](https://docs.expo.dev/workflow/android-studio-emulator/) simulator, otherwise, use the web interface and follow up the tutorial. Great, now follow the instructions below to install these critical dependencies for our project. **Yarn** is the expo's default package manager; see the codes below. # Install the native react navigation libraries yarn add @react-navigation/native yarn add @react-navigation/native-stack #Installing dependencies into an Expo managed project expo install react-native-screens react-native-safe-area-context react-native-gesture-handler # Install an Icon pack and a state manager yarn add react-native-vector-icons react-hooks-global-state Nice, now let’s set up [Firebase](https://console.firebase.google.com/) for this project. ## Setting Up Firebase Run the command below to properly install firebase in the project. #Install firebase with the command expo install firebase Let's get started by configuring the Firebase console for this project, including the services we'll be using. If you do not already have a Firebase account, create one for yourself. After that, go to Firebase and create a new project called **privex**, then activate the Google authentication service, as detailed below. ![Step 1](https://paper-attachments.dropbox.com/s_0CD8CF1F8D288F8B67E35429489539F2DEF742656388B21AFE423365DCE87673_1639828562449_screenshoteasy+6.png) ![Step 2](https://paper-attachments.dropbox.com/s_0CD8CF1F8D288F8B67E35429489539F2DEF742656388B21AFE423365DCE87673_1639828654829_screenshoteasy+7.png) ![Project Creation Process](https://paper-attachments.dropbox.com/s_0CD8CF1F8D288F8B67E35429489539F2DEF742656388B21AFE423365DCE87673_1639828654775_screenshoteasy+9.png) ![Step 3](https://paper-attachments.dropbox.com/s_0CD8CF1F8D288F8B67E35429489539F2DEF742656388B21AFE423365DCE87673_1639828654724_screenshoteasy+10.png) Firebase supports authentication via a variety of providers. For example, social authentication, phone numbers, and the traditional email and password method. Because we'll be using the Google authentication in this tutorial, we'll need to enable it for the project we created in Firebase, as it's disabled by default. Click the sign-in method under the authentication tab for your project, and you should see a list of providers currently supported by Firebase. ![Firebase Authentication Service](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939379597_screenshoteasy+8.png) ![Step 1](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939407189_screenshoteasy+9.png) ![Step 2](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939422868_screenshoteasy+10.png) ![Step 3](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939448868_screenshoteasy+11.png) Super, that will be all for the firebase authentication, lets generate the Firebase SDK configuration keys. You need to go and register your application under your Firebase project. ![Project Overview Page](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939587877_screenshoteasy+4.png) On the project’s overview page, select the add app option and pick **web** as the platform. ![Registering a Firebase SDK Step 1](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939610677_screenshoteasy+5.png) ![Registering a Firebase SDK Step 2](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939643415_screenshoteasy+6.png) Return to the project overview page after completing the SDK config registration, as shown in the image below. ![Project overview page](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939734348_screenshoteasy+7.png) Now you click on the project settings to copy your SDK configuration setups. ![Project Setups](https://paper-attachments.dropbox.com/s_0CD8CF1F8D288F8B67E35429489539F2DEF742656388B21AFE423365DCE87673_1639909446593_screenshoteasy+12.png) The configuration keys shown in the image above must be copied to a separate file that will be used later in this project. Create a file called **firebase.js** in the root of this project and paste the following codes into it before saving. {% gist https://gist.github.com/Daltonic/a44a6277a91984c549ff37084c537999 %} You are awesome if you followed everything correctly. We'll do something similar for **CometChat** next. ## Setting CometChat Head to [CometChat](https://app.cometchat.com/app/) and signup if you don’t have an account with them. Next, log in and you will be presented with the screen below. ![CometChat Dashboard](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939920698_screenshoteasy+26.png) To create a new app, click the **Add New App button**. You will be presented with a modal where you can enter the app details. The image below shows an example. ![Add New App Modal](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642939907089_screenshoteasy+27.png) Following the creation of your app, you will be directed to your dashboard, which should look something like this. ![API Key Here](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642940299680_screenshoteasy+28.png) ![Rest API Key Here](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642940299625_screenshoteasy+29.png) You must also copy those keys to a separate file in the manner described below. Simply create a file called **CONSTANTS.js** in the project's root directory and paste the code below into it. Now include this file in the **.gitIgnore** file, which is also located at the root of this project; this will ensure that it is not published online. export const CONSTANTS = { APP_ID: 'xxx-xxx-xxx', REGION: 'us', Auth_Key: 'xxx-xxx-xxx-xxx-xxx-xxx-xxx-xxx', } Finally, delete the preloaded users, and groups as shown in the images below. ![Users List](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642940871741_screenshoteasy+30.png) ![Group List](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642940871690_screenshoteasy+31.png) Awesome, that will be enough for the setups, let’s start integrating them all into our application, we will start with the components. ## The Components Directory This project contains several directories; let's begin with the components folder. Within the root of this project, create a folder called **components**. Let's begin with the **Header** component. **The Header Component** ![The Header Component](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642952011423_screenshoteasy+34.png) This is a styled component supporting the beauty of our app. Within it is the avatar element which displays the current user's profile picture. Also with this avatar, you can log out from the application. Create this file by going into the components directory and creating a file named **HomeHeader.js**, afterward, paste the code below into it. {% gist https://gist.github.com/Daltonic/240747c706cf917532dcc99ddee76002 %} **The ChatContainer Component** ![The Chat Container](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642953342704_screenshoteasy+37.png) This component is responsible for showcasing the recent conversations of a user. It does more than that, it can also display users' stories and a floating button that lunches a modal for showing all the users registered in our app. We will look at each of them separately, shown below is the code for it. {% gist https://gist.github.com/Daltonic/601dae9b154f1b5b061f1f0b786a52e6 %} **The FloatingButton Component** ![FloatingButton ](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642953157814_screenshoteasy+36.png) This component is responsible for launching the **UserList** modal which allows us to chat with a new user on our platform. Here is the code snippet for it. {% gist https://gist.github.com/Daltonic/e41fa762b1b1ad93794b16598943881a %} **The UserList Component** ![The user List Component](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642953580526_screenshoteasy+38.png) This component is launched by the **FloatingButton**. It displays all the users that are on our platform and enables you to have a first-time conversation with them. After then, they can appear in your conversation list. Below is the code responsible for its implementation. {% gist https://gist.github.com/Daltonic/d2c49606a79efcc6ad0858ba89d31572 %} Fantastic, we have just finished building the dedicated components, let’s proceed to craft out the screens now. ## The Screens Directory The screens are similar to website pages; each screen represents a page, and you can navigate from one to the next using the **ReactNative** navigator package. Let's proceed with the **LoginScreen**. **The LoginScreen** ![The Login Screen](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642954019473_screenshoteasy+32.png) This well-crafted screen does a lot of things behind the scene. It uses Firebase Google authentication to sign you into the system. Once you are signed in, Firebase **authStateChange** function will take note of you as a logged-in user. This is carried out in the **AuthNavigation** file. But before you are let into the system, your authenticated details will be retrieved and sent to CometChat either for signing up or signing in. Once **CometChat** is done, you are then let into the system. See the code below for a full breakdown. {% gist https://gist.github.com/Daltonic/0dfa4574ae9e77909893e8b3dfe0e1de %} **The HomeScreen** ![The Home Screen](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642954467775_screenshoteasy+39.png) This well-crafted screen brings together all the dedicated components in the components directory to one space. Each of the components knows how to perform their duties. Not many words here, let the code below do all the explaining. import { SafeAreaView, StyleSheet } from 'react-native' import ChatContainer from '../components/ChatContainer' import FloatingButton from '../components/FloatingButton' import HomeHeader from '../components/HomeHeader' import UserList from '../components/UserList' const HomeScreen = ({ navigation }) => { return ( <SafeAreaView style={styles.container}> <HomeHeader /> <ChatContainer navigation={navigation} /> <FloatingButton /> <UserList navigation={navigation} /> </SafeAreaView> ) } export default HomeScreen const styles = StyleSheet.create({ container: { flex: 1, backgroundColor: '#122643', }, }) **The ChatScreen** ![The Chat Screen](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642954848645_screenshoteasy+40.png) Lastly for the screens, we have the chat screen that lets us perform one-on-one conversations with another user. It heavily uses the **CometChat SDK** for its operations, let's take a look at the function of the code. {% gist https://gist.github.com/Daltonic/870333db4540072750af19a664150c2b %} There are three functions you should take note of, they are the meat of this screen. The **getMessages**, **sendMessage**, and **listenForMessage** functions. They utilize the **CometChat SDK** and each one of them performs their operations according to their names. That’s the last screen for the application, let’s seal it up with the App component and the router set up… ## Setting Up The Router Now that we've finished coding the project, let's set up the navigation routers and guards. To do so, create and paste the following codes as directed below. **The Navigation file** This categorizes the screens into two groups: those that require authentication and those that do not. Make a new file called **"navigation.js"** in the project's root and paste the code below into it. {% gist https://gist.github.com/Daltonic/394a01466750d92dd8937b447a95b2d8 %} **The AuthNavigation file** This file displays screens logically to you based on the **authState** of the firebase authentication service. It is also responsible for signing a user to **CometChat** depending on whether they are registering or logging in into the system. To proceed, create a new file in the project's root called **AuthNavigation.js** and paste the code below into it. {% gist https://gist.github.com/Daltonic/0c33d6acebc7847a8db7f793f87ea627 %} Finally, the App component… **The App Component** This component puts together every part of this project. Please replace the content of this file with the code below. import { CometChat } from '@cometchat-pro/react-native-chat' import { useEffect } from 'react' import AuthNavigation from './AuthNavigation' import { CONSTANTS } from './CONSTANTS' export default function App() { const initCometChat = () => { let appID = CONSTANTS.APP_ID let region = CONSTANTS.REGION let appSetting = new CometChat.AppSettingsBuilder() .subscribePresenceForAllUsers() .setRegion(region) .build() CometChat.init(appID, appSetting) .then(() => console.log('Initialization completed successfully')) .catch((error) => console.log('Initialization failed with error:', error)) } useEffect(() => initCometChat(), []) return <AuthNavigation /> } Cheers, you just smashed this app now it’s time you do more than this with this new trick you’ve learned. You can spin up your server using the code below on your terminal if you have not done that already. # Start your ReactNative local server on the web view yarn web The App should function like the one in the image below. ![The App Overview](https://paper-attachments.dropbox.com/s_52E0591A3F34FCACA6806EE6ACA69150A91780C910B4BF05FF192DA50DA23706_1642955276948_privex-overview.gif) ## Conclusion We've reached the end of this tutorial; hopefully, you learned something new. Becoming a modern-day developer can be difficult, but it is not impossible; you can accomplish more than you think; all you need is a little guidance. And now you know how to use ReactNative, Firebase, and [CometChat](https://app.cometchat.com/) to create a fantastic chat app with a beautiful interface. I have more of these tutorials that will show you how to make a private or public group chat. I'm excited to see your magnificent creations. All the best! ## About the Author Gospel Darlington kick-started his journey as a software engineer in 2016. Over the years, he has grown full-blown skills in JavaScript stacks such as React, ReactNative, VueJs, and more. He is currently freelancing, building apps for clients, and writing technical tutorials teaching others how to do what he does. Gospel Darlington is open and available to hear from you. You can reach him on [LinkedIn](https://www.linkedin.com/in/darlington-gospel-aa626b125/), [Facebook](https://www.facebook.com/darlington.gospel01), [Github](https://github.com/Daltonic), or on his [website](https://daltonic.github.io/).
daltonic
965,485
PS1toEXE
A tool to convert powershell script to exe and make the same execute in other windows machines...
16,553
2022-01-24T04:54:35
https://devpost.hashnode.dev/ps1toexe
powershelll, automation, windows, programming
A tool to convert powershell script to exe and make the same execute in other windows machines without opening any script runner. ### :boom: Reposting my old repo to the community This was built by me several years ago, thought it would be nice to repost this to this new blog. [:link: Github Repo link](https://github.com/aravindvcyber/PS1toEXE) # :dart: PS1toEXE This PowerShell script lets you "convert" PowerShell scripts into EXE files modified and updated from ps2exe v0.5.0.0 to include support for powershell 5 with full features which is working in Windows 10 # :runner: How to execute this script Open windows powershell ```ps c:\> .\ps1toexe.ps1 -inputfile inputfilepath.ps1 outputfilepath.exe``` and other necessary switches. Besides this some of our contributes has built an export module as well `ConvertTo-Exe`, go and explore it which make this better now. ### For more stuff like this follow our blog [:postbox: Dev Post](https://devpost.hashnode.dev). This repo is free to use and we are ready to refine with your contributions as well, so don't hesitate to raise :white_check_mark: pull request or comment in our blog post below [:link: @PS1toEXE](https://devpost.hashnode.dev/PS1toEXE). And also find our community reacts about PS1toEXE. Thanks for supporting! 🙏 Would be really great if you like to [:coffee: Buy Me a Coffee](https://www.buymeacoffee.com/AravindVCyber), to help boost my efforts. <a href="https://www.buymeacoffee.com/AravindVCyber"><img src="https://img.buymeacoffee.com/button-api/?text=Buy me a coffee&emoji=&slug=AravindVCyber&button_colour=BD5FFF&font_colour=ffffff&font_family=Cookie&outline_colour=000000&coffee_colour=FFDD00" /></a> :repeat: Original post at :link: [Dev Post](https://devpost.hashnode.dev/ps1toexe)
aravindvcyber
965,609
How to kill a process running on particular port in Linux?
...
0
2022-01-24T09:29:16
https://dev.to/gilly7/how-to-kill-a-process-running-on-particular-port-in-linux-37mc
{% stackoverflow 11583562 %}
gilly7
965,800
What Are the Advantages and Disadvantages of a Compiler?
If you're a software engineer, I can imagine that you've heard the term "compiler" a lot whenever...
0
2022-01-24T13:13:25
https://techwithmaddy.com/what-are-the-advantages-and-disadvantages-of-a-compiler
programming, computerscience, codequality
If you're a software engineer, I can imagine that you've heard the term "compiler" a lot whenever you're in the process of coding something. Especially when you're getting an error. 😁 The compiler can be a friend or an enemy, depending on what you're doing. This article will show you the advantages and disadvantages of a compiler. The book "[Five Lines of Code](https://www.manning.com/books/five-lines-of-code?utm_source=maddy&utm_medium=affiliate&utm_campaign=book_clausen_five_5_8_20&a_aid=maddy&a_bid=4b64bea7)" explains it brilliantly, so I'm going to refer to that in this article. The code snippets are in Typescript. > DISCLOSURE: This article contains affiliate links to external products. This means that I will earn a commission at no additional cost to you if you decide to make a purchase. Thank you! Now let's dive in. #### What is a compiler? #### A compiler is a program that turns a high-level language into low-level code that the processor can digest. This is the primary goal of a compiler. But nowadays, compilers do more than that. They check if some errors can happen during runtime, and in this article, we're going to concentrate on this aspect. #### What are the advantages of a compiler? #### *1. Advantage: Reachability ensures that methods return* The compiler ensures that a method has a return statement. The compiler will let us know when a method should return something, and if we miss the return statement. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642683107299/jYMp0YZC5.png) *2. Advantage: Definite assignment prevents accessing uninitialized variables* Compilers are helpful with checking whether a variable has something assigned to it before we use them. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642683139942/ZBCh0JhX1.png) The compiler won't let the programmer get away with not declaring a variable. *3. Advantage: Access control helps encapsulate data* The compiler does an excellent job of managing access control. If we make a member private, we know that it will not slip away easily. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642685112779/V7is2vTux.png) *4. Advantage: Type checking proves properties* The compiler helps us check whether members and variables exist and are associated with compatible types. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642863548812/SA6eyXCwg.png) We've seen the advantages of the compiler. Now let's examine the disadvantages of a compiler. #### What are the disadvantages of a compiler? #### *1. Disadvantage: The halting problem limits compile-time knowledge* What is the "halting problem"? [Wikipedia](https://en.wikipedia.org/wiki/Halting_problem) defines the halting problem like this: > "In computability theory, the halting problem is the problem of determining, from a description of an arbitrary computer program and an input, whether the program will finish running, or continue to run forever." In simpler words, the definition says that we don't know what's going to happen until we run a program. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642938438030/8W6AWCkQ2.png) In the screenshot above, what's inside the IF statement won't run, and regardless the program will fail because there is no definition for the method foo(). This is a weakness because the compiler will allow a program to behave unexpectedly in some cases. In other cases, the compiler will perform *conservative analyses*, which means it will refuse to run the program if it is unsafe. *6. Disadvantage: Dereferencing null crashes our application* Have you ever got a `NullPointerException` unexpectedly? Unfortunately, the compiler doesn't help with this. The compiler will allow a program to run but then throw a NullPointerException at runtime. In some cases, the compiler will tell you something such as "XYZ may not have been initialized". But not all the time the compiler will help you prevent a NullPointerException. An excellent way to prevent a NullPointerException is to ensure that all objects have been initialized before we use them. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642938495529/ZIPOV31Fz.png) *7. Disadvantage: Arithmetic errors cause overflows or crashes* Remember the `StackOverflowException`? The compiler doesn't help with this as well. It doesn't even tell us whether a number is divisible by zero (in that case, we'll have to trust our maths skills). ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642938343442/LFVx7O8Jd.png) *8. Disadvantage: Out-of-bounds errors crash our application* Have you ever seen an `ArrayOutOfBoundsException`? I certainly have. It happens when you try to access an element outside the data structure. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642938571222/xuRKe5P_M.png) In this case, the compiler will not help us. A great way to overcome this exception is to go through the data structure using a loop. Alternatively, we can check that an element exists before we access it. *9. Disadvantage: Infinite loops stall our application* The compiler doesn't help us when we stare at the screen while there is a never-ending loop running in the background. I encountered this problem in the past when using a while loop. Indeed, it's better to use a for loop instead of a while loop to prevent this problem. ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642938615735/aG4XWF-L3.png) *10. Disadvantage: Deadlocks and race conditions cause unintended behavior* The compiler doesn't help us with multithreading. And multithreading is an advanced topic. Many problems (such as race conditions, deadlocks, and starvation) can occur with multithreading. When does a race condition occur? "A race condition occurs when a software program depends on the timing of one or more processes to function correctly."[[TechTerms.com](https://techterms.com/definition/race_condition)] ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642867225164/T6bH9csuI.png) When does a deadlock occur? "Both threads are locked, waiting for each other to unlock before continuing." ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642867364335/hPaJS8FUD.png) A deadlock can be solved using a *lock*, which checks whether a thread is free before moving on. When does starvation occur? The author of the book uses an excellent metaphor: "The metaphor for it is a one-lane bridge where one side has to wait, but the stream of cars from the other side never stops". ![image.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1642867470220/pNphyYlp0.png) **A bit of curiosity: what's the difference between a compile-time and a runtime error?** Compile-time errors happen because of mistakes in the syntax or semantics. The programmer can quickly fix the error during compile-time. Runtime errors happen during program execution. Fixing runtime errors is challenging because the compiler doesn't highlight runtime errors. **CONCLUSION** In this article, we went through the advantages and disadvantages of the compiler, and the difference between compile-time error and runtime error. If you enjoy my content, consider subscribing to my [newsletter](https://techwithmaddy.com/). Thank you for reading this article. Until next time! 👋🏾 ***ADDITIONAL RESOURCES:*** - [Christian Clausen: Five Lines of Code - "How and When to Refactor"](https://www.manning.com/books/five-lines-of-code?utm_source=maddy&utm_medium=affiliate&utm_campaign=book_clausen_five_5_8_20&a_aid=maddy&a_bid=4b64bea7) - [Baeldung: Runtime VS Compile-time](https://www.baeldung.com/cs/runtime-vs-compile-time)
maddy
965,839
Arrow Function in Javascript
Hello guys today i am discussing about Arrow functions in javascript or we can say in ES6. ...
0
2022-01-24T14:25:17
https://dev.to/shubhamtiwari909/arrow-function-in-javascript-46gd
javascript, es6, productivity, webdev
Hello guys today i am discussing about Arrow functions in javascript or we can say in ES6. ## Arrow function - * Arrow functions are introduced in ES6, which provides you a more accurate way to write the functions in JavaScript. They allow us to write smaller function syntax. Arrow functions make your code more readable and structured. * Arrow functions are anonymous functions (the functions without a name and not bound with an identifier). * They can be declare without the function keyword. * Arrow functions cannot be used as the constructors. * They are also called as Lambda Functions in different languages. ###syntax - ```javascript const functionName = (arg1, arg2, ?..) => { //body of the function } ``` ### Example 1 - Normal function ```javascript //Normal function function display(){ console.log("This is a normal function"); } display(); //Arrow function const display = () => { console.log("This is an Arrow function"); } display(); ``` ###Output - ``` This is a norml function This is an Arrow function ``` * "=>" , it is called fat-arrow notation or lambda notation. * "()" , inside this braces we can pass parameters * Then we got the body of the function inside curly braces "{}". * Then we called it like a normal function using its variable name. ###Example 2 - Parameterised Function ```javascript //With parameters const Factorial = (num) =>{ let fact = 1; for(let i=1;i<=num;i++){ fact *= i; } console.log(fact); } Factorial(5); ``` ###Output - ``` 120 ``` * We have created an arrow function named Factorial and inside the braces "(num)", we have passed the "num" parameter , which is the number used to calculate the factorial, then inside the function body we have wrote the factorial logic and in the end we called the Factorial function which produces the output. ###Example 3 - Single line functions ```javascript //single line statement functions dont need curly braces //it is optional to put curly braces for it const single = (a,b) => return a + b; single(10,15); ``` ###Output- ``` 25 ``` ### Example 4 - Rest parameters ```javascript //With rest parameters const restPararmeter = (...args) => { let sum = 0; for(let i of args){ sum += i; } console.log(sum); } restPararmeter(1,2,3,4,5,6,7,8,9,10); ``` ###Output - ``` 55 ``` * Rest parameters "..." denoted by three dots, do not restrict you to pass the number of values in a function, but all the passed values must be of the same type. * Here we have passed 10 numbers in the arguments while calling our function and then summed up all the numbers and output the result. * You can see we have used the rest operator in the function parameter and didn't create 10 different parameter for 10 numbers to pass. ### As we have seen, the arrow function makes the writing of function less complicated, and it also reduces the number of lines. ### Thats it for this post. ### THANK YOU FOR READING THIS POST AND IF YOU FIND ANY MISTAKE OR WANTS TO GIVE ANY SUGGESTION , PLEASE MENTION IT IN THE COMMENT SECTION. ## ^^You can help me by some donation at the link below Thank you👇👇 ^^ ☕ --> https://www.buymeacoffee.com/waaduheck <-- ###Also check these posts as well 1. https://dev.to/shubhamtiwari909/higher-order-function-in-javascript-1i5h 2. https://dev.to/shubhamtiwari909/best-vs-code-extensions-for-web-development-2lk3 3. https://dev.to/shubhamtiwari909/introduction-to-tailwind-best-css-framework-1gdj
shubhamtiwari909
966,030
Tmux hotkey for copier templates
I have added a hotkey to my copier template setup to quickly access all my templates at any time from...
16,020
2022-01-24T15:14:59
https://waylonwalker.com/til/tmux-copier-templates/
python, linux, tmux
--- canonical_url: https://waylonwalker.com/til/tmux-copier-templates/ published: true tags: - python - linux - tmux series: til title: Tmux hotkey for copier templates --- I have added a hotkey to my copier template setup to quickly access all my templates at any time from tmux. At any point I can hit `<c-b><c-b>`, thats holding control and hitting `bb`, and I will get a popup list of all of my templates directory names. Its an fzf list, which means that I can fuzzy search through it for the template I want, or arrow key to the one I want if I am feeling insane. I even setup it up so that the preview is a list of the files that come with the template in tree view. ``` bash bind-key c-b popup -E -w 80% -d '#{pane_current_path}' "\ pipx run copier copy ~/.copier-templates/`ls ~/.copier-templates |\ fzf --header $(pwd) --preview='tree ~/.copier-templates/{} |\ lolcat'` . \ " ``` I've had this on my systems for a few weeks now and I am constantly using it for my [tils](https://waylonwalker.com/til/), [blogs](https://waylonwalker.com/archive), and my .envrc file that goes into all of my projects to make sure that I have a virtual environment installed and running any time I open it. ![this is what it looks like when I open my copier templates popup](https://images.waylonwalker.com/copier-templates-tmux-popup.png)
waylonwalker
966,384
SwiftUI Custom Style
In this post I would like to share my approach to style different views. Some time ago, I wanted to...
0
2022-01-24T22:22:58
https://dev.to/drucelweisse/swiftui-custom-style-468f
swiftui, swift
In this post I would like to share my approach to style different views. Some time ago, I wanted to customize SwiftUI Textfield, but TextFieldStyle API is not exposed to public and looks like this: ```swift struct MyTextFieldStyle: TextFieldStyle { func _body(configuration: TextField<Self._Label>) -> some View { VStack(spacing: 0) { configuration .font(font) Rectangle().frame(maxWidth: .infinity, maxHeight: 2) } } let font = Font.system(size: 37, weight: .bold) } ``` But I think, Apple won't allow to use that, and I came up with new approach. Firstly, let's create typealias to ViewModifier ```swift typealias CustomTextFieldStyle = ViewModifier extension View { func customTextFieldStyle<S>(_ style: S) -> some View where S: CustomTextFieldStyle { modifier(style) } } ``` Now, all I need is just create struct, that will implement CustomTextFieldStyle. ```swift struct LargeTextFieldStyle: CustomTextFieldStyle { func body(content: Content) -> some View { VStack(spacing: 2) { content .font(font) .disableAutocorrection(true) .autocapitalization(.words) Rectangle().foregroundColor(Color.blue).frame(maxWidth: .infinity, maxHeight: 2) } } let font: Font = Font.system(size: 26, weight: .bold) } ``` Also with Swift 5.5 new [feature](https://github.com/apple/swift-evolution/blob/main/proposals/0299-extend-generic-static-member-lookup.md), that allow to extend generics, thus we can extend our CustomTextFieldStyle type and use it even more simpler ```swift extension CustomTextFieldStyle where Self == LargeTextFieldStyle { static var large: LargeTextFieldStyle { LargeTextFieldStyle() } } ``` You can checkout full code in this [gist](https://gist.github.com/drucelweisse/b76c5000dbf17bd5a0e9c24551f697b2)
drucelweisse
966,640
How to run rails console on elastic beanstalk rails app
Put all you environment variable into .env file root of your project directory. sudo su - -c "cd...
0
2022-01-25T05:45:13
https://dev.to/pritambios/how-to-run-rails-console-on-newly-setup-elastic-beanstalk-environment-4ieg
ebs, elasticbeanstalk, rails, ruby
1. Put all you environment variable into `.env` file root of your project directory. ```shell sudo su - -c "cd /var/app/current; /opt/elasticbeanstalk/bin/get-config --output YAML environment | sed 's/: /=/g' > .env" ``` 2. Set environment variable from `.env` file ```shell export $(grep -v '^#' .env | xargs -d '\n') ``` 3. Run your rails console based rails environment(production/staging/test/development etc). ```shell RAILS_ENV=production bundle exec rails c ```
pritambios
966,781
A Practical Introduction to TypeScript Class Decorators
TypeScript is hot and happening at the moment! I love it! But a lot of times there come features you...
0
2022-08-08T07:33:51
https://hasnode.byrayray.dev/a-practical-introduction-to-typescript-class-decorators-afb996af0763
--- title: A Practical Introduction to TypeScript Class Decorators published: true date: 2020-02-03 23:00:37 UTC tags: canonical_url: https://hasnode.byrayray.dev/a-practical-introduction-to-typescript-class-decorators-afb996af0763 --- TypeScript is hot and happening at the moment! *I love it*! But a lot of times there come features you don't know, so maybe in your case "Decorator" is one of them. This is the first post of a series about TypeScript decorators where I want to give you a practical introduction to them. What is a decorator, which types are there, why would you want to use a decorator and in what kind of situations would you use a decorator. > *Currently decorator is a stage 2 proposal for JavaScript and is available as an experimental feature of TypeScript.* *If you want to read more about the specification, which is pretty interesting and well written, check it on the Github repo: [JavaScript Decorators](https://github.com/tc39/proposal-decorators).* --- ![https://images.unsplash.com/photo-1482245294234-b3f2f8d5f1a4?ixlib=rb-1.2.1&q=85&fm=jpg&crop=entropy&cs=srgb](https://images.unsplash.com/photo-1482245294234-b3f2f8d5f1a4?ixlib=rb-1.2.1&q=85&fm=jpg&crop=entropy&cs=srgb) ## What is a Decorator? What are decorators? Well, think about the meaning of the word "decoration" and you are pretty close. The TypeScript website they describe it as: > *A Decorator is a special kind of declaration that can be attached to a class declaration, method, accessor, property, or parameter.* I would describe it as *A special declaration to add extra features on an existing class declaration, method, accessor, property or parameter*. ### JavaScript Classes To use a decorator you need to be familiar with JavaScript Classes. Classes are super useful to know! If you don't know what a JavaScript Class is and how they work, I highly recommend checking this video from **Daniel Shiffman** from **The Coding Train**, to have a practical introduction to them. After that, you can follow this post about decorators better. [https://www.youtube.com/watch?v=T-HGdc8L-7w](https://www.youtube.com/watch?v=T-HGdc8L-7w) ### Angular If you are using Angular, chances are big you are using decorators like `@Component`, `@NgModule`, `@Input`, `@Output` or `@Inject`. This will help a bit with getting a better understanding of decorators. ## Which decorators are there? To describe what decorators do in TypeScript we need to dive deeper into each type. There are a few different types of decorators. 1. Class Decorator 2. ~~Property Decorator~~ (will come later) 3. ~~Method Decorator~~ (will come later) 4. ~~Accessor @ Decorator~~ (will come later) 5. ~~Parameter Decorator~~ (will come later) ![https://images.unsplash.com/photo-1552664730-d307ca884978?ixlib=rb-1.2.1&q=85&fm=jpg&crop=entropy&cs=srgb](https://images.unsplash.com/photo-1552664730-d307ca884978?ixlib=rb-1.2.1&q=85&fm=jpg&crop=entropy&cs=srgb) ### Class decorator Let's start with the class decorator. To visualize that we are gonna build a (*fictive*) piece of the **Lord of the Rings** 🧙‍♂️ game. To play the game we need a lot of characters. Some can be the character you play with, some are for the scene around it. This is the class for a `Character`. It has a `armour`, `height`, `width`, `weight` and `type`. The type is bound to the enum `CharacterType`. {% gist https://gist.github.com/devbyray/d55e08798106f0ae10ea7bf05af9d60b %} In the **LOTR** you have a wide variety of characters. We are gonna use the class decorator to make 2 character types, the Hobbit and the Elf. Both of the classes extend the `Character` class we defined on top because the Elf and Hobbit both have there owned strengths, weaknesses, and capabilities. **Hobbit class** {% gist https://gist.github.com/devbyray/1bd3750c5c2e6f629aa4d6383a655594 %} Elf class {% gist https://gist.github.com/devbyray/4f6ffdfed3c99a3ea143471e692344e0 %} **Hobbit Decorator** In the Hobbit decorator, we override some properties that are specific if our player is a Hobbit. The same is for the Elf. A decorator is a function that returns a new constructor which extends the `constructor` of the class. In this constructor we add or overwrite existing properties/methods to the classes we apply the decorator on. > *Sadly enough we can't get the information from the original constructor to use that information. I learned that after trying for a day! If you know another method, please write it in a comment so I can include it.* {% gist https://gist.github.com/devbyray/299739f1453e4ed4199ec8e9c341b69c %} When you want to add the decorator to a class you define it like `@hobbitDec` above the class you want it to be applied to. {% gist https://gist.github.com/devbyray/9bbde3b2e140e5c674e2eeded3e97d95 %} When you would console log that in the browser, you should see the information combined with the information from the `PlayerCharacter`. {% gist https://gist.github.com/devbyray/d7ca3b87d0740972935bd733d34a2283 %} **Elf decorator** If we want our player to be an Elf, we do the same thing as for the Hobbit. {% gist https://gist.github.com/devbyray/e028bad9cf02a3bba3f5bc590165a428 %} We apply the `@elfDec` on the `PlayerCharacter` class to add the player type information. {% gist https://gist.github.com/devbyray/3cf5b48b047beb013ebefda24cabff6e %} Log the information in the console and you see that the correct information is applied. {% gist https://gist.github.com/devbyray/aea3061a4fc75dc36b94c90cc8a19da0 %} Now we know how to use a class decorator in TypeScript. If you want to check my whole code of the class decorator? Visit the [TypeScript playground](https://www.typescriptlang.org/play/index.html?experimentalDecorators=true&emitDecoratorMetadata=true#code/GYVwdgxgLglg9mABACzgIzTKARAphAHgBVFcAPKXMAEwGdEBvRMXAdwAoA6bgQwCcA5rQBciHmACeAbQC6ASlEMAvoiUA+dhAS0ofENDh9RROYwBQiS4i1gdKdJiiIAvMzaIAEg6zs5FqwH+ljZ2TKww1FDIADQouDACyFCxrPGJyWJ8ALZwIHyxUBIADrgqrqgYWEEBiNV8uFB5SBAANjy09OSUNPQhuvpQhuY1VuGRyC6IY1HVAchpSZPzCUmzowtOrqkrUGuW-Dl5kwe5fHuIhSWTl7jVSmb3ZqCQsAikLcB4hCRdVHSMbg43E4-CEonE0nkihU6k02n6BiMiBMwysfXewEmLFYiAAoh9fOdquiwhEorFlukUhtYic8gViqVJrgPkSAvVGnxmm0OqQKH9evC9IjUSNphMtmTkOdKYtyhtztt0pMlasRvtsqdjpq8ucbtdGXcHmYzFQQFlEABhZD8HjQXB8IiM0WIAByAHlXbjJgByMAIXA+6LVDzugBCYYAkiRXD6Ko4g9VcQAZABivpZwETj1a7Xo1tt9r4LqKIDQLRgEEyhyRYHNaAdkwAjAAGFsAbmqpfLlbiO1EdayDeLrhbXbLFar4oH9cbo-HPanGxnQ7niDHAW7k4ujNEBb4dsojudrn3h4dTpKnA9XuNZlzvK8lScvx6VptB6LJYnvdlUBXw7Ngu27Tsws4jogABMwG9qq-5gauEEACwblYW69jce4fuex5XKe2FFpeuCcKGEbRiaAR9MKgx8L4LoBLQIAlLRfgBPcOY8vQ+KYq+-xnl+DAwVWdK1uBzZth2Ql9ukAFrq2qGWOhU5SrJEGQQpiBKVMy4IYBrgAJwaVpmHvoWR5EZM-HmYynApqmFFokKAyGHRgnqogjHMYSbF3gA9L5iAAAptBIjYQARR5mAAAvGWBfPenFBSFDpWY2bloT+VZgDwWS4KIOh8DAYACL6wU8KFxbZblPqdjm8KaclEHYkl5UpRFDq+J297aHALTES0cACOwPpFI1ohBg1rV8HIQA) where you can also see how the compiled JavaScript looks. ### **When would you use the class decorator?** Well, I think it's pretty simple. When you want to overwrite or add a bunch of properties/methods to a class, it can be a good idea to use the class decorator. **What is the difference between extending a class?** Maybe your thinking 🤔, you could also extend the `Hobbit` or `Elf` class at the `PlayerCharacter` class. Yes, that also possible! But the difference, when using a class decorator instead of extending a class, is that when you extend a class, you most of the time will and can manipulate the data from the extended class. In contrast to extending a class, the class decorator will always be on top of the class. So you can't manipulate the data from the decorator inside your class. So if you use a class decorator, know you can cause some major side effects you didn't think off. 😉 ## Thanks I hope that your now up-to-date with the TypeScript Class Decorator and that you learned how to use it. But also what is not possible with it. In the next post, I'm gonna show you how to use the TypeScript Property Decorator. If you have any questions, please post them in the comments 👍.
devbyrayray
1,216,510
Answer: How to revert to origin's master branch's version of file
answer re: How to revert to origin's...
0
2022-10-11T09:55:55
https://dev.to/eslee99/answer-how-to-revert-to-origins-master-branchs-version-of-file-12dl
{% stackoverflow 1817774 %}
eslee99
966,942
Get individual SVG's back from IconFonts
Originally posted on my blog In this tutorial I'm going to show you how to decompile a big bloated...
0
2022-01-25T09:58:09
https://dev.to/andyfitz/get-individual-svgs-back-from-iconfonts-2j8b
iconfonts, svg, conversion
Originally posted [on my blog](https://andyfitzsimon.com/posts/decompile-iconfonts/) In this tutorial I'm going to show you how to decompile a big bloated icon font into a series of small, neat, and portable SVG files. It's dead easy. First install [Fontforge](https://fontforge.org/) on your system Fedora Linux ``` sudo dnf install fontforge ``` MacOS ``` brew install fontforge ``` Just change the package manager for your distribution: eg Debian/Ubuntu with `sudo apt install fontforge` Now for the magic one liner, just copy and paste this in your terminal and replace `youriconfont.woff` with the path to your icon font file. ``` fontforge -lang=ff -c \\ 'Open($1); SelectWorthOutputting(); \\ foreach Export("svg"); endloop;' \\ youriconfont.woff ``` You'll now have a standalone SVG file for every glyph in your icon font. If you don't care about naming feel free to head to the [bonus round](#bonus-round) ## But those filenames, they suck right? Because the font file knows nothing about what each shape is called, the filenames take the unicode value for each character represented like so you'd think you're stuck with filenames like : **` uniE9AF_youriconfont.svg `** Good news, your existing CSS should have something like this (but probably bigger): ``` .icon-pacman:before { content: "\e916"; } .icon-leaf:before { content: "\e9a4"; } .icon-airplane:before { content: "\e9af"; } ``` > Yarr, thar be useful names' What I do is a bulk find and delete on the common CSS markup so that we're left with something like this: ``` icon-pacman,e916 icon-leaf,e9a4 icon-airplane,e9af ``` As my SED and AWK skills are terrible, I just opened that output as a CSV in a spreadsheet, converted the unicode column to uppercase (if you need to), flipped the columns order, and was left with new CSV output like this: ``` E916,icon-pacman E9A4,icon-leaf E9AF,icon-airplane ``` I dutifully do another 3 find and replace commands: replace first line with `mv uni` replace all commas with `_youriconfont.svg ` (trailing space) replace linebreak with `.svg` followed by a line break The output now looks like this: ``` mv uniE916_youriconfont.svg pacman.svg mv uniE9A4F_youriconfont.svg leaf.svg mv uniE9AF_youriconfont.svg airplane.svg ``` I run that and rejoice This finally replaces all filenames with the sane names given to them by the CSS. I'm sure I can script this reconciliation part but I do like doing it manually as all CSS is crafted slightly differently so it's good to be eyes-on in case there is some rogue CSS to clean up. You probably only need to do this once. If you're doing this every day well... then you might be a competitor of my employer, so shoo shoo 👀 # Bonus round ### install SVGO on your system (I prefer to use NPM as it works on linux and mac hosts alike) ``` npm install -g svgo ``` Now you can simply enter the directory and run ``` svgo *.svg ``` That's it. Your files will have `fill="currentcolor"` applied to them so when you inline the SVG inside markup they will adhere to your CSS `color:` value just like the good-ol days. # A little history I'll cover why you would you ever need or want to do this. Icon Fonts were once an answer to easy web design. They exploded on the scene once [`@fontface`](https://caniuse.com/fontface) was supported by major browsers. iconfonts gave us all the icons you could need within a single HTTP-request - which was great if you had a lightning fast CDN and agressive caching. You used them usually with CSS pseudo elements and custom character ranges. That's where things get hairy for our a11y friends. Remember image sprites? Iconfonts as a technique came after image sprites for single-payload single-color icon sets. Before frameworks like react or powerful site generators it made sense to align your icons and marks with your typographic layout - browsers werent always so nice so this kept things on 'some' guardrails. Plus services like [IcoMoon](https://icomoon.io) emerged which made creating custom icon sets easy. But of course like I mentioned the accessibility was terrible. Worse, some text layout engines mutilated iconsfonts. Think of the carnage the CSS `text-rendering: optimizeSpeed;` would do to your illustrations. ### We now live in the future With the advent of modern frameworks and the popularity of best practices like [ARIA atrributes](https://css-tricks.com/accessible-svg-icons/). Icon fonts just arent the most suitable technique anymore; not by a long shot. But much of the internet is still littered with websites and apps that depend on iconfonts. And for those poor buggers, it's not without effort to climb into modern times. We see this often at [outfit.io](http://outfit.io) whenever we have a new client with only a website to ingest their existing brand assets from. Once we have the clean illustrations (that were often lost) back in a malleable and minimal format, our web developer counterparts can shed hundreds of kilobytes of frontend bottleneck from their sites and apps. I hope this helps you
andyfitz
967,192
20 best flutter libraries for professional developers
A lot of developers use libraries in their work. In the last article, the Code.Market team shared the...
0
2022-01-25T14:16:49
https://dev.to/pablonax/20-best-libraries-for-professional-developers-2b49
<span style="font-weight: 400;">A lot of developers use libraries in their work. In the last article, the Code.Market team shared the best libraries for beginners. In this article, we've listed the most important libraries that professional developers use in their toolkit. </span> </span> &nbsp; <h2><a href="https://code.market/libs/flutter/characters-1-2-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">characters 1.2.0 </span></a></h2> <img class="alignright size-medium wp-image-77569" src="https://code.market/wp-content/uploads/2022/01/characters-1.2.0--900x508.jpg" alt="characters 1.2.0 " width="900" height="508" /> <span style="font-weight: 400;">This library is needed in every application.  You can use it to interact with individual characters of a word. Save it so you don't lose it! For example to get first tag(<) character ``` // Using CharacterRange operations. Characters firstTagCharacters(Characters source) { var range = source.findFirst("<".characters); if (range != null && range.moveUntil(">".characters)) { return range.currentCharacters; } return null; } ``` &nbsp; <h2><a href="https://code.market/libs/flutter/pretty_dio_logger-1-1-1/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">pretty_dio_logger 1.1.1</span></a></h2> <img class="alignright size-medium wp-image-77563" src="https://code.market/wp-content/uploads/2022/01/pretty_dio_logger-1.1.1-900x508.jpg" alt="pretty_dio_logger 1.1.1" width="900" height="508" /> <span style="font-weight: 400;">This library is called pretty for a reason. This feature  logger is a Dio interceptor that logs network calls in a pretty, easy to read format.   Pretty, huh? Nothing superfluous, handy and quick way to detect the errors in sec.</span> Example: ``` Dio dio = Dio(); dio.interceptors.add(PrettyDioLogger()); // customization dio.interceptors.add(PrettyDioLogger( requestHeader: true, requestBody: true, responseBody: true, responseHeader: false, error: true, compact: true, maxWidth: 90)); ``` <img src="https://github.com/Milad-Akarie/pretty_dio_logger/raw/master/images/response_log_android_studio.png?raw=true"> <h2><a href="https://code.market/libs/flutter/extension-0-2-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">extension 0.2.0</span></a></h2> <img class="alignright size-medium wp-image-77579" src="https://code.market/wp-content/uploads/2022/01/TVskkks-glitch-900x508.jpg" alt="extension 0.2.0" width="900" height="508" /> <span style="font-weight: 400;">Another useful library for Flutter that will extent your dart methods. For example use plural forms for Russian words ``` plural(1, 'дом', 'дома', 'домов'); // returns дом plural(2, 'дом', 'дома', 'домов'); // returns дома plural(5, 'дом', 'дома', 'домов'); // returns домов ``` Or some fancy methods to get date: ``` // Is today DateTime.now().isToday; // return bool There are also a couple of useful links to other libraries inside. ``` <h2><a href="https://code.market/libs/flutter/undo-1-4-0/?utm_source=code+market&amp;utm_medium=article"><span style="font-weight: 400;">undo 1.4.0 </span></a></h2> <img class="alignright size-medium wp-image-77581" src="https://code.market/wp-content/uploads/2022/01/TV-glitch-900x508.jpg" alt="undo 1.4.0 " width="900" height="508" /> <span style="font-weight: 400;">The library to undo and redo your changes in flutter project. Undo a change with undo(). ``` print(person.firstName); // Jane changes.undo(); print(person.firstName); // John ``` </span> </span> <h2><a href="https://code.market/libs/flutter/linter-1-18-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">linter 1.18.0</span></a></h2> <img class="alignright size-medium wp-image-77580" src="https://code.market/wp-content/uploads/2022/01/TV-glihgfdvhstch-900x508.jpg" alt="linter 1.18.0" width="900" height="508" /> <span style="font-weight: 400;">The Dart Linter library defines lint rules that identify and report on "lints" found in Dart code. </span>  </span> <h2><a href="https://code.market/libs/flutter/isolated_worker-0-1-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">isolated_worker 0.1.0</span></a></h2> <img class="alignright size-medium wp-image-77578" src="https://code.market/wp-content/uploads/2022/01/TV-xjzjglitch-900x508.jpg" alt="isolated_worker 0.1.0" width="900" height="508" /> <span style="font-weight: 400;">The singleton isolated worker for all platforms. This is the way to isolate processes. On most platforms, it uses Flutter's Isolate, except on the web, since Isolate is not available, it uses Worker instead. Basic example: </span> ``` // if using compute function: // compute(doSomeHeavyCalculation, 1000); IsolatedWorker().run(doSomeHeavyCalculation, 1000); ``` &nbsp; <h2><a href="https://code.market/libs/flutter/timeago-3-1-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">timeago 3.1.0 </span></a></h2> <img class="alignright size-medium wp-image-77577" src="https://code.market/wp-content/uploads/2022/01/timeago-3.1.0--900x508.jpg" alt="timeago 3.1.0 " width="900" height="508" /> <span style="font-weight: 400;">If you need fuzzy timestamps, you need to save this library.  It takes you only five minutes and will save you a lot of effort. All you have to do is load the locales you want, because this library only supports English and Spanish. The easiest way to use this library via top-level function format(date): ``` final fifteenAgo = new DateTime.now().subtract(new Duration(minutes: 15)); print(timeago.format(fifteenAgo)); // 15 minutes ago print(timeago.format(fifteenAgo, locale: 'en_short')); // 15m print(timeago.format(fifteenAgo, locale: 'es')); // hace 15 minutos ```  </span> &nbsp; <h2><a href="https://code.market/libs/flutter/web3dart-2-3-3/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">web3dart 2.3.3 </span></a></h2> <img class="alignright size-medium wp-image-77576" src="https://code.market/wp-content/uploads/2022/01/web3dart-2.3.3--900x508.jpg" alt="web3dart 2.3.3 " width="900" height="508" /> <span style="font-weight: 400;">Do you need a dart library that connects to interact with the Ethereum blockchain? This library includes many features like: connect to an Ethereum node with the rpc-api, call common methods; send signed Ethereum transactions; generate private keys, setup new Ethereum addresses and much more. ``` import 'dart:math'; //used for the random number generator import 'package:web3dart/web3dart.dart'; // You can create Credentials from private keys Credentials fromHex = EthPrivateKey.fromHex("c87509a[...]dc0d3"); // Or generate a new key randomly var rng = new Random.secure(); Credentials random = EthPrivateKey.createRandom(random)(rng); // In either way, the library can derive the public key and the address // from a private key: var address = await credentials.extractAddress(); print(address.hex); ``` </span> &nbsp; <h2><a href="https://code.market/libs/flutter/translator-0-1-7/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/%20%E2%80%8E"><span style="font-weight: 400;">translator 0.1.7 </span></a></h2> <img class="alignright size-medium wp-image-77575" src="https://code.market/wp-content/uploads/2022/01/translator-0.1.7-900x508.jpg" alt="translator 0.1.7" width="900" height="508" /> <span style="font-weight: 400;">Free Google Translate API for Dart. The translator brought the world closer. Make your app closer to your users by adding a feature like the translator. ``` translator.translate("I love Brazil!", from: 'en', to: 'pt').then((s) { print(s); }); // prints Eu amo o Brasil! ``` </span><span style="font-weight: 400;"> </span> <h2><a href="https://code.market/libs/flutter/faker-2-0-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">faker 2.0.0</span></a></h2> <img class="alignright size-medium wp-image-77574" src="https://code.market/wp-content/uploads/2022/01/faker-2.0.0-900x508.jpg" alt="faker 2.0.0" width="900" height="508" /> <span style="font-weight: 400;">Library inspired by the Python package faker, and the Ruby package faker. Allows you to create fake email, name or any random sentence quickly and easily. In case you need. ``` var faker = new Faker(); faker.internet.email(); // francisco_lebsack@buckridge.com faker.internet.ipv6Address(); // 2450:a5bf:7855:8ce9:3693:58db:50bf:a105 faker.internet.userName(); // fiona-ward faker.person.name(); // Fiona Ward faker.person.prefix(); // Mrs. faker.person.suffix(); // Sr. faker.lorem.sentence(); // Nec nam aliquam sem et ``` </span> &nbsp; <h2><a href="https://code.market/libs/flutter/english_words-4-0-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">english_words 4.0.0 </span></a></h2> <img class="alignright size-medium wp-image-77573" src="https://code.market/wp-content/uploads/2022/01/english_words-4.0.0--900x508.jpg" alt="english_words 4.0.0 " width="900" height="508" /> <span style="font-weight: 400;">This package contains the most used English words and some utility functions. There are more than 5,000 words in the set. British or American English, who cares now? All at your fingertips. ``` nouns.take(50).forEach(print); ``` </span> &nbsp; <h2><a href="https://code.market/libs/flutter/mockito-5-0-17/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">mockito 5.0.17</span></a></h2> <img class="alignright size-medium wp-image-77572" src="https://code.market/wp-content/uploads/2022/01/mockito-5.0.17-900x508.jpg" alt="mockito 5.0.17" width="900" height="508" /> <span style="font-weight: 400;">You need this library to generate mock classes. It's very easy to use. It will be easy and simple for you to create stubs and checks for each class. ``` // Annotation which generates the cat.mocks.dart library and the MockCat class. @GenerateMocks([Cat]) void main() { // Create mock object. var cat = MockCat(); } ```  </span> &nbsp; <h2><a href="https://code.market/libs/flutter/pdf-3-6-5/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">pdf 3.6.5</span></a></h2> <img class="alignright size-medium wp-image-77571" src="https://code.market/wp-content/uploads/2022/01/pdf-3.6.5-900x508.jpg" alt="pdf 3.6.5" width="900" height="508" /> <span style="font-weight: 400;">This library  can create a full multi-pages pdf document with graphics, images, and text using TrueType fonts.  Everything brilliant is simple!  ``` final pdf = pw.Document(); pdf.addPage(pw.Page( pageFormat: PdfPageFormat.a4, build: (pw.Context context) { return pw.Center( child: pw.Text("Hello World"), ); // Center })); // Page ``` </span> &nbsp; <h2><a href="https://code.market/libs/flutter/sensors_plus-1-2-1/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">sensors_plus 1.2.1 </span></a></h2> <img class="alignright size-medium wp-image-77570" src="https://code.market/wp-content/uploads/2022/01/sensors_plus-1.2.1--900x508.jpg" alt="sensors_plus 1.2.1 " width="900" height="508" /> <span style="font-weight: 400;">Add a compass to your app. Using the accelerometer, your app will be able to detect whether your smartphone is moving or not. Great for working with maps or a pedometer.  ``` accelerometerEvents.listen((AccelerometerEvent event) { print(event); }); // [AccelerometerEvent (x: 0.0, y: 9.8, z: 0.0)] ``` </span> &nbsp; <h2><a href="https://code.market/libs/flutter/infinite_scroll_pagination-3-1-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/https://pub.dev/packages/infinite_scroll_pagination"><span style="font-weight: 400;">infinite_scroll_pagination 3.1.0</span></a></h2> <img class="alignright size-medium wp-image-77568" src="https://code.market/wp-content/uploads/2022/01/infinite_scroll_pagination-3.1.0-900x508.jpg" alt="infinite_scroll_pagination 3.1.0" width="900" height="508" /> <span style="font-weight: 400;">Scrolling pagination, endless scrolling pagination, auto-pagination, lazy loading pagination, progressive loading pagination, etc - there are a lot of words, but there is only one library. Use this library to let your application scroll indefinitely.</span> <img src="https://raw.githubusercontent.com/EdsonBueno/infinite_scroll_pagination/master/docs/assets/demo.gif"> &nbsp; <h2><a href="https://code.market/libs/flutter/sign_in_with_apple-3-3-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">sign_in_with_apple 3.3.0 </span></a></h2> <img class="alignright size-medium wp-image-77567" src="https://code.market/wp-content/uploads/2022/01/sign_in_with_apple-3.3.0--900x508.jpg" alt="sign_in_with_apple 3.3.0 " width="900" height="508" /> <span style="font-weight: 400;">Everyone likes quick and simple solutions.That's why Apple ID signup is exactly what your app needs! </span> <span style="font-weight: 400;">Using this app you'll make your users very happy because it's handy, fast, and with this library it's also of high quality. ``` SignInWithAppleButton( onPressed: () async { final credential = await SignInWithApple.getAppleIDCredential( scopes: [ AppleIDAuthorizationScopes.email, AppleIDAuthorizationScopes.fullName, ], ); print(credential); // Now send the credential (especially `credential.authorizationCode`) to your server to create a session // after they have been validated with Apple (see `Integration` section for more information on how to do this) }, ); ```  </span> &nbsp; <h2><a href="https://code.market/libs/flutter/connectivity_plus-2-2-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">connectivity_plus 2.2.0</span></a></h2> <img class="alignright size-medium wp-image-77566" src="https://code.market/wp-content/uploads/2022/01/connectivity_plus-2.2.0-900x508.jpg" alt="connectivity_plus 2.2.0" width="900" height="508" /> <span style="font-weight: 400;">This plugin will allow your app to recognize cellular and wi-fi connections. Useful thing that will make it user-friendly.   ``` import 'package:connectivity_plus/connectivity_plus.dart'; var connectivityResult = await (Connectivity().checkConnectivity()); if (connectivityResult == ConnectivityResult.mobile) { // I am connected to a mobile network. } else if (connectivityResult == ConnectivityResult.wifi) { // I am connected to a wifi network. } ``` </span> <h2><a href="https://code.market/libs/flutter/just_audio-0-9-18/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">just_audio 0.9.18</span></a></h2> <img class="alignright size-medium wp-image-77565" src="https://code.market/wp-content/uploads/2022/01/just_audio-0.9.18-900x508.jpg" alt="just_audio 0.9.18" width="900" height="508" /> <span style="font-weight: 400;">The flutter plugin system contains a rich variety of useful audio plugins. In order for them to work together in one application, “just_audio” that's all you need. By focusing on a single duty, different audio plug-ins can safely work together without overlapping duties causing runtime conflicts. Inside the library are links to other useful features that allow you to play audio in the background, etc. Just use it! <img src="https://user-images.githubusercontent.com/19899190/125459608-e89cd6d4-9f09-426c-abcc-ed7513d9acfc.png"> </span> &nbsp; <h2><a href="https://code.market/libs/flutter/graphql-5-0-0/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">graphql 5.0.0</span></a></h2> <img class="alignright size-medium wp-image-77564" src="https://code.market/wp-content/uploads/2022/01/graphql-5.0.0-900x508.jpg" alt="graphql 5.0.0" width="900" height="508" /> <span style="font-weight: 400;">GraphQL has many advantages, both for the client  and for the programmer: the devices will need fewer queries and therefore less data consumption. Also queries are reasoned, they have the same structure as a query. That's why it's so easy to work with. You should try the most popular GraphQL client for dart.</span> &nbsp; <h2><a href="https://code.market/libs/flutter/translator-0-1-7-2/?utm_source=dev.to&utm_medium=article&utm_campaign=flutter_libraries/"><span style="font-weight: 400;">translator 0.1.7</span></a></h2> <img class="alignright size-medium wp-image-77562" src="https://code.market/wp-content/uploads/2022/01/translatorbbb-0.1.7-900x508.jpg" alt="translator 0.1.7" width="900" height="508" /> <span style="font-weight: 400;">The translator is a useful feature for your app. Its essence is simple. Using translate method passing the args from and to designates the language from text you're typing and the language to be translated or you can omit from language and it'll auto-detect the language of source text and also pass the value to a var using await. </span> <span style="font-weight: 400;">What libraries do you use in your own work? Share in the comments. </span> &nbsp;
pablonax
967,513
Using Twitter's API to Gather Tweet Stats (and Follower Data) in Python
In this little tutorial, we will learn how to use Twitter's API to download statistics about your...
16,566
2022-01-26T11:36:40
https://bas.codes/posts/python-twitterapi-intro
python, twitter, beginners
In this little tutorial, we will learn how to use Twitter's API to download statistics about your tweets, like number of impressions, profile clicks etc. Also, we download a list of your followers, which would also allow you to track your unfollowers. ## Getting Started Before we begin, we need an API key for the Twitter API. You can get one by signing up [here](https://developer.twitter.com/en/portal/petition/essential/basic-info). Once you have your Twitter Developer Account, head over to the [Dashboard](https://developer.twitter.com/en/portal/dashboard) to create a new App. Click on your newly created app and head to the "Keys and tokens" tab. ![Twitter Dashboard - Keys and Tokens](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/py7le2gdahvq0ipon3t2.png) Ensure that you generate Authentication tokens with "Read, Write and Direct Messages Permissions" for your Twitter user. In general, we need two pairs of key and secret to make use of our app: 1. API Key and Secret: This tells Twitter that it's your App which makes a request. 2. Access Token and Secret: Since apps can be used by multiple users, the access token and secret authenticates your user to Twitter. Please store these secrets somewhere as you have no chance to display them again on the Twitter Dashboard. If you lose them, you have to regenerate them. ## Installing the TwitterAPI PyPI package Since the Twitter API is a [well-documented](https://developer.twitter.com/en/docs) [REST-API](https://en.wikipedia.org/wiki/Representational_state_transfer), we could build all our queries to it ourselves. However, there are Python packages that allow us to access the Twitter API in a much more comfortable way. The most prominent packages for that purpose are [`tweepy`](https://www.tweepy.org/) and [`TwitterAPI`](https://github.com/geduldig/TwitterAPI). For this guide, we use the latter. We install it as always by using `pip`: ```bash pip install TwitterAPI ``` ## Accessing the API ### Getting Tweet Stats Now, we have everything to get started. Let's create a file `stats.py` and first import the `TwitterAPI` classes. We also create a client object by providing our credentials. ```python from TwitterAPI import TwitterAPI, TwitterPager consumer_key = "<YOUR API KEY>" consumer_secret = "<YOUR API KEY SECRET>" access_token = "<YOUR ACCESS TOKEN>" access_token_secret = "<YOUR ACCESS TOKEN SECRET>" client = TwitterAPI(consumer_key, consumer_secret, access_token, access_token_secret, api_version="2") ``` The first request to the API is for retrieving our own `user_id`, which is a numeric value rather than your self-chosen username. We can get that by calling the `/users/me` endpoint like so: ```python my_user = client.request(f'users/:me') USER_ID = my_user.json()['data']['id'] ``` Now, we can get stats for our tweets: ```python params = { "max_results": 100, "tweet.fields": "created_at,public_metrics,non_public_metrics,in_reply_to_user_id", } r = client.request(f"users/:{USER_ID}/tweets", params) ``` There are two things to note here: 1. the `params` dictionary contains a value for `max_results` set to `100`. This is the maximum Twitter allows for one API request. 2. To access the statistic fields, we have to instruct Twitter to attach these values to the API response. We do that by requesting a comma-separated list of fields for the tweet (`tweet.fields`). If you have paid ads running on Twitter's platform, you can also include the values `organic_metrics,promoted_metrics` to the `tweet.fields` params. However, these fields are available on promoted tweets only so that non-promoted tweets won't appear in your result when you request `organic_metrics,promoted_metrics`. So, you should make two API requests, one for promoted and one for non-promoted tweets. ### Getting Follower Data Just like other social networks, Twitter doesn't really seem to want you to get information about your followers other than their sheer number. In particular, Twitter is reluctant to give out information about who has unfollowed you. The following API accesses are therefore more strictly limited, and you should take care not to make too many requests. That being said, let's get our list of followers: ```python params = { "max_results": 1000, "user.fields": "id,name,username,created_at,description,profile_image_url,public_metrics,url,verified", } followers = client.request(f"users/:{USER_ID}/followers", params) ``` Even though Twitter is a bit reluctant to access the follower endpoint, we can still increase the number of users returned to 1000. That is good news. #### Paging results However, what do we do if you're lucky enough to have more than 1000 followers on Twitter? The `TwitterAPI` package gets us covered and provides a `TwitterPager` object for that purpose. If a request produces more results than fit into one response, the further results are hidden in further responses that can be accessed with a `next_token`. This means that the response is "accompanied" by a link to be followed for the next results. Basically, it works like pages in your favourite online shop: "Page 1 of 453", you get the idea. That's why the tool we need is called a `Pager`. Here is how to use it: ```python params = { "max_results": 200, "user.fields": "id,name,username,created_at,description,profile_image_url,public_metrics,url,verified", } pager = TwitterPager(client, f"users/:{USER_ID}/followers", params) followers = list(pager.get_iterator()) ``` #### Getting Unfollowers How to get the users who unfollowed you on Twitter now? There is no particular API endpoint for getting unfollowers. To circumvent this limitation, we store the result of our follower query on our disk. When we make a subsequent request, we can compare the old and new results. User IDs not present in the old result but present in the new result are *new followers*, and User IDs present in the old result but not present in the new result anymore are therefore *unfollowers*. If you query the followers, let's say, every two hours, you will know who unfollowed you in the meantime. The logic looks like this: ```python follower_ids = [follower['id'] for follower in followers] old_followers = json.load(open('last_run.json')) old_follower_ids = [follower['id'] for follower in old_followers] new_followers = [follower_id for follower_id in follower_ids if follower_id not in old_follower_ids] unfollowers = [follower_id for follower_id in old_follower_ids if follower_id not in follower_ids] ``` ## The Whole Package The complete source code for our [`stats.py` file is on GitHub](https://gist.github.com/codewithbas/73ee2b8536ff4479e11e464d90763cdb). ## The `TwitterStats` PyPI package For added convenience, I created a [package on PyPI](https://pypi.org/project/TwitterStats/), which combines all the ideas of this article on one class for fetching all the stats. You can install it as always with `pip`: ```bash pip install TwitterStats ``` and use it in this simple way: ```python from twitterstats import Fetcher consumer_key = "<YOUR API KEY>" consumer_secret = "<YOUR API KEY SECRET>" access_token = "<YOUR ACCESS TOKEN>" access_token_secret = "<YOUR ACCESS TOKEN SECRET>" fetcher = Fetcher(consumer_key, consumer_secret, access_token, access_token_secret) promoted_tweets = fetcher.get_tweets(promoted=True) unpromoted_tweets = fetcher.get_tweets(promoted=False) followers = fetcher.get_followers() ``` ## tl;dr The `TwitterAPI` package makes it easy to download your tweet and follower/unfollower stats from Twitter. The `TwitterStats` package makes it even more convenient to access these data. If you enjoyed this article, consider [following me on Twitter](https://twitter.com/bascodes) where I regularly share tips on Python, SQL and Cloud Computing. <!-- Twitter single-event website tag code --> <script src="//static.ads-twitter.com/oct.js" type="text/javascript"></script> <script type="text/javascript">twttr.conversion.trackPid('q5o0o', { tw_sale_amount: 0, tw_order_quantity: 0 });</script> <noscript> <img height="1" width="1" style="display:none;" alt="" src="https://analytics.twitter.com/i/adsct?txn_id=q5o0o&p_id=Twitter&tw_sale_amount=0&tw_order_quantity=0" /> <img height="1" width="1" style="display:none;" alt="" src="//t.co/i/adsct?txn_id=q5o0o&p_id=Twitter&tw_sale_amount=0&tw_order_quantity=0" /> </noscript> <!-- End Twitter single-event website tag code -->
bascodes
967,944
Optimize and resize image urls effortlessly - load website faster
The fast and easy way to make your websites load faster Optimize and resize...
0
2022-01-26T03:36:54
https://dev.to/sh20raj/optimize-and-resize-image-urls-effortlessly-load-website-faster-3c8d
javascript, tricks, sh20raj, website
## The fast and easy way to make your websites load faster ### Optimize and resize images effortlessly --- Statically.io :- [https://statically.io/](https://statically.io/) YouTube Doc :- [https://youtu.be/uhK1Lh8HMv8](https://youtu.be/uhK1Lh8HMv8) {% youtube uhK1Lh8HMv8%} --- **Quick start** `GET` https://cdn.statically.io/img/:domain/:image **Resize image by width (w=:pixel)** `GET` https://cdn.statically.io/img/:domain/w=:pixel/:image `Resize image by height (h=:pixel)` `GET`https://cdn.statically.io/img/:domain/h=:pixel/:image **Enable auto-WebP (f=auto)** `GET` https://cdn.statically.io/img/:domain/f=auto/:image **Adjust quality (q=:percentage)** `GET` https://cdn.statically.io/img/:domain/q=:percentage/:image **Combine params** _Params can be combined using commas and after /img/:domain/ path. _ `GET` https://cdn.statically.io/img/:domain/f=auto,w=:pixel/:image `GET `https://cdn.statically.io/img/:domain/h=:pixel,q=:percentage/:image **Live demo** `GET` https://cdn.statically.io/img/staticsite.fun/w=300,h=500/cat.jpg `GET` https://cdn.statically.io/img/staticsite.fun/f=auto,w=600,h=400/dog.jpg GurImg - Free Image Hosting :- https://gurimg.sh20raj.repl.co/
sh20raj
967,975
How to disable web fonts
This is the quick tip to be able to disable all CSS fonts We had a website which contains a lot of...
0
2022-01-26T06:03:42
https://dev.to/katzueno/how-to-disable-web-fonts-2njj
javascript, chrome, font
This is the quick tip to be able to disable all CSS fonts We had a website which contains a lot of line separator (U+2028) character. ## How to install 1. Make a bookmark on bookmark bar (just add anything as dummy) - "Bookmark bar" is the bar that you see the list of icon right below the address bar 1. Edit the newly created bookmark, replace the URL to the following javascript code & rename it to whatever you like 1. Done ## Javascript ``` javascript:Array.prototype.forEach.call(document.getElementsByTagName("*"), function(e){e.style.fontFamily ="Source Sans Pro"}) ``` ## How to use - Visit a page - Click the bookmark icon that you just made ## Why need this If you are using modern web font, they treat line separator as space. However, we had a client whose customers maybe using older computer. They were viewing the page without web font. Then line separator are showing up as corrupted square letters. We need to remove those characters, but it's hard to locate unless we disable web font. Thank you user10089632 of StackExchange! https://superuser.com/questions/1209191/force-chrome-to-use-my-preferred-font-over-the-authors?newreg=20d74beb36514f629acc2f7222727626
katzueno
968,119
Introduction To Appwrite: The Open-Source Firebase Alternative That Is Easy to Self-Host 🚀
Appwrite is a new open-source, end-to-end service that enables developers of front-end and mobile...
0
2022-01-26T08:03:37
https://muthuannamalai.tech/introduction-to-appwrite-the-open-source-firebase-alternative-that-is-easy-to-self-host
opensource, webdev, javascript, beginners
[Appwrite](https://appwrite.io/) is a new [open-source](https://github.com/appwrite/appwrite), end-to-end service that enables developers of front-end and mobile applications to build apps more quickly. Developers can build advanced apps faster with REST APIs and tools that abstract and simplify common development tasks. In this article, I will walk through and introduce you to Appwrite, how to install it, how to create a project, its advantages and many more. Without further preamble let us get into the article ## What is Appwrite ![appwritelogo.jpg](https://cdn.hashnode.com/res/hashnode/image/upload/v1633280921979/fSh7lzk6z.jpeg) [Appwrite](https://appwrite.io/) represents an end-to-end backend server that simplifies and abstracts the complicated and repetitive process of creating modern apps. With Appwrite, you can build apps a lot faster and much more safely by means of a set of APIs, tools, and a UI for the management console. Within Appwrite, you will find a wide variety of services, from user authentication and account management to user preferences, database and storage persistence, localization, image manipulation, and scheduled background tasks. In addition to being cross-platform, Appwrite is technology agnostic, which means it runs on any operating system, coding language, framework, or platform. Despite being a serverless technology, Appwrite is designed to work well in multiple configurations. Appwrite can be integrated directly into your client app, used behind your custom backend, or used with your custom backend server. ## How To Install Appwrite ![WhatsApp Image 2021-10-03 at 22.51.23.jpeg](https://cdn.hashnode.com/res/hashnode/image/upload/v1633281761491/xlFQyY_4f.jpeg) The Appwrite instance can be installed on your local computer or on any cloud provider of your choice. Firstly, you need to install [Docker Desktop](https://docs.docker.com/get-docker/) on your operating system to run Appwrite instance. Depending on the operating system you are using, press one of the following commands in your terminal after installing Docker. For Mac and Linux: ``` docker run -it --rm \ --volume /var/run/docker.sock:/var/run/docker.sock \ --volume "$(pwd)"/appwrite:/usr/src/code/appwrite:rw \ --entrypoint="install" \ appwrite/appwrite:0.10.2 ``` For Windows: ``` docker run -it --rm ^ --volume //var/run/docker.sock:/var/run/docker.sock ^ --volume "%cd%"/appwrite:/usr/src/code/appwrite:rw ^ --entrypoint="install" ^ appwrite/appwrite:0.10.2 ``` Following the execution of the above command, you will be prompted for the port number, as well as other configuration questions. You can accept the default options or modify them to suit your needs. If your docker installation is complete, you can start the Appwrite console by entering the machine's IP address or hostname. If it does not start right away, try waiting a minute or two. Once it is successfully installed you should see a screen like below. ![Appwrite_Login-1024x577.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1633269107911/UZn18zTc3.png) ## How To Create A New Project in Appwrite Go to your new Appwrite console, and once inside, click the 'Create Project' button on your console homepage. Choose a name for your project and click create to get started. ![appwrite.jpeg](https://cdn.hashnode.com/res/hashnode/image/upload/v1633325569805/-UlA_mYrW.jpeg) ## Walkthrough Of Appwrite Project Dashboard Once you have created a project as described above. You will land on the project dashboard. ![hero-light.png](https://cdn.hashnode.com/res/hashnode/image/upload/v1633280676051/jm1sMxZvD.png) ### Home: The Home is the central part. This gives you an overall picture of the project and its activity. ### Database: Using the Database service, you can create structured collections of documents, query and filter lists of documents, and manage an advanced set of permissions to read and write documents. JSON documents are used to store all data in the database service. In addition to nesting child documents in parent documents, Appwrite allows you to search and query data using deep filters. The Appwrite collection rules define each database document structure in your project. Collection rules help you ensure that all user-submitted data is validated and stored in accordance with the collection structure. ### Storage: Your project files can be managed through the Storage service. With the Storage service, you can upload, view, download, and query your entire project archive. To manage who has access to view or edit a file within the service, each file is granted read and write permissions. ### Users: Managing your project users is possible with the Users service. Find your users' information, view current sessions, and view their activity logs with this service. The Users service also allows you to edit your users' preferences and personal information. ### Functions: By using the Functions service, you can define custom behaviors that are triggered either by Appwrite system events or by a schedule you define. With Appwrite Cloud Functions, backend code can automatically run when Appwrite triggers events or it can be scheduled to execute at a predefined time. Appwrite maintains your code in a secure way and executes it in an isolated environment. ### Tasks: Using Appwrite tasks, you can schedule any repeating tasks your app may need to run in the background. The Appwrite tasks are defined by setting a CRON schedule and by submitting an HTTP endpoint. ### Webhooks: With Webhooks, you can create events on Appwrite and set up integrations to subscribe to them. Whenever one of these events occurs, they'll send a POST payload to the URL of the webhook. In addition, webhooks are useful for clearing the cache from the CDN, calculating data, or sending notifications to Slack. Your imagination is the only limit. ### API Keys: With your API Keys, you can access Appwrite's services via your SDK of choice. You can create an API key by going to the API Keys tab of your project settings in your Appwrite console and clicking ‘Add API Key. When creating a new API Key, you can choose which permission scope your application should have access to. Allowing only the permissions that are necessary to accomplish your project's goals is a best practice. You can replace your API Key by creating a new one, updating your app credentials, and deleting your old key once you are finished. ## Advantages of Appwrite: - Great UI - 100% open source - Easy to setup - End to end solution - Consistency across platforms - Easy to use - Small learning curve - Predictable REST API - Accelerate app development - Simplicity first attitude - Lot of security features - Built-in file encryption - Auto SSL certificate generator - Built-in file scanner - Webhooks - Abuse Protection - Built-in Anti-Virus scanner ## Conclusion Appwrite has a great advantage over its competitors attributed to its open-source nature, dedicated community, and founding team for timely improvisations. If your business is looking to manage core backend needs, then Appwrite should be your go-to option. What are you waiting for go and join their [discord community](https://discord.gg/mn4TRK8GqB) and spread the world about Appwrite Happy Appwriting ♥ > You can now extend your support by buying me a Coffee.😊👇 <a href="https://www.buymeacoffee.com/muthuannamalai" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a> Thanks for Reading 😊
muthuannamalai12
968,223
Introducing Flexible Sync (Preview) - The Next Iteration of Realm Sync
We are excited to announce the public preview of our next version of Realm Sync: Flexible Sync. This...
0
2022-01-26T10:48:11
https://dev.to/andrewmorgan/introducing-flexible-sync-preview-the-next-iteration-of-realm-sync-ll8
swift, kotlin, mobile, database
We are excited to announce the public preview of our next version of Realm Sync: Flexible Sync. This new method of syncing puts the power into the hands of the developer. Now, developers can get more granular control over the data synced to user applications with intuitive language-native queries and hierarchical permissions. {% youtube bGaMhy7ns3I %} ## Introduction Prior to launching the general availability of Realm Sync in February 2021, the Realm team spent countless hours with developers learning how they build best-in-class mobile applications. A common theme emerged—building real-time, offline-first mobile apps require an overwhelming amount of complex, non-differentiating work. Our [first version of Realm Sync](https://www.mongodb.com/developer/how-to/realm-partitioning-strategies/) addressed this pain by abstracting away offline-first, real-time syncing functionality using declarative APIs. It expedited the time-to-market for many developers and worked well for apps where data is static and compartmentalized, or where permissions rarely need to change. But for dynamic apps and complex use cases, developers still had to spend time creating workarounds instead of developing new features. With that in mind, we built the next iteration of Realm Sync: Flexible Sync. Flexible Sync is designed to help developers: - Get to market faster: Use intuitive, language-native queries to define the data synced to user applications instead of proprietary concepts. - Optimize real-time collaboration between users: Utilize object-level conflict-resolution logic. - Simplify permissions: Apply role-based logic to applications with an expressive permissions system that groups users into roles on a pe-class or collection basis. ## Language-Native Querying Flexible Sync’s query-based sync logic is distinctly different from how Realm Sync operates today. The new structure is designed to more closely mirror how developers are used to building sync today—typically using GET requests with query parameters. One of the primary benefits of Flexible Sync is that it eliminates all the time developers spend determining what query parameters to pass to an endpoint. Instead, the Realm APIs directly integrate with the native querying system on the developer’s choice of platform—for example, a predicate-based query language for iOS, a Fluent query for Android, a string-based query for Javascript, and a LINQ query for .NET. Under the hood, the Realm Sync thread sends the query to MongoDB Realm (Realm’s cloud offering). MongoDB Realm translates the query to MongoDB’s query language and executes the query against MongoDB Atlas. Atlas then returns the resulting documents. Those documents are then translated into Realm objects, sent down to the Realm client, and stored on disk. The Realm Sync thread keeps a queue of any changes made locally to synced objects—even when offline. As soon as connectivity is reestablished, any changes made to the server-side or client-side are synced down using built-in granular conflict resolution logic. All of this occurs behind the scenes while the developer is interacting with the data. This is the part we’ve heard our users describe as “magic.” Flexible Sync also enables much more dynamic queries, based on user inputs. Picture a home listing app that allows users to search available properties in a certain area. As users define inputs—only show houses in Dallas, TX that cost less than $300k and have at least three bedrooms—the query parameters can be combined with logical ANDs and ORs to produce increasingly complex queries, and narrow down the search result even further. All query results are combined into a single realm file on the client’s device, which significantly simplifies code required on the client-side and ensures changes to data are synced efficiently and in real time. ### Swift ```swift // Set your Schema class Listing: Object { @Persisted(primaryKey: true) var _id: ObjectId @Persisted var location: String @Persisted var price: Int @Persisted var bedrooms: Int } // Configure your App and login let app = App(id: "XXXX") let user = try! await app.login(credentials: .emailPassword(email: "email", password: "password")) // Set the new Flexible Sync Config and open the Realm let config = user.flexibleSyncConfiguration() let realm = try! await Realm(configuration: config, downloadBeforeOpen: .always) // Create a Query and Add it to your Subscriptions let subscriptions = realm.subscriptions try! await subscriptions.write { subscriptions.append(QuerySubscription<Listing>(name: "home-search") { $0.location == "dallas" && $0.price < 300000 && $0.bedrooms >= 3 }) } // Now query the local realm and get your home listings - output is 100 listings // in the results print(realm.objects(Listing.self).count) // Remove the subscription - the data is removed from the local device but stays // on the server try! await subscriptions.write { subscriptions.remove(named: "home-search") } // Output is 0 - listings have been removed locally print(realm.objects(Listing.self).count) ``` ### Kotlin ```kotlin // Set your Schema open class Listing: ObjectRealm() { @PrimaryKey @RealmField("_id") var id: ObjectId var location: String = "" var price: Int = 0 var bedrooms: Int = 0 } // Configure your App and login val app = App("<YOUR_APP_ID_HERE>") val user = app.login(Credentials.emailPassword("email", "password")) // Set the new Flexible Sync Config and open the Realm let config = SyncConfiguration.defaultConfig(user) let realm = Realm.getInstance(config) // Create a Query and Add it to your Subscriptions val subscriptions = realm.subscriptions subscriptions.update { mutableSubscriptions -> val sub = Subscription.create( "home-search", realm.where<Listing>() .equalTo("location", "dallas") .lessThan("price", 300_000) .greaterThanOrEqual("bedrooms", 3) ) mutableSubscriptions.add(subscription) } // Wait for server to accept the new subscription and download data subscriptions.waitForSynchronization() realm.refresh() // Now query the local realm and get your home listings - output is 100 listings // in the results val homes = realm.where<Listing>().count() // Remove the subscription - the data is removed from the local device but stays // on the server subscriptions.update { mutableSubscriptions -> mutableSubscriptions.remove("home-search") } subscriptions.waitForSynchronization() realm.refresh() // Output is 0 - listings have been removed locally val homes = realm.where<Listing>().count() ``` ### .NET ```csharp // Set your Schema class Listing: RealmObject { [PrimaryKey, MapTo("_id")] public ObjectId Id { get; set; } public string Location { get; set; } public int Price { get; set; } public int Bedrooms { get; set; } } // Configure your App and login var app = App.Create(YOUR_APP_ID_HERE); var user = await app.LogInAsync(Credentials.EmailPassword("email", "password")); // Set the new Flexible Sync Config and open the Realm var config = new FlexibleSyncConfiguration(user); var realm = await Realm.GetInstanceAsync(config); // Create a Query and Add it to your Subscriptions var dallasQuery = realm.All<Listing>().Where(l => l.Location == "dallas" && l.Price < 300_000 && l.Bedrooms >= 3); realm.Subscriptions.Update(() => { realm.Subscriptions.Add(dallasQuery); }); await realm.Subscriptions.WaitForSynchronizationAsync(); // Now query the local realm and get your home listings - output is 100 listings // in the results var numberOfListings = realm.All<Listing>().Count(); // Remove the subscription - the data is removed from the local device but stays // on the server realm.Subscriptions.Update(() => { realm.Subscriptions.Remove(dallasQuery); }); await realm.Subscriptions.WaitForSynchronizationAsync(); // Output is 0 - listings have been removed locally numberOfListings = realm.All<Listing>().Count(); ``` ### JavaScript ```js import Realm from "realm"; // Set your Schema const ListingSchema = { name: "Listing", primaryKey: "_id", properties: { _id: "objectId", location: "string", price: "int", bedrooms: "int", }, }; // Configure your App and login const app = new Realm.App({ id: YOUR_APP_ID_HERE }); const credentials = Realm.Credentials.emailPassword("email", "password"); const user = await app.logIn(credentials); // Set the new Flexible Sync Config and open the Realm const realm = await Realm.open({ schema: [ListingSchema], sync: { user, flexible: true }, }); // Create a Query and Add it to your Subscriptions await realm.subscriptions.update((mutableSubscriptions) => { mutableSubscriptions.add( realm .objects(ListingSchema.name) .filtered("location = 'dallas' && price < 300000 && bedrooms = 3", { name: "home-search", }) ); }); // Now query the local realm and get your home listings - output is 100 listings // in the results let homes = realm.objects(ListingSchema.name).length; // Remove the subscription - the data is removed from the local device but stays // on the server await realm.subscriptions.update((mutableSubscriptions) => { mutableSubscriptions.removeByName("home-search"); }); // Output is 0 - listings have been removed locally homes = realm.objects(ListingSchema.name).length; ``` ## Optimizing for Real-Time Collaboration Flexible Sync also enhances query performance and optimizes for real-time user collaboration by treating a single object or document as the smallest entity for synchronization. Flexible Sync allows for Sync Realms to more efficiently share data and for conflict resolution to incorporate changes faster and with less data transfer. For example, you and a fellow employee are analyzing the remaining tasks for a week. Your coworker wants to see all of the time-intensive tasks remaining (`workunits > 5`), and you want to see all the tasks you have left for the week (`owner == ianward`). Your queries will overlap where `workunits > 5` and `owner == ianward`. If your coworker notices one of your tasks is marked incorrectly as `7 workunits` and changes the value to `6`, you will see the change reflected on your device in real time. Under the hood, the merge algorithm will only sync the changed document instead of the entire set of query results increasing query performance. ![Venn diagram showing that 2 different queries can share some of the same documents](https://mongodb-devhub-cms.s3.us-west-1.amazonaws.com/Interesecting_Tasks_380cf4962d.png) ## Permissions Whether it’s a company’s internal application or an app on the App Store, permissions are required in almost every application. That’s why we are excited by how seamless Flexible Sync makes applying a document-level permission model when syncing data—meaning synced documents can be limited based on a user’s role. Consider how a sales organization uses a CRM application. An individual sales representative should only be able to access her own sales pipeline while her manager needs to be able to see the entire region’s sales pipeline. In Flexible Sync, a user’s role will be combined with the client-side query to determine the appropriate result set. For example, when the sales representative above wants to view her deals, she would send a query where `opportunities.owner == "EmmaLullo"` but when her manager wants to see all the opportunities for their entire team, they would query with opportunities.team == "West”. If a user sends a much more expansive query, such as querying for all opportunities, then the permissions system would only allow data to be synced for which the user had explicit access. ```json { "Opportunities": { "roles": [ { name: "manager", applyWhen: { "%%user.custom_data.isSalesManager": true}, read: {"team": "%%user.custom_data.teamManager"} write: {"team": "%%user.custom_data.teamManager"} }, { name: "salesperson", applyWhen: {}, read: {"owner": "%%user.id"} write: {"owner": "%%user.id"} } ] }, { "Bookings": { "roles": [ { name: "accounting", applyWhen: { "%%user.custom_data.isAccounting": true}, read: true, write: true }, { name: "sales", applyWhen: {}, read: {"%%user.custom_data.isSales": true}, write: false } ] } ``` ## Looking Ahead Ultimately, our goal with Flexible Sync is to deliver a sync service that can fit any use case or schema design pattern imaginable without custom code or workarounds. And while we are excited that Flexible Sync is now in preview, we’re nowhere near done. The Realm Sync team is planning to bring you more query operators and permissions integrations over the course of 2022. Up next we are looking to expose array operators and enable querying on embedded documents, but really, we look to you, our users, to help us drive the roadmap. Submit your ideas and feature requests to our [feedback portal](https://feedback.mongodb.com/forums/923521-realm) and ask questions in our [Community forum](https://www.mongodb.com/community/forums/c/realm/realm-sync/111). Happy building!
andrewmorgan
968,414
.NET MAUI Closer Than Ever (Discover +5 New Features)
Just two months ago that, after many previews, pre-releases, leaks (and a lot of waiting) Microsoft...
18,561
2022-01-26T13:20:00
https://www.bytehide.com/blog/dotnet-maui-features-performance?id=2
csharp, dotnet, programming, news
Just two months ago that, after many previews, pre-releases, leaks (and a lot of waiting) Microsoft **officially released the new version of its fastest full-stack web framework:** .NET 6. As many of us already know, Microsoft left out many of the features that came out in the Previews since, according to them, **they were not ready to be officially released.** The response from the .NET developer community **has been super positive** despite not releasing all the promising features and functions. There is a small chance that **Microsoft expected a better response from the entire .NET developer community** and apparently **wants to fix this situation.** We have just started the new year and Microsoft is getting serious: **Releases new Previews** of the long awaited **.NET MAUI.** For those who don't know, **.NET MAUI is a cross-platform framework** ready to create all kinds of native mobile applications with C# and XAML. It allows you to develop applications for **iOS** and **macOS**, as well as for **Android** and **Windows**. And at this point you may be wondering…. > Isn't Xamarin already there for mobile application development? **Yes**, there is already Xamarin for that, but **Microsoft calls .NET MAUI as the "evolution" of Xamarin.** > Like Pokemon? Yes ![Source: https://www.pinterest.es/pin/770748923700305371/](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l359k2nx3q3r60c5edd5.gif) To get into the topic, in the Previews released by Microsoft of .NET MAUI **some very interesting features are mentioned** - which hopefully, will be officially released. **So let's talk about them in depth!** --- ## Windows Control Styling (Fluent Design System) Here, [David Ortinau](https://github.com/davidortinau) - The Principal Program Manager at .NET MAUI - tells us that they have **added design changes,** now with a **more fluid interface** and **new controls** and **support for themes.** The main updates are: - [Button](https://github.com/dotnet/maui/pull/3363) - [Editor](https://github.com/dotnet/maui/pull/3444) - [Entry](https://github.com/dotnet/maui/pull/3444) ![source: https://devblogs.microsoft.com/dotnet/announcing-dotnet-maui-preview-11/](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q9wfmq00yox928e2plf0.png) --- ## Multi-window Apps Here is one of the **most important** and main features of this Preview. Now with .NET MAUI **you can add multiple windows**. Simply with ``Application.Current.Windows`` you can **open a new window**, because it contains the references of the created windows. Let's look at the [Microsoft example](https://devblogs.microsoft.com/dotnet/announcing-dotnet-maui-preview-11/#multi-window-apps): ```csharp var secondWindow = new Window { Page = new MySecondPage { // ... } }; Application.Current.OpenWindow(secondWindow); ``` ![Source: https://devblogs.microsoft.com/dotnet/announcing-dotnet-maui-preview-11/](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zfblxx984l191xp318lx.png) --- ## Templates and C# 10 I think the phrase with which Microsoft introduces [this feature](https://devblogs.microsoft.com/dotnet/announcing-dotnet-maui-preview-11/#templates-and-c-10) is perfect: > "Simplification is one of the main goals of .NET MAUI" To do this, they have **added the templates using C# patterns,** both **file-scoped namespaces** and **implicit usings.** Besides, they have added new element templates, with which we (_the developers_) will save a lot of work. Let's see how it works (again I find the Microsoft example perfect): ```csharp namespace Preview11; public static class MauiProgram { public static MauiApp CreateMauiApp() { var builder = MauiApp.CreateBuilder(); builder .UseMauiApp<App>() .ConfigureFonts(fonts => { fonts.AddFont("OpenSans-Regular.ttf", "OpenSansRegular"); }); return builder.Build(); } } ``` At this point you will think: > Where are the using statements? Relax, don't worry - _**Microsoft says that**, not me._ From now on, with this new update **[implicit global usings](https://docs.microsoft.com/es-mx/dotnet/csharp/fundamentals/types/namespaces)** will be used to **gather them dynamically.** --- ## Shell and Dependency Injection In this new feature Microsoft tells us a bit about ``HostBuilder``. With this, .NET MAUI will have a **powerful dependency injection**. What _Shell_ mainly offers is an **incredible templating and styling capability** to achieve in a very simple way our needs (or what we require at that moment). To see it in a more practical way, let's look at [this example](https://devblogs.microsoft.com/dotnet/announcing-net-maui-preview-12/#shell-and-dependency-injection): As Microsoft tells us, we must define dependencies in the DI container. ```csharp public static class MauiProgram { public static MauiApp CreateMauiApp() { var builder = MauiApp.CreateBuilder(); builder .UseMauiApp<App>(); builder.Services .AddSingleton<MainViewModel>(); return builder.Build(); } } ``` And after that, we simply add in the page where we want the previously defined dependencies to be injected ```csharp public partial class MainPage { readonly MainViewModel _viewModel; public MainPage(MainViewModel viewModel) { InitializeComponent(); BindingContext = _viewModel = viewModel; } } ``` Very simple, right? Let's continue --- ## New Navigation in .NET MAUI (Shell) What is this from Shell? Shell (also known as ``AppShell``) **facilitates simple application designs**, especially in **tabs and drop-down menus**. Simply **add the pages to your app** and then **arrange them as you like** within the navigation structure. Let's see how nice it looks: ![Source: https://devblogs.microsoft.com/dotnet/announcing-net-maui-preview-12/](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oqqsk6ovon9eproy95ts.png) To be honest, I think **these new features are pretty cool**. Clearly considering that we've just started the year (and we've been looking forward to the release of .NET 6), Microsoft plans to **officially release .NET MAUI in Q2 2022**. ### And you, my dear reader? - Do you think there will be more .NET MAUI Previews? - What features would you like to see implemented? - Do you think it will finally be true and will be officially - released this year? Tell me in the comments, I would like to know more about it. You are always reading about me and what I write, now I want you to share with me what you think and I will gladly read and respond in the comments🤗
bytehide
968,441
Monorepos. 2022
When was the last time when you were thinking about “What’s going on with monorepos now”? Is Lerna...
0
2022-01-26T16:00:57
https://dev.to/etc088/monorepos-2022-1408
beginners, javascript, webdev, tutorial
When was the last time when you were thinking about “What’s going on with monorepos now”? Is [Lerna](http://lerna.js.org) still the best choice or just [Yarn Workspaces](https://classic.yarnpkg.com/lang/en/docs/workspaces/) could be enough? I did it a couple days ago and I have something to share with you. --- ### So let’s get started. First of all, what does the `Monorepo` mean? [Wikipedia says](https://en.wikipedia.org/wiki/Monorepo) > In version control systems, a monorepo (“mono” meaning ‘single’ and “repo” being short for ‘repository’) is a software development strategy where code for many projects is stored in the same repository --- ### When you may need monorepos in your dev life? Imagine the situation: * You have several libraries, projects, etc. * All of them are stored in a separated git repos. * Some of these packages should be as dependencies in your other packages. * And you are still continue to develop your packages (some of them may not work correctly in integration with other, etc.) *So the ordinary dev’s reality* :) --- ### What could you do in such a case? 1. *The worst scenario* 👎 You can publish your “medium rare” packages. Include them in the package.json of your main project as dependencies. Then, when you fix something, you republish them, up the version in package.json, `yarn install` again… Repeat as many times as you will find bugs. *If you ever thought about this workflow, keep reading, please* 👉 2. *Good scenario* 👋 You can use [yarn link](https://classic.yarnpkg.com/en/docs/cli/link). In that case, you can test your packages working in integration, without publishing packages every time after you fixed bugs. What is the main issue here? If you have a lot of packages you can simply miss something. You also need to remember to do [yarn unlink](https://classic.yarnpkg.com/en/docs/cli/unlink) after you are done with fixes and published your stuff. Also, you need to pay attention to what version do you use right now, published or linked one.. Yes, your editor may show you which package is linked right now in `node_modules` , but still, it could be risky to miss something. It’s hard to manage everything when you have many packages and cross dependencies 3. *The Best scenario (for my opinion)* 👍 You can use the monorepos. All your packages will be stored in one git repository, with their own package.json’s, linked with each other, etc. It’s easy to develop, easy to manage. When your development is done, all your stuff could be published at once or package-by-package. No need `yarn link/unlink`, etc So in short. We may want to use monorepos when: * We want a bit simplest code base management * We have a huge codebase with cross dependiences in packages * We have several applications and the packages used inside every application * We don’t want to deal with `yarn link`, etc if we need to fix something inside our package * We want flexible versioning, publishing, and changelog generation * Just We can :) --- ### Let’s see in short what projects we can use for monorepos in 2022 #### yarn/npm Workspaces. [yarn workspaces](https://classic.yarnpkg.com/en/docs/cli/workspaces) [npm workspaces](https://docs.npmjs.com/cli/v7/using-npm/workspaces) It’s Low-level primitive used by other tools. Actually, if you just need to link and install dependencies, it will be enough. + Link dependencies + + Run commands across the packages + - Automated packages publishing (we can run publish command one-by-one in each package, or write jour own automation) — - Automated version managing — #### Lerna (⭐️ 31.4k) [More info](http://lerna.js.org) + Link dependencies + + Run commands across the packages + + Automated packages publishing + + Automated version managing + + Caching — #### Turborepo (⭐️ 5.3k) [More info](http://turborepo.org) + Link dependencies + + Run commands across the packages + + Automated packages publishing (can be done via [complimentary tools](https://turborepo.org/docs/guides/complimentary-tools)) — + Automated version managing (can be done via [complimentary tools](https://turborepo.org/docs/guides/complimentary-tools)) — + Caching +++ #### NX (⭐️ 10.3k) [More info](http://nx.dev) + More than just monorepo tool + * Automated packages publishing — * Automated version managing — * Caching +++ * Not as easy as wanted — #### BIT (⭐️ 14.6k) [More info](http://bit.dev) + More than just monorepo tool + + Good for micro frontends + --- ### Before we go further I want you to give a try to NX. * The project has a great codegen tool and provides consistent dev experience for any framework * When you want to add a package to monorepo NX will ask you what should it be. For example, [React](https://reactjs.org) application, React library, or React component. Based on choices NX will generate a proper config for each package inside your monorepo * It works extremely fast I have a feeling that NX is something great! Maybe I just didn’t find use cases for such a multitool in my project and stopped to dive deeper. Also, it was a little hard to make my project work with NX. But it’s only my experience ;) --- ### So, for me (and my project), the best tool is TURBOREPO Here is the list of benefits which are important for me (you may find the whole list in official [documentation](https://turborepo.org/docs)) **Incremental builds** Building once is painful enough, Turborepo will remember what you’ve built and skip the stuff that’s already been computed. **Content-aware hashing** Turborepo looks at the contents of your files, not timestamps to figure out what needs to be built. **Remote Caching** Share a remote build cache with your teammates and CI/CD for even faster builds. **Parallel execution** Execute builds using every core at maximum parallelism without wasting idle CPUs. **Task pipelines** Define the relationships between your tasks and then let Turborepo optimize what to build and when. **Profile in your browser** Generate build profiles and import them in Chrome or Edge to understand which tasks are taking the longest. --- ### Finally, lets see the digits. *Previously I used Lerna as a monorepo engine. So I will compare Lerna speed vs Turborepo* *My monorepo contains one React application and 13 packages. One of the packages is the UI Library Kit which contains 102 React components* `build-packages` — *build all packages except the main application* `build-app` — *build all packages and the main application* ![Lerna vs Turborepo speed](https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8vl5bj9ulg0xmc21nbrp.jpeg) --- Okay. It was a small description of what’s going on with monorepos in 2022 from my point of view. Hope the information was helpful for someone. Let me know what you guys think in the comments :)
etc088
968,503
#reTerminal – Installing TensorFlow, @Code and recognizing 🐿️🐺 using a Camera 🤳
Hi ! Let’s start with some posts using reTerminal with a very simple scenario: Install TensorFlow and...
0
2022-01-26T14:57:59
https://dev.to/elbruno/reterminal-installing-tensorflow-code-and-recognizing-using-a-camera-3dne
englishpost, opencv, reterminal
--- title: #reTerminal – Installing TensorFlow, @Code and recognizing 🐿️🐺 using a Camera 🤳 published: true date: 2022-01-11 20:40:04 UTC tags: EnglishPost,EnglishPost,OpenCV,reTerminal canonical_url: --- Hi ! Let’s start with some posts using reTerminal with a very simple scenario: Install TensorFlow and Code on reTerminal to run an object recognition app. The desired output is something similar to this one. Live demo in a tweet ! And let´s start with the base commands to install Visual Studio Code. Oficial VSCode… [Continue reading #reTerminal – Installing TensorFlow, @Code and recognizing 🐿️🐺 using a Camera 🤳](https://elbruno.com/2022/01/11/reterminal-installing-tensorflow-code-and-recognizing-%f0%9f%90%bf%ef%b8%8f%f0%9f%90%ba-using-a-camera-%f0%9f%a4%b3/)
elbruno
968,506
#AzureFunctions ⚡- Pip and TLS/SSL error on Debugging with #Python 🐍
Hi! There is something wrong with my current development environment. I have an Azure Function with...
0
2022-01-28T14:42:00
https://dev.to/elbruno/azurefunctions-pip-and-tlsssl-error-on-debugging-with-python-2a9p
englishpost, azurefunctions, errors
--- title: #AzureFunctions ⚡- Pip and TLS/SSL error on Debugging with #Python 🐍 published: true date: 2022-01-18 14:00:00 UTC tags: EnglishPost,AzureFunctions,EnglishPost,Error canonical_url: --- Hi! There is something wrong with my current development environment. I have an Azure Function with these dependencies: And I got this error when I press F5 / Run the project in local mode. By default, Azure Functions works with a Python virtual environment. This helps to isolate our current dev environment, and also to… [Continue reading #AzureFunctions ⚡- Pip and TLS/SSL error on Debugging with #Python 🐍](https://elbruno.com/2022/01/18/azurefunctions-%e2%9a%a1-pip-and-tls-ssl-error-on-debugging-with-python-%f0%9f%90%8d/)
elbruno
968,602
i18n style
How to organise your translation files?
0
2022-01-26T16:46:20
https://dev.to/razbakov/i18n-style-48me
i18n, localization, webdev, vue
--- title: i18n style published: true description: How to organise your translation files? tags: i18n, localization, webdev, vue cover_image: https://images.unsplash.com/photo-1539632346654-dd4c3cffad8c?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=1470&q=80 --- ### Style Guide - key pattern: `<page>[.section].<component>[.attribute]` - `camelCase` for multiple words ### Background What are possible ways to map strings in JSON? You would probably start with plain object: ```json { "Login": "Login", "Logout": "Logout", "New post": "New post", } ``` And you will create same file for each language, just changing values with translations. But what if you decide later to change `New post` to `Add post`? You will need to change the key in all language files and in all source code files. Alternative way is to use more abstract keys, that gives you more flexibility. For example: ```json { "login": "Login", "logout": "Logout", "new": "New post", } ``` And what if you now have another feature: `Add event`? You alternatives are: 1) make complex keys 2) group by meaning Complex-word keys would be: ```json { "login": "Login", "logout": "Logout", "newPost": "New post", "newEvent": "New event", } ``` And what if now you have a login screen, which has a title, subtitle, 2 fields and submit button? You might do this: ```json { "loginTitle": "Login", "loginSubtitle": "Please login to continue", "loginSubmit": "Continue", "logout": "Logout", "newPost": "New post", "newEvent": "New event", } ``` And what if you have a registration screen which have similar elements? ```json { "loginTitle": "Login", "loginSubtitle": "Please login to continue", "loginSubmit": "Continue", "registerTitle": "Registration", "registerSubtitle": "Create new account", "registerSubmit": "Start", "logout": "Logout", "newPost": "New post", "newEvent": "New event", } ``` As you see translation file grows exponentially. You can make life easier for developers and translators by grouping keys: ```json { "login": { "title": "Login", "subtitle": "Please login to continue", "submit": "Continue", }, "register": { "title": "Registration", "subtitle": "Create new account", "submit": "Start", }, "logout": "Logout", "post": { "new": "New post" }, "event": { "new": "New event" } } ``` When grouping elements look for similarities, what those elements have in common and how it would scale. Input element can have label, placeholder, error. Those are attributes of that element, so it make sense to group values by element name, i.e. in our login screen: ```json { "login": { "title": "Login", "subtitle": "Please login to continue", "submit": "Continue", "username": { "label": "Enter your username", "placeholder": "JohnDoe", "error": "Username is a required field", } }, } ``` But what if there are more error messages later? If we need to add error message for complexity validation (i.e. "Please use numbers, letters, special symbols"). Both are errors, so we would group them under errors. ### How does this look in YML? YML looks similar to JSON, just without curly brackets: ```yml login: title: Login subtitle: Please login to continue submit: Continue username: label: Enter your username placeholder: JohnDoe error: Username is a required field ``` or you can also do it per line: ```yml login.title: Login login.subtitle: Please login to continue login.submit: Continue login.username.label: Enter your username login.username.placeholder: JohnDoe login.username.error: Username is a required field ``` Last one have few benefits: - It's easier to review PRs having the whole context, and not just seeing some part of bigger object - It's easier to find string from `t()` function in the translations But also you could mix up `login` and `login.title` and destroy the object without even noticing it. ### More on this topic: - https://dev.to/omaiboroda/three-ways-to-name-i18n-translation-keys-2fed - https://medium.com/frontmen/web-internationalisation-i18n-lessons-ive-learned-the-hard-way-438a3e157e0 - https://lokalise.com/blog/translation-keys-naming-and-organizing/ - https://phrase.com/blog/posts/ruby-lessons-learned-naming-and-managing-rails-i18n-keys/
razbakov