text stringlengths 1 1.04M | language stringclasses 25 values |
|---|---|
{"appid": 345710, "name": "<NAME>", "windows": true, "mac": false, "linux": false, "early_access": true, "lookup_time": 1490981186} | json |
{"2010":"","2016":"","Department":"Тернопільський міськрайонний суд Тернопільської області","Region":"Тернопільська область","Position":"Суддя Тернопільського міськрайонного суду Тернопільської області","Name":"<NAME>","Link":"http://court.gov.ua/lustration/83b8e773a5519eb94c62b7300199c7df.pdf","Note":"","AdditionalNote":"","декларації 2015":"","Youtube":"","ПІБ2":"","Кількість справ":"","Оскаржені":"","Кількість скарг":"4","Кількість дисциплінарних стягнень":"","Клейма":"","Фото":"","Як живе":"","Декларація доброчесності судді подано у 2016 році (вперше)":"","Декларація родинних зв’язків судді подано у 2016 році":"","key":"stelmashchuk_petro_yaroslavovich","field8":"","Link 2015":"","field9":"","Декларації 2013":"","Декларації 2014":"","Декларації 2015":"","Декларації 2016":"","type":"judge","analytics":[{"y":2014,"i":181142,"k":44,"ka":1,"fc":2,"fh":114,"fha":1,"fl":880,"fla":1},{"y":2015,"b":1357,"m":392658.96,"i":252760,"k":44,"ka":1,"fc":2,"ff":47,"ffa":1,"fh":1.14,"fha":1,"fl":880,"fla":1,"j":4},{"y":2017,"b":2979,"m":392658.96,"i":268210,"k":44,"ka":1,"fi":220250,"fc":1,"fh":114,"fha":1,"fl":880,"fla":1}],"declarationsLinks":[{"id":"vulyk_31_19","year":2014,"url":"http://static.declarations.com.ua/declarations/chosen_ones/mega_batch/stelmashchuk_petro_iaroslavovych.pdf","provider":"declarations.com.ua.opendata"},{"id":"nacp_5a4e7ed8-3331-4448-83af-8f06c60cc8da","year":2015,"provider":"declarations.com.ua.opendata"},{"id":"nacp_f4135ea9-52a1-4533-9b57-9ece9008cc24","year":2017,"provider":"declarations.com.ua.opendata"}]} | json |
<filename>kun-commons/kun-commons-web/src/test/java/com/miotech/kun/commons/web/serializer/JsonSerializerTest.java
package com.miotech.kun.commons.web.serializer;
import com.google.common.collect.Maps;
import com.google.inject.Guice;
import com.google.inject.util.Types;
import com.miotech.kun.commons.utils.IdGenerator;
import com.miotech.kun.commons.web.mock.MockObject;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import java.io.ByteArrayInputStream;
import java.util.Arrays;
import java.util.List;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.instanceOf;
public class JsonSerializerTest {
private JsonSerializer jsonSerializer;
private String runTaskJson;
private String runTasksJson;
@BeforeEach
public void setUp() {
jsonSerializer = Guice.createInjector().getInstance(JsonSerializer.class);
MockObject mockObject = new MockObject();
mockObject.setId(IdGenerator.getInstance().nextId());
mockObject.setConfig(Maps.newHashMap());
runTaskJson = jsonSerializer.toString(mockObject);
runTasksJson = jsonSerializer.toString(Arrays.asList(mockObject));
}
@Test
public void testToObject_list() {
ByteArrayInputStream byteArrayInputStream = new ByteArrayInputStream(runTasksJson.getBytes());
Object mockObject = jsonSerializer.toObject(byteArrayInputStream, Types.listOf(MockObject.class));
assertThat(mockObject, instanceOf(List.class));
assertThat(((List) mockObject).get(0), instanceOf(MockObject.class));
}
@Test
public void testToObject_object() {
ByteArrayInputStream byteArrayInputStream = new ByteArrayInputStream(runTaskJson.getBytes());
Object runTaskVO = jsonSerializer.toObject(byteArrayInputStream, MockObject.class);
assertThat(runTaskVO, instanceOf(MockObject.class));
}
}
| java |
Uber generates lots of revenue. Everyone knows it. The company’s roughly 20 percent cut of ride spend on its platform adds up to big numbers. How big, of course, is the real question.
New data published by Business Insider helps put Uber’s revenue into perspective. However, before we get to the new sums, let’s rewind. Here’s TechCrunch in December of 2013, reporting on leaked Uber revenue figures — a story that Valleywag’s Nitasha Tiku broke — from October of that year, emphasis is mine:
The numbers span a period of between mid-October and mid-November of 2013 and allow us to form a picture, though incomplete, of Uber’s income and user statistics over the period. According to our calculations based on the information laid out in the dashboard screenshots — and assuming some similarity in numbers for the rest of the year — the car service should be pulling in over $1B a year in gross bookings. At a rough 20% cut, a figure Valleywag notes Kalanick has alluded to, that would place Uber’s slice of the revenue around $213M a year.
So, Uber at the time was doing rides at about a $1 billion run rate, and bringing in a more than $200 million yearly cut. Damn fine numbers, really.
Cut to today, when Business Insider published data from Uber’s December of 2013, just one month after the period the figures above were taken from. It was a big period for the firm, with New Year’s Eve revenue spiking year-over-year, and lots of markets doing high volumes of rides.
Here’s how the Business Insider article frames it:
If you assume annual run rate based on the December 2013 data, Uber’s top five markets (NYC, DC, San Francisco, Chicago and Los Angeles) would generate about $1 billion a year. Again, that’s without taking Uber’s growth and expansion into account during 2014. That’s in line with rumored revenue estimates for Uber, which suggest Uber will generate $1.5-2 billion of revenue this year.
Revenue isn’t revenue isn’t revenue, however. Just ask Tesla. Here’s the story’s first paragraph:
Uber is a four-year-old mobile ride service company that could soon generate $10 billion of revenue per year.
That is a huge number, certainly. But which revenue number is it, $1.5 billion to $2 billion, or $10 billion? Well, none of the above, I don’t think. Keep in mind the data from October that TechCrunch covered: If Uber was on tip to make just over $200 million in revenue for itself on a yearly basis in October of 2013, how by December of the same year was it prepped to generate company-facing top line of $1.5 billion, let alone $2 billion, or $10 billion?
You pay Uber for your rides. It then takes a cut of that total spend, and that’s its take-home revenue. You can try and count the total spend on its platform as its revenue, but that’s as silly as Groupon trying to claim the full sale-price of coupons as its own top line. Ask Andrew Mason how well that went.
So Uber’s 20 percent is its revenue, and on a run-rate basis, that figure was nowhere near the billion dollar mark mere months before it supposedly was. I think that we are seeing a conflation of the term ‘revenue’ here.
If you repine for a moment, and scroll back to the initially leaked Uber slides, what was labeled as “revenue” was in fact considered to be total platform spend. To refresh you:
That $20 million or so per week is around $1 billion per year, given the number of weeks per annum. That’s how we treated what Uber called “revenue” in the past. Now, we appear to be willing to take what Uber called revenue mere months after the above, and presume that it means actual income for the company, and not total platform spend.
Again, revenue isn’t revenue isn’t revenue.
Unless Uber saw a growth spurt so fucking strong between those two months that it grew several-fold, we’re misinterpreting revenue. So, that means that Uber’s prior $20 million plus weekly revenue run rate wasn’t platform spend, but was instead its cut, and thus Uber was rocking a $5 billion platform spend run rate last October.
Well that or we’re just getting this confused today. Uber’s growth of 20 or 30 percent from October to a perhaps more busy December isn’t outside the realm of the reasonable. But here’s Business Insider on the December data:
In San Francisco, Uber generated $17.7 million, a run rate of about $213 million.
This implies that the company’s San Francisco revenue that month was roughly commensurate with its total forward revenue, calculated from October of that year. Or: That Uber’s ‘revenue’ in its slides concerning October was in fact total platform spend, and thus that Uber will likely collect around $40 million in revenue from the San Francisco market this year, extrapolating on a forward twelve-month run-rate basis from the new figures.
So, Uber will see well north of $1 billion in total platform spend this year. Perhaps even $2 billion, or $3 billion if things go well. That would put its own revenue in the mid hundreds of millions.
But keep in mind that Twitter is going to generate around $1.4 billion in revenue this year, and is worth $25.3 billion. If Uber was set to spank Twitter’s top line, why would it be worth less?
Oh, and if you count total platform spend as Uber’s revenue, it’s going to have shit GAAP margins. Something to think about.
| english |
Moscow, April 18: Exterl Affairs Minister Sushma Swaraj on Monday said that tions cannot afford to adopt double standards in the fight against terrorism even as she took up with the Chinese foreign minister the issue of Beijing vetoing New Delhi's move in the UN to ban Jaish-e-Mohammed (JeM) chief Masood Azhar.
"If we continue to adopt double standards in dealing with terrorism, it will have serious consequences not just for our own countries, but the intertiol community as a whole," Sushma Swaraj said while addressing the 14th Russia-India-Chi (RIC) Foreign Ministers' Meeting here.
Sushma Swaraj called on the RIC countries to lead the way in getting the intertiol community together to counter terrorism through joint action, including at the UN.
Raising the issue of UN Security Council reform, Sushma Swaraj said it demanded greater urgency and sought support of Russia and Chi for this.
"There has been some positive movement in this connection with a text being put for the first time on the table for inter-governmental negotiations. However, we need greater urgency on this issue. I seek the support of my Chinese and Russian colleagues for taking the IGN (inter-governmental negotiations) process forward," she said.
On the slowdown of the global economy, the Indian minister said that as three large, emerging economies, India, Russia and Chi shared similar approaches and could benefit "from coorditing our positions".
The minister also sought active participation of all members in the BRICS (Brazil, Russia, India, Chi, South Africa) summit to be held in Goa in October this year.
India assumed the BRICS chairmanship this year.
"We hope to have a very successful BRICS Summit in Goa in October," Sushma Swaraj said.
In a media statement following the RIC meeting, she said that terrorism remained the foremost threat to intertiol peace and security.
"I stressed the need to craft an effective global strategy to counter terrorism, including at the UN. I look forward to working with both my colleagues, in this regard," she said.
Answering a question in press conference after the RIC meeting, she said that all three countries -- India, Russia and Chi -- have been victims of terrorism.
"It is tural for us that we unite to lead the world in our fight against terrorism," Sushma Swaraj said.
In her media statement, she also said that she and the Russian and Chinese foreign ministers had a productive exchange of views on the situation in the Middle East.
"We also discussed the situation in Afghanistan and agreed that it was important for the intertiol community to remain engaged and support the Afghanistan government in its development and reconciliation efforts and in defeating terrorist forces," she said.
Sushma Swaraj started her day with a bilateral meeting with Chinese Foreign Minister Wang Yi during which she raised the issue of Chi vetoing India's bid at the UN to ban JeM chief Masood Azhar.
India approached the UN in February to include Azhar in the UN Security Council's 1267 sanctions list, in the aftermath of the Pathankot airbase terror attack. In the strike in early January by JeM terrorists, seven Indian security personnel were killed.
However, Chi asked the UN sanctions committee to keep on hold the Indian move, saying that Azhar did not meet the UN criteria to be banned as a terrorist.
"I would also like to tell you that in the morning today, I met Chinese Foreign Minister Wang Yi, and in that meeting also I said to him that if we want to fulfill our commitment to fight terrorism together, then we must rethink the position they have taken on UNSC 1267 Committee," Sushma Swaraj said in the press conference.
The exterl affairs minister also held a bilateral meeting with Russian Foreign Minister Sergey Lavrov on the sidelines of the RIC meeting during which the latter briefed her on the progress in the investigations into the deaths of three Indian tiols in Russia earlier this year.
While Puja Kallur, 22, from vi Mumbai, and her room-mate Karishma Udai Bhosle, 21, from Pune, perished in the fire which broke out in their hostel on February 14 in the Smolensk State Medical Academy, Yasir Jawed, a medical student from Srigar died on March 8 after being attacked by unknown people at Kazan city in the Tatarstan republic.
Besides this, Sushma Swaraj and Lavrov reviewed bilateral relations and preparations for the annual bilateral summit.
The exterl affairs minister later also called on Russian Deputy Prime Minister Dmitry Rozogin at the White House in Moscow.
Rogozin is the co-chair of the India-Russia Inter-Governmental Commission on trade and economic cooperation. | english |
package com.victuxbb.systemdesigns.tinyurlkgs.domain;
import org.junit.Before;
import org.junit.Test;
import static org.assertj.core.api.Assertions.assertThat;
public class KeyGeneratorTest {
private KeyGenerator keyGenerator;
@Before
public void setUp() {
keyGenerator = new KeyGenerator(new Base64Dictionary(), 3);
}
@Test
public void testGenerationV0() {
assertThat(keyGenerator.generateKeys().collectList().block()).hasSize(262144);
}
} | java |
<filename>_test/src/main/java/webapp/demoe_websocket/WsDemoController.java<gh_stars>100-1000
package webapp.demoe_websocket;
import org.noear.solon.Solon;
import org.noear.solon.SolonApp;
import org.noear.solon.annotation.Controller;
import org.noear.solon.annotation.Mapping;
import org.noear.solon.core.handle.ModelAndView;
import org.noear.solon.core.handle.Context;
import org.noear.solon.core.handle.MethodType;
@Controller
public class WsDemoController {
@Mapping(value = "/demoe/*",method = MethodType.WEBSOCKET)
public void test(Context ctx) throws Exception{
if(ctx == null){
return;
}
String msg = ctx.body();
if(msg.equals("close")){
ctx.output("它叫我关了:"+ msg);
System.out.println("它叫我关了:"+ msg+"!!!");
ctx.close();//关掉
}else{
ctx.output("我收到了:"+ msg);
}
}
@Mapping(value = "/demoe/websocket")
public Object test_client(Context ctx){
ModelAndView mv = new ModelAndView("demoe/websocket.ftl");
mv.put("app_port", Solon.global().port() + 10000);
return mv;
}
}
| java |
const axios = require('axios');
// check if the url is valid or not
let findIfValid = (url) => {
// if there is https:// or http:// just go ahead
if (url.includes('https://') || url.includes('http://')) {
return axios.head(url)
.then((res) => {
// if response returns then its working
return true
})
.catch((err) => {
return false
})
}
else{
// if there is no http or https, go with http.
return axios.head(`http://${url}`)
.then((res) => {
return true
})
.catch((err) => {
return false
})
}
}
exports.status = findIfValid | javascript |
//
// another test
//
/*
with an inline comment
*/
#define __SPIN2CPP__
#include <propeller.h>
#include "test007.h"
int32_t test007::Donext(void)
{
Count = Count + 1;
return Count;
}
| cpp |
{
"id": 22122,
"citation_title": "Neoclassical Models in Macroeconomics",
"citation_author": [
"<NAME>",
"<NAME>"
],
"citation_publication_date": "2016-03-28",
"issue_date": "2016-03-24",
"revision_date": "None",
"topics": [
"Macroeconomics",
"Macroeconomic Models",
"Consumption and Investment",
"Fiscal Policy"
],
"program": [
"Economic Fluctuations and Growth"
],
"projects": null,
"working_groups": null,
"abstract": "\n\nThis chapter develops a toolkit of neoclassical macroeconomic models, and applies these models to the U.S. economy from 1929 through 2014. We first filter macroeconomic time series into business cycle and long-run components, and show that the long-run component is typically much larger than the business cycle component. We argue that this empirical feature is naturally addressed within neoclassical models with long-run changes in technologies and government policies. We construct two classes of models that we compare to raw data, and also to the filtered data: simple neoclassical models, which feature standard preferences and technologies, rational expectations, and a unique, Pareto-optimal equilibrium, and extended neoclassical models, which build in government policies and market imperfections. We focus on models with multiple sources of technological change, and models with distortions arising from regulatory, labor, and fiscal policies. The models account for much of the relatively stable postwar U.S. economy, and also for the Great Depression and World War II. The models presented in this chapter can be extended and applied more broadly to other settings. We close by identifying several avenues for future research in neoclassical macroeconomics.\n\n",
"acknowledgement": "\nWe thank <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, John Taylor, <NAME>, seminar participants at the Handbook of Macroeconomics Conference and at the 2015 Federal Reserve Bank of Saint Louis Policy Conference for comments. <NAME>, <NAME>, <NAME>, <NAME>, and <NAME> provided excellent research assistance. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research.\n\n\n"
} | json |
Police in Uttar Pradesh's Shamli district has received an unusual request -- to find a bride for a two-foot-tall man.
Azim, 26, had on Tuesday approached the women police station in Shamli, claiming that his family members were not getting him married and requested the police to find a match for him.
However, he had no success.
Station House Officer of the police station Niraj Choudhry said police have no role to arrange marriages for people.
"We can help if there is a dispute between couples but to find a bride for a man is not our job," she said.
Azim's family, who live in Kairana, said they want him to get married but someone should be willing to marry him.
"He is physically weak and has got problems with his hands. We want him to be married to a person who can take care of him," Azim's brother Mohammad Naeem said.
"We have got many proposals including one from Moradabad and we are planning to travel there to see the woman," he added.
However, Azim claimed that his family was not sincere about his marriage. | english |
package commands
import (
"fmt"
"strconv"
jsonrpc "github.com/33cn/chain33/rpc/jsonclient"
rpctypes "github.com/33cn/chain33/rpc/types"
"github.com/33cn/chain33/types"
pkt "github.com/33cn/plugin/plugin/dapp/issuance/types"
"github.com/spf13/cobra"
)
func IssuanceCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "issuance",
Short: "Issuance command",
Args: cobra.MinimumNArgs(1),
}
cmd.AddCommand(
IssuanceCreateRawTxCmd(),
IssuanceDebtRawTxCmd(),
IssuanceRepayRawTxCmd(),
IssuancePriceFeedRawTxCmd(),
IssuanceCloseRawTxCmd(),
IssuanceManageRawTxCmd(),
IssuanceQueryCmd(),
)
return cmd
}
func IssuanceCreateRawTxCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "create",
Short: "Create a issuance",
Run: IssuanceCreate,
}
addIssuanceCreateFlags(cmd)
return cmd
}
func addIssuanceCreateFlags(cmd *cobra.Command) {
cmd.Flags().Float64P("balance", "b", 0, "balance")
cmd.MarkFlagRequired("balance")
cmd.Flags().Float64P("debtCeiling", "d", 0, "debtCeiling")
cmd.Flags().Float64P("liquidationRatio", "l", 0, "liquidationRatio")
cmd.Flags().Uint64P("period", "p", 0, "period")
}
//IssuanceCreate ....
func IssuanceCreate(cmd *cobra.Command, args []string) {
paraName, _ := cmd.Flags().GetString("paraName")
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
balance, _ := cmd.Flags().GetFloat64("balance")
debtCeiling, _ := cmd.Flags().GetFloat64("debtCeiling")
liquidationRatio, _ := cmd.Flags().GetFloat64("liquidationRatio")
period, _ := cmd.Flags().GetUint64("period")
params := &rpctypes.CreateTxIn{
Execer: types.GetExecName(pkt.IssuanceX, paraName),
ActionName: "IssuanceCreate",
Payload: []byte(fmt.Sprintf("{\"totalBalance\":%f, \"debtCeiling\":%f, \"liquidationRatio\":%f, \"period\":%d}",
balance, debtCeiling, liquidationRatio, period)),
}
var res string
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.CreateTransaction", params, &res)
ctx.RunWithoutMarshal()
}
func IssuanceDebtRawTxCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "debt",
Short: "Debt a issuance",
Run: IssuanceDebt,
}
addIssuanceDebtFlags(cmd)
return cmd
}
func addIssuanceDebtFlags(cmd *cobra.Command) {
cmd.Flags().StringP("issuanceID", "g", "", "issuance ID")
cmd.MarkFlagRequired("issuanceID")
cmd.Flags().Float64P("value", "v", 0, "value")
cmd.MarkFlagRequired("value")
}
//IssuanceDebt ...
func IssuanceDebt(cmd *cobra.Command, args []string) {
paraName, _ := cmd.Flags().GetString("paraName")
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
issuanceID, _ := cmd.Flags().GetString("issuanceID")
value, _ := cmd.Flags().GetFloat64("value")
params := &rpctypes.CreateTxIn{
Execer: types.GetExecName(pkt.IssuanceX, paraName),
ActionName: "IssuanceDebt",
Payload: []byte(fmt.Sprintf("{\"issuanceID\":\"%s\",\"value\":%f}", issuanceID, value)),
}
var res string
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.CreateTransaction", params, &res)
ctx.RunWithoutMarshal()
}
func IssuanceRepayRawTxCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "repay",
Short: "Repay a issuance",
Run: IssuanceRepay,
}
addIssuanceRepayFlags(cmd)
return cmd
}
func addIssuanceRepayFlags(cmd *cobra.Command) {
cmd.Flags().StringP("issuanceID", "g", "", "issuance ID")
cmd.MarkFlagRequired("issuanceID")
cmd.Flags().StringP("debtID", "d", "", "debt ID")
cmd.MarkFlagRequired("debtID")
}
//IssuanceRepay ...
func IssuanceRepay(cmd *cobra.Command, args []string) {
paraName, _ := cmd.Flags().GetString("paraName")
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
issuanceID, _ := cmd.Flags().GetString("issuanceID")
debtID, _ := cmd.Flags().GetString("debtID")
params := &rpctypes.CreateTxIn{
Execer: types.GetExecName(pkt.IssuanceX, paraName),
ActionName: "IssuanceRepay",
Payload: []byte(fmt.Sprintf("{\"issuanceID\":\"%s\", \"debtID\":\"%s\"}", issuanceID, debtID)),
}
var res string
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.CreateTransaction", params, &res)
ctx.RunWithoutMarshal()
}
func IssuancePriceFeedRawTxCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "feed",
Short: "price feed",
Run: IssuancePriceFeed,
}
addIssuancePriceFeedFlags(cmd)
return cmd
}
func addIssuancePriceFeedFlags(cmd *cobra.Command) {
cmd.Flags().Float64P("price", "p", 0, "price")
cmd.MarkFlagRequired("price")
cmd.Flags().Uint64P("volume", "v", 0, "volume")
cmd.MarkFlagRequired("volume")
}
//IssuancePriceFeed ...
func IssuancePriceFeed(cmd *cobra.Command, args []string) {
paraName, _ := cmd.Flags().GetString("paraName")
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
price, _ := cmd.Flags().GetFloat64("price")
volume, _ := cmd.Flags().GetUint64("volume")
params := &rpctypes.CreateTxIn{
Execer: types.GetExecName(pkt.IssuanceX, paraName),
ActionName: "IssuancePriceFeed",
Payload: []byte(fmt.Sprintf("{\"price\":[ %f ], \"volume\":[ %d ]}", price, volume)),
}
var res string
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.CreateTransaction", params, &res)
ctx.RunWithoutMarshal()
}
func IssuanceCloseRawTxCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "close",
Short: "close a issuance",
Run: IssuanceClose,
}
addIssuanceCloseFlags(cmd)
return cmd
}
func addIssuanceCloseFlags(cmd *cobra.Command) {
cmd.Flags().StringP("issuanceID", "g", "", "issuance ID")
cmd.MarkFlagRequired("issuanceID")
}
//IssuanceClose ...
func IssuanceClose(cmd *cobra.Command, args []string) {
paraName, _ := cmd.Flags().GetString("paraName")
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
issuanceID, _ := cmd.Flags().GetString("issuanceID")
params := &rpctypes.CreateTxIn{
Execer: types.GetExecName(pkt.IssuanceX, paraName),
ActionName: "IssuanceClose",
Payload: []byte(fmt.Sprintf("{\"issuanceId\":\"%s\"}", issuanceID)),
}
var res string
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.CreateTransaction", params, &res)
ctx.RunWithoutMarshal()
}
func IssuanceManageRawTxCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "manage",
Short: "manage a issuance",
Run: IssuanceManage,
}
addIssuanceManageFlags(cmd)
return cmd
}
func addIssuanceManageFlags(cmd *cobra.Command) {
cmd.Flags().StringP("addr", "a", "", "addr")
cmd.MarkFlagRequired("addr")
}
//IssuanceManage ...
func IssuanceManage(cmd *cobra.Command, args []string) {
paraName, _ := cmd.Flags().GetString("paraName")
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
addr, _ := cmd.Flags().GetString("addr")
params := &rpctypes.CreateTxIn{
Execer: types.GetExecName(pkt.IssuanceX, paraName),
ActionName: "IssuanceManage",
Payload: []byte(fmt.Sprintf("{\"addr\":[\"%s\"]}", addr)),
}
var res string
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.CreateTransaction", params, &res)
ctx.RunWithoutMarshal()
}
//IssuacneQueryPriceCmd ...
func IssuacneQueryPriceCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "price",
Short: "Query latest price",
Run: IssuanceQueryPrice,
}
return cmd
}
//IssuanceQueryPrice ...
func IssuanceQueryPrice(cmd *cobra.Command, args []string) {
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
var params rpctypes.Query4Jrpc
params.Execer = pkt.IssuanceX
params.FuncName = "IssuancePrice"
var res pkt.RepIssuancePrice
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
}
//IssuanceQueryUserBalanceCmd ...
func IssuanceQueryUserBalanceCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "balance",
Short: "Query user balance",
Run: IssuanceQueryUserBalance,
}
addIssuanceQueryBalanceFlags(cmd)
return cmd
}
func addIssuanceQueryBalanceFlags(cmd *cobra.Command) {
cmd.Flags().StringP("address", "a", "", "address")
cmd.MarkFlagRequired("address")
}
//IssuanceQueryUserBalance ...
func IssuanceQueryUserBalance(cmd *cobra.Command, args []string) {
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
addr, _ := cmd.Flags().GetString("address")
var params rpctypes.Query4Jrpc
params.Execer = pkt.IssuanceX
params.FuncName = "IssuanceUserBalance"
req := &pkt.ReqIssuanceRecords{
Addr: addr,
}
params.Payload = types.MustPBToJSON(req)
var res pkt.RepIssuanceUserBalance
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
}
func IssuanceQueryCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "query",
Short: "Query result",
Run: IssuanceQuery,
}
addIssuanceQueryFlags(cmd)
cmd.AddCommand(
IssuacneQueryPriceCmd(),
IssuanceQueryUserBalanceCmd(),
)
return cmd
}
func addIssuanceQueryFlags(cmd *cobra.Command) {
cmd.Flags().StringP("issuanceID", "g", "", "issuance ID")
cmd.Flags().StringP("address", "a", "", "address")
cmd.Flags().StringP("index", "i", "", "index")
cmd.Flags().StringP("status", "s", "", "status")
cmd.Flags().StringP("issuanceIDs", "e", "", "issuance IDs")
cmd.Flags().StringP("debtID", "d", "", "debt ID")
}
//IssuanceQuery ...
func IssuanceQuery(cmd *cobra.Command, args []string) {
rpcLaddr, _ := cmd.Flags().GetString("rpc_laddr")
issuanceID, _ := cmd.Flags().GetString("issuanceID")
address, _ := cmd.Flags().GetString("address")
statusStr, _ := cmd.Flags().GetString("status")
issuanceIDs, _ := cmd.Flags().GetString("issuanceIDs")
debtID, _ := cmd.Flags().GetString("debtID")
var params rpctypes.Query4Jrpc
params.Execer = pkt.IssuanceX
var status int64
var err error
if statusStr != "" {
status, err = strconv.ParseInt(statusStr, 10, 32)
if err != nil {
fmt.Println(err)
cmd.Help()
return
}
}
if issuanceID != "" {
if address != "" {
params.FuncName = "IssuanceRecordsByAddr"
req := &pkt.ReqIssuanceRecords{
IssuanceId: issuanceID,
Status: int32(status),
Addr: address,
}
params.Payload = types.MustPBToJSON(req)
var res pkt.RepIssuanceRecords
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
} else if statusStr != "" {
params.FuncName = "IssuanceRecordsByStatus"
req := &pkt.ReqIssuanceRecords{
IssuanceId: issuanceID,
Status: int32(status),
}
params.Payload = types.MustPBToJSON(req)
var res pkt.RepIssuanceRecords
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
} else if debtID != "" {
params.FuncName = "IssuanceRecordByID"
req := &pkt.ReqIssuanceRecords{
IssuanceId: issuanceID,
DebtId: debtID,
}
params.Payload = types.MustPBToJSON(req)
var res pkt.RepIssuanceDebtInfo
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
} else {
params.FuncName = "IssuanceInfoByID"
req := &pkt.ReqIssuanceInfo{
IssuanceId: issuanceID,
}
params.Payload = types.MustPBToJSON(req)
var res pkt.RepIssuanceCurrentInfo
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
}
} else if statusStr != "" {
params.FuncName = "IssuanceByStatus"
req := &pkt.ReqIssuanceByStatus{Status: int32(status)}
params.Payload = types.MustPBToJSON(req)
var res pkt.RepIssuanceIDs
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
} else if issuanceIDs != "" {
params.FuncName = "IssuanceInfoByIDs"
var issuanceIDsS []string
issuanceIDsS = append(issuanceIDsS, issuanceIDs)
issuanceIDsS = append(issuanceIDsS, issuanceIDs)
req := &pkt.ReqIssuanceInfos{IssuanceIds: issuanceIDsS}
params.Payload = types.MustPBToJSON(req)
fmt.Println(params.Payload)
var res pkt.RepIssuanceCurrentInfos
ctx := jsonrpc.NewRPCCtx(rpcLaddr, "Chain33.Query", params, &res)
ctx.Run()
} else {
cmd.Help()
}
}
| go |
<filename>Site/recipes/Pavlova med mango og pasjonsfrukt.json
{"description": "Oppskrift på hjemmelaget pavlova toppet med herlig mango og pasjonsfrukt.", "submitter": "bb", "steps": ["Sett stekeovnen på 250 °C og finn fram en stekeplate dekket med bakepapir. Tegn opp en sirkel på ca. 26 cm midt på bakepapiret.", "Når du skal lage bunnen til Pavlova, starter du med å vispe eggehviter til hardt skum i en helt ren bolle. Ha i eddik mens du visper. Bland sukker, Maizena og vaniljesukker, og tilsett det litt etter litt i eggehvitene mens du fortsatt visper.", "Når massen er blank, smidig og fast har du marengsmassen ut på bakepapiret og former den til en rund bunn innenfor sirkelen.", "Senk temperaturen i stekeovnen til 100 °C og sett inn stekeplaten nederst i ovnen. Stek bunnen i ca. 60-70 minutter.", "Marengsen er nå sprø på yttersiden og myk inni. La den avkjøles på en rist. Ta bakepapiret forsiktig av. Slik kan marengsbunnen oppbevares i lang tid bare det er tørt.", "Rett før servering vispes fløten stiv med vaniljesukker.", "Bruk en potetskreller eller ostehøvel og fjern skallet på mango. Høvle så mango i tynne, tynne skiver med potetskreller eller ostehøvel. Skrap ut pasjonsfrukten fra skallet.", "Legg bunnen på et serveringsfat og fordel kremen utover. Topp med høvlet mango. Drypp skjeer med pasjonsfrukt utover og pynt med sitronmelisse. Server med en gang."], "ingredients": [{"amount": "6 stk", "name": "eggehviter"}, {"amount": "2 ts", "name": "hvitvinseddik"}, {"amount": "2 dl", "name": "sukker"}, {"amount": "2 ts", "name": "maizena"}, {"amount": "1 ts", "name": "vaniljesukker"}, {"amount": "3 dl", "name": "TINE® Kremfløte"}, {"amount": "½ ts", "name": "vaniljesukker"}, {"amount": "2 stk", "name": "mangoer"}, {"amount": "6 stk", "name": "pasjonsfruktpulp"}, {"amount": "8 stk", "name": "sitronmelisseblader"}], "picture": "https://www.tine.no/_/recipeimage/w_1028%2Ch_720%2Cc_fill%2Cx_1500%2Cy_937%2Cg_xy_center/recipeimage/429277.jpg", "type": "recipe", "link": "Pavlova med mango og pasjonsfrukt", "name": "Pavlova med mango og pasjonsfrukt"} | json |
<filename>public/SileoDepictions/discord-video-loop.json
{
"minVersion": "0.4",
"class": "DepictionTabView",
"headerImage": "",
"tintColor": "",
"tabs": [
{
"class": "DepictionStackView",
"tabname": "Details",
"views": [
{
"class": "DepictionHeaderView",
"title": "Discord Video Loop"
},
{
"class": "DepictionMarkdownView",
"markdown": "Hi **This is a sileo depiction**"
}
]
}
]
} | json |
{
"name": "dom-lib/getHeight",
"private": true,
"main": "../cjs/getHeight.js",
"module": "../esm/getHeight.js",
"types": "../esm/getHeight.d.ts"
}
| json |
President Barack Obama is hosting the new Ukrainian prime minister at the White House on Wednesday, a high profile gesture aimed at cementing the West’s allegiance to Ukraine’s fledgling government.
The meeting between President Obama and Prime Minister Arseniy Yatsenyuk comes as pro-Russian Crimean region in Ukraine readies for a referendum on Sunday to determine its future. Voters in the Crimean Peninsula will be given two options: becoming part of Russia or remaining in Ukraine with broader powers. The U. S. and Europe have declared the referendum illegitimate, saying Ukraine’s central government must be involved in decisions about its territory.
Ukraine’s parliament installed Mr. Yatsenyuk as head of the country’s interim government after pro-Kremlin President Viktor Yanukovych fled the capital of Kiev following three months of popular protests. The uprising started when Mr. Yanukovych rejected a planned partnership agreement with the European Union in favour of historical ties with Moscow.
Days after Mr. Yanukovych left Kiev, Russia moved military forces into Crimea. Russian President Vladimir Putin has so far brushed aside punishments levied by the West following the incursion, including visa bans, the threat of economic sanctions and a halt to planning for an international economic summit Russia is scheduled to host in June.
Russia does not recognize the new government nor the elections planned in Ukraine in May.
A possible path for de-escalating the dispute emerged on Tuesday, when Crimea’s parliament said that if the public votes to become part of Russia, the peninsula will declare itself independent and propose becoming a Russian state. That could give Moscow the option of saying there is no need for Crimea to become part of Russia, while keeping it firmly within its sphere of influence.
Mr. Yatsenyuk will be greeted at the White House Wednesday by all of the grandeur of a head of state visit, including an Oval Office meeting with Mr. Obama. The two leaders were expected to make brief comments to the media following their discussions.
Vice President Joe Biden, who has served as a primary administration contact with Ukraine’s old and new government, has cut short a trip to Latin America to attend the meeting. Secretary of State John Kerry, who met with Mr. Yatsenyuk in Kiev last week, is also expected to have a separate meeting with the prime minister.
Mr. Obama and other administration officials are expected to reinforce their commitment to boost Ukraine’s fragile new government. The U. S. has promised Ukraine $1 billion in loan guarantees, as well as technical support as it moves toward elections.
Mr. Obama has urged Congress to quickly approve the loan guarantee, which is supposed to supplement additional assistance from the International Monetary Fund. The Senate Foreign Relations Committee plans to vote on Wednesday on a bill that both provides aid to Ukraine and hits Russia with sanctions.
A major sticking point had been a provision in the bill to enhance the lending capacity of the IMF. The Obama administration has pushed hard for acceptance of the IMF changes as part of the legislation authorizing the assistance. Sen. Bob Menendez, the committee chairman, said on Tuesday that IMF revisions would be included in the bill.
The IMF’s 2010 revisions expand the power of emerging countries within the global lending body and make some of its funds more readily available. The United States is the only country on the IMF board that hasn’t accepted the changes yet.
Sen. Lindsey Graham said on Tuesday he supported the IMF changes and viewed them as a “national security” concern. But several House Republicans have voiced opposition.
The House last week voted overwhelmingly in favour of the loans to Ukraine. The legislation didn’t include Russia sanctions or any language on the IMF.
The European Union has pledged $15 billion in assistance to Ukraine, though even that falls well short of the $35 billion in international rescue loans Kiev says it needs over the next two years. | english |
The 2020 longlist for India’s richest literary prize, the JCB Prize for Literature, was announced on September 1. This year’s jury – chaired by Tejaswini Niranjana, and joined by Aruni Kashyap, Ramu Ramanathan, and Deepika Sorabjee – chose six women writers, four debuts, and two translations to make the cut for the ten-title list. The shortlist will be announced on September 25, and the winner, on November 7.
Shortly after the announcement, Niranjana and Kashyap spoke to Scroll.in about judging a prize during a pandemic, the Indian publishing landscape, and “separating the wheat from the chaff”. Excerpts from the conversation:
Can you comment on the challenges you faced and the rewards you gained during the process of reading and judging these books?
Aruni Kashyap: I think the biggest challenge for all of us was the pandemic. Due to the lockdown, the jury members did not get physical copies of the books. This was a considerable challenge – especially for me – to read because, anyway, due to the pandemic, we were spending a lot of time on the screen, meetings, teaching on screen, even grading (for me) on screen. Two of the jury members are working professors in two different universities. Our screen time increased exponentially, which meant that we could not be as productive as we wanted to.
I usually sit down and finish a book within a few hours because I am a very fast reader, and I can remember most of it, but we couldn’t read like that anymore. Reading PDFs, reading from the computer just made it incredibly slow for me.
Apart from that, I love reading, and I was delighted that I was getting to read such a wide range of books from all over the country in translation as well as from publishers who are not, as we often say, in the “big five.” We had entries from independent and regional presses. I think that was a big boon because we got to see what is being published in contemporary India. I think the award has its niche because it is only open to Indian citizens. In some ways, we got an idea of homegrown talent, something very valuable about the prize.
One more rewarding thing about the longlist was that reading these entries had given me so much fodder for my research as a literary scholar and an academic. I have graduate students working with me on post-colonial literature, which I will help better because of my reading for JCB.
I think it has helped me grow – to cut a long story short – as a scholar, as a reader. All these books will change conversations for better or worse for Indian literature, and I am going to teach some of these books sooner or later when I can. I have learned immensely by reading these books.
Tejaswini Niranjana: The main challenge of jury work in any field is to separate the wheat from the chaff – and the JCB Prize posed a similar challenge. We had to read through a great deal of indifferent writing to find those novels that made all the effort and tedium worthwhile. The main reward, then, is to be able to discover these fine books, and then to recommend them to the reading public.
What does the 2020 longlist say about the current state of Indian publishing – and contemporary Indian fiction writing?
Aruni Kashyap: I think we can make two critical observations by looking closely at the longlist. My first point is about translation and the second about how Indian writers are responding to current realities.
Translation of literature from Indian languages is here to stay, and it is slowly reclaiming and demanding its long-deserved place. But, while reading the entries, I also felt that we need a lot more good translators – in all languages.
I think translation workshops and translation studies courses should be offered in Indian universities. If public universities are not taking this up, then private universities, private philanthropic enterprises, literary bodies should start training good translators. Otherwise, the great literature, the eclectic, wonderful literary work available in Indian languages, will not be available to the rest of the world.
Of course, a lot has happened in the last ten years, and I think the JCB Prize has a major role in the future. The translated books on the longlist are here because they are great books, and they are competing with books originally written in English, but we must not forget that English books get a lot of support. The writers get a lot of feedback during the drafting process; they work closely with editors, proofreaders, and even select agents who provide intellectual and material support.
This is not the case, for example, for a book published in Assamese – as that book is coming from a very different publishing industry that is radically different from the English language publishing industry. An Assamese book, for instance, is most often only printed. If you are lucky, you will find a proofreader at most. No one edits it, no one will go through drafts with you, strengthening the manuscript before publication. After the book is out, you will have to buy the copies (unless you are a very famous author) and drop review copies in magazines and papers.
I think this is the most significant thing about the JCB Prize and the 2020 longlist: how translated literature from Indian languages is competing with Indian English writing that emerges from a far more privileged space.
My second observation is that if you look closely at the longlist, you will realise that Indian English writers are responding to current realities, which is amazing. As a writer myself, I believe that writers should write socially relevant books; I think that writers have a moral responsibility today to bear witness, speak truth to power; and, coming as I do from a highly militarised, highly fraught, highly racialised place like Assam, I understand this much more than anybody else, and I can arrive at this conclusion that writers address social responsibility through their art.
My late mother, a novelist, was part of a progressive writers’ group in Assam, inspired by leftwing ideas. She and her colleagues decided that they would write the kind of fiction that would change society for the better, that would contribute to the creation of a more equal society. My father was also part of the left movement in Assam and were part of similar writers’ groups in Guwahati. So this meant all discussions about fiction in my household were measured against these parameters. Is it asking questions, speaking truth to power? What kind of a society does it envision? Are the ideas progressive? This is the environment in which I grew up. That something is well crafted was never enough for the three of us!
Later, my mentor, Indira Goswami, who was also a professor in Delhi University, took the idea of social responsibility more seriously, and decided to become the mediator or interlocutor between ULFA and the government. So I think social responsibility is a key theme in regional languages, especially in under-represented literary traditions. And it is not just Indira Goswami who did this, so did many other writers from Assam.
Coming back to the point, the books by Samit Basu, Megha Majumdar, Annie Zaidi, and Tanuj Solanki take on totalitarianism, authoritarianism, fundamentalism, religious fanaticism, capitalism, etc. This shows that Indian-English writing has suddenly woken up to the fact that the country has changed and they need to respond to this new reality.
You will see how the writers on this list, especially the three I mentioned, are grappling with this. Megha Majumdar’s book, A Burning, literally talks about a burning issue in the same way, Samit Basu’s Chosen Spirits is a cautionary tale, and Prelude to a Riot by Annie Zaidi is a deep investigation into Indian society. Tanuj Solanki’s novel is a critique of capitalism.
The longlist also shows the different kinds of experimentation that Indian writers are engaged in order to respond to contemporary realities. Samit Basu’s book is set in the future, Annie Zaidi’s book is told in Faulknerian style from multiple perspectives, and so is Megha Majumdar’s book. There is an experimentation with craft, and definitely a rejection of the linear / traditional novel in some of these books.
All the writers are trying to ask: “How do we tell a story, in 2020, after 2014? How do you tell the stories of a world that has changed so much?” And I think these questions are at the back of their minds, and they are finding answers by, in a sense, by rejecting traditional modes and telling these stories from many perspectives, many different subject positions, and temporal frameworks.
Tejaswini Niranjana: The 2020 longlist covers an impressive range of fictional genres. These include coming-of-age novels, magical realist and dystopian novels, those dealing with urgent contemporary political and social issues, folklore, and inter-generational sagas. It’s good to know there is so much diversity in Indian fiction writing today, and that Indian publishing can support this diversity of offerings.
You mention “memorability” in the press note for the longlist. What else do you wish for readers of this longlist to take away from these books?
Aruni Kashyap: I want readers to pick up all these books and give them the love and adulation that they deserve, but, at the same time, to also think about a couple of other things as they read the books on the longlist. I want readers to pick up more books in translation, to read in their own languages if possible. It’s a matter of practice, please start reading.
I also want readers to pick up books from under-represented communities and literary traditions. I don’t mean regional, but literary traditions. I think the North-East of India has a unique literary tradition, and literature from this region needs to be read and analysed with the parameters that this tradition sets for itself.
Tejaswini Niranjana: All these books represent brilliantly imagined fictional universes, with characters and narratives that will haunt the reader for a long time.
- These, Our Bodies, Possessed by Light,
| english |
Commuting from Lucknow to Delhi will soon be faster,safer and free from traffic congestion.
The National Highways Authority of India (NHAI) is set to begin four-laning and strengthening of the Sitapur-Bareilly section of National Highway-24 in Uttar Pradesh.
The project received the Centres approval last week.
The 152-km stretch will be widened from two to four-lane at a cost of Rs 1,026 crore.
According to NHAI authorities,the tender process has already been initiated and work will begin from January 2010.
The contract for Sitapur-Bareilly section,a part of National Highways Development Project Phase-III,will be given on design-built-finance and operate basis, said Ashutosh Gautam,Project Director of NHAI,UP unit.
Mumbai-based STUPS Consultants has been entrusted with the task to prepare the Detailed Project Report (DPR) of the project.
The DPR will be completed in the next two months and by then,a developer will be appointed to begin construction, Gautam said. The developer will three years to complete the stretch.
On the 502-km highway between Lucknow and Delhi,road widening is already being conducted on three sections Lucknow-Sitapur,Bareilly-Moradabad and Moradabad-Delhi.
According to the NHAI authorities,work is almost complete in the Bareilly-Moradabad and Moradabad-Delhi sections. Four-laning of Lucknow-Sitapur (76-km) section is also in progress. | english |
{
"name": "example",
"version": "1.0.0",
"private": true,
"description": "",
"license": "MIT",
"author": "",
"main": "index.js",
"scripts": {
"build": "tsc",
"cleandir": "cleandir dist",
"pretest": "npm run build",
"test": "npm run test:code && npm run test:cli",
"test:cli": "npm run cleandir",
"test:code": "node index.js"
},
"dependencies": {
"@mstssk/cleandir": "file:.."
},
"devDependencies": {
"typescript": "^4.5.4"
}
}
| json |
import UIElement, { EVENT } from "@core/UIElement";
import { LOAD } from "@core/Event";
import { CSS_TO_STRING, STRING_TO_CSS } from "@core/functions/func";
import SelectIconEditor from "./SelectIconEditor";
export default class FlexLayoutItemEditor extends UIElement {
components() {
return {
SelectIconEditor,
}
}
initState() {
return {
...STRING_TO_CSS(this.props.value),
}
}
setValue (value) {
this.setState(STRING_TO_CSS(value))
}
getValue () {
return CSS_TO_STRING(this.state)
}
modifyData() {
this.parent.trigger(this.props.onchange, this.props.key, this.getValue())
}
[LOAD('$body')] () {
return /*html*/`
<div class='flex-layout-item'>
<div class='label'><label>${this.$i18n('flex.layout.item.editor.direction')}</label></div>
<SelectIconEditor
key='flex-direction'
value="${this.state['flex-direction'] || 'row'}"
options="${getDirectionOptions()}"
onchange='changeKeyValue'
/>
</div>
<div class='flex-layout-item'>
<div class='label'><label>${this.$i18n('flex.layout.item.editor.wrap')}</label></div>
<SelectIconEditor
key='flex-wrap'
value="${this.state['flex-wrap'] || 'wrap'}"
options="${getWrapOptions()}"
onchange='changeKeyValue'
/>
</div>
<div class='flex-layout-item'>
<div class='label'><label>${this.$i18n('flex.layout.item.editor.justify-content')}</label></div>
<SelectIconEditor
key='justify-content'
value="${this.state['justify-content']}"
options="${getJustifyContentOptions()}"
onchange='changeKeyValue'
/>
</div>
<div class='flex-layout-item'>
<div class='label'><label>${this.$i18n('flex.layout.item.editor.align-items')}</label></div>
<SelectIconEditor
key='align-items'
value="${this.state['align-items']}"
options="${getAlignItemsOptions()}"
onchange='changeKeyValue'
/>
</div>
<div class='flex-layout-item'>
<div class='label'><label>${this.$i18n('flex.layout.item.editor.align-content')}</label></div>
<SelectIconEditor
key='align-content'
value="${this.state['align-content']}"
options="${getAlignContentOptions()}"
onchange='changeKeyValue'
/>
</div>
`
}
template () {
return /*html*/`
<div class='flex-layout-editor' ref='$body' ></div>
`
}
[EVENT('changeKeyValue')] (key, value, params) {
this.setState({
[key]: value
}, false)
this.modifyData();
}
} | javascript |
<filename>visao/recursos/js/ajax/Comunicado.js
function inserirComunicado() {
$.ajax({
url: "../control/InserirComunicado.php",
type: "POST",
data: $("#fcomunicado").serialize(),
dataType: 'json',
success: function (data, textStatus, jqXHR) {
if (data.situacao === true) {
swal("Comunicado cadastrado", data.mensagem, "success");
procurarComunicado(true);
} else if (data.situacao === false) {
swal("Erro ao cadastrar", data.mensagem, "error");
}
}, error: function (jqXHR, textStatus, errorThrown) {
swal("Erro ao cadastrar", "Erro causado por:" + errorThrown, "error");
}
});
}
function atualizarComunicado() {
$.ajax({
url: "../control/AtualizarComunicado.php",
type: "POST",
data: $("#fcomunicado").serialize(),
dataType: 'json',
success: function (data, textStatus, jqXHR) {
if (data.situacao === true) {
swal("Comunicado atualizado", data.mensagem, "success");
procurarComunicado(true);
} else if (data.situacao === false) {
swal("Erro ao atualizar", data.mensagem, "error");
}
}, error: function (jqXHR, textStatus, errorThrown) {
swal("Erro ao atualizar", "Erro causado por:" + errorThrown, "error");
}
});
}
function excluirComunicado() {
if (window.confirm("Deseja realmente excluir esse comunicado?")) {
if (document.getElementById("codcomunicado").value !== null && document.getElementById("codcomunicado").value !== "") {
$.ajax({
url: "../control/ExcluirComunicado.php",
type: "POST",
data: {codcomunicado: document.getElementById("codcomunicado").value},
dataType: 'json',
success: function (data, textStatus, jqXHR) {
if (data.situacao === true) {
swal("Comunicado excluido", data.mensagem, "success");
procurarComunicado(true);
} else if (data.situacao === false) {
swal("Erro ao excluir", data.mensagem, "error");
}
}, error: function (jqXHR, textStatus, errorThrown) {
swal("Erro ao excluir", "Erro causado por:" + errorThrown, "error");
}
});
} else {
swal("Campo em branco", "Por favor escolha a comunicado para excluir!", "error");
}
}
}
function excluir2Comunicado(codcomunicado) {
swal({
title: "Confirma exclusão?",
text: "Você não poderá mais visualizar as informações dessa comunicado!",
type: "warning",
showCancelButton: true,
confirmButtonColor: "#DD6B55",
confirmButtonText: "Sim, exclua ele!",
closeOnConfirm: false,
closeOnCancel: true
}, function (isConfirm) {
if (isConfirm) {
if (codcomunicado !== null && codcomunicado !== "") {
$.ajax({
url: "../control/ExcluirComunicado.php",
type: "POST",
data: {codcomunicado: codcomunicado},
dataType: 'json',
success: function (data, textStatus, jqXHR) {
if (data.situacao === true) {
swal("Comunicado excluido", data.mensagem, "success");
procurarComunicado(true);
} else if (data.situacao === false) {
swal("Erro ao excluir", data.mensagem, "error");
}
}, error: function (jqXHR, textStatus, errorThrown) {
swal("Erro ao excluir", "Erro causado por:" + errorThrown, "error");
}
});
} else {
swal("Campo em branco", "Por favor escolha a comunicado para excluir!", "error");
}
}
});
}
function setaComunicado(comunicado) {
$("#codcomunicado").val(comunicado[0]);
$("#codbanco").val(comunicado[1]);
$("#codconvenio").val(comunicado[2]);
$("#codtabela").val(comunicado[3]);
$("#prazo").val(comunicado[4]);
$("#vlsolicitado").val(comunicado[5]);
$("#codstatusComunicado").css("display", "");
$("#codstatusComunicado").val(comunicado[6]);
$("#btInserirComunicado").css("display", "none");
$("#btAtualizarComunicado").css("display", "");
$("#btExcluirComunicado").css("display", "");
$("#status_comunicado").css("display", "");
document.getElementById("fcomunicado").action = "../control/AtualizarComunicado.php";
}
function btNovoComunicado() {
$("#codbanco1").val("");
$("#codbanco").val("");
$("#codconvenio").val("");
$("#codtabela").val("");
$("#prazo").val("");
$("#vlsolicitado").val("");
$("#codstatusComunicado").css("display", "none");
$("#codstatusComunicado").val("");
$("#btInserirComunicado").css("display", "");
$("#btAtualizarComunicado").css("display", "none");
$("#btExcluirComunicado").css("display", "none");
document.getElementById("fcomunicado").action = "../control/InserirComunicado.php";
}
function procurarComunicado(acao) {
$("#carregando").show();
$.ajax({
url: "../control/ProcurarComunicado2.php",
type: "POST",
data: $("#fPcomunicado").serialize(),
dataType: 'text',
success: function (data, textStatus, jqXHR) {
if (acao === false && data === "") {
swal("Atenção", "Nada encontrado de comunicados!", "error");
}
document.getElementById("listagemComunicado").innerHTML = data;
}, error: function (jqXHR, textStatus, errorThrown) {
swal("Erro ao excluir", "Erro causado por:" + errorThrown, "error");
}
});
$("#carregando").hide();
}
function abreRelatorioComunicado() {
document.getElementById("tipo").value = "pdf";
document.getElementById("fPcomunicado").submit();
}
function abreRelatorioComunicado2() {
document.getElementById("tipo").value = "xls";
document.getElementById("fPcomunicado").submit();
}
function procurarComunicadoInicial(codcomunicado) {
TINY.box.show({url: '../control/TextoComunicado.php?codcomunicado=' + codcomunicado, width: 700, height: 300, opacity: 20, topsplit: 3});
}
/**daqui para baixa responsável pelo ajax de inserir ou atualizar arquivo e também pelo upload sem redirecionar página*/
(function () {
if (document.getElementById("fcomunicado") != null) {
var bar = $('.bar');
var percent = $('.percent');
var status = $('#status');
$("#fcomunicado").submit(function () {
$(".progress").css("visibility", "visible");
});
$('#fcomunicado').ajaxForm({
beforeSend: function () {
status.empty();
var percentVal = '0%';
bar.width(percentVal);
percent.html(percentVal);
},
uploadProgress: function (event, position, total, percentComplete) {
var percentVal = percentComplete + '%';
bar.width(percentVal)
percent.html(percentVal);
},
success: function () {
var percentVal = '100%';
bar.width(percentVal);
percent.html(percentVal);
},
complete: function (xhr) {
var data = JSON.parse(xhr.responseText);
if (data.situacao === true) {
swal("Cadastro", data.mensagem, "success");
procurarComunicado(true);
} else if (data.situacao === false) {
swal("Erro", data.mensagem, "error");
}
}
});
}
})();
| javascript |
Dhanush's Thiruchitrambalam had a dream run in the theatres, taking everyone by pleasant surprise. The film was expected to do well at the box office, but it overpowered the expectations and turned out to be a humongous blockbuster. It collected more than 100 crores at the box office and also emerged as a highly profitable venture for the people involved with the trade of the film. Thiruchitrambalam opened to favourable reviews from the audiences and the feel-good nature of the film managed to strike a chord with the masses and the youth and family audiences thronged the theatres for multiple watches.
- 13 recent Tamil songs that definitely deserves a place in your playlist - Here is the list!
- Here's a new big update about Dhanush's Naane Varuvean teaser - Check out the poster!
- BOX OFFICE TRENDS: Are feel-good/romantic movies the current formula for success? - here's what we think!
Following the highly successful run in the theatres, Thiruchitrambalam is now all set for its digital premiere and details regarding the same are out. The film will stream on the Sun NXT app from September 23, coming Friday, and people who missed to watch the film in the theatres can catch it from the comfort of their homes from the said date. The film is sure to have a solid viewership on Sun NXT as several people would want to watch the film again for its fun and romantic episodes. Thiruchitrambalam features Dhanush in the titular role, while Nithya Menen plays his best friend, Shobana.
Nithya Menen's character turned out to be an instant hit among the youth and it has made the people yearn for such a friend in their life. Priya Bhavani Shankar and Raashi Khanna play the other leads in the film that has Bharathiraja and Prakash Raj in pivotal roles. Anirudh Ravichander's songs were massive hits even before the release, but post release, it was the bonus song, 'Mayakkama Kalakkama' which took over the limelight. Produced by Sun Pictures and directed by Mithran R Jawahar of Yaaradi Nee Mohini fame, the film has cinematography handled by Om Prakash and editing by Prasanna GK. | english |
<filename>packages/component/config/babel.plugin.js
exports.transformRuntime = () => ['@babel/plugin-transform-runtime']
exports.ignoreStyleFiles = () => ['babel-plugin-transform-require-ignore', {
extensions: ['.less', '.css', '.sass', 'scss'],
}]
| javascript |
<reponame>jkinfeng/koa-router-helper
{
"name": "koa-router-helper",
"version": "1.0.6",
"description": "autoload koa-router",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [
"koa",
"koa2",
"router",
"helper",
"koa-router",
"koa-router-autoload",
"autoload",
"koa-router-helper"
],
"author": "Jkin.Feng",
"license": "MIT",
"dependencies": {
"koa-compose": "^4.1.0",
"koa-router": "^9.1.0"
},
"homepage": "https://github.com/jkinfeng/koa-router-helper.git#readme",
"repository": {
"type": "git",
"url": "git+https://github.com/jkinfeng/koa-router-helper.git"
}
}
| json |
Commander of Iran’s Border Police said three Iranian border guards abducted by terrorists back in October 2018 are going to be released from Pakistan and return home.
According to Tasnim News Agency, Brigadier General Qassem Rezaei on Wednesday pointed to the “satisfactory” negotiations held with the Pakistani officials to secure the release of the three Iranian border guards who have been held in Pakistan since abduction by the terrorists.
The three kidnapped soldiers are safe and well and will return home very soon, the commander added.
On October 15, 2018, the so-called Jaish-ul-Adl terrorist group infiltrated Iran from the Pakistani side of the border and took hostage 14 border guards, local Basij forces, and the Islamic Revolution Guards Corps (IRGC) members.
The IRGC Ground Force’s Quds Base said at the time that the local Basij forces and the border regiment forces stationed at the border post in Mirjaveh region in Iran’s southeastern province of Sistan and Balouchestan had been abducted after “acts of treason and collusion” involving an element or elements of the anti-Revolution groups who had infiltrated the country.
A number of the abductees have been released since then.
Military forces along the southeastern border areas are frequently attacked by terrorist groups coming from Afghanistan and Pakistan.
Tehran has already asked the two neighbors to step up security at the common border to prevent terrorist attacks on Iranian forces. | english |
Indian Boxers are assured of at least five medals in the ongoing Strandja Memorial Boxing in Sofia, Bulgaria. While Amit Panghal’s promising performance has led to a confirmed medal in the 49 kg category, Lovlina Borgohain routed her opponent 5-0 to bag the first medal by a women pugilist.
Amit Panghal had survived the battle against Ukraine’s Nazar Kurotchyn in the 49kg category by defeating the opponent 3-2 which confirms the semifinal berth and a medal. Commonwealth Games medallists, Gaurav Solanki, and Naman Tanwar also qualified for the quarter-finals in their respective categories.
Indian women boxers too had a great day as they assured four medals for India at the Strandja Memorial Boxing on Sunday. Lovlina Borgohain ousted Brazil’s Soarez Beatriz with a scintillating display which fetched the first medal for the women boxers.
Manju Rani too had thrashed Bonati Roberta of Italy in a one-sided encounter as she defeated her counterpart 5-0 and the present National Champion sealed a spot in the 48 kg semi-finals. Former world junior champion Nikhat Zareen and Neeraj had emerged victorious as they outclassed their opponents with similar results in their respective categories to assure five medals for India at the end of the day.
However, 2016 World Championship silver medallist, Sonia Lather went down 1-4 to Ramirez Yarisel of the USA in the 57kg category. In the men’s round, Ulaan Baatar Cup gold winner, Ankush Dahiya was defeated by Macedonia’s Jasin Ljama. India’s best haul at the event was 11 medals which included two gold metals.
| english |
What Is The Difference Between Momos And Dumplings?
There are a lot of similarities between momos and dumplings, but there are also some key differences that you need to know about if you want to order the right dish at a Chinese restaurant or make them at home. Here are the differences between momos and dumplings.
Momos and dumplings are beloved dishes in many cultures around the world and are commonly served as appetizers, snacks, or even as part of a full meal. The two Asian dishes are beloved by the general populace for their savory fillings and for their convenience; the ability to grab a dumpling or a momo on the go is appealing to many. Because of their diversity and ease of eating, these dishes have found success in a variety of cultures, ranging from the Himalayas to Chinatowns worldwide.
Because of their almost identical appearances, the two are often confused for one another. Despite their commonalities, momos and dumplings differ significantly in terms of taste, texture, and fillings.
Here's how momos and dumplings are different:
"Momo" comes from the Tibetan phrase "mog mog," which means "flour-related food." The origin of momos can be traced back to Tibet in the 17th century. From Tibet, it traveled to Nepal via the Newar ethnic group. Typically stuffed with yak meat, potatoes, and cheese on occasion, momos began to include vegetables as their fillings when they reached India, which has the largest population of vegetarians in the world.
Momos are a type of street dish that is often served steamed and are a common everyday treat. These are typically bite-sized snacks made of thin dough that is wrapped around a savory filling and steamed. Momos can also be deep-fried or pan-fried and served with chilly garlic chutney. In Kathmandu, these are served in a Jhol gravy sauce and are referred to as Jhol Momo. One important thing to remember is that momos always come with fillings.
Dumplings are a staple of popular Asian cuisine and can be made in a number of different ways, such as fried, steamed, or boiled. There are nearly 200 variants of dumplings to choose from; the most prominent ones include pork and chicken dumplings. However, unlike momos, dumplings may or may not come with fillings.
Also Read | Do You The Difference Between Ice Cream And Frozen Desserts?
The word "dumpling" is an umbrella term that usually refers to snacks with fillings wrapped in dough. So accordingly, momos are dumplings of sorts. Italian ravioli and Indian samosas and kachoris can also be classified as dumplings.
- Dumplings originated in China, whereas momos originated in Tibet and then spread to its neighboring countries, Nepal and India.
- Momos always come with fillings, whether meat or vegetables; dumplings, on the other hand, may or may not have fillings.
Also Read | How Are Cupcakes and Muffins Different?
To sum it up, though both dishes are savory and delicious, they differ in ingredients, preparation methods, and cultural origins. The main difference between momos and dumplings is their origin, as well as the fact that momos always come with fillings, while dumplings may or may not.
You May Also Like | Cappuccino Vs. Latte: What's The Difference?
| english |
To be a public figure isn’t all fun and games. While there’s obviously the undeniable financial requisite, many of the celebrities have to deal with extremities that can really affect their mental well-being. They might have a larger than life lifestyle but the constant scrutiny that they are under can be very exhausting. Imagine every move of yours being watched and dissected on the internet for days on end, sounds pretty horrifying, right?
Some of these popular figures have to face the ire of the public for the most bizarre reasons. Let’s take a look at 13 celebs who receive more hate than they actually deserve.
Pop sensation Britney Spears' life was made into a debacle in the 2000s where media and fans defined her as problematic and arrogant. Little did anyone know that she was silently battling her own demons. Details of her 13-year conservatorship and the docu series Framing Britney sheds light on why she went spiralling out of control. Many fans and celebs have come out and apologized for passing judgements on her without knowing her real story.
Ever since her stint in the Twilight series, Stewart has been constantly trolled for her lack of expressions and flat acting skills. While Bella Swan, the character that she played was said to be dull, she has proved time and again that she possesses impeccable acting skills which the world got to witness in her recent films. Of course, her big revelation of cheating on her then beau Robert Pattinson aggravated the hate. Despite all of this, it's high time that everyone moves on.
Then president of the United States Of America Bill Clinton became the eye of the storm when news of his scandalous affair with intern Monica Lewinsky broke. She was called all kinds of names and was deemed as a home wrecker. However, Clinton did not receive the same amount of flak like she did. What’s more, even the scandal was named after her. Over the years, Monica became a butt of jokes but she didn’t deserve such extreme hatred because there were two people in that affair.
If you are surprised to see this name on the list, believe me you are not alone. There was a time when a large group spread all kinds of hate about the Devil Wears Prada actress. The reasons remain unknown till date. Was it because of the shoddy job she did of hosting the Oscars with James Franco or was it because her ex-boyfriend was indicted on fraud and money laundering charges? The mystery still stands unsolved.
The choti bahu as the desis call her, did the unthinkable when she and her beloved husband, Prince Harry decided to part ways and take a step back from the Royal duties. This move brought her a lot of hatred as people even went on to blame her for tearing the family apart. She then faced the ire of the public, when she did a tell-all interview with Oprah Winfrey describing her ordeal of living in the Royal palace and experiencing a toxic and hostile environment. Whatever happened to living as per her own will!
People went ahead and dissected her career by saying that she didn’t deserve fame or success as her songs are only based on her own heartbreaks. Even though many artists create songs on the basis of their own experience, when she did the same she was called all sorts of names.
Agreed that he doesn’t have the best reputation and he did some questionable deeds in his early days but the hatred that he gets is unreasonable. Firstly, he was massively trolled for his voice. Some called it girly, others wrote him off for good. He didn’t help his cause as he frequently mistreated his fans and had run-ins with the law.
However, it would have been better if people had cut him some slack. Not only did he rocketed to fame at a young age, he was also made to deal with the ramifications in a very harsh manner. Since then, he has reformed tremendously and has been charitable towards many causes. Even his behaviour has improved ten-fold.
The ill-fated actress had to bear the brunt of her boyfriend’s suicide. So much so that her character was assassinated and her family was vilified without any proof of her involvement in Sushant Singh Rajput’s case. The entire country turned against the woman and convicted her of the crime without any evidence. Forget her career, even her mental health took a big fall as she had to remain in custody for almost 2 months. As the case still remains unsolved, the lady in question is trying to collect the fragments of her life to experience normalcy after such a horrifying time.
A stellar actress who has a knack for good content, Anushka Sharma was everyone’s favourite until she married Indian cricket team captain Virat Kohli. To her dismay, she got the biggest shock when she received hatred when India lost a crucial match. People blamed her for ‘distracting’ Kohli. Some even went ahead and urged the cricket board to ban her from attending any games in the future. If this doesn’t define obnoxious behaviour then I don’t know what will!
She made waves with her stint in Bigg Boss. Right from the beginning, she openly spoke about her career choice of doing porn films in the West. Still people trolled her mercilessly and blamed her for spoiling the minds of the youth. She was called names and treated with so much disrespect by interviewers and the general public alike. She still braved the storm and maintained a cool approach and gradually shed the image of a pornstar to become a part of the showbiz.
To no fault of his own, Junior Bachchan has been brutally trolled for being the legendary actor Amitabh Bachchan’s son. All nepotism chatter aside, people even lashed out at him for not becoming as successful as his father! Some shameless trolls on the internet keep discrediting his work and taunt him on his unsuccessful attempts. Many others question his earnings and call him ‘lucky’ for having a super successful father and wife Aishwarya Rai by his side. Abhishek Bachchan keeps a cool head most times and gives it back to them with witty responses.
Such is our society that celebs are trolled for not having an opinion on social affairs and then again for having certain opinions on the same. Bhasker is often trolled for her political views. So much so that she is called all sorts of demeaning names. For her bold masturbation scene in Veere Di Wedding, she is still trolled for choosing to do something out-of-the-box.
Sonam is one actress who constantly makes headlines for all the wrong reasons. Foot-in-the-mouth situations were a regular occurrence for Sonam when she had started her career. Over the years, she has learned the art of diplomacy. From being a star’s daughter to her fashion sense, people troll her for anything and everything. Some even question her acting prowess. Interestingly, Sonam takes all of this with a pinch of salt and has often laughed it off but it’s high time people let her be. | english |
Diet Sabya and their followers believe that Malaika Arora had the best walk at the Lakme Fashion Week and even compared the star's showstopper moment with Jennifer Lopez's iconic ramp moment in the green Versace dress. They posted a video on Instagram and wrote, "Malla just owned the ramp. "
This week, many big names attended the Lakme Fashion Week and some turned showstoppers for designers. Malaika Arora is one of them as she walked the ramp for the clothing label Limerick by Abirr and Nanki. The star made heads turn at the fashion week with her glamorous moment on stage and even got a compliment from Diet Sabya - an anonymous Instagram account dedicated to exposing imitations, appropriations and blatant copies in fashion. Diet Sabya and their followers announced that the star had the 'best walk' at the fashion week. They even compared it to Jennifer Lopez's iconic walk in the green Versace dress she wore at the 42nd Grammy Awards in 2000 and then again on the runway 20 years later.
On Saturday, Diet Sabya posted a video featuring Malaika Arora and Jennifer Lopez's ramp walk with the caption, "Same energy. Best walk at fashion week! Haters can argue with the wall. " Their followers and Malaika's fans agreed with the statement and rushed to the comments section to shower the Chaiyya Chaiyya Girl with compliments. While the video showed Malaika walking the ramp in a bralette, skirt and a flowy cape jacket, Jennifer Lopez's clip is from a Versace show that celebrated the 20th anniversary of her green 'Jungle' dress. Check out the video below and see the two diva's iconic and unmatchable energies on the ramp. (Also Read: Malaika Arora and Kareena Kapoor take over London in stylish fits: See pics here)
Netizens flooded the comments section of the post with compliments for Malaika. One user wrote, "The iconic green dress and Malaika. You are so on point with the observation of the energy of both these women on the ramp. " Another added, "Malla just owned the ramp. Learn from her. " Some called her 'the best', and others said they 'loved Malaika's confidence'. "She's the moment right now," one user commented.
While many fans complimented Malaika's ramp walk, some thought that her outfit restricted her from showing her true potential. "Yes, good energy but looked like Malaika's skirt wasn't letting her take the big bold steps like JLo. Malla's vibe was big bold though," one netizen wrote.
What do you think of Malaika Arora's ramp walk as the showstopper?
Malaika Arora is currently dating Arjun Kapoor. The couple made their relationship official in 2019. | english |
Someone has put a small heap of yellow rice on a raised cemented stand for an electricity transformer. A street dog, shocked and shivering, sat there but didn’t touch it. A few are clinging together on the top of a broken brick wall, trying to survive in the flooded surroundings. Unlike for humans, the boats and makeshift rafts do not stop for them. What used to be a bustling lane that meanders deep through this old neighbourhood four days ago, today looks like a river.
With a group of local volunteers, on a makeshift raft made out of foam mattresses, a team of the The Indian Express entered the old city wading through chest-deep water to witness rescue and relief efforts. At the entry of this neighbourhood, there were floating carcasses of dogs and cattle, emanating intense foul smell. An Army boat was slowly returning with a group of men and women and both the soldiers and the survivors looked exhausted. It was probably for the first time that the Armymen were seen without arms in this neighbourhood.
A few yards ahead, a woman appeared on the window and later dropped a utensil tied to a rope. Two men sitting on a makeshift raft, made of two empty plastic tanks, stopped and filled it with milk. She and her family members were not able to leave when the water was dangerously high. Once the water started to recede, the family decided to stay put as the other option was to live on the pavement of a near by highway, where several other survivors have erected tents.
As we reached Shah Faisal Abad Lane, distributing food items among the stranded residents, someone shouted from a window that water level in the lane was high. Mohammed Shafi Shala’s family hadn’t been rescued and he was desperately looking for baby food. There were two toddlers in the house who had almost nothing to eat for last two days.
Unlike the posh localities such as Jawaharnagar and Rajbagh, this old neighbourhood has a feel of a village. Several families here rear cattle, especially cows. “There was no rescue here. But we did get occasional food packages that were hurled from the road since yesterday,’’ said Mohamamd Iqbal Bhat. “We had asked Armymen and volunteers to bring some cattle feed. The cow was mowing loudly. ” Bhat said his cow and a calf died due to hunger. “We didn’t know what to do so we have kept their carcasses inside only. ” His neighbour Abdul Raheem Dar too has lost two of his cows.
A few hundred metres ahead, more than 50 people have taken refuge in a mosque. “We don’t have water to drink. Please give us some water,’’ a man shouts. The rescue team had carried a few cartons of drinking water along so they somehow climbed the fence wall and put it on a dry corner.
In fact, several people trapped in this congested locality said they don’t have drinking water and complained that no rescuer came this way. “No rescue boat tried to come into this narrow lane. They wouldn’t leave the road,’’ said Mohamamd Sidiq Dar, who had come out on the verandah of his top floor. A few yards ahead, a man was in distress. Two children in his family had fever for two days and he had no way to get medical help. In fact, three other families too said that their children were ill.
At the edge of this narrow lane, a frail woman was waving franticly. When the volunteers reached close to the window amid neck deep water, she was in tears. Mubina from Bihar is married to Muneer from Lolab and they have three children.
The volunteers put the family on the two mattress rafts and start the journey back to the dry patch on the highway. | english |
<gh_stars>10-100
["localeresolver","loopback-policy","node-red-contrib-nools","owasp-threat-dragon-core","the-gathering","transparent-tank"] | json |
<filename>apps/api/src/db/models/system/user-physical-data.ts
import { BelongsTo, Column, DataType, ForeignKey, Table } from 'sequelize-typescript';
import { Sex, WeightTarget, UserPhysicalDataAttributes } from '@common/types/models';
import BaseModel from '../model';
import { User } from '.';
@Table({
modelName: 'UserPhysicalData',
tableName: 'user_physical_data',
freezeTableName: true,
timestamps: false,
underscored: true,
})
export default class UserPhysicalData
extends BaseModel<UserPhysicalDataAttributes>
implements UserPhysicalDataAttributes
{
@Column({
allowNull: false,
primaryKey: true,
type: DataType.BIGINT,
})
@ForeignKey(() => User)
public userId!: string;
@Column({
allowNull: true,
type: DataType.STRING(64),
})
public sex!: Sex | null;
@Column({
allowNull: true,
type: DataType.DOUBLE,
})
public weightKg!: number | null;
@Column({
allowNull: true,
type: DataType.DOUBLE,
})
public heightCm!: number | null;
@Column({
allowNull: true,
type: DataType.BIGINT,
})
public physicalActivityLevelId!: number | null;
@Column({
allowNull: true,
type: DataType.DATEONLY,
})
public birthdate!: Date | null;
@Column({
allowNull: true,
type: DataType.STRING(64),
})
public weightTarget!: WeightTarget | null;
@BelongsTo(() => User, 'userId')
public user?: User[];
}
| typescript |
package in.rrapps.mvpdaggertesting.api;
import com.google.gson.FieldNamingPolicy;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import java.util.concurrent.TimeUnit;
import javax.inject.Singleton;
import dagger.Module;
import dagger.Provides;
import io.reactivex.schedulers.Schedulers;
import okhttp3.OkHttpClient;
import okhttp3.Request;
import okhttp3.logging.HttpLoggingInterceptor;
import retrofit2.Retrofit;
import retrofit2.adapter.rxjava2.RxJava2CallAdapterFactory;
import retrofit2.converter.gson.GsonConverterFactory;
@Module
public class ApiModule
{
@Provides
@Singleton
ApiService providesApiService()
{
OkHttpClient.Builder builder = new OkHttpClient.Builder();
builder.hostnameVerifier((str, sslSession) -> true);
builder.connectTimeout(30, TimeUnit.SECONDS);
builder.readTimeout(30, TimeUnit.SECONDS);
builder.writeTimeout(30, TimeUnit.SECONDS);
// Add headers
builder.interceptors().add(chain -> {
Request request = chain.request();
request = request.newBuilder()
.build();
return chain.proceed(request);
});
// Logging
HttpLoggingInterceptor interceptor = new HttpLoggingInterceptor();
interceptor.setLevel(HttpLoggingInterceptor.Level.BODY);
builder.interceptors().add(interceptor);
Gson gson = new GsonBuilder()
.setFieldNamingPolicy(FieldNamingPolicy.LOWER_CASE_WITH_UNDERSCORES)
.setDateFormat("yyyy-MM-dd")
.create();
RxJava2CallAdapterFactory rxAdapter = RxJava2CallAdapterFactory
.createWithScheduler(Schedulers.io());
Retrofit retrofit = new Retrofit.Builder()
// .baseUrl(BuildConfig.API_URL)
.client(builder.build())
.addConverterFactory(GsonConverterFactory.create(gson))
.addCallAdapterFactory(rxAdapter)
.build();
return retrofit.create(ApiService.class);
}
} | java |
Hyderabad, Aug 15: Telangana, the youngest state of India, on Monday celebrated the 70th Independence Day with gaiety and patriotic fervour. Chief Minister K. Chandrasekhar Rao hoisted the national flag at a colourful ceremony at the historic Golconda Fort here. Before unfurling the tricolour, Rao was presented a guard of honour by a police contingent. The number of policemen and students participating in the parade was limited due to lack of space. After paying tributes to martyrs at a memorial at Parade Ground in Secunderabad, Rao drove to Golconda fort for the main event. This is the third year in a row that the main official function was held at the fort. As in the previous two years, there was no demonstration by tableaux due to space constraint. The fort wore a festive look as the celebrations highlighted the rich culture of the state. Drummers, tribal and other artists performed on the ramparts to add colour to the festivities. KCR, as Rao is popularly known, addressed the gathering, highlighting the achievements of his government during last two years. KCR's cabinet colleagues, Chief Secretary Rajiv Sharma, Director General of Police Anurag Sharma and other top civil, police and military officials attended the ceremony. After formation of the state in 2014, KCR had departed from the over five-decade-old tradition of Independence Day celebrations at the Parade Ground to choose the ancient fort as the venue to showcase the rich legacy of Telangana. The Independence Day celebrations were also held at the state assembly, Hyderabad High Court and offices of all political parties. Telangana Rashtra Samithi (TRS) leader and Home Minister N. Narasimha Reddy unfurled the national flag at Telangana Bhavan, the headquarters of the ruling party. Patriotic fervour marked the celebrations at the government offices and educational institutions. The Independence Day was also celebrated with gaiety in the nine other districts of Telangana. State ministers unfurled the national flag at official functions held in district headquarters. | english |
The NBA is one of the toughest leagues to get into. Thousands of kids dream of getting into the NBA every year, but it’s much easier said than done. Since making it into the NBA is not possible for all, some people devote their lives to admiring their favorite players and/or teams.
A lot of the time, one player alone can result in a sell-out crowd. Kobe Bryant was one of those players. The Black Mamba was one fierce competitor and always had the burning desire inside him to win. No matter what the situation was on the court, Kobe would always give his best.
The Mamba had millions of fans, ranging from young kids to old people. He never wanted to let down any of them, and understandably so. Once, after his retirement, Bryant was asked why did he never miss games, despite being sick or having ankle sprains.
In the current NBA, we see many players being given ‘rest’ during a long road trip, or during back-to-back games. However, things weren’t the same earlier and Kobe Bryant is a stellar example of the same. We have witnessed the Black Mamba playing through sickness, injuries, or even sprains.
Once he was asked why does he do so, and he had a simple answer for the same.
The Mamba never wanted to let any of his fans down, even if it meant taking a toll on his body. Kobe had a deep love for the game, and was always looking to pass it ahead to the next generation.
| english |
{"starship_id":9,"name":"Executor","model":"Executor-class star dreadnought","manufacturer":"Kuat Drive Yards, Fondor Shipyards","cost_in_credits":1143350000,"length":19000,"max_atmosphering_speed":0,"crew":"279,144","passengers":38000,"cargo_capacity":250000000,"consumables":"6 years","hyperdrive_rating":2,"MGLT":40,"starship_class":0,"pilots":[],"films":[2,3],"created":"2014-12-15 12:31:42","edited":"2014-12-20 21:23:49"} | json |
Civil police officer (CPO) Ranjith Kumar, who ran in front of an ambulance to make way for it through traffic of congested Kottayam city road, has made it to the cast of Mollywood flick 'Viral 2019'.
The video, which won him praise from various walks of life, has now opened the doors of Mollywood for Ranjith. Naushad Alathur, who had produced movies like Aadupuliyatam, Thoppil Joppan, Kuttanadan Maarpapa, has offered the CPO a role in his upcoming movie titled 'Viral 2019'.
Ever since the announcement, 'Viral 2019' has been on the spotlight for its unique casting. The first one to enter the cast was Hanan Hamid, a student who became an internet sensation when pictures of her selling fish in school uniform hit online. Another girl who landed a role was Soubhagya who broke her leg in an accident a day before she was to participate in the Kerala School Kalolsavam.
Another signing for the movie is Pranav, known for his singing talents and his ability to draw using legs. Television talent show-fame Hasna will also be seen in the movie. Unni R, a singer who won the heart of actor Kamal Hassan with his exceptional talent in singing, is another name in the cast.
Malyalam television personality Sethulakshmi, who struggled financially to meet her son's medical expenses, will also be seen in the movie. According to the team of Viral 2019, she was also given financial aid.
Songs of the movie will be penned by Mariamma teacher, the mother who delivered a heart-wrenching eulogy for her son. The bond between the mother and son moved netizens and the video went viral in 2018. Chandralekha, who became a social media singing sensation with song 'Rajahamsame', will croon for the movie along with singer Abhijith.
'Viral 2019' will have eight director. But the unique feature is that all these directors were shortlisted on the basis of voting. It's the people who choose the directors and the scriptwriters of the movie.
The movie's first poster was unveiled at the Cochin International Airport and the auditions for the selections of rest of the actors will take place in three sessions.
The first audition will be on February 1, 2019 at the Angamaly De Paul Institute of Science and Technology (DiST) open for male between the age of 16 and 50.
The second audition will be held in Indiranagar (Bengaluru) at the East Cultural Association Club (ECA Club) on February 10. This audition will be open to both men and women.
The final audition will be held on the February 17 at the IMA Hall, Kaloor in Ernakulam. This audition will be only for women and transgenders.
Performance of those attending the audition will be uploaded on the official page of 'Viral 2019' and the cast will also be finalised via voting. This innovative movie will be purely based on the popular consent and is expected to start rolling by the month of May. | english |
<filename>i18n/tr.json<gh_stars>0
{
"@metadata": {
"authors": [
"BaRaN6161 TURK"
]
},
"article-action-sections": "Bölümler",
"article-action-quickfacts": "Kısa Bilgiler",
"article-language-available": "$1 mevcut {{PLURAL:$1|dil|dil}}",
"article-loading-message": "Sayfa yükleniyor...",
"app-title": "Vikipedi",
"app-subtitle": "Özgür ansiklopedi",
"app-description": "Ücretsiz Vikipedi uygulamasıyla, 300'den fazla dilde dünyanın her yerinden bilgi arayın ve keşfedin.",
"confirm-section": "\"$1\" Bölümüne Gidin",
"header-menu": "Makale menüsü",
"header-search": "Arama sonuçları",
"header-sections": "Bölümler",
"header-settings": "Ayarlar",
"header-textsize": "Metin boyutu",
"menu-previous": "Önceki makale",
"menu-language": "Dil",
"menu-textsize": "Metin boyutu",
"softkey-done": "Tamamlandı",
"softkey-search": "Ara",
"search-language-placeholder": "Dil ara",
"softkey-menu": "Menü",
"search-placeholder": "Ara",
"softkey-settings": "Ayarlar",
"centerkey-select": "Seç",
"softkey-close": "Kapat",
"language-change": "Uygulama dilini değiştir",
"language-setting": "Dil ayarları",
"language-setting-message": "Burada dili değiştirmek uygulama genelinde dili değiştirir. Makale menüsünü kullanarak her makalenin dilini de değiştirebilirsiniz.",
"softkey-ok": "Tamam",
"softkey-read": "Oku",
"reference-title": "Referans [$1]",
"offline-message": "İnternet bağlantısı yok",
"settings-language": "Dil",
"settings-textsize": "Metin boyutu",
"settings-about-wikipedia": "Vikipedi hakkında",
"settings-privacy": "Gizlilik politikası",
"settings-term": "Kullanım koşulları",
"settings-rate": "Uygulamayı puanla",
"settings-help-feedback": "Yardım ve geri bildirim",
"settings-about-app": "Uygulama hakkında",
"textsize-increase": "4'e basın (Boyutu artır)",
"textsize-decrease": "6'ya basın (Boyutu küçült)",
"textsize-default": "5'e basın (Varsayılan)"
}
| json |
<reponame>kevinyliu/assets<filename>blockchains/solana/assets/4Fo67MYQpVhZj9R7jQTd63FPAnWbPpaafAUxsMGX2geP/info.json
{
"name": "DAI (Portal from Polygon)",
"type": "SPL",
"symbol": "DAIpo",
"decimals": 8,
"description": "Cross Chain Portal Bridged Token",
"website": "https://makerdao.com/",
"explorer": "https://solscan.io/token/<KEY>",
"status": "active",
"id": "<KEY>",
"tags": [
"wrapped"
]
} | json |
<gh_stars>0
// https://wwwimages2.adobe.com/content/dam/acom/en/products/speedgrade/cc/pdfs/cube-lut-specification-1.0.pdf
import {
Loader,
FileLoader,
Vector3,
DataTexture,
DataTexture3D,
RGBFormat,
UnsignedByteType,
ClampToEdgeWrapping,
LinearFilter,
} from '../../three.module.js';
export class LUTCubeLoader extends Loader {
load( url, onLoad, onProgress, onError ) {
const loader = new FileLoader( this.manager );
loader.setPath( this.path );
loader.setResponseType( 'text' );
loader.load( url, text => {
try {
onLoad( this.parse( text ) );
} catch ( e ) {
if ( onError ) {
onError( e );
} else {
console.error( e );
}
this.manager.itemError( url );
}
}, onProgress, onError );
}
parse( str ) {
// Remove empty lines and comments
str = str
.replace( /^#.*?(\n|\r)/gm, '' )
.replace( /^\s*?(\n|\r)/gm, '' )
.trim();
let title = null;
let size = null;
const domainMin = new Vector3( 0, 0, 0 );
const domainMax = new Vector3( 1, 1, 1 );
const lines = str.split( /[\n\r]+/g );
let data = null;
let currIndex = 0;
for ( let i = 0, l = lines.length; i < l; i ++ ) {
const line = lines[ i ].trim();
const split = line.split( /\s/g );
switch ( split[ 0 ] ) {
case 'TITLE':
title = line.substring( 7, line.length - 1 );
break;
case 'LUT_3D_SIZE':
// TODO: A .CUBE LUT file specifies floating point values and could be represented with
// more precision than can be captured with Uint8Array.
const sizeToken = split[ 1 ];
size = parseFloat( sizeToken );
data = new Uint8Array( size * size * size * 3 );
break;
case 'DOMAIN_MIN':
domainMin.x = parseFloat( split[ 1 ] );
domainMin.y = parseFloat( split[ 2 ] );
domainMin.z = parseFloat( split[ 3 ] );
break;
case 'DOMAIN_MAX':
domainMax.x = parseFloat( split[ 1 ] );
domainMax.y = parseFloat( split[ 2 ] );
domainMax.z = parseFloat( split[ 3 ] );
break;
default:
const r = parseFloat( split[ 0 ] );
const g = parseFloat( split[ 1 ] );
const b = parseFloat( split[ 2 ] );
if (
r > 1.0 || r < 0.0 ||
g > 1.0 || g < 0.0 ||
b > 1.0 || b < 0.0
) {
throw new Error( 'LUTCubeLoader : Non normalized values not supported.' );
}
data[ currIndex + 0 ] = r * 255;
data[ currIndex + 1 ] = g * 255;
data[ currIndex + 2 ] = b * 255;
currIndex += 3;
}
}
const texture = new DataTexture();
texture.image.data = data;
texture.image.width = size;
texture.image.height = size * size;
texture.format = RGBFormat;
texture.type = UnsignedByteType;
texture.magFilter = LinearFilter;
texture.minFilter = LinearFilter;
texture.wrapS = ClampToEdgeWrapping;
texture.wrapT = ClampToEdgeWrapping;
texture.generateMipmaps = false;
const texture3D = new DataTexture3D();
texture3D.image.data = data;
texture3D.image.width = size;
texture3D.image.height = size;
texture3D.image.depth = size;
texture3D.format = RGBFormat;
texture3D.type = UnsignedByteType;
texture3D.magFilter = LinearFilter;
texture3D.minFilter = LinearFilter;
texture3D.wrapS = ClampToEdgeWrapping;
texture3D.wrapT = ClampToEdgeWrapping;
texture3D.wrapR = ClampToEdgeWrapping;
texture3D.generateMipmaps = false;
return {
title,
size,
domainMin,
domainMax,
texture,
texture3D,
};
}
}
| javascript |
<filename>packages/webpack-ui/src/pages/default/chunks-tree.ts<gh_stars>0
import { ModuleItemConfig, moduleItemConfig } from './modules-tree';
import { AssetItemConfig, assetItemConfig } from './assets-tree';
import { PackageItemConfig, packageItemConfig } from './packages-tree';
export type ChunkTree = Record<string, unknown>;
export default (hash?: string): ChunkTree => {
return {
view: 'tree',
expanded: false,
limitLines: '= settingListItemsLimit()',
itemConfig: chunkItemConfig(void 0, hash),
};
};
export type ChunkItemConfig = Record<string, unknown>;
export function chunkItemConfig(getter = '$', hash = '#.params.hash'): ChunkItemConfig {
return {
limit: '= settingListItemsLimit()',
content: {
view: 'chunk-item',
data: `{
chunk: ${getter},
hash: ${hash},
match: #.filter
}`,
},
children: `
$reasonModules:origins.resolvedModule.[].[not shouldHideModule()];
$chunkModules:..modules.[not shouldHideModule()];
$chunkModulesPackages:$chunkModules.(resolvedResource.nodeModule()).[].(name.resolvePackage(${hash}));
$chunkPackages:$chunkModulesPackages.({name: name, instances: instances.[modules.[$ in $chunkModules]]});
$modules:modules.[not shouldHideModule()];
[{
title: "Reasons",
reasons: $reasonModules,
data: $reasonModules.chunks.sort(initial desc, entry desc, size desc),
visible: $reasonModules,
type: 'reasons'
}, {
title: "Modules",
// todo: wait contexts and filter modules by current chunk
data: $modules.sort(getModuleSize(${hash}).size desc),
visible: $modules,
type: 'modules'
}, {
title: "Packages",
data: $chunkPackages.sort(instances.size() desc, name asc),
visible: $chunkPackages,
type: 'packages'
}, {
title: "Assets",
data: files.[].sort(isOverSizeLimit asc, getAssetSize(${hash}).size desc),
visible: files.[],
type: 'assets'
}].[visible]`,
itemConfig: {
view: 'switch',
content: [
{
when: 'type="modules"',
content: {
view: 'tree-leaf',
content: [
'text:title',
{
when: 'data',
view: 'badge',
className: 'hack-badge-margin-left',
data: `{text: data.size()}`,
},
],
children: 'data',
limit: '= settingListItemsLimit()',
get itemConfig(): ModuleItemConfig {
return moduleItemConfig(void 0, hash);
},
},
},
{
when: 'type="reasons"',
content: {
view: 'tree-leaf',
content: 'text:title',
children: `
$reasonChunks:reasons.chunks;
[{
title: "Chunks",
reasons: reasons,
data: $reasonChunks,
visible: $reasonChunks,
type: 'chunks'
}, {
title: "Modules",
data: reasons,
visible: reasons,
type: 'modules'
}].[visible]`,
itemConfig: {
view: 'switch',
content: [
{
when: 'type="chunks"',
content: {
view: 'tree-leaf',
content: [
'text:title',
{
when: 'data',
view: 'badge',
className: 'hack-badge-margin-left',
data: `{text: data.size()}`,
},
],
children: `data`,
limit: '= settingListItemsLimit()',
get itemConfig(): ChunkItemConfig {
return chunkItemConfig();
},
},
},
{
when: 'type="modules"',
content: {
view: 'tree-leaf',
content: [
'text:title',
{
when: 'data',
view: 'badge',
className: 'hack-badge-margin-left',
data: `{text: data.size()}`,
},
],
children: 'data',
limit: '= settingListItemsLimit()',
get itemConfig(): ModuleItemConfig {
return moduleItemConfig(void 0, hash);
},
},
},
],
},
},
},
{
when: 'type="packages"',
content: {
view: 'tree-leaf',
content: [
'text:title',
{
when: 'data',
view: 'badge',
className: 'hack-badge-margin-left',
data: `{text: data.size()}`,
},
],
children: 'data',
limit: '= settingListItemsLimit()',
get itemConfig(): PackageItemConfig {
return packageItemConfig(hash);
},
},
},
{
when: 'type="assets"',
content: {
view: 'tree-leaf',
content: [
'text:title',
{
when: 'data',
view: 'badge',
className: 'hack-badge-margin-left',
data: `{
text: data.size(),
postfix: data.reduce(=> $$ + getAssetSize(${hash}).size, 0).formatSize()
}`,
},
],
children: 'data',
limit: '= settingListItemsLimit()',
get itemConfig(): AssetItemConfig {
return assetItemConfig(void 0, hash);
},
},
},
],
},
};
}
| typescript |
Israeli volunteers engaged in relief efforts after the devastating earthquakes in Turkey and Syria are harnessing cloud-based technology to enhance the emergency response, the Times of Israel reported.
The first responders of United Hatzalah, a non-profit emergency medical rescue organization, are sharing real-time data on location, videos and images with the help of Carbyne, an Israeli startup. The technology enables volunteers and emergency responders like police officials, to connect with medical experts and other professional teams in Israel, who offer quick, life-saving solutions to rescuers.
Zohar Eli, United Hatzalah’s director of call centers and technology, told the Israeli newspaper that the advanced video features of Carbyne ensured a clearer view of the situation on ground, helping in better remote consultations with experts.
The technology which combines location services, live video chat, incident mapping, and other communication systems into one dashboard at the call center offers immediate assistance to responders entering turbulent situations.
Started in 2014 by Alichai, Alex Dizengoff, Yony Yatsun, and Lital Leshem, it has raised $128 million in financing. According to the report, Carbyne’s ‘life-saving’ solutions are utilised by law and order departments and governments in over 120 states worldwide.
Another Israeli startup - Sight Diagnostics - has said it will deploy its blood count devices and test kits at a field hospital in southeastern Turkey.
The National Aeronautics and Space Administration (NASA) on Saturday said it is working to share its aerial views and data from space to aid relief and recovery workers, as well as improve its ability to model and predict such events.
A team of scientists at the Earth Observatory of Singapore and NASA's Jet Propulsion Laboratory collected images before and after the earthquake and created a ‘damage proxy map’ for Turkey. The agency's Earth Science Applied Sciences, national and international collaborates make these proxy maps available to the US State Department, the California Seismic Safety Commission, the World Bank and the Miyamoto Global Disaster Relief. | english |
California Gov. Gavin Newsom says the partnership with Verily can be a national model for coronavirus screening.
Verily, the life sciences arm of Google 's parent, Alphabet, late Sunday launched a website to give people information about coronavirus screening, though it's limited for now to two testing sites in the San Francisco Bay Area. The rollout comes after a set of confusing announcements last week by Google and President Donald Trump.
The software tool is hosted through Verily's Project Baseline, an initiative to advance clinical research. It allows people in the Bay Area to take online screener surveys to see if they should go to testing sites in Santa Clara county or San Mateo county for examinations. Along with partnering with the federal government, Verily also worked with the California governor's office.
"We hope that this partnership can scale, and we believe it will be a national model," California Gov. Gavin Newsom said at a press conference Sunday. "We are very encouraged by this partnership, very enthusiastic to finally announce it. I know there's been some conversations about it in the media."
When you enter the website, the first question in the workflow is: "Are you currently experiencing severe cough, shortness of breath, fever, or other concerning symptoms?" Answering "yes" to the question appears to end the test, while saying "no" takes you to the next question. That may seem counterintuitive, but Verily reportedly says the test is working as intended.
Verily didn't immediately respond to a request for comment, but told BuzzFeed: "This screener was developed in partnership with government health officials. The initial question is meant to ensure that anyone who is seriously ill does not come to our sites because they are not prepared to provide medical attention. We are early in this pilot and are going to be learning more that will help us refine this COVID-19 risk screening and testing."
To take the coronavirus screener, you are also required to have a Google account. If you don't have one, you're prompted to create one.
The effort comes as the coronavirus pandemic has severely impacted everyday life across the globe. At Google and Alphabet, all North American employees have been asked to work from home, and the company canceled its annual Google I/O developer conference. The confab, which had been scheduled for May, is Google's biggest event of the year.
The launch follows major confusion about the project and its scope, and Google and government officials earlier Sunday sought to clarify details about the website. The questions began on Friday, when Trump announced that Google was working with the White House and private sector partners on a website to give people information about coronavirus testing. Trump unveiled the project during an address at the White House at which he declared a national state of emergency over the COVID-19 pandemic.
More than an hour later, Google tweeted that Verily, the life sciences arm of Alphabet, is "developing a tool to help triage individuals for COVID-19 testing. Verily is in the early stages of development, and planning to roll testing out in the Bay Area, with the hope of expanding more broadly over time."
There seemed to be a disconnect between the two announcements, especially over the timing and scope. Trump said 1,700 Google engineers were working on the project, while Verily only has around 1,000 employees. It turned out Trump had "oversold" and "inflated the concept," according to a report by the New York Times.
On Saturday, Google followed up with a tweet confirming that it's "partnering with the US Government in developing a nationwide website that includes information about COVID-19 symptoms, risk and testing information."
| english |
In Indian illegally occupied Jammu and Kashmir, Indian troops in their fresh act of state terrorism, martyred two Kashmiri youth in Pulwama district.
According to Kashmir Media Service, the troops martyred the youth during a massive cordon and search operation in Arigam area of the district. The operation was going on till last reports came in.
On the other hand, Indian police detained the All Parties Hurriyat Conference leader, Abdul Samad Inqilabi, in Bandipora district during a raid carried out at his residence in Sumbal area of the district.
Talking to media men, Samad Inqilabi’s family members said that the Indian forces’ personnel raided his house in Sumbal area of the district and took him along.
Meanwhile, APHC leader, Mukhtar Ahmad Waza, speaking in a condolence meeting in Islamabad town appealed to India and Pakistan to demonstrate wisdom and hammer out a peaceful resolution of the Kashmir dispute, which poses a threat to the peace of entire world. Both India and Pakistan are nuclear powers and any flare up can prove extremely detrimental for the regional and global peace, he warned.
| english |
#!/usr/bin/env python3
import logging
import sys
from .ToolChainExplorer import ToolChainExplorer
class ToolChainExplorerDFS(ToolChainExplorer):
def __init__(
self,
simgr,
max_length,
exp_dir,
nameFileShort,
worker,
):
super(ToolChainExplorerDFS, self).__init__(
simgr,
max_length,
exp_dir,
nameFileShort,
worker
)
self.log = logging.getLogger("ToolChainExplorerDFS")
self.log.setLevel("INFO")
def __take_longuest(self, simgr, source_stash):
"""
Take a state of source_stash with longuest amount of steps and append it to active stash
@pre : source_stash exists
"""
id_to_move = 0
max_step = 0
if len(simgr.stashes[source_stash]) > 0:
id_to_move = simgr.stashes[source_stash][0].globals["id"]
max_step = simgr.stashes[source_stash][0].globals["n_steps"]
else:
return
for s in simgr.stashes[source_stash]:
if s.globals["n_steps"] > max_step:
id_to_move = s.globals["id"]
max_step = s.globals["n_steps"]
simgr.move(source_stash, "active", lambda s: s.globals["id"] == id_to_move)
def step(self, simgr, stash="active", **kwargs):
try:
simgr = simgr.step(stash=stash, **kwargs)
except Exception as inst:
self.log.warning("ERROR IN STEP() - YOU ARE NOT SUPPOSED TO BE THERE !")
# self.log.warning(type(inst)) # the exception instance
self.log.warning(inst) # __str__ allows args to be printed directly,
exc_type, exc_obj, exc_tb = sys.exc_info()
# fname = os.path.split(exc_tb.tb_frame.f_code.co_filename)[1]
self.log.warning(exc_type, exc_obj)
exit(-1)
super().build_snapshot(simgr)
if self.print_sm_step and (
len(self.fork_stack) > 0 or len(simgr.deadended) > self.deadended
):
self.log.info(
"A new block of execution have been executed with changes in sim_manager."
)
self.log.info("Currently, simulation manager is :\n" + str(simgr))
self.log.info("pause stash len :" + str(len(self.pause_stash)))
if self.print_sm_step and len(self.fork_stack) > 0:
self.log.info("fork_stack : " + str(len(self.fork_stack)))
# if self.print_sm_step:
# self.log.info("len(self.loopBreak_stack) : " + str(len(self.loopBreak_stack)))
# self.log.info("state.globals['n_steps'] : " + str(state.globals['n_steps']))
# self.log.warning("STEP")
# We detect fork for a state
super().manage_fork(simgr)
# Remove state which performed more jump than the limit allowed
super().remove_exceeded_jump(simgr)
# Manage ended state
super().manage_deadended(simgr)
super().mv_bad_active(simgr)
# import pdb; pdb.set_trace()
# If limit of simultaneous state is not reached and we have some states available in pause stash
if len(simgr.stashes["pause"]) > 0 and len(simgr.active) < self.max_simul_state:
moves = min(
self.max_simul_state - len(simgr.active),
len(simgr.stashes["pause"]),
)
for m in range(moves):
self.__take_longuest(simgr, "pause")
super().manage_pause(simgr)
super().drop_excessed_loop(simgr)
# If states end with errors, it is often worth investigating. Set DEBUG_ERROR to live debug
# TODO : add a log file if debug error is not activated
super().manage_error(simgr)
super().manage_unconstrained(simgr)
for vis in simgr.active:
self.dict_addr_vis[
str(super().check_constraint(vis, vis.history.jump_target))
] = 1
super().excessed_step_to_active(simgr)
super().excessed_loop_to_active(simgr)
super().time_evaluation(simgr)
return simgr
| python |
Tawang War Memorialis dedicated to the war heroes (soldiers of the Indian Army) of the Sino-India war in 1962. It is a 40- foot high multi-hued memorial known as Namgyal Chorten. Overlooking the Tawang-Chu valley, this memorial was constructed to remember the sacrifices made by the soldiers of the Indian Army. Dalai Lama visited this memorial placed on a slope in the year 1997. Visitors can spot the names of soldiers (on granite plates), 2420 in number, who sacrificed their lives during the war in the Kameng sector. Tawang War Memorial is one of the most popular tourist attractions of Arunachal Pradesh.
The war memorial has two rooms on each side. While in-room, there are maps, photographs, artifacts, etc., of the war heroes, the other is a sound and light showroom, telling the tale of the heroism of the Indian soldiers. There is a museum too that displays the things that were used at the time of the war, such as pots, bullets, guns, helmets, mugs, etc. There are newspaper cuttings and maps, too, pinpointing both the armies' placements.
Apart from this, there are many tourist attractions that you can cover in this town through our Tawang tour packages. Contact our tour experts to plan a holiday in Arunachal Pradesh and get unbeatable deals on fully-customizable tour packages.
| english |
<gh_stars>10-100
{"title": "Fast decoding for LDPC based distributed video coding.", "fields": ["low density parity check code", "list decoding", "cuda", "graphics processing unit", "speedup"], "abstract": "Distributed video coding (DVC) is a new coding paradigm targeting applications with the need for low-complexity encoding at the cost of a higher decoding complexity. In the DVC architecture based on a feedback channel, the high decoding complexity is mainly due to the request-decode operation with repetitively fixed step size (induced by Slepian-Wolf decoding). In this paper, a parallel message-passing decoding algorithm for low density parity check (LDPC) syndrome is applied through Compute Unified Device Architecture (CUDA) based on General-Purpose Graphics Processing Unit (GPGPU). Furthermore, we propose an approach to reduce the number of requests dubbed as Ladder Request Step Size (LRSS) which leads to more speedup gain. Experimental results show that, through our work, significant speedup in decoding time is achieved with negligible loss in rate-distortion (RD) performance.", "citation": "Citations (4)", "departments": ["National Taiwan University", "National Taiwan University", "National Taiwan University", "National Taiwan University"], "authors": ["Yu-Shan Pai.....http://dblp.org/pers/hd/p/Pai:Yu=Shan", "Han-Ping Cheng.....http://dblp.org/pers/hd/c/Cheng:Han=Ping", "Yun-Chung Shen.....http://dblp.org/pers/hd/s/Shen:Yun=Chung", "Ja-Ling Wu.....http://dblp.org/pers/hd/w/Wu:Ja=Ling"], "conf": "mm", "year": "2010", "pages": 4} | json |
<filename>packages/react-hooks/src/stateMachine.ts
// Copyright 2017-2021 @polkadot/apps, UseTech authors & contributors
// SPDX-License-Identifier: Apache-2.0
import { Machine } from 'xstate';
const marketplaceStateMachine = Machine({
id: 'marketplace',
initial: 'idle',
states: {
askPrice: {
on: {
ASK_PRICE_FAIL: 'waitForNftDeposit',
ASK_PRICE_SUCCESS: 'registerSale'
}
},
buy: {
on: {
SIGN_FAIL: 'loadingTokenInfo',
SIGN_SUCCESS: 'waitForSignMoneyTransfer',
WAIT_FOR_DEPOSIT: 'checkDepositReady'
}
},
cancelSell: {
on: {
CANCEL_SELL_FAIL: 'loadingTokenInfo',
CANCEL_SELL_SUCCESS: 'waitForTokenRevert'
}
},
checkDepositReady: {
on: {
DEPOSIT_FAIL: 'checkDepositReady',
DEPOSIT_SUCCESS: 'sentTokenToNewOwner'
}
},
idle: {
on: {
BUY: 'buy',
CANCEL: 'cancelSell',
REVERT_UNUSED_MONEY: 'revertMoney',
SELL: 'sell',
UPDATE_TOKEN_STATE: 'loadingTokenInfo'
}
},
loadingTokenInfo: {
on: {
WAIT_FOR_DEPOSIT: 'waitForNftDeposit',
WAIT_FOR_USER_ACTION: 'idle'
}
},
registerSale: {
on: {
REGISTER_SALE_FAIL: 'askPrice',
REGISTER_SALE_SUCCESS: 'loadingTokenInfo'
}
},
revertMoney: {
on: {
WITHDRAW_FAIL: 'loadingTokenInfo',
WITHDRAW_SUCCESS: 'loadingTokenInfo'
}
},
sell: {
on: {
TRANSACTION_READY: 'waitForSignTransfer',
TRANSFER_FAIL: 'loadingTokenInfo'
}
},
sentTokenToNewOwner: {
on: {
SIGN_FAIL: 'loadingTokenInfo',
SIGN_SUCCESS: 'waitForSignTokenBuy'
}
},
waitForNftDeposit: {
on: {
NFT_DEPOSIT_FAIL: 'waitForNftDeposit',
NFT_DEPOSIT_OTHER: 'loadingTokenInfo',
NFT_DEPOSIT_READY: 'askPrice'
}
},
waitForSignMoneyTransfer: {
on: {
SEND_MONEY_FAIL: 'loadingTokenInfo',
SEND_MONEY_SUCCESS: 'checkDepositReady'
}
},
waitForSignTokenBuy: {
on: {
SEND_TOKEN_FAIL: 'loadingTokenInfo',
SEND_TOKEN_SUCCESS: 'waitForTokenOwn'
}
},
waitForSignTransfer: {
on: {
TRANSFER_FAIL: 'loadingTokenInfo',
TRANSFER_SUCCESS: 'waitForNftDeposit'
}
},
waitForTokenOwn: {
on: {
TOKEN_REVERT_FAIL: 'waitForTokenOwn',
TOKEN_REVERT_SUCCESS: 'loadingTokenInfo'
}
},
waitForTokenRevert: {
on: {
TOKEN_REVERT_FAIL: 'waitForTokenRevert',
TOKEN_REVERT_SUCCESS: 'loadingTokenInfo'
}
}
}
});
export default marketplaceStateMachine;
| typescript |
{
"id": "d975-191",
"text": "-98-\nCONCLUSION\nIt is clear that we are on the brink of a new\nera. Communication, as we have come to under¬\nstand and relate to that phenomenon, takes on\na new and deeper implication for the future of\nmankind. We have dramatically demonstrated\nour technical superiority, over other nations\nof the world, in this area through the success¬\nful launching and operation of an internationally\nresponsible communications satellite system.\nWe are now engaged in a series of discussions,\nat the highest levels of government and industry,\nover the feasibility of establishing a Domestic\nNonprofit Communications Satellite System, similar\nin concept and design to the already successful\nEarlybird system.\nThe next few years will be filled with even greater\nrevelation. We will demonstrate technological\ncompetence on a scale hitherto only prophesied\nby man. Yet, in the face of imminent innovation,\nit is ironic that we have to plead for the in¬\nclusion of this area of expertise in the \"massive"
} | json |
<reponame>WAFoundation/wasmer
use crate::CpuFeature;
use crate::{resolve_imports, Export, InstantiationError, RuntimeError, Tunables};
use crate::{ArtifactCreate, Upcastable};
use std::any::Any;
use wasmer_types::entity::BoxedSlice;
use wasmer_types::{DataInitializer, FunctionIndex, LocalFunctionIndex, SignatureIndex};
use wasmer_vm::{
FuncDataRegistry, FunctionBodyPtr, InstanceAllocator, InstanceHandle, TrapHandler,
VMSharedSignatureIndex, VMTrampoline,
};
/// An `Artifact` is the product that the `Engine`
/// implementation produce and use.
///
/// An `Artifact` is the product that the `Engine`
/// implementation produce and use.
///
/// The `ArtifactRun` contains the extra information needed to run the
/// module at runtime, such as [`ModuleInfo`] and [`Features`].
pub trait Artifact: Send + Sync + Upcastable + ArtifactCreate {
/// Register thie `Artifact` stack frame information into the global scope.
///
/// This is required to ensure that any traps can be properly symbolicated.
fn register_frame_info(&self);
/// Returns the functions allocated in memory or this `Artifact`
/// ready to be run.
fn finished_functions(&self) -> &BoxedSlice<LocalFunctionIndex, FunctionBodyPtr>;
/// Returns the function call trampolines allocated in memory of this
/// `Artifact`, ready to be run.
fn finished_function_call_trampolines(&self) -> &BoxedSlice<SignatureIndex, VMTrampoline>;
/// Returns the dynamic function trampolines allocated in memory
/// of this `Artifact`, ready to be run.
fn finished_dynamic_function_trampolines(&self) -> &BoxedSlice<FunctionIndex, FunctionBodyPtr>;
/// Returns the associated VM signatures for this `Artifact`.
fn signatures(&self) -> &BoxedSlice<SignatureIndex, VMSharedSignatureIndex>;
/// Get the func data registry
fn func_data_registry(&self) -> &FuncDataRegistry;
/// Do preinstantiation logic that is executed before instantiating
fn preinstantiate(&self) -> Result<(), InstantiationError> {
Ok(())
}
/// Crate an `Instance` from this `Artifact`.
///
/// # Safety
///
/// See [`InstanceHandle::new`].
unsafe fn instantiate(
&self,
tunables: &dyn Tunables,
imports: &[Export],
host_state: Box<dyn Any>,
) -> Result<InstanceHandle, InstantiationError> {
// Validate the CPU features this module was compiled with against the
// host CPU features.
let host_cpu_features = CpuFeature::for_host();
if !host_cpu_features.is_superset(self.cpu_features()) {
return Err(InstantiationError::CpuFeature(format!(
"{:?}",
self.cpu_features().difference(host_cpu_features)
)));
}
self.preinstantiate()?;
let module = self.module();
let (imports, import_function_envs) = {
let mut imports = resolve_imports(
&module,
imports,
self.finished_dynamic_function_trampolines(),
self.memory_styles(),
self.table_styles(),
)
.map_err(InstantiationError::Link)?;
// Get the `WasmerEnv::init_with_instance` function pointers and the pointers
// to the envs to call it on.
let import_function_envs = imports.get_imported_function_envs();
(imports, import_function_envs)
};
// Get pointers to where metadata about local memories should live in VM memory.
// Get pointers to where metadata about local tables should live in VM memory.
let (allocator, memory_definition_locations, table_definition_locations) =
InstanceAllocator::new(&*module);
let finished_memories = tunables
.create_memories(&module, self.memory_styles(), &memory_definition_locations)
.map_err(InstantiationError::Link)?
.into_boxed_slice();
let finished_tables = tunables
.create_tables(&module, self.table_styles(), &table_definition_locations)
.map_err(InstantiationError::Link)?
.into_boxed_slice();
let finished_globals = tunables
.create_globals(&module)
.map_err(InstantiationError::Link)?
.into_boxed_slice();
self.register_frame_info();
let handle = InstanceHandle::new(
allocator,
module,
self.finished_functions().clone(),
self.finished_function_call_trampolines().clone(),
finished_memories,
finished_tables,
finished_globals,
imports,
self.signatures().clone(),
host_state,
import_function_envs,
)
.map_err(|trap| InstantiationError::Start(RuntimeError::from_trap(trap)))?;
Ok(handle)
}
/// Finishes the instantiation of a just created `InstanceHandle`.
///
/// # Safety
///
/// See [`InstanceHandle::finish_instantiation`].
unsafe fn finish_instantiation(
&self,
trap_handler: &(dyn TrapHandler + 'static),
handle: &InstanceHandle,
) -> Result<(), InstantiationError> {
let data_initializers = self
.data_initializers()
.iter()
.map(|init| DataInitializer {
location: init.location.clone(),
data: &*init.data,
})
.collect::<Vec<_>>();
handle
.finish_instantiation(trap_handler, &data_initializers)
.map_err(|trap| InstantiationError::Start(RuntimeError::from_trap(trap)))
}
}
impl dyn Artifact + 'static {
/// Try to downcast the artifact into a given type.
#[inline]
pub fn downcast_ref<T: 'static>(&'_ self) -> Option<&'_ T> {
self.upcast_any_ref().downcast_ref::<T>()
}
/// Try to downcast the artifact into a given type mutably.
#[inline]
pub fn downcast_mut<T: 'static>(&'_ mut self) -> Option<&'_ mut T> {
self.upcast_any_mut().downcast_mut::<T>()
}
}
| rust |
Will Kattappa Apologize To Kannadigas And Make Smoother Way For Bahubali 2?
Will Bahubali 2 release in the state of Karnatak? This is the big question of the hour. Read on to know more..
| english |
{"title": "Love's Labour's Lost", "description": "Love's Labour's Lost is an early comedy by <NAME>. The King of Navarre and his three friends take a vow of study and seclusion for three years, during which they are forbidden to see or speak to women. Their vows are immediately tested by the arrival of the Pricess of France and her three ladies to the King's court. (Summary by <NAME>)<br /><br />\n<strong>Cast:</strong><br>Biron: <a href=\"http://librivox.org/reader/1492\">mb</a><br>Boyet: <a href=\"http://librivox.org/reader/2149\"><NAME></a><br>Costard: <a href=\"http://librivox.org/reader/5123\"><NAME></a><br><NAME>: <a href=\"http://librivox.org/reader/5141\"><NAME></a><br>Dull:<a href=\"http://librivox.org/reader/2911\"><NAME></a><br>Dumain: <a href=\"http://librivox.org/reader/4572\">om123</a><br>Ferdinand: <a href=\"http://librivox.org/reader/3699\">Bruce Pirie</a><br>First Lord: <a href=\"http://librivox.org/reader/4367\">Vicente Costa Filho</a><br>Forester: <a href=\"http://librivox.org/reader/2297\">Philippa</a><br>Holofernes: <a href=\"http://librivox.org/reader/26\">Denny Sayers</a><br>Jaquenetta: <a href=\"http://librivox.org/reader/4263\">Chelsea Baker</a><br>Katharine: <a href=\"http://librivox.org/reader/3698\"><NAME>-Boulet</a><br>Longaville: <a href=\"http://librivox.org/reader/4267\">Ric F</a><br>Maria: <a href=\"http://librivox.org/reader/4174\">Availle</a><br>Mercade: <a href=\"http://librivox.org/reader/4358\">Kim Stich</a><br>Moth: <a href=\"http://librivox.org/reader/103\"><NAME></a><br>Princess of France: <a href=\"http://librivox.org/reader/1259\">Elizabeth Klett</a><br>Rosaline: <a href=\"http://librivox.org/reader/3536\">Arielle Lipshaw</a><br>Sir Nathaniel: <a href=\"http://librivox.org/reader/5333\"><NAME></a><br>Narrator: <a href=\"http://librivox.org/reader/5174\"><NAME></a><br><br><strong>Audio edited by:</strong> <NAME>", "duration": 9065, "language": "English", "authors": [{"id": "37", "name": "<NAME>"}], "coverArt": "https://www.archive.org/download/LibrivoxCdCoverArt16/loveslabourslost_1204.jpg", "copyright_year": "1598", "genres": ["Comedy"], "supporters": [{"role": "Read by", "name": "LibriVox Volunteers"}, {"role": "Book Coordinator", "name": "<NAME>"}, {"role": "Meta Coordinator", "name": "<NAME>"}], "sections": [{"section": 1, "title": "Dramatis Personae", "readers": [], "path": "https://www.archive.org/download/loves_labours_lost_1008_librivox/loveslabourslost_0_shakespeare_64kb.mp3", "duration": 89}, {"section": 2, "title": "Act 1", "readers": [], "path": "https://www.archive.org/download/loves_labours_lost_1008_librivox/loveslabourslost_1_shakespeare_64kb.mp3", "duration": 1463}, {"section": 3, "title": "Act 2", "readers": [], "path": "https://www.archive.org/download/loves_labours_lost_1008_librivox/loveslabourslost_2_shakespeare_64kb.mp3", "duration": 782}, {"section": 4, "title": "Act 3", "readers": [], "path": "https://www.archive.org/download/loves_labours_lost_1008_librivox/loveslabourslost_3_shakespeare_64kb.mp3", "duration": 645}, {"section": 5, "title": "Act 4", "readers": [], "path": "https://www.archive.org/download/loves_labours_lost_1008_librivox/loveslabourslost_4_shakespeare_64kb.mp3", "duration": 2682}, {"section": 6, "title": "Act 5", "readers": [], "path": "https://www.archive.org/download/loves_labours_lost_1008_librivox/loveslabourslost_5_shakespeare_64kb.mp3", "duration": 3404}]} | json |
<filename>src/main/resources/static/mas_json/2014_kdd_-3613116533249758125.json
{"title": "EARS (earthquake alert and report system): a real time decision support system for earthquake crisis management.", "fields": ["emergency management", "crisis management", "exploit", "software deployment", "government"], "abstract": "Social sensing is based on the idea that communities or groups of people can provide a set of information similar to those obtainable from a sensor network. Emergency management is a candidate field of application for social sensing. In this work we describe the design, implementation and deployment of a decision support system for the detection and the damage assessment of earthquakes in Italy. Our system exploits the messages shared in real-time on Twitter, one of the most popular social networks in the world. Data mining and natural language processing techniques are employed to select meaningful and comprehensive sets of tweets. We then apply a burst detection algorithm in order to promptly identify outbreaking seismic events. Detected events are automatically broadcasted by our system via a dedicated Twitter account and by email notifications. In addition, we mine the content of the messages associated to an event to discover knowledge on its consequences. Finally we compare our results with official data provided by the National Institute of Geophysics and Volcanology (INGV), the authority responsible for monitoring seismic events in Italy. The INGV network detects shaking levels produced by the earthquake, but can only model the damage scenario by using empirical relationships. This scenario can be greatly improved with direct information site by site. Results show that the system has a great ability to detect events of a magnitude in the region of 3.5, with relatively low occurrences of false positives. Earthquake detection mostly occurs within seconds of the event and far earlier than the notifications shared by INGV or by other official channels. Thus, we are able to alert interested parties promptly. Information discovered by our system can be extremely useful to all the government agencies interested in mitigating the impact of earthquakes, as well as the news agencies looking for fresh information to publish.", "citation": "Citations (80)", "departments": ["University of Pisa", "National Research Council", "National Research Council", "National Institute of Geophysics and Volcanology", "National Research Council"], "authors": ["<NAME>.....http://dblp.org/pers/hd/a/Avvenuti:Marco", "<NAME>.....http://dblp.org/pers/hd/c/Cresci:Stefano", "<NAME>.....http://dblp.org/pers/hd/m/Marchetti:Andrea", "<NAME>.....http://dblp.org/pers/hd/m/Meletti:Carlo", "<NAME>.....http://dblp.org/pers/hd/t/Tesconi:Maurizio"], "conf": "kdd", "year": "2014", "pages": 10} | json |
import { Column, CreateDateColumn, Entity, JoinColumn, ManyToOne, PrimaryGeneratedColumn, UpdateDateColumn } from 'typeorm';
import { Contractor } from '../../contactor/entity/contractor.entity';
import { Project } from '..//../project/entity/Project.entity';
import { User } from '../../user/entity/user.entity';
@Entity()
export class Contract {
@PrimaryGeneratedColumn({name: 'id'})
id?: number;
@ManyToOne(type => Project)
@JoinColumn({name: 'project_id'})
project: Project;
@ManyToOne(type => Contractor)
@JoinColumn({name: 'contractor_id'})
contractor: Contractor;
@Column({name: 'start_date'})
startDate: Date;
@Column({name: 'end_date'})
endDate: Date;
@CreateDateColumn({name: 'created_at'})
createdAt?: Date;
@UpdateDateColumn({name: 'updated_at'})
updatedAt?: Date;
@ManyToOne(type => User, {nullable: false})
@JoinColumn({name: 'created_by'})
createdBy: User;
@ManyToOne(type => User)
@JoinColumn({name: 'last_updated_by'})
lastUpdatedBy?: User;
} | typescript |
WIRED StaffCultureNov 5, 2004 2:00 AMSlideshow: Progress in an Ancient TongueResearch advances could bring Ethiopic characters -- hundreds of them -- to text messaging, and maybe help modernize the country. Andrew Heavens reports from Addis Ababa, Ethiopia.A boy stands outside a mobile-phone shop in Addis Ababa, where Ethiopia's infrastructure minister, Kasu Yilala, recently called the country one of the least connected in the world. The state monopoly mobile-phone service provider, Ethiopian Telecommunications, is working to address the physical infrastructure shortfall.Save this storySaveSave this storySaveSee related story: Progress in an Ancient TongueProfessors and research students in Addis Ababa developed two possible ways of mapping a limited set of 210 Ethiopic characters onto a mobile keypad, using a combination of keystrokes for each letter. They hope their work will open the door to text messaging in their country.
| english |
(One particularly skilled or competent at one's craft: "A witch of a writer")
In one way I am. The only thing is that what I have to say is not very popular. What I have to say is:
I have been proven right. How is that? I listen to CAPO - @OneBitNews - Head of Warfare, and what he said about training soldiers: "you must have the highest production to build the greatest army". So I build the greatest production and build the greatest army to protect you all. NOW I am building buildings again and do the waiting game (The stratagem of deferring action and allowing the passage of time to work in one's favor).
We want to be the greatest of all time as the Vikings always have been. Have the Vikings always been the greatest? YES, over time they have always been the greatest. To clear out this, where are the romans today? Even if they where the greatest warriors of all time, today we call them Italians. NO great warriors at all, only weak mothers boys.
YOU are in the greatest gang there is. Be proud of it and wait for the great battles to come, where you will earn money.
An intense competition: a battle of wits.
I see you tomorrow, I promise.
| english |
{
const obj = {
foo: 123,
hoge: 44,
name: "obj-name",
};
{
console.log(obj);
const spread = { ...obj };
console.log(spread);
console.log(`original === spread: ${obj === spread}`);
}
{
console.log("---------");
const spread = { ...obj };
spread.foo = 555;
spread.hoge = 999;
console.log(obj);
console.log(spread);
}
{
console.log("---------2");
const newObj = { ...obj, added: "HEY" };
console.log(newObj);
const updatedObj = { ...obj, foo: "Updated foo" };
console.log(updatedObj);
const opposite = { newProp: 9999, ...obj };
console.log(opposite);
const obj2 = { foo: "second foo", age: 15, hobby: "soccer" };
const merged = { ...obj, ...obj2 };
console.log(merged);
}
{
console.log("---------3");
const array = ["Hey", "Ho", "Hello"];
const array2 = ["Boo", "Yeah", "Oops"];
console.log([array, ...array2]);
console.log([...array, ...array2]);
}
{
console.log("---------4");
function doSomething(...args: string[]) {
args.forEach((value: string, index: number) => {
console.log(`${index}:${value}`);
});
}
const strs = "HELLO";
const array = ["chair", "desk", "smartphone"];
const obj = { foo: 11, hoge: 55 };
doSomething(...strs);
doSomething(...array);
// doSomething(...obj);
}
{
console.log("---------5");
const originalObj = {
value: "1",
first: {
value: "1-1",
second: {
value: "2-1",
third: {
value: "3-1"
}
}
}
};
const copiedObj = { ...originalObj };
console.log(copiedObj);
copiedObj.value = "1-updated";
copiedObj.first.value = "1-1-updated";
copiedObj.first.second.value = "2-1-updated";
copiedObj.first.second.third.value = "3-1-updated";
console.log(originalObj);
}
{
console.log("-------6");
const obj = {
first: 1,
second: 2,
third: 3,
};
{
const { first, second, third } = obj;
console.log(first);
console.log(second);
console.log(third);
}
{
const { second, ...rest } = obj
console.log(second);
console.log(rest);
}
}
}
| typescript |
“Robbie Hall, Robbie Hall, Robbie Robbie Hall!” sang the Oxford faithful.
It was a cool Autumn evening in 2011 and Plymouth were the opponents. He’d just scored his second goal to make it 2-1 and his fifth in six games.
The cross came in from Damian Batt and Hall’s quick movement off his feet allowed the West Ham loanee to guide the ball with his left, past the despairing Jake Cole in the Plymouth goal.
Oxford went on to win 5-1, but Robbie Hall was very much the man of the moment, and the man of the match.
Hall has also represent England at U16, U17, U18 and U19 levels. He was a member of the England side that won the 2010 UEFA European Under-17 Football Championship in Liechtenstein and has an excellent goalscoring record at that level with eight goals in sixteen appearances.
So it was no surprise that in Spetember 2012, the 18 year old made his Premier League debut. He came on as a sub for Guy Demel in the 78? minute in the 3-0 hammering of Fulham.
Hall has been likened to a young, left footed, Jermain Defoe. Now, we all know that no player has ever become the exact same player as one they have been compared to, but Robbie does have the same attributes as Defoe did at his time at Bournemouth.
He’s a fox in the box, who’s nippy and is in possession of a powerful, yet accurate, shot.
The only real weakness of Hall’s game is his lack of strength and he tired massively at the end of games during his spell at Oxford United. Strength will come with age, and he will certainly be hitting the weights over the next few years.
Can Robbie Hall reach the heights of Jermain? Only time will tell. But the future is certainly bright for young Robbie.
He’s a Premier League player in the making.
| english |
India's wheat exports have seen a miniscule rise of 10 per cent during this calendar year over last year after the government imposed a ban on shipments of the grain to other countries five months back to curb domestic prices.
According to Ministry of Food and Consumer Affairs data, the couuntry exported 24. 08 lakh metric tonnes (LMT) of wheat between May 15 and September 30.
This was 10. 3 per cent more than 21. 83 LMT of wheat which was exported between April 1, 2021 and May 14, 2021.
In May this year, India had restricted export of wheat to enhance domestic availability. In September, it had also banned export of broken rice and imposed a 20 per cent export duty on non-Basmati rice to increase domestic supplies amid a fall in area under paddy crop in the current kharif season.
Food security concerns had led to imposition of curbs on exports of wheat, official sources had said.
Recently, India had also justified its decision to ban export of wheat and rice at a WTO meeting, despite many countries raising concerns that New Delhi's decision could have an adverse impact on importing nations, mainly in Africa. | english |
/* IMPORTING ALL STYLES */
/* RESET */
* {
margin: 0;
padding: 0;
box-sizing: border-box; }
/* From 0-base.sass */
/* VARIABLES */
/* SCROLL */
/* width */
::-webkit-scrollbar {
width: 8px; }
/* Track */
::-webkit-scrollbar-track {
background: #f1f1f1; }
/* Handle */
::-webkit-scrollbar-thumb {
background: black; }
/* Handle on hover */
::-webkit-scrollbar-thumb:hover {
background: #FACE5E; }
/* Main tags */
body {
font-family: Montserrat;
background-color: #f7f7f7; }
section {
padding: 40px;
width: 100%; }
h1 {
font-size: 48px;
text-align: center; }
h2 {
font-size: 36px; }
a {
text-decoration: none;
color: black; }
.bold {
font-weight: bold; }
.light {
font-weight: 200; }
.thin {
font-weight: 100; }
.padding-to-bottom {
padding-bottom: 30px; }
#gmap_canvas img {
width: 100% !important;
background: none !important; }
.whiteFont {
color: white; }
.fixed {
position: fixed;
width: 100%;
z-index: 999;
box-shadow: 0px 0px 58px -25px rgba(0, 0, 0, 0.8);
transition: 0.5s; }
.flex {
display: flex;
justify-content: space-between; }
.hidden {
display: none; }
.grid {
display: grid; }
/* Nav menu */
.dropdown {
opacity: 0;
position: absolute;
overflow: hidden;
top: -21px;
padding: 30px;
transition: all 0.3s;
transform: translateY(100px);
will-change: transform;
display: none;
right: 0; }
.dropdownBackground {
width: 750px;
min-height: 150px;
position: absolute;
background: #fff;
transform-origin: 50% 0%;
justify-content: center;
opacity: 0;
flex-direction: column;
cursor: default; }
.dropdownBackground li {
padding-bottom: 20px;
list-style: none;
cursor: default;
grid-template-columns: 50% 30% 10% 10%;
grid-template-areas: "name price order--amount delete";
align-items: center;
font-size: 1.4rem; }
.dropdownBackground li .name {
grid-area: name; }
.dropdownBackground li .price {
grid-area: price;
padding: 0 5px;
display: flex;
justify-content: flex-end; }
.dropdownBackground li .order--amount {
grid-area: order--amount;
padding: 0 5px;
text-align: center;
display: flex;
justify-content: flex-end; }
.dropdownBackground li .delete {
grid-area: delete;
border: none;
background: rgba(255, 0, 0, 0.8);
opacity: 0.7;
border-radius: 50%;
font-size: 1rem;
width: 40px;
height: 40px;
cursor: pointer;
margin-left: auto;
margin-right: 0; }
.dropdownBackground li .delete:hover {
background: red; }
@media screen and (max-width: 1150px) {
.dropdown {
top: -15px; }
.dropdownBackground {
width: 100%;
padding: 2rem; }
.dropdownBackground li {
padding-bottom: 2rem;
font-size: 1.8rem; }
.dropdownBackground li .price {
display: flex;
justify-content: flex-end; }
.dropdownBackground li .order--amount {
margin-left: auto;
margin-right: 0; }
.dropdownBackground li .delete {
display: flex;
justify-content: center;
width: 60px;
height: 60px; } }
.priceContainer {
display: inline-flex;
justify-content: space-between;
align-items: center; }
.priceContainer .priceItem {
display: flex;
font-size: 1.8rem; }
.priceContainer .orderButton {
display: flex;
height: 50px;
width: 120px;
border: none;
font-size: 1.4rem;
justify-content: center;
cursor: pointer; }
@media screen and (max-width: 1150px) {
.priceItem {
font-size: 2rem; } }
.open {
opacity: 1; }
.disabled {
background: #f1f1f1;
color: white; }
.primary-background {
background: #FACE5E; }
/* Button used to open the contact form - fixed at the bottom of the page */
.open-button {
color: #000;
padding: 10px 20px;
border: none;
cursor: pointer;
opacity: 0.8;
position: fixed;
bottom: 23px;
right: 28px;
width: 280px; }
/* The popup form - hidden by default */
.form-popup {
display: none;
position: fixed;
width: 100%;
margin-top: 40px;
justify-content: center;
align-content: center;
z-index: 9999;
top: 100px;
bottom: 100px; }
/* Add styles to the form container */
.form-container {
width: 60%;
padding: 40px;
background-color: white;
overflow-y: auto; }
@media screen and (max-width: 1150px) {
.form-container {
width: 100%;
height: auto;
background-color: white;
overflow-y: auto; }
.form-popup {
display: none;
position: fixed;
width: 100%;
height: auto;
margin-top: 40px;
padding: 2rem;
justify-content: center;
align-content: center;
z-index: 9999;
top: 100px; }
form, .btn {
font-size: 2rem;
font-family: Montserrat;
padding: 0.6rem; }
input {
font-size: 1.8rem;
font-family: Montserrat; } }
/* Full-width input fields */
.form-container input[type=text] {
width: 100%;
padding: 8px;
margin: 5px 0 22px 0;
border: none;
background: #f1f1f1; }
/* When the inputs get focus, do something */
.form-container input[type=text]:focus, .form-container input[type=password]:focus {
background-color: #ddd;
outline: none; }
/* Set a style for the submit/login button */
.form-container .btn {
background-color: #FACE5E;
color: white;
padding: 1rem;
border: none;
cursor: pointer;
width: 100%;
margin-bottom: 10px;
opacity: 0.8;
font-family: Montserrat;
font-size: 2rem; }
/* Add a red background color to the cancel button */
.form-container .cancel {
background-color: red; }
/* Add some hover effects to buttons */
.form-container .btn:hover, .open-button:hover {
opacity: 1; }
.form-grid {
background: #f1f1f1;
display: grid;
margin-bottom: 2rem;
padding: 0.8rem;
list-style: none;
cursor: default;
grid-template-columns: 50% 30% 20%;
grid-template-areas: "name amount price";
align-items: center;
font-size: 1.4rem;
align-content: center; }
.form-grid .name {
grid-area: name; }
.form-grid .price {
grid-area: price;
padding: 0 5px;
display: flex;
justify-content: flex-end; }
.form-grid .orderedAmount {
grid-area: amount;
padding: 0 5px;
text-align: center;
display: flex;
justify-content: flex-end; }
/* HEADER */
nav {
width: 100%;
height: 100px;
background: white; }
nav .wrapper {
width: 100%;
height: 100px;
padding: 6px 30px;
background: white;
transition: 1s; }
nav .wrapper .logo {
display: flex;
padding-top: 16px;
width: 140px;
align-content: center;
justify-content: center;
z-index: 9999; }
nav .wrapper .menu__full {
padding-top: 16px;
display: inline-flex;
list-style: none;
text-decoration: none;
align-items: center;
font-size: 1.5rem; }
nav .wrapper .menu__full > li {
padding-right: 35px;
font-weight: 300;
color: #000; }
nav .wrapper .menu__full .delivery {
margin-left: 10px;
border-radius: 20px;
cursor: pointer;
padding: 6px; }
nav .wrapper .menu__full a:hover {
color: #ff7f2a;
transition: 0.05s ease; }
@media screen and (max-width: 1150px) {
nav .wrapper .menu__full {
font-size: 2rem; } }
/* Hero */
.hero .parallax, .contact .parallax {
min-height: 400px;
object-fit: contain;
width: auto;
padding-top: calc(300px/2); }
.hero .hero--belt, .contact .hero--belt {
min-height: 230px;
width: 100%;
margin-top: -15px;
padding: 60px;
display: inline-flex;
justify-content: space-around;
font-size: 24px;
text-transform: uppercase;
text-align: center; }
.hero .hero--belt p, .contact .hero--belt p {
margin: auto;
padding: 10px;
display: block;
justify-content: space-around;
font-size: 24px;
text-align: center; }
@media screen and (max-width: 1150px) {
.hero .hero--belt, .contact .hero--belt {
display: flex;
justify-content: center;
align-items: center;
flex-direction: column;
text-align: center; }
.hero .hero--belt .column, .contact .hero--belt .column {
margin: auto; }
.hero .hero--belt .column p, .contact .hero--belt .column p {
display: inline-flex; } }
/* Break */
.break {
padding: 120px 0;
min-height: 400px;
width: 100%;
background: white;
align-items: center; }
.break img {
width: 120px;
height: 120px;
display: flex;
justify-content: center;
align-items: center;
margin: auto; }
.break h1, .break h2 {
text-align: center;
margin: 20px; }
/* Services */
.services .row {
width: 100%;
padding: 40px;
min-height: 480px;
background: white;
display: inline-flex;
justify-content: space-around;
margin: auto; }
.services .row .stage img {
width: 380px;
height: auto; }
.services .row .stage h2 {
width: 380px;
text-align: center;
padding-top: 30px;
line-height: 36px; }
@media screen and (max-width: 1150px) {
.services .row {
width: 100%;
display: flex;
flex-direction: column;
justify-content: center;
margin: auto; }
.services .row .stage {
display: flex;
justify-content: center;
flex-direction: column;
align-items: center; }
.services .row .stage img {
width: 720px;
height: auto; }
.services .row .stage h2 {
text-align: center;
width: 80%;
padding: 30px; } }
.menu {
padding: 40px;
background: black; }
.menu .menu--title {
color: #fff;
display: inline;
text-align: center; }
.menu .menu--title h1 {
padding-bottom: 30px; }
.menu .menu--list {
background: #000;
list-style: none;
text-decoration: none; }
.menu .menu--list li {
background: #FACE5E;
width: 100%;
min-height: 6rem;
align-items: center;
margin-bottom: 10px;
padding: 20px; }
.menu .menu--list .dish {
display: grid;
grid-template-columns: 200px 1fr 0.1fr auto 0.1fr 0.1fr 0.1fr;
grid-template-rows: 2fr;
grid-template-areas: "name ingredients price buttons add amount substract";
font-size: 1.6rem; }
.menu .menu--list .dish .name {
grid-area: name;
padding-right: 1em; }
.menu .menu--list .dish .ingredients {
grid-area: ingredients;
justify-content: flex-start;
text-align: left;
font-size: 1.4rem; }
.menu .menu--list .dish .price {
grid-area: price;
padding: 0 1rem; }
.menu .menu--list .dish .button-wrapper {
grid-area: buttons; }
.menu .menu--list .dish .add {
grid-area: add;
height: 2rem;
font-size: 1.4rem; }
.menu .menu--list .dish .amount {
grid-area: amount; }
.menu .menu--list .dish .substract {
grid-area: substract;
height: 2rem;
font-size: 1.4rem; }
.menu img {
width: 60%;
display: flex;
justify-content: center;
margin: auto; }
/*Menu */
/* Contact */
.contact {
padding: 40px; }
/* Review */
footer {
height: 170px;
background: #05090A;
display: flex;
justify-content: space-around;
align-items: center;
margin: auto; }
footer ul {
text-decoration: none;
list-style: none; }
footer ul li {
color: #fff;
font-size: 20px;
padding-bottom: 4px; }
footer .inline {
display: inline-flex; }
footer .inline li {
padding: 10px;
font-size: 24px; }
footer .inline a {
color: #fff; }
footer .inline a:hover {
color: #ff7f2a;
cursor: pointer;
transition: 0.4s; }
/*# sourceMappingURL=style.css.map */
| css |
<filename>package.json<gh_stars>0
{
"name": "practice-puppeteer",
"description": "Practice Puppeteer",
"private": true,
"bin": {
"hb": "bin/hb.js"
},
"scripts": {
"start": "node index.js"
},
"author": "Neo <<EMAIL>> (https://neos21.net/)",
"license": "MIT",
"homepage": "https://github.com/Neos21/practice-puppeteer#readme",
"repository": {
"type": "git",
"url": "git+https://github.com/Neos21/practice-puppeteer.git"
},
"bugs": {
"url": "https://github.com/Neos21/practice-puppeteer/issues"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/Neos21"
},
"dependencies": {
"axios": "0.20.0",
"dotenv": "8.2.0",
"puppeteer-core": "5.2.1"
}
}
| json |
{
"name": "dead-simple-light-syntax",
"main": "./lib/dead-simple.coffee",
"theme": "syntax",
"version": "0.1.0",
"description": "A light syntax theme with minimal coloring for minimal distraction.",
"keywords": [
"syntax",
"theme",
"minimal",
"light"
],
"repository": "https://github.com/blaqbern/dead-simple-light-syntax",
"license": "MIT",
"engines": {
"atom": ">=1.0.0 <2.0.0"
}
}
| json |
An edited video was shared on Instagram where it seems like BTS is dancing to Shah Rukh Khan and Deepika Padukone’s Jhoome Jo Pathaan.
Have you seen those fan-made videos of BTS that show them dancing to hit Hindi tracks? Impeccably edited, those videos often leave people stunned. And this video shared on Instagram may have the same effect on you. In the edited video, it seems like BTS is actually dancing to Shah Rukh Khan and Deepika Padukone’s Jhoome Jo Pathaan.
Instagram users who go by the_nikki20 posted the video. “In BTS 'B' is stan for Bollywood,” they also added while sharing the clip. The video originally shows a dance practice of the K-pop group to the song ON. In the edited clip, the hit track from the film Pathaan is added and the audio perfectly syncs with the video.
Take a look at the video:
The video was posted back in February. Since being shared, it has accumulated close to 1. 6 lakh views and the numbers are only increasing. Additionally, the share has received close to 39,000 likes. People posted various comments while reacting to the video.
Here’s how Instagram users reacted:
“The music matches the moves really good! ! ” commented an Instagram user. “Bts choreography suits on every Indian song! ! ! ! ” joined another. “Another example. BTS does not follow the beat, the beat follows BTS,” added a third. “Can’t stop watching this,” wrote a fourth. | english |
SAN MATEO, California -- We're on the ground here at Maker Faire, after a two-hour long odyssey through the streets of this San Francisco suburb. Outside, it's a wild scene: traffic is backed up for miles and near the Faire, clumps of nerds are streaming past stucco condominiums like hippies through the fields of Woodstock.
Inside, the sound of gears turning wafts through the air like the smoke from the barbecue joints in the food court. Within a minute of walking through the gates, we've already seen people grinding their own flour with a modded stationary bike, muffin cars, a solar-powered chariot pulled by an Arnold Schwarzenegger robot, swarm bots, and a booth full of iPhone hacks.
Keep it locked here for a series of video, photos, and posts throughout the day.
Image: David McMillan shows off "The Solanator," a solar powered chariot created by Bob Schneeveis.
| english |
<reponame>MoeAl-Ani/cosmetics-backend
package com.infotamia.cosmetics.daos;
import com.infotamia.pojos.entities.CategoryEntity;
import com.infotamia.pojos.entities.CategoryTranslationEntity;
import javax.enterprise.context.RequestScoped;
import javax.inject.Inject;
import org.hibernate.Session;
import java.util.*;
/**
* @author <NAME>
**/
@SuppressWarnings("unchecked")
@RequestScoped
public class CategoryDao {
public CategoryDao() {
//
}
@Inject
private Session session;
public Optional<List<CategoryEntity>> fetchCategoriesByIds(Collection<Integer> ids) {
List<Object[]> resultList = session.createSQLQuery("select {c.*}, {ct.*} from category c " +
"left join category_translation ct on c.id = ct.category_id " +
"where c.id in (:ids)")
.setParameter("ids", ids)
.addEntity("c", CategoryEntity.class)
.addEntity("ct", CategoryTranslationEntity.class)
.getResultList();
if (resultList.isEmpty()) return Optional.empty();
Set<CategoryEntity> finalResult = new HashSet<>();
for (Object[] objects : resultList) {
CategoryEntity categoryEntity = null;
CategoryTranslationEntity categoryTranslationEntity = null;
for (Object o : objects) {
if (o instanceof CategoryEntity)
categoryEntity = (CategoryEntity) o;
if (o instanceof CategoryTranslationEntity)
categoryTranslationEntity = (CategoryTranslationEntity) o;
}
if (categoryEntity != null) {
finalResult.add(categoryEntity);
if (categoryTranslationEntity != null)
categoryEntity.getCategoryTranslations().add(categoryTranslationEntity);
}
}
return Optional.of(new ArrayList<>(finalResult));
}
}
| java |
{
"title": "Earth During a Total Eclipse of the Sun",
"credit": "Expedition 12 Crew, NASA",
"explanation": "What does the Earth look like during a total solar eclipse? It appears dark in the region where people see the eclipse, because that's where the shadow of the Moon falls. The shadow spot actually shoots across the Earth at nearly 2,000 kilometers per hour, darkening locations in its path for only a few minutes before moving on. The featured image shows the Earth during the total solar eclipse of 2006 March, as seen from the International Space Station. On Friday the Moon will move in front of the Sun once again, casting another distorted circular shadow that, this time, will zip over part of the north Atlantic Ocean.",
"date": "2015-03-18",
"hdurl": "https://apod.nasa.gov/apod/image/1503/EarthEclipse_ISS_1520.jpg",
"service_version": "v1",
"media_type": "image",
"url": "https://apod.nasa.gov/apod/image/1503/EarthEclipse_ISS_Annotated_960.jpg"
} | json |
<reponame>asgarovf/tripleweb
import styles from "./Navbar.module.scss";
const Navbar = () => {
return <div className={styles.wrapper}></div>;
};
export { Navbar };
| typescript |
<reponame>ppavlov/jsdelivr<gh_stars>1-10
{
"packageManager": "github",
"name": "jquery.placeholder",
"repo": "mathiasbynens/jquery-placeholder",
"files": {
"include": ["jquery.placeholder.min.js", "jquery.placeholder.js"]
}
}
| json |
from django.apps import AppConfig
from django.utils.translation import ugettext_lazy as _
class CartsAppConfig(AppConfig):
name = 'wagtailcommerce.carts'
label = 'wagtailcommerce_carts'
verbose_name = _('Wagtail Commerce Carts')
| python |
GoDaddy will be shuttering its Cloud Servers and Applications business at the end of the year, less than two years after launching the new offerings.
Since March 2016, GoDaddy has offered a suite of Amazon-style cloud computing services to help small businesses build, test and scale cloud solutions on GoDaddy's infrastructure. It was aimed at smaller businesses making at least a partial move to the cloud.
As first reported by TechCrunch, the company sent its cloud customers a memo alerting them that GoDaddy will stop supporting the Cloud Servers service on Dec. 31. On November 15, it will stop supporting apps and development environments provided by Bitnami.
Raghu Murthi, SVP of Hosting and Pro at GoDaddy, confirmed the news in a statement to ZDNet.
"After serious consideration, we have decided to end-of-life our cloud servers' product," he said. "Our goal from the beginning was to create simple and scalable services for small and medium business owners. We're proud of what we built and now we are focusing on building a robust and scalable solutions based on OpenStack infrastructure. GoDaddy Cloud Servers provided numerous learnings for us that we've already been applying to other products and services."
Meanwhile, earlier in the week GoDaddy announced it's selling off its PlusServer business to the private equity firm BC Partners for $456 million (€397 million). The deal is expected to close by the end of August. GoDaddy took over the PlusServer business when it acquired its European rival Host Europe Group (HEG) last year.
| english |
<reponame>sheepla/zenn-trend-cli<filename>version.ts
export const commandName = "zenn-trend-cli";
export const version = "0.0.1";
| typescript |
This edition of Press TV News Analysis will be looking at Israel's latest violation in the occupied territories. Located near the al-Aqsa mosque in the center of Sheikh Jarrah neighborhood in East Jerusalem Al Quds laid the Shepherd Hotel, a heritage symbol for Palestinians, now destroyed by Israel, as part of their on-going effort to Judaize Al Quds.
The armies of the Islamic Resistance are ready and eager to confront the enemies of Imam Mahdi (A). A great battle lies ahead of us… The question is: are you ready? Shaykh Muzaffer Abbas Hyder gives a brilliant overview of the recent events in Sheikh Jarrah, Jerusalem Al-Quds and reminds us of why the noble cause of Palestine is so incredibly significant. Despite all the oppression that the world faces today, we must recognize that we are at a tremendous turning point in human history. Victory lies ahead for the believers and for all those who belong to the Camp of the Oppressed. Palestine will be free INSHALLAH sooner than you think and a global revolution is in the making. Now is the time to free yourselves completely from the shackles of the Taghut and rise up.
| english |
<reponame>ashkeel/overlays
const fs = require('fs');
const inlineAssets = require('inline-assets');
const dist = __dirname + '/dist';
const files = fs.readdirSync(dist);
if (!files) {
console.info('nothing to do');
return;
}
fs.mkdirSync(`${dist}/inline`);
files
.filter((f) => f.endsWith('.html'))
.forEach((f) => {
const from = `${dist}/${f}`;
const to = `${dist}/inline/${f}`;
var content = fs.readFileSync(from, 'utf8');
content = inlineAssets(to, from, content, {});
fs.writeFileSync(to, content, 'utf8');
});
/*
*/
| javascript |
{
"id": "d573-153",
"text": "Reverend <NAME>, S.J,\nDecember 13, 1957\n~2~\nAssociate members pay $25 and Affiliate members pay $15 a year; Individual\nmembers pay $7.50. For this fee members receive our monthly Newsletter:\nthe NAEB Journal; the various reports and surveys we publish periodically;\nservices of our Placement Service, television engineer, legal counsel,\nmanagement consultant, etc.;plus the normal day to day services, informa¬\ntion and advice that any national organization gives its members. Under\nseparate cover we are sending you samples of some of the materials we\nregularly send our members, plus other items which will give you more\ninformation about the association. Of course. Associate, Affiliate and\nIndividual members do not receive as many of these materials as regularly\nas Active members do.\nActive members with radio stations may use the facilities of the NAEB\nRadio Network upon payment of an annual network fee. The Network distributes\napproximately eight hours of programming, by tape recording, weekly, to\nmembers subscribing to its service. This fee is also based on power (with\nthe same power classifications as membership) and a new station does not\nhave to pay the full fee until the third year it is in the Network. The\nfees are s\nClass A\nClass B\nClass C\n1st Year 2nd Year 3rd Year\n$400.00 ISooTocT $800.00\n250.00 375.00 500.00\n150.00 225.00 300.00\nStations must also pay postage both ways on the tapes shipped to and\nfrom Network Headquarters here in Urbana, Illinois. Still even a Class\nA station paying its full third-year rate is paying only a little over\n$2 an hour far better than 400 hours of programming from the Network in\na year. This is about the most inexpensive programming a station can do.\nIf, after looking over the materials which we shall send you, you have\nfurther questions, we shall be most happy to answer them.\nSincerely,\n<NAME>\nAssociate Director\nHEH/dfc /\nEnclosures 3 3\nA/MAto-ow ■ M-V,\nH - 1)3, 5 /d, //, /v<r, 7CL, P-f"
} | json |
const express = require("express");
const router = express.Router();
const config = require("config");
const auth = require("../../middleware/auth");
router.post("/report_admin", auth, (req, res) => {
const ReportAdmin = async () => {
// DB Config
const mongoose = require("mongoose");
const db = config.get("mongoEResearch");
// Connect Mongo
await mongoose
.connect(db, {
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true,
})
.then(() => {
const report = require("../../models/User");
report
.aggregate([
{
$project: {
buasri_id: 1,
dep: 1,
"research._id": 1,
"research.firstname": 1,
"research.lastname": 1,
"research.position": 1,
"research.article": 1,
"research.type_article": 1,
"research.level": 1,
"research.sub_level_1": 1,
"research.sub_level_2": 1,
"research.year": 1,
"research.research_year": 1,
"research.author": 1,
"research.name": 1,
"research.name_th": 1,
"research.conference": 1,
"research.quartile": 1,
"research.student": 1,
"research.tags": 1,
"research.status": 1,
},
},
{ $unwind: "$research" },
{ $unwind: "$research.type_article" },
{
$group: {
buasri_id: { $last: "$buasri_id" },
department: { $last: "$dep" },
_id: "$research._id",
firstname: { $last: "$research.firstname" },
lastname: { $last: "$research.lastname" },
position: { $last: "$research.position" },
article: { $last: "$research.article" },
level: { $last: "$research.level" },
sub_level_1: { $last: "$research.sub_level_1" },
sub_level_2: { $last: "$research.sub_level_2" },
year: { $last: "$research.year" },
research_year: { $last: "$research.research_year" },
author: { $last: "$research.author" },
name: { $last: "$research.name" },
name_th: { $last: "$research.name_th" },
quartile: { $last: "$research.quartile" },
student: { $last: "$research.student" },
tags: { $last: "$research.tags.text" },
status: { $last: "$research.status" },
},
},
{
$sort: { created: 1 },
},
])
.then((report) => {
res.json(report);
})
.catch((err) => console.log(err));
})
.catch((err) => console.log(err));
};
ReportAdmin();
});
router.post("/report_committee", auth, (req, res) => {
const ReportCommittee = async () => {
// DB Config
const mongoose = require("mongoose");
const db = config.get("mongoEResearch");
// Connect Mongo
await mongoose
.connect(db, {
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true,
})
.then(() => {
const report = require("../../models/User");
const { dep } = req.body;
report
.aggregate([
{ $match: { dep } },
{
$project: {
buasri_id: 1,
dep: 1,
"research._id": 1,
"research.firstname": 1,
"research.lastname": 1,
"research.position": 1,
"research.article": 1,
"research.type_article": 1,
"research.level": 1,
"research.sub_level_1": 1,
"research.sub_level_2": 1,
"research.year": 1,
"research.research_year": 1,
"research.author": 1,
"research.name": 1,
"research.name_th": 1,
"research.conference": 1,
"research.quartile": 1,
"research.student": 1,
"research.tags": 1,
"research.status": 1,
},
},
{ $unwind: "$research" },
{ $unwind: "$research.type_article" },
{
$group: {
buasri_id: { $last: "$buasri_id" },
department: { $last: "$dep" },
_id: "$research._id",
firstname: { $last: "$research.firstname" },
lastname: { $last: "$research.lastname" },
position: { $last: "$research.position" },
article: { $last: "$research.article" },
level: { $last: "$research.level" },
sub_level_1: { $last: "$research.sub_level_1" },
sub_level_2: { $last: "$research.sub_level_2" },
year: { $last: "$research.year" },
research_year: { $last: "$research.research_year" },
author: { $last: "$research.author" },
name: { $last: "$research.name" },
name_th: { $last: "$research.name_th" },
quartile: { $last: "$research.quartile" },
student: { $last: "$research.student" },
},
},
{
$sort: { created: 1 },
},
])
.then((report) => {
res.json(report);
})
.catch((err) => console.log(err));
})
.catch((err) => console.log(err));
};
ReportCommittee();
});
router.post("/report_user", auth, (req, res) => {
const ReportUser = async () => {
// DB Config
const mongoose = require("mongoose");
const db = config.get("mongoEResearch");
// Connect Mongo
await mongoose
.connect(db, {
useNewUrlParser: true,
useUnifiedTopology: true,
useCreateIndex: true,
})
.then(() => {
const report = require("../../models/User");
const { buasri_id } = req.body;
report
.aggregate([
{ $match: { buasri_id } },
{
$project: {
buasri_id: 1,
dep: 1,
"research._id": 1,
"research.firstname": 1,
"research.lastname": 1,
"research.position": 1,
"research.article": 1,
"research.type_article": 1,
"research.level": 1,
"research.sub_level_1": 1,
"research.sub_level_2": 1,
"research.year": 1,
"research.research_year": 1,
"research.author": 1,
"research.name": 1,
"research.name_th": 1,
"research.conference": 1,
"research.quartile": 1,
"research.student": 1,
"research.tags": 1,
"research.status": 1,
},
},
{ $unwind: "$research" },
{ $unwind: "$research.type_article" },
{
$group: {
buasri_id: { $last: "$buasri_id" },
department: { $last: "$dep" },
_id: "$research._id",
firstname: { $last: "$research.firstname" },
lastname: { $last: "$research.lastname" },
position: { $last: "$research.position" },
article: { $last: "$research.article" },
level: { $last: "$research.level" },
sub_level_1: { $last: "$research.sub_level_1" },
sub_level_2: { $last: "$research.sub_level_2" },
year: { $last: "$research.year" },
research_year: { $last: "$research.research_year" },
author: { $last: "$research.author" },
name: { $last: "$research.name" },
name_th: { $last: "$research.name_th" },
quartile: { $last: "$research.quartile" },
student: { $last: "$research.student" },
},
},
{
$sort: { created: 1 },
},
])
.then((report) => {
res.json(report);
})
.catch((err) => console.log(err));
})
.catch((err) => console.log(err));
};
ReportUser();
});
module.exports = router;
| javascript |
export interface UserJson {
accountId: string;
displayName: string;
}
export default class User {
private readonly data: UserJson;
constructor(data: UserJson) {
this.data = data;
}
public getAccountId(): string {
return this.data.accountId;
}
public getDisplayName(): string {
return this.data.displayName;
}
}
| typescript |
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import SO3
import numpy as np
def test_so3():
_SO3 = SO3.SO3()
_SO3.set_euler([0.1, 0.2, 0.3])
print _SO3.log()
_SO3.set_euler([0., 0., 0.])
print _SO3.log()
_SO3.set_euler([-0.1, -0.2, -0.3])
print _SO3.log()
_SO3.set_axis_angle(np.pi / 3, [1, 0, 0])
print _SO3.log()
_SO3.set_axis_angle(np.pi / 2, [1, 0, 0])
print _SO3.log()
_SO3.set_axis_angle(np.pi / 2, [1, 1, 0])
print _SO3.log()
_SO3.set_axis_angle(np.pi / 2, [1, 1, 1])
print _SO3.log()
_SO3.set_axis_angle(np.pi / 2, [0, 0, -0.1])
print _SO3.log()
_SO3.set_quaternion([-1, 0, 0, 0])
print _SO3.log()
_SO3.set_quaternion([1, 0, 0, 0])
print _SO3.log()
print 'test right jacobian'
w = np.array([0, 0, 0.1])
w0 = np.array([0, 0, -0.1])
right = SO3.exp_so3(_SO3.right_jac(w).dot(w))
print SO3.exp_so3(w0).dot(right)
print SO3.exp_so3(w0 + w)
w = np.array([0, 1, 2])
print _SO3.right_jac(w)
if __name__ == '__main__':
test_so3()
| python |
function onMouseDown(event) { //鼠标落下的瞬间
event.preventDefault();
mouseOnDown.x = -event.clientX;
mouseOnDown.y = event.clientY;
targetOnDown.x = target.x;
targetOnDown.y = target.y;
document.addEventListener('mousemove', onMouseMove, false);
document.addEventListener('mouseup', onMouseUp, false);
document.addEventListener('mouseout', onMouseOut, false);
}
function onMouseMove(event) { //鼠标落下并且移动 并非仅仅移动
mouse.x = -event.clientX;
mouse.y = event.clientY;
var zoomDamp = distance / 1000;
target.x = targetOnDown.x + (mouse.x - mouseOnDown.x) * 0.005 * zoomDamp;
target.y = targetOnDown.y + (mouse.y - mouseOnDown.y) * 0.005 * zoomDamp;
target.y = target.y > PI_HALF ? PI_HALF : target.y;
target.y = target.y < -PI_HALF ? -PI_HALF : target.y;
// console.log(target.x + " " + target.y + " " + distanceTarget);
}
function onMouseUp(event) {
document.removeEventListener('mousemove', onMouseMove, false);
document.removeEventListener('mouseup', onMouseUp, false);
document.removeEventListener('mouseout', onMouseOut, false);
}
function onMouseOut(event) {
document.removeEventListener('mousemove', onMouseMove, false);
document.removeEventListener('mouseup', onMouseUp, false);
document.removeEventListener('mouseout', onMouseOut, false);
}
function onMouseWheel(event) {
event.preventDefault();
if (overRenderer) {
zoom(event.wheelDeltaY * 1.6);
}
return false;
}
function onWindowResize(event) {
// console.log('resize');
camera.aspect = window.innerWidth / window.innerHeight;
camera.VIEW_ANGLE = 260;
renderer.setSize(window.innerWidth, window.innerHeight);
}
function zoom(delta) {
distanceTarget -= delta;
distanceTarget = distanceTarget > distanceMax ? distanceMax : distanceTarget;
distanceTarget = distanceTarget < distanceMinmin ? distanceMinmin : distanceTarget;
}
function initInteraction() {
document.addEventListener('mousedown', onMouseDown, false);
document.addEventListener('mousewheel', onMouseWheel, false);
window.addEventListener('resize', onWindowResize, false);
document.addEventListener('mouseover', function() {
overRenderer = true;
}, false);
document.addEventListener('mouseout', function() {
overRenderer = false;
}, false);
}
function loopInteraction(){
zoom(curZoomSpeed);
rotation.x += (target.x - rotation.x) * 0.1;
rotation.y += (target.y - rotation.y) * 0.1;
distance += (distanceTarget - distance) * 0.1;
dCameraX += (dCameraXTarget - dCameraX) * 0.1;
dCameraY += (dCameraYTarget - dCameraY) * 0.1;
dCameraZ += (dCameraZTarget - dCameraZ) * 0.1;
cameraPositionX = distance * Math.sin(rotation.x) * Math.cos(rotation.y) + dCameraX;
cameraPositionY = distance * Math.sin(rotation.y) + dCameraY;
cameraPositionZ = distance * Math.cos(rotation.x) * Math.cos(rotation.y) + dCameraZ;
viewX = REarth * Math.sin(rotation.x) * Math.cos(rotation.y);
viewY = REarth * Math.sin(rotation.y);
viewZ = REarth * Math.cos(rotation.x) * Math.cos(rotation.y);
camera.position.x = cameraPositionX;
camera.position.y = cameraPositionY;
camera.position.z = cameraPositionZ;
camera.lookAt(new THREE.Vector3(viewX * kREarth, viewY * kREarth, viewZ * kREarth));
}
function rotate(state){
if(state==="xy顶视"){
target.x =-Math.PI/2;
target.y =Math.PI/2;
distanceTarget =4750;
}
if(state==="鸟瞰"){
target.x =0.75 ;
target.y =0.75 ;
distanceTarget =3580;
}
if(state==="yz侧视"){
target.x =-Math.PI/2 ;
target.y =0 ;
distanceTarget =3580;
}
if(state==="xz侧视"){
target.y =-Math.PI ;
target.x =0 ;
distanceTarget =3580;
}
} | javascript |
{
"parent": "item/generated",
"textures": {
"layer0": "decorationmod:items/block_json_obj"
}
} | json |
<filename>src/jobs_done10/_tests/test_repository.py
from jobs_done10.repository import Repository
def testNameFromURL():
tests = [
('/path/to/repo.git/', 'repo'),
('file:///path/to/repo.git/', 'repo'),
('file://~/path/to/repo.git/', 'repo'),
('git://host.xz/path/to/repo.git/', 'repo'),
('git://host.xz/~user/path/to/repo.git/', 'repo'),
('host.xz:/path/to/repo.git/', 'repo'),
('host.xz:path/to/repo.git', 'repo'),
('host.xz:~user/path/to/repo.git/', 'repo'),
('http://host.xz/path/to/repo.git/', 'repo'),
('https://host.xz/path/to/repo.git/', 'repo'),
('path/to/repo.git/', 'repo'),
('rsync://host.xz/path/to/repo.git/', 'repo'),
('ssh://host.xz/path/to/repo.git/', 'repo'),
('ssh://host.xz/path/to/repo.git/', 'repo'),
('ssh://host.xz/~/path/to/repo.git', 'repo'),
('ssh://host.xz/~user/path/to/repo.git/', 'repo'),
('ssh://host.xz:port/path/to/repo.git/', 'repo'),
('ssh://user@host.xz/path/to/repo.git/', 'repo'),
('ssh://user@host.xz/path/to/repo.git/', 'repo'),
('ssh://user@host.xz/~/path/to/repo.git', 'repo'),
('ssh://user@host.xz/~user/path/to/repo.git/', 'repo'),
('ssh://user@host.xz:port/path/to/repo.git/', 'repo'),
('user@host.xz:/path/to/repo.git/', 'repo'),
('<EMAIL>:path/to/repo.git', 'repo'),
('user@host.xz:~user/path/to/repo.git/', 'repo'),
('~/path/to/repo.git', 'repo'),
]
for url, expected_name in tests:
assert Repository(url=url).name == expected_name, 'Failed for url "%s"' % url
def testEquality():
assert Repository() == Repository()
assert Repository(url='http://example.com') == Repository(url='http://example.com')
assert Repository(url='http://example.com', branch='foo') != Repository(url='http://example.com')
assert Repository(url='http://example.com') != Repository(url='http://other.com')
assert Repository(url='http://example.com', branch='bar') == \
Repository(url='http://example.com', branch='bar')
assert Repository(url='http://example.com', branch='foo') != \
Repository(url='http://example.com', branch='bar')
| python |
I would appreciate the opportunity to discuss my credentials for this position with you during a personal interview. "You will find that various experiences from both my academic career and my personal life align very well with your organization's mission shaping community leaders who are working towards a more just and sustainable world. My responsibilities included the development and management of the site's editorial voice and style, the editorial calendar; I worked closely with healthcare professionals and medical editors to help them provide the best possible information to a consumer audience of patients. Also, I helped physicians learn to utilize their medical content to write user-friendly, readily comprehensible text. I have ability to assess the possibilities of bringing about convergence of interests and actions through processes of strategic planning, strategy formulation and negotiation. My key skills include project planning, monitoring and evaluation, program management, formulating and managing regional plan and budget, monitoring the staff and their works, Project progress report, press release statement and urgent action and appeal. Preparing training modules on human rights, youth leadership and domestic workers, & labour right. Conducting training and workshops and providing technical support to partner organization. Linking up/ lobbying with government agencies and other organization.
| english |
import * as React from 'react';
// @ts-ignore: no declaration file
import injectSheet from 'react-jss';
import { createStyles, ThemeTypes } from '@getstation/theme';
import { DragLayer } from 'react-dnd';
import AppDockIcon from './components/ConnectedAppDockIcon';
interface OuterProps {
onDraggingStateChange?: (isDragging: boolean) => void,
}
interface OwnProps {
item: {
applicationId: string,
manifestURL: string,
index: number,
},
itemType: string,
initialOffset: { x: number, y: number },
currentOffset: { x: number, y: number },
isDragging: boolean,
}
type Props = OuterProps & OwnProps & { classes: {
container: string,
title: string,
}};
const styles = (theme: ThemeTypes) => createStyles({
container: {
position: 'absolute',
pointerEvents: 'none',
zIndex: 100,
left: 0,
top: 0,
},
title: {
width: 220,
...theme.mixins.ellipsis(1),
},
});
function getItemStyles(props: Props) {
const { currentOffset } = props;
if (!currentOffset) {
return {
display: 'none',
};
}
const transform = `translate(${currentOffset.x > 30 ? currentOffset.x : 0}px, ${currentOffset.y}px)`;
return {
display: 'flex',
alignItems: 'center',
transform: transform,
WebkitTransform: transform,
// Uncomment for the expanded state
// color: 'rgba(60, 80, 93, 0.5)',
// backgroundColor: currentOffset.x > 40 ? '#e6e8eb' : 'initial',
// borderRadius: 100,
// width: currentOffset.x > 40 ? 280 : 49,
// height: 49,
// transition: 'width 250ms ease-in-out, background-color 250ms ease-in-out',
// boxShadow: currentOffset.x > 40 && '2px 2px 6px #BBB',
};
}
@DragLayer(monitor => ({
item: monitor.getItem(),
itemType: monitor.getItemType(),
initialOffset: monitor.getInitialSourceClientOffset(),
currentOffset: monitor.getSourceClientOffset(),
isDragging: monitor.isDragging(),
}))
class DockIconDragLayer extends React.PureComponent<Props> {
componentDidUpdate(prevProps: Props) {
const { onDraggingStateChange } = this.props;
if (!onDraggingStateChange) return;
if (prevProps.isDragging !== this.props.isDragging) {
// We are only interested when an 'APP_DOCK_APP' is dragged
// Though, when the drag ends (isDragging=false), `itemType`
// is null. To make sure `onDraggingStateChange` is called when
// drag ends, we don't check item type for `isDragging=false`.
if (this.props.isDragging && this.props.itemType === 'APP_DOCK_APP') {
onDraggingStateChange(true);
}
if (!this.props.isDragging) {
onDraggingStateChange(false);
}
}
}
render() {
const { classes, item, isDragging, itemType } = this.props;
if (itemType !== 'APP_DOCK_APP') {
return null;
}
if (!isDragging) {
return null;
}
return (
<div className={classes.container}>
<div style={getItemStyles(this.props)}>
<AppDockIcon
applicationId={item.applicationId}
active={true}
/>
{/* Uncomment for the expanded state */}
{/*<p className={classes.title}>{item.tabTitle}</p>*/}
</div>
</div>
);
}
}
export default injectSheet(styles)(DockIconDragLayer) as React.ComponentType<OwnProps>;
| typescript |
<reponame>lmcosta98/sokoban_solver
from mapa import Map
from utils import *
import asyncio
from queue import PriorityQueue
GOAL_COST = 1
FLOOR_COST = 1.5
KEEPER_MOVE_COST = 2
DIRECTIONS = ["w","a","s","d"]
# Nos de uma arvore de pesquisa
class SearchNode:
def __init__(self,state,parent,cost=None,heuristic=None,action=''):
self.state = state
self.parent = parent
self.cost = cost
self.heuristic = heuristic
self.action = action
def in_parent(self, newstate):
if self.parent == None:
return False
# verifica se o novo estado e o pai do no atual
if self.parent.state == newstate:
return True
# verifica se o novo estado esta no caminho já percorrido (avo, bisavo, etc)
return self.parent.in_parent(newstate)
def __lt__(self, other):
return self.cost + self.heuristic < other.cost + other.heuristic
def __le__(self, other):
return self.cost + self.heuristic <= other.cost + other.heuristic
def __str__(self):
return "no(" + str(self.state) + "," + str(self.parent) + ")"
def __repr__(self):
return str(self)
# Arvores de pesquisa
class SokobanSolver:
# construtor
def __init__(self,level_map: Map, method='manhatan'):
self.level_map = level_map
self.boxes_position = []
self.goals_position = []
self.deadlocks = []
self.method = method
self.deadlocks_pos = []
# obtain the path from the initial state to the goal state
def get_path(self,node):
if node.parent == None:
return [node.action]
path = self.get_path(node.parent)
path += [node.action]
return path
def result(self, current_state, direction):
'''
RECEIVES: the current state and a direction (action)
RETURNS: the next state
Calculates the next state given the current state and a direction (action):
-> Receives a dictionary containing the current state
-> Calculates the next state (positions of the keeper and boxes)
-> Returns the new state calculated
'''
next_state = calc_next_state(current_state, direction)
return next_state
def actions(self, current_state):
'''
RECEIVES: the current state
RETURNS: a list containing all the valid actions for the state
Calculates all the valid actions for a state:
-> Receives a dictionary containing the current state
-> For every direction in "wasd" calculates the next state (positions of the keeper and boxes)
-> Verifies wether the boxes are in a deadlock
-> Return a list of valid actions (directions)
'''
valid_directions = []
for direction in DIRECTIONS:
next_state = calc_next_state(current_state,direction)
keeper = next_state['keeper']
boxes = next_state['boxes']
valid_directions.append(direction)
# Check wether we are placing a box outside of the map or
# placing a box on top of another box
if Map.is_blocked(self.level_map,keeper) or len(list(set(boxes))) < len (boxes):
valid_directions.remove(direction)
continue
for box in boxes:
if box in self.deadlocks_pos:
valid_directions.remove(direction)
continue
elif self.isDeadlock(box):
self.deadlocks_pos.append(box)
self.deadlocks.append(next_state)
valid_directions.remove(direction)
return list(set(valid_directions))
def heuristic(self, current_state,cost=1,method='manhatan'):
keeper = current_state['keeper']
boxes = current_state['boxes']
goals = current_state['goals']
heuristic = calc_distance(keeper,boxes,method)
for box in boxes:
heuristic += calc_distance(box,goals,method)
return heuristic*cost
def cost(self, current_state, direction):
'''
RECEIVES: the current state and a direction (action)
RETURNS: the cost of achieving the next state
Calculates the next state given the current state and a direction (action):
-> Receives a dictionary containing the current state
-> Calculates the next state (positions of the keeper and boxes)
-> Returns the cost of achieving the new state
'''
prev_boxes = current_state['boxes']
next_state = calc_next_state(current_state, direction)
boxes = next_state['boxes']
for box in boxes:
if box in self.goals_position:
return GOAL_COST
# if we moved a box into a normal floor tile
boxes.sort()
prev_boxes.sort()
if str(boxes) != str(prev_boxes):
return FLOOR_COST
#if its neither a goal nor moved a box return the cost of a keeper move
return KEEPER_MOVE_COST
def satisfies(self, current_state):
'''
RECEIVES: current state
RETURNS: True or False
Verifies if all the boxes are placed on the goals:
-> Receives a dictionary containing the current state
-> Sorts the list containing the positions of the boxes
-> Checks wether the list is equal to the list containing the goals' position
'''
current_state['boxes'].sort()
return current_state['boxes'] == self.goals_position
async def search(self, state):
self.goals_position = state['goals']
self.goals_position.sort()
self.open_nodes = PriorityQueue()
root = SearchNode(state,None,cost=0,heuristic=self.heuristic(current_state=state,method=self.method))
self.open_nodes.put((0,root))
open_nodes = 0
while self.open_nodes != []:
await asyncio.sleep(0)
node = self.open_nodes.get()[1]
if self.satisfies(node.state):
print("OPEN NODES ", open_nodes)
return self.get_path(node)
for action in self.actions(node.state):
new_state = self.result(node.state,action)
if new_state not in self.deadlocks:
if node.in_parent(new_state):
continue
acc_cost = (node.cost + self.cost(node.state,action))
heur = self.heuristic(new_state, self.cost(node.state,action), self.method)
new_node = SearchNode(state=new_state,parent=node,cost=acc_cost,
heuristic=heur,action=action)
open_nodes += 1
self.open_nodes.put((acc_cost+heur, new_node))
return None
# auxiliary method for calculating deadlocks
def isDeadlock(self, pos):
i_x = 0
i_y = 0
list_x = [x[0] for x in self.goals_position]
list_y = [y[1] for y in self.goals_position]
if self.level_map.is_blocked(pos):
return True
if self.level_map.is_blocked((pos[0] + 1, pos[1])) or self.level_map.is_blocked((pos[0] - 1, pos[1])):
i_x += 1
if self.level_map.is_blocked((pos[0], pos[1] + 1)) or self.level_map.is_blocked((pos[0], pos[1] - 1)):
i_y += 1
# verifies if is not on a corner and if it is, make sure it's not a goal
if (i_x > 0 and i_y > 0) and (pos not in self.goals_position):
return True
if (i_x > 0 and (pos[0] not in list_x) and (self.level_map.is_blocked((pos[0]+1, pos[1]+1))) and ((self.level_map.is_blocked((pos[0]+1, pos[1]+2)))) and ((self.level_map.is_blocked((pos[0]+1, pos[1]-2)))) and (self.level_map.is_blocked((pos[0]+1,pos[1]-1)))):
return True
elif (i_x > 0 and (pos[0] not in list_x) and ((self.level_map.is_blocked((pos[0]-1, pos[1]+1))) and (self.level_map.is_blocked((pos[0]-1, pos[1]+2))) and (self.level_map.is_blocked((pos[0]-1, pos[1]-2))) and (self.level_map.is_blocked((pos[0]-1,pos[1]-1))))):
return True
elif (i_y > 0 and (pos[1] not in list_y) and (self.level_map.is_blocked((pos[0]+1, pos[1]+1))) and (self.level_map.is_blocked((pos[0]+2, pos[1]+1))) and (self.level_map.is_blocked((pos[0]-2, pos[1]+1))) and (self.level_map.is_blocked((pos[0]-1, pos[1]+1)))):
return True
elif (i_y > 0 and (pos[1] not in list_y) and (self.level_map.is_blocked((pos[0]+1, pos[1]-1))) and (self.level_map.is_blocked((pos[0]+2, pos[1]-1))) and (self.level_map.is_blocked((pos[0]-2, pos[1]-1))) and (self.level_map.is_blocked((pos[0]-1, pos[1]-1)))):
return True
return False | python |
/*
* Copyright 2018 <NAME>
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package pl.themolka.arcade.objective;
import pl.themolka.arcade.config.ConfigParser;
import pl.themolka.arcade.dom.Node;
import pl.themolka.arcade.parser.Context;
import pl.themolka.arcade.parser.InstallableParser;
import pl.themolka.arcade.parser.NestedParserMap;
import pl.themolka.arcade.parser.ParserLibrary;
import pl.themolka.arcade.parser.ParserException;
import pl.themolka.arcade.parser.ParserNotSupportedException;
import pl.themolka.arcade.parser.Produces;
import pl.themolka.arcade.parser.Result;
import java.util.Collections;
import java.util.Set;
@Produces(Objective.Config.class)
public class ObjectiveParser extends ConfigParser<Objective.Config<?>>
implements InstallableParser {
private NestedParserMap<ObjectiveManifest.ObjectiveParser<?>> parsers;
@Override
public void install(ParserLibrary library) throws ParserNotSupportedException {
super.install(library);
this.parsers = new NestedParserMap<>(library);
for (ObjectiveManifest manifest : ObjectiveManifest.manifests) {
ObjectiveManifest.ObjectiveParser<?> parser = manifest.defineParser(library);
if (parser != null) {
this.parsers.scanDedicatedClass(parser.getClass());
}
}
}
@Override
public Set<Object> expect() {
return Collections.singleton("objective");
}
@Override
protected Result<Objective.Config<?>> parseNode(Context context, Node node, String name, String value) throws ParserException {
ObjectiveManifest.ObjectiveParser<?> parser = this.parsers.parse(name);
if (parser == null) {
throw this.fail(node, name, value, "Unknown objective type");
}
return Result.fine(node, name, value, parser.parseWithDefinition(context, node, name, value).orFail());
}
}
| java |
This story is now available on Chillzee KiMo.
வெட்கம் மின்ன சிரித்த மஞ்சு, “இதுக்கு எதுக்கு நான் கோபப் படனும்?” என்றாள் புரியாமல்.
Bindu Vinod has written more than 31 Tamil series in Chillzee and many more Novels in Chillzee KiMo.
Copyright © 2009 - 2023 Chillzee.in. All Rights Reserved.
| english |
- VIRAL: Kajol Speaks For The First Time On Her Ugly Spat With Karan Johar!
- PICTURES! Deepika, Katrina, Ranveer, Sidharth, Alia, Sara & Others At Shahid Kapoor’s Birthday Bash!
- CAST ALERT: Tabu & Ayushmann Khurrana To Team Up For Sriram Raghavan's Next?
- SAY WHAT! A Biopic On Yo Yo Honey Singh On The Cards?
- Confirmed! Sara Ali Khan To Debut With Tiger Shroff In 'Student Of The Year 2'!
- Shahid Kapoor OPENS UP About Why He Wants To Do A Full On Dance Film After Padmavati!
- ROFL! Shahrukh Khan Trolls Aamir Khan, Aditya Chopra, Alia Bhatt & Kapil Sharma At Filmfare Awards!
- Controversial! Kangana Ranaut SKIPS Shahid Kapoor's B'day Bash Because Of His Rude Comments?
- Manisha Koirala: I'm Planning To Adopt A Baby Girl!
- Kangana Ranaut & Sugandha Mishra Clarify About The ‘Slap’ Comment!
- Diya Aur Baati Hum 2 Titled As 'Tu Sooraj Main Saanjh Piyaji'!
- Karan Kundra Replaced By Nikhil Chinnappa In Roadies!
- Dil Bole Oberoi SPOILER: What! Omkara & Gauri To Get Married; Saumya Returns To Take Revenge!
- Kuch Rang Pyar Ke Aise Bhi: Dev & Soha's Secret Meeting At The Bose House; Here's A Sneak Peek!
- Yeh Rishta Kya Kehlata Hai: Family Members Have A Blast During Kartik & Naira’s Haldi Ceremony!
- Kumkum Bhagya Major Twist: Purab Meets With An Accident During Abhi-Tanu’s Haldi Ceremony!
- When Reel Love Turned Real! Yeh Rishta Kya Kehlata Hai’s Mohsin Khan Confirms Dating Shivangi Joshi!
- WOW! Dileep And Shafi To Join Hands Yet Again?
- Dulquer Salmaan Lauds Vinayakan & Manikandan Achari On Winning CPC Awards!
- Vidya Balan's Withdrawal From Aami: The Actress To Face Legal Trouble?
- Attack On A Popular Actress, Ezra's Record Breaking Run & Other Mollywood News Of The Week!
- TAKE A LOOK! Durga Krishna – Prithviraj's New Heroine In Vimaanam!
- REVEALED! Honey Rose's Role In Omar Lulu's Chunkzz!
- Kalabhavan Mani's Death: Police Team To Wrap Up The Investigation!
- Did Angelina Hint At Reunion? Says 'We Are A Family And We Will Always Be A Family'
| english |
{
"Name": "Estojo para pistola",
"ShortName": "Estojo",
"Description": "O estojo para pistola e armazenamento de munição. Conta com um mecanismo de tranca confiável que requer uma chave."
} | json |
<reponame>medialab/climateDebateExplorer
{
"actors": [
"Umbrella Group",
"European Union",
"Group of 77"
],
"countries": [
"China"
],
"enb_end_date": "18-May-07",
"enb_long_title": "Twenty-sixth sessions of the Subsidiary Bodies (SB 26) of the United Nations Framework Convention on Climate Change (UNFCCC)",
"enb_short_title": "SB 26",
"enb_start_date": "07-May-07",
"enb_url": "http://www.iisd.ca/vol12/enb12333e.html",
"id": "enb12333e_12",
"section_title": "INFORMATION IN NATIONAL COMMUNICATIONS:",
"sentences": [
"The other item held in abeyance was on information contained in national communications from parties not included in Annex I to the Convention.",
"This issue had first arisen at SBI 24, when the Umbrella Group and EU had asked the SBI to consider information from non-Annex I parties in all of their national communications, including their second and, where appropriate, subsequent national communications.",
"Developed countries took the view that this request was in accordance with Convention Article 10.2 (consideration of national communications), and had expressed the hope that SBI 26 could make better use of the valuable information that these documents contain and assist non-Annex I parties to further improve these documents (FCCC/SBI/2006/MISC.12).",
"However, at SBI 26, the G-77/China questioned the inclusion of this agenda item during the opening plenary, and the issue was held in abeyance pending consultations by Chair Asadi.",
"During the closing plenary on 18 May, Chair Asadi reported that his consultations had not resulted in agreement to include this agenda item.",
"He indicated that the issue would be included on the SBI 27 agenda with a footnote noting that there had been no consensus on whether to keep this item on the agenda at SBI 26."
],
"subtype": "",
"topics": [],
"type": "SBI"
} | json |
const util = {
/**
* 获取当前所在工程根目录,有3种使用方法:<br>
* getProjectPath(uri) uri 表示工程内某个文件的路径<br>
* getProjectPath(document) document 表示当前被打开的文件document对象<br>
* getProjectPath() 会自动从 activeTextEditor 拿document对象,如果没有拿到则报错
* @param {*} document
*/
getProjectPath (document) {
if (!document) {
document = vscode.window.activeTextEditor ? vscode.window.activeTextEditor.document : null;
}
if (!document) {
this.showError('当前激活的编辑器不是文件或者没有文件被打开!');
return '';
}
const currentFile = (document.uri ? document.uri : document).fsPath;
let projectPath = null;
let workspaceFolders = vscode.workspace.workspaceFolders.map(item => item.uri.path);
// 由于存在Multi-root工作区,暂时没有特别好的判断方法,先这样粗暴判断
// 如果发现只有一个根文件夹,读取其子文件夹作为 workspaceFolders
if (workspaceFolders.length == 1 && workspaceFolders[0] === vscode.workspace.rootPath) {
const rootPath = workspaceFolders[0];
var files = fs.readdirSync(rootPath);
workspaceFolders = files.filter(name => !/^\./g.test(name)).map(name => path.resolve(rootPath, name));
// vscode.workspace.rootPath会不准确,且已过时
// return vscode.workspace.rootPath + '/' + this._getProjectName(vscode, document);
}
workspaceFolders.forEach(folder => {
if (currentFile.indexOf(folder) === 0) {
projectPath = folder;
}
})
if (!projectPath) {
this.showError('获取工程根路径异常!');
return '';
}
return projectPath;
},
/**
* 弹出错误信息
*/
showError: function (info) {
vscode.window.showErrorMessage(info);
}
}
module.exports = util;
| javascript |
# Configuration as Code Plugin Releases
## Alpha Version
0.1-alpha version of Configuration as Code Plugin has a number of issues that may break your Jenkins instance - details below. DO NOT USE in production environment, only for test.
### Features available:
- Configure (some) plugins via yaml file
- Reload configuration from yaml file - manual step, available under `Manage Jenkins`:arrow_right: `Configuration as Code`
- Vault support - details in [README](../README.md)
- seed job creation - details in [README](../README.md)
- build agents configuration
- github OAuth support
- Matrix (and Matrix Project) Authorization Strategy support
### Limitations:
- Location of yaml file with configuration MUST be provided via environment variable CASC_JENKINS_CONFIG - can be location on disk readable from Jenkins perspective
- plugins has to be installed manually (and it must happen before you try to configure them)
### How to use it
1. Add experimental plugins repository ([link](http://updates.jenkins-ci.org/experimental/update-center.json))
2. Prepare jenkins.yaml and store the location in CASC_JENKINS_CONFIG environment variable
3. Install *Configuration as Code* plugin (& all the others plugins you want to use)
4. When installed plugin will look for configuration file and configure Jenkins accordingly
5. To change the configuration go to `Manage Jenkins`:arrow_right: `Configuration as Code` and hit "Reload" button
| markdown |
<!doctype html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no">
<title>Merging deep learning with physical models for the analysis of modern cosmological surveys</title>
<meta name="description" content="McWilliams/UPitt Astro Seminar, August 21st 2020">
<meta name="author" content="<NAME>">
<meta name="apple-mobile-web-app-capable" content="yes">
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
<link rel="stylesheet" href="reveal.js/css/reset.css">
<link rel="stylesheet" href="reveal.js/css/reveal.css">
<link rel="stylesheet" href="reveal.js/css/theme/darkenergy.css">
<!-- Theme used for syntax highlighting of code -->
<link rel="stylesheet" href="reveal.js/lib/css/monokai.css">
<!-- Printing and PDF exports -->
<script>
var link = document.createElement( 'link' );
link.rel = 'stylesheet';
link.type = 'text/css';
link.href = window.location.search.match( /print-pdf/gi ) ? 'reveal.js/css/print/pdf.css' : 'reveal.js/css/print/paper.css';
document.getElementsByTagName( 'head' )[0].appendChild( link );
</script>
</head>
<body>
<div class="reveal">
<!-- Any section element inside of this container is displayed as a slide -->
<div class="slides">
<section data-background-image="assets/lsst_stills_0009_crop.jpg" >
<div class="container">
<div class="title" style="border-radius: 20px; background-color:rgba(0, 0, 0, 0.4);">
<h1>Merging deep learning with physical models</h1>
<h2> for the analysis of modern cosmological surveys</h2>
</div>
</div>
<hr>
<div style="border-radius: 20px; background-color:rgba(0, 0, 0, 0);">
<div class="container" >
<div class="col">
<div align="left" style="margin-left: 20px;">
<h2><NAME></h2>
<br>
<img src="assets/CosmoStatDarkBK.png" class="plain"></img>
<br>
</div>
</div>
<div class="col">
<br>
<br>
<br>
<br>
<img src="assets/logo_cnrs.png" class="plain" height="150"></img>
</div>
<div class="col">
<br>
<br>
<img src="assets/desc-logo-inv.png" class="plain" height="200"></img>
</div>
</div>
<div> slides at <a href="https://eiffl.github.io/talks/CMU2020">eiffl.github.io/talks/CMU2020</a> </div>
</div>
</section>
<!-- <section data-background-image="assets/WMAP_timeline_large.jpg">
<h3 class='slide-title' style="position:absolute;top:0;"> the $\Lambda$CDM view of the Universe </h3>
<br> <br>
<div class="container">
<div class="col" style="flex: 0 0 40em;">
</div>
<div class="col">
<img class="plain" data-src="assets/Euclid.png" style="width: 300px"/>
<img class="plain" data-src="assets/wfirstlogo.png" style="width: 300px"/>
<img class="plain" data-src="assets/vrro.png" style="width: 300px"/>
</div>
</div>
<br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br>
</section> -->
<section>
<section data-background-video="assets/animation-day-to-night.mov" data-background-video-muted>
<h3 class='slide-title'>the Rubin Observatory Legacy Survey of Space and Time</h3>
<div class="container">
<div class="col">
<ul>
<li class="fragment fade-up"> 1000 images each night, 15 TB/night for 10 years</li>
<br>
<li class="fragment fade-up"> 18,000 square degrees, observed once every few days</li>
<br>
<li class="fragment fade-up"> Tens of billions of objects, each one observed $\sim1000$ times</li>
</ul>
<br >
<!-- <p class="fragment fade-up"> $\Longrightarrow$ Incredible potential for discovery, along with <b>unprecedented challenges</b >.</p> -->
</div>
<div class="col">
<video class="fragment fade-up" data-fragment-index="1" data-autoplay data-src="assets/obsim.mp4" type="video/mp4" />
</div>
</div>
</section>
<section data-transition="fade-in fade-out" data-background="assets/gal_sdss.png" data-vertical-align-top>
<p>Previous generation survey: SDSS</p>
<br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br>
<br> <br> <br> <br> <br> <br> <br>
<div style="float:right; font-size: 20px">Image credit: <NAME></div>
</section>
<section data-transition="fade-in fade-out" data-background="assets/gal_des.png" data-vertical-align-top>
<p>Current generation survey: DES</p>
<br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br>
<br> <br> <br> <br> <br> <br> <br>
<div style="float:right; font-size: 20px">Image credit: <NAME></div>
</section>
<section data-transition="fade-in fade-out" data-background="assets/gal_hsc.png" data-vertical-align-top>
<p>LSST precursor survey: HSC</p>
<br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br> <br>
<br> <br> <br> <br> <br> <br> <br>
<div style="float:right; font-size: 20px">Image credit: <NAME></div>
</section>
</section>
<section>
<h3 class='slide-title'>The challenges of modern surveys</h3>
<div style="float:left;" >
$\Longrightarrow$ Modern surveys will provide <b>large volumes</b> of <b>high quality</b> data
</div>
<br>
<div class="container">
<div class="col">
<div class="block fragment">
<div class="block-title">
A Blessing
</div>
<div class="block-content">
<ul>
<li>Unprecedented statistical power
</li>
<br>
<li>Great potential for new discoveries
</li>
</ul>
</div>
</div>
<br>
<div class="block fragment">
<div class="block-title">
A Curse
</div>
<div class="block-content">
<ul>
<li> Existing methods are reaching their limits at every
step of the science analysis
</li>
<br>
<li>Control of systematics is paramount
</li>
</ul>
</div>
</div>
</div>
<div class="col">
LSST forecast on dark energy parameters
<img class="plain" data-src="assets/LSST_forecast_Y10.png" style="height :400px;"/>
</div>
</div>
<div class="fragment">
$\Longrightarrow$ Dire need for <b class="alert">novel analysis techniques</b> to fully realize the potential of modern surveys:
<!-- <ul>
<li class="fragment grow">Low level image processing</li>
<li>Fast emulation of complex simulations</li>
<li>Probabilistic inference</li>
</ul> -->
</div>
</section>
<section>
<h3 class='slide-title'>Outline of this talk</h3>
<br>
<br>
<h3> How can we merge <b>Deep Learning</b> and <b>Physical Models</b> to help us make sense <br> of increasingly complex data?</h4>
<br>
<br>
<br>
<ul>
<li class="fragment grow">Generative Models for Galaxy Image Simulations</li>
<br>
<br>
<br>
<li class="fragment grow">Differentiable models of the Large Scale Structure</li>
<br>
<br>
<br>
<li class="fragment grow">Differentiable cosmological computations</li>
<br>
</ul>
<br>
<br>
<br>
</section>
<section>
<h2>Deep Generative Models for Galaxy Image Simulations</h2>
<br>
<a href="https://arxiv.org/abs/2008.03833"><img src="https://img.shields.io/badge/astro--ph.IM-arXiv%3A2008.03833-B31B1B.svg" class="plain" style="height:25px;"/></a>
<a href="https://github.com/McWilliamsCenter/galsim_hub"><img src="https://badgen.net/badge/icon/github?icon=github&label" class="plain" style="height:25px;"/></a>
<a href="https://colab.research.google.com/github/McWilliamsCenter/galsim_hub/blob/master/notebooks/GalsimHubDemo.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab" class="plain" style="height:25px;"/></a>
<hr>
<br>
<br>
<div align="left" style="margin-left: 20px;">
<h3>Work in collaboration with <br>
<NAME>, <NAME>, <NAME>, <NAME>, <NAME></h3>
</div>
<br>
<br>
<div style="float:center; font-size: 25px"><b>Lanusse</b> et al. (2020) <br>
<!-- Mandelbaum, <b> Lanusse</b>, et al. (2018)<br> -->
Ravanbakhsh, <b>Lanusse</b>, et al. (2017)
</div>
</section>
<section>
<section>
<h3 class="slide-title"> The weak lensing shape measurement problem</h3>
<div>
<img class="plain" data-src="assets/great.jpg" />
</div>
<div class="block fragment" >
<div class="block-title">
Shape measurement biases
</div>
<div class="block-content">
$$ < e > = \ (1 + m) \ \gamma \ + \ c $$
<ul>
<li class="fragment fade-up"> Can be calibrated on image simulations
</li>
<li class="fragment fade-up"> How complex do the simulations need to be?
</li>
</ul>
</div>
</div>
</section>
<!-- <section>
<h3 class="slide-title"> Image simulations for shear calibration in the HSC survey</h3>
<div style="float:right; font-size: 20px">Mandelbaum,<b>Lanusse</b>, <NAME>, et al. (2018)</div>
<img class="plain" data-src="assets/fig_illustrator_red_2.png" style="height: 450px;"/>
<div class="block">
<div class="block-title">
Hyper Suprime-Cam Subaru Strategic Program
</div>
<div class="block-content">
<ul>
<li> 1400 sq. deg. in the wide survey down to 26.4 mag in i-band
</li>
<li> Essentially a smaller area precursor of LSST
</li>
</ul>
</div>
</div>
</section>
<section>
<h3 class="slide-title"> Simulation and Calibration strategy</h3>
<br>
<img class="plain" data-src="assets/simu.png"/>
<br>
<br>
<div class="block">
<div class="block-title">
The GREAT3 approach
</div>
<div class="block-content">
<ul>
<li> Input galaxies from deep HST/ACS COSMOS images (25.2 imag)
</li>
<br>
<li> Apply a range of PSFs and noise levels sampled from the survey
</li>
<br>
<li> Measure response of shape measurement to a known shear
</li>
</ul>
</div>
</div>
</section> -->
<!-- <section>
<h3 class="slide-title"> From simple to complex simulations... Illustration from HSC calibration</h3>
<div style="float:right; font-size: 20px">Mandelbaum,<b>Lanusse</b>, <NAME>, et al. (2018)</div>
<br>
<div class='container'>
<div class='col'>
<img class="plain" data-src="assets/HSC_sims.png"/>
</div>
<div class='col'>
<img class="plain fragment fade-left" data-src="assets/HSC_sims_diff.png"/>
</div>
</div>
</section> -->
<section>
<h3 class="slide-title"> Impact of galaxy morphology</h3>
<div class='container'>
<div class='col'>
<img class="plain" data-src="assets/real_gal-inv.png" style="height: 350px;"/>
<br>
<div style="float:left; font-size: 20px">Mandelbaum, et al. (2013), Mandelbaum, et al. (2014)</div>
</div>
<div class='col'>
<img class="plain fragment fade-up" data-src="assets/great3_calib2-inv.png" style="height: 425px;"/>
</div>
</div>
<br>
<div class="block fragment fade-up">
<div class="block-title">
The need for data-driven generative models
</div>
<div class="block-content">
<ul>
<li> Lack or inadequacy of physical model
</li>
<li> Extremely computationally expensive simulations
</li>
</ul>
</div>
</div>
</section>
</section>
<section class="inverted" data-background="#000">
<h2>
Can we learn a model for the signal from the data itself?</h2>
</section>
<section>
<h3 class="slide-title"> The evolution of generative models </h3>
<br> <br> <br>
<div class='container'>
<div class='col'>
<div style="position:relative; width:500px; height:500px; margin:0 auto;">
<img class="fragment current-visible plain" data-src="assets/DBN.png" style="position:absolute;top:0;left:0;width:500px;" data-fragment-index="0" />
<img class="fragment current-visible plain" data-src="assets/vae_faces.jpg" style="position:absolute;top:0;left:0;width:500px;" data-fragment-index="1" />
<img class="fragment current-visible plain" data-src="assets/gan-samples-1.png" style="position:absolute;top:0;left:0;width:500px;" data-fragment-index="2" />
<img class="fragment plain" data-src="assets/karras2017.png" style="position:absolute;top:0;left:0;width:500px;" data-fragment-index="3" />
</div>
</div>
<div class='col'>
<ul>
<li class="fragment" data-fragment-index="0"> Deep Belief Network <br> (Hinton et al. 2006) </li>
<br>
<li class="fragment" data-fragment-index="1"> Variational AutoEncoder <br> (Kingma & Welling 2014) </li>
<br>
<li class="fragment" data-fragment-index="2"> Generative Adversarial Network <br> (Goodfellow et al. 2014)</li>
<br>
<li class="fragment" data-fragment-index="3"> <NAME> <br> (Arjovsky et al. 2017) </li>
</ul>
</div>
</div>
<br> <br> <br>
</section>
<!-- <section>
<h3 class="slide-title"> A visual Turing test </h3>
<div class="container">
<div class="col">
<img data-src="assets/samples_pixel_cnn.png" class="plain" style="height: 500px;" ></img>
<br>
<div class="fragment fade-up" data-fragment-index="0"> Fake PixelCNN samples </div>
</div>
<div class="col">
<img data-src="assets/sdss5.png" class="plain" style="height: 500px;" ></img>
<br>
<div class="fragment fade-up" data-fragment-index="0"> Real SDSS </div>
</div>
</div>
</section> -->
<section>
<h3 class="slide-title"> Complications specific to astronomical images: spot the differences!</h3>
<div class="container">
<div class="col">
<img data-src="assets/celeba.png" class="plain" style="height: 450px;" ></img>
<br>
CelebA
</div>
<div class="col">
<img data-src="assets/hsc_images.png" class="plain" style="height: 450px;" ></img>
<br>
HSC PDR-2 wide
</div>
</div>
<br>
<div >
<ul>
<li class="fragment"> There is <b class="alert">noise</b></li>
<li class="fragment"> We have a <b class="alert">Point Spread Function</b></li>
</ul>
</div>
</section>
<section>
<section>
<h3 class="slide-title" style="position:absolute;top:0;">A Physicist's approach: let's build a model</h3>
<br>
<br>
<div class="container">
<div class="col">
<img class="plain fragment" data-src="assets/rand_z_square.png" style="height: 150px" data-fragment-index="4"/>
</div>
<div class="col">
<img class="plain fragment" data-src="assets/cosmos_gal.png" style="width: 200px" data-fragment-index="3"/>
</div>
<div class="col">
<img class="plain fragment" data-src="assets/cosmos_gal_psf.png" style="width: 200px" data-fragment-index="2"/>
</div>
<div class="col">
<img class="plain fragment" data-src="assets/cosmos_gal_pix.png" style="width: 200px" data-fragment-index="1"/>
</div>
<div class="col">
<img class="plain fragment" data-src="assets/cosmos_gal_ground.png" style="width: 200px" data-fragment-index="0"/>
</div>
</div>
<div class="container" style="position:relative; width:1000px; height:50px; margin:0 auto;">
<div class='col fragment' data-fragment-index='4'> <font size="10"> $\longrightarrow$ </font> <br> $g_\theta$ </div>
<div class='col fragment' data-fragment-index='3'> <font size="10"> $\longrightarrow$ </font> <br> PSF </div>
<div class='col fragment' data-fragment-index='2'> <font size="10"> $\longrightarrow$ </font> <br> Pixelation</div>
<div class='col fragment' data-fragment-index='1'> <font size="10"> $\longrightarrow$ </font> <br> Noise </div>
</div>
<div class="container">
<div class="col">
<div style="position:relative; width:400px; height:300px; margin:0 auto;">
<img data-src="assets/pgm_0.png" class="plain fragment current-visible " style="position:absolute;top:0;left:0;height:300px;" data-fragment-index="0"/>
<img data-src="assets/pgm_1.png" class="plain fragment current-visible " style="position:absolute;top:0;left:0;height:300px;" data-fragment-index="1"/>
<img data-src="assets/pgm_1.png" class="plain fragment current-visible " style="position:absolute;top:0;left:0;height:350px;" data-fragment-index="2"/>
<img data-src="assets/pgm_2.png" class="plain fragment current-visible " style="position:absolute;top:0;left:0;height:350px;" data-fragment-index="3"/>
<img data-src="assets/pgm_3.png" class="plain fragment " style="position:absolute;top:0;left:0;height:300px;" data-fragment-index="4"/>
</div>
</div>
<div class=" col">
<div class="block fragment" data-fragment-index="0">
<div class="block-title">
Probabilistic model
</div>
<div class="block-content">
<div style="position:relative; width:400px; height:100px; margin:0 auto;">
<div class="plain fragment current-visible " style="position:absolute;top:0;left:0;width:400px;" data-fragment-index="0"> $$ x \sim ? $$ </div>
<div class="plain fragment current-visible " style="position:absolute;top:0;left:0;width:400px;" data-fragment-index="1"> $$ x \sim \mathcal{N}(z, \Sigma) \quad z \sim ? $$<br>latent $z$ is a denoised galaxy image</div>
<div class="plain fragment current-visible " style="position:absolute;top:0;left:0;width:400px;" data-fragment-index="2"> $$ x \sim \mathcal{N}( \mathbf{P} z, \Sigma) \quad z \sim ?$$<br>latent $z$ is a super-resolved and denoised galaxy image</div>
<div class="plain fragment current-visible " style="position:absolute;top:0;left:0;width:400px;" data-fragment-index="3"> $$ x \sim \mathcal{N}( \mathbf{P} (\Pi \ast z), \Sigma) \quad z \sim ? $$<br>latent $z$ is a deconvolved, super-resolved, and denoised galaxy image </div>
<div class="plain fragment " style="position:absolute;top:0;left:0;width:400px;" data-fragment-index="4"> $$ x \sim \mathcal{N}( \mathbf{P} (\Pi \ast g_\theta(z)), \Sigma) \quad z \sim \mathcal{N}(0, \mathbf{I}) $$ <br>latent $z$ is a Gaussian sample<br> <b class="alert"> $\theta$ are parameters of the model</b> </div>
</div>
<br>
<br>
<br>
</div>
</div>
</div>
</div>
<div class="fragment"> $\Longrightarrow$ <b class="alert"> Decouples the morphology model from the observing conditions</b>.</div>
</section>
<section>
<h3 class="slide-title">How to train your <s>dragon</s> model</h3>
<div class="container">
<div class="col">
<img data-src="assets/pgm.png" class="plain" style="height: 300px;" ></img>
</div>
<div class="col">
<ul>
<li> Training the generative amounts to finding $\theta_\star$ that
<b>maximizes the marginal likelihood</b> of the model:
$$p_\theta(x | \Sigma, \Pi) = \int \mathcal{N}( \Pi \ast g_\theta(z), \Sigma) \ p(z) \ dz$$
<div> $\Longrightarrow$ This is <b class="alert">generally intractable</b></div>
</li>
<br>
<li class="fragment fade-up"> Efficient training of parameter $\theta$ is made possible by <b class="alert">Amortized Variational Inference</b>.
</li>
</ul>
</div>
</div>
<div class="block fragment fade-up">
<div class="block-title">
Auto-Encoding Variational Bayes (Kingma & Welling, 2014)
</div>
<div class="block-content">
<ul>
<li class="fragment fade-up"> We introduce a <b>parametric distribution</b> $q_\phi(z | x, \Pi, \Sigma)$ which aims to model the
posterior $p_{\theta}(z | x, \Pi, \Sigma)$.
</li>
<br>
<li class="fragment fade-up"> Working out the KL divergence between these two distributions leads to:
$$\log p_\theta(x | \Sigma, \Pi) \quad \geq \quad - \mathbb{D}_{KL}\left( q_\phi(z | x, \Sigma, \Pi) \parallel p(z) \right) \quad + \quad \mathbb{E}_{z \sim q_{\phi}(. | x, \Sigma, \Pi)} \left[ \log p_\theta(x | z, \Sigma, \Pi) \right]$$
$\Longrightarrow$ This is the <b>Evidence Lower-Bound</b>, which is differentiable with respect to $\theta$ and $\phi$.
</li>
</ul>
</div>
</div>
</section>
<section>
<h3 class="slide-title">The famous Variational Auto-Encoder</h3>
<img data-src="assets/vae.png" class="plain" style="height: 450px;"> </img>
<br>
<br>
$$\log p_\theta(x| \Sigma, \Pi ) \geq - \underbrace{\mathbb{D}_{KL}\left( q_\phi(z | x, \Sigma, \Pi) \parallel p(z) \right)}_{\mbox{code regularization}} + \underbrace{\mathbb{E}_{z \sim q_{\phi}(. | x, \Sigma, \Pi)} \left[ \log p_\theta(x | z, \Sigma, \Pi) \right]}_{\mbox{reconstruction error}} $$
</section>
<!-- <section>
<h3 class="slide-title"> Conditional sampling in VAE latent space</h3>
<div class="container">
<div class="col">
<img data-src="assets/conditional_flow.png" class="plain" ></img>
</div>
<div class="col">
<ul>
<li> We build a latent space model $p_\varphi(z)$ using a Masked Autoregressive Flow (MAF) (Papamakarios, et al. 2017)
</li>
<br>
<li class="fragment"> While we are learning to sample from the latent space, we can also <b class="alert"> learn to sample conditionaly</b>:
$$ p_\varphi(z | y) $$
</li>
<br>
<li class="fragment"> Here we learn to sample images conditioned on:
<ul>
<li> Size: half-light radius $r$
</li>
<li> Brightness: I band magnitude $mag\_auto$
</li>
<li> Redshift: COSMOS photometric redshift $zphot$
</li>
</ul>
</li>
</ul>
</div>
</div>
</section> -->
</section>
<section>
<h3 class="slide-title"> Illustration on HST/ACS COSMOS images</h3>
<div class="container">
<div class="col">
<img data-src="assets/Figure_autoencode.png" class="plain" ></img>
<br>
<b>Fitting observations</b> with VAE and Bulge+Disk parametric model.
</div>
<div class="col">
<ul>
<li>Training set: <b>GalSim COSMOS HST/ACS postage stamps</b>
<br>
<ul>
<li> 80,000 deblended galaxies from I < 25.2 sample
</li>
<li> Drawn on 128x128 stamps at 0.03 arcsec resolution
</li>
<li> Each stamp comes with:
<ul>
<li>PSF</li>
<li>Noise power spectrum</li>
<li>Bulge+Disk parametric fit</li>
</ul>
</li>
</ul>
</li>
<br>
<li> Auto-Encoder model:
<ul>
<li> Deep residual autoencoder:<br> 7 stages of 2 resnet blocs each
</li>
<li> Dense bottleneck of size 16.
</li>
<li> Outputs positive, noiseless, deconvolved, galaxy surface brightness.
</li>
</ul>
</li>
<br>
<li class="fragment"><b class="alert">Generaitve models are conditioned on:</b><br>
<b>mag_auto, flux_radius, zphot</b></li>
</ul>
</div>
</div>
</section>
<section>
<section>
<h3 class="slide-title"> Flow-VAE samples</h3>
<br>
<br>
<img class="current-visible plain" data-src="assets/lanusse2020_figure1.png"/>
</section>
<section>
<h3 class="slide-title">Second order moments</h3>
<img class="current-visible plain" data-src="assets/lanusse2020_fig6a.png" style="width:550px;"/><br>
<img class="current-visible plain" data-src="assets/lanusse2020_fig6b.png" style="width:550px;"/>
</section>
<section>
<h3 class="slide-title"> Testing conditional sampling</h3>
<img data-src="assets/lanusse2020_figure7.png" class="plain"></img>
<div class="fragment">$\Longrightarrow$ We can successfully condition galaxy generation.</div>
</section>
<section>
<h3 class="slide-title"> Testing galaxy morphologies </h3>
<br>
<br>
<img data-src="assets/gini_m20.png" class="plain"></img>
<br>
<br>
</section>
</section>
<section class="inverted" data-background="#000">
<h2>
How do we make these models useful in practice?</h2>
</section>
<section>
<section>
<h3 class="slide-title"> GalSim Hub: GalSim Interface and Online Repository of Morphology Models </h3>
<br>
<br>
To make generative models easy to share and use, we introduce <b>GalSim Hub</b>:
<ul>
<br><img data-src="assets/tfhub.png" class="plain" style="float:right;width:500px"></img><br>
<li >Defines a <b class="alert">specification for packaging</b> deep generative models as TensorFlow Modules. </li>
<br>
<br>
<li class="fragment">Provides a <b class="alert">GalSim interface</b> to seamlessly use deep generative models as any other lightprofiles.</li>
<br>
<br>
<li class="fragment" >Provides a community maintained <b class="alert">repository of trained models</b>.</li>
</ul>
<br>
<br>
<br>
<div class="fragment">
<img data-src="assets/github.png" class="plain" style="float:center;height:70px"/><br><a href="https://github.com/McWilliamsCenter/galsim_hub">https://github.com/McWilliamsCenter/galsim_hub</a>
</div>
</section>
<section>
<h3 class="slide-title">Minimum Working Example</h3>
Assuming TensorFlow 1.15 and GalSim are already installed:
<pre class="bash"><code data-trim data-noescape>
$ pip install galsim-hub
</code></pre>
<pre class="python"><code data-trim data-noescape>
import galsim
import galsim_hub
from astropy.table import Table
# Load generative model from the online repository
model = galsim_hub.GenerativeGalaxyModel(’hub:Lanusse2020’)
# Defines the input conditions
cat = Table([[5., 10. ,20.],
[24., 24., 24.],
[0.5, 0.5, 0.5]],
names=[’flux_radius’, ’mag_auto’, ’zphot’])
# Sample light profiles for these parameters
ims = model.sample(cat)
# Define a PSF
psf = galsim.Gaussian(sigma=0.06)
# Convolve by PSF
ims = [galsim.Convolve(im, psf) for im in ims]
</code></pre>
<div >
<a href="https://colab.research.google.com/github/McWilliamsCenter/galsim_hub/blob/master/notebooks/GalsimHubDemo.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab" class="plain" style="height:25px;"/></a>
</div>
<!--
<div class="col fragment">
<pre class="yaml"><code data-trim data-noescape>
modules:
- galsim_hub
psf:
type: Gaussian
sigma: 0.06 # arcsec
gal:
type: GenerativeModelGalaxy
flux_radius: { type: Random , min: 5, max: 10 }
mag_auto: { type: Random , min: 24., max: 25. }
zphot: { type: Random , min: 0.5, max: 1. }
image:
type: Tiled
nx_tiles: 10
ny_tiles: 10
stamp_size: 64 # pixels
pixel_scale: 0.03 # arcsec / pixel
noise:
type: COSMOS
output:
dir: output_yaml
file_name: demo14.fits
input:
generative_model:
file_name: 'hub:Lanusse2020'
</code></pre>
</div>
</div> -->
</section>
<!--
<section>
<h3 class="slide-title">Training and adding new models</h3>
<br>
<div class="block">
<div class="block-title">
Galaxy2Galaxy: A framework for deep learning on astronomical data
</div>
<div class="block-content">
<a href="https://github.com/ml4astro/galaxy2galaxy"><img data-src="https://img.shields.io/badge/github-ml4astro%2FGalaxy2Galaxy-blue" class="plain" style="height:25px;"/></a> <a href="https://gitter.im/ml4astro/galaxy2galaxy"> <img data-src="https://badges.gitter.im/ml4astro/galaxy2galaxy.svg" class="plain" style="height:25px;"/></a>
<br>
<ul>
<li class="fragment">Building GalSim-based TensorFlow tf.data <b class="alert">Datasets</b><br>
<pre class="bash"><code data-trim data-noescape>
$ g2g-datagen --problem=attrs2img_cosmos128 \
--data_dir=datasets/attrs2img_cosmos128
</code></pre>
</li>
<li class="fragment">Train <b class="alert">various generative models</b>: VAE, GANs, PixelCNN
<pre class="bash"><code data-trim data-noescape>
$ g2g-trainer --problem=attrs2img_cosmos128 \
--data_dir=datasets/attrs2img_cosmos128 \
--output_dir=models/vae \
--model=continuous_autoencoder_residual_vae \
--hparams_set=continuous_autoencoder_residual_128
</code></pre></li>
<li class="fragment"><b class="alert">Export models as GalSim-Hub</b> compatible modules
<pre class="bash"><code data-trim data-noescape>
$ g2g-exporter --problem=attrs2img_cosmos128 \
--data_dir=datasets/attrs2img_cosmos128 \
--output_dir=models/vae \
--model=continuous_autoencoder_residual_vae \
--hparams_set=continuous_autoencoder_residual_128 \
--export_dir=exported_models/vae
</code></pre><br>
$\Longrightarrow$ Once a model is exported, tgz it, and open a PR to host it on Galsim-Hub.
</li>
</ul>
</div>
</div>
</section> -->
</section>
<section>
<h3 class="slide-title">Examples of applications of generative models</h3>
<br>
<div class="container">
<div class="col">
<ul>
<li><em>Hybrid Physical-Deep Learning Model for Astronomical Inverse Problems</em><br> <b> <NAME></b>, <NAME>, <NAME><br>
<a href="https://arxiv.org/abs/1912.03980"><img src="https://img.shields.io/badge/astro--ph.IM-arXiv%3A1912.03980-B31B1B.svg" class="plain" style="height:25px;"/></a>
<a href="https://www.youtube.com/watch?v=oWOU3qNHoL0"><img src="https://img.shields.io/badge/-youtube-red?logo=youtube&labelColor=grey" class="plain" style="height:25px;"/></a>
</li>
</ul>
$\mathcal{L} = \frac{1}{2} \parallel \mathbf{\Sigma}^{-1/2} (\ Y - P \ast A S \ ) \parallel_2^2 - \sum_{i=1}^K \log p_{\theta}(S_i)$
<img class=" plain" data-src="assets/scarlet_hsc.png" />
</div>
<div class="col fragment">
<ul>
<li><em>The relationship between fine galaxy stellar morphology and star formation activity in cosmological simulations: a deep learning view</em><br> <NAME>, <NAME>, <b><NAME></b>, et al. <br>
<a href="https://arxiv.org/abs/2007.00039"><img src="https://img.shields.io/badge/astro--ph.GA-arXiv%3A2007.00039-B31B1B.svg" class="plain" style="height:25px;"/></a></li>
</ul>
<img class="plain" data-src="assets/Zanisi2020_starforming_vs_quiescent.png"/>
</div>
</div>
</section>
<section>
<h3 class="slide-title"> Takeaway message</h3>
<br>
<ul>
<li class="fragment"> We have <b class="alert">combined physical and deep learning components</b> to model
observed noisy and PSF-convoled galaxy images. <br>
$\Longrightarrow$ This framework can handle multi-band, multi-resolution, multi-instrument data.
</li>
<br>
<!-- <li class="fragment"> We are overcoming the limitations of standard VAEs with an additional latent space model.
<br>$\Longrightarrow$ Can produce sharp and meaningful images.
</li> -->
<br>
<li class="fragment"> We demonstrate conditional sampling of galaxy light profiles<br>
$\Longrightarrow$ Image simulation can be combined with larger survey simulation efforts, e.g. LSST DESC DC2
</li>
<br>
</ul>
<br>
<br>
<div class="block fragment fade-up">
<div class="block-title">
<b>GalSim Hub</b>
</div>
<div class="block-content">
<br>
<div class="container">
<div class="col">
<img data-src="assets/TF_FullColor_Horizontal.png" class="plain"></img>
</div>
<div class="col">
<ul>
<li> Framework for sampling from deep generative models
directly from within GalSim.
</li>
<br>
<li> Go check out the beta version: <a href="https://github.com/McWilliamsCenter/galsim_hub">https://github.com/McWilliamsCenter/galsim_hub</a>
</li>
</ul>
</div>
</div>
</div>
</div>
</section>
<section>
<h2>Differentiable models of the Large-Scale Structure</h2>
<br>
<a href="https://github.com/modichirag/flowpm"><img src="https://badgen.net/badge/icon/github?icon=github&label" class="plain" style="height:25px;"/></a>
<a href="https://colab.research.google.com/github/modichirag/flowpm/blob/master/notebooks/flowpm_blog.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab" class="plain" style="height:25px;"/></a>
<hr>
<br>
<div class="container">
<div class="col">
<div align="left" style="margin-left: 20px;">
<h3>Work in collaboration with <br>
<NAME>, <NAME></h3>
</div>
</div>
<div class="col">
<img src="assets/chirag_modi_square.png">
</div>
</div>
<br>
<div style="float:center; font-size: 25px">Modi, <b>Lanusse</b>, Seljak, et al., in prep <br>
Modi, et al. (2018)
</div>
</section>
<section>
<h3 class='slide-title'> traditional cosmological inference </h3>
<div class='container'>
<div class='col'>
<div style="position:relative; width:480px; height:30px; margin:0 auto;">
<div class="fragment current-visible" style="position:absolute;top:0;" data-fragment-index="1">HSC cosmic shear power spectrum</div>
<div class="fragment" style="position:absolute;top:0;" data-fragment-index="2">HSC Y1 constraints on $(S_8, \Omega_m)$</div>
</div>
<div style="position:relative; width:480px; height:300px; margin:0 auto;">
<div class="fragment current-visible" style="position:absolute;top:0;left:0;" data-fragment-index="0" >
<img class="plain" data-src="assets/alonso_g1.png" />
<img class="plain" data-src="assets/alonso_g2.png" />
</div>
<img class="fragment current-visible plain" data-src="assets/hsc_correlation_function.png" style="position:absolute;top:0;left:0;" data-fragment-index="1" />
<img class="fragment plain" data-src="assets/hsc_constraints.png" style="position:absolute;top:0;left:0;" data-fragment-index="2" />
</div>
<div class="fragment" data-fragment-index="1" style="float:right; font-size: 20px">(Hikage,..., Lanusse, et al. 2018)</div>
</div>
<div class='col'>
<ul>
<li class="fragment" data-fragment-index="0"> Measure the ellipticity $\epsilon = \epsilon_i + \gamma$ of all galaxies<br>
$\Longrightarrow$ Noisy tracer of the weak lensing shear $\gamma$ </li>
<br>
<li class="fragment" data-fragment-index="1"> Compute <b class="alert">summary statistics</b> based on 2pt functions, <br>e.g. the <b>power spectrum</b> </li>
<br>
<li class="fragment" data-fragment-index="2"> Run an MCMC to recover a posterior on model parameters, using an <b class="alert">analytic likelihood</b>
$$ p(\theta | x ) \propto \underbrace{p(x | \theta)}_{\mathrm{likelihood}} \ \underbrace{p(\theta)}_{\mathrm{prior}}$$
</li>
</ul>
</div>
</div>
<div class="block fragment">
<div class="block-title">
Main limitation: the need for an explicit likelihood
</div>
<div class="block-content">
We can only compute the likelihood for <b>simple summary statistics</b> and on <b>large scales</b>
<br>
<br>
<div class="fragment"> $\Longrightarrow$ We are dismissing most of the information! </div>
</div>
</div>
</section>
<section>
<h3 class='slide-title'>A different road: forward modeling</h3>
<div class='container'>
<div class='col'>
<ul>
<li> Instead of trying to analytically evaluate the likelihood,
let us build a forward model of the observables.</li>
<br>
<li class="fragment"> Each component of the model is now tractable, but at the
cost of a <b >large number of latent variables</b>.
</li>
</ul>
<br>
<div class="fragment">
$\Longrightarrow$ How to peform efficient inference in this large number of dimensions?
</div>
<br>
<br>
<ul class="fragment"> A non-exhaustive list of methods:
<li> Hamiltonian Monte-Carlo
</li>
<li> Variational Inference
</li>
<li> MAP+Laplace
</li>
<li> Gold Mining
</li>
<li> Dimensionality reduction by Fisher-Information Maximization
</li>
</ul>
<br>
<br>
<div class="fragment">
What do they all have in common?<br>
-> They require fast, accurate, differentiable forward simulations
</div>
</div>
<div class='col'>
<img data-src="assets/pgm_lensing.png" class="plain" style="height:550px;"/>
<div style="float:right; font-size: 20px">(Schneider et al. 2015)</div>
</div>
</div>
</section>
<section class="inverted" data-background="#000">
<h2>How do we simulate the Universe in a fast and differentiable way?</h2>
</section>
<section>
<h3 class='slide-title'>Forward Models in Cosmology</h3>
<div class="container">
<div class='col'>
<img data-src="assets/fieldinit.png" class="plain" style="height:300px;"/>
<b class="alert"> Linear Field </b>
</div>
<div class='col fragment' data-fragment-index='2'>
<img data-src="assets/fieldfin.png" class="plain " style="height:300px;"/>
<b class="alert"> Final Dark Matter </b>
</div>
<hr style="width: 1px; height: 400px; background: white; border: none;" />
<div class='col ' data-fragment-index='3'>
<!-- <img data-src="assets/fieldhalo.png" class="plain " style="height:300px;"/>
<b class="alert"> Dark Matter Halos </b> -->
</div>
<div class='col ' data-fragment-index='4'>
<!-- <img data-src="assets/fieldgal.png" class="plain " style="height:300px;"/>
<b class="alert"> Galaxies </b> -->
</div>
</div>
<div class="container" style="position:relative; width:1000px; height:50px; margin:0 auto;">
<div class='col fragment' data-fragment-index='2'> <font size="10"> $\longrightarrow$ </font> <br> N-body simulations </div>
<div class='col ' > </div>
<div class='col ' > </div>
<!-- <div class='fragment' data-fragment-index='2'> N-body simulations <div> -->
<!-- <div class='fragment' data-fragment-index='3'> Group Finding algorithms <div> -->
<!-- <div class='fragment' data-fragment-index='4'> Semi-analytic models <div> -->
</div>
</section>
<section>
<h3 class='slide-title'>the Fast Particle-Mesh scheme for N-body simulations</h3>
<b>The idea</b>: approximate gravitational forces by esimating densities on a grid.
<div class='container'>
<div class='col'>
<ul>
<li>The numerical scheme:
<br>
<br>
<ul>
<li class="fragment" data-fragment-index="1"> Estimate the density of particles on a mesh<br>
=> compute gravitational forces by FFT
</li>
<br>
<li class="fragment" data-fragment-index="2"> Interpolate forces at particle positions
</li>
<br>
<li class="fragment" data-fragment-index="3"> Update particle velocity and positions, and iterate
</li>
</ul>
</li>
<br>
<li class='fragment'> Fast and simple, at the cost of approximating short range interactions.
</li>
</ul>
</div>
<div class='col'>
<div style="position:relative; width:550px; height:550px; margin:0 auto;">
<img class="fragment current-visible plain" data-src="assets/particle_positions_0.png" style="position:absolute;top:0;left:0;" data-fragment-index="0" />
<img class="fragment current-visible plain" data-src="assets/particle_density_0.png" style="position:absolute;top:0;left:0;" data-fragment-index="1" />
<img class="fragment current-visible plain" data-src="assets/particle_positions_0.png" style="position:absolute;top:0;left:0;" data-fragment-index="2" />
<img class="fragment plain" data-src="assets/particle_positions_1.png" style="position:absolute;top:0;left:0;" data-fragment-index="3" />
</div>
</div>
</div>
<div class="fragment"> $\Longrightarrow$ Only a series of FFTs and interpolations.</div>
</section>
<section>
<h3 class='slide-title'>introducing FlowPM: Particle-Mesh Simulations in TensorFlow</h3>
<div class='container'>
<div class='col'>
<img data-src="assets/github.png" class="plain" style="height:70px"/>
<div> <a href="https://github.com/modichirag/flowpm">https://github.com/modichirag/flowpm</a>
</div>
<pre class="python"><code data-trim data-noescape>
import tensorflow as tf
import flowpm
# Defines integration steps
stages = np.linspace(0.1, 1.0, 10, endpoint=True)
initial_conds = flowpm.linear_field(32, # size of the cube
100, # Physical size
ipklin, # Initial powerspectrum
batch_size=16)
# Sample particles and displace them by LPT
state = flowpm.lpt_init(initial_conds, a0=0.1)
# Evolve particles down to z=0
final_state = flowpm.nbody(state, stages, 32)
# Retrieve final density field
final_field = flowpm.cic_paint(tf.zeros_like(initial_conditions),
final_state[0])
with tf.Session() as sess:
sim = sess.run(final_field)
</code></pre>
<ul>
<li> Seamless interfacing with deep learning components
</li>
<li> <b class="alert">Mesh TensorFlow</b> implementation for distribution on supercomputers
</li>
</ul>
</div>
<div class='col'>
<div class="fig-container" data-file="flowpm_16.html" data-style="height: 550px;"></div>
<img data-src="assets/TF_FullColor_Horizontal.png" class='plain' style="height: 70px;"/>
</div>
</div>
<!--We can have a link to the project, an showcase an iframe with the
small simulation running on the right-->
</section>
<section>
<section>
<h3 class='slide-title'>Forward Models in Cosmology</h3>
<div class="container">
<div class='col'>
<img data-src="assets/fieldinit.png" class="plain" style="height:300px;"/>
<b class="alert"> Linear Field </b>
</div>
<div class='col '>
<img data-src="assets/fieldfin.png" class="plain " style="height:300px;"/>
<b class="alert"> Final Dark Matter </b>
</div>
<hr style="width: 1px; height: 400px; background: white; border: none;" />
<div class='col fragment' data-fragment-index='3'>
<img data-src="assets/fieldhalo.png" class="plain " style="height:300px;"/>
<b class="alert"> Dark Matter Halos </b>
</div>
<div class='col fragment' data-fragment-index='4'>
<img data-src="assets/fieldgal.png" class="plain " style="height:300px;"/>
<b class="alert"> Galaxies </b>
</div>
</div>
<div class="container" style="position:relative; width:1000px; height:50px; margin:0 auto;">
<div class='col' > <font size="10"> $\longrightarrow$ </font> <br> N-body simulations <br> <b class="alert">FlowPM</b> </div>
<div class='col fragment' data-fragment-index='3'> <font size="10"> $\longrightarrow$ </font> <br> Group Finding <br> algorithms </div>
<div class='col fragment' data-fragment-index='4'> <font size="10"> $\longrightarrow$ </font> <br> Semi-analytic & <br> distribution models </div>
<!-- <div class='fragment' data-fragment-index='2'> N-body simulations <div> -->
<!-- <div class='fragment' data-fragment-index='3'> Group Finding algorithms <div> -->
<!-- <div class='fragment' data-fragment-index='4'> Semi-analytic models <div> -->
</div>
</section>
<section>
<h3 class='slide-title'> Example of Extending Dark Matter Simulations with Deep Learning</h3>
<div style="position:relative; width:1000px; height:290px; margin:0 auto;">
<img style="position:absolute;top:0;left:50px;" data-src="assets/recontruth.png" height="300"/>
</div>
<div style="position:relative; width:1000px; height:290px; margin:0 auto;">
<div class="fragment " data-fragment-index="0" style="position:absolute;top:10px;left:600px;font-size:200%"> $\longrightarrow$ </div>
<img class="fragment plain" data-fragment-index="0" style="position:absolute;top:50px;left:500px;" data-src="assets/fnn.png" height="150"/>
<img class="fragment " data-fragment-index="1" style="position:absolute;top:0;left:50px;" data-src="assets/reconinit.png" height="300"/>
<img class="fragment " data-fragment-index="2" style="position:absolute;top:0;left:50px;" data-src="assets/reconrecon.png" height="300"/>
</div>
<div class="fragment" data-fragment-index="1" style="position:relative; float:right; top: -30px; left:-50px; font-size: 20px; margin:0 auto">Modi et al. 2018</div>
</section>
</section>
<!--
<section>
<h3 class='slide-title'>The practical challenge for inference at scale</h3>
<br>
<br>
<div class="container">
<div class="col">
<ul>
<li> Simulations of scientifically interesting sizes <b class="alert">do not fit on a single GPU RAM</b> <br>
e.g. $128^3$ operational, need $1024^3$ for survey volumes <br>
$\Longrightarrow$ We need a <b>distributed Machine Learning Framework</b><br>
</li>
<br>
<li class="fragment" data-fragment-index="1">Most common form of distribution is <b>data-parallelism</b>
$\Longrightarrow$ Reached Exascale on scientific deep learning applications
</li>
<br>
<li class="fragment"> What we need is <b>model-parallelism</b> on HPC environments
</li>
</ul>
</div>
<div class="col">
<img data-src="assets/summit.png" class="fragment" data-fragment-index="1">
</div>
</div>
<br>
<br>
<div class="fragment">
$\Longrightarrow$ We have started investigating <b class="alert">Mesh TensorFlow at NERSC and Google TPUs</b>.
</div>
</section>
<section>
<h3 class='slide-title'>Mesh TensorFlow in a few words</h3>
<ul>
<li> Redefines the TensorFlow API, in terms of <b class="alert">abstract logical tensors</b>
with actual <b class="alert">memory instantiation on multiple devices</b> defined by:
<ul>
<li class="fragment" data-fragment-index="1" > The specification of the mesh of computing devices
</li>
<li class="fragment" data-fragment-index="2" > The specification of rules for which dimensions can be splitted
</li>
</ul>
</li>
</ul>
<img class="fragment" data-fragment-index="3" data-src="assets/mesh_tensorflow.png"/><br>
<div class="fragment" data-fragment-index="3" style="float:right; font-size: 20px">(Gholami et al. 2018)</div>
</section>
<section>
<h3 class="slide-title">Proof of concept with Mesh FlowPM and why should you care :-)</h3>
<div>
<img data-src="assets/mesh_flopwm.png" class="plain" style="height:350px;"/><br>
Evolution from initial conditions to z=0 distributed on 2 Nodes 16 GPUs
</div>
<div class="block fragment">
<div class="block-title">
Our assessment so far
</div>
<div class="block-content">
<ul>
<li> Provides an easy framework to write down distributed differentiable simulations
and large scale Machine Learning tasks
</li>
<li class="fragment"> The Mesh TensorFlow project is still young and limited in scope:<br>
<b class="alert"> $\Longrightarrow$ we need help from the Physics community to develop it for our needs!</b>
</li>
</ul>
</div>
</div>
</section>
-->
<section>
<h3 class="slide-title"> Takeaway message</h3>
<ul>
<li class="fragment"> We are <b class="alert">combining physical and deep learning components</b> to model
the Large-Scale Structure in a <b>fast and differentiable way</b>. <br>
$\Longrightarrow$ This is <b>a necessary backbone</b> for large scale simulation-based inference.
</li>
<br>
<li class="fragment"> We are demonstrating that large-scale simulations can be
implemented in distributed autodiff frameworks.
<br>$\Longrightarrow$ We hope that this will one day become the norm.
</li>
<!-- <br>
<li class="fragment"> Our community has unique needs and limited resources,
we will all gain by working collaboratively !
</li> -->
</ul>
<br>
<br>
<div class="fragment">
Check out this blogpost to learn more <br> <a href=https://blog.tensorflow.org/2020/03/simulating-universe-in-tensorflow.html>
https://blog.tensorflow.org/2020/03/simulating-universe-in-tensorflow.html</a>
<div class="container">
<div class="col">
<img data-src="assets/init_field.png" style='height:250px;'/>
<br> True initial conditions <br> $z_0$
</div>
<div class="col">
<img data-src="assets/reconim_init.gif"style='height:250px;' />
<br> Reconstructed initial conditions $z$
</div>
<div class="col">
<img data-src="assets/reconim_fin.gif" style='height:250px;'/>
<br> Reconstructed dark matter distribution $x = f(z)$
</div>
<div class="col">
<img data-src="assets/fin_field.png" style='height:250px;' />
<br> Data <br> $x_{DM} = f(z_0)$
</div>
</div>
</div>
</section>
<section>
<h2>jax-cosmo: End-to-end automatically differentiable and GPU accelerated cosmology library</h2>
<br>
<a href="https://gitter.im/DifferentiableUniverseInitiative/jax_cosmo?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge"><img src="https://badges.gitter.im/DifferentiableUniverseInitiative/jax_cosmo.svg" class="plain" style="height:25px;"/></a>
<a href="https://github.com/DifferentiableUniverseInitiative/jax_cosmo"><img src="https://badgen.net/badge/icon/github?icon=github&label" class="plain" style="height:25px;"/></a>
<a href="https://colab.research.google.com/github/DifferentiableUniverseInitiative/jax_cosmo/blob/master/docs/notebooks/jax-cosmo-intro.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab" class="plain" style="height:25px;"/></a>
<hr>
<br>
<div class="container">
<div class="col">
<div align="left" style="margin-left: 20px;">
<h3>Collaborative project, join us :-)</h3>
</div>
</div>
<div>
<img data-src="assets/jax_cosmo_people.png"/>
</div>
</div>
<br>
</section>
<section>
<h3 class="slide-title">A motivating example: Fisher forecasts</h3>
<div class="container">
<div class="col">
<ul>
<li class="fragment"> Fisher matrices are notoriously unstable<br>
$\Longrightarrow$ they rely on evaluating gradients by finite differences.
</li>
<br>
<br>
<li class="fragment"> They do not scale well to large number of parameters.
</li>
<br>
<br>
<li class="fragment"> The main reason for using Fisher matrices is that full MCMCs are extremely slow, and get slower with dimensionality.</li>
</ul>
</div>
<div class="col">
<img data-src="assets/lanusse2015.png" style="width:700px" class="plain" /><br>
<em>3D galaxy clustering with future wide-field surveys: Advantages of
a spherical Fourier-Bessel analysis</em>,<br> <b><NAME></b>, <NAME>, <NAME> (2015)
</div>
</section>
<section class="inverted" data-background="#000">
<h2>How can we scale cosmological analyses to the hundreds of parameters of modern
surveys ?</h2>
</section>
<section>
<h3 class="slide-title">the hammer behind the Deep Learning revolution: Automatic Differentation</h3>
<ul>
<li> <b>Automatic differentiation</b> allows you to compute analytic derivatives of arbitraty expressions:<br>
If I form the expression $y = a * x + b$, it is separated in fundamental ops:
$$ y = u + b \qquad u = a * x $$
then gradients can be obtained by the chain rule:
$$\frac{\partial y}{\partial x} = \frac{\partial y}{\partial u} \frac{ \partial u}{\partial x} = 1 \times a = a$$
</li>
<br>
<li class="fragment"> This is a fundamental tool in Machine Learning, and autodiff frameworks include TensorFlow and PyTorch.
</li>
</ul>
<br>
<br>
<div class="block fragment">
<div class="block-title">
Enters JAX: NumPy + Autograd + GPU
</div>
<div class="block-content">
<div class="container">
<div class="col">
<ul>
<li>JAX follows the NumPy api!
<pre class="python"><code data-trim data-noescape>
import jax.numpy as np
</code></pre>
</li>
<li>Arbitraty order derivatives</li>
<li>Accelerated execution on GPU and TPU</li>
</ul>
</div>
<div class="col" align="center">
<img data-src="https://raw.githubusercontent.com/google/jax/master/images/jax_logo_250px.png" class="plain"/>
</div>
</div>
</section>
<section>
<section>
<h3 class="slide-title"> jax-cosmo: Finally a differentiable cosmology library, and it's in JAX!</h3>
<div class="container">
<div class="col">
<img data-src="assets/github.png" class="plain" style="height:70px"/>
<div> <a href="https://github.com/DifferentiableUniverseInitiative/jax_cosmo/">https://github.com/DifferentiableUniverseInitiative/jax_cosmo</a>
</div>
<pre class="python"><code data-trim data-noescape>
import jax.numpy as np
import jax_cosmo as jc
# Defining a Cosmology
cosmo = jc.Planck15()
# Define a redshift distribution with smail_nz(a, b, z0)
nz = jc.redshift.smail_nz(1., 2., 1.)
# Build a lensing tracer with a single redshift bin
probe = probes.WeakLensing([nz])
# Compute angular Cls for some ell
ell = np.logspace(0.1,3)
cls = angular_cl(cosmo_jax, ell, [probe])
</code></pre>
<div class="block fragment">
<div class="block-title">
Current main features
</div>
<div class="block-content">
<ul>
<li>Weak Lensing and Number counts probes</li>
<li>Eisenstein & Hu (1998) power spectrum + halofit</li>
<li>Angular $C_\ell$ under Limber approximation </li>
</ul>
<div class="fragment">$\Longrightarrow$ 3x2pt DES Y1 capable </div>
</div>
</div>
</div>
<div class="col">
<img class="plain" data-src="assets/jc_vs_ccl_lensing.png"/>
<img class="plain" data-src="assets/jc_vs_ccl_clustering.png"/>
<br>
Validating against the <a href="https://github.com/LSSTDESC/CCL">DESC Core Cosmology Library</a>
</div>
</div>
</section>
<section>
<h3 class="slide-title"> let's compute a Fisher matrix</h3>
<br>
$$F = - \mathbb{E}_{p(x | \theta)}[ H_\theta(\log p(x| \theta)) ] $$
<br>
<div class="container">
<div class="col fragment">
<pre class="python"><code data-trim data-noescape>
import jax
import jax.numpy as np
import jax_cosmo as jc
# .... define probes, and load a data vector
def gaussian_likelihood( theta ):
# Build the cosmology for given parameters
cosmo = jc.Planck15(Omega_c=theta[0], sigma8=theta[1])
# Compute mean and covariance
mu, cov = jc.angular_cl.gaussian_cl_covariance_and_mean(cosmo,
ell, probes)
# returns likelihood of data under model
return jc.likelihood.gaussian_likelihood(data, mu, cov)
# Fisher matrix in just one line:
F = - jax.hessian(gaussian_likelihood)(theta)
</code></pre>
<a href="https://colab.research.google.com/github/DifferentiableUniverseInitiative/jax_cosmo/blob/master/docs/notebooks/jax-cosmo-intro.ipynb" target="_parent"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab" class="plain" style="height:25px;"/></a>
</div>
<div class="col fragment">
<img data-src="assets/Fisher_mat.png" class="plain"><br><br>
</div>
</div>
<ul>
<li class="fragment"> <b class="alert">No derivatives were harmed by finite differences in the computation of this Fisher!</b> </li>
<li class="fragment"> Only a small additional compute time compared to one forward evaluation of the model</li>
</ul>
</section>
</section>
<section>
<h3 class="slide-title"> Inference becomes fast and scalable</h3>
<div class="container">
<div class="col">
<ul>
<li>Current cosmological MCMC chains take <b>days</b>, and typically require access
to large computer clusters.</li>
<br>
<li class="fragment" data-fragment-index="1"><b class="alert">Gradients of the log posterior are required for modern efficient and scalable inference</b> techniques:
<ul>
<li>Variational Inference</li>
<li>Hamiltonian Monte-Carlo</li>
</ul>
</li>
<br>
<li class="fragment" data-fragment-index="2">In jax-cosmo, we can trivially obtain <b>exact</b> gradients:
<pre class="python"><code data-trim data-noescape>
def log_posterior( theta ):
return gaussian_likelihood( theta ) + log_prior(theta)
score = jax.grad(log_posterior)(theta)
</code></pre>
</li>
<br>
<li class="fragment" data-fragment-index="3">On a DES Y1 analysis, we find convergence in 70,000 samples with vanilla HMC, 140,000 with Metropolis-Hastings</li>
</ul>
</div>
<div class="col" >
<div class="fragment" data-fragment-index="3">
<img data-src="assets/jc_3x2pt_hmc.png" class="plain"/><br>
DES Y1 posterior, jax-cosmo HMC vs Cobaya MH <br>(credit: <NAME>)
</div>
</div>
</div>
</section>
<!--
<section>
<h3 class="slide-title"> Example of application: the DESC 3x2pt tomography challenge</h3>
<ul>
<li>
</li>
</ul>
</section> -->
<section>
<h3 class="slide-title">Takeaway message</h3>
<br>
<br>
<ul>
<li><b>Automatic differentation</b> has been a <b class="alert">game changer</b> for Machine Learning,
it can be for physics as well. <br>
$\Longrightarrow$ Fast inference; instant and accurate forecasts; survey optimization; data compression; etc.
</li>
<br>
<li> jax-cosmo proposes a numpy-like cosmology library in the intuitive and powerful JAX framework.
</li>
</ul>
<br>
<br>
<div class="block fragment">
<div class="block-title">
We need you!
</div>
<div class="block-content">
Our goal is to build a high quality tool for the community
<br>
<br>
<ul>
<li> Completely open and collaborative development on GitHub: <a href="https://github.com/DifferentiableUniverseInitiative/jax_cosmo/">https://github.com/DifferentiableUniverseInitiative/jax_cosmo</a>
</il>
<br>
<br>
<li> Easily get in touch on Gitter for any questions <a href="https://gitter.im/DifferentiableUniverseInitiative/jax_cosmo?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge">https://gitter.im/DifferentiableUniverseInitiative/jax_cosmo</a>
</li>
</ul>
</div>
</div>
</section>
<section>
<h1> Conclusion </h1>
</section>
<section>
<h3 class="slide-title"> Conclusion </h3>
<div class="block ">
<div class="block-title">
What can be gained by merging Deep Learning and Physical Models?
</div>
<div class="block-content">
<br>
<ul>
<li class="fragment"> Complement known physical models with data-driven components
<ul>
<li>Learn galaxy morphologies from noisy and PSF-convolved data, for simulation purposes.</il>
<li>Use data-driven model as prior for solving inverse problems such as deblending or deconvolution.</il>
</ul>
</il>
<br>
<br>
<li class="fragment"> Differentiable physical models for fast inference
<ul>
<li> We are developing fast and distributed N-body simulations in TensorFlow.</li>
<li> Even analytic cosmological computations can benefit from differentiability.<br> <a href="https://github.com/DifferentiableUniverseInitiative/jax_cosmo">https://github.com/DifferentiableUniverseInitiative/jax_cosmo</a> </li>
</ul>
</li>
<br>
</ul>
</div>
</div>
<br>
<div class="fragment">
<u>Advertisement:</u>
<br>
<br>
<ul>
<li> <a href="https://gitter.im/DifferentiableUniverseInitiative">Differentiable Universe Initiative (gitter.im/DifferentiableUniverseInitiative)</a>: Group of people interested in making everything differentiable! join the conversation :-)
</li>
</ul>
</div>
<br>
<p class="fragment">Thank you ! </p>
<br> <br> <br>
</section>
</div>
</div>
<style>
/* .reveal .slides {
border: 5px solid red;
min-height: 100%;
width: 128mm;
height: 96mm;
} */
.reveal .block {
background-color: #191919;
margin-left: 20px;
margin-right: 20px;
text-align: left;
padding-bottom: 0.1em;
}
.reveal .block-title {
background-color: #333333;
padding:8px 35px 8px 14px;
color: #FFAA7F;
font-weight: bold;
}
.reveal .block-content {
padding:8px 35px 8px 14px;
}
.reveal .slide-title {
border-left: 5px solid white;
text-align: left;
margin-left: 20px;
padding-left: 20px;
}
.reveal .alert {
color: #FFAA7F;
font-weight: bold;
}
.reveal .inverted {
filter: invert(100%);
}
/*
/* .reveal .alert {
padding:8px 35px 8px 14px; margin-bottom:18px;
text-shadow:0 1px 0 rgba(255,255,255,1);
border:5px solid #FFAA7F;
-webkit-border-radius: 14px; -moz-border-radius: 14px;
border-radius:14px
background-position: 10px 10px;
background-repeat: no-repeat;
background-size: 38px;
padding-left: 30px; /* 55px; if icon
}
.reveal .alert-block {padding-top:14px; padding-bottom:14px}
.reveal .alert-block > p, .alert-block > ul {margin-bottom:1em}
/*.reveal .alert li {margin-top: 1em}
.reveal .alert-block p+p {margin-top:5px} */
</style>
<script src="reveal.js/js/reveal.js"></script>
<script>
// More info about config & dependencies:
// - https://github.com/hakimel/reveal.js#configuration
// - https://github.com/hakimel/reveal.js#dependencies
Reveal.initialize({
controls: false,
//center: false,
hash: true,
// Visibility rule for backwards navigation arrows; "faded", "hidden"
// or "visible"
controlsBackArrows: 'hidden',
// Display a presentation progress bar
progress: true,
// Display the page number of the current slide
slideNumber: true,
transition: 'slide', // none/fade/slide/convex/concave/zoom
// The "normal" size of the presentation, aspect ratio will be preserved
// when the presentation is scaled to fit different resolutions. Can be
// specified using percentage units.
width: 1280,
height: 720,
// Factor of the display size that should remain empty around the content
margin: 0.1,
// Bounds for smallest/largest possible scale to apply to content
minScale: 0.2,
maxScale: 1.5,
dependencies: [
{ src: 'reveal.js/plugin/markdown/marked.js' },
{ src: 'reveal.js/plugin/markdown/markdown.js' },
{ src: 'reveal.js/plugin/notes/notes.js', async: true },
{ src: 'reveal.js/plugin/math/math.js', async: true },
{ src: 'reveal.js/plugin/reveal.js-d3/reveald3.js' },
{ src: 'reveal.js/plugin/reveal.js-plugins/chart/Chart.min.js' },
{ src: 'reveal.js/plugin/reveal.js-plugins/chart/csv2chart.js' },
{ src: 'reveal.js/plugin/highlight/highlight.js', async: true },
]
});
</script>
</body>
</html>
| html |
Worried that you'll have to buy a new PC in order to be able to run Windows 8? Don't! Microsoft has said that the Windows 8 system requirements will be the same, or perhaps even lower, than those of Windows 7.
Speaking at the 2011 Worldwide Partner Conference, Tami Reller, corporate vice president of Microsoft's Windows division, had this to say:
"In both of our Windows 8 previews, we talked about continuing with the important trend that we started with Windows 7, keeping system requirements either flat or reducing them over time. Windows 8 will be able to run on a wide range of machines because it will have the same requirements or lower."
So if Windows 8 has the same system requirements that Windows 7 had, that should mean that Windows 8 will also run on systems that currently have Windows Vista installed on them.
As a reminder, here are the Windows 7 system requirements:
How will Windows 8 pull this trick off?
"We've also built intelligence into Windows 8 so that it can adapt to the user experience based on the hardware of the user. So, whether you're upgrading an existing PC, or buying a new one, Windows will adapt to make the most of that hardware."
Good news for those wanting a new OS without having to buy new hardware.
More from Microsoft Partner Conference:
| english |
<reponame>bincrafters/bincrafters.github.io<filename>_posts/2017-10-05-Conan-Package-MSYS2.md
---
layout: post
title: 'Conan Package - MSYS2'
tags: [C++, Conan.io, Bintray]
---
MSYS2 is software distribution and a building platform for Windows. It provides a Unix-like environment, a command-line interface and a software repository making it easier to install, use, build and port software on Windows. Many C++ projects are setup to be built with MSYS2 when built on Windows, rather than MSBuild. It is primarily intended to be used as a `build_requirement` in Conan recipes for these projects. One project that requires MSYS2 to build from sources is Google's build system Bazel.
## Package Reference
msys2_installer/latest@bincrafters/stable
## Usage Information
To get started with using the Bincrafters public Conan repository, please see this post:
[Bincrafters Conan Instructions](https://bincrafters.github.io/2017/06/06/using-bincrafters-conan-repository)
## Notes About this Package
This excerpt is taken from [the `README.md` file of the github repository for this package](https://github.com/bincrafters/conan-msys2_installer).
MSYS2 never adopted semantic versioning, so this package offers a unique versioning option on the packages by using a "conan alias" named "latest".
[Conan Alias feature Explained](http://conanio.readthedocs.io/en/latest/reference/commands/alias.html?highlight=conan%20alias)
In summary, users can reference the version of "latest" in their requirements to get the latest release of MSYS2. "latest" is just an alias which redirects to an actual version of an MSYS2 package. MSYS2 rarely releases new versions, but when they do Bincrafters will compile, create and upload binaries for the package and "latest" will be updated to point to the new version. Because MSYS2 does not use semantic versioning, a datestamp will be used as the version number on the concrete Bincrafters packages for MSYS2 and the 'source()' method of each version of the recipe will use the latest tarball from the msys2.org repository here:
[MSYS2 Distribution Repository](http://repo.msys2.org/distrib)
## Status - Stable
This package has been tested and is currently considered stable. Please report any issues through the github repository for this recipe.
## Disclaimers
Prior to using Bincrafters packages, please see the [Bincrafters Disclaimer Page](https://bincrafters.github.io/2017/05/01/bincrafters-package-disclaimers/). | markdown |
<reponame>BahamJustin/ReadMe-Generator<gh_stars>0
// TODO: Include packages needed for this application
const inquirer = require("inquirer");
const generateMarkdown = require("./src/readme-template");
const writeFile = require("./utils/writeFile");
const writeSample = require("./test/sample");
// TODO: Create an array of questions for user input
const questions = [
{
type: "input",
name: "title",
message: "What is the name of your project?",
validate: (nameInput) => {
if (nameInput) {
return true;
} else {
console.log("You need to enter a project name!");
return false;
}
},
},
{
type: "input",
name: "descript",
message: "Please enter a description:",
},
{
type: "input",
name: "install",
message: "Please provide installation instructions:",
},
{
type: "input",
name: "usage",
message: "Please provide usage information:",
},
{
type: "input",
name: "contribute",
message: "Please provide contribution guidelines:",
},
{
type: "input",
name: "test",
message: "Please provide test intructions:",
},
{
type: "list",
name: "license",
message: "Please choose a license:",
choices: ["GNU", "MIT", "None"],
},
{
type: "input",
name: "username",
message: "Please enter a GitHub username:",
validate: (gitInput) => {
if (gitInput) {
return true;
} else {
console.log("You need to enter a valid GitHub username!");
return false;
}
},
},
{
type: "input",
name: "email",
message: "Please enter a email address:",
validate: (emailInput) => {
if (emailInput) {
return true;
} else {
console.log("You need to enter a valid email address!");
return false;
}
},
},
];
// TODO: Create a function to initialize app
function init() {
return inquirer.prompt(questions).then((answerData) => {
return answerData;
});
}
// Function call to initialize app
init()
.then((answerData) => {
return generateMarkdown(answerData);
})
.then((markdown) => {
return writeFile(markdown);
})
.then((writeFileResponse) => {
console.log(writeFileResponse);
});
// writeSample();
| javascript |
[{"title":"Test Title","text":"Test text","id":"81baffa4-ffac-49fa-8972-c67f2e5be4e5"},{"title":"hello mate","text":"thank for looking at my app\n","id":"b11c3570-28f6-42c3-a241-f7c0264ac9c4"},{"title":"Thank you for looking","text":"this app was brought to you by michael ","id":"3e0f81df-93c3-470c-88e2-bdfb4606b5de"},{"title":"Create title","text":"Write some body of text here !!","id":"579a248c-f34d-4f1c-8f09-cdfab2fc8bf1"}] | json |
<reponame>solidchristian/genclone2<filename>accounts/44.json
{
"type": "service_account",
"project_id": "genclone",
"private_key_id": "<KEY>",
"private_key": "-----<KEY>",
"client_email": "<EMAIL>",
"client_id": "104587333679351173622",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/mfc-a9thzacxgblboktf5-n3l6vt5t%40genclone.iam.gserviceaccount.com"
}
| json |
<gh_stars>0
{"key": "/books/OL11355632M", "title": "The Year Book of Hand Surgery, 1989", "subtitle": null, "publishers": ["Mosby-Year Book"], "authors": ["<NAME>"], "isbn_10": "0815126425", "isbn_13": "9780815126423", "publish_date": "January 1989", "subjects": ["Orthopaedics & fractures", "Reference works", "Surgery - General", "Medical / Nursing", "Science/Mathematics"]} | json |
<reponame>changeworld/azure-docs.cs-cz
---
title: Korelace telemetrie Azure Application Insights | Dokumenty společnosti Microsoft
description: Korelace telemetrie Application Insights
ms.topic: conceptual
author: lgayhardt
ms.author: lagayhar
ms.date: 06/07/2019
ms.reviewer: sergkanz
ms.openlocfilehash: 6ceace1ee93fab8c0a46ed4a67850fc87a5cdad2
ms.sourcegitcommit: a53fe6e9e4a4c153e9ac1a93e9335f8cf762c604
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 04/09/2020
ms.locfileid: "80991224"
---
# <a name="telemetry-correlation-in-application-insights"></a>Korelace telemetrie v application insights
Ve světě mikroslužeb vyžaduje každá logická operace práci v různých součástech služby. Každou z těchto součástí můžete sledovat samostatně pomocí [application insights](../../azure-monitor/app/app-insights-overview.md). Application Insights podporuje distribuovanou telemetricko korelaci, kterou použijete ke zjištění, která komponenta je zodpovědná za selhání nebo snížení výkonu.
Tento článek vysvětluje datový model používaný Application Insights ke korelaci telemetrie odeslané více součástmi. Zahrnuje techniky a protokoly šíření kontextu. Zahrnuje také provádění korelační taktiky na různých jazycích a platformách.
## <a name="data-model-for-telemetry-correlation"></a>Datový model pro korelaci telemetrie
Application Insights definuje [datový model](../../azure-monitor/app/data-model.md) pro distribuovanou korelaci telemetrie. Chcete-li přidružit telemetrii k logické `operation_Id`operaci, každá položka telemetrie má kontextové pole s názvem . Tento identifikátor je sdílena každou položkou telemetrie v distribuovanétrasování. Takže i v případě, že ztratíte telemetrická data z jedné vrstvy, stále můžete přidružit telemetrie hlášené jinými součástmi.
Distribuovaná logická operace se obvykle skládá ze sady menších operací, které jsou požadavky zpracované jednou z komponent. Tyto operace jsou definovány [telemetrií požadavku](../../azure-monitor/app/data-model-request-telemetry.md). Každá položka telemetrie `id` požadavku má své vlastní, které ji identifikují jednoznačně a globálně. A všechny položky telemetrie (například trasování a výjimky), `operation_parentId` které jsou přidruženy k požadavku by měl nastavit na hodnotu požadavku `id`.
Každá odchozí operace, například volání HTTP do jiné součásti, je [reprezentována telemetrií závislostí](../../azure-monitor/app/data-model-dependency-telemetry.md). Telemetrie závislostí také `id` definuje vlastní, která je globálně jedinečná. Požadavek telemetrie, iniciované toto volání závislostí, používá to jako `id` jeho `operation_parentId`.
Zobrazení distribuované logické operace můžete `operation_Id` `operation_parentId`vytvořit `request.id` pomocí `dependency.id`aplikace , a s . Tato pole také definovat pořadí příčinné souvislosti volání telemetrie.
V prostředí mikroslužeb trasování z komponent může přejít na různé položky úložiště. Každá komponenta může mít svůj vlastní klíč instrumentace v Application Insights. Chcete-li získat telemetrická data pro logickou operaci, Application Insights dotazuje data z každé položky úložiště. Když je počet položek úložiště velký, budete potřebovat nápovědu, kde hledat dál. Datový model Application Insights definuje dvě pole `request.source` k `dependency.target`vyřešení tohoto problému: a . První pole identifikuje součást, která iniciovala požadavek na závislost. Druhé pole určuje, která komponenta vrátila odpověď na volání závislosti.
## <a name="example"></a>Příklad
Pojďme se podívat na příklad. Aplikace s názvem Ceny akcií zobrazuje aktuální tržní cenu akcie pomocí externího rozhraní API s názvem Sklad. Aplikace Ceny akcií má stránku s názvem Stránka Akcie, kterou klientský webový prohlížeč otevře pomocí `GET /Home/Stock`aplikace . Aplikace dotazuje rozhraní STOCK API `GET /api/stock/value`pomocí volání HTTP .
Výslednou telemetrii můžete analyzovat spuštěním dotazu:
```kusto
(requests | union dependencies | union pageViews)
| where operation_Id == "STYz"
| project timestamp, itemType, name, id, operation_ParentId, operation_Id
```
Ve výsledcích si všimněte, že `operation_Id`všechny položky telemetrie sdílet kořen . Při volání Ajax ze stránky, nové jedinečné ID (`qJSXU`) je přiřazena k telemetrii závislostí `operation_ParentId`a ID pageView se používá jako . Požadavek serveru pak používá Ajax `operation_ParentId`ID jako .
| Itemtype | jméno | ID | operation_ParentId | operation_Id |
|------------|---------------------------|--------------|--------------------|--------------|
| pageView | Stránka zásob | | STYz (STYZ) | STYz (STYZ) |
| Závislost | GET /Úvod/Skladem | qJSXU | STYz (STYZ) | STYz (STYZ) |
| Požadavek | DOMŮ/Skladem | KqKwlrSt9PA= | qJSXU | STYz (STYZ) |
| Závislost | GET /api/stock/value | bBrf2L7mm2g= | KqKwlrSt9PA= | STYz (STYZ) |
Při volání `GET /api/stock/value` externí služby, musíte znát identitu tohoto serveru, takže můžete `dependency.target` správně nastavit pole. Pokud externí služba nepodporuje monitorování, `target` je nastavena na název hostitele `stock-prices-api.com`služby (například). Ale pokud služba identifikuje tím, že vrátí předdefinované hlavičky HTTP, obsahuje identitu služby, `target` která umožňuje Application Insights k vytvoření distribuované trasování dotazem telemetrie z této služby.
## <a name="correlation-headers"></a>Záhlaví korelace
Application Insights přechází na [w3c trasovací kontext](https://w3c.github.io/trace-context/), který definuje:
- `traceparent`: Nese globálně jedinečné ID operace a jedinečný identifikátor volání.
- `tracestate`: Nese kontext trasování specifické pro systém.
Nejnovější verze sady Application Insights SDK podporuje protokol Trace-Context, ale možná budete muset přihlásit. (Zpětná kompatibilita s předchozím korelačním protokolem podporovaným sadou Application Insights SDK bude zachována.)
[Korelace protokolu HTTP, také volal Request-Id](https://github.com/dotnet/runtime/blob/master/src/libraries/System.Diagnostics.DiagnosticSource/src/HttpCorrelationProtocol.md), je zastaralá. Tento protokol definuje dvě záhlaví:
- `Request-Id`: Nese globálně jedinečné ID volání.
- `Correlation-Context`: Nese kolekci dvojic název-hodnota distribuované trasování.
Application Insights také definuje [rozšíření](https://github.com/lmolkova/correlation/blob/master/http_protocol_proposal_v2.md) pro korelaci protokolu HTTP. Používá `Request-Context` dvojice název-hodnota k šíření kolekce vlastností používaných volajícího nebo volaného. Sada Application Insights SDK používá `dependency.target` tuto `request.source` hlavičku k nastavení polí a.
### <a name="enable-w3c-distributed-tracing-support-for-classic-aspnet-apps"></a>Povolte podporu distribuovaného trasování W3C pro klasické ASP.NET aplikace
> [!NOTE]
> Počínaje `Microsoft.ApplicationInsights.Web` a `Microsoft.ApplicationInsights.DependencyCollector`, není nutná žádná konfigurace.
W3C Trace-Context podpora je implementována zpětně kompatibilním způsobem. Očekává se, že korelace bude fungovat s aplikacemi, které jsou instrumentovány s předchozími verzemi sady SDK (bez podpory W3C).
Pokud chcete nadále používat `Request-Id` starší protokol, můžete zakázat trasování kontextu pomocí této konfigurace:
```csharp
Activity.DefaultIdFormat = ActivityIdFormat.Hierarchical;
Activity.ForceDefaultIdFormat = true;
```
Pokud spustíte starší verzi sady SDK, doporučujeme ji aktualizovat nebo použít následující konfiguraci pro povolení kontextu trasování.
Tato funkce je `Microsoft.ApplicationInsights.Web` k `Microsoft.ApplicationInsights.DependencyCollector` dispozici v balíčcích a, počínaje verzí 2.8.0-beta1.
Ve výchozím nastavení je zakázán. Chcete-li jej povolit, proveďte tyto změny v `ApplicationInsights.config`:
- V `RequestTrackingTelemetryModule`části `EnableW3CHeadersExtraction` přidejte prvek a `true`nastavte jeho hodnotu na .
- V `DependencyTrackingTelemetryModule`části `EnableW3CHeadersInjection` přidejte prvek a `true`nastavte jeho hodnotu na .
- Přidat `W3COperationCorrelationTelemetryInitializer` `TelemetryInitializers`pod . Bude vypadat podobně jako v tomto příkladu:
```xml
<TelemetryInitializers>
<Add Type="Microsoft.ApplicationInsights.Extensibility.W3C.W3COperationCorrelationTelemetryInitializer, Microsoft.ApplicationInsights"/>
...
</TelemetryInitializers>
```
### <a name="enable-w3c-distributed-tracing-support-for-aspnet-core-apps"></a>Povolení podpory distribuovaného trasování W3C pro aplikace ASP.NET Core
> [!NOTE]
> Počínaje `Microsoft.ApplicationInsights.AspNetCore` verzí 2.8.0 není nutná žádná konfigurace.
W3C Trace-Context podpora je implementována zpětně kompatibilním způsobem. Očekává se, že korelace bude fungovat s aplikacemi, které jsou instrumentovány s předchozími verzemi sady SDK (bez podpory W3C).
Pokud chcete nadále používat `Request-Id` starší protokol, můžete zakázat trasování kontextu pomocí této konfigurace:
```csharp
Activity.DefaultIdFormat = ActivityIdFormat.Hierarchical;
Activity.ForceDefaultIdFormat = true;
```
Pokud spustíte starší verzi sady SDK, doporučujeme ji aktualizovat nebo použít následující konfiguraci pro povolení kontextu trasování.
Tato funkce `Microsoft.ApplicationInsights.AspNetCore` je ve verzi 2.5.0-beta1 a ve `Microsoft.ApplicationInsights.DependencyCollector` verzi 2.8.0-beta1.
Ve výchozím nastavení je zakázán. Chcete-li jej `ApplicationInsightsServiceOptions.RequestCollectionOptions.EnableW3CDistributedTracing` `true`povolit, nastavte na :
```csharp
public void ConfigureServices(IServiceCollection services)
{
services.AddApplicationInsightsTelemetry(o =>
o.RequestCollectionOptions.EnableW3CDistributedTracing = true );
// ....
}
```
### <a name="enable-w3c-distributed-tracing-support-for-java-apps"></a>Povolení podpory distribuovaného trasování W3C pro aplikace Java
#### <a name="java-30-agent"></a>Java 3.0 agent
Java 3.0 agent podporuje W3C po vybalení z krabice a není nutná žádná další konfigurace.
#### <a name="java-sdk"></a>Java SDK
- **Příchozí konfigurace**
- Pro aplikace Java EE přidejte `<TelemetryModules>` do tagu v ApplicationInsights.xml následující:
```xml
<Add type="com.microsoft.applicationinsights.web.extensibility.modules.WebRequestTrackingTelemetryModule>
<Param name = "W3CEnabled" value ="true"/>
<Param name ="enableW3CBackCompat" value = "true" />
</Add>
```
- Pro aplikace jarního spuštění přidejte tyto vlastnosti:
- `azure.application-insights.web.enable-W3C=true`
- `azure.application-insights.web.enable-W3C-backcompat-mode=true`
- **Odchozí konfigurace**
Do souboru AI-Agent.xml přidejte následující:
```xml
<Instrumentation>
<BuiltIn enabled="true">
<HTTP enabled="true" W3C="true" enableW3CBackCompat="true"/>
</BuiltIn>
</Instrumentation>
```
> [!NOTE]
> Režim zpětné kompatibility je ve `enableW3CBackCompat` výchozím nastavení povolen a parametr je volitelný. Použijte ji pouze v případě, že chcete vypnout zpětnou kompatibilitu.
>
> V ideálním případě byste to vypnout, když všechny vaše služby byly aktualizovány na novější verze sad SDK, které podporují protokol W3C. Důrazně doporučujeme, abyste se co nejdříve přesunuli na tyto novější sady SDK.
> [!IMPORTANT]
> Ujistěte se, že příchozí a odchozí konfigurace jsou přesně stejné.
### <a name="enable-w3c-distributed-tracing-support-for-web-apps"></a>Povolení podpory distribuovaného trasování W3C pro webové aplikace
Tato funkce `Microsoft.ApplicationInsights.JavaScript`je v . Ve výchozím nastavení je zakázán. Chcete-li jej `distributedTracingMode` povolit, použijte config. AI_AND_W3C je poskytována zpětná kompatibilita s všechny starší služby instrumentované Application Insights.
- **npm nastavení (ignorovat, pokud používáte nastavení úryvku)**
```javascript
import { ApplicationInsights, DistributedTracingModes } from '@microsoft/applicationinsights-web';
const appInsights = new ApplicationInsights({ config: {
instrumentationKey: 'YOUR_INSTRUMENTATION_KEY_GOES_HERE',
distributedTracingMode: DistributedTracingModes.W3C
/* ...other configuration options... */
} });
appInsights.loadAppInsights();
```
- **Nastavení úryvku (ignorovat, pokud používáte nastavení npm)**
```
<script type="text/javascript">
var sdkInstance="appInsightsSDK";window[sdkInstance]="appInsights";var aiName=window[sdkInstance],aisdk=window[aiName]||function(e){function n(e){i[e]=function(){var n=arguments;i.queue.push(function(){i[e].apply(i,n)})}}var i={config:e};i.initialize=!0;var a=document,t=window;setTimeout(function(){var n=a.createElement("script");n.src=e.url||"https://az416426.vo.msecnd.net/scripts/b/ai.2.min.js",a.getElementsByTagName("script")[0].parentNode.appendChild(n)});try{i.cookie=a.cookie}catch(e){}i.queue=[],i.version=2;for(var r=["Event","PageView","Exception","Trace","DependencyData","Metric","PageViewPerformance"];r.length;)n("track"+r.pop());n("startTrackPage"),n("stopTrackPage");var o="Track"+r[0];if(n("start"+o),n("stop"+o),!(!0===e.disableExceptionTracking||e.extensionConfig&&e.extensionConfig.ApplicationInsightsAnalytics&&!0===e.extensionConfig.ApplicationInsightsAnalytics.disableExceptionTracking)){n("_"+(r="onerror"));var s=t[r];t[r]=function(e,n,a,t,o){var c=s&&s(e,n,a,t,o);return!0!==c&&i["_"+r]({message:e,url:n,lineNumber:a,columnNumber:t,error:o}),c},e.autoExceptionInstrumented=!0}return i}
(
{
instrumentationKey:"INSTRUMENTATION_KEY",
distributedTracingMode: 2 // DistributedTracingModes.W3C
/* ...other configuration options... */
}
);
window[aiName]=aisdk,aisdk.queue&&0===aisdk.queue.length&&aisdk.trackPageView({});
</script>
```
## <a name="opentracing-and-application-insights"></a>OpenTracing a přehledy aplikací
[Specifikace datového modelu OpenTracing](https://opentracing.io/) a datové modely Application Insights mapují následujícím způsobem:
| Application Insights | OpenTracing |
|------------------------------------ |------------------------------------------------- |
| `Request`, `PageView` | `Span`S`span.kind = server` |
| `Dependency` | `Span`S`span.kind = client` |
| `Id`a `Request``Dependency` | `SpanId` |
| `Operation_Id` | `TraceId` |
| `Operation_ParentId` | `Reference`typu `ChildOf` (nadřazené rozpětí) |
Další informace naleznete v [tématu Application Insights telemetrie datový model](../../azure-monitor/app/data-model.md).
Definice konceptů OpenTracing naleznete ve [specifikaci](https://github.com/opentracing/specification/blob/master/specification.md) OpenTracing a [sémantické konvence](https://github.com/opentracing/specification/blob/master/semantic_conventions.md).
## <a name="telemetry-correlation-in-opencensus-python"></a>Korelace telemetrie v OpenCensus Python
OpenCensus Python sleduje `OpenTracing` specifikace datového modelu popsané dříve. Podporuje také [W3C Trace-Context](https://w3c.github.io/trace-context/) bez nutnosti jakékoli konfigurace.
### <a name="incoming-request-correlation"></a>Korelace příchozích požadavků
OpenCensus Python koreluje hlavičky W3C Trace-Context z příchozích požadavků na rozsahy, které jsou generovány z samotných požadavků. OpenCensus to udělá automaticky s integrací pro tyto populární webové aplikace architektury: Flask, Django a Pyramid. Stačí vyplnit W3C Trace-Context záhlaví se [správným formátem](https://www.w3.org/TR/trace-context/#trace-context-http-headers-format) a odeslat je s požadavkem. Zde je ukázka aplikace Flask, která ukazuje toto:
```python
from flask import Flask
from opencensus.ext.azure.trace_exporter import AzureExporter
from opencensus.ext.flask.flask_middleware import FlaskMiddleware
from opencensus.trace.samplers import ProbabilitySampler
app = Flask(__name__)
middleware = FlaskMiddleware(
app,
exporter=AzureExporter(),
sampler=ProbabilitySampler(rate=1.0),
)
@app.route('/')
def hello():
return 'Hello World!'
if __name__ == '__main__':
app.run(host='localhost', port=8080, threaded=True)
```
Tento kód spustí ukázkovou aplikaci Flask na `8080`místním počítači a naslouchá portu . Chcete-li korelovat kontextu trasování, odešlete požadavek do koncového bodu. V tomto příkladu `curl` můžete použít příkaz:
```
curl --header "traceparent: 00-4bf92f3577b34da6a3ce929d0e0e4736-00f067aa0ba902b7-01" localhost:8080
```
Při pohledu na [formát záhlaví trace-context](https://www.w3.org/TR/trace-context/#trace-context-http-headers-format)můžete odvodit následující informace:
`version`: `00`
`trace-id`: `4bf92f3577b34da6a3ce929d0e0e4736`
`parent-id/span-id`: `00f067aa0ba902b7`
`trace-flags`: `01`
Pokud se podíváte na položku požadavku, která byla odeslána do Azure Monitoru, můžete zobrazit pole naplněná informacemi záhlaví trasování. Tato data najdete v části Protokoly (Analytics) v prostředku Azure Monitor Application Insights.

Pole `id` je ve `<trace-id>.<span-id>`formátu , `trace-id` kde je převzata ze záhlaví trasování, která byla předána v požadavku a `span-id` je generováno 8bajtové pole pro toto rozpětí.
Pole `operation_ParentId` je ve `<trace-id>.<parent-id>`formátu , `trace-id` kde `parent-id` jsou převzaty z hlavičky trasování, která byla předána v požadavku.
### <a name="log-correlation"></a>Korelace protokolů
OpenCensus Python umožňuje korelovat protokoly přidáním ID trasování, ID rozsahu a příznak vzorkování do protokolu záznamů. Tyto atributy přidáte instalací [integrace protokolování](https://pypi.org/project/opencensus-ext-logging/)OpenCensus . K objektům `LogRecord` Pythonu budou přidány `traceId` `spanId`následující `traceSampled`atributy: , a . Všimněte si, že to se projeví pouze pro úhozy kláves, které jsou vytvořeny po integraci.
Zde je ukázková aplikace, která demonstruje toto:
```python
import logging
from opencensus.trace import config_integration
from opencensus.trace.samplers import AlwaysOnSampler
from opencensus.trace.tracer import Tracer
config_integration.trace_integrations(['logging'])
logging.basicConfig(format='%(asctime)s traceId=%(traceId)s spanId=%(spanId)s %(message)s')
tracer = Tracer(sampler=AlwaysOnSampler())
logger = logging.getLogger(__name__)
logger.warning('Before the span')
with tracer.span(name='hello'):
logger.warning('In the span')
logger.warning('After the span')
```
Při spuštění tohoto kódu se v ytiskne v konzole následující:
```
2019-10-17 11:25:59,382 traceId=c54cb1d4bbbec5864bf0917c64aeacdc spanId=0000000000000000 Before the span
2019-10-17 11:25:59,384 traceId=c54cb1d4bbbec5864bf0917c64aeacdc spanId=70da28f5a4831014 In the span
2019-10-17 11:25:59,385 traceId=c54cb1d4bbbec5864bf0917c64aeacdc spanId=0000000000000000 After the span
```
Všimněte si, `spanId` že je k dispozici dárek pro zprávu protokolu, který je v rozpětí. Toto je `spanId` stejné, které patří `hello`do rozsahu s názvem .
Data protokolu můžete exportovat `AzureLogHandler`pomocí aplikace . Další informace najdete v [tomto článku](https://docs.microsoft.com/azure/azure-monitor/app/opencensus-python#logs).
## <a name="telemetry-correlation-in-net"></a>Korelace telemetrie v rozhraní .NET
V průběhu času rozhraní .NET definovalo několik způsobů korelace telemetrických a diagnostických protokolů:
- `System.Diagnostics.CorrelationManager`umožňuje sledování [LogicalOperationStack a ActivityId](https://msdn.microsoft.com/library/system.diagnostics.correlationmanager.aspx).
- `System.Diagnostics.Tracing.EventSource`a trasování událostí pro Windows (ETW) definují metodu [SetCurrentThreadActivityId.](https://msdn.microsoft.com/library/system.diagnostics.tracing.eventsource.setcurrentthreadactivityid.aspx)
- `ILogger`používá [protokolobory](https://docs.microsoft.com/aspnet/core/fundamentals/logging#log-scopes).
- Windows Communication Foundation (WCF) a HTTP drát y "aktuální" šíření kontextu.
Tyto metody však neumožnily podporu automatického distribuovaného trasování. `DiagnosticSource`podporuje automatickou korelaci mezi stroji. Knihovny .NET `DiagnosticSource` podporují a umožňují automatické šíření kontextu korelace pomocí přenosu, například HTTP.
[Uživatelská příručka k aktivitám](https://github.com/dotnet/corefx/blob/master/src/System.Diagnostics.DiagnosticSource/src/ActivityUserGuide.md) v `DiagnosticSource` příručce vysvětluje základy sledování aktivit.
ASP.NET Core 2.0 podporuje extrakci http hlaviček a spouštění nových aktivit.
`System.Net.Http.HttpClient`, počínaje verzí 4.1.0, podporuje automatické vkládání korelačních hlaviček HTTP a sledování volání HTTP jako aktivit.
Je nový modul HTTP, [Microsoft.AspNet.TelemetryCorrelation](https://www.nuget.org/packages/Microsoft.AspNet.TelemetryCorrelation/), pro klasické ASP.NET. Tento modul implementuje korelaci telemetrie pomocí `DiagnosticSource`. Spustí aktivitu na základě hlavičky příchozích požadavků. Také koreluje telemetrická data z různých fází zpracování požadavků, i když každá fáze zpracování Internetové informační služby (IIS) běží v jiném spravovaném vlákně.
Sada Application Insights SDK, počínaje verzí 2.4.0-beta1, používá `DiagnosticSource` a `Activity` shromažďuje telemetrii a přidružuje ji k aktuální aktivitě.
<a name="java-correlation"></a>
## <a name="telemetry-correlation-in-java"></a>Korelace telemetrie v Jazyce Java
[Java agent](https://docs.microsoft.com/azure/azure-monitor/app/java-in-process-agent) a [java sdk](../../azure-monitor/app/java-get-started.md) verze 2.0.0 nebo novější podporuje automatickou korelaci telemetrie. Automaticky naplní `operation_id` všechny telemetrie (jako trasování, výjimky a vlastní události) vydané v rámci požadavku. Také šíří záhlaví korelace (popsané výše) pro volání služby služby prostřednictvím protokolu HTTP, pokud je nakonfigurován [agent Java SDK.](../../azure-monitor/app/java-agent.md)
> [!NOTE]
> Application Insights Java agent automaticky shromažďuje požadavky a závislosti pro JMS, Kafka, Netty/Webflux a další. Pro Java SDK jsou podporovány pouze volání prostřednictvím Apache HttpClient jsou podporovány pro funkci korelace. Automatické šíření kontextu napříč technologiemi zasílání zpráv (jako jsou Kafka, RabbitMQ a Azure Service Bus) není v sdk podporováno.
> [!NOTE]
> Chcete-li sbírat vlastní telemetrii, musíte instrumentovat aplikaci pomocí sady Java 2.6 SDK.
### <a name="role-names"></a>Názvy rolí
Můžete chtít přizpůsobit způsob zobrazení názvů součástí v [mapě aplikace](../../azure-monitor/app/app-map.md). Chcete-li tak učinit, můžete `cloud_RoleName` ručně nastavit provedením jedné z následujících akcí:
- Pro Application Insights Java agent 3.0 nastavte název role cloudu takto:
```json
{
"instrumentationSettings": {
"preview": {
"roleName": "my cloud role name"
}
}
}
```
Můžete také nastavit název role cloudu `APPLICATIONINSIGHTS_ROLE_NAME`pomocí proměnné prostředí .
- Pomocí sady Application Insights java sdcharty 2.5.0 `<RoleName>` a novějších můžete zadat přidání souboru `cloud_RoleName` ApplicationInsights.xml:
```XML
<?xml version="1.0" encoding="utf-8"?>
<ApplicationInsights xmlns="http://schemas.microsoft.com/ApplicationInsights/2013/Settings" schemaVersion="2014-05-30">
<InstrumentationKey>** Your instrumentation key **</InstrumentationKey>
<RoleName>** Your role name **</RoleName>
...
</ApplicationInsights>
```
- Pokud používáte jarní spuštění se startérem jarního startu Application Insights, stačí nastavit vlastní název aplikace v souboru application.properties:
`spring.application.name=<name-of-app>`
Startér jarního startu `cloudRoleName` automaticky přiřadí hodnotu, kterou zadáte pro `spring.application.name` vlastnost.
## <a name="next-steps"></a>Další kroky
- Napište [vlastní telemetrii](../../azure-monitor/app/api-custom-events-metrics.md).
- Pokročilé scénáře korelace v ASP.NET jádra a ASP.NET naleznete v [tématu Sledování vlastních operací](custom-operations-tracking.md).
- Přečtěte si další informace o [nastavení cloud_RoleName](../../azure-monitor/app/app-map.md#set-cloud-role-name) pro ostatní sady SDK.
- Zadejte všechny součásti mikroslužeb na Application Insights. Podívejte se na [podporované platformy](../../azure-monitor/app/platforms.md).
- Podívejte se na [datový model](../../azure-monitor/app/data-model.md) pro typy Application Insights.
- Přečtěte si, jak [rozšířit a filtrovat telemetrii](../../azure-monitor/app/api-filtering-sampling.md).
- Zkontrolujte [odkaz na konfiguraci Application Insights](configuration-with-applicationinsights-config.md).
| markdown |
2.Tren i tillit med denne ytelsestekniske fleece -hettegenseren med atletisk passform, store delt kengurulommer, elastisk kant.
3.Den perfekte blandingen av bomull og polyester for lett varme og komfort.
|Størrelse (CM)
To lære mer om de samme stilene.
Ckontakt oss for mer informasjon om pris, pakking, frakt og rabatt.
| english |
.movement-neg{
-fx-text-fill: red;
}
.movement-pos{
-fx-text-fill: green;
} | css |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.