hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1cf7cd31d835d561c98ce25133f916ba86e3848f | 304 | py | Python | examen_2/p2/p2.py | Jhoselyn-Carballo/computacion_para_ingenieria | 4b5ed7d4aa0017fb4993ccfdcc9fcef0fb5b3898 | [
"Apache-2.0"
] | null | null | null | examen_2/p2/p2.py | Jhoselyn-Carballo/computacion_para_ingenieria | 4b5ed7d4aa0017fb4993ccfdcc9fcef0fb5b3898 | [
"Apache-2.0"
] | null | null | null | examen_2/p2/p2.py | Jhoselyn-Carballo/computacion_para_ingenieria | 4b5ed7d4aa0017fb4993ccfdcc9fcef0fb5b3898 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Feb 17 09:10:05 2022
@author: JHOSS
"""
from tkinter import *
def contador(accion, contador):
if accion == 'countUp':
contador == contador + 1
elif accion == 'coundDown':
contador == contador -1
elif accion == 'reset':
contador == 0
return contador
| 17.882353 | 35 | 0.644737 | 40 | 304 | 4.9 | 0.7 | 0.163265 | 0.173469 | 0.214286 | 0.27551 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065844 | 0.200658 | 304 | 16 | 36 | 19 | 0.740741 | 0.243421 | 0 | 0 | 0 | 0 | 0.094595 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1cf95ca934131253ff13ec5e02a3e110c4125705 | 1,838 | py | Python | scripts/preprocess.py | umd-lib/solr-irroc | 860be84ea1847cbb96c1a7a70b03f59dc6e0366b | [
"Apache-2.0"
] | null | null | null | scripts/preprocess.py | umd-lib/solr-irroc | 860be84ea1847cbb96c1a7a70b03f59dc6e0366b | [
"Apache-2.0"
] | null | null | null | scripts/preprocess.py | umd-lib/solr-irroc | 860be84ea1847cbb96c1a7a70b03f59dc6e0366b | [
"Apache-2.0"
] | 1 | 2019-11-04T13:19:34.000Z | 2019-11-04T13:19:34.000Z | #!/user/bin/env python3
# -*- coding: utf8 -*-
#===================================================#
# cleanup.py #
# Joshua Westgard #
# 2015-08-13 #
# #
# Data preprocessing script for IRRoC DB #
# Usage: python3 cleanup.py [in.csv] [out.csv] #
#===================================================#
import sys, csv, re
infields = ['id', 'str_resource', 'str_description', 'website', 'meta_title',
'meta_description', 'stage_list', 'task_list']
outfields = infields + ['stage_list_facet', 'task_list_facet']
with open(sys.argv[1], 'r') as infile, open(sys.argv[2], 'w') as outfile:
# skip header row in order to use own fieldnames
next(infile)
# instantiate the reader and writer objects
dr = csv.DictReader(infile, fieldnames=infields)
dw = csv.DictWriter(outfile, fieldnames=outfields)
dw.writeheader()
exp = re.compile(r'\d+::([^\b])')
# loop over the input file, writing results to output file
for row in dr:
# remove hash marks from URL
m = re.search('#(.+)#', row['website'])
if m:
row['website'] = m.group(1)
# remove spaces from all multivalued fields
row['stage_list_facet'] = row['stage_list'].replace('; ', ';')
row['task_list_facet'] = row['task_list'].replace('; ', ';')
row['meta_description'] = row['meta_description'].replace(', ', ',')
# create stage_list_facet and task_list_facet cols and strip numbers
row['stage_list'] = re.sub(exp, r'\1', row['stage_list_facet'])
row['task_list'] = re.sub(exp, r'\1', row['task_list_facet'])
# write row
dw.writerow(row)
| 36.039216 | 77 | 0.516322 | 207 | 1,838 | 4.449275 | 0.487923 | 0.078176 | 0.060803 | 0.036916 | 0.103149 | 0.036916 | 0.036916 | 0 | 0 | 0 | 0 | 0.012232 | 0.288357 | 1,838 | 50 | 78 | 36.76 | 0.691896 | 0.404788 | 0 | 0 | 0 | 0 | 0.272217 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.05 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1cf9fe0aeeda656a56200c7ffd6e72f3aa0fd6c3 | 783 | py | Python | fmoe/gates/utils.py | GODVIX/fastmoe | 7f6463f0367205a1e95139c6d7e930be6e7fa746 | [
"Apache-2.0"
] | null | null | null | fmoe/gates/utils.py | GODVIX/fastmoe | 7f6463f0367205a1e95139c6d7e930be6e7fa746 | [
"Apache-2.0"
] | 1 | 2021-05-24T03:13:50.000Z | 2021-05-24T03:13:50.000Z | fmoe/gates/utils.py | Co1lin/fastmoe | ff7333c7a164a8e1f54954b1b56095dc4cde7bfc | [
"Apache-2.0"
] | null | null | null | r"""
Utilities that may be used in the gates
"""
import torch
from fmoe.functions import count_by_gate
import fmoe_cuda as fmoe_native
def limit_by_capacity(topk_idx, num_expert, world_size, capacity):
capacity = torch.ones(num_expert, dtype=torch.int32,
device=topk_idx.device) * capacity
pos, lec, gec = count_by_gate(topk_idx, num_expert, world_size,
require_pos=False)
new_gec, = fmoe_native.limit_by_capacity(gec, capacity,
num_expert, world_size)
if world_size > 1:
new_lec, = fmoe_native.expert_exchange(new_gec, num_expert, world_size)
else:
new_lec = new_gec
fmoe_native.prune_gate_by_capacity(topk_idx,
new_lec.to(torch.int32), num_expert, world_size)
return new_lec, new_gec
| 30.115385 | 79 | 0.711367 | 119 | 783 | 4.327731 | 0.386555 | 0.104854 | 0.135922 | 0.174757 | 0.097087 | 0.097087 | 0 | 0 | 0 | 0 | 0 | 0.008065 | 0.208174 | 783 | 25 | 80 | 31.32 | 0.822581 | 0.049808 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.166667 | 0 | 0.277778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1cff9155874f3a3bbd07b635d9ad399b58f00a6d | 1,581 | py | Python | decorator.py | zengboming/python | 13018f476554adc3bff831af27c08f7c216d4b09 | [
"Apache-2.0"
] | null | null | null | decorator.py | zengboming/python | 13018f476554adc3bff831af27c08f7c216d4b09 | [
"Apache-2.0"
] | null | null | null | decorator.py | zengboming/python | 13018f476554adc3bff831af27c08f7c216d4b09 | [
"Apache-2.0"
] | null | null | null | #decorator
def now():
print "2015-11-18"
f=now
f()
print now.__name__
print f.__name__
def log(func):
def wrapper(*args,**kw):
print 'begin call %s():' %func.__name__
func(*args,**kw)
print 'end call %s():' %func.__name__
return wrapper
@log
def now1():
print now1.__name__
now1()
now1=log(now1)
now1()
def log1(text):
def decorator(func):
def wrapper(*args,**kw):
print '%s %s():' %(text,func.__name__)
return func(*args,**kw)
return wrapper
return decorator
@log1('execute')
def now2():
print now2.__name__
now2()
import functools
def log2(func):
@functools.wraps(func)
def wrapper(*args,**kw):
print 'call %s():' %func.__name__
return func(*args,**kw)
return wrapper
@log2
def now3():
print now3.__name__
now3()
def log3(text):
def decorator(func):
@functools.wraps(func)
def wrapper(*args,**kw):
print '%s %s():' %(text,func.__name__)
return func(*args,**kw)
return wrapper
return decorator
@log3('execute')
def now4():
print now4.__name__
now4()
def log4(text):
if callable(text):
@functools.wraps(text)
def wrapper(*args,**kw):
print 'begin call %s:' %text.__name__
text(*args,**kw)
print 'end call '+text.__name__
return wrapper
else :
def decorator(func):
@functools.wraps(func)
def wrapper(*args,**kw):
print 'begin call %s %s():' %(text,func.__name__)
func(*args,**kw)
print 'end call %s %s():' %(text,func.__name__)
return wrapper
return decorator
@log4
def now5():
print 'doing'+now5.__name__
now5()
@log4('execute')
def now6():
print 'doing'+now6.__name__
now6() | 17.566667 | 53 | 0.655914 | 228 | 1,581 | 4.25 | 0.166667 | 0.074303 | 0.102167 | 0.099071 | 0.546956 | 0.508772 | 0.484004 | 0.484004 | 0.413829 | 0.25903 | 0 | 0.02881 | 0.165718 | 1,581 | 90 | 54 | 17.566667 | 0.705838 | 0.005693 | 0 | 0.38961 | 0 | 0 | 0.099237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.012987 | null | null | 0.233766 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e80430c85971413d707279b7c247a402de4c954e | 916 | py | Python | core/migrations/0004_auto_20210929_2354.py | codefair114/Inventory-App-Django | f09f43ca282f82be981cac26a92d614fdf2ff5ef | [
"MIT"
] | null | null | null | core/migrations/0004_auto_20210929_2354.py | codefair114/Inventory-App-Django | f09f43ca282f82be981cac26a92d614fdf2ff5ef | [
"MIT"
] | null | null | null | core/migrations/0004_auto_20210929_2354.py | codefair114/Inventory-App-Django | f09f43ca282f82be981cac26a92d614fdf2ff5ef | [
"MIT"
] | null | null | null | # Generated by Django 3.2.7 on 2021-09-29 23:54
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('core', '0003_auto_20210929_2353'),
]
operations = [
migrations.AlterField(
model_name='order',
name='client',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='core.orderclient'),
),
migrations.AlterField(
model_name='order',
name='payment',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='core.paymentmethod'),
),
migrations.AlterField(
model_name='order',
name='shipment',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='core.shipment'),
),
]
| 30.533333 | 117 | 0.618996 | 100 | 916 | 5.58 | 0.43 | 0.071685 | 0.100358 | 0.157706 | 0.591398 | 0.591398 | 0.387097 | 0.387097 | 0.387097 | 0.387097 | 0 | 0.045388 | 0.254367 | 916 | 29 | 118 | 31.586207 | 0.771596 | 0.049127 | 0 | 0.391304 | 1 | 0 | 0.126582 | 0.026467 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e80867dca8c2989991af1d5f2e487d3e8fa8b0f5 | 13,314 | py | Python | sdk/python/pulumi_azure/lb/outbound_rule.py | suresh198526/pulumi-azure | bf27206a38d7a5c58b3c2c57ec8769fe3d0fc5d7 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/lb/outbound_rule.py | suresh198526/pulumi-azure | bf27206a38d7a5c58b3c2c57ec8769fe3d0fc5d7 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | sdk/python/pulumi_azure/lb/outbound_rule.py | suresh198526/pulumi-azure | bf27206a38d7a5c58b3c2c57ec8769fe3d0fc5d7 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union
from .. import _utilities, _tables
from . import outputs
from ._inputs import *
__all__ = ['OutboundRule']
class OutboundRule(pulumi.CustomResource):
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
allocated_outbound_ports: Optional[pulumi.Input[int]] = None,
backend_address_pool_id: Optional[pulumi.Input[str]] = None,
enable_tcp_reset: Optional[pulumi.Input[bool]] = None,
frontend_ip_configurations: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OutboundRuleFrontendIpConfigurationArgs']]]]] = None,
idle_timeout_in_minutes: Optional[pulumi.Input[int]] = None,
loadbalancer_id: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
protocol: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
__props__=None,
__name__=None,
__opts__=None):
"""
Manages a Load Balancer Outbound Rule.
> **NOTE** When using this resource, the Load Balancer needs to have a FrontEnd IP Configuration and a Backend Address Pool Attached.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West US")
example_public_ip = azure.network.PublicIp("examplePublicIp",
location="West US",
resource_group_name=example_resource_group.name,
allocation_method="Static")
example_load_balancer = azure.lb.LoadBalancer("exampleLoadBalancer",
location="West US",
resource_group_name=example_resource_group.name,
frontend_ip_configurations=[azure.lb.LoadBalancerFrontendIpConfigurationArgs(
name="PublicIPAddress",
public_ip_address_id=example_public_ip.id,
)])
example_backend_address_pool = azure.lb.BackendAddressPool("exampleBackendAddressPool",
resource_group_name=example_resource_group.name,
loadbalancer_id=example_load_balancer.id)
example_outbound_rule = azure.lb.OutboundRule("exampleOutboundRule",
resource_group_name=example_resource_group.name,
loadbalancer_id=example_load_balancer.id,
protocol="Tcp",
backend_address_pool_id=example_backend_address_pool.id,
frontend_ip_configurations=[azure.lb.OutboundRuleFrontendIpConfigurationArgs(
name="PublicIPAddress",
)])
```
## Import
Load Balancer Outbound Rules can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:lb/outboundRule:OutboundRule example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/group1/providers/Microsoft.Network/loadBalancers/lb1/outboundRules/rule1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[int] allocated_outbound_ports: The number of outbound ports to be used for NAT.
:param pulumi.Input[str] backend_address_pool_id: The ID of the Backend Address Pool. Outbound traffic is randomly load balanced across IPs in the backend IPs.
:param pulumi.Input[bool] enable_tcp_reset: Receive bidirectional TCP Reset on TCP flow idle timeout or unexpected connection termination. This element is only used when the protocol is set to TCP.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OutboundRuleFrontendIpConfigurationArgs']]]] frontend_ip_configurations: One or more `frontend_ip_configuration` blocks as defined below.
:param pulumi.Input[int] idle_timeout_in_minutes: The timeout for the TCP idle connection
:param pulumi.Input[str] loadbalancer_id: The ID of the Load Balancer in which to create the Outbound Rule. Changing this forces a new resource to be created.
:param pulumi.Input[str] name: Specifies the name of the Outbound Rule. Changing this forces a new resource to be created.
:param pulumi.Input[str] protocol: The transport protocol for the external endpoint. Possible values are `Udp`, `Tcp` or `All`.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the resource. Changing this forces a new resource to be created.
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
__props__['allocated_outbound_ports'] = allocated_outbound_ports
if backend_address_pool_id is None:
raise TypeError("Missing required property 'backend_address_pool_id'")
__props__['backend_address_pool_id'] = backend_address_pool_id
__props__['enable_tcp_reset'] = enable_tcp_reset
__props__['frontend_ip_configurations'] = frontend_ip_configurations
__props__['idle_timeout_in_minutes'] = idle_timeout_in_minutes
if loadbalancer_id is None:
raise TypeError("Missing required property 'loadbalancer_id'")
__props__['loadbalancer_id'] = loadbalancer_id
__props__['name'] = name
if protocol is None:
raise TypeError("Missing required property 'protocol'")
__props__['protocol'] = protocol
if resource_group_name is None:
raise TypeError("Missing required property 'resource_group_name'")
__props__['resource_group_name'] = resource_group_name
super(OutboundRule, __self__).__init__(
'azure:lb/outboundRule:OutboundRule',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
allocated_outbound_ports: Optional[pulumi.Input[int]] = None,
backend_address_pool_id: Optional[pulumi.Input[str]] = None,
enable_tcp_reset: Optional[pulumi.Input[bool]] = None,
frontend_ip_configurations: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OutboundRuleFrontendIpConfigurationArgs']]]]] = None,
idle_timeout_in_minutes: Optional[pulumi.Input[int]] = None,
loadbalancer_id: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
protocol: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None) -> 'OutboundRule':
"""
Get an existing OutboundRule resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[int] allocated_outbound_ports: The number of outbound ports to be used for NAT.
:param pulumi.Input[str] backend_address_pool_id: The ID of the Backend Address Pool. Outbound traffic is randomly load balanced across IPs in the backend IPs.
:param pulumi.Input[bool] enable_tcp_reset: Receive bidirectional TCP Reset on TCP flow idle timeout or unexpected connection termination. This element is only used when the protocol is set to TCP.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['OutboundRuleFrontendIpConfigurationArgs']]]] frontend_ip_configurations: One or more `frontend_ip_configuration` blocks as defined below.
:param pulumi.Input[int] idle_timeout_in_minutes: The timeout for the TCP idle connection
:param pulumi.Input[str] loadbalancer_id: The ID of the Load Balancer in which to create the Outbound Rule. Changing this forces a new resource to be created.
:param pulumi.Input[str] name: Specifies the name of the Outbound Rule. Changing this forces a new resource to be created.
:param pulumi.Input[str] protocol: The transport protocol for the external endpoint. Possible values are `Udp`, `Tcp` or `All`.
:param pulumi.Input[str] resource_group_name: The name of the resource group in which to create the resource. Changing this forces a new resource to be created.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
__props__["allocated_outbound_ports"] = allocated_outbound_ports
__props__["backend_address_pool_id"] = backend_address_pool_id
__props__["enable_tcp_reset"] = enable_tcp_reset
__props__["frontend_ip_configurations"] = frontend_ip_configurations
__props__["idle_timeout_in_minutes"] = idle_timeout_in_minutes
__props__["loadbalancer_id"] = loadbalancer_id
__props__["name"] = name
__props__["protocol"] = protocol
__props__["resource_group_name"] = resource_group_name
return OutboundRule(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="allocatedOutboundPorts")
def allocated_outbound_ports(self) -> pulumi.Output[Optional[int]]:
"""
The number of outbound ports to be used for NAT.
"""
return pulumi.get(self, "allocated_outbound_ports")
@property
@pulumi.getter(name="backendAddressPoolId")
def backend_address_pool_id(self) -> pulumi.Output[str]:
"""
The ID of the Backend Address Pool. Outbound traffic is randomly load balanced across IPs in the backend IPs.
"""
return pulumi.get(self, "backend_address_pool_id")
@property
@pulumi.getter(name="enableTcpReset")
def enable_tcp_reset(self) -> pulumi.Output[Optional[bool]]:
"""
Receive bidirectional TCP Reset on TCP flow idle timeout or unexpected connection termination. This element is only used when the protocol is set to TCP.
"""
return pulumi.get(self, "enable_tcp_reset")
@property
@pulumi.getter(name="frontendIpConfigurations")
def frontend_ip_configurations(self) -> pulumi.Output[Optional[Sequence['outputs.OutboundRuleFrontendIpConfiguration']]]:
"""
One or more `frontend_ip_configuration` blocks as defined below.
"""
return pulumi.get(self, "frontend_ip_configurations")
@property
@pulumi.getter(name="idleTimeoutInMinutes")
def idle_timeout_in_minutes(self) -> pulumi.Output[Optional[int]]:
"""
The timeout for the TCP idle connection
"""
return pulumi.get(self, "idle_timeout_in_minutes")
@property
@pulumi.getter(name="loadbalancerId")
def loadbalancer_id(self) -> pulumi.Output[str]:
"""
The ID of the Load Balancer in which to create the Outbound Rule. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "loadbalancer_id")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Specifies the name of the Outbound Rule. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def protocol(self) -> pulumi.Output[str]:
"""
The transport protocol for the external endpoint. Possible values are `Udp`, `Tcp` or `All`.
"""
return pulumi.get(self, "protocol")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
"""
The name of the resource group in which to create the resource. Changing this forces a new resource to be created.
"""
return pulumi.get(self, "resource_group_name")
def translate_output_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return _tables.SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 53.043825 | 207 | 0.686646 | 1,580 | 13,314 | 5.532911 | 0.148101 | 0.052848 | 0.035232 | 0.032029 | 0.630634 | 0.596774 | 0.582819 | 0.560284 | 0.531686 | 0.513612 | 0 | 0.003521 | 0.232011 | 13,314 | 250 | 208 | 53.256 | 0.851443 | 0.435106 | 0 | 0.244094 | 1 | 0 | 0.178529 | 0.078676 | 0 | 0 | 0 | 0 | 0 | 1 | 0.102362 | false | 0.007874 | 0.055118 | 0.015748 | 0.259843 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8094e0dbe640e5e2094dda3ca82426a4117f2ae | 21,377 | py | Python | orbit_predictor/predictors/base.py | Juanlu001/orbit-predictor | ca67e2e859932938627ed24e5cbf58c887cd99c0 | [
"MIT"
] | null | null | null | orbit_predictor/predictors/base.py | Juanlu001/orbit-predictor | ca67e2e859932938627ed24e5cbf58c887cd99c0 | [
"MIT"
] | null | null | null | orbit_predictor/predictors/base.py | Juanlu001/orbit-predictor | ca67e2e859932938627ed24e5cbf58c887cd99c0 | [
"MIT"
] | null | null | null | # MIT License
#
# Copyright (c) 2017 Satellogic SA
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
import datetime as dt
import logging
import warnings
from collections import namedtuple
from math import pi, acos, degrees, radians
import numpy as np
try:
from scipy.optimize import brentq, minimize_scalar
except ImportError:
warnings.warn('scipy module was not found, some features may not work properly.',
ImportWarning)
from orbit_predictor.constants import MU_E
from orbit_predictor.exceptions import NotReachable, PropagationError
from orbit_predictor import coordinate_systems
from orbit_predictor.keplerian import rv2coe
from orbit_predictor.utils import (
angle_between,
cross_product,
dot_product,
reify,
vector_diff,
vector_norm,
gstime_from_datetime,
get_shadow,
get_sun,
eclipse_duration,
get_satellite_minus_penumbra_verticals,
)
logger = logging.getLogger(__name__)
ONE_SECOND = dt.timedelta(seconds=1)
def round_datetime(dt_):
return dt_
class Position(namedtuple(
"Position", ['when_utc', 'position_ecef', 'velocity_ecef', 'error_estimate'])):
@reify
def position_llh(self):
"""Latitude (deg), longitude (deg), altitude (km)."""
return coordinate_systems.ecef_to_llh(self.position_ecef)
@reify
def osculating_elements(self):
"""Osculating Keplerian orbital elements.
Semimajor axis (km), eccentricity, inclination (deg),
right ascension of the ascending node or RAAN (deg),
argument of perigee (deg), true anomaly (deg).
"""
gmst = gstime_from_datetime(self.when_utc)
position_eci = coordinate_systems.ecef_to_eci(self.position_ecef, gmst)
velocity_eci = coordinate_systems.ecef_to_eci(self.velocity_ecef, gmst)
# Convert position to Keplerian osculating elements
p, ecc, inc, raan, argp, ta = rv2coe(
MU_E, np.array(position_eci), np.array(velocity_eci)
)
# Transform to more familiar semimajor axis
sma = p / (1 - ecc ** 2)
return sma, ecc, degrees(inc), degrees(raan), degrees(argp), degrees(ta)
class PredictedPass:
def __init__(self, location, sate_id,
max_elevation_deg,
aos, los, duration_s,
max_elevation_position=None,
max_elevation_date=None):
self.location = location
self.sate_id = sate_id
self.max_elevation_position = max_elevation_position
self.max_elevation_date = max_elevation_date
self.max_elevation_deg = max_elevation_deg
self.aos = aos
self.los = los
self.duration_s = duration_s
@property
def midpoint(self):
"""Returns a datetime of the midpoint of the pass"""
return self.aos + (self.los - self.aos) / 2
def __repr__(self):
return "<PredictedPass {} over {} on {}>".format(self.sate_id, self.location, self.aos)
def __eq__(self, other):
return all([issubclass(other.__class__, PredictedPass),
self.location == other.location,
self.sate_id == other.sate_id,
self.max_elevation_position == other.max_elevation_position,
self.max_elevation_date == other.max_elevation_date,
self.max_elevation_deg == other.max_elevation_deg,
self.aos == other.aos,
self.los == other.los,
self.duration_s == other.duration_s])
def get_off_nadir_angle(self):
warnings.warn("This method is deprecated!", DeprecationWarning)
return self.off_nadir_deg
@reify
def off_nadir_deg(self):
"""Computes off-nadir angle calculation
Given satellite position ``sate_pos``, velocity ``sate_vel``, and
location ``target`` in a common frame, off-nadir angle ``off_nadir_angle``
is given by:
t2b = sate_pos - target
cos(off_nadir_angle) = (sate_pos · t2b) # Vectorial dot product
_______________________
|| sate_pos || || t2b||
Sign for the rotation is calculated this way
cross = target ⨯ sate_pos
sign = cross · sate_vel
____________________
| cross · sate_vel |
"""
sate_pos = self.max_elevation_position.position_ecef
sate_vel = self.max_elevation_position.velocity_ecef
target = self.location.position_ecef
t2b = vector_diff(sate_pos, target)
angle = acos(
dot_product(sate_pos, t2b) / (vector_norm(sate_pos) * vector_norm(t2b))
)
cross = cross_product(target, sate_pos)
dot = dot_product(cross, sate_vel)
try:
sign = dot / abs(dot)
except ZeroDivisionError:
sign = 1
return degrees(angle) * sign
class Predictor:
@property
def sate_id(self):
raise NotImplementedError
def propagate_eci(self, when_utc=None):
raise NotImplementedError
def get_position(self, when_utc=None):
raise NotImplementedError("You have to implement it!")
def get_shadow(self, when_utc=None):
"""Gives illumination at given time (2 for illuminated, 1 for penumbra, 0 for umbra)."""
if when_utc is None:
when_utc = dt.datetime.utcnow()
return get_shadow(
self.get_position(when_utc).position_ecef,
when_utc
)
def get_normal_vector(self, when_utc=None):
"""Gets unitary normal vector (orthogonal to orbital plane) at given time."""
if when_utc is None:
when_utc = dt.datetime.utcnow()
position, velocity = self.propagate_eci(when_utc)
orbital_plane_normal = np.cross(position, velocity)
return orbital_plane_normal / vector_norm(orbital_plane_normal)
def get_beta(self, when_utc=None):
"""Gets angle between orbital plane and Sun direction (beta) at given time, in degrees."""
if when_utc is None:
when_utc = dt.datetime.utcnow()
# Here we calculate the complementary angle of beta,
# because we use the normal vector of the orbital plane
beta_comp = angle_between(
get_sun(when_utc),
self.get_normal_vector(when_utc)
)
# We subtract from 90 degrees to return the real beta angle
return 90 - beta_comp
class CartesianPredictor(Predictor):
def _propagate_ecef(self, when_utc=None):
"""Return position and velocity in the given date using ECEF coordinate system."""
if when_utc is None:
when_utc = dt.datetime.utcnow()
position_eci, velocity_eci = self.propagate_eci(when_utc)
gmst = gstime_from_datetime(when_utc)
position_ecef = coordinate_systems.eci_to_ecef(position_eci, gmst)
velocity_ecef = coordinate_systems.eci_to_ecef(velocity_eci, gmst)
return position_ecef, velocity_ecef
@reify
def mean_motion(self):
"""Mean motion, in radians per minute"""
raise NotImplementedError
@reify
def period(self):
"""Orbital period, in minutes"""
return 2 * pi / self.mean_motion
def get_position(self, when_utc=None):
"""Return a Position namedtuple in ECEF coordinate system"""
if when_utc is None:
when_utc = dt.datetime.utcnow()
position_ecef, velocity_ecef = self._propagate_ecef(when_utc)
return Position(when_utc=when_utc, position_ecef=position_ecef,
velocity_ecef=velocity_ecef, error_estimate=None)
def get_only_position(self, when_utc=None):
"""Return a tuple in ECEF coordinate system"""
return self.get_position(when_utc).position_ecef
def get_eclipse_duration(self, when_utc=None, tolerance=1e-1):
"""Gets eclipse duration at given time, in minutes"""
ecc = self.get_position(when_utc).osculating_elements[1]
if ecc > tolerance:
raise NotImplementedError("Non circular orbits are not supported")
beta = self.get_beta(when_utc)
return eclipse_duration(beta, self.period)
def passes_over(self, location, when_utc, limit_date=None, max_elevation_gt=0, aos_at_dg=0):
return LocationPredictor(location, self, when_utc, limit_date,
max_elevation_gt, aos_at_dg)
def get_next_pass(self, location, when_utc=None, max_elevation_gt=5,
aos_at_dg=0, limit_date=None):
"""Return a PredictedPass instance with the data of the next pass over the given location
location_llh: point on Earth we want to see from the satellite.
when_utc: datetime UTC after which the pass is calculated, default to now.
max_elevation_gt: filter passes with max_elevation under it.
aos_at_dg: This is if we want to start the pass at a specific elevation.
The next pass with a LOS strictly after when_utc will be returned,
possibly the current pass.
"""
if when_utc is None:
when_utc = dt.datetime.utcnow()
for pass_ in self.passes_over(location, when_utc, limit_date,
max_elevation_gt=max_elevation_gt,
aos_at_dg=aos_at_dg):
return pass_
else:
raise NotReachable('Propagation limit date exceeded')
def eclipses_since(self, when_utc=None, limit_date=None):
"""
An iterator that yields all eclipses start and end times between
when_utc and limit_date.
The next eclipse with a end strictly after when_utc will be returned,
possibly the current eclipse.
The last eclipse returned starts before limit_date, but it can end
strictly after limit_date.
No circular orbits are not supported, and will raise NotImplementedError.
"""
def _get_illumination(t):
my_start = start + dt.timedelta(seconds=t)
result = get_satellite_minus_penumbra_verticals(
self.get_only_position(my_start),
my_start
)
return result
if when_utc is None:
when_utc = dt.datetime.utcnow()
orbital_period_s = self.period * 60
# A third of the orbit period is used as the base window of the search.
# This window ensures the function get_satellite_minus_penumbra_verticals
# will not have more than one local minimum (one in the illuminated phase and
# the other in penumbra).
base_search_window_s = orbital_period_s / 3
start = when_utc
while limit_date is None or start < limit_date:
# a minimum negative value is aproximatelly the middle point of the eclipse
minimum_illumination = minimize_scalar(
_get_illumination,
bounds=(0, base_search_window_s),
method="bounded",
options={"xatol": 1e-2},
)
eclipse_center_candidate_delta_s = minimum_illumination.x
# If found a minimum that is not illuminated, there is an eclipse here
if _get_illumination(eclipse_center_candidate_delta_s) < 0:
# The small time interval to search zeros around the center
# is estimated with the expected eclipse duration (which generally
# is smaller than expected, and that is the reason of the 1.5 coeficient).
# Also a minimum of 180 seconds was added because
# in some cases the estimation is 0 even though there is an eclipse.
eclipse_duration_estimation_s = self.get_eclipse_duration(start) * 60
zero_search_window_s = max(180, 1.5 * eclipse_duration_estimation_s)
# Search now both zeros to get the start and end of the eclipse
eclipse_start_delta_s = brentq(
_get_illumination,
eclipse_center_candidate_delta_s - zero_search_window_s,
eclipse_center_candidate_delta_s,
xtol=1e-2,
full_output=False,
)
eclipse_end_delta_s = brentq(
_get_illumination,
eclipse_center_candidate_delta_s,
eclipse_center_candidate_delta_s + zero_search_window_s,
xtol=1e-2,
full_output=False,
)
eclipse_start = start + dt.timedelta(seconds=eclipse_start_delta_s)
eclipse_end = start + dt.timedelta(seconds=eclipse_end_delta_s)
yield eclipse_start, eclipse_end
start = eclipse_end + dt.timedelta(seconds=base_search_window_s)
else:
start += dt.timedelta(seconds=base_search_window_s)
class GPSPredictor(Predictor):
pass
class LocationPredictor:
"""Predicts passes over a given location
Exposes an iterable interface
"""
def __init__(self, location, predictor, start_date, limit_date=None,
max_elevation_gt=0, aos_at_dg=0, *, propagator=None):
if propagator is not None:
warnings.warn(
"propagator parameter was renamed to predictor "
"and will be removed in a future release",
DeprecationWarning
)
predictor = propagator
self.location = location
self.predictor = predictor
self.start_date = start_date
self.limit_date = limit_date
self.max_elevation_gt = radians(max([max_elevation_gt, aos_at_dg]))
self.aos_at = radians(aos_at_dg)
@property
def propagator(self):
warnings.warn(
"propagator parameter was renamed to predictor "
"and will be removed in a future release",
DeprecationWarning
)
return self.predictor
def __iter__(self):
"""Returns one pass each time"""
current_date = self.start_date
while True:
if self.is_ascending(current_date):
# we need a descending point
ascending_date = current_date
descending_date = self._find_nearest_descending(ascending_date)
pass_ = self._refine_pass(ascending_date, descending_date)
if pass_.valid:
if self.limit_date is not None and pass_.aos > self.limit_date:
break
yield self._build_predicted_pass(pass_)
if self.limit_date is not None and current_date > self.limit_date:
break
current_date = pass_.tca + self._orbit_step(0.6)
else:
current_date = self._find_nearest_ascending(current_date)
def _build_predicted_pass(self, accuratepass):
"""Returns a classic predicted pass"""
tca_position = self.predictor.get_position(accuratepass.tca)
return PredictedPass(self.location, self.predictor.sate_id,
max_elevation_deg=accuratepass.max_elevation_deg,
aos=accuratepass.aos,
los=accuratepass.los,
duration_s=accuratepass.duration.total_seconds(),
max_elevation_position=tca_position,
max_elevation_date=accuratepass.tca,
)
def _find_nearest_descending(self, ascending_date):
for candidate in self._sample_points(ascending_date):
if not self.is_ascending(candidate):
return candidate
else:
logger.error('Could not find a descending pass over %s start date: %s - TLE: %s',
self.location, ascending_date, self.predictor.tle)
raise PropagationError("Can not find an descending phase")
def _find_nearest_ascending(self, descending_date):
for candidate in self._sample_points(descending_date):
if self.is_ascending(candidate):
return candidate
else:
logger.error('Could not find an ascending pass over %s start date: %s - TLE: %s',
self.location, descending_date, self.predictor.tle)
raise PropagationError('Can not find an ascending phase')
def _sample_points(self, date):
"""Helper method to found ascending or descending phases of elevation"""
start = date
end = date + self._orbit_step(0.99)
mid = self.midpoint(start, end)
mid_right = self.midpoint(mid, end)
mid_left = self.midpoint(start, mid)
return [end, mid, mid_right, mid_left]
def _refine_pass(self, ascending_date, descending_date):
tca = self._find_tca(ascending_date, descending_date)
elevation = self._elevation_at(tca)
if elevation > self.max_elevation_gt:
aos = self._find_aos(tca)
los = self._find_los(tca)
else:
aos = los = None
return AccuratePredictedPass(aos, tca, los, elevation)
def _find_tca(self, ascending_date, descending_date):
while not self._precision_reached(ascending_date, descending_date):
midpoint = self.midpoint(ascending_date, descending_date)
if self.is_ascending(midpoint):
ascending_date = midpoint
else:
descending_date = midpoint
return ascending_date
def _precision_reached(self, start, end):
# TODO: Allow the precision to change from the outside
return end - start <= ONE_SECOND
@staticmethod
def midpoint(start, end):
"""Returns the midpoint between two dates"""
return start + (end - start) / 2
def _elevation_at(self, when_utc):
position = self.predictor.get_only_position(when_utc)
return self.location.elevation_for(position)
def is_passing(self, when_utc):
"""Returns a boolean indicating if satellite is actually visible"""
return bool(self._elevation_at(when_utc))
def is_ascending(self, when_utc):
"""Check is elevation is ascending or descending on a given point"""
elevation = self._elevation_at(when_utc)
next_elevation = self._elevation_at(when_utc + ONE_SECOND)
return elevation <= next_elevation
def _orbit_step(self, size):
"""Returns a time step, that will make the satellite advance a given number of orbits"""
step_in_radians = size * 2 * pi
seconds = (step_in_radians / self.predictor.mean_motion) * 60
return dt.timedelta(seconds=seconds)
def _find_aos(self, tca):
end = tca
start = tca - self._orbit_step(0.34) # On third of the orbit
elevation = self._elevation_at(start)
assert elevation < 0
while not self._precision_reached(start, end):
midpoint = self.midpoint(start, end)
elevation = self._elevation_at(midpoint)
if elevation < self.aos_at:
start = midpoint
else:
end = midpoint
return end
def _find_los(self, tca):
start = tca
end = tca + self._orbit_step(0.34)
while not self._precision_reached(start, end):
midpoint = self.midpoint(start, end)
elevation = self._elevation_at(midpoint)
if elevation < self.aos_at:
end = midpoint
else:
start = midpoint
return start
class AccuratePredictedPass:
def __init__(self, aos, tca, los, max_elevation):
self.aos = round_datetime(aos) if aos is not None else None
self.tca = round_datetime(tca)
self.los = round_datetime(los) if los is not None else None
self.max_elevation = max_elevation
@property
def valid(self):
return self.max_elevation > 0 and self.aos is not None and self.los is not None
@reify
def max_elevation_deg(self):
return degrees(self.max_elevation)
@reify
def duration(self):
return self.los - self.aos
| 38.105169 | 98 | 0.635449 | 2,626 | 21,377 | 4.925362 | 0.178218 | 0.029767 | 0.012757 | 0.011597 | 0.263801 | 0.201253 | 0.166615 | 0.118679 | 0.109943 | 0.105149 | 0 | 0.004975 | 0.29485 | 21,377 | 560 | 99 | 38.173214 | 0.852793 | 0.224821 | 0 | 0.211699 | 0 | 0 | 0.040015 | 0 | 0 | 0 | 0 | 0.001786 | 0.002786 | 1 | 0.128134 | false | 0.075209 | 0.038997 | 0.022284 | 0.289694 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e81163dfe2d224db06c949798465e50a5cb2ce2e | 7,702 | py | Python | python/GafferUI/ScriptEditor.py | PaulDoessel/gaffer-play | 8b72dabb388e12424c230acfb0bd209049b01bd6 | [
"BSD-3-Clause"
] | null | null | null | python/GafferUI/ScriptEditor.py | PaulDoessel/gaffer-play | 8b72dabb388e12424c230acfb0bd209049b01bd6 | [
"BSD-3-Clause"
] | null | null | null | python/GafferUI/ScriptEditor.py | PaulDoessel/gaffer-play | 8b72dabb388e12424c230acfb0bd209049b01bd6 | [
"BSD-3-Clause"
] | 1 | 2020-02-15T16:15:54.000Z | 2020-02-15T16:15:54.000Z | ##########################################################################
#
# Copyright (c) 2011-2012, John Haddon. All rights reserved.
# Copyright (c) 2011-2013, Image Engine Design Inc. All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above
# copyright notice, this list of conditions and the following
# disclaimer.
#
# * Redistributions in binary form must reproduce the above
# copyright notice, this list of conditions and the following
# disclaimer in the documentation and/or other materials provided with
# the distribution.
#
# * Neither the name of John Haddon nor the names of
# any other contributors to this software may be used to endorse or
# promote products derived from this software without specific prior
# written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS
# IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
# THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
# LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
# NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#
##########################################################################
import ast
import sys
import traceback
import IECore
import Gaffer
import GafferUI
QtGui = GafferUI._qtImport( "QtGui" )
QtCore = GafferUI._qtImport( "QtCore" )
## \todo Custom right click menu with script load, save, execute file, undo, redo etc.
## \todo Standard way for users to customise all menus
## \todo Tab completion and popup help. rlcompleter module should be useful for tab completion. Completer( dict ) constructs a completer
# that works in a specific namespace.
class ScriptEditor( GafferUI.EditorWidget ) :
def __init__( self, scriptNode, **kw ) :
self.__splittable = GafferUI.SplitContainer()
GafferUI.EditorWidget.__init__( self, self.__splittable, scriptNode, **kw )
self.__outputWidget = GafferUI.MultiLineTextWidget(
editable = False,
wrapMode = GafferUI.MultiLineTextWidget.WrapMode.None,
role = GafferUI.MultiLineTextWidget.Role.Code,
)
self.__inputWidget = GafferUI.MultiLineTextWidget(
wrapMode = GafferUI.MultiLineTextWidget.WrapMode.None,
role = GafferUI.MultiLineTextWidget.Role.Code,
)
self.__splittable.append( self.__outputWidget )
self.__splittable.append( self.__inputWidget )
self.__inputWidgetActivatedConnection = self.__inputWidget.activatedSignal().connect( Gaffer.WeakMethod( self.__activated ) )
self.__inputWidgetDropTextConnection = self.__inputWidget.dropTextSignal().connect( Gaffer.WeakMethod( self.__dropText ) )
self.__executionDict = {
"IECore" : IECore,
"Gaffer" : Gaffer,
"GafferUI" : GafferUI,
"script" : scriptNode,
"parent" : scriptNode
}
def inputWidget( self ) :
return self.__inputWidget
def execute( self ) :
# decide what to execute
haveSelection = True
toExecute = self.__inputWidget.selectedText()
if not toExecute :
haveSelection = False
toExecute = self.__inputWidget.getText()
# parse it first. this lets us give better error formatting
# for syntax errors, and also figure out whether we can eval()
# and display the result or must exec() only.
try :
parsed = ast.parse( toExecute )
except SyntaxError, e :
self.__outputWidget.appendHTML( self.__syntaxErrorToHTML( e ) )
return
# execute it
self.__outputWidget.appendHTML( self.__codeToHTML( toExecute ) )
with Gaffer.OutputRedirection( stdOut = Gaffer.WeakMethod( self.__redirectOutput ), stdErr = Gaffer.WeakMethod( self.__redirectOutput ) ) :
with _MessageHandler( self.__outputWidget ) :
with Gaffer.UndoContext( self.scriptNode() ) :
with self.getContext() :
try :
if len( parsed.body ) == 1 and isinstance( parsed.body[0], ast.Expr ) :
result = eval( toExecute, self.__executionDict, self.__executionDict )
if result is not None :
self.__outputWidget.appendText( str( result ) )
else :
exec( toExecute, self.__executionDict, self.__executionDict )
if not haveSelection :
self.__inputWidget.setText( "" )
except Exception, e :
self.__outputWidget.appendHTML( self.__exceptionToHTML() )
def __repr__( self ) :
return "GafferUI.ScriptEditor( scriptNode )"
def __activated( self, widget ) :
self.execute()
return True
def __dropText( self, widget, dragData ) :
if isinstance( dragData, IECore.StringVectorData ) :
return repr( list( dragData ) )
elif isinstance( dragData, Gaffer.GraphComponent ) :
if self.scriptNode().isAncestorOf( dragData ) :
return "script['" + dragData.relativeName( self.scriptNode() ).replace( ".", "']['" ) + "']"
elif isinstance( dragData, Gaffer.Set ) :
if len( dragData ) == 1 :
return self.__dropText( widget, dragData[0] )
else :
return "[ " + ", ".join( [ self.__dropText( widget, d ) for d in dragData ] ) + " ]"
elif isinstance( dragData, IECore.Data ) and hasattr( dragData, "value" ) :
return repr( dragData.value )
return None
def __codeToHTML( self, code ) :
code = code.replace( "<", "<" ).replace( ">", ">" )
return "<pre>" + code + "</pre>"
def __syntaxErrorToHTML( self, syntaxError ) :
formatted = traceback.format_exception_only( SyntaxError, syntaxError )
lineNumber = formatted[0].rpartition( "," )[2].strip()
headingText = formatted[-1].replace( ":", " : " + lineNumber + " : ", 1 )
result = "<h1 class='ERROR'>%s</h1>" % headingText
result += "<br>" + self.__codeToHTML( "".join( formatted[1:-1] ) )
return result
def __exceptionToHTML( self ) :
t = traceback.extract_tb( sys.exc_info()[2] )
lineNumber = str( t[1][1] )
headingText = traceback.format_exception_only( *(sys.exc_info()[:2]) )[0].replace( ":", " : line " + lineNumber + " : ", 1 )
result = "<h1 class='ERROR'>%s</h1>" % headingText
if len( t ) > 2 :
result += "<br>" + self.__codeToHTML( "".join( traceback.format_list( t[2:] ) ) )
return result
def __redirectOutput( self, output ) :
if output != "\n" :
self.__outputWidget.appendText( output )
# update the gui so messages are output as they occur, rather than all getting queued
# up till the end.
QtGui.QApplication.instance().processEvents( QtCore.QEventLoop.ExcludeUserInputEvents )
GafferUI.EditorWidget.registerType( "ScriptEditor", ScriptEditor )
class _MessageHandler( IECore.MessageHandler ) :
def __init__( self, textWidget ) :
IECore.MessageHandler.__init__( self )
self.__textWidget = textWidget
def handle( self, level, context, message ) :
html = formatted = "<h1 class='%s'>%s : %s </h1><span class='message'>%s</span><br>" % (
IECore.Msg.levelAsString( level ),
IECore.Msg.levelAsString( level ),
context,
message.replace( "\n", "<br>" )
)
self.__textWidget.appendHTML( html )
# update the gui so messages are output as they occur, rather than all getting queued
# up till the end.
QtGui.QApplication.instance().processEvents( QtCore.QEventLoop.ExcludeUserInputEvents )
| 36.67619 | 141 | 0.692288 | 868 | 7,702 | 6 | 0.362903 | 0.024578 | 0.015361 | 0.017281 | 0.185484 | 0.154378 | 0.137097 | 0.137097 | 0.137097 | 0.120584 | 0 | 0.006358 | 0.183199 | 7,702 | 209 | 142 | 36.851675 | 0.821491 | 0.305635 | 0 | 0.137931 | 0 | 0.008621 | 0.055556 | 0.018065 | 0 | 0 | 0 | 0.004785 | 0 | 0 | null | null | 0 | 0.068966 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e81446098f632747c9bb69739c5cdbc90d7e2461 | 946 | py | Python | superset/superset_config.py | panchohumeres/dynamo-covid | cf473be3eeca436efccd8891a61b721192cf6d34 | [
"MIT"
] | 4 | 2020-08-10T07:35:10.000Z | 2022-03-31T23:03:32.000Z | superset/superset_config.py | panchohumeres/superJupyter | cc0ce98da2c58a7ab0ae502b5b07250db0c78b89 | [
"MIT"
] | 1 | 2021-06-02T02:51:12.000Z | 2021-06-02T02:51:12.000Z | superset/superset_config.py | panchohumeres/dynamo-covid | cf473be3eeca436efccd8891a61b721192cf6d34 | [
"MIT"
] | 1 | 2022-02-08T01:46:19.000Z | 2022-02-08T01:46:19.000Z | import os
SERVER_NAME = os.getenv('DOMAIN_SUPERSET')
PUBLIC_ROLE_LIKE_GAMMA = True
SESSION_COOKIE_SAMESITE = None # One of [None, 'Lax', 'Strict']
SESSION_COOKIE_HTTPONLY = False
MAPBOX_API_KEY = os.getenv('MAPBOX_API_KEY', '')
POSTGRES_DB=os.getenv('POSTGRES_DB')
POSTGRES_PASSWORD=os.getenv('POSTGRES_PASSWORD')
POSTGRES_USER=os.getenv('POSTGRES_USER')
POSTGRES_PORT=str(os.getenv('POSTGRES_PORT'))
HTTP_HEADERS = {'X-Frame-Options': 'ALLOWALL'}
sql_alchemy_string='postgresql+psycopg2://'+POSTGRES_USER+':'+POSTGRES_PASSWORD+'@postgres:'+POSTGRES_PORT+'/'+POSTGRES_DB
CACHE_CONFIG = {
'CACHE_TYPE': 'redis',
'CACHE_DEFAULT_TIMEOUT': 300,
'CACHE_KEY_PREFIX': 'superset_',
'CACHE_REDIS_HOST': 'redis',
'CACHE_REDIS_PORT': 6379,
'CACHE_REDIS_DB': 1,
'CACHE_REDIS_URL': 'redis://redis:6379/1'}
SQLALCHEMY_DATABASE_URI = \
sql_alchemy_string
SQLALCHEMY_TRACK_MODIFICATIONS = True
SECRET_KEY = 'thisISaSECRET_1234' | 35.037037 | 122 | 0.756871 | 125 | 946 | 5.32 | 0.496 | 0.07218 | 0.096241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021127 | 0.099366 | 946 | 27 | 123 | 35.037037 | 0.75939 | 0.031712 | 0 | 0 | 0 | 0 | 0.333333 | 0.046995 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e8175b6a8041e64371cfdfe4643c61104dee988c | 2,189 | py | Python | dataclassses_howto.py | CvanderStoep/VideosSampleCode | 38a8d2538a041d5664d0040807ffac463d0fb79c | [
"MIT"
] | 285 | 2021-02-16T21:08:57.000Z | 2022-03-31T10:58:41.000Z | dataclassses_howto.py | CvanderStoep/VideosSampleCode | 38a8d2538a041d5664d0040807ffac463d0fb79c | [
"MIT"
] | 10 | 2021-05-01T17:20:50.000Z | 2022-03-09T21:46:46.000Z | dataclassses_howto.py | CvanderStoep/VideosSampleCode | 38a8d2538a041d5664d0040807ffac463d0fb79c | [
"MIT"
] | 96 | 2021-02-15T14:23:15.000Z | 2022-03-31T10:58:50.000Z | import dataclasses
import inspect
from dataclasses import dataclass, field
from pprint import pprint
import attr
class ManualComment:
def __init__(self, id: int, text: str):
self.id: int = id
self.text: str = text
def __repr__(self):
return "{}(id={}, text={})".format(self.__class__.__name__, self.id, self.text)
def __eq__(self, other):
if other.__class__ is self.__class__:
return (self.id, self.text) == (other.id, other.text)
else:
return NotImplemented
def __ne__(self, other):
result = self.__eq__(other)
if result is NotImplemented:
return NotImplemented
else:
return not result
def __hash__(self):
return hash((self.__class__, self.id, self.text))
def __lt__(self, other):
if other.__class__ is self.__class__:
return (self.id, self.text) < (other.id, other.text)
else:
return NotImplemented
def __le__(self, other):
if other.__class__ is self.__class__:
return (self.id, self.text) <= (other.id, other.text)
else:
return NotImplemented
def __gt__(self, other):
if other.__class__ is self.__class__:
return (self.id, self.text) > (other.id, other.text)
else:
return NotImplemented
def __ge__(self, other):
if other.__class__ is self.__class__:
return (self.id, self.text) >= (other.id, other.text)
else:
return NotImplemented
@dataclass(frozen=True, order=True)
class Comment:
id: int
text: str = ""
replies: list[int] = field(default_factory=list, repr=False, compare=False)
@attr.s(frozen=True, order=True, slots=True)
class AttrComment:
id: int = 0
text: str = ""
def main():
comment = Comment(1, "I just subscribed!")
# comment.id = 3 # can't immutable
print(comment)
print(dataclasses.astuple(comment))
print(dataclasses.asdict(comment))
copy = dataclasses.replace(comment, id=3)
print(copy)
pprint(inspect.getmembers(Comment, inspect.isfunction))
if __name__ == '__main__':
main()
| 25.752941 | 87 | 0.616263 | 264 | 2,189 | 4.727273 | 0.246212 | 0.043269 | 0.064103 | 0.078526 | 0.405449 | 0.378205 | 0.378205 | 0.378205 | 0.378205 | 0.378205 | 0 | 0.002505 | 0.270443 | 2,189 | 84 | 88 | 26.059524 | 0.778961 | 0.014619 | 0 | 0.301587 | 0 | 0 | 0.020427 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15873 | false | 0 | 0.079365 | 0.031746 | 0.587302 | 0.095238 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e81b3e433344ddf1234f763d89e92536bfcf4280 | 108,927 | py | Python | corehq/apps/domain/views.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | null | null | null | corehq/apps/domain/views.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | 1 | 2022-03-12T01:03:25.000Z | 2022-03-12T01:03:25.000Z | corehq/apps/domain/views.py | johan--/commcare-hq | 86ee99c54f55ee94e4c8f2f6f30fc44e10e69ebd | [
"BSD-3-Clause"
] | null | null | null | import copy
import datetime
from decimal import Decimal
import logging
import uuid
import json
import cStringIO
from couchdbkit import ResourceNotFound
import dateutil
from django.core.paginator import Paginator
from django.views.generic import View
from django.db.models import Sum
from django.conf import settings
from django.template.loader import render_to_string
from django.utils.decorators import method_decorator
from django.utils.safestring import mark_safe
from django.core.urlresolvers import reverse
from django.http import HttpResponseRedirect, HttpResponse, Http404
from django.shortcuts import redirect, render
from django.contrib import messages
from django.views.decorators.http import require_POST
from PIL import Image
from django.utils.translation import ugettext as _, ugettext_noop, ugettext_lazy
from corehq.const import USER_DATE_FORMAT
from custom.dhis2.forms import Dhis2SettingsForm
from custom.dhis2.models import Dhis2Settings
from casexml.apps.case.mock import CaseBlock
from casexml.apps.case.xml import V2
from corehq.apps.accounting.async_handlers import Select2BillingInfoHandler
from corehq.apps.accounting.invoicing import DomainWireInvoiceFactory
from corehq.apps.accounting.decorators import (
requires_privilege_with_fallback,
)
from corehq.apps.hqwebapp.tasks import send_mail_async
from corehq.apps.accounting.exceptions import (
NewSubscriptionError,
PaymentRequestError,
)
from corehq.apps.accounting.payment_handlers import (
BulkStripePaymentHandler,
CreditStripePaymentHandler,
InvoiceStripePaymentHandler,
)
from corehq.apps.accounting.subscription_changes import DomainDowngradeStatusHandler
from corehq.apps.accounting.forms import EnterprisePlanContactForm
from corehq.apps.accounting.utils import (
get_change_status, get_privileges, fmt_dollar_amount,
quantize_accounting_decimal, get_customer_cards,
)
from corehq.apps.hqwebapp.async_handler import AsyncHandlerMixin
from corehq.apps.smsbillables.async_handlers import SMSRatesAsyncHandler, SMSRatesSelect2AsyncHandler
from corehq.apps.smsbillables.forms import SMSRateCalculatorForm
from corehq.apps.users.models import DomainInvitation
from corehq.apps.fixtures.models import FixtureDataType
from corehq.toggles import NAMESPACE_DOMAIN, all_toggles, CAN_EDIT_EULA, TRANSFER_DOMAIN
from corehq.util.context_processors import get_domain_type
from dimagi.utils.couch.resource_conflict import retry_resource
from corehq import privileges, feature_previews
from django_prbac.utils import has_privilege
from corehq.apps.accounting.models import (
Subscription, CreditLine, SoftwareProductType, SubscriptionType,
DefaultProductPlan, SoftwarePlanEdition, BillingAccount,
BillingAccountType,
Invoice, BillingRecord, InvoicePdf, PaymentMethodType,
PaymentMethod, EntryPoint, WireInvoice, SoftwarePlanVisibility, FeatureType,
StripePaymentMethod,
)
from corehq.apps.accounting.usage import FeatureUsageCalculator
from corehq.apps.accounting.user_text import (
get_feature_name,
PricingTable,
DESC_BY_EDITION,
get_feature_recurring_interval,
)
from corehq.apps.hqwebapp.models import ProjectSettingsTab
from corehq.apps import receiverwrapper
from corehq.apps.domain.calculations import CALCS, CALC_FNS, CALC_ORDER, dom_calc
from corehq.apps.domain.decorators import (
domain_admin_required, login_required, require_superuser, login_and_domain_required
)
from corehq.apps.domain.forms import (
DomainGlobalSettingsForm, DomainMetadataForm, SnapshotSettingsForm,
SnapshotApplicationForm, DomainInternalForm, PrivacySecurityForm,
ConfirmNewSubscriptionForm, ProBonoForm, EditBillingAccountInfoForm,
ConfirmSubscriptionRenewalForm, SnapshotFixtureForm, TransferDomainForm,
SelectSubscriptionTypeForm, INTERNAL_SUBSCRIPTION_MANAGEMENT_FORMS)
from corehq.apps.domain.models import Domain, LICENSES, TransferDomainRequest
from corehq.apps.domain.utils import normalize_domain_name
from corehq.apps.hqwebapp.views import BaseSectionPageView, BasePageView, CRUDPaginatedViewMixin
from corehq.apps.orgs.models import Organization, OrgRequest, Team
from corehq.apps.domain.forms import ProjectSettingsForm
from dimagi.utils.decorators.memoized import memoized
from dimagi.utils.web import get_ip, json_response, get_site_domain
from corehq.apps.users.decorators import require_can_edit_web_users
from corehq.apps.receiverwrapper.forms import GenericRepeaterForm, FormRepeaterForm
from corehq.apps.receiverwrapper.models import FormRepeater, CaseRepeater, ShortFormRepeater, AppStructureRepeater, \
RepeatRecord
from dimagi.utils.post import simple_post
from toggle.models import Toggle
from corehq.apps.hqwebapp.tasks import send_html_email_async
accounting_logger = logging.getLogger('accounting')
PAYMENT_ERROR_MESSAGES = {
400: ugettext_lazy('Your request was not formatted properly.'),
403: ugettext_lazy('Forbidden.'),
404: ugettext_lazy('Page not found.'),
500: ugettext_lazy("There was an error processing your request."
" We're working quickly to fix the issue. Please try again shortly."),
}
# Domain not required here - we could be selecting it for the first time. See notes domain.decorators
# about why we need this custom login_required decorator
@login_required
def select(request, domain_select_template='domain/select.html', do_not_redirect=False):
domains_for_user = Domain.active_for_user(request.user)
if not domains_for_user:
return redirect('registration_domain', domain_type=get_domain_type(None, request))
email = request.couch_user.get_email()
open_invitations = [e for e in DomainInvitation.by_email(email) if not e.is_expired]
additional_context = {
'domains_for_user': domains_for_user,
'open_invitations': open_invitations,
}
last_visited_domain = request.session.get('last_visited_domain')
if open_invitations \
or do_not_redirect \
or not last_visited_domain:
return render(request, domain_select_template, additional_context)
else:
domain = Domain.get_by_name(last_visited_domain)
if domain and domain.is_active:
# mirrors logic in login_and_domain_required
if (
request.couch_user.is_member_of(domain) or domain.is_public
or (request.user.is_superuser and not domain.restrict_superusers)
or domain.is_snapshot
):
try:
from corehq.apps.dashboard.views import dashboard_default
return dashboard_default(request, last_visited_domain)
except Http404:
pass
del request.session['last_visited_domain']
return render(request, domain_select_template, additional_context)
@require_superuser
def incomplete_email(request,
incomplete_email_template='domain/incomplete_email.html'):
from corehq.apps.domain.tasks import (
incomplete_self_started_domains,
incomplete_domains_to_email
)
context = {
'self_started': incomplete_self_started_domains,
'dimagi_owned': incomplete_domains_to_email,
}
return render(request, incomplete_email_template, context)
class DomainViewMixin(object):
"""
Paving the way for a world of entirely class-based views.
Let's do this, guys. :-)
Set strict_domain_fetching to True in subclasses to bypass the cache.
"""
strict_domain_fetching = False
@property
@memoized
def domain(self):
domain = self.args[0] if len(self.args) > 0 else self.kwargs.get('domain', "")
return normalize_domain_name(domain)
@property
@memoized
def domain_object(self):
domain = Domain.get_by_name(self.domain, strict=self.strict_domain_fetching)
if not domain:
raise Http404()
return domain
class LoginAndDomainMixin(object):
@method_decorator(login_and_domain_required)
def dispatch(self, *args, **kwargs):
return super(LoginAndDomainMixin, self).dispatch(*args, **kwargs)
class SubscriptionUpgradeRequiredView(LoginAndDomainMixin, BasePageView,
DomainViewMixin):
page_title = ugettext_lazy("Upgrade Required")
template_name = "domain/insufficient_privilege_notification.html"
@property
def page_url(self):
return self.request.get_full_path
@property
def page_name(self):
return _("Sorry, you do not have access to %(feature_name)s") % {
'feature_name': self.feature_name,
}
@property
def is_domain_admin(self):
if not hasattr(self.request, 'couch_user'):
return False
return self.request.couch_user.is_domain_admin(self.domain)
@property
def page_context(self):
return {
'domain': self.domain,
'feature_name': self.feature_name,
'plan_name': self.required_plan_name,
'change_subscription_url': reverse(SelectPlanView.urlname,
args=[self.domain]),
'is_domain_admin': self.is_domain_admin,
}
@property
def missing_privilege(self):
return self.args[1]
@property
def feature_name(self):
return privileges.Titles.get_name_from_privilege(self.missing_privilege)
@property
def required_plan_name(self):
return DefaultProductPlan.get_lowest_edition_by_domain(
self.domain_object, [self.missing_privilege]
)
def get(self, request, *args, **kwargs):
self.request = request
self.args = args
return super(SubscriptionUpgradeRequiredView, self).get(
request, *args, **kwargs
)
class BaseDomainView(LoginAndDomainMixin, BaseSectionPageView, DomainViewMixin):
@property
def main_context(self):
main_context = super(BaseDomainView, self).main_context
main_context.update({
'domain': self.domain,
})
return main_context
@property
@memoized
def page_url(self):
if self.urlname:
return reverse(self.urlname, args=[self.domain])
class BaseProjectSettingsView(BaseDomainView):
section_name = ugettext_lazy("Project Settings")
template_name = "settings/base_template.html"
@property
def main_context(self):
main_context = super(BaseProjectSettingsView, self).main_context
main_context.update({
'active_tab': ProjectSettingsTab(
self.request,
self.urlname,
domain=self.domain,
couch_user=self.request.couch_user,
project=self.request.project
),
'is_project_settings': True,
})
return main_context
@property
@memoized
def section_url(self):
return reverse(EditMyProjectSettingsView.urlname, args=[self.domain])
class DefaultProjectSettingsView(BaseDomainView):
urlname = 'domain_settings_default'
def get(self, request, *args, **kwargs):
if request.couch_user.is_domain_admin(self.domain):
return HttpResponseRedirect(reverse(EditBasicProjectInfoView.urlname, args=[self.domain]))
return HttpResponseRedirect(reverse(EditMyProjectSettingsView.urlname, args=[self.domain]))
class BaseAdminProjectSettingsView(BaseProjectSettingsView):
"""
The base class for all project settings views that require administrative
access.
"""
@method_decorator(domain_admin_required)
def dispatch(self, request, *args, **kwargs):
return super(BaseProjectSettingsView, self).dispatch(request, *args, **kwargs)
class BaseEditProjectInfoView(BaseAdminProjectSettingsView):
"""
The base class for all the edit project information views.
"""
strict_domain_fetching = True
@property
def autocomplete_fields(self):
return []
@property
def main_context(self):
context = super(BaseEditProjectInfoView, self).main_context
context.update({
'autocomplete_fields': self.autocomplete_fields,
'commtrack_enabled': self.domain_object.commtrack_enabled,
# ideally the template gets access to the domain doc through
# some other means. otherwise it has to be supplied to every view reachable in that sidebar (every
# view whose template extends users_base.html); mike says he's refactoring all of this imminently, so
# i will not worry about it until he is done
'call_center_enabled': self.domain_object.call_center_config.enabled,
'cloudcare_releases': self.domain_object.cloudcare_releases,
})
return context
class EditBasicProjectInfoView(BaseEditProjectInfoView):
template_name = 'domain/admin/info_basic.html'
urlname = 'domain_basic_info'
page_title = ugettext_lazy("Basic")
@property
def can_user_see_meta(self):
return self.request.couch_user.is_previewer()
@property
def can_use_custom_logo(self):
return has_privilege(self.request, privileges.CUSTOM_BRANDING)
@property
@memoized
def basic_info_form(self):
initial = {
'hr_name': self.domain_object.hr_name or self.domain_object.name,
'default_timezone': self.domain_object.default_timezone,
'case_sharing': json.dumps(self.domain_object.case_sharing),
'call_center_enabled': self.domain_object.call_center_config.enabled,
'call_center_type': self.initial_call_center_type,
'call_center_case_owner': self.initial_call_center_case_owner,
'call_center_case_type': self.domain_object.call_center_config.case_type,
'commtrack_enabled': self.domain_object.commtrack_enabled,
}
if self.request.method == 'POST':
if self.can_user_see_meta:
return DomainMetadataForm(
self.request.POST,
self.request.FILES,
user=self.request.couch_user,
domain=self.domain_object.name,
can_use_custom_logo=self.can_use_custom_logo,
)
return DomainGlobalSettingsForm(
self.request.POST,
self.request.FILES,
domain=self.domain_object.name,
can_use_custom_logo=self.can_use_custom_logo
)
if self.can_user_see_meta:
initial.update({
'is_test': self.domain_object.is_test,
'cloudcare_releases': self.domain_object.cloudcare_releases,
})
return DomainMetadataForm(
can_use_custom_logo=self.can_use_custom_logo,
user=self.request.couch_user,
domain=self.domain_object.name,
initial=initial
)
return DomainGlobalSettingsForm(
initial=initial,
domain=self.domain_object.name,
can_use_custom_logo=self.can_use_custom_logo
)
@property
@memoized
def initial_call_center_case_owner(self):
config = self.domain_object.call_center_config
if config.use_user_location_as_owner:
return DomainGlobalSettingsForm.USE_LOCATIONS_CHOICE
return self.domain_object.call_center_config.case_owner_id
@property
@memoized
def initial_call_center_type(self):
if self.domain_object.call_center_config.use_fixtures:
return DomainGlobalSettingsForm.CASES_AND_FIXTURES_CHOICE
return DomainGlobalSettingsForm.CASES_ONLY_CHOICE
@property
def page_context(self):
return {
'basic_info_form': self.basic_info_form,
}
def post(self, request, *args, **kwargs):
if self.basic_info_form.is_valid():
if self.basic_info_form.save(request, self.domain_object):
messages.success(request, _("Project settings saved!"))
else:
messages.error(request, _("There seems to have been an error saving your settings. Please try again!"))
return self.get(request, *args, **kwargs)
class EditMyProjectSettingsView(BaseProjectSettingsView):
template_name = 'domain/admin/my_project_settings.html'
urlname = 'my_project_settings'
page_title = ugettext_lazy("My Timezone")
@property
@memoized
def my_project_settings_form(self):
initial = { 'global_timezone': self.domain_object.default_timezone }
if self.domain_membership:
initial.update({
'override_global_tz': self.domain_membership.override_global_tz,
'user_timezone': (self.domain_membership.timezone if self.domain_membership.override_global_tz
else self.domain_object.default_timezone),
})
else:
initial.update({
'override_global_tz': False,
'user_timezone': initial["global_timezone"],
})
if self.request.method == 'POST':
return ProjectSettingsForm(self.request.POST, initial=initial)
return ProjectSettingsForm(initial=initial)
@property
@memoized
def domain_membership(self):
return self.request.couch_user.get_domain_membership(self.domain)
@property
def page_context(self):
return {
'my_project_settings_form': self.my_project_settings_form,
'override_global_tz': self.domain_membership.override_global_tz if self.domain_membership else False,
'no_domain_membership': not self.domain_membership,
}
def post(self, request, *args, **kwargs):
if self.my_project_settings_form.is_valid():
self.my_project_settings_form.save(self.request.couch_user, self.domain)
messages.success(request, _("Your project settings have been saved!"))
return self.get(request, *args, **kwargs)
class EditDhis2SettingsView(BaseProjectSettingsView):
template_name = 'domain/admin/dhis2_settings.html'
urlname = 'dhis2_settings'
page_title = ugettext_lazy("DHIS2 API settings")
@property
@memoized
def dhis2_settings_form(self):
settings_ = Dhis2Settings.for_domain(self.domain_object.name)
initial = settings_.dhis2 if settings_ else {'enabled': False}
if self.request.method == 'POST':
return Dhis2SettingsForm(self.request.POST, initial=initial)
return Dhis2SettingsForm(initial=initial)
@property
def page_context(self):
return {
'dhis2_settings_form': self.dhis2_settings_form,
}
def post(self, request, *args, **kwargs):
if self.dhis2_settings_form.is_valid():
if self.dhis2_settings_form.save(self.domain_object):
messages.success(request, _('DHIS2 API settings successfully updated'))
else:
messages.error(request, _('There seems to have been an error. Please try again.'))
return self.get(request, *args, **kwargs)
@require_POST
@require_can_edit_web_users
def drop_repeater(request, domain, repeater_id):
rep = FormRepeater.get(repeater_id)
rep.retire()
messages.success(request, "Form forwarding stopped!")
return HttpResponseRedirect(reverse(DomainForwardingOptionsView.urlname, args=[domain]))
@require_POST
@require_can_edit_web_users
def test_repeater(request, domain):
url = request.POST["url"]
repeater_type = request.POST['repeater_type']
format = request.POST['format']
form = GenericRepeaterForm(
{"url": url, "format": format},
domain=domain,
repeater_class=receiverwrapper.models.repeater_types[repeater_type]
)
if form.is_valid():
url = form.cleaned_data["url"]
# now we fake a post
def _stub(repeater_type):
if 'case' in repeater_type.lower():
return CaseBlock(
case_id='test-case-%s' % uuid.uuid4().hex,
create=True,
case_type='test',
case_name='test case',
).as_string()
else:
return "<?xml version='1.0' ?><data id='test'><TestString>Test post from CommCareHQ on %s</TestString></data>" % \
(datetime.datetime.utcnow())
fake_post = _stub(repeater_type)
try:
resp = simple_post(fake_post, url)
if 200 <= resp.status < 300:
return HttpResponse(json.dumps({"success": True,
"response": resp.read(),
"status": resp.status}))
else:
return HttpResponse(json.dumps({"success": False,
"response": resp.read(),
"status": resp.status}))
except Exception, e:
errors = str(e)
return HttpResponse(json.dumps({"success": False, "response": errors}))
else:
return HttpResponse(json.dumps({"success": False, "response": "Please enter a valid url."}))
def autocomplete_fields(request, field):
prefix = request.GET.get('prefix', '')
results = Domain.field_by_prefix(field, prefix)
return HttpResponse(json.dumps(results))
def logo(request, domain):
logo = Domain.get_by_name(domain).get_custom_logo()
if logo is None:
raise Http404()
return HttpResponse(logo[0], content_type=logo[1])
class DomainAccountingSettings(BaseAdminProjectSettingsView):
@method_decorator(login_and_domain_required)
def dispatch(self, request, *args, **kwargs):
return super(DomainAccountingSettings, self).dispatch(request, *args, **kwargs)
@property
@memoized
def product(self):
return SoftwareProductType.get_type_by_domain(self.domain_object)
@property
@memoized
def account(self):
return BillingAccount.get_account_by_domain(self.domain)
@property
def current_subscription(self):
return Subscription.get_subscribed_plan_by_domain(self.domain_object)[1]
class DomainSubscriptionView(DomainAccountingSettings):
urlname = 'domain_subscription_view'
template_name = 'domain/current_subscription.html'
page_title = ugettext_lazy("Current Subscription")
@property
def can_purchase_credits(self):
return self.request.couch_user.is_domain_admin(self.domain)
@property
def plan(self):
plan_version, subscription = Subscription.get_subscribed_plan_by_domain(self.domain_object)
date_end = None
next_subscription = {
'exists': False,
'can_renew': False,
'name': None,
'price': None,
}
cards = None
general_credits = None
if subscription:
cards = get_customer_cards(self.account, self.request.user.username, self.domain)
date_end = (subscription.date_end.strftime(USER_DATE_FORMAT)
if subscription.date_end is not None else "--")
if subscription.date_end is not None:
if subscription.is_renewed:
next_product = self.get_product_summary(subscription.next_subscription.plan_version,
self.account,
subscription)
next_subscription.update({
'exists': True,
'date_start': subscription.next_subscription.date_start.strftime(USER_DATE_FORMAT),
'name': subscription.next_subscription.plan_version.plan.name,
'price': next_product['monthly_fee'],
})
else:
days_left = (subscription.date_end - datetime.date.today()).days
next_subscription.update({
'can_renew': days_left <= 30,
'renew_url': reverse(SubscriptionRenewalView.urlname, args=[self.domain]),
})
general_credits = CreditLine.get_credits_by_subscription_and_features(subscription)
elif self.account is not None:
general_credits = CreditLine.get_credits_for_account(self.account)
if general_credits:
general_credits = self._fmt_credit(self._credit_grand_total(general_credits))
info = {
'products': [self.get_product_summary(plan_version, self.account, subscription)],
'features': self.get_feature_summary(plan_version, self.account, subscription),
'general_credit': general_credits,
'css_class': "label-plan %s" % plan_version.plan.edition.lower(),
'do_not_invoice': subscription.do_not_invoice if subscription is not None else False,
'is_trial': subscription.is_trial if subscription is not None else False,
'date_start': (subscription.date_start.strftime(USER_DATE_FORMAT)
if subscription is not None else None),
'date_end': date_end,
'cards': cards,
'next_subscription': next_subscription,
}
info.update(plan_version.user_facing_description)
return info
def _fmt_credit(self, credit_amount=None):
if credit_amount is None:
return {
'amount': "--",
}
return {
'amount': fmt_dollar_amount(credit_amount),
'is_visible': credit_amount != Decimal('0.0'),
}
def _credit_grand_total(self, credit_lines):
return sum([c.balance for c in credit_lines]) if credit_lines else Decimal('0.00')
def get_product_summary(self, plan_version, account, subscription):
product_rates = plan_version.product_rates.all()
if len(product_rates) > 1:
# Models and UI are both written to support multiple products,
# but for now, each subscription can only have one product.
accounting_logger.error(
"[BILLING] "
"There seem to be multiple ACTIVE NEXT subscriptions for the subscriber %s. "
"Odd, right? The latest one by date_created was used, but consider this an issue."
% self.account
)
product_rate = product_rates[0]
product_info = {
'name': product_rate.product.product_type,
'monthly_fee': _("USD %s /month") % product_rate.monthly_fee,
'credit': None,
'type': product_rate.product.product_type,
}
credit_lines = None
if subscription is not None:
credit_lines = CreditLine.get_credits_by_subscription_and_features(
subscription, product_type=product_rate.product.product_type
)
elif account is not None:
credit_lines = CreditLine.get_credits_for_account(
account, product_type=product_rate.product.product_type
)
if credit_lines:
product_info['credit'] = self._fmt_credit(self._credit_grand_total(credit_lines))
return product_info
def get_feature_summary(self, plan_version, account, subscription):
feature_summary = []
for feature_rate in plan_version.feature_rates.all():
usage = FeatureUsageCalculator(feature_rate, self.domain).get_usage()
feature_info = {
'name': get_feature_name(feature_rate.feature.feature_type, self.product),
'usage': usage,
'remaining': (
feature_rate.monthly_limit - usage
if feature_rate.monthly_limit != -1
else _('Unlimited')
),
'credit': self._fmt_credit(),
'type': feature_rate.feature.feature_type,
'recurring_interval': get_feature_recurring_interval(feature_rate.feature.feature_type),
}
credit_lines = None
if subscription is not None:
credit_lines = CreditLine.get_credits_by_subscription_and_features(
subscription, feature_type=feature_rate.feature.feature_type
)
elif account is not None:
credit_lines = CreditLine.get_credits_for_account(
account, feature_type=feature_rate.feature.feature_type)
if credit_lines:
feature_info['credit'] = self._fmt_credit(self._credit_grand_total(credit_lines))
feature_summary.append(feature_info)
return feature_summary
@property
def page_context(self):
return {
'plan': self.plan,
'change_plan_url': reverse(SelectPlanView.urlname, args=[self.domain]),
'can_purchase_credits': self.can_purchase_credits,
'credit_card_url': reverse(CreditsStripePaymentView.urlname, args=[self.domain]),
'wire_url': reverse(CreditsWireInvoiceView.urlname, args=[self.domain]),
'stripe_public_key': settings.STRIPE_PUBLIC_KEY,
'payment_error_messages': PAYMENT_ERROR_MESSAGES,
'sms_rate_calc_url': reverse(SMSRatesView.urlname,
args=[self.domain]),
'user_email': self.request.couch_user.username,
}
class EditExistingBillingAccountView(DomainAccountingSettings, AsyncHandlerMixin):
template_name = 'domain/update_billing_contact_info.html'
urlname = 'domain_update_billing_info'
page_title = ugettext_lazy("Billing Information")
async_handlers = [
Select2BillingInfoHandler,
]
@property
@memoized
def billing_info_form(self):
if self.request.method == 'POST':
return EditBillingAccountInfoForm(
self.account, self.domain, self.request.couch_user.username, data=self.request.POST
)
return EditBillingAccountInfoForm(self.account, self.domain, self.request.couch_user.username)
def dispatch(self, request, *args, **kwargs):
if self.account is None:
raise Http404()
return super(EditExistingBillingAccountView, self).dispatch(request, *args, **kwargs)
@property
def page_context(self):
return {
'billing_account_info_form': self.billing_info_form,
'cards': self._get_cards(),
'stripe_public_key': settings.STRIPE_PUBLIC_KEY,
'card_base_url': reverse(CardsView.url_name, args=[self.domain]),
}
def _get_cards(self):
user = self.request.user.username
payment_method, new_payment_method = StripePaymentMethod.objects.get_or_create(
web_user=user,
method_type=PaymentMethodType.STRIPE,
)
return payment_method.all_cards_serialized(self.account)
def post(self, request, *args, **kwargs):
if self.async_response is not None:
return self.async_response
if self.billing_info_form.is_valid():
is_saved = self.billing_info_form.save()
if not is_saved:
messages.error(
request, _("It appears that there was an issue updating your contact information. "
"We've been notified of the issue. Please try submitting again, and if the problem "
"persists, please try in a few hours."))
else:
messages.success(
request, _("Billing contact information was successfully updated.")
)
return HttpResponseRedirect(reverse(EditExistingBillingAccountView.urlname, args=[self.domain]))
return self.get(request, *args, **kwargs)
class DomainBillingStatementsView(DomainAccountingSettings, CRUDPaginatedViewMixin):
template_name = 'domain/billing_statements.html'
urlname = 'domain_billing_statements'
page_title = ugettext_lazy("Billing Statements")
limit_text = ugettext_lazy("statements per page")
empty_notification = ugettext_lazy("No Billing Statements match the current criteria.")
loading_message = ugettext_lazy("Loading statements...")
@property
def parameters(self):
return self.request.POST if self.request.method == 'POST' else self.request.GET
@property
def stripe_cards(self):
return get_customer_cards(self.account, self.request.user.username, self.domain)
@property
def show_hidden(self):
if not self.request.user.is_superuser:
return False
return bool(self.request.POST.get('additionalData[show_hidden]'))
@property
def show_unpaid(self):
try:
return json.loads(self.request.POST.get('additionalData[show_unpaid]'))
except TypeError:
return False
@property
def invoices(self):
invoices = Invoice.objects.filter(subscription__subscriber__domain=self.domain)
if not self.show_hidden:
invoices = invoices.filter(is_hidden=False)
if self.show_unpaid:
invoices = invoices.filter(date_paid__exact=None)
return invoices.order_by('-date_start', '-date_end')
@property
def total(self):
return self.paginated_invoices.count
@property
@memoized
def paginated_invoices(self):
return Paginator(self.invoices, self.limit)
@property
def total_balance(self):
"""
Returns the total balance of unpaid, unhidden invoices.
Doesn't take into account the view settings on the page.
"""
invoices = (Invoice.objects
.filter(subscription__subscriber__domain=self.domain)
.filter(date_paid__exact=None)
.filter(is_hidden=False))
return invoices.aggregate(
total_balance=Sum('balance')
).get('total_balance') or 0.00
@property
def column_names(self):
return [
_("Statement No."),
_("Plan"),
_("Billing Period"),
_("Date Due"),
_("Payment Status"),
_("PDF"),
]
@property
def page_context(self):
pagination_context = self.pagination_context
pagination_context.update({
'stripe_public_key': settings.STRIPE_PUBLIC_KEY,
'payment_error_messages': PAYMENT_ERROR_MESSAGES,
'process_invoice_payment_url': reverse(
InvoiceStripePaymentView.urlname,
args=[self.domain],
),
'process_bulk_payment_url': reverse(
BulkStripePaymentView.urlname,
args=[self.domain],
),
'process_wire_invoice_url': reverse(
WireInvoiceView.urlname,
args=[self.domain],
),
'stripe_cards': self.stripe_cards,
'total_balance': self.total_balance,
})
return pagination_context
@property
def can_pay_invoices(self):
return self.request.couch_user.is_domain_admin(self.domain)
@property
def paginated_list(self):
for invoice in self.paginated_invoices.page(self.page).object_list:
try:
last_billing_record = BillingRecord.objects.filter(
invoice=invoice
).latest('date_created')
if invoice.is_paid:
payment_status = (_("Paid on %s.")
% invoice.date_paid.strftime(USER_DATE_FORMAT))
payment_class = "label label-inverse"
else:
payment_status = _("Not Paid")
payment_class = "label label-important"
date_due = (
(invoice.date_due.strftime(USER_DATE_FORMAT)
if not invoice.is_paid else _("Already Paid"))
if invoice.date_due else _("None")
)
yield {
'itemData': {
'id': invoice.id,
'invoice_number': invoice.invoice_number,
'start': invoice.date_start.strftime(USER_DATE_FORMAT),
'end': invoice.date_end.strftime(USER_DATE_FORMAT),
'plan': invoice.subscription.plan_version.user_facing_description,
'payment_status': payment_status,
'payment_class': payment_class,
'date_due': date_due,
'pdfUrl': reverse(
BillingStatementPdfView.urlname,
args=[self.domain, last_billing_record.pdf_data_id]
),
'canMakePayment': (not invoice.is_paid
and self.can_pay_invoices),
'balance': "%s" % quantize_accounting_decimal(invoice.balance),
},
'template': 'statement-row-template',
}
except BillingRecord.DoesNotExist:
logging.error(
"An invoice was generated for %(invoice_id)d "
"(domain: %(domain)s), but no billing record!" % {
'invoice_id': invoice.id,
'domain': self.domain,
})
def refresh_item(self, item_id):
pass
def post(self, *args, **kwargs):
return self.paginate_crud_response
def dispatch(self, request, *args, **kwargs):
if self.account is None:
raise Http404()
return super(DomainBillingStatementsView, self).dispatch(request, *args, **kwargs)
class BaseStripePaymentView(DomainAccountingSettings):
http_method_names = ['post']
@property
def account(self):
raise NotImplementedError("you must impmement the property account")
@property
@memoized
def domain_admin(self):
if self.request.couch_user.is_domain_admin(self.domain):
return self.request.couch_user.username
else:
raise PaymentRequestError(
"The logged in user was not a domain admin."
)
def get_or_create_payment_method(self):
return StripePaymentMethod.objects.get_or_create(
web_user=self.domain_admin,
method_type=PaymentMethodType.STRIPE,
)[0]
def get_payment_handler(self):
"""Returns a StripePaymentHandler object
"""
raise NotImplementedError("You must impmenent get_payment_handler()")
def post(self, request, *args, **kwargs):
try:
payment_handler = self.get_payment_handler()
response = payment_handler.process_request(request)
except PaymentRequestError as e:
accounting_logger.error(
"[BILLING] Failed to process Stripe Payment due to bad "
"request for domain %(domain)s user %(web_user)s: "
"%(error)s" % {
'domain': self.domain,
'web_user': self.request.user.username,
'error': e,
}
)
response = {
'error': {
'message': _(
"There was an issue processing your payment. No "
"charges were made. We're looking into the issue "
"as quickly as possible. Sorry for the inconvenience."
)
}
}
return json_response(response)
class CreditsStripePaymentView(BaseStripePaymentView):
urlname = 'domain_credits_payment'
@property
@memoized
def account(self):
return BillingAccount.get_or_create_account_by_domain(
self.domain,
created_by=self.request.user.username,
account_type=BillingAccountType.USER_CREATED,
entry_point=EntryPoint.SELF_STARTED,
)[0]
def get_payment_handler(self):
return CreditStripePaymentHandler(
self.get_or_create_payment_method(),
self.domain,
self.account,
subscription=Subscription.get_subscribed_plan_by_domain(self.domain_object)[1],
post_data=self.request.POST.copy(),
)
class CreditsWireInvoiceView(DomainAccountingSettings):
http_method_names = ['post']
urlname = 'domain_wire_payment'
@method_decorator(login_and_domain_required)
def dispatch(self, request, *args, **kwargs):
return super(CreditsWireInvoiceView, self).dispatch(request, *args, **kwargs)
def post(self, request, *args, **kwargs):
emails = request.POST.get('emails', []).split()
amount = Decimal(request.POST.get('amount', 0))
wire_invoice_factory = DomainWireInvoiceFactory(request.domain, contact_emails=emails)
try:
wire_invoice_factory.create_wire_credits_invoice(self._get_items(request), amount)
except Exception as e:
return json_response({'error': {'message': str(e)}})
return json_response({'success': True})
def _get_items(self, request):
product_type = SoftwareProductType.get_type_by_domain(Domain.get_by_name(self.domain))
features = [{'type': get_feature_name(feature_type[0], product_type),
'amount': Decimal(request.POST.get(feature_type[0], 0))}
for feature_type in FeatureType.CHOICES
if Decimal(request.POST.get(feature_type[0], 0)) > 0]
products = [{'type': pt[0],
'amount': Decimal(request.POST.get(pt[0], 0))}
for pt in SoftwareProductType.CHOICES
if Decimal(request.POST.get(pt[0], 0)) > 0]
return products + features
class InvoiceStripePaymentView(BaseStripePaymentView):
urlname = 'domain_invoice_payment'
@property
@memoized
def invoice(self):
try:
invoice_id = self.request.POST['invoice_id']
except IndexError:
raise PaymentRequestError("invoice_id is required")
try:
return Invoice.objects.get(pk=invoice_id)
except Invoice.DoesNotExist:
raise PaymentRequestError(
"Could not find a matching invoice for invoice_id '%s'"
% invoice_id
)
@property
def account(self):
return self.invoice.subscription.account
def get_payment_handler(self):
return InvoiceStripePaymentHandler(
self.get_or_create_payment_method(), self.domain, self.invoice
)
class BulkStripePaymentView(BaseStripePaymentView):
urlname = 'domain_bulk_payment'
@property
def account(self):
return BillingAccount.get_account_by_domain(self.domain)
def get_payment_handler(self):
return BulkStripePaymentHandler(
self.get_or_create_payment_method(), self.domain
)
class WireInvoiceView(View):
http_method_names = ['post']
urlname = 'domain_wire_invoice'
@method_decorator(login_and_domain_required)
@method_decorator(domain_admin_required)
def dispatch(self, request, *args, **kwargs):
return super(WireInvoiceView, self).dispatch(request, *args, **kwargs)
def post(self, request, *args, **kwargs):
emails = request.POST.get('emails', []).split()
balance = Decimal(request.POST.get('customPaymentAmount', 0))
wire_invoice_factory = DomainWireInvoiceFactory(request.domain, contact_emails=emails)
try:
wire_invoice_factory.create_wire_invoice(balance)
except Exception, e:
return json_response({'error': {'message', e}})
return json_response({'success': True})
class BillingStatementPdfView(View):
urlname = 'domain_billing_statement_download'
@method_decorator(login_and_domain_required)
@method_decorator(domain_admin_required)
def dispatch(self, request, *args, **kwargs):
return super(BillingStatementPdfView, self).dispatch(request, *args, **kwargs)
def get(self, request, *args, **kwargs):
domain = args[0]
statement_id = kwargs.get('statement_id')
if statement_id is None or domain is None:
raise Http404()
try:
invoice_pdf = InvoicePdf.get(statement_id)
except ResourceNotFound:
raise Http404()
try:
if invoice_pdf.is_wire:
invoice = WireInvoice.objects.get(
pk=invoice_pdf.invoice_id,
domain=domain
)
else:
invoice = Invoice.objects.get(
pk=invoice_pdf.invoice_id,
subscription__subscriber__domain=domain
)
except (Invoice.DoesNotExist, WireInvoice.DoesNotExist):
raise Http404()
if invoice.is_wire:
edition = 'Bulk'
else:
edition = DESC_BY_EDITION[invoice.subscription.plan_version.plan.edition]['name']
filename = "%(pdf_id)s_%(domain)s_%(edition)s_%(filename)s" % {
'pdf_id': invoice_pdf._id,
'domain': domain,
'edition': edition,
'filename': invoice_pdf.get_filename(invoice),
}
try:
data = invoice_pdf.get_data(invoice)
response = HttpResponse(data, content_type='application/pdf')
response['Content-Disposition'] = 'inline;filename="%s' % filename
except Exception as e:
logging.error('[Billing] Fetching invoice PDF failed: %s' % e)
return HttpResponse(_("Could not obtain billing statement. "
"An issue has been submitted."))
return response
class InternalSubscriptionManagementView(BaseAdminProjectSettingsView):
template_name = 'domain/internal_subscription_management.html'
urlname = 'internal_subscription_mgmt'
page_title = ugettext_lazy("Dimagi Internal Subscription Management")
form_classes = INTERNAL_SUBSCRIPTION_MANAGEMENT_FORMS
@method_decorator(require_superuser)
def get(self, request, *args, **kwargs):
return super(InternalSubscriptionManagementView, self).get(request, *args, **kwargs)
@method_decorator(require_superuser)
def post(self, request, *args, **kwargs):
form = self.get_post_form
if form.is_valid():
try:
form.process_subscription_management()
return HttpResponseRedirect(reverse(DomainSubscriptionView.urlname, args=[self.domain]))
except NewSubscriptionError as e:
messages.error(self.request, e.message)
return self.get(request, *args, **kwargs)
@property
def page_context(self):
return {
'plan_name': Subscription.get_subscribed_plan_by_domain(self.domain)[0],
'select_subscription_type_form': self.select_subscription_type_form,
'subscription_management_forms': self.slug_to_form.values(),
'today': datetime.date.today(),
}
@property
def get_post_form(self):
return self.slug_to_form[self.request.POST.get('slug')]
@property
@memoized
def slug_to_form(self):
def create_form(form_class):
if self.request.method == 'POST' and form_class.slug == self.request.POST.get('slug'):
return form_class(self.domain, self.request.couch_user.username, self.request.POST)
return form_class(self.domain, self.request.couch_user.username)
return {form_class.slug: create_form(form_class) for form_class in self.form_classes}
@property
@memoized
def select_subscription_type_form(self):
if self.request.method == 'POST':
for form_slug in self.slug_to_form:
if form_slug in self.request.POST:
return SelectSubscriptionTypeForm({
'subscription_type': form_slug,
})
subscription_type = None
subscription = Subscription.get_subscribed_plan_by_domain(self.domain_object)[1]
if subscription is None:
subscription_type = None
else:
plan = subscription.plan_version.plan
if subscription.service_type == SubscriptionType.CONTRACTED:
subscription_type = "contracted_partner"
elif plan.edition == SoftwarePlanEdition.ENTERPRISE:
subscription_type = "dimagi_only_enterprise"
elif (plan.edition == SoftwarePlanEdition.ADVANCED
and plan.visibility == SoftwarePlanVisibility.TRIAL_INTERNAL):
subscription_type = "advanced_extended_trial"
return SelectSubscriptionTypeForm({'subscription_type': subscription_type})
class SelectPlanView(DomainAccountingSettings):
template_name = 'domain/select_plan.html'
urlname = 'domain_select_plan'
page_title = ugettext_lazy("Change Plan")
step_title = ugettext_lazy("Select Plan")
edition = None
lead_text = ugettext_lazy("Please select a plan below that fits your organization's needs.")
@property
def edition_name(self):
if self.edition:
return DESC_BY_EDITION[self.edition]['name']
@property
def is_non_ops_superuser(self):
if not self.request.couch_user.is_superuser:
return False
return not has_privilege(self.request, privileges.ACCOUNTING_ADMIN)
@property
def parent_pages(self):
return [
{
'title': DomainSubscriptionView.page_title,
'url': reverse(DomainSubscriptionView.urlname, args=[self.domain]),
}
]
@property
def steps(self):
edition_name = u" (%s)" % self.edition_name if self.edition_name else ""
return [
{
'title': _(u"1. Select a Plan%(edition_name)s") % {
"edition_name": edition_name
},
'url': reverse(SelectPlanView.urlname, args=[self.domain]),
}
]
@property
def main_context(self):
context = super(SelectPlanView, self).main_context
context.update({
'steps': self.steps,
'step_title': self.step_title,
'lead_text': self.lead_text,
})
return context
@property
def page_context(self):
return {
'pricing_table': PricingTable.get_table_by_product(self.product, domain=self.domain),
'current_edition': (self.current_subscription.plan_version.plan.edition.lower()
if self.current_subscription is not None
and not self.current_subscription.is_trial
else ""),
'is_non_ops_superuser': self.is_non_ops_superuser,
}
class EditPrivacySecurityView(BaseAdminProjectSettingsView):
template_name = "domain/admin/project_privacy.html"
urlname = "privacy_info"
page_title = ugettext_lazy("Privacy and Security")
@property
@memoized
def privacy_form(self):
initial = {
"secure_submissions": self.domain_object.secure_submissions,
"restrict_superusers": self.domain_object.restrict_superusers,
"allow_domain_requests": self.domain_object.allow_domain_requests,
}
if self.request.method == 'POST':
return PrivacySecurityForm(self.request.POST, initial=initial)
return PrivacySecurityForm(initial=initial)
@property
def page_context(self):
return {
'privacy_form': self.privacy_form
}
def post(self, request, *args, **kwargs):
if self.privacy_form.is_valid():
self.privacy_form.save(self.domain_object)
messages.success(request, _("Your project settings have been saved!"))
return self.get(request, *args, **kwargs)
class SelectedEnterprisePlanView(SelectPlanView):
template_name = 'domain/selected_enterprise_plan.html'
urlname = 'enterprise_request_quote'
step_title = ugettext_lazy("Contact Dimagi")
edition = SoftwarePlanEdition.ENTERPRISE
@property
def steps(self):
last_steps = super(SelectedEnterprisePlanView, self).steps
last_steps.append({
'title': _("2. Contact Dimagi"),
'url': reverse(SelectedEnterprisePlanView.urlname, args=[self.domain]),
})
return last_steps
@property
@memoized
def is_not_redirect(self):
return not 'plan_edition' in self.request.POST
@property
@memoized
def enterprise_contact_form(self):
if self.request.method == 'POST' and self.is_not_redirect:
return EnterprisePlanContactForm(self.domain, self.request.couch_user, data=self.request.POST)
return EnterprisePlanContactForm(self.domain, self.request.couch_user)
@property
def page_context(self):
return {
'enterprise_contact_form': self.enterprise_contact_form,
}
def post(self, request, *args, **kwargs):
if self.is_not_redirect and self.enterprise_contact_form.is_valid():
self.enterprise_contact_form.send_message()
messages.success(request, _("Your request was sent to Dimagi. "
"We will try our best to follow up in a timely manner."))
return HttpResponseRedirect(reverse(DomainSubscriptionView.urlname, args=[self.domain]))
return self.get(request, *args, **kwargs)
class ConfirmSelectedPlanView(SelectPlanView):
template_name = 'domain/confirm_plan.html'
urlname = 'confirm_selected_plan'
step_title = ugettext_lazy("Confirm Plan")
@property
def steps(self):
last_steps = super(ConfirmSelectedPlanView, self).steps
last_steps.append({
'title': _("2. Confirm Plan"),
'url': reverse(SelectPlanView.urlname, args=[self.domain]),
})
return last_steps
@property
@memoized
def edition(self):
edition = self.request.POST.get('plan_edition').title()
if edition not in [e[0] for e in SoftwarePlanEdition.CHOICES]:
raise Http404()
return edition
@property
@memoized
def selected_plan_version(self):
return DefaultProductPlan.get_default_plan_by_domain(self.domain, self.edition).plan.get_version()
@property
def downgrade_messages(self):
current_plan_version, subscription = Subscription.get_subscribed_plan_by_domain(self.domain_object)
if subscription is None:
current_plan_version = None
downgrades = get_change_status(current_plan_version, self.selected_plan_version)[1]
downgrade_handler = DomainDowngradeStatusHandler(
self.domain_object, self.selected_plan_version, downgrades,
web_user=self.request.user.username
)
return downgrade_handler.get_response()
@property
def page_context(self):
return {
'downgrade_messages': self.downgrade_messages,
'current_plan': (self.current_subscription.plan_version.user_facing_description
if self.current_subscription is not None else None),
'show_community_notice': (self.edition == SoftwarePlanEdition.COMMUNITY
and self.current_subscription is None),
}
@property
def main_context(self):
context = super(ConfirmSelectedPlanView, self).main_context
context.update({
'plan': self.selected_plan_version.user_facing_description,
})
return context
def get(self, request, *args, **kwargs):
return HttpResponseRedirect(reverse(SelectPlanView.urlname, args=[self.domain]))
def post(self, request, *args, **kwargs):
if self.edition == SoftwarePlanEdition.ENTERPRISE and not self.request.couch_user.is_superuser:
return HttpResponseRedirect(reverse(SelectedEnterprisePlanView.urlname, args=[self.domain]))
return super(ConfirmSelectedPlanView, self).get(request, *args, **kwargs)
class ConfirmBillingAccountInfoView(ConfirmSelectedPlanView, AsyncHandlerMixin):
template_name = 'domain/confirm_billing_info.html'
urlname = 'confirm_billing_account_info'
step_title = ugettext_lazy("Confirm Billing Information")
is_new = False
async_handlers = [
Select2BillingInfoHandler,
]
@property
def steps(self):
last_steps = super(ConfirmBillingAccountInfoView, self).steps
last_steps.append({
'title': _("3. Confirm Billing Account"),
'url': reverse(ConfirmBillingAccountInfoView.urlname, args=[self.domain]),
})
return last_steps
@property
@memoized
def account(self):
if self.current_subscription:
return self.current_subscription.account
account, self.is_new = BillingAccount.get_or_create_account_by_domain(
self.domain,
created_by=self.request.couch_user.username,
account_type=BillingAccountType.USER_CREATED,
entry_point=EntryPoint.SELF_STARTED,
)
return account
@property
def payment_method(self):
user = self.request.user.username
payment_method, __ = StripePaymentMethod.objects.get_or_create(
web_user=user,
method_type=PaymentMethodType.STRIPE,
)
return payment_method
@property
@memoized
def is_form_post(self):
return 'company_name' in self.request.POST
@property
@memoized
def billing_account_info_form(self):
initial = None
if self.edition == SoftwarePlanEdition.ENTERPRISE and self.request.couch_user.is_superuser:
initial = {
'company_name': "Dimagi",
'first_line': "585 Massachusetts Ave",
'second_line': "Suite 4",
'city': "Cambridge",
'state_province_region': "MA",
'postal_code': "02139",
'country': "US",
}
if self.request.method == 'POST' and self.is_form_post:
return ConfirmNewSubscriptionForm(
self.account, self.domain, self.request.couch_user.username,
self.selected_plan_version, self.current_subscription, data=self.request.POST, initial=initial
)
return ConfirmNewSubscriptionForm(self.account, self.domain, self.request.couch_user.username,
self.selected_plan_version, self.current_subscription, initial=initial)
@property
def page_context(self):
return {
'billing_account_info_form': self.billing_account_info_form,
'stripe_public_key': settings.STRIPE_PUBLIC_KEY,
'cards': self.payment_method.all_cards_serialized(self.account)
}
def post(self, request, *args, **kwargs):
if self.async_response is not None:
return self.async_response
if self.edition == SoftwarePlanEdition.ENTERPRISE and not self.request.couch_user.is_superuser:
return HttpResponseRedirect(reverse(SelectedEnterprisePlanView.urlname, args=[self.domain]))
if self.is_form_post and self.billing_account_info_form.is_valid():
is_saved = self.billing_account_info_form.save()
software_plan_name = DESC_BY_EDITION[self.selected_plan_version.plan.edition]['name'].encode('utf-8')
if not is_saved:
messages.error(
request, _("It appears there was an issue subscribing your project to the %s Software Plan. You "
"may try resubmitting, but if that doesn't work, rest assured someone will be "
"contacting you shortly.") % software_plan_name)
else:
messages.success(
request, _("Your project has been successfully subscribed to the %s Software Plan."
% software_plan_name)
)
return HttpResponseRedirect(reverse(DomainSubscriptionView.urlname, args=[self.domain]))
return super(ConfirmBillingAccountInfoView, self).post(request, *args, **kwargs)
class SubscriptionMixin(object):
@property
@memoized
def subscription(self):
subscription = Subscription.get_subscribed_plan_by_domain(self.domain_object)[1]
if subscription is None:
raise Http404
if subscription.is_renewed:
raise Http404
return subscription
class SubscriptionRenewalView(SelectPlanView, SubscriptionMixin):
urlname = "domain_subscription_renewal"
page_title = ugettext_lazy("Renew Plan")
step_title = ugettext_lazy("Renew or Change Plan")
@property
def lead_text(self):
return ugettext_lazy("Based on your current usage we recommend you use the <strong>{plan}</strong> plan"
.format(plan=self.current_subscription.plan_version.plan.edition))
@property
def main_context(self):
context = super(SubscriptionRenewalView, self).main_context
context.update({'is_renewal': True})
return context
@property
def page_context(self):
context = super(SubscriptionRenewalView, self).page_context
current_privs = get_privileges(self.subscription.plan_version)
plan = DefaultProductPlan.get_lowest_edition_by_domain(
self.domain, current_privs, return_plan=False,
).lower()
context['current_edition'] = (plan
if self.current_subscription is not None
and not self.current_subscription.is_trial
else "")
return context
class ConfirmSubscriptionRenewalView(DomainAccountingSettings, AsyncHandlerMixin, SubscriptionMixin):
template_name = 'domain/confirm_subscription_renewal.html'
urlname = 'domain_subscription_renewal_confirmation'
page_title = ugettext_lazy("Renew Plan")
async_handlers = [
Select2BillingInfoHandler,
]
@property
@memoized
def next_plan_version(self):
new_edition = self.request.POST.get('plan_edition').title()
plan_version = DefaultProductPlan.get_default_plan_by_domain(self.domain, new_edition)
if plan_version is None:
logging.error("[BILLING] Could not find a matching renewable plan "
"for %(domain)s, subscription number %(sub_pk)s." % {
'domain': self.domain,
'sub_pk': self.subscription.pk
})
raise Http404
return plan_version
@property
@memoized
def confirm_form(self):
if self.request.method == 'POST' and "from_plan_page" not in self.request.POST:
return ConfirmSubscriptionRenewalForm(
self.account, self.domain, self.request.couch_user.username,
self.subscription, self.next_plan_version,
data=self.request.POST,
)
return ConfirmSubscriptionRenewalForm(
self.account, self.domain, self.request.couch_user.username,
self.subscription, self.next_plan_version,
)
@property
def page_context(self):
return {
'subscription': self.subscription,
'plan': self.subscription.plan_version.user_facing_description,
'confirm_form': self.confirm_form,
'next_plan': self.next_plan_version.user_facing_description,
}
def post(self, request, *args, **kwargs):
if self.async_response is not None:
return self.async_response
if self.confirm_form.is_valid():
is_saved = self.confirm_form.save()
if not is_saved:
messages.error(
request, _(
"There was an issue renewing your subscription. We "
"have been notified of the issue. Please try "
"submitting again, and if the problem persists, "
"please try in a few hours."
)
)
else:
messages.success(
request, _("Your subscription was successfully renewed!")
)
return HttpResponseRedirect(
reverse(DomainSubscriptionView.urlname, args=[self.domain])
)
return self.get(request, *args, **kwargs)
class ExchangeSnapshotsView(BaseAdminProjectSettingsView):
template_name = 'domain/snapshot_settings.html'
urlname = 'domain_snapshot_settings'
page_title = ugettext_lazy("CommCare Exchange")
@property
def page_context(self):
return {
'project': self.domain_object,
'snapshots': list(self.domain_object.snapshots()),
'published_snapshot': self.domain_object.published_snapshot(),
}
class CreateNewExchangeSnapshotView(BaseAdminProjectSettingsView):
template_name = 'domain/create_snapshot.html'
urlname = 'domain_create_snapshot'
page_title = ugettext_lazy("Publish New Version")
strict_domain_fetching = True
@property
def parent_pages(self):
return [{
'title': ExchangeSnapshotsView.page_title,
'url': reverse(ExchangeSnapshotsView.urlname, args=[self.domain]),
}]
@property
def page_context(self):
context = {
'form': self.snapshot_settings_form,
'app_forms': self.app_forms,
'fixture_forms': self.fixture_forms,
'can_publish_as_org': self.can_publish_as_org,
'autocomplete_fields': ('project_type', 'phone_model', 'user_type', 'city', 'countries', 'region'),
}
if self.published_snapshot:
context.update({
'published_as_org': self.published_snapshot.publisher == 'organization',
'author': self.published_snapshot.author,
})
elif self.request.method == 'POST':
context.update({
'published_as_org': self.request.POST.get('publisher', '') == 'organization',
'author': self.request.POST.get('author', '')
})
return context
@property
def can_publish_as_org(self):
return (self.domain_object.get_organization()
and self.request.couch_user.is_org_admin(self.domain_object.get_organization().name))
@property
@memoized
def snapshots(self):
return list(self.domain_object.snapshots())
@property
@memoized
def published_snapshot(self):
return self.snapshots[0] if self.snapshots else self.domain_object
@property
@memoized
def published_apps(self):
published_apps = {}
if self.published_snapshot:
for app in self.published_snapshot.full_applications():
base_app_id = app.copy_of if self.domain_object == self.published_snapshot else app.copied_from.copy_of
if base_app_id:
published_apps[base_app_id] = app
return published_apps
@property
def app_forms(self):
app_forms = []
for app in self.domain_object.applications():
if self.request.method == 'POST':
app_forms.append((app, SnapshotApplicationForm(self.request.POST, prefix=app.id)))
elif self.published_snapshot and app.copy_of in self.published_apps:
original = self.published_apps[app.copy_of]
app_forms.append((app, SnapshotApplicationForm(initial={
'publish': True,
'name': original.name,
'description': original.description,
'deployment_date': original.deployment_date,
'user_type': original.user_type,
'attribution_notes': original.attribution_notes,
'phone_model': original.phone_model,
}, prefix=app.id)))
else:
app_forms.append((app,
SnapshotApplicationForm(
initial={
'publish': (self.published_snapshot is None
or self.published_snapshot == self.domain_object)
}, prefix=app.id)))
return app_forms
@property
@memoized
def published_fixtures(self):
return [f.copy_from for f in FixtureDataType.by_domain(self.published_snapshot._id)]
@property
def fixture_forms(self):
fixture_forms = []
for fixture in FixtureDataType.by_domain(self.domain_object.name):
fixture.id = fixture._id
if self.request.method == 'POST':
fixture_forms.append((fixture,
SnapshotFixtureForm(self.request.POST, prefix=fixture._id)))
else:
fixture_forms.append((fixture,
SnapshotFixtureForm(
initial={
'publish': (self.published_snapshot == self.domain_object
or fixture._id in self.published_fixtures)
}, prefix=fixture._id)))
return fixture_forms
@property
@memoized
def snapshot_settings_form(self):
if self.request.method == 'POST':
form = SnapshotSettingsForm(self.request.POST,
self.request.FILES,
domain=self.domain_object,
is_superuser=self.request.user.is_superuser)
return form
proj = self.published_snapshot if self.published_snapshot else self.domain_object
initial = {
'case_sharing': json.dumps(proj.case_sharing),
'publish_on_submit': True,
'share_multimedia': self.published_snapshot.multimedia_included if self.published_snapshot else True,
}
init_attribs = ['default_timezone', 'project_type', 'license']
if self.published_snapshot:
init_attribs.extend(['title', 'description', 'short_description'])
if self.published_snapshot.yt_id:
initial['video'] = 'http://www.youtube.com/watch?v=%s' % self.published_snapshot.yt_id
for attr in init_attribs:
initial[attr] = getattr(proj, attr)
return SnapshotSettingsForm(initial=initial,
domain=self.domain_object,
is_superuser=self.request.user.is_superuser)
@property
@memoized
def has_published_apps(self):
for app in self.domain_object.applications():
if self.request.POST.get("%s-publish" % app.id, False):
return True
messages.error(self.request, _("Cannot publish a project without applications to CommCare Exchange"))
return False
@property
def has_signed_eula(self):
eula_signed = self.request.couch_user.is_eula_signed()
if not eula_signed:
messages.error(self.request, _("You must agree to our eula to publish a project to Exchange"))
return eula_signed
@property
def has_valid_form(self):
is_valid = self.snapshot_settings_form.is_valid()
if not is_valid:
messages.error(self.request, _("There are some problems with your form. "
"Please address these issues and try again."))
return is_valid
def post(self, request, *args, **kwargs):
if self.has_published_apps and self.has_signed_eula and self.has_valid_form:
new_license = request.POST['license']
if request.POST.get('share_multimedia', False):
app_ids = self.snapshot_settings_form._get_apps_to_publish()
media = self.domain_object.all_media(from_apps=app_ids)
for m_file in media:
if self.domain not in m_file.shared_by:
m_file.shared_by.append(self.domain)
# set the license of every multimedia file that doesn't yet have a license set
if not m_file.license:
m_file.update_or_add_license(self.domain, type=new_license, should_save=False)
m_file.save()
if not request.POST.get('share_reminders', False):
share_reminders = False
else:
share_reminders = True
copy_by_id = set()
for k in request.POST.keys():
if k.endswith("-publish"):
copy_by_id.add(k[:-len("-publish")])
old = self.domain_object.published_snapshot()
new_domain = self.domain_object.save_snapshot(
share_reminders=share_reminders, copy_by_id=copy_by_id)
new_domain.license = new_license
new_domain.description = request.POST['description']
new_domain.short_description = request.POST['short_description']
new_domain.project_type = request.POST['project_type']
new_domain.title = request.POST['title']
new_domain.multimedia_included = request.POST.get('share_multimedia', '') == 'on'
new_domain.publisher = request.POST.get('publisher', None) or 'user'
if request.POST.get('video'):
new_domain.yt_id = self.snapshot_settings_form.cleaned_data['video']
new_domain.author = request.POST.get('author', None)
new_domain.is_approved = False
new_domain.is_starter_app = request.POST.get('is_starter_app', '') == 'on'
publish_on_submit = request.POST.get('publish_on_submit', "no") == "yes"
image = self.snapshot_settings_form.cleaned_data['image']
if image:
new_domain.image_path = image.name
new_domain.image_type = image.content_type
elif request.POST.get('old_image', False):
new_domain.image_path = old.image_path
new_domain.image_type = old.image_type
new_domain.save()
documentation_file = self.snapshot_settings_form.cleaned_data['documentation_file']
if documentation_file:
new_domain.documentation_file_path = documentation_file.name
new_domain.documentation_file_type = documentation_file.content_type
elif request.POST.get('old_documentation_file', False):
new_domain.documentation_file_path = old.documentation_file_path
new_domain.documentation_file_type = old.documentation_file_type
new_domain.save()
if publish_on_submit:
_publish_snapshot(request, self.domain_object, published_snapshot=new_domain)
else:
new_domain.published = False
new_domain.save()
if image:
im = Image.open(image)
out = cStringIO.StringIO()
im.thumbnail((200, 200), Image.ANTIALIAS)
im.save(out, new_domain.image_type.split('/')[-1])
new_domain.put_attachment(content=out.getvalue(), name=image.name)
elif request.POST.get('old_image', False):
new_domain.put_attachment(content=old.fetch_attachment(old.image_path), name=new_domain.image_path)
if documentation_file:
new_domain.put_attachment(content=documentation_file, name=documentation_file.name)
elif request.POST.get('old_documentation_file', False):
new_domain.put_attachment(content=old.fetch_attachment(old.documentation_file_path),
name=new_domain.documentation_file_path)
for application in new_domain.full_applications():
original_id = application.copied_from._id
name_field = "%s-name" % original_id
if name_field not in request.POST:
continue
application.name = request.POST[name_field]
application.description = request.POST["%s-description" % original_id]
date_picked = request.POST["%s-deployment_date" % original_id]
try:
date_picked = dateutil.parser.parse(date_picked)
if date_picked.year > 2009:
application.deployment_date = date_picked
except Exception:
pass
application.phone_model = request.POST["%s-phone_model" % original_id]
application.attribution_notes = request.POST["%s-attribution_notes" % original_id]
application.user_type = request.POST["%s-user_type" % original_id]
if not new_domain.multimedia_included:
application.multimedia_map = {}
application.save()
for fixture in FixtureDataType.by_domain(new_domain.name):
old_id = FixtureDataType.by_domain_tag(self.domain_object.name,
fixture.tag).first()._id
fixture.description = request.POST["%s-description" % old_id]
fixture.save()
if new_domain is None:
messages.error(request, _("Version creation failed; please try again"))
else:
messages.success(request, (_("Created a new version of your app. This version will be posted to "
"CommCare Exchange pending approval by admins.") if publish_on_submit
else _("Created a new version of your app.")))
return redirect(ExchangeSnapshotsView.urlname, self.domain)
return self.get(request, *args, **kwargs)
class ManageProjectMediaView(BaseAdminProjectSettingsView):
urlname = 'domain_manage_multimedia'
page_title = ugettext_lazy("Multimedia Sharing")
template_name = 'domain/admin/media_manager.html'
@property
def project_media_data(self):
return [{
'license': m.license.type if m.license else 'public',
'shared': self.domain in m.shared_by,
'url': m.url(),
'm_id': m._id,
'tags': m.tags.get(self.domain, []),
'type': m.doc_type,
} for m in self.request.project.all_media()]
@property
def page_context(self):
return {
'media': self.project_media_data,
'licenses': LICENSES.items(),
}
@retry_resource(3)
def post(self, request, *args, **kwargs):
for m_file in request.project.all_media():
if '%s_tags' % m_file._id in request.POST:
m_file.tags[self.domain] = request.POST.get('%s_tags' % m_file._id, '').split(' ')
if self.domain not in m_file.shared_by and request.POST.get('%s_shared' % m_file._id, False):
m_file.shared_by.append(self.domain)
elif self.domain in m_file.shared_by and not request.POST.get('%s_shared' % m_file._id, False):
m_file.shared_by.remove(self.domain)
if '%s_license' % m_file._id in request.POST:
m_file.update_or_add_license(self.domain,
type=request.POST.get('%s_license' % m_file._id, 'public'),
should_save=True)
m_file.save()
messages.success(request, _("Multimedia updated successfully!"))
return self.get(request, *args, **kwargs)
class RepeaterMixin(object):
@property
def friendly_repeater_names(self):
return {
'FormRepeater': _("Forms"),
'CaseRepeater': _("Cases"),
'ShortFormRepeater': _("Form Stubs"),
'AppStructureRepeater': _("App Schema Changes"),
}
class DomainForwardingOptionsView(BaseAdminProjectSettingsView, RepeaterMixin):
urlname = 'domain_forwarding'
page_title = ugettext_lazy("Data Forwarding")
template_name = 'domain/admin/domain_forwarding.html'
@property
def repeaters(self):
available_repeaters = [
FormRepeater, CaseRepeater, ShortFormRepeater, AppStructureRepeater,
]
return [(r.__name__, r.by_domain(self.domain), self.friendly_repeater_names[r.__name__])
for r in available_repeaters]
@property
def page_context(self):
return {
'repeaters': self.repeaters,
'pending_record_count': RepeatRecord.count(self.domain),
}
class AddRepeaterView(BaseAdminProjectSettingsView, RepeaterMixin):
urlname = 'add_repeater'
page_title = ugettext_lazy("Forward Data")
template_name = 'domain/admin/add_form_repeater.html'
repeater_form_class = GenericRepeaterForm
@property
def page_url(self):
return reverse(self.urlname, args=[self.domain, self.repeater_type])
@property
def parent_pages(self):
return [{
'title': DomainForwardingOptionsView.page_title,
'url': reverse(DomainForwardingOptionsView.urlname, args=[self.domain]),
}]
@property
def repeater_type(self):
return self.kwargs['repeater_type']
@property
def page_name(self):
return "Forward %s" % self.friendly_repeater_names.get(self.repeater_type, "Data")
@property
@memoized
def repeater_class(self):
try:
return receiverwrapper.models.repeater_types[self.repeater_type]
except KeyError:
raise Http404()
@property
@memoized
def add_repeater_form(self):
if self.request.method == 'POST':
return self.repeater_form_class(
self.request.POST,
domain=self.domain,
repeater_class=self.repeater_class
)
return self.repeater_form_class(
domain=self.domain,
repeater_class=self.repeater_class
)
@property
def page_context(self):
return {
'form': self.add_repeater_form,
'repeater_type': self.repeater_type,
}
def make_repeater(self):
repeater = self.repeater_class(
domain=self.domain,
url=self.add_repeater_form.cleaned_data['url'],
use_basic_auth=self.add_repeater_form.cleaned_data['use_basic_auth'],
username=self.add_repeater_form.cleaned_data['username'],
password=self.add_repeater_form.cleaned_data['password'],
format=self.add_repeater_form.cleaned_data['format']
)
return repeater
def post(self, request, *args, **kwargs):
if self.add_repeater_form.is_valid():
repeater = self.make_repeater()
repeater.save()
messages.success(request, _("Forwarding set up to %s" % repeater.url))
return HttpResponseRedirect(reverse(DomainForwardingOptionsView.urlname, args=[self.domain]))
return self.get(request, *args, **kwargs)
class AddFormRepeaterView(AddRepeaterView):
urlname = 'add_form_repeater'
repeater_form_class = FormRepeaterForm
@property
def page_url(self):
return reverse(self.urlname, args=[self.domain])
def make_repeater(self):
repeater = super(AddFormRepeaterView, self).make_repeater()
repeater.exclude_device_reports = self.add_repeater_form.cleaned_data['exclude_device_reports']
repeater.include_app_id_param = self.add_repeater_form.cleaned_data['include_app_id_param']
return repeater
class OrgSettingsView(BaseAdminProjectSettingsView):
template_name = 'domain/orgs_settings.html'
urlname = 'domain_org_settings'
page_title = ugettext_lazy("Organization")
@method_decorator(requires_privilege_with_fallback(privileges.CROSS_PROJECT_REPORTS))
def dispatch(self, request, *args, **kwargs):
return super(OrgSettingsView, self).dispatch(request, *args, **kwargs)
@property
def page_context(self):
domain = self.domain_object
org_users = []
teams = Team.get_by_domain(domain.name)
for team in teams:
for user in team.get_members():
user.team_id = team.get_id
user.team = team.name
org_users.append(user)
for user in org_users:
user.current_domain = domain.name
all_orgs = Organization.get_all()
return {
"project": domain,
'domain': domain.name,
"organization": Organization.get_by_name(getattr(domain, "organization", None)),
"org_users": org_users,
"all_orgs": all_orgs,
}
class BaseInternalDomainSettingsView(BaseProjectSettingsView):
strict_domain_fetching = True
@method_decorator(login_and_domain_required)
@method_decorator(require_superuser)
def dispatch(self, request, *args, **kwargs):
return super(BaseInternalDomainSettingsView, self).dispatch(request, *args, **kwargs)
@property
def main_context(self):
context = super(BaseInternalDomainSettingsView, self).main_context
context.update({
'project': self.domain_object,
})
return context
@property
def page_name(self):
return mark_safe("%s <small>Internal</small>" % self.page_title)
class EditInternalDomainInfoView(BaseInternalDomainSettingsView):
urlname = 'domain_internal_settings'
page_title = ugettext_lazy("Project Information")
template_name = 'domain/internal_settings.html'
strict_domain_fetching = True
@property
def autocomplete_fields(self):
return ['countries']
@property
@memoized
def internal_settings_form(self):
can_edit_eula = CAN_EDIT_EULA.enabled(self.request.couch_user.username)
if self.request.method == 'POST':
return DomainInternalForm(can_edit_eula, self.request.POST)
initial = {
'deployment_date': self.domain_object.deployment.date.date
if self.domain_object.deployment.date else '',
'countries': self.domain_object.deployment.countries,
'is_test': self.domain_object.is_test,
}
internal_attrs = [
'sf_contract_id',
'sf_account_id',
'services',
'initiative',
'self_started',
'area',
'sub_area',
'organization_name',
'notes',
'phone_model',
'commtrack_domain',
'business_unit',
'workshop_region',
]
if can_edit_eula:
internal_attrs += [
'custom_eula',
'can_use_data',
]
for attr in internal_attrs:
val = getattr(self.domain_object.internal, attr)
if isinstance(val, bool):
val = 'true' if val else 'false'
initial[attr] = val
return DomainInternalForm(can_edit_eula, initial=initial)
@property
def page_context(self):
return {
'project': self.domain_object,
'form': self.internal_settings_form,
'areas': dict([(a["name"], a["sub_areas"]) for a in settings.INTERNAL_DATA["area"]]),
}
def post(self, request, *args, **kwargs):
if self.internal_settings_form.is_valid():
old_attrs = copy.copy(self.domain_object.internal)
self.internal_settings_form.save(self.domain_object)
eula_props_changed = (bool(old_attrs.custom_eula) != bool(self.domain_object.internal.custom_eula) or
bool(old_attrs.can_use_data) != bool(self.domain_object.internal.can_use_data))
if eula_props_changed and settings.EULA_CHANGE_EMAIL:
message = '\n'.join([
'{user} changed either the EULA or data sharing properties for domain {domain}.',
'',
'The properties changed were:',
'- Custom eula: {eula_old} --> {eula_new}',
'- Can use data: {can_use_data_old} --> {can_use_data_new}'
]).format(
user=self.request.couch_user.username,
domain=self.domain,
eula_old=old_attrs.custom_eula,
eula_new=self.domain_object.internal.custom_eula,
can_use_data_old=old_attrs.can_use_data,
can_use_data_new=self.domain_object.internal.can_use_data,
)
send_mail_async.delay(
'Custom EULA or data use flags changed for {}'.format(self.domain),
message, settings.DEFAULT_FROM_EMAIL, [settings.EULA_CHANGE_EMAIL]
)
messages.success(request, _("The internal information for project %s was successfully updated!")
% self.domain)
else:
messages.error(request, _(
"Your settings are not valid, see below for errors. Correct them and try again!"))
return self.get(request, *args, **kwargs)
class EditInternalCalculationsView(BaseInternalDomainSettingsView):
urlname = 'domain_internal_calculations'
page_title = ugettext_lazy("Calculated Properties")
template_name = 'domain/internal_calculations.html'
@property
def page_context(self):
return {
'calcs': CALCS,
'order': CALC_ORDER,
}
@login_and_domain_required
@require_superuser
def calculated_properties(request, domain):
calc_tag = request.GET.get("calc_tag", '').split('--')
extra_arg = calc_tag[1] if len(calc_tag) > 1 else ''
calc_tag = calc_tag[0]
if not calc_tag or calc_tag not in CALC_FNS.keys():
data = {"error": 'This tag does not exist'}
else:
data = {"value": dom_calc(calc_tag, domain, extra_arg)}
return json_response(data)
def _publish_snapshot(request, domain, published_snapshot=None):
snapshots = domain.snapshots()
for snapshot in snapshots:
if snapshot.published:
snapshot.published = False
if not published_snapshot or snapshot.name != published_snapshot.name:
snapshot.save()
if published_snapshot:
if published_snapshot.copied_from.name != domain.name:
messages.error(request, "Invalid snapshot")
return False
# cda stuff. In order to publish a snapshot, a user must have agreed to this
published_snapshot.cda.signed = True
published_snapshot.cda.date = datetime.datetime.utcnow()
published_snapshot.cda.type = 'Content Distribution Agreement'
if request.couch_user:
published_snapshot.cda.user_id = request.couch_user.get_id
published_snapshot.cda.user_ip = get_ip(request)
published_snapshot.published = True
published_snapshot.save()
_notification_email_on_publish(domain, published_snapshot, request.couch_user)
return True
def _notification_email_on_publish(domain, snapshot, published_by):
params = {"domain": domain, "snapshot": snapshot,
"published_by": published_by, "url_base": get_site_domain()}
text_content = render_to_string(
"domain/email/published_app_notification.txt", params)
html_content = render_to_string(
"domain/email/published_app_notification.html", params)
recipients = settings.EXCHANGE_NOTIFICATION_RECIPIENTS
subject = "New App on Exchange: %s" % snapshot.title
try:
for recipient in recipients:
send_html_email_async.delay(subject, recipient, html_content,
text_content=text_content,
email_from=settings.DEFAULT_FROM_EMAIL)
except Exception:
logging.warning("Can't send notification email, "
"but the message was:\n%s" % text_content)
@domain_admin_required
def set_published_snapshot(request, domain, snapshot_name=''):
domain = request.project
snapshots = domain.snapshots()
if request.method == 'POST':
if snapshot_name != '':
published_snapshot = Domain.get_by_name(snapshot_name)
_publish_snapshot(request, domain, published_snapshot=published_snapshot)
else:
_publish_snapshot(request, domain)
return redirect('domain_snapshot_settings', domain.name)
class ProBonoMixin():
page_title = ugettext_lazy("Pro-Bono Application")
is_submitted = False
url_name = None
@property
def requesting_domain(self):
raise NotImplementedError
@property
@memoized
def pro_bono_form(self):
if self.request.method == 'POST':
return ProBonoForm(self.use_domain_field, self.request.POST)
return ProBonoForm(self.use_domain_field)
@property
def page_context(self):
return {
'pro_bono_form': self.pro_bono_form,
'is_submitted': self.is_submitted,
}
@property
def page_url(self):
return self.url_name
def post(self, request, *args, **kwargs):
if self.pro_bono_form.is_valid():
self.pro_bono_form.process_submission(domain=self.requesting_domain)
self.is_submitted = True
return self.get(request, *args, **kwargs)
class ProBonoStaticView(ProBonoMixin, BasePageView):
template_name = 'domain/pro_bono/static.html'
urlname = 'pro_bono_static'
use_domain_field = True
@property
def requesting_domain(self):
return self.pro_bono_form.cleaned_data['domain']
class ProBonoView(ProBonoMixin, DomainAccountingSettings):
template_name = 'domain/pro_bono/domain.html'
urlname = 'pro_bono'
use_domain_field = False
@property
def requesting_domain(self):
return self.domain
@property
def parent_pages(self):
return [
{
'title': DomainSubscriptionView.page_title,
'url': reverse(DomainSubscriptionView.urlname, args=[self.domain]),
}
]
@property
def section_url(self):
return self.page_url
class FeaturePreviewsView(BaseAdminProjectSettingsView):
urlname = 'feature_previews'
page_title = ugettext_lazy("Feature Previews")
template_name = 'domain/admin/feature_previews.html'
@memoized
def features(self):
features = []
for preview_name in dir(feature_previews):
if not preview_name.startswith('__'):
preview = getattr(feature_previews, preview_name)
if isinstance(preview, feature_previews.FeaturePreview) and preview.has_privilege(self.request):
features.append((preview, preview.enabled(self.domain)))
return features
def get_toggle(self, slug):
if not slug in [f.slug for f, _ in self.features()]:
raise Http404()
try:
return Toggle.get(slug)
except ResourceNotFound:
return Toggle(slug=slug)
@property
def page_context(self):
return {
'features': self.features(),
}
def post(self, request, *args, **kwargs):
for feature, enabled in self.features():
self.update_feature(feature, enabled, feature.slug in request.POST)
return redirect('feature_previews', domain=self.domain)
def update_feature(self, feature, current_state, new_state):
if current_state != new_state:
feature.set(self.domain, new_state, NAMESPACE_DOMAIN)
if feature.save_fn is not None:
feature.save_fn(self.domain, new_state)
class FeatureFlagsView(BaseAdminProjectSettingsView):
urlname = 'domain_feature_flags'
page_title = ugettext_lazy("Feature Flags")
template_name = 'domain/admin/feature_flags.html'
@method_decorator(require_superuser)
def dispatch(self, request, *args, **kwargs):
return super(FeatureFlagsView, self).dispatch(request, *args, **kwargs)
@memoized
def enabled_flags(self):
def _sort_key(toggle_enabled_tuple):
return (not toggle_enabled_tuple[1], not toggle_enabled_tuple[2], toggle_enabled_tuple[0].label)
return sorted(
[(toggle, toggle.enabled(self.domain), toggle.enabled(self.request.couch_user.username))
for toggle in all_toggles()],
key=_sort_key,
)
@property
def page_context(self):
return {
'flags': self.enabled_flags(),
}
class TransferDomainView(BaseAdminProjectSettingsView):
urlname = 'transfer_domain_view'
page_title = ugettext_lazy("Transfer Project")
template_name = 'domain/admin/transfer_domain.html'
@property
@memoized
def active_transfer(self):
return TransferDomainRequest.get_active_transfer(self.domain,
self.request.user.username)
@property
@memoized
def transfer_domain_form(self):
return TransferDomainForm(self.domain,
self.request.user.username,
self.request.POST or None)
def get(self, request, *args, **kwargs):
if self.active_transfer:
self.template_name = 'domain/admin/transfer_domain_pending.html'
if request.GET.get('resend', None):
self.active_transfer.send_transfer_request()
messages.info(request,
_(u"Resent transfer request for project '{domain}'").format(domain=self.domain))
return super(TransferDomainView, self).get(request, *args, **kwargs)
def post(self, request, *args, **kwargs):
form = self.transfer_domain_form
if form.is_valid():
# Initiate domain transfer
transfer = form.save()
transfer.send_transfer_request()
return HttpResponseRedirect(self.page_url)
context = self.get_context_data(**kwargs)
return self.render_to_response(context)
@property
def page_context(self):
if self.active_transfer:
return {'transfer': self.active_transfer.as_dict()}
else:
return {'form': self.transfer_domain_form}
@method_decorator(domain_admin_required)
def dispatch(self, request, *args, **kwargs):
if not TRANSFER_DOMAIN.enabled(request.domain):
raise Http404()
return super(TransferDomainView, self).dispatch(request, *args, **kwargs)
class ActivateTransferDomainView(BasePageView):
urlname = 'activate_transfer_domain'
page_title = 'Activate Domain Transfer'
template_name = 'domain/activate_transfer_domain.html'
@property
@memoized
def active_transfer(self):
return TransferDomainRequest.get_by_guid(self.guid)
@property
def page_context(self):
if self.active_transfer:
return {'transfer': self.active_transfer.as_dict()}
else:
return {}
@property
def page_url(self):
return self.request.get_full_path()
def get(self, request, guid, *args, **kwargs):
self.guid = guid
if (self.active_transfer and
self.active_transfer.to_username != request.user.username and
not request.user.is_superuser):
return HttpResponseRedirect(reverse("no_permissions"))
return super(ActivateTransferDomainView, self).get(request, *args, **kwargs)
def post(self, request, guid, *args, **kwargs):
self.guid = guid
if not self.active_transfer:
raise Http404()
if self.active_transfer.to_username != request.user.username and not request.user.is_superuser:
return HttpResponseRedirect(reverse("no_permissions"))
self.active_transfer.transfer_domain(ip=get_ip(request))
messages.success(request, _(u"Successfully transferred ownership of project '{domain}'")
.format(domain=self.active_transfer.domain))
return HttpResponseRedirect(reverse('dashboard_default', args=[self.active_transfer.domain]))
@method_decorator(login_required)
def dispatch(self, *args, **kwargs):
return super(ActivateTransferDomainView, self).dispatch(*args, **kwargs)
class DeactivateTransferDomainView(View):
def post(self, request, guid, *args, **kwargs):
transfer = TransferDomainRequest.get_by_guid(guid)
if not transfer:
return HttpResponseRedirect(request.META.get('HTTP_REFERER', '/'))
if (transfer.to_username != request.user.username and
transfer.from_username != request.user.username and
not request.user.is_superuser):
return HttpResponseRedirect(reverse("no_permissions"))
transfer.active = False
transfer.save()
referer = request.META.get('HTTP_REFERER', '/')
# Do not want to send them back to the activate page
if referer.endswith(reverse('activate_transfer_domain', args=[guid])):
messages.info(request,
_(u"Declined ownership of project '{domain}'").format(domain=transfer.domain))
return HttpResponseRedirect('/')
else:
return HttpResponseRedirect(referer)
@method_decorator(login_required)
def dispatch(self, *args, **kwargs):
return super(DeactivateTransferDomainView, self).dispatch(*args, **kwargs)
from corehq.apps.smsbillables.forms import PublicSMSRateCalculatorForm
from corehq.apps.smsbillables.async_handlers import PublicSMSRatesAsyncHandler
class PublicSMSRatesView(BasePageView, AsyncHandlerMixin):
urlname = 'public_sms_rates_view'
page_title = ugettext_lazy("SMS Rate Calculator")
template_name = 'domain/admin/global_sms_rates.html'
async_handlers = [PublicSMSRatesAsyncHandler]
@property
def page_url(self):
return reverse(self.urlname)
@property
def page_context(self):
return {
'rate_calc_form': PublicSMSRateCalculatorForm()
}
def post(self, request, *args, **kwargs):
return self.async_response or self.get(request, *args, **kwargs)
class SMSRatesView(BaseAdminProjectSettingsView, AsyncHandlerMixin):
urlname = 'domain_sms_rates_view'
page_title = ugettext_lazy("SMS Rate Calculator")
template_name = 'domain/admin/sms_rates.html'
async_handlers = [
SMSRatesAsyncHandler,
SMSRatesSelect2AsyncHandler,
]
@property
@memoized
def rate_calc_form(self):
if self.request.method == 'POST':
return SMSRateCalculatorForm(self.domain, self.request.POST)
return SMSRateCalculatorForm(self.domain)
@property
def page_context(self):
return {
'rate_calc_form': self.rate_calc_form,
}
def post(self, request, *args, **kwargs):
if self.async_response is not None:
return self.async_response
return self.get(request, *args, **kwargs)
@require_POST
@domain_admin_required
def org_request(request, domain):
org_name = request.POST.get("org_name", None)
org = Organization.get_by_name(org_name)
if org:
org_request = OrgRequest.get_requests(org_name, domain=domain, user_id=request.couch_user.get_id)
if not org_request:
org_request = OrgRequest(organization=org_name, domain=domain,
requested_by=request.couch_user.get_id, requested_on=datetime.datetime.utcnow())
org_request.save()
_send_request_notification_email(request, org, domain)
messages.success(request,
"Your request was submitted. The admin of organization %s can now choose to manage the project %s" %
(org_name, domain))
else:
messages.error(request, "You've already submitted a request to this organization")
else:
messages.error(request, "The organization '%s' does not exist" % org_name)
return HttpResponseRedirect(reverse('domain_org_settings', args=[domain]))
def _send_request_notification_email(request, org, dom):
params = {"org": org, "dom": dom, "requestee": request.couch_user,
"url_base": get_site_domain()}
text_content = render_to_string(
"domain/email/org_request_notification.txt", params)
html_content = render_to_string(
"domain/email/org_request_notification.html", params)
recipients = [member.email for member in org.get_members()
if member.is_org_admin(org.name)]
subject = "New request to add a project to your organization! -- CommcareHQ"
try:
for recipient in recipients:
send_html_email_async.delay(subject, recipient, html_content,
text_content=text_content,
email_from=settings.DEFAULT_FROM_EMAIL)
except Exception:
logging.warning("Can't send notification email, "
"but the message was:\n%s" % text_content)
class BaseCardView(DomainAccountingSettings):
@property
def payment_method(self):
payment_method, __ = StripePaymentMethod.objects.get_or_create(
web_user=self.request.user.username,
method_type=PaymentMethodType.STRIPE,
)
return payment_method
def _generic_error(self):
error = ("Something went wrong while processing your request. "
"We're working quickly to resolve the issue. "
"Please try again in a few hours.")
return json_response({'error': error}, status_code=500)
def _stripe_error(self, e):
body = e.json_body
err = body['error']
return json_response({'error': err['message'],
'cards': self.payment_method.all_cards_serialized(self.account)},
status_code=502)
class CardView(BaseCardView):
"""View for dealing with a single Credit Card"""
url_name = "card_view"
def post(self, request, domain, card_token):
try:
card = self.payment_method.get_card(card_token)
if request.POST.get("is_autopay") == 'true':
self.payment_method.set_autopay(card, self.account)
elif request.POST.get("is_autopay") == 'false':
self.payment_method.unset_autopay(card, self.account)
except self.payment_method.STRIPE_GENERIC_ERROR as e:
return self._stripe_error(e)
except Exception as e:
return self._generic_error()
return json_response({'cards': self.payment_method.all_cards_serialized(self.account)})
def delete(self, request, domain, card_token):
try:
self.payment_method.remove_card(card_token)
except self.payment_method.STRIPE_GENERIC_ERROR as e:
return self._stripe_error(e)
return json_response({'cards': self.payment_method.all_cards_serialized(self.account)})
class CardsView(BaseCardView):
"""View for dealing Credit Cards"""
url_name = "cards_view"
def get(self, request, domain):
return json_response({'cards': self.payment_method.all_cards_serialized(self.account)})
def post(self, request, domain):
stripe_token = request.POST.get('token')
autopay = request.POST.get('autopay') == 'true'
try:
self.payment_method.create_card(stripe_token, self.account, autopay)
except self.payment_method.STRIPE_GENERIC_ERROR as e:
return self._stripe_error(e)
except Exception as e:
return self._generic_error()
return json_response({'cards': self.payment_method.all_cards_serialized(self.account)})
| 38.750267 | 130 | 0.636215 | 11,505 | 108,927 | 5.782877 | 0.084224 | 0.028558 | 0.017315 | 0.01231 | 0.421842 | 0.342362 | 0.267631 | 0.217534 | 0.179357 | 0.152739 | 0 | 0.002254 | 0.275092 | 108,927 | 2,810 | 131 | 38.764057 | 0.840328 | 0.007914 | 0 | 0.349274 | 0 | 0.000427 | 0.110778 | 0.023874 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.001708 | 0.03117 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e81ba3ad3b0b7d369560fab0313b86525e0c16f1 | 5,668 | py | Python | src/anmi/T2/funcs_met_iters.py | alexmascension/ANMI | 9c51a497a5fa2650f1429f847c7f9df69271168b | [
"BSD-3-Clause"
] | 1 | 2021-11-30T23:30:45.000Z | 2021-11-30T23:30:45.000Z | src/anmi/T2/funcs_met_iters.py | alexmascension/ANMI | 9c51a497a5fa2650f1429f847c7f9df69271168b | [
"BSD-3-Clause"
] | 2 | 2021-04-11T20:39:15.000Z | 2021-04-13T17:45:43.000Z | src/anmi/T2/funcs_met_iters.py | alexmascension/ANMI | 9c51a497a5fa2650f1429f847c7f9df69271168b | [
"BSD-3-Clause"
] | 1 | 2021-11-30T23:31:11.000Z | 2021-11-30T23:31:11.000Z | from sympy import simplify, zeros
from sympy import Matrix as mat
import numpy as np
from ..genericas import print_verbose, matriz_inversa
def criterio_radio_espectral(H, verbose=True):
eigs = [simplify(i) for i in list(H.eigenvals().keys())]
print_verbose("||Criterio de radio espectral||", verbose)
try:
print_verbose(
f"El mayor autovalor es {np.max(np.array(eigs, dtype=float))}. Si ese valor es < 1 entonces los métodos iterativos convergen.",
verbose,
)
except:
print_verbose(
f"Los autovalores son {eigs}. Si el mayor autovalor es < 1, entonces el método converge.",
verbose,
)
def criterio_diagonal_dominante(A, verbose=True):
print_verbose(
"||Criterio de Diagonal Dominante||\n Si la matriz es dominante por filas, los métodos de Jacobi y Gauss-Seidel convergen.",
verbose,
)
A_abs = abs(A)
try:
np.array(A_abs, dtype=float)
for r in range(A.shape[0]):
diff = 2 * A_abs[r, r] - sum(A_abs[r, :])
if diff <= 0:
print_verbose(
f"La fila {r} NO es dominante por filas: diff = {diff}.", verbose
)
return
print_verbose("La matriz CUMPLE EL CRITERIO DIAGONAL DOMINANTE", verbose)
except:
print_verbose(
"La matriz tiene complejos o simbolos. Hay que verificar el criterio a mano.",
verbose,
)
def criterio_simetrica_definida_positiva(A, verbose=True):
print_verbose(
"||Criterio de Sim Def Pos||\n Si la matriz es simétrica y definida positiva, el método de Gauss-Seidel es convergente.",
verbose,
)
if A != A.T:
print_verbose("La matriz NO es simétrica.", verbose)
return
det_A = A.det()
print_verbose(f"El determinante de A es {det_A}.", verbose)
try:
if float(det_A) > 0:
print_verbose(
"La matriz es DEFINIDA POSITIVA (el determinante es positivo).",
verbose,
)
print_verbose("La matriz CUMPLE EL CRITERIO SIM DEF POS", verbose)
else:
print_verbose(
"La matriz NO es DEFINIDA POSITIVA (el determinante no es positivo).",
verbose,
)
except:
print_verbose(
"No podemos determinar la positividad porque hay símbolos o complejos.",
verbose,
)
def criterio_SOR(verbose):
print_verbose(
"||Criterio SOR||\n Si la matriz es simétrica y definida positiva y w in (0, 2) el método SOR es convergente.\nSi w no (0, 2) el método SOR no converge.",
verbose,
)
def criterio_m_matriz(A, verbose):
print_verbose(
"||Criterio M matriz||\n Si la A es M-matriz entonces las descomposiciones de Jacobi y Gauss-Seidel son convergentes.\nA^-1 >= 0\naij < 0 para todo i =/= j",
verbose,
)
A_inv = matriz_inversa(A)
try:
np.array(A, dtype=float)
if np.min(A_inv) >= 0:
print_verbose("A^-1 >= 0", verbose)
else:
print_verbose("A^-1 < 0. La matriz NO CUMPLE el criterio", verbose)
A_null_diag = A.copy()
for i in range(A.shape[0]):
A_null_diag[i, i] = 0
if np.max(A_null_diag) > 0:
print_verbose(
"La matriz tiene elementos no diagonales positivos. NO CUMPLE el criterio.",
verbose,
)
else:
print_verbose("Los elementos no diagonales son negativos.", verbose)
except:
print_verbose(
"La matriz tiene complejos o símbolos, no podemos verificar le criterio.",
verbose,
)
def metodo_iterativo(
A, b=None, x0=None, metodo="jacobi", w=1.5, n_iter=10, verbose=True,
):
"""Aplica el método iterativo designado
Args:
A (matriz): Matriz de valores
b (vector, optional): Vector de rhs. Por defecto es 1, 1, ..., 1.
x0 (vector, optional): Vector con elementos de la primera iteración. Por defecto es 1, 1, ..., 1.
metodo (str, optional): método de resolución, puede ser "jacobi", "gs" o "sor".
w (float, optional): Peso para método sor. Defaults to 1.5.
n_iter (int, optional): Número de iteraciones del método. Defaults to 10.
verbose (bool, optional): Imprime resultados intermedios. Defaults to True.
Returns:
dict: 'x': vector de resultados para Ax=b, 'diff': diferencia entre Ax y b para cada iteración.
"""
if b is None:
b = mat([[1] * A.shape[0]]).T
if x0 is None:
x0 = mat([[1] * A.shape[1]]).T
D, L, U = (
zeros(A.shape[0], A.shape[1]),
zeros(A.shape[0], A.shape[1]),
zeros(A.shape[0], A.shape[1]),
)
for r in range(A.shape[0]):
for c in range(A.shape[1]):
if r == c:
D[r, c] = A[r, c]
elif r < c:
U[r, c] = -A[r, c]
else:
L[r, c] = -A[r, c]
if metodo == "jacobi":
M = D
elif metodo == "gs":
M = D - L
elif metodo == "sor":
M = D / w - L
N = simplify(M - A)
# Aplicamos criterios!
criterio_radio_espectral(matriz_inversa(M) * N, verbose)
criterio_diagonal_dominante(A, verbose)
criterio_simetrica_definida_positiva(A, verbose)
criterio_SOR(verbose)
criterio_m_matriz(A, verbose)
diff = []
for iter in range(n_iter): # Aplica el método
x0 = (matriz_inversa(M)) * (N * x0 + b)
diff.append(np.sum(np.abs(A * x0 - b)))
return {"x": x0, "diff": diff}
| 32.574713 | 165 | 0.572689 | 774 | 5,668 | 4.109819 | 0.226098 | 0.082993 | 0.035209 | 0.050299 | 0.309337 | 0.175731 | 0.127947 | 0.072619 | 0.072619 | 0.017919 | 0 | 0.014212 | 0.317219 | 5,668 | 173 | 166 | 32.763006 | 0.807752 | 0.124206 | 0 | 0.328358 | 0 | 0.037313 | 0.307818 | 0.004479 | 0 | 0 | 0 | 0.028902 | 0 | 1 | 0.044776 | false | 0 | 0.029851 | 0 | 0.097015 | 0.164179 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e81d7c04c616831d43d69c14fef3ddcc06add40b | 5,766 | py | Python | trainer/dataset.py | vinay-swamy/gMVP | 62202baa0769dfe0e47c230e78dffa42fb1280f1 | [
"MIT"
] | 2 | 2021-04-24T03:23:40.000Z | 2021-06-28T11:51:10.000Z | trainer/dataset.py | vinay-swamy/gMVP | 62202baa0769dfe0e47c230e78dffa42fb1280f1 | [
"MIT"
] | null | null | null | trainer/dataset.py | vinay-swamy/gMVP | 62202baa0769dfe0e47c230e78dffa42fb1280f1 | [
"MIT"
] | 2 | 2021-09-14T13:03:38.000Z | 2022-03-23T02:49:19.000Z | import tensorflow as tf
import os
import pickle
import numpy as np
from constant_params import input_feature_dim, window_size
def build_dataset(input_tfrecord_files, batch_size):
drop_remainder = False
feature_description = {
'label': tf.io.FixedLenFeature([], tf.int64),
'ref_aa': tf.io.FixedLenFeature([], tf.int64),
'alt_aa': tf.io.FixedLenFeature([], tf.int64),
'feature': tf.io.FixedLenFeature([], tf.string),
'mask': tf.io.FixedLenFeature([], tf.string),
'var_id': tf.io.FixedLenFeature([], tf.string),
}
def _parser(example_proto):
parsed = tf.io.parse_single_example(example_proto, feature_description)
label, ref_aa, alt_aa = parsed['label'], parsed['ref_aa'], parsed[
'alt_aa']
var_id = parsed['var_id']
ref_aa, alt_aa, label = tf.cast(ref_aa, tf.int32), tf.cast(
alt_aa, tf.int32), tf.cast(label, tf.float32)
feature = tf.io.decode_raw(parsed['feature'], tf.float32)
feature = tf.reshape(feature, (window_size, input_feature_dim))
mask = tf.io.decode_raw(parsed['mask'], tf.float32)
mask = tf.reshape(mask, (window_size, ))
h = window_size // 2
#mask the postion of interest
mask = tf.concat(
[mask[:h],
tf.cast([
1,
], dtype=tf.float32), mask[h + 1:]],
axis=-1)
'''
pos_encoding = 1.0 + tf.cast(
tf.math.abs(window_size // 2 - tf.range(window_size)),
dtype=tf.float32)
#pos_encoding = tf.math.log() / tf.math.log(2.0)
feature = tf.concat([feature, pos_encoding[:, tf.newaxis]], axis=-1)
'''
return var_id, ref_aa, alt_aa, feature, label, mask
dataset = tf.data.TFRecordDataset(input_tfrecord_files)
options = tf.data.Options()
options.experimental_threading.max_intra_op_parallelism = 1
dataset = dataset.with_options(options)
dataset = dataset.shuffle(2048)
dataset = dataset.map(_parser, num_parallel_calls=8)
dataset = dataset.batch(batch_size)
#dataset = dataset.prefetch(4)
return dataset
def build_all_possible_missenses_dataset(tr_list, feature_dir, batch_size):
amino_acid_order = 'ACDEFGHIKLMNPQRSTVWY*'
def _gen_data():
for transcript_id in tr_list:
feature_path = f'{feature_dir}/{transcript_id}.pickle'
if not os.path.exists(feature_path):
continue
print(feature_path, flush=True)
with open(feature_path, 'rb') as fr:
feature = pickle.load(fr)
L = feature.shape[0]
w = window_size // 2
for aa_pos in range(L):
ref_aa = int(feature[aa_pos, 0])
start = max(aa_pos - w, 0)
end = min(L, aa_pos + 1 + w)
var_start = start - (aa_pos - w)
var_end = var_start + (end - start)
var_feature = np.zeros([w * 2 + 1, feature.shape[1]])
var_feature[var_start:var_end] = feature[start:end]
mask = np.ones((w * 2 + 1, ), dtype=np.float32)
mask[var_start:var_end] = 0.0
mask[w] = 1.0
for alt_aa in range(20):
var_id = f'{transcript_id}_{str(aa_pos+1)}_{amino_acid_order[ref_aa]}_{amino_acid_order[alt_aa]}'.encode(
'utf-8')
yield var_id, np.int32(ref_aa), np.int32(
alt_aa), np.float32(var_feature), np.float32(mask)
dataset = tf.data.Dataset.from_generator(
_gen_data, (tf.string, tf.int32, tf.int32, tf.float32, tf.float32),
(tf.TensorShape(()), tf.TensorShape(()), tf.TensorShape(
()), tf.TensorShape((window_size, input_feature_dim)),
tf.TensorShape((window_size, ))))
options = tf.data.Options()
options.experimental_threading.max_intra_op_parallelism = 1
dataset = dataset.with_options(options)
#dataset = dataset.map(_parser, num_parallel_calls=8)
dataset = dataset.batch(batch_size)
dataset = dataset.prefetch(4)
return dataset
def build_test_dataset(input_tfrecord_files, batch_size):
drop_remainder = False
feature_description = {
'ref_aa': tf.io.FixedLenFeature([], tf.int64),
'alt_aa': tf.io.FixedLenFeature([], tf.int64),
'feature': tf.io.FixedLenFeature([], tf.string),
'mask': tf.io.FixedLenFeature([], tf.string),
'var_id': tf.io.FixedLenFeature([], tf.string),
}
def _parser(example_proto):
parsed = tf.io.parse_single_example(example_proto, feature_description)
ref_aa, alt_aa = parsed['ref_aa'], parsed['alt_aa']
var_id = parsed['var_id']
ref_aa, alt_aa = tf.cast(ref_aa, tf.int32), tf.cast(alt_aa, tf.int32)
feature = tf.io.decode_raw(parsed['feature'], tf.float32)
feature = tf.reshape(feature, (window_size, input_feature_dim))
mask = tf.io.decode_raw(parsed['mask'], tf.float32)
mask = tf.reshape(mask, (window_size, ))
h = window_size // 2
#mask the postion of interest
mask = tf.concat(
[mask[:h],
tf.cast([
1,
], dtype=tf.float32), mask[h + 1:]],
axis=-1)
return var_id, ref_aa, alt_aa, feature, mask
dataset = tf.data.TFRecordDataset(input_tfrecord_files)
options = tf.data.Options()
options.experimental_threading.max_intra_op_parallelism = 1
dataset = dataset.with_options(options)
dataset = dataset.map(_parser, num_parallel_calls=8)
dataset = dataset.batch(batch_size)
#dataset = dataset.prefetch(4)
return dataset
| 33.137931 | 125 | 0.601283 | 742 | 5,766 | 4.443396 | 0.17655 | 0.020625 | 0.063391 | 0.070064 | 0.668487 | 0.643312 | 0.62754 | 0.62754 | 0.62754 | 0.62754 | 0 | 0.023029 | 0.269511 | 5,766 | 173 | 126 | 33.32948 | 0.759734 | 0.028789 | 0 | 0.535088 | 0 | 0.008772 | 0.051779 | 0.026737 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.04386 | 0 | 0.140351 | 0.008772 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e81e5aa3bb914c6f0bd68fa448159f5b4cf4de2f | 348 | py | Python | setup.py | flother/pdf-search | fa4c519a673bf5a5d25e1ab44e971690ab3cf781 | [
"MIT"
] | 5 | 2018-06-18T10:31:20.000Z | 2020-06-10T01:05:02.000Z | setup.py | flother/pdf-search | fa4c519a673bf5a5d25e1ab44e971690ab3cf781 | [
"MIT"
] | null | null | null | setup.py | flother/pdf-search | fa4c519a673bf5a5d25e1ab44e971690ab3cf781 | [
"MIT"
] | null | null | null | from setuptools import setup
setup(
name='espdf',
version='0.1.0-dev',
url='https://github.com/flother/pdf-search',
py_modules=(
'espdf',
),
install_requires=(
'certifi',
'elasticsearch-dsl',
),
entry_points={
'console_scripts': (
'espdf=espdf:cli',
),
},
)
| 16.571429 | 48 | 0.514368 | 34 | 348 | 5.147059 | 0.852941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012712 | 0.321839 | 348 | 20 | 49 | 17.4 | 0.728814 | 0 | 0 | 0.166667 | 0 | 0 | 0.316092 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.055556 | 0 | 0.055556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e81f767a21388acf235edc3f3bc49f314ff2619e | 10,376 | py | Python | pywallet/network.py | martexcoin/pywallet | dca53f124452869890b0247c40afba821b650c6b | [
"MIT"
] | 1 | 2021-01-04T04:11:42.000Z | 2021-01-04T04:11:42.000Z | pywallet/network.py | martexcoin/pywallet | dca53f124452869890b0247c40afba821b650c6b | [
"MIT"
] | null | null | null | pywallet/network.py | martexcoin/pywallet | dca53f124452869890b0247c40afba821b650c6b | [
"MIT"
] | 1 | 2021-07-09T21:21:07.000Z | 2021-07-09T21:21:07.000Z | class BitcoinGoldMainNet(object):
"""Bitcoin Gold MainNet version bytes. """
NAME = "Bitcoin Gold Main Net"
COIN = "BTG"
SCRIPT_ADDRESS = 0x17 # int(0x17) = 23
PUBKEY_ADDRESS = 0x26 # int(0x26) = 38 # Used to create payment addresses
SECRET_KEY = 0x80 # int(0x80) = 128 # Used for WIF format
EXT_PUBLIC_KEY = 0x0488b21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/0'/0'/"
class BitcoinCashMainNet(object):
"""Bitcoin Cash MainNet version bytes."""
NAME = "Bitcoin Cash Main Net"
COIN = "BCH"
SCRIPT_ADDRESS = 0x28 # int(0x28) = 40
PUBKEY_ADDRESS = 0x1C # int(0x00) = 28 # Used to create payment addresses
SECRET_KEY = 0x80 # int(0x80) = 128 # Used for WIF format
EXT_PUBLIC_KEY = 0x0488b21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/145'/0'/"
class DashMainNet(object):
"""Dash MainNet version bytes."""
NAME = "Dash Main Net"
COIN = "DASH"
SCRIPT_ADDRESS = 0x10 # int(0x10) = 16
PUBKEY_ADDRESS = 0x4C # int(0x4C) = 76 # Used to create payment addresses
SECRET_KEY = 0xCC # int(0xCC) = 204 # Used for WIF format
EXT_PUBLIC_KEY = 0X0488B21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0X0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/5'/0'/"
class DashTestNet(object):
"""Dash TestNet version bytes."""
NAME = "Dash Test Net"
COIN = "DASH"
SCRIPT_ADDRESS = 0x13 # int(0x13) = 19
PUBKEY_ADDRESS = 0x8C # int(0x8C) = 140 # Used to create payment addresses
SECRET_KEY = 0xEF # int(0xEF) = 239 # Used for WIF format
EXT_PUBLIC_KEY = 0x043587CF # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x04358394 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/1'/0'/"
class MarteXMainNet(object):
"""MarteX MainNet version bytes."""
NAME = "MarteX Main Net"
COIN = "MXT"
SCRIPT_ADDRESS = 0x05 # int(0x05) = 05
PUBKEY_ADDRESS = 0x32 # int(0x32) = 50 # Used to create payment addresses
SECRET_KEY = 0xB2 # int(0xB2) = 178 # Used for WIF format
EXT_PUBLIC_KEY = 0X0488B21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0X0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/180'/0'/"
class MarteXTestNet(object):
"""MarteX TestNet version bytes."""
NAME = "MarteX Test Net"
COIN = "MXT"
SCRIPT_ADDRESS = 0xC4 # int(0xC4) = 196
PUBKEY_ADDRESS = 0x6C # int(0x6F) = 111 # Used to create payment addresses
SECRET_KEY = 0x144 # int(0x144) = 324 # Used for WIF format
EXT_PUBLIC_KEY = 0x043587CF # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x04358394 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/1'/0'/"
class OmniMainNet(object):
"""Bitcoin MainNet version bytes.
From https://github.com/OmniLayer/omnicore/blob/develop/src/chainparams.cpp
"""
NAME = "Omni Main Net"
COIN = "USDT"
SCRIPT_ADDRESS = 0x00 # int(0x00) = 0
PUBKEY_ADDRESS = 0x05 # int(0x05) = 5 # Used to create payment addresses
SECRET_KEY = 0x80 # int(0x80) = 128 # Used for WIF format
EXT_PUBLIC_KEY = 0x0488B21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/0'/0'/"
class OmniTestNet(object):
"""Bitcoin MainNet version bytes.
From https://github.com/OmniLayer/omnicore/blob/develop/src/chainparams.cpp
"""
NAME = "Omni Test Net"
COIN = "USDT"
SCRIPT_ADDRESS = 0x6f # int(0x6f) = 111
PUBKEY_ADDRESS = 0xc4 # int(0xc4) = 196 # Used to create payment addresses
SECRET_KEY = 0xef # int(0xef) = 239 # Used for WIF format
EXT_PUBLIC_KEY = 0x043587CF # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x04358394 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/0'/0'/"
class BitcoinMainNet(object):
"""Bitcoin MainNet version bytes.
From https://github.com/bitcoin/bitcoin/blob/v0.9.0rc1/src/chainparams.cpp
"""
NAME = "Bitcoin Main Net"
COIN = "BTC"
SCRIPT_ADDRESS = 0x05 # int(0x05) = 5
PUBKEY_ADDRESS = 0x00 # int(0x00) = 0 # Used to create payment addresses
SECRET_KEY = 0x80 # int(0x80) = 128 # Used for WIF format
EXT_PUBLIC_KEY = 0x0488B21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/0'/0'/"
class FeathercoinMainNet(object):
"""Feathercoin MainNet version bytes.
From https://github.com/FeatherCoin/Feathercoin/blob/master-0.13/src/chainparams.cpp
"""
NAME = "Feathercoin Main Net"
COIN = "FTC"
SCRIPT_ADDRESS = 0x05 # int(0x05) = 5
PUBKEY_ADDRESS = 0x0E # int(0x0E) = 14 # Used to create payment addresses
SECRET_KEY = 0x8E # int(0x8E) = 142 # Used for WIF format
EXT_PUBLIC_KEY = 0x0488BC26 # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x0488DAEE # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/4'/0'/"
class BitcoinTestNet(object):
"""Bitcoin TestNet version bytes.
From https://github.com/bitcoin/bitcoin/blob/v0.9.0rc1/src/chainparams.cpp
"""
NAME = "Bitcoin Test Net"
COIN = "BTC"
SCRIPT_ADDRESS = 0xc4 # int(0xc4) = 196
PUBKEY_ADDRESS = 0x6f # int(0x6f) = 111
SECRET_KEY = 0xEF # int(0xef) = 239
EXT_PUBLIC_KEY = 0x043587CF
EXT_SECRET_KEY = 0x04358394
BIP32_PATH = "m/44'/1'/0'/"
class LitecoinMainNet(object):
"""Litecoin MainNet version bytes
Primary version bytes from:
https://github.com/litecoin-project/litecoin/blob/master-0.8/src/base58.h
Unofficial extended version bytes from
https://bitcointalk.org/index.php?topic=453395.0
"""
NAME = "Litecoin Main Net"
COIN = "LTC"
SCRIPT_ADDRESS = 0x05 # int(0x05) = 5
PUBKEY_ADDRESS = 0x30 # int(0x30) = 48
SECRET_KEY = PUBKEY_ADDRESS + 128 # = int(0xb0) = 176
# Unofficial extended version bytes taken from
# https://bitcointalk.org/index.php?topic=453395.0
# EXT_PUBLIC_KEY = 0x019da462
# EXT_SECRET_KEY = 0x019d9cfe
# same as Bitcoin's
# https://github.com/ranaroussi/pywallet/issues/6
EXT_PUBLIC_KEY = 0x0488B21E
EXT_SECRET_KEY = 0x0488ADE4
BIP32_PATH = "m/44'/2'/0'/"
class LitecoinTestNet(object):
"""Litecoin TestNet version bytes
Primary version bytes from:
https://github.com/litecoin-project/litecoin/blob/master-0.8/src/base58.h
Unofficial extended version bytes from
https://bitcointalk.org/index.php?topic=453395.0
"""
NAME = "Litecoin Test Net"
COIN = "LTC"
SCRIPT_ADDRESS = 0xc4 # int(0xc4) = 196
PUBKEY_ADDRESS = 0x6f # int(0x6f) = 111
SECRET_KEY = PUBKEY_ADDRESS + 128 # = int(0xef) = 239
# Unofficial extended version bytes taken from
# https://bitcointalk.org/index.php?topic=453395.0
# EXT_PUBLIC_KEY = 0x0436f6e1
# EXT_SECRET_KEY = 0x0436ef7d
# same as Bitcoin's
# https://github.com/ranaroussi/pywallet/issues/6
EXT_PUBLIC_KEY = 0x043587CF
EXT_SECRET_KEY = 0x04358394
BIP32_PATH = "m/44'/1'/0'/"
class DogecoinMainNet(object):
"""Dogecoin MainNet version bytes
Primary version bytes from:
https://github.com/dogecoin/dogecoin/blob/1.5.2/src/base58.h
Unofficial extended version bytes from
https://bitcointalk.org/index.php?topic=409731
"""
NAME = "Dogecoin Main Net"
COIN = "DOGE"
SCRIPT_ADDRESS = 0x16 # int(0x16) = 22
PUBKEY_ADDRESS = 0x1e # int(0x1e) = 30
SECRET_KEY = PUBKEY_ADDRESS + 128 # int(0x9e) = 158
# Unofficial extended version bytes taken from
# https://bitcointalk.org/index.php?topic=409731
EXT_PUBLIC_KEY = 0x02facafd
EXT_SECRET_KEY = 0x02fac398
BIP32_PATH = "m/44'/3'/0'/"
class DogecoinTestNet(object):
"""Dogecoin TestNet version bytes
Primary version bytes from:
https://github.com/dogecoin/dogecoin/blob/1.5.2/src/base58.h
Unofficial extended version bytes from
https://bitcointalk.org/index.php?topic=409731
"""
NAME = "Dogecoin Test Net"
COIN = "DOGE"
SCRIPT_ADDRESS = 0xc4 # int(0xc4) = 196
PUBKEY_ADDRESS = 0x71 # int(0x71) = 113
SECRET_KEY = PUBKEY_ADDRESS + 128 # int(0xf1) = 241
# Unofficial extended version bytes taken from
# https://bitcointalk.org/index.php?topic=409731
EXT_PUBLIC_KEY = 0x0432a9a8
EXT_SECRET_KEY = 0x0432a243
BIP32_PATH = "m/44'/1'/0'/"
class BlockCypherTestNet(object):
"""BlockCypher TestNet version bytes.
From http://dev.blockcypher.com/#testing
"""
NAME = "BlockCypher Test Net"
COIN = "BlockCypher"
SCRIPT_ADDRESS = 0x1f # int(0x1f) = 31
PUBKEY_ADDRESS = 0x1b # int(0x1b) = 27 # Used to create payment addresses
SECRET_KEY = 0x49 # int(0x49) = 73 # Used for WIF format
EXT_PUBLIC_KEY = 0x2d413ff # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x2d40fc3 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/1'/0'/"
class QtumMainNet(object):
"""Qtum MainNet version bytes
Primary version bytes from:
https://github.com/qtumproject/qtum/blob/master/src/chainparams.cpp
"""
NAME = "Qtum Main Net"
COIN = "QTUM"
SCRIPT_ADDRESS = 0x32 # int(0x32) = 50
PUBKEY_ADDRESS = 0x3A # int(0x3A) = 58 # Used to create payment addresses
SECRET_KEY = 0x80 # int(0x80) = 128 # Used for WIF format
EXT_PUBLIC_KEY = 0x0488B21E # Used to serialize public BIP32 addresses
EXT_SECRET_KEY = 0x0488ADE4 # Used to serialize private BIP32 addresses
BIP32_PATH = "m/44'/88'/0'/"
class QtumTestNet(object):
"""Qtum TestNet version bytes
Primary version bytes from:
https://github.com/qtumproject/qtum/blob/master/src/chainparams.cpp
"""
NAME = "Qtum Test Net"
COIN = "QTUM"
SCRIPT_ADDRESS = 0x6E # int(0x6e) = 110
PUBKEY_ADDRESS = 0x78 # int(0x78) = 120
SECRET_KEY = 0xEF # int(0xef) = 239
EXT_PUBLIC_KEY = 0x043587CF
EXT_SECRET_KEY = 0x04358394
BIP32_PATH = "m/44'/88'/0'/"
| 37.189964 | 88 | 0.673092 | 1,377 | 10,376 | 4.958606 | 0.147422 | 0.050088 | 0.052724 | 0.031634 | 0.762009 | 0.698448 | 0.682484 | 0.638987 | 0.610574 | 0.593878 | 0 | 0.112875 | 0.223015 | 10,376 | 278 | 89 | 37.323741 | 0.734061 | 0.487278 | 0 | 0.462963 | 0 | 0 | 0.117623 | 0 | 0 | 0 | 0.112975 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e820480eb7c2b0f9e9f4f3e9695e4d3110de174e | 20,078 | py | Python | yacos/algorithm/metaheuristics.py | ComputerSystemsLaboratory/YaCoS | abd5d3c6e227e5c7a563493f7855ebf58ba3de05 | [
"Apache-2.0"
] | 8 | 2022-02-03T16:41:01.000Z | 2022-02-09T11:29:20.000Z | yacos/algorithm/metaheuristics.py | ComputerSystemsLaboratory/YaCoS | abd5d3c6e227e5c7a563493f7855ebf58ba3de05 | [
"Apache-2.0"
] | null | null | null | yacos/algorithm/metaheuristics.py | ComputerSystemsLaboratory/YaCoS | abd5d3c6e227e5c7a563493f7855ebf58ba3de05 | [
"Apache-2.0"
] | null | null | null | """
Copyright 2021 Anderson Faustino da Silva.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import os
from dataclasses import dataclass
import pygmo as pg
from yacos.essential import Sequence
from yacos.essential import IO
from yacos.essential import Engine
class Pygmo:
"""A Pygmo's strategy."""
__version__ = '1.0.0'
__flags = None
# {key: {'goal': float,
# 'seq': list}}
__results = None
# SGA
# {gen = {'fevals': int,
# 'best': float,
# 'improvement': float}}
#
# PSO
# {gen: {'fevals': int,
# 'gbest': float,
# 'meanvel': float,
# 'meanlbest': float,
# 'avgdist': float}
__log = None
class Problem:
"""Pygmo's problem."""
def __init__(self,
first_key,
last_key,
passes_dict,
dimension,
goal,
compiler,
benchmark_directory,
working_set,
times,
tool,
verify_output):
"""Construct a Pygmo problem.
Parameters
----------
first_key : int
The index of the first pass.
last_key : int
The index of the last pass.
passes_dict : dict
The dictionary with the available passes.
dimension : int
The length of a sequence.
goal : str
compiler : str
benchmark_directory : str
working_set : int
times: int
tool: str
Execution tool
verify_output: bool
The goal is valid only if the execution status is OK.
"""
self.first_key = first_key
self.last_key = last_key
self.passes_dict = passes_dict
self.dimension = dimension
self.goal = goal
self.compiler = compiler
self.benchmark_directory = benchmark_directory
self.working_set = working_set
self.times = times
self.tool = tool
self.verify_output = verify_output
def __deepcopy__(self,
*args,
**kwargs):
"""Deeep copy."""
return self
def fitness(self,
sequence):
"""Calculate and return the fitness."""
sequence = Sequence.fix_index(list(sequence))
sequence = Sequence.sanitize(sequence)
sequence = Sequence.index_pass_to_list(sequence,
self.passes_dict)
goal_value = Engine.evaluate(self.goal,
Sequence.name_pass_to_string(
sequence
),
self.compiler,
self.benchmark_directory,
self.working_set,
self.times,
self.tool,
self.verify_output)
return [goal_value]
def get_nix(self):
"""Integer dimension of the problem."""
return self.dimension
def get_bounds(self):
"""Box-bounds."""
return ([self.first_key] * self.dimension,
[self.last_key] * self.dimension)
def get_name(self):
"""Problem name."""
return 'Optimization Selection'
def get_extra_info(self):
"""Info."""
return '\tDimensions: ' + str(self.dimension)
@dataclass
class PygmoFlags:
"""Pygmo flags.
Parameters
----------
first_key : int
The index of the first pass.
last_key : int
The index of the last pass.
passes_dict : dict
The dictionary with the available passes.
dimension : int
The length of a sequence.
population : int
goals : dict
compiler : str
benchmarks_directory : str
working_set : int
The dataset to execute the benchmark.
times: int
Execution times
tool : str
Execution tool
verify_output: bool
The goal is valid only if the execution status is OK.
"""
first_key: int
last_key: int
passes_dict: dict
dimension: int
population: int
goals: dict
compiler: str
benchmarks_directory: str
working_set: int
times: int
tool: str
verify_output: bool
def __init__(self,
dimension,
population,
passes_filename,
goals,
compiler,
benchmarks_directory,
working_set,
times,
tool,
verify_output):
"""Initialize the arguments.
Parameters
----------
dimension : int
The length of a sequence.
population : int
passes_filename : str
The file that describes the passes to use.
goals : dict
compiler : str
benchmarks_directory : str
working_set : int
The dataset to execute the benchmark.
times: int
Execution times
tool: str
Execution tool
verify_output: bool
The goal is valid only if the execution status is OK.
"""
first_key, last_key, passes_dict = IO.load_passes(passes_filename)
# When the goal is obtained during compile time
# and the working set is not defined during compilation,
# we do not need the working set.
self.__flags = self.PygmoFlags(first_key,
last_key,
passes_dict,
dimension,
population,
goals,
compiler,
benchmarks_directory,
working_set,
times,
tool,
verify_output)
@property
def results(self):
"""Getter."""
return self.__results
@property
def log(self):
"""Getter."""
return self.__log
def exec(self, algorithm, benchmark):
"""Execute the algorithm.
Parameter
---------
algorithm : Pygmo algorithm
benchmark : str
"""
# Step 1: Algorithm
algorithm = pg.algorithm(algorithm)
# algorithm.set_verbosity(1)
# Step 2: Instantiate a pygmo problem
index = benchmark.find('.')
# Benchmark directtory
bench_dir = os.path.join(self.__flags.benchmarks_directory,
benchmark[:index],
benchmark[index+1:])
problem = self.Problem(self.__flags.first_key,
self.__flags.last_key,
self.__flags.passes_dict,
self.__flags.dimension,
self.__flags.goals,
self.__flags.compiler,
bench_dir,
self.__flags.working_set,
self.__flags.times,
self.__flags.tool,
self.__flags.verify_output)
problem = pg.problem(problem)
# Step 3: The initial population
population = pg.population(problem,
self.__flags.population)
# Step 4: Evolve the population
population = algorithm.evolve(population)
# Step 5: Get the results
sga_sequence = population.get_x().tolist()
sga_fitness = population.get_f().tolist()
self.__results = {}
for index in range(self.__flags.population):
sequence = Sequence.index_pass_to_list(sga_sequence[index],
self.__flags.passes_dict)
goal_value = sga_fitness[index][0]
if goal_value == float('inf'):
continue
self.__results[index] = {'seq': sequence,
'goal': goal_value}
# Step 6: Get the log
self.__log = {}
if algorithm.get_name() == 'SGA: Genetic Algorithm':
uda = algorithm.extract(pg.sga)
log = uda.get_log()
for (gen, fevals, best, improvement) in log:
self.__log[gen] = {'fevals': fevals,
'best': best,
'improvement': improvement}
elif algorithm.get_name() == 'PSO: Particle Swarm Optimization':
uda = algorithm.extract(pg.pso)
log = uda.get_log()
for (gen, fevals, gbest, meanvel, meanlbest, avgdist) in log:
self.__log[gen] = {'fevals': fevals,
'gbest': gbest,
'meanvel': meanvel,
'meanlbest': meanlbest,
'avgdist': avgdist}
class SGA(Pygmo):
"""Simple Genetic Algorithm."""
__version__ = '1.0.0'
__flags = None
@dataclass
class Flags:
"""Pygmo flags.
Parameters
----------
generations : int
cr : float
Crossover probability
m : float
Mutation probability
param_m : float
Distribution index (polynomial mutation),
gaussian width (gaussian mutation) or
inactive (uniform mutation)
param_s : float
The number of best individuals to use in “truncated”
selection or the size of the tournament in
tournament selection.
crossover : str
exponential, binomial or single
mutation : str
gaussian, polynomial or uniform
selection : str
tournament or truncated
seed : int
"""
generations: int
cr: float
m: float
param_m: float
param_s: float
crossover: str
mutation: str
selection: str
seed: int
def __init__(self,
generations,
population,
cr,
m,
param_m,
param_s,
crossover,
mutation,
selection,
seed,
dimension,
passes_filename,
goals,
compiler,
benchmarks_directory,
working_set,
times,
tool,
verify_output):
"""Initialize a SGA object.
Parameters
----------
generations : int
population : int
cr : float
Crossover probability
m : float
Mutation probability
param_m : float
Distribution index (polynomial mutation),
gaussian width (gaussian mutation) or
inactive (uniform mutation)
param_s : float
The number of best individuals to use in “truncated”
selection or the size of the tournament in
tournament selection.
crossover : str
exponential, binomial or single
mutation : str
gaussian, polynomial or uniform
selection : str
tournament or truncated
seed : int
dimension : int
The length of a sequence.
passes_filename : str
The file that describes the passes to use.
goals : dict
compiler : str
benchmarks_directory : str
working_set : int
The dataset to execute the benchmark.
times : int
Execution times
tool : str
Execution tool
verify_output: bool
The goal is valid only if the execution status is OK.
"""
self.__flags = self.Flags(generations,
cr,
m,
param_m,
param_s,
crossover,
mutation,
selection,
seed)
super().__init__(dimension,
population,
passes_filename,
goals,
compiler,
benchmarks_directory,
working_set,
times,
tool,
verify_output)
def run(self, benchmark):
"""Execute the algorithm.
Parameter
--------
benchmark: str
"""
if self.__flags.seed is None:
algorithm = pg.sga(gen=self.__flags.generations,
cr=self.__flags.cr,
m=self.__flags.m,
param_m=self.__flags.param_m,
param_s=self.__flags.param_s,
crossover=self.__flags.crossover,
mutation=self.__flags.mutation,
selection=self.__flags.selection)
else:
algorithm = pg.sga(gen=self.__flags.generations,
cr=self.__flags.cr,
m=self.__flags.m,
param_m=self.__flags.param_m,
param_s=self.__flags.param_s,
crossover=self.__flags.crossover,
mutation=self.__flags.mutation,
selection=self.__flags.selection,
seed=self.__flags.seed)
# Execute
super().exec(algorithm, benchmark)
class PSO(Pygmo):
"""Particle Swarm Optimization."""
__version__ = '1.0.0'
__flags = None
@dataclass
class Flags:
"""PSO flags.
Parameters
----------
generations : int
omega : float
Inertia weight (or constriction factor)
eta1 : float
Social component
eta2 : float
Cognitive component
max_vel : float
Maximum allowed particle velocities
(normalized with respect to the bounds width)
variant : int
Algorithmic variant
neighb_type : int
Swarm topology (defining each particle’s neighbours)
neighb_param : int
Topology parameter (defines how many neighbours to consider)
memory : bool
When true the velocities are not reset between successive
calls to the evolve method
seed : int
Seed used by the internal random number generator.
"""
generations: int
omega: float
eta1: float
eta2: float
max_vel: float
variant: int
neighb_type: int
neighb_param: int
memory: bool
seed: int
def __init__(self,
generations,
population,
omega,
eta1,
eta2,
max_vel,
variant,
neighb_type,
neighb_param,
memory,
seed,
dimension,
passes_filename,
goals,
compiler,
benchmarks_directory,
working_set,
times,
tool,
verify_output):
"""Initialize a PSO object.
Parameters
----------
generations : int
population : int
omega : float
Inertia weight (or constriction factor)
eta1 : float
Social component
eta2 : float
Cognitive component
max_vel : float
Maximum allowed particle velocities
(normalized with respect to the bounds width)
variant : int
Algorithmic variant
neighb_type : int
Swarm topology (defining each particle’s neighbours)
neighb_param : int
Topology parameter (defines how many neighbours to consider)
memory : bool
When true the velocities are not reset between successive
calls to the evolve method
seed : int
Seed used by the internal random number generator.
"""
self.__flags = self.Flags(generations,
omega,
eta1,
eta2,
max_vel,
variant,
neighb_type,
neighb_param,
memory,
seed)
super().__init__(dimension,
population,
passes_filename,
goals,
compiler,
benchmarks_directory,
working_set,
times,
tool,
verify_output)
def run(self, benchmark):
"""Execute the algorithm.
Parameter
--------
benchmark : str
"""
if self.__flags.seed:
algorithm = pg.pso(self.__flags.generations,
self.__flags.omega,
self.__flags.eta1,
self.__flags.eta2,
self.__flags.max_vel,
self.__flags.variant,
self.__flags.neighb_type,
self.__flags.neighb_param,
self.__flags.memory,
self.__flags.seed)
else:
algorithm = pg.pso(self.__flags.generations,
self.__flags.omega,
self.__flags.eta1,
self.__flags.eta2,
self.__flags.max_vel,
self.__flags.variant,
self.__flags.neighb_type,
self.__flags.neighb_param,
self.__flags.memory)
# Execute
super().exec(algorithm, benchmark)
| 28.60114 | 76 | 0.456121 | 1,688 | 20,078 | 5.227488 | 0.165284 | 0.058137 | 0.019946 | 0.019039 | 0.603128 | 0.577856 | 0.557684 | 0.521419 | 0.521419 | 0.502267 | 0 | 0.003841 | 0.481373 | 20,078 | 701 | 77 | 28.64194 | 0.843561 | 0.295896 | 0 | 0.540193 | 0 | 0 | 0.013773 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048232 | false | 0.051447 | 0.019293 | 0 | 0.141479 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
e8213e750200485b12b34726d6bd13c4476d653d | 843 | py | Python | aries_cloudagent/protocols/actionmenu/v1_0/messages/menu_request.py | panickervinod/aries-cloudagent-python | bb4627fe62ee42ffeeb435cf3d8bfbd66c10d02f | [
"Apache-2.0"
] | null | null | null | aries_cloudagent/protocols/actionmenu/v1_0/messages/menu_request.py | panickervinod/aries-cloudagent-python | bb4627fe62ee42ffeeb435cf3d8bfbd66c10d02f | [
"Apache-2.0"
] | 1 | 2020-06-16T20:20:55.000Z | 2020-06-16T20:20:55.000Z | aries_cloudagent/protocols/actionmenu/v1_0/messages/menu_request.py | panickervinod/aries-cloudagent-python | bb4627fe62ee42ffeeb435cf3d8bfbd66c10d02f | [
"Apache-2.0"
] | null | null | null | """Represents a request for an action menu."""
from .....messaging.agent_message import AgentMessage, AgentMessageSchema
from ..message_types import MENU_REQUEST, PROTOCOL_PACKAGE
HANDLER_CLASS = f"{PROTOCOL_PACKAGE}.handlers.menu_request_handler.MenuRequestHandler"
class MenuRequest(AgentMessage):
"""Class representing a request for an action menu."""
class Meta:
"""Metadata for action menu request."""
handler_class = HANDLER_CLASS
message_type = MENU_REQUEST
schema_class = "MenuRequestSchema"
def __init__(self, **kwargs):
"""Initialize a menu request object."""
super().__init__(**kwargs)
class MenuRequestSchema(AgentMessageSchema):
"""MenuRequest schema class."""
class Meta:
"""MenuRequest schema metadata."""
model_class = MenuRequest
| 26.34375 | 86 | 0.702254 | 87 | 843 | 6.551724 | 0.425287 | 0.096491 | 0.038596 | 0.045614 | 0.080702 | 0.080702 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194543 | 843 | 31 | 87 | 27.193548 | 0.83947 | 0.251483 | 0 | 0.153846 | 0 | 0 | 0.14 | 0.111667 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
e8221f351de78d1c4212826efb9039a3b827c105 | 410 | py | Python | joulia/unit_conversions_test.py | willjschmitt/joulia-webserver | 712decb749c2d1bda71af49ecab245378bf30078 | [
"FTL"
] | null | null | null | joulia/unit_conversions_test.py | willjschmitt/joulia-webserver | 712decb749c2d1bda71af49ecab245378bf30078 | [
"FTL"
] | 95 | 2016-08-04T01:59:37.000Z | 2021-06-10T18:41:46.000Z | joulia/unit_conversions_test.py | willjschmitt/joulia-webserver | 712decb749c2d1bda71af49ecab245378bf30078 | [
"FTL"
] | null | null | null | """Tests joulia.unit_conversions.
"""
from django.test import TestCase
from joulia import unit_conversions
class GramsToPoundsTest(TestCase):
def test_grams_to_pounds(self):
self.assertEquals(unit_conversions.grams_to_pounds(1000.0), 2.20462)
class GramsToOuncesTest(TestCase):
def test_grams_to_ounces(self):
self.assertEquals(unit_conversions.grams_to_ounces(1000.0), 35.27392)
| 24.117647 | 77 | 0.77561 | 54 | 410 | 5.62963 | 0.462963 | 0.197368 | 0.098684 | 0.131579 | 0.421053 | 0.276316 | 0.276316 | 0 | 0 | 0 | 0 | 0.064426 | 0.129268 | 410 | 16 | 78 | 25.625 | 0.787115 | 0.073171 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e824b59433bad7c9a547eb3af6d1a2e1fb6af9c5 | 2,504 | py | Python | scripts/train_presets/beads.py | kreshuklab/hylfm-net | 9f1013640e40e998674b65176023367b1e978782 | [
"MIT"
] | 8 | 2020-11-13T05:46:59.000Z | 2022-01-30T06:12:04.000Z | scripts/train_presets/beads.py | kreshuklab/hylfm-net | 9f1013640e40e998674b65176023367b1e978782 | [
"MIT"
] | 1 | 2020-11-13T08:29:23.000Z | 2022-02-10T16:45:19.000Z | scripts/train_presets/beads.py | kreshuklab/hylfm-net | 9f1013640e40e998674b65176023367b1e978782 | [
"MIT"
] | 2 | 2020-10-30T11:02:42.000Z | 2021-01-12T06:51:33.000Z | from pathlib import Path
from hylfm.hylfm_types import (
CriterionChoice,
DatasetChoice,
LRSchedThresMode,
LRSchedulerChoice,
MetricChoice,
OptimizerChoice,
PeriodUnit,
)
from hylfm.model import HyLFM_Net
from hylfm.train import train
if __name__ == "__main__":
train(
dataset=DatasetChoice.beads_highc_b,
batch_multiplier=2,
batch_size=1,
crit_apply_weight_above_threshold=False,
crit_beta=1.0,
crit_decay_weight_by=0.8,
crit_decay_weight_every_unit=PeriodUnit.epoch,
crit_decay_weight_every_value=1,
crit_decay_weight_limit=1.0,
crit_ms_ssim_weight=0.01,
crit_threshold=0.5,
crit_weight=0.001,
criterion=CriterionChoice.WeightedSmoothL1,
data_range=1.0,
eval_batch_size=1,
interpolation_order=2,
lr_sched_factor=0.5,
lr_sched_patience=10,
lr_sched_thres=0.0001,
lr_sched_thres_mode=LRSchedThresMode.abs,
lr_scheduler=LRSchedulerChoice.ReduceLROnPlateau,
max_epochs=10,
model_weights=None, # Path()
opt_lr=3e-4,
opt_momentum=0.0,
opt_weight_decay=0.0,
optimizer=OptimizerChoice.Adam,
patience=5,
score_metric=MetricChoice.MS_SSIM,
seed=None,
validate_every_unit=PeriodUnit.epoch,
validate_every_value=1,
win_sigma=1.5,
win_size=11,
# model
nnum=19,
z_out=51,
kernel2d=3,
c00_2d=976,
c01_2d=976,
c02_2d=0,
c03_2d=0,
c04_2d=0,
up0_2d=488,
c10_2d=488,
c11_2d=0,
c12_2d=0,
c13_2d=0,
c14_2d=0,
up1_2d=244,
c20_2d=244,
c21_2d=0,
c22_2d=0,
c23_2d=0,
c24_2d=0,
up2_2d=0,
c30_2d=0,
c31_2d=0,
c32_2d=0,
c33_2d=0,
c34_2d=0,
last_kernel2d=1,
cin_3d=7,
kernel3d=3,
c00_3d=7,
c01_3d=0,
c02_3d=0,
c03_3d=0,
c04_3d=0,
up0_3d=7,
c10_3d=7,
c11_3d=7,
c12_3d=0,
c13_3d=0,
c14_3d=0,
up1_3d=0,
c20_3d=0,
c21_3d=0,
c22_3d=0,
c23_3d=0,
c24_3d=0,
up2_3d=0,
c30_3d=0,
c31_3d=0,
c32_3d=0,
c33_3d=0,
c34_3d=0,
init_fn=HyLFM_Net.InitName.xavier_uniform_,
final_activation=None,
)
| 23.185185 | 57 | 0.5623 | 341 | 2,504 | 3.777126 | 0.381232 | 0.044255 | 0.046584 | 0.031056 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152147 | 0.349042 | 2,504 | 107 | 58 | 23.401869 | 0.638037 | 0.004792 | 0 | 0 | 0 | 0 | 0.003214 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.038835 | 0 | 0.038835 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e82d0945219e52cb12d826deeb2607fdf4a302a4 | 1,353 | py | Python | bot/ganjoor/category_choose.py | MmeK/ganjoor-telegram-bot | 3992bdd860ea3626dccd79b0c1993a3662e92aa5 | [
"MIT"
] | null | null | null | bot/ganjoor/category_choose.py | MmeK/ganjoor-telegram-bot | 3992bdd860ea3626dccd79b0c1993a3662e92aa5 | [
"MIT"
] | 3 | 2021-11-17T08:03:59.000Z | 2021-11-17T14:00:23.000Z | bot/ganjoor/category_choose.py | MmeK/ganjoor-telegram-bot | 3992bdd860ea3626dccd79b0c1993a3662e92aa5 | [
"MIT"
] | null | null | null |
# Copyright 2021 Mohammad Kazemi <kazemi.me.222@gmail.com>.
# SPDX-License-Identifier: MIT
# Telegram API framework core imports
from collections import namedtuple
from functools import partial
from ganjoor.ganjoor import Ganjoor
from telegram.ext import Dispatcher, CallbackContext
from telegram import Update
# Helper methods import
from utils.logger import get_logger
from utils.telegram.keyboards import category_keyboard
# Telegram API framework handlers imports
from telegram.ext import CallbackQueryHandler
# Init logger
logger = get_logger(__name__)
CallbackData = namedtuple('CallbackData', "menu_name doto")
def init(dispatcher: Dispatcher, ganjoor: Ganjoor):
"""Provide handlers initialization."""
dispatcher.add_handler(CallbackQueryHandler(
partial(category_id, ganjoor=ganjoor), pattern=r'^category_*'))
def category_id(update: Update, context: CallbackContext, ganjoor: Ganjoor) -> int:
"""Process a /start command."""
query = update.callback_query
message_id = '_'.join(query.data.split('_')[2:])
cat_id = query.data.split('_')[1]
cat = ganjoor.find_category_by_id(cat_id, with_poems=True)
# query.answer()
query.answer()
context.bot.edit_message_reply_markup(
inline_message_id=message_id, reply_markup=category_keyboard(cat, message_id))
# query.edit_reply_markup()
| 33 | 86 | 0.764967 | 168 | 1,353 | 5.958333 | 0.458333 | 0.055944 | 0.03996 | 0.041958 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007705 | 0.136733 | 1,353 | 40 | 87 | 33.825 | 0.849315 | 0.219512 | 0 | 0 | 0 | 0 | 0.038573 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.380952 | 0 | 0.47619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e8308aa03da4eb51baf9d04fa863c427d1808871 | 257,168 | py | Python | scripts/emoji-to-scl.py | SilverWingedSeraph/sws-dotfiles | 6bee2b2ece03439101848673d6bcd9196359f7c4 | [
"Apache-2.0"
] | 3 | 2018-03-07T15:15:50.000Z | 2018-05-22T17:50:34.000Z | scripts/emoji-to-scl.py | SilverWingedSeraph/sws-dotfiles | 6bee2b2ece03439101848673d6bcd9196359f7c4 | [
"Apache-2.0"
] | null | null | null | scripts/emoji-to-scl.py | SilverWingedSeraph/sws-dotfiles | 6bee2b2ece03439101848673d6bcd9196359f7c4 | [
"Apache-2.0"
] | 1 | 2020-01-30T15:07:29.000Z | 2020-01-30T15:07:29.000Z | #!/usr/bin/python3
# -*- coding: utf-8 -*-
from subprocess import Popen, PIPE
emojis="""⛑🏻 Helmet With White Cross, Type-1-2
⛑🏼 Helmet With White Cross, Type-3
⛑🏽 Helmet With White Cross, Type-4
⛑🏾 Helmet With White Cross, Type-5
⛑🏿 Helmet With White Cross, Type-6
💏🏻 Kiss, Type-1-2
💏🏼 Kiss, Type-3
💏🏽 Kiss, Type-4
💏🏾 Kiss, Type-5
💏🏿 Kiss, Type-6
💑🏻 Couple With Heart, Type-1-2
💑🏼 Couple With Heart, Type-3
💑🏽 Couple With Heart, Type-4
💑🏾 Couple With Heart, Type-5
💑🏿 Couple With Heart, Type-6
⛷🏻 Skier, Type-1-2
⛷🏼 Skier, Type-3
⛷🏽 Skier, Type-4
⛷🏾 Skier, Type-5
⛷🏿 Skier, Type-6
😀 Grinning Face
😁 Grinning Face With Smiling Eyes
😂 Face With Tears of Joy
🤣 Rolling on the Floor Laughing
😃 Smiling Face With Open Mouth
😄 Smiling Face With Open Mouth & Smiling Eyes
😅 Smiling Face With Open Mouth & Cold Sweat
😆 Smiling Face With Open Mouth & Closed Eyes
😉 Winking Face
😊 Smiling Face With Smiling Eyes
😋 Face Savouring Delicious Food
😎 Smiling Face With Sunglasses
😍 Smiling Face With Heart-Eyes
😘 Face Blowing a Kiss
😗 Kissing Face
😙 Kissing Face With Smiling Eyes
😚 Kissing Face With Closed Eyes
☺ Smiling Face
🙂 Slightly Smiling Face
🤗 Hugging Face
🤩 Star-Struck
🤔 Thinking Face
🤨 Face With Raised Eyebrow
😐 Neutral Face
😑 Expressionless Face
😶 Face Without Mouth
🙄 Face With Rolling Eyes
😏 Smirking Face
😣 Persevering Face
😥 Disappointed but Relieved Face
😮 Face With Open Mouth
🤐 Zipper-Mouth Face
😯 Hushed Face
😪 Sleepy Face
😫 Tired Face
😴 Sleeping Face
😌 Relieved Face
😛 Face With Stuck-Out Tongue
😜 Face With Stuck-Out Tongue & Winking Eye
😝 Face With Stuck-Out Tongue & Closed Eyes
🤤 Drooling Face
😒 Unamused Face
😓 Face With Cold Sweat
😔 Pensive Face
😕 Confused Face
🙃 Upside-Down Face
🤑 Money-Mouth Face
😲 Astonished Face
☹ Frowning Face
🙁 Slightly Frowning Face
😖 Confounded Face
😞 Disappointed Face
😟 Worried Face
😤 Face With Steam From Nose
😢 Crying Face
😭 Loudly Crying Face
😦 Frowning Face With Open Mouth
😧 Anguished Face
😨 Fearful Face
😩 Weary Face
🤯 Exploding Head
😬 Grimacing Face
😰 Face With Open Mouth & Cold Sweat
😱 Face Screaming in Fear
😳 Flushed Face
🤪 Crazy Face
😵 Dizzy Face
😡 Pouting Face
😠 Angry Face
🤬 Face With Symbols Over Mouth
😷 Face With Medical Mask
🤒 Face With Thermometer
🤕 Face With Head-Bandage
🤢 Nauseated Face
🤮 Face Vomiting
🤧 Sneezing Face
😇 Smiling Face With Halo
🤠 Cowboy Hat Face
🤡 Clown Face
🤥 Lying Face
🤫 Shushing Face
🤭 Face With Hand Over Mouth
🧐 Face With Monocle
🤓 Nerd Face
😈 Smiling Face With Horns
👿 Angry Face With Horns
👹 Ogre
👺 Goblin
💀 Skull
☠ Skull and Crossbones
👻 Ghost
👽 Alien
👾 Alien Monster
🤖 Robot Face
💩 Pile of Poo
😺 Smiling Cat Face With Open Mouth
😸 Grinning Cat Face With Smiling Eyes
😹 Cat Face With Tears of Joy
😻 Smiling Cat Face With Heart-Eyes
😼 Cat Face With Wry Smile
😽 Kissing Cat Face With Closed Eyes
🙀 Weary Cat Face
😿 Crying Cat Face
😾 Pouting Cat Face
🙈 See-No-Evil Monkey
🙉 Hear-No-Evil Monkey
🙊 Speak-No-Evil Monkey
👶 Baby
👶🏻 Baby: Light Skin Tone
👶🏼 Baby: Medium-Light Skin Tone
👶🏽 Baby: Medium Skin Tone
👶🏾 Baby: Medium-Dark Skin Tone
👶🏿 Baby: Dark Skin Tone
🧒 Child
🧒🏻 Child: Light Skin Tone
🧒🏼 Child: Medium-Light Skin Tone
🧒🏽 Child: Medium Skin Tone
🧒🏾 Child: Medium-Dark Skin Tone
🧒🏿 Child: Dark Skin Tone
👦 Boy
👦🏻 Boy: Light Skin Tone
👦🏼 Boy: Medium-Light Skin Tone
👦🏽 Boy: Medium Skin Tone
👦🏾 Boy: Medium-Dark Skin Tone
👦🏿 Boy: Dark Skin Tone
👧 Girl
👧🏻 Girl: Light Skin Tone
👧🏼 Girl: Medium-Light Skin Tone
👧🏽 Girl: Medium Skin Tone
👧🏾 Girl: Medium-Dark Skin Tone
👧🏿 Girl: Dark Skin Tone
🧑 Adult
🧑🏻 Adult: Light Skin Tone
🧑🏼 Adult: Medium-Light Skin Tone
🧑🏽 Adult: Medium Skin Tone
🧑🏾 Adult: Medium-Dark Skin Tone
🧑🏿 Adult: Dark Skin Tone
👨 Man
👨🏻 Man: Light Skin Tone
👨🏼 Man: Medium-Light Skin Tone
👨🏽 Man: Medium Skin Tone
👨🏾 Man: Medium-Dark Skin Tone
👨🏿 Man: Dark Skin Tone
👩 Woman
👩🏻 Woman: Light Skin Tone
👩🏼 Woman: Medium-Light Skin Tone
👩🏽 Woman: Medium Skin Tone
👩🏾 Woman: Medium-Dark Skin Tone
👩🏿 Woman: Dark Skin Tone
🧓 Older Adult
🧓🏻 Older Adult: Light Skin Tone
🧓🏼 Older Adult: Medium-Light Skin Tone
🧓🏽 Older Adult: Medium Skin Tone
🧓🏾 Older Adult: Medium-Dark Skin Tone
🧓🏿 Older Adult: Dark Skin Tone
👴 Old Man
👴🏻 Old Man: Light Skin Tone
👴🏼 Old Man: Medium-Light Skin Tone
👴🏽 Old Man: Medium Skin Tone
👴🏾 Old Man: Medium-Dark Skin Tone
👴🏿 Old Man: Dark Skin Tone
👵 Old Woman
👵🏻 Old Woman: Light Skin Tone
👵🏼 Old Woman: Medium-Light Skin Tone
👵🏽 Old Woman: Medium Skin Tone
👵🏾 Old Woman: Medium-Dark Skin Tone
👵🏿 Old Woman: Dark Skin Tone
👨⚕️ Man Health Worker
👨🏻⚕️ Man Health Worker: Light Skin Tone
👨🏼⚕️ Man Health Worker: Medium-Light Skin Tone
👨🏽⚕️ Man Health Worker: Medium Skin Tone
👨🏾⚕️ Man Health Worker: Medium-Dark Skin Tone
👨🏿⚕️ Man Health Worker: Dark Skin Tone
👩⚕️ Woman Health Worker
👩🏻⚕️ Woman Health Worker: Light Skin Tone
👩🏼⚕️ Woman Health Worker: Medium-Light Skin Tone
👩🏽⚕️ Woman Health Worker: Medium Skin Tone
👩🏾⚕️ Woman Health Worker: Medium-Dark Skin Tone
👩🏿⚕️ Woman Health Worker: Dark Skin Tone
👨🎓 Man Student
👨🏻🎓 Man Student: Light Skin Tone
👨🏼🎓 Man Student: Medium-Light Skin Tone
👨🏽🎓 Man Student: Medium Skin Tone
👨🏾🎓 Man Student: Medium-Dark Skin Tone
👨🏿🎓 Man Student: Dark Skin Tone
👩🎓 Woman Student
👩🏻🎓 Woman Student: Light Skin Tone
👩🏼🎓 Woman Student: Medium-Light Skin Tone
👩🏽🎓 Woman Student: Medium Skin Tone
👩🏾🎓 Woman Student: Medium-Dark Skin Tone
👩🏿🎓 Woman Student: Dark Skin Tone
👨🏫 Man Teacher
👨🏻🏫 Man Teacher: Light Skin Tone
👨🏼🏫 Man Teacher: Medium-Light Skin Tone
👨🏽🏫 Man Teacher: Medium Skin Tone
👨🏾🏫 Man Teacher: Medium-Dark Skin Tone
👨🏿🏫 Man Teacher: Dark Skin Tone
👩🏫 Woman Teacher
👩🏻🏫 Woman Teacher: Light Skin Tone
👩🏼🏫 Woman Teacher: Medium-Light Skin Tone
👩🏽🏫 Woman Teacher: Medium Skin Tone
👩🏾🏫 Woman Teacher: Medium-Dark Skin Tone
👩🏿🏫 Woman Teacher: Dark Skin Tone
👨⚖️ Man Judge
👨🏻⚖️ Man Judge: Light Skin Tone
👨🏼⚖️ Man Judge: Medium-Light Skin Tone
👨🏽⚖️ Man Judge: Medium Skin Tone
👨🏾⚖️ Man Judge: Medium-Dark Skin Tone
👨🏿⚖️ Man Judge: Dark Skin Tone
👩⚖️ Woman Judge
👩🏻⚖️ Woman Judge: Light Skin Tone
👩🏼⚖️ Woman Judge: Medium-Light Skin Tone
👩🏽⚖️ Woman Judge: Medium Skin Tone
👩🏾⚖️ Woman Judge: Medium-Dark Skin Tone
👩🏿⚖️ Woman Judge: Dark Skin Tone
👨🌾 Man Farmer
👨🏻🌾 Man Farmer: Light Skin Tone
👨🏼🌾 Man Farmer: Medium-Light Skin Tone
👨🏽🌾 Man Farmer: Medium Skin Tone
👨🏾🌾 Man Farmer: Medium-Dark Skin Tone
👨🏿🌾 Man Farmer: Dark Skin Tone
👩🌾 Woman Farmer
👩🏻🌾 Woman Farmer: Light Skin Tone
👩🏼🌾 Woman Farmer: Medium-Light Skin Tone
👩🏽🌾 Woman Farmer: Medium Skin Tone
👩🏾🌾 Woman Farmer: Medium-Dark Skin Tone
👩🏿🌾 Woman Farmer: Dark Skin Tone
👨🍳 Man Cook
👨🏻🍳 Man Cook: Light Skin Tone
👨🏼🍳 Man Cook: Medium-Light Skin Tone
👨🏽🍳 Man Cook: Medium Skin Tone
👨🏾🍳 Man Cook: Medium-Dark Skin Tone
👨🏿🍳 Man Cook: Dark Skin Tone
👩🍳 Woman Cook
👩🏻🍳 Woman Cook: Light Skin Tone
👩🏼🍳 Woman Cook: Medium-Light Skin Tone
👩🏽🍳 Woman Cook: Medium Skin Tone
👩🏾🍳 Woman Cook: Medium-Dark Skin Tone
👩🏿🍳 Woman Cook: Dark Skin Tone
👨🔧 Man Mechanic
👨🏻🔧 Man Mechanic: Light Skin Tone
👨🏼🔧 Man Mechanic: Medium-Light Skin Tone
👨🏽🔧 Man Mechanic: Medium Skin Tone
👨🏾🔧 Man Mechanic: Medium-Dark Skin Tone
👨🏿🔧 Man Mechanic: Dark Skin Tone
👩🔧 Woman Mechanic
👩🏻🔧 Woman Mechanic: Light Skin Tone
👩🏼🔧 Woman Mechanic: Medium-Light Skin Tone
👩🏽🔧 Woman Mechanic: Medium Skin Tone
👩🏾🔧 Woman Mechanic: Medium-Dark Skin Tone
👩🏿🔧 Woman Mechanic: Dark Skin Tone
👨🏭 Man Factory Worker
👨🏻🏭 Man Factory Worker: Light Skin Tone
👨🏼🏭 Man Factory Worker: Medium-Light Skin Tone
👨🏽🏭 Man Factory Worker: Medium Skin Tone
👨🏾🏭 Man Factory Worker: Medium-Dark Skin Tone
👨🏿🏭 Man Factory Worker: Dark Skin Tone
👩🏭 Woman Factory Worker
👩🏻🏭 Woman Factory Worker: Light Skin Tone
👩🏼🏭 Woman Factory Worker: Medium-Light Skin Tone
👩🏽🏭 Woman Factory Worker: Medium Skin Tone
👩🏾🏭 Woman Factory Worker: Medium-Dark Skin Tone
👩🏿🏭 Woman Factory Worker: Dark Skin Tone
👨💼 Man Office Worker
👨🏻💼 Man Office Worker: Light Skin Tone
👨🏼💼 Man Office Worker: Medium-Light Skin Tone
👨🏽💼 Man Office Worker: Medium Skin Tone
👨🏾💼 Man Office Worker: Medium-Dark Skin Tone
👨🏿💼 Man Office Worker: Dark Skin Tone
👩💼 Woman Office Worker
👩🏻💼 Woman Office Worker: Light Skin Tone
👩🏼💼 Woman Office Worker: Medium-Light Skin Tone
👩🏽💼 Woman Office Worker: Medium Skin Tone
👩🏾💼 Woman Office Worker: Medium-Dark Skin Tone
👩🏿💼 Woman Office Worker: Dark Skin Tone
👨🔬 Man Scientist
👨🏻🔬 Man Scientist: Light Skin Tone
👨🏼🔬 Man Scientist: Medium-Light Skin Tone
👨🏽🔬 Man Scientist: Medium Skin Tone
👨🏾🔬 Man Scientist: Medium-Dark Skin Tone
👨🏿🔬 Man Scientist: Dark Skin Tone
👩🔬 Woman Scientist
👩🏻🔬 Woman Scientist: Light Skin Tone
👩🏼🔬 Woman Scientist: Medium-Light Skin Tone
👩🏽🔬 Woman Scientist: Medium Skin Tone
👩🏾🔬 Woman Scientist: Medium-Dark Skin Tone
👩🏿🔬 Woman Scientist: Dark Skin Tone
👨💻 Man Technologist
👨🏻💻 Man Technologist: Light Skin Tone
👨🏼💻 Man Technologist: Medium-Light Skin Tone
👨🏽💻 Man Technologist: Medium Skin Tone
👨🏾💻 Man Technologist: Medium-Dark Skin Tone
👨🏿💻 Man Technologist: Dark Skin Tone
👩💻 Woman Technologist
👩🏻💻 Woman Technologist: Light Skin Tone
👩🏼💻 Woman Technologist: Medium-Light Skin Tone
👩🏽💻 Woman Technologist: Medium Skin Tone
👩🏾💻 Woman Technologist: Medium-Dark Skin Tone
👩🏿💻 Woman Technologist: Dark Skin Tone
👨🎤 Man Singer
👨🏻🎤 Man Singer: Light Skin Tone
👨🏼🎤 Man Singer: Medium-Light Skin Tone
👨🏽🎤 Man Singer: Medium Skin Tone
👨🏾🎤 Man Singer: Medium-Dark Skin Tone
👨🏿🎤 Man Singer: Dark Skin Tone
👩🎤 Woman Singer
👩🏻🎤 Woman Singer: Light Skin Tone
👩🏼🎤 Woman Singer: Medium-Light Skin Tone
👩🏽🎤 Woman Singer: Medium Skin Tone
👩🏾🎤 Woman Singer: Medium-Dark Skin Tone
👩🏿🎤 Woman Singer: Dark Skin Tone
👨🎨 Man Artist
👨🏻🎨 Man Artist: Light Skin Tone
👨🏼🎨 Man Artist: Medium-Light Skin Tone
👨🏽🎨 Man Artist: Medium Skin Tone
👨🏾🎨 Man Artist: Medium-Dark Skin Tone
👨🏿🎨 Man Artist: Dark Skin Tone
👩🎨 Woman Artist
👩🏻🎨 Woman Artist: Light Skin Tone
👩🏼🎨 Woman Artist: Medium-Light Skin Tone
👩🏽🎨 Woman Artist: Medium Skin Tone
👩🏾🎨 Woman Artist: Medium-Dark Skin Tone
👩🏿🎨 Woman Artist: Dark Skin Tone
👨✈️ Man Pilot
👨🏻✈️ Man Pilot: Light Skin Tone
👨🏼✈️ Man Pilot: Medium-Light Skin Tone
👨🏽✈️ Man Pilot: Medium Skin Tone
👨🏾✈️ Man Pilot: Medium-Dark Skin Tone
👨🏿✈️ Man Pilot: Dark Skin Tone
👩✈️ Woman Pilot
👩🏻✈️ Woman Pilot: Light Skin Tone
👩🏼✈️ Woman Pilot: Medium-Light Skin Tone
👩🏽✈️ Woman Pilot: Medium Skin Tone
👩🏾✈️ Woman Pilot: Medium-Dark Skin Tone
👩🏿✈️ Woman Pilot: Dark Skin Tone
👨🚀 Man Astronaut
👨🏻🚀 Man Astronaut: Light Skin Tone
👨🏼🚀 Man Astronaut: Medium-Light Skin Tone
👨🏽🚀 Man Astronaut: Medium Skin Tone
👨🏾🚀 Man Astronaut: Medium-Dark Skin Tone
👨🏿🚀 Man Astronaut: Dark Skin Tone
👩🚀 Woman Astronaut
👩🏻🚀 Woman Astronaut: Light Skin Tone
👩🏼🚀 Woman Astronaut: Medium-Light Skin Tone
👩🏽🚀 Woman Astronaut: Medium Skin Tone
👩🏾🚀 Woman Astronaut: Medium-Dark Skin Tone
👩🏿🚀 Woman Astronaut: Dark Skin Tone
👨🚒 Man Firefighter
👨🏻🚒 Man Firefighter: Light Skin Tone
👨🏼🚒 Man Firefighter: Medium-Light Skin Tone
👨🏽🚒 Man Firefighter: Medium Skin Tone
👨🏾🚒 Man Firefighter: Medium-Dark Skin Tone
👨🏿🚒 Man Firefighter: Dark Skin Tone
👩🚒 Woman Firefighter
👩🏻🚒 Woman Firefighter: Light Skin Tone
👩🏼🚒 Woman Firefighter: Medium-Light Skin Tone
👩🏽🚒 Woman Firefighter: Medium Skin Tone
👩🏾🚒 Woman Firefighter: Medium-Dark Skin Tone
👩🏿🚒 Woman Firefighter: Dark Skin Tone
👮 Police Officer
👮🏻 Police Officer: Light Skin Tone
👮🏼 Police Officer: Medium-Light Skin Tone
👮🏽 Police Officer: Medium Skin Tone
👮🏾 Police Officer: Medium-Dark Skin Tone
👮🏿 Police Officer: Dark Skin Tone
👮♂️ Man Police Officer
👮🏻♂️ Man Police Officer: Light Skin Tone
👮🏼♂️ Man Police Officer: Medium-Light Skin Tone
👮🏽♂️ Man Police Officer: Medium Skin Tone
👮🏾♂️ Man Police Officer: Medium-Dark Skin Tone
👮🏿♂️ Man Police Officer: Dark Skin Tone
👮♀️ Woman Police Officer
👮🏻♀️ Woman Police Officer: Light Skin Tone
👮🏼♀️ Woman Police Officer: Medium-Light Skin Tone
👮🏽♀️ Woman Police Officer: Medium Skin Tone
👮🏾♀️ Woman Police Officer: Medium-Dark Skin Tone
👮🏿♀️ Woman Police Officer: Dark Skin Tone
🕵 Detective
🕵🏻 Detective: Light Skin Tone
🕵🏼 Detective: Medium-Light Skin Tone
🕵🏽 Detective: Medium Skin Tone
🕵🏾 Detective: Medium-Dark Skin Tone
🕵🏿 Detective: Dark Skin Tone
🕵️♂️ Man Detective
🕵🏻♂️ Man Detective: Light Skin Tone
🕵🏼♂️ Man Detective: Medium-Light Skin Tone
🕵🏽♂️ Man Detective: Medium Skin Tone
🕵🏾♂️ Man Detective: Medium-Dark Skin Tone
🕵🏿♂️ Man Detective: Dark Skin Tone
🕵️♀️ Woman Detective
🕵🏻♀️ Woman Detective: Light Skin Tone
🕵🏼♀️ Woman Detective: Medium-Light Skin Tone
🕵🏽♀️ Woman Detective: Medium Skin Tone
🕵🏾♀️ Woman Detective: Medium-Dark Skin Tone
🕵🏿♀️ Woman Detective: Dark Skin Tone
💂 Guard
💂🏻 Guard: Light Skin Tone
💂🏼 Guard: Medium-Light Skin Tone
💂🏽 Guard: Medium Skin Tone
💂🏾 Guard: Medium-Dark Skin Tone
💂🏿 Guard: Dark Skin Tone
💂♂️ Man Guard
💂🏻♂️ Man Guard: Light Skin Tone
💂🏼♂️ Man Guard: Medium-Light Skin Tone
💂🏽♂️ Man Guard: Medium Skin Tone
💂🏾♂️ Man Guard: Medium-Dark Skin Tone
💂🏿♂️ Man Guard: Dark Skin Tone
💂♀️ Woman Guard
💂🏻♀️ Woman Guard: Light Skin Tone
💂🏼♀️ Woman Guard: Medium-Light Skin Tone
💂🏽♀️ Woman Guard: Medium Skin Tone
💂🏾♀️ Woman Guard: Medium-Dark Skin Tone
💂🏿♀️ Woman Guard: Dark Skin Tone
👷 Construction Worker
👷🏻 Construction Worker: Light Skin Tone
👷🏼 Construction Worker: Medium-Light Skin Tone
👷🏽 Construction Worker: Medium Skin Tone
👷🏾 Construction Worker: Medium-Dark Skin Tone
👷🏿 Construction Worker: Dark Skin Tone
👷♂️ Man Construction Worker
👷🏻♂️ Man Construction Worker: Light Skin Tone
👷🏼♂️ Man Construction Worker: Medium-Light Skin Tone
👷🏽♂️ Man Construction Worker: Medium Skin Tone
👷🏾♂️ Man Construction Worker: Medium-Dark Skin Tone
👷🏿♂️ Man Construction Worker: Dark Skin Tone
👷♀️ Woman Construction Worker
👷🏻♀️ Woman Construction Worker: Light Skin Tone
👷🏼♀️ Woman Construction Worker: Medium-Light Skin Tone
👷🏽♀️ Woman Construction Worker: Medium Skin Tone
👷🏾♀️ Woman Construction Worker: Medium-Dark Skin Tone
👷🏿♀️ Woman Construction Worker: Dark Skin Tone
🤴 Prince
🤴🏻 Prince: Light Skin Tone
🤴🏼 Prince: Medium-Light Skin Tone
🤴🏽 Prince: Medium Skin Tone
🤴🏾 Prince: Medium-Dark Skin Tone
🤴🏿 Prince: Dark Skin Tone
👸 Princess
👸🏻 Princess: Light Skin Tone
👸🏼 Princess: Medium-Light Skin Tone
👸🏽 Princess: Medium Skin Tone
👸🏾 Princess: Medium-Dark Skin Tone
👸🏿 Princess: Dark Skin Tone
👳 Person Wearing Turban
👳🏻 Person Wearing Turban: Light Skin Tone
👳🏼 Person Wearing Turban: Medium-Light Skin Tone
👳🏽 Person Wearing Turban: Medium Skin Tone
👳🏾 Person Wearing Turban: Medium-Dark Skin Tone
👳🏿 Person Wearing Turban: Dark Skin Tone
👳♂️ Man Wearing Turban
👳🏻♂️ Man Wearing Turban: Light Skin Tone
👳🏼♂️ Man Wearing Turban: Medium-Light Skin Tone
👳🏽♂️ Man Wearing Turban: Medium Skin Tone
👳🏾♂️ Man Wearing Turban: Medium-Dark Skin Tone
👳🏿♂️ Man Wearing Turban: Dark Skin Tone
👳♀️ Woman Wearing Turban
👳🏻♀️ Woman Wearing Turban: Light Skin Tone
👳🏼♀️ Woman Wearing Turban: Medium-Light Skin Tone
👳🏽♀️ Woman Wearing Turban: Medium Skin Tone
👳🏾♀️ Woman Wearing Turban: Medium-Dark Skin Tone
👳🏿♀️ Woman Wearing Turban: Dark Skin Tone
👲 Man With Chinese Cap
👲🏻 Man With Chinese Cap: Light Skin Tone
👲🏼 Man With Chinese Cap: Medium-Light Skin Tone
👲🏽 Man With Chinese Cap: Medium Skin Tone
👲🏾 Man With Chinese Cap: Medium-Dark Skin Tone
👲🏿 Man With Chinese Cap: Dark Skin Tone
🧕 Woman With Headscarf
🧕🏻 Person With Headscarf: Light Skin Tone
🧕🏼 Person With Headscarf: Medium-Light Skin Tone
🧕🏽 Person With Headscarf: Medium Skin Tone
🧕🏾 Person With Headscarf: Medium-Dark Skin Tone
🧕🏿 Person With Headscarf: Dark Skin Tone
🧔 Bearded Person
🧔🏻 Bearded Person: Light Skin Tone
🧔🏼 Bearded Person: Medium-Light Skin Tone
🧔🏽 Bearded Person: Medium Skin Tone
🧔🏾 Bearded Person: Medium-Dark Skin Tone
🧔🏿 Bearded Person: Dark Skin Tone
👱 Blond-Haired Person
👱🏻 Blond-Haired Person: Light Skin Tone
👱🏼 Blond-Haired Person: Medium-Light Skin Tone
👱🏽 Blond-Haired Person: Medium Skin Tone
👱🏾 Blond-Haired Person: Medium-Dark Skin Tone
👱🏿 Blond-Haired Person: Dark Skin Tone
👱♂️ Blond-Haired Man
👱🏻♂️ Blond-Haired Man: Light Skin Tone
👱🏼♂️ Blond-Haired Man: Medium-Light Skin Tone
👱🏽♂️ Blond-Haired Man: Medium Skin Tone
👱🏾♂️ Blond-Haired Man: Medium-Dark Skin Tone
👱🏿♂️ Blond-Haired Man: Dark Skin Tone
👱♀️ Blond-Haired Woman
👱🏻♀️ Blond-Haired Woman: Light Skin Tone
👱🏼♀️ Blond-Haired Woman: Medium-Light Skin Tone
👱🏽♀️ Blond-Haired Woman: Medium Skin Tone
👱🏾♀️ Blond-Haired Woman: Medium-Dark Skin Tone
👱🏿♀️ Blond-Haired Woman: Dark Skin Tone
🤵 Man in Tuxedo
🤵🏻 Man in Tuxedo: Light Skin Tone
🤵🏼 Man in Tuxedo: Medium-Light Skin Tone
🤵🏽 Man in Tuxedo: Medium Skin Tone
🤵🏾 Man in Tuxedo: Medium-Dark Skin Tone
🤵🏿 Man in Tuxedo: Dark Skin Tone
👰 Bride With Veil
👰🏻 Bride With Veil: Light Skin Tone
👰🏼 Bride With Veil: Medium-Light Skin Tone
👰🏽 Bride With Veil: Medium Skin Tone
👰🏾 Bride With Veil: Medium-Dark Skin Tone
👰🏿 Bride With Veil: Dark Skin Tone
🤰 Pregnant Woman
🤰🏻 Pregnant Woman: Light Skin Tone
🤰🏼 Pregnant Woman: Medium-Light Skin Tone
🤰🏽 Pregnant Woman: Medium Skin Tone
🤰🏾 Pregnant Woman: Medium-Dark Skin Tone
🤰🏿 Pregnant Woman: Dark Skin Tone
🤱 Breast-Feeding
🤱🏻 Breast-Feeding: Light Skin Tone
🤱🏼 Breast-Feeding: Medium-Light Skin Tone
🤱🏽 Breast-Feeding: Medium Skin Tone
🤱🏾 Breast-Feeding: Medium-Dark Skin Tone
🤱🏿 Breast-Feeding: Dark Skin Tone
👼 Baby Angel
👼🏻 Baby Angel: Light Skin Tone
👼🏼 Baby Angel: Medium-Light Skin Tone
👼🏽 Baby Angel: Medium Skin Tone
👼🏾 Baby Angel: Medium-Dark Skin Tone
👼🏿 Baby Angel: Dark Skin Tone
🎅 Santa Claus
🎅🏻 Santa Claus: Light Skin Tone
🎅🏼 Santa Claus: Medium-Light Skin Tone
🎅🏽 Santa Claus: Medium Skin Tone
🎅🏾 Santa Claus: Medium-Dark Skin Tone
🎅🏿 Santa Claus: Dark Skin Tone
🤶 Mrs. Claus
🤶🏻 Mrs. Claus: Light Skin Tone
🤶🏼 Mrs. Claus: Medium-Light Skin Tone
🤶🏽 Mrs. Claus: Medium Skin Tone
🤶🏾 Mrs. Claus: Medium-Dark Skin Tone
🤶🏿 Mrs. Claus: Dark Skin Tone
🧙 Mage
🧙🏻 Mage: Light Skin Tone
🧙🏼 Mage: Medium-Light Skin Tone
🧙🏽 Mage: Medium Skin Tone
🧙🏾 Mage: Medium-Dark Skin Tone
🧙🏿 Mage: Dark Skin Tone
🧙♀️ Woman Mage
🧙🏻♀️ Woman Mage: Light Skin Tone
🧙🏼♀️ Woman Mage: Medium-Light Skin Tone
🧙🏽♀️ Woman Mage: Medium Skin Tone
🧙🏾♀️ Woman Mage: Medium-Dark Skin Tone
🧙🏿♀️ Woman Mage: Dark Skin Tone
🧙♂️ Man Mage
🧙🏻♂️ Man Mage: Light Skin Tone
🧙🏼♂️ Man Mage: Medium-Light Skin Tone
🧙🏽♂️ Man Mage: Medium Skin Tone
🧙🏾♂️ Man Mage: Medium-Dark Skin Tone
🧙🏿♂️ Man Mage: Dark Skin Tone
🧚 Fairy
🧚🏻 Fairy: Light Skin Tone
🧚🏼 Fairy: Medium-Light Skin Tone
🧚🏽 Fairy: Medium Skin Tone
🧚🏾 Fairy: Medium-Dark Skin Tone
🧚🏿 Fairy: Dark Skin Tone
🧚♀️ Woman Fairy
🧚🏻♀️ Woman Fairy: Light Skin Tone
🧚🏼♀️ Woman Fairy: Medium-Light Skin Tone
🧚🏽♀️ Woman Fairy: Medium Skin Tone
🧚🏾♀️ Woman Fairy: Medium-Dark Skin Tone
🧚🏿♀️ Woman Fairy: Dark Skin Tone
🧚♂️ Man Fairy
🧚🏻♂️ Man Fairy: Light Skin Tone
🧚🏼♂️ Man Fairy: Medium-Light Skin Tone
🧚🏽♂️ Man Fairy: Medium Skin Tone
🧚🏾♂️ Man Fairy: Medium-Dark Skin Tone
🧚🏿♂️ Man Fairy: Dark Skin Tone
🧛 Vampire
🧛🏻 Vampire: Light Skin Tone
🧛🏼 Vampire: Medium-Light Skin Tone
🧛🏽 Vampire: Medium Skin Tone
🧛🏾 Vampire: Medium-Dark Skin Tone
🧛🏿 Vampire: Dark Skin Tone
🧛♀️ Woman Vampire
🧛🏻♀️ Woman Vampire: Light Skin Tone
🧛🏼♀️ Woman Vampire: Medium-Light Skin Tone
🧛🏽♀️ Woman Vampire: Medium Skin Tone
🧛🏾♀️ Woman Vampire: Medium-Dark Skin Tone
🧛🏿♀️ Woman Vampire: Dark Skin Tone
🧛♂️ Man Vampire
🧛🏻♂️ Man Vampire: Light Skin Tone
🧛🏼♂️ Man Vampire: Medium-Light Skin Tone
🧛🏽♂️ Man Vampire: Medium Skin Tone
🧛🏾♂️ Man Vampire: Medium-Dark Skin Tone
👯🏻 Woman With Bunny Ears, Type-1-2
👯🏼 Woman With Bunny Ears, Type-3
🧛🏿♂️ Man Vampire: Dark Skin Tone
👯🏽 Woman With Bunny Ears, Type-4
👯🏾 Woman With Bunny Ears, Type-5
🧜 Merperson
👯🏿 Woman With Bunny Ears, Type-6
🧜🏻 Merperson: Light Skin Tone
👯🏻♂️ Men With Bunny Ears Partying, Type-1-2
🧜🏼 Merperson: Medium-Light Skin Tone
👯🏼♂️ Men With Bunny Ears Partying, Type-3
🧜🏽 Merperson: Medium Skin Tone
👯🏽♂️ Men With Bunny Ears Partying, Type-4
🧜🏾 Merperson: Medium-Dark Skin Tone
👯🏾♂️ Men With Bunny Ears Partying, Type-5
🧜🏿 Merperson: Dark Skin Tone
👯🏿♂️ Men With Bunny Ears Partying, Type-6
🧜♀️ Mermaid
👯🏻♀️ Women With Bunny Ears Partying, Type-1-2
🧜🏻♀️ Mermaid: Light Skin Tone
👯🏼♀️ Women With Bunny Ears Partying, Type-3
🧜🏼♀️ Mermaid: Medium-Light Skin Tone
👯🏽♀️ Women With Bunny Ears Partying, Type-4
👯🏾♀️ Women With Bunny Ears Partying, Type-5
🧜🏽♀️ Mermaid: Medium Skin Tone
👯🏿♀️ Women With Bunny Ears Partying, Type-6
🧜🏾♀️ Mermaid: Medium-Dark Skin Tone
🧜🏿♀️ Mermaid: Dark Skin Tone
🧜♂️ Merman
🧜🏻♂️ Merman: Light Skin Tone
🧜🏼♂️ Merman: Medium-Light Skin Tone
👫🏻 Man and Woman Holding Hands, Type-1-2
🧜🏽♂️ Merman: Medium Skin Tone
👫🏼 Man and Woman Holding Hands, Type-3
👫🏽 Man and Woman Holding Hands, Type-4
🧜🏾♂️ Merman: Medium-Dark Skin Tone
👫🏾 Man and Woman Holding Hands, Type-5
👫🏿 Man and Woman Holding Hands, Type-6
🧜🏿♂️ Merman: Dark Skin Tone
👬🏻 Two Men Holding Hands, Type-1-2
🧝 Elf
👬🏼 Two Men Holding Hands, Type-3
👬🏽 Two Men Holding Hands, Type-4
🧝🏻 Elf: Light Skin Tone
👬🏾 Two Men Holding Hands, Type-5
🧝🏼 Elf: Medium-Light Skin Tone
👬🏿 Two Men Holding Hands, Type-6
🧝🏽 Elf: Medium Skin Tone
🧝🏾 Elf: Medium-Dark Skin Tone
👭🏻 Two Women Holding Hands, Type-1-2
🧝🏿 Elf: Dark Skin Tone
🧝♀️ Woman Elf
👭🏼 Two Women Holding Hands, Type-3
👭🏽 Two Women Holding Hands, Type-4
🧝🏻♀️ Woman Elf: Light Skin Tone
👭🏾 Two Women Holding Hands, Type-5
👭🏿 Two Women Holding Hands, Type-6
🧝🏼♀️ Woman Elf: Medium-Light Skin Tone
🧝🏽♀️ Woman Elf: Medium Skin Tone
🧝🏾♀️ Woman Elf: Medium-Dark Skin Tone
🧝🏿♀️ Woman Elf: Dark Skin Tone
🧝♂️ Man Elf
👪🏻 Family, Type-1-2
🧝🏻♂️ Man Elf: Light Skin Tone
👪🏼 Family, Type-3
👪🏽 Family, Type-4
🧝🏼♂️ Man Elf: Medium-Light Skin Tone
👪🏾 Family, Type-5
👪🏿 Family, Type-6
🧝🏽♂️ Man Elf: Medium Skin Tone
🧝🏾♂️ Man Elf: Medium-Dark Skin Tone
🧝🏿♂️ Man Elf: Dark Skin Tone
🧞 Genie
🧞♀️ Woman Genie
🧞♂️ Man Genie
🧟 Zombie
🧟♀️ Woman Zombie
🧟♂️ Man Zombie
🙍 Person Frowning
🙍🏻 Person Frowning: Light Skin Tone
🙍🏼 Person Frowning: Medium-Light Skin Tone
🙍🏽 Person Frowning: Medium Skin Tone
🙍🏾 Person Frowning: Medium-Dark Skin Tone
🙍🏿 Person Frowning: Dark Skin Tone
🙍♂️ Man Frowning
🙍🏻♂️ Man Frowning: Light Skin Tone
🏻 Light Skin Tone
🏼 Medium-Light Skin Tone
🙍🏼♂️ Man Frowning: Medium-Light Skin Tone
🏽 Medium Skin Tone
🙍🏽♂️ Man Frowning: Medium Skin Tone
🏾 Medium-Dark Skin Tone
🏿 Dark Skin Tone
🙍🏾♂️ Man Frowning: Medium-Dark Skin Tone
🙍🏿♂️ Man Frowning: Dark Skin Tone
🙍♀️ Woman Frowning
🙍🏻♀️ Woman Frowning: Light Skin Tone
🙍🏼♀️ Woman Frowning: Medium-Light Skin Tone
🙍🏽♀️ Woman Frowning: Medium Skin Tone
🙍🏾♀️ Woman Frowning: Medium-Dark Skin Tone
🙍🏿♀️ Woman Frowning: Dark Skin Tone
🙎 Person Pouting
🙎🏻 Person Pouting: Light Skin Tone
🙎🏼 Person Pouting: Medium-Light Skin Tone
🙎🏽 Person Pouting: Medium Skin Tone
🙎🏾 Person Pouting: Medium-Dark Skin Tone
🙎🏿 Person Pouting: Dark Skin Tone
🙎♂️ Man Pouting
🙎🏻♂️ Man Pouting: Light Skin Tone
🙎🏼♂️ Man Pouting: Medium-Light Skin Tone
🙎🏽♂️ Man Pouting: Medium Skin Tone
🙎🏾♂️ Man Pouting: Medium-Dark Skin Tone
🙎🏿♂️ Man Pouting: Dark Skin Tone
🙎♀️ Woman Pouting
🙎🏻♀️ Woman Pouting: Light Skin Tone
🙎🏼♀️ Woman Pouting: Medium-Light Skin Tone
🙎🏽♀️ Woman Pouting: Medium Skin Tone
🙎🏾♀️ Woman Pouting: Medium-Dark Skin Tone
🙎🏿♀️ Woman Pouting: Dark Skin Tone
🙅 Person Gesturing No
🙅🏻 Person Gesturing No: Light Skin Tone
🙅🏼 Person Gesturing No: Medium-Light Skin Tone
🙅🏽 Person Gesturing No: Medium Skin Tone
🙅🏾 Person Gesturing No: Medium-Dark Skin Tone
🙅🏿 Person Gesturing No: Dark Skin Tone
🙅♂️ Man Gesturing No
🙅🏻♂️ Man Gesturing No: Light Skin Tone
🙅🏼♂️ Man Gesturing No: Medium-Light Skin Tone
🙅🏽♂️ Man Gesturing No: Medium Skin Tone
🙅🏾♂️ Man Gesturing No: Medium-Dark Skin Tone
🙅🏿♂️ Man Gesturing No: Dark Skin Tone
🙅♀️ Woman Gesturing No
🙅🏻♀️ Woman Gesturing No: Light Skin Tone
🙅🏼♀️ Woman Gesturing No: Medium-Light Skin Tone
🙅🏽♀️ Woman Gesturing No: Medium Skin Tone
🙅🏾♀️ Woman Gesturing No: Medium-Dark Skin Tone
🙅🏿♀️ Woman Gesturing No: Dark Skin Tone
🙆 Person Gesturing OK
🙆🏻 Person Gesturing OK: Light Skin Tone
🙆🏼 Person Gesturing OK: Medium-Light Skin Tone
🙆🏽 Person Gesturing OK: Medium Skin Tone
🙆🏾 Person Gesturing OK: Medium-Dark Skin Tone
🙆🏿 Person Gesturing OK: Dark Skin Tone
🙆♂️ Man Gesturing OK
🙆🏻♂️ Man Gesturing OK: Light Skin Tone
🙆🏼♂️ Man Gesturing OK: Medium-Light Skin Tone
🙆🏽♂️ Man Gesturing OK: Medium Skin Tone
🙆🏾♂️ Man Gesturing OK: Medium-Dark Skin Tone
🙆🏿♂️ Man Gesturing OK: Dark Skin Tone
🙆♀️ Woman Gesturing OK
🙆🏻♀️ Woman Gesturing OK: Light Skin Tone
🙆🏼♀️ Woman Gesturing OK: Medium-Light Skin Tone
🙆🏽♀️ Woman Gesturing OK: Medium Skin Tone
🙆🏾♀️ Woman Gesturing OK: Medium-Dark Skin Tone
🙆🏿♀️ Woman Gesturing OK: Dark Skin Tone
💁 Person Tipping Hand
💁🏻 Person Tipping Hand: Light Skin Tone
💁🏼 Person Tipping Hand: Medium-Light Skin Tone
💁🏽 Person Tipping Hand: Medium Skin Tone
💁🏾 Person Tipping Hand: Medium-Dark Skin Tone
💁🏿 Person Tipping Hand: Dark Skin Tone
💁♂️ Man Tipping Hand
💁🏻♂️ Man Tipping Hand: Light Skin Tone
💁🏼♂️ Man Tipping Hand: Medium-Light Skin Tone
💁🏽♂️ Man Tipping Hand: Medium Skin Tone
💁🏾♂️ Man Tipping Hand: Medium-Dark Skin Tone
💁🏿♂️ Man Tipping Hand: Dark Skin Tone
💁♀️ Woman Tipping Hand
💁🏻♀️ Woman Tipping Hand: Light Skin Tone
💁🏼♀️ Woman Tipping Hand: Medium-Light Skin Tone
💁🏽♀️ Woman Tipping Hand: Medium Skin Tone
💁🏾♀️ Woman Tipping Hand: Medium-Dark Skin Tone
💁🏿♀️ Woman Tipping Hand: Dark Skin Tone
🙋 Person Raising Hand
🙋🏻 Person Raising Hand: Light Skin Tone
🙋🏼 Person Raising Hand: Medium-Light Skin Tone
🙋🏽 Person Raising Hand: Medium Skin Tone
🙋🏾 Person Raising Hand: Medium-Dark Skin Tone
🙋🏿 Person Raising Hand: Dark Skin Tone
🙋♂️ Man Raising Hand
🙋🏻♂️ Man Raising Hand: Light Skin Tone
🙋🏼♂️ Man Raising Hand: Medium-Light Skin Tone
🙋🏽♂️ Man Raising Hand: Medium Skin Tone
🙋🏾♂️ Man Raising Hand: Medium-Dark Skin Tone
🙋🏿♂️ Man Raising Hand: Dark Skin Tone
🙋♀️ Woman Raising Hand
🙋🏻♀️ Woman Raising Hand: Light Skin Tone
🙋🏼♀️ Woman Raising Hand: Medium-Light Skin Tone
🙋🏽♀️ Woman Raising Hand: Medium Skin Tone
🙋🏾♀️ Woman Raising Hand: Medium-Dark Skin Tone
🙋🏿♀️ Woman Raising Hand: Dark Skin Tone
🙇 Person Bowing
🙇🏻 Person Bowing: Light Skin Tone
🙇🏼 Person Bowing: Medium-Light Skin Tone
🙇🏽 Person Bowing: Medium Skin Tone
🙇🏾 Person Bowing: Medium-Dark Skin Tone
🙇🏿 Person Bowing: Dark Skin Tone
🙇♂️ Man Bowing
🙇🏻♂️ Man Bowing: Light Skin Tone
🤝🏻 Handshake, Type-1-2
🙇🏼♂️ Man Bowing: Medium-Light Skin Tone
🤝🏼 Handshake, Type-3
🤝🏽 Handshake, Type-4
🙇🏽♂️ Man Bowing: Medium Skin Tone
🤝🏾 Handshake, Type-5
🤝🏿 Handshake, Type-6
🙇🏾♂️ Man Bowing: Medium-Dark Skin Tone
🙇🏿♂️ Man Bowing: Dark Skin Tone
🙇♀️ Woman Bowing
🙇🏻♀️ Woman Bowing: Light Skin Tone
🙇🏼♀️ Woman Bowing: Medium-Light Skin Tone
🙇🏽♀️ Woman Bowing: Medium Skin Tone
🙇🏾♀️ Woman Bowing: Medium-Dark Skin Tone
🙇🏿♀️ Woman Bowing: Dark Skin Tone
🤦 Person Facepalming
🤦🏻 Person Facepalming: Light Skin Tone
🤦🏼 Person Facepalming: Medium-Light Skin Tone
🤦🏽 Person Facepalming: Medium Skin Tone
🤦🏾 Person Facepalming: Medium-Dark Skin Tone
🤦🏿 Person Facepalming: Dark Skin Tone
🤦♂️ Man Facepalming
🤦🏻♂️ Man Facepalming: Light Skin Tone
🤦🏼♂️ Man Facepalming: Medium-Light Skin Tone
🤦🏽♂️ Man Facepalming: Medium Skin Tone
🤦🏾♂️ Man Facepalming: Medium-Dark Skin Tone
🤦🏿♂️ Man Facepalming: Dark Skin Tone
🤦♀️ Woman Facepalming
🤦🏻♀️ Woman Facepalming: Light Skin Tone
🤦🏼♀️ Woman Facepalming: Medium-Light Skin Tone
🤦🏽♀️ Woman Facepalming: Medium Skin Tone
🤦🏾♀️ Woman Facepalming: Medium-Dark Skin Tone
🤦🏿♀️ Woman Facepalming: Dark Skin Tone
🤷 Person Shrugging
🤷🏻 Person Shrugging: Light Skin Tone
🤷🏼 Person Shrugging: Medium-Light Skin Tone
🤷🏽 Person Shrugging: Medium Skin Tone
🤷🏾 Person Shrugging: Medium-Dark Skin Tone
🤷🏿 Person Shrugging: Dark Skin Tone
🤷♂️ Man Shrugging
🤷🏻♂️ Man Shrugging: Light Skin Tone
🤷🏼♂️ Man Shrugging: Medium-Light Skin Tone
🤷🏽♂️ Man Shrugging: Medium Skin Tone
🤷🏾♂️ Man Shrugging: Medium-Dark Skin Tone
🤷🏿♂️ Man Shrugging: Dark Skin Tone
🤷♀️ Woman Shrugging
🤷🏻♀️ Woman Shrugging: Light Skin Tone
🤷🏼♀️ Woman Shrugging: Medium-Light Skin Tone
🤷🏽♀️ Woman Shrugging: Medium Skin Tone
🤷🏾♀️ Woman Shrugging: Medium-Dark Skin Tone
🤷🏿♀️ Woman Shrugging: Dark Skin Tone
💆 Person Getting Massage
💆🏻 Person Getting Massage: Light Skin Tone
💆🏼 Person Getting Massage: Medium-Light Skin Tone
💆🏽 Person Getting Massage: Medium Skin Tone
💆🏾 Person Getting Massage: Medium-Dark Skin Tone
💆🏿 Person Getting Massage: Dark Skin Tone
💆♂️ Man Getting Massage
💆🏻♂️ Man Getting Massage: Light Skin Tone
💆🏼♂️ Man Getting Massage: Medium-Light Skin Tone
💆🏽♂️ Man Getting Massage: Medium Skin Tone
💆🏾♂️ Man Getting Massage: Medium-Dark Skin Tone
💆🏿♂️ Man Getting Massage: Dark Skin Tone
💆♀️ Woman Getting Massage
💆🏻♀️ Woman Getting Massage: Light Skin Tone
💆🏼♀️ Woman Getting Massage: Medium-Light Skin Tone
💆🏽♀️ Woman Getting Massage: Medium Skin Tone
💆🏾♀️ Woman Getting Massage: Medium-Dark Skin Tone
💆🏿♀️ Woman Getting Massage: Dark Skin Tone
💇 Person Getting Haircut
💇🏻 Person Getting Haircut: Light Skin Tone
💇🏼 Person Getting Haircut: Medium-Light Skin Tone
💇🏽 Person Getting Haircut: Medium Skin Tone
💇🏾 Person Getting Haircut: Medium-Dark Skin Tone
💇🏿 Person Getting Haircut: Dark Skin Tone
💇♂️ Man Getting Haircut
💇🏻♂️ Man Getting Haircut: Light Skin Tone
💇🏼♂️ Man Getting Haircut: Medium-Light Skin Tone
💇🏽♂️ Man Getting Haircut: Medium Skin Tone
💇🏾♂️ Man Getting Haircut: Medium-Dark Skin Tone
💇🏿♂️ Man Getting Haircut: Dark Skin Tone
💇♀️ Woman Getting Haircut
💇🏻♀️ Woman Getting Haircut: Light Skin Tone
💇🏼♀️ Woman Getting Haircut: Medium-Light Skin Tone
💇🏽♀️ Woman Getting Haircut: Medium Skin Tone
💇🏾♀️ Woman Getting Haircut: Medium-Dark Skin Tone
💇🏿♀️ Woman Getting Haircut: Dark Skin Tone
🚶 Person Walking
🚶🏻 Person Walking: Light Skin Tone
🚶🏼 Person Walking: Medium-Light Skin Tone
🚶🏽 Person Walking: Medium Skin Tone
🚶🏾 Person Walking: Medium-Dark Skin Tone
🚶🏿 Person Walking: Dark Skin Tone
🚶♂️ Man Walking
🚶🏻♂️ Man Walking: Light Skin Tone
🚶🏼♂️ Man Walking: Medium-Light Skin Tone
🚶🏽♂️ Man Walking: Medium Skin Tone
🚶🏾♂️ Man Walking: Medium-Dark Skin Tone
🚶🏿♂️ Man Walking: Dark Skin Tone
🚶♀️ Woman Walking
🚶🏻♀️ Woman Walking: Light Skin Tone
🚶🏼♀️ Woman Walking: Medium-Light Skin Tone
🚶🏽♀️ Woman Walking: Medium Skin Tone
🚶🏾♀️ Woman Walking: Medium-Dark Skin Tone
🚶🏿♀️ Woman Walking: Dark Skin Tone
🏃 Person Running
🏃🏻 Person Running: Light Skin Tone
🏃🏼 Person Running: Medium-Light Skin Tone
🏃🏽 Person Running: Medium Skin Tone
🏃🏾 Person Running: Medium-Dark Skin Tone
🏃🏿 Person Running: Dark Skin Tone
🏃♂️ Man Running
🏃🏻♂️ Man Running: Light Skin Tone
🏃🏼♂️ Man Running: Medium-Light Skin Tone
🏃🏽♂️ Man Running: Medium Skin Tone
🏃🏾♂️ Man Running: Medium-Dark Skin Tone
🏃🏿♂️ Man Running: Dark Skin Tone
🏃♀️ Woman Running
🏃🏻♀️ Woman Running: Light Skin Tone
🏃🏼♀️ Woman Running: Medium-Light Skin Tone
🏃🏽♀️ Woman Running: Medium Skin Tone
🏃🏾♀️ Woman Running: Medium-Dark Skin Tone
🏃🏿♀️ Woman Running: Dark Skin Tone
💃 Woman Dancing
💃🏻 Woman Dancing: Light Skin Tone
💃🏼 Woman Dancing: Medium-Light Skin Tone
💃🏽 Woman Dancing: Medium Skin Tone
💃🏾 Woman Dancing: Medium-Dark Skin Tone
💃🏿 Woman Dancing: Dark Skin Tone
🕺 Man Dancing
🕺🏻 Man Dancing: Light Skin Tone
🕺🏼 Man Dancing: Medium-Light Skin Tone
🕺🏽 Man Dancing: Medium Skin Tone
🕺🏾 Man Dancing: Medium-Dark Skin Tone
🕺🏿 Man Dancing: Dark Skin Tone
👯 People With Bunny Ears Partying
👯♂️ Men With Bunny Ears Partying
👯♀️ Women With Bunny Ears Partying
🧖 Person in Steamy Room
🧖🏻 Person in Steamy Room: Light Skin Tone
🧖🏼 Person in Steamy Room: Medium-Light Skin Tone
🧖🏽 Person in Steamy Room: Medium Skin Tone
🧖🏾 Person in Steamy Room: Medium-Dark Skin Tone
🧖🏿 Person in Steamy Room: Dark Skin Tone
🧖♀️ Woman in Steamy Room
🧖🏻♀️ Woman in Steamy Room: Light Skin Tone
🧖🏼♀️ Woman in Steamy Room: Medium-Light Skin Tone
🧖🏽♀️ Woman in Steamy Room: Medium Skin Tone
🧖🏾♀️ Woman in Steamy Room: Medium-Dark Skin Tone
🧖🏿♀️ Woman in Steamy Room: Dark Skin Tone
🧖♂️ Man in Steamy Room
🧖🏻♂️ Man in Steamy Room: Light Skin Tone
🧖🏼♂️ Man in Steamy Room: Medium-Light Skin Tone
🧖🏽♂️ Man in Steamy Room: Medium Skin Tone
🧖🏾♂️ Man in Steamy Room: Medium-Dark Skin Tone
🧖🏿♂️ Man in Steamy Room: Dark Skin Tone
🧗 Person Climbing
🧗🏻 Person Climbing: Light Skin Tone
🧗🏼 Person Climbing: Medium-Light Skin Tone
🧗🏽 Person Climbing: Medium Skin Tone
🧗🏾 Person Climbing: Medium-Dark Skin Tone
🧗🏿 Person Climbing: Dark Skin Tone
🧗♀️ Woman Climbing
🧗🏻♀️ Woman Climbing: Light Skin Tone
🧗🏼♀️ Woman Climbing: Medium-Light Skin Tone
🧗🏽♀️ Woman Climbing: Medium Skin Tone
🧗🏾♀️ Woman Climbing: Medium-Dark Skin Tone
🧗🏿♀️ Woman Climbing: Dark Skin Tone
🧗♂️ Man Climbing
🧗🏻♂️ Man Climbing: Light Skin Tone
🧗🏼♂️ Man Climbing: Medium-Light Skin Tone
🧗🏽♂️ Man Climbing: Medium Skin Tone
🧗🏾♂️ Man Climbing: Medium-Dark Skin Tone
🧗🏿♂️ Man Climbing: Dark Skin Tone
🧘 Person in Lotus Position
🧘🏻 Person in Lotus Position: Light Skin Tone
🧘🏼 Person in Lotus Position: Medium-Light Skin Tone
🧘🏽 Person in Lotus Position: Medium Skin Tone
🧘🏾 Person in Lotus Position: Medium-Dark Skin Tone
🧘🏿 Person in Lotus Position: Dark Skin Tone
🧘♀️ Woman in Lotus Position
🧘🏻♀️ Woman in Lotus Position: Light Skin Tone
🧘🏼♀️ Woman in Lotus Position: Medium-Light Skin Tone
🧘🏽♀️ Woman in Lotus Position: Medium Skin Tone
🧘🏾♀️ Woman in Lotus Position: Medium-Dark Skin Tone
🧘🏿♀️ Woman in Lotus Position: Dark Skin Tone
🧘♂️ Man in Lotus Position
🧘🏻♂️ Man in Lotus Position: Light Skin Tone
🧘🏼♂️ Man in Lotus Position: Medium-Light Skin Tone
🧘🏽♂️ Man in Lotus Position: Medium Skin Tone
🧘🏾♂️ Man in Lotus Position: Medium-Dark Skin Tone
🧘🏿♂️ Man in Lotus Position: Dark Skin Tone
🛀 Person Taking Bath
🛀🏻 Person Taking Bath: Light Skin Tone
🛀🏼 Person Taking Bath: Medium-Light Skin Tone
🛀🏽 Person Taking Bath: Medium Skin Tone
🛀🏾 Person Taking Bath: Medium-Dark Skin Tone
🛀🏿 Person Taking Bath: Dark Skin Tone
🛌 Person in Bed
🛌🏻 Person in Bed: Light Skin Tone
🛌🏼 Person in Bed: Medium-Light Skin Tone
🛌🏽 Person in Bed: Medium Skin Tone
🛌🏾 Person in Bed: Medium-Dark Skin Tone
🛌🏿 Person in Bed: Dark Skin Tone
🕴 Man in Business Suit Levitating
🕴🏻 Man in Business Suit Levitating: Light Skin Tone
🕴🏼 Man in Business Suit Levitating: Medium-Light Skin Tone
🕴🏽 Man in Business Suit Levitating: Medium Skin Tone
🕴🏾 Man in Business Suit Levitating: Medium-Dark Skin Tone
🕴🏿 Man in Business Suit Levitating: Dark Skin Tone
🗣 Speaking Head
👤 Bust in Silhouette
👥 Busts in Silhouette
🤺 Person Fencing
🏇 Horse Racing
🏇🏻 Horse Racing: Light Skin Tone
🏇🏼 Horse Racing: Medium-Light Skin Tone
🏇🏽 Horse Racing: Medium Skin Tone
🏇🏾 Horse Racing: Medium-Dark Skin Tone
🏇🏿 Horse Racing: Dark Skin Tone
⛷ Skier
🏂 Snowboarder
🏂🏻 Snowboarder: Light Skin Tone
🏂🏼 Snowboarder: Medium-Light Skin Tone
🏂🏽 Snowboarder: Medium Skin Tone
🏂🏾 Snowboarder: Medium-Dark Skin Tone
🏂🏿 Snowboarder: Dark Skin Tone
🏌 Person Golfing
🏌🏻 Person Golfing: Light Skin Tone
🏌🏼 Person Golfing: Medium-Light Skin Tone
🏌🏽 Person Golfing: Medium Skin Tone
🏌🏾 Person Golfing: Medium-Dark Skin Tone
🏌🏿 Person Golfing: Dark Skin Tone
🏌️♂️ Man Golfing
🏌🏻♂️ Man Golfing: Light Skin Tone
🏌🏼♂️ Man Golfing: Medium-Light Skin Tone
🏌🏽♂️ Man Golfing: Medium Skin Tone
🏌🏾♂️ Man Golfing: Medium-Dark Skin Tone
🏌🏿♂️ Man Golfing: Dark Skin Tone
🏌️♀️ Woman Golfing
🏌🏻♀️ Woman Golfing: Light Skin Tone
🏌🏼♀️ Woman Golfing: Medium-Light Skin Tone
🏌🏽♀️ Woman Golfing: Medium Skin Tone
🏌🏾♀️ Woman Golfing: Medium-Dark Skin Tone
🏌🏿♀️ Woman Golfing: Dark Skin Tone
🏄 Person Surfing
🏄🏻 Person Surfing: Light Skin Tone
🏄🏼 Person Surfing: Medium-Light Skin Tone
🏄🏽 Person Surfing: Medium Skin Tone
🏄🏾 Person Surfing: Medium-Dark Skin Tone
🏄🏿 Person Surfing: Dark Skin Tone
🏄♂️ Man Surfing
🏄🏻♂️ Man Surfing: Light Skin Tone
🏄🏼♂️ Man Surfing: Medium-Light Skin Tone
🏄🏽♂️ Man Surfing: Medium Skin Tone
🏄🏾♂️ Man Surfing: Medium-Dark Skin Tone
🏄🏿♂️ Man Surfing: Dark Skin Tone
🏄♀️ Woman Surfing
🏄🏻♀️ Woman Surfing: Light Skin Tone
🏄🏼♀️ Woman Surfing: Medium-Light Skin Tone
🏄🏽♀️ Woman Surfing: Medium Skin Tone
🏄🏾♀️ Woman Surfing: Medium-Dark Skin Tone
🏄🏿♀️ Woman Surfing: Dark Skin Tone
🚣 Person Rowing Boat
🚣🏻 Person Rowing Boat: Light Skin Tone
🚣🏼 Person Rowing Boat: Medium-Light Skin Tone
🚣🏽 Person Rowing Boat: Medium Skin Tone
🚣🏾 Person Rowing Boat: Medium-Dark Skin Tone
🚣🏿 Person Rowing Boat: Dark Skin Tone
🚣♂️ Man Rowing Boat
🚣🏻♂️ Man Rowing Boat: Light Skin Tone
🚣🏼♂️ Man Rowing Boat: Medium-Light Skin Tone
🚣🏽♂️ Man Rowing Boat: Medium Skin Tone
🚣🏾♂️ Man Rowing Boat: Medium-Dark Skin Tone
🚣🏿♂️ Man Rowing Boat: Dark Skin Tone
🚣♀️ Woman Rowing Boat
🚣🏻♀️ Woman Rowing Boat: Light Skin Tone
🚣🏼♀️ Woman Rowing Boat: Medium-Light Skin Tone
🚣🏽♀️ Woman Rowing Boat: Medium Skin Tone
🚣🏾♀️ Woman Rowing Boat: Medium-Dark Skin Tone
🚣🏿♀️ Woman Rowing Boat: Dark Skin Tone
🏊 Person Swimming
🏊🏻 Person Swimming: Light Skin Tone
🏊🏼 Person Swimming: Medium-Light Skin Tone
🏊🏽 Person Swimming: Medium Skin Tone
🏊🏾 Person Swimming: Medium-Dark Skin Tone
🏊🏿 Person Swimming: Dark Skin Tone
🏊♂️ Man Swimming
🏊🏻♂️ Man Swimming: Light Skin Tone
🏊🏼♂️ Man Swimming: Medium-Light Skin Tone
🏊🏽♂️ Man Swimming: Medium Skin Tone
🏊🏾♂️ Man Swimming: Medium-Dark Skin Tone
🏊🏿♂️ Man Swimming: Dark Skin Tone
🏊♀️ Woman Swimming
🏊🏻♀️ Woman Swimming: Light Skin Tone
🏊🏼♀️ Woman Swimming: Medium-Light Skin Tone
🏊🏽♀️ Woman Swimming: Medium Skin Tone
🏊🏾♀️ Woman Swimming: Medium-Dark Skin Tone
🏊🏿♀️ Woman Swimming: Dark Skin Tone
⛹ Person Bouncing Ball
⛹🏻 Person Bouncing Ball: Light Skin Tone
⛹🏼 Person Bouncing Ball: Medium-Light Skin Tone
⛹🏽 Person Bouncing Ball: Medium Skin Tone
⛹🏾 Person Bouncing Ball: Medium-Dark Skin Tone
⛹🏿 Person Bouncing Ball: Dark Skin Tone
⛹️♂️ Man Bouncing Ball
⛹🏻♂️ Man Bouncing Ball: Light Skin Tone
⛹🏼♂️ Man Bouncing Ball: Medium-Light Skin Tone
⛹🏽♂️ Man Bouncing Ball: Medium Skin Tone
⛹🏾♂️ Man Bouncing Ball: Medium-Dark Skin Tone
⛹🏿♂️ Man Bouncing Ball: Dark Skin Tone
⛹️♀️ Woman Bouncing Ball
⛹🏻♀️ Woman Bouncing Ball: Light Skin Tone
⛹🏼♀️ Woman Bouncing Ball: Medium-Light Skin Tone
⛹🏽♀️ Woman Bouncing Ball: Medium Skin Tone
⛹🏾♀️ Woman Bouncing Ball: Medium-Dark Skin Tone
⛹🏿♀️ Woman Bouncing Ball: Dark Skin Tone
🏋 Person Lifting Weights
🏋🏻 Person Lifting Weights: Light Skin Tone
🏋🏼 Person Lifting Weights: Medium-Light Skin Tone
🏋🏽 Person Lifting Weights: Medium Skin Tone
🏋🏾 Person Lifting Weights: Medium-Dark Skin Tone
🏋🏿 Person Lifting Weights: Dark Skin Tone
🏋️♂️ Man Lifting Weights
🏋🏻♂️ Man Lifting Weights: Light Skin Tone
🏋🏼♂️ Man Lifting Weights: Medium-Light Skin Tone
🏋🏽♂️ Man Lifting Weights: Medium Skin Tone
🏋🏾♂️ Man Lifting Weights: Medium-Dark Skin Tone
🏋🏿♂️ Man Lifting Weights: Dark Skin Tone
🏋️♀️ Woman Lifting Weights
🏋🏻♀️ Woman Lifting Weights: Light Skin Tone
🏋🏼♀️ Woman Lifting Weights: Medium-Light Skin Tone
🏋🏽♀️ Woman Lifting Weights: Medium Skin Tone
🏋🏾♀️ Woman Lifting Weights: Medium-Dark Skin Tone
🏋🏿♀️ Woman Lifting Weights: Dark Skin Tone
🚴 Person Biking
🚴🏻 Person Biking: Light Skin Tone
🚴🏼 Person Biking: Medium-Light Skin Tone
🚴🏽 Person Biking: Medium Skin Tone
🚴🏾 Person Biking: Medium-Dark Skin Tone
🚴🏿 Person Biking: Dark Skin Tone
🚴♂️ Man Biking
🚴🏻♂️ Man Biking: Light Skin Tone
🚴🏼♂️ Man Biking: Medium-Light Skin Tone
🚴🏽♂️ Man Biking: Medium Skin Tone
🚴🏾♂️ Man Biking: Medium-Dark Skin Tone
🚴🏿♂️ Man Biking: Dark Skin Tone
🚴♀️ Woman Biking
🚴🏻♀️ Woman Biking: Light Skin Tone
🚴🏼♀️ Woman Biking: Medium-Light Skin Tone
🚴🏽♀️ Woman Biking: Medium Skin Tone
🚴🏾♀️ Woman Biking: Medium-Dark Skin Tone
🚴🏿♀️ Woman Biking: Dark Skin Tone
🚵 Person Mountain Biking
🚵🏻 Person Mountain Biking: Light Skin Tone
🚵🏼 Person Mountain Biking: Medium-Light Skin Tone
🚵🏽 Person Mountain Biking: Medium Skin Tone
🚵🏾 Person Mountain Biking: Medium-Dark Skin Tone
🚵🏿 Person Mountain Biking: Dark Skin Tone
🚵♂️ Man Mountain Biking
🚵🏻♂️ Man Mountain Biking: Light Skin Tone
🚵🏼♂️ Man Mountain Biking: Medium-Light Skin Tone
🚵🏽♂️ Man Mountain Biking: Medium Skin Tone
🚵🏾♂️ Man Mountain Biking: Medium-Dark Skin Tone
🚵🏿♂️ Man Mountain Biking: Dark Skin Tone
🚵♀️ Woman Mountain Biking
🚵🏻♀️ Woman Mountain Biking: Light Skin Tone
🚵🏼♀️ Woman Mountain Biking: Medium-Light Skin Tone
🚵🏽♀️ Woman Mountain Biking: Medium Skin Tone
🚵🏾♀️ Woman Mountain Biking: Medium-Dark Skin Tone
🚵🏿♀️ Woman Mountain Biking: Dark Skin Tone
🏎 Racing Car
🏍 Motorcycle
🤸 Person Cartwheeling
🤸🏻 Person Cartwheeling: Light Skin Tone
🤸🏼 Person Cartwheeling: Medium-Light Skin Tone
🤸🏽 Person Cartwheeling: Medium Skin Tone
🤸🏾 Person Cartwheeling: Medium-Dark Skin Tone
🤸🏿 Person Cartwheeling: Dark Skin Tone
🤸♂️ Man Cartwheeling
🤸🏻♂️ Man Cartwheeling: Light Skin Tone
🤸🏼♂️ Man Cartwheeling: Medium-Light Skin Tone
🤸🏽♂️ Man Cartwheeling: Medium Skin Tone
🤸🏾♂️ Man Cartwheeling: Medium-Dark Skin Tone
🤸🏿♂️ Man Cartwheeling: Dark Skin Tone
🤸♀️ Woman Cartwheeling
🤸🏻♀️ Woman Cartwheeling: Light Skin Tone
🤸🏼♀️ Woman Cartwheeling: Medium-Light Skin Tone
🤸🏽♀️ Woman Cartwheeling: Medium Skin Tone
🤸🏾♀️ Woman Cartwheeling: Medium-Dark Skin Tone
🤸🏿♀️ Woman Cartwheeling: Dark Skin Tone
🤼 People Wrestling
🤼♂️ Men Wrestling
🤼♀️ Women Wrestling
🤽 Person Playing Water Polo
🤽🏻 Person Playing Water Polo: Light Skin Tone
🤽🏼 Person Playing Water Polo: Medium-Light Skin Tone
🤽🏽 Person Playing Water Polo: Medium Skin Tone
🤽🏾 Person Playing Water Polo: Medium-Dark Skin Tone
🤽🏿 Person Playing Water Polo: Dark Skin Tone
🤽♂️ Man Playing Water Polo
🤽🏻♂️ Man Playing Water Polo: Light Skin Tone
🤽🏼♂️ Man Playing Water Polo: Medium-Light Skin Tone
🤽🏽♂️ Man Playing Water Polo: Medium Skin Tone
🤽🏾♂️ Man Playing Water Polo: Medium-Dark Skin Tone
🤽🏿♂️ Man Playing Water Polo: Dark Skin Tone
🤽♀️ Woman Playing Water Polo
🤽🏻♀️ Woman Playing Water Polo: Light Skin Tone
🤽🏼♀️ Woman Playing Water Polo: Medium-Light Skin Tone
🤽🏽♀️ Woman Playing Water Polo: Medium Skin Tone
🤽🏾♀️ Woman Playing Water Polo: Medium-Dark Skin Tone
🤽🏿♀️ Woman Playing Water Polo: Dark Skin Tone
🤾 Person Playing Handball
🤾🏻 Person Playing Handball: Light Skin Tone
🤾🏼 Person Playing Handball: Medium-Light Skin Tone
🤾🏽 Person Playing Handball: Medium Skin Tone
🤾🏾 Person Playing Handball: Medium-Dark Skin Tone
🤾🏿 Person Playing Handball: Dark Skin Tone
🤾♂️ Man Playing Handball
🤾🏻♂️ Man Playing Handball: Light Skin Tone
🤾🏼♂️ Man Playing Handball: Medium-Light Skin Tone
🤾🏽♂️ Man Playing Handball: Medium Skin Tone
🤾🏾♂️ Man Playing Handball: Medium-Dark Skin Tone
🤾🏿♂️ Man Playing Handball: Dark Skin Tone
🤾♀️ Woman Playing Handball
🤾🏻♀️ Woman Playing Handball: Light Skin Tone
🤾🏼♀️ Woman Playing Handball: Medium-Light Skin Tone
🤾🏽♀️ Woman Playing Handball: Medium Skin Tone
🤾🏾♀️ Woman Playing Handball: Medium-Dark Skin Tone
🤾🏿♀️ Woman Playing Handball: Dark Skin Tone
🤹 Person Juggling
🤹🏻 Person Juggling: Light Skin Tone
🤹🏼 Person Juggling: Medium-Light Skin Tone
🤹🏽 Person Juggling: Medium Skin Tone
🤹🏾 Person Juggling: Medium-Dark Skin Tone
🤹🏿 Person Juggling: Dark Skin Tone
🤹♂️ Man Juggling
🤹🏻♂️ Man Juggling: Light Skin Tone
🤹🏼♂️ Man Juggling: Medium-Light Skin Tone
🤹🏽♂️ Man Juggling: Medium Skin Tone
🤹🏾♂️ Man Juggling: Medium-Dark Skin Tone
🤹🏿♂️ Man Juggling: Dark Skin Tone
🤹♀️ Woman Juggling
🤹🏻♀️ Woman Juggling: Light Skin Tone
🤹🏼♀️ Woman Juggling: Medium-Light Skin Tone
🤹🏽♀️ Woman Juggling: Medium Skin Tone
🤹🏾♀️ Woman Juggling: Medium-Dark Skin Tone
🤹🏿♀️ Woman Juggling: Dark Skin Tone
🤼🏻 Wrestlers, Type-1-2
🤼🏼 Wrestlers, Type-3
👫 Man and Woman Holding Hands
🤼🏽 Wrestlers, Type-4
👬 Two Men Holding Hands
🤼🏾 Wrestlers, Type-5
👭 Two Women Holding Hands
🤼🏿 Wrestlers, Type-6
💏 Kiss
👩❤️💋👨 Kiss: Woman, Man
🤼🏻♂️ Men Wrestling, Type-1-2
🤼🏼♂️ Men Wrestling, Type-3
🤼🏽♂️ Men Wrestling, Type-4
👨❤️💋👨 Kiss: Man, Man
🤼🏾♂️ Men Wrestling, Type-5
🤼🏿♂️ Men Wrestling, Type-6
👩❤️💋👩 Kiss: Woman, Woman
🤼🏻♀️ Women Wrestling, Type-1-2
💑 Couple With Heart
🤼🏼♀️ Women Wrestling, Type-3
👩❤️👨 Couple With Heart: Woman, Man
🤼🏽♀️ Women Wrestling, Type-4
🤼🏾♀️ Women Wrestling, Type-5
👨❤️👨 Couple With Heart: Man, Man
🤼🏿♀️ Women Wrestling, Type-6
👩❤️👩 Couple With Heart: Woman, Woman
👪 Family
👨👩👦 Family: Man, Woman, Boy
👨👩👧 Family: Man, Woman, Girl
👨👩👧👦 Family: Man, Woman, Girl, Boy
👨👩👦👦 Family: Man, Woman, Boy, Boy
👨👩👧👧 Family: Man, Woman, Girl, Girl
👨👨👦 Family: Man, Man, Boy
👨👨👧 Family: Man, Man, Girl
👨👨👧👦 Family: Man, Man, Girl, Boy
👨👨👦👦 Family: Man, Man, Boy, Boy
👨👨👧👧 Family: Man, Man, Girl, Girl
👩👩👦 Family: Woman, Woman, Boy
👩👩👧 Family: Woman, Woman, Girl
👩👩👧👦 Family: Woman, Woman, Girl, Boy
👩👩👦👦 Family: Woman, Woman, Boy, Boy
👩👩👧👧 Family: Woman, Woman, Girl, Girl
👨👦 Family: Man, Boy
👨👦👦 Family: Man, Boy, Boy
👨👧 Family: Man, Girl
👨👧👦 Family: Man, Girl, Boy
👨👧👧 Family: Man, Girl, Girl
👩👦 Family: Woman, Boy
👩👦👦 Family: Woman, Boy, Boy
👩👧 Family: Woman, Girl
👩👧👦 Family: Woman, Girl, Boy
👩👧👧 Family: Woman, Girl, Girl
🤳 Selfie
🤳🏻 Selfie: Light Skin Tone
🤳🏼 Selfie: Medium-Light Skin Tone
🤳🏽 Selfie: Medium Skin Tone
🤳🏾 Selfie: Medium-Dark Skin Tone
🤳🏿 Selfie: Dark Skin Tone
💪 Flexed Biceps
💪🏻 Flexed Biceps: Light Skin Tone
💪🏼 Flexed Biceps: Medium-Light Skin Tone
💪🏽 Flexed Biceps: Medium Skin Tone
💪🏾 Flexed Biceps: Medium-Dark Skin Tone
💪🏿 Flexed Biceps: Dark Skin Tone
👈 Backhand Index Pointing Left
👈🏻 Backhand Index Pointing Left: Light Skin Tone
👈🏼 Backhand Index Pointing Left: Medium-Light Skin Tone
👈🏽 Backhand Index Pointing Left: Medium Skin Tone
👈🏾 Backhand Index Pointing Left: Medium-Dark Skin Tone
👈🏿 Backhand Index Pointing Left: Dark Skin Tone
👉 Backhand Index Pointing Right
👉🏻 Backhand Index Pointing Right: Light Skin Tone
👉🏼 Backhand Index Pointing Right: Medium-Light Skin Tone
👉🏽 Backhand Index Pointing Right: Medium Skin Tone
👉🏾 Backhand Index Pointing Right: Medium-Dark Skin Tone
👉🏿 Backhand Index Pointing Right: Dark Skin Tone
☝ Index Pointing Up
☝🏻 Index Pointing Up: Light Skin Tone
☝🏼 Index Pointing Up: Medium-Light Skin Tone
☝🏽 Index Pointing Up: Medium Skin Tone
☝🏾 Index Pointing Up: Medium-Dark Skin Tone
☝🏿 Index Pointing Up: Dark Skin Tone
👆 Backhand Index Pointing Up
👆🏻 Backhand Index Pointing Up: Light Skin Tone
👆🏼 Backhand Index Pointing Up: Medium-Light Skin Tone
👆🏽 Backhand Index Pointing Up: Medium Skin Tone
👆🏾 Backhand Index Pointing Up: Medium-Dark Skin Tone
👆🏿 Backhand Index Pointing Up: Dark Skin Tone
🖕 Middle Finger
🖕🏻 Middle Finger: Light Skin Tone
🖕🏼 Middle Finger: Medium-Light Skin Tone
🖕🏽 Middle Finger: Medium Skin Tone
🖕🏾 Middle Finger: Medium-Dark Skin Tone
🖕🏿 Middle Finger: Dark Skin Tone
👇 Backhand Index Pointing Down
👇🏻 Backhand Index Pointing Down: Light Skin Tone
👇🏼 Backhand Index Pointing Down: Medium-Light Skin Tone
👇🏽 Backhand Index Pointing Down: Medium Skin Tone
👇🏾 Backhand Index Pointing Down: Medium-Dark Skin Tone
👇🏿 Backhand Index Pointing Down: Dark Skin Tone
✌ Victory Hand
✌🏻 Victory Hand: Light Skin Tone
✌🏼 Victory Hand: Medium-Light Skin Tone
✌🏽 Victory Hand: Medium Skin Tone
✌🏾 Victory Hand: Medium-Dark Skin Tone
✌🏿 Victory Hand: Dark Skin Tone
🤞 Crossed Fingers
🤞🏻 Crossed Fingers: Light Skin Tone
🤞🏼 Crossed Fingers: Medium-Light Skin Tone
🤞🏽 Crossed Fingers: Medium Skin Tone
🤞🏾 Crossed Fingers: Medium-Dark Skin Tone
🤞🏿 Crossed Fingers: Dark Skin Tone
🖖 Vulcan Salute
🖖🏻 Vulcan Salute: Light Skin Tone
🖖🏼 Vulcan Salute: Medium-Light Skin Tone
🖖🏽 Vulcan Salute: Medium Skin Tone
🖖🏾 Vulcan Salute: Medium-Dark Skin Tone
🖖🏿 Vulcan Salute: Dark Skin Tone
🤘 Sign of the Horns
🤘🏻 Sign of the Horns: Light Skin Tone
🤘🏼 Sign of the Horns: Medium-Light Skin Tone
🤘🏽 Sign of the Horns: Medium Skin Tone
🤘🏾 Sign of the Horns: Medium-Dark Skin Tone
🤘🏿 Sign of the Horns: Dark Skin Tone
🤙 Call Me Hand
🤙🏻 Call Me Hand: Light Skin Tone
🤙🏼 Call Me Hand: Medium-Light Skin Tone
🤙🏽 Call Me Hand: Medium Skin Tone
🤙🏾 Call Me Hand: Medium-Dark Skin Tone
🤙🏿 Call Me Hand: Dark Skin Tone
🖐 Raised Hand With Fingers Splayed
🖐🏻 Raised Hand With Fingers Splayed: Light Skin Tone
🖐🏼 Raised Hand With Fingers Splayed: Medium-Light Skin Tone
🖐🏽 Raised Hand With Fingers Splayed: Medium Skin Tone
🖐🏾 Raised Hand With Fingers Splayed: Medium-Dark Skin Tone
🖐🏿 Raised Hand With Fingers Splayed: Dark Skin Tone
✋ Raised Hand
✋🏻 Raised Hand: Light Skin Tone
✋🏼 Raised Hand: Medium-Light Skin Tone
✋🏽 Raised Hand: Medium Skin Tone
✋🏾 Raised Hand: Medium-Dark Skin Tone
✋🏿 Raised Hand: Dark Skin Tone
👌 OK Hand
👌🏻 OK Hand: Light Skin Tone
👌🏼 OK Hand: Medium-Light Skin Tone
👌🏽 OK Hand: Medium Skin Tone
👌🏾 OK Hand: Medium-Dark Skin Tone
👌🏿 OK Hand: Dark Skin Tone
👍 Thumbs Up
👍🏻 Thumbs Up: Light Skin Tone
👍🏼 Thumbs Up: Medium-Light Skin Tone
👍🏽 Thumbs Up: Medium Skin Tone
👍🏾 Thumbs Up: Medium-Dark Skin Tone
👍🏿 Thumbs Up: Dark Skin Tone
👎 Thumbs Down
👎🏻 Thumbs Down: Light Skin Tone
👎🏼 Thumbs Down: Medium-Light Skin Tone
👎🏽 Thumbs Down: Medium Skin Tone
👎🏾 Thumbs Down: Medium-Dark Skin Tone
👎🏿 Thumbs Down: Dark Skin Tone
✊ Raised Fist
✊🏻 Raised Fist: Light Skin Tone
✊🏼 Raised Fist: Medium-Light Skin Tone
✊🏽 Raised Fist: Medium Skin Tone
✊🏾 Raised Fist: Medium-Dark Skin Tone
✊🏿 Raised Fist: Dark Skin Tone
👊 Oncoming Fist
👊🏻 Oncoming Fist: Light Skin Tone
👊🏼 Oncoming Fist: Medium-Light Skin Tone
👊🏽 Oncoming Fist: Medium Skin Tone
👊🏾 Oncoming Fist: Medium-Dark Skin Tone
👊🏿 Oncoming Fist: Dark Skin Tone
🤛 Left-Facing Fist
🤛🏻 Left-Facing Fist: Light Skin Tone
🤛🏼 Left-Facing Fist: Medium-Light Skin Tone
🤛🏽 Left-Facing Fist: Medium Skin Tone
🤛🏾 Left-Facing Fist: Medium-Dark Skin Tone
🤛🏿 Left-Facing Fist: Dark Skin Tone
🤜 Right-Facing Fist
🤜🏻 Right-Facing Fist: Light Skin Tone
🤜🏼 Right-Facing Fist: Medium-Light Skin Tone
🤜🏽 Right-Facing Fist: Medium Skin Tone
🤜🏾 Right-Facing Fist: Medium-Dark Skin Tone
🤜🏿 Right-Facing Fist: Dark Skin Tone
🤚 Raised Back of Hand
🤚🏻 Raised Back of Hand: Light Skin Tone
🤚🏼 Raised Back of Hand: Medium-Light Skin Tone
🤚🏽 Raised Back of Hand: Medium Skin Tone
🤚🏾 Raised Back of Hand: Medium-Dark Skin Tone
🤚🏿 Raised Back of Hand: Dark Skin Tone
👋 Waving Hand
👋🏻 Waving Hand: Light Skin Tone
👋🏼 Waving Hand: Medium-Light Skin Tone
👋🏽 Waving Hand: Medium Skin Tone
👋🏾 Waving Hand: Medium-Dark Skin Tone
👋🏿 Waving Hand: Dark Skin Tone
🤟 Love-You Gesture
🤟🏻 Love-You Gesture: Light Skin Tone
🤟🏼 Love-You Gesture: Medium-Light Skin Tone
🤟🏽 Love-You Gesture: Medium Skin Tone
🤟🏾 Love-You Gesture: Medium-Dark Skin Tone
🤟🏿 Love-You Gesture: Dark Skin Tone
✍ Writing Hand
✍🏻 Writing Hand: Light Skin Tone
✍🏼 Writing Hand: Medium-Light Skin Tone
✍🏽 Writing Hand: Medium Skin Tone
✍🏾 Writing Hand: Medium-Dark Skin Tone
✍🏿 Writing Hand: Dark Skin Tone
👏 Clapping Hands
👏🏻 Clapping Hands: Light Skin Tone
👏🏼 Clapping Hands: Medium-Light Skin Tone
👏🏽 Clapping Hands: Medium Skin Tone
👏🏾 Clapping Hands: Medium-Dark Skin Tone
👏🏿 Clapping Hands: Dark Skin Tone
👐 Open Hands
👐🏻 Open Hands: Light Skin Tone
👐🏼 Open Hands: Medium-Light Skin Tone
👐🏽 Open Hands: Medium Skin Tone
👐🏾 Open Hands: Medium-Dark Skin Tone
👐🏿 Open Hands: Dark Skin Tone
🙌 Raising Hands
🙌🏻 Raising Hands: Light Skin Tone
🙌🏼 Raising Hands: Medium-Light Skin Tone
🙌🏽 Raising Hands: Medium Skin Tone
🙌🏾 Raising Hands: Medium-Dark Skin Tone
🙌🏿 Raising Hands: Dark Skin Tone
🤲 Palms Up Together
🤲🏻 Palms Up Together: Light Skin Tone
🤲🏼 Palms Up Together: Medium-Light Skin Tone
🤲🏽 Palms Up Together: Medium Skin Tone
🤲🏾 Palms Up Together: Medium-Dark Skin Tone
🤲🏿 Palms Up Together: Dark Skin Tone
🙏 Folded Hands
🙏🏻 Folded Hands: Light Skin Tone
🙏🏼 Folded Hands: Medium-Light Skin Tone
🙏🏽 Folded Hands: Medium Skin Tone
🙏🏾 Folded Hands: Medium-Dark Skin Tone
🙏🏿 Folded Hands: Dark Skin Tone
🤝 Handshake
💅 Nail Polish
💅🏻 Nail Polish: Light Skin Tone
💅🏼 Nail Polish: Medium-Light Skin Tone
💅🏽 Nail Polish: Medium Skin Tone
💅🏾 Nail Polish: Medium-Dark Skin Tone
💅🏿 Nail Polish: Dark Skin Tone
👂 Ear
👂🏻 Ear: Light Skin Tone
👂🏼 Ear: Medium-Light Skin Tone
👂🏽 Ear: Medium Skin Tone
👂🏾 Ear: Medium-Dark Skin Tone
👂🏿 Ear: Dark Skin Tone
👃 Nose
👃🏻 Nose: Light Skin Tone
👃🏼 Nose: Medium-Light Skin Tone
👃🏽 Nose: Medium Skin Tone
👃🏾 Nose: Medium-Dark Skin Tone
👃🏿 Nose: Dark Skin Tone
👣 Footprints
👀 Eyes
👁 Eye
👁️🗨️ Eye in Speech Bubble
🧠 Brain
👅 Tongue
👄 Mouth
💋 Kiss Mark
💘 Heart With Arrow
❤ Red Heart
💓 Beating Heart
💔 Broken Heart
💕 Two Hearts
💖 Sparkling Heart
💗 Growing Heart
💙 Blue Heart
💚 Green Heart
💛 Yellow Heart
🧡 Orange Heart
💜 Purple Heart
🖤 Black Heart
💝 Heart With Ribbon
💞 Revolving Hearts
💟 Heart Decoration
❣ Heavy Heart Exclamation
💌 Love Letter
💤 Zzz
💢 Anger Symbol
💣 Bomb
💥 Collision
💦 Sweat Droplets
💨 Dashing Away
💫 Dizzy
💬 Speech Balloon
🗨 Left Speech Bubble
🗯 Right Anger Bubble
💭 Thought Balloon
🕳 Hole
👓 Glasses
🕶 Sunglasses
👔 Necktie
👕 T-Shirt
👖 Jeans
🧣 Scarf
🧤 Gloves
🧥 Coat
🧦 Socks
👗 Dress
👘 Kimono
👙 Bikini
👚 Woman’s Clothes
👛 Purse
👜 Handbag
👝 Clutch Bag
🛍 Shopping Bags
🎒 School Backpack
👞 Man’s Shoe
👟 Running Shoe
👠 High-Heeled Shoe
👡 Woman’s Sandal
👢 Woman’s Boot
👑 Crown
👒 Woman’s Hat
🎩 Top Hat
🎓 Graduation Cap
🧢 Billed Cap
⛑ Rescue Worker’s Helmet
📿 Prayer Beads
💄 Lipstick
💍 Ring
💎 Gem Stone
🐵 Monkey Face
🐒 Monkey
🦍 Gorilla
🐶 Dog Face
🐕 Dog
🐩 Poodle
🐺 Wolf Face
🦊 Fox Face
🐱 Cat Face
🐈 Cat
🦁 Lion Face
🐯 Tiger Face
🐅 Tiger
🐆 Leopard
🐴 Horse Face
🐎 Horse
🦄 Unicorn Face
🦓 Zebra
🦌 Deer
🐮 Cow Face
🐂 Ox
🐃 Water Buffalo
🐄 Cow
🐷 Pig Face
🐖 Pig
🐗 Boar
🐽 Pig Nose
🐏 Ram
🐑 Ewe
🐐 Goat
🐪 Camel
🐫 Two-Hump Camel
🦒 Giraffe
🐘 Elephant
🦏 Rhinoceros
🐭 Mouse Face
🐁 Mouse
🐀 Rat
🐹 Hamster Face
🐰 Rabbit Face
🐇 Rabbit
🐿 Chipmunk
🦔 Hedgehog
🦇 Bat
🐻 Bear Face
🐨 Koala
🐼 Panda Face
🐾 Paw Prints
🦃 Turkey
🐔 Chicken
🐓 Rooster
🐣 Hatching Chick
🐤 Baby Chick
🐥 Front-Facing Baby Chick
🐦 Bird
🐧 Penguin
🕊 Dove
🦅 Eagle
🦆 Duck
🦉 Owl
🐸 Frog Face
🐊 Crocodile
🐢 Turtle
🦎 Lizard
🐍 Snake
🐲 Dragon Face
🐉 Dragon
🦕 Sauropod
🦖 T-Rex
🐳 Spouting Whale
🐋 Whale
🐬 Dolphin
🐟 Fish
🐠 Tropical Fish
🐡 Blowfish
🦈 Shark
🐙 Octopus
🐚 Spiral Shell
🦀 Crab
🦐 Shrimp
🦑 Squid
🐌 Snail
🦋 Butterfly
🐛 Bug
🐜 Ant
🐝 Honeybee
🐞 Lady Beetle
🦗 Cricket
🕷 Spider
🕸 Spider Web
🦂 Scorpion
💐 Bouquet
🌸 Cherry Blossom
💮 White Flower
🏵 Rosette
🌹 Rose
🥀 Wilted Flower
🌺 Hibiscus
🌻 Sunflower
🌼 Blossom
🌷 Tulip
🌱 Seedling
🌲 Evergreen Tree
🌳 Deciduous Tree
🌴 Palm Tree
🌵 Cactus
🌾 Sheaf of Rice
🌿 Herb
☘ Shamrock
🍀 Four Leaf Clover
🍁 Maple Leaf
🍂 Fallen Leaf
🍃 Leaf Fluttering in Wind
🍇 Grapes
🍈 Melon
🍉 Watermelon
🍊 Tangerine
🍋 Lemon
🍌 Banana
🍍 Pineapple
🍎 Red Apple
🍏 Green Apple
🍐 Pear
🍑 Peach
🍒 Cherries
🍓 Strawberry
🥝 Kiwi Fruit
🍅 Tomato
🥥 Coconut
🥑 Avocado
🍆 Eggplant
🥔 Potato
🥕 Carrot
🌽 Ear of Corn
🌶 Hot Pepper
🥒 Cucumber
🥦 Broccoli
🍄 Mushroom
🥜 Peanuts
🌰 Chestnut
🍞 Bread
🥐 Croissant
🥖 Baguette Bread
🥨 Pretzel
🥞 Pancakes
🧀 Cheese Wedge
🍖 Meat on Bone
🍗 Poultry Leg
🥩 Cut of Meat
🥓 Bacon
🍔 Hamburger
🍟 French Fries
🍕 Pizza
🌭 Hot Dog
🥪 Sandwich
🌮 Taco
🌯 Burrito
🥙 Stuffed Flatbread
🥚 Egg
🍳 Cooking
🥘 Shallow Pan of Food
🍲 Pot of Food
🥣 Bowl With Spoon
🥗 Green Salad
🍿 Popcorn
🥫 Canned Food
🍱 Bento Box
🍘 Rice Cracker
🍙 Rice Ball
🍚 Cooked Rice
🍛 Curry Rice
🍜 Steaming Bowl
🍝 Spaghetti
🍠 Roasted Sweet Potato
🍢 Oden
🍣 Sushi
🍤 Fried Shrimp
🍥 Fish Cake With Swirl
🍡 Dango
🥟 Dumpling
🥠 Fortune Cookie
🥡 Takeout Box
🍦 Soft Ice Cream
🍧 Shaved Ice
🍨 Ice Cream
🍩 Doughnut
🍪 Cookie
🎂 Birthday Cake
🍰 Shortcake
🥧 Pie
🍫 Chocolate Bar
🍬 Candy
🍭 Lollipop
🍮 Custard
🍯 Honey Pot
🍼 Baby Bottle
🥛 Glass of Milk
☕ Hot Beverage
🍵 Teacup Without Handle
🍶 Sake
🍾 Bottle With Popping Cork
🍷 Wine Glass
🍸 Cocktail Glass
🍹 Tropical Drink
🍺 Beer Mug
🍻 Clinking Beer Mugs
🥂 Clinking Glasses
🥃 Tumbler Glass
🥤 Cup With Straw
🥢 Chopsticks
🍽 Fork and Knife With Plate
🍴 Fork and Knife
🥄 Spoon
🔪 Kitchen Knife
🏺 Amphora
🌍 Globe Showing Europe-Africa
🌎 Globe Showing Americas
🌏 Globe Showing Asia-Australia
🌐 Globe With Meridians
🗺 World Map
🗾 Map of Japan
🏔 Snow-Capped Mountain
⛰ Mountain
🌋 Volcano
🗻 Mount Fuji
🏕 Camping
🏖 Beach With Umbrella
🏜 Desert
🏝 Desert Island
🏞 National Park
🏟 Stadium
🏛 Classical Building
🏗 Building Construction
🏘 House
🏙 Cityscape
🏚 Derelict House
🏠 House
🏡 House With Garden
🏢 Office Building
🏣 Japanese Post Office
🏤 Post Office
🏥 Hospital
🏦 Bank
🏨 Hotel
🏩 Love Hotel
🏪 Convenience Store
🏫 School
🏬 Department Store
🏭 Factory
🏯 Japanese Castle
🏰 Castle
💒 Wedding
🗼 Tokyo Tower
🗽 Statue of Liberty
⛪ Church
🕌 Mosque
🕍 Synagogue
⛩ Shinto Shrine
🕋 Kaaba
⛲ Fountain
⛺ Tent
🌁 Foggy
🌃 Night With Stars
🌄 Sunrise Over Mountains
🌅 Sunrise
🌆 Cityscape at Dusk
🌇 Sunset
🌉 Bridge at Night
♨ Hot Springs
🌌 Milky Way
🎠 Carousel Horse
🎡 Ferris Wheel
🎢 Roller Coaster
💈 Barber Pole
🎪 Circus Tent
🎭 Performing Arts
🖼 Framed Picture
🎨 Artist Palette
🎰 Slot Machine
🚂 Locomotive
🚃 Railway Car
🚄 High-Speed Train
🚅 High-Speed Train With Bullet Nose
🚆 Train
🚇 Metro
🚈 Light Rail
🚉 Station
🚊 Tram
🚝 Monorail
🚞 Mountain Railway
🚋 Tram Car
🚌 Bus
🚍 Oncoming Bus
🚎 Trolleybus
🚐 Minibus
🚑 Ambulance
🚒 Fire Engine
🚓 Police Car
🚔 Oncoming Police Car
🚕 Taxi
🚖 Oncoming Taxi
🚗 Automobile
🚘 Oncoming Automobile
🚙 Sport Utility Vehicle
🚚 Delivery Truck
🚛 Articulated Lorry
🚜 Tractor
🚲 Bicycle
🛴 Kick Scooter
🛵 Motor Scooter
🚏 Bus Stop
🛣 Motorway
🛤 Railway Track
⛽ Fuel Pump
🚨 Police Car Light
🚥 Horizontal Traffic Light
🚦 Vertical Traffic Light
🚧 Construction
🛑 Stop Sign
⚓ Anchor
⛵ Sailboat
🛶 Canoe
🚤 Speedboat
🛳 Passenger Ship
⛴ Ferry
🛥 Motor Boat
🚢 Ship
✈ Airplane
🛩 Small Airplane
🛫 Airplane Departure
🛬 Airplane Arrival
💺 Seat
🚁 Helicopter
🚟 Suspension Railway
🚠 Mountain Cableway
🚡 Aerial Tramway
🛰 Satellite
🚀 Rocket
🛸 Flying Saucer
🛎 Bellhop Bell
🚪 Door
🛏 Bed
🛋 Couch and Lamp
🚽 Toilet
🚿 Shower
🛁 Bathtub
⌛ Hourglass
⏳ Hourglass With Flowing Sand
⌚ Watch
⏰ Alarm Clock
⏱ Stopwatch
⏲ Timer Clock
🕰 Mantelpiece Clock
🕛 Twelve O’clock
🕧 Twelve-Thirty
🕐 One O’clock
🕜 One-Thirty
🕑 Two O’clock
🕝 Two-Thirty
🕒 Three O’clock
🕞 Three-Thirty
🕓 Four O’clock
🕟 Four-Thirty
🕔 Five O’clock
🕠 Five-Thirty
🕕 Six O’clock
🕡 Six-Thirty
🕖 Seven O’clock
🕢 Seven-Thirty
🕗 Eight O’clock
🕣 Eight-Thirty
🕘 Nine O’clock
🕤 Nine-Thirty
🕙 Ten O’clock
🕥 Ten-Thirty
🕚 Eleven O’clock
🕦 Eleven-Thirty
🌑 New Moon
🌒 Waxing Crescent Moon
🌓 First Quarter Moon
🌔 Waxing Gibbous Moon
🌕 Full Moon
🌖 Waning Gibbous Moon
🌗 Last Quarter Moon
🌘 Waning Crescent Moon
🌙 Crescent Moon
🌚 New Moon Face
🌛 First Quarter Moon With Face
🌜 Last Quarter Moon With Face
🌡 Thermometer
☀ Sun
🌝 Full Moon With Face
🌞 Sun With Face
⭐ White Medium Star
🌟 Glowing Star
🌠 Shooting Star
☁ Cloud
⛅ Sun Behind Cloud
⛈ Cloud With Lightning and Rain
🌤 Sun Behind Small Cloud
🌥 Sun Behind Large Cloud
🌦 Sun Behind Rain Cloud
🌧 Cloud With Rain
🌨 Cloud With Snow
🌩 Cloud With Lightning
🌪 Tornado
🌫 Fog
🌬 Wind Face
🌀 Cyclone
🌈 Rainbow
🌂 Closed Umbrella
☂ Umbrella
☔ Umbrella With Rain Drops
⛱ Umbrella on Ground
⚡ High Voltage
❄ Snowflake
☃ Snowman
⛄ Snowman Without Snow
☄ Comet
🔥 Fire
💧 Droplet
🌊 Water Wave
🎃 Jack-O-Lantern
🎄 Christmas Tree
🎆 Fireworks
🎇 Sparkler
✨ Sparkles
🎈 Balloon
🎉 Party Popper
🎊 Confetti Ball
🎋 Tanabata Tree
🎍 Pine Decoration
🎎 Japanese Dolls
🎏 Carp Streamer
🎐 Wind Chime
🎑 Moon Viewing Ceremony
🎀 Ribbon
🎁 Wrapped Gift
🎗 Reminder Ribbon
🎟 Admission Tickets
🎫 Ticket
🎖 Military Medal
🏆 Trophy
🏅 Sports Medal
🥇 1st Place Medal
🥈 2nd Place Medal
🥉 3rd Place Medal
⚽ Soccer Ball
⚾ Baseball
🏀 Basketball
🏐 Volleyball
🏈 American Football
🏉 Rugby Football
🎾 Tennis
🎱 Pool 8 Ball
🎳 Bowling
🏏 Cricket
🏑 Field Hockey
🏒 Ice Hockey
🏓 Ping Pong
🏸 Badminton
🥊 Boxing Glove
🥋 Martial Arts Uniform
🥅 Goal Net
🎯 Direct Hit
⛳ Flag in Hole
⛸ Ice Skate
🎣 Fishing Pole
🎽 Running Shirt
🎿 Skis
🛷 Sled
🥌 Curling Stone
🎮 Video Game
🕹 Joystick
🎲 Game Die
♠ Spade Suit
♥ Heart Suit
♦ Diamond Suit
♣ Club Suit
🃏 Joker
🀄 Mahjong Red Dragon
🎴 Flower Playing Cards
🔇 Muted Speaker
🔈 Speaker Low Volume
🔉 Speaker Medium Volume
🔊 Speaker High Volume
📢 Loudspeaker
📣 Megaphone
📯 Postal Horn
🔔 Bell
🔕 Bell With Slash
🎼 Musical Score
🎵 Musical Note
🎶 Musical Notes
🎙 Studio Microphone
🎚 Level Slider
🎛 Control Knobs
🎤 Microphone
🎧 Headphone
📻 Radio
🎷 Saxophone
🎸 Guitar
🎹 Musical Keyboard
🎺 Trumpet
🎻 Violin
🥁 Drum
📱 Mobile Phone
📲 Mobile Phone With Arrow
☎ Telephone
📞 Telephone Receiver
📟 Pager
📠 Fax Machine
🔋 Battery
🔌 Electric Plug
💻 Laptop Computer
🖥 Desktop Computer
🖨 Printer
⌨ Keyboard
🖱 Computer Mouse
🖲 Trackball
💽 Computer Disk
💾 Floppy Disk
💿 Optical Disk
📀 DVD
🎥 Movie Camera
🎞 Film Frames
📽 Film Projector
🎬 Clapper Board
📺 Television
📷 Camera
📸 Camera With Flash
📹 Video Camera
📼 Videocassette
🔍 Left-Pointing Magnifying Glass
🔎 Right-Pointing Magnifying Glass
🔬 Microscope
🔭 Telescope
📡 Satellite Antenna
🕯 Candle
💡 Light Bulb
🔦 Flashlight
🏮 Red Paper Lantern
📔 Notebook With Decorative Cover
📕 Closed Book
📖 Open Book
📗 Green Book
📘 Blue Book
📙 Orange Book
📚 Books
📓 Notebook
📒 Ledger
📃 Page With Curl
📜 Scroll
📄 Page Facing Up
📰 Newspaper
🗞 Rolled-Up Newspaper
📑 Bookmark Tabs
🔖 Bookmark
🏷 Label
💰 Money Bag
💴 Yen Banknote
💵 Dollar Banknote
💶 Euro Banknote
💷 Pound Banknote
💸 Money With Wings
💳 Credit Card
💹 Chart Increasing With Yen
💱 Currency Exchange
💲 Heavy Dollar Sign
✉ Envelope
📧 E-Mail
📨 Incoming Envelope
📩 Envelope With Arrow
📤 Outbox Tray
📥 Inbox Tray
📦 Package
📫 Closed Mailbox With Raised Flag
📪 Closed Mailbox With Lowered Flag
📬 Open Mailbox With Raised Flag
📭 Open Mailbox With Lowered Flag
📮 Postbox
🗳 Ballot Box With Ballot
✏ Pencil
✒ Black Nib
🖋 Fountain Pen
🖊 Pen
🖌 Paintbrush
🖍 Crayon
📝 Memo
💼 Briefcase
📁 File Folder
📂 Open File Folder
🗂 Card Index Dividers
📅 Calendar
📆 Tear-Off Calendar
🗒 Spiral Notepad
🗓 Spiral Calendar
📇 Card Index
📈 Chart Increasing
📉 Chart Decreasing
📊 Bar Chart
📋 Clipboard
📌 Pushpin
📍 Round Pushpin
📎 Paperclip
🖇 Linked Paperclips
📏 Straight Ruler
📐 Triangular Ruler
✂ Scissors
🗃 Card File Box
🗄 File Cabinet
🗑 Wastebasket
🔒 Locked
🔓 Unlocked
🔏 Locked With Pen
🔐 Locked With Key
🔑 Key
🗝 Old Key
🔨 Hammer
⛏ Pick
⚒ Hammer and Pick
🛠 Hammer and Wrench
🗡 Dagger
⚔ Crossed Swords
🔫 Pistol
🏹 Bow and Arrow
🛡 Shield
🔧 Wrench
🔩 Nut and Bolt
⚙ Gear
🗜 Clamp
⚗ Alembic
⚖ Balance Scale
🔗 Link
⛓ Chains
💉 Syringe
💊 Pill
🚬 Cigarette
⚰ Coffin
⚱ Funeral Urn
🗿 Moai
🛢 Oil Drum
🔮 Crystal Ball
🛒 Shopping Cart
🏧 Atm Sign
🚮 Litter in Bin Sign
🚰 Potable Water
♿ Wheelchair Symbol
🚹 Men’s Room
🚺 Women’s Room
🚻 Restroom
🚼 Baby Symbol
🚾 Water Closet
🛂 Passport Control
🛃 Customs
🛄 Baggage Claim
🛅 Left Luggage
⚠ Warning
🚸 Children Crossing
⛔ No Entry
🚫 Prohibited
🚳 No Bicycles
🚭 No Smoking
🚯 No Littering
🚱 Non-Potable Water
🚷 No Pedestrians
📵 No Mobile Phones
🔞 No One Under Eighteen
☢ Radioactive
☣ Biohazard
⬆ Up Arrow
↗ Up-Right Arrow
➡ Right Arrow
↘ Down-Right Arrow
⬇ Down Arrow
↙ Down-Left Arrow
⬅ Left Arrow
↖ Up-Left Arrow
↕ Up-Down Arrow
↔ Left-Right Arrow
↩ Right Arrow Curving Left
↪ Left Arrow Curving Right
⤴ Right Arrow Curving Up
⤵ Right Arrow Curving Down
🔃 Clockwise Vertical Arrows
🔄 Anticlockwise Arrows Button
🔙 Back Arrow
🔚 End Arrow
🔛 On! Arrow
🔜 Soon Arrow
🔝 Top Arrow
🛐 Place of Worship
⚛ Atom Symbol
🕉 Om
✡ Star of David
☸ Wheel of Dharma
☯ Yin Yang
✝ Latin Cross
☦ Orthodox Cross
☪ Star and Crescent
☮ Peace Symbol
🕎 Menorah
🔯 Dotted Six-Pointed Star
♈ Aries
♉ Taurus
♊ Gemini
♋ Cancer
♌ Leo
♍ Virgo
♎ Libra
♏ Scorpius
♐ Sagittarius
♑ Capricorn
♒ Aquarius
♓ Pisces
⛎ Ophiuchus
🔀 Shuffle Tracks Button
🔁 Repeat Button
🔂 Repeat Single Button
▶ Play Button
⏩ Fast-Forward Button
⏭ Next Track Button
⏯ Play or Pause Button
◀ Reverse Button
⏪ Fast Reverse Button
⏮ Last Track Button
🔼 Up Button
⏫ Fast Up Button
🔽 Down Button
⏬ Fast Down Button
⏸ Pause Button
⏹ Stop Button
⏺ Record Button
⏏ Eject Button
🎦 Cinema
🔅 Dim Button
🔆 Bright Button
📶 Antenna Bars
📳 Vibration Mode
📴 Mobile Phone Off
♀ Female Sign
♂ Male Sign
⚕ Medical Symbol
♻ Recycling Symbol
⚜ Fleur-De-Lis
🔱 Trident Emblem
📛 Name Badge
🔰 Japanese Symbol for Beginner
⭕ Heavy Large Circle
✅ White Heavy Check Mark
☑ Ballot Box With Check
✔ Heavy Check Mark
✖ Heavy Multiplication X
❌ Cross Mark
❎ Cross Mark Button
➕ Heavy Plus Sign
➖ Heavy Minus Sign
➗ Heavy Division Sign
➰ Curly Loop
➿ Double Curly Loop
〽 Part Alternation Mark
✳ Eight-Spoked Asterisk
✴ Eight-Pointed Star
❇ Sparkle
‼ Double Exclamation Mark
⁉ Exclamation Question Mark
❓ Question Mark
❔ White Question Mark
❕ White Exclamation Mark
❗ Exclamation Mark
〰 Wavy Dash
© Copyright
® Registered
™ Trade Mark
#️⃣ Keycap Number Sign
*️⃣ Keycap Asterisk
0️⃣ Keycap Digit Zero
1️⃣ Keycap Digit One
2️⃣ Keycap Digit Two
3️⃣ Keycap Digit Three
4️⃣ Keycap Digit Four
5️⃣ Keycap Digit Five
6️⃣ Keycap Digit Six
7️⃣ Keycap Digit Seven
8️⃣ Keycap Digit Eight
9️⃣ Keycap Digit Nine
🔟 Keycap 10
💯 Hundred Points
🔠 Input Latin Uppercase
🔡 Input Latin Lowercase
🔢 Input Numbers
🔣 Input Symbols
🔤 Input Latin Letters
🅰 A Button (blood Type)
🆎 Ab Button (blood Type)
🅱 B Button (blood Type)
🆑 CL Button
🆒 Cool Button
🆓 Free Button
ℹ Information
🆔 ID Button
Ⓜ Circled M
🆕 New Button
🆖 NG Button
🅾 O Button (blood Type)
🆗 OK Button
🅿 P Button
🆘 SOS Button
🆙 Up! Button
🆚 Vs Button
🈁 Japanese “here” Button
🈂 Japanese “service Charge” Button
🈷 Japanese “monthly Amount” Button
🈶 Japanese “not Free of Charge” Button
🈯 Japanese “reserved” Button
🉐 Japanese “bargain” Button
🈹 Japanese “discount” Button
🈚 Japanese “free of Charge” Button
🈲 Japanese “prohibited” Button
🉑 Japanese “acceptable” Button
🈸 Japanese “application” Button
🈴 Japanese “passing Grade” Button
🈳 Japanese “vacancy” Button
㊗ Japanese “congratulations” Button
㊙ Japanese “secret” Button
🈺 Japanese “open for Business” Button
🈵 Japanese “no Vacancy” Button
▪ Black Small Square
▫ White Small Square
◻ White Medium Square
◼ Black Medium Square
◽ White Medium-Small Square
◾ Black Medium-Small Square
⬛ Black Large Square
⬜ White Large Square
🔶 Large Orange Diamond
🔷 Large Blue Diamond
🔸 Small Orange Diamond
🔹 Small Blue Diamond
🔺 Red Triangle Pointed Up
🔻 Red Triangle Pointed Down
💠 Diamond With a Dot
🔘 Radio Button
🔲 Black Square Button
🔳 White Square Button
⚪ White Circle
⚫ Black Circle
🔴 Red Circle
🔵 Blue Circle
🏁 Chequered Flag
🚩 Triangular Flag
🎌 Crossed Flags
🏴 Black Flag
🏳 White Flag
🏳️🌈 Rainbow Flag
🇦🇨 Ascension Island
🇦🇩 Andorra
🇦🇪 United Arab Emirates
🇦🇫 Afghanistan
🇦🇬 Antigua & Barbuda
🇦🇮 Anguilla
🇦🇱 Albania
🇦🇲 Armenia
🇦🇴 Angola
🇦🇶 Antarctica
🇦🇷 Argentina
🇦🇸 American Samoa
🇦🇹 Austria
🇦🇺 Australia
🇦🇼 Aruba
🇦🇽 Åland Islands
🇦🇿 Azerbaijan
🇧🇦 Bosnia & Herzegovina
🇧🇧 Barbados
🇧🇩 Bangladesh
🇧🇪 Belgium
🇧🇫 Burkina Faso
🇧🇬 Bulgaria
🇧🇭 Bahrain
🇧🇮 Burundi
🇧🇯 Benin
🇧🇱 St. Barthélemy
🇧🇲 Bermuda
🇧🇳 Brunei
🇧🇴 Bolivia
🇧🇶 Caribbean Netherlands
🇧🇷 Brazil
🇧🇸 Bahamas
🇧🇹 Bhutan
🇧🇻 Bouvet Island
🇧🇼 Botswana
🇧🇾 Belarus
🇧🇿 Belize
🇨🇦 Canada
🇨🇨 Cocos (Keeling) Islands
🇨🇩 Congo - Kinshasa
🇨🇫 Central African Republic
🇨🇬 Congo - Brazzaville
🇨🇭 Switzerland
🇨🇮 Côte D’Ivoire
🇨🇰 Cook Islands
🇨🇱 Chile
🇨🇲 Cameroon
🇨🇳 China
🇨🇴 Colombia
🇨🇵 Clipperton Island
🇨🇷 Costa Rica
🇨🇺 Cuba
🇨🇻 Cape Verde
🇨🇼 Curaçao
🇨🇽 Christmas Island
🇨🇾 Cyprus
🇨🇿 Czechia
🇩🇪 Germany
🇩🇬 Diego Garcia
🇩🇯 Djibouti
🇩🇰 Denmark
🇩🇲 Dominica
🇩🇴 Dominican Republic
🇩🇿 Algeria
🇪🇦 Ceuta & Melilla
🇪🇨 Ecuador
🇪🇪 Estonia
🇪🇬 Egypt
🇪🇭 Western Sahara
🇪🇷 Eritrea
🇪🇸 Spain
🇪🇹 Ethiopia
🇪🇺 European Union
🇫🇮 Finland
🇫🇯 Fiji
🇫🇰 Falkland Islands
🇫🇲 Micronesia
🇫🇴 Faroe Islands
🇫🇷 France
🇬🇦 Gabon
🇬🇧 United Kingdom
🇬🇩 Grenada
🇬🇪 Georgia
🇬🇫 French Guiana
🇬🇬 Guernsey
🇬🇭 Ghana
🇬🇮 Gibraltar
🇬🇱 Greenland
🇬🇲 Gambia
🇬🇳 Guinea
🇬🇵 Guadeloupe
🇬🇶 Equatorial Guinea
🇬🇷 Greece
🇬🇸 South Georgia & South Sandwich Islands
🇬🇹 Guatemala
🇬🇺 Guam
🇬🇼 Guinea-Bissau
🇬🇾 Guyana
🇭🇰 Hong Kong Sar China
🇭🇲 Heard & Mcdonald Islands
🇭🇳 Honduras
🇭🇷 Croatia
🇭🇹 Haiti
🇭🇺 Hungary
🇮🇨 Canary Islands
🇮🇩 Indonesia
🇮🇪 Ireland
🇮🇱 Israel
🇮🇲 Isle of Man
🇮🇳 India
🇮🇴 British Indian Ocean Territory
🇮🇶 Iraq
🇮🇷 Iran
🇮🇸 Iceland
🇮🇹 Italy
🇯🇪 Jersey
🇯🇲 Jamaica
🇯🇴 Jordan
🇯🇵 Japan
🇰🇪 Kenya
🇰🇬 Kyrgyzstan
🇰🇭 Cambodia
🇰🇮 Kiribati
🇰🇲 Comoros
🇰🇳 St. Kitts & Nevis
🇰🇵 North Korea
🇰🇷 South Korea
🇰🇼 Kuwait
🇰🇾 Cayman Islands
🇰🇿 Kazakhstan
🇱🇦 Laos
🇱🇧 Lebanon
🇱🇨 St. Lucia
🇱🇮 Liechtenstein
🇱🇰 Sri Lanka
🇱🇷 Liberia
🇱🇸 Lesotho
🇱🇹 Lithuania
🇱🇺 Luxembourg
🇱🇻 Latvia
🇱🇾 Libya
🇲🇦 Morocco
🇲🇨 Monaco
🇲🇩 Moldova
🇲🇪 Montenegro
🇲🇫 St. Martin
🇲🇬 Madagascar
🇲🇭 Marshall Islands
🇲🇰 Macedonia
🇲🇱 Mali
🇲🇲 Myanmar (Burma)
🇲🇳 Mongolia
🇲🇴 Macau Sar China
🇲🇵 Northern Mariana Islands
🇲🇶 Martinique
🇲🇷 Mauritania
🇲🇸 Montserrat
🇲🇹 Malta
🇲🇺 Mauritius
🇲🇻 Maldives
🇲🇼 Malawi
🇲🇽 Mexico
🇲🇾 Malaysia
🇲🇿 Mozambique
🇳🇦 Namibia
🇳🇨 New Caledonia
🇳🇪 Niger
🇳🇫 Norfolk Island
🇳🇬 Nigeria
🇳🇮 Nicaragua
🇳🇱 Netherlands
🇳🇴 Norway
🇳🇵 Nepal
🇳🇷 Nauru
🇳🇺 Niue
🇳🇿 New Zealand
🇴🇲 Oman
🇵🇦 Panama
🇵🇪 Peru
🇵🇫 French Polynesia
🇵🇬 Papua New Guinea
🇵🇭 Philippines
🇵🇰 Pakistan
🇵🇱 Poland
🇵🇲 St. Pierre & Miquelon
🇵🇳 Pitcairn Islands
🇵🇷 Puerto Rico
🇵🇸 Palestinian Territories
🇵🇹 Portugal
🇵🇼 Palau
🇵🇾 Paraguay
🇶🇦 Qatar
🇷🇪 Réunion
🇷🇴 Romania
🇷🇸 Serbia
🇷🇺 Russia
🇷🇼 Rwanda
🇸🇦 Saudi Arabia
🇸🇧 Solomon Islands
🇸🇨 Seychelles
🇸🇩 Sudan
🇸🇪 Sweden
🇸🇬 Singapore
🇸🇭 St. Helena
🇸🇮 Slovenia
🇸🇯 Svalbard & Jan Mayen
🇸🇰 Slovakia
🇸🇱 Sierra Leone
🇸🇲 San Marino
🇸🇳 Senegal
🇸🇴 Somalia
🇸🇷 Suriname
🇸🇸 South Sudan
🇸🇹 São Tomé & Príncipe
🇸🇻 El Salvador
🇸🇽 Sint Maarten
🇸🇾 Syria
🇸🇿 Swaziland
🇹🇦 Tristan Da Cunha
🇹🇨 Turks & Caicos Islands
🇹🇩 Chad
🇹🇫 French Southern Territories
🇹🇬 Togo
🇹🇭 Thailand
🇹🇯 Tajikistan
🇹🇰 Tokelau
🇹🇱 Timor-Leste
🇹🇲 Turkmenistan
🇹🇳 Tunisia
🇹🇴 Tonga
🇹🇷 Turkey
🇹🇹 Trinidad & Tobago
🇹🇻 Tuvalu
🇹🇼 Taiwan
🇹🇿 Tanzania
🇺🇦 Ukraine
🇺🇬 Uganda
🇺🇲 U.S. Outlying Islands
🇺🇳 United Nations
🇺🇸 United States
🇺🇾 Uruguay
🇺🇿 Uzbekistan
🇻🇦 Vatican City
🇻🇨 St. Vincent & Grenadines
🇻🇪 Venezuela
🇻🇬 British Virgin Islands
🇻🇮 U.S. Virgin Islands
🇻🇳 Vietnam
🇻🇺 Vanuatu
🇼🇫 Wallis & Futuna
🇼🇸 Samoa
🇽🇰 Kosovo
🇾🇪 Yemen
🇾🇹 Mayotte
🇿🇦 South Africa
🇿🇲 Zambia
🇿🇼 Zimbabwe
🏴 Flag for England (GB-ENG)
🏴 Flag for Scotland (GB-SCT)
🏴 Flag for Wales (GB-WLS)
🥆 Rifle
🤻 Modern Pentathlon
🏴☠️ Pirate Flag
🇦 Regional Indicator Symbol Letter A
🇧 Regional Indicator Symbol Letter B
🇨 Regional Indicator Symbol Letter C
🇩 Regional Indicator Symbol Letter D
🇪 Regional Indicator Symbol Letter E
🇫 Regional Indicator Symbol Letter F
🇬 Regional Indicator Symbol Letter G
🇭 Regional Indicator Symbol Letter H
🇮 Regional Indicator Symbol Letter I
🇯 Regional Indicator Symbol Letter J
🇰 Regional Indicator Symbol Letter K
🇱 Regional Indicator Symbol Letter L
🇲 Regional Indicator Symbol Letter M
🇳 Regional Indicator Symbol Letter N
🇴 Regional Indicator Symbol Letter O
🇵 Regional Indicator Symbol Letter P
🇶 Regional Indicator Symbol Letter Q
🇷 Regional Indicator Symbol Letter R
🇸 Regional Indicator Symbol Letter S
🇹 Regional Indicator Symbol Letter T
🇺 Regional Indicator Symbol Letter U
🇻 Regional Indicator Symbol Letter V
🇼 Regional Indicator Symbol Letter W
🇽 Regional Indicator Symbol Letter X
🇾 Regional Indicator Symbol Letter Y
🇿 Regional Indicator Symbol Letter Z
🐱🐉 Dino Cat
🐱🚀 Astro Cat
🐱👤 Ninja Cat
🐱💻 Hacker Cat
🐱🏍 Stunt Cat
🐱👓 Hipster Cat
◯◯◯◯◯ Olympic Rings
🏴 Flag for Baiti (NR-05)
🏴 Flag for Nord-Trøndelag (NO-17)
🏴 Flag for Hordaland (NO-12)
🏴 Flag for Akershus (NO-02)
🏴 Flag for Sør-Trøndelag (NO-16)
🏴 Flag for Telemark (NO-08)
🏴 Flag for Utrecht (NL-UT)
🏴 Flag for Møre og Romsdal (NO-15)
🏴 Flag for Svalbard (NO-21)
🏴 Flag for Purwanchal (NP-4)
🏴 Flag for Central (NP-1)
🏴 Flag for Oslo (NO-03)
🏴 Flag for Boe (NR-06)
👨🏾👨🏾👦🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for North Brabant (NL-NB)
🏴 Flag for Aust-Agder (NO-09)
🏴 Flag for Anabar (NR-02)
🏴 Flag for Limburg (NL-LI)
🏴 Flag for Buskerud (NO-06)
🏴 Flag for Hedmark (NO-04)
🏴 Flag for Vestfold (NO-07)
🏴 Flag for Anibare (NR-04)
🏴 Flag for Finnmark (NO-20)
🏴 Flag for Overijssel (NL-OV)
🏴 Flag for Rogaland (NO-11)
🏴 Flag for Østfold (NO-01)
🏴 Flag for Aiwo (NR-01)
🏴 Flag for Zeeland (NL-ZE)
🏴 Flag for Buada (NR-07)
🏴 Flag for Troms (NO-19)
🏴 Flag for Oppland (NO-05)
🏴 Flag for Madhya Pashchimanchal (NP-2)
🏴 Flag for Anetan (NR-03)
🏴 Flag for Western (NP-3)
🏴 Flag for Jan Mayen (NO-22)
🏴 Flag for Nordland (NO-18)
🏴 Flag for Bocas del Toro (PA-1)
🏴 Flag for Colón (PA-3)
🏴 Flag for Ad Dakhiliyah (OM-DA)
🏴 Flag for Muscat (OM-MA)
🏴 Flag for Ewa (NR-09)
🏴 Flag for Taranaki (NZ-TKI)
🏴 Flag for Ijuw (NR-10)
🏴 Flag for West Coast (NZ-WTC)
🏴 Flag for Southland (NZ-STL)
🏴 Flag for Tasman (NZ-TAS)
🏴 Flag for Manawatu-Wanganui (NZ-MWT)
🏴 Flag for Waikato (NZ-WKO)
🏴 Flag for Marl (NZ-MBH)
🏴 Flag for Bay of Plenty (NZ-BOP)
🏴 Flag for Nibok (NR-12)
🏴 Flag for Al Buraimi (OM-BU)
🏴 Flag for Auckland (NZ-AUK)
🏴 Flag for Janub ash Sharqiyah (OM-SJ)
🏴 Flag for Shamal ash Sharqiyah (OM-SS)
🏴 Flag for Coclé (PA-2)
🏴 Flag for Meneng (NR-11)
🏴 Flag for West Panamá (PA-10)
🏴 Flag for Ad Dhahirah (OM-ZA)
🏴 Flag for Northland (NZ-NTL)
🏴 Flag for Canterbury (NZ-CAN)
🏴 Flag for Gisborne (NZ-GIS)
🏴 Flag for Chatham Islands (NZ-CIT)
🏴 Flag for Uaboe (NR-13)
🏴 Flag for Denigomodu (NR-08)
🏴 Flag for Musandam (OM-MU)
🏴 Flag for Shamal al Batinah (OM-BS)
🏴 Flag for Hawke’s Bay (NZ-HKB)
🏴 Flag for Otago (NZ-OTA)
🏴 Flag for Janub al Batinah (OM-BJ)
🏴 Flag for Dhofar (OM-ZU)
🏴 Flag for Darién (PA-5)
🏴 Flag for El Callao (PE-CAL)
🏴 Flag for Herrera (PA-6)
🏴 Flag for Guna Yala (PA-KY)
🏴 Flag for Emberá (PA-EM)
🏴 Flag for La Libertad (PE-LAL)
🏴 Flag for Veraguas (PA-9)
🏴 Flag for Loreto (PE-LOR)
🏴 Flag for Amazonas (PE-AMA)
🏴 Flag for Chiriquí (PA-4)
🏴 Flag for Chimbu (PG-CPK)
🏴 Flag for Eastern Highlands (PG-EHG)
🏴 Flag for San Martín (PE-SAM)
🏴 Flag for Junín (PE-JUN)
🏴 Flag for Huánuco (PE-HUC)
🏴 Flag for Pasco (PE-PAS)
🏴 Flag for Ngöbe-Buglé (PA-NB)
🏴 Flag for Cajamarca (PE-CAJ)
🏴 Flag for Ica (PE-ICA)
🏴 Flag for Lima Region (PE-LIM)
🏴 Flag for Moquegua (PE-MOQ)
🏴 Flag for Puno (PE-PUN)
🏴 Flag for Ucayali (PE-UCA)
🏴 Flag for Lima (PE-LMA)
🏴 Flag for Piura (PE-PIU)
🏴 Flag for Tumbes (PE-TUM)
🏴 Flag for Cusco (PE-CUS)
🏴 Flag for Panamá (PA-8)
🏴 Flag for Tacna (PE-TAC)
🏴 Flag for Central (PG-CPM)
🏴 Flag for Los Santos (PA-7)
🏴 Flag for Lambayeque (PE-LAM)
🏴 Flag for Huancavelica (PE-HUV)
🏴 Flag for Ancash (PE-ANC)
🏴 Flag for Hela (PG-HLA)
🏴 Flag for Port Moresby (PG-NCD)
🏴 Flag for Islamabad (PK-IS)
🏴 Flag for Metro Manila (PH-00)
🏴 Flag for Bicol (PH-05)
🏴 Flag for Gulf (PG-GPK)
🏴 Flag for Zamboanga Peninsula (PH-09)
🏴 Flag for Bougainville (PG-NSB)
🏴 Flag for Gilgit-Baltistan (PK-GB)
🏴 Flag for Madang (PG-MPM)
🏴 Flag for Western (FJ-W)
🏴 Flag for Soccsksargen (PH-12)
🏴 Flag for Eastern Visayas (PH-08)
🏴 Flag for Enga (PG-EPW)
🏴 Flag for Milne Bay (PG-MBA)
🏴 Flag for Calabarzon (PH-40)
🏴 Flag for Jiwaka (PG-JWK)
🏴 Flag for Cagayan Valley (PH-02)
👨🏿👨🏿👦🏿👧🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Morobe (PG-MPL)
🏴 Flag for Northern Mindanao (PH-10)
🏴 Flag for Mimaropa (PH-41)
🏴 Flag for Balochistan (PK-BA)
🏴 Flag for Caraga (PH-13)
🏴 Flag for East Sepik (PG-ESW)
🏴 Flag for Western Visayas (PH-06)
🏴 Flag for Central Luzon (PH-03)
🏴 Flag for Muslim Mindanao (PH-14)
🏴 Flag for Southern Highlands (PG-SHM)
🏴 Flag for Western (PG-WPD)
🏴 Flag for Sandaun (PG-SAN)
🏴 Flag for New Ireland (PG-NIK)
🏴 Flag for Oro (PG-NPP)
🏴 Flag for Manus (PG-MRL)
🏴 Flag for Western Highlands (PG-WHM)
🏴 Flag for Davao (PH-11)
🏴 Flag for Punjab (PK-PB)
🏴 Flag for Federal Capital Territory (PL-PM)
🏴 Flag for Silesia (PL-SL)
🏴 Flag for Kuyavian-Pomerania (PL-KP)
🏴 Flag for Tubas (PS-TBS)
🏴 Flag for Ramallah and al-Bireh (PS-RBH)
🏴 Flag for Gaza (PS-GZA)
🏴 Flag for Rafah (PS-RFH)
🏴 Flag for Hebron (PS-HBN)
🏴 Flag for Podlaskie (PL-PD)
🏴 Flag for Subcarpathia (PL-PK)
🏴 Flag for Jenin (PS-JEN)
🏴 Flag for Lower Silesian (PL-DS)
🏴 Flag for Khan Yunis (PS-KYS)
🏴 Flag for Łódź (PL-LD)
🏴 Flag for North Gaza (PS-NGZ)
🏴 Flag for West Pomerania (PL-ZP)
🏴 Flag for Azad Kashmir (PK-JK)
🏴 Flag for Salfit (PS-SLT)
🏴 Flag for Mazovia (PL-MZ)
🏴 Flag for Lesser Poland (PL-MA)
🏴 Flag for Qalqilya (PS-QQA)
🏴 Flag for Aveiro (PT-01)
🏴 Flag for Greater Poland (PL-WP)
🏴 Flag for Opole (PL-OP)
🏴 Flag for Bethlehem (PS-BTH)
🏴 Flag for Khyber Pakhtunkhwa (PK-KP)
🏴 Flag for Tulkarm (PS-TKM)
🏴 Flag for Nablus (PS-NBS)
🏴 Flag for Warmian-Masuria (PL-WN)
🏴 Flag for Jericho (PS-JRH)
🏴 Flag for Sindh (PK-SD)
🏴 Flag for Lublin (PL-LU)
🏴 Flag for Jerusalem (PS-JEM)
🏴 Flag for Lubusz (PL-LB)
🏴 Flag for Świętokrzyskie (PL-SK)
🏴 Flag for Melekeok (PW-212)
🏴 Flag for Faro (PT-08)
🏴 Flag for Central (PY-11)
🏴 Flag for Évora (PT-07)
🏴 Flag for Ngiwal (PW-228)
🏴 Flag for Ñeembucú (PY-12)
🏴 Flag for Viana do Castelo (PT-16)
🏴 Flag for Lisbon (PT-11)
🏴 Flag for Presidente Hayes (PY-15)
🏴 Flag for Vila Real (PT-17)
🏴 Flag for Viseu (PT-18)
🏴 Flag for Airai (PW-004)
🏴 Flag for Amambay (PY-13)
🏴 Flag for Ngatpang (PW-224)
🏴 Flag for Coimbra (PT-06)
🏴 Flag for Portalegre (PT-12)
🏴 Flag for Peleliu (PW-350)
🏴 Flag for Ngardmau (PW-222)
🏴 Flag for Ngaraard (PW-214)
🏴 Flag for Canindeyú (PY-14)
🏴 Flag for Angaur (PW-010)
🏴 Flag for Sonsorol (PW-370)
🏴 Flag for Bragança (PT-04)
🏴 Flag for Castelo Branco (PT-05)
🏴 Flag for Santarém (PT-14)
🏴 Flag for Braga (PT-03)
🏴 Flag for Hatohobei (PW-050)
🏴 Flag for Koror (PW-150)
🏴 Flag for Alto Paraná (PY-10)
🏴 Flag for Ngeremlengui (PW-227)
🏴 Flag for Leiria (PT-10)
🏴 Flag for Porto (PT-13)
🏴 Flag for Setúbal (PT-15)
🏴 Flag for Aimeliik (PW-002)
🏴 Flag for Ngchesar (PW-226)
🏴 Flag for Guarda (PT-09)
🏴 Flag for San Pedro (PY-2)
🏴 Flag for Caaguazú (PY-5)
🏴 Flag for Guairá (PY-4)
🏴 Flag for Bacău (RO-BC)
🏴 Flag for Itapúa (PY-7)
🏴 Flag for Caraș-Severin (RO-CS)
🏴 Flag for Caazapá (PY-6)
🏴 Flag for Al Khor (QA-KH)
🏴 Flag for Covasna (RO-CV)
🏴 Flag for Alba (RO-AB)
🏴 Flag for Doha (QA-DA)
🏴 Flag for Dolj (RO-DJ)
🏴 Flag for Cordillera (PY-3)
🏴 Flag for Madinat ash Shamal (QA-MS)
🏴 Flag for Bihor (RO-BH)
🏴 Flag for Harghita (RO-HR)
🏴 Flag for Brăila (RO-BR)
🏴 Flag for Argeș (RO-AG)
🏴 Flag for Al Daayen (QA-ZA)
🏴 Flag for Bistriţa-Năsăud (RO-BN)
🏴 Flag for Călărași (RO-CL)
🏴 Flag for Asunción (PY-ASU)
🏴 Flag for Concepción (PY-1)
🏴 Flag for Botoşani (RO-BT)
🏴 Flag for Galați (RO-GL)
🏴 Flag for Giurgiu (RO-GR)
🏴 Flag for Boquerón (PY-19)
🏴 Flag for Misiones (PY-8)
🏴 Flag for Bucharest (RO-B)
🏴 Flag for Paraguarí (PY-9)
🏴 Flag for Al Rayyan (QA-RA)
🏴 Flag for Constanța (RO-CT)
🏴 Flag for Hunedoara (RO-HD)
🏴 Flag for Dâmbovița (RO-DB)
🏴 Flag for Arad (RO-AR)
🏴 Flag for Cluj (RO-CJ)
🏴 Flag for Buzău (RO-BZ)
🏴 Flag for Al Wakrah (QA-WA)
🏴 Flag for Vâlcea (RO-VL)
🏴 Flag for Iași (RO-IS)
🏴 Flag for Mehedinți (RO-MH)
🏴 Flag for Kosovo-Metohija (RS-KM)
🏴 Flag for Ialomița (RO-IL)
🏴 Flag for Teleorman (RO-TR)
🏴 Flag for Šumadija (RS-12)
🏴 Flag for Nišava (RS-20)
🏴 Flag for Altai (RU-AL)
🏴 Flag for Vrancea (RO-VN)
🏴 Flag for Vaslui (RO-VS)
🏴 Flag for Ilfov (RO-IF)
🏴 Flag for Mačva (RS-08)
🏴 Flag for Kolubara (RS-09)
🏴 Flag for Prahova (RO-PH)
🏴 Flag for Braničevo (RS-11)
🏴 Flag for Beograd (RS-00)
🏴 Flag for Zaječar (RS-15)
🏴 Flag for Moravica (RS-17)
🏴 Flag for Pomoravlje (RS-13)
🏴 Flag for Olt (RO-OT)
🏴 Flag for Satu Mare (RO-SM)
🏴 Flag for Toplica (RS-21)
🏴 Flag for Sălaj (RO-SJ)
🏴 Flag for Mureş (RO-MS)
🏴 Flag for Pirot (RS-22)
🏴 Flag for Rasina (RS-19)
🏴 Flag for Pčinja (RS-24)
🏴 Flag for Maramureş (RO-MM)
🏴 Flag for Suceava (RO-SV)
🏴 Flag for Raška (RS-18)
🏴 Flag for Bor (RS-14)
🏴 Flag for Podunavlje (RS-10)
🏴 Flag for Neamţ (RO-NT)
🏴 Flag for Zlatibor (RS-16)
🏴 Flag for Vojvodina (RS-VO)
🏴 Flag for Jablanica (RS-23)
🏴 Flag for Tulcea (RO-TL)
🏴 Flag for Adygea (RU-AD)
🏴 Flag for Timiș (RO-TM)
👩🏼👦🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Karachay-Cherkess (RU-KC)
🏴 Flag for Khakassia (RU-KK)
🏴 Flag for Buryat (RU-BU)
🏴 Flag for Kalmykia (RU-KL)
🏴 Flag for Belgorod (RU-BEL)
🏴 Flag for Khanty-Mansi (RU-KHM)
🏴 Flag for Leningrad (RU-LEN)
🏴 Flag for Kurgan (RU-KGN)
🏴 Flag for Ivanovo (RU-IVA)
🏴 Flag for Ingushetia (RU-IN)
🏴 Flag for Kirov (RU-KIR)
🏴 Flag for Krasnodar Krai (RU-KDA)
🏴 Flag for Karelia (RU-KR)
🏴 Flag for Magadan (RU-MAG)
🏴 Flag for Krasnoyarsk Krai (RU-KYA)
🏴 Flag for Kemerovo (RU-KEM)
🏴 Flag for Astrakhan (RU-AST)
🏴 Flag for Amur (RU-AMU)
🏴 Flag for Mordovia (RU-MO)
🏴 Flag for Komi (RU-KO)
🏴 Flag for Chelyabinsk (RU-CHE)
🏴 Flag for Khabarovsk Krai (RU-KHA)
🏴 Flag for Kursk (RU-KRS)
🏴 Flag for Mari El (RU-ME)
🏴 Flag for Chukotka Okrug (RU-CHU)
🏴 Flag for Kaliningrad (RU-KGD)
🏴 Flag for Irkutsk (RU-IRK)
🏴 Flag for Kaluga (RU-KLU)
🏴 Flag for Kabardino-Balkar (RU-KB)
🏴 Flag for Lipetsk (RU-LIP)
🏴 Flag for Bashkortostan (RU-BA)
🏴 Flag for Chuvash (RU-CU)
🏴 Flag for Kamchatka Krai (RU-KAM)
🏴 Flag for Kostroma (RU-KOS)
🏴 Flag for Sakhalin (RU-SAK)
🏴 Flag for Tver (RU-TVE)
🏴 Flag for Novosibirsk (RU-NVS)
🏴 Flag for Vladimir (RU-VLA)
🏴 Flag for Oryol (RU-ORL)
🏴 Flag for Stavropol Krai (RU-STA)
🏴 Flag for Nizhny Novgorod (RU-NIZ)
🏴 Flag for Saratov (RU-SAR)
🏴 Flag for Orenburg (RU-ORE)
🏴 Flag for Nenets (RU-NEN)
🏴 Flag for Volgograd (RU-VGG)
🏴 Flag for Tomsk (RU-TOM)
🏴 Flag for Sverdlovsk (RU-SVE)
🏴 Flag for Saint Petersburg (RU-SPE)
🏴 Flag for Yamalo-Nenets Okrug (RU-YAN)
🏴 Flag for Sakha (RU-SA)
🏴 Flag for Moscow (RU-MOW)
🏴 Flag for Penza (RU-PNZ)
🏴 Flag for Smolensk (RU-SMO)
🏴 Flag for Tatarstan (RU-TA)
🏴 Flag for Vologda (RU-VLG)
🏴 Flag for Tula (RU-TUL)
🏴 Flag for Yaroslavl (RU-YAR)
🏴 Flag for Tyumen (RU-TYU)
🏴 Flag for Pskov (RU-PSK)
🏴 Flag for Udmurt (RU-UD)
🏴 Flag for Samara (RU-SAM)
🏴 Flag for Ulyanovsk (RU-ULY)
🏴 Flag for Ryazan (RU-RYA)
🏴 Flag for Omsk (RU-OMS)
🏴 Flag for Perm Krai (RU-PER)
🏴 Flag for Voronezh (RU-VOR)
🏴 Flag for Novgorod (RU-NGR)
🏴 Flag for Tambov (RU-TAM)
🏴 Flag for Tuva (RU-TY)
🏴 Flag for Rostov (RU-ROS)
🏴 Flag for Murmansk (RU-MUR)
🏴 Flag for Kigali (RW-01)
🏴 Flag for Anse Etoile (SC-03)
🏴 Flag for Isabel (SB-IS)
🏴 Flag for Anse Boileau (SC-02)
🏴 Flag for Tabuk (SA-07)
🏴 Flag for Guadalcanal (SB-GU)
🏴 Flag for Northern (RW-03)
🏴 Flag for Southern (RW-05)
🏴 Flag for Central (SB-CE)
🏴 Flag for Ha’il (SA-06)
🏴 Flag for Bel Air (SC-09)
🏴 Flag for Malaita (SB-ML)
🏴 Flag for Najran (SA-10)
🏴 Flag for Al Jawf (SA-12)
🏴 Flag for Honiara (SB-CT)
🏴 Flag for Western (SB-WE)
🏴 Flag for Northern Borders (SA-08)
🏴 Flag for Riyadh (SA-01)
🏴 Flag for Rennell and Bellona (SB-RB)
🏴 Flag for Au Cap (SC-04)
🏴 Flag for Eastern (RW-02)
🏴 Flag for Anse Royale (SC-05)
🏴 Flag for Jewish (RU-YEV)
🏴 Flag for Bel Ombre (SC-10)
🏴 Flag for Al-Qassim (SA-05)
🏴 Flag for Temotu (SB-TE)
🏴 Flag for Baie Sainte Anne (SC-07)
🏴 Flag for Choiseul (SB-CH)
🏴 Flag for Western (RW-04)
🏴 Flag for Makira-Ulawa (SB-MK)
🏴 Flag for Makkah (SA-02)
🏴 Flag for Jizan (SA-09)
🏴 Flag for Anse aux Pins (SC-01)
🏴 Flag for Eastern (SA-04)
🏴 Flag for Asir (SA-14)
🏴 Flag for Zabaykalsky Krai (RU-ZAB)
🏴 Flag for Beau Vallon (SC-08)
🏴 Flag for Al Madinah (SA-03)
🏴 Flag for Baie Lazare (SC-06)
🏴 Flag for Plaisance (SC-19)
🏴 Flag for Södermanland (SE-D)
🏴 Flag for La Rivière Anglaise (SC-16)
🏴 Flag for Saint Louis (SC-22)
🏴 Flag for Mont Fleuri (SC-18)
🏴 Flag for Northern (SD-NO)
🏴 Flag for Grand’Anse Mahé (SC-13)
🏴 Flag for Takamaka (SC-23)
🏴 Flag for West Darfur (SD-DW)
🏴 Flag for Al Qadarif (SD-GD)
🏴 Flag for South Darfur (SD-DS)
🏴 Flag for River Nile (SD-NR)
🏴 Flag for West Kurdufan (SD-GK)
🏴 Flag for Kassala (SD-KA)
🏴 Flag for Khartoum (SD-KH)
🏴 Flag for La Digue (SC-15)
🏴 Flag for Les Mamelles (SC-24)
🏴 Flag for Port Glaud (SC-21)
🏴 Flag for Västerbotten (SE-AC)
🏴 Flag for Jönköping (SE-F)
🏴 Flag for Stockholm (SE-AB)
🏴 Flag for Glacis (SC-12)
🏴 Flag for Pointe La Rue (SC-20)
🏴 Flag for White Nile (SD-NW)
🏴 Flag for Al Jazirah (SD-GZ)
🏴 Flag for Östergötland (SE-E)
🏴 Flag for Norrbotten (SE-BD)
🏴 Flag for Uppsala (SE-C)
🏴 Flag for Mont Buxton (SC-17)
🏴 Flag for Grand’Anse Praslin (SC-14)
🏴 Flag for South Kurdufan (SD-KS)
🏴 Flag for Cascade (SC-11)
🏴 Flag for North Kurdufan (SD-KN)
🏴 Flag for Sennar (SD-SI)
🏴 Flag for East Darfur (SD-DE)
🏴 Flag for Blue Nile (SD-NB)
🏴 Flag for North Darfur (SD-DN)
🏴 Flag for Central Darfur (SD-DC)
🏴 Flag for Västmanland (SE-U)
🏴 Flag for Värmland (SE-S)
🏴 Flag for Črnomelj (SI-017)
🏴 Flag for Västernorrland (SE-Y)
🏴 Flag for South West (SG-05)
🏴 Flag for Črna na Koroškem (SI-016)
🏴 Flag for Västra Götaland (SE-O)
🏴 Flag for Gävleborg (SE-X)
🏴 Flag for North East (SG-02)
🏴 Flag for Brda (SI-007)
🏴 Flag for Kalmar (SE-H)
🏴 Flag for Destrnik (SI-018)
🏴 Flag for Beltinci (SI-002)
🏴 Flag for Bohinj (SI-004)
🏴 Flag for Brežice (SI-009)
🏴 Flag for North West (SG-03)
🏴 Flag for Ascension Island (SH-AC)
👩🏽👦🏽👶🏽 Family - Woman: Medium Skin Tone, Boy: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Cerklje na Gorenjskem (SI-012)
🏴 Flag for Cerknica (SI-013)
🏴 Flag for Bovec (SI-006)
🏴 Flag for Črenšovci (SI-015)
🏴 Flag for Kronoberg (SE-G)
🏴 Flag for Ajdovščina (SI-001)
🏴 Flag for Tišina (SI-010)
🏴 Flag for South East (SG-04)
🏴 Flag for Brezovica (SI-008)
🏴 Flag for Saint Helena (SH-HL)
🏴 Flag for Jämtland (SE-Z)
🏴 Flag for Gotland (SE-I)
🏴 Flag for Dalarna (SE-W)
🏴 Flag for Blekinge (SE-K)
🏴 Flag for Borovnica (SI-005)
🏴 Flag for Tristan da Cunha (SH-TA)
🏴 Flag for Bled (SI-003)
🏴 Flag for Cerkno (SI-014)
🏴 Flag for Örebro (SE-T)
🏴 Flag for Domžale (SI-023)
🏴 Flag for Izola (SI-040)
🏴 Flag for Kuzma (SI-056)
🏴 Flag for Dravograd (SI-025)
🏴 Flag for Duplek (SI-026)
🏴 Flag for Jesenice (SI-041)
🏴 Flag for Gorišnica (SI-028)
🏴 Flag for Gornja Radgona (SI-029)
🏴 Flag for Dobrepolje (SI-020)
🏴 Flag for Gornji Petrovci (SI-031)
🏴 Flag for Dornava (SI-024)
🏴 Flag for Hrastnik (SI-034)
🏴 Flag for Ivančna Gorica (SI-039)
🏴 Flag for Komen (SI-049)
🏴 Flag for Kozje (SI-051)
🏴 Flag for Divača (SI-019)
🏴 Flag for Idrija (SI-036)
🏴 Flag for Kidričevo (SI-045)
🏴 Flag for Kobarid (SI-046)
🏴 Flag for Kobilje (SI-047)
🏴 Flag for Koper (SI-050)
🏴 Flag for Ig (SI-037)
🏴 Flag for Kungota (SI-055)
🏴 Flag for Grosuplje (SI-032)
🏴 Flag for Dobrova–Polhov Gradec (SI-021)
🏴 Flag for Juršinci (SI-042)
🏴 Flag for Krško (SI-054)
🏴 Flag for Šalovci (SI-033)
🏴 Flag for Kranjska Gora (SI-053)
🏴 Flag for Kočevje (SI-048)
🏴 Flag for Ilirska Bistrica (SI-038)
🏴 Flag for Kamnik (SI-043)
🏴 Flag for Hrpelje–Kozina (SI-035)
🏴 Flag for Gornji Grad (SI-030)
🏴 Flag for Kanal (SI-044)
🏴 Flag for Dol pri Ljubljani (SI-022)
🏴 Flag for Pesnica (SI-089)
🏴 Flag for Piran (SI-090)
🏴 Flag for Mežica (SI-074)
🏴 Flag for Muta (SI-081)
🏴 Flag for Ljubno (SI-062)
🏴 Flag for Ormož (SI-087)
🏴 Flag for Postojna (SI-094)
🏴 Flag for Mislinja (SI-076)
👩🏾👦🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Majšperk (SI-069)
🏴 Flag for Mengeš (SI-072)
🏴 Flag for Metlika (SI-073)
🏴 Flag for Moravče (SI-077)
🏴 Flag for Moravske Toplice (SI-078)
🏴 Flag for Ljubljana (SI-061)
🏴 Flag for Murska Sobota (SI-080)
🏴 Flag for Naklo (SI-082)
🏴 Flag for Nova Gorica (SI-084)
🏴 Flag for Osilnica (SI-088)
🏴 Flag for Pivka (SI-091)
🏴 Flag for Nazarje (SI-083)
🏴 Flag for Miren–Kostanjevica (SI-075)
🏴 Flag for Logatec (SI-064)
🏴 Flag for Litija (SI-060)
🏴 Flag for Maribor (SI-070)
🏴 Flag for Ljutomer (SI-063)
🏴 Flag for Loški Potok (SI-066)
🏴 Flag for Luče (SI-067)
🏴 Flag for Podčetrtek (SI-092)
🏴 Flag for Podvelka (SI-093)
🏴 Flag for Medvode (SI-071)
🏴 Flag for Loška Dolina (SI-065)
🏴 Flag for Laško (SI-057)
🏴 Flag for Lendava (SI-059)
🏴 Flag for Mozirje (SI-079)
🏴 Flag for Lukovica (SI-068)
🏴 Flag for Tržič (SI-131)
🏴 Flag for Šentilj (SI-118)
🏴 Flag for Rače–Fram (SI-098)
🏴 Flag for Puconci (SI-097)
👩🏿👦🏿👶🏿 Family - Woman: Dark Skin Tone, Boy: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Rogašovci (SI-105)
🏴 Flag for Slovenska Bistrica (SI-113)
🏴 Flag for Rogatec (SI-107)
🏴 Flag for Ptuj (SI-096)
🏴 Flag for Šentjernej (SI-119)
🏴 Flag for Sežana (SI-111)
🏴 Flag for Škofljica (SI-123)
🏴 Flag for Slovenj Gradec (SI-112)
🏴 Flag for Starše (SI-115)
🏴 Flag for Sveti Jurij (SI-116)
🏴 Flag for Trebnje (SI-130)
🏴 Flag for Sevnica (SI-110)
🏴 Flag for Radeče (SI-099)
🏴 Flag for Škocjan (SI-121)
🏴 Flag for Šmarje pri Jelšah (SI-124)
🏴 Flag for Šoštanj (SI-126)
🏴 Flag for Štore (SI-127)
🏴 Flag for Rogaška Slatina (SI-106)
🏴 Flag for Preddvor (SI-095)
🏴 Flag for Turnišče (SI-132)
🏴 Flag for Radovljica (SI-102)
🏴 Flag for Ruše (SI-108)
🏴 Flag for Slovenske Konjice (SI-114)
🏴 Flag for Šentjur (SI-120)
🏴 Flag for Tolmin (SI-128)
🏴 Flag for Ribnica (SI-104)
🏴 Flag for Radlje ob Dravi (SI-101)
🏴 Flag for Trbovlje (SI-129)
🏴 Flag for Semič (SI-109)
🏴 Flag for Šenčur (SI-117)
🏴 Flag for Ravne na Koroškem (SI-103)
🏴 Flag for Miklavž na Dravskem Polju (SI-169)
🏴 Flag for Vodice (SI-138)
🏴 Flag for Velenje (SI-133)
🏴 Flag for Zagorje ob Savi (SI-142)
🏴 Flag for Vuzenica (SI-141)
🏴 Flag for Vrhnika (SI-140)
👩🏻👧🏻 Family - Woman: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Železniki (SI-146)
🏴 Flag for Žiri (SI-147)
🏴 Flag for Benedikt (SI-148)
🏴 Flag for Velike Lašče (SI-134)
🏴 Flag for Vitanje (SI-137)
🏴 Flag for Komenda (SI-164)
🏴 Flag for Dobrna (SI-155)
🏴 Flag for Dobrovnik (SI-156)
🏴 Flag for Dolenjske Toplice (SI-157)
🏴 Flag for Hajdina (SI-159)
🏴 Flag for Oplotnica (SI-171)
🏴 Flag for Videm (SI-135)
🏴 Flag for Jezersko (SI-163)
🏴 Flag for Cankova (SI-152)
🏴 Flag for Kostel (SI-165)
🏴 Flag for Križevci (SI-166)
🏴 Flag for Vojnik (SI-139)
🏴 Flag for Markovci (SI-168)
🏴 Flag for Mirna Peč (SI-170)
🏴 Flag for Vipava (SI-136)
🏴 Flag for Horjul (SI-162)
🏴 Flag for Cerkvenjak (SI-153)
🏴 Flag for Bloke (SI-150)
🏴 Flag for Zavrč (SI-143)
🏴 Flag for Bistrica ob Sotli (SI-149)
🏴 Flag for Zreče (SI-144)
🏴 Flag for Hodoš (SI-161)
🏴 Flag for Hoče–Slivnica (SI-160)
🏴 Flag for Grad (SI-158)
🏴 Flag for Podlehnik (SI-172)
🏴 Flag for Cirkulane (SI-196)
🏴 Flag for Prebold (SI-174)
🏴 Flag for Razkrižje (SI-176)
🏴 Flag for Veržej (SI-188)
🏴 Flag for Žalec (SI-190)
🏴 Flag for Solčava (SI-180)
🏴 Flag for Sveta Ana (SI-181)
🏴 Flag for Šempeter–Vrtojba (SI-183)
🏴 Flag for Trnovska Vas (SI-185)
🏴 Flag for Sodražica (SI-179)
🏴 Flag for Makole (SI-198)
🏴 Flag for Straža (SI-203)
🏴 Flag for Selnica ob Dravi (SI-178)
🏴 Flag for Žužemberk (SI-193)
🏴 Flag for Kostanjevica na Krki (SI-197)
🏴 Flag for Prevalje (SI-175)
🏴 Flag for Šmartno pri Litiji (SI-194)
🏴 Flag for Žetale (SI-191)
🏴 Flag for Vransko (SI-189)
🏴 Flag for Renče–Vogrsko (SI-201)
🏴 Flag for Središče ob Dravi (SI-202)
🏴 Flag for Trzin (SI-186)
🏴 Flag for Sveta Trojica v Slovenskih Goricah (SI-204)
🏴 Flag for Sveti Tomaž (SI-205)
🏴 Flag for Ribnica na Pohorju (SI-177)
🏴 Flag for Gorje (SI-207)
🏴 Flag for Tabor (SI-184)
🏴 Flag for Mokronog–Trebelno (SI-199)
🏴 Flag for Polzela (SI-173)
🏴 Flag for Poljčane (SI-200)
🏴 Flag for Apače (SI-195)
🏴 Flag for Velika Polana (SI-187)
🏴 Flag for Trnava (SK-TA)
🏴 Flag for Rečica ob Savinji (SI-209)
🏴 Flag for Serravalle (SM-09)
🏴 Flag for Chiesanuova (SM-02)
🏴 Flag for Kaffrine (SN-KA)
🏴 Flag for Nitra (SK-NI)
🏴 Flag for Šentrupert (SI-211)
🏴 Flag for Borgo Maggiore (SM-06)
🏴 Flag for Košice (SK-KI)
🏴 Flag for Banská Bystrica (SK-BC)
🏴 Flag for Montegiardino (SM-08)
🏴 Flag for Dakar (SN-DK)
🏴 Flag for Prešov (SK-PV)
🏴 Flag for Mirna (SI-212)
🏴 Flag for Fiorentino (SM-05)
🏴 Flag for Thiès (SN-TH)
🏴 Flag for Ankaran (SI-213)
🏴 Flag for Tambacounda (SN-TC)
🏴 Flag for Fatick (SN-FK)
🏴 Flag for Trenčín (SK-TC)
🏴 Flag for Kaolack (SN-KL)
🏴 Flag for Faetano (SM-04)
🏴 Flag for Žilina (SK-ZI)
🏴 Flag for Southern (SL-S)
🏴 Flag for Sédhiou (SN-SE)
🏴 Flag for Bratislava (SK-BL)
🏴 Flag for Diourbel (SN-DB)
🏴 Flag for Kédougou (SN-KE)
🏴 Flag for Northern (SL-N)
🏴 Flag for Western Area (SL-W)
🏴 Flag for Matam (SN-MT)
🏴 Flag for Eastern (SL-E)
🏴 Flag for Acquaviva (SM-01)
🏴 Flag for Kolda (SN-KD)
🏴 Flag for Saint-Louis (SN-SL)
🏴 Flag for San Marino (SM-07)
🏴 Flag for Louga (SN-LG)
🏴 Flag for Domagnano (SM-03)
🏴 Flag for Eastern Equatoria (SS-EE)
🏴 Flag for Saramacca (SR-SA)
👩🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Marowijne (SR-MA)
🏴 Flag for Middle Juba (SO-JD)
🏴 Flag for Mudug (SO-MU)
🏴 Flag for Lower Shebelle (SO-SH)
🏴 Flag for Hiran (SO-HI)
🏴 Flag for Central Equatoria (SS-EC)
🏴 Flag for Ziguinchor (SN-ZG)
🏴 Flag for Coronie (SR-CR)
🏴 Flag for Middle Shebelle (SO-SD)
🏴 Flag for Upper Nile (SS-NU)
🏴 Flag for Wanica (SR-WA)
🏴 Flag for Awdal (SO-AW)
🏴 Flag for Sanaag (SO-SA)
🏴 Flag for Lower Juba (SO-JH)
🏴 Flag for Lakes (SS-LK)
🏴 Flag for Warrap (SS-WR)
🏴 Flag for Príncipe (ST-P)
🏴 Flag for Sipaliwini (SR-SI)
🏴 Flag for Western Bahr el Ghazal (SS-BW)
🏴 Flag for Western Equatoria (SS-EW)
🏴 Flag for Bari (SO-BR)
🏴 Flag for Jonglei (SS-JG)
🏴 Flag for Paramaribo (SR-PM)
🏴 Flag for Commewijne (SR-CM)
🏴 Flag for Galguduud (SO-GA)
🏴 Flag for Nickerie (SR-NI)
🏴 Flag for Para (SR-PR)
🏴 Flag for Woqooyi Galbeed (SO-WO)
🏴 Flag for Gedo (SO-GE)
🏴 Flag for Bay, Somalia (SO-BY)
🏴 Flag for Brokopondo (SR-BR)
🏴 Flag for Nugal (SO-NU)
🏴 Flag for Togdheer (SO-TO)
🏴 Flag for Bakool (SO-BK)
🏴 Flag for Sool (SO-SO)
🏴 Flag for Hhohho (SZ-HH)
🏴 Flag for Ennedi-Ouest (TD-EO)
🏴 Flag for Guéra (TD-GR)
🏴 Flag for Shiselweni (SZ-SH)
🏴 Flag for Daraa (SY-DR)
🏴 Flag for Ar-Raqqah (SY-RA)
🏴 Flag for Sonsonate (SV-SO)
🏴 Flag for La Unión (SV-UN)
🏴 Flag for San Miguel (SV-SM)
🏴 Flag for Morazán (SV-MO)
🏴 Flag for San Salvador (SV-SS)
🏴 Flag for Deir ez-Zor (SY-DY)
🏴 Flag for Cabañas (SV-CA)
🏴 Flag for Lubombo (SZ-LU)
🏴 Flag for Chalatenango (SV-CH)
🏴 Flag for Rif Dimashq (SY-RD)
🏴 Flag for Tartus (SY-TA)
🏴 Flag for Borkou (TD-BO)
🏴 Flag for Manzini (SZ-MA)
🏴 Flag for Batha (TD-BA)
🏴 Flag for Homs (SY-HI)
🏴 Flag for Ennedi-Est (TD-EE)
🏴 Flag for Bahr el Gazel (TD-BG)
🏴 Flag for Kanem (TD-KA)
🏴 Flag for Hama (SY-HM)
🏴 Flag for Latakia (SY-LA)
🏴 Flag for Idlib (SY-ID)
🏴 Flag for La Libertad (SV-LI)
🏴 Flag for Aleppo (SY-HL)
🏴 Flag for Ahuachapán (SV-AH)
🏴 Flag for Chari-Baguirmi (TD-CB)
🏴 Flag for La Paz (SV-PA)
🏴 Flag for As-Suwayda (SY-SU)
🏴 Flag for Damascus (SY-DI)
🏴 Flag for Quneitra (SY-QU)
🏴 Flag for Al-Hasakah (SY-HA)
🏴 Flag for Santa Ana (SV-SA)
🏴 Flag for Cuscatlán (SV-CU)
🏴 Flag for Logone Occidental (TD-LO)
🏴 Flag for Chanthaburi (TH-22)
🏴 Flag for Mayo-Kebbi Est (TD-ME)
🏴 Flag for Moyen-Chari (TD-MC)
🏴 Flag for Logone Oriental (TD-LR)
🏴 Flag for Savanes (TG-S)
🏴 Flag for Phra Nakhon Si Ayutthaya (TH-14)
🏴 Flag for Centrale (TG-C)
🏴 Flag for Sa Kaeo (TH-27)
🏴 Flag for Nonthaburi (TH-12)
🏴 Flag for Buri Ram (TH-31)
🏴 Flag for Chon Buri (TH-20)
🏴 Flag for Sila (TD-SI)
🏴 Flag for Lac (TD-LC)
🏴 Flag for Rayong (TH-21)
🏴 Flag for Prachin Buri (TH-25)
🏴 Flag for Nakhon Ratchasima (TH-30)
🏴 Flag for Kara (TG-K)
🏴 Flag for Ang Thong (TH-15)
🏴 Flag for Bangkok (TH-10)
🏴 Flag for Mandoul (TD-MA)
🏴 Flag for Pathum Thani (TH-13)
🏴 Flag for Chachoengsao (TH-24)
🏴 Flag for Sing Buri (TH-17)
🏴 Flag for Mayo-Kebbi Ouest (TD-MO)
🏴 Flag for Ouaddaï (TD-OD)
🏴 Flag for Surin (TH-32)
🏴 Flag for Nakhon Nayok (TH-26)
🏴 Flag for Salamat (TD-SA)
🏴 Flag for Tandjilé (TD-TA)
🏴 Flag for Wadi Fira (TD-WF)
🏴 Flag for Saraburi (TH-19)
🏴 Flag for Samut Prakan (TH-11)
🏴 Flag for Tibesti (TD-TI)
🏴 Flag for Plateaux (TG-P)
🏴 Flag for N’Djamena (TD-ND)
🏴 Flag for Chai Nat (TH-18)
🏴 Flag for Kamphaeng Phet (TH-62)
🏴 Flag for Suphanburi (TH-72)
🏴 Flag for Samut Sakhon (TH-74)
🏴 Flag for Phetchabun (TH-67)
🏴 Flag for Kanchanaburi (TH-71)
🏴 Flag for Phrae (TH-54)
🏴 Flag for Tak (TH-63)
🏴 Flag for Nakhon Phanom (TH-48)
🏴 Flag for Lampang (TH-52)
🏴 Flag for Mae Hong Son (TH-58)
🏴 Flag for Sakon Nakhon (TH-47)
🏴 Flag for Phayao (TH-56)
🏴 Flag for Udon Thani (TH-41)
🏴 Flag for Mukdahan (TH-49)
🏴 Flag for Nakhon Pathom (TH-73)
🏴 Flag for Chiang Mai (TH-50)
🏴 Flag for Khon Kaen (TH-40)
🏴 Flag for Amnat Charoen (TH-37)
🏴 Flag for Ratchaburi (TH-70)
🏴 Flag for Yasothon (TH-35)
🏴 Flag for Lamphun (TH-51)
🏴 Flag for Loei (TH-42)
🏴 Flag for Nakhon Sawan (TH-60)
🏴 Flag for Ubon Ratchathani (TH-34)
🏴 Flag for Maha Sarakham (TH-44)
🏴 Flag for Roi Et (TH-45)
🏴 Flag for Kalasin (TH-46)
🏴 Flag for Phichit (TH-66)
🏴 Flag for Nan (TH-55)
🏴 Flag for Uthai Thani (TH-61)
🏴 Flag for Bueng Kan (TH-38)
🏴 Flag for Si Sa Ket (TH-33)
🏴 Flag for Nong Bua Lam Phu (TH-39)
🏴 Flag for Uttaradit (TH-53)
🏴 Flag for Chiang Rai (TH-57)
🏴 Flag for Sukhothai (TH-64)
🏴 Flag for Nong Khai (TH-43)
🏴 Flag for Phitsanulok (TH-65)
🏴 Flag for Ermera (TL-ER)
🏴 Flag for Oecusse (TL-OE)
🏴 Flag for Liquiçá (TL-LI)
🏴 Flag for Aileu (TL-AL)
🏴 Flag for Ahal (TM-A)
🏴 Flag for Surat Thani (TH-84)
🏴 Flag for Phetchaburi (TH-76)
🏴 Flag for Bobonaro (TL-BO)
🏴 Flag for Manatuto (TL-MT)
🏴 Flag for Khatlon (TJ-KT)
🏴 Flag for Ainaro (TL-AN)
🏴 Flag for Phang Nga (TH-82)
🏴 Flag for Cova Lima (TL-CO)
🏴 Flag for Tunis (TN-11)
🏴 Flag for Ranong (TH-85)
🏴 Flag for Nakhon Si Thammarat (TH-80)
🏴 Flag for Prachuap Khiri Khan (TH-77)
🏴 Flag for Dushanbe (TJ-DU)
🏴 Flag for Yala (TH-95)
🏴 Flag for Songkhla (TH-90)
🏴 Flag for Lebap (TM-L)
🏴 Flag for Narathiwat (TH-96)
🏴 Flag for Mary (TM-M)
🏴 Flag for Manufahi (TL-MF)
👨🏼👨🏼👦🏼👶🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Balkan (TM-B)
🏴 Flag for Baucau (TL-BA)
🏴 Flag for Nohiyahoi Tobei Jumhurí (TJ-RA)
🏴 Flag for Trang (TH-92)
🏴 Flag for Sughd (TJ-SU)
🏴 Flag for Viqueque (TL-VI)
🏴 Flag for Pattani (TH-94)
🏴 Flag for Krabi (TH-81)
🏴 Flag for Dili (TL-DI)
🏴 Flag for Phuket (TH-83)
🏴 Flag for Satun (TH-91)
🏴 Flag for Pattaya (TH-S)
🏴 Flag for Daşoguz (TM-D)
🏴 Flag for Kairouan (TN-41)
🏴 Flag for Monastir (TN-52)
🏴 Flag for Aydın (TR-09)
🏴 Flag for Béja (TN-31)
🏴 Flag for Antalya (TR-07)
🏴 Flag for Nabeul (TN-21)
🏴 Flag for Mahdia (TN-53)
🏴 Flag for Haʻapai (TO-02)
🏴 Flag for Amasya (TR-05)
🏴 Flag for Bitlis (TR-13)
🏴 Flag for Ariana (TN-12)
🏴 Flag for Kebili (TN-73)
🏴 Flag for Adana (TR-01)
🏴 Flag for ʻEua (TO-01)
🏴 Flag for Bingöl (TR-12)
🏴 Flag for Tataouine (TN-83)
🏴 Flag for Artvin (TR-08)
🏴 Flag for Sousse (TN-51)
🏴 Flag for Gabès (TN-81)
🏴 Flag for Ağrı (TR-04)
🏴 Flag for Bilecik (TR-11)
🏴 Flag for Jendouba (TN-32)
🏴 Flag for Tongatapu (TO-04)
🏴 Flag for Adıyaman (TR-02)
🏴 Flag for Kef (TN-33)
🏴 Flag for Zaghouan (TN-22)
🏴 Flag for Balıkesir (TR-10)
🏴 Flag for Ben Arous (TN-13)
🏴 Flag for Niuas (TO-03)
🏴 Flag for Tozeur (TN-72)
🏴 Flag for Manouba (TN-14)
🏴 Flag for Kasserine (TN-42)
🏴 Flag for Bolu (TR-14)
🏴 Flag for Siliana (TN-34)
🏴 Flag for Vavaʻu (TO-05)
🏴 Flag for Ankara (TR-06)
🏴 Flag for Sfax (TN-61)
🏴 Flag for Sidi Bouzid (TN-43)
🏴 Flag for Medenine (TN-82)
🏴 Flag for Bizerte (TN-23)
🏴 Flag for Erzincan (TR-24)
🏴 Flag for Kahramanmaraş (TR-46)
🏴 Flag for Kars (TR-36)
🏴 Flag for Niğde (TR-51)
🏴 Flag for Kayseri (TR-38)
🏴 Flag for Kocaeli (TR-41)
🏴 Flag for Çankırı (TR-18)
🏴 Flag for Muğla (TR-48)
🏴 Flag for Konya (TR-42)
🏴 Flag for Malatya (TR-44)
🏴 Flag for Gümüşhane (TR-29)
🏴 Flag for Edirne (TR-22)
🏴 Flag for Kırklareli (TR-39)
🏴 Flag for Gaziantep (TR-27)
🏴 Flag for Samsun (TR-55)
🏴 Flag for Diyarbakır (TR-21)
🏴 Flag for Bursa (TR-16)
🏴 Flag for Çorum (TR-19)
🏴 Flag for Ordu (TR-52)
🏴 Flag for Manisa (TR-45)
🏴 Flag for Erzurum (TR-25)
🏴 Flag for Burdur (TR-15)
🏴 Flag for Isparta (TR-32)
🏴 Flag for Istanbul (TR-34)
🏴 Flag for Hakkâri (TR-30)
🏴 Flag for Hatay (TR-31)
🏴 Flag for Muş (TR-49)
🏴 Flag for Mersin (TR-33)
🏴 Flag for Siirt (TR-56)
🏴 Flag for Nevşehir (TR-50)
🏴 Flag for Elazığ (TR-23)
🏴 Flag for Giresun (TR-28)
🏴 Flag for Denizli (TR-20)
🏴 Flag for Mardin (TR-47)
🏴 Flag for Kastamonu (TR-37)
🏴 Flag for Sakarya (TR-54)
🏴 Flag for Kırşehir (TR-40)
🏴 Flag for Çanakkale (TR-17)
🏴 Flag for Rize (TR-53)
🏴 Flag for Eskişehir (TR-26)
🏴 Flag for Van (TR-65)
🏴 Flag for Princes Town (TT-PRT)
🏴 Flag for Couva-Tabaquite-Talparo (TT-CTT)
🏴 Flag for Tobago (TT-TOB)
🏴 Flag for Şanlıurfa (TR-63)
🏴 Flag for Arima (TT-ARI)
🏴 Flag for Zonguldak (TR-67)
🏴 Flag for Siparia (TT-SIP)
🏴 Flag for Ardahan (TR-75)
🏴 Flag for Kilis (TR-79)
🏴 Flag for Port of Spain (TT-POS)
🏴 Flag for Aksaray (TR-68)
🏴 Flag for Diego Martin (TT-DMN)
🏴 Flag for Bayburt (TR-69)
🏴 Flag for Tekirdağ (TR-59)
🏴 Flag for Batman (TR-72)
🏴 Flag for Chaguanas (TT-CHA)
🏴 Flag for Osmaniye (TR-80)
🏴 Flag for Yalova (TR-77)
🏴 Flag for San Juan-Laventille (TT-SJL)
🏴 Flag for Karabük (TR-78)
🏴 Flag for Yozgat (TR-66)
🏴 Flag for Mayaro-Rio Claro (TT-MRC)
🏴 Flag for Uşak (TR-64)
🏴 Flag for Sinop (TR-57)
🏴 Flag for Tunapuna-Piarco (TT-TUP)
🏴 Flag for Bartın (TR-74)
🏴 Flag for Kırıkkale (TR-71)
🏴 Flag for Penal-Debe (TT-PED)
🏴 Flag for Iğdır (TR-76)
🏴 Flag for Şırnak (TR-73)
🏴 Flag for Trabzon (TR-61)
🏴 Flag for Point Fortin (TT-PTF)
🏴 Flag for Tunceli (TR-62)
🏴 Flag for Tokat (TR-60)
🏴 Flag for Karaman (TR-70)
🏴 Flag for San Fernando (TT-SFO)
🏴 Flag for Sivas (TR-58)
🏴 Flag for Zanzibar North (TZ-07)
🏴 Flag for Changhua (TW-CHA)
🏴 Flag for Vaitupu (TV-VAI)
🏴 Flag for Kaohsiung (TW-KHH)
🏴 Flag for Kilimanjaro (TZ-09)
🏴 Flag for Kinmen (TW-KIN)
🏴 Flag for Penghu (TW-PEN)
🏴 Flag for Tainan (TW-TNN)
🏴 Flag for Nukufetau (TV-NKF)
🏴 Flag for Kigoma (TZ-08)
🏴 Flag for Taipei (TW-TPE)
🏴 Flag for Pingtung (TW-PIF)
🏴 Flag for Yilan (TW-ILA)
🏴 Flag for Taoyuan (TW-TAO)
🏴 Flag for Dodoma (TZ-03)
🏴 Flag for Nui (TV-NUI)
🏴 Flag for Niutao (TV-NIT)
🏴 Flag for North Pemba (TZ-06)
🏴 Flag for New Taipei (TW-NWT)
🏴 Flag for Iringa (TZ-04)
🏴 Flag for Kagera (TZ-05)
🏴 Flag for Yunlin (TW-YUN)
🏴 Flag for Lienchiang (TW-LIE)
🏴 Flag for Nanumanga (TV-NMG)
🏴 Flag for Dar es Salaam (TZ-02)
🏴 Flag for Nanumea (TV-NMA)
🏴 Flag for Taitung (TW-TTT)
🏴 Flag for Nantou (TW-NAN)
🏴 Flag for Chiayi (TW-CYQ)
🏴 Flag for Arusha (TZ-01)
🏴 Flag for Hualien (TW-HUA)
🏴 Flag for Chiayi County (TW-CYI)
🏴 Flag for Taichung (TW-TXG)
🏴 Flag for Keelung (TW-KEE)
🏴 Flag for Miaoli (TW-MIA)
🏴 Flag for Crimea (UA-43)
🏴 Flag for Lindi (TZ-12)
🏴 Flag for Manyara (TZ-26)
🏴 Flag for Luhanshchyna (UA-09)
🏴 Flag for Rukwa (TZ-20)
🏴 Flag for Dnipropetrovshchyna (UA-12)
🏴 Flag for Volyn (UA-07)
🏴 Flag for Shinyanga (TZ-22)
🏴 Flag for Vinnychchyna (UA-05)
🏴 Flag for Ruvuma (TZ-21)
🏴 Flag for Katavi (TZ-28)
🏴 Flag for Zaporizhzhya (UA-23)
🏴 Flag for Kyivshchyna (UA-32)
🏴 Flag for Singida (TZ-23)
🏴 Flag for Tabora (TZ-24)
🏴 Flag for Mara (TZ-13)
🏴 Flag for Geita (TZ-27)
🏴 Flag for Simiyu (TZ-30)
🏴 Flag for Mykolayivschyna (UA-48)
🏴 Flag for Kirovohradschyna (UA-35)
🏴 Flag for Rivnenshchyna (UA-56)
🏴 Flag for Poltavshchyna (UA-53)
🏴 Flag for Mbeya (TZ-14)
🏴 Flag for Mwanza (TZ-18)
🏴 Flag for Zakarpattia (UA-21)
🏴 Flag for South Pemba (TZ-10)
🏴 Flag for Pwani (TZ-19)
🏴 Flag for Mtwara (TZ-17)
🏴 Flag for Sevastopol (UA-40)
🏴 Flag for Odeshchyna (UA-51)
🏴 Flag for Lvivshchyna (UA-46)
🏴 Flag for Donechchyna (UA-14)
🏴 Flag for Prykarpattia (UA-26)
🏴 Flag for Zanzibar Urban/West (TZ-15)
🏴 Flag for Morogoro (TZ-16)
🏴 Flag for Njombe (TZ-29)
🏴 Flag for Chernivtsi Oblast (UA-77)
🏴 Flag for Palmyra Atoll (UM-95)
🏴 Flag for Kansas (US-KS)
👨🏽👨🏽👦🏽👶🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Arizona (US-AZ)
🏴 Flag for Johnston Atoll (UM-67)
🏴 Flag for Chernihivshchyna (UA-74)
🏴 Flag for Howland Island (UM-84)
🏴 Flag for Georgia (US-GA)
🏴 Flag for Hawaii (US-HI)
🏴 Flag for Midway Atoll (UM-71)
🏴 Flag for American Samoa (US-AS)
🏴 Flag for Connecticut (US-CT)
🏴 Flag for Iowa (US-IA)
🏴 Flag for Ternopilshchyna (UA-61)
🏴 Flag for Northern (UG-N)
🏴 Flag for Guam (US-GU)
🏴 Flag for Baker Island (UM-81)
🏴 Flag for Eastern (UG-E)
🏴 Flag for Khersonshchyna (UA-65)
🏴 Flag for Sumshchyna (UA-59)
🏴 Flag for Indiana (US-IN)
🏴 Flag for Arkansas (US-AR)
🏴 Flag for Delaware (US-DE)
🏴 Flag for Kharkivshchyna (UA-63)
🏴 Flag for Alabama (US-AL)
🏴 Flag for Western (UG-W)
🏴 Flag for Khmelnychchyna (UA-68)
🏴 Flag for Navassa Island (UM-76)
🏴 Flag for Jarvis Island (UM-86)
🏴 Flag for Idaho (US-ID)
🏴 Flag for Kingman Reef (UM-89)
🏴 Flag for Florida (US-FL)
🏴 Flag for Wake Island (UM-79)
🏴 Flag for Illinois (US-IL)
🏴 Flag for Washington DC (US-DC)
🏴 Flag for Cherkashchyna (UA-71)
🏴 Flag for New York (US-NY)
👨🏾👨🏾👦🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for North Carolina (US-NC)
🏴 Flag for Mississippi (US-MS)
🏴 Flag for Massachusetts (US-MA)
🏴 Flag for Nevada (US-NV)
🏴 Flag for Wisconsin (US-WI)
🏴 Flag for Maryland (US-MD)
🏴 Flag for New Mexico (US-NM)
🏴 Flag for Puerto Rico (US-PR)
🏴 Flag for U.S. Outlying Islands (US-UM)
🏴 Flag for Wyoming (US-WY)
🏴 Flag for Ohio (US-OH)
🏴 Flag for Kentucky (US-KY)
🏴 Flag for New Jersey (US-NJ)
🏴 Flag for Oregon (US-OR)
🏴 Flag for Michigan (US-MI)
🏴 Flag for U.S. Virgin Islands (US-VI)
🏴 Flag for Missouri (US-MO)
🏴 Flag for Pennsylvania (US-PA)
🏴 Flag for Virginia (US-VA)
🏴 Flag for Artigas (UY-AR)
🏴 Flag for Canelones (UY-CA)
🏴 Flag for Washington (US-WA)
🏴 Flag for South Carolina (US-SC)
🏴 Flag for Maine (US-ME)
🏴 Flag for Louisiana (US-LA)
🏴 Flag for Minnesota (US-MN)
🏴 Flag for Rhode Island (US-RI)
🏴 Flag for West Virginia (US-WV)
🏴 Flag for Texas (US-TX)
🏴 Flag for Utah (US-UT)
🏴 Flag for Oklahoma (US-OK)
🏴 Flag for New Hampshire (US-NH)
🏴 Flag for Samarqand (UZ-SA)
🏴 Flag for Maldonado (UY-MA)
🏴 Flag for Namangan (UZ-NG)
🏴 Flag for Charlotte (VC-01)
🏴 Flag for Salto (UY-SA)
🏴 Flag for Cerro Largo (UY-CL)
🏴 Flag for Tacuarembó (UY-TA)
🏴 Flag for Capital (VE-A)
🏴 Flag for Anzoátegui (VE-B)
🏴 Flag for Saint Andrew (VC-02)
🏴 Flag for Soriano (UY-SO)
🏴 Flag for Rocha (UY-RO)
🏴 Flag for Saint David (VC-03)
🏴 Flag for San José (UY-SJ)
🏴 Flag for Florida (UY-FD)
🏴 Flag for Colonia (UY-CO)
🏴 Flag for Flores (UY-FS)
🏴 Flag for Xorazm (UZ-XO)
🏴 Flag for Durazno (UY-DU)
🏴 Flag for Andijan (UZ-AN)
🏴 Flag for Aragua (VE-D)
🏴 Flag for Sirdaryo (UZ-SI)
🏴 Flag for Paysandú (UY-PA)
🏴 Flag for Grenadines (VC-06)
🏴 Flag for Rivera (UY-RV)
🏴 Flag for Lavalleja (UY-LA)
🏴 Flag for Surxondaryo (UZ-SU)
🏴 Flag for Tashkent Province (UZ-TO)
🏴 Flag for Qashqadaryo (UZ-QA)
🏴 Flag for Treinta y Tres (UY-TT)
🏴 Flag for Montevideo (UY-MO)
🏴 Flag for Bukhara (UZ-BU)
🏴 Flag for Fergana (UZ-FA)
🏴 Flag for Karakalpakstan (UZ-QR)
🏴 Flag for Jizzakh (UZ-JI)
🏴 Flag for Río Negro (UY-RN)
🏴 Flag for Tashkent (UZ-TK)
🏴 Flag for Saint Patrick (VC-05)
🏴 Flag for Navoiy (UZ-NW)
🏴 Flag for Lara (VE-K)
🏴 Flag for Nueva Esparta (VE-O)
🏴 Flag for Táchira (VE-S)
🏴 Flag for Bolívar (VE-F)
🏴 Flag for Thanh Hóa (VN-21)
🏴 Flag for Hòa Bình (VN-14)
🏴 Flag for Guárico (VE-J)
🏴 Flag for Cojedes (VE-H)
🏴 Flag for Thừa Thiên–Huế (VN-26)
🏴 Flag for Portuguesa (VE-P)
🏴 Flag for Ninh Bình (VN-18)
🏴 Flag for Sucre (VE-R)
🏴 Flag for Lai Châu (VN-01)
🏴 Flag for Lạng Sơn (VN-09)
🏴 Flag for Miranda (VE-M)
🏴 Flag for Quảng Bình (VN-24)
🏴 Flag for Barinas (VE-E)
🏴 Flag for Monagas (VE-N)
🏴 Flag for Nghệ An (VN-22)
🏴 Flag for Lào Cai (VN-02)
🏴 Flag for Tuyên Quang (VN-07)
🏴 Flag for Sơn La (VN-05)
🏴 Flag for Thái Bình (VN-20)
🏴 Flag for Federal Dependencies (VE-W)
🏴 Flag for Quảng Ngãi (VN-29)
🏴 Flag for Mérida (VE-L)
🏴 Flag for Falcón (VE-I)
🏴 Flag for Cao Bằng (VN-04)
🏴 Flag for Amazonas (VE-Z)
🏴 Flag for Yên Bái (VN-06)
🏴 Flag for Hà Tĩnh (VN-23)
🏴 Flag for Kon Tum (VN-28)
🏴 Flag for Vargas (VE-X)
🏴 Flag for Yaracuy (VE-U)
🏴 Flag for Trujillo (VE-T)
🏴 Flag for Quảng Ninh (VN-13)
🏴 Flag for Hà Giang (VN-03)
🏴 Flag for Quảng Nam (VN-27)
🏴 Flag for Bắc Ninh (VN-56)
🏴 Flag for Ninh Thuận (VN-36)
🏴 Flag for Thái Nguyên (VN-69)
🏴 Flag for Nam Định (VN-67)
🏴 Flag for Lâm Đồng (VN-35)
🏴 Flag for Hải Dương (VN-61)
🏴 Flag for Sóc Trăng (VN-52)
🏴 Flag for Hậu Giang (VN-73)
🏴 Flag for Vĩnh Phúc (VN-70)
🏴 Flag for Bến Tre (VN-50)
🏴 Flag for Bắc Kạn (VN-53)
🏴 Flag for Bắc Giang (VN-54)
🏴 Flag for Đắk Lắk (VN-33)
🏴 Flag for Bình Dương (VN-57)
🏴 Flag for Da Nang (VN-DN)
🏴 Flag for Tiền Giang (VN-46)
🏴 Flag for Bà Rịa–Vũng Tàu (VN-43)
🏴 Flag for Điện Biên (VN-71)
🏴 Flag for Bình Phước (VN-58)
🏴 Flag for Can Tho (VN-CT)
🏴 Flag for Bạc Liêu (VN-55)
🏴 Flag for Phú Yên (VN-32)
🏴 Flag for An Giang (VN-44)
🏴 Flag for Hà Nam (VN-63)
🏴 Flag for Cà Mau (VN-59)
🏴 Flag for Kiên Giang (VN-47)
🏴 Flag for Khánh Hòa (VN-34)
🏴 Flag for Đồng Tháp (VN-45)
🏴 Flag for Đồng Nai (VN-39)
🏴 Flag for Hanoi (VN-HN)
🏴 Flag for Vĩnh Long (VN-49)
🏴 Flag for Phú Thọ (VN-68)
🏴 Flag for Tây Ninh (VN-37)
🏴 Flag for Gia Lai (VN-30)
🏴 Flag for Đắk Nông (VN-72)
🏴 Flag for Bình Thuận (VN-40)
🏴 Flag for Long An (VN-41)
🏴 Flag for Bình Định (VN-31)
🏴 Flag for Uvea (WF-UV)
🏴 Flag for Sa’dah (YE-SD)
🏴 Flag for Abyan (YE-AB)
🏴 Flag for Hajjah (YE-HJ)
🏴 Flag for Malampa (VU-MAP)
🏴 Flag for Atua (WS-AT)
🏴 Flag for Va’a-o-Fonoti (WS-VF)
🏴 Flag for Al Hudaydah (YE-HU)
🏴 Flag for Palauli (WS-PA)
🏴 Flag for Satupa’itea (WS-SA)
🏴 Flag for Dhale (YE-DA)
🏴 Flag for Tombouctou (ML-6)
🏴 Flag for Raymah (YE-RA)
🏴 Flag for Sanma (VU-SAM)
🏴 Flag for Alo (WF-AL)
🏴 Flag for Al Mahrah (YE-MR)
👨🏻👨🏻👧🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for ’Adan (YE-AD)
🏴 Flag for Shabwah (YE-SH)
🏴 Flag for Tafea (VU-TAE)
🏴 Flag for Amran (YE-AM)
🏴 Flag for Penama (VU-PAM)
🏴 Flag for Al Mahwit (YE-MW)
🏴 Flag for Gaga’emauga (WS-GE)
🏴 Flag for Hadramaut (YE-HD)
🏴 Flag for Aiga-i-le-Tai (WS-AL)
🏴 Flag for Ma’rib (YE-MA)
🏴 Flag for Al Bayda (YE-BA)
🏴 Flag for Haiphong (VN-HP)
🏴 Flag for A’ana (WS-AA)
🏴 Flag for Sigave (WF-SG)
🏴 Flag for Lahij (YE-LA)
🏴 Flag for Shefa (VU-SEE)
🏴 Flag for Ibb (YE-IB)
🏴 Flag for Torba (VU-TOB)
🏴 Flag for Al Jawf (YE-JA)
🏴 Flag for Tuamasaga (WS-TU)
🏴 Flag for Dhamar (YE-DH)
🏴 Flag for Western Cape (ZA-WC)
🏴 Flag for Arkhabil Suqutra (YE-SU)
🏴 Flag for Matabeleland North (ZW-MN)
🏴 Flag for Mashonaland East (ZW-ME)
🏴 Flag for North-Western (ZM-06)
🏴 Flag for Sana’a (YE-SN)
🏴 Flag for Limpopo (ZA-LP)
🏴 Flag for Eastern (ZM-03)
🏴 Flag for Midlands (ZW-MI)
🏴 Flag for Bulawayo (ZW-BU)
🏴 Flag for Northern (ZM-05)
🏴 Flag for Southern (ZM-07)
🏴 Flag for Free (ZA-FS)
🏴 Flag for Matabeleland South (ZW-MS)
🏴 Flag for Eastern Cape (ZA-EC)
🏴 Flag for Western (ZM-01)
👨🏼👨🏼👧🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Copperbelt (ZM-08)
🏴 Flag for North West (ZA-NW)
🏴 Flag for Muchinga (ZM-10)
🏴 Flag for Gauteng (ZA-GT)
🏴 Flag for Lusaka (ZM-09)
🏴 Flag for Central (ZM-02)
🏴 Flag for Northern Cape (ZA-NC)
🏴 Flag for Mpumalanga (ZA-MP)
🏴 Flag for Taiz (YE-TA)
🏴 Flag for KwaZulu-Natal (ZA-NL)
🏴 Flag for Manicaland (ZW-MA)
🏴 Flag for Masvingo (ZW-MV)
🏴 Flag for Luapula (ZM-04)
🏴 Flag for Mashonaland West (ZW-MW)
🏴 Flag for Harare (ZW-HA)
👨🏽👨🏽👧🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone
👨🏾👨🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Pays-de-la-Loire (FR-PDL)
🏴 Flag for Klaipėdos Municipality (LT-20)
🏴 Flag for Crete (GR-M)
Tag Latin Small Letter X
🏴 Flag for Mazandaran (IR-21)
🏴 Flag for Primorsky Krai (RU-PRI)
🏴 Flag for Fukushima (JP-07)
🏴 Flag for Manitoba (CA-MB)
👨🏻👨🏻👦🏻👦🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone, Boy: Light Skin Tone
👩🏻❤️👩🏻 Couple With Heart - Woman: Light Skin Tone, Woman: Light Skin Tone
🏴 Flag for Quebec (CA-QC)
👨👩👶 Family: Man, Woman, Baby
🏴 Flag for Kavango East (NA-KE)
🏴 Flag for San Luis Potosí (MX-SLP)
🏴 Flag for Lääne-Viru (EE-59)
🏴 Flag for Bong (LR-BG)
🏴 Flag for Deir al-Balah (PS-DEB)
👨🏿👨🏿👧🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Saint Thomas (JM-03)
🏴 Flag for Kayangel (PW-100)
🏴 Flag for Pool (CG-12)
👨❤️👨🏾 Couple With Heart - Man, Man: Medium-Dark Skin Tone
🏴 Flag for Balearic Islands (ES-IB)
👩👨👦 Family: Woman, Man, Boy
🏴 Flag for Uusimaa (FI-18)
👨🏻👩🏻👦🏻👧🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Ceará (BR-CE)
👨👩👦👶 Family: Man, Woman, Boy, Baby
👨🏻👨🏻👧🏻👦🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Demir Hisar (MK-25)
🏴 Flag for Antofagasta (CL-AN)
🏴 Flag for Christ Church (BB-01)
🏴 Flag for Harju (EE-37)
👨🏿❤️💋👩🏽 Kiss - Man: Dark Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Yaren (NR-14)
👩❤️👩🏻 Couple With Heart - Woman, Woman: Light Skin Tone
🏴 Flag for Selangor (MY-10)
👨🏼👨🏼👧🏼👦🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Apurímac (PE-APU)
👩👨👦👧 Family: Woman, Man, Boy, Girl
👨🏿👩🏿👧🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Abkhazia (GE-AB)
🏴 Flag for Schellenberg (LI-08)
🏴 Flag for Düzce (TR-81)
👩🏾👧🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩👨👶👦 Family: Woman, Man, Baby, Boy
🏴 Flag for Sonora (MX-SON)
🏴 Flag for Sassandra-Marahoué (CI-SM)
🏴 Flag for Arequipa (PE-ARE)
👩🏽❤️👩🏼 Couple With Heart - Woman: Medium Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Bouenza (CG-11)
🏴 Flag for Saint Catherine (JM-14)
🏴 Flag for Škofja Loka (SI-122)
👩🏻❤️💋👨🏼 Kiss - Woman: Light Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Hsinchu (TW-HSZ)
👩🏼👧🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Southern (LK-3)
👨❤️💋👨🏼 Kiss - Man, Man: Medium-Light Skin Tone
👨🏽👨🏽👧🏽👦🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for León (NI-LE)
🏴 Flag for Varaždin (HR-05)
🏴 Flag for Antioquia (CO-ANT)
🏴 Flag for Sainte-Dévote Chapel (MC-SD)
🏴 Flag for Plasnica (MK-61)
👨🏾❤️👨🏻 Couple With Heart - Man: Medium-Dark Skin Tone, Man: Light Skin Tone
🏴 Flag for West Greece (GR-G)
🏴 Flag for North Province (MV-NO)
👨❤️👩🏻 Couple With Heart - Man, Woman: Light Skin Tone
🏴 Flag for Apure (VE-C)
☿️ Mercury
🏴 Flag for Montana (US-MT)
👩🏼❤️👨🏾 Couple With Heart - Woman: Medium-Light Skin Tone, Man: Medium-Dark Skin Tone
👨🏾👨🏾👧🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Esmeraldas (EC-E)
🏴 Flag for Béchar (DZ-08)
🏴 Flag for North Holland (NL-NH)
🏴 Flag for St. Barthélemy (FR-BL)
🏴 Flag for Ouaka (CF-UK)
🏴 Flag for Red Sea (SD-RS)
🏴 Flag for Tabasco (MX-TAB)
🏴 Flag for Macau SAR China (CN-92)
🏴 Flag for Eger (HU-EG)
🏴 Flag for North Ossetia-Alania (RU-SE)
🏴 Flag for Équateur (CD-EQ)
👨🏿👨🏿👧🏿👦🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Basque Country (ES-PV)
👨🏽❤️💋👨🏻 Kiss - Man: Medium Skin Tone, Man: Light Skin Tone
🏴 Flag for Gafsa (TN-71)
🏴 Flag for Tavastia Proper (FI-06)
🏴 Flag for Razavi Khorasan (IR-30)
🏴 Flag for Dobje (SI-154)
👨🏼❤️💋👨🏻 Kiss - Man: Medium-Light Skin Tone, Man: Light Skin Tone
🏴 Flag for Retalhuleu (GT-RE)
🏴 Flag for Line Islands (KI-L)
🏴 Flag for West Azarbaijan (IR-02)
🏴 Flag for Nariño (CO-NAR)
🏴 Flag for Mashonaland Central (ZW-MC)
👨🏻❤️👨🏻 Couple With Heart - Man: Light Skin Tone, Man: Light Skin Tone
🏴 Flag for Emilia-Romagna (IT-45)
🏴 Flag for Valencian Community (ES-VC)
🏴 Flag for Samut Songkhram (TH-75)
🏴 Flag for Île-de-France (FR-IDF)
🏴 Flag for Maseru (LS-A)
🏴 Flag for Marsabit (KE-25)
🏴 Flag for Adrar (DZ-01)
🏴 Flag for Usulután (SV-US)
🏴 Flag for Mazsalaca (LV-060)
👩🏻❤️💋👩🏾 Kiss - Woman: Light Skin Tone, Woman: Medium-Dark Skin Tone
👨🏾👦🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Chaiyaphum (TH-36)
🏴 Flag for Central Visayas (PH-07)
🏴 Flag for Chumphon (TH-86)
🏴 Flag for Zanzan (CI-ZZ)
🏴 Flag for Castile and León (ES-CL)
👨🏻👨🏻👧🏻👧🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Al Bahah (SA-11)
🏴 Flag for Sint Eustatius (BQ-SE)
🏴 Flag for Åland Islands (FI-01)
🏴 Flag for Heredia (CR-H)
🏴 Flag for Kütahya (TR-43)
🏴 Flag for Vaisigano (WS-VS)
👨🏿❤️💋👩🏼 Kiss - Man: Dark Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Kranj (SI-052)
🏴 Flag for Zulia (VE-V)
👩🏽❤️💋👨🏼 Kiss - Woman: Medium Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Capellen (LU-CA)
👩🏽❤️👩🏾 Couple With Heart - Woman: Medium Skin Tone, Woman: Medium-Dark Skin Tone
👨🏼👨🏼👧🏼👧🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for East Berbice-Corentyne (GY-EB)
🏴 Flag for Lopburi (TH-16)
🏴 Flag for Luqa (MT-25)
👨🏻❤️👨🏼 Couple With Heart - Man: Light Skin Tone, Man: Medium-Light Skin Tone
👨🏽👨🏽👧🏽👧🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone, Girl: Medium Skin Tone
👩🏻❤️👩🏽 Couple With Heart - Woman: Light Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Baja California Sur (MX-BCS)
🏴 Flag for Beni Suef (EG-BNS)
🏴 Flag for Phatthalung (TH-93)
🏴 Flag for Tanga (TZ-25)
🏴 Flag for Oriental (MA-04)
👨🏾👨🏾👧🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏿👩🏿👶🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Gorenja Vas–Poljane (SI-027)
🏴 Flag for Sangre Grande (TT-SGE)
🏴 Flag for Koknese (LV-046)
🏴 Flag for Odranci (SI-086)
🏴 Flag for Nelson (NZ-NSN)
🏴 Flag for Szabolcs-Szatmár-Bereg (HU-SZ)
👩🏾❤️💋👨🏽 Kiss - Woman: Medium-Dark Skin Tone, Man: Medium Skin Tone
🏴 Flag for Sveti Jurij v Slovenskih Goricah (SI-210)
߷ NKo Symbol Gbakurunen
🏴 Flag for Delta (NG-DE)
🏴 Flag for Căușeni (MD-CS)
👩🏽👧🏽👦🏽 Family - Woman: Medium Skin Tone, Girl: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Isla de la Juventud (CU-99)
🏴 Flag for Svay Rieng (KH-20)
🏴 Flag for Hadjer-Lamis (TD-HL)
🏴 Flag for Gifu (JP-21)
🏴 Flag for Jelgava Municipality (LV-041)
🏴 Flag for Federally Administered Tribal Areas (PK-TA)
🏴 Flag for Xewkija (MT-62)
🏴 Flag for Guidimaka (MR-10)
🏴 Flag for Aračinovo (MK-02)
🏴 Flag for Log–Dragomer (SI-208)
🏴 Flag for Šmartno ob Paki (SI-125)
🏴 Flag for Capital District (CO-DC)
🏴 Flag for Ventspils Municipality (LV-106)
🏴 Flag for South Central Province (MV-SC)
🏴 Flag for Assam (IN-AS)
🏴 Flag for Alytus Municipality (LT-02)
🏴 Flag for Hưng Yên (VN-66)
👨🏻👨🏻👧🏻👶🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone, Baby: Light Skin Tone
👨🏼👨🏼👧🏼👶🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for San Marcos (GT-SM)
👨🏼👨🏼👦🏼👦🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Schleswig-Holstein (DE-SH)
👨👨👶👧 Family: Man, Man, Baby, Girl
️ Variation Selector-16
👨🏽👨🏽👧🏽👶🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone, Baby: Medium Skin Tone
👨🏾👨🏾👧🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👨🏿👨🏿👧🏿👶🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone, Baby: Dark Skin Tone
👨🏻👨🏻👶🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone
👨🏼👨🏼👶🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👨🏽👨🏽👶🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone
👨🏾👨🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👨🏿👨🏿👶🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone
👩❤️👨🏿 Couple With Heart - Woman, Man: Dark Skin Tone
🏴 Flag for Cantabria (ES-CB)
🏴 Flag for Unity (SS-UY)
👩🏼👶🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏽👶🏽👦🏽 Family - Woman: Medium Skin Tone, Baby: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾👶🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👶🏿👦🏿 Family - Woman: Dark Skin Tone, Baby: Dark Skin Tone, Boy: Dark Skin Tone
👩🏻👶🏻👧🏻 Family - Woman: Light Skin Tone, Baby: Light Skin Tone, Girl: Light Skin Tone
👩🏼👶🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👩🏽👶🏽👧🏽 Family - Woman: Medium Skin Tone, Baby: Medium Skin Tone, Girl: Medium Skin Tone
👩🏾👶🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩🏽👶🏽👶🏽 Family - Woman: Medium Skin Tone, Baby: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👶🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏿👶🏿👶🏿 Family - Woman: Dark Skin Tone, Baby: Dark Skin Tone, Baby: Dark Skin Tone
👩🏻👨🏻👦🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone
👩🏼👨🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏽👨🏽👦🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾👨🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👨🏿👦🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone
👩🏻👨🏻👦🏻👦🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone, Boy: Light Skin Tone
👩🏼👶🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏼👨🏼👦🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏽👨🏽👦🏽👦🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾👨🏾👦🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👨🏿👦🏿👦🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone, Boy: Dark Skin Tone
👩🏽👨🏽👦🏽👧🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone, Girl: Medium Skin Tone
👩🏾👨🏾👦🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩🏿👨🏿👦🏿👧🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone, Girl: Dark Skin Tone
👩🏼👨🏼👦🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👨🏽👦🏽👶🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👨🏾👦🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏻👨🏻👧🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone
👩🏽👨🏽👧🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone
👩🏻👨🏻👦🏻👶🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone, Baby: Light Skin Tone
👩🏼👨🏼👦🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👩🏼👨🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👩🏻👨🏻👧🏻👦🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone, Boy: Light Skin Tone
👩🏼👨🏼👧🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏽👨🏽👧🏽👦🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾👨🏾👧🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👨🏿👧🏿👦🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone, Boy: Dark Skin Tone
👩🏻👨🏻👧🏻👧🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone, Girl: Light Skin Tone
👩🏽👨🏽👧🏽👧🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone, Girl: Medium Skin Tone
👩🏿👨🏿👧🏿👧🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone, Girl: Dark Skin Tone
👩🏻👨🏻👧🏻👶🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Girl: Light Skin Tone, Baby: Light Skin Tone
👩🏼👨🏼👧🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👨🏽👧🏽👶🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Girl: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👨🏾👧🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏿👨🏿👧🏿👶🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone, Baby: Dark Skin Tone
👩🏻👨🏻👶🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone
👩🏼👨🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏻👨🏻👶🏻👦🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone, Boy: Light Skin Tone
👩🏼👨🏼👶🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏾👨🏾👶🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👨🏿👶🏿👦🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone, Boy: Dark Skin Tone
👩🏻👨🏻👶🏻👧🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone, Girl: Light Skin Tone
👩🏼👨🏼👶🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👩🏽👨🏽👶🏽👧🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone, Girl: Medium Skin Tone
👩🏿👨🏿👶🏿👧🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone, Girl: Dark Skin Tone
👩🏻👨🏻👶🏻👶🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone, Baby: Light Skin Tone
👩🏼👨🏼👶🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👨🏽👶🏽👶🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👨🏾👶🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏿👨🏿👶🏿👶🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone, Baby: Dark Skin Tone
👩🏼👩🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏻👩🏻👦🏻👧🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone, Girl: Light Skin Tone
👩🏼👩🏼👦🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏾👩🏾👦🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👩🏿👦🏿👦🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone, Boy: Dark Skin Tone
👩🏿👩🏿👦🏿👶🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone, Baby: Dark Skin Tone
👩🏽👩🏽👦🏽👧🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone, Girl: Medium Skin Tone
👩🏾👩🏾👦🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩🏿👩🏿👦🏿👧🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone, Girl: Dark Skin Tone
👩🏻👩🏻👦🏻👶🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone, Baby: Light Skin Tone
👩🏼👩🏼👦🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👩🏽👦🏽👶🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👩🏾👦🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏻👩🏻👧🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone
👩🏼👩🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👩🏽👩🏽👦🏽👦🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone, Boy: Medium Skin Tone
👩🏽👩🏽👧🏽👦🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone, Boy: Medium Skin Tone
👩🏻👩🏻👧🏻👧🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone, Girl: Light Skin Tone
👩🏽👩🏽👧🏽👧🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone, Girl: Medium Skin Tone
👩🏾👩🏾👧🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩🏿👩🏿👧🏿👧🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone, Girl: Dark Skin Tone
👩🏻👩🏻👧🏻👶🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone, Baby: Light Skin Tone
👩🏼👩🏼👧🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👩🏽👧🏽👶🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👩🏾👧🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏿👩🏿👧🏿👶🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone, Baby: Dark Skin Tone
👩🏻👩🏻👶🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone
👨🏾👩🏾👧🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩🏼👩🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👩🏽👶🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👩🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏿👩🏿👶🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone
👩🏻👩🏻👶🏻👦🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone, Boy: Light Skin Tone
👩🏼👩🏼👶🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏽👩🏽👶🏽👦🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾👩🏾👶🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏿👩🏿👶🏿👦🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone, Boy: Dark Skin Tone
👩🏻👩🏻👶🏻👧🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone, Girl: Light Skin Tone
👩🏽👩🏽👶🏽👧🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone, Girl: Medium Skin Tone
👩🏾👩🏾👶🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👩🏿👩🏿👶🏿👧🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone, Girl: Dark Skin Tone
👩🏻👩🏻👶🏻👶🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone, Baby: Light Skin Tone
👩🏼👩🏼👶🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👩🏽👩🏽👶🏽👶🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👩🏾👶🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩🏿👩🏿👶🏿👶🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone, Baby: Dark Skin Tone
👩🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Maluku Islands (ID-ML)
👩🏿👶🏿👧🏿 Family - Woman: Dark Skin Tone, Baby: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Southern Denmark (DK-83)
🏴 Flag for Skopje (MK-85)
👨🏼❤️💋👩 Kiss - Man: Medium-Light Skin Tone, Woman
🏴 Flag for Beja (PT-02)
🏴 Flag for Sardinia (IT-88)
🏴 Flag for Bavaria (DE-BY)
🏴 Flag for East New Britain (PG-EBR)
🏴 Flag for Trentino-South Tyrol (IT-32)
🏴 Flag for Tennessee (US-TN)
🏴 Flag for Saskatchewan (CA-SK)
🏴 Flag for Funafuti (TV-FUN)
🏴 Flag for Gorno-Badakhshan (TJ-GB)
🏴 Flag for Banaadir (SO-BN)
🏴 Flag for Radenci (SI-100)
🏴 Flag for Baden-Württemberg (DE-BW)
👩🏿👧🏿 Family - Woman: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Carabobo (VE-G)
Zero Width Joiner
🏴 Flag for Nakuru (KE-31)
🏴 Flag for Maritime (TG-M)
🏴 Flag for Borno (NG-BO)
🏴 Flag for Transnistria (MD-SN)
🏴 Flag for Tehran (IR-07)
🏴 Flag for Dagestan (RU-DA)
🏴 Flag for Al Wusta (OM-WU)
🏴 Flag for Ústecký kraj (CZ-42)
🏴 Flag for Kuala Lumpur (MY-14)
🏴 Flag for Ayacucho (PE-AYA)
🏴 Flag for Kiev (UA-30)
🏴 Flag for Saint Philip (AG-08)
🏴 Flag for Mdina (MT-29)
🏴 Flag for Northern Ireland (GB-NIR)
🏴 Flag for Auvergne-Rhône-Alpes (FR-ARA)
🏴 Flag for Durango (MX-DUR)
👨🏼👩🏼👧🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Eastern (LK-5)
🏴 Flag for Ogun (NG-OG)
🏴 Flag for Jafara (LY-JI)
🏴 Flag for Skåne (SE-M)
👨🏽👩🏽👧🏽👦🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾👩🏾👧🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Mato Grosso do Sul (BR-MS)
🏴 Flag for Santa Rosa (GT-SR)
👨🏼👩🏼👧🏼👧🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Braslovče (SI-151)
🏴 Flag for Madeira (PT-30)
🏴 Flag for San Vicente (SV-SV)
🏴 Flag for Alborz (IR-32)
🏴 Flag for Fa’asaleleaga (WS-FA)
👨🏼👨🏼👦🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Newfoundland and Labrador (CA-NL)
🏴 Flag for Peloponnese (GR-J)
🏴 Flag for Sint Maarten (NL-SX)
🏴 Flag for St. Julian’s (MT-48)
🏴 Flag for Adamawa (NG-AD)
👩🏿👩🏿👧🏿👦🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for São Tomé (ST-S)
👩🏻👩🏻👧🏻👦🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Auce (LV-010)
🏴 Flag for Cordillera Administrative (PH-15)
🏴 Flag for Fukui (JP-18)
👨🏿👩🏿👦🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Kakheti (GE-KA)
🏴 Flag for Jeju (KR-49)
🏴 Flag for Souss-Massa-Drâa (MA-13)
🏴 Flag for Inčukalns (LV-037)
🏴 Flag for French Southern Territories (FR-TF)
🏴 Flag for Quintana Roo (MX-ROO)
👩🏻👶🏻👶🏻 Family - Woman: Light Skin Tone, Baby: Light Skin Tone, Baby: Light Skin Tone
👨🏾👨🏾👦🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Győr-Moson-Sopron (HU-GS)
👩🏿👩🏿👧🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone
👩🏻👩🏻👦🏻👦🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone, Boy: Light Skin Tone
Shibuya
👩❤️👨🏽 Couple With Heart - Woman, Man: Medium Skin Tone
🏴 Flag for Gaga’ifomauga (WS-GI)
🏴 Flag for Nord-Est (HT-NE)
🏴 Flag for Central Singapore (SG-01)
🏴 Flag for Tungurahua (EC-T)
# Number Sign
👨🏻👨🏻👶🏻👦🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone, Boy: Light Skin Tone
1 Digit One
🏴 Flag for Tarija (BO-T)
👨🏾👩🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Cibitoke (BI-CI)
🏴 Flag for Upper South Province (MV-US)
🏴 Flag for Canillo (AD-02)
🏴 Flag for Bamyan (AF-BAM)
🏴 Flag for Encamp (AD-03)
🏴 Flag for Northern Mariana Islands (US-MP)
🏴 Flag for Babīte (LV-012)
🏴 Flag for Cotopaxi (EC-X)
🏴 Flag for Ngounié (GA-4)
* Asterisk
Tag Latin Small Letter Z
🏴 Flag for La Massana (AD-04)
Tag Digit Three
👩🏼❤️💋👩🏻 Kiss - Woman: Medium-Light Skin Tone, Woman: Light Skin Tone
🏴 Flag for Berane (ME-03)
👨🏿❤️💋👨🏽 Kiss - Man: Dark Skin Tone, Man: Medium Skin Tone
🏴 Flag for El Valle (DO-37)
👩🏾❤️👩🏻 Couple With Heart - Woman: Medium-Dark Skin Tone, Woman: Light Skin Tone
🏴 Flag for Baringo (KE-01)
🏴 Flag for Amanat Al Asimah (YE-SA)
👨🏼👨🏼👶🏼👦🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
Tag Digit Two
🏴 Flag for Senglea (MT-20)
🕴️♀️ Woman in Business Suit Levitating
🏴 Flag for Haut-Mbomou (CF-HM)
Tag Digit One
Tag Digit Four
🏴 Flag for Absheron (AZ-ABS)
6 Digit Six
🏴 Flag for Savannakhet (LA-SV)
🏴 Flag for Kayes (ML-1)
🏴 Flag for Abu Dhabi (AE-AZ)
🏴 Flag for Asturias (ES-AS)
🏴 Flag for Kirkuk (IQ-KI)
👩❤️👩🏽 Couple With Heart - Woman, Woman: Medium Skin Tone
🏴 Flag for Berlin (DE-BE)
8 Digit Eight
🏴 Flag for Escaldes-Engordany (AD-08)
🏴 Flag for Ningxia (CN-64)
🏴 Flag for Cañar (EC-F)
🏴 Flag for Ajman (AE-AJ)
🕴🏻♀️ Woman in Business Suit Levitating: Light Skin Tone
👨🏻❤️💋👩 Kiss - Man: Light Skin Tone, Woman
Tag Digit Eight
🏴 Flag for Fars (IR-14)
🏴 Flag for Fujairah (AE-FU)
👨🏼👦🏼👦🏼 Family - Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Virovitica-Podravina (HR-10)
Tag Latin Small Letter I
7 Digit Seven
Tag Digit Seven
Tag Latin Small Letter E
👩🏼👩🏼👧🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Ratak Chain (MH-T)
🏴 Flag for Sharjah (AE-SH)
Tag Latin Small Letter F
🏴 Flag for Vilniaus Municipality (LT-57)
🏴 Flag for Westfjords (IS-4)
🏴 Flag for British Columbia (CA-BC)
4 Digit Four
🏴 Flag for Balkh (AF-BAL)
👨👶👦 Family: Man, Baby, Boy
🏴 Flag for Hsinchu County (TW-HSQ)
👩👶👧 Family: Woman, Baby, Girl
🏴 Flag for Jalisco (MX-JAL)
🏴 Flag for Kitui (KE-18)
🏴 Flag for Azores (PT-20)
🏴 Flag for Manipur (IN-MN)
🏴 Flag for Badakhshan (AF-BDS)
👩🏻❤️👩🏼 Couple With Heart - Woman: Light Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Ordino (AD-05)
👩🏽❤️💋👩 Kiss - Woman: Medium Skin Tone, Woman
🏴 Flag for Baghlan (AF-BGL)
🏴 Flag for Cross River (NG-CR)
🏴 Flag for Colorado (US-CO)
Tag Latin Small Letter T
🏴 Flag for Radoviš (MK-64)
🏴 Flag for Wellington (NZ-WGN)
👨🏽👨🏽👶🏽👦🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Kurdistan (IR-16)
👨🏽❤️💋👨🏿 Kiss - Man: Medium Skin Tone, Man: Dark Skin Tone
Tag Latin Small Letter S
👩👶👶 Family: Woman, Baby, Baby
🏴 Flag for Daykundi (AF-DAY)
👨🏻❤️💋👨🏾 Kiss - Man: Light Skin Tone, Man: Medium-Dark Skin Tone
🏴 Flag for Farah (AF-FRA)
Tag Latin Small Letter Q
🏴 Flag for Guatemala (GT-GU)
🏴 Flag for Thurgau (CH-TG)
🏴 Flag for Chechen (RU-CE)
Tag Digit Five
🏴 Flag for Ghōr (AF-GHO)
🏴 Flag for Vienna (AT-9)
🏴 Flag for Ghazni (AF-GHA)
Tag Latin Small Letter U
🏴 Flag for Gaborone (BW-GA)
Tag Latin Small Letter Y
Cancel Tag
Tag Latin Small Letter W
👩🏽❤️👩🏿 Couple With Heart - Woman: Medium Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Amazonas (CO-AMA)
Tag Latin Small Letter N
👩❤️💋👩🏽 Kiss - Woman, Woman: Medium Skin Tone
👨👶 Family: Man, Baby
🏴 Flag for Burgenland (AT-1)
🏴 Flag for Helmand (AF-HEL)
Tag Digit Six
🏴 Flag for Jowzjan (AF-JOW)
🧕♀️ Woman With Headscarf
Tag Latin Small Letter B
Tag Digit Zero
🏴 Flag for Herat (AF-HER)
🏴 Flag for Saint Mark (GD-05)
3 Digit Three
Tag Latin Small Letter G
🕴🏾♀️ Woman in Business Suit Levitating: Medium-Dark Skin Tone
👩🏽❤️💋👨🏽 Kiss - Woman: Medium Skin Tone, Man: Medium Skin Tone
🏴 Flag for Alaska (US-AK)
Tag Latin Small Letter R
🏴 Flag for Lautém (TL-LA)
🏴 Flag for Kabul (AF-KAB)
👨❤️💋👨🏿 Kiss - Man, Man: Dark Skin Tone
🧕♂️ Man With Headscarf
Tag Latin Small Letter V
Tag Latin Small Letter D
🏴 Flag for Kandahar (AF-KAN)
🏴 Flag for Kapisa (AF-KAP)
🏴 Flag for Saint Roman (MC-SR)
🏴 Flag for Hiiu (EE-39)
Tag Latin Small Letter M
🏴 Flag for Khost (AF-KHO)
🧕🏻♂️ Man With Headscarf: Light Skin Tone
🏴 Flag for Kunduz (AF-KDZ)
👩🏿❤️👨 Couple With Heart - Woman: Dark Skin Tone, Man
🏴 Flag for South Dakota (US-SD)
🏴 Flag for Badghis (AF-BDG)
🏴 Flag for Southern (IS-8)
🏴 Flag for Kunar (AF-KNR)
👨👨👶👶 Family: Man, Man, Baby, Baby
🏴 Flag for Tokyo (JP-13)
🏴 Flag for Laghman (AF-LAG)
🧕🏽♂️ Man With Headscarf: Medium Skin Tone
🏴 Flag for Logar (AF-LOG)
5 Digit Five
Tag Latin Small Letter C
🏴 Flag for Faryab (AF-FYB)
Tag Latin Small Letter P
🏴 Flag for Nangarhar (AF-NAN)
Tag Digit Nine
🏴 Flag for Navarra Chartered Community (ES-NC)
👩🏼👦🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Nayarit (MX-NAY)
🏴 Flag for Pernambuco (BR-PE)
🏴 Flag for Campania (IT-72)
🧕🏾♂️ Man With Headscarf: Medium-Dark Skin Tone
👩🏽❤️💋👩🏾 Kiss - Woman: Medium Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Nuristan (AF-NUR)
👨👨👧👶 Family: Man, Man, Girl, Baby
🏴 Flag for West New Britain (PG-WBK)
👨🏼👩🏼👧🏼👦🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Upper Demerara-Berbice (GY-UD)
👨❤️💋👩 Kiss - Man, Woman
🏴 Flag for Afar (ET-AF)
🏴 Flag for Parwan (AF-PAR)
🏴 Flag for Nimruz (AF-NIM)
🏴 Flag for Karlovac (HR-04)
🏴 Flag for Paktia (AF-PIA)
🧕🏿♂️ Man With Headscarf: Dark Skin Tone
🧕🏼♂️ Man With Headscarf: Medium-Light Skin Tone
🏴 Flag for Baja California (MX-BCN)
🏴 Flag for Paktika (AF-PKA)
🏴 Flag for Phoenix Islands (KI-P)
Tag Latin Small Letter O
🏴 Flag for Panjshir (AF-PAN)
🏴 Flag for Ticino (CH-TI)
🏴 Flag for Žirovnica (SI-192)
🏴 Flag for Halland (SE-N)
Tag Latin Small Letter J
👩🏽❤️💋👩🏻 Kiss - Woman: Medium Skin Tone, Woman: Light Skin Tone
👨🏾👨🏾👶🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👨🏿👨🏿👶🏿👦🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Northern Bahr el Ghazal (SS-BN)
👨🏽❤️💋👩 Kiss - Man: Medium Skin Tone, Woman
🏴 Flag for Basse-Kotto (CF-BK)
👨❤️👨🏻 Couple With Heart - Man, Man: Light Skin Tone
👨🏽❤️👨 Couple With Heart - Man: Medium Skin Tone, Man
🏴 Flag for Butnan (LY-BU)
👩👶 Family: Woman, Baby
🏴 Flag for Sabaragamuwa (LK-9)
🏴 Flag for Samangan (AF-SAM)
🏴 Flag for Nukulaelae (TV-NKL)
🏴 Flag for Ras al-Khaimah (AE-RK)
🏴 Flag for Ceuta (ES-CE)
🏴 Flag for Dubai (AE-DU)
👨🏻👨🏻👶🏻👧🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Okinawa (JP-47)
🏴 Flag for Sar-e Pol (AF-SAR)
👩🏼👩🏼👦🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
Tag Latin Small Letter L
🏴 Flag for Urozgan (AF-URU)
9 Digit Nine
👩🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👨❤️💋👨🏽 Kiss - Man, Man: Medium Skin Tone
🏴 Flag for Saint Joseph (DM-06)
🏴 Flag for Saint John (AG-04)
🏴 Flag for Vichada (CO-VID)
🏴 Flag for Ngarchelong (PW-218)
🏴 Flag for Arkhangelsk (RU-ARK)
🏴 Flag for Zabul (AF-ZAB)
🏴 Flag for Saint George (AG-03)
🏴 Flag for Lombardy (IT-25)
👨🏻❤️💋👨🏻 Kiss - Man: Light Skin Tone, Man: Light Skin Tone
🏴 Flag for Pardubický kraj (CZ-53)
🏴 Flag for Saint Paul (AG-06)
🏴 Flag for Trà Vinh (VN-51)
👩👨👶👧 Family: Woman, Man, Baby, Girl
🏴 Flag for South Gyeongsang (KR-48)
🏴 Flag for Saint Mary (AG-05)
🏴 Flag for North Aegean (GR-K)
👩👩👶👧 Family: Woman, Woman, Baby, Girl
🏴 Flag for Zamora-Chinchipe (EC-Z)
🏴 Flag for Masaya (NI-MS)
🏴 Flag for Gilbert Islands (KI-G)
🏴 Flag for Chihuahua (MX-CHH)
👨🏼👨🏼👶🏼👧🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏽👨🏽👶🏽👧🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone, Girl: Medium Skin Tone
👩🏽👧🏽👧🏽 Family - Woman: Medium Skin Tone, Girl: Medium Skin Tone, Girl: Medium Skin Tone
👩👨👶👶 Family: Woman, Man, Baby, Baby
🏴 Flag for Redonda (AG-11)
👩👩👶 Family: Woman, Woman, Baby
👨❤️💋👩🏻 Kiss - Man, Woman: Light Skin Tone
👨❤️💋👨🏾 Kiss - Man, Man: Medium-Dark Skin Tone
🏴 Flag for Berat County (AL-01)
Tag Latin Small Letter A
🏴 Flag for Barbuda (AG-10)
🏴 Flag for San Andrés & Providencia (CO-SAP)
🏴 Flag for Elbasan County (AL-03)
👨🏾👨🏾👶🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏿👨🏿👦🏿👶🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Karnataka (IN-KA)
🏴 Flag for Gjirokastër County (AL-05)
🏴 Flag for Hokkaidō (JP-01)
👩🏾👨🏾👶🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Central (UG-C)
👨🏼❤️💋👨 Kiss - Man: Medium-Light Skin Tone, Man
🏴 Flag for Durrës County (AL-02)
🏴 Flag for Fier County (AL-04)
🏴 Flag for Korçë County (AL-06)
🏴 Flag for Alto Paraguay (PY-16)
🏴 Flag for Kukës County (AL-07)
👨🏿❤️💋👨 Kiss - Man: Dark Skin Tone, Man
🏴 Flag for Upper Takutu-Upper Essequibo (GY-UT)
👨🏾👶🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏿👨🏿👶🏿👧🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone, Girl: Dark Skin Tone
👨🏻👨🏻👶🏻👶🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Baby: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Dibër County (AL-09)
🏴 Flag for Lezhë County (AL-08)
👨🏼👨🏼👶🏼👶🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Tirana County (AL-11)
🏴 Flag for Sant Julià de Lòria (AD-06)
🏴 Flag for Bahia (BR-BA)
🏴 Flag for Shkodër County (AL-10)
👩❤️💋👨🏿 Kiss - Woman, Man: Dark Skin Tone
👨🏽👨🏽👶🏽👶🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone, Baby: Medium Skin Tone
👨🏾👨🏾👶🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👩❤️💋👨🏽 Kiss - Woman, Man: Medium Skin Tone
🏴 Flag for Vlorë County (AL-12)
🏴 Flag for Trat (TH-23)
🏴 Flag for Gegharkunik (AM-GR)
👨🏿👨🏿👶🏿👶🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Aragatsotn (AM-AG)
🏴 Flag for Ararat (AM-AR)
🏴 Flag for Yerevan (AM-ER)
🏴 Flag for Kotayk (AM-KT)
🏴 Flag for Corse (FR-COR)
🏴 Flag for Armavir (AM-AV)
👩❤️💋👩🏿 Kiss - Woman, Woman: Dark Skin Tone
🏴 Flag for Minas Gerais (BR-MG)
🏴 Flag for Pointe-Noire (CG-16)
🏴 Flag for Lori (AM-LO)
🏴 Flag for Skikda (DZ-21)
🏴 Flag for Shirak (AM-SH)
👩❤️💋👩🏾 Kiss - Woman, Woman: Medium-Dark Skin Tone
🏴 Flag for Andorra la Vella (AD-07)
🏴 Flag for Altai Krai (RU-ALT)
🏴 Flag for Lovrenc na Pohorju (SI-167)
👩❤️💋👩🏼 Kiss - Woman, Woman: Medium-Light Skin Tone
👨🏿❤️💋👩🏻 Kiss - Man: Dark Skin Tone, Woman: Light Skin Tone
🏴 Flag for Panevėžys County (LT-PN)
🏴 Flag for Cibao Norte (DO-35)
🏴 Flag for Vest-Agder (NO-10)
👨❤️💋👩🏿 Kiss - Man, Woman: Dark Skin Tone
🏴 Flag for Vayots Dzor (AM-VD)
👩🏻❤️💋👩🏻 Kiss - Woman: Light Skin Tone, Woman: Light Skin Tone
🏴 Flag for Vermont (US-VT)
👨🏽❤️💋👨 Kiss - Man: Medium Skin Tone, Man
🏴 Flag for Bengo (AO-BGO)
👩🏻❤️💋👩 Kiss - Woman: Light Skin Tone, Woman
🏴 Flag for Meta (CO-MET)
🏴 Flag for Saba (NL-BQ2)
👩🏽❤️💋👩🏼 Kiss - Woman: Medium Skin Tone, Woman: Medium-Light Skin Tone
👨🏽👩🏽👦🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Benguela (AO-BGU)
🏴 Flag for Sucre (CO-SUC)
🏴 Flag for Cuando Cubango (AO-CCU)
🏴 Flag for Madre de Dios (PE-MDD)
🏴 Flag for Vaud (CH-VD)
🏴 Flag for Bié (AO-BIE)
🏴 Flag for Cabinda (AO-CAB)
🏴 Flag for Huíla (AO-HUI)
🏴 Flag for Cuanza Sul (AO-CUS)
👨❤️💋👩🏽 Kiss - Man, Woman: Medium Skin Tone
👩👩👦👶 Family: Woman, Woman, Boy, Baby
🏴 Flag for Huambo (AO-HUA)
👨🏼❤️👩🏾 Couple With Heart - Man: Medium-Light Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Kyrenia (CY-06)
👩🏼❤️💋👨🏻 Kiss - Woman: Medium-Light Skin Tone, Man: Light Skin Tone
🏴 Flag for Umm al-Quwain (AE-UQ)
🏴 Flag for Lunda Sul (AO-LSU)
🏴 Flag for Grand Cape Mount (LR-CM)
🏴 Flag for Lunda Norte (AO-LNO)
👩🏽❤️👨🏿 Couple With Heart - Woman: Medium Skin Tone, Man: Dark Skin Tone
👨🏾❤️👩🏾 Couple With Heart - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Cuanza Norte (AO-CNO)
🏴 Flag for Malanje (AO-MAL)
👩🏼❤️💋👩 Kiss - Woman: Medium-Light Skin Tone, Woman
👨🏼👩🏼👦🏼👦🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Moxico (AO-MOX)
🏴 Flag for Namibe (AO-NAM)
👨🏾👩🏾👦🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
Tag Latin Small Letter K
🕴🏼♀️ Woman in Business Suit Levitating: Medium-Light Skin Tone
🏴 Flag for Salta (AR-A)
👨🏾👩🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Lualaba (CD-LU)
🏴 Flag for Buenos Aires Province (AR-B)
👨🏿👩🏿👦🏿👦🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for San Luis (AR-D)
🏴 Flag for Zaire (AO-ZAI)
🏴 Flag for Afyonkarahisar (TR-03)
0 Digit Zero
🏴 Flag for Quảng Trị (VN-25)
🕴🏿♀️ Woman in Business Suit Levitating: Dark Skin Tone
🏴 Flag for Uíge (AO-UIG)
👩🏾👧🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Zhytomyrshchyna (UA-18)
👨🏾❤️💋👨🏽 Kiss - Man: Medium-Dark Skin Tone, Man: Medium Skin Tone
🏴 Flag for Cesar (CO-CES)
🏴 Flag for Syunik (AM-SU)
🏴 Flag for Entre Ríos (AR-E)
👨🏿❤️💋👩 Kiss - Man: Dark Skin Tone, Woman
🏴 Flag for La Rioja (AR-F)
🏴 Flag for East Kazakhstan (KZ-VOS)
🏴 Flag for Maidan Wardak (AF-WAR)
🏴 Flag for San Juan (AR-J)
👩🏾👩🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Luanda (AO-LUA)
🏴 Flag for La Pampa (AR-L)
👩🏼❤️💋👩🏽 Kiss - Woman: Medium-Light Skin Tone, Woman: Medium Skin Tone
👨🏼👩🏼👦🏼👧🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏼👩🏼👦🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Catamarca (AR-K)
🏴 Flag for Río Negro (AR-R)
🏴 Flag for Chaco (AR-H)
🏴 Flag for Formosa (AR-P)
🏴 Flag for Mendoza (AR-M)
🏴 Flag for Misiones (AR-N)
🏴 Flag for Neuquén (AR-Q)
👨🏽👩🏽👦🏽👧🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Tucumán (AR-T)
🏴 Flag for Santa Fe (AR-S)
🏴 Flag for Corrientes (AR-W)
🏴 Flag for Jujuy (AR-Y)
🏴 Flag for Tierra del Fuego (AR-V)
🏴 Flag for Chubut (AR-U)
🏴 Flag for Córdoba (AR-X)
🏴 Flag for Santa Cruz (AR-Z)
🏴 Flag for Santiago del Estero (AR-G)
🏴 Flag for Carinthia (AT-2)
🏴 Flag for Basel-Landschaft (CH-BL)
👩🏿👧🏿👧🏿 Family - Woman: Dark Skin Tone, Girl: Dark Skin Tone, Girl: Dark Skin Tone
👨🏻👩🏻👦🏻👶🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone, Baby: Light Skin Tone
👩🏻👧🏻👶🏻 Family - Woman: Light Skin Tone, Girl: Light Skin Tone, Baby: Light Skin Tone
👨👨👦👧 Family: Man, Man, Boy, Girl
🏴 Flag for Lower Austria (AT-3)
👩👶👦 Family: Woman, Baby, Boy
🏴 Flag for Nouakchott Ouest (MR-13)
👨🏼👩🏼👦🏼👶🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Mbomou (CF-MB)
🏴 Flag for Styria (AT-6)
🏴 Flag for Ilocos (PH-01)
🏴 Flag for Tyrol (AT-7)
🏴 Flag for Guizhou (CN-52)
🏴 Flag for Xaisomboun (LA-XS)
🏴 Flag for Vorarlberg (AT-8)
👨🏼👨🏼👦🏼👧🏼 Family - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Salzburg (AT-5)
👨🏿👩🏿👦🏿👧🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone, Girl: Dark Skin Tone
👩👩👶👶 Family: Woman, Woman, Baby, Baby
👩👨👧👦 Family: Woman, Man, Girl, Boy
👩👨👧 Family: Woman, Man, Girl
👩👦👶 Family: Woman, Boy, Baby
🏴 Flag for New South Wales (AU-NSW)
👩👨👧👶 Family: Woman, Man, Girl, Baby
👩🏽👧🏽👶🏽 Family - Woman: Medium Skin Tone, Girl: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Northern Territory (AU-NT)
👩🏿👧🏿👦🏿 Family - Woman: Dark Skin Tone, Girl: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Queensland (AU-QLD)
2 Digit Two
👩👨👧👧 Family: Woman, Man, Girl, Girl
👩🏼👧🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Upper Austria (AT-4)
🏴 Flag for East Macedonia and Thrace (GR-A)
👨🏽👩🏽👦🏽👶🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone, Baby: Medium Skin Tone
👨🏾👩🏾👦🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👨👶👧 Family: Man, Baby, Girl
👨🏻👩🏻👧🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone
👨🏿👩🏿👦🏿👶🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone, Baby: Dark Skin Tone
👩👨👶 Family: Woman, Man, Baby
🏴 Flag for Nebraska (US-NE)
🏴 Flag for Agstafa (AZ-AGA)
🏴 Flag for Takhar (AF-TAK)
🏴 Flag for Western Australia (AU-WA)
🏴 Flag for Aghjabadi (AZ-AGC)
🏴 Flag for Astara (AZ-AST)
🏴 Flag for Balakan (AZ-BAL)
👩❤️💋👨🏼 Kiss - Woman, Man: Medium-Light Skin Tone
🏴 Flag for California (US-CA)
🏴 Flag for Agdash (AZ-AGS)
🏴 Flag for Baku (AZ-BA)
👨🏻❤️💋👩🏿 Kiss - Man: Light Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Victoria (AU-VIC)
🏴 Flag for Agdam (AZ-AGM)
👨🏻👧🏻 Family - Man: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Barda (AZ-BAR)
👨🏽👩🏽👧🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone
👩🏾👧🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Agsu (AZ-AGU)
🏴 Flag for Tanganyika (CD-TA)
👩🏻❤️👨🏼 Couple With Heart - Woman: Light Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Bilasuvar (AZ-BIL)
🏴 Flag for Jalilabad (AZ-CAL)
🏴 Flag for Jabrayil (AZ-CAB)
🏴 Flag for Beylagan (AZ-BEY)
🏴 Flag for Novo Mesto (SI-085)
🏴 Flag for Niari (CG-9)
🏴 Flag for Dashkasan (AZ-DAS)
🏴 Flag for Fizuli (AZ-FUZ)
👩🏿❤️💋👨🏽 Kiss - Woman: Dark Skin Tone, Man: Medium Skin Tone
👨🏿❤️👨🏾 Couple With Heart - Man: Dark Skin Tone, Man: Medium-Dark Skin Tone
🏴 Flag for Goychay (AZ-GOY)
🏴 Flag for Goranboy (AZ-GOR)
🏴 Flag for Ganja (AZ-GA)
🏴 Flag for Umm Salal (QA-US)
🏴 Flag for Eastern (FJ-E)
🏴 Flag for Goygol (AZ-GYG)
🏴 Flag for Hajigabul (AZ-HAC)
👩🏿❤️💋👩 Kiss - Woman: Dark Skin Tone, Woman
🏴 Flag for Rēzekne Municipality (LV-077)
🏴 Flag for Australian Capital Territory (AU-ACT)
👨🏽❤️💋👩🏾 Kiss - Man: Medium Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Federal Capital Territory (NG-FC)
🏴 Flag for Bryansk (RU-BRY)
🏴 Flag for Tavush (AM-TV)
🏴 Flag for Santo Domingo de los Tsáchilas (EC-SD)
👩🏼❤️👩 Couple With Heart - Woman: Medium-Light Skin Tone, Woman
🏴 Flag for Imishli (AZ-IMI)
🏴 Flag for Aşgabat (TM-S)
👨❤️👩🏾 Couple With Heart - Man, Woman: Medium-Dark Skin Tone
🏴 Flag for Sekong (LA-XE)
🏴 Flag for Gorj (RO-GJ)
👨🏻❤️👨 Couple With Heart - Man: Light Skin Tone, Man
🏴 Flag for Kurdamir (AZ-KUR)
👩🏻👨🏻👦🏻👧🏻 Family - Woman: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Kalbajar (AZ-KAL)
🏴 Flag for Gadabay (AZ-GAD)
🏴 Flag for Lachin (AZ-LAC)
🏴 Flag for Lankaran (AZ-LA)
🏴 Flag for Ho Chi Minh City (VN-SG)
🏴 Flag for Lerik (AZ-LER)
🏴 Flag for Mingachevir (AZ-MI)
👩🏾👨🏾👧🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Naftalan (AZ-NA)
🏴 Flag for Masally (AZ-MAS)
👨❤️👩 Couple With Heart - Man, Woman
🏴 Flag for Lankaran District (AZ-LAN)
👩🏼👨🏼👧🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👩🏽❤️💋👨🏾 Kiss - Woman: Medium Skin Tone, Man: Medium-Dark Skin Tone
👩🏿👧🏿👶🏿 Family - Woman: Dark Skin Tone, Girl: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Neftchala (AZ-NEF)
🏴 Flag for Nakhchivan AR (AZ-NX)
🏴 Flag for Celje (SI-011)
🏴 Flag for Panevėžio Municipality (LT-32)
👩🏿❤️💋👩🏽 Kiss - Woman: Dark Skin Tone, Woman: Medium Skin Tone
👨🏻❤️👩🏿 Couple With Heart - Man: Light Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Ismailli (AZ-ISM)
Tag Latin Small Letter H
👩🏾❤️👨🏻 Couple With Heart - Woman: Medium-Dark Skin Tone, Man: Light Skin Tone
👩🏻👶🏻 Family - Woman: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Nana-Mambéré (CF-NM)
🏴 Flag for Gobustan (AZ-QOB)
👩🏿❤️💋👨🏻 Kiss - Woman: Dark Skin Tone, Man: Light Skin Tone
👩🏿❤️💋👩🏿 Kiss - Woman: Dark Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Qubadli (AZ-QBI)
🏴 Flag for Qazakh (AZ-QAZ)
🏴 Flag for Braşov (RO-BV)
👨👩👧👶 Family: Man, Woman, Girl, Baby
🏴 Flag for Quba (AZ-QBA)
🏴 Flag for Qabala (AZ-QAB)
🏴 Flag for Uri (CH-UR)
🏴 Flag for Oghuz (AZ-OGU)
🏴 Flag for Qakh (AZ-QAX)
🏴 Flag for Šmarješke Toplice (SI-206)
👨🏾❤️💋👩🏿 Kiss - Man: Medium-Dark Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Saint Peter (AG-07)
👨🏻👩🏻👧🏻👧🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Maryland (LR-MY)
🏴 Flag for South Australia (AU-SA)
🏴 Flag for Qusar (AZ-QUS)
🏴 Flag for Sabirabad (AZ-SAB)
👨❤️👩🏽 Couple With Heart - Man, Woman: Medium Skin Tone
👨❤️👩🏼 Couple With Heart - Man, Woman: Medium-Light Skin Tone
🏴 Flag for Saatly (AZ-SAT)
🏴 Flag for Shabran (AZ-SBN)
👨🏼❤️👩🏽 Couple With Heart - Man: Medium-Light Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Shaki District (AZ-SAK)
🏴 Flag for Casanare (CO-CAS)
👨👩👶👶 Family: Man, Woman, Baby, Baby
🏴 Flag for Shirvan (AZ-SR)
🏴 Flag for Shusha (AZ-SUS)
🏴 Flag for Valais (CH-VS)
👩🏽👶🏽 Family - Woman: Medium Skin Tone, Baby: Medium Skin Tone
👩🏻❤️💋👨🏿 Kiss - Woman: Light Skin Tone, Man: Dark Skin Tone
🏴 Flag for Shaki (AZ-SA)
🏴 Flag for Martinique (FR-MQ)
🏴 Flag for Sumqayit (AZ-SM)
🏴 Flag for Siazan (AZ-SIY)
🏴 Flag for Shamakhi (AZ-SMI)
👩🏿❤️💋👨 Kiss - Woman: Dark Skin Tone, Man
🏴 Flag for Samukh (AZ-SMX)
👨🏻👩🏻👧🏻👶🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Tovuz (AZ-TOV)
🏴 Flag for Khachmaz (AZ-XAC)
🏴 Flag for Ujar (AZ-UCA)
🏴 Flag for Tartar (AZ-TAR)
👨🏿❤️💋👨🏻 Kiss - Man: Dark Skin Tone, Man: Light Skin Tone
👩🏼👧🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏽👩🏽👧🏽👶🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Khizi (AZ-XIZ)
👨🏽❤️👨🏼 Couple With Heart - Man: Medium Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Khojali (AZ-XCI)
🏴 Flag for Delta Amacuro (VE-Y)
🏴 Flag for Stepanakert (AZ-XA)
🏴 Flag for Yardymli (AZ-YAR)
🏴 Flag for Yevlakh District (AZ-YEV)
🏴 Flag for Zaqatala (AZ-ZAQ)
👩🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Yevlakh (AZ-YE)
🏴 Flag for Federation of Bosnia and Herzegovina (BA-BIH)
🏴 Flag for Zardab (AZ-ZAR)
🏴 Flag for Salyan (AZ-SAL)
🏴 Flag for Zug (CH-ZG)
👨🏾👩🏾👧🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
👨🏿👩🏿👧🏿👶🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone, Baby: Dark Skin Tone
👩🏿👶🏿 Family - Woman: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Republika Srpska (BA-SRP)
👨🏽❤️👩 Couple With Heart - Man: Medium Skin Tone, Woman
👨🏻👩🏻👶🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Andalusia (ES-AN)
👨🏼👩🏼👶🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Saint James (BB-04)
👨🏾❤️👩🏼 Couple With Heart - Man: Medium-Dark Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Saint George (BB-03)
🏴 Flag for Saint Andrew (BB-02)
👨👩👶👦 Family: Man, Woman, Baby, Boy
👨🏽👩🏽👶🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Saint John (BB-05)
👨🏾👩🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Saint Joseph (BB-06)
🏴 Flag for Western (LK-1)
🏴 Flag for Brest (BY-BR)
🏴 Flag for Shamkir (AZ-SKR)
🏴 Flag for Saint Lucy (BB-07)
👩🏻👶🏻👦🏻 Family - Woman: Light Skin Tone, Baby: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Castile-La Mancha (ES-CM)
🏴 Flag for Saint Philip (BB-10)
🏴 Flag for Saint George (VC-04)
👨🏻👩🏻👶🏻👦🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone, Boy: Light Skin Tone
👩🏻👧🏻👧🏻 Family - Woman: Light Skin Tone, Girl: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Barisal (BD-A)
🏴 Flag for Zangilan (AZ-ZAN)
🏴 Flag for Kingston (JM-01)
👨🏼👩🏼👶🏼👦🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Rajshahi Division (BD-E)
🏴 Flag for Rangpur Division (BD-F)
🏴 Flag for Dhaka Division (BD-C)
🏴 Flag for Khulna Division (BD-D)
🏴 Flag for Saint Peter (BB-09)
🏴 Flag for Lenart (SI-058)
👩🏼👶🏼 Family - Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Cascades (BF-02)
🏴 Flag for Mymensingh Division (BD-H)
🏴 Flag for Wallonia (BE-WAL)
🏴 Flag for Beau-Bassin Rose-Hill (MU-BR)
🏴 Flag for Centre-Est (BF-04)
🏴 Flag for Hong Kong SAR China (CN-91)
🏴 Flag for Boucle du Mouhoun (BF-01)
🏴 Flag for Centre (BF-03)
🏴 Flag for Central Denmark (DK-82)
🏴 Flag for Centre-Sud (BF-07)
👨🏽👩🏽👶🏽👦🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Centre-Ouest (BF-06)
🏴 Flag for Centre-Nord (BF-05)
🏴 Flag for Saint Michael (BB-08)
🏴 Flag for Saint Thomas (BB-11)
👨🏽❤️👩🏿 Couple With Heart - Man: Medium Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Est (BF-08)
🏴 Flag for Brussels (BE-BRU)
🏴 Flag for Sylhet Division (BD-G)
🏴 Flag for Plateau-Central (BF-11)
🏴 Flag for Chittagong Division (BD-B)
🏴 Flag for Sud-Ouest (BF-13)
👨🏾👩🏾👶🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Vidin (BG-05)
🏴 Flag for Varna (BG-03)
👨🏿❤️👩🏽 Couple With Heart - Man: Dark Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Burgas (BG-02)
🏴 Flag for Nord (BF-10)
🏴 Flag for Veliko Tarnovo (BG-04)
👨🏽👩🏽👧🏽👧🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Gabrovo (BG-07)
👨🏿👩🏿👶🏿👦🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Dobrich (BG-08)
🏴 Flag for Sahel (BF-12)
🏴 Flag for Tasmania (AU-TAS)
👨🏿❤️👩🏻 Couple With Heart - Man: Dark Skin Tone, Woman: Light Skin Tone
👩🏻👧🏻👦🏻 Family - Woman: Light Skin Tone, Girl: Light Skin Tone, Boy: Light Skin Tone
👨🏻👩🏻👶🏻👧🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone, Girl: Light Skin Tone
👨🏼👩🏼👶🏼👧🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏾❤️💋👩🏾 Kiss - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Khojavend (AZ-XVD)
🏴 Flag for Lovech (BG-11)
🏴 Flag for Libertador General Bernardo O’Higgins (CL-LI)
🏴 Flag for Pazardzhik (BG-13)
👨🏿❤️👩🏿 Couple With Heart - Man: Dark Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Pernik (BG-14)
🏴 Flag for Kyustendil (BG-10)
🏴 Flag for Red Sea (EG-BA)
🏴 Flag for Zanzibar Central/South (TZ-11)
👨🏿👩🏿👧🏿👦🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Pleven (BG-15)
👨🏿👨🏿👦🏿👦🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone, Boy: Dark Skin Tone
👨🏽👩🏽👶🏽👧🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Smolyan (BG-21)
🏴 Flag for Blagoevgrad (BG-01)
🏴 Flag for Bordj Bou Arréridj (DZ-34)
🏴 Flag for Plovdiv (BG-16)
🏴 Flag for Vallée du Bandama (CI-VB)
🏴 Flag for Silistra (BG-19)
👩❤️👨🏼 Couple With Heart - Woman, Man: Medium-Light Skin Tone
🏴 Flag for Razgrad (BG-17)
👨🏾❤️👨 Couple With Heart - Man: Medium-Dark Skin Tone, Man
🏴 Flag for Cunene (AO-CNN)
🏴 Flag for Sliven (BG-20)
🧕🏻♀️ Woman With Headscarf: Light Skin Tone
🏴 Flag for Targovishte (BG-25)
👩🏼👩🏼👶🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏾👩🏾👶🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Sofia District (BG-23)
🏴 Flag for Sofia (BG-22)
👨🏿👩🏿👧🏿👧🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Girl: Dark Skin Tone, Girl: Dark Skin Tone
👨🏻❤️💋👩🏾 Kiss - Man: Light Skin Tone, Woman: Medium-Dark Skin Tone
🧕🏽♀️ Woman With Headscarf: Medium Skin Tone
🏴 Flag for Yambol (BG-28)
🏴 Flag for Capital (BH-13)
🏴 Flag for Haskovo (BG-26)
🏴 Flag for Schaan (LI-07)
👨🏿👩🏿👶🏿👧🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Muharraq (BH-15)
🏴 Flag for Southern (BH-14)
🧕🏾♀️ Woman With Headscarf: Medium-Dark Skin Tone
🏴 Flag for Sibiu (RO-SB)
🧕🏼♀️ Woman With Headscarf: Medium-Light Skin Tone
👩🏻❤️👨🏿 Couple With Heart - Woman: Light Skin Tone, Man: Dark Skin Tone
🏴 Flag for Northern (BH-17)
🏴 Flag for Bubanza (BI-BB)
👩🏻❤️👩 Couple With Heart - Woman: Light Skin Tone, Woman
🏴 Flag for Flanders (BE-VLG)
👩🏽👧🏽 Family - Woman: Medium Skin Tone, Girl: Medium Skin Tone
👨🏻👩🏻👶🏻👶🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Baby: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Bujumbura (BI-BM)
🧕🏿♀️ Woman With Headscarf: Dark Skin Tone
🏴 Flag for Bujumbura Rural (BI-BL)
👨🏾❤️💋👩🏽 Kiss - Man: Medium-Dark Skin Tone, Woman: Medium Skin Tone
👨🏼👩🏼👶🏼👶🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👨🏻👨🏻👦🏻👶🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Cankuzo (BI-CA)
🏴 Flag for Montana (BG-12)
🏴 Flag for Sala (LV-085)
⃣ Combining Enclosing Keycap
🏴 Flag for Bururi (BI-BR)
🏴 Flag for Kardzhali (BG-09)
🏴 Flag for Rumonge (BI-RM)
🏴 Flag for Aruba (NL-AW)
🏴 Flag for Muyinga (BI-MY)
🏴 Flag for Rutana (BI-RT)
🏴 Flag for Ruyigi (BI-RY)
🏴 Flag for Kirundo (BI-KI)
🏴 Flag for Kayanza (BI-KY)
🏴 Flag for Mwaro (BI-MW)
🏴 Flag for Shumen (BG-27)
🏴 Flag for Ngozi (BI-NG)
🏴 Flag for Karuzi (BI-KR)
🏴 Flag for Muramvya (BI-MU)
🏴 Flag for Laâyoune-Boujdour-Sakia El Hamra (MA-15)
👨🏽👩🏽👶🏽👶🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Baby: Medium Skin Tone, Baby: Medium Skin Tone
👩🏾👨🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏾👩🏾👶🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Donga (BJ-DO)
👩🏽👨🏽👶🏽👦🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone, Boy: Medium Skin Tone
👨🏽❤️💋👩🏼 Kiss - Man: Medium Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Hauts-de-France (FR-HDF)
🏴 Flag for Alibori (BJ-AL)
🏴 Flag for Atakora (BJ-AK)
👨🏿👩🏿👶🏿👶🏿 Family - Man: Dark Skin Tone, Woman: Dark Skin Tone, Baby: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Littoral (BJ-LI)
🏴 Flag for Borgou (BJ-BO)
👩👩👧👶 Family: Woman, Woman, Girl, Baby
🏴 Flag for North Dakota (US-ND)
👨🏼❤️💋👨🏾 Kiss - Man: Medium-Light Skin Tone, Man: Medium-Dark Skin Tone
🏴 Flag for Kouffo (BJ-KO)
🏴 Flag for Plateau (BJ-PL)
🏴 Flag for Carriacou and Petite Martinique (GD-10)
🏴 Flag for Zou (BJ-ZO)
👩🏼❤️👨🏻 Couple With Heart - Woman: Medium-Light Skin Tone, Man: Light Skin Tone
👩🏽❤️👨🏽 Couple With Heart - Woman: Medium Skin Tone, Man: Medium Skin Tone
👨🏽❤️👩🏼 Couple With Heart - Man: Medium Skin Tone, Woman: Medium-Light Skin Tone
👩🏽❤️👨🏻 Couple With Heart - Woman: Medium Skin Tone, Man: Light Skin Tone
🏴 Flag for Beqaa (LB-BI)
🏴 Flag for Temburong (BN-TE)
👩🏻👦🏻 Family - Woman: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Tutong (BN-TU)
🏴 Flag for Brunei-Muara (BN-BM)
👨🏻👩🏻👦🏻👦🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Vratsa (BG-06)
👩🏽❤️👨🏼 Couple With Heart - Woman: Medium Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Beni (BO-B)
🏴 Flag for Belait (BN-BE)
👩🏼❤️👨 Couple With Heart - Woman: Medium-Light Skin Tone, Man
🏴 Flag for Ouémé (BJ-OU)
👩🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Roche Caiman (SC-25)
👩🏻❤️👨🏾 Couple With Heart - Woman: Light Skin Tone, Man: Medium-Dark Skin Tone
🏴 Flag for Cochabamba (BO-C)
👨🏾👩🏾👧🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Pando (BO-N)
👩🏽❤️👩🏻 Couple With Heart - Woman: Medium Skin Tone, Woman: Light Skin Tone
👩🏾❤️👨🏽 Couple With Heart - Woman: Medium-Dark Skin Tone, Man: Medium Skin Tone
🏴 Flag for Chuquisaca (BO-H)
🏴 Flag for La Paz (BO-L)
🏴 Flag for Khentii (MN-039)
🕴🏽♀️ Woman in Business Suit Levitating: Medium Skin Tone
🏴 Flag for Dolneni (MK-27)
🏴 Flag for Stara Zagora (BG-24)
👩🏽👦🏽 Family - Woman: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Sistan and Baluchestan (IR-13)
👩🏾❤️👨🏼 Couple With Heart - Woman: Medium-Dark Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Potosí (BO-P)
🏴 Flag for Bonaire (BQ-BO)
👩❤️💋👨🏻 Kiss - Woman, Man: Light Skin Tone
👩🏾❤️👨 Couple With Heart - Woman: Medium-Dark Skin Tone, Man
👩🏼👦🏼👦🏼 Family - Woman: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Brčko District (BA-BRC)
🏴 Flag for Saba (BQ-SA)
👩🏽❤️👨🏾 Couple With Heart - Woman: Medium Skin Tone, Man: Medium-Dark Skin Tone
👩🏾❤️👨🏿 Couple With Heart - Woman: Medium-Dark Skin Tone, Man: Dark Skin Tone
🏴 Flag for Acre (BR-AC)
🏴 Flag for Gitega (BI-GI)
👩🏿👦🏿 Family - Woman: Dark Skin Tone, Boy: Dark Skin Tone
👩🏿❤️👨🏻 Couple With Heart - Woman: Dark Skin Tone, Man: Light Skin Tone
🏴 Flag for Amazonas (BR-AM)
🏴 Flag for Buenos Aires (AR-C)
👨🏼👩🏼👧🏼👶🏼 Family - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👨🏼❤️💋👨🏼 Kiss - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Espírito Santo (BR-ES)
👨🏿❤️💋👨🏾 Kiss - Man: Dark Skin Tone, Man: Medium-Dark Skin Tone
👨🏼❤️💋👨🏽 Kiss - Man: Medium-Light Skin Tone, Man: Medium Skin Tone
👩🏾👦🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👨🏻❤️👩 Couple With Heart - Man: Light Skin Tone, Woman
👨🏿❤️💋👩🏾 Kiss - Man: Dark Skin Tone, Woman: Medium-Dark Skin Tone
👩🏻❤️💋👩🏽 Kiss - Woman: Light Skin Tone, Woman: Medium Skin Tone
👨🏼❤️💋👨🏿 Kiss - Man: Medium-Light Skin Tone, Man: Dark Skin Tone
👩🏽👦🏽👦🏽 Family - Woman: Medium Skin Tone, Boy: Medium Skin Tone, Boy: Medium Skin Tone
👩🏿❤️👩🏼 Couple With Heart - Woman: Dark Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Maranhão (BR-MA)
👩🏿❤️👩🏽 Couple With Heart - Woman: Dark Skin Tone, Woman: Medium Skin Tone
👩🏿❤️👩 Couple With Heart - Woman: Dark Skin Tone, Woman
🏴 Flag for Amapá (BR-AP)
👨🏽❤️👨🏻 Couple With Heart - Man: Medium Skin Tone, Man: Light Skin Tone
👩🏻❤️💋👨🏻 Kiss - Woman: Light Skin Tone, Man: Light Skin Tone
👨🏽❤️💋👨🏽 Kiss - Man: Medium Skin Tone, Man: Medium Skin Tone
👩🏿❤️💋👩🏻 Kiss - Woman: Dark Skin Tone, Woman: Light Skin Tone
👨🏽❤️💋👩🏿 Kiss - Man: Medium Skin Tone, Woman: Dark Skin Tone
👩🏼❤️💋👨🏾 Kiss - Woman: Medium-Light Skin Tone, Man: Medium-Dark Skin Tone
👨🏿❤️💋👨🏼 Kiss - Man: Dark Skin Tone, Man: Medium-Light Skin Tone
👨🏾❤️💋👨🏿 Kiss - Man: Medium-Dark Skin Tone, Man: Dark Skin Tone
👩🏽❤️💋👩🏿 Kiss - Woman: Medium Skin Tone, Woman: Dark Skin Tone
👩🏼❤️💋👨🏿 Kiss - Woman: Medium-Light Skin Tone, Man: Dark Skin Tone
👨🏽❤️💋👩🏽 Kiss - Man: Medium Skin Tone, Woman: Medium Skin Tone
👨🏾❤️💋👨🏼 Kiss - Man: Medium-Dark Skin Tone, Man: Medium-Light Skin Tone
👨🏽❤️💋👩🏻 Kiss - Man: Medium Skin Tone, Woman: Light Skin Tone
👨🏾❤️💋👨 Kiss - Man: Medium-Dark Skin Tone, Man
👨🏾❤️💋👨🏾 Kiss - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone
👩❤️💋👨🏾 Kiss - Woman, Man: Medium-Dark Skin Tone
👩❤️💋👩🏻 Kiss - Woman, Woman: Light Skin Tone
👩🏽❤️💋👨🏻 Kiss - Woman: Medium Skin Tone, Man: Light Skin Tone
👩🏿❤️💋👨🏿 Kiss - Woman: Dark Skin Tone, Man: Dark Skin Tone
👩🏻❤️💋👩🏿 Kiss - Woman: Light Skin Tone, Woman: Dark Skin Tone
👩🏻❤️💋👩🏼 Kiss - Woman: Light Skin Tone, Woman: Medium-Light Skin Tone
👩🏾❤️💋👩🏿 Kiss - Woman: Medium-Dark Skin Tone, Woman: Dark Skin Tone
👩🏾❤️💋👩 Kiss - Woman: Medium-Dark Skin Tone, Woman
👩🏾❤️💋👩🏻 Kiss - Woman: Medium-Dark Skin Tone, Woman: Light Skin Tone
👩🏻❤️👨 Couple With Heart - Woman: Light Skin Tone, Man
👩🏻👩🏻👦🏻 Family - Woman: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone
👩🏾❤️💋👨🏾 Kiss - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone
👨🏻❤️👨🏽 Couple With Heart - Man: Light Skin Tone, Man: Medium Skin Tone
🏴 Flag for Mato Grosso (BR-MT)
👨🏽❤️👩🏻 Couple With Heart - Man: Medium Skin Tone, Woman: Light Skin Tone
👨❤️👨🏿 Couple With Heart - Man, Man: Dark Skin Tone
👩🏿❤️💋👨🏼 Kiss - Woman: Dark Skin Tone, Man: Medium-Light Skin Tone
👩🏿❤️💋👩🏾 Kiss - Woman: Dark Skin Tone, Woman: Medium-Dark Skin Tone
👩🏻👦🏻👧🏻 Family - Woman: Light Skin Tone, Boy: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Santa Cruz (BO-S)
👨🏻❤️👩🏽 Couple With Heart - Man: Light Skin Tone, Woman: Medium Skin Tone
👨🏽❤️👩🏽 Couple With Heart - Man: Medium Skin Tone, Woman: Medium Skin Tone
👩🏾❤️💋👩🏽 Kiss - Woman: Medium-Dark Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Collines (BJ-CO)
👨🏻👩🏻👦🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Boy: Light Skin Tone
👨❤️👨🏽 Couple With Heart - Man, Man: Medium Skin Tone
👨🏾👩🏾👦🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏼❤️👨 Couple With Heart - Man: Medium-Light Skin Tone, Man
👨🏾❤️👩🏽 Couple With Heart - Man: Medium-Dark Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Pará (BR-PA)
👩🏽👦🏽👧🏽 Family - Woman: Medium Skin Tone, Boy: Medium Skin Tone, Girl: Medium Skin Tone
👨🏼❤️👨🏼 Couple With Heart - Man: Medium-Light Skin Tone, Man: Medium-Light Skin Tone
👨🏿❤️👨🏻 Couple With Heart - Man: Dark Skin Tone, Man: Light Skin Tone
👩🏽❤️👩🏽 Couple With Heart - Woman: Medium Skin Tone, Woman: Medium Skin Tone
👨🏾❤️👨🏽 Couple With Heart - Man: Medium-Dark Skin Tone, Man: Medium Skin Tone
👨🏽❤️👨🏽 Couple With Heart - Man: Medium Skin Tone, Man: Medium Skin Tone
👨🏻❤️👩🏼 Couple With Heart - Man: Light Skin Tone, Woman: Medium-Light Skin Tone
👨🏾❤️👩🏿 Couple With Heart - Man: Medium-Dark Skin Tone, Woman: Dark Skin Tone
👨🏾❤️👨🏼 Couple With Heart - Man: Medium-Dark Skin Tone, Man: Medium-Light Skin Tone
👩🏾❤️💋👩🏾 Kiss - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone
👩🏼❤️👩🏻 Couple With Heart - Woman: Medium-Light Skin Tone, Woman: Light Skin Tone
👨🏿❤️👩🏼 Couple With Heart - Man: Dark Skin Tone, Woman: Medium-Light Skin Tone
👨🏼❤️👨🏾 Couple With Heart - Man: Medium-Light Skin Tone, Man: Medium-Dark Skin Tone
👨🏽❤️👨🏾 Couple With Heart - Man: Medium Skin Tone, Man: Medium-Dark Skin Tone
👩❤️👨🏾 Couple With Heart - Woman, Man: Medium-Dark Skin Tone
🏴 Flag for Alagoas (BR-AL)
👩❤️👨🏻 Couple With Heart - Woman, Man: Light Skin Tone
🏴 Flag for Hauts-Bassins (BF-09)
👨🏼👦🏼 Family - Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👩🏾❤️👩🏾 Couple With Heart - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Rio de Janeiro (BR-RJ)
👨🏾❤️💋👩🏻 Kiss - Man: Medium-Dark Skin Tone, Woman: Light Skin Tone
🏴 Flag for Rondônia (BR-RO)
👨🏾❤️👨🏿 Couple With Heart - Man: Medium-Dark Skin Tone, Man: Dark Skin Tone
👨🏽👦🏽 Family - Man: Medium Skin Tone, Boy: Medium Skin Tone
👨🏼❤️👨🏽 Couple With Heart - Man: Medium-Light Skin Tone, Man: Medium Skin Tone
🏴 Flag for Piauí (BR-PI)
👨🏽👩🏽👦🏽👦🏽 Family - Man: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Rio Grande do Norte (BR-RN)
👩🏻❤️👨🏻 Couple With Heart - Woman: Light Skin Tone, Man: Light Skin Tone
👨🏻👦🏻 Family - Man: Light Skin Tone, Boy: Light Skin Tone
👩🏼❤️👩🏾 Couple With Heart - Woman: Medium-Light Skin Tone, Woman: Medium-Dark Skin Tone
👨🏿❤️👩🏾 Couple With Heart - Man: Dark Skin Tone, Woman: Medium-Dark Skin Tone
🏴 Flag for Sergipe (BR-SE)
🏴 Flag for Paraná (BR-PR)
👨🏿👦🏿 Family - Man: Dark Skin Tone, Boy: Dark Skin Tone
👩🏼❤️👩🏽 Couple With Heart - Woman: Medium-Light Skin Tone, Woman: Medium Skin Tone
👩🏾❤️👩🏼 Couple With Heart - Woman: Medium-Dark Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Moscow Province (RU-MOS)
👩🏽❤️💋👩🏽 Kiss - Woman: Medium Skin Tone, Woman: Medium Skin Tone
👩🏿👦🏿👦🏿 Family - Woman: Dark Skin Tone, Boy: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for São Paulo (BR-SP)
🏴 Flag for East Azerbaijan (IR-01)
🏴 Flag for Rio Grande do Sul (BR-RS)
👩🏼❤️👨🏿 Couple With Heart - Woman: Medium-Light Skin Tone, Man: Dark Skin Tone
🏴 Flag for Sogn og Fjordane (NO-14)
🏴 Flag for Tocantins (BR-TO)
🏴 Flag for Sveti Andraž v Slovenskih Goricah (SI-182)
👨🏼❤️👩🏻 Couple With Heart - Man: Medium-Light Skin Tone, Woman: Light Skin Tone
👨🏿❤️👨🏽 Couple With Heart - Man: Dark Skin Tone, Man: Medium Skin Tone
👨🏽👦🏽👦🏽 Family - Man: Medium Skin Tone, Boy: Medium Skin Tone, Boy: Medium Skin Tone
👨🏿👦🏿👦🏿 Family - Man: Dark Skin Tone, Boy: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Bimini (BS-BI)
👨🏿❤️👩 Couple With Heart - Man: Dark Skin Tone, Woman
👩🏻👦🏻👦🏻 Family - Woman: Light Skin Tone, Boy: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Roraima (BR-RR)
🏴 Flag for Oruro (BO-O)
🏴 Flag for Exuma (BS-EX)
👨🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👩🏽❤️👨 Couple With Heart - Woman: Medium Skin Tone, Man
🏴 Flag for Central Eleuthera (BS-CE)
🏴 Flag for Berry Islands (BS-BY)
🏴 Flag for Makamba (BI-MA)
🏴 Flag for Federal District (BR-DF)
👩🏻❤️👩🏾 Couple With Heart - Woman: Light Skin Tone, Woman: Medium-Dark Skin Tone
👨🏼❤️💋👩🏼 Kiss - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone
🏴 Flag for Central Abaco (BS-CO)
🏴 Flag for East Grand Bahama (BS-EG)
🏴 Flag for Central Andros (BS-CS)
👨🏻👦🏻👧🏻 Family - Man: Light Skin Tone, Boy: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for Crooked Island (BS-CK)
🏴 Flag for Black Point (BS-BP)
👨🏼👦🏼👧🏼 Family - Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏽👦🏽👧🏽 Family - Man: Medium Skin Tone, Boy: Medium Skin Tone, Girl: Medium Skin Tone
👩🏿❤️👨🏾 Couple With Heart - Woman: Dark Skin Tone, Man: Medium-Dark Skin Tone
👩🏾❤️💋👨🏼 Kiss - Woman: Medium-Dark Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for North Eleuthera (BS-NE)
🏴 Flag for North Abaco (BS-NO)
🏴 Flag for Mayaguana (BS-MG)
👨🏾👦🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏼❤️💋👩🏻 Kiss - Man: Medium-Light Skin Tone, Woman: Light Skin Tone
🏴 Flag for Grand Cay (BS-GC)
🏴 Flag for Freeport (BS-FP)
🏴 Flag for Inagua (BS-IN)
🏴 Flag for Hope Town (BS-HT)
👩🏾❤️👩🏿 Couple With Heart - Woman: Medium-Dark Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Long Island (BS-LI)
👨🏿👦🏿👧🏿 Family - Man: Dark Skin Tone, Boy: Dark Skin Tone, Girl: Dark Skin Tone
👨🏾❤️👩 Couple With Heart - Man: Medium-Dark Skin Tone, Woman
👩🏿❤️👨🏿 Couple With Heart - Woman: Dark Skin Tone, Man: Dark Skin Tone
👨🏻👦🏻👶🏻 Family - Man: Light Skin Tone, Boy: Light Skin Tone, Baby: Light Skin Tone
👨👨👶 Family: Man, Man, Baby
👩👧👶 Family: Woman, Girl, Baby
👨👦👶 Family: Man, Boy, Baby
👨👨👶👦 Family: Man, Man, Baby, Boy
👨👦👧 Family: Man, Boy, Girl
👨👶👶 Family: Man, Baby, Baby
🏴 Flag for Ragged Island (BS-RI)
👩🏿❤️👩🏿 Couple With Heart - Woman: Dark Skin Tone, Woman: Dark Skin Tone
👩🏿❤️👨🏽 Couple With Heart - Woman: Dark Skin Tone, Man: Medium Skin Tone
👩🏼❤️👨🏼 Couple With Heart - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for North Andros (BS-NS)
👩🏿❤️👩🏻 Couple With Heart - Woman: Dark Skin Tone, Woman: Light Skin Tone
👨🏻❤️💋👨 Kiss - Man: Light Skin Tone, Man
🏴 Flag for South Andros (BS-SA)
👨🏻❤️💋👨🏼 Kiss - Man: Light Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for South Eleuthera (BS-SE)
👨🏼👦🏼👶🏼 Family - Man: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👨🏻❤️💋👩🏻 Kiss - Man: Light Skin Tone, Woman: Light Skin Tone
👨🏼❤️💋👩🏾 Kiss - Man: Medium-Light Skin Tone, Woman: Medium-Dark Skin Tone
👨🏾❤️💋👩🏼 Kiss - Man: Medium-Dark Skin Tone, Woman: Medium-Light Skin Tone
👨🏾❤️💋👨🏻 Kiss - Man: Medium-Dark Skin Tone, Man: Light Skin Tone
🏴 Flag for Santa Catarina (BR-SC)
👩👩👦👧 Family: Woman, Woman, Boy, Girl
👨❤️💋👩🏾 Kiss - Man, Woman: Medium-Dark Skin Tone
🏴 Flag for Rum Cay (BS-RC)
👩👩👶👦 Family: Woman, Woman, Baby, Boy
👨🏻❤️💋👩🏽 Kiss - Man: Light Skin Tone, Woman: Medium Skin Tone
🏴 Flag for Cat Island (BS-CI)
👩🏽❤️👩 Couple With Heart - Woman: Medium Skin Tone, Woman
👨🏽👦🏽👶🏽 Family - Man: Medium Skin Tone, Boy: Medium Skin Tone, Baby: Medium Skin Tone
👩👨👦👶 Family: Woman, Man, Boy, Baby
👨🏾❤️💋👩 Kiss - Man: Medium-Dark Skin Tone, Woman
👨❤️💋👨🏻 Kiss - Man, Man: Light Skin Tone
👨🏻❤️💋👨🏿 Kiss - Man: Light Skin Tone, Man: Dark Skin Tone
👨🏼❤️💋👩🏽 Kiss - Man: Medium-Light Skin Tone, Woman: Medium Skin Tone
👨🏾👦🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for South Abaco (BS-SO)
👩🏾❤️💋👩🏼 Kiss - Woman: Medium-Dark Skin Tone, Woman: Medium-Light Skin Tone
👨🏻❤️👨🏿 Couple With Heart - Man: Light Skin Tone, Man: Dark Skin Tone
👨🏿❤️💋👨🏿 Kiss - Man: Dark Skin Tone, Man: Dark Skin Tone
👩🏾❤️💋👨🏿 Kiss - Woman: Medium-Dark Skin Tone, Man: Dark Skin Tone
👩🏼❤️💋👨🏽 Kiss - Woman: Medium-Light Skin Tone, Man: Medium Skin Tone
👩🏾❤️💋👨🏻 Kiss - Woman: Medium-Dark Skin Tone, Man: Light Skin Tone
👩🏽❤️💋👨 Kiss - Woman: Medium Skin Tone, Man
👨👧👶 Family: Man, Girl, Baby
👩🏻❤️💋👨🏾 Kiss - Woman: Light Skin Tone, Man: Medium-Dark Skin Tone
👨❤️👨🏼 Couple With Heart - Man, Man: Medium-Light Skin Tone
👩🏼❤️💋👩🏼 Kiss - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone
👨🏿❤️💋👩🏿 Kiss - Man: Dark Skin Tone, Woman: Dark Skin Tone
👨❤️💋👩🏼 Kiss - Man, Woman: Medium-Light Skin Tone
🏴 Flag for Abidjan (CI-AB)
👩🏻❤️💋👨 Kiss - Woman: Light Skin Tone, Man
👩🏼❤️💋👩🏾 Kiss - Woman: Medium-Light Skin Tone, Woman: Medium-Dark Skin Tone
👨🏻❤️💋👩🏼 Kiss - Man: Light Skin Tone, Woman: Medium-Light Skin Tone
👩🏽❤️💋👨🏿 Kiss - Woman: Medium Skin Tone, Man: Dark Skin Tone
👩🏿❤️💋👩🏼 Kiss - Woman: Dark Skin Tone, Woman: Medium-Light Skin Tone
👩🏿❤️💋👨🏾 Kiss - Woman: Dark Skin Tone, Man: Medium-Dark Skin Tone
👩🏼❤️💋👨 Kiss - Woman: Medium-Light Skin Tone, Man
👩❤️👩🏾 Couple With Heart - Woman, Woman: Medium-Dark Skin Tone
👨🏿❤️👨🏼 Couple With Heart - Man: Dark Skin Tone, Man: Medium-Light Skin Tone
👨🏿👦🏿👶🏿 Family - Man: Dark Skin Tone, Boy: Dark Skin Tone, Baby: Dark Skin Tone
👨🏼❤️👩🏼 Couple With Heart - Man: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone
👩🏼❤️👨🏽 Couple With Heart - Woman: Medium-Light Skin Tone, Man: Medium Skin Tone
🏴 Flag for Spanish Wells (BS-SW)
👨🏿❤️👨🏿 Couple With Heart - Man: Dark Skin Tone, Man: Dark Skin Tone
👨🏼❤️👨🏿 Couple With Heart - Man: Medium-Light Skin Tone, Man: Dark Skin Tone
👨🏼❤️👩 Couple With Heart - Man: Medium-Light Skin Tone, Woman
👩🏼❤️👩🏼 Couple With Heart - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone
👨🏼❤️👨🏻 Couple With Heart - Man: Medium-Light Skin Tone, Man: Light Skin Tone
👨🏾❤️👨🏾 Couple With Heart - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone
👩❤️👩🏼 Couple With Heart - Woman, Woman: Medium-Light Skin Tone
👨🏼❤️👩🏿 Couple With Heart - Man: Medium-Light Skin Tone, Woman: Dark Skin Tone
👨🏻❤️👨🏾 Couple With Heart - Man: Light Skin Tone, Man: Medium-Dark Skin Tone
👨🏽❤️👩🏾 Couple With Heart - Man: Medium Skin Tone, Woman: Medium-Dark Skin Tone
👩❤️👩🏿 Couple With Heart - Woman, Woman: Dark Skin Tone
👨🏽❤️👨🏿 Couple With Heart - Man: Medium Skin Tone, Man: Dark Skin Tone
👨👨👦👶 Family: Man, Man, Boy, Baby
👨🏿❤️👨 Couple With Heart - Man: Dark Skin Tone, Man
👩🏻❤️👩🏿 Couple With Heart - Woman: Light Skin Tone, Woman: Dark Skin Tone
🏴 Flag for San Salvador (BS-SS)
🏴 Flag for Samtse (BT-14)
👩🏻❤️👨🏽 Couple With Heart - Woman: Light Skin Tone, Man: Medium Skin Tone
👩🏼❤️👩🏿 Couple With Heart - Woman: Medium-Light Skin Tone, Woman: Dark Skin Tone
👨❤️👩🏿 Couple With Heart - Man, Woman: Dark Skin Tone
🏴 Flag for Paro (BT-11)
👨🏻❤️👩🏾 Couple With Heart - Man: Light Skin Tone, Woman: Medium-Dark Skin Tone
👨🏼👧🏼 Family - Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Thimphu (BT-15)
👩🏾❤️👩🏽 Couple With Heart - Woman: Medium-Dark Skin Tone, Woman: Medium Skin Tone
🏴 Flag for West Grand Bahama (BS-WG)
🏴 Flag for Haa (BT-13)
🏴 Flag for Chukha (BT-12)
👨🏻❤️💋👨🏽 Kiss - Man: Light Skin Tone, Man: Medium Skin Tone
👨🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
👨🏽👧🏽 Family - Man: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Acklins (BS-AK)
🏴 Flag for Trongsa (BT-32)
🏴 Flag for Trashigang (BT-41)
🏴 Flag for Punakha (BT-23)
🏴 Flag for Wangdue Phodrang (BT-24)
🏴 Flag for Bumthang (BT-33)
🏴 Flag for Zhemgang (BT-34)
👩🏼❤️💋👨🏼 Kiss - Woman: Medium-Light Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Mongar (BT-42)
🏴 Flag for Paraíba (BR-PB)
👩🏿❤️👨🏼 Couple With Heart - Woman: Dark Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Zürich (CH-ZH)
🏴 Flag for Sarpang (BT-31)
🏴 Flag for Dagana (BT-22)
👩🏻❤️💋👨🏽 Kiss - Woman: Light Skin Tone, Man: Medium Skin Tone
👨🏿👨🏿👧🏿👧🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Central (BW-CE)
🏴 Flag for Gasa (BT-GA)
🏴 Flag for Chobe (BW-CH)
🏴 Flag for Samdrup Jongkhar (BT-45)
🏴 Flag for Francistown (BW-FR)
🏴 Flag for Lhuntse (BT-44)
🏴 Flag for Trashiyangtse (BT-TY)
🏴 Flag for Tsirang (BT-21)
🏴 Flag for Pemagatshel (BT-43)
👨🏿👧🏿 Family - Man: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for North East (BW-NE)
🏴 Flag for Kgatleng (BW-KL)
🏴 Flag for Kgalagadi (BW-KG)
🏴 Flag for South East (BW-SE)
🏴 Flag for Kweneng (BW-KW)
👨🏻👧🏻👦🏻 Family - Man: Light Skin Tone, Girl: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for North West (BW-NW)
🏴 Flag for Jwaneng (BW-JW)
🏴 Flag for Mangrove Cay (BS-MC)
👩🏼❤️💋👩🏿 Kiss - Woman: Medium-Light Skin Tone, Woman: Dark Skin Tone
🏴 Flag for Ghanzi (BW-GH)
👨🏻❤️👩🏻 Couple With Heart - Man: Light Skin Tone, Woman: Light Skin Tone
🏴 Flag for Atlantique (BJ-AQ)
👨🏼👧🏼👦🏼 Family - Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
👨🏾👧🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
👨🏿👧🏿👦🏿 Family - Man: Dark Skin Tone, Girl: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Southern (BW-SO)
👨🏽👧🏽👦🏽 Family - Man: Medium Skin Tone, Girl: Medium Skin Tone, Boy: Medium Skin Tone
👩🏾❤️👩 Couple With Heart - Woman: Medium-Dark Skin Tone, Woman
👨👩👶👧 Family: Man, Woman, Baby, Girl
👨🏽❤️💋👨🏾 Kiss - Man: Medium Skin Tone, Man: Medium-Dark Skin Tone
🏴 Flag for Sowa Town (BW-ST)
🏴 Flag for Selibe Phikwe (BW-SP)
👩🏿❤️👩🏾 Couple With Heart - Woman: Dark Skin Tone, Woman: Medium-Dark Skin Tone
👩👨👦👦 Family: Woman, Man, Boy, Boy
👩🏿👨🏿👦🏿👶🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Minsk (BY-HM)
🏴 Flag for Homel (BY-HO)
👨🏻👦🏻👦🏻 Family - Man: Light Skin Tone, Boy: Light Skin Tone, Boy: Light Skin Tone
👨🏻👩🏻👧🏻👦🏻 Family - Man: Light Skin Tone, Woman: Light Skin Tone, Girl: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Izmir (TR-35)
🏴 Flag for Hrodna (BY-HR)
🏴 Flag for Magileu (BY-MA)
🏴 Flag for Minsk Region (BY-MI)
👨🏼❤️💋👩🏿 Kiss - Man: Medium-Light Skin Tone, Woman: Dark Skin Tone
👨🏾❤️👩🏻 Couple With Heart - Man: Medium-Dark Skin Tone, Woman: Light Skin Tone
🏴 Flag for Belize (BZ-BZ)
🏴 Flag for Lobatse (BW-LO)
👩👦👧 Family: Woman, Boy, Girl
👨🏼👧🏼👧🏼 Family - Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Moore’s Island (BS-MI)
🏴 Flag for Mono (BJ-MO)
👨🏽👧🏽👧🏽 Family - Man: Medium Skin Tone, Girl: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Vitebsk (BY-VI)
🏴 Flag for Stann Creek (BZ-SC)
👨🏾👧🏾👧🏾 Family - Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Corozal (BZ-CZL)
👨🏻👧🏻👶🏻 Family - Man: Light Skin Tone, Girl: Light Skin Tone, Baby: Light Skin Tone
👨🏿👧🏿👧🏿 Family - Man: Dark Skin Tone, Girl: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Toledo (BZ-TOL)
🏴 Flag for Sudur Pashchimanchal (NP-5)
🏴 Flag for Harbour Island (BS-HI)
🏴 Flag for Alberta (CA-AB)
👩🏾❤️👨🏾 Couple With Heart - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone
👨🏽❤️💋👨🏼 Kiss - Man: Medium Skin Tone, Man: Medium-Light Skin Tone
🏴 Flag for Vientiane Province (LA-VI)
👨👩👦👧 Family: Man, Woman, Boy, Girl
👨🏻👧🏻👧🏻 Family - Man: Light Skin Tone, Girl: Light Skin Tone, Girl: Light Skin Tone
👨🏼👧🏼👶🏼 Family - Man: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
👨🏽👧🏽👶🏽 Family - Man: Medium Skin Tone, Girl: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Prince Edward Island (CA-PE)
🏴 Flag for Kwango (CD-KG)
🏴 Flag for Nova Scotia (CA-NS)
👨🏾👧🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Haut-Uélé (CD-HU)
🏴 Flag for Bas-Congo (CD-BC)
🏴 Flag for Sud-Ubangi (CD-SU)
🏴 Flag for Maniema (CD-MA)
🏴 Flag for Sankuru (CD-SA)
🏴 Flag for Tshuapa (CD-TU)
🏴 Flag for Yukon (CA-YT)
🏴 Flag for Mongala (CD-MO)
🏴 Flag for Bamingui-Bangoran (CF-BB)
🏴 Flag for Mai-Ndombe (CD-MN)
🏴 Flag for Nunavut (CA-NU)
🏴 Flag for Kwilu (CD-KL)
🏴 Flag for New Brunswick (CA-NB)
🏴 Flag for Bangui (CF-BGF)
🏴 Flag for Kinshasa (CD-KN)
🏴 Flag for North Kivu (CD-NK)
🏴 Flag for Northwest Territories (CA-NT)
🏴 Flag for Tshopo (CD-TO)
🏴 Flag for Bas-Uélé (CD-BU)
🏴 Flag for Haut-Lomami (CD-HL)
🏴 Flag for Haut-Katanga (CD-HK)
🏴 Flag for Kasaï-Oriental (CD-KE)
🏴 Flag for South Kivu (CD-SK)
🏴 Flag for Ontario (CA-ON)
🏴 Flag for Ouham (CF-AC)
🏴 Flag for Mambéré-Kadéï (CF-HS)
🏴 Flag for Kasaï Central (CD-KC)
🏴 Flag for Nord-Ubangi (CD-NU)
🏴 Flag for Kasaï (CD-KS)
🏴 Flag for Ituri (CD-IT)
🏴 Flag for Bern (CH-BE)
🏴 Flag for Lékoumou (CG-2)
🏴 Flag for Appenzell Innerrhoden (CH-AI)
🏴 Flag for Ombella-M’Poko (CF-MP)
👨🏻👶🏻 Family - Man: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for Kémo (CF-KG)
🏴 Flag for Sangha (CG-13)
🏴 Flag for Lucerne (CH-LU)
🏴 Flag for Geneva (CH-GE)
🏴 Flag for Nidwalden (CH-NW)
🏴 Flag for Kouilou (CG-5)
🏴 Flag for Likouala (CG-7)
🏴 Flag for Brazzaville (CG-BZV)
🏴 Flag for Schaffhausen (CH-SH)
🏴 Flag for Lomami (CD-LO)
🏴 Flag for Appenzell Ausserrhoden (CH-AR)
🏴 Flag for Schwyz (CH-SZ)
🏴 Flag for Neuchâtel (CH-NE)
🏴 Flag for Ouham-Pendé (CF-OP)
🏴 Flag for Graubünden (CH-GR)
🏴 Flag for Solothurn (CH-SO)
🏴 Flag for Fribourg (CH-FR)
🏴 Flag for Plateaux (CG-14)
🏴 Flag for Sangha-Mbaéré (CF-SE)
👨🏿👧🏿👶🏿 Family - Man: Dark Skin Tone, Girl: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Aargau (CH-AG)
🏴 Flag for Cuvette-Ouest (CG-15)
🏴 Flag for St. Gallen (CH-SG)
🏴 Flag for Cuvette (CG-8)
🏴 Flag for Obwalden (CH-OW)
🏴 Flag for Basel-Stadt (CH-BS)
🏴 Flag for Lobaye (CF-LB)
🏴 Flag for Valparaíso (CL-VS)
🏴 Flag for Northwest (CM-NW)
🏴 Flag for Denguélé (CI-DN)
🏴 Flag for North (CM-NO)
🏴 Flag for Yamoussoukro (CI-YM)
🏴 Flag for East (CM-ES)
👨🏼👶🏼 Family - Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Woroba (CI-WR)
🏴 Flag for Lagunes (CI-LG)
🏴 Flag for Gôh-Djiboua (CI-GD)
🏴 Flag for Comoé (CI-CM)
🏴 Flag for Southwest (CM-SW)
🏴 Flag for Bío Bío (CL-BI)
🏴 Flag for Aysén (CL-AI)
🏴 Flag for Santiago Metropolitan (CL-RM)
🏴 Flag for Tarapacá (CL-TA)
🏴 Flag for South (CM-SU)
🏴 Flag for Atacama (CL-AT)
🏴 Flag for Tianjin (CN-12)
🏴 Flag for Lacs (CI-LC)
🏴 Flag for Coquimbo (CL-CO)
🏴 Flag for Arica y Parinacota (CL-AP)
🏴 Flag for Littoral (CM-LT)
🏴 Flag for Centre (CM-CE)
🏴 Flag for Far North (CM-EN)
🏴 Flag for Magallanes Region (CL-MA)
🏴 Flag for Maule (CL-ML)
🏴 Flag for Montagnes (CI-MG)
🏴 Flag for Bas-Sassandra (CI-BS)
🏴 Flag for Adamawa (CM-AD)
🏴 Flag for Los Ríos (CL-LR)
🏴 Flag for West (CM-OU)
🏴 Flag for Savanes (CI-SV)
🏴 Flag for Los Lagos (CL-LL)
🏴 Flag for Shandong (CN-37)
🏴 Flag for Gansu (CN-62)
🏴 Flag for Shanghai (CN-31)
🏴 Flag for Jiangxi (CN-36)
🏴 Flag for Taiwan (CN-71)
🏴 Flag for Boyacá (CO-BOY)
🏴 Flag for Beijing (CN-11)
🏴 Flag for Ruse (BG-18)
🏴 Flag for Guangdong (CN-44)
🏴 Flag for Qinghai (CN-63)
🏴 Flag for Heilongjiang (CN-23)
🏴 Flag for Sichuan (CN-51)
🏴 Flag for Caldas (CO-CAL)
🏴 Flag for Bolívar (CO-BOL)
🏴 Flag for Yunnan (CN-53)
🏴 Flag for Atlántico (CO-ATL)
🏴 Flag for Hubei (CN-42)
🏴 Flag for Jilin (CN-22)
🏴 Flag for Caquetá (CO-CAQ)
🏴 Flag for Zhejiang (CN-33)
🏴 Flag for Hebei (CN-13)
🏴 Flag for Inner Mongolia (CN-15)
🏴 Flag for Hunan (CN-43)
🏴 Flag for Haute-Kotto (CF-HK)
🏴 Flag for Xinjiang (CN-65)
🏴 Flag for Chongqing (CN-50)
🏴 Flag for Guangxi (CN-45)
🏴 Flag for Tibet (CN-54)
🏴 Flag for Jiangsu (CN-32)
🏴 Flag for Arauca (CO-ARA)
🏴 Flag for Fujian (CN-35)
🏴 Flag for Henan (CN-41)
🏴 Flag for Hainan (CN-46)
🏴 Flag for Shanxi (CN-14)
🏴 Flag for Magdalena (CO-MAG)
🏴 Flag for Chocó (CO-CHO)
🏴 Flag for Guainía (CO-GUA)
🏴 Flag for Córdoba (CO-COR)
🏴 Flag for Putumayo (CO-PUT)
🏴 Flag for Santander (CO-SAN)
🏴 Flag for Villa Clara (CU-05)
🏴 Flag for Valle del Cauca (CO-VAC)
🏴 Flag for Quindío (CO-QUI)
🏴 Flag for Risaralda (CO-RIS)
🏴 Flag for Cundinamarca (CO-CUN)
👨🏽👶🏽 Family - Man: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Alajuela (CR-A)
🏴 Flag for Puntarenas (CR-P)
🏴 Flag for Huila (CO-HUI)
🏴 Flag for Vaupés (CO-VAU)
🏴 Flag for Cauca (CO-CAU)
🏴 Flag for Sancti Spíritus (CU-07)
🏴 Flag for Limón (CR-L)
🏴 Flag for Norte de Santander (CO-NSA)
🏴 Flag for Matanzas (CU-04)
🏴 Flag for Guanacaste (CR-G)
🏴 Flag for Havana (CU-03)
👩🏾❤️💋👨 Kiss - Woman: Medium-Dark Skin Tone, Man
🏴 Flag for Ciego de Ávila (CU-08)
🏴 Flag for Tolima (CO-TOL)
🏴 Flag for Camagüey (CU-09)
🏴 Flag for Cienfuegos (CU-06)
🏴 Flag for Guaviare (CO-GUV)
🏴 Flag for Cayo (BZ-CY)
🏴 Flag for Southern Nations, Nationalities, and Peoples (ET-SN)
🏴 Flag for Pinar del Río (CU-01)
🏴 Flag for San José (CR-SJ)
🏴 Flag for Cartago (CR-C)
🏴 Flag for La Guajira (CO-LAG)
🏴 Flag for Limassol (CY-02)
🏴 Flag for Lower Saxony (DE-NI)
🏴 Flag for Orange Walk (BZ-OW)
🏴 Flag for Kraj Vysočina (CZ-63)
🏴 Flag for Liberecký kraj (CZ-51)
🏴 Flag for Las Tunas (CU-10)
🏴 Flag for Santiago de Cuba (CU-13)
👨🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Nicosia (CY-01)
🏴 Flag for Středočeský kraj (CZ-20)
🏴 Flag for Vakaga (CF-VK)
🏴 Flag for Královéhradecký kraj (CZ-52)
🏴 Flag for Karlovarský kraj (CZ-41)
🏴 Flag for Artemisa (CU-15)
🏴 Flag for Famagusta (CY-04)
🏴 Flag for Bremen (DE-HB)
🏴 Flag for Hesse (DE-HE)
🏴 Flag for Holguín (CU-11)
👨🏿👶🏿 Family - Man: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Moravskoslezský kraj (CZ-80)
🏴 Flag for Jihočeský kraj (CZ-31)
🏴 Flag for Glarus (CH-GL)
🏴 Flag for Praha, Hlavní mešto (CZ-10)
🏴 Flag for Larnaca (CY-03)
🏴 Flag for Hamburg (DE-HH)
🏴 Flag for Mecklenburg-Vorpommern (DE-MV)
🏴 Flag for Barlavento Islands (CV-B)
🏴 Flag for Sotavento Islands (CV-S)
🏴 Flag for Mayabeque (CU-16)
🏴 Flag for Olomoucký kraj (CZ-71)
🏴 Flag for Guantánamo (CU-14)
🏴 Flag for Brandenburg (DE-BB)
🏴 Flag for Plzeňský kraj (CZ-32)
🏴 Flag for Ali Sabieh (DJ-AS)
🏴 Flag for Rhineland-Palatinate (DE-RP)
🏴 Flag for Saxony (DE-SN)
🏴 Flag for Zealand (DK-85)
🏴 Flag for Saxony-Anhalt (DE-ST)
🏴 Flag for Chlef (DZ-02)
🏴 Flag for Saint Luke (DM-07)
🏴 Flag for Arta (DJ-AR)
🏴 Flag for Capital Region (DK-84)
🏴 Flag for Saint Paul (DM-10)
🏴 Flag for Cibao Sur (DO-36)
🏴 Flag for Enriquillo (DO-38)
🏴 Flag for Saint Patrick (DM-09)
🏴 Flag for Cibao Noroeste (DO-34)
🏴 Flag for Cibao Nordeste (DO-33)
🏴 Flag for Saint John (DM-05)
🏴 Flag for Yuma (DO-42)
🏴 Flag for Obock (DJ-OB)
🏴 Flag for Thuringia (DE-TH)
🏴 Flag for Ozama (DO-40)
🏴 Flag for Saarland (DE-SL)
🏴 Flag for Saint George (DM-04)
🏴 Flag for Saint David (DM-03)
🏴 Flag for Saint Andrew (DM-02)
🏴 Flag for Dikhil (DJ-DI)
🏴 Flag for Saint Mark (DM-08)
🏴 Flag for Tadjourah (DJ-TA)
🏴 Flag for Saint Peter (DM-11)
🏴 Flag for Valdesia (DO-41)
🏴 Flag for Higüamo (DO-39)
🏴 Flag for Laghouat (DZ-03)
🏴 Flag for M’Sila (DZ-28)
🏴 Flag for Illizi (DZ-33)
👩🏿👨🏿👧🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Tizi Ouzou (DZ-15)
🏴 Flag for Tiaret (DZ-14)
🏴 Flag for Sétif (DZ-19)
🏴 Flag for Djelfa (DZ-17)
🏴 Flag for Constantine (DZ-25)
🏴 Flag for Guelma (DZ-24)
🏴 Flag for Tipasa (DZ-42)
🏴 Flag for Batna (DZ-05)
🏴 Flag for Tébessa (DZ-12)
🏴 Flag for Biskra (DZ-07)
🏴 Flag for Ouargla (DZ-30)
🏴 Flag for Sidi Bel Abbès (DZ-22)
🏴 Flag for Tamanghasset (DZ-11)
🏴 Flag for Médéa (DZ-26)
🏴 Flag for El Bayadh (DZ-32)
🏴 Flag for Khenchela (DZ-40)
🏴 Flag for Tissemsilt (DZ-38)
🏴 Flag for El Oued (DZ-39)
🏴 Flag for Souk Ahras (DZ-41)
🏴 Flag for Tlemcen (DZ-13)
🏴 Flag for Béjaïa (DZ-06)
🏴 Flag for Mila (DZ-43)
🏴 Flag for Saïda (DZ-20)
🏴 Flag for Oran (DZ-31)
🏴 Flag for Bouira (DZ-10)
🏴 Flag for Boumerdès (DZ-35)
🏴 Flag for El Tarf (DZ-36)
🏴 Flag for Algiers (DZ-16)
🏴 Flag for Tindouf (DZ-37)
🏴 Flag for Annaba (DZ-23)
🏴 Flag for Blida (DZ-09)
🏴 Flag for Oum El Bouaghi (DZ-04)
🏴 Flag for Mostaganem (DZ-27)
🏴 Flag for Chimborazo (EC-H)
🏴 Flag for Ghardaïa (DZ-47)
🏴 Flag for Bolívar (EC-B)
🏴 Flag for Carchi (EC-C)
🏴 Flag for Aïn Defla (DZ-44)
🏴 Flag for Paphos (CY-05)
🏴 Flag for Relizane (DZ-48)
🏴 Flag for Morona-Santiago (EC-S)
🏴 Flag for Jura (CH-JU)
🏴 Flag for Santa Elena (EC-SE)
🏴 Flag for Lääne (EE-57)
🏴 Flag for Imbabura (EC-I)
🏴 Flag for Aïn Témouchent (DZ-46)
🏴 Flag for Galápagos (EC-W)
🏴 Flag for Napo (EC-N)
👨🏽👶🏽👦🏽 Family - Man: Medium Skin Tone, Baby: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Pärnu (EE-67)
🏴 Flag for Tartu (EE-78)
🏴 Flag for Azuay (EC-A)
🏴 Flag for Manabí (EC-M)
🏴 Flag for El Oro (EC-O)
🏴 Flag for Pichincha (EC-P)
🏴 Flag for Rapla (EE-70)
🏴 Flag for Saare (EE-74)
👨🏾👶🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Põlva (EE-65)
🏴 Flag for Pastaza (EC-Y)
🏴 Flag for Guayas (EC-G)
🏴 Flag for Los Ríos (EC-R)
🏴 Flag for Sucumbíos (EC-U)
🏴 Flag for Jõgeva (EE-49)
🏴 Flag for Valga (EE-82)
🏴 Flag for Loja (EC-L)
🏴 Flag for Orellana (EC-D)
👨🏼👶🏼👦🏼 Family - Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Boy: Medium-Light Skin Tone
🏴 Flag for Naama (DZ-45)
🏴 Flag for Järva (EE-51)
🏴 Flag for North Sinai (EG-SIN)
🏴 Flag for South Sinai (EG-JS)
🏴 Flag for Qena (EG-KN)
🏴 Flag for Viljandi (EE-84)
🏴 Flag for Ismailia (EG-IS)
🏴 Flag for Aswan (EG-ASN)
🏴 Flag for Dakahlia (EG-DK)
🏴 Flag for Gharbia (EG-GH)
🏴 Flag for Beheira (EG-BH)
🏴 Flag for Võru (EE-86)
🏴 Flag for Asyut (EG-AST)
🏴 Flag for Qalyubia (EG-KB)
🏴 Flag for Giza (EG-GZ)
👨🏿👶🏿👦🏿 Family - Man: Dark Skin Tone, Baby: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Anseba (ER-AN)
🏴 Flag for Kafr el-Sheikh (EG-KFS)
🏴 Flag for Matrouh (EG-MT)
🏴 Flag for Gash-Barka (ER-GB)
🏴 Flag for Minya (EG-MN)
🏴 Flag for Alexandria (EG-ALX)
🏴 Flag for Southern Red Sea (ER-DK)
🏴 Flag for Port Said (EG-PTS)
🏴 Flag for Sohag (EG-SHG)
🏴 Flag for New Valley (EG-WAD)
🏴 Flag for Northern Red Sea (ER-SK)
🏴 Flag for Suez (EG-SUZ)
🏴 Flag for Monufia (EG-MNF)
🏴 Flag for Luxor (EG-LX)
🏴 Flag for Maekel (ER-MA)
🏴 Flag for Damietta (EG-DT)
🏴 Flag for Al Sharqia (EG-SHR)
🏴 Flag for Faiyum (EG-FYM)
🏴 Flag for Debub (ER-DU)
🏴 Flag for Aragon (ES-AR)
🏴 Flag for Anhui (CN-34)
🏴 Flag for Northern Denmark (DK-81)
👨🏻👶🏻👧🏻 Family - Man: Light Skin Tone, Baby: Light Skin Tone, Girl: Light Skin Tone
👨🏼👶🏼👧🏼 Family - Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
👨🏽👶🏽👧🏽 Family - Man: Medium Skin Tone, Baby: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Tigray (ET-TI)
🏴 Flag for Liaoning (CN-21)
🏴 Flag for Gambela (ET-GA)
🏴 Flag for Melilla (ES-ML)
🏴 Flag for Murcia Region (ES-MC)
🏴 Flag for Lapland (FI-10)
🏴 Flag for Central Ostrobothnia (FI-07)
🏴 Flag for Amhara (ET-AM)
🏴 Flag for Benishangul-Gumuz (ET-BE)
🏴 Flag for Oromia (ET-OR)
🏴 Flag for La Rioja (ES-RI)
🏴 Flag for Djibouti (DJ-DJ)
🏴 Flag for Madrid Autonomous Community (ES-MD)
🏴 Flag for Dire Dawa (ET-DD)
🏴 Flag for Mascara (DZ-29)
🏴 Flag for Kainuu (FI-05)
🏴 Flag for Kymenlaakso (FI-09)
🏴 Flag for Southern Ostrobothnia (FI-03)
🏴 Flag for Pirkanmaa (FI-11)
🏴 Flag for Southern Savonia (FI-04)
🏴 Flag for North Karelia (FI-13)
🏴 Flag for South Karelia (FI-02)
🏴 Flag for Harari (ET-HA)
🏴 Flag for Zlínský kraj (CZ-72)
🏴 Flag for Somali (ET-SO)
🏴 Flag for Catalonia (ES-CT)
🏴 Flag for Kosrae (FM-KSA)
🏴 Flag for New Caledonia (FR-NC)
🏴 Flag for Occitanie (FR-OCC)
🏴 Flag for Provence-Alpes-Côte-d’Azur (FR-PAC)
🏴 Flag for Northern Savonia (FI-15)
🏴 Flag for Chuuk (FM-TRK)
🏴 Flag for Bourgogne-Franche-Comté (FR-BFC)
🏴 Flag for Northern Ostrobothnia (FI-14)
🏴 Flag for Rotuma (FJ-R)
🏴 Flag for Mayotte (FR-MAY)
🏴 Flag for Nouvelle-Aquitaine (FR-NAQ)
🏴 Flag for Central (FJ-C)
🏴 Flag for Grand-Est (FR-GES)
🏴 Flag for Northern (FJ-N)
🏴 Flag for Guadeloupe (FR-GUA)
🏴 Flag for Yap (FM-YAP)
🏴 Flag for Bretagne (FR-BRE)
🏴 Flag for French Polynesia (FR-PF)
🏴 Flag for Normandie (FR-NOR)
🏴 Flag for French Guiana (FR-GF)
🏴 Flag for Centre-Val de Loire (FR-CVL)
🏴 Flag for Clipperton Island (FR-CP)
🏴 Flag for St. Martin (FR-MF)
🏴 Flag for Päijänne Tavastia (FI-16)
🏴 Flag for Southwest Finland (FI-19)
🏴 Flag for La Réunion (FR-LRE)
🏴 Flag for Satakunta (FI-17)
🏴 Flag for Shida Kartli (GE-SK)
🏴 Flag for Moyen-Ogooué (GA-3)
👨🏿👶🏿👧🏿 Family - Man: Dark Skin Tone, Baby: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Saint George (GD-03)
🏴 Flag for Nyanga (GA-5)
🏴 Flag for Ogooué-Ivindo (GA-6)
🏴 Flag for Brong-Ahafo (GH-BA)
🏴 Flag for Haut-Ogooué (GA-2)
🏴 Flag for Saint Andrew (GD-01)
🏴 Flag for Saint Patrick (GD-06)
🏴 Flag for Galicia (ES-GA)
🏴 Flag for Wallis & Futuna (FR-WF)
👨🏻👶🏻👶🏻 Family - Man: Light Skin Tone, Baby: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for St. Pierre & Miquelon (FR-PM)
🏴 Flag for Saint John (GD-04)
🏴 Flag for Tbilisi (GE-TB)
👨🏼👶🏼👶🏼 Family - Man: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone, Baby: Medium-Light Skin Tone
🏴 Flag for Saint David (GD-02)
🏴 Flag for Guria (GE-GU)
🏴 Flag for Woleu-Ntem (GA-9)
🏴 Flag for Racha-Lechkhumi and Kvemo Svaneti (GE-RL)
🏴 Flag for Samtskhe-Javakheti (GE-SJ)
🏴 Flag for Mtskheta-Mtianeti (GE-MM)
🏴 Flag for Imereti (GE-IM)
🏴 Flag for Ogooué-Maritime (GA-8)
🏴 Flag for Shaanxi (CN-61)
🏴 Flag for Greater Accra (GH-AA)
🏴 Flag for Jihomoravský kraj (CZ-64)
🏴 Flag for Adjara (GE-AJ)
🏴 Flag for Samegrelo-Zemo Svaneti (GE-SZ)
🏴 Flag for Estuaire (GA-1)
🏴 Flag for Ogooué-Lolo (GA-7)
🏴 Flag for Kindia Region (GN-D)
🏴 Flag for Mamou Region (GN-M)
👨🏽👶🏽👶🏽 Family - Man: Medium Skin Tone, Baby: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Qaasuitsup (GL-QA)
🏴 Flag for North Bank Division (GM-N)
🏴 Flag for Sermersooq (GL-SM)
🏴 Flag for Northern (GH-NP)
🏴 Flag for Ionian Islands (GR-F)
🏴 Flag for Central Greece (GR-H)
🏴 Flag for Central (GH-CP)
🏴 Flag for Kankan Region (GN-K)
🏴 Flag for South Aegean (GR-L)
🏴 Flag for Attica (GR-I)
🏴 Flag for Upper River Division (GM-U)
🏴 Flag for Eastern (GH-EP)
🏴 Flag for Nzérékoré Region (GN-N)
🏴 Flag for Western (GH-WP)
🏴 Flag for West Macedonia (GR-C)
🏴 Flag for Río Muni (GQ-C)
🏴 Flag for Lower River Division (GM-L)
🏴 Flag for Upper East (GH-UE)
🏴 Flag for Conakry (GN-C)
🏴 Flag for Central Macedonia (GR-B)
🏴 Flag for Central River Division (GM-M)
🏴 Flag for Upper West (GH-UW)
🏴 Flag for Kujalleq (GL-KU)
🏴 Flag for Boké Region (GN-B)
🏴 Flag for Qeqqata (GL-QE)
🏴 Flag for Epirus (GR-D)
🏴 Flag for Ashanti (GH-AH)
🏴 Flag for Volta (GH-TV)
🏴 Flag for Mount Athos (GR-69)
🏴 Flag for Insular (GQ-I)
🏴 Flag for West Coast Division (GM-W)
🏴 Flag for Banjul (GM-B)
🏴 Flag for Labé Region (GN-L)
🏴 Flag for Thessaly (GR-E)
🏴 Flag for Faranah Region (GN-F)
🏴 Flag for Cuyuni-Mazaruni (GY-CU)
🏴 Flag for Atlántida (HN-AT)
👨🏾👶🏾👶🏾 Family - Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Huehuetenango (GT-HU)
🏴 Flag for Alta Verapaz (GT-AV)
🏴 Flag for El Progreso (GT-PR)
🏴 Flag for Norte (GW-N)
🏴 Flag for Suchitepéquez (GT-SU)
🏴 Flag for Pomeroon-Supenaam (GY-PM)
🏴 Flag for Izabal (GT-IZ)
🏴 Flag for Potaro-Siparuni (GY-PT)
🏴 Flag for Quetzaltenango (GT-QZ)
🏴 Flag for Chimaltenango (GT-CM)
🏴 Flag for Addis Ababa (ET-AA)
🏴 Flag for Bissau (GW-BS)
🏴 Flag for Quiché (GT-QC)
🏴 Flag for Totonicapán (GT-TO)
🏴 Flag for Barima-Waini (GY-BA)
🏴 Flag for Essequibo Islands-West Demerara (GY-ES)
👨🏿👶🏿👶🏿 Family - Man: Dark Skin Tone, Baby: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Choluteca (HN-CH)
🏴 Flag for Demerara-Mahaica (GY-DE)
👨🏻👨🏻👦🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Sacatepéquez (GT-SA)
🏴 Flag for Jutiapa (GT-JU)
🏴 Flag for Chiquimula (GT-CQ)
🏴 Flag for Baja Verapaz (GT-BV)
🏴 Flag for Escuintla (GT-ES)
🏴 Flag for Zacapa (GT-ZA)
🏴 Flag for Sul (GW-S)
🏴 Flag for Leste (GW-L)
🏴 Flag for Jalapa (GT-JA)
🏴 Flag for Petén (GT-PE)
🏴 Flag for Sololá (GT-SO)
🏴 Flag for Comayagua (HN-CM)
🏴 Flag for Koprivnica-Križevci (HR-06)
🏴 Flag for Copán (HN-CP)
🏴 Flag for Bay Islands (HN-IB)
🏴 Flag for Lika-Senj (HR-09)
🏴 Flag for Santa Bárbara (HN-SB)
🏴 Flag for Intibucá (HN-IN)
🏴 Flag for Francisco Morazán (HN-FM)
🏴 Flag for Zagreb County (HR-01)
🏴 Flag for Colón (HN-CL)
🏴 Flag for Centre (HT-CE)
🏴 Flag for Primorje-Gorski Kotar (HR-08)
🏴 Flag for Lempira (HN-LE)
🏴 Flag for Osijek-Baranja (HR-14)
🏴 Flag for Brod-Posavina (HR-12)
🏴 Flag for Split-Dalmatia (HR-17)
🏴 Flag for Olancho (HN-OL)
🏴 Flag for La Paz (HN-LP)
🏴 Flag for Međimurje (HR-20)
🏴 Flag for El Paraíso (HN-EP)
🏴 Flag for Zagreb (HR-21)
🏴 Flag for Šibenik-Knin (HR-15)
🏴 Flag for Ida-Viru (EE-44)
🏴 Flag for Cortés (HN-CR)
🏴 Flag for Sisak-Moslavina (HR-03)
🏴 Flag for Zadar (HR-13)
🏴 Flag for Istria (HR-18)
🏴 Flag for Krapina-Zagorje (HR-02)
🏴 Flag for Vukovar-Syrmia (HR-16)
🏴 Flag for Yoro (HN-YO)
🏴 Flag for Artibonite (HT-AR)
🏴 Flag for Gracias a Dios (HN-GD)
🏴 Flag for Valle (HN-VA)
🏴 Flag for Jijel (DZ-18)
🏴 Flag for Dubrovnik-Neretva (HR-19)
🏴 Flag for Požega-Slavonia (HR-11)
🏴 Flag for Bjelovar-Bilogora (HR-07)
🏴 Flag for Ocotepeque (HN-OC)
🏴 Flag for Budapest (HU-BU)
🏴 Flag for Hódmezővásárhely (HU-HV)
🏴 Flag for Fejér (HU-FE)
🏴 Flag for Baranya (HU-BA)
🏴 Flag for Székesfehérvár (HU-SF)
🏴 Flag for Borsod-Abaúj-Zemplén (HU-BZ)
🏴 Flag for Csongrád (HU-CS)
🏴 Flag for Sopron (HU-SN)
🏴 Flag for Dunaújváros (HU-DU)
🏴 Flag for Kaposvár (HU-KV)
🏴 Flag for Nyíregyháza (HU-NY)
🏴 Flag for Hajdú-Bihar (HU-HB)
🏴 Flag for Ouest (HT-OU)
🏴 Flag for Szeged (HU-SD)
🏴 Flag for Pest (HU-PE)
🏴 Flag for Komárom-Esztergom (HU-KE)
🏴 Flag for Nagykanizsa (HU-NK)
🏴 Flag for Grand’Anse (HT-GA)
🏴 Flag for Békéscsaba (HU-BC)
🏴 Flag for Sud (HT-SD)
🏴 Flag for Nord-Ouest (HT-NO)
🏴 Flag for Heves (HU-HE)
🏴 Flag for Bács-Kiskun (HU-BK)
🏴 Flag for Miskolc (HU-MI)
🏴 Flag for Érd (HU-ER)
👨🏽👨🏽👦🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Nippes (HT-NI)
🏴 Flag for Szolnok (HU-SK)
🏴 Flag for Nord (HT-ND)
🏴 Flag for Sud-Est (HT-SE)
🏴 Flag for Jász-Nagykun-Szolnok (HU-JN)
🏴 Flag for Pécs (HU-PS)
🏴 Flag for Kecskemét (HU-KM)
🏴 Flag for Debrecen (HU-DE)
🏴 Flag for Békés (HU-BE)
🏴 Flag for Nógrád (HU-NO)
🏴 Flag for Szombathely (HU-SH)
🏴 Flag for Győr (HU-GY)
🏴 Flag for Lesser Sunda Islands (ID-NU)
🏴 Flag for Tatabánya (HU-TB)
🏴 Flag for Java (ID-JW)
🏴 Flag for Chandigarh (IN-CH)
🏴 Flag for Gujarat (IN-GJ)
🏴 Flag for Leinster (IE-L)
🏴 Flag for Zala (HU-ZA)
🏴 Flag for Daman and Diu (IN-DD)
🏴 Flag for Tel Aviv District (IL-TA)
🏴 Flag for Sulawesi (ID-SL)
🏴 Flag for Arunachal Pradesh (IN-AR)
🏴 Flag for Veszprém County (HU-VE)
🏴 Flag for Andaman and Nicobar Islands (IN-AN)
🏴 Flag for Somogy (HU-SO)
🏴 Flag for Vas (HU-VA)
🏴 Flag for Jerusalem (IL-JM)
🏴 Flag for Dadra and Nagar Haveli (IN-DN)
🏴 Flag for Veszprém (HU-VM)
🏴 Flag for Salgótarján (HU-ST)
🏴 Flag for Chhattisgarh (IN-CT)
🏴 Flag for Ulster (IE-U)
🏴 Flag for Delhi (IN-DL)
🏴 Flag for Munster (IE-M)
🏴 Flag for Connacht (IE-C)
🏴 Flag for Haifa District (IL-HA)
🏴 Flag for Kalimantan (ID-KA)
🏴 Flag for Goa (IN-GA)
🏴 Flag for Sumatra (ID-SM)
🏴 Flag for Papua Islands (ID-PP)
🏴 Flag for Szekszárd (HU-SS)
🏴 Flag for Northern District (IL-Z)
🏴 Flag for Tolna (HU-TO)
🏴 Flag for Central District (IL-M)
🏴 Flag for Southern District (IL-D)
🏴 Flag for Bihar (IN-BR)
🏴 Flag for Zalaegerszeg (HU-ZE)
🏴 Flag for Andhra Pradesh (IN-AP)
🏴 Flag for Dohuk (IQ-DA)
🏴 Flag for Jharkhand (IN-JH)
🏴 Flag for Kerala (IN-KL)
🏴 Flag for West Bengal (IN-WB)
🏴 Flag for Odisha (IN-OR)
🏴 Flag for Puducherry (IN-PY)
🏴 Flag for Karbala (IQ-KA)
🏴 Flag for Saladin (IQ-SD)
🏴 Flag for Mizoram (IN-MZ)
🏴 Flag for Himachal Pradesh (IN-HP)
🏴 Flag for Madhya Pradesh (IN-MP)
🏴 Flag for Punjab (IN-PB)
🏴 Flag for Nagaland (IN-NL)
🏴 Flag for Al-Qādisiyyah (IQ-QA)
🏴 Flag for Diyala (IQ-DI)
🏴 Flag for Nineveh (IQ-NI)
🏴 Flag for Dhi Qar (IQ-DQ)
🏴 Flag for Meghalaya (IN-ML)
🏴 Flag for Tamil Nadu (IN-TN)
🏴 Flag for Najaf (IQ-NA)
🏴 Flag for Al Muthanna (IQ-MU)
🏴 Flag for Telangana (IN-TG)
🏴 Flag for Haryana (IN-HR)
🏴 Flag for Uttarakhand (IN-UT)
🏴 Flag for Tripura (IN-TR)
🏴 Flag for Baghdad (IQ-BG)
🏴 Flag for Lakshadweep (IN-LD)
🏴 Flag for Maysan (IQ-MA)
🏴 Flag for Basra (IQ-BA)
🏴 Flag for Erbil (IQ-AR)
🏴 Flag for Maharashtra (IN-MH)
🏴 Flag for Al Anbar (IQ-AN)
🏴 Flag for Sikkim (IN-SK)
🏴 Flag for Babylon (IQ-BB)
🏴 Flag for Uttar Pradesh (IN-UP)
🏴 Flag for Sulaymaniyah (IQ-SU)
🏴 Flag for Rajasthan (IN-RJ)
🏴 Flag for Jammu and Kashmir (IN-JK)
🏴 Flag for Chaharmahal and Bakhtiari (IR-08)
🏴 Flag for Qom (IR-26)
🏴 Flag for Capital (IS-1)
👨🏾👨🏾👦🏾 Family - Man: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Ardabil (IR-03)
🏴 Flag for Yazd (IR-25)
🏴 Flag for South Khorasan (IR-29)
👨🏿👨🏿👦🏿 Family - Man: Dark Skin Tone, Man: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Hamadan (IR-24)
🏴 Flag for Mahaica-Berbice (GY-MA)
🏴 Flag for Western (IS-3)
🏴 Flag for Golestan (IR-27)
🏴 Flag for Zanjan (IR-11)
🏴 Flag for Lorestan (IR-20)
🏴 Flag for Kermanshah (IR-17)
🏴 Flag for Kohgiluyeh and Boyer-Ahmad (IR-18)
🏴 Flag for Cairo (EG-C)
🏴 Flag for North Khorasan (IR-31)
🏴 Flag for Bushehr (IR-06)
🏴 Flag for Extremadura (ES-EX)
🏴 Flag for Canary Islands (ES-CN)
🏴 Flag for Eastern (IS-7)
🏴 Flag for Ilam (IR-05)
🏴 Flag for Qazvin (IR-28)
🏴 Flag for Isfahan (IR-04)
🏴 Flag for Kerman (IR-15)
🏴 Flag for Hormozgan (IR-23)
🏴 Flag for Wasit (IQ-WA)
🏴 Flag for Piedmont (IT-21)
🏴 Flag for Northeastern (IS-6)
🏴 Flag for Northwestern (IS-5)
🏴 Flag for Markazi (IR-22)
🏴 Flag for Gilan (IR-19)
🏴 Flag for Khuzestan (IR-10)
🏴 Flag for Semnan (IR-12)
🏴 Flag for Southern Peninsula (IS-2)
🏴 Flag for Manchester (JM-12)
🏴 Flag for Irbid (JO-IR)
🏴 Flag for Saint Mary (JM-05)
🏴 Flag for Basilicata (IT-77)
🏴 Flag for Friuli–Venezia Giulia (IT-36)
🏴 Flag for Clarendon (JM-13)
🏴 Flag for Marche (IT-57)
🏴 Flag for Portland (JM-04)
🏴 Flag for Sicily (IT-82)
🏴 Flag for Veneto (IT-34)
🏴 Flag for Abruzzo (IT-65)
🏴 Flag for Molise (IT-67)
🏴 Flag for Balqa (JO-BA)
🏴 Flag for Apulia (IT-75)
🏴 Flag for Calabria (IT-78)
🏴 Flag for Tuscany (IT-52)
🏴 Flag for Hanover (JM-09)
🏴 Flag for Saint Andrew (JM-02)
🏴 Flag for Tafilah (JO-AT)
🏴 Flag for Umbria (IT-55)
🏴 Flag for Saint James (JM-08)
🏴 Flag for Saint Ann (JM-06)
🏴 Flag for Saint Elizabeth (JM-11)
🏴 Flag for Zarqa (JO-AZ)
🏴 Flag for Ostrobothnia (FI-12)
🏴 Flag for Lazio (IT-62)
🏴 Flag for Ajloun (JO-AJ)
🏴 Flag for Liguria (IT-42)
🏴 Flag for Trelawny (JM-07)
🏴 Flag for Aqaba (JO-AQ)
🏴 Flag for Jerash (JO-JA)
🏴 Flag for Amman (JO-AM)
🏴 Flag for Aosta Valley (IT-23)
🏴 Flag for Westmoreland (JM-10)
🏴 Flag for Ibaraki (JP-08)
🏴 Flag for Madaba (JO-MD)
🏴 Flag for Shimane (JP-32)
🏴 Flag for Kyōto (JP-26)
🏴 Flag for Araucanía (CL-AR)
🏴 Flag for Tochigi (JP-09)
🏴 Flag for Akita (JP-05)
🏴 Flag for Chiba (JP-12)
🏴 Flag for Miyagi (JP-04)
🏴 Flag for Niigata (JP-15)
🏴 Flag for Toyama (JP-16)
🏴 Flag for Aichi (JP-23)
🏴 Flag for Tokushima (JP-36)
🏴 Flag for Nagano (JP-20)
🏴 Flag for Tottori (JP-31)
🏴 Flag for Iwate (JP-03)
🏴 Flag for Okayama (JP-33)
🏴 Flag for Ishikawa (JP-17)
🏴 Flag for Wakayama (JP-30)
🏴 Flag for Gunma (JP-10)
🏴 Flag for Mafraq (JO-MA)
🏴 Flag for Yamaguchi (JP-35)
🏴 Flag for Granma (CU-12)
🏴 Flag for Shiga (JP-25)
🏴 Flag for Aomori (JP-02)
🏴 Flag for Saitama (JP-11)
🏴 Flag for Nara (JP-29)
🏴 Flag for Yamanashi (JP-19)
🏴 Flag for Hiroshima (JP-34)
🏴 Flag for Ma’an (JO-MN)
🏴 Flag for Shizuoka (JP-22)
🏴 Flag for Ōsaka (JP-27)
🏴 Flag for Mie (JP-24)
🏴 Flag for Yamagata (JP-06)
🏴 Flag for Hyōgo (JP-28)
🏴 Flag for Karak (JO-KA)
🏴 Flag for Ehime (JP-38)
🏴 Flag for Kanagawa (JP-14)
🏴 Flag for Kagawa (JP-37)
🏴 Flag for Garissa (KE-07)
🏴 Flag for Mandera (KE-24)
🏴 Flag for Kagoshima (JP-46)
🏴 Flag for Kisumu (KE-17)
🏴 Flag for Kilifi (KE-14)
🏴 Flag for Kirinyaga (KE-15)
🏴 Flag for Kajiado (KE-10)
🏴 Flag for Bungoma (KE-03)
🏴 Flag for Nandi (KE-32)
🏴 Flag for Kiambu (KE-13)
🏴 Flag for Laikipia (KE-20)
🏴 Flag for Lamu (KE-21)
🏴 Flag for Fukuoka (JP-40)
🏴 Flag for Busia (KE-04)
🏴 Flag for Saga (JP-41)
🏴 Flag for Migori (KE-27)
🏴 Flag for Embu (KE-06)
👩🏾👦🏾👧🏾 Family - Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone, Girl: Medium-Dark Skin Tone
🏴 Flag for Kericho (KE-12)
🏴 Flag for Isiolo (KE-09)
🏴 Flag for Kwale (KE-19)
🏴 Flag for Nagasaki (JP-42)
🏴 Flag for Nairobi County (KE-30)
🏴 Flag for Makueni (KE-23)
🏴 Flag for Murang’a (KE-29)
🏴 Flag for Kōchi (JP-39)
🏴 Flag for Bomet (KE-02)
🏴 Flag for Mombasa (KE-28)
🏴 Flag for Homa Bay (KE-08)
🏴 Flag for Kakamega (KE-11)
🏴 Flag for Machakos (KE-22)
🏴 Flag for Kisii (KE-16)
🏴 Flag for Elgeyo-Marakwet (KE-05)
🏴 Flag for Ōita (JP-44)
🏴 Flag for Narok (KE-33)
🏴 Flag for Meru (KE-26)
🏴 Flag for Kumamoto (JP-43)
🏴 Flag for Miyazaki (JP-45)
🏴 Flag for Stung Treng (KH-19)
🏴 Flag for Samburu (KE-37)
🏴 Flag for West Pokot (KE-47)
🏴 Flag for Taita-Taveta (KE-39)
🏴 Flag for Prey Veng (KH-14)
🏴 Flag for Tharaka-Nithi (KE-41)
🏴 Flag for Osh Region (KG-O)
🏴 Flag for Tbong Khmum (KH-25)
🏴 Flag for Talas (KG-T)
🏴 Flag for Phnom Penh (KH-12)
🏴 Flag for Bishkek (KG-GB)
🏴 Flag for Uasin Gishu (KE-44)
🏴 Flag for Kep (KH-23)
🏴 Flag for Kratié (KH-10)
🏴 Flag for Takéo (KH-21)
🏴 Flag for Battambang (KH-2)
🏴 Flag for Nyeri (KE-36)
🏴 Flag for Preah Vihear (KH-13)
🏴 Flag for Tana River (KE-40)
🏴 Flag for Pailin (KH-24)
🏴 Flag for Ratanakiri (KH-16)
🏴 Flag for Oddar Meanchey (KH-22)
🏴 Flag for Trans Nzoia (KE-42)
🏴 Flag for Sihanoukville (KH-18)
🏴 Flag for Vihiga (KE-45)
🏴 Flag for Osh (KG-GO)
🏴 Flag for Batken (KG-B)
🏴 Flag for Jalal-Abad (KG-J)
🏴 Flag for Mondulkiri (KH-11)
🏴 Flag for Siem Reap (KH-17)
🏴 Flag for Turkana (KE-43)
🏴 Flag for Banteay Meanchey (KH-1)
🏴 Flag for Naryn (KG-N)
🏴 Flag for Nyandarua (KE-35)
🏴 Flag for Siaya (KE-38)
🏴 Flag for Nyamira (KE-34)
🏴 Flag for Pursat (KH-15)
🏴 Flag for Wajir (KE-46)
🏴 Flag for Issyk-Kul (KG-Y)
🏴 Flag for Chuy (KG-C)
🏴 Flag for Mohéli (KM-M)
🏴 Flag for Seoul (KR-11)
🏴 Flag for Kampong Chhnang (KH-4)
🏴 Flag for Daejeon (KR-30)
🏴 Flag for South Hwanghae (KP-05)
🏴 Flag for Kampot (KH-7)
🏴 Flag for Nevis (KN-N)
🏴 Flag for Chagang (KP-04)
🏴 Flag for South Jeolla (KR-46)
🏴 Flag for North Hwanghae (KP-06)
🏴 Flag for Saint Kitts (KN-K)
🏴 Flag for Kampong Speu (KH-5)
🏴 Flag for North Jeolla (KR-45)
🏴 Flag for North Pyongan (KP-03)
🏴 Flag for Koh Kong (KH-9)
🏴 Flag for Kangwon (KP-07)
🏴 Flag for Busan (KR-26)
🏴 Flag for Gwangju City (KR-29)
🏴 Flag for Kampong Cham (KH-3)
🏴 Flag for North Chungcheong (KR-43)
🏴 Flag for Kandal (KH-8)
🏴 Flag for Kampong Thom (KH-6)
🏴 Flag for Ryanggang (KP-10)
🏴 Flag for South Pyongan (KP-02)
🏴 Flag for Grande Comore (KM-G)
🏴 Flag for South Hamgyong (KP-08)
🏴 Flag for Rason (KP-13)
🏴 Flag for Daegu (KR-27)
🏴 Flag for Incheon (KR-28)
🏴 Flag for Gangwon (KR-42)
🏴 Flag for Pyongyang (KP-01)
🏴 Flag for Ulsan (KR-31)
🏴 Flag for South Chungcheong (KR-44)
🏴 Flag for Anjouan (KM-A)
🏴 Flag for Gyeonggi (KR-41)
🏴 Flag for North Gyeongsang (KR-47)
🏴 Flag for North Hamgyong (KP-09)
🏴 Flag for Houaphanh (LA-HO)
🏴 Flag for Bayqongyr (KZ-BAY)
🏴 Flag for Champasak (LA-CH)
🏴 Flag for Vientiane (LA-VT)
🏴 Flag for Hawalli (KW-HA)
🏴 Flag for Phongsaly (LA-PH)
🏴 Flag for Pavlodar (KZ-PAV)
🏴 Flag for Almaty Region (KZ-ALM)
🏴 Flag for Al Asimah (KW-KU)
🏴 Flag for Bokeo (LA-BK)
🏴 Flag for Attapeu (LA-AT)
🏴 Flag for Aktobe (KZ-AKT)
🏴 Flag for Atyrau (KZ-ATY)
🏴 Flag for Al Jahra (KW-JA)
🏴 Flag for Bolikhamsai (LA-BL)
🏴 Flag for Oudomxay (LA-OU)
🏴 Flag for Mangystau (KZ-MAN)
🏴 Flag for West Kazakhstan (KZ-ZAP)
🏴 Flag for Jambyl (KZ-ZHA)
🏴 Flag for Astana (KZ-AST)
🏴 Flag for Luang Prabang (LA-LP)
🏴 Flag for Al Farwaniyah (KW-FA)
🏴 Flag for Kostanay (KZ-KUS)
🏴 Flag for Almaty (KZ-ALA)
🏴 Flag for Karagandy (KZ-KAR)
🏴 Flag for Kyzylorda (KZ-KZY)
🏴 Flag for Salavan (LA-SL)
🏴 Flag for Luang Namtha (LA-LM)
🏴 Flag for Sejong (KR-50)
🏴 Flag for Mubarak Al-Kabeer (KW-MU)
🏴 Flag for North Kazakhstan (KZ-SEV)
👩🏿👦🏿👧🏿 Family - Woman: Dark Skin Tone, Boy: Dark Skin Tone, Girl: Dark Skin Tone
🏴 Flag for Al Ahmadi (KW-AH)
🏴 Flag for Khammouane (LA-KH)
🏴 Flag for Akmola (KZ-AKM)
🏴 Flag for South Kazakhstan (KZ-YUZ)
🏴 Flag for Triesen (LI-09)
👨🏽👨🏽👦🏽👦🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone, Boy: Medium Skin Tone
👩🏻👦🏻👶🏻 Family - Woman: Light Skin Tone, Boy: Light Skin Tone, Baby: Light Skin Tone
🏴 Flag for North Central (LK-7)
🏴 Flag for Sainyabuli (LA-XA)
🏴 Flag for Akkar (LB-AK)
🏴 Flag for Laborie (LC-07)
🏴 Flag for Gros Islet (LC-06)
🏴 Flag for North (LB-AS)
🏴 Flag for Balzers (LI-01)
🏴 Flag for Central (LK-2)
🏴 Flag for Mauren (LI-04)
🏴 Flag for Nabatieh (LB-NA)
🏴 Flag for Dennery (LC-05)
🏴 Flag for South (LB-JA)
🏴 Flag for Vaduz (LI-11)
🏴 Flag for Castries (LC-02)
🏴 Flag for Uva (LK-8)
🏴 Flag for Triesenberg (LI-10)
🏴 Flag for Planken (LI-05)
🏴 Flag for Vieux Fort (LC-11)
🏴 Flag for Baalbek-Hermel (LB-BH)
🏴 Flag for North Western (LK-6)
🏴 Flag for Ruggell (LI-06)
🏴 Flag for Micoud (LC-08)
🏴 Flag for Eschen (LI-02)
🏴 Flag for Canaries (LC-12)
🏴 Flag for Beirut (LB-BA)
🏴 Flag for Xiangkhouang (LA-XI)
🏴 Flag for Soufrière (LC-10)
🏴 Flag for Anse la Raye (LC-01)
🏴 Flag for Choiseul (LC-03)
🏴 Flag for Gamprin (LI-03)
🏴 Flag for Northern (LK-4)
🏴 Flag for Grand Bassa (LR-GB)
🏴 Flag for Gbarpolu (LR-GP)
🏴 Flag for Grand Gedeh (LR-GG)
🏴 Flag for Jurbarkas (LT-12)
🏴 Flag for Nimba (LR-NI)
🏴 Flag for Central Finland (FI-08)
🏴 Flag for Jonava (LT-10)
🏴 Flag for Margibi (LR-MG)
🏴 Flag for Sinoe (LR-SI)
🏴 Flag for Montserrado (LR-MO)
🏴 Flag for Kaunas (LT-16)
🏴 Flag for Thaba-Tseka (LS-K)
🏴 Flag for Birštonas (LT-05)
🏴 Flag for Mohale’s Hoek (LS-F)
🏴 Flag for Bomi (LR-BM)
🏴 Flag for Druskininkai (LT-07)
🏴 Flag for Kalvarija (LT-14)
🏴 Flag for Kauno Municipality (LT-15)
🏴 Flag for Qacha’s Nek (LS-H)
🏴 Flag for Anykščiai (LT-04)
🏴 Flag for Leribe (LS-C)
🏴 Flag for Joniškis (LT-11)
🏴 Flag for Lofa (LR-LO)
🏴 Flag for Rivercess (LR-RI)
🏴 Flag for Kaišiadorys (LT-13)
🏴 Flag for Elektrėnai (LT-08)
🏴 Flag for Grand Kru (LR-GK)
🏴 Flag for Berea (LS-D)
🏴 Flag for Quthing (LS-G)
🏴 Flag for Butha-Buthe (LS-B)
🏴 Flag for Akmenė (LT-01)
🏴 Flag for Ignalina (LT-09)
🏴 Flag for Mafeteng (LS-E)
🏴 Flag for Mokhotlong (LS-J)
🏴 Flag for Alytus (LT-03)
🏴 Flag for Biržai (LT-06)
🏴 Flag for Nana-Grébizi (CF-KB)
🏴 Flag for River Gee (LR-RG)
🏴 Flag for Utena (LT-54)
🏴 Flag for Molėtai (LT-27)
🏴 Flag for Šakiai (LT-41)
🏴 Flag for Kelmė (LT-19)
🏴 Flag for Kupiškis (LT-23)
🏴 Flag for Vilkaviškis (LT-56)
🏴 Flag for Neringa (LT-28)
🏴 Flag for Panevėžys (LT-33)
🏴 Flag for Pagėgiai (LT-29)
🏴 Flag for Šiaulių Municipality (LT-43)
🏴 Flag for Palanga (LT-31)
🏴 Flag for Kėdainiai (LT-18)
🏴 Flag for Rokiškis (LT-40)
🏴 Flag for Šilalė (LT-45)
🏴 Flag for Trakai (LT-52)
🏴 Flag for Pohnpei (FM-PNI)
🏴 Flag for Prienai (LT-36)
🏴 Flag for Telšiai (LT-51)
🏴 Flag for Klaipėda (LT-21)
🏴 Flag for Kazlų Rūda (LT-17)
🏴 Flag for Širvintos (LT-47)
🏴 Flag for Pakruojis (LT-30)
🏴 Flag for Šiauliai (LT-44)
🏴 Flag for Kretinga (LT-22)
🏴 Flag for Šilutė (LT-46)
🏴 Flag for Šalčininkai (LT-42)
🏴 Flag for Raseiniai (LT-38)
🏴 Flag for Varėna (LT-55)
🏴 Flag for Pasvalys (LT-34)
🏴 Flag for Plungė (LT-35)
🏴 Flag for Švenčionys (LT-49)
🏴 Flag for Radviliškis (LT-37)
🏴 Flag for Lazdijai (LT-24)
🏴 Flag for Tauragė (LT-50)
🏴 Flag for Skuodas (LT-48)
🏴 Flag for Ukmergė (LT-53)
🏴 Flag for Rietavas (LT-39)
🏴 Flag for Marijampolė (LT-25)
🏴 Flag for Mažeikiai (LT-26)
🏴 Flag for Baldone (LV-013)
🏴 Flag for Vilnius County (LT-VL)
🏴 Flag for Alsunga (LV-006)
🏴 Flag for Vilnius (LT-58)
🏴 Flag for Tauragė County (LT-TA)
🏴 Flag for Utena County (LT-UT)
🏴 Flag for Aizkraukle (LV-002)
🏴 Flag for Diekirch (LU-DI)
🏴 Flag for Marijampolė County (LT-MR)
👩🏽👨🏽👶🏽 Family - Woman: Medium Skin Tone, Man: Medium Skin Tone, Baby: Medium Skin Tone
🏴 Flag for Šiauliai County (LT-SA)
🏴 Flag for Echternach (LU-EC)
🏴 Flag for Redange (LU-RD)
🏴 Flag for Clervaux (LU-CL)
🏴 Flag for Visaginas (LT-59)
🏴 Flag for Ape (LV-009)
🏴 Flag for Amata (LV-008)
🏴 Flag for Alytus County (LT-AL)
🏴 Flag for Grevenmacher (LU-GR)
🏴 Flag for Aglona (LV-001)
🏴 Flag for Mersch (LU-ME)
🏴 Flag for Vianden (LU-VD)
🏴 Flag for Aloja (LV-005)
🏴 Flag for Mount Lebanon (LB-JL)
🏴 Flag for Kaunas County (LT-KU)
🏴 Flag for Zarasai (LT-60)
🏴 Flag for Wiltz (LU-WI)
🏴 Flag for Ādaži (LV-011)
🏴 Flag for Luxembourg (LU-LU)
🏴 Flag for Telšiai County (LT-TE)
🏴 Flag for Alūksne (LV-007)
🏴 Flag for Remich (LU-RM)
🏴 Flag for Aknīste (LV-004)
🏴 Flag for Esch-sur-Alzette (LU-ES)
🏴 Flag for Aizpute (LV-003)
🏴 Flag for Klaipėda County (LT-KL)
🏴 Flag for Dundaga (LV-027)
🏴 Flag for Jaunpils (LV-040)
🏴 Flag for Burtnieki (LV-019)
🏴 Flag for Balvi (LV-015)
🏴 Flag for Beverīna (LV-017)
🏴 Flag for Daugavpils Municipality (LV-025)
🏴 Flag for Cesvaine (LV-021)
🏴 Flag for Ilūkste (LV-036)
🏴 Flag for Kuldīga (LV-050)
🏴 Flag for Grobiņa (LV-032)
🏴 Flag for Gulbene (LV-033)
🏴 Flag for Kandava (LV-043)
🏴 Flag for Brocēni (LV-018)
🏴 Flag for Krimulda (LV-048)
🏴 Flag for Carnikava (LV-020)
🏴 Flag for Krustpils (LV-049)
👩🏾👨🏾👶🏾 Family - Woman: Medium-Dark Skin Tone, Man: Medium-Dark Skin Tone, Baby: Medium-Dark Skin Tone
🏴 Flag for Dobele (LV-026)
🏴 Flag for Kocēni (LV-045)
🏴 Flag for Garkalne (LV-031)
🏴 Flag for Ērgļi (LV-030)
🏴 Flag for Durbe (LV-028)
🏴 Flag for Krāslava (LV-047)
🏴 Flag for Dagda (LV-024)
🏴 Flag for Jaunjelgava (LV-038)
🏴 Flag for Bauska (LV-016)
🏴 Flag for Baltinava (LV-014)
🏴 Flag for Jēkabpils Municipality (LV-042)
🏴 Flag for Jaunpiebalga (LV-039)
🏴 Flag for Cēsis (LV-022)
🏴 Flag for Iecava (LV-034)
🏴 Flag for Ķegums (LV-051)
🏴 Flag for Ikšķile (LV-035)
🏴 Flag for Cibla (LV-023)
🏴 Flag for Kārsava (LV-044)
🏴 Flag for Engure (LV-029)
🏴 Flag for Līgatne (LV-055)
🏴 Flag for Nīca (LV-066)
🏴 Flag for Mālpils (LV-061)
🏴 Flag for Kvemo Kartli (GE-KK)
🏴 Flag for Pārgauja (LV-070)
🏴 Flag for Lielvārde (LV-053)
🏴 Flag for Pļaviņas (LV-072)
🏴 Flag for Pāvilosta (LV-071)
🏴 Flag for Madona (LV-059)
🏴 Flag for Rauna (LV-076)
🏴 Flag for Limbaži (LV-054)
🏴 Flag for Naukšēni (LV-064)
🏴 Flag for Ķekava (LV-052)
🏴 Flag for Salaspils (LV-087)
🏴 Flag for Mērsrags (LV-063)
🏴 Flag for Olaine (LV-068)
🏴 Flag for Roja (LV-079)
🏴 Flag for Rucava (LV-081)
🏴 Flag for Rugāji (LV-082)
🏴 Flag for Ogre (LV-067)
🏴 Flag for Rūjiena (LV-084)
🏴 Flag for Saulkrasti (LV-089)
🏴 Flag for Saldus (LV-088)
🏴 Flag for Rundāle (LV-083)
🏴 Flag for Nereta (LV-065)
🏴 Flag for Ozolnieki (LV-069)
🏴 Flag for Ropaži (LV-080)
🏴 Flag for Riebiņi (LV-078)
🏴 Flag for Līvāni (LV-056)
🏴 Flag for Priekuļi (LV-075)
🏴 Flag for Ludza (LV-058)
🏴 Flag for Sēja (LV-090)
🏴 Flag for Priekule (LV-074)
🏴 Flag for Lubāna (LV-057)
🏴 Flag for Salacgrīva (LV-086)
🏴 Flag for Mārupe (LV-062)
🏴 Flag for Preiļi (LV-073)
🏴 Flag for Viesīte (LV-107)
🏴 Flag for Smiltene (LV-094)
🏴 Flag for Kufra (LY-KF)
🏴 Flag for Daugavpils (LV-DGV)
🏴 Flag for Tukums (LV-099)
👩🏿👨🏿👶🏿 Family - Woman: Dark Skin Tone, Man: Dark Skin Tone, Baby: Dark Skin Tone
🏴 Flag for Liepāja (LV-LPX)
🏴 Flag for Valka (LV-101)
🏴 Flag for Vārkava (LV-103)
🏴 Flag for Murqub (LY-MB)
🏴 Flag for Ventspils (LV-VEN)
🏴 Flag for Jabal al Akhdar (LY-JA)
🏴 Flag for Jēkabpils (LV-JKB)
🏴 Flag for Sigulda (LV-091)
🏴 Flag for Jabal al Gharbi (LY-JG)
🏴 Flag for Ghat (LY-GT)
🏴 Flag for Stopiņi (LV-095)
🏴 Flag for Riga (LV-RIX)
🏴 Flag for Derna (LY-DR)
🏴 Flag for Vaiņode (LV-100)
🏴 Flag for Varakļāni (LV-102)
🏴 Flag for Jelgava (LV-JEL)
🏴 Flag for Skrīveri (LV-092)
🏴 Flag for Talsi (LV-097)
🏴 Flag for Valmiera (LV-VMR)
🏴 Flag for Benghazi (LY-BA)
🏴 Flag for Rēzekne (LV-REZ)
🏴 Flag for Skrunda (LV-093)
🏴 Flag for Zilupe (LV-110)
🏴 Flag for Strenči (LV-096)
🏴 Flag for Jufra (LY-JU)
🏴 Flag for Vecpiebalga (LV-104)
🏴 Flag for Vecumnieki (LV-105)
🏴 Flag for Viļaka (LV-108)
🏴 Flag for Jūrmala (LV-JUR)
🏴 Flag for Viļāni (LV-109)
🏴 Flag for Tērvete (LV-098)
🏴 Flag for Grand Casablanca (MA-08)
🏴 Flag for Marj (LY-MJ)
🏴 Flag for Al Wahat (LY-WA)
🏴 Flag for Monte Carlo (MC-MC)
🏴 Flag for Guelmim-Es Semara (MA-14)
🏴 Flag for Zawiya (LY-ZA)
🏴 Flag for Gharb-Chrarda-Béni Hssen (MA-02)
🏴 Flag for Marrakesh-Tensift-El Haouz (MA-11)
🏴 Flag for Doukkala-Abda (MA-10)
👩🏽👩🏽👦🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Boy: Medium Skin Tone
🏴 Flag for Rabat-Salé-Zemmour-Zaer (MA-07)
🏴 Flag for Oued Ed-Dahab-Lagouira (MA-16)
🏴 Flag for Nalut (LY-NL)
🏴 Flag for Sabha (LY-SB)
🏴 Flag for Taza-Al Hoceima-Taounate (MA-03)
🏴 Flag for Jardin Exotique de Monaco (MC-JE)
🏴 Flag for Wadi al Shatii (LY-WS)
🏴 Flag for Larvotto (MC-LA)
🏴 Flag for Nuqat al Khams (LY-NQ)
🏴 Flag for Malbousquet (MC-MA)
🏴 Flag for Tadla-Azilal (MA-12)
🏴 Flag for La Condamine (MC-CO)
🏴 Flag for Monaco-Ville (MC-MO)
🏴 Flag for Chaouia-Ouardigha (MA-09)
🏴 Flag for Tangier-Tétouan (MA-01)
🏴 Flag for Moneghetti (MC-MG)
🏴 Flag for Murzuq (LY-MQ)
🏴 Flag for Meknès-Tafilalet (MA-06)
🏴 Flag for Fontvieille (MC-FO)
🏴 Flag for Wadi al Hayaa (LY-WD)
🏴 Flag for La Colle (MC-CL)
🏴 Flag for Sirte (LY-SR)
🏴 Flag for Misrata (LY-MI)
🏴 Flag for Fès-Boulemane (MA-05)
🏴 Flag for Tripoli (LY-TB)
🏴 Flag for La Gare (MC-GA)
👩🏾👩🏾👦🏾 Family - Woman: Medium-Dark Skin Tone, Woman: Medium-Dark Skin Tone, Boy: Medium-Dark Skin Tone
🏴 Flag for Edineț (MD-ED)
🏴 Flag for Hîncești (MD-HI)
🏴 Flag for Fălești (MD-FA)
🏴 Flag for Criuleni (MD-CR)
🏴 Flag for Sîngerei (MD-SI)
🏴 Flag for Soroca (MD-SO)
🏴 Flag for Cantemir (MD-CT)
🏴 Flag for Rezina (MD-RE)
🏴 Flag for Șoldănești (MD-SD)
🏴 Flag for Briceni (MD-BR)
🏴 Flag for Vallon de la Rousse (MC-VR)
🏴 Flag for Bălţi (MD-BA)
🏴 Flag for Dubăsari (MD-DU)
🏴 Flag for Călărași (MD-CL)
🏴 Flag for Spélugues (MC-SP)
🏴 Flag for Cahul (MD-CA)
🏴 Flag for Ialoveni (MD-IA)
🏴 Flag for Orhei (MD-OR)
🏴 Flag for Drochia (MD-DR)
🏴 Flag for Gagauzia (MD-GA)
🏴 Flag for Cimișlia (MD-CM)
🏴 Flag for Ocniţa (MD-OC)
🏴 Flag for Basarabeasca (MD-BS)
🏴 Flag for Strășeni (MD-ST)
🏴 Flag for Anenii Noi (MD-AN)
🏴 Flag for Moulins (MC-MU)
🏴 Flag for Bender (MD-BD)
🏴 Flag for Glodeni (MD-GL)
🏴 Flag for La Source (MC-SO)
🏴 Flag for Chișinău (MD-CU)
🏴 Flag for Dondușeni (MD-DO)
🏴 Flag for Florești (MD-FL)
🏴 Flag for Port Hercules (MC-PH)
🏴 Flag for Nisporeni (MD-NI)
🏴 Flag for Rîșcani (MD-RI)
🏴 Flag for Leova (MD-LE)
🏴 Flag for Ştefan Vodă (MD-SV)
🏴 Flag for Ungheni (MD-UN)
🏴 Flag for Toamasina (MG-A)
🏴 Flag for Antananarivo (MG-T)
🏴 Flag for Cetinje (ME-06)
🏴 Flag for Bogdanci (MK-05)
🏴 Flag for Ulcinj (ME-20)
🏴 Flag for Kolašin (ME-09)
🏴 Flag for Bosilovo (MK-07)
🏴 Flag for Pljevlja (ME-14)
🏴 Flag for Telenești (MD-TE)
🏴 Flag for Bogovinje (MK-06)
🏴 Flag for Žabljak (ME-21)
🏴 Flag for Herceg Novi (ME-08)
🏴 Flag for Petnjica (ME-23)
🏴 Flag for Rožaje (ME-17)
🏴 Flag for Budva (ME-05)
🏴 Flag for Bar (ME-02)
🏴 Flag for Berovo (MK-03)
🏴 Flag for Tivat (ME-19)
🏴 Flag for Plužine (ME-15)
🏴 Flag for Kotor (ME-10)
🏴 Flag for Ralik Chain (MH-L)
🏴 Flag for Danilovgrad (ME-07)
🏴 Flag for Plav (ME-13)
🏴 Flag for Bitola (MK-04)
🏴 Flag for Bijelo Polje (ME-04)
🏴 Flag for Andrijevica (ME-01)
👩🏿👩🏿👦🏿 Family - Woman: Dark Skin Tone, Woman: Dark Skin Tone, Boy: Dark Skin Tone
🏴 Flag for Nikšić (ME-12)
🏴 Flag for Taraclia (MD-TA)
🏴 Flag for Mojkovac (ME-11)
🏴 Flag for Mahajanga (MG-M)
🏴 Flag for Gusinje (ME-22)
🏴 Flag for Fianarantsoa (MG-F)
🏴 Flag for Šavnik (ME-18)
🏴 Flag for Podgorica (ME-16)
🏴 Flag for Toliara (MG-U)
🏴 Flag for Antsiranana (MG-D)
🏴 Flag for Kratovo (MK-43)
🏴 Flag for Kriva Palanka (MK-44)
🏴 Flag for Makedonski Brod (MK-52)
🏴 Flag for Jegunovce (MK-35)
🏴 Flag for Lozovo (MK-49)
🏴 Flag for Kumanovo (MK-47)
🏴 Flag for Vevčani (MK-12)
🏴 Flag for Demir Kapija (MK-24)
🏴 Flag for Vasilevo (MK-11)
🏴 Flag for Želino (MK-30)
🏴 Flag for Kavadarci (MK-36)
🏴 Flag for Zelenikovo (MK-32)
🏴 Flag for Konče (MK-41)
🏴 Flag for Vinica (MK-14)
🏴 Flag for Valandovo (MK-10)
🏴 Flag for Novaci (MK-55)
🏴 Flag for Novo Selo (MK-56)
🏴 Flag for Ilinden (MK-34)
🏴 Flag for Makedonska Kamenica (MK-51)
🏴 Flag for Vrapčište (MK-16)
🏴 Flag for Brvenica (MK-08)
🏴 Flag for Gradsko (MK-20)
🏴 Flag for Mavrovo and Rostuša (MK-50)
🏴 Flag for Debarca (MK-22)
🏴 Flag for Gostivar (MK-19)
🏴 Flag for Mogila (MK-53)
🏴 Flag for Lipkovo (MK-48)
🏴 Flag for Karbinci (MK-37)
🏴 Flag for Zrnovci (MK-33)
🏴 Flag for Negotino (MK-54)
🏴 Flag for Kičevo (MK-40)
🏴 Flag for Debar (MK-21)
🏴 Flag for Veles (MK-13)
🏴 Flag for Dojran (MK-26)
🏴 Flag for Gevgelija (MK-18)
🏴 Flag for Kočani (MK-42)
🏴 Flag for Krivogaštani (MK-45)
🏴 Flag for Delčevo (MK-23)
🏴 Flag for Kruševo (MK-46)
🏴 Flag for Čučer-Sandevo (MK-82)
🏴 Flag for Prilep (MK-62)
🏴 Flag for Centar Župa (MK-78)
🏴 Flag for Mandalay (MM-04)
🏴 Flag for Ségou (ML-4)
🏴 Flag for Petrovec (MK-59)
🏴 Flag for Češinovo-Obleševo (MK-81)
🏴 Flag for Kidal (ML-8)
🏴 Flag for Bago (MM-02)
🏴 Flag for Struga (MK-72)
🏴 Flag for Tearce (MK-75)
🏴 Flag for Studeničani (MK-74)
🏴 Flag for Ohrid (MK-58)
🏴 Flag for Sveti Nikole (MK-69)
🏴 Flag for Strumica (MK-73)
🏴 Flag for Sikasso (ML-3)
🏴 Flag for Kachin (MM-11)
🏴 Flag for Resen (MK-66)
🏴 Flag for Bamako (ML-BKO)
🏴 Flag for Magway (MM-03)
🏴 Flag for Sopište (MK-70)
🏴 Flag for Staro Nagoričane (MK-71)
🏴 Flag for Ayeyarwady (MM-07)
🏴 Flag for Gao (ML-7)
🏴 Flag for Mopti (ML-5)
🏴 Flag for Štip (MK-83)
🏴 Flag for Kayah (MM-12)
🏴 Flag for Tanintharyi (MM-05)
🏴 Flag for Koulikoro (ML-2)
🏴 Flag for Probištip (MK-63)
🏴 Flag for Pehčevo (MK-60)
🏴 Flag for Sagaing (MM-01)
🏴 Flag for Čaška (MK-80)
🏴 Flag for Rankovce (MK-65)
🏴 Flag for Yangon (MM-06)
🏴 Flag for Tetovo (MK-76)
🏴 Flag for Rosoman (MK-67)
🏴 Flag for Assaba (MR-03)
🏴 Flag for Shan (MM-17)
🏴 Flag for Rakhine (MM-16)
🏴 Flag for Khövsgöl (MN-041)
🏴 Flag for Bayan-Ölgii (MN-071)
🏴 Flag for Bayankhongor (MN-069)
🏴 Flag for Dornod (MN-061)
🏴 Flag for Selenge (MN-049)
🏴 Flag for Ulaanbaatar (MN-1)
🏴 Flag for Darkhan-Uul (MN-037)
🏴 Flag for Töv (MN-047)
🏴 Flag for Mon (MM-15)
🏴 Flag for Trarza (MR-06)
🏴 Flag for Sükhbaatar (MN-051)
🏴 Flag for Gorgol (MR-04)
🏴 Flag for Övörkhangai (MN-055)
🏴 Flag for Chin (MM-14)
🏴 Flag for Bulgan (MN-067)
🏴 Flag for Zavkhan (MN-057)
🏴 Flag for Dornogovi (MN-063)
🏴 Flag for Ömnögovi (MN-053)
🏴 Flag for Kayin (MM-13)
🏴 Flag for Govi-Altai (MN-065)
🏴 Flag for Tiris Zemmour (MR-11)
🏴 Flag for Dundgovi (MN-059)
🏴 Flag for Arkhangai (MN-073)
🏴 Flag for Tagant (MR-09)
🏴 Flag for Khovd (MN-043)
🏴 Flag for Uvs (MN-046)
🏴 Flag for Govisümber (MN-064)
🏴 Flag for Brakna (MR-05)
🏴 Flag for Dakhlet Nouadhibou (MR-08)
🏴 Flag for Hodh Ech Chargui (MR-01)
🏴 Flag for Orkhon (MN-035)
🏴 Flag for Hodh El Gharbi (MR-02)
🏴 Flag for Naypyidaw (MM-18)
🏴 Flag for Adrar (MR-07)
🏴 Flag for Inchiri (MR-12)
🏴 Flag for Iklin (MT-19)
🏴 Flag for Għarb (MT-14)
🏴 Flag for Mqabba (MT-33)
🏴 Flag for Kerċem (MT-22)
🏴 Flag for Għasri (MT-16)
🏴 Flag for Lija (MT-24)
🏴 Flag for Birżebbuġa (MT-05)
🏴 Flag for Birkirkara (MT-04)
🏴 Flag for Mġarr (MT-31)
🏴 Flag for Balzan (MT-02)
🏴 Flag for Munxar (MT-36)
🏴 Flag for Għajnsielem (MT-13)
🏴 Flag for Naxxar (MT-38)
🏴 Flag for Floriana (MT-09)
🏴 Flag for Marsa (MT-26)
🏴 Flag for Dingli (MT-07)
🏴 Flag for Gudja (MT-11)
🏴 Flag for Kirkop (MT-23)
🏴 Flag for Marsaskala (MT-27)
🏴 Flag for Paola (MT-39)
🏴 Flag for Fontana (MT-10)
🏴 Flag for Msida (MT-34)
🏴 Flag for Nadur (MT-37)
🏴 Flag for Mosta (MT-32)
🏴 Flag for Imtarfa (MT-35)
🏴 Flag for Cospicua (MT-06)
🏴 Flag for Birgu (MT-03)
🏴 Flag for Nouakchott Nord (MR-14)
🏴 Flag for Gżira (MT-12)
🏴 Flag for Mellieħa (MT-30)
🏴 Flag for Għaxaq (MT-17)
🏴 Flag for Ħamrun (MT-18)
🏴 Flag for Fgura (MT-08)
🏴 Flag for Attard (MT-01)
🏴 Flag for Għargħur (MT-15)
🏴 Flag for Kalkara (MT-21)
🏴 Flag for Nouakchott Sud (MR-15)
🏴 Flag for Marsaxlokk (MT-28)
🏴 Flag for Victoria (MT-45)
🏴 Flag for Qala (MT-42)
🏴 Flag for Żabbar (MT-64)
🏴 Flag for Agaléga (MU-AG)
🏴 Flag for Ta’ Xbiex (MT-58)
🏴 Flag for Pietà (MT-41)
🏴 Flag for Sannat (MT-52)
🏴 Flag for Port Louis District (MU-PL)
🏴 Flag for Xagħra (MT-61)
🏴 Flag for Rivière Noire (MU-BL)
🏴 Flag for Sliema (MT-56)
🏴 Flag for Safi (MT-47)
🏴 Flag for Flacq (MU-FL)
🏴 Flag for Pembroke (MT-40)
🏴 Flag for Swieqi (MT-57)
🏴 Flag for Curepipe (MU-CU)
🏴 Flag for Żurrieq (MT-68)
🏴 Flag for San Ġwann (MT-49)
🏴 Flag for Grand Port (MU-GP)
🏴 Flag for Cargados Carajos (MU-CC)
🏴 Flag for Qrendi (MT-44)
🏴 Flag for Valletta (MT-60)
🏴 Flag for Pamplemousses (MU-PA)
🏴 Flag for Qormi (MT-43)
🏴 Flag for Port Louis (MU-PU)
🏴 Flag for Tarxien (MT-59)
🏴 Flag for Żebbuġ Gozo (MT-65)
🏴 Flag for Saint Lawrence (MT-50)
🏴 Flag for Żejtun (MT-67)
🏴 Flag for St. Paul’s Bay (MT-51)
🏴 Flag for Santa Luċija (MT-53)
🏴 Flag for Żebbuġ (MT-66)
🏴 Flag for Rabat (MT-46)
🏴 Flag for Siġġiewi (MT-55)
👩🏽👩🏽👧🏽 Family - Woman: Medium Skin Tone, Woman: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for Santa Venera (MT-54)
🏴 Flag for Xgħajra (MT-63)
🏴 Flag for Moka (MU-MO)
🏴 Flag for Michoacán (MX-MIC)
🏴 Flag for Northern (MW-N)
🏴 Flag for Upper North Province (MV-UN)
🏴 Flag for Colima (MX-COL)
🏴 Flag for Rodrigues (MU-RO)
🏴 Flag for Guanajuato (MX-GUA)
🏴 Flag for Ciudad de Mexico (MX-CMX)
🏴 Flag for Puebla (MX-PUE)
🏴 Flag for Quatre Bornes (MU-QB)
🏴 Flag for Oaxaca (MX-OAX)
🏴 Flag for Central (MW-C)
🏴 Flag for Savanne (MU-SA)
🏴 Flag for Morelos (MX-MOR)
🏴 Flag for Hidalgo (MX-HID)
🏴 Flag for Aguascalientes (MX-AGU)
🏴 Flag for Campeche (MX-CAM)
🏴 Flag for Nuevo León (MX-NLE)
🏴 Flag for Malé (MV-MLE)
🏴 Flag for Guerrero (MX-GRO)
🏴 Flag for Vacoas-Phoenix (MU-VP)
👨🏻👨🏻👦🏻👧🏻 Family - Man: Light Skin Tone, Man: Light Skin Tone, Boy: Light Skin Tone, Girl: Light Skin Tone
🏴 Flag for North Central Province (MV-NC)
🏴 Flag for Mexico State (MX-MEX)
🏴 Flag for Plaines Wilhems (MU-PW)
🏴 Flag for Central Province (MV-CE)
🏴 Flag for Coahuila (MX-COA)
🏴 Flag for South Province (MV-SU)
🏴 Flag for Chiapas (MX-CHP)
🏴 Flag for Southern (MW-S)
🏴 Flag for Sofala (MZ-S)
🏴 Flag for Perlis (MY-09)
🏴 Flag for Veracruz (MX-VER)
🏴 Flag for Sarawak (MY-13)
🏴 Flag for Kelantan (MY-03)
🏴 Flag for Zambezi (NA-CA)
🏴 Flag for Manica (MZ-B)
🏴 Flag for Labuan (MY-15)
🏴 Flag for Cabo Delgado (MZ-P)
🏴 Flag for Hardap (NA-HA)
🏴 Flag for Tete (MZ-T)
🏴 Flag for Kedah (MY-02)
🏴 Flag for Pahang (MY-06)
🏴 Flag for Penang (MY-07)
🏴 Flag for Perak (MY-08)
🏴 Flag for Maputo Province (MZ-L)
🏴 Flag for Goiás (BR-GO)
🏴 Flag for Terengganu (MY-11)
🏴 Flag for Inhambane (MZ-I)
🏴 Flag for Malacca (MY-04)
🏴 Flag for Erongo (NA-ER)
🏴 Flag for Tlaxcala (MX-TLA)
🏴 Flag for Negeri Sembilan (MY-05)
🏴 Flag for Zacatecas (MX-ZAC)
🏴 Flag for Tamaulipas (MX-TAM)
🏴 Flag for Niassa (MZ-A)
🏴 Flag for Maputo (MZ-MPM)
🏴 Flag for Nampula (MZ-N)
🏴 Flag for Putrajaya (MY-16)
🏴 Flag for Sinaloa (MX-SIN)
🏴 Flag for Yucatán (MX-YUC)
🏴 Flag for Sabah (MY-12)
👩🏼👩🏼👧🏼👧🏼 Family - Woman: Medium-Light Skin Tone, Woman: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone, Girl: Medium-Light Skin Tone
🏴 Flag for Zambezia (MZ-Q)
🏴 Flag for Querétaro (MX-QUE)
🏴 Flag for Gaza (MZ-G)
🏴 Flag for Otjozondjupa (NA-OD)
🏴 Flag for Maradi (NE-4)
🏴 Flag for Kunene (NA-KU)
🏴 Flag for Akwa Ibom (NG-AK)
🏴 Flag for Tahoua (NE-5)
🏴 Flag for Rivière du Rempart (MU-RR)
🏴 Flag for Imo (NG-IM)
🏴 Flag for Katsina (NG-KT)
🏴 Flag for Dosso (NE-3)
🏴 Flag for Tillabéri (NE-6)
🏴 Flag for Ekiti (NG-EK)
🏴 Flag for Omaheke (NA-OH)
🏴 Flag for Bauchi (NG-BA)
🏴 Flag for Karas (NA-KA)
🏴 Flag for Bayelsa (NG-BY)
🏴 Flag for Ohangwena (NA-OW)
🏴 Flag for Benue (NG-BE)
🏴 Flag for Enugu (NG-EN)
🏴 Flag for Oshana (NA-ON)
🏴 Flag for Kaduna (NG-KD)
👨🏻👶🏻👦🏻 Family - Man: Light Skin Tone, Baby: Light Skin Tone, Boy: Light Skin Tone
🏴 Flag for Kebbi (NG-KE)
🏴 Flag for Jigawa (NG-JI)
🏴 Flag for Niamey (NE-8)
🏴 Flag for Anambra (NG-AN)
🏴 Flag for Gombe (NG-GO)
🏴 Flag for Agadez (NE-1)
🏴 Flag for Khomas (NA-KH)
🏴 Flag for Diffa (NE-2)
🏴 Flag for Johor (MY-01)
🏴 Flag for Kano (NG-KN)
🏴 Flag for Omusati (NA-OS)
🏴 Flag for Kogi (NG-KO)
🏴 Flag for Edo (NG-ED)
🏴 Flag for Abia (NG-AB)
🏴 Flag for Oshikoto (NA-OT)
🏴 Flag for Kavango West (NA-KW)
🏴 Flag for Ebonyi (NG-EB)
🏴 Flag for Zinder (NE-7)
🏴 Flag for Jinotega (NI-JI)
🏴 Flag for Nasarawa (NG-NA)
🏴 Flag for Friesland (NL-FR)
🏴 Flag for Sokoto (NG-SO)
🏴 Flag for Rivas (NI-RI)
🏴 Flag for Nueva Segovia (NI-NS)
🏴 Flag for Plateau (NG-PL)
🏴 Flag for Yobe (NG-YO)
🏴 Flag for Bonaire (NL-BQ1)
🏴 Flag for Atlántico Norte (NI-AN)
🏴 Flag for Zamfara (NG-ZA)
🏴 Flag for Gelderland (NL-GE)
🏴 Flag for Oyo (NG-OY)
🏴 Flag for Madriz (NI-MD)
🏴 Flag for Chinandega (NI-CI)
🏴 Flag for Ondo (NG-ON)
👨🏽👨🏽👦🏽👧🏽 Family - Man: Medium Skin Tone, Man: Medium Skin Tone, Boy: Medium Skin Tone, Girl: Medium Skin Tone
🏴 Flag for North Rhine-Westphalia (DE-NW)
🏴 Flag for Lagos (NG-LA)
🏴 Flag for Managua (NI-MN)
🏴 Flag for Atlántico Sur (NI-AS)
🏴 Flag for Curaçao (NL-CW)
🏴 Flag for Boaco (NI-BO)
🏴 Flag for Rivers (NG-RI)
🏴 Flag for Granada (NI-GR)
🏴 Flag for Chontales (NI-CO)
🏴 Flag for Groningen (NL-GR)
🏴 Flag for Sint Eustatius (NL-BQ3)
🏴 Flag for Río San Juan (NI-SJ)
🏴 Flag for Osun (NG-OS)
🏴 Flag for Taraba (NG-TA)
🏴 Flag for Flevoland (NL-FL)
🏴 Flag for Matagalpa (NI-MT)
🏴 Flag for Drenthe (NL-DR)
🏴 Flag for Carazo (NI-CA)
🏴 Flag for Kwara (NG-KW)
🏴 Flag for Niger (NG-NI)
🏴 Flag for Estelí (NI-ES)
🏴 Flag for South Holland (NL-ZH)
"""
for line in emojis.splitlines():
words = line.split()
char = words[0]
desc = " ".join(words[1:])
print("{}\t:{}".format(desc, char))
| 35.752537 | 141 | 0.63422 | 45,956 | 257,168 | 4.311363 | 0.294107 | 0.130049 | 0.073224 | 0.05351 | 0.279444 | 0.217354 | 0.187899 | 0.180566 | 0.164506 | 0.134193 | 0 | 0.018066 | 0.160778 | 257,168 | 7,192 | 142 | 35.757508 | 0.737658 | 0.000152 | 0 | 0 | 0 | 0.033398 | 0.999242 | 0.00063 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.000417 | 0.000139 | 0 | 0.000139 | 0.000278 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8327e8e634b6bfa67228d199235d028b776af03 | 3,104 | py | Python | openquake.hazardlib/openquake/hazardlib/tests/gsim/campbell_2003_test.py | rainzhop/ConvNetQuake | a3e6de3f7992eac72f1b9883fec36b8c7fdefd48 | [
"MIT"
] | null | null | null | openquake.hazardlib/openquake/hazardlib/tests/gsim/campbell_2003_test.py | rainzhop/ConvNetQuake | a3e6de3f7992eac72f1b9883fec36b8c7fdefd48 | [
"MIT"
] | null | null | null | openquake.hazardlib/openquake/hazardlib/tests/gsim/campbell_2003_test.py | rainzhop/ConvNetQuake | a3e6de3f7992eac72f1b9883fec36b8c7fdefd48 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# vim: tabstop=4 shiftwidth=4 softtabstop=4
#
# Copyright (C) 2012-2016 GEM Foundation
#
# OpenQuake is free software: you can redistribute it and/or modify it
# under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# OpenQuake is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with OpenQuake. If not, see <http://www.gnu.org/licenses/>.
from openquake.hazardlib.gsim.campbell_2003 import (
Campbell2003,
Campbell2003SHARE,
Campbell2003MblgAB1987NSHMP2008,
Campbell2003MblgJ1996NSHMP2008,
Campbell2003MwNSHMP2008
)
from openquake.hazardlib.tests.gsim.utils import BaseGSIMTestCase
import numpy
# Test data generated from OpenSHA implementation.
class Campbell2003TestCase(BaseGSIMTestCase):
GSIM_CLASS = Campbell2003
def test_mean(self):
self.check('C03/C03_MEAN.csv',
max_discrep_percentage=0.1)
def test_std_total(self):
self.check('C03/C03_STD_TOTAL.csv',
max_discrep_percentage=0.1)
class Campbell2003SHARETestCase(BaseGSIMTestCase):
GSIM_CLASS = Campbell2003SHARE
def test_mean(self):
self.check('C03/C03SHARE_MEAN.csv',
max_discrep_percentage=0.1)
def test_std_total(self):
self.check('C03/C03SHARE_STD_TOTAL.csv',
max_discrep_percentage=0.1)
class Campbell2003MblgAB1987NSHMP2008TestCase(BaseGSIMTestCase):
GSIM_CLASS = Campbell2003MblgAB1987NSHMP2008
# test data generated from ``subroutine getCampCEUS`` in ``hazgridXnga2.f``
def test_mean(self):
self.check('C03/C03MblgAB1987NSHMP2008_MEAN.csv',
max_discrep_percentage=0.1)
def test_std_total(self):
self.check('C03/C03MblgAB1987NSHMP2008_STD_TOTAL.csv',
max_discrep_percentage=0.1)
class Campbell2003MblgJ1996NSHMP2008TestCase(BaseGSIMTestCase):
GSIM_CLASS = Campbell2003MblgJ1996NSHMP2008
# test data generated from ``subroutine getCampCEUS`` in ``hazgridXnga2.f``
def test_mean(self):
self.check('C03/C03MblgJ1996NSHMP2008_MEAN.csv',
max_discrep_percentage=0.1)
def test_std_total(self):
self.check('C03/C03MblgJ1996NSHMP2008_STD_TOTAL.csv',
max_discrep_percentage=0.1)
class Campbell2003MwNSHMP2008TestCase(BaseGSIMTestCase):
GSIM_CLASS = Campbell2003MwNSHMP2008
# test data generated from ``subroutine getCampCEUS`` in ``hazgridXnga2.f``
def test_mean(self):
self.check('C03/C03MwNSHMP2008_MEAN.csv',
max_discrep_percentage=0.1)
def test_std_total(self):
self.check('C03/C03MwNSHMP2008_STD_TOTAL.csv',
max_discrep_percentage=0.1)
| 31.673469 | 79 | 0.715206 | 365 | 3,104 | 5.928767 | 0.342466 | 0.032348 | 0.060074 | 0.073937 | 0.474584 | 0.411738 | 0.39695 | 0.340573 | 0.325323 | 0.255083 | 0 | 0.097166 | 0.204253 | 3,104 | 97 | 80 | 32 | 0.778947 | 0.320232 | 0 | 0.408163 | 0 | 0 | 0.139435 | 0.131768 | 0 | 0 | 0 | 0 | 0 | 1 | 0.204082 | false | 0 | 0.061224 | 0 | 0.469388 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8361342f0f0d1a04e77282debfd9e4a5b53ef3e | 16,223 | py | Python | SimpleCV/MachineLearning/query_imgs/get_imgs_geo_gps_search.py | nikhilgk/SimpleCV | ee64451c16db1f40b4da221115273020a6a7b01a | [
"BSD-3-Clause"
] | 2 | 2016-04-30T12:23:05.000Z | 2022-03-02T00:01:10.000Z | SimpleCV/MachineLearning/query_imgs/get_imgs_geo_gps_search.py | nikhilgk/SimpleCV | ee64451c16db1f40b4da221115273020a6a7b01a | [
"BSD-3-Clause"
] | null | null | null | SimpleCV/MachineLearning/query_imgs/get_imgs_geo_gps_search.py | nikhilgk/SimpleCV | ee64451c16db1f40b4da221115273020a6a7b01a | [
"BSD-3-Clause"
] | 2 | 2015-04-16T12:14:55.000Z | 2019-08-07T14:12:04.000Z | #!/usr/bin/python
#
# So this script is in a bit of a hack state right now.
# This script reads
#
#
#
# Graciously copied and modified from:
# http://graphics.cs.cmu.edu/projects/im2gps/flickr_code.html
#Image querying script written by Tamara Berg,
#and extended heavily James Hays
#9/26/2007 added dynamic timeslices to query more efficiently.
#8/18/2008 added new fields and set maximum time slice.
#8/19/2008 this is a much simpler function which gets ALL geotagged photos of
# sufficient accuracy. No queries, no negative constraints.
# divides up the query results into multiple files
# 1/5/2009
# now uses date_taken instead of date_upload to get more diverse blocks of images
# 1/13/2009 - uses the original im2gps keywords, not as negative constraints though
import sys, string, math, time, socket
from flickrapi2 import FlickrAPI
from datetime import datetime
import pycurl
import os
import shutil
socket.setdefaulttimeout(30) #30 second time out on sockets before they throw
#an exception. I've been having trouble with urllib.urlopen hanging in the
#flickr API. This will show up as exceptions.IOError.
#the time out needs to be pretty long, it seems, because the flickr servers can be slow
#to respond to our big searches.
#returns a query and the search times to attempt to get a desired number of photos
#this needs serious refactoring -KAS
def DoSearch(fapi,query_string,desired_photos):
# number of seconds to skip per query
#timeskip = 62899200 #two years
#timeskip = 604800 #one week
timeskip = 172800 #two days
#timeskip = 86400 #one day
#timeskip = 3600 #one hour
#timeskip = 2257 #for resuming previous query
#mintime = 1121832000 #from im2gps
#mintime = 1167407788 # resume crash england
#mintime = 1177828976 #resume crash japan
#mintime = 1187753798 #resume crash greece
mintime = 1171416400 #resume crash WashingtonDC
maxtime = mintime+timeskip
endtime = 1192165200 #10/12/2007, at the end of im2gps queries
print datetime.fromtimestamp(mintime)
print datetime.fromtimestamp(endtime)
while (maxtime < endtime):
#new approach - adjust maxtime until we get the desired number of images
#within a block. We'll need to keep upper bounds and lower
#lower bound is well defined (mintime), but upper bound is not. We can't
#search all the way from endtime.
lower_bound = mintime + 900 #lower bound OF the upper time limit. must be at least 15 minutes or zero results
upper_bound = mintime + timeskip * 20 #upper bound of the upper time limit
maxtime = .95 * lower_bound + .05 * upper_bound
print '\nBinary search on time range upper bound'
print 'Lower bound is ' + str(datetime.fromtimestamp(lower_bound))
print 'Upper bound is ' + str(datetime.fromtimestamp(upper_bound))
keep_going = 6 #search stops after a fixed number of iterations
while( keep_going > 0 and maxtime < endtime):
try:
rsp = fapi.photos_search(api_key=flickrAPIKey,
ispublic="1",
media="photos",
per_page="250",
page="1",
has_geo = "0", #bbox="-180, -90, 180, 90",
text=query_string,
accuracy="6", #6 is region level.
min_upload_date=str(mintime),
max_upload_date=str(maxtime))
#we want to catch these failures somehow and keep going.
time.sleep(1)
fapi.testFailure(rsp)
total_images = rsp.photos[0]['total'];
null_test = int(total_images); #want to make sure this won't crash later on for some reason
null_test = float(total_images);
print '\nnumimgs: ' + total_images
print 'mintime: ' + str(mintime) + ' maxtime: ' + str(maxtime) + ' timeskip: ' + str(maxtime - mintime)
if( int(total_images) > desired_photos ):
print 'too many photos in block, reducing maxtime'
upper_bound = maxtime
maxtime = (lower_bound + maxtime) / 2 #midpoint between current value and lower bound.
if( int(total_images) < desired_photos):
print 'too few photos in block, increasing maxtime'
lower_bound = maxtime
maxtime = (upper_bound + maxtime) / 2
print 'Lower bound is ' + str(datetime.fromtimestamp(lower_bound))
print 'Upper bound is ' + str(datetime.fromtimestamp(upper_bound))
if( int(total_images) > 0): #only if we're not in a degenerate case
keep_going = keep_going - 1
else:
upper_bound = upper_bound + timeskip;
except KeyboardInterrupt:
print('Keyboard exception while querying for images, exiting\n')
raise
except:
print sys.exc_info()[0]
#print type(inst) # the exception instance
#print inst.args # arguments stored in .args
#print inst # __str__ allows args to printed directly
print ('Exception encountered while querying for images\n')
#end of while binary search
print 'finished binary search'
return([mintime,maxtime,total_images,rsp])
###########################################################################
# Modify this section to reflect your data and specific search
###########################################################################
# flickr auth information:
# change these to your flickr api keys and secret
flickrAPIKey = "fa33550d413b36b3fddc473a931a3b3b" # API key
flickrSecret = "7fd481bff0916055" # shared "secret"
rootpath = "../data/" #where do you want the data
desired_photos = 1000 #how many photos do you want to try and get
query_file_name = 'query.dat' #The file to get the queries from
#query_file_name = 'place_rec_queries_fall08.txt'
query_file = open(query_file_name, 'r')
#aggregate all of the positive and negative queries together.
pos_queries = [] #an empty list
neg_queries = '' #a string
num_queries = 0
for line in query_file:
if line[0] != '#' and len(line) > 1: #line end character is 2 long?
print line[0:len(line)-1]
if line[0] != '-':
pos_queries = pos_queries + [line[0:len(line)-1]]
num_queries = num_queries + 1
if line[0] == '-':
neg_queries = neg_queries + ' ' + line[0:len(line)-1]
query_file.close()
print 'positive queries: '
print pos_queries
print 'negative queries: ' + neg_queries
print 'num_queries = ' + str(num_queries)
#this is the desired number of photos in each block
# make a new FlickrAPI instance
fapi = FlickrAPI(flickrAPIKey, flickrSecret)
for current_tag in range(0, num_queries):
print('TOP OF LOOP')
# change this to the location where you want to put your output file
try:
stats = os.stat(rootpath)
except OSError:
os.mkdir(rootpath)
outpath = rootpath+pos_queries[current_tag]+'/'
try:
os.mkdir(outpath)
except OSError:
shutil.rmtree(outpath,True)
os.mkdir(outpath)
out_file = open(rootpath + pos_queries[current_tag] + '.txt','w')
###########################################################################
#form the query string.
query_string = pos_queries[current_tag] + ' ' + neg_queries
print '\n\nquery_string is ' + query_string
total_images_queried = 0;
[mintime,maxtime,total_images,rsp] = DoSearch(fapi,query_string,desired_photos)
print('GETTING TOTATL IMAGES:'+str(total_images))
s = '\nmintime: ' + str(mintime) + ' maxtime: ' + str(maxtime)
print s
out_file.write(s + '\n')
i = getattr(rsp,'photos',None)
if i:
s = 'numimgs: ' + total_images
print s
out_file.write(s + '\n')
current_image_num = 1;
num = 4 # CHANGE THIS BACK int(rsp.photos[0]['pages'])
s = 'total pages: ' + str(num)
print s
out_file.write(s + '\n')
#only visit 16 pages max, to try and avoid the dreaded duplicate bug
#16 pages = 4000 images, should be duplicate safe. Most interesting pictures will be taken.
num_visit_pages = min(16,num)
s = 'visiting only ' + str(num_visit_pages) + ' pages ( up to ' + str(num_visit_pages * 250) + ' images)'
print s
out_file.write(s + '\n')
total_images_queried = total_images_queried + min((num_visit_pages * 250), int(total_images))
#print 'stopping before page ' + str(int(math.ceil(num/3) + 1)) + '\n'
pagenum = 1;
counter = -1
while( pagenum <= num_visit_pages ):
#for pagenum in range(1, num_visit_pages + 1): #page one is searched twice
print ' page number ' + str(pagenum)
try:
print("PAGE")
print(pagenum)
# WARNING THIS QUERY HAS TO MATCH THE SEARCH QUERY!!!!
rsp = fapi.photos_search(api_key=flickrAPIKey,
ispublic="1",
media="photos",
per_page="250",
page=str(pagenum),
has_geo = "0",
text=query_string,
#extras = "tags, original_format, license, geo, date_taken, date_upload, o_dims, views",
#accuracy="6", #6 is region level.
min_upload_date=str(1121832000),#mintime),
max_upload_date=str(1192165200))#maxtime))
#rsp = fapi.photos_search(api_key=flickrAPIKey,
# ispublic="1",
# media="photos",
# per_page="250",
# page='0', #str(pagenum),
# sort="interestingness-desc",
# has_geo = "0", #bbox="-180, -90, 180, 90",
# text=query_string,
# #accuracy="6", #6 is region level. most things seem 10 or better.
# extras = "tags, original_format, license, geo, date_taken, date_upload, o_dims, views",
# min_upload_date=str(mintime),
# max_upload_date=str(maxtime))
##min_taken_date=str(datetime.fromtimestamp(mintime)),
##max_taken_date=str(datetime.fromtimestamp(maxtime)))
time.sleep(1)
fapi.testFailure(rsp)
except KeyboardInterrupt:
print('Keyboard exception while querying for images, exiting\n')
raise
except:
print sys.exc_info()[0]
#print type(inst) # the exception instance
#print inst.args # arguments stored in .args
#print inst # __str__ allows args to printed directly
print ('Exception encountered while querying for images\n')
else:
print('got a response')
# and print them
k = getattr(rsp,'photos',None)
if k:
print('In K')
m = getattr(rsp.photos[0],'photo',None)
if m:
print('In M')
for b in rsp.photos[0].photo:
print('In b')
if b!=None:
counter = counter + 1
##print(http://farm{farm-id}.static.flickr.com/{server-id}/{id}_{secret}.jpg)
myurl = 'http://farm'+b['farm']+".static.flickr.com/"+b['server']+"/"+b['id']+"_"+b['secret']+'.jpg'
fname = outpath+pos_queries[current_tag]+str(counter)+'.jpg' #b['id']+"_"+b['secret']+'.jpg'
print(myurl)
print(fname)
mycurl = pycurl.Curl()
mycurl.setopt(pycurl.URL, str(myurl))
myfile = open(fname,"wb")
mycurl.setopt(pycurl.WRITEDATA, myfile)
mycurl.setopt(pycurl.FOLLOWLOCATION, 1)
mycurl.setopt(pycurl.MAXREDIRS, 5)
mycurl.setopt(pycurl.NOSIGNAL, 1)
mycurl.perform()
mycurl.close()
myfile.close()
out_file.write('URL: '+myurl+'\n')
out_file.write('File: '+ fname+'\n')
out_file.write('photo: ' + b['id'] + ' ' + b['secret'] + ' ' + b['server'] + '\n')
out_file.write('owner: ' + b['owner'] + '\n')
out_file.write('title: ' + b['title'].encode("ascii","replace") + '\n')
out_file.write('originalsecret: ' + b['originalsecret'] + '\n')
out_file.write('originalformat: ' + b['originalformat'] + '\n')
out_file.write('o_height: ' + b['o_height'] + '\n')
out_file.write('o_width: ' + b['o_width'] + '\n')
out_file.write('datetaken: ' + b['datetaken'].encode("ascii","replace") + '\n')
out_file.write('dateupload: ' + b['dateupload'].encode("ascii","replace") + '\n')
out_file.write('tags: ' + b['tags'].encode("ascii","replace") + '\n')
out_file.write('license: ' + b['license'].encode("ascii","replace") + '\n')
out_file.write('latitude: ' + b['latitude'].encode("ascii","replace") + '\n')
out_file.write('longitude: ' + b['longitude'].encode("ascii","replace") + '\n')
out_file.write('accuracy: ' + b['accuracy'].encode("ascii","replace") + '\n')
out_file.write('views: ' + b['views'] + '\n')
out_file.write('interestingness: ' + str(current_image_num) + ' out of ' + str(total_images) + '\n');
out_file.write('\n')
current_image_num = current_image_num + 1;
print('')
pagenum = pagenum + 1; #this is in the else exception block. Itwon't increment for a failure.
#this block is indented such that it will only run if there are no exceptions
#in the original query. That means if there are exceptions, mintime won't be incremented
#and it will try again
timeskip = maxtime - mintime #used for initializing next binary search
mintime = maxtime
out_file.write('Total images queried: ' + str(total_images_queried) + '\n')
out_file.close
| 47.023188 | 133 | 0.513715 | 1,750 | 16,223 | 4.652571 | 0.272571 | 0.022353 | 0.035372 | 0.02874 | 0.285065 | 0.237534 | 0.211005 | 0.175633 | 0.160157 | 0.160157 | 0 | 0.031157 | 0.376811 | 16,223 | 344 | 134 | 47.159884 | 0.774184 | 0.2915 | 0 | 0.252525 | 0 | 0 | 0.121397 | 0.002873 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.030303 | null | null | 0.20202 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
e8367fbdc5ace072208dddf385f21c1015f73383 | 384 | py | Python | safemasks/resources/rest/router.py | Safemasks/safemasks-app | 44c1cf16f81b15b74fa5eb38d36eaa078180e975 | [
"BSD-3-Clause"
] | 1 | 2020-11-04T09:42:29.000Z | 2020-11-04T09:42:29.000Z | safemasks/resources/rest/router.py | Safemasks/safemasks-app | 44c1cf16f81b15b74fa5eb38d36eaa078180e975 | [
"BSD-3-Clause"
] | null | null | null | safemasks/resources/rest/router.py | Safemasks/safemasks-app | 44c1cf16f81b15b74fa5eb38d36eaa078180e975 | [
"BSD-3-Clause"
] | 1 | 2022-02-16T12:58:28.000Z | 2022-02-16T12:58:28.000Z | """
"""
from rest_framework import routers
from safemasks.resources.rest.serializers import SupplierViewSet, TrustedSupplierViewSet
# Routers provide an easy way of automatically determining the URL conf.
ROUTER = routers.DefaultRouter()
ROUTER.register(r"suppliers", SupplierViewSet, "suppliers")
ROUTER.register(r"suppliers-trusted", TrustedSupplierViewSet, "suppliers-trusted")
| 32 | 88 | 0.815104 | 41 | 384 | 7.609756 | 0.634146 | 0.089744 | 0.096154 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091146 | 384 | 11 | 89 | 34.909091 | 0.893983 | 0.182292 | 0 | 0 | 0 | 0 | 0.170492 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
e8372d789b4ce7a9e159e03cce618e0ad5219646 | 410 | py | Python | waterApp/migrations/0011_auto_20210911_1043.py | csisarep/groundwater_dashboard | 4f93d7b7c9fd6b48ace54e7b62cae717decc98d2 | [
"MIT"
] | null | null | null | waterApp/migrations/0011_auto_20210911_1043.py | csisarep/groundwater_dashboard | 4f93d7b7c9fd6b48ace54e7b62cae717decc98d2 | [
"MIT"
] | null | null | null | waterApp/migrations/0011_auto_20210911_1043.py | csisarep/groundwater_dashboard | 4f93d7b7c9fd6b48ace54e7b62cae717decc98d2 | [
"MIT"
] | null | null | null | # Generated by Django 2.2 on 2021-09-11 04:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('waterApp', '0010_auto_20210911_1041'),
]
operations = [
migrations.AlterField(
model_name='gwmonitoring',
name='id',
field=models.BigAutoField(primary_key=True, serialize=False),
),
]
| 21.578947 | 73 | 0.617073 | 43 | 410 | 5.767442 | 0.837209 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100671 | 0.273171 | 410 | 18 | 74 | 22.777778 | 0.731544 | 0.104878 | 0 | 0 | 1 | 0 | 0.123288 | 0.063014 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
08fa0ae611479cbc0414a8eece26a08f791be531 | 928 | py | Python | src/exporter/management/commands/test_export.py | xmdy/h9eNi8F5Ut | 4128d7cbc6105ec0fe69157bd88ef8e30415d6ca | [
"Unlicense"
] | null | null | null | src/exporter/management/commands/test_export.py | xmdy/h9eNi8F5Ut | 4128d7cbc6105ec0fe69157bd88ef8e30415d6ca | [
"Unlicense"
] | null | null | null | src/exporter/management/commands/test_export.py | xmdy/h9eNi8F5Ut | 4128d7cbc6105ec0fe69157bd88ef8e30415d6ca | [
"Unlicense"
] | null | null | null | from django.core.management import BaseCommand
import logging
# These two lines enable debugging at httplib level (requests->urllib3->http.client)
# You will see the REQUEST, including HEADERS and DATA, and RESPONSE with HEADERS but without DATA.
# The only thing missing will be the response.body which is not logged.
try:
import http.client as http_client
except ImportError:
# Python 2
import httplib as http_client
http_client.HTTPConnection.debuglevel = 1
# You must initialize logging, otherwise you'll not see debug output.
logging.basicConfig()
logging.getLogger().setLevel(logging.DEBUG)
requests_log = logging.getLogger("requests.packages.urllib3")
requests_log.setLevel(logging.DEBUG)
requests_log.propagate = True
class Command(BaseCommand):
def handle(self, *args, **options):
from exporter.tasks import GenerateModelExportTask
gmet = GenerateModelExportTask()
gmet.run(1) | 37.12 | 99 | 0.774784 | 122 | 928 | 5.844262 | 0.614754 | 0.070126 | 0.033661 | 0.078541 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006337 | 0.149784 | 928 | 25 | 100 | 37.12 | 0.897338 | 0.352371 | 0 | 0 | 0 | 0 | 0.041946 | 0.041946 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.352941 | 0 | 0.470588 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
08fed17a5871561cc999432b467a8117b686ae74 | 20,180 | py | Python | tools/generate_serialization_header.py | StableCoder/vulkan-mini-libs-2 | e048f45149816e100d3f4f51306626ebf547b032 | [
"Apache-2.0"
] | 1 | 2022-02-28T20:58:44.000Z | 2022-02-28T20:58:44.000Z | tools/generate_serialization_header.py | StableCoder/vulkan-mini-libs-2 | e048f45149816e100d3f4f51306626ebf547b032 | [
"Apache-2.0"
] | null | null | null | tools/generate_serialization_header.py | StableCoder/vulkan-mini-libs-2 | e048f45149816e100d3f4f51306626ebf547b032 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python3
import sys
import getopt
import xml.etree.ElementTree as ET
def processVendors(outFile, vendors):
outFile.writelines(["\nconstexpr std::array<std::string_view, ", str(
len(vendors)), "> vendors = {{\n"])
for vendor in vendors:
outFile.writelines([' \"', vendor.tag, '\",\n'])
outFile.write('}};\n')
def processEnumValue(outFile, enum, value):
if not value.get('value') is None:
# Spitting out plain values
outFile.write(value.get('value'))
elif not value.get('bitpos') is None:
# Bitflag
outFile.writelines(
['0x', format(1 << int(value.get('bitpos')), '08X')])
elif not value.get('alias') is None:
processEnumValue(outFile, enum, enum.find(value.get('alias')))
def processEnums(outFile, enums, vendors, first, last):
for enum in enums:
# Skip VkResult
if enum.tag == 'VkResult':
continue
# Skip if there's no values, MSVC can't do zero-sized arrays
if len(enum.findall('./')) == 0:
continue
outFile.writelines(
['\nconstexpr EnumValueSet ', enum.tag, 'Sets[] = {\n'])
# Determine how much to chop off the front
strName = enum.tag
typeDigit = ''
# Determine if type ends with vendor tag
vendorName = ''
for vendor in vendors:
if strName.endswith(vendor.tag):
vendorName = vendor.tag
strName = strName[:-len(vendorName)]
if strName[-1].isdigit():
typeDigit = strName[-1]
strName = strName[:-1]
if strName.endswith('FlagBits'):
strName = strName[:-8]
# Construct most likely enum prefix
mainPrefix = ''
for char in strName:
if mainPrefix == '':
mainPrefix += char
elif char.isupper():
mainPrefix += '_'
mainPrefix += char.upper()
else:
mainPrefix += char.upper()
mainPrefix += '_'
if typeDigit != '':
mainPrefix += typeDigit
mainPrefix += '_'
current = first
while current <= last:
for value in enum.findall('./'):
if int(value.get('first')) != current:
continue
outFile.write(" {\"")
valueStr = value.tag
if valueStr.startswith(mainPrefix):
valueStr = valueStr[len(mainPrefix):]
if vendorName != '' and valueStr.endswith(vendorName):
valueStr = valueStr[:-len(vendorName)-1]
if valueStr.endswith('_BIT'):
valueStr = valueStr[:-4]
outFile.write(valueStr)
outFile.write("\", ")
processEnumValue(outFile, enum, value)
outFile.write("},\n")
current += 1
outFile.write('};\n')
def main(argv):
inputFile = ''
outputFile = ''
try:
opts, args = getopt.getopt(argv, 'i:o:', [])
except getopt.GetoptError:
print('Error parsing options')
sys.exit(1)
for opt, arg in opts:
if opt == '-i':
inputFile = arg
elif opt == '-o':
outputFile = arg
if(inputFile == ''):
print("Error: No Vulkan XML file specified")
sys.exit(1)
if(outputFile == ''):
print("Error: No output file specified")
sys.exit(1)
try:
dataXml = ET.parse(inputFile)
dataRoot = dataXml.getroot()
except:
print("Error: Could not open input file: ", inputFile)
sys.exit(1)
firstVersion = int(dataRoot.get('first'))
lastVersion = int(dataRoot.get('last'))
outFile = open(outputFile, "w")
# Common Header
with open("common_header.txt") as fd:
outFile.write(fd.read())
outFile.write('\n')
#
outFile.write("""#ifndef VK_VALUE_SERIALIZATION_HPP
#define VK_VALUE_SERIALIZATION_HPP
/* USAGE:
To use, include this header where the declarations for the boolean checks are required.
On *ONE* compilation unit, include the definition of `#define VK_VALUE_SERIALIZATION_CONFIG_MAIN`
so that the definitions are compiled somewhere following the one definition rule.
*/
#include <vulkan/vulkan.h>
#include <string>
#include <string_view>
""")
# Static Asserts
outFile.writelines(["\nstatic_assert(VK_HEADER_VERSION >= ", str(
firstVersion), ", \"VK_HEADER_VERSION is from before the supported range.\");\n"])
outFile.writelines(["static_assert(VK_HEADER_VERSION <= ", str(
lastVersion), ", \"VK_HEADER_VERSION is from after the supported range.\");\n"])
# Function Declarataions
outFile.write("""
/**
* @brief Macro that automatically stringifies the given Vulkan type for serialization
* @param VKTYPE Actual Vulkan type
* @param VALUE Value to be serialized
* @param STRPTR Pointer to the string to store the serialization in. Only modified if true is
* returned.
* @return True if serialization was successful. False otherwise.
*/
#define VK_SERIALIZE(VKTYPE, VALUE, STRPTR) vk_serialize<VKTYPE>(#VKTYPE, VALUE, STRPTR)
/**
* @brief Macro that automatically stringifies the given Vulkan type for parsing
* @param VKTYPE Actual Vulkan type
* @param STRING String to be parsed
* @param VALPTR Pointer to the value to store the parsed value in. Only modified if true is
* returned.
* @return True if serialization was successful. False otherwise.
*/
#define VK_PARSE(VKTYPE, STRING, VALPTR) vk_parse<VKTYPE>(#VKTYPE, STRING, VALPTR)
/**
* @brief Serializes a Vulkan enumerator/flag type (32-bit)
* @param vkType Name of the Vulkan enumerator/flag type
* @param vkValue Value being serialized
* @param pString Pointer to a string that will be modified with the serialized value. Only modified
* if true is returned.
* @return True the value was successfully serialized. False otherwise.
*/
bool vk_serialize(std::string_view vkType, uint32_t vkValue, std::string *pString);
/**
* @brief Parses a Vulkan enumerator/flag serialized string (32-bit)
* @param vkType Name of the Vulkan enumerator/flag type
* @param vkString String being parsed
* @param pValue Pointer to a value that will be modified with the parsed value. Only modified if
* true is returned.
* @return True the value was successfully serialized. False otherwise.
*/
bool vk_parse(std::string_view vkType, std::string vkString, uint32_t *pValue);
/**
* @brief Serializes a Vulkan enumerator/flag type (64-bit)
* @param vkType Name of the Vulkan enumerator/flag type
* @param vkValue Value being serialized
* @param pString Pointer to a string that will be modified with the serialized value. Only modified
* if true is returned.
* @return True the value was successfully serialized. False otherwise.
*/
bool vk_serialize(std::string_view vkType, uint64_t vkValue, std::string *pString);
/**
* @brief Parses a Vulkan enumerator/flag serialized string (64-bit)
* @param vkType Name of the Vulkan enumerator/flag type
* @param vkString String being parsed
* @param pValue Pointer to a value that will be modified with the parsed value. Only modified if
* true is returned.
* @return True the value was successfully serialized. False otherwise.
*/
bool vk_parse(std::string_view vkType, std::string vkString, uint64_t *pValue);
/**
* @brief Serializes a Vulkan enumerator/flag type
* @tparam Vulkan type being serialized
* @param vkType Name of the Vulkan enumerator/flag type
* @param vkValue Value being serialized
* @param pString Pointer to a string that will be modified with the serialized value. Only modified
* if true is returned.
* @return True the value was successfully serialized. False otherwise.
*/
template <typename T>
bool vk_serialize(std::string_view vkType, T vkValue, std::string *pString) {
return vk_serialize(vkType, static_cast<uint32_t>(vkValue), pString);
}
/**
* @brief Parses a Vulkan enumerator/flag serialized string
* @tparam Vulkan type being parsed
* @param vkType Name of the Vulkan enumerator/flag type
* @param vkString String being parsed
* @param pValue Pointer to a value that will be modified with the parsed value. Only modified if
* true is returned.
* @return True the value was successfully serialized. False otherwise.
*/
template <typename T>
bool vk_parse(std::string_view vkType, std::string vkString, T *pValue) {
uint32_t retVal = 0;
auto found = vk_parse(vkType, vkString, &retVal);
if (found) {
*pValue = static_cast<T>(retVal);
}
return found;
}
""")
# Definition Start
outFile.write("\n#ifdef VK_VALUE_SERIALIZATION_CONFIG_MAIN\n")
outFile.write("\n#include <algorithm>\n")
outFile.write("#include <array>\n")
outFile.write("#include <cstring>\n")
outFile.write("\nnamespace {\n")
# Vendors
vendors = dataRoot.findall('vendors/')
processVendors(outFile, vendors)
# EnumSet Declaration
outFile.write("\nstruct EnumValueSet {\n")
outFile.write(" std::string_view name;\n")
outFile.write(" int64_t value;\n")
outFile.write("};\n")
# Enums
enums = dataRoot.findall('enums/')
processEnums(outFile, enums, vendors, firstVersion, lastVersion)
# Enum Type Declaration
outFile.write("\nstruct EnumType {\n")
outFile.write(" std::string_view name;\n")
outFile.write(" EnumValueSet const* data;\n")
outFile.write(" uint32_t count;\n")
outFile.write(" bool allowEmpty;\n")
outFile.write("};\n")
# Enum Pointer Array
outFile.writelines(["\nconstexpr std::array<EnumType, ", str(
len(enums)-1), "> enumTypes = {{\n"]) # -1 for not doing VkResult
for enum in enums:
if enum.tag == 'VkResult':
continue
valueCount = len(enum.findall('./'))
if valueCount == 0:
outFile.writelines(
[" {\"", str(enum.tag), "\", nullptr, 0, true},\n"])
else:
allowEmpty = "true"
for enumVal in enum.findall('./'):
if enumVal.get('first') == enum.get('first'):
allowEmpty = "false"
outFile.writelines([" {\"", str(enum.tag), "\", ", str(
enum.tag), "Sets, ", str(valueCount), ", ", allowEmpty, "},\n"])
outFile.write('}};\n')
# Function definitions
outFile.write("""
/**
* @brief Removes a vendor tag from the end of the given string view
* @param view String view to remove the vendor tag from
* @return A string_view without the vendor tag, if it was suffixed
*/
std::string_view stripVendor(std::string_view view) {
for (auto const &it : vendors) {
// Don't strip if it's all that's left
if (view == it)
break;
if (strncmp(view.data() + view.size() - it.size(), it.data(), it.size()) == 0) {
view = view.substr(0, view.size() - it.size());
break;
}
}
return view;
}
/**
* @brief Strips '_BIT' from the end of a string, if there
*/
std::string_view stripBit(std::string_view view) {
if (view.size() > strlen("_BIT")) {
if (view.substr(view.size() - strlen("_BIT")) == "_BIT") {
return view.substr(0, view.size() - strlen("_BIT"));
}
}
return view;
}
bool getEnumType(std::string_view vkType,
EnumValueSet const **ppStart,
EnumValueSet const **ppEnd,
bool *pAllowEmpty) {
// Check for a conversion from Flags -> FlagBits
std::string localString;
if (vkType.rfind("Flags") != std::string::npos) {
localString = vkType;
auto it = localString.rfind("Flags");
localString = localString.replace(it, strlen("Flags"), "FlagBits");
vkType = localString;
}
// Try the original name
for (auto const &it : enumTypes) {
if (vkType == std::string_view{it.name}) {
*ppStart = it.data;
*ppEnd = it.data + it.count;
*pAllowEmpty = it.allowEmpty;
return true;
}
}
// Try a vendor-stripped name
vkType = stripVendor(vkType);
for (auto const &it : enumTypes) {
if (vkType == std::string_view{it.name}) {
*ppStart = it.data;
*ppEnd = it.data + it.count;
*pAllowEmpty = it.allowEmpty;
return true;
}
}
return false;
}
/**
* @brief Converts a Vulkan Flag typename into the prefix that is used for it's enums
* @param typeName Name of the type to generate the Vk enum prefix for
* @return Generated prefix string
*
* Any capitalized letters except for the first has an underscore inserted before it, an underscore
* is added to the end, and all characters are converted to upper case.
*
* It also removed the 'Flags' or 'FlagBits' suffixes.
*/
std::string processEnumPrefix(std::string_view typeName) {
// Flag Bits
std::size_t flagBitsSize = strlen("FlagBits");
if (typeName.size() > flagBitsSize) {
if (strncmp(typeName.data() + typeName.size() - flagBitsSize, "FlagBits", flagBitsSize) ==
0) {
typeName = typeName.substr(0, typeName.size() - strlen("FlagBits"));
}
}
// Flags
std::size_t flagsSize = strlen("Flags");
if (typeName.size() > flagsSize) {
if (strncmp(typeName.data() + typeName.size() - flagsSize, "Flags", flagsSize) == 0) {
typeName = typeName.substr(0, typeName.size() - strlen("Flags"));
}
}
std::string retStr;
for (auto it = typeName.begin(); it != typeName.end(); ++it) {
if (it == typeName.begin()) {
retStr += ::toupper(*it);
} else if (::isupper(*it)) {
retStr += '_';
retStr += *it;
} else {
retStr += toupper(*it);
}
}
retStr += '_';
return retStr;
}
bool findValue(std::string_view findValue,
std::string_view prefix,
uint64_t *pValue,
EnumValueSet const *start,
EnumValueSet const *end) {
// Remove the vendor tag suffix if it's on the value
findValue = stripVendor(findValue);
if (findValue[findValue.size() - 1] == '_')
findValue = findValue.substr(0, findValue.size() - 1);
// Remove '_BIT' if it's there
findValue = stripBit(findValue);
// Iterate until we find the value
while (start != end) {
if (findValue == start->name) {
*pValue |= start->value;
return true;
}
std::string prefixedName{prefix};
prefixedName += start->name;
if (findValue == prefixedName) {
*pValue |= start->value;
return true;
}
++start;
}
return false;
}
/**
* @brief Takes a given string and formats it for use with parsing
* @param str The string to format
* @return Formatted string
*
* First, any non alphanumeric characters are trimmed from both ends of the string.
* After than, any spaces are replaced with underscores, and finally all the characters are
* capitalized. This will generate the string closest to the original ones found in the XML spec.
*/
std::string formatString(std::string str) {
// Trim left
std::size_t cutOffset = 0;
for (auto c : str) {
if (::isalnum(c))
break;
else
++cutOffset;
}
str = str.substr(cutOffset);
// Trim right
cutOffset = 0;
for (std::size_t i = 0; i < str.size(); ++i) {
if (::isalnum(str[i]))
cutOffset = i + 1;
}
str = str.substr(0, cutOffset);
std::replace(str.begin(), str.end(), ' ', '_');
std::for_each(str.begin(), str.end(), [](char &c) { c = ::toupper(c); });
return str;
}
bool serializeBitmask(EnumValueSet const *end,
EnumValueSet const *start,
bool allowEmpty,
uint64_t vkValue,
std::string *pString) {
--end;
--start;
if(start == end) {
// If this is a non-existing bitmask, then return an empty string
*pString = {};
return true;
}
std::string retStr;
while (start != end) {
if(vkValue == 0 && !retStr.empty()) {
break;
}
if ((start->value & vkValue) == start->value) {
// Found a compatible bit mask, add it
if (!retStr.empty()) {
retStr += " | ";
}
retStr += start->name;
vkValue = vkValue ^ start->value;
}
--start;
}
if (vkValue != 0 || (retStr.empty() && !allowEmpty)) {
// Failed to find a valid bitmask for the value
return false;
}
*pString = retStr;
return true;
}
bool serializeEnum(EnumValueSet const *start,
EnumValueSet const *end,
uint64_t vkValue,
std::string *pString) {
while (start != end) {
if (start->value == vkValue) {
*pString = start->name;
return true;
}
++start;
}
return false;
}
bool parseBitmask(std::string_view vkString,
EnumValueSet const *start,
EnumValueSet const *end,
std::string_view prefix,
uint64_t *pValue) {
uint64_t retVal = 0;
auto startCh = vkString.begin();
auto endCh = startCh;
for (; endCh != vkString.end(); ++endCh) {
if (*endCh == '|') {
std::string token(startCh, endCh);
token = formatString(token);
bool foundVal = findValue(token, prefix, &retVal, start, end);
if (!foundVal)
return false;
startCh = endCh + 1;
}
}
if (startCh != endCh) {
std::string token(startCh, endCh);
token = formatString(token);
bool foundVal = findValue(token, prefix, &retVal, start, end);
if (!foundVal)
return false;
}
*pValue = retVal;
return true;
}
bool parseEnum(std::string_view vkString,
EnumValueSet const *start,
EnumValueSet const *end,
std::string_view prefix,
uint64_t *pValue) {
uint64_t retVal = 0;
std::string token = formatString(std::string{vkString});
bool found = findValue(token, prefix, &retVal, start, end);
if (found) {
*pValue = retVal;
}
return found;
}
} // namespace
bool vk_serialize(std::string_view vkType, uint64_t vkValue, std::string *pString) {
if (vkType.empty()) {
return false;
}
EnumValueSet const *start, *end;
bool allowEmpty;
if (!getEnumType(vkType, &start, &end, &allowEmpty)) {
return false;
}
if (vkType.find("Flags") != std::string::npos || vkType.find("FlagBits") != std::string::npos) {
return serializeBitmask(start, end, allowEmpty, vkValue, pString);
}
return serializeEnum(start, end, vkValue, pString);
}
bool vk_serialize(std::string_view vkType, uint32_t vkValue, std::string *pString) {
return vk_serialize(vkType, static_cast<uint64_t>(vkValue), pString);
}
bool vk_parse(std::string_view vkType, std::string vkString, uint64_t *pValue) {
if (vkType.empty()) {
return false;
}
EnumValueSet const *start, *end;
bool allowEmpty;
if (!getEnumType(vkType, &start, &end, &allowEmpty)) {
return false;
}
if (vkString.empty()) {
if (allowEmpty) {
*pValue = 0;
return true;
} else {
return false;
}
}
std::string prefix = processEnumPrefix(stripVendor(vkType));
if (vkType.find("Flags") != std::string::npos || vkType.find("FlagBits") != std::string::npos) {
return parseBitmask(vkString, start, end, prefix, pValue);
}
return parseEnum(vkString, start, end, prefix, pValue);
}
bool vk_parse(std::string_view vkType, std::string vkString, uint32_t *pValue) {
uint64_t tempValue;
if (vk_parse(vkType, vkString, &tempValue)) {
*pValue = static_cast<uint32_t>(tempValue);
return true;
}
return false;
}
""")
# endif
outFile.write("\n#endif // VK_VALUE_SERIALIZATION_CONFIG_MAIN\n")
outFile.write("#endif // VK_VALUE_SERIALIZATION_HPP\n")
outFile.close()
if __name__ == "__main__":
main(sys.argv[1:])
| 30.345865 | 101 | 0.607334 | 2,358 | 20,180 | 5.141645 | 0.15352 | 0.04157 | 0.028951 | 0.017239 | 0.403827 | 0.353184 | 0.330584 | 0.319367 | 0.305345 | 0.29421 | 0 | 0.006486 | 0.266601 | 20,180 | 664 | 102 | 30.391566 | 0.812703 | 0.021754 | 0 | 0.361111 | 0 | 0.011111 | 0.751369 | 0.080426 | 0 | 0 | 0 | 0 | 0.003704 | 1 | 0.007407 | false | 0 | 0.005556 | 0 | 0.077778 | 0.007407 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c00a5891e5e90872bc713f8239c14e5c3378b1b | 5,624 | py | Python | tests/test_charge.py | fossabot/MolVS | dc5afca7fcea93ebb0a342b766d70e88d2c0b841 | [
"MIT"
] | 1 | 2019-03-15T03:42:37.000Z | 2019-03-15T03:42:37.000Z | tests/test_charge.py | fossabot/MolVS | dc5afca7fcea93ebb0a342b766d70e88d2c0b841 | [
"MIT"
] | null | null | null | tests/test_charge.py | fossabot/MolVS | dc5afca7fcea93ebb0a342b766d70e88d2c0b841 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Tests for charge.py"""
from __future__ import print_function
from __future__ import unicode_literals
from __future__ import division
import logging
from rdkit import Chem
from molvs.standardize import Standardizer, standardize_smiles
from molvs.charge import Reionizer
logging.basicConfig(level=logging.DEBUG)
def charge_parent_smiles(smiles, prefer_organic=False):
"""Utility function that returns the charge parent SMILES for given a SMILES string."""
mol = Chem.MolFromSmiles(smiles.encode('utf8'), sanitize=False)
mol = Standardizer(prefer_organic=prefer_organic).charge_parent(mol)
if mol:
return Chem.MolToSmiles(mol, isomericSmiles=True)
def test_charge_parent():
"""Test neutralization of ionized acids and bases."""
assert charge_parent_smiles('C(C(=O)[O-])(Cc1n[n-]nn1)(C[NH3+])(C[N+](=O)[O-])') == 'NCC(Cc1nn[nH]n1)(C[N+](=O)[O-])C(=O)O'
def test_charge_parent2():
"""Test preservation of zwitterion."""
assert charge_parent_smiles('n(C)1cc[n+]2cccc([O-])c12') == 'Cn1cc[n+]2cccc([O-])c12'
def test_charge_parent3():
"""Choline should be left with a positive charge."""
assert charge_parent_smiles('C[N+](C)(C)CCO') == 'C[N+](C)(C)CCO'
def test_charge_parent4():
"""This should have the hydrogen removed to give deanol as a charge parent."""
assert charge_parent_smiles('C[NH+](C)CCO') == 'CN(C)CCO'
def test_charge_parent5():
"""Sodium benzoate to benzoic acid."""
assert charge_parent_smiles('[Na+].O=C([O-])c1ccccc1') == 'O=C(O)c1ccccc1'
def test_charge_parent6():
"""Benzoate ion to benzoic acid."""
assert charge_parent_smiles('O=C([O-])c1ccccc1') == 'O=C(O)c1ccccc1'
def test_charge_parent7():
"""Charges in histidine should be neutralized."""
assert charge_parent_smiles('[NH3+]C(Cc1cnc[nH]1)C(=O)[O-]') == 'NC(Cc1cnc[nH]1)C(=O)O'
def test_charge_parent8():
""""""
assert charge_parent_smiles('C[NH+](C)(C).[Cl-]') == 'CN(C)C'
def test_charge_parent9():
"""No organic fragments."""
assert charge_parent_smiles('[N+](=O)([O-])[O-]') == 'O=[N+]([O-])[O-]'
def test_charge_parent10():
"""No organic fragments."""
assert charge_parent_smiles('[N+](=O)([O-])[O-]', prefer_organic=True) == 'O=[N+]([O-])[O-]'
def test_charge_parent11():
"""Larger inorganic fragment should be chosen."""
assert charge_parent_smiles('[N+](=O)([O-])[O-].[CH2]') == 'O=[N+]([O-])[O-]'
def test_charge_parent12():
"""Smaller organic fragment should be chosen over larger inorganic fragment."""
assert charge_parent_smiles('[N+](=O)([O-])[O-].[CH2]', prefer_organic=True) == '[CH2]'
def test_standardize():
"""Test table salt."""
assert standardize_smiles('[Na].[Cl]') == '[Cl-].[Na+]'
def test_reionize():
"""Test reionizer moves proton to weaker acid."""
mol = Chem.MolFromSmiles('C1=C(C=CC(=C1)[S]([O-])=O)[S](O)(=O)=O')
r = Reionizer()
mol = r.reionize(mol)
assert Chem.MolToSmiles(mol) == 'O=S(O)c1ccc(S(=O)(=O)[O-])cc1'
def test_reionize2():
"""Test charged carbon doesn't get recognised as alpha-carbon-hydrogen-keto."""
mol = Chem.MolFromSmiles('CCOC(=O)C(=O)[CH-]C#N')
r = Reionizer()
mol = r.reionize(mol)
assert Chem.MolToSmiles(mol) == 'CCOC(=O)C(=O)[CH-]C#N'
def test_reionize3():
""""""
mol = Chem.MolFromSmiles('C[N+]1=C[CH-]N(C(=N)N)/C1=C/[N+](=O)[O-]')
r = Reionizer()
mol = r.reionize(mol)
assert Chem.MolToSmiles(mol) == 'C[N+]1=CCN(C(=N)N)C1=[C-][N+](=O)[O-]'
def test_should_complete():
"""Reionization should not infinitely loop forever on these molecules."""
# GitHub Issue #14
assert standardize_smiles('CCCCCCCCCCCCCCCCCC(=O)CC(=C)C(=O)O[Ti](=O)(OC(C)C)C(C)C') == 'C=C(CC(=O)[CH-]CCCCCCCCCCCCCCCC)C(=O)[O-].CC(C)[O-].CCC.[O-2].[Ti+5]'
assert standardize_smiles('OP(=O)(O)[O-].OP(=O)([O-])[O-].[O-]S(=O)(=O)[O-].[Na+].[Na+].[Na+].[Mg+2].[Cl-].[Cl-].[K+].[K+]') == 'O=P([O-])(O)O.O=P([O-])([O-])O.O=S(=O)([O-])[O-].[Cl-].[Cl-].[K+].[K+].[Mg+2].[Na+].[Na+].[Na+]'
def test_forced_charge1():
"""Test forced charge correction maintaining overall neutral charge."""
assert standardize_smiles('[Na].O=C(O)c1ccccc1') == 'O=C([O-])c1ccccc1.[Na+]'
def test_forced_charge2():
"""Test forced charge correction with no corresponding proton for neutralization."""
# GitHub Issue #15
assert standardize_smiles('[Na].[Na]') == '[Na+].[Na+]'
# TODO: Arguably should become selenite ion... O=[Se]([O-])[O-]. Need an AcidBasePair?
assert standardize_smiles('[Na].[Na].O[Se](O)=O') == 'O=[Se](O)O.[Na+].[Na+]'
# def test_reionize3():
# """Test canonical ionization position when multiple equivalent possibilities."""
# mol = Chem.MolFromSmiles('CC1=CC(=CC=C1S(O)=O)S([O-])=O')
# mol2 = Chem.MolFromSmiles('CC1=CC(=CC=C1S([O-])=O)S(O)=O')
# r = Reionizer()
# mol = r.reionize(mol)
# mol2 = r.reionize(mol2)
# assert Chem.MolToSmiles(mol) == 'Cc1cc(S(=O)[O-])ccc1S(=O)O'
# assert Chem.MolToSmiles(mol2) == 'Cc1cc(S(=O)[O-])ccc1S(=O)O'
# assert Chem.MolToSmiles(mol) == Chem.MolToSmiles(mol2)
#
#
# def test_reionize4():
# """Test canonical ionization position when multiple equivalent possibilities."""
# mol = Chem.MolFromSmiles('CCOC(=O)C(=O)[CH-]C#N')
# mol2 = Chem.MolFromSmiles('[CH2-]COC(=O)C(=O)CC#N')
# r = Reionizer()
# mol = r.reionize(mol)
# mol2 = r.reionize(mol2)
# assert Chem.MolToSmiles(mol) == '[CH2-]COC(=O)C(=O)CC#N'
# assert Chem.MolToSmiles(mol2) == ''
# assert Chem.MolToSmiles(mol) == Chem.MolToSmiles(mol2)
| 35.594937 | 229 | 0.628912 | 845 | 5,624 | 4.080473 | 0.242604 | 0.031323 | 0.014791 | 0.083527 | 0.417343 | 0.364269 | 0.343677 | 0.263921 | 0.25464 | 0.236659 | 0 | 0.018864 | 0.142248 | 5,624 | 157 | 230 | 35.821656 | 0.695896 | 0.359708 | 0 | 0.096774 | 0 | 0.145161 | 0.314269 | 0.230791 | 0 | 0 | 0 | 0.006369 | 0.33871 | 1 | 0.322581 | false | 0 | 0.112903 | 0 | 0.451613 | 0.016129 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c026bf515ed2ee75cb046511de891145466adbe | 10,188 | py | Python | gym_flock/envs/old/mapping.py | katetolstaya/gym-flock | 3236d1dafcb1b9be0cf78b471672e8becb2d37af | [
"MIT"
] | 19 | 2019-07-29T22:19:58.000Z | 2022-01-27T04:38:38.000Z | gym_flock/envs/old/mapping.py | henghenghahei849/gym-flock | b09bdfbbe4a96fe052958d1f9e1e9dd314f58419 | [
"MIT"
] | null | null | null | gym_flock/envs/old/mapping.py | henghenghahei849/gym-flock | b09bdfbbe4a96fe052958d1f9e1e9dd314f58419 | [
"MIT"
] | 5 | 2019-10-03T14:44:49.000Z | 2021-12-09T20:39:39.000Z | import gym
from gym import spaces, error, utils
from gym.utils import seeding
import numpy as np
import configparser
from os import path
import matplotlib.pyplot as plt
from matplotlib.pyplot import gca
font = {'family': 'sans-serif',
'weight': 'bold',
'size': 14}
class MappingEnv(gym.Env):
def __init__(self):
# config_file = path.join(path.dirname(__file__), "params_flock.cfg")
# config = configparser.ConfigParser()
# config.read(config_file)
# config = config['flock']
self.nearest_agents = 7
self.nearest_targets = 7
self.mean_pooling = True # normalize the adjacency matrix by the number of neighbors or not
self.centralized = True
# number states per agent
self.nx_system = 4
# number of actions per agent
self.nu = 2
# default problem parameters
self.n_agents = 100 # int(config['network_size'])
# self.comm_radius = 0.9 # float(config['comm_radius'])
self.dt = 0.1 # #float(config['system_dt'])
self.v_max = 5.0 # float(config['max_vel_init'])
self.v_bias = self.v_max
# intitialize state matrices
self.x = None
self.u = None
self.mean_vel = None
self.init_vel = None
self.greedy_action = None
self.diff = None
self.r2 = None
self.adj_mat = None
self.adj_mat_mean = None
self.diff_targets = None
self.r2_targets = None
self.target_observed = None
self.state_network = None
self.state_values = None
self.reward = None
self.max_accel = 1
# self.action_space = spaces.Box(low=-self.max_accel, high=self.max_accel, shape=(2 * self.n_agents,),
# dtype=np.float32)
#
# self.observation_space = spaces.Box(low=-np.Inf, high=np.Inf, shape=(self.n_agents, ),
# dtype=np.float32)
# target initialization
self.px_max = 100
self.py_max = 100
x = np.linspace(-1.0 * self.px_max, self.px_max, self.n_agents)
y = np.linspace(-1.0 * self.py_max, self.py_max, self.n_agents)
tx, ty = np.meshgrid(x, y)
tx = tx.reshape((-1, 1))
ty = ty.reshape((-1, 1))
self.obs_rad = 2.0
self.obs_rad2 = self.obs_rad * self.obs_rad
self.target_x = np.stack((tx, ty), axis=1).reshape((-1, 2))
self.target_unobserved = np.ones((self.n_agents * self.n_agents, 2), dtype=np.bool)
# rendering initialization
self.fig = None
self.ax = None
self.line1 = None
self.line2 = None
self.action_scalar = 10.0
self.seed()
def reset(self):
x = np.zeros((self.n_agents, self.nx_system))
self.target_unobserved = np.ones((self.n_agents * self.n_agents, 2), dtype=np.bool)
x[:, 0] = np.random.uniform(low=-self.px_max, high=self.px_max, size=(self.n_agents,))
x[:, 1] = np.random.uniform(low=-self.py_max, high=self.py_max, size=(self.n_agents,))
#bias = np.random.uniform(low=-self.v_bias, high=self.v_bias, size=(2,))
x[:, 2] = np.random.uniform(low=-self.v_max, high=self.v_max, size=(self.n_agents,)) #+ bias[0]
x[:, 3] = np.random.uniform(low=-self.v_max, high=self.v_max, size=(self.n_agents,)) #+ bias[1]
# keep good initialization
self.mean_vel = np.mean(x[:, 2:4], axis=0)
self.init_vel = x[:, 2:4]
self.x = x
# self.a_net = self.get_connectivity(self.x)
self.compute_helpers()
return self.state_values, self.state_network
def params_from_cfg(self, args):
# TODO
pass
# # self.comm_radius = args.getfloat('comm_radius')
# # self.comm_radius2 = self.comm_radius * self.comm_radius
# # self.vr = 1 / self.comm_radius2 + np.log(self.comm_radius2)
# #
# # self.n_agents = args.getint('n_agents')
# # self.r_max = self.r_max * np.sqrt(self.n_agents)
#
# # self.action_space = spaces.Box(low=-self.max_accel, high=self.max_accel, shape=(2 * self.n_agents,),
# # dtype=np.float32)
# #
# # self.observation_space = spaces.Box(low=-np.Inf, high=np.Inf, shape=(self.n_agents, self.n_features),
# # dtype=np.float32)
#
# self.v_max = args.getfloat('v_max')
# self.v_bias = self.v_max
# self.dt = args.getfloat('dt')
def seed(self, seed=None):
self.np_random, seed = seeding.np_random(seed)
return [seed]
def step(self, u):
# u = np.reshape(u, (-1, 2))
assert u.shape == (self.n_agents, self.nu)
u = np.clip(u, a_min=-self.max_accel, a_max=self.max_accel)
self.u = u * self.action_scalar
old_x = np.copy(self.x)
# x position
self.x[:, 0] = self.x[:, 0] + self.x[:, 2] * self.dt + self.u[:, 0] * self.dt * self.dt * 0.5
# y position
self.x[:, 1] = self.x[:, 1] + self.x[:, 3] * self.dt + self.u[:, 1] * self.dt * self.dt * 0.5
# x velocity
self.x[:, 2] = self.x[:, 2] + self.u[:, 0] * self.dt
# y velocity
self.x[:, 3] = self.x[:, 3] + self.u[:, 1] * self.dt
# clip velocities
self.x[:, 2:4] = np.clip(self.x[:, 2:4], -1.0*self.v_max, self.v_max)
dist_traveled = np.sum(np.linalg.norm(self.x[:, 0:2] - old_x[:, 0:2], axis=1))
self.compute_helpers()
done = (0 == np.sum(self.target_unobserved))
return (self.state_values, self.state_network), 10.0 * self.reward - dist_traveled, done, {}
def compute_helpers(self):
# TODO - check this, and initialize stuff in the init(), and try to make more efficient
# Neighbors computations
self.diff = self.x.reshape((self.n_agents, 1, self.nx_system)) - self.x.reshape(
(1, self.n_agents, self.nx_system))
self.r2 = np.multiply(self.diff[:, :, 0], self.diff[:, :, 0]) + np.multiply(self.diff[:, :, 1],
self.diff[:, :, 1])
np.fill_diagonal(self.r2, np.Inf)
nearest = np.argsort(self.r2, axis=1)
obs_neigh = np.zeros((self.n_agents, self.nearest_agents * 4))
self.adj_mat = np.zeros((self.n_agents, self.n_agents))
for i in range(self.nearest_agents):
ind2, ind3 = np.meshgrid(nearest[:, i], range(4), indexing='ij')
ind1, _ = np.meshgrid(range(self.n_agents), range(4), indexing='ij')
obs_neigh[:, i * self.nx_system:(i + 1) * self.nx_system] = np.reshape(
self.diff[ind1.flatten(), ind2.flatten(), ind3.flatten()], (-1, 4))
self.adj_mat[:, nearest[:, i]] = 1.0
# Normalize the adjacency matrix by the number of neighbors - results in mean pooling, instead of sum pooling
n_neighbors = np.reshape(np.sum(self.adj_mat, axis=1), (self.n_agents, 1)) # correct - checked this
n_neighbors[n_neighbors == 0] = 1
self.adj_mat_mean = self.adj_mat / n_neighbors
# Targets computations
self.diff_targets = self.x[:, 0:2].reshape((self.n_agents, 1, 2)) - self.target_x[
self.target_unobserved].reshape(
(1, -1, 2))
self.r2_targets = np.multiply(self.diff_targets[:, :, 0], self.diff_targets[:, :, 0]) + np.multiply(
self.diff_targets[:, :, 1],
self.diff_targets[:, :, 1])
nearest_targets = np.argsort(self.r2_targets, axis=1)
obs_target = np.zeros((self.n_agents, self.nearest_targets * 2))
for i in range(min(self.nearest_targets, np.shape(nearest_targets)[1])):
ind2, ind3 = np.meshgrid(nearest_targets[:, i], range(2), indexing='ij')
ind1, _ = np.meshgrid(range(self.n_agents), range(2), indexing='ij')
obs_target[:, i * 2:(i + 1) * 2] = np.reshape(
self.diff_targets[ind1.flatten(), ind2.flatten(), ind3.flatten()], (-1, 2))
self.target_observed = np.any(self.r2_targets < self.obs_rad2, axis=0).reshape((-1, 1))
self.target_unobserved[self.target_unobserved] = np.tile(np.logical_not(self.target_observed), (1, 2)).flatten()
self.reward = np.sum(self.target_observed.astype(np.int))
self.state_values = np.hstack((obs_neigh, obs_target))
self.greedy_action = -1.0 * obs_target[:, 0:2]
if self.mean_pooling:
self.state_network = self.adj_mat_mean
else:
self.state_network = self.adj_mat
def controller(self):
"""
The controller for flocking from Turner 2003.
Returns: the optimal action
"""
# TODO
# return np.zeros((self.n_agents, 2))
return self.greedy_action / 10.0
def render(self, mode='human'):
"""
Render the environment with agents as points in 2D space
"""
if self.fig is None:
plt.ion()
fig = plt.figure()
self.ax = fig.add_subplot(111)
line1, = self.ax.plot(self.x[:, 0], self.x[:, 1], 'bo')
locs = self.target_x[self.target_unobserved].reshape((-1, 2))
line2, = self.ax.plot(locs[:, 0], locs[:, 1], 'rx')
plt.ylim(-1.0 * self.py_max, 1.0 * self.py_max)
plt.xlim(-1.0 * self.px_max, 1.0 * self.px_max)
a = gca()
a.set_xticklabels(a.get_xticks(), font)
a.set_yticklabels(a.get_yticks(), font)
plt.title('GNN Controller')
self.fig = fig
self.line1 = line1
self.line2 = line2
# TODO render unobserved targets
else:
self.line1.set_xdata(self.x[:, 0])
self.line1.set_ydata(self.x[:, 1])
locs = self.target_x[self.target_unobserved].reshape((-1,2))
self.line2.set_xdata(locs[:, 0])
self.line2.set_ydata(locs[:, 1])
self.fig.canvas.draw()
self.fig.canvas.flush_events()
def close(self):
pass
| 38.014925 | 120 | 0.565077 | 1,431 | 10,188 | 3.871419 | 0.176101 | 0.027978 | 0.059567 | 0.027076 | 0.312996 | 0.235379 | 0.197653 | 0.152527 | 0.145487 | 0.127798 | 0 | 0.029801 | 0.288575 | 10,188 | 267 | 121 | 38.157303 | 0.734547 | 0.213683 | 0 | 0.064103 | 0 | 0 | 0.007733 | 0 | 0 | 0 | 0 | 0.011236 | 0.00641 | 1 | 0.057692 | false | 0.012821 | 0.051282 | 0 | 0.141026 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c04cbdee0c3246cf3dd07ebf05dec05475d975b | 1,379 | py | Python | 1-Chapter/htmlcomponents.py | DSandovalFlavio/Dashboards-Plotly-Dash | 58867c2e813bc9273838dec12e7bd15be25504fa | [
"MIT"
] | null | null | null | 1-Chapter/htmlcomponents.py | DSandovalFlavio/Dashboards-Plotly-Dash | 58867c2e813bc9273838dec12e7bd15be25504fa | [
"MIT"
] | null | null | null | 1-Chapter/htmlcomponents.py | DSandovalFlavio/Dashboards-Plotly-Dash | 58867c2e813bc9273838dec12e7bd15be25504fa | [
"MIT"
] | null | null | null | import dash
from dash import html
app = dash.Dash(__name__)
app.layout = html.Div(children=[html.H1('Data Science',
style = {'textAlign': 'center',
'color': '#0FD08D',
'font-size': '50px'}),
html.H2('La carrera mas sexy del siglo XXI',
style = {'textAlign': 'center',
'color' : '#009A64'}),
html.P('Factores clave:'),
html.Ul(children = [html.Li('Factor 1'),
html.Li('Factor 2'),
html.Li('Factor 3'),
html.Li(['Source: ',
html.A('https://www.excelsior.com.mx/nacional/ciencia-de-datos-la-carrera-mas-sexy-del-xxi-en-la-unam/1323946',
href = 'https://www.excelsior.com.mx/nacional/ciencia-de-datos-la-carrera-mas-sexy-del-xxi-en-la-unam/1323946')
])
])
])
if __name__ == '__main__':
app.run_server(debug=True) | 55.16 | 175 | 0.354605 | 112 | 1,379 | 4.25 | 0.526786 | 0.05042 | 0.07563 | 0.10084 | 0.380252 | 0.340336 | 0.340336 | 0.340336 | 0.340336 | 0.340336 | 0 | 0.044615 | 0.528644 | 1,379 | 25 | 176 | 55.16 | 0.687692 | 0 | 0 | 0.227273 | 0 | 0.090909 | 0.267391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c051afc13411020a3654ad65fd96dd86b2c6979 | 75,380 | py | Python | baadalinstallation/baadal/modules/vm_helper.py | iitd-plos/baadal2.0 | 0496a8ddb5c0620f3448f018ba48b080b96cbe61 | [
"Apache-2.0"
] | 8 | 2017-01-30T17:40:18.000Z | 2022-02-07T19:37:32.000Z | baadalinstallation/baadal/modules/vm_helper.py | iitd-plos/baadal2.0 | 0496a8ddb5c0620f3448f018ba48b080b96cbe61 | [
"Apache-2.0"
] | null | null | null | baadalinstallation/baadal/modules/vm_helper.py | iitd-plos/baadal2.0 | 0496a8ddb5c0620f3448f018ba48b080b96cbe61 | [
"Apache-2.0"
] | 7 | 2016-11-08T13:38:10.000Z | 2019-11-26T04:33:17.000Z | # -*- coding: utf-8 -*-
###################################################################################
from gluon import current
from helper import get_constant, execute_remote_cmd, config, get_datetime, \
log_exception, is_pingable, get_context_path
from libvirt import * # @UnusedWildImport
from log_handler import logger
from nat_mapper import create_mapping, remove_mapping
import math, shutil, libvirt, os, time, random
import xml.etree.ElementTree as etree
def _choose_datastore():
"""
Chooses datastore from a list of available datastores
"""
# datastore_capacity = current.db(current.db.datastore.id >= 0).select(orderby = current.db.datastore.used
datastores = current.db(current.db.datastore.id >= 0).select()
datastore_length = len(datastores)
logger.debug("datastore_lengtn" + str(datastore_length))
if(datastore_length == 0):
raise Exception("No datastore found.")
else:
count = datastore_length
available_datastores = {}
while count != 0:
available = datastores[datastore_length-count].capacity - datastores[datastore_length-count].used
available_datastores[datastores[datastore_length-count]] = available
count = count-1
z = [(i,available_datastores[i]) for i in available_datastores]
z.sort(key=lambda x: x[1])
available_datastores = z
logger.debug("available d" + str(available_datastores[-1]))
first_elts = available_datastores[-1]
first_elts = first_elts[0]
logger.debug("selected database" + str(first_elts))
return first_elts
def host_resources_used(host_id):
"""
Returns resources utilization of a host in MB, Count
"""
RAM = 0.0
CPU = 0.0
vms = current.db((current.db.vm_data.host_id == host_id) & (current.db.vm_data.status != current.VM_STATUS_UNKNOWN) & (current.db.vm_data.status != current.VM_STATUS_IN_QUEUE)).select()
logger.debug("vms selected are: " + str(vms))
for vm_data in vms:
RAM += vm_data.RAM
CPU += vm_data.vCPU
return (math.ceil(RAM),math.ceil(CPU))
def getVirshDomainConn(vm_details, host_ip=None, domain_name=None):
"""
Generic method to establish libvirt connection
"""
if vm_details != None:
host_ip = vm_details.host_id.host_ip.private_ip
domain_name = vm_details.vm_identity
connection_object = libvirt.open("qemu+ssh://root@" + host_ip + "/system")
domain = connection_object.lookupByName(domain_name)
return (connection_object, domain)
def getVirshDomain(vm_details):
"""
Generic method to establish libvirt connection
"""
(connection_object, domain) = getVirshDomainConn(vm_details)
connection_object.close()
return domain
def _set_portgroup_in_vm(domain_name, portgroup, host_ip, vlan_tag):
"""
Set the vlan tag in network configuration of VM
This is required to ensure that VM fetches IP of its vlan from DHCP
"""
(connection_object, domain) = getVirshDomainConn(None, host_ip, domain_name)
xml = etree.fromstring(domain.XMLDesc(0))
source_network_element = xml.find('.//interface/source')
source_network_string=etree.tostring(source_network_element)
logger.debug("Source network is " + source_network_string)
if source_network_string.find(" bridge=") != -1:
logger.debug("Source is set to bridge adding <vlan><tag_id> to the interface tag ")
root_new = xml.find('.//interface')
root_new_vlan= etree.SubElement(root_new, 'vlan')
root_new_tag= etree.SubElement(root_new_vlan, 'tag')
root_new_tag.set('id',vlan_tag)
logger.debug("After append root_new_vlan is " + etree.tostring(root_new_vlan))
elif source_network_string.find(" network=") != -1:
logger.debug("Source is set to network adding portgroup to the source tag ")
source_network_element.set('portgroup', portgroup)
logger.debug("Changed source network is " + etree.tostring(source_network_element))
else:
logger.debug("Neither VM nor vlan tagId is added in the xml" )
domain = connection_object.defineXML(etree.tostring(xml))
domain.destroy()
domain.create()
domain.isActive()
connection_object.close()
def _get_private_ip_mac(security_domain_id):
"""
Chooses a random Private IP from the pool, such that:
- It is not assigned to any VM or host
- It belongs to VLAN of given security domain
"""
vlans = current.db(current.db.security_domain.id == security_domain_id)._select(current.db.security_domain.vlan)
private_ip_pool = current.db((~current.db.private_ip_pool.id.belongs(current.db(current.db.vm_data.private_ip != None)._select(current.db.vm_data.private_ip)))
& (~current.db.private_ip_pool.id.belongs(current.db(current.db.host.host_ip != None)._select(current.db.host.host_ip)))
& (current.db.private_ip_pool.vlan.belongs(vlans))).select(current.db.private_ip_pool.ALL, orderby='<random>').first()
if private_ip_pool:
return private_ip_pool
else:
sd = current.db.security_domain[security_domain_id]
raise Exception(("Available MACs are exhausted for security domain '%s'." % sd.name))
def _choose_random_public_ip():
"""
Chooses a random Public IP from the pool, such that:
- It is not assigned to any VM
- It is not assigned to any host
- IP is marked active.
"""
public_ip_pool = current.db((~current.db.public_ip_pool.id.belongs(current.db(current.db.vm_data.public_ip != None)._select(current.db.vm_data.public_ip)))
& (~current.db.public_ip_pool.id.belongs(current.db(current.db.host.public_ip != None)._select(current.db.host.public_ip)))
& (current.db.public_ip_pool.is_active == True)) \
.select(current.db.public_ip_pool.ALL, orderby='<random>').first()
return public_ip_pool
def _choose_mac_ip(vm_properties):
"""
Chooses mac address and ip address for a vm to be installed.
It also chooses a random public IP if requested
"""
if not 'private_ip' in vm_properties:
private_ip_info = _get_private_ip_mac(vm_properties['security_domain'])
vm_properties['private_ip'] = private_ip_info.private_ip
vm_properties['mac_addr'] = private_ip_info.mac_addr
vm_properties['vlan_name'] = private_ip_info.vlan.name
vm_properties['vlan_tag'] = private_ip_info.vlan.vlan_tag
if vm_properties['public_ip_req']:
if 'public_ip' not in vm_properties:
public_ip_pool = _choose_random_public_ip()
if public_ip_pool:
vm_properties['public_ip'] = public_ip_pool.public_ip
else:
raise Exception("Available Public IPs are exhausted.")
else:
vm_properties['public_ip'] = None
def _choose_mac_ip_vncport(vm_properties):
"""
Chooses mac address, ip address and vncport for a vm to be installed
"""
_choose_mac_ip(vm_properties)
start_range = int(get_constant('vncport_start_range'))
end_range = int(get_constant('vncport_end_range'))
vnc_ports_taken = current.db().select(current.db.vm_data.vnc_port)
while True:
random_vnc_port = random.randrange(start_range, end_range, 1)
if not random_vnc_port in vnc_ports_taken:
break;
vm_properties['vnc_port'] = str(random_vnc_port)
def find_new_host(RAM, vCPU):
"""
Select a random host from list of 3 hosts with available RAM and CPU
Availability is checked with 200 percent over-commitment.
"""
hosts = current.db(current.db.host.status == 1).select()
hosts = hosts.as_list(True,False)
count = 3
selected_hosts = []
while count != 0 and hosts:
host = random.choice(hosts)
logger.debug("Checking host =" + host['host_name'])
(used_ram, used_cpu) = host_resources_used(host['id'])
logger.debug("used ram: " + str(used_ram) + " used cpu: " + str(used_cpu) + " host ram: " + str(host['RAM']) + " host cpu "+ str(host['CPUs']))
host_ram_after_200_percent_overcommitment = math.floor((host['RAM'] * 1024) * 2)
host_cpu_after_200_percent_overcommitment = math.floor(host['CPUs'] * 2)
logger.debug("ram available: %s cpu available: %s cpu < max cpu: %s" % ((( host_ram_after_200_percent_overcommitment - used_ram) >= RAM), ((host_cpu_after_200_percent_overcommitment - used_cpu) >= vCPU), (vCPU <= host['CPUs']) ))
if((( host_ram_after_200_percent_overcommitment - used_ram) >= RAM) and ((host_cpu_after_200_percent_overcommitment - used_cpu) >= vCPU) and (vCPU <= host['CPUs'])):
selected_hosts.append(host)
count = count -1
hosts.remove(host)
if selected_hosts:
#Sort selected host list by Ram first then Cpu
selected_host = sorted(selected_hosts,key=lambda k: k['RAM'])[0]
return selected_host['id']
#If no suitable host found
raise Exception("No active host is available for a new vm.")
def allocate_vm_properties(vm_details):
"""
Allocates vm properties ( datastore, host, ip address, mac address, vnc port, ram, vcpus)
"""
logger.debug("Inside allocate_vm_properties()...")
vm_properties = {}
vm_properties['datastore'] = _choose_datastore()
logger.debug("Datastore selected is: " + str(vm_properties['datastore']))
vm_properties['host'] = find_new_host(vm_details.RAM, vm_details.vCPU)
logger.debug("Host selected is: " + str(vm_properties['host']))
vm_properties['public_ip_req'] = False if (vm_details.public_ip == None) else True
vm_properties['security_domain'] = vm_details.security_domain
_choose_mac_ip_vncport(vm_properties)
logger.debug("MAC is : " + str(vm_properties['mac_addr']) + " IP is : " + str(vm_properties['private_ip']) + " VNCPORT is : " \
+ str(vm_properties['vnc_port']) + " Vlan tag is " + str(vm_properties['vlan_tag']) )
vm_properties['ram'] = vm_details.RAM
vm_properties['vcpus'] = vm_details.vCPU
return vm_properties
def create_vm_image(vm_details, datastore):
"""
Create a VM image
- Creates a directory for the new VM using vm_identity
- Find the location of template image requested for
- Copy the template image from its location to new vm directory
"""
# Creates a directory for the new vm
vm_directory_path = datastore.system_mount_point + '/' + get_constant('vms') + '/' + vm_details.vm_identity
logger.debug("Creating vm directory...")
if not os.path.exists (vm_directory_path):
os.makedirs(vm_directory_path)
else:
raise Exception("Directory with same name as vmname already exists.")
# Finds the location of template image that the user has requested for its vm.
template = current.db.template[vm_details.template_id]
vm_image_name = vm_directory_path + '/' + vm_details.vm_identity + '.qcow2'
# Copies the template image from its location to new vm directory
storage_type = config.get("GENERAL_CONF","storage_type")
copy_command = 'ndmpcopy ' if storage_type == current.STORAGE_NETAPP_NFS else 'cp '
#template_dir = get_constant('vm_templates_datastore')
if copy_command == 'cp ':
template_location = datastore.system_mount_point + '/' + get_constant('templates_dir') + '/' + template.hdfile
logger.debug("cp %s %s" % (template_location, vm_image_name))
rc = os.system("cp %s %s" % (template_location, vm_image_name))
if rc != 0:
logger.error("Copy not successful")
raise Exception("Copy not successful")
else:
logger.debug("Copied successfully")
elif copy_command == 'ndmpcopy ':
template_dir = template.datastore_id.path
logger.debug(template_dir)
logger.debug("Copy in progress when storage type is " + str(storage_type))
command_to_execute = copy_command + template_dir + '/' + get_constant("templates_dir") + '/' + \
template.hdfile + ' ' + datastore.path + '/' + get_constant('vms') + '/' + \
vm_details.vm_identity
logger.debug("ndmpcopy command: " + str(command_to_execute))
command_output = execute_remote_cmd(datastore.ds_ip, datastore.username, command_to_execute, datastore.password)
logger.debug(command_output)
logger.debug("Copied successfully.")
try:
vm_template_name = datastore.system_mount_point + '/' + get_constant('vms') + '/' + vm_details.vm_identity + '/' + template.hdfile
os.rename(vm_template_name, vm_image_name)
logger.debug("Template renamed successfully")
except:
logger.debug("Template rename not successful")
raise Exception("Template rename not successful")
return (template, vm_image_name)
def _get_install_command(vm_details, vm_image_location, vm_properties):
"""
Generates install command for vm
"""
template = vm_properties['template']
bus = ',bus=virtio'
optional = ' --import --os-type=' + template.os
model = ',model=virtio'
if (template.arch != 'amd64' and template.os == 'Linux'):
optional = optional + ' --arch=' + template.arch + ' '
format_command = ''
if (template.type == 'QCOW2'):
format_command = ',format=qcow2'
if (template.os == 'Windows'):
bus = ''
model = ''
install_command = 'virt-install \
--name=' + vm_details.vm_identity + ' \
--ram=' + str(vm_properties['ram']) + ' \
--vcpus=' + str(vm_properties['vcpus']) + optional + ' \
--disk path=' + vm_image_location + format_command + bus + ',cache=none' + ' \
--network network='+current.LIBVIRT_NETWORK + model + ',mac=' + vm_properties['mac_addr'] + ' \
--graphics vnc,port=' + vm_properties['vnc_port'] + ',listen=0.0.0.0,password=duolc \
--noautoconsole \
--autostart \
--force'
return install_command
def _generate_disk_xml(diskpath,target_disk):
"""
Generates xml for defining new disk
"""
root_element = etree.Element('disk',attrib = {'type':'block','device':'disk'})
etree.SubElement(root_element, 'driver',attrib = {'name':'qemu','cache':'none', 'type':'qcow2'})
etree.SubElement(root_element, 'source', attrib = {'dev':diskpath})
etree.SubElement(root_element, 'target', attrib = {'dev': target_disk})
return (etree.tostring(root_element))
def create_extra_disk_image(vm_details, disk_name, size, datastore):
"""
Create extra disk image
"""
vm_extra_disks_directory_path = datastore.system_mount_point + '/' + get_constant('extra_disks_dir') + '/' + \
datastore.ds_name + '/' + vm_details.vm_identity
if not os.path.exists (vm_extra_disks_directory_path):
logger.debug("Making Directory")
os.makedirs(vm_extra_disks_directory_path)
diskpath = vm_extra_disks_directory_path + '/' + disk_name
command= "qemu-img create -f qcow2 "+ diskpath + " " + str(size) + "G"
output = os.system(command)
return False if output != 0 else True
def attach_disk(vm_details, disk_name, hostip, already_attached_disks, new_vm):
"""
Attach given disk to the VM
"""
try:
(connection_object, domain) = getVirshDomainConn(None, hostip, vm_details.vm_identity)
#already_attached_disks = len(current.db(current.db.attached_disks.vm_id == vm.id).select())
logger.debug("Value of alreadyattached is : " + str(already_attached_disks))
(diskpath, device_present, disk_size) = get_extra_disk_location(vm_details.datastore_id, vm_details.vm_identity, disk_name, True)
if not device_present:
raise Exception("Device to be attached %s missing" %(diskpath))
# Attaching disk to vm using libvirt API
target_disk = "vd" + chr(97 + already_attached_disks + 1)
logger.debug(target_disk)
logger.debug("...................")
xmlDescription = _generate_disk_xml(diskpath, target_disk)
logger.debug(xmlDescription)
logger.debug("new vm is %s " % new_vm)
if new_vm:
logger.debug("Starting to attach disk on new vm request.")
domain.destroy()
logger.debug("VM destroyed")
domain.attachDeviceFlags(xmlDescription, VIR_DOMAIN_AFFECT_CONFIG)
logger.debug("Disk attached")
logger.debug("Turn on vm")
domain.create()
logger.debug("VM started")
domain.isActive()
elif vm_details.status == current.VM_STATUS_SHUTDOWN:
logger.debug("Starting to attach disk while vm is shutdown.")
domain.attachDeviceFlags(xmlDescription, VIR_DOMAIN_AFFECT_CONFIG)
logger.debug("Disk attached")
else:
raise Exception("VM is not in shutdown state. Check its status on host")
xmlfile = domain.XMLDesc(0)
domain = connection_object.defineXML(xmlfile)
logger.debug("VM XML redefined")
connection_object.close()
return disk_size
except:
logger.exception('Exception: ')
return 0
def serve_extra_disk_request(vm_details, disk_size, host_ip, new_vm = False):
"""
Serves extra disk request and updates db
"""
logger.debug("Starting to serve extra disk request...")
logger.debug("new vm is %s " % new_vm)
datastore = _choose_datastore()
already_attached_disks = len(current.db(current.db.attached_disks.vm_id == vm_details.id).select())
disk_name = vm_details.vm_identity + "_disk" + str(already_attached_disks + 1) + ".qcow2"
disk_created = create_extra_disk_image(vm_details, disk_name, disk_size, datastore)
vm_details.datastore_id = datastore.id
if disk_created:
if (attach_disk(vm_details, disk_name, host_ip, already_attached_disks, new_vm)):
current.db.attached_disks.insert(vm_id = vm_details.id, datastore_id = datastore.id , attached_disk_name = disk_name, capacity = disk_size)
current.db(current.db.datastore.id == datastore.id).update(used = int(datastore.used) + int(disk_size))
return True
return False
def launch_vm_on_host(vm_details, vm_image_location, vm_properties):
"""
Launches a vm image on host
"""
attach_disk_status_message = ''
install_command = _get_install_command(vm_details, vm_image_location, vm_properties)
# Starts installing a vm
host_ip = current.db.host[vm_properties['host']].host_ip.private_ip
logger.debug("Installation started...")
logger.debug("Host is "+ host_ip)
logger.debug("Installation command : " + install_command)
command_output = execute_remote_cmd(host_ip, 'root', install_command)
logger.debug(command_output)
logger.debug("Starting to set portgroup in vm...")
_set_portgroup_in_vm(vm_details['vm_identity'], vm_properties['vlan_name'], host_ip, vm_properties['vlan_tag'])
logger.debug("Portgroup set in vm")
# Serving HDD request
if (int(vm_details.extra_HDD) != 0):
if (serve_extra_disk_request(vm_details, vm_details.extra_HDD, host_ip, new_vm = True)):
message = "Attached extra disk successfully."
attach_disk_status_message += message
logger.debug(message)
else:
attach_disk_status_message += "Attached extra disk failed."
return attach_disk_status_message
def check_if_vm_defined(hostip, vmname):
"""
Checks if a newly created vm is successfully defined
"""
vm_defined = False
try:
connection_object = libvirt.openReadOnly('qemu+ssh://root@'+ hostip +'/system')
domain = connection_object.lookupByName(vmname)
if domain.ID() in connection_object.listDomainsID():
vm_defined = True
connection_object.close()
return vm_defined
except:
return False
def _free_vm_properties(vm_details, vm_properties):
"""
Frees vm properties in-case installation has failed mid-way
"""
logger.debug("VM installation fails..Starting to free vm properties")
if vm_properties:
host_ip_of_vm = current.db.host[vm_properties['host']].host_ip.private_ip
logger.debug("Host IP of vm is " + str(host_ip_of_vm))
if check_if_vm_defined(host_ip_of_vm, vm_details.vm_identity):
connection_object = libvirt.open('qemu+ssh://root@'+ host_ip_of_vm +'/system')
domain = connection_object.lookupByName(vm_details.vm_identity)
logger.debug("Starting to delete vm from host..")
domain.destroy()
domain.undefine()
connection_object.close()
logger.debug("VM deleted.")
current.db(current.db.attached_disks.vm_id == vm_details.id).delete()
if 'datastore' in vm_properties:
vm_directory_path = vm_properties['datastore'].system_mount_point + '/' + get_constant('vms') + '/' + vm_details.vm_identity
vm_extra_disk_dir_path = vm_properties['datastore'].system_mount_point + '/' + get_constant('extra_disks_dir') + '/' + vm_properties['datastore'].ds_name + '/' + vm_details.vm_identity
if os.path.exists (vm_directory_path):
logger.debug("Starting to delete vm directory.")
shutil.rmtree(vm_directory_path)
if os.path.exists (vm_extra_disk_dir_path):
logger.debug("Starting to delete vm extra disk directory.")
shutil.rmtree(vm_extra_disk_dir_path)
return
def update_db_after_vm_installation(vm_details, vm_properties, parent_id = None):
"""
Updates db after a vm is installed successfully
"""
logger.debug("Starting to update db after vm installation..")
hostid = vm_properties['host']
datastore = vm_properties['datastore']
template_hdd = vm_properties['template'].hdd
logger.debug("Inside update db after installation")
logger.debug(vm_properties)
# Updating the used entry of datastore
current.db(current.db.datastore.id == datastore.id).update(used = int(datastore.used) + int(vm_details.extra_HDD) +
int(template_hdd))
private_ip_id = current.db.private_ip_pool(private_ip=vm_properties['private_ip']).id
public_ip_id = None
if vm_properties['public_ip'] != None:
public_ip_id = current.db.public_ip_pool(public_ip=vm_properties['public_ip']).id
if parent_id:
vm_status = current.VM_STATUS_SHUTDOWN
else:
vm_status = current.VM_STATUS_RUNNING
# Update vm_data table
current.db(current.db.vm_data.id == vm_details.id).update( host_id = hostid,
extra_HDD = vm_details.extra_HDD,
datastore_id = datastore.id,
vnc_port = vm_properties['vnc_port'],
private_ip = private_ip_id,
public_ip = public_ip_id,
start_time = get_datetime(),
parent_id = parent_id,
status = vm_status)
logger.debug("Updated db")
return
def create_object_store(parameters,object_data):
try:
logger.debug("In create_object_store() function...")
object_name=object_data['object_store_name']
size_limit=object_data['object_store_size']
sh_path = os.path.join(get_context_path(), 'private/object_storage.sh')
command = 'sh %s %s %s' %(sh_path, object_name, str(size_limit))
logger.debug("command :%s" %command)
file_name= object_data['object_store_name'] + "_key.txt"
file_path = os.path.join(get_context_path(), 'private/Object_keys/' + file_name)
cp = os.open(file_path,os.O_RDWR|os.O_CREAT)
co = os.fdopen(cp,"rw+")
fd = os.open('/home/key.txt',os.O_RDWR|os.O_CREAT)
fo = os.fdopen(fd,"r+")
key_s3_secret= fo.readline();
co.write(key_s3_secret);
key_s3_access= fo.readline();
co.write(key_s3_access);
key_swift_secret= fo.readline();
co.write(key_swift_secret);
swift_user= 'Swift_user: ' + object_name + ':swift'
co.write(swift_user)
co.close()
a,b,key_swift_secret= key_swift_secret.partition(' ') # @UnusedVariable
a,b,key_s3_secret= key_s3_secret.partition(' ') # @UnusedVariable
a,b,key_s3_access= key_s3_access.partition(' ') # @UnusedVariable
#print key_s3_secret, key_s3_access , key_swift_secret
object_data.update_record(swift_access_key= key_swift_secret.strip() , s3_secret_key= key_s3_secret.strip(), s3_access_key= key_s3_access.strip(), status=3)
fo.close()
message = "Object Store is created successfully."
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
# Installs a vm
def install(parameters):
"""
Installs a vm
"""
vmid = parameters['vm_id']
logger.debug("In install() function...")
vm_details = current.db.vm_data[vmid]
vm_properties = None
try:
# Fetches vm details from vm_data table
logger.debug("VM details are: " + str(vm_details))
# Calling allocate_vm_properties function
vm_properties = allocate_vm_properties(vm_details)
# Calling create_vm_image function
(vm_properties['template'], vm_image_location) = create_vm_image(vm_details, vm_properties['datastore'])
# Calling launch_vm_on_host
attach_disk_status_message = launch_vm_on_host(vm_details, vm_image_location, vm_properties)
# Checking if vm has been installed successfully
assert(check_if_vm_defined(current.db.host[vm_properties['host']].host_ip.private_ip, vm_details.vm_identity)), "VM is not installed. Check logs."
if vm_properties['public_ip_req']:
create_mapping(vm_properties['public_ip'], vm_properties['private_ip'])
# Update database after vm installation
update_db_after_vm_installation(vm_details, vm_properties)
message = "VM is installed successfully." + attach_disk_status_message
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
if vm_properties != None:
_free_vm_properties(vm_details, vm_properties)
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def start(parameters):
"""
Starts a vm
"""
logger.debug("Inside start() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
try:
domain = getVirshDomain(vm_details)
if domain.info()[0] == VIR_DOMAIN_RUNNING:
raise Exception("VM is already running. Check vm status on host.")
domain.create()
current.db(current.db.vm_data.id == vm_id).update(status = current.VM_STATUS_RUNNING)
message = vm_details.vm_identity + " is started successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def suspend(parameters):
"""
Suspends a vm
"""
logger.debug("Inside suspend() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
try:
domain = getVirshDomain(vm_details)
if domain.info()[0] == VIR_DOMAIN_PAUSED:
raise Exception("VM is already paused. Check vm status on host.")
domain.suspend()
current.db(current.db.vm_data.id == vm_id).update(status = current.VM_STATUS_SUSPENDED)
message = vm_details.vm_identity + " is suspended successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def resume(parameters):
"""
Resumes a vm
"""
logger.debug("Inside resume() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
try:
domain = getVirshDomain(vm_details)
if domain.info()[0] == VIR_DOMAIN_RUNNING:
raise Exception("VM is already running. Check vm status on host.")
domain.resume()
current.db(current.db.vm_data.id == vm_id).update(status = current.VM_STATUS_RUNNING)
message = vm_details.vm_identity + " is resumed successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def destroy(parameters):
"""
Destroys a vm forcefully
"""
logger.debug("Inside destroy() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
logger.debug(str(vm_details))
try:
domain = getVirshDomain(vm_details)
if domain.info()[0] == VIR_DOMAIN_SHUTOFF:
raise Exception("VM is already shutoff. Check vm status on host.")
domain.destroy()
current.db(current.db.vm_data.id == vm_id).update(status = current.VM_STATUS_SHUTDOWN)
message = vm_details.vm_identity + " is destroyed successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def shutdown(parameters):
"""
Destroys a vm gracefully
"""
logger.debug("Inside shutdown() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
logger.debug(str(vm_details))
try:
domain = getVirshDomain(vm_details)
if domain.info()[0] == VIR_DOMAIN_SHUTOFF:
raise Exception("VM is already shutoff. Check vm status on host.")
domain.managedSave()
current.db(current.db.vm_data.id == vm_id).update(status = current.VM_STATUS_SHUTDOWN)
message = vm_details.vm_identity + " is shutdown successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def _clean_up_database_after_vm_deletion(vm_details):
"""
Cleans up database after vm deletion
"""
logger.debug("Inside clean up database after vm deletion () function...")
# moving vm image folder to archives folder
archive_directory_path = vm_details.datastore_id.system_mount_point + '/' + get_constant('archives_dir')
if not os.path.exists(archive_directory_path):
os.makedirs(archive_directory_path)
source_file = vm_details.datastore_id.system_mount_point + '/' + get_constant('vms') + '/' + vm_details.vm_identity
archive_filename = vm_details.vm_identity + str(get_datetime())
logger.debug(archive_filename)
destination_file = archive_directory_path + '/' + archive_filename
shutil.move(source_file, destination_file)
# removing hdd
vm_extra_disks_directory_path = vm_details.datastore_id.system_mount_point + '/' + get_constant('extra_disks_dir') + '/' + \
vm_details.datastore_id.ds_name + "/" + vm_details.vm_identity
if os.path.exists(vm_extra_disks_directory_path):
shutil.rmtree(vm_extra_disks_directory_path)
# updating the used entry of database
current.db(current.db.datastore.id == vm_details.datastore_id).update(used = int(vm_details.datastore_id.used) - \
(int(vm_details.extra_HDD) + int(vm_details.template_id.hdd)))
# updating task_queue_event entry to remove reference of VM
current.db(current.db.task_queue_event.vm_id == vm_details.id).update(vm_id = None)
# deleting entry of extra disk of vm
current.db(current.db.attached_disks.vm_id == vm_details.id).delete()
logger.debug("Database cleaned")
def vm_has_snapshots(vm_id):
"""
Checks if a vm has snapshot(s)
"""
if (current.db(current.db.snapshot.vm_id == vm_id).select()):
return True
else:
return False
def delete(parameters):
"""
Deletes a vm
"""
logger.debug("Inside delete() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
try:
domain = getVirshDomain(vm_details)
logger.debug(str(vm_details.status))
if (vm_details.status == current.VM_STATUS_RUNNING or vm_details.status == current.VM_STATUS_SUSPENDED):
logger.debug("Vm is not shutoff. Shutting it off first.")
domain.destroy()
logger.debug("Starting to delete it...")
domain.undefineFlags(VIR_DOMAIN_UNDEFINE_SNAPSHOTS_METADATA )
if vm_details.public_ip:
remove_mapping(vm_details.public_ip.public_ip, vm_details.private_ip.private_ip)
message = vm_details.vm_identity + " is deleted successfully."
logger.debug(message)
_clean_up_database_after_vm_deletion(vm_details)
current.db(current.db.vm_data.id == vm_id).delete()
current.db.commit()
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def migrate_domain_with_snapshots(vm_details, destination_host_ip, domain, domain_snapshots_list, current_snapshot_name, flags, vm_backup_during_migration):
"""
Migrate domain with snapshots
"""
# XML dump of snapshot(s) of the vm
logger.debug("Starting to take xml dump of the snapshot(s) of the vm... ")
if not os.path.exists(vm_backup_during_migration):
os.makedirs(vm_backup_during_migration)
for domain_snapshot in domain_snapshots_list:
logger.debug("snapshot name is " + str(domain_snapshot))
dump_xml_path = vm_backup_during_migration + '/' + 'dump_' + domain_snapshot
snapshot_dumpxml_command = 'virsh snapshot-dumpxml %s %s > %s' % ( vm_details.vm_identity, domain_snapshot, dump_xml_path)
logger.debug("Taking xml dump of" + str(domain_snapshot))
command_output = execute_remote_cmd(vm_details.host_id.host_ip.private_ip, 'root', snapshot_dumpxml_command)
logger.debug(command_output)
logger.debug("XML dump of " + str(domain_snapshot) + "succeeded.")
# Delete snapshot(s) of the vm and migrate it to destination host
logger.debug("Starting to delete snapshots of the vm....")
for domain_snapshot in domain_snapshots_list:
snapshot = domain.snapshotLookupByName(domain_snapshot, 0)
snapshot.delete(0)
logger.debug("Migrating the vm to destination host...")
domain.migrateToURI("qemu+ssh://root@" + destination_host_ip + "/system", flags , None, 0)
# Redefine all the snapshot(s) of the vm on the destination host and set current snapshot
logger.debug("Starting to redefine all the snapshot(s) of the domain...")
for domain_snapshot in domain_snapshots_list:
redefine_xml_path = vm_backup_during_migration + '/' + 'dump_' + domain_snapshot
snapshot_redefine_command = 'virsh snapshot-create --redefine %s %s ' % (vm_details.vm_identity, redefine_xml_path)
command_output = execute_remote_cmd(destination_host_ip, 'root', snapshot_redefine_command)
logger.debug(command_output)
snapshot_current_command = 'virsh snapshot-current %s %s' % (vm_details.vm_identity, current_snapshot_name)
command_output = execute_remote_cmd(destination_host_ip, 'root', snapshot_current_command)
logger.debug(command_output)
return
def _clean_migration_directory(vm_backup_during_migration):
"""
Delete directory created for storing dumpxml of vm snapshots
"""
if os.path.exists(vm_backup_during_migration):
shutil.rmtree(vm_backup_during_migration)
return
def undo_migration(vm_details, domain_snapshots_list, current_snapshot_name, vm_backup_during_migration):
"""
Undo the migration
"""
if domain_snapshots_list:
# Redefine the snapshots of the vm on the source host
logger.debug("Starting to redefine all the snapshot(s) of the vm on the source host...")
for domain_snapshot in domain_snapshots_list:
redefine_xml_path = vm_backup_during_migration + '/' + 'dump_' + domain_snapshot
snapshot_redefine_command = 'virsh snapshot-create --redefine %s %s ' % (vm_details.vm_identity, redefine_xml_path)
command_output = execute_remote_cmd(vm_details.host_id.host_ip.private_ip, 'root', snapshot_redefine_command, None, True)
logger.debug(command_output)
snapshot_current_command = 'virsh snapshot-current %s %s' % (vm_details.vm_identity, current_snapshot_name)
command_output = execute_remote_cmd(vm_details.host_id.host_ip.private_ip, 'root', snapshot_current_command, None, True)
logger.debug(command_output)
# Delete directory created for storing dumpxml of vm snapshots
_clean_migration_directory(vm_backup_during_migration)
return
def migrate_domain(vm_id, destination_host_id=None, live_migration=False):
"""
Migrate domain
"""
vm_details = current.db.vm_data[vm_id]
domain_snapshots_list = []
current_snapshot_name = ''
vm_migration_directory = get_constant('vm_migration_data')
vm_backup_during_migration = vm_details.datastore_id.system_mount_point + '/' + vm_migration_directory + '/' + \
vm_details.vm_identity
if destination_host_id == None:
destination_host_id = find_new_host(vm_details.RAM, vm_details.vCPU)
destination_host_ip = current.db.host[destination_host_id].host_ip.private_ip
flags = VIR_MIGRATE_PEER2PEER|VIR_MIGRATE_PERSIST_DEST|VIR_MIGRATE_UNDEFINE_SOURCE|VIR_MIGRATE_UNSAFE
if live_migration:
flags |= VIR_MIGRATE_TUNNELLED|VIR_MIGRATE_LIVE
if vm_details.status == current.VM_STATUS_SUSPENDED:
logger.debug("Vm is suspended")
flags |= VIR_MIGRATE_TUNNELLED|VIR_MIGRATE_PAUSED
elif vm_details.status == current.VM_STATUS_SHUTDOWN:
logger.debug("Vm is shut off")
flags |= VIR_MIGRATE_OFFLINE
logger.debug("Flags: " + str(flags))
try:
domain = getVirshDomain(vm_details)
dom_snapshot_names = domain.snapshotListNames(0)
for snapshot in current.db(current.db.snapshot.vm_id == vm_id).select():
logger.debug("snapshot:" + str(snapshot.snapshot_name))
domain_snapshots_list.append(snapshot.snapshot_name)
dom_snapshot_names.remove(snapshot.snapshot_name)
logger.debug("domain snapshot list is " + str(domain_snapshots_list))
for dom_snapshot in dom_snapshot_names:
logger.debug("Deleting orphan snapshot %s" %(dom_snapshot))
snapshot = domain.snapshotLookupByName(dom_snapshot, 0)
snapshot.delete(0)
if domain_snapshots_list:
current_snapshot = domain.snapshotCurrent(0)
current_snapshot_name = current_snapshot.getName()
migrate_domain_with_snapshots(vm_details, destination_host_ip, domain, domain_snapshots_list, current_snapshot_name, flags, vm_backup_during_migration)
else:
domain.migrateToURI("qemu+ssh://root@" + destination_host_ip + "/system", flags , None, 0)
vm_details.update_record(host_id = destination_host_id)
current.db.commit()
# Delete directory created for storing dumpxml of vm snapshot
_clean_migration_directory(vm_backup_during_migration)
message = vm_details.vm_identity + " is migrated successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
undo_migration(vm_details, domain_snapshots_list, current_snapshot_name, vm_backup_during_migration)
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def migrate_domain_datastore(vmid, destination_datastore_id, live_migration=False):
"""
Migrate VM domain from one datastore to another.
- Copy VM Image to new datastore
- Update VM XML definition
- Update database
"""
logger.debug(sys.path)
vm_details = current.db.vm_data[vmid]
# datastore_id = vm_details["datastore_id"]
logger.debug("Inside live disk migration block")
try:
(connection_object, domain) = getVirshDomainConn(vm_details)
datastore = current.db.datastore[destination_datastore_id]
vm_directory_path = datastore.system_mount_point + get_constant('vms') + '/' + vm_details.vm_identity
logger.debug("Creating vm directory on other datastore...")
if not os.path.exists (vm_directory_path):
os.makedirs(vm_directory_path)
diskpath = vm_directory_path + '/' + vm_details.vm_identity + '.qcow2'
current_disk_path = vm_details.datastore_id.system_mount_point + get_constant('vms') + '/' + vm_details.vm_identity
current_disk_file = current_disk_path + '/' + vm_details.vm_identity + '.qcow2'
logger.debug(current_disk_file)
xmlfile = domain.XMLDesc(0)
if(live_migration==False):
rc = os.system("cp %s %s" % (current_disk_file, diskpath))
if rc != 0:
logger.error("Copy not successful")
raise Exception("Copy not successful")
else:
logger.debug("Copied successfully")
else:
if domain.isActive:
domain.undefine()
root = etree.fromstring(xmlfile)
target_elem = root.find("devices/disk/target")
target_disk = target_elem.get('dev')
#
# destxml = generate_blockcopy_xml(diskpath,target_disk)
flag = VIR_DOMAIN_BLOCK_REBASE_SHALLOW | VIR_DOMAIN_BLOCK_REBASE_COPY
domain.blockRebase(target_disk, diskpath, 0, flag)
block_info_list = domain.blockJobInfo(current_disk_file,0)
while(block_info_list['end'] != block_info_list['cur']):
logger.debug("time to sleep")
time.sleep(60)
block_info_list = domain.blockJobInfo(current_disk_file,0)
domain.blockJobAbort(current_disk_file, VIR_DOMAIN_BLOCK_JOB_ABORT_PIVOT)
source_elem = root.find("devices/disk/source")
source_elem.set('file',diskpath)
newxml_file = etree.tostring(root)
domain = connection_object.defineXML(newxml_file)
vm_details.update_record(datastore_id=destination_datastore_id)
if os.path.exists (diskpath):
os.remove(current_disk_file)
restore_symboltable_path = current_disk_path+"/restore_symboltable"
if os.path.exists (restore_symboltable_path):
logger.debug(restore_symboltable_path)
os.remove(restore_symboltable_path)
os.rmdir(current_disk_path)
connection_object.close()
message = vm_details.vm_identity + " is migrated successfully to new datastore."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
#undo_datastore_migration(vm_details, domain, diskpath, current_disk_file, vm_directory_path, datastore_id)
connection_object.close()
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def undo_datastore_migration(vm_details, domain, diskpath, current_disk_file, vm_directory_path, datastore_id):
"""
Undo migration in case of any issue
"""
# undo databse changes
vm_details.update_record(datastore_id=datastore_id)
if domain.isActive:
logger.debug("domain is active")
block_info_list = domain.blockJobInfo(current_disk_file,0)
if(bool(block_info_list) == True):
while(block_info_list['end'] != block_info_list['cur']):
logger.debug("time to sleep")
time.sleep(60)
block_info_list = domain.blockJobInfo(current_disk_file,0)
if(block_info_list['end'] == block_info_list['cur']):
domain.blockJobAbort(current_disk_file)
block_info_list = domain.blockJobInfo(current_disk_file,0)
if os.path.exists (diskpath):
os.remove(diskpath)
os.rmdir(vm_directory_path)
def migrate(parameters):
"""
Migrates VM to new host
"""
vmid = parameters['vm_id']
logger.debug("Inside migrate() function for vm_id: "+str(vmid))
destination_host_id = parameters['destination_host']
if parameters['live_migration'] == 'on':
live_migration = True
else:
live_migration = False
return migrate_domain(vmid, destination_host_id, live_migration)
def migrate_datastore(parameters):
"""
Migrates VM to new datastore
"""
logger.debug("Inside migrate_datastore() function")
vmid = parameters['vm_id']
destination_ds_id = parameters['destination_ds']
if parameters['live_migration'] == 'on':
live_migration = True
else:
live_migration = False
return migrate_domain_datastore(vmid, destination_ds_id, live_migration)
def snapshot(parameters):
"""
Snapshots a vm
"""
logger.debug("Inside snapshot() function")
vm_id = parameters['vm_id']
snapshot_type = parameters['snapshot_type']
try:
vm_details = current.db.vm_data[vm_id]
if is_pingable(str(vm_details.private_ip.private_ip)):
logger.debug("VM is pingable. Starting to start with snapshotting...")
if snapshot_type != current.SNAPSHOT_USER:
snapshots = current.db((current.db.snapshot.vm_id == vm_id) & (current.db.snapshot.type == snapshot_type)).select()
#Delete the existing Daily/Monthly/Yearly snapshot
for snapshot_cron in snapshots:
logger.debug(snapshot_cron)
delete_snapshot({'vm_id':vm_id, 'snapshot_id':snapshot_cron.id})
snapshot_name = get_datetime().strftime("%I:%M%p_%B%d,%Y")
domain = getVirshDomain(vm_details)
xmlDesc = "<domainsnapshot><name>%s</name></domainsnapshot>" % (snapshot_name)
domain.snapshotCreateXML(xmlDesc, 0)
message = "Snapshotted successfully."
current.db.snapshot.insert(vm_id = vm_id, datastore_id = vm_details.datastore_id, snapshot_name = snapshot_name, type = snapshot_type)
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
else:
message = "Unable to ping VM before snapshoting: %s" % (vm_details.private_ip.private_ip)
raise Exception("Unable to ping VM before snapshoting: %s" % (vm_details.private_ip.private_ip))
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def revert(parameters):
"""
Reverts to snapshot
"""
logger.debug("Inside revert snapshot() function")
vm_id = parameters['vm_id']
snapshotid = parameters['snapshot_id']
vm_details = current.db.vm_data[vm_id]
try:
domain = getVirshDomain(vm_details)
snapshot_name = current.db(current.db.snapshot.id == snapshotid).select().first()['snapshot_name']
snapshot = domain.snapshotLookupByName(snapshot_name, 0)
domain.revertToSnapshot(snapshot, 0)
message = "Reverted to snapshot successfully."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def delete_snapshot(parameters):
"""
Deletes a snapshot
"""
logger.debug("Inside delete snapshot() function")
vm_id = parameters['vm_id']
snapshotid = parameters['snapshot_id']
vm_details = current.db.vm_data[vm_id]
logger.debug(str(vm_details))
try:
domain = getVirshDomain(vm_details)
snapshot_name = current.db(current.db.snapshot.id == snapshotid).select().first()['snapshot_name']
snapshot = None
try:
snapshot = domain.snapshotLookupByName(snapshot_name, 0)
except libvirtError:
logger.debug("Snapshot %s not found" %(snapshot_name))
if snapshot != None:
snapshot.delete(0)
message = "Deleted snapshot successfully."
logger.debug(message)
current.db(current.db.snapshot.id == snapshotid).delete()
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def update_security_domain(vm_details, security_domain_id, xmlDesc=None):
"""
Get new IP for given security domain.
Update the VM XML with new mac_address and update the information in DB
"""
# fetch new private IP from db from given security domain
private_ip_info = _get_private_ip_mac(security_domain_id)
# update vm config to add new mac address.
root = etree.fromstring(xmlDesc)
mac_elem = root.find("devices/interface[@type='bridge']/mac")
mac_elem.set('address', private_ip_info.mac_addr)
vlan_tag_elem = root.find("devices/interface[@type='bridge']/vlan/tag")
vlan_tag_elem.set('id', private_ip_info.vlan.vlan_tag)
# update NAT IP mapping, if public IP present
if vm_details.public_ip:
remove_mapping(vm_details.public_ip.public_ip, vm_details.private_ip.private_ip)
create_mapping(vm_details.public_ip.public_ip, private_ip_info.private_ip)
# update vm_data
current.db(current.db.vm_data.id == vm_details.id).update(security_domain = security_domain_id,
private_ip = private_ip_info.id)
return etree.tostring(root)
def edit_vm_config(parameters):
"""
Edits vm configuration
"""
logger.debug("Inside edit vm config() function")
vm_id = parameters['vm_id']
vm_details = current.db.vm_data[vm_id]
message = ""
try:
connection_object, domain = getVirshDomainConn(vm_details)
if 'vcpus' in parameters:
new_vcpus = int(parameters['vcpus'])
domain.setVcpusFlags(new_vcpus, VIR_DOMAIN_VCPU_MAXIMUM)
domain.setVcpusFlags(new_vcpus, VIR_DOMAIN_AFFECT_CONFIG)
message += "Edited vCPU successfully."
current.db(current.db.vm_data.id == vm_id).update(vCPU = new_vcpus)
if 'ram' in parameters:
new_ram = int(parameters['ram']) * 1024
logger.debug(str(new_ram))
domain.setMemoryFlags(new_ram, VIR_DOMAIN_MEM_MAXIMUM)
domain.setMemoryFlags(new_ram, VIR_DOMAIN_AFFECT_CONFIG)
message += " And edited RAM successfully."
current.db(current.db.vm_data.id == vm_id).update(RAM = int(parameters['ram']))
if 'public_ip' in parameters:
enable_public_ip = parameters['public_ip']
if enable_public_ip:
public_ip_pool = _choose_random_public_ip()
if public_ip_pool:
create_mapping(public_ip_pool.public_ip, vm_details.private_ip.private_ip)
current.db.vm_data[vm_id] = dict(public_ip=public_ip_pool.id)
message += "Edited Public IP successfully."
else:
raise Exception("Available Public IPs are exhausted.")
else:
remove_mapping(vm_details.public_ip.public_ip, vm_details.private_ip.private_ip)
current.db.vm_data[vm_id] = dict(public_ip = None)
if 'security_domain' in parameters:
logger.debug('Updating security domain')
xmlfile = update_security_domain(vm_details, parameters['security_domain'], domain.XMLDesc(0))
domain = connection_object.defineXML(xmlfile)
if domain.isActive():
domain.reboot(0)
message += "Edited security domain successfully"
connection_object.close()
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def _get_clone_properties(vm_details, cloned_vm_details, vm_properties):
"""
Get properties for Cloned VM.
"""
datastore = _choose_datastore()
vm_properties['datastore'] = datastore
logger.debug("Datastore selected is: " + str(datastore))
vm_properties['security_domain'] = vm_details.security_domain
vm_properties['public_ip_req'] = False
# Finds mac address, ip address and vnc port for the cloned vm
_choose_mac_ip_vncport(vm_properties)
logger.debug("MAC is : " + str(vm_properties['mac_addr']) + " IP is : " + str(vm_properties['private_ip']) + \
" VNCPORT is : " + str(vm_properties['vnc_port']))
# Template and host of parent vm
vm_properties['template'] = current.db(current.db.template.id == vm_details.template_id).select()[0]
vm_properties['vm_host_details'] = current.db.host[vm_details.host_id]
vm_properties['host'] = vm_properties['vm_host_details'].id
# Creates a directory for the cloned vm
logger.debug("Creating directory for cloned vm...")
cloned_vm_directory_path = datastore.system_mount_point + '/' + get_constant('vms') + '/' + cloned_vm_details.vm_identity
if not os.path.exists (cloned_vm_directory_path):
os.makedirs(cloned_vm_directory_path)
clone_file_parameters = ' --file ' + cloned_vm_directory_path + '/' + cloned_vm_details.vm_identity + '.qcow2'
else:
raise Exception("Directory with same name as vmname already exists.")
# Creates a folder for additional disks of the cloned vm
vm = current.db(current.db.vm_data.vm_identity == vm_details.vm_identity).select().first()
disk_details_of_cloning_vm = current.db(current.db.attached_disks.vm_id == vm.id).select(orderby=current.db.attached_disks.attached_disk_name)
logger.debug(disk_details_of_cloning_vm)
already_attached_disks = len(disk_details_of_cloning_vm)
cloned_vm_extra_disks_directory = datastore.system_mount_point + '/' + get_constant('extra_disks_dir') + '/' + \
datastore.ds_name + '/' + cloned_vm_details.vm_identity
if already_attached_disks > 0:
if not os.path.exists (cloned_vm_extra_disks_directory):
logger.debug("Making Directory")
os.makedirs(cloned_vm_extra_disks_directory)
count = already_attached_disks
while already_attached_disks > 0:
disk_name = cloned_vm_details.vm_identity + '_disk' + str(count - already_attached_disks + 1) + '.qcow2'
clone_file_parameters += ' --file ' + cloned_vm_extra_disks_directory + '/' + disk_name
current.db.attached_disks.insert(vm_id = cloned_vm_details.id,
datastore_id = datastore.id ,
attached_disk_name = disk_name,
capacity = disk_details_of_cloning_vm[count - already_attached_disks].capacity)
already_attached_disks -= 1
return (clone_file_parameters)
def migrate_clone_to_new_host(vm_details, cloned_vm_details, new_host_id_for_cloned_vm,vm_properties):
"""
Migrates cloned vm to new host
"""
try:
new_host_ip_for_cloned_vm = current.db.host[new_host_id_for_cloned_vm].host_ip.private_ip
logger.debug("New host ip for cloned vm is: " + str(new_host_ip_for_cloned_vm))
flags = VIR_MIGRATE_PEER2PEER|VIR_MIGRATE_PERSIST_DEST|VIR_MIGRATE_UNDEFINE_SOURCE|VIR_MIGRATE_OFFLINE|VIR_MIGRATE_UNSAFE
logger.debug("Clone currently on: " + str(vm_details.host_id.host_ip))
(current_host_connection_object, domain) = getVirshDomainConn(None, vm_details.host_id.host_ip, cloned_vm_details.vm_identity)
logger.debug("Starting to migrate cloned vm to host " + str(new_host_ip_for_cloned_vm))
domain.migrateToURI("qemu+ssh://root@" + new_host_ip_for_cloned_vm + "/system", flags , None, 0)
current_host_connection_object.close()
logger.debug("Successfully migrated cloned vm to host " + str(new_host_ip_for_cloned_vm))
cloned_vm_details.update_record(host_id = new_host_id_for_cloned_vm)
vm_properties['host'] = new_host_id_for_cloned_vm
return True
except libvirt.libvirtError,e:
message = e.get_error_message()
logger.debug("Error: " + message)
return False
def clone(vmid):
"""
Clones vm
"""
vm_properties = {}
logger.debug("Inside clone() function")
cloned_vm_details = current.db.vm_data[vmid]
vm_details = current.db(current.db.vm_data.id == cloned_vm_details.parent_id).select().first()
try:
domain = getVirshDomain(vm_details)
if domain.info()[0] != VIR_DOMAIN_SHUTOFF:
raise Exception("VM is not shutoff. Check vm status.")
clone_file_parameters = _get_clone_properties(vm_details, cloned_vm_details, vm_properties)
logger.debug("cloned vm properties after clone_file_parameters" + str(vm_properties))
host = vm_properties['vm_host_details']
logger.debug("host is: " + str(host))
logger.debug("host details are: " + str(host))
(used_ram, used_cpu) = host_resources_used(host.id)
logger.debug("uram: " + str(used_ram) + " used_cpu: " + str(used_cpu) + " host ram: " + str(host.RAM) +" host cpu: " + str(host.CPUs))
host_ram_after_200_percent_overcommitment = math.floor((host.RAM * 1024) * 2)
host_cpu_after_200_percent_overcommitment = math.floor(host.CPUs * 2)
logger.debug("host_ram_after_200_percent_overcommitment in MB " + str(host_ram_after_200_percent_overcommitment))
logger.debug("host_cpu_after_200_percent_overcommitment " + str(host_cpu_after_200_percent_overcommitment))
logger.debug("Available RAM on host: %s, Requested RAM: %s" % ((host_ram_after_200_percent_overcommitment - used_ram), vm_details.RAM))
logger.debug("Available CPUs on host: %s, Requested CPU: %s " % ((host_cpu_after_200_percent_overcommitment - used_cpu), vm_details.vCPU))
if((( host_ram_after_200_percent_overcommitment - used_ram) >= vm_details.RAM) and ((host_cpu_after_200_percent_overcommitment - used_cpu) >= vm_details.vCPU) and (vm_details.vCPU <= host.CPUs)):
clone_command = "virt-clone --original " + vm_details.vm_identity + " --name " + cloned_vm_details.vm_identity + \
clone_file_parameters + " --mac " + vm_properties['mac_addr']
command_output = execute_remote_cmd(vm_details.host_id.host_ip.private_ip, 'root', clone_command, None, True)
logger.debug(command_output)
logger.debug("Updating db after cloning")
update_db_after_vm_installation(cloned_vm_details, vm_properties, parent_id = vm_details.id)
message = "Cloned successfully. "
try:
new_host_id_for_cloned_vm = find_new_host(cloned_vm_details.RAM, cloned_vm_details.vCPU)
if new_host_id_for_cloned_vm != host.id:
if migrate_clone_to_new_host(vm_details, cloned_vm_details, new_host_id_for_cloned_vm,vm_properties):
message += "Found new host and migrated successfully."
else:
message += "Found new host but not migrated successfully."
else:
message += "New host selected to migrate cloned vm is same as the host on which it currently resides."
except:
message += "Could not find host to migrate cloned vm."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
else:
raise Exception("Host resources exhausted. Migrate the host vms and then try.")
except:
_free_vm_properties(cloned_vm_details, vm_properties)
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def attach_extra_disk(parameters):
"""
Attaches extra disk to VM
"""
logger.debug("Inside attach extra disk() function")
vmid = parameters['vm_id']
disk_size = parameters['disk_size']
vm_details = current.db.vm_data[vmid]
logger.debug(str(vm_details))
try:
if (serve_extra_disk_request(vm_details, disk_size, vm_details.host_id.host_ip.private_ip)):
current.db(current.db.vm_data.id == vmid).update(extra_HDD = vm_details.extra_HDD + disk_size)
message = "Attached extra disk successfully"
logger.debug(message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
else:
message = " Your request for additional HDD could not be completed at this moment. Check logs."
logger.debug("Task Status: SUCCESS Message: %s " % message)
return (current.TASK_QUEUE_STATUS_FAILED, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def get_vm_image_location(datastore_id, vm_identity):
"""
Get the file path for qcow2 image of a VM
"""
datastore = current.db.datastore[datastore_id]
vm_directory_path = datastore.system_mount_point + '/' + get_constant('vms') + '/' + vm_identity
vm_image_name = vm_directory_path + '/' + vm_identity + '.qcow2'
image_present = True if os.path.exists(vm_image_name) else False
return (vm_image_name, image_present)
def get_extra_disk_location(datastore_id, vm_identity, disk_name, get_disk_size=False):
"""
Get the file path for qcow2 image of teh extra disk
"""
datastore = current.db.datastore[datastore_id]
if datastore:
vm_extra_disks_directory_path = datastore.system_mount_point + '/' + get_constant('extra_disks_dir') + '/' + \
datastore.ds_name + '/' + vm_identity
ext = '' if disk_name.endswith('.qcow2') else '.qcow2'
disk_image_path = vm_extra_disks_directory_path + '/' + disk_name + ext
image_present = True if os.path.exists(disk_image_path) else False
disk_size = 0
if image_present & get_disk_size:
command = "qemu-img info " + disk_image_path + " | grep 'virtual size'"
ret = os.popen(command).read() # Returns e.g. virtual size: 40G (42949672960 bytes)
disk_size = int(ret[ret.index(':')+1:ret.index('G ')].strip())
return (disk_image_path, image_present, disk_size)
else:
return (None, False, 0)
def launch_existing_vm_image(vm_details):
"""
Launch existing VM image
- Choose new private_ip & mac_addr if not provided
- Get location for VM image
- Launch VM on given host
- Attach extra disk to VM if defined
- Create mapping between public IP and private IP if required
"""
logger.debug('Launch existing VM image')
vm_properties = {}
vm_properties['ram'] = vm_details.RAM
vm_properties['vcpus'] = vm_details.vCPU
vm_properties['security_domain'] = vm_details.security_domain
#If Private IP was already chosen previously and DHCP entry is done
if vm_details.private_ip != None:
private_ip_info = current.db.private_ip_pool[vm_details.private_ip]
if private_ip_info:
vm_properties['private_ip'] = private_ip_info.private_ip
vm_properties['mac_addr'] = private_ip_info.mac_addr
vm_properties['vlan_name'] = private_ip_info.vlan.name
vm_properties['vlan_tag'] = private_ip_info.vlan.vlan_tag
if vm_details.public_ip == None:
vm_properties['public_ip_req'] = False
else:
vm_properties['public_ip_req'] = True
if vm_details.public_ip.is_active:
vm_properties['public_ip'] = vm_details.public_ip.public_ip
_choose_mac_ip_vncport(vm_properties)
vm_properties['template'] = current.db.template[vm_details.template_id]
vm_properties['datastore'] = current.db.datastore[vm_details.datastore_id]
vm_properties['host'] = find_new_host(vm_details.RAM, vm_details.vCPU)
(vm_image_name, image_present) = get_vm_image_location(vm_details.datastore_id, vm_details.vm_identity)
if image_present:
launch_vm_on_host(vm_details, vm_image_name, vm_properties)
#Check if extra disk needs to be attached
attached_disks = current.db((current.db.attached_disks.vm_id == vm_details.id)).select()
if attached_disks:
#Extra disk to be attached to the VM
host_ip = current.db.host[vm_properties['host']].host_ip.private_ip
disk_counter = 1
for attached_disk in attached_disks:
disk_size = attach_disk(vm_details, attached_disk.attached_disk_name, host_ip, disk_counter, True)
current.db(current.db.attached_disks.vm_id == attached_disk.vm_id and
current.db.attached_disks.attached_disk_name==attached_disk.attached_disk_name
).update(capacity = disk_size)
vm_details.extra_HDD += disk_size
disk_counter += 1
#Create mapping of Private_IP and Public_IP
if vm_properties['public_ip_req']:
create_mapping(vm_properties['public_ip'], vm_properties['private_ip'])
update_db_after_vm_installation(vm_details, vm_properties)
def save_vm_as_template(parameters):
"""
Save VM as template
If template for given VM already exists, replace with new template.
"""
logger.debug("Inside save_as_template() function")
vm_id = parameters['vm_id']
vm_data = current.db.vm_data[vm_id]
user_list = []
vm_details = current.db.vm_data[vm_id]
logger.debug(str(vm_details))
try:
(is_templated_created, new_template, old_template) = create_new_template(vm_details)
if (is_templated_created):
#remove old template
if os.path.exists (old_template):
os.remove(old_template)
else:
for user in current.db(current.db.user_vm_map.vm_id == vm_id).select(current.db.user_vm_map.user_id):
user_list.append(user.user_id)
new_template_id = current.db.template.insert(name = vm_data.vm_name + "_template" ,
os = vm_data.template_id.os ,
os_name = vm_data.template_id.os_name ,
os_version = vm_data.template_id.os_version ,
os_type = vm_data.template_id.os_type ,
arch = vm_data.template_id.arch ,
hdd = vm_data.template_id.hdd ,
hdfile = new_template ,
type = vm_data.template_id.type ,
tag = vm_data.vm_name + "_template" ,
datastore_id = vm_data.template_id.datastore_id,
owner = user_list)
current.db.vm_data[vm_id] = dict(saved_template = new_template_id)
message = "User Template saved successfully"
logger.debug(message)
return (current.TASK_QUEUE_STATUS_SUCCESS, message)
else:
message = " Vm Template not saved "
logger.debug("Task Status: %s " % message)
return (current.TASK_QUEUE_STATUS_FAILED, message)
except:
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (current.TASK_QUEUE_STATUS_FAILED, log_exception())
def delete_template(parameters):
"""
Delete template
"""
logger.debug("Inside delete_template() function")
template_id = parameters['template_id']
template_details = current.db.template[template_id]
template_path = template_details["hdfile"]
if os.path.exists(template_path):
os.remove(template_path)
# set value in db also
parent_vm = current.db.vm_data(saved_template = template_id)
if parent_vm:
parent_vm.update_record(saved_template = None)
del current.db.template[template_id]
return (current.TASK_QUEUE_STATUS_SUCCESS, "")
def create_new_template(vm_details):
"""
Create a new template from the VM image
- Create template directory
- Copy VM Image to directory(Live copy if VM is running)
- Update database to define new template
"""
try:
(connection_object, domain) = getVirshDomainConn(vm_details)
xmlfile = domain.XMLDesc(0)
logger.debug("connection object created")
datastore = _choose_datastore()
logger.debug(datastore)
new_template_dir = datastore.system_mount_point + '/' +get_constant('templates_dir') + '/' + vm_details.requester_id.first_name
logger.debug("Creating user template directory...")
if not os.path.exists (new_template_dir):
os.makedirs(new_template_dir)
template = new_template_dir + '/' + vm_details.vm_identity + '_template.qcow2'
template_location = '/' + vm_details.requester_id.first_name + '/' + vm_details.vm_identity + '_template.qcow2'
old_template = new_template_dir + '/' + vm_details.vm_identity + '_template_old.qcow2'
if os.path.exists (template):
# move template to some other path
logger.debug("move template to some other file")
shutil.move(template, old_template)
logger.debug("template " + template)
current_disk_path = vm_details.datastore_id.system_mount_point + get_constant('vms') + '/' + vm_details.vm_identity
current_disk_file = current_disk_path + '/' + vm_details.vm_identity + '.qcow2'
if (vm_details.status == current.VM_STATUS_RUNNING or vm_details.status == current.VM_STATUS_SUSPENDED):
logger.debug("vm is active in db")
if domain.isActive():
domain.undefine()
root = etree.fromstring(xmlfile)
target_elem = root.find("devices/disk/target")
target_disk = target_elem.get('dev')
flag = VIR_DOMAIN_BLOCK_REBASE_SHALLOW | VIR_DOMAIN_BLOCK_REBASE_COPY
domain.blockRebase(target_disk, template, 0, flag)
block_info_list = domain.blockJobInfo(current_disk_file,0)
while(block_info_list['end'] != block_info_list['cur']):
logger.debug("time to sleep")
time.sleep(60)
block_info_list = domain.blockJobInfo(current_disk_file,0)
domain.blockJobAbort(current_disk_file)
domain = connection_object.defineXML(xmlfile)
connection_object.close()
return (True, template_location, old_template)
else:
logger.debug("domain is not running on host")
return (False, template_location, old_template)
elif(vm_details.status == current.VM_STATUS_SHUTDOWN):
if domain.isActive():
logger.debug("Domain is still active...Please try again after some time!!!")
return (False, template_location, old_template)
else:
logger.debug("copying")
copy_command = "cp "+current_disk_file+" "+template
logger.debug("copy_command"+copy_command)
#rc = os.system("cp %s %s" % (current_disk_file, template))
logger.debug("copy command running on " + vm_details.host_id.host_ip.private_ip + " host")
command_output = execute_remote_cmd(vm_details.host_id.host_ip.private_ip, 'root', copy_command)
logger.debug(command_output)
return (True, template_location, old_template)
except:
if not domain.isPersistent():
domain = connection_object.defineXML(xmlfile)
connection_object.close()
logger.debug("Task Status: FAILED Error: %s " % log_exception())
return (False, template_location, old_template)
| 44.789067 | 237 | 0.655399 | 9,377 | 75,380 | 4.97046 | 0.0642 | 0.046537 | 0.015813 | 0.02079 | 0.614036 | 0.524331 | 0.447863 | 0.396305 | 0.356634 | 0.329171 | 0 | 0.003589 | 0.242266 | 75,380 | 1,682 | 238 | 44.815696 | 0.812406 | 0.034014 | 0 | 0.366234 | 0 | 0 | 0.136538 | 0.005075 | 0 | 0 | 0 | 0 | 0.000866 | 0 | null | null | 0.001732 | 0.006926 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c073d575249e6f524c3e4fa1ac84edb0ff05cc7 | 984 | py | Python | UAS/UAS 11 & 12/main.py | Archedar/UAS | 3237d9304026340acc93c8f36b358578dc0ae66f | [
"BSD-Source-Code"
] | null | null | null | UAS/UAS 11 & 12/main.py | Archedar/UAS | 3237d9304026340acc93c8f36b358578dc0ae66f | [
"BSD-Source-Code"
] | null | null | null | UAS/UAS 11 & 12/main.py | Archedar/UAS | 3237d9304026340acc93c8f36b358578dc0ae66f | [
"BSD-Source-Code"
] | null | null | null | #Main Program
from Class import Barang
import Menu
histori = list()
listBarang = [
Barang('Rinso', 5000, 20),
Barang('Sabun', 3000, 20),
Barang('Pulpen', 2500, 20),
Barang('Tisu', 10000, 20),
Barang('Penggaris', 1000, 20)
]
while True:
print('''
Menu
1. Tampilkan Barang
2. Tambahkan Barang
3. Tambah Stock Barang
4. Hapus Barang
5. Cari Barang Berdasarkan Keyword
6. Hitung Barang Belanjaan
7. Histori Keluar Masuk Barang
0. Keluar Program
''')
choice = input('Masukan No Menu: ')
if choice == '1':
Menu.menu1(listBarang)
elif choice == '2':
Menu.menu2(listBarang, histori)
elif choice == '3':
Menu.menu3(listBarang, histori)
elif choice == '4':
Menu.menu4(listBarang, histori)
elif choice == '5':
Menu.menu5(listBarang)
elif choice == '6':
Menu.menu6(listBarang, histori)
elif choice == '7':
Menu.menu7(histori)
elif choice == '0':
print('Keluar Program')
break
else:
print('Invalid Input!') | 20.93617 | 37 | 0.645325 | 127 | 984 | 5 | 0.472441 | 0.110236 | 0.133858 | 0.170079 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069767 | 0.213415 | 984 | 47 | 38 | 20.93617 | 0.750646 | 0.012195 | 0 | 0 | 0 | 0 | 0.311015 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c075f34dca283714195a979ceda054e43bd4f75 | 13,010 | py | Python | original/baselines/train/JointE+ONE.py | thunlp/JointNRE | 29e2070910d0940bf4d32a8b8c97800bceff98fb | [
"MIT"
] | 186 | 2018-01-29T09:33:59.000Z | 2022-03-17T08:20:44.000Z | original/baselines/train/JointE+ONE.py | thunlp/JointNRE | 29e2070910d0940bf4d32a8b8c97800bceff98fb | [
"MIT"
] | 19 | 2018-03-01T01:55:08.000Z | 2022-02-17T03:38:21.000Z | original/baselines/train/JointE+ONE.py | thunlp/JointNRE | 29e2070910d0940bf4d32a8b8c97800bceff98fb | [
"MIT"
] | 36 | 2018-02-02T06:29:29.000Z | 2021-01-22T08:36:00.000Z | #coding:utf-8
import numpy as np
import tensorflow as tf
import os
import time
import datetime
import ctypes
import threading
import json
ll1 = ctypes.cdll.LoadLibrary
lib_cnn = ll1("./init_cnn.so")
ll2 = ctypes.cdll.LoadLibrary
lib_kg = ll2("./init_know.so")
class Config(object):
def __init__(self):
self.instanceTot = lib_cnn.getInstanceTot()
self.sequence_size = lib_cnn.getLenLimit()
self.num_classes = lib_cnn.getRelationTotal()
self.num_words = lib_cnn.getWordTotal()
self.num_positions = 2 * lib_cnn.getPositionLimit() + 1
self.word_size = lib_cnn.getWordDimension()
self.position_size = 5
self.embedding_size = self.word_size + self.position_size * 2
self.filter_size = 3
self.num_filters = 230
self.relation_size = self.word_size#230
self.dropout_keep_prob = 0.5
self.l2_lambda = 0.0001
self.NA = 51
lib_cnn.setNA(self.NA)
lib_cnn.setRate(3)
self.margin = 1.0
self.nbatches = 100
self.trainTimes = 15
self.entityTotal = 0
self.relationTotal = 0
class Model(object):
def __init__(self, config):
sequence_size = config.sequence_size
num_classes = config.num_classes
num_words = config.num_words
num_positions = config.num_positions
embedding_size = config.embedding_size
word_size = config.word_size
position_size = config.position_size
relation_size = config.relation_size
filter_size = config.filter_size
num_filters = config.num_filters
dropout_keep_prob = config.dropout_keep_prob
margin = config.margin
l2_lambda = config.l2_lambda
self.input_x = tf.placeholder(tf.int32, [None, sequence_size], name = "input_x")
self.input_p_h = tf.placeholder(tf.int32, [None, sequence_size], name = "input_p_h")
self.input_p_t = tf.placeholder(tf.int32, [None, sequence_size], name = "input_p_t")
self.input_r = tf.placeholder(tf.float32, [1, 1], name = "input_r")
self.input_r_n = tf.placeholder(tf.float32, [1, 1], name = "input_r_n")
self.input_h = tf.placeholder(tf.int32, [1, 1], name = "input_h")
self.input_t = tf.placeholder(tf.int32, [1, 1], name = "input_t")
self.input_y = tf.placeholder(tf.float32, [1, num_classes], name = "input_y")
self.pos_h = tf.placeholder(tf.int32, [None])
self.pos_t = tf.placeholder(tf.int32, [None])
self.pos_r = tf.placeholder(tf.int32, [None])
self.neg_h = tf.placeholder(tf.int32, [None])
self.neg_t = tf.placeholder(tf.int32, [None])
self.neg_r = tf.placeholder(tf.int32, [None])
l2_loss = tf.constant(0.0)
with tf.name_scope("embedding-lookup"):
self.word_embeddings = tf.Variable(word_embeddings, name="word_embeddings")
self.relation_embeddings = tf.get_variable("relation_embeddings", [config.relationTotal, word_size])
self.position_embeddings = tf.get_variable("position_embeddings", [num_positions, position_size])
self.relation_attention = tf.get_variable("relation_attention", [num_classes, relation_size])
self.NAattention = tf.get_variable("NAattention", [relation_size, 1])
self.attention = tf.get_variable("attention", [num_filters, relation_size])
#know
pos_h_e = tf.nn.embedding_lookup(self.word_embeddings, self.pos_h)
pos_t_e = tf.nn.embedding_lookup(self.word_embeddings, self.pos_t)
pos_r_e = tf.nn.embedding_lookup(self.relation_embeddings, self.pos_r)
neg_h_e = tf.nn.embedding_lookup(self.word_embeddings, self.neg_h)
neg_t_e = tf.nn.embedding_lookup(self.word_embeddings, self.neg_t)
neg_r_e = tf.nn.embedding_lookup(self.relation_embeddings, self.neg_r)
#cnn
self.x_initial = tf.nn.embedding_lookup(self.word_embeddings, self.input_x)
self.x_p_h = tf.nn.embedding_lookup(self.position_embeddings, self.input_p_h)
self.x_p_t = tf.nn.embedding_lookup(self.position_embeddings, self.input_p_t)
self.x = tf.expand_dims(tf.concat(2, [self.x_initial, self.x_p_h, self.x_p_t]), -1)
self.head = tf.nn.embedding_lookup(self.word_embeddings, self.input_h)
self.tail = tf.nn.embedding_lookup(self.word_embeddings, self.input_t)
l2_loss += tf.nn.l2_loss(self.attention)
with tf.name_scope("conv-maxpool"):
self.W = tf.get_variable("W", [filter_size, embedding_size, 1, num_filters])
self.b = tf.get_variable("b", [num_filters])
conv = tf.nn.conv2d(self.x, self.W, strides=[1, 1, 1, 1], padding="VALID", name="conv")
h = tf.nn.tanh(tf.nn.bias_add(conv, self.b), name="tanh")
self.y = tf.nn.max_pool(h, ksize=[1, sequence_size - filter_size + 1, 1, 1], strides=[1, 1, 1, 1], padding='VALID', name="pool")
l2_loss += tf.nn.l2_loss(self.W)
l2_loss += tf.nn.l2_loss(self.b)
self.y = tf.reshape(self.y, [-1, num_filters])
with tf.name_scope('attention'):
self.y_attention = tf.reduce_max(self.y, 0 , keep_dims = True)
with tf.name_scope("dropout"):
self.y_attention = tf.nn.l2_normalize(self.y_attention, 1)
self.h_drop = tf.nn.dropout(self.y_attention, dropout_keep_prob)
self.transfer_w = tf.get_variable("transfer_w", [num_filters, num_classes])
self.scores = tf.matmul(self.h_drop, self.transfer_w)
l2_loss += tf.nn.l2_loss(self.transfer_w)
with tf.name_scope("loss"):
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(self.scores, self.input_y)
self.loss_cnn = tf.reduce_mean(cross_entropy) + l2_lambda * l2_loss
pos = tf.reduce_sum(abs(pos_h_e + pos_r_e - pos_t_e), 1, keep_dims = True)
neg = tf.reduce_sum(abs(neg_h_e + neg_r_e - neg_t_e), 1, keep_dims = True)
self.loss_kg = tf.reduce_sum(tf.maximum(pos - neg + margin, 0))
with tf.name_scope("accuracy"):
self.predictions = tf.argmax(self.scores, 1, name="predictions")
correct_predictions = tf.equal(self.predictions, tf.argmax(self.input_y, 1))
self.accuracy = tf.reduce_mean(tf.cast(correct_predictions, "float"), name="accuracy")
bags_sum = 0.0
bags_hit_NA = 0.0
sum_NA = 0.0
sum_fNA = 0.0
bags_hit = 0.0
loss_sum = 0.0
if __name__ == "__main__":
lib_cnn.readWordVec()
lib_cnn.readFromFile()
lib_kg.init()
np.random.seed(0)
tf.set_random_seed(0)
config = Config()
word_embeddings = np.zeros(config.num_words * config.word_size, dtype = np.float32)
lib_cnn.getWordVec.argtypes = [ctypes.c_void_p]
lib_cnn.getWordVec(word_embeddings.__array_interface__['data'][0])
word_embeddings.resize((config.num_words,config.word_size))
config.batch_size = lib_kg.getTripleTotal() / config.nbatches
config.entityTotal = lib_kg.getEntityTotal()
config.relationTotal = lib_kg.getRelationTotal()
with tf.Graph().as_default():
conf = tf.ConfigProto()
sess = tf.Session(config=conf)
with sess.as_default():
initializer = tf.contrib.layers.xavier_initializer()
with tf.variable_scope("model", reuse=None, initializer = initializer):
m = Model(config = config)
global_step_cnn = tf.Variable(0, name="global_step_cnn", trainable=False)
optimizer_cnn = tf.train.GradientDescentOptimizer(0.01)
grads_and_vars_cnn = optimizer_cnn.compute_gradients(m.loss_cnn)
train_op_cnn = optimizer_cnn.apply_gradients(grads_and_vars_cnn, global_step = global_step_cnn)
global_step_kg = tf.Variable(0, name="global_step_kg", trainable=False)
optimizer_kg = tf.train.GradientDescentOptimizer(0.001)
grads_and_vars_kg = optimizer_kg.compute_gradients(m.loss_kg)
train_op_kg = optimizer_kg.apply_gradients(grads_and_vars_kg, global_step=global_step_kg)
sess.run(tf.initialize_all_variables())
def outEmbedding(str1):
word_embeddings, relation_embeddings, position_embeddings, relation_attention, attention, W, B, transfer_w, transfer_b, softmax_w, softmax_b = sess.run([m.word_embeddings, m.relation_embeddings, m.position_embeddings, m.relation_attention, m.attention, m.W, m.b, m.transfer_w, m.transfer_b, m.softmax_w, m.softmax_b])
log = open("log"+str1+".txt", "w")
log.write(json.dumps(word_embeddings.tolist())+"\n")
log.write(json.dumps(relation_embeddings.tolist())+"\n")
log.write(json.dumps(position_embeddings.tolist())+"\n")
log.write(json.dumps(relation_attention.tolist())+"\n")
log.write(json.dumps(attention.tolist())+"\n")
log.write(json.dumps(W.tolist())+"\n")
log.write(json.dumps(B.tolist())+"\n")
log.write(json.dumps(transfer_w.tolist())+"\n")
NAattention = sess.run(m.NAattention)
log.write(json.dumps(NAattention.tolist()) + "\n")
log.close()
x_batch = np.zeros((config.instanceTot,config.sequence_size), dtype = np.int32)
p_t_batch = np.zeros((config.instanceTot,config.sequence_size), dtype = np.int32)
p_h_batch = np.zeros((config.instanceTot,config.sequence_size), dtype = np.int32)
r_batch = np.zeros((1, 1), dtype = np.int32)
y_batch = np.zeros((1, config.num_classes), dtype = np.int32)
r_n_batch = np.zeros((1, 1), dtype = np.float32)
h_batch = np.zeros((1, 1), dtype = np.int32)
t_batch = np.zeros((1, 1), dtype = np.int32)
x_batch_addr = x_batch.__array_interface__['data'][0]
p_t_batch_addr = p_t_batch.__array_interface__['data'][0]
p_h_batch_addr = p_h_batch.__array_interface__['data'][0]
y_batch_addr = y_batch.__array_interface__['data'][0]
r_batch_addr = r_batch.__array_interface__['data'][0]
r_n_batch_addr = r_n_batch.__array_interface__['data'][0]
h_batch_addr = h_batch.__array_interface__['data'][0]
t_batch_addr = t_batch.__array_interface__['data'][0]
lib_cnn.batch_iter.argtypes = [ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p]
tipTotal = lib_cnn.getTipTotal()
loop = 0
def train_cnn(coord):
def train_step_cnn(x_batch, p_h_batch, p_t_batch, y_batch, r_batch, r_n_batch, h_batch, t_batch):
global bags_sum, bags_hit, loss_sum, bags_hit_NA, bags_hit, sum_fNA, sum_NA
feed_dict = {
m.input_x: x_batch,
m.input_p_h: p_h_batch,
m.input_p_t: p_t_batch,
m.input_r: r_batch,
m.input_r_n: r_n_batch,
m.input_y: y_batch,
m.input_h: h_batch,
m.input_t: t_batch
}
_, step, loss, accuracy = sess.run(
[train_op_cnn, global_step_cnn, m.loss_cnn, m.accuracy], feed_dict)
time_str = datetime.datetime.now().isoformat()
loss_sum += loss
bags_sum += 1
if (r_batch[0]!=config.NA):
sum_fNA += 1
if accuracy > 0.5:
bags_hit += 1.0
else:
sum_NA += 1
if accuracy > 0.5:
bags_hit_NA += 1.0
if bags_sum % 1000 == 0:
if (sum_NA == 0):
sum_NA+=1
if (sum_fNA == 0):
sum_fNA+=1
print("{}: step {}, loss {:g}, acc {:g} acc {:g} {} {}".format(time_str, step, loss_sum/bags_sum, bags_hit_NA/sum_NA, bags_hit/sum_fNA, sum_NA, sum_fNA))
global loop
while not coord.should_stop():
print 'Looping ', loop
outEmbedding(str(loop))
for i in range(tipTotal):
length = lib_cnn.batch_iter(x_batch_addr, p_h_batch_addr, p_t_batch_addr, y_batch_addr, r_batch_addr, r_n_batch_addr, h_batch_addr, t_batch_addr)
train_step_cnn(x_batch[0:length,], p_h_batch[0:length,], p_t_batch[0:length,], y_batch, r_batch, r_n_batch, h_batch, t_batch)
global bags_sum, bags_hit, loss_sum, bags_hit_NA, bags_hit, sum_fNA, sum_NA
bags_sum = 0
bags_hit = 0
bags_hit_NA = 0
loss_sum = 0
sum_fNA = 0
sum_NA = 0
loop += 1
if loop == config.trainTimes:
coord.request_stop()
ph = np.zeros(config.batch_size * 2, dtype = np.int32)
pt = np.zeros(config.batch_size * 2, dtype = np.int32)
pr = np.zeros(config.batch_size * 2, dtype = np.int32)
nh = np.zeros(config.batch_size * 2, dtype = np.int32)
nt = np.zeros(config.batch_size * 2, dtype = np.int32)
nr = np.zeros(config.batch_size * 2, dtype = np.int32)
ph_addr = ph.__array_interface__['data'][0]
pt_addr = pt.__array_interface__['data'][0]
pr_addr = pr.__array_interface__['data'][0]
nh_addr = nh.__array_interface__['data'][0]
nt_addr = nt.__array_interface__['data'][0]
nr_addr = nr.__array_interface__['data'][0]
lib_kg.getBatch.argtypes = [ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_void_p, ctypes.c_int]
times_kg = 0
def train_kg(coord):
def train_step_kg(pos_h_batch, pos_t_batch, pos_r_batch, neg_h_batch, neg_t_batch, neg_r_batch):
feed_dict = {
m.pos_h: pos_h_batch,
m.pos_t: pos_t_batch,
m.pos_r: pos_r_batch,
m.neg_h: neg_h_batch,
m.neg_t: neg_t_batch,
m.neg_r: neg_r_batch
}
_, step, loss = sess.run(
[train_op_kg, global_step_kg, m.loss_kg], feed_dict)
return loss
global times_kg
while not coord.should_stop():
times_kg += 1
res = 0.0
for batch in range(config.nbatches):
lib_kg.getBatch(ph_addr, pt_addr, pr_addr, nh_addr, nt_addr, nr_addr, config.batch_size)
res += train_step_kg(ph, pt, pr, nh, nt, nr)
coord = tf.train.Coordinator()
threads = []
threads.append(threading.Thread(target=train_kg, args=(coord,)))
threads.append(threading.Thread(target=train_cnn, args=(coord,)))
for t in threads: t.start()
coord.join(threads)
| 41.301587 | 321 | 0.711299 | 2,131 | 13,010 | 4.012201 | 0.119193 | 0.010292 | 0.019298 | 0.021053 | 0.362573 | 0.310643 | 0.254152 | 0.209708 | 0.172281 | 0.118363 | 0 | 0.021304 | 0.152114 | 13,010 | 314 | 322 | 41.433121 | 0.753785 | 0.001691 | 0 | 0.044444 | 0 | 0 | 0.036746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02963 | null | null | 0.007407 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c07e719407fcda373a642abe4461b09f4086e6c | 4,041 | py | Python | CRNitschke/get_sextract_thresholds.py | deapplegate/wtgpipeline | 9693e8562022cc97bf5a96427e22965e1a5e8497 | [
"MIT"
] | 1 | 2019-03-15T04:01:19.000Z | 2019-03-15T04:01:19.000Z | CRNitschke/get_sextract_thresholds.py | deapplegate/wtgpipeline | 9693e8562022cc97bf5a96427e22965e1a5e8497 | [
"MIT"
] | 5 | 2017-12-11T00:11:39.000Z | 2021-07-09T17:05:16.000Z | CRNitschke/get_sextract_thresholds.py | deapplegate/wtgpipeline | 9693e8562022cc97bf5a96427e22965e1a5e8497 | [
"MIT"
] | 2 | 2017-08-15T21:19:11.000Z | 2017-10-12T00:36:35.000Z | #! /usr/bin/env python
#adam-does# runs SeeingClearly to get the seeing and rms of the image, then uses those to get sextractor thresholds for CR detection
#adam-use# use with CRNitschke pipeline
#adam-call_example# call it like ./get_sextract_thresholds.py /path/flname.fits output_file.txt
#IO stuff:
import sys ; sys.path.append('/u/ki/awright/InstallingSoftware/pythons')
###saveout = sys.stdout
saveout = sys.stdout
###logout = open('SeeingClearly_stdout.log','w')
###sys.stdout = logout
saveerr = sys.stderr
###logerr = open('SeeingClearly_stderr.log','w')
###sys.stderr = logerr
sys.stdout = sys.stderr
#the basics
import hashlib
import os
import SeeingClearly
from copy import deepcopy
def seeing_to_ft_dt(x):
y1_dt,m_dt,x1_dt= 5900, -16551.7, 0.48
min_dt= 3500
max_dt= 6000
yy_dts=y1_dt+m_dt*(x-x1_dt)
if yy_dts<min_dt:yy_dts=min_dt
if yy_dts>max_dt:yy_dts=max_dt
y1_ft,m_ft,x1_ft,min_ft= 850, -7000.0, 0.48, 450
min_ft= 450
max_ft= 1000
yy_fts=y1_ft+m_ft*(x-x1_ft)
if yy_fts<min_ft:yy_fts=min_ft
if yy_fts>max_ft:yy_fts=max_ft
return yy_fts,yy_dts
import imagetools
import glob
import astropy
from astropy.io import ascii
from numpy import asarray
if __name__ == "__main__":
args=deepcopy(sys.argv[1:])
for false_arg in ['-i', '--']:
if false_arg in args: args.remove(false_arg)
if len(args)<1:
sys.exit()
if not os.path.isfile(args[0]):
print "sys.argv[1]=",args[0]
raise Exception(args[0]+" is not a file!")
else:
fl=args[0]
fl2save=args[1]
#start tmp
print "Using SeeingClearly to get seeing for: "+fl
print "saving output to: " +fl2save
try:
FILTER=astropy.io.fits.open(fl)[0].header['FILTER']
except:
FILTER="UnknownFilt"
BASE,ending=os.path.basename(fl).split('OCF')
ending="OCF"+ending
ending=ending.replace('.fits','')
fls_dir=os.path.dirname(fl)
basename=os.path.basename(fl)
CCDnum=imagetools.GetCCD(fl)
globthis='_'+str(CCDnum)
glob_basename=basename.replace(globthis,'_*')
fls=sorted(glob.glob(fls_dir+"/"+glob_basename))
if not len(fls)==10:
raise Exception('cannot find 10 files like this from different CCDs')
#adam-old# seeing,back_rms=SeeingClearly.seeing_clearly_withplot(fls,checkplots=1,saveas='pltSeeingClearly_%s_%s' % (FILTER,BASE[:-1]+"ALL"))
import adam_stars_from_cat
import numpy
seeing,back_rms=adam_stars_from_cat.get_seeing_backrms(fls)
back_rms=numpy.array(back_rms)
ft,dt=seeing_to_ft_dt(seeing)
detect_thresh=dt/back_rms #convert to S2N ratio
filter_thresh=ft/back_rms #convert to S2N ratio
if FILTER=='W-J-B':
detect_thresh=asarray([min(170.0,detect_thresh[i]) for i in range(len(detect_thresh))])
filter_thresh=asarray([min(20.0,filter_thresh[i]) for i in range(len(filter_thresh))])
elif (detect_thresh>170.0).any() or (filter_thresh>20.0).any():
print 'checkit: filter=%s and %.2f %% of the detection thresholds are above 170.0 and %.2f %% of the filter thresholds are above 20.0' % (FILTER,(detect_thresh>170.0).mean()*100, (filter_thresh>20.0).mean()*100)
dict_out={}
dict_out['seeing']=[seeing]*10
dict_out['rms']=back_rms
dict_out['dt']=detect_thresh
dict_out['ft']=filter_thresh
dict_out['#files']=fls
t=astropy.table.Table(data=dict_out,names=['#files','rms','seeing','dt','ft'],dtype=[str,float,float,float,float])
t.write(fl2save,format="ascii.basic")
#adam-2014#detect_thresh_cap=min(detect_thresh,150.0) #cap is now set in the function seeing_to_ft_dt
#PIXSCALE=float(os.environ['PIXSCALE'])
#if seeing>PIXSCALE*2.5: #I have no check for being undersampled, should I?
#if seeing>.4:
# sys.stdout=saveout #back to printing to terminal
# ###sys.stdout.write(str(seeing))
# print "'0 "+str(back_rms)+" "+str(seeing)+" "+str(detect_thresh)+" "+str(filter_thresh)+"'"
#
#else:
# #print "exit 1;"
# #raise Exception('Seeing less than 2.5xPIXSCALE. The image is undersampled')
# #sys.stderr=saveerr #back to printing to terminal
# #sys.stderr.write('1')
# sys.stdout=saveout #back to printing to terminal
# print "0 "+str(back_rms)+" "+str(seeing)+" "+str(detect_thresh)+" "+str(filter_thresh)
| 36.736364 | 213 | 0.732739 | 690 | 4,041 | 4.12029 | 0.308696 | 0.04643 | 0.016884 | 0.012663 | 0.10904 | 0.10904 | 0.081604 | 0.066831 | 0.038692 | 0.038692 | 0 | 0.034751 | 0.109874 | 4,041 | 109 | 214 | 37.073395 | 0.75563 | 0.329374 | 0 | 0 | 0 | 0.013158 | 0.150019 | 0.015077 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.157895 | null | null | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c10af158381d3bf41fbdb21d408b7a2f5c450b9 | 2,336 | py | Python | python/swap_header.py | daniestevez/gr-csp | 0a10e4d2e5cf4a51256e5dc72aa42f8d3d54c232 | [
"Unlicense"
] | 19 | 2016-05-27T15:12:31.000Z | 2021-04-19T09:42:35.000Z | python/swap_header.py | daniestevez/gr-csp | 0a10e4d2e5cf4a51256e5dc72aa42f8d3d54c232 | [
"Unlicense"
] | null | null | null | python/swap_header.py | daniestevez/gr-csp | 0a10e4d2e5cf4a51256e5dc72aa42f8d3d54c232 | [
"Unlicense"
] | 5 | 2017-04-26T22:48:40.000Z | 2022-02-19T23:49:33.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copyright 2016 Daniel Estevez <daniel@destevez.net>.
#
# This is free and unencumbered software released into the public domain.
#
# Anyone is free to copy, modify, publish, use, compile, sell, or
# distribute this software, either in source code form or as a compiled
# binary, for any purpose, commercial or non-commercial, and by any
# means.
#
# In jurisdictions that recognize copyright laws, the author or authors
# of this software dedicate any and all copyright interest in the
# software to the public domain. We make this dedication for the benefit
# of the public at large and to the detriment of our heirs and
# successors. We intend this dedication to be an overt act of
# relinquishment in perpetuity of all present and future rights to this
# software under copyright law.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
# IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR
# OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
# ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.
#
# For more information, please refer to <http://unlicense.org>
#
import numpy
from gnuradio import gr
import pmt
import array
class swap_header(gr.basic_block):
"""
docstring for block swap_header
"""
def __init__(self):
gr.basic_block.__init__(self,
name="swap_crc",
in_sig=[],
out_sig=[])
self.message_port_register_in(pmt.intern('in'))
self.set_msg_handler(pmt.intern('in'), self.handle_msg)
self.message_port_register_out(pmt.intern('out'))
def handle_msg(self, msg_pmt):
msg = pmt.cdr(msg_pmt)
if not pmt.is_u8vector(msg):
print "[ERROR] Received invalid message type. Expected u8vector"
return
packet = array.array("B", pmt.u8vector_elements(msg))
header = packet[:4]
header.reverse()
packet = header + packet[4:]
msg_pmt = pmt.cons(pmt.PMT_NIL, pmt.init_u8vector(len(packet), bytearray(packet)))
self.message_port_pub(pmt.intern('out'), msg_pmt)
| 36.5 | 90 | 0.704195 | 342 | 2,336 | 4.707602 | 0.494152 | 0.018634 | 0.02795 | 0.028571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005959 | 0.20976 | 2,336 | 63 | 91 | 37.079365 | 0.866197 | 0.558647 | 0 | 0 | 0 | 0 | 0.078782 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.166667 | null | null | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c10ce60247554a61b0b5d48488c8c6ceac709b6 | 2,149 | py | Python | start.py | gleenn/dfplayer | dd390a6f54b3bd8b2a3397fddd6caacfba01b29d | [
"MIT"
] | null | null | null | start.py | gleenn/dfplayer | dd390a6f54b3bd8b2a3397fddd6caacfba01b29d | [
"MIT"
] | null | null | null | start.py | gleenn/dfplayer | dd390a6f54b3bd8b2a3397fddd6caacfba01b29d | [
"MIT"
] | null | null | null | #!/usr/bin/python
#
# Start dfplayer.
import argparse
import os
import shutil
import subprocess
import sys
import time
_PROJ_DIR = os.path.dirname(__file__)
def main():
os.chdir(_PROJ_DIR)
os.environ['LD_LIBRARY_PATH'] = '/lib:/usr/lib:/usr/local/lib'
arg_parser = argparse.ArgumentParser(description='Start player')
arg_parser.add_argument('--gdb', action='store_true')
arg_parser.add_argument('--no-reset', action='store_true')
arg_parser.add_argument('--disable-net', action='store_true')
arg_parser.add_argument('--mpd', action='store_true')
arg_parser.add_argument('--disable-fin', action='store_true')
arg_parser.add_argument('--max', action='store_true')
arg_parser.add_argument('--no-sound', action='store_true')
arg_parser.add_argument('--no-sound-config', action='store_true')
arg_parser.add_argument('--prod', action='store_true')
arg_parser.add_argument('--enable-kinect', action='store_true')
args = arg_parser.parse_args()
if args.prod:
print 'dfplayer is sleeping for 30 seconds before startup'
time.sleep(30)
if not args.no_sound_config and not args.no_sound:
shutil.copyfile(
'dfplayer/asoundrc.sample', '/home/' + os.getlogin() + '/.asoundrc')
params = ['env/bin/dfplayer', '--listen=0.0.0.0:8081']
if args.no_reset:
params.append('--no-reset')
if args.no_sound:
params.append('--no-sound')
if args.disable_net:
params.append('--disable-net')
if args.disable_fin:
params.append('--disable-fin')
if args.enable_kinect or args.prod:
params.append('--enable-kinect')
if args.mpd:
params.append('--mpd')
if args.max or args.prod:
params.append('--max')
try:
if args.gdb:
subprocess.check_call(
['gdb', '-ex', 'run', '--args', 'env/bin/python'] + params)
#['gdb', '--args', 'env/bin/python'] + params)
else:
subprocess.check_call(params)
except KeyboardInterrupt:
print 'Player is exiting via KeyboardInterrupt'
except Exception, err:
print sys.exc_info()[0]
if args.prod:
print 'dfplayer has exited and start.py script is now sleeping'
time.sleep(3600)
main()
| 28.653333 | 76 | 0.68497 | 302 | 2,149 | 4.695364 | 0.311258 | 0.076164 | 0.084626 | 0.141044 | 0.3378 | 0.2433 | 0.2433 | 0.14457 | 0.059238 | 0 | 0 | 0.00933 | 0.152164 | 2,149 | 74 | 77 | 29.040541 | 0.768935 | 0.035831 | 0 | 0.035088 | 0 | 0 | 0.278181 | 0.035317 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.105263 | null | null | 0.070175 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c119d6282e07a22b49176d0f6616aca7099e5dc | 3,159 | py | Python | options/base_option.py | lime-j/YTMT-Strategy-1 | aacc38c4e61b91e187cac81aa95500e0422d4d0f | [
"Apache-2.0"
] | 26 | 2021-11-08T07:49:34.000Z | 2022-03-28T14:09:27.000Z | options/base_option.py | lime-j/YTMT-Strategy-1 | aacc38c4e61b91e187cac81aa95500e0422d4d0f | [
"Apache-2.0"
] | 2 | 2021-10-22T02:53:10.000Z | 2021-12-29T12:35:13.000Z | options/base_option.py | lime-j/YTMT-Strategy-1 | aacc38c4e61b91e187cac81aa95500e0422d4d0f | [
"Apache-2.0"
] | 1 | 2021-10-18T08:00:22.000Z | 2021-10-18T08:00:22.000Z | import argparse
import models
model_names = sorted(name for name in models.__dict__
if name.islower() and not name.startswith("__")
and callable(models.__dict__[name]))
class BaseOptions():
def __init__(self):
self.parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
self.initialized = False
def initialize(self):
# experiment specifics
self.parser.add_argument('--name', type=str, default=None,
help='name of the experiment. It decides where to store samples and models')
self.parser.add_argument('--gpu_ids', type=str, default='0', help='gpu ids: e.g. 0 0,1,2, 0,2. use -1 for CPU')
self.parser.add_argument('--model', type=str, default='errnet_model', help='chooses which model to use.')
self.parser.add_argument('--checkpoints_dir', type=str, default='./checkpoints', help='models are saved here')
self.parser.add_argument('--resume', '-r', action='store_true', help='resume from checkpoint')
self.parser.add_argument('--resume_epoch', '-re', type=int, default=None,
help='checkpoint to use. (default: latest')
self.parser.add_argument('--seed', type=int, default=2018, help='random seed to use. Default=2018')
self.parser.add_argument('--supp_eval', action='store_true', help='supplementary evaluation')
self.parser.add_argument('--start_now', action='store_true', help='supplementary evaluation')
self.parser.add_argument('--testr', action='store_true', help='test for reflections')
self.parser.add_argument('--select', type=str, default=None)
# for setting input
self.parser.add_argument('--serial_batches', action='store_true',
help='if true, takes images in order to make batches, otherwise takes them randomly')
self.parser.add_argument('--nThreads', default=8, type=int, help='# threads for loading data')
self.parser.add_argument('--max_dataset_size', type=int, default=None,
help='Maximum number of samples allowed per dataset. If the dataset directory contains more than max_dataset_size, only a subset is loaded.')
# for display
self.parser.add_argument('--no-log', action='store_true', help='disable tf logger?')
self.parser.add_argument('--no-verbose', action='store_true', help='disable verbose info?')
self.parser.add_argument('--display_winsize', type=int, default=256, help='display window size')
self.parser.add_argument('--display_port', type=int, default=8097, help='visdom port of the web display')
self.parser.add_argument('--display_id', type=int, default=0,
help='window id of the web display (use 0 to disable visdom)')
self.parser.add_argument('--display_single_pane_ncols', type=int, default=0,
help='if positive, display all images in a single visdom web panel with certain number of images per row.')
self.initialized = True
| 65.8125 | 174 | 0.652738 | 404 | 3,159 | 4.955446 | 0.351485 | 0.104895 | 0.12987 | 0.20979 | 0.242757 | 0.062937 | 0.062937 | 0.062937 | 0.062937 | 0.062937 | 0 | 0.010958 | 0.220006 | 3,159 | 47 | 175 | 67.212766 | 0.801542 | 0.015828 | 0 | 0 | 0 | 0.054054 | 0.365217 | 0.008696 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054054 | false | 0 | 0.054054 | 0 | 0.135135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c11a09e91a9f24c73ca32bb8e2bc358e52c7c63 | 2,277 | py | Python | bookstore/__init__.py | JanhaviSoni/Book-Recommendation-Analysis | d2697e1f2eb9b9b4e0bafc0dd43d486ceb3d1707 | [
"MIT"
] | 23 | 2021-01-15T15:46:45.000Z | 2021-11-16T12:26:58.000Z | bookstore/__init__.py | JanhaviSoni/Book-Recommendation-Analysis | d2697e1f2eb9b9b4e0bafc0dd43d486ceb3d1707 | [
"MIT"
] | 108 | 2021-01-13T11:02:31.000Z | 2022-03-21T17:47:24.000Z | bookstore/__init__.py | JanhaviSoni/Book-Recommendation-Analysis | d2697e1f2eb9b9b4e0bafc0dd43d486ceb3d1707 | [
"MIT"
] | 46 | 2021-01-14T17:27:28.000Z | 2022-03-20T10:12:24.000Z |
from flask import Flask, Response
from flask_basicauth import BasicAuth
from flask_cors import CORS, cross_origin
import os
#from flask_admin import Admin,AdminIndexView
#from flask_admin.contrib.sqla import ModelView
from flask_sqlalchemy import SQLAlchemy as _BaseSQLAlchemy
from flask_migrate import Migrate, MigrateCommand
from flask_script import Manager
from werkzeug.exceptions import HTTPException
from flask_login import LoginManager
from itsdangerous import URLSafeSerializer
# import psycopg2
# import pymysql
# import logging
# import warnings
# warnings.filterwarnings("ignore")
# Initializing Flask App
app = Flask(__name__)
app.secret_key="Vampire"
# This video demonstrates why we use CORS in our Flask App - https://www.youtube.com/watch?v=vWl5XcvQBx0
CORS(app)
app.config.from_object("config.DevelopmentConfig")
class SQLAlchemy(_BaseSQLAlchemy):
"""
This class is defined so that we can set "pool_pre_ping" to True.
pool_pre_ping is a boolean flag, which when set to True,
will enable the connection pool 'pre-ping' feature
that tests connections for liveness upon each checkout.
This prevents from dropping of database connection with our app.
This class inherits the original SQLAlchemy class,
and nothing else is changed except pool_pre_ping flag
https://docs.sqlalchemy.org/en/13/core/pooling.html#dealing-with-disconnects
https://github.com/pallets/flask-sqlalchemy/issues/589
"""
def apply_pool_defaults(self, app, options):
super(SQLAlchemy, self).apply_pool_defaults(app, options)
options["pool_pre_ping"] = True
# Creating and Initializing db object of SQLAlchemy class
db = SQLAlchemy(app)
db.init_app(app)
migrate = Migrate(app, db, render_as_batch=True)
with app.app_context():
if db.engine.url.drivername == 'sqlite':
migrate.init_app(app, db, render_as_batch=True)
else:
migrate.init_app(app, db)
manager = Manager(app)
manager.add_command('db', MigrateCommand)
# Creating serializer object of URLSafeSerializer class for serializing session_token
serializer = URLSafeSerializer(app.secret_key)
# Here we set session_token as our user_loader.
from bookstore.client.views import client
from bookstore.admin.views import admin
app.register_blueprint(client)
app.register_blueprint(admin)
| 25.021978 | 104 | 0.798858 | 323 | 2,277 | 5.495356 | 0.442724 | 0.045634 | 0.030986 | 0.014648 | 0.04338 | 0.024789 | 0 | 0 | 0 | 0 | 0 | 0.004028 | 0.1278 | 2,277 | 90 | 105 | 25.3 | 0.889728 | 0.452789 | 0 | 0 | 0 | 0 | 0.042834 | 0.019769 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030303 | false | 0 | 0.363636 | 0 | 0.424242 | 0.060606 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c192652847b82c847977050650f6dd9bf312075 | 7,587 | py | Python | research/object_detection/core/freezable_batch_norm_test.py | baranshad/models | aaf008855e9764f32d974e86f8e1f9cfddfafd9a | [
"Apache-2.0"
] | 3 | 2019-12-15T18:05:04.000Z | 2021-04-30T16:26:04.000Z | research/object_detection/core/freezable_batch_norm_test.py | baranshad/models | aaf008855e9764f32d974e86f8e1f9cfddfafd9a | [
"Apache-2.0"
] | 10 | 2020-01-28T23:15:47.000Z | 2022-03-12T00:11:34.000Z | research/object_detection/core/freezable_batch_norm_test.py | baranshad/models | aaf008855e9764f32d974e86f8e1f9cfddfafd9a | [
"Apache-2.0"
] | 5 | 2020-06-02T09:14:45.000Z | 2022-02-05T17:32:44.000Z | # Copyright 2018 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for object_detection.core.freezable_batch_norm."""
import numpy as np
import tensorflow as tf
from object_detection.core import freezable_batch_norm
class FreezableBatchNormTest(tf.test.TestCase):
"""Tests for FreezableBatchNorm operations."""
def _build_model(self, training=None):
model = tf.keras.models.Sequential()
norm = freezable_batch_norm.FreezableBatchNorm(training=training,
input_shape=(10,),
momentum=0.8)
model.add(norm)
return model, norm
def _train_freezable_batch_norm(self, training_mean, training_var):
model, _ = self._build_model()
model.compile(loss='mse', optimizer='sgd')
# centered on training_mean, variance training_var
train_data = np.random.normal(
loc=training_mean,
scale=training_var,
size=(1000, 10))
model.fit(train_data, train_data, epochs=4, verbose=0)
return model.weights
def _test_batchnorm_layer(
self, norm, should_be_training, test_data,
testing_mean, testing_var, training_arg, training_mean, training_var):
out_tensor = norm(tf.convert_to_tensor(test_data, dtype=tf.float32),
training=training_arg)
out = tf.keras.backend.eval(out_tensor)
out -= tf.keras.backend.eval(norm.beta)
out /= tf.keras.backend.eval(norm.gamma)
if not should_be_training:
out *= training_var
out += (training_mean - testing_mean)
out /= testing_var
np.testing.assert_allclose(out.mean(), 0.0, atol=1.5e-1)
np.testing.assert_allclose(out.std(), 1.0, atol=1.5e-1)
def test_batchnorm_freezing_training_none(self):
with self.test_session():
training_mean = 5.0
training_var = 10.0
testing_mean = -10.0
testing_var = 5.0
# Initially train the batch norm, and save the weights
trained_weights = self._train_freezable_batch_norm(training_mean,
training_var)
# Load the batch norm weights, freezing training to True.
# Apply the batch norm layer to testing data and ensure it is normalized
# according to the batch statistics.
model, norm = self._build_model(training=True)
for trained_weight, blank_weight in zip(trained_weights, model.weights):
weight_copy = blank_weight.assign(tf.keras.backend.eval(trained_weight))
tf.keras.backend.eval(weight_copy)
# centered on testing_mean, variance testing_var
test_data = np.random.normal(
loc=testing_mean,
scale=testing_var,
size=(1000, 10))
# Test with training=True passed to the call method:
training_arg = True
should_be_training = True
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
# Test with training=False passed to the call method:
training_arg = False
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
# Test the layer in various Keras learning phase scopes:
training_arg = None
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
tf.keras.backend.set_learning_phase(True)
should_be_training = True
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
tf.keras.backend.set_learning_phase(False)
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
def test_batchnorm_freezing_training_false(self):
with self.test_session():
training_mean = 5.0
training_var = 10.0
testing_mean = -10.0
testing_var = 5.0
# Initially train the batch norm, and save the weights
trained_weights = self._train_freezable_batch_norm(training_mean,
training_var)
# Load the batch norm back up, freezing training to False.
# Apply the batch norm layer to testing data and ensure it is normalized
# according to the training data's statistics.
model, norm = self._build_model(training=False)
for trained_weight, blank_weight in zip(trained_weights, model.weights):
weight_copy = blank_weight.assign(tf.keras.backend.eval(trained_weight))
tf.keras.backend.eval(weight_copy)
# centered on testing_mean, variance testing_var
test_data = np.random.normal(
loc=testing_mean,
scale=testing_var,
size=(1000, 10))
# Make sure that the layer is never training
# Test with training=True passed to the call method:
training_arg = True
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
# Test with training=False passed to the call method:
training_arg = False
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
# Test the layer in various Keras learning phase scopes:
training_arg = None
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
tf.keras.backend.set_learning_phase(True)
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
tf.keras.backend.set_learning_phase(False)
should_be_training = False
self._test_batchnorm_layer(norm, should_be_training, test_data,
testing_mean, testing_var, training_arg,
training_mean, training_var)
if __name__ == '__main__':
tf.test.main()
| 41.010811 | 80 | 0.63991 | 925 | 7,587 | 4.963243 | 0.194595 | 0.038336 | 0.076672 | 0.070137 | 0.689828 | 0.651492 | 0.640601 | 0.62274 | 0.62274 | 0.62274 | 0 | 0.011743 | 0.281666 | 7,587 | 184 | 81 | 41.233696 | 0.830642 | 0.223936 | 0 | 0.669492 | 0 | 0 | 0.002395 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 1 | 0.042373 | false | 0 | 0.025424 | 0 | 0.09322 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c1fc5837c8db7a7bfccee73b5ceb661f8e4a0b9 | 3,477 | py | Python | unittests/test_apiv2_user.py | mtcolman/django-DefectDojo | 76175aca446e077884bdb5e1d8e2a671a0840775 | [
"BSD-3-Clause"
] | 249 | 2016-09-06T21:04:40.000Z | 2018-01-19T15:59:44.000Z | unittests/test_apiv2_user.py | mtcolman/django-DefectDojo | 76175aca446e077884bdb5e1d8e2a671a0840775 | [
"BSD-3-Clause"
] | 275 | 2021-02-19T15:16:15.000Z | 2022-03-31T21:09:29.000Z | unittests/test_apiv2_user.py | mtcolman/django-DefectDojo | 76175aca446e077884bdb5e1d8e2a671a0840775 | [
"BSD-3-Clause"
] | 152 | 2016-09-06T21:04:54.000Z | 2018-01-18T08:52:24.000Z | from rest_framework.test import APITestCase, APIClient
from django.urls import reverse
from rest_framework.authtoken.models import Token
class UserTest(APITestCase):
"""
Test the User APIv2 endpoint.
"""
fixtures = ['dojo_testdata.json']
def setUp(self):
token = Token.objects.get(user__username='admin')
self.client = APIClient()
self.client.credentials(HTTP_AUTHORIZATION='Token ' + token.key)
def test_user_list(self):
r = self.client.get(reverse('user-list'))
self.assertEqual(r.status_code, 200, r.content[:1000])
user_list = r.json()['results']
self.assertTrue(len(user_list) >= 1, r.content[:1000])
for user in user_list:
for item in ['username', 'first_name', 'last_name', 'email']:
self.assertIn(item, user, r.content[:1000])
for item in ['password']:
self.assertNotIn(item, user, r.content[:1000])
def test_user_add(self):
# simple user without password
r = self.client.post(reverse('user-list'), {
"username": "api-user-1"
}, format='json')
self.assertEqual(r.status_code, 201, r.content[:1000])
# user with good password
password = 'testTEST1234!@#$'
r = self.client.post(reverse('user-list'), {
"username": "api-user-2",
"password": password
}, format='json')
self.assertEqual(r.status_code, 201, r.content[:1000])
# test password by fetching API key
r = self.client.post(reverse('api-token-auth'), {
"username": "api-user-2",
"password": password
}, format='json')
self.assertEqual(r.status_code, 200, r.content[:1000])
# user with weak password
r = self.client.post(reverse('user-list'), {
"username": "api-user-3",
"password": "weakPassword"
}, format='json')
self.assertEqual(r.status_code, 400, r.content[:1000])
self.assertIn('The password must contain at least 1 digit, 0-9.', r.content.decode("utf-8"))
def test_user_change_password(self):
# some user
r = self.client.post(reverse('user-list'), {
"username": "api-user-4"
}, format='json')
self.assertEqual(r.status_code, 201, r.content[:1000])
user_id = r.json()['id']
r = self.client.put("{}{}/".format(reverse('user-list'), user_id), {
"username": "api-user-4",
"first_name": "first"
}, format='json',)
self.assertEqual(r.status_code, 200, r.content[:1000])
r = self.client.patch("{}{}/".format(reverse('user-list'), user_id), {
"last_name": "last"
}, format='json')
self.assertEqual(r.status_code, 200, r.content[:1000])
r = self.client.put("{}{}/".format(reverse('user-list'), user_id), {
"username": "api-user-4",
"password": "testTEST1234!@#$"
}, format='json')
self.assertEqual(r.status_code, 400, r.content[:1000])
self.assertIn("Update of password though API is not allowed", r.content.decode("utf-8"))
r = self.client.patch("{}{}/".format(reverse('user-list'), user_id), {
"password": "testTEST1234!@#$"
}, format='json')
self.assertEqual(r.status_code, 400, r.content[:1000])
self.assertIn("Update of password though API is not allowed", r.content.decode("utf-8"))
| 39.067416 | 100 | 0.581823 | 423 | 3,477 | 4.699764 | 0.229314 | 0.064386 | 0.078471 | 0.110664 | 0.621227 | 0.576962 | 0.576962 | 0.576962 | 0.576962 | 0.576962 | 0 | 0.041939 | 0.252517 | 3,477 | 88 | 101 | 39.511364 | 0.72297 | 0.043428 | 0 | 0.537313 | 0 | 0 | 0.191773 | 0 | 0 | 0 | 0 | 0 | 0.238806 | 1 | 0.059701 | false | 0.164179 | 0.044776 | 0 | 0.134328 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1c20efc185d3c6e0666c4268894d9c9ea652083d | 371 | py | Python | src/init.py | ankit-kushwaha-51/RESTful_API | 4513e8a058cb0200b41d47830b93b8a23ea38d7b | [
"MIT"
] | null | null | null | src/init.py | ankit-kushwaha-51/RESTful_API | 4513e8a058cb0200b41d47830b93b8a23ea38d7b | [
"MIT"
] | null | null | null | src/init.py | ankit-kushwaha-51/RESTful_API | 4513e8a058cb0200b41d47830b93b8a23ea38d7b | [
"MIT"
] | null | null | null | from flask import Flask
from src.models import db
from . import config
def create_app():
flask_app = Flask(__name__)
flask_app.config['SQLALCHEMY_DATABASE_URI'] = config.DATABASE_CONNECTION_URI
flask_app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
flask_app.app_context().push()
db.init_app(flask_app)
db.create_all()
return flask_app
| 24.733333 | 80 | 0.752022 | 52 | 371 | 4.980769 | 0.442308 | 0.185328 | 0.084942 | 0.185328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15903 | 371 | 14 | 81 | 26.5 | 0.830128 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c27cd5f5bfc380cd284613d082bf0df751fd64e | 43,000 | py | Python | utils.py | jiangycTarheel/Compositional-Auxseq | e4645a92c21c893cd320eb186c19d392bc147b43 | [
"MIT"
] | 8 | 2021-10-02T00:08:27.000Z | 2022-02-15T17:23:14.000Z | utils.py | jiangycTarheel/compositional-auxseq | e4645a92c21c893cd320eb186c19d392bc147b43 | [
"MIT"
] | null | null | null | utils.py | jiangycTarheel/compositional-auxseq | e4645a92c21c893cd320eb186c19d392bc147b43 | [
"MIT"
] | null | null | null | import os
import json
import gzip
from copy import deepcopy, copy
import numpy as np
import csv
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.utils.data import Dataset, DataLoader, RandomSampler
from transformers.tokenization_utils import trim_batch
class LabelSmoothingLoss(nn.Module):
def __init__(self, label_smooth, tgt_vocab_size, ignore_index=-100):
assert 0. < label_smooth <= 1.
self.ignore_index = ignore_index
super(LabelSmoothingLoss, self).__init__()
smoothing_value = label_smooth / (tgt_vocab_size - 2)
one_hot = torch.full((tgt_vocab_size,), smoothing_value)
one_hot[self.ignore_index] = 0
self.register_buffer('one_hot', one_hot.unsqueeze(0).unsqueeze(0))
self.confidence = 1.0 - label_smooth
self.lossfct = torch.nn.KLDivLoss(reduction='none')
def forward(self, pred, target):
"""
Args:
pred: [bsz, seq_len, vocab_size]
target: [bsz, seq_len]
Returns:
"""
model_prob = self.one_hot.repeat(target.size(0), target.size(1), 1) # [bsz, seq_len, vocab_size]
model_prob.scatter_(2, target.unsqueeze(2), self.confidence)
model_prob.masked_fill_((target == self.ignore_index).unsqueeze(2), 0)
pred_prob = F.log_softmax(pred, dim=2)
#return F.kl_div(pred_prob, model_prob, reduction='mean')
loss = self.lossfct(pred_prob, model_prob)
loss = torch.sum(loss, dim=2).masked_fill_((target == self.ignore_index), 0)
avg_loss = torch.sum(loss) / torch.sum((target != self.ignore_index).to(torch.float))
return avg_loss
# Special symbols
SOS_token = "<SOS>" # start of sentence
EOS_token = "<EOS>" # end of sentence
PAD_token = SOS_token # padding symbol
INPUT_TOKENS_SCAN = ['jump', 'opposite', 'right', 'twice', 'and', 'turn', 'thrice', 'run', 'after', 'around', 'left', 'walk', 'look']
OUTPUT_TOKENS_SCAN = ['I_TURN_RIGHT', 'I_JUMP', 'I_TURN_LEFT', 'I_RUN', 'I_WALK', 'I_LOOK']
# ACTION_TO_TEXT = {'I_TURN_RIGHT': 'right', 'I_JUMP': 'jump', 'I_TURN_LEFT': 'left', 'I_RUN': 'run', 'I_WALK': 'walk', 'I_LOOK': 'look'}
class Lang:
# Class for converting strings/words to numerical indices, and vice versa.
# Should use separate class for input language (English) and output language (actions)
#
def __init__(self, symbols, io_type):
# symbols : list of all possible symbols
n = len(symbols)
self.symbols = [_s.strip('\n') for _s in symbols]
self.io_type = io_type
if SOS_token not in self.symbols:
assert EOS_token not in self.symbols
self.index2symbol = {n: SOS_token, n+1: EOS_token}
self.symbol2index = {SOS_token: n, EOS_token: n + 1}
self.sos_id, self.eos_id = n, n + 1
else:
self.index2symbol = {}
self.symbol2index = {}
self.sos_id, self.eos_id = 0, 1
self.pad_token_id = self.sos_id
for idx,s in enumerate(self.symbols):
self.index2symbol[idx] = s
self.symbol2index[s] = idx
self.n_symbols = len(self.index2symbol)
def variableFromSymbols(self, mylist, add_eos=True):
# Convert a list of symbols to a tensor of indices (adding a EOS token at end)
#
# Input
# mylist : list of m symbols
# add_eos : true/false, if true add the EOS symbol at end
#
# Output
# output : [m or m+1 LongTensor] indices of each symbol (plus EOS if appropriate)
mylist = copy(mylist)
if add_eos:
mylist.append(EOS_token)
indices = [self.symbol2index[s] for s in mylist]
output = torch.LongTensor(indices)
#if USE_CUDA:
output = output.cuda()
return output
def symbolsFromVector(self, v):
# Convert indices to symbols, breaking where we get a EOS token
#
# Input
# v : list of m indices
#
# Output
# mylist : list of m or m-1 symbols (excluding EOS)
mylist = []
for x in v:
s = self.index2symbol[x]
if s == EOS_token:
break
mylist.append(s)
return mylist
def encode_scan_file(self, data, max_length):
encoded_data = []
for dp in data:
input, output = dp[0], dp[1]
if self.io_type == 'input':
raw = input
else:
assert self.io_type == 'output'
raw = output
encoded = self.variableFromSymbols(raw.split(' '))
encoded_data.append(encoded)
return encoded_data
def encode_scan_file_2_seg(self, data, max_length, cutoffs):
encoded_data_1, encoded_data_2 = [], []
for _id, dp in enumerate(data):
input, output, cutoff = dp[0], dp[1], cutoffs[_id]
assert self.io_type == 'output'
raw = output
encoded_1 = self.variableFromSymbols(raw.split(' ')[:cutoff])
encoded_2 = self.variableFromSymbols(raw.split(' ')[cutoff:])
encoded_data_1.append(encoded_1)
encoded_data_2.append(encoded_2)
return encoded_data_1, encoded_data_2
def encode_cfq_file(self, data, max_length):
encoded_data = []
for dp in data:
input, output = dp['query_ids'], dp['sparql_ids']
if self.io_type == 'input':
raw = input
else:
assert self.io_type == 'output'
raw = output + [self.eos_id]
encoded = torch.LongTensor(raw).cuda()
encoded_data.append(encoded)
return encoded_data
def encode_cogs_file(self, data, max_length):
encoded_data = []
for dp in data:
input, output = dp['src'], dp['trg']
if self.io_type == 'input':
raw = input
else:
assert self.io_type == 'output'
raw = output
encoded = self.variableFromSymbols(raw.split(' '))
encoded_data.append(encoded)
return encoded_data
def decode(self, ids):
out = self.symbolsFromVector(ids.cpu().numpy())
if out == []:
return out
if out[0] in ['<SOS>', '<SOS_2>']:
out = out[1:]
return out
def calculate_accuracy(preds, gts):
assert len(preds) == len(gts)
match = 0
for pred, gt in zip(preds, gts):
if pred == gt:
match += 1
return match / len(preds)
def encode_file(tokenizer, data_path, max_length, pad_to_max_length=True, return_tensors="pt", max_examples=None):
examples = []
if data_path[-3:] == '.gz':
print('Data file is gzipped')
f = gzip.open(data_path, "rt")
else:
print('Data file is plain text')
print(data_path)
f = open(data_path, "r", encoding='utf-8')
for i, text in enumerate(f.readlines()):
tokenized = tokenizer.batch_encode_plus( [text + ' </s>'], max_length=max_length,
pad_to_max_length=pad_to_max_length, return_tensors=return_tensors )
if max_examples and i >= max_examples:
break
examples.append(tokenized)
f.close()
return examples
# def encode_file_iterator(tokenizer, data_path, max_length, pad_to_max_length=True, return_tensors="pt", max_examples=None):
# '''
# This provides a low-memory usage way of iterating thru all of the source/target lines for processing by JIT loader.
# '''
# if data_path[-3:] == '.gz':
# print('Data file is gzipped')
# f = gzip.open(data_path, "rt")
# else:
# print('Data file is plain text')
# f = open(data_path, "r", encoding='utf-8')
#
# for i, text in enumerate(f):
#
# tokenized = tokenizer.batch_encode_plus( [text + ' </s>'], max_length=max_length,
# pad_to_max_length=pad_to_max_length, return_tensors=return_tensors )
#
# yield tokenized
#
# if max_examples and i >= max_examples:
# break
#
# f.close()
# def convert_scan_actions_to_text(actions):
# return ' '.join([ACTION_TO_TEXT[_action] for _action in actions.split(' ')])
# def encode_scan_file(tokenizer, data, io_type, max_length, pad_to_max_length=True, return_tensors="pt", max_examples=None):
# examples = []
# # a = tokenizer.batch_encode_plus( ['right jump left run walk look' + ' <s> </s>'], max_length=max_length,
# # pad_to_max_length=pad_to_max_length, return_tensors=return_tensors )
# # print(a)
# # exit()
# for dp in data:
# input, output = dp[0], dp[1]
# if io_type == 'input':
# raw = input
# else:
# assert io_type == 'output'
# raw = convert_scan_actions_to_text(output)
#
# tokenized = tokenizer.batch_encode_plus( [raw + ' </s>'], max_length=max_length,
# pad_to_max_length=pad_to_max_length, return_tensors=return_tensors )
#
# if max_examples and i >= max_examples:
# break
# examples.append(tokenized)
#
# return examples
def load_scan_file(mytype, split):
# Load SCAN dataset from file
#
# Input
# mytype : type of SCAN experiment
# split : 'train' or 'test'
#
# Output
# commands : list of input/output strings (as tuples)
assert mytype in ['simple', 'addprim_jump', 'length', 'addprim_turn_left', 'all', 'template_around_right', 'viz',
'examine', 'template_jump_around_right', 'template_right', 'template_around_right',
'mcd1', 'mcd2', 'mcd3', 'mcd1.1', 'mcd1.2', 'debug', 'attn_vis']
assert split in ['train', 'test', 'val']
if split == 'val' and mytype not in ['mcd1', 'mcd2', 'mcd3', 'mcd1.1', 'mcd1.2']:
split = 'test'
fn = 'data/scan/tasks_' + split + '_' + mytype + '.txt'
fid = open(fn, 'r')
lines = fid.readlines()
fid.close()
lines = [l.strip() for l in lines]
lines = [l.lstrip('IN: ') for l in lines]
commands = [l.split(' OUT: ') for l in lines]
return commands
class CompositionDataset(Dataset):
def __init__(
self,
src_lang,
trg_lang,
data_dir,
type_path,
sub_task,
max_source_length=20,
max_target_length=20,
tokenized=False,
):
super().__init__()
self.max_source_length = max_source_length
self.max_target_length = max_target_length
self.tokenized = tokenized
self.src_lang = src_lang
self.trg_lang = trg_lang
def __len__(self):
if self.tokenized:
return len(self.dataset)
else:
return len(self.source)
def __getitem__(self, index):
if self.tokenized:
dp = self.dataset[index]
source_ids, src_mask, target_ids = dp[0], dp[1], dp[2]
source_ids = source_ids[:self.max_source_length]
#src_mask = src_mask[:self.max_source_length]
target_ids = target_ids[:self.max_target_length]
else:
source_ids = self.source[index]
target_ids = self.target[index]
return {"source_ids": source_ids, "target_ids": target_ids}
@staticmethod
def trim_seq2seq_batch(batch, src_pad_token_id, trg_pad_token_id, trim_y=True):
if trim_y:
y = trim_batch(batch["target_ids"], trg_pad_token_id)
else:
y = batch["target_ids"]
source_ids, source_mask = trim_batch(batch["source_ids"], src_pad_token_id, attention_mask=batch["source_mask"])
return source_ids, source_mask, y
def pad_to_max_len(self, ids, max_len, pad_token_id):
ids_length = ids.size(0)
if ids_length == max_len:
return ids
pad_tokens = torch.tensor([pad_token_id] * (max_len - ids_length))
# if ids.type() == 'torch.cuda.FloatTensor':
# print(ids)
# exit()
padded_ids = torch.cat([ids, pad_tokens.cuda()])
return padded_ids
def create_mask(self, ids, max_len):
ids_length = ids.size(0)
mask = torch.tensor([1] * ids_length + [0] * (max_len - ids_length)).cuda()
return mask
def collate_fn(self, batch):
max_src_len = max(map(len, [x["source_ids"] for x in batch]))
max_trg_len = max(map(len, [x["target_ids"] for x in batch]))
src_mask = torch.stack([self.create_mask(x["source_ids"], max_src_len) for x in batch])
src_ids = torch.stack([self.pad_to_max_len(x["source_ids"], max_src_len, self.src_lang.pad_token_id) for x in batch])
#masks = torch.stack([x["source_mask"] for x in batch])
trg_ids = torch.stack([self.pad_to_max_len(x["target_ids"], max_trg_len, self.trg_lang.pad_token_id) for x in batch])
y = trim_batch(trg_ids, self.trg_lang.pad_token_id)
src_ids, src_mask = trim_batch(src_ids, self.src_lang.pad_token_id, attention_mask=src_mask)
return {"source_ids": src_ids, "source_mask": src_mask, "target_ids": y}
class ScanDataset(CompositionDataset):
def __init__(
self,
src_lang,
trg_lang,
data_dir="./data/scan/",
type_path="train",
sub_task="addprim_jump",
max_source_length=20,
max_target_length=20,
tokenized=False,
):
super().__init__(src_lang, trg_lang, data_dir, type_path, sub_task, max_source_length,
max_target_length, tokenized)
scan_data = load_scan_file(sub_task, type_path)
print(len(scan_data))
all_scan_dict = self.convert_to_dict(load_scan_file('all', 'train'))
self.action_count_labels, self.action_group_labels, self.action_type_labels = self.construct_count_label(scan_data, all_scan_dict)
if not tokenized:
self.source = self.src_lang.encode_scan_file(scan_data, max_source_length)
self.target = self.trg_lang.encode_scan_file(scan_data, max_target_length)
else:
self.dataset = torch.load(os.path.join(data_dir, type_path))
def construct_count_label(self, raw_data, all_data_dict):
all_count_labels = []
count_label_scheme = "v1"
group_label_scheme = "v2"
type_label_scheme = "v2"
all_action_group_labels, all_action_type_labels = [], []
# Group 1: single prim (jump), Group 2: prim + direction (jump left), Group 3: prim opposite, Group 4: prim around
#no_skip_id = np.random.randint(0, len(raw_data), int(len(raw_data)*0.05))
#no_skip_id = np.random.choice(range(len(raw_data)), int(len(raw_data)*0.07), replace=False)
# no_skip_id = np.random.choice(range(len(raw_data)), 10, replace=False)
skip_cnt, sup_cnt = 0, 0
for _id, dp in enumerate(raw_data):
input_text, output_text = dp[0], dp[1]
input_tok, output_tok = input_text.split(' '), output_text.split(' ')
count_labels, group_labels, type_labels = [], [], []
first_part_output_text, second_part_output_text = '', ''
if 'and' in input_tok:
first_part_input_tok = input_tok[:input_tok.index('and')]
second_part_input_tok = input_tok[input_tok.index('and')+1:]
first_part_output_text = all_data_dict[' '.join(first_part_input_tok)]
second_part_output_text = all_data_dict[' '.join(second_part_input_tok)]
elif 'after' in input_tok:
second_part_input_tok = input_tok[:input_tok.index('after')]
first_part_input_tok = input_tok[input_tok.index('after') + 1:]
first_part_output_text = all_data_dict[' '.join(first_part_input_tok)]
second_part_output_text = all_data_dict[' '.join(second_part_input_tok)]
else:
first_part_input_tok, second_part_input_tok = input_tok, []
first_part_output_text = output_text
first_part_output_tok, second_part_output_tok = first_part_output_text.split(' '), second_part_output_text.split(' ')
if second_part_output_text == '':
second_part_output_tok = []
assert len(first_part_output_tok) + len(second_part_output_tok) == len(output_tok), \
(len(first_part_output_tok), len(second_part_output_tok), len(output_tok), first_part_output_text, second_part_output_text, output_text)
### 1. Build the action count labels ###
if count_label_scheme == 'v1':
### For the first part output
if 'twice' in first_part_input_tok:
if 'after' in input_tok:
count_labels += ([4] * int(len(first_part_output_tok) / 2) + [3] * int(len(first_part_output_tok) / 2))
else:
count_labels += ([1] * int(len(first_part_output_tok) / 2) + [0] * int(len(first_part_output_tok) / 2))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok) / 2) - 1)) * 2
elif 'thrice' in first_part_input_tok:
if 'after' in input_tok:
count_labels += ([5] * int(len(first_part_output_tok) / 3) + [4] * int(len(first_part_output_tok) / 3) + \
[3] * int(len(first_part_output_tok) / 3))
else:
count_labels += ([2] * int(len(first_part_output_tok) / 3) + [1] * int(len(first_part_output_tok) / 3) + \
[0] * int(len(first_part_output_tok) / 3))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok) / 3) - 1)) * 3
else:
if 'after' in input_tok:
count_labels += ([3] * len(first_part_output_tok))
else:
count_labels += ([0] * len(first_part_output_tok))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok)) - 1))
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
if 'after' in input_tok:
count_labels += ([1] * int(len(second_part_output_tok) / 2) + [0] * int(len(second_part_output_tok) / 2))
else:
count_labels += ([4] * int(len(second_part_output_tok) / 2) + [3] * int(len(second_part_output_tok) / 2))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok) / 2) - 1)) * 2
elif 'thrice' in second_part_input_tok:
if 'after' in input_tok:
count_labels += ([2] * int(len(second_part_output_tok) / 3) + [1] * int(len(second_part_output_tok) / 3) + \
[0] * int(len(second_part_output_tok) / 3))
else:
count_labels += ([5] * int(len(second_part_output_tok) / 3) + [4] * int(len(second_part_output_tok) / 3) + \
[3] * int(len(second_part_output_tok) / 3))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok) / 3) - 1)) * 3
else:
if 'after' in input_tok:
count_labels += ([0] * len(second_part_output_tok))
else:
count_labels += ([3] * len(second_part_output_tok))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok)) - 1))
elif count_label_scheme == 'v2':
### For the first part output
if 'twice' in first_part_input_tok:
count_labels += ([1] * int(len(first_part_output_tok) / 2) + [0] * int(
len(first_part_output_tok) / 2))
elif 'thrice' in first_part_input_tok:
count_labels += ([2] * int(len(first_part_output_tok) / 3) + [1] * int(
len(first_part_output_tok) / 3) + \
[0] * int(len(first_part_output_tok) / 3))
else:
count_labels += ([0] * len(first_part_output_tok))
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
count_labels += ([1] * int(len(second_part_output_tok) / 2) + [0] * int(
len(second_part_output_tok) / 2))
elif 'thrice' in second_part_input_tok:
count_labels += ([2] * int(len(second_part_output_tok) / 3) + [1] * int(
len(second_part_output_tok) / 3) + [0] * int(len(second_part_output_tok) / 3))
else:
count_labels += ([0] * len(second_part_output_tok))
elif count_label_scheme == 'v3':
### For the first part output
if 'thrice' in first_part_input_tok and 'thrice' in second_part_input_tok:
start_count = 5
elif ('thrice' in first_part_input_tok and 'twice' in second_part_input_tok) or \
('twice' in first_part_input_tok and 'thrice' in second_part_input_tok):
start_count = 4
elif ('twice' in first_part_input_tok and 'twice' in second_part_input_tok) or \
('thrice' in first_part_input_tok) or ('thrice' in second_part_input_tok):
start_count = 3
elif 'twice' in first_part_input_tok or 'twice' in second_part_input_tok:
start_count = 2
else:
start_count = 1
if 'twice' in first_part_input_tok:
if 'after' in input_tok:
count_labels += ([start_count] * int(len(first_part_output_tok) / 2) + [start_count-1] * int(len(first_part_output_tok) / 2))
else:
count_labels += ([1] * int(len(first_part_output_tok) / 2) + [0] * int(len(first_part_output_tok) / 2))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok) / 2) - 1)) * 2
elif 'thrice' in first_part_input_tok:
if 'after' in input_tok:
count_labels += ([start_count] * int(len(first_part_output_tok) / 3) + [start_count-1] * int(len(first_part_output_tok) / 3) + \
[start_count-2] * int(len(first_part_output_tok) / 3))
else:
count_labels += ([2] * int(len(first_part_output_tok) / 3) + [1] * int(len(first_part_output_tok) / 3) + \
[0] * int(len(first_part_output_tok) / 3))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok) / 3) - 1)) * 3
else:
if 'after' in input_tok:
count_labels += ([start_count] * len(first_part_output_tok))
else:
count_labels += ([0] * len(first_part_output_tok))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok)) - 1))
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
if 'after' in input_tok:
count_labels += ([1] * int(len(second_part_output_tok) / 2) + [0] * int(len(second_part_output_tok) / 2))
else:
count_labels += ([start_count] * int(len(second_part_output_tok) / 2) + [start_count-1] * int(len(second_part_output_tok) / 2))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok) / 2) - 1)) * 2
elif 'thrice' in second_part_input_tok:
if 'after' in input_tok:
count_labels += ([2] * int(len(second_part_output_tok) / 3) + [1] * int(len(second_part_output_tok) / 3) + \
[0] * int(len(second_part_output_tok) / 3))
else:
count_labels += ([start_count] * int(len(second_part_output_tok) / 3) + [start_count-1] * int(len(second_part_output_tok) / 3) + \
[start_count-2] * int(len(second_part_output_tok) / 3))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok) / 3) - 1)) * 3
else:
if 'after' in input_tok:
count_labels += ([0] * len(second_part_output_tok))
else:
count_labels += ([start_count] * len(second_part_output_tok))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok)) - 1))
elif count_label_scheme == 'v3.1':
### For the first part output
if 'thrice' in first_part_input_tok and 'thrice' in second_part_input_tok:
start_count = 5
elif ('thrice' in first_part_input_tok and 'twice' in second_part_input_tok) or \
('twice' in first_part_input_tok and 'thrice' in second_part_input_tok):
start_count = 4
elif ('twice' in first_part_input_tok and 'twice' in second_part_input_tok) or \
('thrice' in first_part_input_tok) or ('thrice' in second_part_input_tok):
start_count = 3
elif 'twice' in first_part_input_tok or 'twice' in second_part_input_tok:
start_count = 2
else:
start_count = 1
if 'twice' in first_part_input_tok:
count_labels += ([start_count] * int(len(first_part_output_tok) / 2) + [start_count - 1] * int(
len(first_part_output_tok) / 2))
# count_labels += ([1] + [0] * (int(len(first_part_output_tok) / 2) - 1)) * 2
elif 'thrice' in first_part_input_tok:
count_labels += ([start_count] * int(len(first_part_output_tok) / 3) + [start_count - 1] * int(
len(first_part_output_tok) / 3) + \
[start_count - 2] * int(len(first_part_output_tok) / 3))
else:
count_labels += ([start_count] * len(first_part_output_tok))
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
count_labels += ([1] * int(len(second_part_output_tok) / 2) + [0] * int(
len(second_part_output_tok) / 2))
# count_labels += ([1] + [0] * (int(len(second_part_output_tok) / 2) - 1)) * 2
elif 'thrice' in second_part_input_tok:
count_labels += ([2] * int(len(second_part_output_tok) / 3) + [1] * int(
len(second_part_output_tok) / 3) + \
[0] * int(len(second_part_output_tok) / 3))
else:
count_labels += ([0] * len(second_part_output_tok))
else:
### For the first part output
if 'twice' in first_part_input_tok:
if 'after' in input_tok:
new_count_labels = list(range(int(len(first_part_output_tok) / 2)))[::-1] * 2
else:
new_count_labels = list(range(int(len(first_part_output_tok) / 2)))[::-1] * 2
elif 'thrice' in first_part_input_tok:
if 'after' in input_tok:
new_count_labels = list(range(int(len(first_part_output_tok) / 3)))[::-1] * 3
else:
new_count_labels = list(range(int(len(first_part_output_tok) / 3)))[::-1] * 3
else:
if 'after' in input_tok:
new_count_labels = list(range(len(first_part_output_tok)))[::-1]
else:
new_count_labels = list(range(len(first_part_output_tok)))[::-1]
count_labels += new_count_labels
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
if 'after' in input_tok:
new_count_labels = list(range(int(len(second_part_output_tok) / 2)))[::-1] * 2
new_count_labels = [_c + 8 for _c in new_count_labels]
else:
new_count_labels = list(range(int(len(second_part_output_tok) / 2)))[::-1] * 2
new_count_labels = [_c + 8 for _c in new_count_labels]
elif 'thrice' in second_part_input_tok:
if 'after' in input_tok:
new_count_labels = list(range(int(len(second_part_output_tok) / 3)))[::-1] * 3
new_count_labels = [_c + 8 for _c in new_count_labels]
else:
new_count_labels = list(range(int(len(second_part_output_tok) / 3)))[::-1] * 3
new_count_labels = [_c + 8 for _c in new_count_labels]
else:
if 'after' in input_tok:
new_count_labels = list(range(len(second_part_output_tok)))[::-1]
new_count_labels = [_c + 8 for _c in new_count_labels]
else:
new_count_labels = list(range(len(second_part_output_tok)))[::-1]
new_count_labels = [_c + 8 for _c in new_count_labels]
count_labels += new_count_labels
# count_labels = []
# count_labels += list(range(len(first_part_output_tok)))[::-1]
# count_labels += list(range(len(second_part_output_tok)))[::-1]
assert len(count_labels) == len(output_tok), (len(count_labels), len(output_tok), input_text, first_part_input_tok, count_labels, output_tok,
first_part_output_text, first_part_output_tok, second_part_output_text, second_part_output_tok)
count_labels.append(-1) # For the EOS token
# count_labels.append(7) # For the EOS token
### 2. Build the action group labels ###
if group_label_scheme == 'v1': ## As used in exp 9.0-9.4
if 'around' in first_part_input_tok:
if 'after' in input_tok:
group_labels += ([4] * len(first_part_output_tok))
else:
group_labels += ([0] * len(first_part_output_tok))
elif 'opposite' in first_part_input_tok:
if 'after' in input_tok:
group_labels += ([5] * len(first_part_output_tok))
else:
group_labels += ([1] * len(first_part_output_tok))
elif 'left' in first_part_input_tok or 'right' in first_part_input_tok:
if 'after' in input_tok:
group_labels += ([6] * len(first_part_output_tok))
else:
group_labels += ([2] * len(first_part_output_tok))
else:
if 'after' in input_tok:
group_labels += ([7] * len(first_part_output_tok))
else:
group_labels += ([3] * len(first_part_output_tok))
if 'around' in second_part_input_tok:
if 'after' in input_tok:
group_labels += ([0] * len(second_part_output_tok))
else:
group_labels += ([4] * len(second_part_output_tok))
elif 'opposite' in second_part_input_tok:
if 'after' in input_tok:
group_labels += ([1] * len(second_part_output_tok))
else:
group_labels += ([5] * len(second_part_output_tok))
elif 'left' in second_part_input_tok or 'right' in second_part_input_tok:
if 'after' in input_tok:
group_labels += ([2] * len(second_part_output_tok))
else:
group_labels += ([6] * len(second_part_output_tok))
else:
if 'after' in input_tok:
group_labels += ([3] * len(second_part_output_tok))
else:
group_labels += ([7] * len(second_part_output_tok))
else:
### For the first part output
if 'twice' in first_part_input_tok:
if 'after' in input_tok:
new_group_labels = list(range(int(len(first_part_output_tok) / 2)))[::-1] * 2
new_group_labels = [_c + 8 for _c in new_group_labels]
else:
new_group_labels = list(range(int(len(first_part_output_tok) / 2)))[::-1] * 2
elif 'thrice' in first_part_input_tok:
if 'after' in input_tok:
new_group_labels = list(range(int(len(first_part_output_tok) / 3)))[::-1] * 3
new_group_labels = [_c + 8 for _c in new_group_labels]
else:
new_group_labels = list(range(int(len(first_part_output_tok) / 3)))[::-1] * 3
else:
if 'after' in input_tok:
new_group_labels = list(range(len(first_part_output_tok)))[::-1]
new_group_labels = [_c + 8 for _c in new_group_labels]
else:
new_group_labels = list(range(len(first_part_output_tok)))[::-1]
group_labels += new_group_labels
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
if 'after' in input_tok:
new_group_labels = list(range(int(len(second_part_output_tok) / 2)))[::-1] * 2
else:
new_group_labels = list(range(int(len(second_part_output_tok) / 2)))[::-1] * 2
new_group_labels = [_c + 8 for _c in new_group_labels]
elif 'thrice' in second_part_input_tok:
if 'after' in input_tok:
new_group_labels = list(range(int(len(second_part_output_tok) / 3)))[::-1] * 3
else:
new_group_labels = list(range(int(len(second_part_output_tok) / 3)))[::-1] * 3
new_group_labels = [_c + 8 for _c in new_group_labels]
else:
if 'after' in input_tok:
new_group_labels = list(range(len(second_part_output_tok)))[::-1]
else:
new_group_labels = list(range(len(second_part_output_tok)))[::-1]
new_group_labels = [_c + 8 for _c in new_group_labels]
group_labels += new_group_labels
assert len(group_labels) == len(output_tok)
group_labels.append(-1) # For the EOS token
# group_labels.append(17) # For the EOS token
### 3. Build the action type labels ###
### For the first part output
if type_label_scheme == 'v1':
if 'around' in first_part_input_tok:
new_type_labels = [3] * len(first_part_output_tok)
elif 'opposite' in first_part_input_tok:
new_type_labels = [2] * len(first_part_output_tok)
elif 'left' in first_part_input_tok or 'right' in first_part_input_tok:
new_type_labels = [1] * len(first_part_output_tok)
else:
new_type_labels = [0] * len(first_part_output_tok)
# if 'after' in input_tok:
# new_type_labels = [_c + 4 for _c in new_type_labels]
type_labels += new_type_labels
### For the second part output
if len(second_part_output_tok) > 0:
if 'around' in second_part_input_tok:
new_type_labels = [3] * len(second_part_output_tok)
elif 'opposite' in second_part_input_tok:
new_type_labels = [2] * len(second_part_output_tok)
elif 'left' in second_part_input_tok or 'right' in second_part_input_tok:
new_type_labels = [1] * len(second_part_output_tok)
else:
new_type_labels = [0] * len(second_part_output_tok)
# if 'after' not in input_tok:
# new_type_labels = [_c + 4 for _c in new_type_labels]
type_labels += new_type_labels
elif type_label_scheme == 'v2':
if 'twice' in first_part_input_tok:
type_labels += ([1] * int(len(first_part_output_tok) / 2) + [0] * int(
len(first_part_output_tok) / 2))
elif 'thrice' in first_part_input_tok:
type_labels += ([2] * int(len(first_part_output_tok) / 3) + [1] * int(
len(first_part_output_tok) / 3) + \
[0] * int(len(first_part_output_tok) / 3))
else:
type_labels += ([0] * len(first_part_output_tok))
### For the second part output
if len(second_part_output_tok) > 0:
if 'twice' in second_part_input_tok:
type_labels += ([1] * int(len(second_part_output_tok) / 2) + [0] * int(
len(second_part_output_tok) / 2))
elif 'thrice' in second_part_input_tok:
type_labels += ([2] * int(len(second_part_output_tok) / 3) + [1] * int(
len(second_part_output_tok) / 3) + [0] * int(len(second_part_output_tok) / 3))
else:
type_labels += ([0] * len(second_part_output_tok))
assert len(type_labels) == len(output_tok)
type_labels.append(-1) # For the EOS token
# group_labels.append(17) # For the EOS token
# if _id not in no_skip_id:
# count_labels = [-1] * len(count_labels)
# group_labels = [-1] * len(group_labels)
# skip_cnt += 1
# else:
# sup_cnt += 1
all_action_type_labels.append(torch.tensor(type_labels).cuda())
all_count_labels.append(torch.tensor(count_labels).cuda())
all_action_group_labels.append(torch.tensor(group_labels).cuda())
print(skip_cnt, sup_cnt)
return all_count_labels, all_action_group_labels, all_action_type_labels
def convert_to_dict(self, raw_data):
dict_data = {}
for dp in raw_data:
input, output = dp[0], dp[1]
assert input not in dict_data
dict_data[input] = output
return dict_data
def __getitem__(self, index):
if self.tokenized:
dp = self.dataset[index]
source_ids, src_mask, target_ids = dp[0], dp[1], dp[2]
source_ids = source_ids[:self.max_source_length]
#src_mask = src_mask[:self.max_source_length]
target_ids = target_ids[:self.max_target_length]
else:
source_ids = self.source[index]
target_ids = self.target[index]
count_labels = self.action_count_labels[index]
group_labels = self.action_group_labels[index]
type_labels = self.action_type_labels[index]
return {"source_ids": source_ids, "target_ids": target_ids, "action_count_labels": count_labels,
"action_group_labels": group_labels, "action_type_labels": type_labels}
@staticmethod
def trim_seq2seq_batch(batch, src_pad_token_id, trg_pad_token_id, trim_y=True):
if trim_y:
y = trim_batch(batch["target_ids"], trg_pad_token_id)
else:
y = batch["target_ids"]
source_ids, source_mask = trim_batch(batch["source_ids"], src_pad_token_id, attention_mask=batch["source_mask"])
return source_ids, source_mask, y
def collate_fn(self, batch):
max_src_len = max(map(len, [x["source_ids"] for x in batch]))
max_trg_len = max(map(len, [x["target_ids"] for x in batch]))
src_mask = torch.stack([self.create_mask(x["source_ids"], max_src_len) for x in batch])
trg_mask = torch.stack([self.create_mask(x["target_ids"], max_trg_len) for x in batch])
src_ids = torch.stack([self.pad_to_max_len(x["source_ids"], max_src_len, self.src_lang.pad_token_id) for x in batch])
#masks = torch.stack([x["source_mask"] for x in batch])
trg_ids = torch.stack([self.pad_to_max_len(x["target_ids"], max_trg_len, self.trg_lang.pad_token_id) for x in batch])
action_count_labels = torch.stack([self.pad_to_max_len(x["action_count_labels"], max_trg_len, -1) for x in batch])
action_group_labels = torch.stack([self.pad_to_max_len(x["action_group_labels"], max_trg_len, -1) for x in batch])
action_type_labels = torch.stack(
[self.pad_to_max_len(x["action_type_labels"], max_trg_len, -1) for x in batch])
y = trim_batch(trg_ids, self.trg_lang.pad_token_id)
#action_count_labels = trim_batch(action_count_labels, -1)
# _src_ids, src_mask = trim_batch(src_ids, self.src_lang.pad_token_id, attention_mask=src_mask)
# print(_src_ids.size(), src_ids.size())
return {"source_ids": src_ids, "source_mask": src_mask, "target_ids": y, "target_mask": trg_mask,
"action_count_labels": action_count_labels, "action_group_labels": action_group_labels,
"action_type_labels": action_type_labels}
| 49.768519 | 158 | 0.548721 | 5,425 | 43,000 | 4 | 0.060092 | 0.089401 | 0.098848 | 0.076175 | 0.734055 | 0.701797 | 0.681336 | 0.659677 | 0.619032 | 0.594147 | 0 | 0.016724 | 0.342279 | 43,000 | 863 | 159 | 49.826188 | 0.750548 | 0.137698 | 0 | 0.558029 | 0 | 0 | 0.044011 | 0.001846 | 0 | 0 | 0 | 0 | 0.022258 | 1 | 0.041335 | false | 0 | 0.017488 | 0 | 0.104928 | 0.007949 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c28644cccaa9b7ccc104007acdb7fe41da7c7ad | 1,198 | py | Python | python/graphscope/experimental/nx/tests/algorithms/forward/operators/test_product.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 2 | 2020-12-15T08:42:10.000Z | 2022-01-14T09:13:16.000Z | python/graphscope/experimental/nx/tests/algorithms/forward/operators/test_product.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 1 | 2020-12-22T13:15:40.000Z | 2020-12-22T13:15:40.000Z | python/graphscope/experimental/nx/tests/algorithms/forward/operators/test_product.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 1 | 2021-11-23T03:40:43.000Z | 2021-11-23T03:40:43.000Z | import networkx.algorithms.operators.tests.test_product
import pytest
from graphscope.experimental.nx.utils.compat import import_as_graphscope_nx
import_as_graphscope_nx(networkx.algorithms.operators.tests.test_product,
decorators=pytest.mark.usefixtures("graphscope_session"))
def test_tensor_product_combinations():
# basic smoke test, more realistic tests would be useful
P5 = nx.path_graph(5)
K3 = nx.complete_graph(3)
G = nx.tensor_product(P5, K3)
assert nx.number_of_nodes(G) == 5 * 3
G = nx.tensor_product(nx.DiGraph(P5), nx.DiGraph(K3))
assert nx.number_of_nodes(G) == 5 * 3
@pytest.mark.skip(reason="not support multigraph")
def test_cartesian_product_multigraph():
pass
def test_lexicographic_product_combinations():
P5 = nx.path_graph(5)
K3 = nx.complete_graph(3)
G = nx.lexicographic_product(P5, K3)
assert nx.number_of_nodes(G) == 5 * 3
def test_strong_product_combinations():
P5 = nx.path_graph(5)
K3 = nx.complete_graph(3)
G = nx.strong_product(P5, K3)
assert nx.number_of_nodes(G) == 5 * 3
@pytest.mark.skip(reason="not support multigraph")
def test_graph_power_raises():
pass
| 27.860465 | 81 | 0.724541 | 176 | 1,198 | 4.693182 | 0.301136 | 0.042373 | 0.01937 | 0.077482 | 0.576271 | 0.549637 | 0.445521 | 0.445521 | 0.445521 | 0.445521 | 0 | 0.028084 | 0.16778 | 1,198 | 42 | 82 | 28.52381 | 0.800401 | 0.045075 | 0 | 0.5 | 0 | 0 | 0.054291 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.178571 | false | 0.071429 | 0.142857 | 0 | 0.321429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1c2b5a500905db564cebad53847b80d4840a37d9 | 3,947 | py | Python | manpages.py | mba811/dash-manpages-zh | 94f7345f48084c2fa22ae00996920d1309458649 | [
"Apache-2.0"
] | 1 | 2020-04-09T10:51:01.000Z | 2020-04-09T10:51:01.000Z | manpages.py | mba811/dash-manpages-zh | 94f7345f48084c2fa22ae00996920d1309458649 | [
"Apache-2.0"
] | null | null | null | manpages.py | mba811/dash-manpages-zh | 94f7345f48084c2fa22ae00996920d1309458649 | [
"Apache-2.0"
] | 1 | 2020-09-16T03:04:18.000Z | 2020-09-16T03:04:18.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
@author: Wu Liang
@contact:
@date: 2014/06/23
"""
import os
import sqlite3
import urllib2
import shutil
import tarfile
import hashlib
import codecs
from mako.template import Template
from pyquery import PyQuery
currentPath = os.path.join(os.path.dirname(os.path.realpath(__file__)))
name = "manpages"
baseName = "manpages-zh"
output = baseName + ".docset"
appName = "dash-" + baseName
tarFileName = baseName + ".tgz"
feedName = baseName + ".xml"
version = "1.5.0"
docsetPath = os.path.join(currentPath, output, "Contents", "Resources", "Documents")
# Step 2: Copy the HTML Documentation
fin = codecs.open(os.path.join(docsetPath, "index.html"), "r", "utf-8")
content = fin.read()
fin.close()
jQuery = PyQuery(content)
jQuery.find("body").empty()
fileNames = []
itemTemplate = Template("<a href='html/${fileName}'>${name}</a><br />\n")
for fileName in os.listdir(os.path.join(docsetPath, "html")):
fileNames.append({
"name": fileName.split(".")[0],
"fileName": fileName
})
jQuery.find("body").append(itemTemplate.render(name = fileName.split(".")[0], fileName = fileName))
fin = codecs.open(os.path.join(docsetPath, "index.html"), "w", "utf-8")
newContent = jQuery.html()
fin.write(newContent)
fin.close()
# Step 3: create the Info.plist file
infoTemplate = Template('''<?xml version="1.0" encoding="UTF-8"?>
<plist version="1.0">
<dict>
<key>CFBundleIdentifier</key>
<string>${name}</string>
<key>CFBundleName</key>
<string>${name}</string>
<key>DocSetPlatformFamily</key>
<string>${name}</string>
<key>dashIndexFilePath</key>
<string>index.html</string>
<key>dashIndexFilePath</key>
<string>index.html</string>
<key>isDashDocset</key><true/>
<key>isJavaScriptEnabled</key><true/>
</dict>
</plist>''')
infoPlistFile = os.path.join(currentPath, output, "Contents", "Info.plist")
fin = open(infoPlistFile, "w")
fin.write(infoTemplate.render(name = name))
fin.close()
# Step 4: Create the SQLite Index
dbFile = os.path.join(currentPath, output, "Contents", "Resources", "docSet.dsidx")
if os.path.exists(dbFile):
os.remove(dbFile)
db = sqlite3.connect(dbFile)
cursor = db.cursor()
try:
cursor.execute("DROP TABLE searchIndex;")
except Exception:
pass
cursor.execute('CREATE TABLE searchIndex(id INTEGER PRIMARY KEY, name TEXT, type TEXT, path TEXT);')
cursor.execute('CREATE UNIQUE INDEX anchor ON searchIndex (name, type, path);')
insertTemplate = Template("INSERT OR IGNORE INTO searchIndex(name, type, path) VALUES ('${name}', '${type}', '${path}');")
# Step 5: Populate the SQLite Index
for result in fileNames:
sql = insertTemplate.render(name = result["name"], type = "Builtin", path = "html/" + result["fileName"])
print sql
cursor.execute(sql)
db.commit()
db.close()
# Step 6: copy icon
shutil.copyfile(os.path.join(currentPath, "icon.png"),
os.path.join(currentPath, output, "icon.png"))
shutil.copyfile(os.path.join(currentPath, "icon@2x.png"),
os.path.join(currentPath, output, "icon@2x.png"))
# Step 7: 打包
if not os.path.exists(os.path.join(currentPath, "dist")):
os.makedirs(os.path.join(currentPath, "dist"))
tarFile = tarfile.open(os.path.join(currentPath, "dist", tarFileName), "w:gz")
for root, dirNames, fileNames in os.walk(output):
for fileName in fileNames:
fullPath = os.path.join(root, fileName)
tarFile.add(fullPath)
tarFile.close()
# Step 8: 更新feed url
feedTemplate = Template('''<entry>
<version>${version}</version>
<sha1>${sha1Value}</sha1>
<url>https://raw.githubusercontent.com/magicsky/${appName}/master/dist/${tarFileName}</url>
</entry>''')
fout = open(os.path.join(currentPath, "dist", tarFileName), "rb")
sha1Value = hashlib.sha1(fout.read()).hexdigest()
fout.close()
fin = open(os.path.join(currentPath, feedName), "w")
fin.write(feedTemplate.render(sha1Value = sha1Value, appName = appName, tarFileName = tarFileName, version = version))
fin.close()
| 30.835938 | 122 | 0.698759 | 519 | 3,947 | 5.306358 | 0.331407 | 0.045752 | 0.061728 | 0.091503 | 0.265069 | 0.217139 | 0.179739 | 0.065723 | 0.065723 | 0 | 0 | 0.011501 | 0.118824 | 3,947 | 127 | 123 | 31.07874 | 0.780334 | 0.057512 | 0 | 0.113402 | 0 | 0.020619 | 0.319934 | 0.115942 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.010309 | 0.092784 | null | null | 0.010309 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c2c1ecff02208f628aa2e65eae53abaf0c94bd6 | 1,527 | py | Python | docs/conf.py | alexweav/nisystemlink-clients-python | f19a30907a7fef536043ecbddc5a755e5fedf846 | [
"MIT"
] | null | null | null | docs/conf.py | alexweav/nisystemlink-clients-python | f19a30907a7fef536043ecbddc5a755e5fedf846 | [
"MIT"
] | null | null | null | docs/conf.py | alexweav/nisystemlink-clients-python | f19a30907a7fef536043ecbddc5a755e5fedf846 | [
"MIT"
] | null | null | null | import os
import sys
sys.path.insert(0, os.path.abspath(".."))
# --------------------------------------------------------------------------------------
project = "nisystemlink"
copyright = "2020, National Instruments"
author = "National Instruments"
# The short X.Y version
version = "0.1"
# The full version, including alpha/beta/rc tags
release = "0.1.3"
# --------------------------------------------------------------------------------------
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.napoleon",
"sphinx.ext.viewcode",
"sphinx_autodoc_typehints",
"docs.cleanup",
]
master_doc = "index"
html_theme = "sphinx_rtd_theme"
html_extra_path = [
"../LICENSE",
]
nitpicky = True
nitpick_ignore = [
("py:class", "datetime.datetime"),
("py:class", "datetime.timedelta"),
("py:class", "pathlib.Path"),
("py:data", "typing.Any"),
("py:data", "typing.Awaitable"),
("py:data", "typing.Dict"),
("py:data", "typing.Iterable"),
("py:data", "typing.List"),
("py:data", "typing.Optional"),
("py:data", "typing.Sequence"),
("py:data", "typing.Tuple"),
("py:data", "typing.Union"),
]
autodoc_default_options = {
"inherited-members": True,
"special-members": "__init__",
"no-private-members": True,
}
# Don't let napoleon force methods to be included in the docs; use autodoc flags and our
# own docs.cleanup module for that.
napoleon_include_init_with_doc = False
napoleon_include_private_with_doc = False
napoleon_include_special_with_doc = False
| 26.789474 | 88 | 0.587426 | 173 | 1,527 | 5.028902 | 0.537572 | 0.062069 | 0.124138 | 0.045977 | 0.062069 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007675 | 0.146693 | 1,527 | 56 | 89 | 27.267857 | 0.660015 | 0.237721 | 0 | 0 | 0 | 0 | 0.432152 | 0.020743 | 0 | 0 | 0 | 0.017857 | 0 | 1 | 0 | false | 0 | 0.046512 | 0 | 0.046512 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c2f9886c30209c8f8c18348757a2729fc8d5b30 | 1,832 | py | Python | sdk/applicationinsights/azure-mgmt-applicationinsights/azure/mgmt/applicationinsights/v2015_05_01/models/_application_insights_management_client_enums.py | iscai-msft/azure-sdk-for-python | 83715b95c41e519d5be7f1180195e2fba136fc0f | [
"MIT"
] | 8 | 2021-01-13T23:44:08.000Z | 2021-03-17T10:13:36.000Z | sdk/applicationinsights/azure-mgmt-applicationinsights/azure/mgmt/applicationinsights/v2015_05_01/models/_application_insights_management_client_enums.py | iscai-msft/azure-sdk-for-python | 83715b95c41e519d5be7f1180195e2fba136fc0f | [
"MIT"
] | 226 | 2019-07-24T07:57:21.000Z | 2019-10-15T01:07:24.000Z | sdk/applicationinsights/azure-mgmt-applicationinsights/azure/mgmt/applicationinsights/v2015_05_01/models/_application_insights_management_client_enums.py | iscai-msft/azure-sdk-for-python | 83715b95c41e519d5be7f1180195e2fba136fc0f | [
"MIT"
] | 2 | 2020-05-21T22:51:22.000Z | 2020-05-26T20:53:01.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
from enum import Enum
class ApplicationType(str, Enum):
web = "web"
other = "other"
class FlowType(str, Enum):
bluefield = "Bluefield"
class RequestSource(str, Enum):
rest = "rest"
class PurgeState(str, Enum):
pending = "pending"
completed = "completed"
class FavoriteType(str, Enum):
shared = "shared"
user = "user"
class WebTestKind(str, Enum):
ping = "ping"
multistep = "multistep"
class ItemScope(str, Enum):
shared = "shared"
user = "user"
class ItemType(str, Enum):
query = "query"
function = "function"
folder = "folder"
recent = "recent"
class SharedTypeKind(str, Enum):
user = "user"
shared = "shared"
class FavoriteSourceType(str, Enum):
retention = "retention"
notebook = "notebook"
sessions = "sessions"
events = "events"
userflows = "userflows"
funnel = "funnel"
impact = "impact"
segmentation = "segmentation"
class ItemScopePath(str, Enum):
analytics_items = "analyticsItems"
myanalytics_items = "myanalyticsItems"
class ItemTypeParameter(str, Enum):
none = "none"
query = "query"
function = "function"
folder = "folder"
recent = "recent"
class CategoryType(str, Enum):
workbook = "workbook"
tsg = "TSG"
performance = "performance"
retention = "retention"
| 17.960784 | 76 | 0.600437 | 179 | 1,832 | 6.134078 | 0.497207 | 0.082878 | 0.023679 | 0.034608 | 0.15847 | 0.15847 | 0.15847 | 0.100182 | 0.100182 | 0 | 0 | 0.000697 | 0.217249 | 1,832 | 101 | 77 | 18.138614 | 0.764993 | 0.246725 | 0 | 0.313725 | 0 | 0 | 0.189189 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.019608 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1c30848fe8db838bf2ea7ab14ebea0d07ae3d297 | 2,311 | py | Python | setup.py | mark-mishyn/django-axes | dfaf67810abd21a0e76200a4906c1bffdd4fa9c9 | [
"MIT"
] | null | null | null | setup.py | mark-mishyn/django-axes | dfaf67810abd21a0e76200a4906c1bffdd4fa9c9 | [
"MIT"
] | null | null | null | setup.py | mark-mishyn/django-axes | dfaf67810abd21a0e76200a4906c1bffdd4fa9c9 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from setuptools import setup, find_packages
setup(
name="django-axes",
description="Keep track of failed login attempts in Django-powered sites.",
long_description="\n".join(
[
open("README.rst", encoding="utf-8").read(),
open("CHANGES.rst", encoding="utf-8").read(),
]
),
keywords="authentication django pci security",
author=", ".join(
[
"Josh VanderLinden",
"Philip Neustrom",
"Michael Blume",
"Alex Clark",
"Camilo Nova",
"Aleksi Hakli",
]
),
author_email="security@jazzband.co",
maintainer="Jazzband",
maintainer_email="security@jazzband.co",
url="https://github.com/jazzband/django-axes",
project_urls={
"Documentation": "https://django-axes.readthedocs.io/",
"Source": "https://github.com/jazzband/django-axes",
"Tracker": "https://github.com/jazzband/django-axes/issues",
},
license="MIT",
package_dir={"axes": "axes"},
use_scm_version=True,
setup_requires=["setuptools_scm"],
python_requires="~=3.6",
install_requires=["django>=1.11", "django-appconf>=1.0.3", "django-ipware>=2.0.2"],
include_package_data=True,
packages=find_packages(),
classifiers=[
"Development Status :: 5 - Production/Stable",
"Environment :: Web Environment",
"Environment :: Plugins",
"Framework :: Django",
"Framework :: Django :: 1.11",
"Framework :: Django :: 2.2",
"Framework :: Django :: 3.0",
"Intended Audience :: Developers",
"Intended Audience :: System Administrators",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"Topic :: Internet :: Log Analysis",
"Topic :: Security",
"Topic :: System :: Logging",
],
zip_safe=False,
)
| 34.492537 | 87 | 0.581134 | 229 | 2,311 | 5.79476 | 0.528384 | 0.100226 | 0.131876 | 0.078372 | 0.10098 | 0.072344 | 0 | 0 | 0 | 0 | 0 | 0.016384 | 0.260493 | 2,311 | 66 | 88 | 35.015152 | 0.760094 | 0.008654 | 0 | 0.031746 | 0 | 0 | 0.541048 | 0.00917 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.015873 | 0 | 0.015873 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c33dae046d778c2acefa8efab3c4ae7565e1bc3 | 348 | py | Python | spark_work.py | nszceta/spark-python-celery-demo | c5b03be4bb96699f8e41aa8a42fecd4c25c76331 | [
"MIT"
] | 8 | 2016-01-19T15:59:36.000Z | 2018-04-25T09:00:57.000Z | spark_work.py | nszceta/spark-python-celery-demo | c5b03be4bb96699f8e41aa8a42fecd4c25c76331 | [
"MIT"
] | null | null | null | spark_work.py | nszceta/spark-python-celery-demo | c5b03be4bb96699f8e41aa8a42fecd4c25c76331 | [
"MIT"
] | null | null | null | import sys
from pyspark import SparkContext
import json
print('spark got python path -> ' + str(sys.executable))
logfile = sys.argv[1]
sc = SparkContext()
logdata = sc.textFile(logfile).cache()
a_count = logdata.filter(lambda s: 'a' in s).count()
b_count = logdata.filter(lambda s: 'b' in s).count()
print(json.dumps({'a': a_count, 'b': b_count}))
| 31.636364 | 56 | 0.70977 | 56 | 348 | 4.339286 | 0.5 | 0.049383 | 0.148148 | 0.197531 | 0.205761 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003279 | 0.123563 | 348 | 10 | 57 | 34.8 | 0.793443 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.3 | 0 | 0.3 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c3521323cf7d57dc8b2b240d95a181b90cc3144 | 1,188 | py | Python | src/recognizeDigit.py | RsTaK/Sudoku | 8daa0a06906ce61d9a71586a8d28a3931ca4e5e3 | [
"MIT"
] | 2 | 2020-01-22T14:32:40.000Z | 2021-12-23T20:42:52.000Z | src/recognizeDigit.py | RsTaK/Sudoku | 8daa0a06906ce61d9a71586a8d28a3931ca4e5e3 | [
"MIT"
] | 4 | 2020-11-13T18:54:24.000Z | 2022-02-10T02:10:00.000Z | src/recognizeDigit.py | RsTaK/Sudoku | 8daa0a06906ce61d9a71586a8d28a3931ca4e5e3 | [
"MIT"
] | 1 | 2020-01-22T14:02:50.000Z | 2020-01-22T14:02:50.000Z | from keras.models import load_model
import cv2
import pickle
import keras.backend as K
import numpy as np
from src.model_path import MODEL_PATH
'''def predict(self, cell):
model = load_model('./model/Model.h5')
f = K.function([model.layers[0].input, K.learning_phase()],[model.layers[-1].output])
rescaled_cell = self.rescale(cell)
result = []
for _ in range(10):
result.append(f([rescaled_cell, 1]))
result = np.array(result)
prediction = result.mean(axis=0)
uncertainty = result.var(axis=0)
if uncertainty.argmax() > 3:
new_prediction = 0
print(prediction.argmax(),uncertainty.argmax(),new_prediction)
else:
print(prediction.argmax(),uncertainty.argmax())'''
class recognizeDigit:
def __init__(self, cell):
self._prediction = self.predict(cell)
def predict(self, cell):
model = load_model(MODEL_PATH)
rescaled_cell = self.rescale(cell)
pred = model.predict(rescaled_cell)
return pred.argmax()
def rescale(self, cell):
resized_cell = cv2.resize(cell, (28, 28))
return resized_cell.reshape(1, resized_cell.shape[0], resized_cell.shape[1], 1)
@property
def prediction(self):
return self._prediction | 27 | 87 | 0.705387 | 165 | 1,188 | 4.933333 | 0.363636 | 0.039312 | 0.034398 | 0.044226 | 0.250614 | 0.090909 | 0.090909 | 0.090909 | 0 | 0 | 0 | 0.02006 | 0.160774 | 1,188 | 44 | 88 | 27 | 0.796389 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.3 | 0.05 | 0.7 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1c357d3712292b01ee95a5bca2342315acb4f8ef | 623 | py | Python | dojo/db_migrations/0147_rename_sslyze_parser.py | dant24/django-DefectDojo | caf5c91b3f8870d5f466dfaaf5a3a096f8812ad9 | [
"BSD-3-Clause"
] | 249 | 2016-09-06T21:04:40.000Z | 2018-01-19T15:59:44.000Z | dojo/db_migrations/0147_rename_sslyze_parser.py | dant24/django-DefectDojo | caf5c91b3f8870d5f466dfaaf5a3a096f8812ad9 | [
"BSD-3-Clause"
] | 255 | 2016-09-06T21:36:37.000Z | 2018-01-19T19:57:57.000Z | dojo/db_migrations/0147_rename_sslyze_parser.py | dant24/django-DefectDojo | caf5c91b3f8870d5f466dfaaf5a3a096f8812ad9 | [
"BSD-3-Clause"
] | 152 | 2016-09-06T21:04:54.000Z | 2018-01-18T08:52:24.000Z | from django.db import migrations
def rename_sslyze_parser(apps, schema_editor):
Test_Type_model = apps.get_model('dojo', 'Test_Type')
try:
test_type_sslyze = Test_Type_model.objects.get(name='SSLyze 3 Scan (JSON)')
test_type_sslyze.name = 'SSLyze Scan (JSON)'
test_type_sslyze.save()
except Test_Type_model.DoesNotExist:
# This happens when a new instance of DD is initialized
pass
class Migration(migrations.Migration):
dependencies = [
('dojo', '0146_lead_optional'),
]
operations = [
migrations.RunPython(rename_sslyze_parser),
]
| 25.958333 | 83 | 0.678973 | 77 | 623 | 5.220779 | 0.584416 | 0.139303 | 0.097015 | 0.079602 | 0.109453 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010373 | 0.226324 | 623 | 23 | 84 | 27.086957 | 0.823651 | 0.085072 | 0 | 0 | 0 | 0 | 0.128521 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.0625 | 0.0625 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1c38a65740967a1e49c94a99e84549d3470de0b7 | 493 | py | Python | TwoPointers/Leetcode11.py | Rylie-W/LeetRecord | 623c4efe88b3af54b8a65f6ec23db850b8c6f46f | [
"Apache-2.0"
] | null | null | null | TwoPointers/Leetcode11.py | Rylie-W/LeetRecord | 623c4efe88b3af54b8a65f6ec23db850b8c6f46f | [
"Apache-2.0"
] | null | null | null | TwoPointers/Leetcode11.py | Rylie-W/LeetRecord | 623c4efe88b3af54b8a65f6ec23db850b8c6f46f | [
"Apache-2.0"
] | null | null | null | class Solution:
def maxArea(self, height) -> int:
left=0
right=len(height)-1
res=min(height[left],height[right])*(right-left)
while right>left:
res=max(res,(right-left)*min(height[right],height[left]))
if height[left]<height[right]:
left+=1
else: right-=1
return res
if __name__ == '__main__':
sol=Solution()
# height = [1, 1]
height=[1,3,2,5,25,24,5]
print(sol.maxArea(height))
| 25.947368 | 69 | 0.543611 | 66 | 493 | 3.939394 | 0.424242 | 0.138462 | 0.123077 | 0.161538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043228 | 0.296146 | 493 | 18 | 70 | 27.388889 | 0.706052 | 0.030426 | 0 | 0 | 0 | 0 | 0.016807 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0 | 0 | 0.2 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c3c2ebbf2a88dc388bb0314813d8b32b385e4b0 | 3,133 | py | Python | rqalpha/data/instrument_mixin.py | mysky528/rqalpha | ecd550fc30aee96f9995e8152e2c48f5512f8b11 | [
"Apache-2.0"
] | 3 | 2017-07-11T15:37:24.000Z | 2021-11-22T14:21:13.000Z | rqalpha/data/instrument_mixin.py | mysky528/rqalpha | ecd550fc30aee96f9995e8152e2c48f5512f8b11 | [
"Apache-2.0"
] | null | null | null | rqalpha/data/instrument_mixin.py | mysky528/rqalpha | ecd550fc30aee96f9995e8152e2c48f5512f8b11 | [
"Apache-2.0"
] | 2 | 2019-04-26T07:51:08.000Z | 2020-12-01T20:59:04.000Z | # -*- coding: utf-8 -*-
#
# Copyright 2017 Ricequant, Inc
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import six
class InstrumentMixin(object):
def __init__(self, instruments):
self._instruments = {i.order_book_id: i for i in instruments}
self._sym_id_map = {i.symbol: k for k, i in six.iteritems(self._instruments)
# 过滤掉 CSI300, SSE50, CSI500, SSE180
if not i.order_book_id.endswith('INDX')}
try:
# FIXME
# 沪深300 中证500 固定使用上证的
for o in ['000300.XSHG', '000905.XSHG']:
self._sym_id_map[self._instruments[o].symbol] = o
# 上证180 及 上证180指数 两个symbol都指向 000010.XSHG
self._sym_id_map[self._instruments['SSE180.INDX'].symbol] = '000010.XSHG'
except KeyError:
pass
def sector(self, code):
return [v.order_book_id for v in self._instruments.values()
if v.type == 'CS' and v.sector_code == code]
def industry(self, code):
return [v.order_book_id for v in self._instruments.values()
if v.type == 'CS' and v.industry_code == code]
def concept(self, *concepts):
return [v.order_book_id for v in self._instruments.values()
if v.type == 'CS' and any(c in v.concept_names.split('|') for c in concepts)]
def all_instruments(self, types, dt=None):
return [i for i in self._instruments.values()
if ((dt is None or i.listed_date.date() <= dt.date() <= i.de_listed_date.date()) and
(types is None or i.type in types))]
def _instrument(self, sym_or_id):
try:
return self._instruments[sym_or_id]
except KeyError:
try:
sym_or_id = self._sym_id_map[sym_or_id]
return self._instruments[sym_or_id]
except KeyError:
return None
def instruments(self, sym_or_ids):
if isinstance(sym_or_ids, six.string_types):
return self._instrument(sym_or_ids)
return [i for i in [self._instrument(sid) for sid in sym_or_ids] if i is not None]
def get_future_contracts(self, underlying, date):
date = date.replace(hour=0, minute=0, second=0)
futures = [v for o, v in six.iteritems(self._instruments)
if v.type == 'Future' and v.underlying_symbol == underlying and
not o.endswith('88') and not o.endswith('99')]
if not futures:
return []
return sorted(i.order_book_id for i in futures if i.listed_date <= date <= i.de_listed_date)
| 40.166667 | 100 | 0.620172 | 444 | 3,133 | 4.209459 | 0.326577 | 0.096308 | 0.035313 | 0.025682 | 0.256822 | 0.197432 | 0.17924 | 0.146067 | 0.101124 | 0.101124 | 0 | 0.029333 | 0.281838 | 3,133 | 77 | 101 | 40.688312 | 0.801333 | 0.21481 | 0 | 0.23913 | 0 | 0 | 0.026639 | 0 | 0 | 0 | 0 | 0.012987 | 0 | 1 | 0.173913 | false | 0.021739 | 0.021739 | 0.086957 | 0.456522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c3e669806f961c690e3e607d0c5ebaae5ffefbe | 2,503 | py | Python | articles/views.py | qwghlm/CommentIsMee | 2c11be1376ec693df28123727c3d86b38404fd71 | [
"MIT"
] | null | null | null | articles/views.py | qwghlm/CommentIsMee | 2c11be1376ec693df28123727c3d86b38404fd71 | [
"MIT"
] | null | null | null | articles/views.py | qwghlm/CommentIsMee | 2c11be1376ec693df28123727c3d86b38404fd71 | [
"MIT"
] | null | null | null | from django.http import HttpResponse
from django.template import RequestContext, loader
from django.shortcuts import render, get_object_or_404, redirect
from django.core.urlresolvers import reverse
from django.core.cache import cache
from articles.models import CIFArticle
from .forms import CIFArticleForm
def index(request):
"""
Handle requests to the homepage
"""
article = None
# If a user has submitted a URL...
if request.POST:
form = CIFArticleForm(request.POST)
if (form.is_valid()):
try:
article = form.save(commit=False)
existing_articles = CIFArticle.objects.filter(url=article.url).count()
if existing_articles:
article = CIFArticle.objects.get(url=article.url)
else:
article.measure_ego()
article.save()
except ValueError, e:
article = None
form._errors["url"] = form.error_class([str(e)])
# If no URL submitted, just set up a blank form
else:
form = CIFArticleForm()
# If an article is found or created due to a user submission, redirect there
if article:
return redirect(reverse("articles:detail", args=(article.id,)))
# Else show the homepage & rendered form
else:
top_articles = cache.get('cim:top_articles')
if top_articles is None:
top_articles = CIFArticle.objects.filter(is_cif=1).order_by('-score')[:10]
cache.set('cim:top_articles', top_articles, 60)
latest_articles = cache.get('cim:latest_articles')
if latest_articles is None:
latest_articles = CIFArticle.objects.filter(is_cif=1).order_by('-id')[:5]
cache.set('cim:latest_articles', latest_articles, 30)
return render(request, 'articles/index.html', {
'form' : form ,
'top_articles' : top_articles,
'latest_articles' : latest_articles
})
def detail(request, article_id):
"""
Handle detail view for an article
"""
# Quite simple, set up article and form
form = CIFArticleForm()
article_key = 'cim:article:%s' % article_id
article = cache.get(article_key)
if article is None:
article = get_object_or_404(CIFArticle, id=article_id)
cache.set(article_key, article, 300)
return render(request, 'articles/detail.html', {
'article' : article,
'form' : form })
| 32.934211 | 86 | 0.62485 | 299 | 2,503 | 5.103679 | 0.334448 | 0.057667 | 0.049148 | 0.060944 | 0.057667 | 0.057667 | 0.057667 | 0.057667 | 0.057667 | 0 | 0 | 0.009939 | 0.276468 | 2,503 | 75 | 87 | 33.373333 | 0.832689 | 0.09189 | 0 | 0.137255 | 0 | 0 | 0.088439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.137255 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c41c0dd3400c46c01883be0652a07078deef3cb | 2,616 | py | Python | pydoc_fork/__main__.py | matthewdeanmartin/pydoc_fork | 174475b15be966f3751d5563b4db0beecc3ab1f9 | [
"MIT"
] | null | null | null | pydoc_fork/__main__.py | matthewdeanmartin/pydoc_fork | 174475b15be966f3751d5563b4db0beecc3ab1f9 | [
"MIT"
] | 1 | 2022-01-17T16:28:45.000Z | 2022-01-17T16:28:45.000Z | pydoc_fork/__main__.py | matthewdeanmartin/pydoc_fork | 174475b15be966f3751d5563b4db0beecc3ab1f9 | [
"MIT"
] | null | null | null | # noinspection PyPep8
"""pydoc_fork
A fork of pydoc that is optimized for generating html documentation in a CI context
Usage:
pydoc_fork <package>... [options]
pydoc_fork (-h | --help)
pydoc_fork --version
Options:
-h --help Show this screen.
-v --version Show version.
--quiet No printing or logging.
--verbose Crank up the logging.
--config <config> pyproject.toml or other toml config.
--document_internals respect underscore or __all__ private
--prefer_docs_python_org link to python.org or generate own stdlib docs
-o --output <folder> where to write files
"""
# TODO: implement this
# pydoc_fork dot_notation <importable>... [--output=<folder>] [--document_internals]
# pydoc_fork source_path <path>... [--output=<folder>] [--document_internals]
import logging
import sys
import docopt
from pydoc_fork import commands, settings
from pydoc_fork.settings import load_config
LOGGER = logging.getLogger(__name__)
LOGGERS = []
__version__ = "3.0.0"
def main() -> int:
"""Get the args object from command parameters"""
arguments = docopt.docopt(__doc__, version=f"pydoc_fork {__version__}")
config_path = arguments.get("<config>")
if config_path:
load_config(config_path)
LOGGER.debug(f"Invoking with docopts: {str(arguments)}")
output_folder = arguments["--output"]
# TODO: add lists of packages
package = arguments["<package>"] or []
# quiet = bool(arguments.get("--quiet", False))
if arguments.get("--document_internals"):
settings.DOCUMENT_INTERNALS = arguments["--document_internals"]
if arguments.get("--prefer_docs_python_org"):
settings.PREFER_DOCS_PYTHON_ORG = arguments["--prefer_docs_python_org"]
if arguments.get("--verbose"):
# root logger, all modules
for root in ("pydoc_fork", "__main__"):
logger = logging.getLogger(root)
logger.setLevel(logging.DEBUG)
handler = logging.StreamHandler()
handler.setLevel(logging.DEBUG)
log_format = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
formatter = logging.Formatter(log_format)
handler.setFormatter(formatter)
logger.addHandler(handler)
LOGGERS.append(logger)
commands.process_path_or_dot_name(
package,
output_folder=output_folder,
)
# # TODO
# print("Don't recognize that command.")
# return -1
return 0
if __name__ == "__main__":
sys.exit(main())
| 31.518072 | 86 | 0.64526 | 302 | 2,616 | 5.337748 | 0.413907 | 0.055831 | 0.039702 | 0.047146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003018 | 0.240061 | 2,616 | 82 | 87 | 31.902439 | 0.807847 | 0.41208 | 0 | 0 | 0 | 0 | 0.176781 | 0.031662 | 0 | 0 | 0 | 0.012195 | 0 | 1 | 0.026316 | false | 0 | 0.131579 | 0 | 0.184211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c4262cdeb92ebd6c335d957cdc8fd8bfca03129 | 190 | py | Python | Learning Python/Exercise Files/Ch2/helloworld_my.py | RomanShevtsiv/linkedin-learning | d7ec85953b7e88905f87928ede067d32344b984f | [
"MIT"
] | null | null | null | Learning Python/Exercise Files/Ch2/helloworld_my.py | RomanShevtsiv/linkedin-learning | d7ec85953b7e88905f87928ede067d32344b984f | [
"MIT"
] | null | null | null | Learning Python/Exercise Files/Ch2/helloworld_my.py | RomanShevtsiv/linkedin-learning | d7ec85953b7e88905f87928ede067d32344b984f | [
"MIT"
] | null | null | null | #
# Example file for HelloWorld
#
def main():
print("Hello World")
name = input("What is your name? ")
print("Nice to meet you,", name)
if __name__ == "__main__":
main()
| 13.571429 | 39 | 0.594737 | 25 | 190 | 4.2 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.252632 | 190 | 13 | 40 | 14.615385 | 0.739437 | 0.142105 | 0 | 0 | 0 | 0 | 0.345912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.166667 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c49c9837d339902372100015afa8dd09aa825df | 718 | py | Python | tests/main.py | deeso/json-search-replace | d1dd75cfaecb65bf8fcbad0c80a0bd839eccaa8d | [
"Apache-2.0"
] | 1 | 2019-02-08T14:42:45.000Z | 2019-02-08T14:42:45.000Z | tests/main.py | deeso/manipin-json | d1dd75cfaecb65bf8fcbad0c80a0bd839eccaa8d | [
"Apache-2.0"
] | null | null | null | tests/main.py | deeso/manipin-json | d1dd75cfaecb65bf8fcbad0c80a0bd839eccaa8d | [
"Apache-2.0"
] | null | null | null | from wrapper_tests.upsert_test import *
from wrapper_tests.upsertvaluedict_test import *
import os
import logging
import sys
import argparse
import signal
logging.getLogger().setLevel(logging.DEBUG)
ch = logging.StreamHandler(sys.stdout)
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter('[%(asctime)s - %(name)s] %(message)s')
ch.setFormatter(formatter)
logging.getLogger().addHandler(ch)
parser = argparse.ArgumentParser(
description='Unit testing for fiery snap.')
parser.add_argument('-config', type=str, default=None,
help='toml config for keys and such, see key.toml')
if __name__ == '__main__':
unittest.main()
os.kill(os.getpid(), signal.SIGKILL)
| 26.592593 | 71 | 0.721448 | 90 | 718 | 5.611111 | 0.588889 | 0.043564 | 0.063366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157382 | 718 | 26 | 72 | 27.615385 | 0.834711 | 0 | 0 | 0 | 0 | 0 | 0.169916 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.35 | 0 | 0.35 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c4b4d3e7fde53ff67c2f5b9ffd3aee5b505137c | 598 | py | Python | Sending_email/email.py | Satyam-Bhalla/Python-Scripts | 39c46a362acd63cc5d1b9ab57ecb7250eaff35f7 | [
"MIT"
] | 8 | 2018-09-25T16:30:12.000Z | 2022-03-25T05:13:43.000Z | Sending_email/email.py | Satyam-Bhalla/Python-Scripts | 39c46a362acd63cc5d1b9ab57ecb7250eaff35f7 | [
"MIT"
] | 1 | 2021-03-31T18:43:43.000Z | 2021-03-31T18:43:43.000Z | Sending_email/email.py | Satyam-Bhalla/Python-Scripts | 39c46a362acd63cc5d1b9ab57ecb7250eaff35f7 | [
"MIT"
] | 6 | 2018-01-29T19:00:42.000Z | 2022-03-25T05:13:47.000Z | import smtplib
gmail_user = 'your email'
gmail_password = 'your password'
sent_from = gmail_user
to = ['reciever email'] #Create a list for all the recievers
subject = 'OMG Super Important Message'
body = 'Hey, what\'s up?\n- You'
email_text = """\
From: %s
To: %s
Subject: %s
%s
""" % (sent_from, ", ".join(to), subject, body)
try:
server = smtplib.SMTP_SSL('smtp.gmail.com', 465)
server.ehlo()
server.login(gmail_user, gmail_password)
server.sendmail(sent_from, to, email_text)
server.close()
print('Email sent!')
except Exception as e:
print(e)
| 21.357143 | 61 | 0.653846 | 86 | 598 | 4.418605 | 0.546512 | 0.071053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006303 | 0.204013 | 598 | 27 | 62 | 22.148148 | 0.792017 | 0.058528 | 0 | 0 | 0 | 0 | 0.251337 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.090909 | 0.090909 | 0 | 0.090909 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1c4b532b4b156dd08dc3bcca54d167230e8c8b2a | 4,532 | py | Python | FlaskDaemon/load_test.py | caffeinate/test-pylot | 3380208ea0e7ee5fed4299f22ab592a3d3232b3a | [
"MIT"
] | null | null | null | FlaskDaemon/load_test.py | caffeinate/test-pylot | 3380208ea0e7ee5fed4299f22ab592a3d3232b3a | [
"MIT"
] | 1 | 2021-10-31T17:46:54.000Z | 2021-10-31T17:46:54.000Z | FlaskDaemon/load_test.py | caffeinate/test-pylot | 3380208ea0e7ee5fed4299f22ab592a3d3232b3a | [
"MIT"
] | 1 | 2020-07-20T04:10:40.000Z | 2020-07-20T04:10:40.000Z | '''
Created on 11 Sep 2015
@author: si
'''
import json
import random
import time
from threading import Thread
# import urllib
import urllib2
from Queue import Queue
import logging
logger = logging.getLogger(__name__)
API_URL = "http://127.0.0.1:5000/"
class LoadTest(object):
"""
Create a single process with one thread per test user.
"""
def __init__(self, test_users_count, requests_per_user):
"""
@param test_users_count: int
@param requests_per_user: int
"""
self.thread_table = []
self.test_users_count = test_users_count
self.requests_per_user = requests_per_user
self.stats = { 'return_codes' : {},
'requests_made' : 0,
'total_seconds_waiting' : 0.0
}
self.stats_q = Queue(0)
def go(self):
start_time = time.time()
msg = "%s test users with %s requests each..." % \
(self.test_users_count, self.requests_per_user)
self.logger(msg)
for i in range(self.test_users_count):
p = TestUser(i, self.requests_per_user, self.stats_q)
p.start()
self.thread_table.append(p)
end_time = time.time()
self.logger("time taken to create threads : %s" % (end_time-start_time,))
start_time = time.time()
# wait for threads to complete
while True:
alive_count = len(self.thread_table)
# could time.sleep(0.5) or just wait for all threads to finish
for p in self.thread_table:
if not p.is_alive():
alive_count -= 1
else:
p.join()
if alive_count == 0:
break
#print "alive:%s" % alive_count
end_time = time.time()
time_taken = end_time-start_time
self.logger("finished. Time taken : %s" % time_taken)
while not self.stats_q.empty():
user_stats = self.stats_q.get()
for http_status, count in user_stats['return_codes'].iteritems():
if http_status not in self.stats['return_codes']:
self.stats['return_codes'][http_status] = 0
self.stats['return_codes'][http_status] += count
self.stats['requests_made'] += user_stats['requests_made']
self.stats['total_seconds_waiting'] += user_stats['total_seconds_waiting']
print self.stats
# time_taken is real time not CPU
req_per_sec = float(self.stats['requests_made'])/time_taken
print "Requests per second: %s" % req_per_sec
def logger(self, msg):
logger.info(msg)
print msg
class TestUser(Thread):
"""
Act like a user. Bit over simplified at the moment.
"""
def __init__(self, user_id, requests_count, stats_queue):
super(TestUser, self).__init__()
self.remaining_request = requests_count
self.base_url = API_URL
self.stats_queue = stats_queue
self.user_id = user_id
def logger(self, msg):
logger.info(msg)
#print msg
def run(self):
"""
@return: dictionary of stats to be collected by main process
"""
stats = { 'return_codes' : {},
'requests_made': self.remaining_request,
'total_seconds_waiting' : 0.0, # waiting for requests
}
while self.remaining_request > 0:
# sleep for average of half a second
time.sleep(random.random())
start_time = time.time()
# for POST
#raw = {}
#d = json.dumps(raw)
#h = {'Content-type': 'application/json'}
#req = urllib2.Request(self.base_url, data=d, headers=h)
# for GET
req = urllib2.Request(self.base_url)
f = urllib2.urlopen(req)
end_time = time.time()
d = end_time-start_time
stats['total_seconds_waiting'] += d
http_status = f.getcode()
if http_status not in stats['return_codes']:
stats['return_codes'][http_status] = 0
stats['return_codes'][http_status] += 1
self.remaining_request -= 1
self.logger("Thread %s finished: %s" % (self.user_id, stats))
self.stats_queue.put(stats, False)
if __name__ == '__main__':
l = LoadTest(10,30)
l.go()
| 29.23871 | 86 | 0.559797 | 552 | 4,532 | 4.349638 | 0.262681 | 0.052478 | 0.059975 | 0.029988 | 0.205331 | 0.118284 | 0.058309 | 0.03082 | 0.03082 | 0 | 0 | 0.01297 | 0.336496 | 4,532 | 154 | 87 | 29.428571 | 0.785501 | 0.081421 | 0 | 0.11236 | 0 | 0 | 0.118376 | 0.027683 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.078652 | null | null | 0.033708 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c4b641cf08d14aaba12ee7b055b0523dd40710b | 407 | py | Python | urls.py | jeylani99/Real-Estate | 5ccb4bf23c73b4acb77427faa202a15216ef58c3 | [
"Apache-2.0"
] | null | null | null | urls.py | jeylani99/Real-Estate | 5ccb4bf23c73b4acb77427faa202a15216ef58c3 | [
"Apache-2.0"
] | null | null | null | urls.py | jeylani99/Real-Estate | 5ccb4bf23c73b4acb77427faa202a15216ef58c3 | [
"Apache-2.0"
] | null | null | null | from django.contrib import admin
from django.conf.urls import include,url
from .import views
urlpatterns = [
url(r'^$', views.IndexView.as_view(),name='index'),
#homeapp_detail_view_url
url(r'^(?P<pk>[0-9]+)/$',views.LocationView.as_view(),name='property'),
#homeapp/detailview/moredetailview
url(r'^([0-9]+)/(?P<pk>[0-9]+)/$',views.PropertyView.as_view(),name='propertyview'),
]
| 29.071429 | 88 | 0.668305 | 57 | 407 | 4.666667 | 0.491228 | 0.045113 | 0.112782 | 0.037594 | 0.075188 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016713 | 0.117936 | 407 | 14 | 89 | 29.071429 | 0.724234 | 0.137592 | 0 | 0 | 0 | 0 | 0.2 | 0.074286 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c5289b76fb10d8b256a4000027a462353b8a389 | 1,342 | py | Python | SSOKeyGen/ssokeygendialog.py | chrcoe/sso-keygen | c149f6202fbecb38874c75bf82e0d4857d1249f9 | [
"MIT"
] | null | null | null | SSOKeyGen/ssokeygendialog.py | chrcoe/sso-keygen | c149f6202fbecb38874c75bf82e0d4857d1249f9 | [
"MIT"
] | null | null | null | SSOKeyGen/ssokeygendialog.py | chrcoe/sso-keygen | c149f6202fbecb38874c75bf82e0d4857d1249f9 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'ssokeygendialog.ui'
#
# Created: Sun Feb 1 12:33:36 2015
# by: PyQt5 UI code generator 5.4
#
# WARNING! All changes made in this file will be lost!
from PyQt5 import QtCore, QtGui, QtWidgets
class Ui_Dialog(object):
def setupUi(self, Dialog):
Dialog.setObjectName("Dialog")
Dialog.resize(400, 300)
self.buttonBox = QtWidgets.QDialogButtonBox(Dialog)
self.buttonBox.setGeometry(QtCore.QRect(30, 240, 341, 32))
self.buttonBox.setOrientation(QtCore.Qt.Horizontal)
self.buttonBox.setStandardButtons(QtWidgets.QDialogButtonBox.Cancel|QtWidgets.QDialogButtonBox.Ok)
self.buttonBox.setObjectName("buttonBox")
self.testLabel = QtWidgets.QLabel(Dialog)
self.testLabel.setGeometry(QtCore.QRect(50, 40, 181, 31))
self.testLabel.setObjectName("testLabel")
self.retranslateUi(Dialog)
self.buttonBox.accepted.connect(Dialog.accept)
self.buttonBox.rejected.connect(Dialog.reject)
QtCore.QMetaObject.connectSlotsByName(Dialog)
def retranslateUi(self, Dialog):
_translate = QtCore.QCoreApplication.translate
Dialog.setWindowTitle(_translate("Dialog", "Dialog Test"))
self.testLabel.setText(_translate("Dialog", "TextLabel"))
| 38.342857 | 106 | 0.708644 | 148 | 1,342 | 6.398649 | 0.554054 | 0.096093 | 0.040127 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037341 | 0.181818 | 1,342 | 34 | 107 | 39.470588 | 0.825137 | 0.161699 | 0 | 0 | 1 | 0 | 0.050179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.047619 | 0 | 0.190476 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c54af7d2bc1fc02891b6239b955a52d082c20b2 | 936 | py | Python | pycuda/characterize.py | grlee77/pycuda | cfb787ac73a523fe4b32eff31ecffac485388bbf | [
"Apache-2.0"
] | null | null | null | pycuda/characterize.py | grlee77/pycuda | cfb787ac73a523fe4b32eff31ecffac485388bbf | [
"Apache-2.0"
] | null | null | null | pycuda/characterize.py | grlee77/pycuda | cfb787ac73a523fe4b32eff31ecffac485388bbf | [
"Apache-2.0"
] | 1 | 2020-08-31T08:52:24.000Z | 2020-08-31T08:52:24.000Z | from __future__ import division
from __future__ import absolute_import
from pycuda.tools import context_dependent_memoize
import numpy as np
def platform_bits():
return tuple.__itemsize__ * 8
def has_stack():
from pycuda.driver import Context
return Context.get_device().compute_capability() >= (2, 0)
def has_double_support():
from pycuda.driver import Context
return Context.get_device().compute_capability() >= (1, 3)
@context_dependent_memoize
def sizeof(type_name, preamble=""):
from pycuda.compiler import SourceModule
mod = SourceModule("""
%s
extern "C"
__global__ void write_size(size_t *output)
{
*output = sizeof(%s);
}
""" % (preamble, type_name), no_extern_c=True)
import pycuda.gpuarray as gpuarray
output = gpuarray.empty((), dtype=np.uintp)
mod.get_function("write_size")(output, block=(1, 1, 1), grid=(1, 1))
return int(output.get())
| 24 | 72 | 0.700855 | 123 | 936 | 5.03252 | 0.479675 | 0.06462 | 0.051696 | 0.071082 | 0.219709 | 0.219709 | 0.219709 | 0.219709 | 0.219709 | 0.219709 | 0 | 0.013106 | 0.184829 | 936 | 38 | 73 | 24.631579 | 0.798165 | 0 | 0 | 0.074074 | 0 | 0 | 0.132479 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.148148 | false | 0 | 0.296296 | 0.037037 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1c556489b0d99f41db32e59ce5f01f383067703c | 2,797 | py | Python | pygs/graphserver/compiler/dedupe.py | abyrd/graphserver | 42edcad2618635310c57fa6ab4a13974025248ba | [
"BSD-3-Clause-Clear"
] | 2 | 2015-02-25T21:46:02.000Z | 2019-04-27T20:22:33.000Z | pygs/graphserver/compiler/dedupe.py | ninowalker/graphserver | dc08070bc6e295986633cf510ca46a2f8d451b92 | [
"BSD-3-Clause-Clear"
] | null | null | null | pygs/graphserver/compiler/dedupe.py | ninowalker/graphserver | dc08070bc6e295986633cf510ca46a2f8d451b92 | [
"BSD-3-Clause-Clear"
] | null | null | null | # eliminate duplicate service periods from a GTFS database
from graphserver.ext.gtfs.gtfsdb import GTFSDatabase
import sys
from optparse import OptionParser
def main():
usage = """usage: python dedupe.py <graphdb_filename>"""
parser = OptionParser(usage=usage)
(options, args) = parser.parse_args()
if len(args) != 1:
parser.print_help()
exit(-1)
graphdb_filename = args[0]
gtfsdb = GTFSDatabase( graphdb_filename )
query = """
SELECT count(*), monday, tuesday, wednesday, thursday, friday, saturday, sunday, start_date, end_date
FROM calendar
GROUP BY monday, tuesday, wednesday, thursday, friday, saturday, sunday, start_date, end_date"""
duped_periods = gtfsdb.execute( query )
equivilants = []
for count, m,t,w,th,f,s,su,start_date,end_date in duped_periods:
# no need to check for dupes if there's only one
if count==1:
continue
#print count, m, t, w, th, f, s, su, start_date, end_date
# get service_ids for this dow/start_date/end_date combination
service_ids = [x[0] for x in list( gtfsdb.execute( "SELECT service_id FROM calendar where monday=? and tuesday=? and wednesday=? and thursday=? and friday=? and saturday=? and sunday=? and start_date=? and end_date=?", (m,t,w,th,f,s,su,start_date,end_date) ) ) ]
# group by service periods with the same set of exceptions
exception_set_grouper = {}
for service_id in service_ids:
exception_set = list(gtfsdb.execute( "SELECT date, exception_type FROM calendar_dates WHERE service_id=?", (service_id,) ) )
exception_set.sort()
exception_set = tuple(exception_set)
exception_set_grouper[exception_set] = exception_set_grouper.get(exception_set,[])
exception_set_grouper[exception_set].append( service_id )
# extend list of equivilants
for i, exception_set_group in enumerate( exception_set_grouper.values() ):
equivilants.append( ("%d%d%d%d%d%d%d-%s-%s-%d"%(m,t,w,th,f,s,su,start_date,end_date,i), exception_set_group) )
for new_name, old_names in equivilants:
for old_name in old_names:
print old_name, new_name
c = gtfsdb.conn.cursor()
c.execute( "UPDATE calendar SET service_id=? WHERE service_id=?", (new_name, old_name) )
c.execute( "UPDATE calendar_dates SET service_id=? WHERE service_id=?", (new_name, old_name) )
c.execute( "UPDATE trips SET service_id=? WHERE service_id=?", (new_name, old_name) )
gtfsdb.conn.commit()
c.close()
if __name__=='__main__':
| 39.957143 | 271 | 0.631748 | 370 | 2,797 | 4.556757 | 0.3 | 0.099644 | 0.049822 | 0.066429 | 0.297746 | 0.286477 | 0.282325 | 0.231317 | 0.231317 | 0.231317 | 0 | 0.002431 | 0.264569 | 2,797 | 70 | 272 | 39.957143 | 0.81721 | 0.108688 | 0 | 0 | 0 | 0.02439 | 0.274226 | 0.009248 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.073171 | null | null | 0.04878 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c59215728acaff76dbcdca05ce20bf9c254f9f4 | 1,627 | py | Python | tests/test_deepsv.py | lsantuari/deepsv | debaa1442d1d97b8220be70e12321cf047d3e6a0 | [
"Apache-2.0"
] | null | null | null | tests/test_deepsv.py | lsantuari/deepsv | debaa1442d1d97b8220be70e12321cf047d3e6a0 | [
"Apache-2.0"
] | null | null | null | tests/test_deepsv.py | lsantuari/deepsv | debaa1442d1d97b8220be70e12321cf047d3e6a0 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import pytest
from deepsv import deepsv
from unittest.mock import patch
"""Tests for the deepsv module.
"""
def test_something():
assert True
def test_adding_numbers():
assert deepsv.add_numbers(1, 1) == 2
assert deepsv.add_numbers(1, 2) != 2
def test_with_error():
with pytest.raises(ValueError):
# Do something that raises a ValueError
raise ValueError
# Fixture example
@pytest.fixture
def an_object():
return {}
def test_deepsv(an_object):
assert an_object == {}
def side_effect_function(mock):
print('This part of the code runs when patched')
return 'Some text that I want to test with'
def test_word_count_of_book_base():
book = 'https://www.gutenberg.org/files/59560/59560-0.txt'
wc = deepsv.word_count(book)
assert wc == 30577
@patch('deepsv.deepsv.download_text', side_effect=side_effect_function)
def test_word_count_of_book(mock):
# book = 'https://www.gutenberg.org/files/59560/59560-0.txt'
wc = deepsv.word_count(mock.text)
assert wc == 8
def test_count_single_base():
sequence = 'TTAGGACCA'
assert deepsv.count_single_base('A', sequence) == 3
assert deepsv.count_single_base('C', sequence) == 2
assert deepsv.count_single_base('G', sequence) == 2
assert deepsv.count_single_base('T', sequence) == 2
def side_effect_get_sequence():
return 'GTACGTCAG'
@patch('deepsv.deepsv.get_sequence', return_value='GTACGTCAG')
def test_count_bases(sequence):
seq_dict = {'A': 2, 'C': 2, 'G': 3, 'T': 2}
assert deepsv.count_bases(sequence) == seq_dict
| 22.287671 | 71 | 0.695144 | 238 | 1,627 | 4.546218 | 0.357143 | 0.051756 | 0.069316 | 0.085028 | 0.356747 | 0.218115 | 0.177449 | 0.110906 | 0.110906 | 0.110906 | 0 | 0.032138 | 0.177628 | 1,627 | 72 | 72 | 22.597222 | 0.776532 | 0.095267 | 0 | 0 | 0 | 0 | 0.146648 | 0.037011 | 0 | 0 | 0 | 0 | 0.282051 | 1 | 0.282051 | false | 0 | 0.076923 | 0.051282 | 0.435897 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c5cd63de747901926f8ddd0a4d149ca05999677 | 2,575 | py | Python | python-framework/handlers/base/auth.py | huangxingx/python-framework | a62618b0ee5ecff9de426327892cdd690d10510d | [
"MIT"
] | 7 | 2019-10-24T03:26:22.000Z | 2019-10-27T14:55:07.000Z | python-framework/handlers/base/auth.py | PJoemu/python-framework | a62618b0ee5ecff9de426327892cdd690d10510d | [
"MIT"
] | 3 | 2021-06-08T19:13:10.000Z | 2022-01-13T00:38:48.000Z | python-framework/handlers/base/auth.py | PJoemu/python-framework | a62618b0ee5ecff9de426327892cdd690d10510d | [
"MIT"
] | 2 | 2019-10-25T03:54:51.000Z | 2020-06-28T08:50:12.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @author: x.huang
# @date:17-8-4
import logging
from pony.orm import db_session
from handlers.base.base import BaseRequestHandler
class LoginRequireError(Exception):
pass
class AuthBaseHandler(BaseRequestHandler):
""" 登录验证的基类 """
def prepare(self):
if not self.current_user and self.request.method.lower() != 'options':
self.render_error('Auth Error.', status_code=401)
super(AuthBaseHandler, self).prepare()
class Authentication(object):
def __init__(self, handler):
self.handler = handler
def admin_auth(self, username, password):
try:
with db_session:
user_obj = self.handler.m_useradmin.get(username=username, is_delete=False)
if user_obj:
is_auth = user_obj.check_password(password)
if is_auth:
user_dict = user_obj.to_dict(exclude=self.handler.m_useradmin.password.column)
user_dict['permission'] = user_obj.role_id.permission if user_obj.role_id else None
return user_dict
else:
return None
except Exception as e:
logging.error(str(e))
return None
def api_auth(self, phone, password, sc_auth=False):
try:
with db_session:
user_obj = self.handler.m_appuser.get(phone=phone, is_delete=False)
if user_obj:
is_auth = False
if password:
is_auth = user_obj.check_password(password)
if sc_auth or is_auth:
user_dict = user_obj.to_dict()
return user_dict
else:
return None
except Exception as e:
logging.error(str(e))
return None
def web_auth(self, username, password):
try:
with db_session:
user_obj = self.handler.m_comuser.get(com_username=username, is_delete=False)
if user_obj:
is_auth = False
if password:
is_auth = user_obj.check_password(password)
if is_auth:
user_dict = user_obj.to_dict()
return user_dict
else:
return None
except Exception as e:
logging.error(str(e))
return None
| 32.1875 | 107 | 0.533204 | 282 | 2,575 | 4.652482 | 0.308511 | 0.074695 | 0.045732 | 0.036585 | 0.536585 | 0.536585 | 0.536585 | 0.536585 | 0.536585 | 0.509909 | 0 | 0.005109 | 0.391845 | 2,575 | 79 | 108 | 32.594937 | 0.832695 | 0.031456 | 0 | 0.633333 | 0 | 0 | 0.011272 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.166667 | 0.05 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1c5e34faccefb41600dc36e2445e46683f4cb6c1 | 5,213 | py | Python | tests/test_command.py | paulfurley/Mailpile | f89611d916e41e74dd00997327a2c2d042a96399 | [
"Apache-2.0"
] | 1 | 2017-04-19T11:10:05.000Z | 2017-04-19T11:10:05.000Z | tests/test_command.py | paulfurley/Mailpile | f89611d916e41e74dd00997327a2c2d042a96399 | [
"Apache-2.0"
] | null | null | null | tests/test_command.py | paulfurley/Mailpile | f89611d916e41e74dd00997327a2c2d042a96399 | [
"Apache-2.0"
] | null | null | null | import unittest
import mailpile
from mock import patch
from mailpile.commands import Action as action
from tests import MailPileUnittest
class TestCommands(MailPileUnittest):
def test_index(self):
res = self.mp.rescan()
self.assertEqual(res.as_dict()["status"], 'success')
def test_search(self):
# A random search must return results in less than 0.2 seconds.
res = self.mp.search("foo")
self.assertLess(float(res.as_dict()["elapsed"]), 0.2)
def test_optimize(self):
res = self.mp.optimize()
self.assertEqual(res.as_dict()["result"], True)
def test_set(self):
self.mp.set("prefs.num_results=1")
results = self.mp.search("twitter")
self.assertEqual(results.result['stats']['count'], 1)
def test_unset(self):
self.mp.unset("prefs.num_results")
results = self.mp.search("twitter")
self.assertEqual(results.result['stats']['count'], 3)
def test_add(self):
res = self.mp.add("tests")
self.assertEqual(res.as_dict()["result"], True)
def test_add_mailbox_already_in_pile(self):
res = self.mp.add("tests")
self.assertEqual(res.as_dict()["result"], True)
def test_add_mailbox_no_such_directory(self):
res = self.mp.add("wut?")
self.assertEqual(res.as_dict()["result"], False)
def test_output(self):
res = self.mp.output("json")
self.assertEqual(res.as_dict()["result"], {'output': 'json'})
def test_help(self):
res = self.mp.help()
self.assertEqual(len(res.result), 3)
def test_help_variables(self):
res = self.mp.help_variables()
self.assertGreater(len(res.result['variables']), 1)
def test_help_with_param_search(self):
res = self.mp.help('search')
self.assertEqual(res.result['pre'], 'Search your mail!')
def test_help_urlmap_as_text(self):
res = self.mp.help_urlmap()
self.assertEqual(len(res.result), 1)
self.assertGreater(res.as_text(), 0)
def test_crypto_policy_auto_set_all_action(self):
res = self.mp.crypto_policy_auto_set_all()
self.assertEqual(res.as_dict()["message"], u'Discovered crypto policy')
self.assertEqual(set(), res.as_dict()['result'])
def test_crypto_policy_action(self):
res = self.mp.crypto_policy("foobar")
self.assertEqual(res.as_dict()["message"], u'Crypto policy for foobar is none')
self.assertEqual(res.as_dict()["result"], 'none')
class TestCommandResult(MailPileUnittest):
def test_command_result_as_dict(self):
res = self.mp.help_splash()
self.assertGreater(len(res.as_dict()), 0)
def test_command_result_as_text(self):
res = self.mp.help_splash()
self.assertGreater(res.as_text(), 0)
def test_command_result_as_text_for_boolean_result(self):
res = self.mp.rescan()
self.assertEquals(res.result['messages'], 0)
self.assertEquals(res.result['mailboxes'], 0)
self.assertEquals(res.result['vcards'], 0)
def test_command_result_non_zero(self):
res = self.mp.help_splash()
self.assertTrue(res)
def test_command_result_as_json(self):
res = self.mp.help_splash()
self.assertGreater(res.as_json(), 0)
def test_command_result_as_html(self):
res = self.mp.help_splash()
self.assertGreater(res.as_html(), 0)
class TestTagging(MailPileUnittest):
def test_addtag(self):
pass
class TestGPG(MailPileUnittest):
def test_key_search(self):
gpg_result = {
"D13C70DA": {
"uids": [
{
"email": "smari@mailpile.is"
}
]
}
}
with patch('mailpile.commands.GnuPG') as gpg_mock:
gpg_mock.return_value.search_key.return_value = gpg_result
res = action(self.mp._session, "crypto/gpg/searchkey", "D13C70DA")
email = res.result["D13C70DA"]["uids"][0]["email"]
self.assertEqual(email, "smari@mailpile.is")
gpg_mock.return_value.search_key.assert_called_with("D13C70DA")
def test_key_receive(self):
gpg_result = {
"updated": [
{
"fingerprint": "08A650B8E2CBC1B02297915DC65626EED13C70DA"
}
]
}
with patch('mailpile.commands.GnuPG') as gpg_mock:
gpg_mock.return_value.recv_key.return_value = gpg_result
res = action(self.mp._session, "crypto/gpg/receivekey", "D13C70DA")
self.assertEqual(res.result[0]["updated"][0]["fingerprint"],
"08A650B8E2CBC1B02297915DC65626EED13C70DA")
gpg_mock.return_value.recv_key.assert_called_with("D13C70DA")
def test_key_import(self):
res = action(self.mp._session, "crypto/gpg/importkey",
'testing/pub.key')
self.assertEqual(res.result["results"]["count"], 1)
def test_nicknym_get_key(self):
pass
def test_nicknym_refresh_key(self):
pass
if __name__ == '__main__':
unittest.main()
| 32.378882 | 87 | 0.619029 | 636 | 5,213 | 4.850629 | 0.201258 | 0.061264 | 0.05543 | 0.075851 | 0.514749 | 0.420746 | 0.351053 | 0.269044 | 0.21718 | 0.20389 | 0 | 0.02454 | 0.249568 | 5,213 | 160 | 88 | 32.58125 | 0.764059 | 0.011702 | 0 | 0.190083 | 0 | 0 | 0.127961 | 0.028544 | 0 | 0 | 0 | 0 | 0.256198 | 1 | 0.223141 | false | 0.024793 | 0.057851 | 0 | 0.31405 | 0.016529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c6036ce4a4bea03f2bf60037b8ba69bf71a83e1 | 713 | py | Python | tests/backends/test_cookie.py | euri10/starsessions | 6bd258a0f94d30b6ec4a8da41910f97c5dabbe54 | [
"MIT"
] | 31 | 2021-07-15T13:00:06.000Z | 2022-03-17T08:25:52.000Z | tests/backends/test_cookie.py | euri10/starsessions | 6bd258a0f94d30b6ec4a8da41910f97c5dabbe54 | [
"MIT"
] | 6 | 2021-09-01T15:25:20.000Z | 2022-03-13T07:29:19.000Z | tests/backends/test_cookie.py | euri10/starsessions | 6bd258a0f94d30b6ec4a8da41910f97c5dabbe54 | [
"MIT"
] | 5 | 2021-08-19T04:46:35.000Z | 2022-03-09T15:27:22.000Z | import pytest
from starsessions import SessionBackend
@pytest.mark.asyncio
async def test_cookie_read_write(cookie: SessionBackend, session_payload: dict) -> None:
new_id = await cookie.write(session_payload, "session_id")
assert await cookie.read(new_id) == session_payload
@pytest.mark.asyncio
async def test_cookie_remove(cookie: SessionBackend) -> None:
await cookie.remove("session_id")
@pytest.mark.asyncio
async def test_cookie_exists(cookie: SessionBackend) -> None:
assert await cookie.exists("session_id") is False
@pytest.mark.asyncio
async def test_cookie_generate_id(cookie: SessionBackend) -> None:
new_id = await cookie.generate_id()
assert isinstance(new_id, str)
| 27.423077 | 88 | 0.775596 | 96 | 713 | 5.541667 | 0.291667 | 0.103383 | 0.12782 | 0.165414 | 0.338346 | 0.263158 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 713 | 25 | 89 | 28.52 | 0.858065 | 0 | 0 | 0.25 | 0 | 0 | 0.042076 | 0 | 0 | 0 | 0 | 0 | 0.1875 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c60d6b7074a5670b3d1308323fd21a043a33869 | 4,888 | py | Python | sqlalchemy_dremio/db.py | thbeh/sqlalchemy_dremio | 180169a86200977a8087d39afe67d3594bd66523 | [
"Apache-2.0"
] | 14 | 2020-04-19T16:14:37.000Z | 2021-11-14T01:45:51.000Z | sqlalchemy_dremio/db.py | thbeh/sqlalchemy_dremio | 180169a86200977a8087d39afe67d3594bd66523 | [
"Apache-2.0"
] | 13 | 2020-04-18T14:44:49.000Z | 2022-03-14T13:45:22.000Z | sqlalchemy_dremio/db.py | thbeh/sqlalchemy_dremio | 180169a86200977a8087d39afe67d3594bd66523 | [
"Apache-2.0"
] | 6 | 2020-04-29T10:18:59.000Z | 2021-08-19T13:46:30.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import logging
from pyarrow import flight
from sqlalchemy_dremio.exceptions import Error, NotSupportedError
from sqlalchemy_dremio.flight_auth import HttpDremioClientAuthHandler
from sqlalchemy_dremio.query import execute
logger = logging.getLogger(__name__)
paramstyle = 'qmark'
def connect(c):
return Connection(c)
def check_closed(f):
"""Decorator that checks if connection/cursor is closed."""
def g(self, *args, **kwargs):
if self.closed:
raise Error(
'{klass} already closed'.format(klass=self.__class__.__name__))
return f(self, *args, **kwargs)
return g
def check_result(f):
"""Decorator that checks if the cursor has results from `execute`."""
def d(self, *args, **kwargs):
if self._results is None:
raise Error('Called before `execute`')
return f(self, *args, **kwargs)
return d
class Connection(object):
def __init__(self, connection_string):
# TODO: Find a better way to extend to addition flight parameters
splits = connection_string.split(";")
client = flight.FlightClient('grpc+tcp://{0}:{1}'.format(splits[2].split("=")[1], splits[3].split("=")[1]))
client.authenticate(HttpDremioClientAuthHandler(splits[0].split("=")[1], splits[1].split("=")[1]))
self.flightclient = client
self.closed = False
self.cursors = []
@check_closed
def rollback(self):
pass
@check_closed
def close(self):
"""Close the connection now."""
self.closed = True
for cursor in self.cursors:
try:
cursor.close()
except Error:
pass # already closed
@check_closed
def commit(self):
pass
@check_closed
def cursor(self):
"""Return a new Cursor Object using the connection."""
cursor = Cursor(self.flightclient)
self.cursors.append(cursor)
return cursor
@check_closed
def execute(self, query):
cursor = self.cursor()
return cursor.execute(query)
def __enter__(self):
return self
def __exit__(self, *exc):
self.commit() # no-op
self.close()
class Cursor(object):
"""Connection cursor."""
def __init__(self, flightclient=None):
self.flightclient = flightclient
# This read/write attribute specifies the number of rows to fetch at a
# time with .fetchmany(). It defaults to 1 meaning to fetch a single
# row at a time.
self.arraysize = 1
self.closed = False
# this is updated only after a query
self.description = None
# this is set to a list of rows after a successful query
self._results = None
@property
@check_result
@check_closed
def rowcount(self):
return len(self._results)
@check_closed
def close(self):
"""Close the cursor."""
self.closed = True
@check_closed
def execute(self, query, params=None):
self.description = None
self._results, self.description = execute(
query, self.flightclient)
return self
@check_closed
def executemany(self, query):
raise NotSupportedError(
'`executemany` is not supported, use `execute` instead')
@check_result
@check_closed
def fetchone(self):
"""
Fetch the next row of a query result set, returning a single sequence,
or `None` when no more data is available.
"""
try:
return self._results.pop(0)
except IndexError:
return None
@check_result
@check_closed
def fetchmany(self, size=None):
"""
Fetch the next set of rows of a query result, returning a sequence of
sequences (e.g. a list of tuples). An empty sequence is returned when
no more rows are available.
"""
size = size or self.arraysize
out = self._results[:size]
self._results = self._results[size:]
return out
@check_result
@check_closed
def fetchall(self):
"""
Fetch all (remaining) rows of a query result, returning them as a
sequence of sequences (e.g. a list of tuples). Note that the cursor's
arraysize attribute can affect the performance of this operation.
"""
out = self._results[:]
self._results = []
return out
@check_closed
def setinputsizes(self, sizes):
# not supported
pass
@check_closed
def setoutputsizes(self, sizes):
# not supported
pass
@check_closed
def __iter__(self):
return iter(self._results)
| 25.591623 | 115 | 0.618658 | 579 | 4,888 | 5.069085 | 0.29361 | 0.059966 | 0.07155 | 0.024532 | 0.201704 | 0.12879 | 0.07155 | 0.050426 | 0.02385 | 0.02385 | 0 | 0.003743 | 0.289484 | 4,888 | 190 | 116 | 25.726316 | 0.841348 | 0.21829 | 0 | 0.344828 | 0 | 0 | 0.034314 | 0 | 0 | 0 | 0 | 0.005263 | 0 | 1 | 0.206897 | false | 0.043103 | 0.077586 | 0.034483 | 0.431034 | 0.008621 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c61e6c641ff5d3b13cd3eb58254039918bc75f6 | 2,081 | py | Python | docker-images/rasa2/snips_services/tts_server.py | sanyaade-machine-learning/opensnips_original | 3c7d4aa2ef7dec7b0b8c532a537b79c3ef9df7cc | [
"MIT"
] | 57 | 2017-12-28T22:50:20.000Z | 2022-01-25T16:05:36.000Z | docker-images/rasa2/snips_services/tts_server.py | sanyaade-machine-learning/opensnips_original | 3c7d4aa2ef7dec7b0b8c532a537b79c3ef9df7cc | [
"MIT"
] | 28 | 2018-04-18T06:45:20.000Z | 2022-03-08T22:50:50.000Z | docker-images/rasa2/snips_services/tts_server.py | sanyaade-machine-learning/opensnips_original | 3c7d4aa2ef7dec7b0b8c532a537b79c3ef9df7cc | [
"MIT"
] | 18 | 2017-12-27T01:57:14.000Z | 2021-03-02T14:13:06.000Z | #!/opt/rasa/anaconda/bin/python
# -*-: coding utf-8 -*-
""" Snips core and nlu server. """
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import json
import time
import os
from socket import error as socket_error
from SnipsMqttServer import SnipsMqttServer
import paho.mqtt.client as mqtt
from thread_handler import ThreadHandler
import sys,warnings
# apt-get install sox libsox-fmt-all
import sox
class SnipsTTSServer(SnipsMqttServer):
def __init__(self,
mqtt_hostname='mosquitto',
mqtt_port=1883,
):
SnipsMqttServer.__init__(self,mqtt_hostname,mqtt_port)
self.subscribe_to='hermes/tts/say'
def on_message(self, client, userdata, msg):
#print("MESSAGEtts: {}".format(msg.topic))
if msg.topic is not None and msg.topic=="hermes/tts/say":
print("MESSAGE OK: {}".format(msg.topic))
payload = json.loads(msg.payload)
# .decode('utf-8')
sessionId = payload.get('sessionId')
siteId = payload.get('siteId','default')
lang = payload.get('lang','en-GB')
theId = sessionId
fileName = '/tmp/speaking.wav'
os.system('/usr/bin/pico2wave -w=' + fileName + ' "{}" '.format(payload.get('text')))
#pubCommand = "mosquitto_pub -h " +self.mqtt_hostname+" -t 'hermes/audioServer/default/playBytes/0049a91e-8449-4398-9752-07c858234' -f '" + fileName + "'"
#print(pubCommand)
#os.system(pubCommand)
fp = open(fileName)
f = fp.read()
topic = 'hermes/audioServer/{}/playBytes'.format(siteId)
if theId is not None:
topic = topic + '/{}'.format(theId[::-1])
self.client.publish(topic, payload=bytes(f),qos=0)
#print("PUBLISHED on " + topic)
os.remove(fileName)
server = SnipsTTSServer()
server.start()
| 29.728571 | 166 | 0.605478 | 233 | 2,081 | 5.248927 | 0.480687 | 0.032706 | 0.05233 | 0.032706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023087 | 0.271504 | 2,081 | 69 | 167 | 30.15942 | 0.783641 | 0.189332 | 0 | 0 | 0 | 0 | 0.099099 | 0.018619 | 0 | 0 | 0 | 0 | 0 | 1 | 0.051282 | false | 0 | 0.333333 | 0 | 0.410256 | 0.051282 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c625b305422a96fe496b35f015f87dde84dd1cd | 462 | py | Python | gtd/migrations/0018_context_color.py | jimbofreedman/naggingnelly-api | 510d801791dcce39560bac227c12e5f6d9e80dcc | [
"BSD-3-Clause"
] | null | null | null | gtd/migrations/0018_context_color.py | jimbofreedman/naggingnelly-api | 510d801791dcce39560bac227c12e5f6d9e80dcc | [
"BSD-3-Clause"
] | null | null | null | gtd/migrations/0018_context_color.py | jimbofreedman/naggingnelly-api | 510d801791dcce39560bac227c12e5f6d9e80dcc | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.9 on 2018-08-02 17:53
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('gtd', '0017_auto_20180108_1508'),
]
operations = [
migrations.AddField(
model_name='context',
name='color',
field=models.CharField(default='ffffff', max_length=6),
),
]
| 22 | 67 | 0.614719 | 52 | 462 | 5.269231 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.099707 | 0.261905 | 462 | 20 | 68 | 23.1 | 0.703812 | 0.147186 | 0 | 0 | 1 | 0 | 0.112532 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c63f92718131c9edb3951c411929fc66600dca1 | 607 | py | Python | cogs/TieThePie.py | Engineer152/Engineer-Bot | 9654666776d5ba91b1c8afdb32c86a7aedad7143 | [
"MIT"
] | null | null | null | cogs/TieThePie.py | Engineer152/Engineer-Bot | 9654666776d5ba91b1c8afdb32c86a7aedad7143 | [
"MIT"
] | null | null | null | cogs/TieThePie.py | Engineer152/Engineer-Bot | 9654666776d5ba91b1c8afdb32c86a7aedad7143 | [
"MIT"
] | null | null | null | import discord
from discord.ext import commands
client = commands.Bot(command_prefix='your prefix',owner_ids = {your user id},case_insensitive=True )
class TieThePie(commands.Cog):
def __init__(self,client):
self.client=client
@commands.command()
async def tiethepie(self,ctx):
embed=discord.Embed(title="**Tie The Pie**",color=0x46e2ec,description='Subscribe to Dude Perfect to see the reveal of Panda\n**[Details](https://youtu.be/bFUZ5gruc0E)**ㅤㅤㅤㅤ**[Subscribe](http://bit.ly/SubDudePerfect)**')
await ctx.send(embed=embed)
def setup(client):
client.add_cog(TieThePie(client)) | 33.722222 | 224 | 0.742998 | 85 | 607 | 5.211765 | 0.658824 | 0.063205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011132 | 0.112026 | 607 | 18 | 225 | 33.722222 | 0.810761 | 0 | 0 | 0 | 0 | 0.083333 | 0.282895 | 0 | 0 | 0 | 0.013158 | 0 | 0 | 0 | null | null | 0 | 0.166667 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c6537848455d77ed4e22e5c61b4d2a5153fa5e0 | 3,359 | py | Python | python/lsst/eotest/simulation/generate_Fe55_images.py | tguillemLSST/eotest | c6f150984fa5dff85b9805028645bf46fc846f11 | [
"BSD-3-Clause-LBNL"
] | 3 | 2016-04-21T07:05:45.000Z | 2020-08-05T08:37:37.000Z | python/lsst/eotest/simulation/generate_Fe55_images.py | tguillemLSST/eotest | c6f150984fa5dff85b9805028645bf46fc846f11 | [
"BSD-3-Clause-LBNL"
] | 70 | 2015-03-26T09:48:53.000Z | 2020-04-22T16:29:43.000Z | python/lsst/eotest/simulation/generate_Fe55_images.py | tguillemLSST/eotest | c6f150984fa5dff85b9805028645bf46fc846f11 | [
"BSD-3-Clause-LBNL"
] | 5 | 2017-08-15T20:52:44.000Z | 2022-03-25T12:54:07.000Z | """
@brief Generate Fe55 images and associated darks and bias images
according to section 5.4 of the E/O document (Dec 19, 2012 version).
@author J. Chiang <jchiang@slac.stanford.edu>
"""
import os
import numpy as np
from sim_inputs import *
from sim_tools import *
def generate_Fe55_images(exptimes, nxrays, outdir, sensorid, gain=gain,
bias_level=bias_level, sys_noise=sys_noise,
dark_current=dark_current):
nexp = len(exptimes)
for i, exptime, nxray in zip(list(range(nexp)), exptimes, nxrays):
#
# Bias images
#
outfile = "Fe55_bias_%s_%02i.fits" % (sensorid, i)
bias_file = os.path.join(outdir, outfile)
bias_segs = []
for hdu in range(nhdu):
seg = SegmentExposure(exptime=0, gain=gain)
seg.add_bias(level=bias_level, sigma=sys_noise) # electronics
seg.add_bias(level=0, sigma=read_noise) # read noise
bias_segs.append(seg)
bias_output = fitsFile(bias_segs)
bias_output[0].header['GAIN'] = gain
bias_output[0].header['BIASLVL'] = bias_level
bias_output[0].header['SYSNOISE'] = sys_noise
bias_output[0].header['RDNOISE'] = read_noise
bias_output.writeto(bias_file, overwrite=True)
#
# Dark images
#
outfile = "Fe55_dark_%s_%02i.fits" % (sensorid, i)
dark_file = os.path.join(outdir, outfile)
dark_segs = []
for hdu in range(nhdu):
seg = SegmentExposure(exptime=exptime, gain=gain)
seg.add_bias(level=bias_level, sigma=sys_noise) # electronics
seg.add_bias(level=0, sigma=read_noise) # read noise
seg.add_dark_current(level=dark_current) # dark current
dark_segs.append(seg)
dark_output = fitsFile(dark_segs)
dark_output[0].header['GAIN'] = gain
dark_output[0].header['BIASLVL'] = bias_level
dark_output[0].header['SYSNOISE'] = sys_noise
dark_output[0].header['RDNOISE'] = read_noise
dark_output[0].header['DARKCURR'] = dark_current
dark_output.writeto(dark_file, overwrite=True)
#
# Fe55 exposures
#
outfile = "Fe55_exp_%s_%02i.fits" % (sensorid, i)
Fe55_file = os.path.join(outdir, outfile)
fe55_segs = []
for hdu in range(nhdu):
seg = SegmentExposure(exptime=exptime, gain=gain)
seg.add_bias(level=bias_level, sigma=sys_noise) # electronics
seg.add_bias(level=0, sigma=read_noise) # read noise
seg.add_dark_current(level=dark_current) # dark current
seg.add_Fe55_hits(nxrays=nxray)
fe55_segs.append(seg)
fe55_output = fitsFile(fe55_segs)
fe55_output[0].header['GAIN'] = gain
fe55_output[0].header['BIASLVL'] = bias_level
fe55_output[0].header['SYSNOISE'] = sys_noise
fe55_output[0].header['RDNOISE'] = read_noise
fe55_output[0].header['DARKCURR'] = dark_current
fe55_output[0].header['FE55HITS'] = nxray
fe55_output.writeto(Fe55_file, overwrite=True)
if __name__ == '__main__':
nexp = 10
exptimes = np.linspace(1, 5, nexp)
nxrays = [int(x*1000) for x in exptimes]
generate_Fe55_images(exptimes, nxrays, '.', 'xxx-xx')
| 39.988095 | 74 | 0.61953 | 430 | 3,359 | 4.609302 | 0.237209 | 0.052977 | 0.098385 | 0.045409 | 0.567608 | 0.469728 | 0.264884 | 0.264884 | 0.264884 | 0.241675 | 0 | 0.034525 | 0.267044 | 3,359 | 83 | 75 | 40.46988 | 0.770512 | 0.093778 | 0 | 0.209677 | 1 | 0 | 0.060265 | 0.021523 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016129 | false | 0 | 0.064516 | 0 | 0.080645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c67babe06797acaab8d0e9b738376ce3cb3ee88 | 376 | py | Python | lessons/day_05/python/app.py | jiaguilera/a-walk-in-graphql | ed4f44b4f4bf283cc7342141eb8127a2745ea2d7 | [
"MIT"
] | 16 | 2020-06-16T17:12:16.000Z | 2021-12-03T14:19:38.000Z | lessons/day_05/python/app.py | martinarnesi/a-walk-in-graphql | 56cd949cbeb4c4322882bd15398a867b16900ccd | [
"MIT"
] | 8 | 2020-06-11T21:53:03.000Z | 2020-07-26T01:47:10.000Z | lessons/day_05/python/app.py | martinarnesi/a-walk-in-graphql | 56cd949cbeb4c4322882bd15398a867b16900ccd | [
"MIT"
] | 9 | 2020-06-15T13:09:57.000Z | 2022-03-06T14:49:17.000Z | from ariadne import make_executable_schema, load_schema_from_path
from ariadne.asgi import GraphQL
from resolvers import query, skill, person, eye_color, mutation
# import schema from GraphQL file
type_defs = load_schema_from_path("./schema.gql")
schema = make_executable_schema(
type_defs, query, skill, person, eye_color, mutation
)
app = GraphQL(schema, debug=True)
| 28.923077 | 65 | 0.800532 | 54 | 376 | 5.314815 | 0.444444 | 0.10453 | 0.139373 | 0.125436 | 0.222997 | 0.222997 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 376 | 12 | 66 | 31.333333 | 0.87234 | 0.082447 | 0 | 0 | 0 | 0 | 0.034985 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c6bbe01f2a25c56bbd4e7b84c94d14c49d0cee9 | 1,127 | py | Python | src/__main__.py | andreaswatch/piTomation | 140bff77ad0b84ad17898106c7be7dc48a2d0783 | [
"MIT"
] | null | null | null | src/__main__.py | andreaswatch/piTomation | 140bff77ad0b84ad17898106c7be7dc48a2d0783 | [
"MIT"
] | null | null | null | src/__main__.py | andreaswatch/piTomation | 140bff77ad0b84ad17898106c7be7dc48a2d0783 | [
"MIT"
] | null | null | null | import importlib
import time
from pathlib import Path
import os
import sys
def import_plugins():
#find actual path
realpath = os.path.realpath(__file__)
dirname = os.path.dirname(realpath)
#add modules & plugins
plugin_path = os.path.join(dirname, "plugins")
for dir_path in Path(plugin_path).rglob('*.py'):
dp = str(dir_path)
if dp.lower().endswith("__init__.py"):
continue
path = dp[len(dirname)+1:-3].replace(os.sep,".")
if len(path.split('.')) < 4:
'''only import the top level plugin directory, so that potential submodules are
only imported if they are imported by the plugins.'''
print(" > " + path)
importlib.import_module(path)
print("Import plugins ..")
import_plugins()
print("Import app ..")
import modules.app.App as piTomation
app: piTomation.App
print("Start app ..")
app = piTomation.App()
#try:
# app = piTomation.App()
#except Exception as ex:
# print(ex)
# exit()
try:
while not app.is_disposed:
time.sleep(1)
except Exception as ex:
print(ex)
| 21.673077 | 92 | 0.624667 | 148 | 1,127 | 4.648649 | 0.452703 | 0.075581 | 0.069767 | 0.055233 | 0.075581 | 0.075581 | 0 | 0 | 0 | 0 | 0 | 0.004728 | 0.249335 | 1,127 | 51 | 93 | 22.098039 | 0.808511 | 0.100266 | 0 | 0 | 0 | 0 | 0.080233 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.37931 | 0 | 0.413793 | 0.172414 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c7b1f4e4b7bfbf72b788463867c6a1ec1a46c6d | 900 | py | Python | testing/python/telBuggyScript2.py | sys-bio/rrplugins | 03af6ea70d73462ad88103f1e446dc0c5f3f971c | [
"Apache-2.0"
] | null | null | null | testing/python/telBuggyScript2.py | sys-bio/rrplugins | 03af6ea70d73462ad88103f1e446dc0c5f3f971c | [
"Apache-2.0"
] | 8 | 2015-12-02T18:20:43.000Z | 2021-08-20T17:13:34.000Z | testing/python/telBuggyScript2.py | sys-bio/telPlugins | 03af6ea70d73462ad88103f1e446dc0c5f3f971c | [
"Apache-2.0"
] | 3 | 2015-01-27T18:53:45.000Z | 2015-07-13T17:07:50.000Z | import roadrunner
import teplugins as tel
i = 0
#for i in range(100):
try:
noisePlugin = tel.Plugin ("tel_add_noise")
print noisePlugin.listOfProperties()
# Create a roadrunner instance
rr = roadrunner.RoadRunner()
rr.load("sbml_test_0001.xml")
# Generate data
data = rr.simulate(0, 10, 511) # Want 512 points
# Get the dataseries from roadrunner
d = tel.getDataSeries (data)
# Assign the dataseries to the plugin inputdata
noisePlugin.InputData = d
# Set parameter for the 'size' of the noise
noisePlugin.Sigma = 3.e-6
# Add the noise
noisePlugin.execute()
# Get the data to plot
noisePlugin.InputData.plot()
# tel.show()
d.writeDataSeries ("testData2.dat")
d.readDataSeries ("testData2.dat")
print "done"
print i
except Exception as e:
print 'Problem: ' + `e`
| 20 | 52 | 0.637778 | 113 | 900 | 5.044248 | 0.557522 | 0.021053 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031963 | 0.27 | 900 | 45 | 53 | 20 | 0.835616 | 0.278889 | 0 | 0 | 0 | 0 | 0.109375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c7ea7eccdeaa85272171df846b591a0afd65d34 | 9,843 | py | Python | francoralite/apps/francoralite_front/tools.py | Francoralite/francoralite | f8c5eeffe6d395c7e4222a9f5a4a7a01841b503c | [
"BSD-3-Clause"
] | 2 | 2021-07-26T08:29:26.000Z | 2021-07-26T08:29:27.000Z | francoralite/apps/francoralite_front/tools.py | lluc/telemeta-integration | c2fb116471235674eae597abac84a7113e0f7c82 | [
"BSD-3-Clause"
] | 167 | 2018-10-20T14:34:46.000Z | 2021-06-01T10:40:55.000Z | francoralite/apps/francoralite_front/tools.py | Francoralite/francoralite | f8c5eeffe6d395c7e4222a9f5a4a7a01841b503c | [
"BSD-3-Clause"
] | 1 | 2021-06-06T12:16:49.000Z | 2021-06-06T12:16:49.000Z | # -*- coding: utf-8 -*-
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
# Authors: Luc LEGER / Coopérative ARTEFACTS <artefacts.lle@gmail.com>
import requests
from django.conf import settings
from django.contrib import messages
from django.core.exceptions import PermissionDenied
from django.http import HttpResponseRedirect, Http404
from django.utils.translation import gettext as _
from requests.exceptions import RequestException
from rest_framework import status
from francoralite.apps.francoralite_front.errors import APPLICATION_ERRORS
from .views.related import (
write_fond_related,
write_mission_related,
write_collection_related,
write_item_related)
HTTP_ERRORS = {
status.HTTP_400_BAD_REQUEST: APPLICATION_ERRORS['HTTP_API_400'],
status.HTTP_401_UNAUTHORIZED: APPLICATION_ERRORS['HTTP_API_401'],
status.HTTP_403_FORBIDDEN: APPLICATION_ERRORS['HTTP_API_403'],
status.HTTP_404_NOT_FOUND: APPLICATION_ERRORS['HTTP_API_404'],
status.HTTP_409_CONFLICT: APPLICATION_ERRORS['HTTP_API_409'],
}
PROBLEM_NAMES = [
"legal_rights",
"recording_context",
"location_gis",
]
class UserMessageError(RequestException): pass
def get_token_header(request):
"""
TODO: À renseigner
"""
auth_token = request.session.get('oidc_access_token')
if auth_token:
return {'Authorization': 'Bearer ' + auth_token}
else:
return {}
def check_status_code(status_code, allowed_codes=(status.HTTP_200_OK,)):
"""
TODO: À renseigner
"""
if status_code == status.HTTP_403_FORBIDDEN:
raise PermissionDenied(_('Accès interdit.'))
if status_code == status.HTTP_404_NOT_FOUND:
raise Http404(_('Cette fiche n’existe pas.'))
if status_code == status.HTTP_409_CONFLICT:
raise UserMessageError(_('Une fiche avec ce code existe déjà.'))
if status.HTTP_400_BAD_REQUEST <= status_code < status.HTTP_500_INTERNAL_SERVER_ERROR:
raise RequestException()
if status_code not in allowed_codes:
raise Exception(HTTP_ERRORS[status_code])
def handle_message_from_exception(request, exception):
"""
TODO: À renseigner
"""
if isinstance(exception, UserMessageError):
messages.add_message(request, messages.ERROR, exception)
elif exception is not None:
messages.add_message(request, messages.ERROR,
_('Une erreur indéterminée est survenue.'))
def request_api(endpoint):
"""
TODO: À renseigner
"""
response = requests.get(settings.FRONT_HOST_URL + endpoint)
check_status_code(response.status_code)
return response.json()
def post(entity, form_entity, request, *args, **kwargs):
"""
TODO: À renseigner
"""
form = form_entity(request.POST, request.FILES)
entity_api = entity
entity_url = entity
# Processing the problem names entities
if entity in PROBLEM_NAMES:
entity_api = entity.replace('_', '')
# Processing URL for Mission entity
if entity == 'fond':
entity_url = 'institution/' + kwargs['id_institution'] \
+ '/' + entity
# Processing URL for Mission entity
if entity == 'mission':
entity_url = 'institution/' + kwargs['id_institution'] \
+ '/fond/' + kwargs['id_fond']\
+ '/' + entity
# Processing URL for Collection entity
if entity == 'collection':
entity_url = 'institution/' + kwargs['id_institution'] \
+ '/fond/' + kwargs['id_fond']\
+ '/mission/' + kwargs['id_mission'] \
+ '/' + entity
# Processing URL for Item entity
if entity == 'item':
entity_url = 'institution/' + kwargs['id_institution'] \
+ '/fond/' + kwargs['id_fond']\
+ '/mission/' + kwargs['id_mission'] \
+ '/collection/' + kwargs['id_collection'] \
+ '/' + entity
# Problem with old Telemeta fields/entities
if form.is_valid():
if entity == 'item':
# Concatenate domains
form.cleaned_data['domain'] = ''.join(form.cleaned_data['domain'])
# Remove the 'file' entry : if not, there some bugs
del form.cleaned_data['file']
try:
post_api(settings.FRONT_HOST_URL + '/api/' + entity_api,
data=form.cleaned_data,
request=request,
entity=entity)
if entity == 'fond':
return HttpResponseRedirect(
'/institution/' +
str(form.cleaned_data['institution']))
# Previous page ( not an edit page ... )
if len(request.session["referers"]) > 1:
try:
for referer in request.session["referers"]:
if 'add' not in referer.split('/'):
return HttpResponseRedirect(referer)
except Exception:
return HttpResponseRedirect('/' + entity)
return HttpResponseRedirect('/' + entity)
except RequestException as e:
handle_message_from_exception(request, e)
return HttpResponseRedirect('/' + entity_url + '/add')
return HttpResponseRedirect('/' + entity_url + '/add')
def post_api(endpoint, data, request, entity):
"""
TODO: À renseigner
"""
headers = get_token_header(request=request)
response = requests.post(
endpoint,
data=data,
files=request.FILES,
headers=headers,
)
check_status_code(response.status_code,
allowed_codes=(status.HTTP_200_OK, status.HTTP_201_CREATED))
entity_json = response.json()
if entity == "fond":
write_fond_related(entity_json, request, headers)
if entity == "mission":
write_mission_related(entity_json, request, headers)
if entity == "collection":
write_collection_related(entity_json, request, headers)
if entity == "item":
write_item_related(entity_json, request, headers)
return entity_json
def patch(entity, form_entity, request, *args, **kwargs):
"""
TODO: À renseigner
"""
form = form_entity(request.POST)
if entity == 'item':
form.fields['file'].required = False
id = kwargs.get('id')
entity_api = entity
if entity in PROBLEM_NAMES:
entity_api = entity.replace('_', '')
if form.is_valid():
if entity == "collection":
form.cleaned_data['recorded_from_year'] = \
form.data['recorded_from_year']
form.cleaned_data['recorded_to_year'] = \
form.data['recorded_to_year']
if form.cleaned_data['year_published'] is None:
form.cleaned_data['year_published'] = ''
if entity == "item":
# Concatenate domains
form.cleaned_data['domain'] = ''.join(form.cleaned_data['domain'])
try:
response = patch_api(
settings.FRONT_HOST_URL + '/api/' + entity_api + '/' + str(id),
data=form.cleaned_data,
request=request,
entity=entity
)
if(response.status_code != status.HTTP_200_OK):
return HttpResponseRedirect('/' + entity + '/edit/' +
str(id))
# Previous page ( not an edit page ... )
if len(request.session["referers"]) > 1:
for referer in request.session["referers"]:
if 'edit' not in referer.split('/'):
return HttpResponseRedirect(referer)
return HttpResponseRedirect('/' + entity)
except RequestException as e:
handle_message_from_exception(request, e)
return HttpResponseRedirect('/' + entity + '/edit/' + str(id))
return HttpResponseRedirect('/' + entity + '/edit/' + str(id))
def patch_api(endpoint, data, request, entity):
"""
TODO: À renseigner
"""
response = requests.patch(
endpoint,
data=data,
headers=get_token_header(request=request),
)
check_status_code(response.status_code)
entity_json = response.json()
if entity == "fond":
write_fond_related(
entity_json,
request,
headers=get_token_header(request=request),
)
if entity == "mission":
write_mission_related(
entity_json,
request,
headers=get_token_header(request=request),
)
if entity == "collection":
write_collection_related(
entity_json,
request,
headers=get_token_header(request=request),
)
if entity == "item":
write_item_related(
entity_json,
request,
headers=get_token_header(request=request),
)
return response
def delete(entity, request, *args, **kwargs):
"""
TODO: À renseigner
"""
id = kwargs.get('id')
entity_api = entity
if entity in PROBLEM_NAMES:
entity_api = entity.replace('_', '')
try:
delete_api(
settings.FRONT_HOST_URL + '/api/' + entity_api + '/' + str(id),
request=request,
)
return HttpResponseRedirect(request.META.get('HTTP_REFERER'))
except RequestException as e:
handle_message_from_exception(request, e)
return HttpResponseRedirect('/' + entity)
def delete_api(endpoint, request):
"""
TODO: À renseigner
"""
response = requests.delete(
endpoint,
headers=get_token_header(request=request),
)
check_status_code(response.status_code)
return response
| 30.286154 | 90 | 0.607843 | 1,034 | 9,843 | 5.559961 | 0.193424 | 0.027831 | 0.03131 | 0.029222 | 0.573491 | 0.503392 | 0.451557 | 0.402157 | 0.357627 | 0.333623 | 0 | 0.009347 | 0.282637 | 9,843 | 324 | 91 | 30.37963 | 0.804844 | 0.082089 | 0 | 0.46729 | 0 | 0 | 0.093021 | 0 | 0 | 0 | 0 | 0.030864 | 0 | 1 | 0.046729 | false | 0.004673 | 0.046729 | 0 | 0.186916 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c7f5af1d319f74fdb488cde790b4cffce3502aa | 5,173 | py | Python | python2.7/site-packages/twisted/internet/iocpreactor/client.py | 84KaliPleXon3/sslstrip-hsts-openwrt | f875ded48078a3ed84bffef1e69dcbeaf2e77ae3 | [
"MIT"
] | 4 | 2020-10-31T19:52:05.000Z | 2021-09-22T11:39:27.000Z | python2.7/site-packages/twisted/internet/iocpreactor/client.py | 84KaliPleXon3/sslstrip-hsts-openwrt | f875ded48078a3ed84bffef1e69dcbeaf2e77ae3 | [
"MIT"
] | null | null | null | python2.7/site-packages/twisted/internet/iocpreactor/client.py | 84KaliPleXon3/sslstrip-hsts-openwrt | f875ded48078a3ed84bffef1e69dcbeaf2e77ae3 | [
"MIT"
] | 2 | 2020-02-27T08:28:35.000Z | 2020-09-13T12:39:26.000Z | # Copyright (c) 2001-2004 Twisted Matrix Laboratories.
# See LICENSE for details.
import socket
from twisted.persisted import styles
from twisted.internet.base import BaseConnector
from twisted.internet import defer, interfaces, error
from twisted.python import failure
from abstract import ConnectedSocket
from ops import ConnectExOp
from util import StateEventMachineType
from zope.interface import implements
class ClientSocket(ConnectedSocket):
def __init__(self, sock, protocol, sf):
ConnectedSocket.__init__(self, sock, protocol, sf)
self.repstr = '<%s to %s at %x>' % (self.__class__, self.sf.addr, id(self))
self.logstr = protocol.__class__.__name__+",client"
self.startReading()
class _SubConnector:
state = "connecting"
socket = None
def __init__(self, sf):
self.sf = sf
def startConnecting(self):
d = defer.maybeDeferred(self.sf.resolveAddress)
d.addCallback(self._cbResolveDone)
d.addErrback(self._ebResolveErr)
def _cbResolveDone(self, addr):
if self.state == "dead":
return
try:
skt = socket.socket(*self.sf.sockinfo)
except socket.error, se:
raise error.ConnectBindError(se[0], se[1])
try:
if self.sf.bindAddress is None:
self.sf.bindAddress = ("", 0) # necessary for ConnectEx
skt.bind(self.sf.bindAddress)
except socket.error, se:
raise error.ConnectBindError(se[0], se[1])
self.socket = skt
op = ConnectExOp(self)
op.initiateOp(self.socket, addr)
def _ebResolveErr(self, fail):
if self.state == "dead":
return
self.sf.connectionFailed(fail)
def connectDone(self):
if self.state == "dead":
return
self.sf.connectionSuccess()
def connectErr(self, err):
if self.state == "dead":
return
self.sf.connectionFailed(err)
class SocketConnector(styles.Ephemeral, object):
__metaclass__ = StateEventMachineType
implements(interfaces.IConnector)
transport_class = ClientSocket
events = ["stopConnecting", "disconnect", "connect"]
sockinfo = None
factoryStarted = False
timeoutID = None
def __init__(self, addr, factory, timeout, bindAddress):
from twisted.internet import reactor
self.state = "disconnected"
self.addr = addr
self.factory = factory
self.timeout = timeout
self.bindAddress = bindAddress
self.reactor = reactor
self.prepareAddress()
def handle_connecting_stopConnecting(self):
self.connectionFailed(failure.Failure(error.UserError()))
def handle_disconnected_stopConnecting(self):
raise error.NotConnectingError
handle_connected_stopConnecting = handle_disconnected_stopConnecting
handle_connecting_disconnect = handle_connecting_stopConnecting
def handle_connected_disconnect(self):
self.transport.loseConnection()
def handle_disconnected_disconnect(self):
pass
def handle_connecting_connect(self):
raise RuntimeError, "can't connect in this state"
handle_connected_connect = handle_connecting_connect
def handle_disconnected_connect(self):
self.state = "connecting"
if not self.factoryStarted:
self.factory.doStart()
self.factoryStarted = True
if self.timeout is not None:
self.timeoutID = self.reactor.callLater(self.timeout, self.connectionFailed, failure.Failure(error.TimeoutError()))
self.sub = _SubConnector(self)
self.sub.startConnecting()
self.factory.startedConnecting(self)
def prepareAddress(self):
raise NotImplementedError
def resolveAddress(self):
raise NotImplementedError
def connectionLost(self, reason):
self.state = "disconnected"
self.factory.clientConnectionLost(self, reason)
if self.state == "disconnected":
# factory hasn't called our connect() method
self.factory.doStop()
self.factoryStarted = 0
def connectionFailed(self, reason):
if self.sub.socket:
self.sub.socket.close()
self.sub.state = "dead"
del self.sub
self.state = "disconnected"
self.cancelTimeout()
self.factory.clientConnectionFailed(self, reason)
if self.state == "disconnected":
# factory hasn't called our connect() method
self.factory.doStop()
self.factoryStarted = 0
def cancelTimeout(self):
if self.timeoutID:
try:
self.timeoutID.cancel()
except ValueError:
pass
del self.timeoutID
def connectionSuccess(self):
socket = self.sub.socket
self.sub.state = "dead"
del self.sub
self.state = "connected"
self.cancelTimeout()
p = self.factory.buildProtocol(self.buildAddress(socket.getpeername()))
self.transport = self.transport_class(socket, p, self)
p.makeConnection(self.transport)
| 30.429412 | 127 | 0.64972 | 532 | 5,173 | 6.197368 | 0.272556 | 0.020018 | 0.020018 | 0.018198 | 0.204125 | 0.150743 | 0.150743 | 0.142554 | 0.11647 | 0.095238 | 0 | 0.003925 | 0.261164 | 5,173 | 169 | 128 | 30.609467 | 0.858713 | 0.036149 | 0 | 0.263566 | 0 | 0 | 0.038964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.015504 | 0.077519 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c7f78b673d9e154cc86707fcd75f178c99f6089 | 2,678 | py | Python | pypika/tests/dialects/test_mssql.py | uhrm/pypika | b390aa33c980704555d75d27ade5bfa4d1d4bae7 | [
"Apache-2.0"
] | null | null | null | pypika/tests/dialects/test_mssql.py | uhrm/pypika | b390aa33c980704555d75d27ade5bfa4d1d4bae7 | [
"Apache-2.0"
] | null | null | null | pypika/tests/dialects/test_mssql.py | uhrm/pypika | b390aa33c980704555d75d27ade5bfa4d1d4bae7 | [
"Apache-2.0"
] | null | null | null | import unittest
from pypika import Table
from pypika.analytics import Count
from pypika.dialects import MSSQLQuery
from pypika.utils import QueryException
class SelectTests(unittest.TestCase):
def test_normal_select(self):
q = MSSQLQuery.from_("abc").select("def")
self.assertEqual('SELECT "def" FROM "abc"', str(q))
def test_distinct_select(self):
q = MSSQLQuery.from_("abc").select("def").distinct()
self.assertEqual('SELECT DISTINCT "def" FROM "abc"', str(q))
def test_top_distinct_select(self):
q = MSSQLQuery.from_("abc").select("def").top(10).distinct()
self.assertEqual('SELECT DISTINCT TOP (10) "def" FROM "abc"', str(q))
def test_top_select(self):
q = MSSQLQuery.from_("abc").select("def").top(10)
self.assertEqual('SELECT TOP (10) "def" FROM "abc"', str(q))
def test_top_select_non_int(self):
with self.assertRaisesRegex(QueryException, "TOP value must be an integer"):
MSSQLQuery.from_("abc").select("def").top("a")
def test_limit(self):
q = MSSQLQuery.from_("abc").select("def").orderby("def").limit(10)
self.assertEqual('SELECT "def" FROM "abc" ORDER BY "def" OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY', str(q))
def test_fetch_next(self):
q = MSSQLQuery.from_("abc").select("def").orderby("def").fetch_next(10)
self.assertEqual('SELECT "def" FROM "abc" ORDER BY "def" OFFSET 0 ROWS FETCH NEXT 10 ROWS ONLY', str(q))
def test_offset(self):
q = MSSQLQuery.from_("abc").select("def").orderby("def").offset(10)
self.assertEqual('SELECT "def" FROM "abc" ORDER BY "def" OFFSET 10 ROWS', str(q))
def test_fetch_next_with_offset(self):
q = MSSQLQuery.from_("abc").select("def").orderby("def").fetch_next(10).offset(10)
self.assertEqual('SELECT "def" FROM "abc" ORDER BY "def" OFFSET 10 ROWS FETCH NEXT 10 ROWS ONLY', str(q))
def test_groupby_alias_False_does_not_group_by_alias_with_standard_query(self):
t = Table('table1')
col = t.abc.as_('a')
q = MSSQLQuery.from_(t).select(col, Count('*')).groupby(col)
self.assertEqual('SELECT "abc" "a",COUNT(\'*\') FROM "table1" GROUP BY "abc"', str(q))
def test_groupby_alias_False_does_not_group_by_alias_when_subqueries_are_present(self):
t = Table('table1')
subquery = MSSQLQuery.from_(t).select(t.abc)
col = subquery.abc.as_('a')
q = MSSQLQuery.from_(subquery).select(col, Count('*')).groupby(col)
self.assertEqual(
'SELECT "sq0"."abc" "a",COUNT(\'*\') FROM (SELECT "abc" FROM "table1") "sq0" GROUP BY "sq0"."abc"', str(q)
)
| 38.257143 | 118 | 0.647872 | 375 | 2,678 | 4.466667 | 0.170667 | 0.071045 | 0.089552 | 0.123582 | 0.698507 | 0.652537 | 0.580299 | 0.567761 | 0.477612 | 0.421493 | 0 | 0.017098 | 0.191934 | 2,678 | 69 | 119 | 38.811594 | 0.756932 | 0 | 0 | 0.086957 | 0 | 0.086957 | 0.249066 | 0 | 0 | 0 | 0 | 0 | 0.23913 | 1 | 0.23913 | false | 0 | 0.108696 | 0 | 0.369565 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c828f50eb6739af4655f73774016c25d4ee4ac9 | 1,389 | py | Python | suplemon/helpers.py | johnmbaughman/suplemon | fdde20f2181c280236d40f89b89b9bbe5843440e | [
"MIT"
] | null | null | null | suplemon/helpers.py | johnmbaughman/suplemon | fdde20f2181c280236d40f89b89b9bbe5843440e | [
"MIT"
] | null | null | null | suplemon/helpers.py | johnmbaughman/suplemon | fdde20f2181c280236d40f89b89b9bbe5843440e | [
"MIT"
] | null | null | null | # -*- encoding: utf-8
"""
Various helper constants and functions.
"""
import os
import re
import sys
import time
import traceback
def curr_time():
"""Current time in %H:%M"""
return time.strftime("%H:%M")
def curr_time_sec():
"""Current time in %H:%M:%S"""
return time.strftime("%H:%M:%S")
def multisplit(data, delimiters):
pattern = "|".join(map(re.escape, delimiters))
return re.split(pattern, data)
def get_error_info():
"""Return info about last error."""
msg = "{0}\n{1}".format(str(traceback.format_exc()), str(sys.exc_info()))
return msg
def get_string_between(start, stop, s):
"""Search string for a substring between two delimeters. False if not found."""
i1 = s.find(start)
if i1 == -1:
return False
s = s[i1 + len(start):]
i2 = s.find(stop)
if i2 == -1:
return False
s = s[:i2]
return s
def whitespace(line):
"""Return index of first non whitespace character on a line."""
i = 0
for char in line:
if char != " ":
break
i += 1
return i
def parse_path(path):
"""Parse a relative path and return full directory and filename as a tuple."""
if path[:2] == "~" + os.sep:
p = os.path.expanduser("~")
path = os.path.join(p+os.sep, path[2:])
ab = os.path.abspath(path)
parts = os.path.split(ab)
return parts
| 21.369231 | 83 | 0.592513 | 206 | 1,389 | 3.946602 | 0.427184 | 0.00984 | 0.02706 | 0.03444 | 0.120541 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014423 | 0.25126 | 1,389 | 64 | 84 | 21.703125 | 0.767308 | 0.24622 | 0 | 0.051282 | 0 | 0 | 0.024777 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.179487 | false | 0 | 0.128205 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1c8bf817623bc83ae0e3cfb38c83d93d7647579a | 1,068 | py | Python | madlib/main.py | FredericIV/PythonPractice | 36b3a321eb8fefc38befe83b15a7596418250756 | [
"CC0-1.0"
] | null | null | null | madlib/main.py | FredericIV/PythonPractice | 36b3a321eb8fefc38befe83b15a7596418250756 | [
"CC0-1.0"
] | null | null | null | madlib/main.py | FredericIV/PythonPractice | 36b3a321eb8fefc38befe83b15a7596418250756 | [
"CC0-1.0"
] | null | null | null | #!/bin/python3
# Libraries
import sys
import array
import textwrap
# Variable Declaration
madlib_selection = "example.txt"
madlib_array = array.array('i')
copy_state = False
user_filler = ""
new_madlib = []
if len(sys.argv) != 1:
print(len(sys.argv))
if sys.argv[1] == "-":
print("This program takes the path to a madlib as an argument. Showing default now.")
## TODO: Add input validation, i.e. make sure the input is actully text.
else:
## TODO: Add pipe as input option.
madlib_selection = sys.argv[1]
with open(madlib_selection, 'r') as madlib:
read_madlib = madlib.read()
for i in range(read_madlib.count("#")//2):
first = read_madlib.index("#")
second = read_madlib.index("#", first+1)
replacement = input("Please give me " + read_madlib[first+1:second] + ":")
new_madlib = read_madlib[0:first] + replacement + read_madlib[second+1:]
read_madlib = new_madlib
print("\n\n\n")
print(textwrap.fill(read_madlib, drop_whitespace=False, replace_whitespace=False))
| 31.411765 | 93 | 0.659176 | 149 | 1,068 | 4.590604 | 0.489933 | 0.131579 | 0.035088 | 0.038012 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010626 | 0.206929 | 1,068 | 33 | 94 | 32.363636 | 0.79693 | 0.136704 | 0 | 0 | 0 | 0 | 0.125683 | 0 | 0 | 0 | 0 | 0.030303 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c8e802ab7e5ab17bb7b662f2406ded9d3de6507 | 11,773 | py | Python | mcp/augmentation/album.py | j20232/moco_image_pipeline | 997ae76e795548e75f95e862284c1fc0a3c7541a | [
"BSD-3-Clause"
] | 5 | 2020-03-18T14:36:12.000Z | 2022-01-26T09:36:11.000Z | mcp/augmentation/album.py | j20232/moco_image_pipeline | 997ae76e795548e75f95e862284c1fc0a3c7541a | [
"BSD-3-Clause"
] | null | null | null | mcp/augmentation/album.py | j20232/moco_image_pipeline | 997ae76e795548e75f95e862284c1fc0a3c7541a | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
from PIL import Image, ImageOps, ImageEnhance
import albumentations as A
# ndarray: H x W x C
def apply_aug(aug, image):
return aug(image=image)["image"]
# ----------------------------------- Blur -------------------------------------------
class RandomBlur():
def __init__(self, prob, blur_limit=9):
self.prob = np.clip(prob, 0.0, 1.0)
self.blur_limit = blur_limit
def __call__(self, img):
if np.random.uniform() < self.prob:
r = np.random.uniform()
if r < 0.4:
img = apply_aug(A.Blur(blur_limit=self.blur_limit, always_apply=True), img)
elif r < 0.6:
img = apply_aug(A.GaussianBlur(blur_limit=self.blur_limit, always_apply=True), img)
else:
img = apply_aug(A.MotionBlur(blur_limit=self.blur_limit, always_apply=True), img)
return img
# ----------------------------------- Noise -------------------------------------------
class GaussNoise():
def __init__(self, prob, var_limit=(0.0, 0.07)):
self.prob = np.clip(prob, 0.0, 1.0)
self.var_limit = var_limit
def __call__(self, img):
return apply_aug(A.GaussNoise(var_limit=self.var_limit, p=self.prob), img)
class MultiplicativeNoise():
def __init__(self, prob, var_limit=(0.6, 1.1)):
self.prob = np.clip(prob, 0.0, 1.0)
self.var_limit = var_limit
def __call__(self, img):
return apply_aug(A.MultiplicativeNoise(multiplier=self.var_limit, p=self.prob), img)
# ---------------------------------- Distortion ---------------------------------------
class GridDistortion():
def __init__(self, prob, num_steps=10, distort_limit=0.7):
self.prob = np.clip(prob, 0.0, 1.0)
self.num_steps = num_steps
self.distort_limit = distort_limit
def __call__(self, img):
return apply_aug(A.GridDistortion(p=self.prob, num_steps=self.num_steps,
distort_limit=self.distort_limit), img)
class ElasticTransform():
def __init__(self, prob, sigma=40, alpha=1, alpha_affine=15):
self.prob = np.clip(prob, 0.0, 1.0)
self.sigma = sigma
self.alpha = alpha
self.alpha_affine = alpha_affine
def __call__(self, img):
return apply_aug(A.ElasticTransform(p=self.prob, sigma=self.sigma,
alpha=self.alpha, alpha_affine=self.alpha_affine), img)
class ShiftScaleRotate():
def __init__(self, prob, shift_limit=0.0625, scale_limit=0.2, rotate_limit=20):
self.prob = prob
self.shift_limit = shift_limit
self.scale_limit = scale_limit
self.rotate_limit = rotate_limit
def __call__(self, img):
return apply_aug(A.ShiftScaleRotate(p=self.prob, shift_limit=self.shift_limit,
scale_limit=self.scale_limit,
rotate_limit=self.rotate_limit), img)
# ----------------------------------- Histogram ----------------------------------------
class HueSaturationValue():
def __init__(self, prob, hue_shift_limit=20, sat_shift_limit=40, val_shift_limit=100):
self.prob = np.clip(prob, 0.0, 1.0)
self.hue_shift_limit = hue_shift_limit
self.sat_shift_limit = sat_shift_limit
self.val_shift_limit = val_shift_limit
def __call__(self, img):
out = img if img.dtype == "uint8" else (img * 255).astype(np.uint8)
out = apply_aug(A.HueSaturationValue(p=self.prob, hue_shift_limit=self.hue_shift_limit,
sat_shift_limit=self.sat_shift_limit,
val_shift_limit=self.val_shift_limit), out)
return out if img.dtype == "uint8" else (out / 255).astype(np.float64)
class RandomBrightnessContrast():
def __init__(self, prob, brightness_limit=2.0, contrast_limit=0.6):
self.prob = np.clip(prob, 0.0, 1.0)
self.brightness_limit = brightness_limit
self.contrast_limit = contrast_limit
def __call__(self, img):
return apply_aug(A.RandomBrightnessContrast(p=self.prob,
brightness_limit=self.brightness_limit,
contrast_limit=self.contrast_limit,
brightness_by_max=False,
), img)
class RandomCLAHE():
def __init__(self, prob, clip_limit=40.0, tile_grid_size=(16, 16)):
self.prob = np.clip(prob, 0.0, 1.0)
self.clip_limit = clip_limit
self.tile_grid_size = tile_grid_size
def __call__(self, img):
out = img if img.dtype == "uint8" else (img * 255).astype(np.uint8)
out = apply_aug(A.CLAHE(p=self.prob, clip_limit=self.clip_limit,
tile_grid_size=self.tile_grid_size), out)
return out if img.dtype == "uint8" else (out / 255).astype(np.float64)
# ------------------------------------- Removal ------------------------------------------
class CoarseDropout():
def __init__(self, prob, max_holes=10, max_height=12, max_width=12):
self.prob = np.clip(prob, 0.0, 1.0)
self.max_holes = max_holes
self.max_height = max_height
self.max_width = max_width
def __call__(self, img):
return apply_aug(A.CoarseDropout(p=self.prob, max_holes=self.max_holes,
max_height=self.max_height, max_width=self.max_width,
fill_value=np.median(img)), img)
# ------------------------------------------- Augmix -------------------------------------------
# Reference: https://www.kaggle.com/haqishen/augmix-based-on-albumentations
def int_parameter(level, maxval):
"""Helper function to scale `val` between 0 and maxval .
Args:
level: Level of the operation that will be between [0, `PARAMETER_MAX`].
maxval: Maximum value that the operation can have. This will be scaled to
level/PARAMETER_MAX.
Returns:
An int that results from scaling `maxval` according to `level`.
"""
return int(level * maxval / 10)
def float_parameter(level, maxval):
"""Helper function to scale `val` between 0 and maxval.
Args:
level: Level of the operation that will be between [0, `PARAMETER_MAX`].
maxval: Maximum value that the operation can have. This will be scaled to
level/PARAMETER_MAX.
Returns:
A float that results from scaling `maxval` according to `level`.
"""
return float(level) * maxval / 10.
def sample_level(n):
return np.random.uniform(low=0.1, high=n)
def autocontrast(pil_img, _):
return ImageOps.autocontrast(pil_img)
def equalize(pil_img, _):
return ImageOps.equalize(pil_img)
def posterize(pil_img, level):
level = int_parameter(sample_level(level), 4)
return ImageOps.posterize(pil_img, 4 - level)
def rotate(pil_img, level):
degrees = int_parameter(sample_level(level), 30)
if np.random.uniform() > 0.5:
degrees = -degrees
return pil_img.rotate(degrees, resample=Image.BILINEAR)
def solarize(pil_img, level):
level = int_parameter(sample_level(level), 256)
return ImageOps.solarize(pil_img, 256 - level)
def shear_x(pil_img, level):
level = float_parameter(sample_level(level), 0.3)
if np.random.uniform() > 0.5:
level = -level
return pil_img.transform(pil_img.size,
Image.AFFINE, (1, level, 0, 0, 1, 0),
resample=Image.BILINEAR)
def shear_y(pil_img, level):
level = float_parameter(sample_level(level), 0.3)
if np.random.uniform() > 0.5:
level = -level
return pil_img.transform(pil_img.size,
Image.AFFINE, (1, 0, 0, level, 1, 0),
resample=Image.BILINEAR)
def translate_x(pil_img, level):
level = int_parameter(sample_level(level), pil_img.size[0] / 3)
if np.random.random() > 0.5:
level = -level
return pil_img.transform(pil_img.size,
Image.AFFINE, (1, 0, level, 0, 1, 0),
resample=Image.BILINEAR)
def translate_y(pil_img, level):
level = int_parameter(sample_level(level), pil_img.size[0] / 3)
if np.random.random() > 0.5:
level = -level
return pil_img.transform(pil_img.size,
Image.AFFINE, (1, 0, 0, 0, 1, level),
resample=Image.BILINEAR)
# operation that overlaps with ImageNet-C's test set
def color(pil_img, level):
level = float_parameter(sample_level(level), 1.8) + 0.1
return ImageEnhance.Color(pil_img).enhance(level)
# operation that overlaps with ImageNet-C's test set
def contrast(pil_img, level):
level = float_parameter(sample_level(level), 1.8) + 0.1
return ImageEnhance.Contrast(pil_img).enhance(level)
# operation that overlaps with ImageNet-C's test set
def brightness(pil_img, level):
level = float_parameter(sample_level(level), 1.8) + 0.1
return ImageEnhance.Brightness(pil_img).enhance(level)
# operation that overlaps with ImageNet-C's test set
def sharpness(pil_img, level):
level = float_parameter(sample_level(level), 1.8) + 0.1
return ImageEnhance.Sharpness(pil_img).enhance(level)
def normalize(image):
"""Normalize input image channel-wise to zero mean and unit variance."""
return image - 127
def apply_op(image, op, severity):
# image = np.clip(image, 0, 255)
pil_img = Image.fromarray(image) # Convert to PIL.Image
pil_img = op(pil_img, severity)
return np.asarray(pil_img)
def augment_and_mix(image, severity=3, width=3, depth=-1, alpha=1.):
"""Perform AugMix augmentations and compute mixture.
Args:
image: Raw input image as float32 np.ndarray of shape (h, w, c)
severity: Severity of underlying augmentation operators (between 1 to 10).
width: Width of augmentation chain
depth: Depth of augmentation chain. -1 enables stochastic depth uniformly
from [1, 3]
alpha: Probability coefficient for Beta and Dirichlet distributions.
Returns:
mixed: Augmented and mixed image.
"""
augmentations = [
autocontrast, equalize, posterize, rotate, solarize, shear_x, shear_y,
translate_x, translate_y
]
ws = np.float32(np.random.dirichlet([alpha] * width))
m = np.float32(np.random.beta(alpha, alpha))
mix = np.zeros_like(image).astype(np.float32)
for i in range(width):
image_aug = image.copy()
depth = depth if depth > 0 else np.random.randint(1, 4)
for _ in range(depth):
op = np.random.choice(augmentations)
image_aug = apply_op(image_aug, op, severity)
# Preprocessing commutes since all coefficients are convex
mix += ws[i] * image_aug
# mix += ws[i] * normalize(image_aug)
mixed = (1 - m) * image + m * mix
# mixed = (1 - m) * normalize(image) + m * mix
return mixed
class RandomAugMix():
def __init__(self, prob=0.1, severity=2, width=3, depth=2, alpha=1.):
self.prob = prob
self.severity = severity
self.width = width
self.depth = depth
self.alpha = alpha
def __call__(self, img):
if np.random.uniform() > self.prob:
return img
tmp = (img * 255).astype(np.uint8) if img.dtype != "uint8" else img
out = augment_and_mix(tmp, self.severity, self.width, self.depth, self.alpha)
if type(img) is np.ndarray:
if img.dtype != "uint8":
out = (out / 255).astype(np.float64)
return out
| 35.460843 | 99 | 0.599507 | 1,544 | 11,773 | 4.368523 | 0.148964 | 0.032024 | 0.016012 | 0.024463 | 0.447294 | 0.427279 | 0.402817 | 0.374055 | 0.365456 | 0.30467 | 0 | 0.026854 | 0.256689 | 11,773 | 331 | 100 | 35.567976 | 0.743915 | 0.179903 | 0 | 0.283582 | 0 | 0 | 0.003677 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208955 | false | 0 | 0.014925 | 0.054726 | 0.437811 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c8ea0dcd3e4b0f8ab68d4a876a677661904e6f8 | 2,959 | py | Python | website/util/sanitize.py | bdyetton/prettychart | e8b33a7dfdc8c33d15969586be7f68172795f76d | [
"Apache-2.0"
] | null | null | null | website/util/sanitize.py | bdyetton/prettychart | e8b33a7dfdc8c33d15969586be7f68172795f76d | [
"Apache-2.0"
] | null | null | null | website/util/sanitize.py | bdyetton/prettychart | e8b33a7dfdc8c33d15969586be7f68172795f76d | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import bleach
import json
def strip_html(unclean):
"""Sanitize a string, removing (as opposed to escaping) HTML tags
:param unclean: A string to be stripped of HTML tags
:return: stripped string
:rtype: str
"""
return bleach.clean(unclean, strip=True, tags=[], attributes=[], styles=[])
def clean_tag(data):
"""Format as a valid Tag
:param data: A string to be cleaned
:return: cleaned string
:rtype: str
"""
# TODO: make this a method of Tag?
return escape_html(data).replace('"', '"').replace("'", ''')
def is_iterable_but_not_string(obj):
"""Return True if ``obj`` is an iterable object that isn't a string."""
return (hasattr(obj, '__iter__') and not hasattr(obj, 'strip'))
def escape_html(data):
"""Escape HTML characters in data.
:param data: A string, dict, or list to clean of HTML characters
:return: A cleaned object
:rtype: str or list or dict
"""
if isinstance(data, dict):
return {
key: escape_html(value)
for (key, value) in data.iteritems()
}
if is_iterable_but_not_string(data):
return [
escape_html(value)
for value in data
]
if isinstance(data, basestring):
return bleach.clean(data)
return data
def assert_clean(data):
"""Ensure that data is cleaned
:raise: AssertionError
"""
def _ensure_clean(value):
if value != bleach.clean(value):
raise ValueError
return escape_html(data)
# TODO: Remove safe_unescape_html when mako html safe comes in
def safe_unescape_html(value):
"""
Return data without html escape characters.
:param value: A string, dict, or list
:return: A string or list or dict without html escape characters
"""
safe_characters = {
'&': '&',
'<': '<',
'>': '>',
}
if isinstance(value, dict):
return {
key: safe_unescape_html(value)
for (key, value) in value.iteritems()
}
if is_iterable_but_not_string(value):
return [
safe_unescape_html(each)
for each in value
]
if isinstance(value, basestring):
for escape_sequence, character in safe_characters.items():
value = value.replace(escape_sequence, character)
return value
return value
def safe_json(value):
"""
Dump a string to JSON in a manner that can be used for JS strings in mako templates.
Providing additional forward-slash escaping to prevent injection of closing markup in strings. See:
http://benalpert.com/2012/08/03/preventing-xss-json.html
:param value: A string to be converted
:return: A JSON-formatted string that explicitly escapes forward slashes when needed
"""
return json.dumps(value).replace('</', '<\\/') # Fix injection of closing markup in strings
| 26.185841 | 103 | 0.627239 | 383 | 2,959 | 4.744125 | 0.310705 | 0.034673 | 0.019813 | 0.018162 | 0.127683 | 0.096863 | 0.036324 | 0 | 0 | 0 | 0 | 0.005071 | 0.266982 | 2,959 | 112 | 104 | 26.419643 | 0.832642 | 0.404867 | 0 | 0.12 | 0 | 0 | 0.029066 | 0 | 0 | 0 | 0 | 0.017857 | 0.02 | 1 | 0.16 | false | 0 | 0.04 | 0 | 0.46 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c9014f0cf5d96c8108b4c96a94c876f92838ff8 | 530 | py | Python | dags/exercise1.py | mikef-nl/airflow-training-skeleton | 85a0e9103772be012a41ee0daa9f67ba401bfddc | [
"Apache-2.0"
] | null | null | null | dags/exercise1.py | mikef-nl/airflow-training-skeleton | 85a0e9103772be012a41ee0daa9f67ba401bfddc | [
"Apache-2.0"
] | null | null | null | dags/exercise1.py | mikef-nl/airflow-training-skeleton | 85a0e9103772be012a41ee0daa9f67ba401bfddc | [
"Apache-2.0"
] | null | null | null | import airflow
from airflow.models import DAG
from airflow.operators.dummy_operator import DummyOperator
args = {
'owner': 'Mike',
'start_date': airflow.utils.dates.days_ago(2),
}
dag = DAG(
dag_id='exercise1',
default_args=args,
schedule_interval=None
)
t1 = DummyOperator(task_id='task1', dag=dag)
t2 = DummyOperator(task_id='task2', dag=dag)
t3 = DummyOperator(task_id='task3', dag=dag)
t4 = DummyOperator(task_id='task4', dag=dag)
t5 = DummyOperator(task_id='task5', dag=dag)
t1 >> t2 >> [t3,t4] >> t5
| 23.043478 | 58 | 0.70566 | 76 | 530 | 4.776316 | 0.473684 | 0.115702 | 0.261708 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037445 | 0.143396 | 530 | 22 | 59 | 24.090909 | 0.762115 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1c95c503cb53578803e2dbe2dd22ba875018dd47 | 817 | py | Python | stix_shifter_modules/elastic/entry_point.py | 6un9-h0-Dan/stix-shifter | f99feee8c247b9fc1d79f6db623c301b49685b63 | [
"Apache-2.0"
] | 1 | 2020-04-06T21:28:19.000Z | 2020-04-06T21:28:19.000Z | stix_shifter_modules/elastic/entry_point.py | 6un9-h0-Dan/stix-shifter | f99feee8c247b9fc1d79f6db623c301b49685b63 | [
"Apache-2.0"
] | null | null | null | stix_shifter_modules/elastic/entry_point.py | 6un9-h0-Dan/stix-shifter | f99feee8c247b9fc1d79f6db623c301b49685b63 | [
"Apache-2.0"
] | null | null | null | from stix_shifter_utils.utils.entry_point_base import EntryPointBase
from stix_shifter_utils.modules.cim.stix_translation.cim_data_mapper import CimDataMapper
from stix_shifter_utils.modules.car.stix_translation.car_data_mapper import CarDataMapper
from .stix_translation.stix_to_elastic import StixToElastic
class EntryPoint(EntryPointBase):
def __init__(self, connection={}, configuration={}, options={}):
super().__init__(options)
self.add_dialect('default', query_translator=StixToElastic(), data_mapper=CarDataMapper(options), default=True)
self.add_dialect('cim', query_translator=StixToElastic(), data_mapper=CimDataMapper(options), default_include=False)
self.add_dialect('car', query_translator=StixToElastic(), data_mapper=CarDataMapper(options), default_include=False) | 74.272727 | 124 | 0.812729 | 97 | 817 | 6.474227 | 0.381443 | 0.079618 | 0.071656 | 0.095541 | 0.353503 | 0.207006 | 0.207006 | 0.207006 | 0 | 0 | 0 | 0 | 0.089351 | 817 | 11 | 125 | 74.272727 | 0.844086 | 0 | 0 | 0 | 0 | 0 | 0.015892 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1c967254ce0d2a6e7d37a5e738a1749e4d64b857 | 6,324 | py | Python | genetic_pwdcrack.py | robotenique/AI-programming | 41a690963b452165342cfd3caa81bfad13d1cc76 | [
"Unlicense"
] | 3 | 2018-04-05T16:38:48.000Z | 2020-11-15T21:24:57.000Z | genetic_pwdcrack.py | robotenique/AI-programming | 41a690963b452165342cfd3caa81bfad13d1cc76 | [
"Unlicense"
] | null | null | null | genetic_pwdcrack.py | robotenique/AI-programming | 41a690963b452165342cfd3caa81bfad13d1cc76 | [
"Unlicense"
] | null | null | null | """
Crack a password using a genetic algorithm!
"""
import random as rnd
def main():
"""
This file implements a genetic algorithm to solve the problem of
cracking a given password, by creating 'generations' of different
words, selecting the best, breeeding them, applying a simple crossover
(randomized) and a mutation chance.
"""
#variables dict: Define the problem constants
genetic_variables = {
'password' : "verylongwordpass",
'size_population' : 100,
'best_sample' : 20,
'lucky_few' : 20,
'number_of_child' : 5,
'number_of_generations' : 10000, #Overkill >:D
'chance_of_mutation' : .5
}
prob = genetic_variables
#program
if (prob['best_sample'] + prob['lucky_few'])/2*prob['number_of_child'] != prob['size_population']:
print ("population size not stable")
return
last_gen, _ = genetic_algorithm(**genetic_variables)
print("Last generation: \n\n")
print(last_gen)
def genetic_algorithm(**kwargs):
"""
Execute the genetic algorithm.
This algorithm takes a dict as an argument.
It will iterate based on the variable 'number_of_generations', and return
the last_gen and the historic
"""
# Unpack the values from the dict
password = kwargs['password']
size_population = kwargs['size_population']
best_sample = kwargs['best_sample']
lucky_few = kwargs['lucky_few']
number_of_child = kwargs['number_of_child']
number_of_generations = kwargs['number_of_generations']
chance_of_mutation = kwargs['chance_of_mutation']
hist = []
# The genetic algorithm
curr_pop = initial_pop(size_population, password)
hist = curr_pop
last_found = -1
for _ in range (number_of_generations):
curr_pop = next_gen(curr_pop, password, best_sample, lucky_few, number_of_child, chance_of_mutation)
hist.append(curr_pop)
if check_solution(curr_pop, password):
last_found = _
break
if last_found != -1:
print(f"Found a solution in the {last_found} generation!!")
else:
print("No solution found! D':")
return curr_pop, hist
def next_gen(curr_pop, password, best_sample, lucky_few, number_of_child, chance_of_mutation):
"""
-> This is the main task of the Genetic Algorithm <-
Given the current population, apply the following steps:
- Compute the fitness of each individual in the population
- Select the best ones (and some lucky guys)
- Make them reproduce
- Mutate the children
- Return this new population
"""
pop_sorted = compute_perf_pop(curr_pop, password)
next_breeders = select_from_population(pop_sorted, best_sample, lucky_few)
next_pop = create_children(next_breeders, number_of_child)
next_gen = mutate_pop(next_pop, chance_of_mutation)
return next_gen
def initial_pop(size, password):
"""
Generate a population consisting of random words, each with the same
length as the password, and the population has the size specified.
"""
return [word_generate(len(password)) for _ in range(size)]
def fitness (password, test_word):
"""
The fitness function:
fitness(test_word): (# of correct chars) / (total number of chars)
fitness(test_word) = 0 if # of correct chars = 0
fitness(test_word) = 100 if # of correct chars = total number of chars
"""
if (len(test_word) != len(password)):
print("Incompatible password...")
return
else:
score = (1 if password[i] == test_word[i] else 0 for i in range(len(password)))
return sum(score)*100/len(password)
def compute_perf_pop(population, password):
"""
Return the population, sorted by the fitness from each individual
"""
populationPerf = {ind:fitness(password, ind) for ind in population}
# Sort by fitness, reversed (best ones in the beginning of the list)
return sorted(populationPerf.items(), key= lambda it: it[1], reverse=True)
def select_from_population(pop_sorted, best_sample, lucky_few):
"""
Create the next breeders, with 'best_sample' individuals which have the
top fitness value from the population, and 'lucky_few' individuals which
are randomly selected.
"""
next_gen = []
for i in range(best_sample):
next_gen.append(pop_sorted[i][0])
# Simple lucky few: randomly select some elements from the population
for i in range(lucky_few):
next_gen.append(rnd.choice(pop_sorted)[0])
rnd.shuffle(next_gen)
return next_gen
def create_children(breeders, nof_childs):
"""
Create the next population of individuals, by breeding two by two
"""
next_pop = []
mid_pos = len(breeders)//2 # len(breeders) must be an even number
for ind_1, ind_2 in zip(breeders[:mid_pos], breeders[mid_pos:]):
for _ in range(nof_childs):
next_pop.append(create_child(ind_1, ind_2))
return next_pop
def mutate_pop(population, chance):
"""
Given a chance for mutation, this apply the mutation layer
to the genetic algorithm, by generating a mutation with the chance
specified.
"""
for i in range(len(population)):
if rnd.random() < chance:
population[i] = mutate_word(population[i])
return population
def mutate_word(word):
"""
Mutate a letter(gene) from the word, then return it
"""
pos = int(rnd.random()*len(word))
word = word[:pos] + chr(97 + int(26*rnd.random())) + word[pos + 1:]
return word
def create_child(ind_1, ind_2):
"""
For each letter of the child, get a random gene from ind_1 or ind_2
in the i-th position.
"""
temp = [ind_1[i] if rnd.random() < 0.5 else ind_2[i] for i in range(len(ind_1))]
return "".join(temp)
def word_generate(length):
"""
Generate a string with random lowercase letters, with length = length!
"""
# Generate a random letter from alphabet, lowercase, and add to result
return "".join((chr(97 + rnd.randint(0, 26)) for _ in range(length)))
def check_solution(population, password):
"""
Check if the population found a solution to the problem
"""
return any(ind == password for ind in population)
if __name__ == '__main__':
main()
| 34 | 108 | 0.669355 | 868 | 6,324 | 4.691244 | 0.228111 | 0.027505 | 0.022348 | 0.022102 | 0.097495 | 0.082024 | 0.072692 | 0.056974 | 0.056974 | 0.03389 | 0 | 0.011125 | 0.232448 | 6,324 | 185 | 109 | 34.183784 | 0.827771 | 0.346616 | 0 | 0.065217 | 0 | 0 | 0.106966 | 0.010957 | 0 | 0 | 0 | 0 | 0 | 1 | 0.141304 | false | 0.195652 | 0.01087 | 0 | 0.304348 | 0.065217 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1c98d508fd84565e1b07d0b60db1b387344d3b53 | 2,852 | py | Python | scaffolds/__init__.py | chhsiao1981/frontend_template | fcd68c47d9ba3b04c8eb59c684c93baa20a688aa | [
"MIT"
] | null | null | null | scaffolds/__init__.py | chhsiao1981/frontend_template | fcd68c47d9ba3b04c8eb59c684c93baa20a688aa | [
"MIT"
] | null | null | null | scaffolds/__init__.py | chhsiao1981/frontend_template | fcd68c47d9ba3b04c8eb59c684c93baa20a688aa | [
"MIT"
] | null | null | null | # API
from pyramid.scaffolds import PyramidTemplate
import os
import re
import logging
def _camelcase_to_upper_camel_case(the_str):
if not the_str:
return ''
return the_str[0].upper() + the_str[1:]
def _upper_camelcase_to_camelcase(the_str):
if not the_str:
return ''
return the_str[0].lower() + the_str[1:]
def _camelcase_to_constant(the_str):
s1 = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', the_str)
return re.sub('([a-z0-9])([A-Z])', r'\1_\2', s1).upper()
class MyTemplate(PyramidTemplate):
def pre(self, command, output_dir, vars):
the_args = command.args
module_name = '' if not isinstance(the_args, list) or len(the_args) < 2 else the_args[1]
logging.warning('command: %s output_dir: %s vars: %s args: %s module_name: %s', command, output_dir, vars, command.args, module_name)
self._setup_module(vars, module_name)
return PyramidTemplate.pre(self, command, output_dir, vars)
def _setup_module(self, vars, full_module_name):
full_module_path = full_module_name.replace('.', os.path.sep)
module_name = os.path.basename(full_module_path)
class_name = _camelcase_to_upper_camel_case(module_name)
constant_name = _camelcase_to_constant(module_name)
sub_pkg_dir = os.path.dirname(full_module_path)
sub_pkg_name = sub_pkg_dir.replace(os.path.sep, '.')
test_name = '' if not module_name else 'test' + class_name
sub_pkg_dir_list = [] if not sub_pkg_dir else sub_pkg_dir.split(os.path.sep)
test_dir_list = ['test_' + each_pkg for each_pkg in sub_pkg_dir_list]
test_dir = os.path.sep.join(test_dir_list)
pkg_name = vars['package']
if sub_pkg_name:
pkg_name += '.' + sub_pkg_name
project_name = vars['project']
vars['module_name'] = module_name
vars['class_name'] = class_name
vars['sub_pkg_name'] = sub_pkg_name
vars['sub_pkg_dir'] = sub_pkg_dir
vars['constant_name'] = constant_name
vars['test_name'] = test_name
vars['test_dir'] = test_dir
vars['pkg_name'] = pkg_name
vars['project_name'] = project_name
class ComponentProjectTemplate(MyTemplate):
_template_dir = 'component'
summary = 'component'
class ContainerProjectTemplate(MyTemplate):
_template_dir = 'container'
summary = 'container'
class SubContainerProjectTemplate(MyTemplate):
_template_dir = 'subcontainer'
summary = 'subcontainer'
class ModuleProjectTemplate(MyTemplate):
_template_dir = 'module'
summary = 'module'
class InitStarterProjectTemplate(MyTemplate):
_template_dir = 'init_starter'
summary = 'including store / middleware / utils'
class InitDevProjectTemplate(MyTemplate):
_template_dir = 'init_dev'
summary = 'starting project'
| 29.102041 | 141 | 0.678822 | 382 | 2,852 | 4.719895 | 0.222513 | 0.043261 | 0.039933 | 0.033278 | 0.132557 | 0.069884 | 0.039933 | 0.039933 | 0.039933 | 0.039933 | 0 | 0.0062 | 0.208275 | 2,852 | 97 | 142 | 29.402062 | 0.792294 | 0.001052 | 0 | 0.0625 | 0 | 0 | 0.128908 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078125 | false | 0 | 0.0625 | 0 | 0.53125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1c9a0955e72ca504725f135176d44e72aae8607c | 1,237 | py | Python | tests/periodicities/gen_makefile.py | jmabry/pyaf | afbc15a851a2445a7824bf255af612dc429265af | [
"BSD-3-Clause"
] | 377 | 2016-10-13T20:52:44.000Z | 2022-03-29T18:04:14.000Z | tests/periodicities/gen_makefile.py | jmabry/pyaf | afbc15a851a2445a7824bf255af612dc429265af | [
"BSD-3-Clause"
] | 160 | 2016-10-13T16:11:53.000Z | 2022-03-28T04:21:34.000Z | tests/periodicities/gen_makefile.py | jmabry/pyaf | afbc15a851a2445a7824bf255af612dc429265af | [
"BSD-3-Clause"
] | 63 | 2017-03-09T14:51:18.000Z | 2022-03-27T20:52:57.000Z | import os
import glob
subdirs = glob.glob("tests/periodicities/*");
subdirs = ['tests/periodicities/Month',
'tests/periodicities/Minute',
'tests/periodicities/Week',
'tests/periodicities/Business_Hour',
'tests/periodicities/Business_Day',
'tests/periodicities/Second',
'tests/periodicities/Semi_Month',
'tests/periodicities/Hour',
'tests/periodicities/Day']
#print(subdirs)
print("PYTHON=python3\n\n");
lAllTarget = "";
for subdir1 in sorted(subdirs):
lBase = os.path.basename(subdir1);
test_target = "";
for filename in sorted(glob.glob(subdir1 + "/*.py")):
bn = os.path.basename(filename);
logfile = bn.replace("/" , "_");
logfile = "logs/periodicities_" + logfile.replace(".py" , ".log");
print("#PROCESSING FILE : " , filename, bn , logfile);
print(bn , " : " , "\n\t", "-$(PYTHON) " , filename , " > " , logfile , " 2>&1");
test_target = bn + " " + test_target;
lAllTarget = lAllTarget + " " + lBase;
print("\n\n", lBase , ": ", test_target, "\n" , "\n");
print("\n# ********************************************** \n");
print("all: " , lAllTarget , "\n\t\n");
| 32.552632 | 89 | 0.5481 | 123 | 1,237 | 5.439024 | 0.333333 | 0.269058 | 0.068759 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006329 | 0.23363 | 1,237 | 37 | 90 | 33.432432 | 0.699367 | 0.011318 | 0 | 0 | 0 | 0 | 0.356792 | 0.253682 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.214286 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
98bad417651d09071c208452c18aa4573b275f66 | 1,683 | py | Python | core/log.py | dl-stuff/dl9 | 1cbe98afc53a1de9d413797fb130946acc4b6ba4 | [
"MIT"
] | null | null | null | core/log.py | dl-stuff/dl9 | 1cbe98afc53a1de9d413797fb130946acc4b6ba4 | [
"MIT"
] | null | null | null | core/log.py | dl-stuff/dl9 | 1cbe98afc53a1de9d413797fb130946acc4b6ba4 | [
"MIT"
] | null | null | null | """Simulation logs"""
from __future__ import annotations # default once 3.10
import sys
from enum import Enum
from typing import Type, TYPE_CHECKING
if TYPE_CHECKING:
from core.timeline import Timeline
class LogKind(Enum):
def __str__(self) -> str:
return self.name
DEBUG = 0
SIM = 1
class LogData:
pass
class Logger:
__slots__ = ["_timeline", "_entries", "_data"]
PRINT_ASAP = True
def __init__(self, timeline: Timeline):
self._timeline = timeline
self.reset()
def reset(self):
self._entries = []
self._data = LogData()
def __call__(self, fmt: str, kind: LogKind, *args, **kwargs) -> None:
entry = LogEntry(self._timeline.now, fmt, kind, *args, **kwargs)
if self.PRINT_ASAP:
print(entry.fmt(), flush=True)
entry.process(self._data)
self._entries.append(entry)
def write(self, output=sys.stdout):
for entry in self:
output.write(entry.fmt())
output.write("\n")
class LogEntry:
"""1 row in the log"""
__slots__ = ["_timestamp", "_kind", "_fmt", "_args", "_kwargs"]
def __init__(self, timestamp: float, fmt: str, kind: LogKind, *args, **kwargs) -> None:
self._timestamp = timestamp
self._fmt = "{ts:>8.3f}{kind:>6}| " + fmt
self._kind = kind
self._args = args
self._kwargs = kwargs
def fmt(self) -> str:
"""Format this line of log"""
return self._fmt.format(ts=self._timestamp, kind=self._kind, *self._args, **self._kwargs)
def process(self, data: LogData) -> None:
"""Does any kind of updates to log data"""
pass
| 25.119403 | 97 | 0.60309 | 210 | 1,683 | 4.566667 | 0.357143 | 0.04171 | 0.022941 | 0.050052 | 0.064651 | 0.064651 | 0.064651 | 0 | 0 | 0 | 0 | 0.007276 | 0.265003 | 1,683 | 66 | 98 | 25.5 | 0.767987 | 0.066548 | 0 | 0.045455 | 0 | 0 | 0.049032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0.045455 | 0.113636 | 0.022727 | 0.545455 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
98be793b0386d37224cb563ae56376daaaeb6f10 | 507 | py | Python | linter.py | CudaText-addons/cuda_lint_htmltidy | afcf1dbfaa2dfc2d63e1ded4781d5f3e4b40a21c | [
"MIT"
] | null | null | null | linter.py | CudaText-addons/cuda_lint_htmltidy | afcf1dbfaa2dfc2d63e1ded4781d5f3e4b40a21c | [
"MIT"
] | null | null | null | linter.py | CudaText-addons/cuda_lint_htmltidy | afcf1dbfaa2dfc2d63e1ded4781d5f3e4b40a21c | [
"MIT"
] | null | null | null | # Copyright (c) 2013 Aparajita Fishman
# Change for CudaLint: Alexey T.
# License: MIT
import os
from cuda_lint import Linter, util
if os.name=='nt':
_exe = os.path.join(os.path.dirname(__file__), 'tidy_win32', 'tidy')
else:
_exe = 'tidy'
class HtmlTidy(Linter):
syntax = ('HTML', 'HTML_')
cmd = (_exe, '-errors', '-quiet', '-utf8')
regex = r'^line (?P<line>\d+) column (?P<col>\d+) - (?:(?P<error>Error)|(?P<warning>Warning)): (?P<message>.+)'
error_stream = util.STREAM_STDERR
| 26.684211 | 115 | 0.627219 | 72 | 507 | 4.25 | 0.694444 | 0.039216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016588 | 0.167653 | 507 | 18 | 116 | 28.166667 | 0.708531 | 0.157791 | 0 | 0 | 0 | 0.090909 | 0.347518 | 0.099291 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
98bf61f5f3abef89b085be204210156d6a5477f5 | 3,006 | py | Python | airtech_api/utils/error_messages/serialization_errors.py | chidioguejiofor/airtech-api | 45d77da0cc4230dd3cb7ab4cbb5168a9239850f5 | [
"MIT"
] | 1 | 2019-04-04T12:27:55.000Z | 2019-04-04T12:27:55.000Z | airtech_api/utils/error_messages/serialization_errors.py | chidioguejiofor/airtech-api | 45d77da0cc4230dd3cb7ab4cbb5168a9239850f5 | [
"MIT"
] | 34 | 2019-03-26T11:18:17.000Z | 2022-02-10T08:12:36.000Z | airtech_api/utils/error_messages/serialization_errors.py | chidioguejiofor/airtech-api | 45d77da0cc4230dd3cb7ab4cbb5168a9239850f5 | [
"MIT"
] | null | null | null | msg_dict = {
'resource_not_found':
'The resource you specified was not found',
'invalid_gender':
"The gender you specified is invalid!!",
'many_invalid_fields':
'Some errors occured while validating some fields. Please check and try again',
'unique':
'The {} you inputted already exists',
'user_not_found':
'The user with that username/email and password combination was not found',
'email_not_found':
'A user with email `{}` does not exist',
'user_already_verified':
'The user with that email has already been verified',
'invalid_flight_type':
'Flight type must be either international or local',
'invalid_flight_schedule':
'Flight schedule must be at least 12 hours before it is created',
'resource_id_not_found':
'The {} with that id was not found',
'user_book_flight_twice':
'You had previously booked for this Flight and thus cannot do it again',
'flight_booking_expired':
'You cannot book for a flight less than 24 hours before the flight',
'flight_schedule_expired':
'The schedule of this flight has already passed and thus you cannot book it',
'missing_field':
'You forgot to include this field',
'value_not_a_file':
'The value you inputted is not a file',
'not_an_image':
'The file you uploaded is not a valid image',
'image_too_large':
'Image must not be more than 2MB',
'payment_link_error':
'An error occurred while creating payment link',
'booking_already_paid':
'You have already paid for this flight',
'booking_expired':
'Your booking has expired, thus you cannot pay for this ticket',
'invalid_url':
'The `{}` field must be a valid URL with protocols `http` or `https`',
"invalid_url_field":
'This field must be a valid URL with protocols `http` or `https`',
'paystack_threw_error':
"There was an unexpected error while processing request. "
"Please raise this as an issue in at "
"https://github.com/chidioguejiofor/airtech-api/issues",
'empty_request':
'You did not specify any `{}` data in your request',
'paid_booking_cannot_be_deleted':
'You cannot delete this Booking because you have already paid for it',
'cannot_delete_expired_booking':
'You cannot delete an expired booking',
'cannot_delete_flight_with_bookings':
'You cannot delete this flight because users have started booking it',
'cannot_delete_flight_that_has_flown':
'You cannot delete this flight because the schedule date has been passed',
'cannot_update_flight_field_with_bookings':
'You cannot update the `{}` of this flight because it has already been booked',
'cannot_update_field':
'You cannot update a {} {}',
'regular_user_only':
'This endpoint is for only regular users',
'profile_not_updated':
'You need to update your profile picture before you can do this',
'only_alpha_and_numbers':
'This field can contain only alphabets and numbers'
}
| 42.338028 | 83 | 0.706254 | 430 | 3,006 | 4.762791 | 0.339535 | 0.039551 | 0.029297 | 0.027832 | 0.094727 | 0.074219 | 0.042969 | 0.042969 | 0.042969 | 0.042969 | 0 | 0.002107 | 0.210579 | 3,006 | 70 | 84 | 42.942857 | 0.860936 | 0 | 0 | 0 | 0 | 0 | 0.815037 | 0.107119 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.042857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
98c14c64fb91ce8b039d5c03cf8ab0036d83b74c | 3,810 | py | Python | cogs/memes.py | Code-Cecilia/botman-rewrite | 9d8baeebf267c62df975d2f209e85589b81934af | [
"MIT"
] | 2 | 2022-02-21T14:10:15.000Z | 2022-02-21T14:10:50.000Z | cogs/memes.py | Code-Cecilia/botman-rewrite | 9d8baeebf267c62df975d2f209e85589b81934af | [
"MIT"
] | null | null | null | cogs/memes.py | Code-Cecilia/botman-rewrite | 9d8baeebf267c62df975d2f209e85589b81934af | [
"MIT"
] | null | null | null | import json
import discord
from discord.ext import commands
from assets import internet_funcs
from assets.list_funcs import chunks
class Memes(commands.Cog, description="Memes from https://imgflip.com/"):
def __init__(self, bot):
self.bot = bot
with open("config.json") as configFile:
config = json.load(configFile)
self.username = config.get("imgflip_username")
self.password = config.get("imgflip_password")
self.memetemps = {}
@commands.Cog.listener()
async def on_ready(self):
result = json.loads(await internet_funcs.get_response("https://api.imgflip.com/get_memes"))
if result["success"] is not True:
return
result = result["data"]["memes"]
for k in result:
self.memetemps[k["id"]] = {"name": k["name"], "box_count": k["box_count"]}
@commands.command(name="memetemplates", aliases=["memetemps"])
async def meme_temps(self, ctx):
"""Fetches top 100 meme templates from imgflip.com"""
# TODO: pagination for meme templates
result = list(self.memetemps.items())
if not result:
await self.on_ready()
result = list(self.memetemps.items())
n = 0
split_entries = list(chunks(result, 25))
for entry in split_entries:
embed = discord.Embed(title="Meme Templates", color=0x00ff00)
for meme in entry:
n += 1
meme_id = meme[0]
meme_name = meme[1]["name"]
embed.add_field(name=f"{n}. {meme_name}", value=f"ID: `{meme_id}`", inline=False)
try:
await ctx.author.send(embed=embed)
except discord.Forbidden:
await ctx.send("I can't DM you! Please enable DMs and try again.")
return
@commands.command(name="memegen", aliases=["memegenerator"])
async def meme_gen(self, ctx, meme_id, *text):
"""Generates a meme from imgflip. For template IDs, see the `memetemplates` command"""
text = list(text)
if self.memetemps == {}:
await self.on_ready()
if len(text) > 20:
text = text[:20]
if not str(meme_id).isnumeric():
found = False
for k, v in self.memetemps.items():
if str(meme_id).lower() == str(v["name"]).lower():
meme_id = int(k)
found = True
break
if not found:
return await ctx.send("Meme not found. Please check the ID and try again.")
# clean up the number of boxes to send
if meme_id in self.memetemps.keys():
if len(text) > self.memetemps[meme_id]["box_count"]:
text = text[:int(self.memetemps[meme_id]["box_count"])]
if len(text) < self.memetemps[meme_id]["box_count"]:
text += [""] * int(self.memetemps[meme_id]["box_count"] - len(text))
# ready the text boxes
boxes_dict = {}
for box_count in range(len(text)):
boxes_dict[f"boxes[{box_count}][text]"] = text[box_count]
boxes_dict[f"boxes[{box_count}][color]"] = "#000000"
boxes_dict[f"boxes[{box_count}][outline_color]"] = "#FFFFFF"
# send the request
payload = {"template_id": meme_id, "username": self.username, "password": self.password}
payload.update(boxes_dict)
result = json.loads(await internet_funcs.post("https://api.imgflip.com/caption_image", data=payload))
if result["success"] is not True:
await ctx.send("An error occurred:" + " " + "**" + result["error_message"] + "**")
return
await ctx.send(result["data"]["url"])
def setup(bot):
bot.add_cog(Memes(bot))
| 38.877551 | 109 | 0.574803 | 475 | 3,810 | 4.494737 | 0.303158 | 0.033724 | 0.022482 | 0.035597 | 0.179391 | 0.153162 | 0.067447 | 0.067447 | 0.037471 | 0.037471 | 0 | 0.008886 | 0.291076 | 3,810 | 97 | 110 | 39.278351 | 0.781562 | 0.028871 | 0 | 0.118421 | 0 | 0 | 0.165917 | 0.02306 | 0 | 0 | 0.00225 | 0.010309 | 0 | 1 | 0.026316 | false | 0.026316 | 0.065789 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
98c28b91f69f483e365aff3baa53ce90ba90427d | 937 | py | Python | aquarius/app/auth_util.py | oceanprotocol/provider-backend | f9e36e3d6b880de548c6b92c38d10d76daf369ba | [
"Apache-2.0"
] | null | null | null | aquarius/app/auth_util.py | oceanprotocol/provider-backend | f9e36e3d6b880de548c6b92c38d10d76daf369ba | [
"Apache-2.0"
] | 1 | 2018-08-15T09:57:01.000Z | 2018-08-15T09:57:01.000Z | aquarius/app/auth_util.py | oceanprotocol/provider-backend | f9e36e3d6b880de548c6b92c38d10d76daf369ba | [
"Apache-2.0"
] | null | null | null | #
# Copyright 2021 Ocean Protocol Foundation
# SPDX-License-Identifier: Apache-2.0
#
from eth_utils import is_address
from web3 import Web3
def sanitize_addresses(addresses):
return [Web3.toChecksumAddress(a) for a in addresses if is_address(a)]
def compare_eth_addresses(address, checker, logger):
"""
Compare two addresses and return TRUE if there is a match
:param str address: Address
:param str checker: Address to compare with
:param logger: instance of logging
:return: boolean
"""
logger.debug("compare_eth_addresses address: %s" % address)
logger.debug("compare_eth_addresses checker: %s" % checker)
if not is_address(address):
logger.debug("Address is not web3 valid")
return False
if not is_address(checker):
logger.debug("Checker is not web3 valid")
return False
return Web3.toChecksumAddress(address) == Web3.toChecksumAddress(checker)
| 31.233333 | 77 | 0.71825 | 124 | 937 | 5.330645 | 0.387097 | 0.054463 | 0.086233 | 0.078669 | 0.166415 | 0.075643 | 0 | 0 | 0 | 0 | 0 | 0.01731 | 0.198506 | 937 | 29 | 78 | 32.310345 | 0.86285 | 0.276414 | 0 | 0.142857 | 0 | 0 | 0.180404 | 0.065319 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.071429 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
98c9bbdcbfc1d4a76b6ddc9df442f68e0236c7a7 | 519 | py | Python | prayer_times_v2.py | danish09/request_api | 67aac9079cb30fc0069a9273c8b4074122ea4d3b | [
"MIT"
] | null | null | null | prayer_times_v2.py | danish09/request_api | 67aac9079cb30fc0069a9273c8b4074122ea4d3b | [
"MIT"
] | null | null | null | prayer_times_v2.py | danish09/request_api | 67aac9079cb30fc0069a9273c8b4074122ea4d3b | [
"MIT"
] | null | null | null | import json
import requests
from datetime import datetime
from playsound import playsound
tday=datetime.today().strftime('%Y-%m-%d')
right_now=datetime.today().strftime('%I-%M-%p')
response = requests.get("https://www.londonprayertimes.com/api/times/?format=json&key=0239f686-4423-408e-9a0c-7968a403d197&year=&month=")
data=response.json()
for key,value in data.items():
if value >= '03:30' and value < '06:00':
print('It is asr time')
#playsound('/home/danish/Downloads/adan.mp3') | 23.590909 | 137 | 0.693642 | 74 | 519 | 4.851351 | 0.72973 | 0.072423 | 0.116992 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0783 | 0.138728 | 519 | 22 | 138 | 23.590909 | 0.724832 | 0.084778 | 0 | 0 | 0 | 0.090909 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
98d56156be74bebcd376e40f41b92a8ab49e898e | 5,833 | py | Python | wificontrol/utils/networkstranslate.py | patrislav1/pywificontrol | 1edf9cdb95158804033dba8fcb860e5214ded10f | [
"BSD-3-Clause"
] | 1 | 2019-02-12T14:08:08.000Z | 2019-02-12T14:08:08.000Z | wificontrol/utils/networkstranslate.py | patrislav1/pywificontrol | 1edf9cdb95158804033dba8fcb860e5214ded10f | [
"BSD-3-Clause"
] | null | null | null | wificontrol/utils/networkstranslate.py | patrislav1/pywificontrol | 1edf9cdb95158804033dba8fcb860e5214ded10f | [
"BSD-3-Clause"
] | 2 | 2018-12-05T15:55:22.000Z | 2019-01-28T03:44:21.000Z | # Written by Ivan Sapozhkov and Denis Chagin <denis.chagin@emlid.com>
#
# Copyright (c) 2016, Emlid Limited
# All rights reserved.
#
# Redistribution and use in source and binary forms,
# with or without modification,
# are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# 3. Neither the name of the copyright holder nor the names of its contributors
# may be used to endorse or promote products derived from this software
# without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
# THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
# FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
# IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS
# BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY,
# OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED
# AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
# STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
# EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
def create_security(proto, key_mgmt, group):
if not proto:
return 'open'
if not key_mgmt:
if "wep" in group:
return 'wep'
else:
return None
else:
if "wpa-psk" in key_mgmt:
if proto == "WPA":
return "wpapsk"
elif proto == "RSN":
return "wpa2psk"
else:
return None
elif "wpa-eap" in key_mgmt:
return 'wpaeap'
else:
return None
def convert_to_wpas_network(network):
return dict(WpasNetworkConverter(network))
def convert_to_wificontrol_network(network, current_network):
wifinetwork = dict(WifiControlNetworkConverter(network))
try:
if wifinetwork['ssid'] == current_network['ssid']:
wifinetwork.update(current_network)
wifinetwork["connected"] = True
except TypeError:
pass
finally:
return wifinetwork
class WpasNetworkConverter(object):
def __init__(self, network_dict):
def rawUtf8(s):
return "{}".format(s.encode('utf-8'))[2:-1]
self.security = network_dict.get('security')
self.name = rawUtf8(network_dict.get('ssid', ''))
self.password = rawUtf8(network_dict.get('password', ''))
self.identity = rawUtf8(network_dict.get('identity', ''))
def __iter__(self):
if (self.security == 'open'):
yield "ssid", "{}".format(self.name)
yield "key_mgmt", "NONE"
elif (self.security == 'wep'):
yield "ssid", "{}".format(self.name)
yield "key_mgmt", "NONE"
yield "group", "WEP104 WEP40"
yield "wep_key0", "{}".format(self.password)
elif (self.security == 'wpapsk'):
yield "ssid", "{}".format(self.name)
yield "key_mgmt", "WPA-PSK"
yield "pairwise", "CCMP TKIP"
yield "group", "CCMP TKIP"
yield "eap", "TTLS PEAP TLS"
yield "psk", "{}".format(self.password)
elif (self.security == 'wpa2psk'):
yield "ssid", "{}".format(self.name)
yield "proto", "RSN"
yield "key_mgmt", "WPA-PSK"
yield "pairwise", "CCMP TKIP"
yield "group", "CCMP TKIP"
yield "eap", "TTLS PEAP TLS"
yield "psk", "{}".format(self.password)
elif (self.security == 'wpaeap'):
yield "ssid", "{}".format(self.name)
yield "key_mgmt", "WPA-EAP"
yield "pairwise", "CCMP TKIP"
yield "group", "CCMP TKIP"
yield "eap", "TTLS PEAP TLS"
yield "identity", "{}".format(self.identity)
yield "password", "{}".format(self.password)
yield "phase1", "peaplable=0"
else:
yield "ssid", "{}".format(self.name)
yield "psk", "{}".format(self.password)
class WifiControlNetworkConverter(object):
def __init__(self, network_dict):
self.name = network_dict.get('ssid')
self.key_mgmt = network_dict.get('key_mgmt')
self.proto = network_dict.get('proto')
self.group = network_dict.get('group')
def __iter__(self):
if (self.key_mgmt == 'NONE'):
if not self.group:
yield "ssid", self.name
yield "security", "Open"
else:
yield "ssid", self.name
yield "security", "WEP"
elif (self.key_mgmt == 'WPA-PSK'):
if not self.proto:
yield "ssid", self.name
yield "security", "WPA-PSK"
else:
yield "ssid", self.name
yield "security", "WPA2-PSK"
elif (self.key_mgmt == 'WPA-EAP'):
yield "ssid", self.name
yield "security", "WPA-EAP"
else:
yield "ssid", self.name
yield "security", "NONE"
yield "connected", False
if __name__ == '__main__':
network = {'ssid': "MySSID", 'password': "NewPassword", 'security': "wpaeap", 'identity': "alex@example.com"}
conv = convert_to_wpas_network(network)
reconv = convert_to_wificontrol_network(conv)
print(conv, reconv)
| 36.006173 | 113 | 0.603806 | 680 | 5,833 | 5.083824 | 0.301471 | 0.028348 | 0.045126 | 0.032977 | 0.33584 | 0.265548 | 0.214637 | 0.16604 | 0.16604 | 0.125832 | 0 | 0.005967 | 0.281673 | 5,833 | 161 | 114 | 36.229814 | 0.819093 | 0.265044 | 0 | 0.394495 | 0 | 0 | 0.162003 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.073395 | false | 0.073395 | 0 | 0.018349 | 0.192661 | 0.009174 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
98df6d63c240e8262eac8f0396a8b8f0ecd76ac8 | 10,728 | py | Python | PrometheusScrapper/scrapper.py | masterchef/webscraper | f47220e941980e2a6dda593d74696062784062e1 | [
"MIT"
] | null | null | null | PrometheusScrapper/scrapper.py | masterchef/webscraper | f47220e941980e2a6dda593d74696062784062e1 | [
"MIT"
] | null | null | null | PrometheusScrapper/scrapper.py | masterchef/webscraper | f47220e941980e2a6dda593d74696062784062e1 | [
"MIT"
] | null | null | null | import datetime
import getpass
import logging
import os
import pathlib
import platform
import re
import smtplib
import sys
from contextlib import contextmanager
from email.message import EmailMessage
from functools import wraps
import azure.functions as func
import click
import gspread
import pandas as pd
from apscheduler.schedulers.background import BlockingScheduler
from oauth2client.service_account import ServiceAccountCredentials
from selenium import webdriver
from selenium.common.exceptions import TimeoutException
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.support.ui import WebDriverWait
log = logging.getLogger(__name__)
log.setLevel(logging.DEBUG)
handler = logging.StreamHandler(sys.stdout)
handler.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
log.addHandler(handler)
@contextmanager
def get_driver(*args, **kwargs):
options = Options()
options.headless = True
options.add_argument("--window-size=1920,1200")
options.add_argument('--no-sandbox')
options.add_argument('--disable-dev-shm-usage')
options.add_argument('--disable-crash-reporter')
options.add_argument('--disable-logging')
options.add_argument('--log-level=3')
if platform.system() == 'Linux':
DRIVER_PATH = 'chromedriver'
elif platform.system() == "Darwin":
DRIVER_PATH = (pathlib.Path(__file__).parent.parent /
'chromedriver').resolve()
else:
log.error('Unsupported OS')
exit(0)
driver = webdriver.Chrome(options=options, executable_path=DRIVER_PATH)
yield driver
driver.close()
driver.quit()
def get_browser(func):
@wraps(func)
def wrapper(*args, **kwargs):
with get_driver() as d:
kwargs['driver'] = d
return func(*args, **kwargs)
return wrapper
@click.group()
@click.option('--email', is_flag=True, help='A flag for sending email with results.')
@click.option('--email_to', help='CSV of email addresses to send notification to.')
@click.option('--username', help='SMTP account username.')
@click.option('--gsheet', is_flag=True, help='A flag for updating google sheet with results')
@click.option('--doc_key', help='Google Doc Key to update')
@click.pass_context
def cli(ctx, email, email_to, username, gsheet, doc_key):
ctx.ensure_object(dict)
if email and (not username or not email_to):
log.error('Please provide email sending parameters')
exit(0)
elif email:
password = getpass.getpass(
"Please enter your google account password for sending email:\n")
ctx.obj['password'] = password
if gsheet and not doc_key:
log.error('Please provide a gsheet doc key')
exit(0)
pass
@cli.command('schedule')
@click.option('--hour', default='*/1', help='Cron hour expression')
@click.pass_context
def schedule(ctx, hour):
email = ctx.parent.params['email']
username = ctx.parent.params['username']
email_to = ctx.parent.params['email_to']
password = ctx.obj.get('password', None)
gsheet = ctx.parent.params['gsheet']
doc_key = ctx.parent.params['doc_key']
schedule = BlockingScheduler()
schedule.add_job(run, kwargs={"email": email, "gsheet": gsheet, "doc_key": doc_key,
"username": username, "email_to": email_to, "password": password}, trigger='cron', hour=hour)
try:
schedule.start()
except (KeyboardInterrupt, SystemExit):
schedule.shutdown()
@cli.command('run')
@click.pass_context
def once(ctx):
email = ctx.parent.params['email']
gsheet = ctx.parent.params['gsheet']
username = ctx.parent.params['username']
email_to = ctx.parent.params['email_to']
password = ctx.obj.get('password', None)
doc_key = ctx.parent.params['doc_key']
run(email, username, email_to, password, gsheet, doc_key)
def run(email, username, email_to, password, gsheet, doc_key):
log.info('In run')
content = []
for link in os.environ["searchLinks"].split():
content += get_prometheus_apartments(link)
formatted_content = format_email(content)
if gsheet:
log.info('Updating gsheet')
update_historical_data(doc_key, content)
formatted_content += f'For historical data click the link below:\nhttps://docs.google.com/spreadsheets/d/1XZocxmyQ91e1exBvwDAaSR8Rhavy9WPnwLSz0Z5SKsM/edit?usp=sharing'
if email:
log.info('Sending email')
send_email(username, password, email_to, formatted_content)
log.info(content)
@get_browser
def get_prometheus_apartments(url, driver):
driver.get(url)
content = []
log.info(f'Getting apartments: {url}')
try:
anchors = driver.find_elements_by_xpath(
"//div[@id='results-cards']/div/a[@class='card-wrapper']")
except Exception as e:
log.exception(f'{e}')
return content
links = [a.get_attribute('href') for a in anchors]
apartments = []
for apt in links:
name = apt.strip('/').split('/')[-1]
apartments.append({'name': name, 'url': f'{apt}lease'})
# Scrape each appartment in parallel
for apt in apartments:
results = get_availability(apt)
if results:
content.append(results)
# with Pool() as pool:
# results = [pool.apply_async(get_availability, args=(apt,)) for apt in apartments]
# for result in results:
# data = result.get()
# if data:
# content.append(data)
return content
def update_historical_data(doc_key, content):
date = datetime.datetime.today().strftime('%Y-%m-%d')
all_content = []
for apt in content:
complex = apt['meta']['name']
data = apt['data']
for row in data:
cleaned_values = [f'{date}', f'{complex}'] + \
[value.replace('$', '').replace(',', '') for value in row]
all_content.append(cleaned_values)
update_gdoc(doc_key, all_content)
def format_email(content):
result = ''
for apt in content:
complex = apt['meta']['name']
data = apt['data']
if complex != 'mansion-grove':
continue
result += f'------------ {complex} ----------------\n'
total_available = sum(int(row[-1]) for row in data)
result += '\n'.join(', '.join(row) for row in data)
result += f'\nTotal Available: {total_available}\n'
return result
@get_browser
def get_availability(data, driver):
"""
Returns apartment availability information
"""
url = data['url']
content = []
log.info(f'Processing {url}')
driver.get(url)
delay = 60 # seconds
try:
WebDriverWait(driver, delay).until(
EC.frame_to_be_available_and_switch_to_it('rp-leasing-widget'))
WebDriverWait(driver, delay).until(EC.presence_of_element_located(
(By.XPATH, "//button[contains(@class, 'primary')][contains(text(), 'Start')]")))
except TimeoutException:
log.info(f'Page did not load: {url}')
return content
try:
driver.find_element_by_xpath(
"//button[contains(@class, 'primary')][contains(text(), 'Start')]").click()
WebDriverWait(driver, delay).until(
EC.presence_of_element_located((By.XPATH, "//div[contains(@class, 'floorplan-tile')]/div/span[contains(@class, 'name')]")))
# Print plan prices
names = [n.text for n in driver.find_elements_by_xpath(
"//div[contains(@class, 'floorplan-tile')]/div/span[contains(@class, 'name')]")]
specs = [n.text for n in driver.find_elements_by_xpath(
"//div[contains(@class, 'floorplan-tile')]/div/span[contains(@class, 'specs')]")]
prices = [n.text for n in driver.find_elements_by_xpath(
"//div[contains(@class, 'floorplan-tile')]/div/span[contains(@class, 'range')]")]
availability = [n.text for n in driver.find_elements_by_xpath(
"//div[contains(@class, 'floorplan-tile')]/div[@class='tile-buttons']/button")]
except Exception:
log.exception(f'Unable to parse {url}')
return content
for i in range(len(names)):
match = re.match(
r'\((\d+)\).*', availability[i]) if len(availability) > i else None
units = int(match.groups()[0]) if match else '0'
match = re.match(
r'(\$\d*)( \- \$\d*\*)*', prices[i].split(' - ')[0].replace(',', '').replace('From ', '')) if len(prices) > i else None
min_price = match.groups()[0] if match else '$0'
content.append((names[i], specs[i], min_price, str(units)))
return {'meta': data, 'data': content}
def send_email(username, password, to, content):
if not content:
log.info('Nothing to send')
return
msg = EmailMessage()
msg.set_content(content)
msg['Subject'] = f'Apartment availability'
msg['From'] = username
msg['To'] = to
# Send the message via our own SMTP server.
s = smtplib.SMTP_SSL('smtp.gmail.com', 465)
s.login(username, password)
s.send_message(msg)
s.quit()
def update_gdoc(doc_key, cells):
scope = [
"https://spreadsheets.google.com/feeds",
"https://www.googleapis.com/auth/drive",
]
CREDENTIALS_PATH = pathlib.Path(__file__).parent.parent / 'credentials.json'
credentials = ServiceAccountCredentials.from_json_keyfile_name(
CREDENTIALS_PATH.resolve(), scope,
)
docs = gspread.authorize(credentials)
sheet = docs.open_by_key(doc_key).sheet1
new = pd.DataFrame(cells)
new.columns = ['Date', 'Complex', 'Plan', 'Specs', 'Price', 'Availability']
existing = pd.DataFrame(sheet.get_all_values()[1:])
if existing.size:
existing.columns = ['Date', 'Complex',
'Plan', 'Specs', 'Price', 'Availability']
updated = existing.append(new)
updated = updated.groupby(['Date', 'Complex', 'Plan', 'Specs']).min()
updated.reset_index(inplace=True)
sheet.update([updated.columns.values.tolist()] +
updated.values.tolist(), value_input_option='USER_ENTERED')
if __name__ == '__main__':
cli()
def azurefunc(PrometheusScrapper: func.TimerRequest) -> None:
email = os.environ["SendEmail"]
email_to = os.environ["EmailTo"]
username = os.environ["GmailUsername"]
password = os.environ["GmailPassword"]
gsheet = os.environ["UpdateGSheet"]
doc_key = os.environ["GSheetKey"]
run(email, username, email_to, password, gsheet, doc_key)
| 34.495177 | 175 | 0.646905 | 1,316 | 10,728 | 5.156535 | 0.259119 | 0.017683 | 0.022104 | 0.014736 | 0.238285 | 0.203655 | 0.180666 | 0.146183 | 0.146183 | 0.113469 | 0 | 0.004236 | 0.207774 | 10,728 | 310 | 176 | 34.606452 | 0.794211 | 0.033464 | 0 | 0.18 | 0 | 0.024 | 0.20785 | 0.055008 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056 | false | 0.072 | 0.096 | 0 | 0.188 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
98e581895367116db85fb5bcc24f1ed7b42ed751 | 2,181 | py | Python | bbio/bbio.py | timgates42/PyBBIO | 0d46115059ed7ec0c17afb6dd7ed2f507b4f2b8a | [
"MIT"
] | 102 | 2015-01-29T04:28:49.000Z | 2022-01-03T18:27:50.000Z | bbio/bbio.py | timgates42/PyBBIO | 0d46115059ed7ec0c17afb6dd7ed2f507b4f2b8a | [
"MIT"
] | 62 | 2015-01-29T11:05:13.000Z | 2019-12-03T04:30:34.000Z | bbio/bbio.py | timgates42/PyBBIO | 0d46115059ed7ec0c17afb6dd7ed2f507b4f2b8a | [
"MIT"
] | 58 | 2015-02-10T14:31:18.000Z | 2022-03-29T13:24:03.000Z | """
PyBBIO - bbio.py
Copyright (c) 2012-2015 - Alexander Hiam <alex@graycat.io>
Released under the MIT license
https://github.com/graycatlabs/PyBBIO
"""
import sys, atexit
from .platform import platform_init, platform_cleanup
from .common import ADDITIONAL_CLEANUP, util_init
def bbio_init():
""" Pre-run initialization, i.e. starting module clocks, etc. """
util_init()
platform_init()
def bbio_cleanup():
""" Post-run cleanup, i.e. stopping module clocks, etc. """
# Run user cleanup routines:
for cleanup in ADDITIONAL_CLEANUP:
try:
cleanup()
except Exception as e:
# Something went wrong with one of the cleanup routines, but we
# want to keep going; just print the error and continue
print "*Exception raised trying to call cleanup routine '%s':\n %s" %\
(cleanup, e)
platform_cleanup()
# The following code detects if Python is running interactively,
# and if so initializes PyBBIO on import and registers PyBBIO's
# cleanup to be called at exit, otherwise it defines the run() and
# stop() methods for the file based control flow:
import __main__
if not hasattr(__main__, '__file__'):
# We're in the interpreter, see:
# http://stackoverflow.com/questions/2356399/tell-if-python-is-in-interactive-mode
bbio_init()
print "PyBBIO initialized"
def interactive_cleanup():
bbio_cleanup()
print "Finished PyBBIO cleanup"
atexit.register(interactive_cleanup)
else:
bbio_init()
atexit.register(bbio_cleanup)
# Imported in a Python file, define run() and stop():
def run(setup, loop):
""" The main loop; must be passed a setup and a loop function.
First the setup function will be called once, then the loop
function wil be called continuously until a stop signal is
raised, e.g. CTRL-C or a call to the stop() function from
within the loop. """
try:
setup()
while (True):
loop()
except KeyboardInterrupt:
# Manual exit signal, clean up and exit happy
exit(0)
def stop():
""" Preferred way for a program to stop itself. """
raise KeyboardInterrupt # Expected happy stop condition in run()
| 32.073529 | 85 | 0.692343 | 305 | 2,181 | 4.859016 | 0.495082 | 0.016194 | 0.014845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009373 | 0.217332 | 2,181 | 67 | 86 | 32.552239 | 0.858817 | 0.2884 | 0 | 0.117647 | 0 | 0 | 0.119256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.117647 | null | null | 0.088235 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
98ee2fa044a20258e55e590fef0af310684f4e34 | 433 | py | Python | tests/unit_tests/cx_core/integration/integration_test.py | clach04/controllerx | b5cd92d3371c352c50f7d5ba7dae4538d7c15dfe | [
"MIT"
] | 204 | 2020-01-18T10:12:13.000Z | 2022-03-27T09:40:17.000Z | tests/unit_tests/cx_core/integration/integration_test.py | clach04/controllerx | b5cd92d3371c352c50f7d5ba7dae4538d7c15dfe | [
"MIT"
] | 329 | 2020-01-17T17:18:53.000Z | 2022-03-29T11:20:30.000Z | tests/unit_tests/cx_core/integration/integration_test.py | clach04/controllerx | b5cd92d3371c352c50f7d5ba7dae4538d7c15dfe | [
"MIT"
] | 66 | 2020-01-19T20:17:21.000Z | 2022-03-13T15:03:41.000Z | from cx_core import integration as integration_module
from cx_core.controller import Controller
def test_get_integrations(fake_controller: Controller):
integrations = integration_module.get_integrations(fake_controller, {})
inteagration_names = {i.name for i in integrations}
assert inteagration_names == {
"z2m",
"zha",
"deconz",
"state",
"mqtt",
"lutron_caseta",
}
| 27.0625 | 75 | 0.678984 | 46 | 433 | 6.130435 | 0.586957 | 0.042553 | 0.070922 | 0.205674 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003012 | 0.233256 | 433 | 15 | 76 | 28.866667 | 0.846386 | 0 | 0 | 0 | 0 | 0 | 0.078522 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 1 | 0.076923 | false | 0 | 0.153846 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
98fc678951f86f4c4317fc775c6ba763f66da302 | 8,717 | py | Python | ambari-server/src/test/python/stacks/2.3/ATLAS/test_metadata_server.py | gcxtx/ambari | 133d9c4661b21182482c25f96c3f0bf0a9740a9f | [
"Apache-2.0"
] | 1 | 2021-05-06T06:24:04.000Z | 2021-05-06T06:24:04.000Z | ambari-server/src/test/python/stacks/2.3/ATLAS/test_metadata_server.py | gcxtx/ambari | 133d9c4661b21182482c25f96c3f0bf0a9740a9f | [
"Apache-2.0"
] | null | null | null | ambari-server/src/test/python/stacks/2.3/ATLAS/test_metadata_server.py | gcxtx/ambari | 133d9c4661b21182482c25f96c3f0bf0a9740a9f | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
'''
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
'''
from mock.mock import MagicMock, call, patch
from stacks.utils.RMFTestCase import *
import json
import sys
from only_for_platform import not_for_platform, PLATFORM_WINDOWS
@not_for_platform(PLATFORM_WINDOWS)
class TestMetadataServer(RMFTestCase):
COMMON_SERVICES_PACKAGE_DIR = "ATLAS/0.1.0.2.3/package"
STACK_VERSION = "2.3"
def configureResourcesCalled(self):
self.assertResourceCalled('Directory', '/var/run/atlas',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0755
)
self.assertResourceCalled('Directory', '/etc/atlas/conf',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0755
)
self.assertResourceCalled('Directory', '/var/log/atlas',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0755
)
self.assertResourceCalled('Directory', '/usr/hdp/current/atlas-server/hbase/logs',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0755
)
self.assertResourceCalled('Directory', '/usr/hdp/current/atlas-server/data',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0755
)
self.assertResourceCalled('Directory', '/usr/hdp/current/atlas-server/data',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0644
)
self.assertResourceCalled('Directory', '/usr/hdp/current/atlas-server/server/webapp',
owner='atlas',
group='hadoop',
create_parents = True,
cd_access='a',
mode=0644
)
self.assertResourceCalled('File', '/usr/hdp/current/atlas-server/server/webapp/atlas.war',
content = StaticFile('/usr/hdp/current/atlas-server/server/webapp/atlas.war'),
)
appprops = dict(self.getConfig()['configurations'][
'application-properties'])
appprops['atlas.server.bind.address'] = 'c6401.ambari.apache.org'
self.assertResourceCalled('PropertiesFile',
'/etc/atlas/conf/application.properties',
properties=appprops,
owner='atlas',
group='hadoop',
mode=0644,
)
self.assertResourceCalled('File', '/etc/atlas/conf/atlas-env.sh',
content=InlineTemplate(
self.getConfig()['configurations'][
'atlas-env']['content']),
owner='atlas',
group='hadoop',
mode=0755,
)
self.assertResourceCalled('File', '/etc/atlas/conf/atlas-log4j.xml',
content=InlineTemplate(
self.getConfig()['configurations'][
'atlas-log4j']['content']),
owner='atlas',
group='hadoop',
mode=0644,
)
self.assertResourceCalled('File', '/etc/atlas/conf/users-credentials.properties',
content=StaticFile('users-credentials.properties'),
owner='atlas',
group='hadoop',
mode=0644,
)
self.assertResourceCalled('File', '/etc/atlas/conf/policy-store.txt',
content=StaticFile('policy-store.txt'),
owner='atlas',
group='hadoop',
mode=0644,
)
self.assertResourceCalled('XmlConfig', 'hbase-site.xml',
owner = 'atlas',
group = 'hadoop',
conf_dir = '/usr/hdp/current/atlas-server/hbase/conf',
configurations = self.getConfig()['configurations']['atlas-hbase-site'],
configuration_attributes = self.getConfig()['configuration_attributes']['atlas-hbase-site']
)
def test_configure_default(self):
self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + "/scripts/metadata_server.py",
classname = "MetadataServer",
command = "configure",
config_file="default.json",
stack_version = self.STACK_VERSION,
target = RMFTestCase.TARGET_COMMON_SERVICES
)
self.configureResourcesCalled()
self.assertNoMoreResources()
def test_configure_secure(self):
self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + "/scripts/metadata_server.py",
classname = "MetadataServer",
command = "configure",
config_file="secure.json",
stack_version = self.STACK_VERSION,
target = RMFTestCase.TARGET_COMMON_SERVICES
)
self.configureResourcesCalled()
self.assertResourceCalled('TemplateConfig', '/etc/atlas/conf/atlas_jaas.conf',
owner = 'atlas',
)
self.assertNoMoreResources()
def test_start_default(self):
self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + "/scripts/metadata_server.py",
classname = "MetadataServer",
command = "start",
config_file="default.json",
stack_version = self.STACK_VERSION,
target = RMFTestCase.TARGET_COMMON_SERVICES
)
self.configureResourcesCalled()
self.assertResourceCalled('Execute', 'source /etc/atlas/conf/atlas-env.sh ; /usr/hdp/current/atlas-server/bin/atlas_start.py',
not_if = 'ls /var/run/atlas/atlas.pid >/dev/null 2>&1 && ps -p `cat /var/run/atlas/atlas.pid` >/dev/null 2>&1',
user = 'atlas',
)
def test_stop_default(self):
self.executeScript(self.COMMON_SERVICES_PACKAGE_DIR + "/scripts/metadata_server.py",
classname = "MetadataServer",
command = "stop",
config_file="default.json",
stack_version = self.STACK_VERSION,
target = RMFTestCase.TARGET_COMMON_SERVICES
)
self.assertResourceCalled('Execute', 'source /etc/atlas/conf/atlas-env.sh; /usr/hdp/current/atlas-server/bin/atlas_stop.py',
user = 'atlas',
)
self.assertResourceCalled('File', '/var/run/atlas/atlas.pid',
action = ['delete'],
)
| 46.367021 | 141 | 0.494207 | 725 | 8,717 | 5.835862 | 0.256552 | 0.102104 | 0.046088 | 0.064524 | 0.597967 | 0.578823 | 0.535334 | 0.520681 | 0.500355 | 0.46632 | 0 | 0.013406 | 0.409545 | 8,717 | 187 | 142 | 46.614973 | 0.808626 | 0.002294 | 0 | 0.547771 | 0 | 0.019108 | 0.212278 | 0.123409 | 0 | 0 | 0 | 0 | 0.127389 | 0 | null | null | 0 | 0.031847 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c70662701931e0df30976bfadaca0ac6c230e738 | 1,401 | py | Python | Day3/Day3.py | ErAgOn-AmAnSiRoHi/Advent-of-Code-2021 | 0f0d59483d93f6fce4aa06fb36101aea08b02fc3 | [
"MIT"
] | null | null | null | Day3/Day3.py | ErAgOn-AmAnSiRoHi/Advent-of-Code-2021 | 0f0d59483d93f6fce4aa06fb36101aea08b02fc3 | [
"MIT"
] | null | null | null | Day3/Day3.py | ErAgOn-AmAnSiRoHi/Advent-of-Code-2021 | 0f0d59483d93f6fce4aa06fb36101aea08b02fc3 | [
"MIT"
] | null | null | null | with open("inputday3.txt") as f:
data = [x for x in f.read().split()]
gamma = ""
epsilon = ""
for b in range(0, len(data[0])):
one = 0
zero = 0
for c in range(0, len(data)):
if data[c][b] == '0':
zero += 1
else:
one += 1
if zero > one:
gamma += '0'
epsilon += '1'
else:
gamma += '1'
epsilon += '0'
g = int(gamma, 2)
e = int(epsilon, 2)
print("PART 1", g * e)
gamma = ""
epsilon = ""
data2 = data.copy()
index = 0
while len(data) > 1:
one = 0
zero = 0
ones = []
zeroes = []
for c in range(0, len(data)):
if data[c][index] == "0":
zero += 1
zeroes.append(data[c])
else:
one += 1
ones.append(data[c])
if zero > one:
data = zeroes
else:
data = ones
index += 1
oxygen = int(data[0], 2)
data = data2
index = 0
while len(data) > 1:
one = 0
zero = 0
ones = []
zeroes = []
for c in range(0, len(data)):
if data[c][index] == '0':
zero += 1
zeroes.append(data[c])
else:
one += 1
ones.append(data[c])
if one < zero:
data = ones
else:
data = zeroes
index += 1
co2 = int(data[0], 2)
print("PART 2", oxygen * co2)
| 18.932432 | 41 | 0.417559 | 186 | 1,401 | 3.145161 | 0.209677 | 0.059829 | 0.054701 | 0.075214 | 0.470085 | 0.444444 | 0.444444 | 0.444444 | 0.444444 | 0.444444 | 0 | 0.053885 | 0.430407 | 1,401 | 73 | 42 | 19.191781 | 0.679198 | 0 | 0 | 0.703125 | 0 | 0 | 0.024096 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c706f98a7ed12b68d12a292394d4a9f058dbea40 | 12,449 | py | Python | keras2pytorch_dataset.py | MPCAICDM/MPCA | c996435a0578ea4160f934bc01041c2ef23468f3 | [
"MIT"
] | null | null | null | keras2pytorch_dataset.py | MPCAICDM/MPCA | c996435a0578ea4160f934bc01041c2ef23468f3 | [
"MIT"
] | null | null | null | keras2pytorch_dataset.py | MPCAICDM/MPCA | c996435a0578ea4160f934bc01041c2ef23468f3 | [
"MIT"
] | null | null | null | from __future__ import print_function
from PIL import Image
import os
import os.path
import numpy as np
import sys
from misc import AverageMeter
from eval_accuracy import simple_accuracy
if sys.version_info[0] == 2:
import cPickle as pickle
else:
import pickle
import torch.utils.data as data
import torch
from multiprocessing import Value
def softmax(input_tensor):
act = torch.nn.Softmax(dim=1)
return act(input_tensor).numpy()
class dataset_pytorch(data.Dataset):
def __init__(self, train_data, train_labels, test_data, test_labels, train=True,
transform=None, target_transform=None):
self.transform = transform
self.target_transform = target_transform
self.train = train # training set or test set
self.train_data = train_data # ndarray
self.train_labels = train_labels
self.test_data = test_data
self.test_labels = test_labels
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image, target) where target is index of the target class.
"""
if self.train:
img, target = self.train_data[index], self.train_labels[index]
else:
img, target = self.test_data[index], self.test_labels[index]
# doing this so that it is consistent with all other datasets
# to return a PIL Image
img = Image.fromarray(img)
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
return img, target
def __len__(self):
if self.train:
return len(self.train_data)
else:
return len(self.test_data)
def __repr__(self):
fmt_str = 'Dataset ' + self.__class__.__name__ + '\n'
fmt_str += ' Number of datapoints: {}\n'.format(self.__len__())
tmp = 'train' if self.train is True else 'test'
fmt_str += ' Split: {}\n'.format(tmp)
fmt_str += ' Root Location: {}\n'.format(self.root)
tmp = ' Transforms (if any): '
fmt_str += '{0}{1}\n'.format(tmp, self.transform.__repr__().replace('\n', '\n' + ' ' * len(tmp)))
tmp = ' Target Transforms (if any): '
fmt_str += '{0}{1}'.format(tmp, self.target_transform.__repr__().replace('\n', '\n' + ' ' * len(tmp)))
return fmt_str
class transformer_score_dataset(data.Dataset):
def __init__(self, train_data, train_labels, data_transformer, aux_labels=None, transform=None,
target_transform=None, train_sequential=False):
self.transform = transform
self.target_transform = target_transform
self.train_data = train_data
self.train_labels = train_labels
self.aux_labels = aux_labels
self.transfomer = data_transformer
self.n_transforms = self.transfomer.n_transforms
self.train_sequential = train_sequential
if train_sequential:
self.length = self.train_data.shape[0]
self.transform_idx = 0
self.iter_count = Value('i', 0)
else:
self.length = self.train_data.shape[0] * self.transfomer.n_transforms
assert self.length == len(self.train_labels)
def __len__(self):
return self.length
def __getitem__(self, idx):
if self.train_sequential:
with self.iter_count.get_lock():
self.iter_count.value += 1
if self.iter_count.value == self.length:
self.transform_idx = (self.transform_idx + 1) % self.n_transforms
self.iter_count.value = 0
image_idx, transform_idx = idx, self.transform_idx
nidx = image_idx * self.n_transforms + transform_idx
else:
image_idx, transform_idx = idx // self.n_transforms, idx % self.n_transforms
nidx = idx
img, target = self.transfomer.transform_one(self.train_data[image_idx], transform_idx).copy(), self.train_labels[nidx]
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
if self.aux_labels is not None:
return img, (target, self.aux_labels[idx])
return img, target
class transformer_dataset(data.Dataset):
def __init__(self, train_data, train_labels, data_transformer, aux_labels=None, transform=None,
target_transform=None, train_sequential=False, is_padding=False):
self.transform = transform
self.target_transform = target_transform
self.train_data = train_data
self.train_labels = train_labels
self.aux_labels = aux_labels
self.transfomer = data_transformer
self.n_transforms = self.transfomer.n_transforms
self.train_sequential = train_sequential
self.is_padding = is_padding
if train_sequential:
self.length = self.train_data.shape[0]
self.transform_idx = 0
self.iter_count = Value('i', 0)
else:
self.length = self.train_data.shape[0] * self.transfomer.n_transforms
assert self.length == len(self.train_labels)
def __len__(self):
return self.length
def __getitem__(self, idx):
if self.train_sequential:
with self.iter_count.get_lock():
self.iter_count.value += 1
if self.iter_count.value == self.length:
self.transform_idx = (self.transform_idx + 1) % self.n_transforms
self.iter_count.value = 0
image_idx, transform_idx = idx, self.transform_idx
nidx = image_idx * self.n_transforms + transform_idx
else:
image_idx, transform_idx = idx // self.n_transforms, idx % self.n_transforms
nidx = idx
if self.is_padding:
img = np.pad(self.train_data[image_idx].copy(), ((2, 2), (2, 2), (0, 0)), 'constant')
#print(img.shape)
img, target = self.transfomer.transform_one(img, transform_idx).copy(), self.train_labels[nidx]
else:
img, target = self.transfomer.transform_one(self.train_data[image_idx], transform_idx).copy(), self.train_labels[nidx]
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
if self.aux_labels is not None:
return img, (target, self.aux_labels[idx])
return img, target
class h5idx_dataset(data.Dataset):
def __init__(self, train_index, train_labels, total_data, aux_labels=None, transform=None, target_transform=None):
self.transform = transform
self.target_transform = target_transform
self.train_index = train_index # just a index
self.train_labels = train_labels
self.aux_labels = aux_labels
self.total_data = total_data
self.length = self.train_index.shape[0] * self.total_data.shape[1]
self.n_transform = self.total_data.shape[1]
assert self.length == len(self.train_labels)
def __len__(self):
return self.length
def __getitem__(self, idx):
image_idx, transform_idx = idx // self.n_transform, idx % self.n_transform
img, target = np.array(self.total_data[self.train_index[image_idx], transform_idx, :]), self.train_labels[idx]
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
if self.aux_labels is not None:
return img, (target, self.aux_labels[idx])
return img, target
class trainset_pytorch(data.Dataset):
def __init__(self, train_data, train_labels, aux_labels=None,transform=None, target_transform=None):
self.transform = transform
self.target_transform = target_transform
self.train_data = train_data # ndarray
self.train_labels = train_labels
self.aux_labels = aux_labels
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image, target) where target is index of the target class.
"""
img, target = self.train_data[index], self.train_labels[index]
# doing this so that it is consistent with all other datasets
# to return a PIL Image
# img = Image.fromarray(img) # used if the img is [H, W, C] and the dtype is uint8
if self.transform is not None:
img = self.transform(img)
if self.target_transform is not None:
target = self.target_transform(target)
if self.aux_labels is not None:
return img, (target, self.aux_labels[index])
return img, target
def __len__(self):
return len(self.train_data)
class testset_pytorch(data.Dataset):
def __init__(self, test_data, transform=None):
self.transform = transform
self.test_data = test_data # ndarray
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image, target) where target is index of the target class.
"""
img = self.test_data[index]
# doing this so that it is consistent with all other datasets
# to return a PIL Image
# img = Image.fromarray(img)
if self.transform is not None:
img = self.transform(img)
return img
def __len__(self):
return len(self.test_data)
class dataset_reorganized(data.Dataset):
def __init__(self, data, transform=None):
self.transform = transform
self.data = data # ndarray
def __getitem__(self, index):
"""
Args:
index (int): Index
Returns:
tuple: (image, target) where target is index of the target class.
"""
imgs = self.data[index]
# doing this so that it is consistent with all other datasets
# to return a PIL Image
# img = Image.fromarray(img) # used if the img is [H, W, C] and the dtype is uint8
if self.transform is not None:
new_imgs = []
for i in range(imgs.shape[0]):
img = imgs[i]
img = self.transform(img)
new_imgs.append(img.unsqueeze(0))
new_imgs = torch.cat(new_imgs, dim=0)
else:
raise NotImplementedError
return new_imgs
def __len__(self):
return len(self.data)
def train_reorganized(trainloader, model, criterion, optimizer, epochs):
# train the model
model.train()
top1 = AverageMeter()
losses = AverageMeter()
for epoch in range(epochs):
for batch_idx, (inputs) in enumerate(trainloader):
targets = torch.LongTensor(np.tile(np.arange(inputs.size(1)), inputs.size(0)))
inputs = inputs.reshape(-1, inputs.size(-3), inputs.size(-2), inputs.size(-1))
inputs, targets = torch.autograd.Variable(inputs.cuda()), torch.autograd.Variable(targets.cuda())
outputs, _ = model(inputs)
loss = criterion(outputs, targets)
prec1 = simple_accuracy(outputs.data.cpu(), targets.data.cpu())
top1.update(prec1, inputs.size(0))
losses.update(loss.data.cpu(), inputs.size(0))
# compute gradient and do SGD step
optimizer.zero_grad()
loss.backward()
optimizer.step()
if batch_idx % 10 == 0:
print('Epoch: [{} | {}], batch: {}, loss: {}, Accuracy: {}'.format(epoch + 1, epochs, batch_idx + 1, losses.avg, top1.avg))
def test_reorganized(testloader, model):
model.eval()
res = torch.Tensor()
for batch_idx, (inputs) in enumerate(testloader):
inputs = inputs.reshape(-1, inputs.size(-3), inputs.size(-2), inputs.size(-1))
inputs = torch.autograd.Variable(inputs.cuda())
outputs, _ = model(inputs)
res = torch.cat((res, outputs.data.cpu()), dim=0)
return res
def get_scores(outputs, targets):
scores = []
for i in range(outputs.shape[0]):
scores.append(outputs[i, targets[i]])
return np.array(scores) | 34.969101 | 139 | 0.618684 | 1,570 | 12,449 | 4.684713 | 0.115287 | 0.055065 | 0.033583 | 0.029368 | 0.719103 | 0.687695 | 0.65656 | 0.617811 | 0.617811 | 0.617811 | 0 | 0.006711 | 0.28187 | 12,449 | 356 | 140 | 34.969101 | 0.815996 | 0.087397 | 0 | 0.592593 | 0 | 0 | 0.020534 | 0 | 0.00823 | 0 | 0 | 0 | 0.012346 | 1 | 0.106996 | false | 0 | 0.053498 | 0.024691 | 0.283951 | 0.00823 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c709d0df6d7c96b0dace86ff6283e481bd4f3000 | 8,584 | py | Python | sdk/python/pulumi_azure_nextgen/marketplace/private_store_offer.py | pulumi/pulumi-azure-nextgen | 452736b0a1cf584c2d4c04666e017af6e9b2c15c | [
"Apache-2.0"
] | 31 | 2020-09-21T09:41:01.000Z | 2021-02-26T13:21:59.000Z | sdk/python/pulumi_azure_nextgen/marketplace/private_store_offer.py | pulumi/pulumi-azure-nextgen | 452736b0a1cf584c2d4c04666e017af6e9b2c15c | [
"Apache-2.0"
] | 231 | 2020-09-21T09:38:45.000Z | 2021-03-01T11:16:03.000Z | sdk/python/pulumi_azure_nextgen/marketplace/private_store_offer.py | pulumi/pulumi-azure-nextgen | 452736b0a1cf584c2d4c04666e017af6e9b2c15c | [
"Apache-2.0"
] | 4 | 2020-09-29T14:14:59.000Z | 2021-02-10T20:38:16.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi SDK Generator. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union
from .. import _utilities, _tables
from . import outputs
from ._enums import *
from ._inputs import *
__all__ = ['PrivateStoreOffer']
class PrivateStoreOffer(pulumi.CustomResource):
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
e_tag: Optional[pulumi.Input[str]] = None,
icon_file_uris: Optional[pulumi.Input[Mapping[str, pulumi.Input[str]]]] = None,
offer_id: Optional[pulumi.Input[str]] = None,
plans: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PlanArgs']]]]] = None,
private_store_id: Optional[pulumi.Input[str]] = None,
specific_plan_ids_limitation: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
update_suppressed_due_idempotence: Optional[pulumi.Input[bool]] = None,
__props__=None,
__name__=None,
__opts__=None):
"""
The privateStore offer data structure.
API Version: 2020-01-01.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] e_tag: Identifier for purposes of race condition
:param pulumi.Input[Mapping[str, pulumi.Input[str]]] icon_file_uris: Icon File Uris
:param pulumi.Input[str] offer_id: The offer ID to update or delete
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['PlanArgs']]]] plans: Offer plans
:param pulumi.Input[str] private_store_id: The store ID - must use the tenant ID
:param pulumi.Input[Sequence[pulumi.Input[str]]] specific_plan_ids_limitation: Plan ids limitation for this offer
:param pulumi.Input[bool] update_suppressed_due_idempotence: Indicating whether the offer was not updated to db (true = not updated). If the allow list is identical to the existed one in db, the offer would not be updated.
"""
if __name__ is not None:
warnings.warn("explicit use of __name__ is deprecated", DeprecationWarning)
resource_name = __name__
if __opts__ is not None:
warnings.warn("explicit use of __opts__ is deprecated, use 'opts' instead", DeprecationWarning)
opts = __opts__
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = dict()
__props__['e_tag'] = e_tag
__props__['icon_file_uris'] = icon_file_uris
__props__['offer_id'] = offer_id
__props__['plans'] = plans
if private_store_id is None and not opts.urn:
raise TypeError("Missing required property 'private_store_id'")
__props__['private_store_id'] = private_store_id
__props__['specific_plan_ids_limitation'] = specific_plan_ids_limitation
__props__['update_suppressed_due_idempotence'] = update_suppressed_due_idempotence
__props__['created_at'] = None
__props__['modified_at'] = None
__props__['name'] = None
__props__['offer_display_name'] = None
__props__['publisher_display_name'] = None
__props__['type'] = None
__props__['unique_offer_id'] = None
alias_opts = pulumi.ResourceOptions(aliases=[pulumi.Alias(type_="azure-nextgen:marketplace/latest:PrivateStoreOffer"), pulumi.Alias(type_="azure-nextgen:marketplace/v20200101:PrivateStoreOffer")])
opts = pulumi.ResourceOptions.merge(opts, alias_opts)
super(PrivateStoreOffer, __self__).__init__(
'azure-nextgen:marketplace:PrivateStoreOffer',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None) -> 'PrivateStoreOffer':
"""
Get an existing PrivateStoreOffer resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = dict()
return PrivateStoreOffer(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="createdAt")
def created_at(self) -> pulumi.Output[str]:
"""
Private store offer creation date
"""
return pulumi.get(self, "created_at")
@property
@pulumi.getter(name="eTag")
def e_tag(self) -> pulumi.Output[Optional[str]]:
"""
Identifier for purposes of race condition
"""
return pulumi.get(self, "e_tag")
@property
@pulumi.getter(name="iconFileUris")
def icon_file_uris(self) -> pulumi.Output[Optional[Mapping[str, str]]]:
"""
Icon File Uris
"""
return pulumi.get(self, "icon_file_uris")
@property
@pulumi.getter(name="modifiedAt")
def modified_at(self) -> pulumi.Output[str]:
"""
Private store offer modification date
"""
return pulumi.get(self, "modified_at")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name of the resource.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="offerDisplayName")
def offer_display_name(self) -> pulumi.Output[str]:
"""
It will be displayed prominently in the marketplace
"""
return pulumi.get(self, "offer_display_name")
@property
@pulumi.getter
def plans(self) -> pulumi.Output[Optional[Sequence['outputs.PlanResponse']]]:
"""
Offer plans
"""
return pulumi.get(self, "plans")
@property
@pulumi.getter(name="privateStoreId")
def private_store_id(self) -> pulumi.Output[str]:
"""
Private store unique id
"""
return pulumi.get(self, "private_store_id")
@property
@pulumi.getter(name="publisherDisplayName")
def publisher_display_name(self) -> pulumi.Output[str]:
"""
Publisher name that will be displayed prominently in the marketplace
"""
return pulumi.get(self, "publisher_display_name")
@property
@pulumi.getter(name="specificPlanIdsLimitation")
def specific_plan_ids_limitation(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
Plan ids limitation for this offer
"""
return pulumi.get(self, "specific_plan_ids_limitation")
@property
@pulumi.getter
def type(self) -> pulumi.Output[str]:
"""
The type of the resource.
"""
return pulumi.get(self, "type")
@property
@pulumi.getter(name="uniqueOfferId")
def unique_offer_id(self) -> pulumi.Output[str]:
"""
Offers unique id
"""
return pulumi.get(self, "unique_offer_id")
@property
@pulumi.getter(name="updateSuppressedDueIdempotence")
def update_suppressed_due_idempotence(self) -> pulumi.Output[Optional[bool]]:
"""
Indicating whether the offer was not updated to db (true = not updated). If the allow list is identical to the existed one in db, the offer would not be updated.
"""
return pulumi.get(self, "update_suppressed_due_idempotence")
def translate_output_property(self, prop):
return _tables.CAMEL_TO_SNAKE_CASE_TABLE.get(prop) or prop
def translate_input_property(self, prop):
return _tables.SNAKE_TO_CAMEL_CASE_TABLE.get(prop) or prop
| 40.11215 | 230 | 0.646086 | 991 | 8,584 | 5.324924 | 0.194753 | 0.045859 | 0.04927 | 0.046807 | 0.401554 | 0.289748 | 0.162592 | 0.137199 | 0.070495 | 0.070495 | 0 | 0.002658 | 0.254893 | 8,584 | 213 | 231 | 40.300469 | 0.822389 | 0.230079 | 0 | 0.144 | 1 | 0 | 0.16851 | 0.059694 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136 | false | 0.008 | 0.064 | 0.016 | 0.336 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.