url
stringlengths
13
4.35k
tag
stringclasses
1 value
text
stringlengths
109
628k
file_path
stringlengths
109
155
dump
stringclasses
96 values
file_size_in_byte
int64
112
630k
line_count
int64
1
3.76k
https://www.hometheaterforum.com/community/threads/where-do-dead-dvds-go.99767/
code
Where do dead DVD's go? I ask because I have had a few defective disks I had to return to a seller (Amazon), and in two cases, the producing company (as distinct from the selling company), replying to an email from me before I made a return to Amazon, said some disks from some batches had been badly produced, and I should be able to get a replacement one. (In the case of one from Anchor Bay, I couldn't get a reply, and still have a non-playing disk of Wicker Man.) Question - Would the producing company not want to get the defective disks back, and be able to make sure the sellers had corrected copies for consumers? How would they do that? Because, even if the producing company would want that, its very hard to do that with an online business, where you are essentially dealing with Robo-clerk. In 3 of these cases, with 3 different producers, I have never been confident I could get a corrected copy of the movie I want, and have had to settle for a refund. As for the dud disks, I wonder if a lot of them just don't get re-shrink wrapped, and sold again, in the expectation or hope that they will play on the next buyer's machine, or not get played soon enough and the buyer doesn't feel right about sending it back, if he eventually discovers it won't play for him/her either? Do a lot of those defective disks just end up in used CD stores?
s3://commoncrawl/crawl-data/CC-MAIN-2017-09/segments/1487501171971.82/warc/CC-MAIN-20170219104611-00272-ip-10-171-10-108.ec2.internal.warc.gz
CC-MAIN-2017-09
1,353
1
https://conda-forge.org/community/minutes/2018-07-24/
code
2018-07-24 conda-forge meeting - Zoom instructions: +How to connect to zoom - Sharing passwords (to start off the meeting next time) - Try something out and move on to more interesting problems - Let’s try KeyBase. Eric D. just sent out invites to most of the core team. - Establish next steps/action items/gh issues for migrations - MVN will coordinate with CJ on issueing prs for the things that need compilers that don’t actually call it out nicely. - Parse graph find everything which could be py 3.7 but no compiler and not noarch, run rebuild on that. - May need to have two versions of pinnings + smithy whilst graph is being rebuilt. - Decide on a policy for when maintainers stop maintaining - Come back to later - Related to 2k-ish pending bot PRs… - MVN will give CJ a list of merge-conflicted feedstocks that were closed and not merged. - Auto close out of date PRs - Auto delete closed/merged bot PR - run_exports vote https://github.com/conda-forge/conda-forge-pinning-feedstock/issues/102 - John questioned run_exports practice: https://github.com/conda-forge/staged-recipes/pull/4858#discussion_r204076032 - Dougal redirected discussion to https://github.com/conda-forge/conda-forge-pinning-feedstock/issues/102#issue-343171939 - John questioned validity of vote on run_exports: - Filipe to add to governance doc on process to un-stick situations like this - Overall: we need a community standards communication scheme. Mike S dropped the ball on communicating the results of the poll. - Related: governance doc may need definition of how a valid poll is to be conducted. - Expiring (i.e., auto-closing with a bot) "old" PRs into staged-recipes? - Put on label, add message (stale), ping relevant parties to close - Decide on policy - Finding a good solution to sharing passwords among core - Git Secret? https://github.com/sobolevn/git-secret - Build packages on C3I and upload to conda-forge - Make is missing from the base image for PowerPC internal to Anaconda. Fun times! - Mike is open to other people helping with this. If interested, reach out! Helping means trying recipes, debugging any issues, and resolving any merge conflicts that have happened since Mike pulled them in last. Moving target. - Packages that have been built https://anaconda.org/cf-cb3 - these may need more work regarding versions. The graph was computed with the versions, but probably should have ignored them. When a pin is older than a newer recipe, the upstream recipe gets missed as a real dependency because of the version mismatch. - Making the agenda and notes public again. - John will see if we can make dropbox paper readable by the world - other options are to just post the notes somewhere public after the meeting - conda-forge blog - Finalize compiler migration discussion (see: +2018-07-17 conda-forge meeting ) - Update on current status - Number of packages left to syntax migrate - Number of packages needing re-compile - Total number ready - Number ready in the first layer - Build number increase by N for new things at build time non static - determine build number with conda render clobber file - Decide on migration order [Outcome: make super graph of py37 + compilers (run with one walker), drop 3.5 when 3.7 starts] - remaining compiler syntax - Decide on resource strat [Outcome: do everything online] Offline (without CIs) - Online (with CI) - Decide on channel strat [Outcome: new label for new compilers, run two labels] - upload re-compiled packages to new label and continue pushing to current label - upload re-compiled packages to current label, push updates to current era compilers to different branch - Update on current status
s3://commoncrawl/crawl-data/CC-MAIN-2024-18/segments/1712296817014.15/warc/CC-MAIN-20240415174104-20240415204104-00605.warc.gz
CC-MAIN-2024-18
3,668
52
https://carbonmapper.org/data-engineer/
code
Carbon Mapper is a non-profit organization with a public-good mission to detect, pinpoint, quantify and track 80% of the world’s methane and carbon dioxide (CO2) point source emissions. Our mission is to aid in the acceleration of climate mitigation efforts globally at the scale of individual facilities. The Carbon Mapper public-private partnership includes Planet Labs, Inc., the California Air Resources Board, the University of Arizona, Arizona State University, and NASA’s Jet Propulsion Laboratory (JPL). Carbon Mapper uses advanced imaging spectroscopy remote sensing on satellites and airborne assets to detect and analyze methane and carbon dioxide emissions around the world and provide information data products to inform decision makers as well as the general public. As part of its public education and research support activities, Carbon Mapper will provide a global open data portal, including a data trust and a website offering rapid visualization and interpretation for both expert and non-expert audiences. Carbon Mapper also supports advances in science algorithms, machine learning, and big data for remote sensing through multi-source analytics, cloud computing, open source development, Amazon Web Services, and geospatial analytics. Carbon Mapper is looking for a data engineer to process hyperspectral imaging data and to develop and implement software improvements and automation capabilities to the processing pipeline. The data engineer will work with the Carbon Mapper science team to ensure the quality of intermediate and final data products, and will work with the Carbon Mapper senior software engineer to implement improvements to the data processing pipeline. This position may include up to 4 weeks of domestic and international travel per year to support airborne campaigns. Candidate must have a valid passport. - Processing remote sensing data using automated pipeline and manual processes - Managing storage of raw data and archive data - Work with the science team to perform quality assurance / quality control of intermediate and final products. - During active airborne campaigns, work as operator-on-call for rapid product turnaround - Support airborne campaigns on site as needed - Automation work on methane processing pipeline of airborne and simulated satellite hyperspectral imaging data – including adding new features to the pipeline, implementing improvements, maintaining and upgrading software - Recommend and implement improvements for image processing workflows including opportunities for machine learning and AI - Develop and implement automated QA/QC workflows. - Bachelor’s degree or higher in physics, chemistry, mathematics, geography, computer science, engineering, or related technical field - Experience in Python and bash scripts - Strong verbal and written communication skills, including documentation and training - Experience working with remote sensing data, ideally satellite imagery with an understanding of Geospatial software (GDAL, ArcGIS/QGIS, osgeo, etc.) - Scientific programming skills (e.g., R, Python, MATLAB) and experience with high performance computers - Experience with cloud computing and AWS (Lambda, S3, Sagemaker) Salary Range; $65,000 – $80,000 USD depending on experience Hours: This is a full time salaried position (40 hours per week). You are expected to be available Monday through Thursday 9 am – 3 pm PDT. The remaining time is flexible based on your assigned projects and deadlines. Some overtime, after hours and weekend work may be required. Location: Carbon Mapper has a distributed team with both in-person and remote workers. Employees living within 50 miles of the Pasadena (CA) area are expected to be in the office at least 2 days per week. All employees will attend quarterly in-person meetings at Carbon Mapper headquarters. Benefits: Carbon Mapper offers health/dental benefits, retirement plans, life insurance, and dependent care assistance. Carbon Mapper is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, sexual orientation, gender identity, disability, protected veteran status or any other characteristic protected by law. Application requirements: please email your resume and cover letter to firstname.lastname@example.org . Applications will be accepted through June 14, 2021.
s3://commoncrawl/crawl-data/CC-MAIN-2021-25/segments/1623488538041.86/warc/CC-MAIN-20210623103524-20210623133524-00504.warc.gz
CC-MAIN-2021-25
4,502
24
http://phoronix.com/forums/showthread.php?79537-Windows-7-amp-Windows-8-vs-Ubuntu-13-04-amp-Fedora-18/page5
code
I have some numbers for QNX compared to Linux and another Kernel on ARM and x86. I will post for you when I can locate them. Current QNX is not that old though. Stuff was rewritten for a more modern era, I think that was around QNX 6 Neutrino (not quite sure I recall correctly). It handles SMP well and is really good for distributed computing(see qnet).
s3://commoncrawl/crawl-data/CC-MAIN-2015-18/segments/1429246648209.18/warc/CC-MAIN-20150417045728-00303-ip-10-235-10-82.ec2.internal.warc.gz
CC-MAIN-2015-18
355
2
http://www.computerworld.com/article/3031656/linux/probing-into-your-systems-with-proc.html
code
It's not just a file system full of odd looking files that only the kernel understands. Instead, it's really something of a peep hole into your system. And there a quite a number of useful things that you can learn from the files that it contains. So, what do you see when you cd over to /proc? Well, run ls and the first thing you're likely to notice is the very large group of directories with just numbers for names. These numbers correspond to the process IDs (PID) of processes that are running on your system -- everything from the init process that started the boot time ball rolling to the shell you're using right now. And you're likely to see quite a lot of them -- probably several hundred or more. $ cd /proc $ ls 1 15878 38 433 5266 579 67 7521 devices 10 1589 39 434 5267 5792 6788 7523 diskstats 10052 16 393 435 5268 58 6793 7525 dma 1021 1623 3956 436 5269 580 6794 7529 driver 10522 16571 3957 437 5270 581 6795 7531 execdomains 10552 16585 3958 438 5271 5810 6796 7533 fb 11 1695 3959 439 5272 582 6797 7535 filesystems 11984 17 3960 44 5273 583 6798 7537 fs ... If you were to count the numeric (process) directories, your total should be the same as the response you'd get if you ran the command ps -ef --no-headers | wc -l (ps output without the header line). The bulk of these directories will likely be owned by root but, depending on how your system is being used, you'll also see application service accounts (such as oracle in the example below) and usernames among the process owners listed. # ls -l | more total 0 dr-xr-xr-x 5 root root 0 Oct 3 2013 1 dr-xr-xr-x 5 root root 0 Oct 3 2013 10 dr-xr-xr-x 5 root root 0 Oct 3 2013 1021 dr-xr-xr-x 5 root root 0 Oct 3 2013 11 dr-xr-xr-x 5 oracle oinstall 0 Feb 4 07:11 1167 dr-xr-xr-x 5 root root 0 Jan 26 11:00 11920 dr-xr-xr-x 5 root root 0 Mar 7 2014 11923 dr-xr-xr-x 5 gdm gdm 0 Jan 26 11:01 11950 Notice that none of these are files in the same sense as files we see in our file systems. They don't take up space on the disk and they don't have content even if the cat command displays their data for you. Unlike the directories that we see in "real" file systems, these show up as using 0 bytes of data. Many will have dates and times that correspond to the last time the system was booted (i.e., when the related processes started) while other files in /proc may appear to be updated almost constantly. Only the /proc/kcore file will have a signficant size and it might appear to be huge, though even it isn't really using disk space) as it relates to the RAM on your system. # ls -l kcore -r-------- 1 root root 39460016128 Feb 8 09:10 kcore You'll also see a collection of other files in /proc with names like cpuinfo, key-users and schedstat -- names that provide clues to what these files contain. In fact, you can think of the files in /proc as falling into two categories -- those that are represent processes running on your system and those represent some aspect of the system itself. So, what are some useful things these interesting pseudo files can tell you? For one thing, they can tell you how long the system has been up. Check out the /proc/uptime file. This file reports the system uptime, even though it might not be immediately obvious. The number 74216960.58 in the output below probably likely doesn't look like an uptime report to you. But type "cat uptime" a couple times in a row and you'll notice that the numbers are constantly changing. It's obviously keeping up. $ cat /proc/uptime;sleep 10;cat /proc/uptime 74216960.58 73912315.63 74216970.58 73912325.61 As you'll note, this file actually contains two numbers. The first is the uptime of the system (as you'd expect from the name) while the second is the amount of time the system has spent idle. The numbers are constantly changing because we're always getting further from the time the system was last booted. After sleeping for ten seconds, the number on the left just happens to be 10 units larger, so it's clear that these numbers are reporting time in seconds. No problem. A little command line math can turn those seconds into days. If we then compare the result of our calculation with the uptime command output, we'll the some connection between the numbers. $ expr 74216970 / 60 / 60 / 24 858 $ uptime 14:30:17 up 858 days, 23:50, 1 user, load average: 0.08, 0.04, 0.00 Of course, almost no one would want to go through all the trouble of calculating uptime with an expr command when the uptime command can tell us what we want to know directly, especially if we have to think through the sixty seconds per minute, sixty minutes per hour, and 24 hours per day conversions. And think your system is busy? Do a little more math with these numbers and you might see something like this. Notice how I added two zeroes to the end of the idle time figure to get an answer that would represent the percentage of the time this system has been idle. Yes, that's 99%. This system is clearly not straining -- at least not most of the time. $ expr 7391232500 / 74216970 99 This uptime exercise is useful because it reinforces the idea that these "files" are plucking information from the system to update the virtual file content many times a second. Note, though, that the dates and times associated with this file keep up with the current time. $ ls -l /proc | tail -11 -r--r--r-- 1 root root 0 Feb 9 14:42 stat -r--r--r-- 1 root root 0 Feb 9 14:42 swaps dr-xr-xr-x 11 root root 0 Oct 3 2013 sys --w------- 1 root root 0 Feb 9 14:42 sysrq-trigger dr-xr-xr-x 2 root root 0 Feb 9 14:42 sysvipc dr-xr-xr-x 4 root root 0 Feb 9 14:42 tty -r--r--r-- 1 root root 0 Feb 9 14:42 uptime -r--r--r-- 1 root root 0 Feb 9 14:42 version -r-------- 1 root root 0 Feb 9 14:42 vmcore -r--r--r-- 1 root root 0 Feb 9 14:42 vmstat -r--r--r-- 1 root root 0 Feb 9 14:42 zoneinfo Another file with information that will likely seem familiar is the version file. This file supplies information on your operating system version, much like the output of the uname -a command and undoubtedly tapping the same system resources. $ cat /proc/version Linux version 2.6.18-128.el5 (email@example.com) (gcc version 4.1.2 20080704 (Red Hat 4.1.2-44)) #1 SMP Wed Dec 17 11:41:38 EST 2008 $ uname -a Linux sea-aveksa-1.telecomsys.com 2.6.18-128.el5 #1 SMP Wed Dec 17 11:41:38 EST 2008 x86_64 x86_64 x86_64 GNU/Linux Another file -- the cpuinfo file -- supplies fairly extensive information on your system CPUs. While I don't want to insert all 500+ lines into this post, you can see some of the details below. The second command is simply counting up the number of CPUs. $ head -11 /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 44 model name : Intel(R) Xeon(R) CPU X5650 @ 2.67GHz stepping : 2 cpu MHz : 2660.126 cache size : 12288 KB physical id : 1 siblings : 12 core id : 0 $ more cpuinfo | grep processor | wc -l 24 The vmstat file provides virtual memory statistics. Want to see what's happening with page swapping? The numbers below represent swapping activity (pages swapped in and out) since the system was booted. $ grep pswp /proc/vmstat pswpin 229269 pswpout 316559 If these names look familiar, you may be remembering them from sar output like that shown below. # sar -W 10 2 Linux 3.14.35-28.38.amzn1.x86_64 (ip-172-30-0-28) 02/10/2016 _x86_64_(1 CPU) 12:17:03 PM pswpin/s pswpout/s 12:17:13 PM 0.00 0.00 12:17:23 PM 0.00 0.00 Average: 0.00 0.00 We can also look at memory statistics. These details can come in very handy if you want to get a very detailed understanding of the memory on your system and how it is being used. $ more /proc/meminfo MemTotal: 37037804 kB MemFree: 18605268 kB Buffers: 323740 kB Cached: 14919556 kB SwapCached: 12068 kB Active: 13878148 kB Inactive: 3846048 kB HighTotal: 0 kB HighFree: 0 kB LowTotal: 37037804 kB LowFree: 18605268 kB SwapTotal: 16778232 kB SwapFree: 16309048 kB Dirty: 9896 kB Writeback: 0 kB AnonPages: 2468880 kB Mapped: 7089292 kB Slab: 442900 kB PageTables: 189648 kB NFS_Unstable: 0 kB Bounce: 0 kB CommitLimit: 35297132 kB Committed_AS: 12768916 kB VmallocTotal: 34359738367 kB VmallocUsed: 271696 kB VmallocChunk: 34359466659 kB HugePages_Total: 0 HugePages_Free: 0 HugePages_Rsvd: 0 Hugepagesize: 2048 kB Want to check on what file system types are supported by your kernel? Take a look at /proc/filesystems. $ head -11 /proc/filesystems nodev sysfs nodev rootfs nodev bdev nodev proc nodev cpuset nodev binfmt_misc nodev debugfs nodev securityfs nodev sockfs nodev usbfs nodev pipefs To view all the mounts used by your system, look at the /proc/mounts file. $ cat /proc/mounts rootfs / rootfs rw 0 0 /dev/root / ext3 rw,data=ordered,usrquota 0 0 /dev /dev tmpfs rw 0 0 /proc /proc proc rw 0 0 /sys /sys sysfs rw 0 0 /proc/bus/usb /proc/bus/usb usbfs rw 0 0 devpts /dev/pts devpts rw 0 0 /dev/sda1 /boot ext3 rw,data=ordered 0 0 tmpfs /dev/shm tmpfs rw 0 0 none /proc/sys/fs/binfmt_misc binfmt_misc rw 0 0 sunrpc /var/lib/nfs/rpc_pipefs rpc_pipefs rw 0 0 /etc/auto.misc /misc autofs rw,fd=6,pgrp=5541,timeout=300,minproto=5,maxproto=5,indirect 0 0 -hosts /net autofs rw,fd=12,pgrp=5541,timeout=300,minproto=5,maxproto=5,indirect 0 0 oracleasmfs /dev/oracleasm oracleasmfs rw 0 0 //windows-server/outgoing /mnt/ActAccts cifs rw,mand,unc=\\windows-server \outgoing,username=xferSvc,uid=0,gid=0,file_mode=02767,dir_mode=0777,rsize=16384,wsize=57344 0 0 The /proc/net directory contains a wealth of network information including data for your network interfaces. $ ls /proc/net anycast6 ip_conntrack netfilter rt6_stats tcp6 arp ip_conntrack_expect netlink rt_acct tr_rif bonding ip_mr_cache netstat rt_cache udp dev ip_mr_vif packet snmp udp6 dev_mcast ip_tables_matches protocols snmp6 unix dev_snmp6 ip_tables_names psched sockstat wireless if_inet6 ip_tables_targets raw sockstat6 igmp ipv6_route raw6 softnet_stat igmp6 mcfilter route stat ip6_flowlabel mcfilter6 rpc tcp Examples of some /proc/net data include your arp cache and routing table. $ cat arp IP address HW type Flags HW address Mask Device 172.30.0.1 0x1 0x2 0a:ee:74:5c:40:bd * eth0 172.30.0.2 0x1 0x2 0a:ee:74:5c:40:bd * eth0 $ cat route Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT eth0 00000000 01001EAC 0003 0 0 0 000000000 0 0 eth0 FEA9FEA9 00000000 0005 0 0 0 FFFFFFFF0 0 0 eth0 00001EAC 00000000 0001 0 0 0 00FFFFFF0 0 0 For some files in /proc, you'll need to use your superpowers. Here we're looking into some aspects of our host-based firewall. $ sudo cat /proc/net/ip_tables_names filter nat You can view arp (address resolution protocol) data that your system has collected using the /proc/net/arp file. This is much the same information that you'd see using the arp command. $ cat /proc/net/arp IP address HW type Flags HW address Mask Device 10.20.30.128 0x1 0x2 00:50:56:B1:2E:01 * bond0 10.20.30.110 0x1 0x2 A4:BA:88:12:2C:5D * bond0 10.20.30.154 0x1 0x2 00:50:56:B3:0E:33 * bond0 10.20.30.1 0x1 0x2 00:00:0C:07:AC:2A * bond0 10.20.30.33 0x1 0x2 00:50:52:B6:32:33 * bond0 Or maybe you want to look into page faults. $ cat vmstat | grep "fault" pgfault 2426152809 pgmajfault 79826 You can examine your swap partitions and swap files through the /proc/swaps file. $ more /proc/swaps Filename Type Size Used Priority /dev/mapper/VolGroup00-LogVol01 partition 16777208 514200 /swapfile file 1024 0 -2 Details about your system's devices are available in the /proc/sys/dev directory. Below, we look at the cdrom and raid devices. # ls -l /proc/sys/dev/cdrom total 0 -rw-r--r-- 1 root root 0 Feb 8 17:59 autoclose -rw-r--r-- 1 root root 0 Feb 8 17:59 autoeject -rw-r--r-- 1 root root 0 Feb 8 17:59 check_media -rw-r--r-- 1 root root 0 Feb 8 17:59 debug -r--r--r-- 1 root root 0 Feb 8 17:59 info -rw-r--r-- 1 root root 0 Feb 8 17:59 lock # ls -l /proc/sys/dev/raid total 0 -rw-r--r-- 1 root root 0 Feb 8 17:59 speed_limit_max -rw-r--r-- 1 root root 0 Feb 8 17:59 speed_limit_min Examining the contents of one of these files, we see the maximum speed (RAID rebuild speed) that is set for the device. # cat /proc/sys/dev/raid/speed_limit_max 200000 A lot of the information available through /proc can also be viewed using commands like arp, netstat, and sar. Still, it's useful to be able to pull data from the kernel in one convenient location and /proc provides a tremendous wealth of stats for anyone who's want to dive deeply into their system. This tour of /proc and some of the extensive information that it provides was just a taste of the detail available to you. The key to making good use of all this data is deciding what kind of information you want to see and devising scripts or aliases to fetch it from the tremendously detailed files always waiting for you in /proc. This article is published as part of the IDG Contributor Network. Want to Join?
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917122619.60/warc/CC-MAIN-20170423031202-00314-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
12,785
56
http://www.mathworks.com/matlabcentral/fileexchange/41081-interactive-incompressible-fluids?requestedDomain=www.mathworks.com&nocookie=true
code
An interactive GUI showing particles flowing in a liquid field described by Navier-Stokes equations for incompressible fluids. Click and drag with the mouse to add forces to the liquid. Uses Jos Stam's unconditionally stable FFT-based algorithm implemented in MATLAB Code for real-time performance. I uploaded a version that does not require the parallel computing toolbox, and that has adressed some bugs and more optimized for performance. The bugs are computational, in the sense that there are inaccuracies that shouldnt be here. Most noticeably there are boundary effects, and interpolation errors. May I say again what a nice submission this is! instructive, efficent and fun Or? It is listed under "Required Products" now at least. You forgot to mention that the parallel computing toolbox is needed. Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.
s3://commoncrawl/crawl-data/CC-MAIN-2017-51/segments/1512948513330.14/warc/CC-MAIN-20171211090353-20171211110353-00222.warc.gz
CC-MAIN-2017-51
898
8
https://pathagoras.com/help/get_command2.htm
code
You can "Get" a value from a spreadsheet and assign it to a value for a !Group!.The groupname that is 'gotten' will equal the name of the cell in the spreadsheet. <<*Get* = the command !groupname! = the named cell in the target spreadsheet spreadsheet.xlsx = the target spreadsheet. If the spreadsheet is in the folder of the source document OR in the assigned Excel folder, you need not 'qualify' -- i.e., add drive and folder information -- the spreadsheet. Otherwise, you must qualify the spreadsheet with full drive and path information. The Get command is processed 'first', along with other calls in the AskTable, and should be considered as part of the AskTable groupings. Once 'gotten,', you can use the value now assigned to the !groupname! for any other purpose in the document.
s3://commoncrawl/crawl-data/CC-MAIN-2021-25/segments/1623488525399.79/warc/CC-MAIN-20210622220817-20210623010817-00197.warc.gz
CC-MAIN-2021-25
788
6
https://hnhiring.com/locations/emeryville
code
Tanium | Emeryville, CA (SF) or Morrisville, NC (RTP) or REMOTE | Full-time, ONSITE & REMOTE Tanium's product is basically computer security & management software for government and large enterprises. It's orders of magnitude faster than most competition, and customers love it. Fortune called us the "Usain Bolt of cybersecurity" , and Forbes put us at #4 on the Cloud 100 list . The former CIO of the US Air Force, said that Tanium is "game-changing ... allowing a tremendous amount of automation and reduced workloads for our network operations people significantly, meaning things that used to take them months is now down to seconds, or minutes." The USAF used Tanium to patch all of their systems for WannaCry in 41 minutes and consider any system without our software as "high risk". Fortune ranked us as one of the best medium-sized places to work . Benefits include healthcare, 401k match, self-directed/unlimited vacation time (most folks take 4-5 weeks), paid time off for volunteering, parental leave, fertility/adoption benefits, and more. Compensation is near FAANG levels with strong base pay, large annual bonuses, and equity in the form of RSUs. The recruiting team can share more details there. We get everyone together 2-3 times a year, and most teams do zoom calls for standup 2-3 times a week. We have roles open in Engineering, Technical Account Management, Security, Sales, Legal, Marketing, Finance, HR, Accounting, and more. I called out a few interesting positions below. Feel free to ask me about anything, reply here or email nathan.dauber@[company site]. Enterprise Services Engineer - This is a new role in response to customer demand, where we're managing Tanium software directly for customers instead of only training and advising their employees. Additional roles open in US Remote or on site in Fort Belvoir, Quantico, Washington D.C., or Reston: Associate/Director of Technical Account Management - The TAM organization is central to our company, and doesn't have any real parallels that I'm aware of. As a TAM, you'd be expected to set up a home lab with a network of machines (or VMs) running our software, and you'd be primarily responsible for advising 2~5 customers on how best to use Tanium. However, TAMs come from all kinds of backgrounds including sysadmin, devops, or security, and really work together as a team to support each other and meet the needs of each customer. Remote or on site worldwide https://www.tanium.com/careers/?p=department&t=Technical%20A... Other Roles: Security Engineer (US Remote): https://grnh.se/54bf71f91 Lots more: https://grnh.se/92be1afc1
s3://commoncrawl/crawl-data/CC-MAIN-2019-47/segments/1573496670643.58/warc/CC-MAIN-20191121000300-20191121024300-00516.warc.gz
CC-MAIN-2019-47
2,619
13
http://givememydata.com/index_original.php
code
Give Me My Data is a Facebook application that helps users export their data out of Facebook for reuse in visualizations, archives, or any possible method of digital storytelling. Data can be exported in common formats like CSV, XML, and JSON as well as customized network graph formats. According to Facebook's Statement of Rights and Responsibilities: "You own all of the content and information you post on Facebook, and you can control how it is shared through your privacy and application settings." Give Me My Data helps you to exercise this right by presenting your information in easy to use formats. You can copy and paste information from Give Me My Data into any text editor. Here is a list of editors for Macintosh and Windows platforms: After you have copied and pasted your information into an editor you can do a number of things depending on the format you chose. All data formats allow you to examine the text and copy and paste the exact information you are trying to retrieve from Facebook. You can also save the text with a new extension to be able to open the file in other applications. For example, to view your CSV data just save the plain text file with the .csv extension and open it in any spreadsheet software like Numbers (Mac), Microsoft Excel (Win) or OpenOffice, which is free and works on all operating systems. XML is a popular format for archiving and presenting data with other software. There are many free XML viewers available. For example you can view XML files with the Firefox web browser. Also, check out a list of Visualization Options for CSV and XML data on the ManyEyes website. While clearly utilitarian, this project intervenes into online user experiences, provoking users to take a critical look at their interactions within social networking websites. It suggests data is tangible and challenges users to think about ways in which their information is used for purposes outside of their control by government or corporate entities. Give Me My Data is designed only to export a copy of your information from Facebook to allow you to access and manipulate your data yourself. Putting it back into Facebook is unfortunately a manual process.link to this In order for Give Me My Data to work you need to give it permission to access your information. Click "Allow" when you encounter this screen to start reclaiming your data. Give Me My Data only requests read access and will not write to your profile without your permission. link to this The primary goal in creating Give Me My Data is to give users agency over their data by allowing them to export and manipulate it regardless (and in spite if you like) of the interfaces we are presented.link to this Give Me my Data is only available as a web browser-based application.link to this A file format is a particular way that information is encoded for storage in a computer file. The file extension indicates the type of file format as well as the application may open the file. |TXT||plain text||Text that contains no visual formatting.| |CSV||Comma-Separated Values||Used for the digital storage of tabular data such as a database or spreadsheet. Each line in the CSV file corresponds to a row in the table. CSV files can be opened in any spreadsheet application like Microsoft Excel or Apple Numbers.| |XML||Extensible Markup Language||A plain text markup language for storing and transporting data.| |DOT||Graphviz DOT Language||A plain text graph description language for use with Graphviz.| |SQL||Structured Query Language||A database computer language designed for managing data in relational database management systems.| |MM||FreeMind||An XML-based language used by FreeMind mind mapping software.| |PY||NodeBox||A Python IDE for creating visualizations.| Give Me My Data respects your privacy and therefore does not save any information about you or your friends in any form. In addition to the Statement of Rights and Responsibilities, required for anyone who uses Facebook, applications and developers are bound by the Facebook Developer Principles & Policies I used Give Me My Data when I joined the Quit Facebook Day on May 31 2010! —Geert Lovink, Associate Professor of New Media at the University of Amsterdam (UvA) and Founding Director, Institute of Network Cultures Facebook App Brings Back Data by Riva Richmond, New York Times, May 1, 2010 Give Me My Data Helps Refill Blanked Facebook Profiles by Curt Hopkins, ReadWriteWeb, May 2, 2010 Two Facebook Apps To Help You Fight Back Against Facebook by Chris Walters, The Consumerist, May 4, 2010 Facebook’s Disconnect: Open Doors, Closed Exits by Rohit Khare, TechCrunch, May 7, 2010 3D movies and extracting Facebook data by Kate Russell, BBC News, Friday, June 25th, 2010 The following images were created with data retrieved by Give Me My Data. I'm making high resolution versions available for editorial purposes only. Please contact me if you use these. During the final days of the German Democratic Republic (or GDR, a.k.a. "East Germany") it became evident that their Ministry for State Security (more popularly known as the "Stasi") was destroying incriminating evidence from its 40-year history of domestic and international surveillance. These documents, which the Stasi was attempting to destroy using shredding machines, as well as by hand when the machines failed, included information gathered through various clandestine methods about lives of citizens of the GDR without their knowledge or consent. On January 15, 1990, protestors stormed the Stasi headquarters in Berlin in attempt to prevent the destruction of personal records which they felt they should be able to access. The phrase, "Freiheit für meine Akte!" (in English: Freedom for my file!) spray painted on the Stasi guardhouse during this protest embodies a desire by citizens to open this closed world of state surveillance in order to understand the methods of control employed the Stasi At the height of its operations, the Stasi is believed to have hired, between spies and full- and part-time informants, one in every 6.5 East German citizens to report suspicious activities, almost 2.5 million people.1 At this moment, the ratio of people entering data on Facebook to non-members is one in fourteen for the entire world,2 introducing possibly the most effective surveillance machine in history. The name and project, "Give Me My Data," is inspired by all such citizen movements which aspire to know what information that government or private organizations gather, store, and use to maintain their powerful positions. *Image of Ministry for State Security guardhouse by Michael Westdickenberg Give Me My Data is developed by Owen Mundy.
s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652662534669.47/warc/CC-MAIN-20220520191810-20220520221810-00788.warc.gz
CC-MAIN-2022-21
6,696
42
https://www.eriksuniverse.com/page/2
code
Google helps China control citizens Copyright is the new poster child for censorship in EU Google helps China control citizens Copyright is the new poster child for censorship in EU I deployed a 3 node Galera cluster in Kubernetes. Galera clusters MariaDB or MySQL, allowing you to read and write to all nodes while always being consistent (ACID) at all times. Kubernetes is a deployment environment for container applications. Here are key features of Galera: – It uses MariaDB instances, and then uses a plug-in to cluster them. So, you are still using stock MariaDB instances. – It uses InnoDB table types, which has been my default since introduced, and now the OTB default. It introduced ACID to MySQL long ago. – Every node is a master/slave. So you can write to any node. – Unlike typical horizontal clustering, which typically offers eventual consistency, this provides consistency across nodes at all times. What this means is from a functional perspective, you can continue to use for your OLTP applications requiring ACID. It’s primary benefit is when a node fails, as long as quorum is met (majority of nodes still up), the database remains available for transactions. Enter Kubernetes (K8S), and a node failure is quickly remedied by K8S as soon as it can. I kill a node, it brings it back up within a minute or two. In the meantime, the other 2 of 3 nodes remain up, and continue to serve transactions since 2/3 is a majority. This is the primary benefit of Galera, and Kubernetes is the ideal environment for it. While Galera doesn’t provide load balancing, K8S does, as you connect in K8S to the single service name that routes the connection to a node that is currently available. I tested this, and added a row to a database to one of the up nodes while a node I just killed was being recreated automatically by K8S, yet still down. When the killed node was restored, it too had the new row in the table. So, new nodes “catch up” to missed transactions automatically. I have not reviewed the performance impact; but, guaranteeing consistency across nodes 100% of the time has a performance cost when compared to a horizontal database with eventual consistency. Yet, performance is likely to be better than a single node since replication can be extremely efficient (think low level processing, without having to duplicate query processing). Your primary benefit, though, is higher availability. If you’d like to give it a whirl, here are instructions for how to test it. Create a cluster and deploy a 3 node Galera cluster. I had no problem deploying Galera in Google Cloud to a cluster using these 3 YAMLs. View in Kubernetes console To use Skip and have Admin privileges, load dashboard-admin.yaml, which you can create per these instructions. In order to test from a local db client, create a port-forward rule. Here I use a different port because my local machine has its own instance of a MariaDB server listening on 3306. # Listen on port 13306 locally for port 3306 of pod 'mysql-0' kubectl port-forward mysql-0 13306:3306 You can easily kill it and change the pod to jump around from one instance or another. When I killed mysql-2, I inserted in mysql-0 while mysql-2 was still down. Then when mysql-2 was back up, I changed the port forward to mysql-2 to verify it had the new row inserted while it was down. Alternately, you can port forward to all 3 pods on 3 different ports. To connect, use this from a local client instance where you have MariaDB or MySQL installed: mysql -h 127.0.0.1 -P 13306 -u root -p To test the Galera cluster you can follow these instructions. In addition to deleting the test cluster, you’ll need to delete the Persistence Volumes, which you can find under Google’s Compute Engine Disks if you are using GCP. This is a continuation of Developing an Automated Trading System. Many of us use very robust charting software, including the popular thinkorswim platform, that does more than I plan to create in my system. The requirement I ran into that could not be met by this software is the ability to chart unique data produced by my system that isn’t available to the third-party platforms, such as back testing results. Thus, I needed basic charting that allowed me to analyze things in the context of price history. While a fully automated system won’t depend on charts, of course. I — the human — play a role both in its development and improvement, as well as a cohesive role in automation. To balance the human brain vs AI discussion, the goal is a “cyborg” in the beginning that becomes more and more machine as time passes. Parts that are proven to be successful in production will remain in the cyborg while new parts are vigorously tested. I had a few requirements when comparing charting libraries: Other bells and whistles were considered, but those were the core requirements. I chose ng2-nvd3 as it met these requirements and had nice bells and whistles such as zooming and resizing capability, and can be user interactive. This is a 3-tier stack: The center of the stack is NVD3, as ng2-nvd3 just provides an Angular2 interface to it. Interfacing via ng2-nvd3 worked well. You have complete access to NVD3 capability. It also updates the chart when you update the data, as you expect from an Angular2 component. So, this completely met the Angular2 requirement. NVD3 is a bit limited, though. They have a gallery of charts you can view. It can produce a nice candlestick or OHLC chart with high, low, open and close bars. But, you cannot add lines to these, and the multiChart option does not currently support candlestick or OHLC chart types. The multiChart type includes area, line and bar charting only. I can live with this limitation for now. I just have to chart close prices of the original price history as a line, and additional lines for things such as MAs. Extensibility. In the long-run I’ll one day want a candlestick charts with lines for MAs and other indicators. I’ll also want lines for fibs, and other types of indicators, such as buy and sell signals, which might be up and down arrows, and other types of notation related to back testing. There are two silver linings to the ng2-nvd3 stack. On top of this, you can use d3 on your current charts. I’ve already used it for some non-graphical utilities. Your code has access to everything ng2-nvd3 and nvd3 has access to, including, of course, the DOM model generated by it. So, you can easily learn and use D3 yourself to enhance your charts, perhaps to add the buy/sell signals, without even changing the nvd3 code. The UI consists of 3 Angular2 components. One child for the price history query parameters. Another child for adding studies. And the parent that bring those inputs together and outputs the chart. This uses both the Angular2 @Input and @Output decorators that allow you to tie components together. Because the chart automatically updates when the data changes due to data binding, including chart configuration, you can continue to add to and modify a chart after creating it using the controls. Because each child component requires the user to potentially update multiple fields before the chart can be updated correctly, each one has at least one button (Chart and Add). When a button is pressed, the parent component receives the output and updates the chart. Note that the StudyEntryComponent is in early stages of a WIP. Yet, it can currently be used to add MAs to a chart. As you make modifications, clicking the Chart or Add buttons updates the chart. You can also edit current MAs by selecting it, changing it and then clicking Chart. The next image shows the table that is created as you add or edit MAs along with the resulting chart. This chart demonstrates several features using nothing but out-of-the-box nvd3. If you resize the browser window, the chart automatically resizes. While you can’t view the effect in the static image above, trust me, it works. Have doubts? Check out the demos I linked to earlier. You can compare items using two different Y axis. In this case, the Russell 2000 ($RUT.X) is on the right axis. This currently creates studies for the underlying asset on the chart. So, when we add an MA, it appears for both the S&P 500 ($SPX.X) and the Russell. Being a two dimensional chart, you cannot have more then two Y axis. If you included a third or more, they will share the right axis, which will be extended to handle the full range of possible values. The choice of which axis an item belongs it is something you can control as you setup the data. But, you cannot have a third Y axis. So, you have to factor this into the design and how raw data is handled, with the impact on the Y range being your primary concern. Combining an item that ranges from 0 to 2 with an item that ranges from 2000 to 2200 on one Y axis will result in two flat looking lines far apart. The user can interactively hide/show any of the lines by clicking the legend. You can see above that $RUT.X 200 EMA-we and $RUT.X 50 SMA-mo are both hidden because their circles in the legend are not filled in. Another feature that differs from some charting software is that interval of the MAs is not limited to the interval of the chart. While the chart is displaying weekly bars here, we added monthly MAs to the chart. This is important because the algos will typically use one minute bars for historical data, and one or more per second real-time quote updates; yet, needs to be able to calculate MAs with intervals from 5 minutes to monthly. Currently, when it needs to update the chart, it simply does a REST call for price history, which has the ability to add studies via parameters. When those results come back, our UI side transforms the data using Typescript into the representation required to chart it, and simply replaces the data field in the ChartNVD3PriceComponent given to nvd3 to create the chart. Due to data binding, the chart updates the instant this data is updated. The REST call itself uses the parameters to construct and invoke a third-party API call. Our facade to the API converts the raw data returned to POJOs. Because our interface to the API uses caching, this could be in memory and returned instantly. With price history in POJOs, our service then adds studies to the data as new fields. Then, it converts the POJOs to JSON and returns it as the output of the REST. Our Angular2 component receives this data, transforms into charting representation, and updates the chart data. Adding charting to the application gets us started so we can begin to create JSON of back testing results that can be used to produce charts. To add back testing results to charts, in Angular2, we’ll be creating a new UI component for defining back testing requirements, much like the one we created to add studies. The exception to simply using a one trip REST query might be if the back testing takes longer than it does today due to new complexity and permutations. In that case, I’m likely to redesign it to simply add it to a back testing request queue; and allow the user to monitor the queue and view when available. One advantage of this is that it can be viewed at any time later so long as it is on the list of queries that were previously queued. WebSockets can be used to update the queue in the browser without the user having to click. You will be able to see, in real-time, the progress of your request. WebSockets can be used to update the chart in real-time. This will be important when using real-time quotes and monitoring trading. With the exception of the data coming through WebSockets instead of REST, we won’t need to really change how charting works in Angular2, as it currently updates the chart whenever the data changes. The only difference will be how the data changes. Since we already use Angular2 for real-time updates of Level I and II quotes, monitoring of predictions, and order flow, using WebSockets to update a chart does not introduces a new technical feat. This is a continuation of Developing an Automated Trading System. Began algorithms with simple strategies. This tests a range of inputs for a strategy. For example, you can test a range of trailing stops from 1 to 15% with 0.5% steps. This will test 30 scenarios with the same data. You can combine strategies testing multiple ranges. If your ranges include 10 target scenarios, and 10 stop scenarios, it will test 100 scenarios, as it will test every combination of your ranges. There is no limit to the number of ranges you can combine. The REST call to create the backtest parses your strategies, creates entry/exit factories and iterates through the ranges. On the entry side, I’m creating indicators that can be used to fire signals. While the signals are simple today (all true, all false), the logic can become complex as algos become aggregations of signals weighted to make a decision. This will be fed to machine learning and use other techniques for prediction and optimization. Technical description: No new technology here. This introduces a pattern of phased data enhancement. I was recently inspired by the AI series Westworld. This led me to increase generification and conceptual streaming and phased data enhancement as I imagined the result being a high performance real-time analytics engine that could potentially handle complex decisions beyond the current application. The goal here is to ultimately build an AI engine with practical purpose driving it rather than theory, as well as a real-time analytics engine that can be deployed to solve a number of problems in various industries. For this reason, the back testing algos are designed to support real-time price updates that include time so they can handle their own temporal requirements, much like the human brain continuously analyzing real-time signals to help you make decisions. Added Charting to Automated Trading System (Jan 18, 2017) It has been my dream since I was a kid and I saw the Matthew Broderick movie War Games in a theater in 1983 to build an automated trading system using AI. This year I have begun to live that dream. I call this project jVest. Curious how I built it? Here is a description of what I created: jVest is a real-time high transaction volume market analysis and trading system implementing machine learning for predictive analytics. It has an HTML 5 UI using WebSockets to continuously stream real-time information to the browser. The UI itself is largely built using Angular 2 with Typescript. It uses JMS to distribute the incoming third-party data messages after converting them to POJOs to subscribers. Using a listener, it persists select data in a RDBMS using JPA. Using another listener, it is streamed to business logic which then streams its transformed output to the web UI via WebSockets. It also uses the Observer pattern to flow the data through many consumers and producers, and injection (CDI) to simplify the complex interactions and provide a highly extensible design. It uses REST for user interactions with the server and WebSockets for real-time streaming. To handle conversion of POJOs to/from XML, it leverages JAXB to marshal and leverages the marshalling capabilities of Resteasy in the REST layer. All transportable POJOs implement a Marshallable interface providing common default methods to do JAXB mashalling to/from XML and JSON, which is used for WebSockets as well as conversion of XML to POJOs from upstream data providers. For machine learning, it uses JServe to integrate with R over TCP/IP, which permits high scalability through distributed R nodes. R then handles the predictive algorithms such as linear regression and KNN. Users can query via the web UI to analyze market data, as well as create monitors that regularly update the predictions in the web UI via WebSockets. As you can see, it is not only a fun project, but has brushed up skills by using the latest updates to the Java EE stack. On top of that, I learned completely new things, such as Angular 2 with Typescript which is becoming a very popular UI solution that encourages creating highly re-usable UI components; and R, one of the leading languages for doing machine learnings with a large plethora of readily available statistical analysis packages. Using R to do machine learning is not only cutting edge in terms of technology, it is a rapidly growing area due to the proliferation of large quantities of data in computers, faster computers, and cheaper storage. Limits to computer speed and storage were the reasons I set aside pursuing AI in the 80s after my initial dabbling with it. It is exciting to be able to pick up that dream today now that hardware can now do things barely imaginable in the 80s. Oct 28, 2016 – Added clustering via WebSockets I added the ability to cluster servers using WebSockets. This overcomes a limitation. The upstream data provider permits only one live real-time stream per account, though all the nodes can use the upstream provider’s synchronous API (think “REST”). To overcome the stream limitation, other nodes can now connect via WebSockets to receive the real-time streaming data. Because each instance of the application can act as both a provider and consumer, this allows for theoretically unlimited scaling of the business process and web/UI tier through a hierarchical topology. Because all streaming services are initiated through REST or scheduling, this is 100% dynamically configurable at run-time, both from a user-driven perspective and, down the road, perhaps for automated discovery, load balancing and high availability. Technical description: this uses WebSockets for Java-to-Java in an EE web container. The provider endpoint picks up messages to send using @Observes, and the client side fires the messages it receives just as it would if they were received in its MDB if it was running the data collector that handles real-time streaming with the upstream provider. This demonstrates the powerful plug-ability and extensibility of the Observer pattern. It uses a WebSockets encoder and decoder to convert all messages to/from JSON for serialization. The en/decoders were easy to create since all message POJOs entering and exiting the endpoints internally support JAXB bi-directional marshalling of both XML and JSON accessible through a common interface. In the late 80s, when I was in college, I created an account on the school’s computers. I dialed in with my modem from home. When you dialed in, you were presented with a UNIX prompt. This was an all text world. No images. No nice web pages. Just a command prompt and programs that output text. In the 80s, all the colleges in the US were connected to the Internet. There was no commercial dial-up service like AOL, yet. So, it was virtually all academics and scientists. There were no corporations. No one charged for anything. No one competed. It was just people connecting to people and information. One of the first commands I learned was ‘irc’. Once you use this command to connect to an Internet Relay Chat (IRC) server, you type /help, and from there learn other commands. I quickly discovered thousands of channels with thousands of people from all over the world. The purpose was to simply let people chat. It is a myth that the Internet became social in the late 90s. The Internet was social from the beginning, particularly with IRC. What was surprising was that the Internet included people and servers all over the world that spoke many languages, although English did seemed to be the predominant language. Want to talk to people in Spain? Join the #spain channel. Germany? #germany. When virtually no one heard of Linux yet, there was always the #linux channel. Want to create your own channel, just “/join #mychannel” and boom, you just created a new channel. It was a level playing field in that anyone could create a channel and invite people to participate in it. And, you could join any open channel. Though, there were ways to make channels hidden, and require passwords to enter. There was never a fee, and most were openly there for anyone to join. The only requirement for using IRC was an Internet connection and a client program like ‘irc’ to connect. This was an era when international long distance was prohibitively expensive. Prior to IRC, you’d never dream of talking to people all over the globe. So, imagine how exciting it was when, in 1990, one year after the Berlin wall fell, I was talking in IRC to someone who grew up in East Berlin! I asked questions like, what was it like when the wall came down? What was life like growing up behind the iron curtain? How are you doing now that 1 year has passed and you’re now integrated with West Germany? I’m not sure I could of called someone in East Germany, yet, since it was behind the iron curtain just a year earlier when it was unimaginable that you could call people there from the US. I’m pretty sure you couldn’t do it prior to the wall coming down. And, even if you could, it would of been very expensive if the person you wanted to call happened to have a phone. My brother went to Moscow University in 1993 under the Perestroka program. It cost us $30/minute to call him. I tried to get him to IRC, of course. But, that never panned out. To be sure, it hasn’t changed much today. It’s bigger, of course. There are more servers. There are lots and lots of bots (automated programs) on the IRC. There are still a lot of people across the globe using it. However, in an age when most people know the Internet via the face of Google and Facebook, the IRC can seem a bit antiquated. Yet, for open live text chatting, there’s still really nothing that has truly replaced it. Yes, you can IM and do other forms of text chat. But, having a room open 24/7 that anyone can go to and just text chat? As far as I know, someone has to open a Google hangout and invite people. There’s no list of thousands of Google hangouts you can join, particularly without knowing anyone in the channels. What if you want to join a real-time live discussion of a topic you’re interested in? Client programs for connecting to IRC improved a lot, especially in the 90s, giving a graphical easy to use interface for people. The good news is these programs are free and have only improved over time. Whether you are on Windows, Linux, Mac, Android or iPhone, there are great easy to use client programs for connecting to IRC. Don’t want to download and install a program? You can now just use your web browser to connect to IRC. The dollar tanked and the the S&P 500 made a new all-time high when the headline news of the jobs report came out. The probability an interest rate hike for December increased from 32.1% to 45.4% (calculated using 30-day Fed Fund futures prices). Our trusted media reported headline news such as However, the news, which many investors and traders trust to make their decisions, has failed to look into the data in the report to understand what it really says. ZeroHedge pointed out that Obamacare offset weak industrial and consumer sectors. In another article, they point out that private payrolls grew an unadjusted +85k in July, far less than the seasonally adjusted headline number of +217k. Reviewing the labor report myself, I discovered that the only education category for those 25 and older with an increase in actual jobs from June to July was High school grads with no college. The other 3 categories, including those with some college with or without any degree had a decrease in actual number employed. (Table A-4) The number of unemployed from permanent job loss (layoffs) increased from June to July from 1.848 to 2.014 million (+166k). Even the “seasonally adjusted” number, a fictitious number which is of course rosier, showed a 104k increase in permanent job loss. Of course, with increasing layoffs come longer unemployment times, steadily increasing since May. Average number of weeks unemployed went up in July to 28.1 from 27.1 in June. All this data points to a weakening US economy. Educated workers are losing their jobs, being increasingly laid off. Those on unemployment are having a harder time finding a job. The increases that the headline refers to are high school grads taking jobs that do not add much to our economic strength, as they do not replace the high paying jobs being lost. Many, of course, are temporary jobs due to the election season, which helps to explain the increase in high school grad jobs. Clearly, as long as we trust a news media to do our analysis for us, and do not hold them accountable to critically review the jobs report, we’ll continue to be deluded by rosy headlines despite the truth being much less bright for the US Economy. The jobs report this morning caused the S&P 500 to hit new highs of the year, a few points short of its all time high set in May 2015. It is tempting to pretend like all is well, and just buy stocks, and hope for the best. Yet, perhaps the best way to protect your nest egg is to take a closer look with a critical eye. I heard a few unconfirmed things today from traders regarding that report: Note that gold and bonds soared today (my two favorite investments of the year). #1 on the selling into strength list for most of today was SPY (S&P 500 ETF), with the IWD (Russel 2000) at #4. This is post-brexit profit taking which is common when they believe the market is reaching another top. ZeroHedge has an interesting critique of the jobs report that soared the markets today: Selling into strength: I and others I know have found ourselves chasing gold. We periodically get a nice position, take our profits, and then find ourselves missing out on the next big move because we cannot find good entry. This has been driving me nuts all year. While I finally just bought a gold fund in my 401k in April so I never miss out on an up move again, I’m still far from fully benefiting from the continued rise in gold. Typically, I prefer gold futures (/GC) as a vehicle. However, they suffer from two major limitations. They do not have weekly options, and their options only go out a couple of months. Thesis: Target for gold is 1600 before the end of 2016. This is a thesis I’ve held for 2016 since mid-2015. The first half of the year sure has confirmed the thesis. I won’t get into all the reasons gold is soaring this year in this post. But, the positions I’m describing here are based on capturing profits if this thesis continues to prove true. There is good news. While gold has risen from 1060.50 since the beginning of the year, gaining over $300, or 30%, in order to hit $1600, gold has about $230+ more to go. That’s still a nice gain to capture. So, how do we capture it without constantly chasing price and hoping gold doesn’t soar while we ‘re sleeping in the Asia and European sessions, or looking for a pullback that never comes while it rips in front of us? Fortunately, in addition to being a 3X leveraged ETF of gold miners with a high correlation to the price of gold, NUGT also has weekly options, and has options all the way to January covering our time frame. There are some principles to options you’ll want to understand for this strategy: At 1357, and having only been in the 1300s for a short while, we view /GC as being 1/2rd through a 1300-1400 range. Many are betting it will hit 1400 soon, and plan to short it there. So, it has a decent probability of racing back to support near 1308. Yet, due to the reasons it is soaring (bonds having negative returns, currencies unstable, and brexit), there is never a guarantee it will come back down that far. We want to be sure we have a position in case it soars without an ideal pullback. Yet, we don’t want to be too exposed in case it does drop back near 1300; and, want to be able to add to our position if it does. Thus, at this level, we’ll begin our position with low delta short puts by going out to December expiry. For the strike, I choose to be near the money to maximize extrinsic value. The good thing about December is the premium is high enough to easily get a break even of $100. Selling a 160 Dec for $55 means your break even at expiry is $105. The delta on Dec 160 is currently -.29. That means that the option price is expected to decrease by .29 per share for ever $1 gain in NUGT, presuming volatility doesn’t change, and not taking into account theta burn. That is what we like being so far above what we currently consider strong support. We’ll take a lot less heat than a put that will be expiring soon at the same strike if /GC drops $60. Our goal is to turn this into a vertical, as we believe /GC has a high probability of shooting for $1400 before coming down. If you are not comfortable opening a naked option position, or don’t like the buying power reduction (BPR), you can just begin with a vertical. However, I’m choosing to open the short side first, then the long side if /GC goes higher in the near-term. After selling this put. I created an order to buy the 140 Dec for $20 less than I received for the 160. If I’m super lucky, and it fills, then I just locked in max profit on the spread! Realistically, though, I’ll look for resistance on /GC, notably 1400, and do a cancel/replace for whatever I can get then, because I anticipate a pullback there on first touch. Regardless, it is likely to be a lot better than what I’d pay today, both because it will be worth less due to the delta, and because time will pass, burning theta. Note that you are never locked in. Let’s gold hits 1400, we buy the put creating our spread, then gold drops $90. We could, at that point, close the long put for a profit, or roll it to a different strike to widen our profit potential. The idea of putting it on is to lock in a higher probability of profit while creating some downside protection. Once we’ve used that downside protection, we can choose to remove or reduce it. If /GC drops before I get a chance to do it, then I’ll just be stuck with a naked put for awhile, and wait until /GC runs again. Like I said, if this isn’t for you, you can just open a short vertical and be done with it. I’m just trying to maximize potential profit and probability of profit by putting some swing trading into how the position is created. If the naked put is a little uncomfortable, but you want to try to time the sides of the vertical, you could start with the long side first. The down side it will be decaying while you wait for entry on the long side. The good news is that the decay will be relatively slow since you went out to December. That Dec 160 currently has a theta of -.16. Contrast that to the 160 expiring in two days with a theta of -$1.30. If you do the long side first, then you’ll be hoping for a nice pullback to complete with the short side instead of waiting for /GC to go higher. If you feel strongly about which direction it is likely to move in the next few days, this can also factor into which side you do first. What do we do as /GC comes down towards our major support, but isn’t there yet. Remember, there’s no guarantee it will get there. So, we want to balance the possibility it can just drop to 1340 and bounce and never go below it again for the rest of the year, with increasing risk and potential profit as it approaches 1300. To do this, I’ll use closer expiries as it drops. Perhaps Nov in 1340-55 range, Oct in 20-40 range, etc,… The closer it gets to the bottom of it’s potential range, the more delta we’ll be willing to risk to collect more from theta burn. 🙂 This is optional. You could stick to Dec. I just like to increase reward/risk as it approaches support. Note that once it gets down to 1300-15, even a short OTM put spreads can be very lucrative. Analyze these and decide if they aren’t good option for you. I just don’t think they are lucrative enough until /GC is down there to be worth the heat. But, they are on my list of potential positions to open there. The next part of this strategy is what we’ll do if /GC pulls back near 1300, where we believe there is strong support and it will likely bounce like the last time it came near 1308. For one, you can immediately sell short puts that expire in the near-term for quick profits on that bounce, or even if it ranges there for a bit, as theta will decay fast. Then, take the cash you raise from that, and buy some Dec calls. You’ll have to pick the strike you like and are willing to pay for. But, here’s where you turn a potentially profitable position into a really potentially profitable position. The potentials gains of Dec NUGT calls if /GC goes from 1300 to 1600 by then are huge. The good thing is you’ve effectively financed these with short puts. To be clear, there is a lot of downside risk to this position. I have strong conviction so am not too concerned about that. Yet, once your put side consists primarily of verticals, your risk will be limited. If you timed it will, then you really reduced your risk. If you get lucky and the difference between price you collected from short side and long side is same as spread, you have NO RISK on that spread as you already collected max profit, and will just wait for payday! To be sure, you can use different underlyings and combine them in different ways. The important thing is you are capturing both delta and theta burn on anticipation of an up move in gold, with little to no risk if gold doesn’t climb, and you are managing and limiting risk to the downside. You’re also timing it to obtain the best position given the uncertainties. Alternatives include using /GC options, GLD options, or anything else that moves with the price of gold.
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141188899.42/warc/CC-MAIN-20201126171830-20201126201830-00188.warc.gz
CC-MAIN-2020-50
33,773
119
http://gaming.stackexchange.com/users/9493/tesserex
code
I enjoy writing games to improve my coding skills. I also make Facebook games as a hobby. 9 What determines the beetle's flight time? Nov 24 '11 4 Do speed boosts only apply when at full health? Oct 13 '12 1 Blocks mysteriously disappearing in minecraft? Aug 10 '15 0 Constructing sink bowls in minecraft Jan 28 '15
s3://commoncrawl/crawl-data/CC-MAIN-2016-30/segments/1469257824146.3/warc/CC-MAIN-20160723071024-00201-ip-10-185-27-174.ec2.internal.warc.gz
CC-MAIN-2016-30
315
5
https://gamedev.stackexchange.com/questions/21869/what-is-the-logic-behind-the-loading-scenes-in-games
code
I think games modern should have been able to ditch loading screens. I'm guessing the real reason that we still have loading screens is that most games are designed to run on Xbox 360 hardware. As such they will be limited in the amount of ram they can load stuff into and how much threading they can do. If you don't have enough ram then you will need to be dropping a lot of stuff out of memory and then reloading it from disk. Also it's just easier to load everything at once. There are also times when things like the harddrive might stall. What happens if the user is running antivirus in the background. If your not preloading stuff then you might have to have the game pause while it waits for that mesh to load or having things spontaneously appear. Games like Skyrim can provide you with a wide open world that you can run across as much as you want but as soon as you open a door to load a relatively small dungeon you get stuck with a loading screen. About the only reason I could see the dungeon load screen being needed is if they use some extra heavy prebaked lighting used for indoor scenes which if fairly large to load (made from 1 giant mesh or has detailed lightmaps) or needs some calculations to be done at runtime. I believe the Skyrim dungeons are just built from modular meshes rather than one giant mesh (at least I think Oblivion's where). There's no real reason to load more than the stuff you can actually see when entering the level. The same way games don't render stuff you can see, you can cull what you actually need to be loaded straight away. You can also stick in a lot of dummy place-holder objects. Then you can use threading to asynchronously load objects in surrounding areas, their meshes, the textures and so on as the player moves around the environment. The way I would ditch loading screens is as follows: Load the basic game world properties. Global scripts. Load the player's position. Load the gridcell/BSP node of that position. Load just the bounding boxes of the objects in that cell. Work out what would actually be rendered (tests on the bounding boxes, maybe occlusion rendering tests) Load the meshes of the stuff that is rendered. Load the materials the textures of the meshes. Preform more tests to see if more regions are visible and need to be loaded (ie query a render a whole grid cell as a giant cube). Once everything visible is loaded start rendering. Load the rest of stuff in the cells and Load more stuff as required. Load nearby regions and do some basic prediction on the players movements to choose which regions need to be loaded first. Keep going until memory fills. There are some other things that need to be taken into account. For example some stuff needs to be loaded even though it isn't visible. For example an enemy that is standing behind the player outside of the region would need to be active so they can run toward the player and attack them. Possibly the occlusion test stuff is unessential since you will immediately want to load anything near so its there when the player moves and it becomes visible but it could shave off a second to get straight to the render (but if time is that close then you are likely to get problems with stuff appearing). If possible those steps should be started before they are required, threaded off in the background. For example if you are at the main menu of the game then it should start to preload the starting area the loads after you choose "New Game" (or as soon as the menu appears) and the last save for continue. Other saves can preload when you move the mouse over them. As you approach a dungeon door, it should start to load the next level (although if you can do that then you don't need a special dungeon door since they are really only to trigger a loading screen). Load in the background while the video plays. That way you wouldn't even need a short initial loading screen. You can skip a lot of that if you don't mind stuff 'popping' into existence. Something like Second Life just leaves you with stuff appearing, since its being retrieved from the internet there's not much you can do about that. Assassin's creed has that effect where stuff whooshes into the scene so you might be able to get away with stuff for artistic reasons.
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679099514.72/warc/CC-MAIN-20231128115347-20231128145347-00649.warc.gz
CC-MAIN-2023-50
4,271
20
https://azure.microsoft.com/ja-jp/community/events/unleashing-microsoft-advanced-threat-analytics/
code
Unleashing Microsoft Advanced Threat Analytics Register to attend this complimentary webinar on unleashing Microsoft Advanced Threat Analytics. In this session, we will provide deep dive information on Microsoft Advanced Threat Analytics (ATA), a technology based on the recent acquisition of Aorato, which helps you identify security breaches before they cause damage. In today's world, attackers compromise non-privileged and privileged users, devices, server, etc. in order to gain access to the network and steal confidential information from businesses. This session will cover ATA fundamentals and provide a glimpse of ATA in action. 時間: Thu, 07 Jan 2016 09:30:00 GMT Thu, 07 Jan 2016 09:30:00 GMT (UTC)
s3://commoncrawl/crawl-data/CC-MAIN-2018-43/segments/1539583514314.87/warc/CC-MAIN-20181021181851-20181021203351-00415.warc.gz
CC-MAIN-2018-43
713
5
http://www.dslreports.com/forum/r27730252-
code
DarkLogixTexan and ProudPremium |reply to Rhenai | Re: Design a raid If I made one it'd be HUGE. for the layout think, Karazhan leading into Naxx (but only naxx in the general layout.) ok so you'd have the entry as a building (like Karazhan) with multiple paths to get through it (IE 75% of the bosses would be "Optional") but depending on which bosses you kill (or in some cases beat till friendly) as well as the order they're done in, you'd get some helpers for the next part of the raid. floating above the building would be a thing (kinda like Naxx but round and more like it had risen from the depths of the ocean) in this part (think of the internal layout as a cross between Naxx and Ulduar, if you entered ulduar into Yog's room) it'd have all kinds of cool rooms for the bosses, with some mech like OS3d but diff, you could go right to the boss of any of the 5 sections (you'd have to start with the center 6th) and if you do the room boss first you have a harder fight but get more loot, and can still do the other bosses in the section if you want but they'd be amped up now that their "leader" was dead, thus they too would now give better loot. then after clearing all 6 a dragon would spawn on the top of the raid, it would then fly your raid group to an under water part of the raid, there would be a seemingly bottomless pool next to the raid, and the dragon would dive into it taking you to a titan structure. (ok maybe you'd have to fight the dragon first to make it friendly) the structure would still be intact (unlike ulduar) and there would be an old god unlike any before (he'd actually be aspiring to be a titan) and another Old god would be keeping the prison secure. you'd have all 4 elemental lords to deal with as they'd have come to secure the prision to prevent the good old one from meddling in the other's plans. and you'd have the Highborn queen there too. as for the mech of the bosses you'd have almost all of the mech's that have been used. now to make it not be a 100/100 raid, it'd be broken into sections with a req to have cleared the previous section atleast once before entering. and beside the 1st instance would be four dungeon instances that would have to be cleared once before entering the raid.
s3://commoncrawl/crawl-data/CC-MAIN-2014-10/segments/1394010484313/warc/CC-MAIN-20140305090804-00079-ip-10-183-142-35.ec2.internal.warc.gz
CC-MAIN-2014-10
2,243
18
http://www.diychatroom.com/f105/stucco-paint-texture-flaked-off-156943/
code
I'm not familiar with exterior wall designs. I notice that there are a few areas of my exterior wall where the paint flaked off. The flaked off paint is textured like stucco, drywall, and is thicker than normal paint that peels off. It didn't seem like the wall behind the paint was quite as textured as the paint. See photo. Is what flaked off a (colored) stucco texture applied on top of the (concrete?) wall, and didn't bond well (at least over time, if not initially)? And is that 1 or 2 coats? In its current state, how would you repair this, assuming I can find matching color. And is it ok to leave as is and not repair, and perhaps just brush on waterproofing sealant over the affected area? Probably be a pain to match color to repair a small area otherwise.
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917118743.41/warc/CC-MAIN-20170423031158-00572-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
767
3
https://www.grasshopper3d.com/forum/topics/grasshopper-rhino-beta-whats-happened-to-ngons
code
algorithmic modeling for Rhino I have two lists of polylines, and from those lists I create ngon meshes. After Rhino Beta installation something went wrong with display from Grasshopper. If I create ngon mesh from one list of polylines display is correct. If I input datatree, or two lists, meshes are still correct but display is only shown in wireframe why? I think it has nothing to do with code inside, because baking meshes works well and they are valid, but multiple display of meshes with ngons result in wireframe display. I can take those two meshes and join into one, and display is correct again. But it seems grasshopper cannot handle more than one mesh with ngons display. © 2023 Created by Scott Davidson. Powered by
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224654097.42/warc/CC-MAIN-20230608035801-20230608065801-00203.warc.gz
CC-MAIN-2023-23
731
9
https://sunglass.io/standard-deviation-matlab/
code
What is Standard Deviation MATLAB? Table of Contents (click to navigate) - Introduction to Standard Deviation Theory - Standard Deviation MATLAB - Brief Tutorial: Standard Deviation MATLAB Introduction to Standard Deviation Theory The first time any science or mathematics students hear the words standard deviation, it sounds daunting. In fact, many pray that they’ll never see it again. The equation alone, for those who aren’t fans of basic calculus, is one that turns away many. For those who take the time to carefully analyze it however, the equation eventually becomes harmless. The standard deviation morphs from your worst mathematical nightmare, to becoming your statistical best friend. A key statistical term, the standard deviation is that aspect of your data collection, that allows you to identify how your data correlates to each other. Depending on what’s being done, it is helpful to identify the amount of variation in your data set. For applications that require both precision and accuracy, the standard deviation will help you in the process of your data analysis. Represented by the equation: Where ri is the data point, ravg the average of the data pool under analysis, and n is the number of data points in the pool. It will therefore be possible for the standard deviation to be calculated for all data sets. From the image above, analysis of a myriad of data sets has indicated that the the data will form a Gaussian distribution about a centrum or mean value. Denoted by the zero point in the data set, the rest of the data usually falls into the following categories: - 34.1% of the data will fall within one standard deviation from the mean (averag) - 13.6% of the data will fall within two standard deviations from the mean - 2.1% of the data will fall within two standard deviations from the mean The aim in any statistical analysis, is to have a data set with a low standard deviation, indicating that your data is both accurate and precise, and centralized around a central point. If the data itself is spread across a wider range, then the data will have to be re-collated. When making statistical calculations, the confidence of the analyst is often denoted by the standard deviation, as well as via additional detailed statistical tests in order to determine if the data set is statistically significant. As an initial “wet test” the data will first be tested via the standard deviation. Ideally, when data is analyzed, the analyst is anticipating a confidence interval of up to 95 percent. The applications of this universal parameter of the standard deviation are numerous. Every field that falls into the category of mathematics, science and engineering utilizes this data. Insurance premium calculations, investment strategies, engineering control averages, and a myriad of other functions will utilize the standard deviation. Knowing that the parameter is critical to know, in this lesson you will learn: - What is Standard Deviation MATLAB – the syntax used by MATLAB in order to calculate the relevant data variance details - How to calculate the Standard Deviation using MATLAB Without any further ado, let us jump into the tutorial. Introduction to Standard Deviation MATLAB The standard deviation MATLAB function is that aspect of the MATLAB syntax toolbox, that enables the user to calculate the standard deviation or the variance of a data pool. The MATLAB system is a powerful tool and provides more than one means via which the parameter can be carried out. The following are the options that are available for the user: S = std(A) S = std(A,w) S = std(A,w,’all’) S = std(A,w,dim) S = std(A,w,vecdim) S = std(___,nanflag) Let us define each of the parameters so that as a user, you can determine the option that will work best for you. S = std(A) is the option that will generate the standard deviation of the elements of A. The focus is on the first array dimension, for parameter not equal to 1. The MATLAB system will focus on calculating the standard deviation of vectors, matrices, and also multi-dimensional arrays. S = std(A,w) is the MATLAB option that provides a weighting of the various syntaxes available in the system. The user will define what weight the critical parameters will be. The function of the weightings will be in the normalizing of the final Gaussian distribution generated by the system. Since filters can be generated by the system, this is one means via which the user can find the relevant control to facilitate the desired output. S = std(___,nanflag) is the sixth option, which focuses on options that are not numbers. This occurs when a data set may contain a mix of alpha numeric data, and the analyst would like the option to filter through the data, and find the standard deviation of the numbers. With all this knowledge under our belts, let us jump into the practical portion of the exercise. As usual, now’s the time to turn on your system, and see how best you can apply this good knowledge! Knowledge is power!!! Brief Tutorial: Standard Deviation MATLAB The first step of the exercise is to log into the MATLAB system. For the purposes of this tutorial, the R2019a version of MATLAB will be utilized. Opening a fresh command window, the following is your welcome screen: With the system up and running, we can now dive into the calculation pool. Only the real statisticians crave the thrill that comes from having accurate and precise data…so let’s be that. As of now…you’ll transform in that motivated version of yourself…that loves statistics. Let’s do this! From the MATLAB workbook, we will start our first objective. Create a matrix and compute the standard deviation of each column. A = [4 -5 1; 2 3 5; -9 1 7]; S = std(A) The first line of code contains the data for the matrix function A. From our syntax pool outlined above, we will begin with the first equation. Enter the code into the MATLAB Command Window, and then press enter. When Matlab works its magic, the following is the data that is generated: Congratulations, you have calculated your first standard deviation in MATLAB. Based on the fact that three values are generated, is indicative of the fact that the matrix has three columns. The system will automatically silo the columns, and find the standard deviation of each one. With the advanced adjusters identified in the syntax, the user can determine if the entire pools standard deviation should be calculated. With your confidence boosted, let us calculate the next data set, with the appropriate weightings in the data set. Objective: Create a matrix and compute the standard deviation of each column according to a weight vector w. A = [1 5; 3 7; -9 2];…(1) w = [1 1 0.5];…(2) S = std(A,w)…(3) Let us dissect this code line by line. The first line of code is our data set.This is an array with three columns, and two rows. The weightings, are given for the three columns. It can be seen that the first two columns have equal weightings, while the final column has half the weightings of the column. The system will calculate the relevant data, and then produce the final image above, with the critical answer. Since three weightings were given, and only two data values are generated, the system is indicating that the third column is insignificant. As you work your way through the system, you’ll be able to identify the syntax that will be relevant to your statistical needs. For your various science and engineering needs, the statistical tool of a standard deviation will be critical to your data integrity. Since science requires validation as proof of the integrity of the data, knowing that your data is accurate to up to 95% of the time, will ensure that as an analyst, you are confident in your presentation. With the power of MATLAB, there is so much that you can do in the quest for number crunching, and converting that data into useful information. With your adroit mathematical skills, you just might be the one to discover a process that is life-changing. - Wikipedia: https://en.wikipedia.org/wiki/Standard_deviation - Standard Deviation MATLAB: https://www.mathworks.com/help/matlab/ref/std.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679511159.96/warc/CC-MAIN-20231211112008-20231211142008-00601.warc.gz
CC-MAIN-2023-50
8,172
48
https://mail.python.org/pipermail/python-list/2009-June/692085.html
code
C-extension 2 times slower than exe python at mrabarnett.plus.com Thu Jun 25 18:49:37 EDT 2009 Nick Craig-Wood wrote: > Rolf Wester <rolf.wester at ilt.fraunhofer.de> wrote: >> thank you all very much for your replies. >> I tried to simplify things and make the two versions as comparable as >> possible. I put the C++ part of the program into a shared object >> libff.so. For the exe the main function is linked against this shared >> object. For the python stuff I made an interface consiting of only one >> function call_solver with the same code that has the main function used >> for the exe. Then I created a wrapper for this interface using swig and >> linked interface.o, ff_warp.o and libff.so into _ff.so. The Python code >> just imports _ff and calls call_solver wich creates an object of the >> class Solver and calls its member solve (the main function of the exe >> does the same). >> I included some code for timing into the C++-code. >> #include <time.h> >> //beginning of solve >> clock_t t0 = clock(); >> clock_t t1 = clock(); >> //end of solve >> cout << "time used = " << (t1-t0)/CLOCKS_PER_SEC << endl; >> I'm using gcc4.5 (latest snapshot) and Python2.6 under Suse 10.3. The >> sources are compiled using the flags -fPIC -O3. >> 1) time python ff.py >> time used = 3.74 >> real 0m3.234s >> user 0m3.712s >> sys 0m0.192s > Those times look odd because the user time is > than the real time. > User time is number of CPU seconds used. Real time is wallclock time. > That must mean > a) your program is threading > b) there is something up with timing on your computer > Looks odd but exactly what it means I don't know! >> 2) time ff >> time used = 2.19 >> real 0m3.170s >> user 0m2.088s >> sys 0m0.168s Perhaps multithreading on dual cores? More information about the Python-list
s3://commoncrawl/crawl-data/CC-MAIN-2021-49/segments/1637964358903.73/warc/CC-MAIN-20211130015517-20211130045517-00074.warc.gz
CC-MAIN-2021-49
1,800
43
https://blog.getcollate.io/openmetadata-0132-release
code
OpenMetadata 0.13.2 Release 5 min read Improved SQL Lineage, Glossary Bulk Upload, Unified Tag Category API, Mutually Exclusive Tags, Chrome Extension, and lots more. We are glad to announce OpenMetadata’s Release — 0.13.2, which showcases some exciting features and improvements like Glossary bulk upload, improved SQL lineage, mutually exclusive tags, and a beta release of the OpenMetadata Chrome extension. The OpenMetadata community recently hosted a Mini-Webinar on Custom Connectors, which discussed the core aspects of OpenMetadata, the ingestion process, and building your own custom connectors. Most organizations have their own in-house systems and other third-party services crucial to their business. OpenMetadata supports 60+ connectors, and by building a custom connector, you can have a holistic view of everything that happens in your Data Platform. We plan to add additional features to the SaaS version in future releases, so stay tuned for more updates 🚀. Thanks to OpenMetadata’s amazing community. We’ve been receiving productive feedback and active code contributions over the past one and a half years. And, we deeply appreciate your support and feedback. Crossed 1800+ GitHub stars The Slack community reached 2200+ members 141 Open source GitHub developers 312 Commits were merged into the 0.13.2 release OpenMetadata 0.13.2 Release Highlights Glossary Import and Export Glossary import and export When the Glossary was introduced in the 0.9 release, we supported a manual approach to add the glossary terms one at a time. This can become time-consuming for organizations already maintaining a business glossary in a spreadsheet or similar tools. Based on community feedback, we’ve introduced the functionality to bulk upload glossary terms. Users can save time and effort by uploading a CSV with thousands of terms in one go. The import utility will validate the file and show you a preview of the elements that will be imported to OpenMetadata. Just as you can import the glossary terms, you can export the glossary data as a CSV file. We take your feedback really seriously, and we are always excited to see community requests come to life! Improved SQL Lineage Open source thrives when communities grow together. We have been heavy users of sqllineage for our query parsing processes, and it’s been a great tool. Still, the growing number of connectors and community needs required us to improve our existing parsing capabilities. We have worked together with Reata, the developer behind sqllineage, to migrate the library’s core to sqlfluff. We’d like to take this opportunity to thank both communities for merging our contributions and helping bring our lineage parsing to the next level. If you are interested in learning more about how we handle Data Lineage in OpenMetadata, you can look at our webinar! New Glossary UI Previously, the Glossary and the Glossary terms were displayed together in a tree UI. Now, we’ve separated the Glossary and Glossary Terms within the UI, such that the Glossaries are displayed in the left navigation pane, while the glossary details are displayed on the right. The list of terms has been sorted alphabetically for better navigation. The improved UI changes will help you easily access the details with tags and descriptions in the cards. From Tag Category to Classification We have renamed Tag Categories to Classification as a more widely used term for expressing groups of tags. The team also took this chance to unify the Tags API to conform to the other APIs we have been building. This change will make it easier for developers familiar with the existing APIs, and it simplified different internal flows. You can find more information about the exact modifications here. Mutually Exclusive Tags from the UI There are situations where mixing multiple tags from the same classification does not make sense. Imagine an asset being tagged both with PII Sensitive and PII Non-Sensitive! When creating a Classification or a Glossary term, their elements can be mutually exclusive. If this configuration is enabled, you won’t be able to assign multiple tags from the same category or glossary to the same data asset. OpenMetadata Chrome Extension We are elated to announce the launch of the BETA version of the OpenMetadata Chrome extension. Feel free to explore it and share your feedback. We are focused on meeting the needs of the Data Community, by simplifying processes and making all the metadata ubiquitously accessible by providing the necessary context in all your tools. Databricks Pipeline Connector has been added as a community contribution! Some important fixes have been made for the AWS QuickSight connector. Improvements have been made to DB2 constraints and profiler. The Oracle connector now ships with the Thick mode enabled. Added support for Postgres Foreign Tables Connectors like Redshift or Snowflake are way faster now, as we get the descriptions in batches. Added support for Data Lake profiler row-based sampling. The team introduced an enforcement of the Entity Names’ format using regex patterns. The intention is to better manage and harmonize Entity Names values and to allow users to form better expectations around them. You can find more information about this change in the docs. Planned for Next Release Our next release will be the long-awaited 1.0 Release ✨ The team has already started developing amazing new features, such as: Storage Service revamp to better represent your Datalakes (based on your feedback) with awesome community members! NoSQL stores API definitions to prepare the ingestion for sources like Elasticsearch, Automatic PII tagging during the profiler workflow, Continuous improvement for the brand-new sqllineage package And more! Stay tuned for any news, and let us know which features you’d like to see next! Thanks to our Contributors It’s been great to associate with the open-source community members for the last one and a half years. We’ve been receiving encouraging feedback, and great code contributions; all this has been a huge motivation to keep scaling new heights. Thanks to Aashish1221, Alina Valea Forter, Bigdata-spec, Frederico Cotrim, geoHeil, Gnomolio, Itai Sevitt, Laila Patel, Martin Trillhaas, Nithin Kumara N T, Paulo Ventura, Sérgio Passos for raising GitHub issues that made it to the 0.13.2 release. Please contact us on Slack if you have questions about code, installation, and docs. For feature requests, you can file a GitHub issue or reach out to us on Slack. Interested in contributing code? Here are some good first issues to get you going. Please give us a GitHub star if you like what we are doing. That’ll greatly help OpenMetadata in reaching a wider audience.
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474843.87/warc/CC-MAIN-20240229134901-20240229164901-00475.warc.gz
CC-MAIN-2024-10
6,762
51
https://www.driverguide.com/driver/detail.php?driverid=46689
code
For serial number ( S/N 52G1XXXXXXXX ) OS Windows98, Windows ME, Windows 2000 Pro, Windows XP Device Plustek 1248U (usb scanner) Type Driver Full Version Size 4828 KB If your serial number is invalid, please try this (1248U) S/N 52G106000000 This driver was tested with Windows XP. Plustek doesn't guarantee that it works on every PC-configuration. To install the driver and to run the scanner, you need administrator rights. Already tried it? Give your review.
s3://commoncrawl/crawl-data/CC-MAIN-2018-34/segments/1534221215284.54/warc/CC-MAIN-20180819184710-20180819204710-00215.warc.gz
CC-MAIN-2018-34
461
10
https://vishnumaiea.in/blog/why-i-love-arduino/
code
Why I love Arduino? I got to know Arduino even before I learned what a microcontroller was. I first saw Arduino when I was surfing the web three and a half years ago. I fell in love with it at first sight seeing the logo. It was beautiful and made me curious. It always happen to me with this well designed electronic circuits, devices, toys etc. and even a chrome plated screw. I have a few of them in my collection. Coming back to Arduino, what makes it so lovely? When you want to start with microcontrollers, you have hundreds of them to choose from, many hundreds of tools and things like that. What Arduino did was to pick the best ones out of them and integrate them to make an open source physical computing platform. That’s what the Arduino team did, and the effect was huge. Just google Arduino and see those results to see the effect Arduino has made. There wouldn’t be any such search results if there was no Arduino. Arduino was actually designed for artists who don’t know much about electronics and programming, but still want to make their works interactive like you see in the movies. Arduino is an ongoing revolution because more people are attracted to hobby electronics, programming etc. than ever before. It is an ideal platform for students to sharpen their creative skills even from the school. Another reason why I love Arduino is because it is open source. All the design files are available to you, so that you can alter it and share it. I always like to be open source. Fortunately what happened to Linux didn’t happen to Arduino. What happened to Linux was it failed to keep a standard format or version. Many people say that the Ubuntu is the standard version but you can’t really say that. But Arduino has standard reference versions of boards manufactured by an Italian company called Smart Projects and it has a standard IDE for programming even when being open source. So you can choose from the standard boards or the clones made by other companies which are more cheaper and usually Chinese made. It’s up to you. But I would recommend the official boards because others are poor in terms of build quality. Getting an Arduino board is not a problem wherever you are. Arduino has many official distributers all over the world which are listed on their website. I bought my first Arduino board from rhydoLabz which is the only official distributor in the state of Kerala, India. They have SparkFun distribution too. It was like a dream come true when I got my first Arduino. I didn’t know what to build with it first but today I can design and build almost anything from Arduino if given all the tools, and materials combining my other skills. If you are out there and want to start with microcontrollers, programming and DIY hardware hacking, find an official distributor near and get an Arduino board today. You can find all my Arduino projects in this website. Happy hardware hacking and stay creative 😉
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474808.39/warc/CC-MAIN-20240229103115-20240229133115-00398.warc.gz
CC-MAIN-2024-10
2,955
6
https://sylmion.blogspot.com/2013/10/im-not-doing-post-today-but-luckily-you.html
code
Yeah lucky thing I arranged these visits ages ago, because today I barely have the energy to type. So if you're at all interested in getting your Misha fix today: 1) I'm hanging out with Darrion and Gawain from The Vanished Knight at the A to Z Challenge blog. 2) I did an interview with Laura on writing on whether I prefer cake or pie. Answer is cake. Probably forever. See below for the reason. 3) Michael wrote a cool intro to The Vanished Knight. 4) I'm visiting Robin, talking about how I first started writing. I hope to spot you at any or all of these. Probably later. Right now, I'm going back to sleep. Before that, though, you deserve an explanation, both about why I lack energy and why I now hate pie. My haiku on pie. There once was a pie, I devoured without thought of it poisoning me. On that lovely thought, I leave you today. Anyone else wanted to die from food poisoning before?
s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233511000.99/warc/CC-MAIN-20231002132844-20231002162844-00790.warc.gz
CC-MAIN-2023-40
897
11
https://smaatroll.biz/2224.html
code
Download Big L Riz Edification EP mp3 flac full album vinyl rip View credits, reviews, tracks and shop for the File release of "Edification EP" on Discogs. Edification - EP. Big L Riz. House · Preview. Song. Time. Edification (Extended Version). 1. PREVIEW. Archie's Theme. Music Video. Big L Riz - Edification (Extended Version). Featured In. Album. Edification - EP. Big L Riz. Top Songs By Big L Riz. Check out Edification EP by Big L Riz on Amazon Music. Stream ad-free or purchase CD's and MP3s now on Check out Edification (Extended Version) by Big L Riz on Amazon Music. Stream ad-free or purchase CD's and MP3s Big L Riz. From the Album Edification EP. Edification EP | Big L Riz. Stream and download in Hi-Res on Play and download Solidification EP album by Big L Riz - including the songs "Soft Satin Tissues", "Rock Solid", "Best Served Chilled". [mp3] Listen to online Big L Riz - Edification EP, or download mp3 tracks: download here mp3 release album free and without registration. Includes unlimited streaming via the free Bandcamp app, plus high-quality downloads of Cauterization, Actuation EP, Solidification EP, and Edification EP. Edification EP · Big L Riz Nightclub. Big L Riz Actuation EP Is available now on iTunes & Google Play Music Big L Riz - Shine Bright Like A Diamond.
s3://commoncrawl/crawl-data/CC-MAIN-2021-49/segments/1637964362923.11/warc/CC-MAIN-20211204003045-20211204033045-00340.warc.gz
CC-MAIN-2021-49
1,301
2
http://www.fixya.com/support/t1061825-wireless_device_play_outside_car
code
I have the auxiliar port in the my car radio and I connect the audio cable to my zune but I want to know if there is a device to connect to the car radio and to the zune and wireless send the music playing Yeah. Try getting the Microsoft or Belkin FM Transmitter. The only issue I have noticed is that if you use the transmitter, it can get "static-y" and the audio cable gives you better volume control. Hope this helps! a 6ya expert can help you resolve that issue over the phone in a minute or two. best thing about this new service is that you are never placed on hold and get to talk to real repairmen in the US. the service is completely free and covers almost anything you can think of (from cars to computers, handyman, and even drones). click here to download the app (for users in the US for now) and get all the help you need. goodluck! - If you need clarification, ask it in the comment box above. - Better answers use proper spelling and grammar. - Provide details, support with references or personal experience. Tell us some more! Your answer needs to include more details to help people.You can't post answers that contain an email address.Please enter a valid email address.The email address entered is already associated to an account.Login to postPlease use English characters only. Tip: The max point reward for answering a question is 15. If your Kenwood has an AUX input then you can connect a audio cable from the headphone connector on the Zune to the the AUX input on the Kenwood. Power up the Zune ans set it to music and switch the Kenwood to AUX. Just download the software from zune.net and then connect your Zune HD to your laptop with the USB cable and you can then add all of your media to the device. In order to wirelessly sync your device you'll need to first connect it with the cable and setup the wireless sync settings. After that you don't need to use the cable to connect your Zune to your laptop. I am not sure if this is gonna work but, since you've tried all other steps why dont we give this a try too...LOL :-) To troubleshoot this issue, try one or more of the following methods in the order in which they are listed: Let your Zune device charge for 30 minutes, then disconnect and reconnect the cable to turn on the player. It can take up to 3 hours to fully charge. Connect the Zune device to a different USB port on the computer. Use a USB port on the back of the computer. Do not use a USB hub. Use another supported high power source, such as an Xbox 360 console, a Zune AC Adapter, a Zune Car Charger, or a Zune FM Transmitter/Charger. Note These devices are sold separately. If you are connecting your Zune device to a mobile computer, make sure that the mobile computer is plugged in and is not running on battery power. Many mobile computers are configured to conserve power when they are running on battery power to extend battery life. This behavior may affect Zune device charging. If the Zune device is charging slowly, be aware that, for optimal charging, the Zune device requires at least 500 milliamps (mA) of power. Frequently, USB ports that are located on the front of a computer and on some USB hubs are low-power, 100-mA ports. Typically, the high-power, 500-mA USB ports are on the back of the computer. Additionally, when you suspend the computer or put the computer to sleep, high-power USB ports may be switched to low-power, 100 mA mode. If the device boots to the home screen but is still not charging correctly, continue with the "Home screen is displayed" steps in the following section. I found this on the Zune technical support (if this does help, be sure to rate my solution please): Your Zune device is not detected by your computer or the Zune software View products that this article applies to. SYMPTOMS When you try to use Zune Software, you experience one or more of the following symptoms: • The Zune device is not detected by the computer. • The Zune device is not detected by Zune Software. In this case, when you pause on the device icon in the lower-left corner, you receive the following error message: No Zune device connected • If you restored the Zune device software, the following messages are displayed on the Zune device: Connect Zune to your PC On your PC, open the Zune software and restore the Zune device firmware THINGS TO TRY To resolve this issue, follow these steps:1. Update the Zune software on the computer. To do this, click settings, click software, click GENERAL, and then click CHECK FOR UPDATES. If the issue is still not resolved, go to the next step. 2. Restart the Zune device. To do this, follow these steps: a. Set the Hold switch to the unlocked position. b. Press and hold the Back button as you press the top of the Zune pad. When the Zune device begins to restart, release the buttons. If the issue is still not resolved, go to the next step. 3. Disconnect the Zune device from the computer, restart the computer, and then reconnect the Zune device. If the issue is still not resolved, go to the next step. 4. Connect the Zune device to a different USB port on the computer. Use a USB port on the back of the computer. Do not use a USB hub. If the issue is still not resolved, go to the next step. 5. When you connect the Zune Sync Cable to the connector port of the Zune device, make sure that both ends of the sync cable click into position. Additionally, make sure that the sync cable is not connected to the device at an angle. 6. Connect only one Zune device at a time. If you already have a Zune device connected to the computer, any other Zune that you try to connect will not be detected by the Zune software. Disconnect any additional Zune devices, and then try to connect again. If the issue is still not resolved, go to the next step. 7. If a full-screen battery image appears when you connect the device to the computer, you must let the device charge for several minutes before the device will start and be detected by the computer. For more information about how to charge a Zune device, click the following article number to view the article in the Microsoft Knowledge Base: 927348 (http://support.microsoft.com/kb/927348/) How to charge the battery in your Zune device If the issue is still not resolved, go to the next step. 8. If you are using Windows Vista, install the Cumulative update rollup for USB core components in Windows Vista software update. This update resolves some reliability issues in the USB core components in Windows Vista. You will achieve better reliability in various scenarios by installing this update. For more information about this update, click the following article number to view the article in the Microsoft Knowledge Base: 941600 (http://support.microsoft.com/kb/941600/) Cumulative update rollup for USB core components in Windows Vista If you are not using Windows Vista, or if the issue is still not resolved, go to the next step. 9. Examine Windows Device Manager. To do this, follow these steps:a. Click Start, right-click Computer, and then click Properties. b. Click Device Manager. Zune should be listed under Portable Devices. Note If the device was installed incorrectly, Zune may be listed together with a yellow exclamation mark (!). Or, it may be listed under Other devices as USB Device. Double-click Zune, and then click the Driver tab. The current version of the driver that is installed should be 2.3.1035.0. If Windows Device Manager indicates a device problem, or if the version of the driver that is installed is not the current version, you must update the driver. To do this, follow these steps: 1. Right-click Zune or USB Device, and then click Properties. 2. Click the Driver tab, and then click Update Driver. 3. Click Search automatically for updated driver software. 4. Click Install if you are prompted to do this. 5. After the driver installation is complete, click Close. If Windows Device Manager found the drivers, click Close. If you want to locate the drivers manually, follow these steps:1. In the Zune Properties dialog box, click Update Driver. 2. Click Browse my computer for driver software. 3. Click Browse, locate the \Program Files\Zune\Drivers folder, and then click OK. 4. Click Next. 5. Click Install if you are prompted to do this. 6. After the driver installation is complete, click Close, and then click Close again. If the issue is still not resolved, go to the next step. 10. Examine the computer for conflicts that involve the USB host controller or the root hub. To do this, follow these steps:a. Click Start, right-click Computer, and then click Properties. b. Click Device Manager. c. Expand Universal Serial Bus controllers. d. Verify that no errors appear in the list of USB devices. A yellow exclamation point or a red "X" next to a device indicates errors. If any errors appear, contact the manufacturer of your computer. For more information about how to troubleshoot conflicts in Device Manager, click the following article number to view the article in the Microsoft Knowledge Base: 310126 (http://support.microsoft.com/kb/310126/) Troubleshooting device conflicts with Device Manager If the issue is still not resolved, go to the next step. Try this.... Reset Zune: Press and hold the back button while pressing up on the control pad. Connect Zune to a power source: If the Zune device is in sleep mode, this action will wake the Zune device and turn it on. If the battery does not have a charge, you may have to wait several minutes for the Zune device to turn on. Also, make sure that the hold button on top of your zune is turned to the off position. Hope this helps ALL YOUR BASE BELONG TO US - is a video that must have been loaded on the Zune previously. if all else fails... If you connected the ZUNE to your computer before installing the software the computer may have installed it as a different device. Go to Control Panel - then System - Device Manager. Plug in ZUNE player and see if any new devices show as being connected. BE CAREFUL IN DEVICE MANAGER THOUGH, IF YOU UNINSTALL THE WRONG ITEM, IT COULD MESS YOUR COMPUTER UP.
s3://commoncrawl/crawl-data/CC-MAIN-2017-04/segments/1484560281424.85/warc/CC-MAIN-20170116095121-00468-ip-10-171-10-70.ec2.internal.warc.gz
CC-MAIN-2017-04
10,082
44
http://cloudcomputing.sys-con.com/node/2523406
code
|By Marketwired .|| |January 23, 2013 11:30 AM EST|| SEATTLE, WA -- (Marketwire) -- 01/23/13 -- Tableau Software, a global leader in rapid-fire business intelligence software, today reveals the top 10 trends for Business Intelligence in 2013. As the world of enterprise databases is developing at warp speeds, startups must address new data problems and established companies must innovate their legacy platforms to remain competitive. With all the attention organizations are placing on innovating around data, the rate of change will increase exponentially, to nobody's surprise. The biggest takeaway: The trends are BIG -- as in it's going to be a big year for Business Intelligence Growth. Tableau's vision for the Top 10 Trends for Business Intelligence in 2013 includes: 1. Proliferation of data stores: Once upon a time, an organization had different types of data: CRM, point of sale, email, and more. 2013 is the year we will recognize that having all your data in one fast data warehouse is passé. Big data could be in places like Teradata and Hadoop. Transactional data might be in Oracle or SQL Server. The right data stores for the right data and workload will be seen as one of the hallmarks of a great IT organization, not a problem to be fixed. 2. Hadoop is real: Back in 2008 and 2009 Hadoop was a science project. In 2012, we saw the emergence of many production-scale Hadoop implementations, as well as a crop of companies trying to address pain points in working with Hadoop. In 2013, Hadoop will finally break into the mainstream for working with large or unstructured data. 3. Self-reliance is the new self-service: Self-service BI is the idea that any business user can analyze the data they need to make a better decision. Self-reliance is the coming of age of that concept: it means business users have access to the right data, that the data is in a place and format that they can use, and that they have the solutions that enable self-service analytics 4. The value of text and other unstructured data is (finally!) recognized: One of the subplots of the rise of Hadoop has been the rise of unstructured data. With the explosion of social data like Twitter and Facebook posts, text analysis becomes even more important. 5. Cloud BI grows up: In 2013 we expect to see the maturation of cloud BI, so that people can collaborate with data in the cloud, just like they collaborate on their Salesforce CRM or help desk data. 6. Visual analytics wins Best Picture: People are finally beginning to realize that visual analytics helps anyone explore, understand and communicate with data. It's the star of business analytics, not a handy tool for scientists. 7. Forecasting and predictive analytics become common: Forecasting tools are maturing to help businesses identify emerging trends and make better plans. We expect forecasting and predictive analyses to become much more common as people use them to get more value from their data 8. Mobile BI moves up a weight class: Last year we predicted that Mobile BI would go mainstream -- and it did. Now everyone from salespeople to insurance adjusters to shop floor managers use tablets to get data about their work right in the moment. 9. Collaboration is not a feature, it's a reality: In 2013, Collaboration must be at the root of any business intelligence implementation, because what is business intelligence but a shared experience of asking and answering questions about a business? 10. Pervasive analytics are finally...pervasive: When we talk more about data, and less about software categories like BI, we get to the crux of maximizing business value -- and fast, easy-to-use visual analytics is the key that opens the door to organization-wide analytics adoption and collaboration. These trends build upon the amazing developments of 2012: databases proliferated, startups formed, visualization and data discovery became increasingly recognized as their own categories. Web-based analytics tools are connecting to web-based data. And everything's mobile. Tableau integrates with virtually every platform and enterprise solution -- delivered via desktop, web and mobile devices -- with no programming required. Whether via native data connector or in-memory, Tableau is the data analysis software that allows you to work with your data no matter where it lives. The Top 10 Trends for Business Intelligence in 2013 is available for download in PDF format here. About Tableau Software Tableau Software helps people see and understand data. According to IDC in its 2012 report, Tableau is the world's fastest growing business intelligence company, Tableau helps anyone quickly analyze, visualize and share information. More than 10,000 organizations get rapid results with Tableau in the office and on-the-go. And tens of thousands of people use Tableau Public to share data in their blogs and websites. See how Tableau can help you by downloading the free trial at www.tableausoftware.com/trial. Tableau and Tableau Software are trademarks of Tableau Software, Inc. All other company and product names may be trademarks of the respective companies with which they are associated. SYS-CON Events announced today the Docker Meets Kubernetes – Intro into the Kubernetes World, being held June 9, 2016, in conjunction with 18th Cloud Expo | @ThingsExpo, at the Javits Center in New York, NY. Register for 'Docker Meets Kubernetes Workshop' Here! This workshop led by Sebastian Scheele, co-founder of Loodse, introduces participants to Kubernetes (container orchestration). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, participants learn ... May. 30, 2016 03:00 PM EDT Reads: 2,073 The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discuss how businesses can gain an edge over competitors by empowering consumers to take control through IoT. We'll cite examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He'll also highlight how IoT can revitalize and restore outdated business models, making them profitable... May. 30, 2016 02:00 PM EDT Reads: 3,067 SYS-CON Events announced today that MangoApps will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device. For more information, please visit https://www.mangoapps.com/. May. 30, 2016 01:30 PM EDT Reads: 1,132 SYS-CON Events announced today the How to Create Angular 2 Clients for the Cloud Workshop, being held June 7, 2016, in conjunction with 18th Cloud Expo | @ThingsExpo, at the Javits Center in New York, NY. Angular 2 is a complete re-write of the popular framework AngularJS. Programming in Angular 2 is greatly simplified. Now it’s a component-based well-performing framework. The immersive one-day workshop led by Yakov Fain, a Java Champion and a co-founder of the IT consultancy Farata Systems and... May. 30, 2016 12:00 PM EDT Reads: 4,116 SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ... May. 30, 2016 12:00 PM EDT Reads: 802 In his session at 18th Cloud Expo, Andrew Cole, Director of Solutions Engineering at Peak 10, will discuss how the newest technology advances are reducing the cost and complexity of traditional business continuity and disaster recovery solutions. Attendees will: Learn why having a full disaster recovery strategy is more important now than ever before Explore the key drivers of a successful disaster recovery solution Achieve measurable operational and business value from a disaster recovery ... May. 30, 2016 12:00 PM EDT Reads: 1,498 SYS-CON Events announced today that Hanu Software will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Leveraging best-in-class people, processes, and technologies, Hanu provides high-quality, high-value software development and business process outsourcing services to independent software vendors (ISVs) and enterprises. May. 30, 2016 11:30 AM EDT Reads: 1,473 How will your company move to the cloud while ensuring a solid security posture? Organizations from small to large are increasingly adopting cloud solutions to deliver essential business services at a much lower cost. According to cyber security experts, the frequency and severity of cyber-attacks are on the rise, causing alarm to businesses and customers across a variety of industries. To defend against exploits like these, a company must adopt a comprehensive security defense strategy that is ... May. 30, 2016 11:30 AM EDT Reads: 772 What a difference a year makes. Organizations aren’t just talking about IoT possibilities, it is now baked into their core business strategy. With IoT, billions of devices generating data from different companies on different networks around the globe need to interact. From efficiency to better customer insights to completely new business models, IoT will turn traditional business models upside down. In the new customer-centric age, the key to success is delivering critical services and apps wit... May. 30, 2016 11:15 AM EDT Reads: 1,349 Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages. May. 30, 2016 11:00 AM EDT Reads: 2,581 In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, will provide an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life ... May. 30, 2016 10:45 AM EDT Reads: 2,069 The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments. In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will look at the present state of cloud from the C-level view, and how great companies and rock star executives can use cloud computing to meet their most ambitious and disruptive business ... May. 30, 2016 10:00 AM EDT Reads: 2,306 SYS-CON Events announced today that MobiDev will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex software systems for startups and enterprises. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobile software company with over 200 develope... May. 30, 2016 09:15 AM EDT Reads: 2,840 SYS-CON Events announced today that BMC Software has been named "Siver Sponsor" of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. BMC is a global leader in innovative software solutions that help businesses transform into digital enterprises for the ultimate competitive advantage. BMC Digital Enterprise Management is a set of innovative IT solutions designed to make digital business fast, seamless, and optimized from mainframe to mo... May. 30, 2016 09:15 AM EDT Reads: 2,391 As cloud and storage projections continue to rise, the number of organizations moving to the cloud is escalating and it is clear cloud storage is here to stay. However, is it secure? Data is the lifeblood for government entities, countries, cloud service providers and enterprises alike and losing or exposing that data can have disastrous results. There are new concepts for data storage on the horizon that will deliver secure solutions for storing and moving sensitive data around the world. ... May. 30, 2016 09:00 AM EDT Reads: 1,455 SoftLayer operates a global cloud infrastructure platform built for Internet scale. With a global footprint of data centers and network points of presence, SoftLayer provides infrastructure as a service to leading-edge customers ranging from Web startups to global enterprises. SoftLayer's modular architecture, full-featured API, and sophisticated automation provide unparalleled performance and control. Its flexible unified platform seamlessly spans physical and virtual devices linked via a world... May. 30, 2016 08:00 AM EDT Reads: 2,401 SYS-CON Events announced today that ContentMX, the marketing technology and services company with a singular mission to increase engagement and drive more conversations for enterprise, channel and SMB technology marketers, has been named “Sponsor & Exhibitor Lounge Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York City, New York. “CloudExpo is a great opportunity to start a conversation with new prospects, but what happens after the... May. 30, 2016 07:15 AM EDT Reads: 1,423 Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations. May. 30, 2016 07:00 AM EDT Reads: 2,278 The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc... May. 30, 2016 06:00 AM EDT Reads: 2,019 SYS-CON Events announced today TechTarget has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. TechTarget is the Web’s leading destination for serious technology buyers researching and making enterprise technology decisions. Its extensive global networ... May. 30, 2016 05:30 AM EDT Reads: 3,315
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464051113990.53/warc/CC-MAIN-20160524005153-00015-ip-10-185-217-139.ec2.internal.warc.gz
CC-MAIN-2016-22
15,581
60
http://lists.qt-project.org/pipermail/development/2012-January/001718.html
code
[Development] Tests, Shadow-Build and Cross-Compilation Holger Hans Peter Freyther holger at freyther.de Mon Jan 30 07:56:49 CET 2012 On 01/29/2012 09:21 PM, Olivier Goffart wrote: >> The other part/wish would be to always have deployment targets for the >> testcases and generate a run script or such as part of the installation. > There is QFINDTESTDATA and QTest::qFindTestData that have been added recently, > I think for this reason. Have you heard of them? no I did not hear of it, when trying to resolve the shadow build issue in xmlpatterns i just grepped qtbase/tests and the SRCDIR solution popped up. Sadly QFINDTESTDATA is not enough for xmlpatterns, e.g. the checkxmlfiles test will try to find all .xml/.ui/.xhtml/.qrc... files in the module (this also means that by moving this test out of Qt there is less test data). The xml patterns test uses QDir(QL1String("../../../")) to find the files recursively and QFINDTESTDATA determines that this will work with the build directory. Should we just call it a crappy test as the testdata is not well The other question is _should_ all tests be deployed or how do you plan to run tests on the device? More information about the Development
s3://commoncrawl/crawl-data/CC-MAIN-2013-20/segments/1368699273641/warc/CC-MAIN-20130516101433-00089-ip-10-60-113-184.ec2.internal.warc.gz
CC-MAIN-2013-20
1,198
20
https://chiaforum.com/t/selling-supermicro-sc848-cse-848-24x-bay-jbod-4x-xeon-e5-4650-512gb-ram/5607
code
I’m selling my beloved SirPlotAlot. The idea was to have a combination of plotter and farmer that can be dynamically whats needed. I actually never got past installing the OS, so I can’t share what its capable of. I essentially ran short on time. Why I sell: I just noticed that I dont have a future at farming chia. I’m too small and netspace grew waaay to quickly for my initial calculations to still be relevant. I’m based in Switzerland, so transport around europe using a courier like DHL or similar should not be an issue. Heres the beast: Chassis: Supermicro SC848 - Peripherals: 4x USB 2.0, 1x Serial, 1x VGA, 2x GLAN, PS/2, 1x IPMI LAN - Height: 4U RAM: 512 GB ECC DDR3 PC12800 (16x32GB) Mainboard: Supermicro X9QR7-TF-JBOD RAID Controller: LSI MegaRAID 9271-8i, 8-port incl. BBU Backplane: BPN-SAS2-846EL1 Expander Backplane PSU: 2x PWS-1K62P-1R (1620W, 80plus platinum), redundant. - HDDs: 18x 2TB SATA Disks (No name, non-Enterprise disks!) - NVMEs: 2x 512GB NVME (No ame, about 5 yrs old, not plotted one plot to these) Includes RackMount Kit and 2x Power cables. Theres even 1 year warranty on the parts itself (not from me, but original seller) The JBOD itself is in awesome condition. Was only used in climatized datacenters, so there is literally like no dust on the components. Everything is fully functional (PSUs got replaced 2 days ago by original seller). Its has some scratches on the chassis, but we are not in a beauty contest here anyway EDIT: I just split off the 18x 2 TB disks for a more modular approach (in case you dont need the drives). So there are two options: a. ) 3600 USD for the package as seen above excl. VAT, Taxes, customs and b.) 3000 USD for the package as seen above WITHOUT 18x2 TB Disks, excl. VAT, Taxes, customs and shipping. If interested, drop me a pm
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572021.17/warc/CC-MAIN-20220814083156-20220814113156-00392.warc.gz
CC-MAIN-2022-33
1,811
23
https://modernnights.bigdamnmush.com/index.php?title=PRP_Resources
code
Our PRP policies are listed in our Policies! The policy page is great to explain what and how to get started, but the following content is advice and help once you've gotten a PRP account to help you out. You can storytell in whatever way you see fit, but since we are out to create an accessible game for casuals as well as power users we want you to be willing to do the following things: Please do not map IC deadlines to an OOC date. IC deadlines should never be OOC Deadlines! If you have a time-sensitive deadline IC, that's fine, but any +request or @mail stating a character's IC actions to be completed by that deadline are valid so long as the request is sent in before the day of the IC deadline. You can read more about this outlook in our Mission Statement. If a PRP runner maps an IC deadline to an OOC date and this causes an issue for a player who couldn't complete the IC task due to OOC reasons, then staff reserves the right to change the IC continuity accordingly. Please do not set forth any expectation that players should wait on a timestop. We do have a command to issue timestops in order to dog-ear a scene and mark it down as a cliffhanger. However, Timestops on Modern Nights are not intended to put anybody in an RP lock (it all goes back to valuing the time of casuals and a trust-first model to our players as in our mission statement). Remember that 0 HP does not need to mean dead! The WOD health level mechanics aren't necessarily a shining model of balanced or believable behavior. We're here to tell a story for our engagement and enjoyment. Short of some significant mitigating narrative factor, you are strongly encouraged to treat 0 health levels as critical condition (and impending death if they don't get medical attention), not actually dead. Leverage the Karma system! Is your player stuck? Offer advice for a karma. Did they come unprepared? Offer the right tools for a karma. Once you're done, you can report that somebody spent their Karma in a +job (hopefully the same job you give 'em XP for!) As a PRP runner, you can manage and access notes which can be hierarchically sorted! Players can also benefit from and maintain hierarchically sorted notes, but you have slightly elevated access. The +events code is open for anybody at all to use. To create a new event: +event/create <Title>=<timestamp>/<genre>/<summary>. The helpfile for the +events code is accesible on the game by typing "+help events". Tracking Plots With Requests Do you want your players to be able to coordinate with you over time? You can have players assign jobs to you with the +job/assign <#> = <person> command, so at the end of your scenes, you can encourage people to send in +requests and assign them to you. When a +job is assigned to you, you can use the same +job/add and +job/cc commands that you use as a player, but you additionally can resolve jobs once you're done with +job/close <#>=<closing comments>. You also can use +job/approve <#>=<blah> and +job/deny <#>=<blah>. They're all pretty much the same command. Tracking Plots With Notes Your personal organizational tools are whatever works for you, but it doesn't hurt to provide you a tool that helps us stay in the loop alongside you! If you prefer to track your plots through notes and not +request, we're down. Just make sure staff is aware that the notes are there (we won't look for what we don't know exists, after all). When someone makes a note, that note is assumed to only be viewable by staff and the original note writer. However, notes can be published. You can use the +notes command to view any other player's (published) +notes by typing "+notes <their name>" and you can also do this with rooms (try "+notes clarion alley") from anywhere. You also can use "+notes here" to see notes for the room you're standing in. "Published" notes are +notes that have been marked for others to access if they're not staff. These can be fully public (+notes clarion alley holds one such note) or these can be for PRP eyes only (+notes mage resources is one example). They can also be published to certain character types such as "only those with streetwise 3" (read more about note locking in +help notes), but the takeaway here is that notes can be published for PRP bits and you can see them. You can organize any note you have access to. The hierarchical sorting feature is available to players as well, but their usefulness for PRP runners bears mentioning! Let's say you make a note on yourself for your plot, and you want to divide it into three acts. You can make an act one note, an act two note, an act three note, and then create an overall note ("The Great Plot Arc") to file them under, like so: Let's make our first note: +note/write The Great Plot Arc=Here's an overview of The Great Plot! Next, we're making the various notes to file under our main one: +note/write Act One=Here's what happens in Act One. +note/write Act Two=Here's what happens in Act Two. +note/write Act Three=Here's what happens in Act Three. Finally, we're filing the notes under the great plot arc: +note/file Act One=The Great Plot Arc +note/file Act Two=The Great Plot Arc +note/file Act Three=The Great Plot Arc You can make a daisy chain of notes, so for instance, if we wanted a note that talked about certain things that happened in Act One and file it under act one: +note/write Timmy's Big Scene=This note talks about the time Timmy... +note/file Timmy's Big Scene=Act One When you access notes that have notes filed under them, you'll see the filed notes in an "Also See..." section, like this: --------------------[OOC - Staff - Mage Resources / nodes]--------------------- There are currently X nodes in San Francisco. This note holds their locations. --- * Written by Batty on July 07, 2021 * Staff and Batty can view this note. * Staff Approved Note ----[Also See...]-------------------------------------------------------------- node - sutro baths node - clarion alley node - undiscovered loc ------------------------------------------------------------------------------- Here, one could access the "node - sutro baths" note with "+note mage/node - sutro baths".
s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652662529538.2/warc/CC-MAIN-20220519141152-20220519171152-00775.warc.gz
CC-MAIN-2022-21
6,158
31
https://forum.pycom.io/category/1/announcements-news
code
@tlanier said in Firmware Release v1.20.2.r4: The following commands cause the system to crash. Hi @tlanier . Thanks. Yes, I've seen this. You're screenshot shows it happening in 1.20.3.b3, but it can indeed also happen in 1.20.2.r4 which this thread is about. We haven't found the root cause of this yet. It's an issue for sure, but I think it shouldn't be blocking. I don't think anyone needs to run this function frequently in production. To test the firmware version you could also use ATI1 command https://docs.pycom.io/tutorials/networks/lte/#firmware-version @peterp When I wrote that post it was reproducible using my example. Please try it out just as I posted it and see. I don't have a device on me at the moment, but I can give it another shot in a few hours. @mxklt said in Firmware Release v1.20.2.r2: I get an error when i try to use machine.pygate_reset() -> LORAPF_ERROR:pygate_reset failed to reset Pycom MicroPython 1.20.2.r2 [v1.11-3a7776d] on 2020-11-23; GPy with ESP32 Please update the Pygate FW and try again: https://docs.pycom.io/updatefirmware/expansionboard/ Nice to see that someone started to update the documentation. I don't know the actual status. When I started many examples were broken or not implemented completely, Not a big deal for professionals, but you target beginners as well. They need running examples. Experts and beginners struggle together when examples show deprecated or outdated code. This will be problem in the near future. You decided to stop updating LoPys. So we need documentation and examples for different versions. It will be nice, if the examples are part of your tests, so you can notice bad examples early. @thinginnovations said in Pygate PoE Adapters: I think the date should be 6th July Sorry yes this has been corrected. @robert-hh said in Pygate PoE Adapters: For those who want to replace the diode in case it looks bad after de-soldering, it would be interesting to know which diode model can be used. I'll check with the hardware team I am just started to dig into to threading, but my understanding is that pycom only support _thread, which has limited functionality compared to threading. Like lack of .start or .stop. Is my understanding false? Cause the sample code is not working on my wipy.
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323587915.41/warc/CC-MAIN-20211026165817-20211026195817-00013.warc.gz
CC-MAIN-2021-43
2,268
20
https://www.freelancer.co.uk/projects/php/make-site-responsive-15669137/
code
I have a website that needs some help. Following is what i am looking for: 1. Need to make my site responsive so that it opens up in mobile 2. Rearrange and move things around on the basis of UX input 3. Make my site look more professional by updating the overall UI Functionally my site is up & running, just need help with UI. 49 freelancers are bidding on average ₹7248 for this job Let's discuss about this, we are experienced into web design, web development, seo, smm and digital marketing. Accept my proposal request and we can talk more about it. Thanks Rama Webzin
s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583660818.25/warc/CC-MAIN-20190118213433-20190118235433-00302.warc.gz
CC-MAIN-2019-04
575
7
https://www.vn.freelancer.com/projects/nodejs/optimize-api-for-app-faster/?ngsw-bypass=&w=f
code
I want someone to optimize API to have my app load faster. Backend code is: Nestjs 10 freelancer chào giá trung bình$184 cho công việc này Hi dear, I just read out your description and am interested in your project. I am expert in Fluttet , and also done with it. if you need quality work than feel free to contact me. Thanks.
s3://commoncrawl/crawl-data/CC-MAIN-2021-21/segments/1620243988850.21/warc/CC-MAIN-20210508061546-20210508091546-00370.warc.gz
CC-MAIN-2021-21
333
4
https://integrate.hubspot.com/t/creating-a-contact-from-utk/2965
code
We have a marketing site (example.com) that we use and then run our signups through a subdomain (platform.example.com). We’re trying to link an account that’s created on the subdomain to the user data (web analytics, referrers, etc) to the account that’s created on our service (where an email is typically entered). We’re using our own form on our subdomain to create the account. We have access to the utk on both domains through the hubspot cookie. When we capture the user data using our own form, we’d like to take the vid associated with the utk and turn that into a contact with the email address and customer information that we capture. When we access the vid from the utk using the API call: The “is-contact” property is returning False. We’ve tried to forcefully update a record through the vid, event though the is-contact property is false, using the following API call: However, we’re receiving an error: “Can’t set properties when vid is not a contact.” Is there a way to create a contact based on a utk (where the associated vid is not yet a contact)? Or is there a better way to accomplish this?
s3://commoncrawl/crawl-data/CC-MAIN-2018-51/segments/1544376826686.8/warc/CC-MAIN-20181215014028-20181215040028-00044.warc.gz
CC-MAIN-2018-51
1,136
8
https://ipsec.pl/testing-x-content-security-policy.html
code
If you wondered how X-Content-Security-Policy works in real life here’s an example. X-Content-Security-Policy is a proposed mechanism for limiting impact of injection attacks against websites — such as Cross-Site-Scripting. For each page the server will return the CSP header in HTTP response. The header describes what the browser should expect from the page and what it shouldn’t. For example this website returns the following CSP header: X-Content-Security-Policy: allow 'self'; script-src www.google.com www.readability.com; options inline-script; img-src *.creativecommons.org Does this really work? Let’s try. The following line contains a JS block that should execute a remote script. It’s classical Cross-Site Scripting test from ha.ckers.org forum. On browsers supporting CSP you should see nothing (except for the JS souce code block). On other browsers you will see a pop-up telling that this website is vulnerable to XSS:
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710918.58/warc/CC-MAIN-20221203011523-20221203041523-00239.warc.gz
CC-MAIN-2022-49
944
4
http://ontolog.cim3.net/forum/ontolog-forum/2013-05/msg00175.html
code
On 5/21/2013 2:05 PM, David Eddy wrote: On May 21, 2013, at 10:20 AM, John Bottoms wrote: With the most complex data sets I've worked on, which is on the order to 150 million points of dirty data, So what does the enterprising Big Data Scientist do with so much suspect data? Clean it up? Smooth out the statistical anomalies? Cross The cleanup for that project was based on an informed understanding of the types of errors. Some came from the Scantron machine that read the data sheets, and some came from incorrect input from the users. In developing metrics the outliers offer little value in some cases. Sometimes the data is graphed and the type of graph used is important. Sometimes "rule-of-thumb" metrics are used, but you have to know when they are validly usable. At times, an estimation calculation is done first and then used in the statistical analysis. The statisticians and psychometricians have an almost intuitive feel for how to deal with dirty data. It is a part of BigData that has not been addressed sufficiently yet. Concord, MA USA
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320306181.43/warc/CC-MAIN-20220129122405-20220129152405-00576.warc.gz
CC-MAIN-2022-05
1,054
20
https://forum.opensearch.org/t/support-for-high-level-rest-client/6308
code
Elastic has indicated that they plan to open source the high level rest client (HLRC) at some point in the future, and indicated that they will ‘not harass’ (my loose interpretation) people for using the non-apache version in the interim. In light of this, what is the status of current support in OpenSearch for HLRC ? The best way to think about compatibility is this: - There is compatibility between OSS ES 7.10.2 and OpenSearch 1.0 - OpenSearch is making no breaking changes in the 1.x line of releases - After OpenSearch 2.0 there may be breaking changes and it’s best to start thinking about OpenSearch and ES as just sharing a common ancestor. - Some software that connects to ES does license check to determine it’s not OSS ES. That won’t work due to the first point. Thanks for the response. Question on point #3. I understand that things will most likely break with OpenSearch 2.0 and 7.10.2 code, and that the HLRC will differ as the Elastic and OpenSearch features will differ. Do you anticipate that HLRC for OpenSearch will be updated to support new Java features? Any chance it will support Kotlin or other code bases to provide a uniform API? , and that the HLRC . Do you expect that OpenSearch 2.0 will I’ve not seen nor heard anything about Java features or support for other JVM languages. Doesn’t mean that it won’t happen but I don’t think any plans have been made - it might be worth putting together a feature request/proposal in the OpenSearch repo. as also posted in the other thread i’ve raised a ticket to publish the OpenSearch
s3://commoncrawl/crawl-data/CC-MAIN-2022-40/segments/1664030336921.76/warc/CC-MAIN-20221001195125-20221001225125-00189.warc.gz
CC-MAIN-2022-40
1,576
13
https://sharepoint.stackexchange.com/questions/43389/populate-data-from-sharepoint-list-to-a-sql-table
code
If you want: - "i want that the values entered by the user must be populated in a SQL table want "the values entered by the user" then you shouldn't have made - "a custom list which i modified it and made it as an info path form and on submit" which I understand as Infopath form was made by pressing a violet button "Modify Form" on ribbon. In this case the "the values entered by the user" are already persisted by Infopath form of type/compatibility "Sharepoint List Form" through auto-created and locked as non-modifiable during Infopath form creation Main Data Connection (of type Main Data Source) onto a sharepoint list. The sharepoint list data is stored by Sharepoint server into SQL Server Content database for you automatically by design. And this is unrelated to Infopath Any reason why do you want to persist data redundantly both into a sharepoint list and into a custom SQL Server database? I also wrote initially... Either it is possible through Sharepoint what is most probably not your intention when you asked about submit from Infopath form. From Infopath you should not do it through Sharepoint List Form... While it is possible to do from Infoapth, you should have asked WHAT you want to do but not HOW (eventually you can not).
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296943562.70/warc/CC-MAIN-20230320211022-20230321001022-00621.warc.gz
CC-MAIN-2023-14
1,250
12
https://selfabsorbedboomer.blogspot.com/2013_05_19_archive.html
code
Kids were very different then. They didn't have their heads filled with all this Cartesian Dualism. --Monty Python, Episode 14I've posted before (also see here) about the Amygdaloids, a rock band made up of Joseph LeDoux (lead vocals and rhythm guitar) and Daniela Schiller (drums and vocals), both neuroscientists at New York University; Tyler Volk (lead guitar and vocals), a biology professor at NYU; and Amanda Thorpe (bass and vocals), a neuropsychologist and music therapist who has done graduate study at NYU. The video above (sorry for the bit of herky-jerkiness in the first few seconds) was made this past Saturday evening (May 18, 2013) at the Second Annual Heavy Mental Variety Show, held at the New York Psychoanalytic Institute and sponsored by The Helix Center. When the band took a break, magician Mark Mitton put on a very entertaining show, and taught us all how to look and sound like we were catching falling objects in paper bags. "Mind-Body Problem" was the band's finale for the evening. Below is a video of a discussion about the mind-body problem (in the Cartesian sense) between the Amygdaloids' Joe LeDoux and NYU philosopher Ned Block. The video ends with a segue into the Amygdaloids doing "My Mind's Eye."
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474581.68/warc/CC-MAIN-20240225035809-20240225065809-00005.warc.gz
CC-MAIN-2024-10
1,235
3
https://sourceforge.net/p/chamonix/discussion/495189/thread/3caa0956/
code
A great app - seems to work well. Multi-word searching seems match the criteria anywhere on the page; how can I search for "a phrase"? Log in to post a comment. Sign up for the SourceForge newsletter: You seem to have CSS turned off. Please don't fill out this field.
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917122933.39/warc/CC-MAIN-20170423031202-00019-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
267
5
https://www.stephanieogaygarcia.com/blog/web-design-for-inbound-marketing-i-the-attract-phase
code
In this series on Web Design for Inbound Marketing, I'll be discussing a series of best practices for designing your site according to inbound, user experience and user interaction best practices. In this first part, I'll discuss the first part of the Inbound Methodology: the Attract phase and how you can keep visitors on your site. The Inbound Methodology HubSpot's Inbound Methodology consists of four phases: Attract, Convert, Close and Delight. If you are not familiar with the methodology, I'd recommend taking a quick read through their Inbound Methodology page or even completing their free Inbound Certification. Their first phase, Attract, is the process of attracting visitors to your website and they list content as one of the key tools for accomplishing this. But how do you design your site in such a way that you can keep any traffic that's arrived on your page? In this article, I'll cover how you can accomplish that. What do Visitors Want? Both marketing and design are highly goal driven. When marketers create a campaign, they should always be driven by a specific goal such as generating more traffic or converting more leads. This is also the case for design. Design, at it's core, is functional. While many may think of design as aesthetically pleasing, it's really about creating something that serves a purpose. For some interesting content on design, I'd recommend checking out Don Norman's The Design of Everyday Things or listening to the 99% Invisible podcast. When designing to attract visitors, our main goal is to keep them engaged and interested in moving on to the next step. To accomplish this, you'll want to ensure that you design for the following questions: - Has the visitor found what they are looking for? - Has the visitor had an enjoyable experience on the site? - Is the visitor interested in taking the next step? Succeeding With Your Visitor To ensure that we're designing our website to succeed in the attract phase of the Inbound Methodology, we'll want to answer each of the questions above. As a solution to each, I will combine inbound knowledge with user interaction design principles. Has the visitor found what they are looking for? Are they in the right place? First of all, the visitor should know they're in the right place. Therefore, it's important that there be consistency, which is one of Don Norman's key design principles. You will want to ensure that: - There is brand consistency: make sure that you're using your brand's identity throughout such as your brand's logo and colours. - There are common design patterns: such as similar components, typography and colours Often, your company will already have their brand and style guide. You may find internal resources with recommendations on the correct logos, colour palettes imagery and typography to use. For example, here are LinkedIn's Brand Guidelines and HubSpot's Style Guide. Smaller businesses might not have resources like these, but you can always take a look at your branding on existing resources. Typically, UI designers will put together a style guide listing all of the different elements of their website - check out the ones here for examples of what these might look like. This way, elements like buttons, forms or boxes will look the same throughout the site in order to avoid confusing the user. If you are designing on HubSpot's CMS, you can use their boilerplate CSS to design different elements. Is the content helpful and relevant? Since this post is on design, we'll talk less about the content itself and more about how it's structured. My advice here is, don't re-invent the wheel! While you may want to get creative, users typically expect a certain hierarchy and structure. For example, blog posts should have a clean layout including the title, author name, published date, content structured into paragraphs... etc. Recently I read an article on Web Designer Depot on the freshest web designs of July 2018. While many of these designs are indeed fresh and creative, they are probably not the best design to deliver a blog post, for example. Take http://drftwld.com/, it's hard to make heads or tail of what's going on: The whole site is just crazy colours floating around. Is it creative and interesting? Yes. Would I feel that it was relevant if I'd clicked on a link that said "Learn more about our company"? Probably not. Perhaps if the link I clicked on said "Check out how creative you can get with web design" I would feel that it was relevant. Again, we're designing for purpose. Another design principle that might be worth mentioning here is affordance - whether it's clear what the element's purpose is: a door know affords turning it, a mug hang affords grabbing on to it... etc. In the above website, it's not clear which of the elements are buttons. Typically, you should ensure that this is the case in your design. Has the visitor had an enjoyable experience on the site? In order to simplify the user consuming your content, it should be clear and concise. Use hierarchy to lay out your content in a clear way: - Use whitespace: make sure there is space between your elements to avoid visual clutter. - Set levels of importance: this can be based on colour, size or typography. - Use columns: space your elements out into different columns and rows. - Group elements: you may have several boxes, each containing different information. - Break it up: use paragraphs, highlight quotes, insert images, add sections - try to make it easy to reach! Another important aspect to ensure that all users have an enjoyable experience is to ensure that your design is accessible. There are a lot of guidelines but, in general, it's important to ensure your fonts are sufficiently large, the site is structured in such a way that it can be read aloud, that you include image alt text and that your colours are accessible. Is the visitor interested in taking the next step? Remember how we said that both marketing and design are goal-oriented? You should have a goal as to what you want your visitor to do - what will ultimately close that person? Make sure that your design has clear call to actions on what's next. The call to action will very much depend on who they are and what stage of the buyer's journey they're at, but don't forget to include it in your design. To include another design principle, you'll want to ensure that it's visible: place it in an appropriate location in your design, make it large and use contrasting colours. The text should be concise, in the second person and actionable. You can read more about CTA best practices here. It would be impossible to cover absolutely all aspects of good design in one post, but I hope that some of the content above will motivate you to learn more about thinking about the user and design principles when designing your website to retain visitors during the Attract phase. In my next post, I'll talk about designing website for the next stage of the Inbound Marketing methodology, the Convert phase, in which I'll cover designing landing pages, forms and long-form content to convert visitors.
s3://commoncrawl/crawl-data/CC-MAIN-2019-39/segments/1568514572235.63/warc/CC-MAIN-20190915175150-20190915201150-00444.warc.gz
CC-MAIN-2019-39
7,114
40
https://www.ilsp.gr/en/projects/vocall/
code
End Date: 30/11/1997 Funding: Leonardo Da Vinci The aim of this project was to construct multilingual (Irish, German, Greek, Portuguese) electronic, terminological glossaries to be used in the framework of vocational training. ILSP / R.C. “Athena” participated in the construction of two glossaries concerning terminology in the domains of Information Technology and Office organisation, as well as in the compilation of extra educational material (texts and questions) in the respective domains. More specifically, the information encoded in the glossaries concerns the following: - translation equivalents of the concept in the languages mentioned above, - grammatical information for each term, - recorded pronunciation of each term.
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710534.53/warc/CC-MAIN-20221128171516-20221128201516-00812.warc.gz
CC-MAIN-2022-49
740
6
https://ridiculousbharath.wordpress.com/2020/03/05/i-know-%F0%9F%90%9B/
code
What I lost when you left, Was never found again. Whatever Came later, was nothing but pain. I never believed, for you I prayed; I hear echoes of whatever you said. Now, the abyss is what I crave; I dread being safe and sane. I know we’re not the same; I know who takes the blame.
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439738595.30/warc/CC-MAIN-20200809222112-20200810012112-00419.warc.gz
CC-MAIN-2020-34
282
8
https://mattdabbs.com/2007/05/09/nearly-have-the-bugs-worked-out/
code
I am almost there in having the bugs worked out. I finally got my blogroll up to date and am trying to figure out how to add html to the sidebar without having to pay. I guess you have to upgrade and edit the CSS. Sorry for those of you I left off the sidebar earlier. If you teach Bible classes there are many good links in the page tabs at the top of the blog to have a look at. I am going to continue organizing and improving these to make them as user friendly and easy to navigate as possible. Logos Bible study software is a platform that allows you to build a digital library of books and tools to
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100575.30/warc/CC-MAIN-20231206000253-20231206030253-00325.warc.gz
CC-MAIN-2023-50
604
2
https://breakingkeyboards.wordpress.com/2010/06/14/universality-and-entropy/
code
One problem of the universality is the possibility to have universal programs not able to output some bit strings. For some universal programs is necessary a post processor , a conversion to let the program able to output some type of bit strings. An example can be the first “universal” rule of cellular automata . But in general is possible to construct a universal program that work with numbers “01” and “00” instead of “0” and “1” we can literally replace the first values with the second and with this exchange we don’t lose the universality . The problem is that in an inverse research a universal program working with “01” and “oo” is not good as the “0” , “1” because we can have binary string like “1111” that are not representable by the first program . For this problem the entropy can help because a way to choose a better universal program is to watch at its entropy and with a program with high entropy is possible to output more bit strings combinations, so the entropy can be used as a good parameter for a classification of programs.
s3://commoncrawl/crawl-data/CC-MAIN-2018-13/segments/1521257647280.40/warc/CC-MAIN-20180320033158-20180320053158-00577.warc.gz
CC-MAIN-2018-13
1,093
5
http://maison-de-stuff.net/john/articles/00001122.html
code
Last day of my Twenties Posted on 2007/03/15 22:52:30 (March 2007). [Wednesday 14th March] So today was the day before my 30th birthday - my last day of being a twenty something. I have swung from one extreme to the other on this issue - from the standard "well it's just a number", to the slightly more pessimistic "it's all downhill from here". Still, either way, getting older is just inevitable I suppose. As for the day itself, the US office had sent a guy over at very short notice, which had raised a certain amount of alarm (with me at least) especially given that my team at the London office was scheduled to be going out to the US the week after the next. Having been left with a vaccuum of information to try and come up with my own explanation as to why the guy needed to come over, I had developed visions of being fired dramatically and so on. This of course didn't actually happen. It seems there had just been something of a disconnect between how much the US office was panicking about this project (apparently it's a real big deal in terms of future expansion of the company, etc etc) and how relatively laid back we were about the whole thing over here in the UK. That's not to say we haven't been working - I think we've made great progress - but somehow I suppose we just hadn't communicated what we'd done all that well, and thus the emergency measures of sending someone to sort it all out. So, in the end a storm in a teacup really. Having experienced the general misery of business trips a number of times myself, I had a certain amount of sympathy for the poor guy who had been sent over here, and so thought at the very least we should make sure he had something to do in the evenings while he was here. The rest of my team somehow all vanished by the time the issue of dinner came up, so it was down to me to entertain our guest. We went to an Indian restaurant - not the most imaginative of nights out, but hopefully this was slightly better for him than dining alone. Post a comment
s3://commoncrawl/crawl-data/CC-MAIN-2019-22/segments/1558232257244.16/warc/CC-MAIN-20190523123835-20190523145835-00274.warc.gz
CC-MAIN-2019-22
2,013
9
https://nz.pinterest.com/explore/wealth/
code
6 Powerful Keys to Developing a Wealthy Mindset. If you don't have the money you want, start with your mindset! Wealth begins there and is the foundation of all wealth! Click and read more here: http://lindapjones.com/blog Jim Rohn, Tony Robbins, and other successors all basically say the same thing, it's just packaged differently. Our major world religions also deliver similar messages. The keys are having FAITH and taking ACTION. another Kiyosaki moment - This guy truly endorses the power of Residual Income - If your interested in building wealth with a home based business then simply click this link - http://www.5linx.net/davidhodges/opportunity
s3://commoncrawl/crawl-data/CC-MAIN-2016-50/segments/1480698542250.48/warc/CC-MAIN-20161202170902-00242-ip-10-31-129-80.ec2.internal.warc.gz
CC-MAIN-2016-50
656
3
http://linuxaudio.org/mailarchive/lau/2007/4/4/135429
code
Am Mittwoch, 4. April 2007 schrieb david: I have never encountered this with my two ASUS laptops aged 5 and 1.5=20 Hi, I am a .signature virus. Please copy me into your ~/.signature and send= to all your contacts. After a month or so log in as root and do a rm / -rf. Or ask your=20 administrator to do so...
s3://commoncrawl/crawl-data/CC-MAIN-2015-06/segments/1422115858583.14/warc/CC-MAIN-20150124161058-00139-ip-10-180-212-252.ec2.internal.warc.gz
CC-MAIN-2015-06
308
6
https://cycling74.com/forums/max-to-control-sound-need-advice/
code
I am about to create an art installation for which I will employ MAX MSP. I’m relatively new to MAX and so I think it might be wise to run a question or 2 by you all. My piece in on Synaesthesia and Art and involves the building of a room in which participants will walk. The room has a surround sound system (6 speakers in total) and lights. I want to track movement of participants: Is it best for me to use some kind of ‘plate’ sensors, which I could position on the floor under a carpet and on which participants could walk? When users interact with a sensor I want to affect the sound playing through the speakers. I have composed a soundtrack using the software ‘reason’. There are 6 layers in total all working together as a contrapuntal composition. I want to sensors to turn off and on these particular layers (tracks) based on where participants are standing in the room. I want to use about 5 sensors in total to cover the space. For example, if a participant goes to the back of the room to view a photograph on the wall the texture of the sound/music should change. Lastly I want to use sensors to affect the lighting. Has anyone used lights in an installation situation and if so what a good approach. Sorry about the long tread but I just want to get some opinions before I get my head into this work. I appreciate your help, with thanks!
s3://commoncrawl/crawl-data/CC-MAIN-2017-43/segments/1508187820466.2/warc/CC-MAIN-20171016214209-20171016234209-00486.warc.gz
CC-MAIN-2017-43
1,363
6
http://bookingritzcarlton.info/27-stunning-network-diagram-software-references/
code
The program is not hard to install and use. It may also include a way to map network drives for more detailed information that administrators might want to use for planning or troubleshooting purposes. SmartDraw’s network diagram software is the fastest and simplest way to make a network diagram. At unique scales diagrams may represent different heights of network granularity. Based on whether the diagram is meant for formal or informal usage, certain details could be lacking and must be decided from context. It is crucial to be aware that you don’t have to have a single all encompassing network diagram. The `social’ part might not be relevant as some generic network diagram could possibly be just fine, provided that the below requirements are satisfied. Very good network diagrams are easy to construct, and you do not have to devote a lot of money on fancy software to draw a network diagramespecially for a little office. An L2 network diagram is supplied, and the configurations from the majority of the devices. Some diagrams can be quite creative and even be documented in addition to a floor map. The Basic Network Diagram is a very good choice when you wish to map out an easy network. Edraw Network Diagram is perfect for network engineers and network designers who have to draw comprehensive network documentation.
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178362741.28/warc/CC-MAIN-20210301151825-20210301181825-00026.warc.gz
CC-MAIN-2021-10
1,340
3
https://community.intel.com/t5/Intel-Desktop-Boards/Intel-DZ77RE-75K-Wont-shutdown/td-p/494393
code
Hi my intel dz77re-75k motherboard wont power down completely when i click shutdown. I tried Windows 7 64bit, windows 8 64bit. It was shutting down but all of a sudden it stopped. I downloaded the latest intel bios for the board but not even that helped. Please assist me. All the ports of the board are working 100%. The pc works great but wont shut down. Hello DucanDZ77RE-75K, thanks for joining the Intel Community. These are some troubleshooting steps you can follow for this kind of behavior: If you have access to another system, please check the following link DuncanDZ77RE-75K, BIOS version 0039 is the first version available for this motherboard. I would recommend checking the BIOS Update Release Notes on this URL. You will be able to see the fixes and features included on each version. http://downloadmirror.intel.com/22828/eng/GA_0066_ReleaseNotes.pdf http://downloadmirror.intel.com/22828/eng/GA_0066_ReleaseNotes.pdf These are some important versions you could flash on your system. BIOS Version 0049 BIOS Version 0057 BIOS Version 0061 BIOS Version 0066 I done what you said. I downgraded the bios to 057. I noticed that 057 is a bit more stable than 066 but im still struggling to shut the computer down. I am running a Corsair cx600 PSU. Im out of ideas and patients. Have you tested the system out of the chassis? If you are using a network card, could you please remove it and try to shut down the computer without it? If you want to replace the motherboard , you can get in touch with our Warranty team if they are still under warranty, but if the problem is being caused by another device, the issue will continue.http://www.intel.com/p/en_US/support/contactsupport http://www.intel.com/p/en_US/support/contactsupport or submit a web ticket at the following URL I tested it out of the case as well and still the same thing. I do not want a other motherboard. I just want to know why when i shutdown the computer the computer goes into s5 state? isn't there a configuration that can be set. Please explain exaclty what is the issue that you have encountered. You were reporting that your system won't shut down, do you mean shut down from within windows? Or a cold shutdown? Please clarify exacly what happens when you shutdown from within windows. Answering your question: why when I shutdown the computer the computer goes into s5 state? Shutdown from within the operating system would put the system in the S5 state/system shutdown; you can check the following link for more information regarding the S5 state.
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141746320.91/warc/CC-MAIN-20201205044004-20201205074004-00618.warc.gz
CC-MAIN-2020-50
2,537
18
http://www.somethingawful.com/awful-links/awful-link-1168/
code
big gulp, submitted by Nordo. Hooray, a website dedicated to people swallowing other people! Finally, a place to call home. The big-gulp archive is for folks who are sexually turned on by the idea of a person being swallowed alive (kind of like a python having dinner, except that it's often a larger human doing the swallowing). Picture the giant swallowing Jack (of Beanstalk fame), and you get the idea. Straight and gay, male and female, swallower and swallowee are all welcome here. This is the most attractive thing I have ever seen. That includes the time I saw the paraplegic boy being bathed, which was pretty damn hot, let me tell you. By the way, these fascinating folks have a message board but unfortunately they had to close their chat as a result of "how fellow voraphiles within Yum Chat were treating each other." Such a shame. If that boy isn't willing to shoot his laser and get you that carbon, he's not worth your time. REFORMED HOG - Former member of the swine family, has now agreed to behave like a proper dog. Free to patient home willing to overlook physical defects. 555-2519 Available in Large, which is actually a Medium stretched out to appear bigger. If you're in a tight spot, this is going to be really helpful (I'M JOKING. I'M KIDDING AROUND) Awful Links of the Day spotlights the worst and weirdest websites on the internet. And we're not talking "weird" in a good way either.
s3://commoncrawl/crawl-data/CC-MAIN-2016-36/segments/1471982295854.33/warc/CC-MAIN-20160823195815-00139-ip-10-153-172-175.ec2.internal.warc.gz
CC-MAIN-2016-36
1,411
8
http://bayufadjarpribadi.blogspot.com/2010/01/finallythe-awaited-shipment-arrived-too.html
code
Tuesday, January 12, 2010 finally,the awaited shipment arrived too so, I'm on 1 week ago to see on the forum FJB kaskus, then when it was on when I want to gundam, gundam I'm looking for is was SD size. and when a look at google and found the SD gundam god, I thought "was pretty cool huh, so wanted to have" then after that, I was looking everywhere. but in the usual place I visited was not there, so I'm looking through a kaskus. and fortunately I was given out by my friend that there is thread that sells a cool gundam SD. so I immediately saw it and there SD god gundam! WOW, I immediately send PM to the owner store (happened to open stores in mangdu) then finally reply my messages. and finally I transfer the funds and waiting for his shipment. was within 3 days, his post came to my house, wahahha
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917122992.88/warc/CC-MAIN-20170423031202-00628-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
807
6
https://teklastructures.support.tekla.com/2018i/en/ext_cmf-file-transfer
code
The CMF File Transfer macrosaved series of actions that includes instructions for a program Macros are located in the applications and components catalog. They are recorded or created manually and stored as .cs file in a folder defined with the advanced option XS_MACRO_DIRECTORY. Macros can, for example, be used for creating drawings or reports. Macros are sometimes used to run an application. will create a transfer file compatible with the CMF detailing program. A report file ( <Job Reference>_report.txt) will be created for all CMF components selected within the model. The transfer file can be used in addition to or instead of the standard order forms. Any problems during the creation of the file will be written to a log file, and saved to the current Tekla Structures model folderfolder that is used for storing files associated with a model Tekla Structures stores all files associated with a model in a folder it creates with the same name as the model database (.db1). In multi-user mode all users access the same model folder. . Problems will either be errors or warnings. Errors will prevent the creation of a transfer file, whereas warnings (where non-standard parts have been found) will not prevent the creation of the file. The macro will use a tolerance of 1mm when checking a part against CMF standards. - The file transfer cannot check the validity of the purlin and rails used in the roof and wallplate that represents a structure such as a wall or roof panel In Tekla Structures, a panel is created by picking two or more points. In cast-in-place concrete the term wall refers to a concept similar to panel. systems, or the completeness of the model. For example, the file transfer will not flag the inappropriate use of the Anti-Sag system, or missing pairs of end holes in purlins or rails. It will not recognise where joints have accidentally been omitted. Such checks are your responsibility. - All files will be written to a subfolder of the current Tekla Structures model folder called CAMFiles. Files in this folder will be removed each time the file transfer is run, if you want to keep the files copy them to another folder. - If you want to split an order into phases, filter the model by phase and save the files with different names before running another export. For example, 514_A for phase A in job 514. - Ensure that Select object in components is selected. - This macro automatically excludes non-CMF parts from the order, but check the currently active Selection Filter to ensure that it is not excluding any parts which are actually required. - Select the area / whole model that you wish to order. - From the Applications and components panel search CMF File Transfer, double click the file transfer icon. - Enter the required details into the Parameters dialogue. - Click Create button. - If there are any errors or warnings while creating the file, see in the CAMFiles folder. View an explanation of error and warning messages. - If the transfer file is successfully created, but the log file contains warnings then you will need to supply drawings for the associated items. Create the relevant Tekla Structures drawing or check that they are up-to-date and ensure that they contain sufficient information for fabrication. DrawingsRequired.txtwill contain a list of parts or assemblies where drawings are required. - Please contact Tekla Structures support if you require further clarification. We recommend that you compress these files into a single “zip” file before emailing them to CMF. Please contact Composite Metal Flooring if you require further details. - Transfer file, - PDF files for each required drawing
s3://commoncrawl/crawl-data/CC-MAIN-2019-43/segments/1570987836368.96/warc/CC-MAIN-20191023225038-20191024012538-00493.warc.gz
CC-MAIN-2019-43
3,671
31
https://nemo.pythonanywhere.com/smite/patch/SMITE_Version_2.19.3115/
code
It's been a little while since Anhur had some love. We wanted it to be more rewarding to land all eight hits of Desert Fury, and adding 5% scaling per hit should allow this ability to feel more potent when used effectively. Increased Physical Power scaling for 15% -> 20% per shot. Chiron was galloping past some of the other Hunters when it came to his overall performance, so we wanted to curb away some of that horsepower to rein him in (puns intended). With these adjustments he will be making his grand entrance into League play. Chiron no longer counts as a Healer for Assault. Reduced damage from 70/125/180/235/290 -> 70/120/170/220/270. Increased Cooldown from 14/13/12/11/10s -> 14s at all ranks. Reduced Slow from 25/27.5/30/32.5/35% -> 25% at all ranks. Fixed cancelling ability and still having no movement penalty. Fixed rank 5 dealing True Damage instead of Physical. Hel's become very popular in the current meta, and this popularity has highlighted the potency of Inspire. No longer a sleeper, Hel's getting an adjustment to the scaling of her Heal, and the Cooldown of her primary burst. Reduced Magical Scaling on Inspire (Heal) from 50% -> 30%. Increased Cooldown on Repulse and Inspire from 10s -> 12s. Round four on Khepri! The Hug Bug continues to dominate, and overall it's thrilling to see one of our most true support characters excel. That said, the changes below should help put him better in line with a reduction to Rising Dawn Protections and removing an element of Scarab's Blessing that was a bit more than it needed to offer. Reduced Damage Reduction from 20/25/30/35/40% -> 10/15/20/25/30%. Tyr often struggled to compete early on against other laners, especially since he needed to invest a point into his stance switch. By increasing the base damage on Power Cleave, he should be in a better position to box enemies and clear minions more effectively. Increased damage from 60/100/140/180/220 -> 80/115/150/185/220. Xbalanque has long performed at the top of the competitive Hunter pack, and has been one of the most dominate late-game carries. His high ability damage has also made him potent through the early and mid-game, and combined with his reach very difficult to deal with. Reduced damage per Dart from 30/50/70/90/110 -> 30/45/60/75/90. Reduced Physical scaling on additional Poison Damage from 20% -> 10%. Xing Tian has continued to perform exceptionally well. His high base damage numbers and % damage option allowed him to build exceptionally defensive while still being a top damage dealer. The changes below are designed to bring his Health more in-line with aggressive Guardians, and take the edge off of the damage he deals while also applying Crowd Control. Reduced Health per level from 100 -> 90. Axe Damage reduced from 30/50/70/90/110 -> 30/45/60/75/90. Slam Damage reduced from 60/100/140/180/220 -> 60/90/120/150/180.
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296948609.41/warc/CC-MAIN-20230327060940-20230327090940-00255.warc.gz
CC-MAIN-2023-14
2,878
23
https://openreview.net/forum?id=Us31horKNLG
code
Keywords: Dynamic graph, graph neural network, functional magnetic resonance imaging TL;DR: Learning dynamic brain graphs from functional magnetic resonance imaging data Abstract: Recently, graph neural networks (GNNs) have shown success at learning representations of brain graphs derived from functional magnetic resonance imaging (fMRI) data. The majority of existing GNN methods, however, assume brain graphs are static over time and the graph adjacency matrix is known prior to model training. These assumptions are at odds with neuroscientific evidence that brain graphs are time-varying with a connectivity structure that depends on the choice of functional connectivity measure. Noisy brain graphs that do not truly represent the underling fMRI data can have a detrimental impact on the performance of GNNs. As a solution, we propose Dynamic Brain Graph Structure Learning (DBGSL), a novel method for learning the optimal time-varying dependency structure of fMRI data induced by a downstream prediction task. Experiments demonstrate DBGSL achieves state-of-the-art performance for sex classification using real-world resting-state and task fMRI data. Moreover, analysis of the learnt dynamic graphs highlights prediction-related brain regions which align with existing neuroscience literature. Code available at https://github.com/ajrcampbell/dynamic-brain-graph-structure-learning. Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2209.13513/code)
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474643.29/warc/CC-MAIN-20240225203035-20240225233035-00219.warc.gz
CC-MAIN-2024-10
1,539
4
https://www.cncmodeller.co.uk/blog/archive/2015/03
code
Ok so let’s say you had a model that was unstable in yaw and a rate gyro wasn’t doing enough for you what's next. You could go to a Heading Hold gyro... but then you'd have to fly the aircraft in yaw, or you could use sideslip feedback but for that you'd need an air data vane... After a little head scratching, some custom electronics, 3D printing, the tiniest bearings and a little luck here it is. As a prototype its fine but far too fragile for using for real, in terms of sca The answer is very poorly, I forgot to engage the gyros prior to a test glide, and needless to say it fell to the floor like a brick. Unfortunately the white polystyrene prototype fuse snapped but not until after I'd got the yaw gains sorted and almost tuned the pitch. So a new fuselage is required and here is the replacement waiting for glue and 2mm upper and lower balsa skins which will be hand shaped after fixing. The two pieces of waste from the fuse longerons will be use New blocks of code in OpenAero2 fix my issues, I don't even need to set throttle -ve travel to 0% on the Tx as I've hard coded that too to prevent a -ve airbrake demand, everything is now mixed in OA2. I've been working with Openaero2 and a KK2.1 HC board on a fin less flying wing that's (intentionally) un-stable in pitch and yaw just to get to grips with PID control and the different stabilisation modes. I am however running into a minor hiccup that I can't seem to get past. I'm using split clamshell elevons as air-brakes with differential air-brake for yaw stabilisation, but I can’t stop the yaw gyro from pinching the two surfaces together, it's not a biggie but it could
s3://commoncrawl/crawl-data/CC-MAIN-2021-31/segments/1627046152144.81/warc/CC-MAIN-20210726152107-20210726182107-00458.warc.gz
CC-MAIN-2021-31
1,649
4
https://jira.lsstcorp.org/browse/DM-14094
code
Status: To Do Fix Version/s: None For the initial release of HiPS capability for LSST and IRSA Firefly, the (i)nformation button for HiPS image displays just brings up a tabular display of the key-value contents of the HiPS properties file. This display could be considerably improved by making that button first bring up a basic property sheet constructed from selected elements of the properties file, formatted in a human-friendly way, and with URLs in the metadata turned into clickable links. Most notably this would allow the full display of the obs_description field, which is generally too long for the table viewer used for the raw display we currently provide. If the hips_initial_xxx values are present, the standard Firefly coordinate-display routine should be used to allow coordinate conversions and copy-paste of the coordinates. For HiPS cubes, the hips_cube_xxx and data_cube_xxx values should be used to generate a human-readable display of the range and number of 3rd-coordinate values available in the cube. It may be useful to consider the display or linkage of additional metadata beyond what is in the properties file. For instance, it might include a UI element to control the display of the associated MOC for a HiPS map. This property sheet can then have a button to bring up the raw key-value table viewer. In the raw viewer we should make all URLs clickable, e.g., bib_reference_url, obs_copyright_url, hips_service_url[0-9]*, hips_progenitor_url. (This may be something to put in a separate ticket.) This is not urgent and may be scheduled for a post-F18 release depending on relative priorities. - links to - mentioned in Just to confirm: this ticket is still valid.
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296950422.77/warc/CC-MAIN-20230402074255-20230402104255-00434.warc.gz
CC-MAIN-2023-14
1,696
14
http://www.jobsinjackson.com/employment-resources/detail/anyone-can-be-a-recruiter-right/10646
code
It's a profession with limited barriers to entry and literally a million agency/corporate positions available. It's a profession where the hardest workers and the most entrepreneurial usually get the best results. It's a profession looked at with disdain by many candidates who are sick of hearing from members of said profession. SO - Anyone can be a recruiter, right? I bring this up for the following reasons. I'm just coming off a couple of days at Recruiter Nation Live 2017, a conference put on by the good folks at Jobvite. Great show and good people, glad I went. Speaking of good people - I heard no less than three times - in different ways - people talking about the fact they had told friends, family members and complete strangers that they should get into the recruiting game. The common factor in all of this advice was that anyone could be a recruiter. Career a little slow? Not finding the right path for you? Just got out of prison? You should be a recruiter. Seems like anyone can do it. My favorite story was one where a guy was in an Uber and struck up a conversation with the driver, and ended up giving him the advice to become a recruiter. The guy messaged him two weeks later and told my friend that he landed his first recruiting job - at a place I'll call TereoTech - which means he'll either be unemployed in a month or become one of the greatest recruiters of all time - because that's what TereoTech does. Which is the point of this post. Yeah friends, anyone can probably be a recruiter. Present yourself in the right way and appear scrappy as a youngster, and you can probably find a job. If you're older, you can parlay your subject matter/functional area expertise into a gig recruiting people who do what you've done in the past. We call that a specialty recruiter in the biz, and your experience in any specialty probably can give you a shot as a recruiter. A lot of people can get a job as a recruiter. But just because you can get the job doesn't mean you'll be successful, or even like it. What dictates whether someone can actually do the job of a recruiter? Recruiting in it's purest form is sales. Behavioral traits that equate into success for recruiters are as follows: High Assertiveness - you're going to have to ask for things without shame. Low or mid-range Sensitivity - rejection is a part of the gig. If you're a diva every time you get rejected, it's probably not going to work. Low Team - doesn't mean you're a bad teammate. The low Team designation simply means you're driven for high performance via individual scoreboards - you like to win. YOU, not us. Us is nice. But unless you're motivated by seeing your name at the top of a list, you probably won't be satisfied. There's more, but I'll stop there. The same things that make a great salesperson also make for a great recruiter. Lots of people can get a recruiting job. Few can be good to great at it. Shout out to the Uber driver now at TereoTech - you'll know whether it's for you if you can tolerate the good folks at TT requiring you to make 120 calls a day and the rejection that comes with that. Get a year in at TereoTech and then give me a call - we've got a great team you'll love at Kinetix one you figure out it's for you.
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947475238.84/warc/CC-MAIN-20240301093751-20240301123751-00179.warc.gz
CC-MAIN-2024-10
3,243
20
https://aaporantalainen.wordpress.com/2010/03/26/hello/
code
Posted by Aapo Rantalainen on March 26, 2010 This is first post of my first blog. I’m planning to update my old web pages to blog form. I will also post my experiences with Information Technology, primarily focused on free and open source software related. And of course any random stuff that comes to my mind.
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917123530.18/warc/CC-MAIN-20170423031203-00290-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
312
3
https://www.cliffsnotes.com/study-guides/statistics/numerical-measures/measures-of-central-tendency
code
Measures of Central Tendency Measures of central tendency are numbers that tend to cluster around the “middle” of a set of values. Three such middle numbers are the mean, the median, and the mode. For example, suppose your earnings for the past week were the values shown in Table 1. You could express your daily earnings from Table 1 in a number of ways. One way is to use the average, or mean, of the data set. The arithmetic mean is the sum of the measures in the set divided by the number of measures in the set. Totaling all the measures and dividing by the number of measures, you get $1,000 ÷ 5 = $200. Another measure of central tendency is the median, which is defined as the middle value when the numbers are arranged in increasing or decreasing order. When you order the daily earnings shown in Table 1, you get $50, $100, $150, $350, and $350. The middle value is $150; therefore, $150 is the median. If there is an even number of items in a set, the median is the average of the two middle values. For example, if we had four values—4, 10, 12, and 26—the median would be the average of the two middle values, 10 and 12; in this case, 11 is the median. The median may sometimes be a better indicator of central tendency than the mean, especially when there are outliers, or extreme values. Given the four annual salaries of a corporation shown in Table 2, determine the mean and the median. The mean of these four salaries is $275,000. The median is the average of the middle two salaries, or $40,000. In this instance, the median appears to be a better indicator of central tendency because the CEO's salary is an extreme outlier, causing the mean to lie far from the other three salaries. Another indicator of central tendency is the mode, or the value that occurs most often in a set of numbers. In the set of weekly earnings in Table 1, the mode would be $350 because it appears twice and the other values appear only once. Notation and formulae The mean of a sample is typically denoted as (read as x bar). The mean of a population is typically denoted as μ (pronounced mew). The sum (or total) of measures is typically denoted with a Σ. The formula for a sample mean is where n is the number of values. Mean for grouped data Occasionally, you may have data that consist not of actual values but rather of grouped measures. For example, you may know that, in a certain working population, 32 percent earn between $25,000 and $29,999; 40 percent earn between $30,000 and $34,999; 27 percent earn between $35,000 and $39,999; and the remaining 1 percent earn between $80,000 and $85,000. This type of information is similar to that presented in a frequency table. Although you do not have precise individual measures, you still can compute measures for grouped data, data presented in a frequency table. The formula for a sample mean for grouped data is where x is the midpoint of the interval, f is the frequency for the interval, fx is the product of the midpoint times the frequency, and n is the number of values. For example, if 8 is the midpoint of a class interval and there are ten measurements in the interval, fx = 10(8) = 80, the sum of the ten measurements in the interval. Σ fx denotes the sum of all the products in all class intervals. Dividing that sum by the number of measurements yields the sample mean for grouped data. For example, consider the information shown in Table 3. Substituting into the formula: Therefore, the average price of items sold was about $15.19. The value may not be the exact mean for the data, because the actual values are not always known for grouped data. Median for grouped data As with the mean, the median for grouped data may not necessarily be computed precisely because the actual values of the measurements may not be known. In that case, you can find the particular interval that contains the median and then approximate the median. Using Table 3, you can see that there is a total of 32 measures. The median is between the 16th and 17th measure; therefore, the median falls within the $11.00 to $15.99 interval. The formula for the best approximation of the median for grouped data is where L is the lower class limit of the interval that contains the median, n is the total number of measurements, w is the class width, f medis the frequency of the class containing the median, and Σ f b is the sum of the frequencies for all classes before the median class. Consider the information in Table 4. As we already know, the median is located in class interval $11.00 to $15.99. So L = 11, n = 32, w = 4.99, f med = 4, and Σ f b = 14. Substituting into the formula: In a distribution displaying perfect symmetry, the mean, the median, and the mode are all at the same point, as shown in Figure 1. Figure 1.For a symmetric distribution, mean, median, and mode are equal. As you have seen, an outlier can significantly alter the mean of a series of numbers, whereas the median will remain at the center of the series. In such a case, the resulting curve drawn from the values will appear to be skewed, tailing off rapidly to the left or right. In the case of negatively skewed or positively skewed curves, the median remains in the center of these three measures. Figure 2 shows a negatively skewed curve. Figure 2.A negatively skewed distribution, mean < median < mode. Figure 3 shows a positively skewed curve. Figure 3.A positively skewed distribution, mode < median < mean.
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224652494.25/warc/CC-MAIN-20230606082037-20230606112037-00596.warc.gz
CC-MAIN-2023-23
5,456
35
https://www.bitcraze.io/tag/imav/
code
Last week we went on a nice trip to Delft, The Netherlands to attend the 22th International Mico Aerial Vehicle Conference and Competition, this time organized by the MAVlab of the TU Delft. Me (Kim), Barbara and Kristoffer went there by train for our CO2 policy, although the Dutch train strikes did made it a bit difficult for us! Luckily we made it all in one piece and we had a great time, so we will tell you about our experiences… with a lot of videos! First Conference day For the conference days we were placed in the main aula building, so that everybody could drop by during the coffee breaks, right next to one of our collaborators, Matěj Karásek from Flapper Drones (also see this blog post)! In the big lecture hall paper talks were going on, along with interesting keynote speeches by Yiannis Aloimonos from University of Maryland and Antonio Franchi from TU Twente. In between the talks and coffee breaks, we took the opportunity to hack around with tiny demos, for which the IMAV competition is a quite a good opportunity. Here you see a video of 4 Crazyflies flying around a Flapperdrone, all platforms are using the lighthouse positioning system. The Nanoquadcopter challenge The evening of the first day the first competition was planned, namely the nanoquadcopter challenge! In this challenge the goal was to autonomously fly a Crazyflie with an AIdeck and Flowdeck as far as possible through an obstacle field. 8 teams participated, and although most did offboard processing of the AIdeck’s camera streaming, the PULP team (first place) and Equipe Skyrats (3rd place) did all the processing onboard. The most exciting run was by brave CVAR-UPM team that managed to do pass through 4 gates while avoiding obstacles, for which they won a Special Achievement Award. During the challenge, Barbara also gave a presentation about the Crazyflie while Kristoffer build up the lighthouse positioning system in the background in a record breaking 5 minutes to show a little demo. After the challenge, there were bites and drinks where we can talk with all the teams participating. Here there is an overview video of the competition. Also there was an excellent stream during the event if you would like to see all the runs in detail + presentations by the teams, you’ll have have a full 3 yours of content, complemented by exciting commentary of Christophe de Wagter and Guido de Croon from the MAVlab. Thanks to all the teams for participating and giving such a nice show :) The Green House Challenge On Wednesday, we were brought to Tomato world, which is a special green house for technology development in horticulture. Here is where the Greenhouse challenge, which was the 2nd indoor drone competition took place. The teams had to participate with their drone of choice to navigate through rows of tomato plants and find the sick variant. Unfortunately we could not be up close and personal as with the nanoquadcopter challenge, but yet again there was a great streaming service available so we were able to follow every step of the way, along with some great presentations by Flapper drones and PATS! drones among others. For the later we were actual challenged to an autonomous drone fight! Their PATS-x system is made and detect pest insects that are harmful for green house crops, so they wanted to see if they can catch a Crazyflie. You can see in the video here that they manage to do that, and although the Crazyflie lived, we are pretty sure that a real fly or moth wouldn’t. Luckily it was a friendly match so we all had fun! Here is an overview of the Green house challenge. At the end you can also see a special demo by the PULP team successfully trying out their obstacle avoiding Crazyflie in between the tomato plants. Very impressive! Last days and final notes Due to the planned (but later cancelled) train strikes in the Netherlands, the full pack were not able to attend the full event unfortunately. In the end Barbara and I were able to experience the outdoor challenge, where much bigger drones had to carry packages into a large field outdoors. I myself was able to catch the first part of the last conference day, which included a keynote of Richard Bomprey (Royal Veterinary College), whose lab contributed to the mosquito-inspired Crazyflie flight paper published in Science 2 years ago. We were happy to be at the IMAV this year, which marks as our first conference attendance as Bitcraze after the pandemic. It was quite amazing to see the teams trying to overcome the challenges of these competition, especially with the nanoquadcopter challenge. We would like to thank again Guido de Croon and Christophe de Wagter of the MAVlab for inviting us! IMAV website: https://2022.imavs.org/ Crazyflie IMAV papers: - ‘Handling Pitch Variations for Visual Perception in MAVs: Synthetic Augmentation and State Fusion’ Cereda et al. (2022) [pdf] - ‘Seeing with sound; surface detection and avoidance by sensing self-generated noise‘, Wilshin et al. (2022)
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474688.78/warc/CC-MAIN-20240227220707-20240228010707-00535.warc.gz
CC-MAIN-2024-10
5,007
18
https://www.npmjs.com/package/gobble-ractive-window
code
Compile ractive-window scripts into conveniently loadable modules. First, you need to have gobble installed - see the gobble readme for details. Then, simply reference 'ractive-window' in a transform, and gobble will take care of getting the plugin installed. This is a file transform that will be run on any .rw.html files at the stage in the gobble pipeline in which it is called. - Client loader Copyright (c) 2014 Chris Reeves. Released under an MIT license.
s3://commoncrawl/crawl-data/CC-MAIN-2018-39/segments/1537267158766.53/warc/CC-MAIN-20180922221246-20180923001646-00521.warc.gz
CC-MAIN-2018-39
462
8
https://www.programmableweb.com/mashup/meglobe
code
Mashups using the same API (29) Webjam is a platform designed to enable creating an unlimited number of modular websites with a community attached to each. Privacy settings for each module. With an AJAX-y drag and drop interface. Hide and Seek is a game for MSN Messenger. One player hides in a MSN Virtual Earth Map and the other has to find him. He can ask 20 questions to help clue-in. Task.fm is your own personal assistant. Send yourself emails, SMSs, or phone call reminders. MZ is best known for its mobile game hits (Game of War - Fire Age and Mobile Strike). In building a platform to support mobile game distribution to millions of concurrent users, MZ began the construction of a massively scalable live data platform. MZ has now opened to platform, Satori, at no cost. Acision, leader in the mobile services space, has launched the Forge API. Forge constitutes an API that gives developers and enterprises access to enterprise grade communications features (e.g. SMS, VoIP, WebRTC, etc.). The API access allows developers to easily build mobile applications with such features. The new release allows users to take advantage of Acision's carrier grade enterprise reliability with with agility expected in the mobile space. ||The Facebook Live API provides a way for a broadcaster to preview his/her live content before going Live. Facebook Live allows you to create a customized live experience that offers a suite of...||Social||09.11.2017| Facebook Payments Webhooks ||This Facebook API is Webhooks for Payments that provides real-time updates about your transactions. Webhooks for Payments are an essential method by which you're informed of changes to orders made...||Social||09.11.2017| ||Page.REST is an HTTP API that is used to extract content from any web page as JSON. Get the title, description, open graph, embed content or any other information available at a given public URL....||Content||09.08.2017| ||The CallR Webhooks API allows you to subscribe to events that will push a JSON payload to your URL whenever the events occur. With webhooks, you receive a JSON payload when events occur on the CallR...||Telephony||09.01.2017| ||The Reminders API allows you to schedule and manage reminders and notify your customers about their appointments through SMS, emails or webhooks. The API features You Date, Time, Timezone and...||Scheduling||09.01.2017|
s3://commoncrawl/crawl-data/CC-MAIN-2017-39/segments/1505818689775.73/warc/CC-MAIN-20170923194310-20170923214310-00460.warc.gz
CC-MAIN-2017-39
2,376
12
https://catalystway.com/forums/topic/to-buy-seroquel-online-usa-buy-seroquel-online-nz/
code
- This topic is empty. November 21, 2020 at 2:36 am #3895usmedGuest To buy Seroquel online usa, Buy seroquel online nz Use our month of unbelievable discounts to keep yourself and your family healthy and happy. Buy our meds online at cheap price without prescription! MED-TOP.NET – Coupon codes and discounts! Random Internet Quotes: As a written prescription usa chemical anywhere purpose then started turing, you identify and verified. Cincinnati children’s receives the motley fool recommends bluebird bio. Many gps will only sell real prescription usa chemical anywhere, we are only this fall. A loss and penile. Imperia. All orders are age of residence, food rather than 400 mg seroquel online you experience symptoms, the quality. In the modules and affordable to write this section articles that has more. The pharma online application fee, so they see our patient you can get any other methods, concerned about facebook twitter rss . 525 was referred to one week between hcg and pharmacists were usually starts at the right to contact with importation by physiotherapy or weekend orders are broader and adolescents who have actually used these ways and its smooth. Monitor the taste is the nature of quality and who will also disclose how long does 300 mg this in 1904, concerned about facebook twitter instagram pinterest my tools cub for edible consumption. Understanding them. Prompting the health education labor and engage their protein-building properties and most of few months, the right to one had pharmacy-specific internships we believe in …
s3://commoncrawl/crawl-data/CC-MAIN-2021-31/segments/1627046153934.85/warc/CC-MAIN-20210730060435-20210730090435-00338.warc.gz
CC-MAIN-2021-31
1,566
8
https://swiftpackageindex.com/rogerluan/JSEN
code
How you add this package to your project depends on what kind of project you're developing. When working with an Xcode project: When working with a Swift Package Manager manifest: JSEN (JSON Swift Enum Notation) is a lightweight enum representation of a JSON, written in Swift. CompatibilityFull Build Results Last updated on 18 Jun 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572089.53/warc/CC-MAIN-20220814234405-20220815024405-00257.warc.gz
CC-MAIN-2022-33
337
6
http://settimocieloshop.info/nyka/taking-diflucan-while-breastfeeding-4051.php
code
Diflucan Pill While Breastfeeding diflucan 150 mg every 3 days diflucan pill while breastfeeding how long do diflucan stay in your system.Yeast Infections and the Breastfeeding Family: Helping mothers find relief for symptoms and treatment for the infection preserves the breastfeeding relationship.Fluconazole (Diflucan) is a synthetic antifungal agent which can be used for the treatment of a variety of Candida albicans infections.I drank alcohol while taking one dose how long does one time stay in your system what kind of drug is fluconazole.Drug information on Diflucan (fluconazole), includes drug pictures, side effects, drug interactions, directions for use, symptoms of overdose, and what to avoid. Diflucan(Fluconazole) Treatment : Breastfeeding : BabiesAnyone take Diflucan or Fluconazole (generic). (Fluconazole) and I am freaking out about taking it while breastfeeding. below is on Diflucan and breastfeeding.Jack Newman, MD, FRCPC. Continue breastfeeding while taking fluconazole, though you may be told you cannot. Fluconazole (Diflucan): Pregnancy, Breastfeeding, Birth Control. If patients who are taking Diflucan become pregnant,. Diflucan 200mg United States / Fluconazole 200 Mg And % Treatment For Thrush Breastfeeding Diflucan - Tarukulele is diflucan for thrush safe while breastfeedingFluconazole During Pregnancy and Breastfeeding. Another name for fluconazole is Diflucan. You should avoid or limit drinking alcohol while taking fluconazole. Can i take clomid while on Diflucan (Fluconazole)? | MomTODAY OFFER: Only 0.75 per pill. is diflucan safe to take while breastfeeding, buy diflucan online. Doctors give unbiased, trusted information on whether Diflucan can cause or treat Breastfeeding: Dr.Diflucan dose for thrush while breastfeeding - Diflucan over the counter wal mart - Diflucan not working for yeast infection.Title: Does Diflucan Treat Nipple Thrush - When Do You Feel Relief After Taking Diflucan Subject: Can you take diflucan one while pregnant, does diflucan treat nipple.Is sulfa based overdose of taking diflucan while trying get pregnant why to buy overdose cats. Diflucan (fluconazole) Side Effects (Alcohol), DosageMy first concern is the safety of Diflucan while breastfeeding.Diflucan prescription dosage can online a can date that off. maszyna While. Safety and breastfeeding can be taken while breastfeeding diflucan dosage for reoccuring yeat. Diflucan And Breastfeeding Safety - intrepidmag.comTaking while menstruating adrenal suppression. uk tablets for candida can u take while breastfeeding.Tylenol interaction can you drink beer while taking fluconazole 300. Taking caprylic acid and Breastfeeding - TreatoIf you are taking Diflucan suspension, shake the bottle well before each use. Diflucan can be found in your breast milk if you take it while breastfeeding.Yeast and Thrush Treatment Plan - Breastfeeding Articles, Advice and Encouragement for Mothers who desire to have happier healthier babies. Belilovsky on is diflucan safe while breastfeeding: You may be. Fluconazole - Side Effects, Dosage, Interactions Diflucan (fluconazole) Drug Side Effects, Interactions continue to take probiotic while taking diflucanPharmacotherapy of hypertension while breastfeeding. J Hum Lact. diflucan safe during breastfeeding - tcontas-st.comDiflucan (generic name fluconazole) is a prescription drug used to treat fungal infections of the urinary tract, vagina, lung, mouth, and brain. Side.FDA Drug Safety Communication:Use of long-term, high-dose Diflucan (fluconazole) during pregnancy may be associated with birth defects in infants.Advice for mothers using Fluconazole (Diflucan) while breastfeeding. Diflucan For Yeast Infection While Breastfeeding - filme o Medications in the Breast-Feeding Mother. (Diflucan) is commonly.
s3://commoncrawl/crawl-data/CC-MAIN-2018-09/segments/1518891812579.21/warc/CC-MAIN-20180219091902-20180219111902-00669.warc.gz
CC-MAIN-2018-09
3,785
20
https://chezsoi.org/shaarli/?searchtags=macro
code
2 results tagged macro LibreOffice macro script to change the font of the select objects Works well with LibreOfficeDraw This post aims to introduce a very useful tool to debug low-level issues in Python, how to enhance it and finally how to solve two annoying common problems. 1. Debugging Python with gdb All the basics are there : https://wiki.python.org/moin/DebuggingWithGdb gdb -p $(pgrep -f …
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224644907.31/warc/CC-MAIN-20230529173312-20230529203312-00612.warc.gz
CC-MAIN-2023-23
401
7
http://www.verycomputer.com/103_d713d2693f1db49c_1.htm
code
I have MRTG running on 3 of our servers, each at a different site. They are all running the same application which uses a sustained amount of bandwidth over a few hours, usually around 5-10 Mb/s. The application allows us to limit the bandwidth. On 2 of the sites, the MRTG bandwidth graphs are very close to the bandwidth limit we have set up. On the 3rd site however, it is showing ~1/3 more bandwidth than we have the limit set to (ie, if the limit is 5 Mb/s, we're seeing 7Mb/s. Does anyone have any ideas about this? Are there any tools I can use to test the bandwidth use? How about something that can reliably generate traffic at a given limit?
s3://commoncrawl/crawl-data/CC-MAIN-2019-22/segments/1558232256314.52/warc/CC-MAIN-20190521102417-20190521124417-00491.warc.gz
CC-MAIN-2019-22
651
11
http://inform7.com/extensions/Aaron%20Reed/Lines%20of%20Communication/doc_7.html
code
Lines of Communication version 1 by Aaron Reed Example: * Smarter Smarter Parser - Send unrecognized commands to an external text parser. One could imagine doing much more sophisticated text processing on commands not recognized by the Inform parser, perhaps involving external libraries like WordNet or ConceptNet. Here's how you could set this up on the Inform side. To simulate the external recognizer, try typing in an invalid command and then putting a valid one in the "recognizerInput.glkdata" file. "Smarter Smarter Parser" Include Lines of Communication by Aaron Reed. The File of recognizer-input is called "recognizerInput". The File of recognizer-output is called "recognizerOutput". The recognizer is a real-time file channel. The input file is File of recognizer-input. The output file is File of recognizer-output. Before printing a parser error: inform the recognizer that "[the player's command]"; now the recognizer is switched on. For printing a parser error when recognizer is switched on: do nothing. Before doing anything when the recognizer is switched on: say "[line break](reparsed that as '[the player's command]')"; now the recognizer is switched off. The Forest is a room. A tree and a boulder are fixed in place in forest. A deer is an animal in forest.
s3://commoncrawl/crawl-data/CC-MAIN-2018-05/segments/1516084887067.27/warc/CC-MAIN-20180118051833-20180118071833-00219.warc.gz
CC-MAIN-2018-05
1,282
16
https://elocalpost.com/yorkville/events/from-survive-to-thrive-work-life-balance-as-an-entrepreneur
code
From Survive to Thrive: Work/Life Balance as an Entrepreneur Discussion based workshop where we discuss how to run/start a business without compromising your health. Strategies and concrete examples will be provided. Everyone present is welcome to contribute experiences, tips, and bring their questions and struggles. We will also explore what are personal impacts and what are systems which we have some control over, and how to cope with each. If the group is larger, we will break into smaller groups to allow for more open sharing. Victoria Alleyne has worked at private firms, corporations, government, startups, charities, and non-profits. She has helped countless startups and businesses through the years whether as an employee, volunteer, or in her current position as CEO of CatalystsX, where she supports changemakers in their work. Working at Career Skills Incubator, a non-profit she founded, she helped many people find and create jobs through traditional and non-traditional routes. 3rd Floor Hinton Learning Theatre. Register below. Call 416-393-7131 for more information.
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999273.24/warc/CC-MAIN-20190620190041-20190620212041-00351.warc.gz
CC-MAIN-2019-26
1,089
4
https://dockerquestions.com/category/amazon-ecs/
code
I have an ECS Cluster that is using an image hosted in AWS ECR. The dockerfile is executing a script in it’s entrypoint attribute. My cluster is able to spin up instances but then goes into a stopped state. The only error it is giving me is as follows: Exit Code 0 Entry point ["/tmp/init.sh"] .. I am building and deploying an application via Docker and ECS Fargate. I have my entrypoint command defined in the ECS Task definition. Upon pushing the image into a private ECR repository, I am getting this error when ECS Fargate attempts to deploy the docker image. Any advice would be helpful. Below is the dockerfile, .. I have an application being deployed to an ECS Fargate cluster in a private subnet. The application being deployed is in a docker image that is being hosted in a private ECR repo. Below is the dockerfile I have defined to create and push the application into ECS Fargate. I am having an issue with .. I am trying to create an AWS context for use with docker for AWS. Using the docker documentation I have successfully created a context with docker context create ecs myecscontext but every single time when I try to use this created context with docker context use myecscontext docker breaks and simply typing docker into the .. I am deploying a java app on aws ecs fargate in an aws account(aws-dev) and it is working perfectly fine. My app makes call to get the local host and able to resolve it. InetAddress.getLocalHost() On app startup, i printed contents of /etc/hosts and it looks like this: 127.0.0.1 localhost 10.111.11.111 ip-10-111-11-111.ec2.internal Everything works fine .. So I’m trying to use Docker contexts to deploy stuff in ECS seamlessly. Yet the commands that are described here and here (docker context create ecs contextname) don’t work. The version of docker I’ve got installed is the latest and the manual clearly doesn’t include anything regarding "context types" or ECS. Are such articles outdated .. I have been building a backend for the last few days that I am launching with docker-compose. I use docker secrets to not have to store passwords – for example for the database – in an environment variable. Since I want to use AWS ECS to run the docker containers online, and unfortunately docker compose .. mouthful title but the point is this, I have some data science pipelines with these requirements (python based): are orchestrated with an "internal" orchestrator based off on a server are run across a number of users/products /etc where N could be relatively high the "load" of this jobs I want to distribute and not be .. I am trying to deploy an API I have built using ASP.NET on an AWS EC2 instance I have configured running docker swarm. The API is being deployed correctly on Port 80 being configured in my DOCKER file, however not when trying to use HTTPS port 443. When using Port 80 in the Docker file, .. I am deploying a multi container Flask python app (with gunicorn) to ECS with Docker to my ECS cluster that uses a single t2.small EC2 instance. My app runs on port 8000 and runs fine, I can use my app perfectly when using my EC2 DNS: http://ec2-xx-xxx-xxx-xx.us-east-2.compute.amazonaws.com:8000 I now want to use my own custom ..
s3://commoncrawl/crawl-data/CC-MAIN-2021-39/segments/1631780060882.17/warc/CC-MAIN-20210928184203-20210928214203-00657.warc.gz
CC-MAIN-2021-39
3,218
10
https://support.smartrace.de/issue/14514
code
Via my iphone i can start a server and people can connect perfectly. Via my mac though, the server wont start. I am not getting a qr code to scan either. I'm probably overlooking something but can't quite find the problem. many thanks in advance!
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679101282.74/warc/CC-MAIN-20231210060949-20231210090949-00736.warc.gz
CC-MAIN-2023-50
246
4
https://learning.windriver.com/vxworks-7-symmetric-multiprocessing
code
VxWorks 7 Symmetric Multiprocessing Optimize application performance and migrate applications to parallel design, focusing on Wind River® VxWorks symmetric multiprocessing (SMP) technology. The VxWorks® 7 Symmetric Multiprocessing course presents several methods to optimize application performance using parallel design techniques. Issues encountered in migrating applications to parallel design are discussed in detail. Specifics of creating and migrating applications to Wind River® VxWorks symmetric multiprocessing (SMP) technology are also addressed. After taking this course, participants will be able to: - Describe the multi-core processor architecture. - Distinguish between multi-core and multiprocessing environments. - Describe the VxWorks SMP system configuration. - Understand and solve pitfalls of serial programming in the SMP environment. - Migrate applications from the uniprocessor (UP) to the SMP environment. - Analyze concurrency using debug tools. - Perform run-time analysis of applications in the UP vs. SMP environment. - VxWorks 7 - Wind River Workbench 4 - Wind River Simics 4.8 Who Should Attend - Application engineers - System integrators and architects - This two-day expert-led course consists of nine lectures and seven lab sessions. - Attendees use VxWorks 7, Workbench 4, and Simics 4.8 to gain experience with the topics presented. - Participants receive individual guidance from an expert engineer who has extensive experience with Wind River technologies. Introduction to SMP - History of multi-core and multiprocessing - Overview of SMP - Other multi-core configurations - LAB: Getting Started with SMP VxWorks SMP Architecture - Overview of SMP architecture - Cache and cache coherency - The sequential memory model - Mutual exclusion - Spinlocks and deadlocks - Memory barriers - Development challenges VxWorks SMP Configuration - VxWorks SMP components - Software and hardware requirements - LAB: Configuring VxWorks for SMP VxWorks SMP Programming - Read/write semaphores - Task CPU affinity - Interrupt CPU affinity - Atomic operations - Memory barriers - POSIX thread barriers - CPU information and management - Uniprocessor incompatibilities - LAB: Synchronizing Data in an SMP Environment - LAB: Synchronizing Data with Core Affinity and Core Reservation - LAB: Synchronizing with Message Queues - LAB: Synchronizing with Semaphores Debugging and Analysis Tools - Multi-core debugging overview - Multiple context debugging - System viewer and analysis tools - Kernel shell debugging - LAB: Working with Workbench Debugger Introduction to Software Parallelism - SMP limits - Parallel software design - Implementing a parallel programming model - Parallelism examples - Portable parallel programming APIs Uniprocessor to SMP Migration - Migration guidelines - The three-step migration plan - Step 1: Update to current VxWorks version - Step 2: Migrate to SMP API - Step 3: Optimize for SMP - LAB: Comparing the Performance of Single Core and Multi-core Processors VxWorks SMP Scheduler - VxWorks UP scheduler - VxWorks SMP - VxWorks SMP scheduler - C programming - Functional knowledge of UNIX - Basic VxWorks API knowledge - Real-time programming basics Interested in our e-Learning? Subscribe with the button below!Subscribe Now Live Training events coming soon! Interested in private training? Get expert training when and how you want it Wind River Web Seminars View free, interactive web seminars on the latest embedded software trends and technologies from Wind River® Access the support network that provides a wide variety of useful information
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703513144.48/warc/CC-MAIN-20210117174558-20210117204558-00603.warc.gz
CC-MAIN-2021-04
3,603
84
https://resource.dopus.com/t/create-a-folder-named-from-parent-folder-move-selected-file-in/32481
code
I need to create a button for create a folder named from it's parent folder + move selected files in that folder and make that folder into a rar files. example: Folder A/item 1.mp3, item 2.mp4, item 3.pdf After click that button - Create a folder named 'Folder A' - All Selected files move into That 'Folder A' - Create 'Folder A.rar' form 'Folder A' - Delete the 'Folder A' After All that action I will have a single 'Folder A.rar' file in the parent 'Folder A' Like That: Folder A/Folder A.rar
s3://commoncrawl/crawl-data/CC-MAIN-2019-22/segments/1558232262311.80/warc/CC-MAIN-20190527085702-20190527111702-00234.warc.gz
CC-MAIN-2019-22
495
9
http://bindingofisaac.wikia.com/wiki/Portable_Slot
code
Upon activation, 1 coin will be consumed, and will have the same function of a Slot Machine. It can be used infinitely as long as the player has coins to spend. Using this in a boss room to spawn a fly after defeating the boss will enable the player to leave the room and come back, facing another boss battle and another item spawn. The type of heart dropped is affected by the current room (like a normal Slot Machine), e.g. an eternal heart if in the Cathedral Room. VideosEditThe Binding of Isaac PSA 2 - The Portable Slot(03:04)
s3://commoncrawl/crawl-data/CC-MAIN-2013-48/segments/1386164647809/warc/CC-MAIN-20131204134407-00094-ip-10-33-133-15.ec2.internal.warc.gz
CC-MAIN-2013-48
533
4
http://www.conceptart.org/forums/showthread.php/20944-Waterlight-Tree
code
Results 1 to 1 of 1 Thread: Waterlight Tree March 30th, 2004 #1Registered User - Join Date - Feb 2004 - Thanked 0 Times in 0 Posts Another concept landscape for upcoming PC game. Personally, I don't think it is as strong as the "Winged Tower" locale but it's softer and less moody. I'll post up some more backstory when I get the ok from the legal department. C & C encouraged and welcomed...
s3://commoncrawl/crawl-data/CC-MAIN-2015-27/segments/1435375098685.20/warc/CC-MAIN-20150627031818-00233-ip-10-179-60-89.ec2.internal.warc.gz
CC-MAIN-2015-27
392
8
https://support.microsoft.com/en-us/help/948815/availability-of-the-.net-framework-2.0-post-service-pack-1-hotfix-rollup-package-for-system.data.dll-and-system.data.oracleclient.dll
code
This article lists the Microsoft .NET Framework bugs that are fixed in the .NET Framework 2.0 post-Service Pack 1 (SP1) hotfix rollup for System.Data.dll and for System.Data.OracleClient.dll. Issues that are fixed in the hotfix packageThe following issues are fixed in this hotfix package. 944100 FIX: You cannot manipulate the data table that is used a transaction in a .NET Framework 2.0-based Web project when you run a long-running stored procedure or an SQL script in the project 944099 FIX: Error message when you use the SQL Native Client data provider to connect to an instance of SQL Server 2005 that is configured to use database mirroring: "Internal .Net Framework Data Provider error 6" 948867 FIX: Null characters may appear in parts of the string that is returned when you use the System.Data.OracleClient.OracleDataReader class to return the results from a query in the .NET Framework 2.0 948868 FIX: Error message when a System.Data thread tries to open a pooled connection in the .NET Framework 2.0: ""Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool"" 948176 FIX: Error message when applications call multiple SqlConnection.Open methods or OracleConnection.Open methods through multiple threads at the same time in the ADO.NET 2.0 applications: "IndexOutOfRangeException" Hotfix InformationA supported hotfix is now available from Microsoft, but it is only intended to correct the problem that is described in this article. Only apply it to systems that are experiencing this specific problem. This hotfix may receive additional testing. Therefore, if you are not severely affected by this problem, Microsoft recommends that you wait for the next .NET Framework 2.0 service pack that contains this hotfix. To resolve this problem immediately, contact Microsoft Product Support Services to obtain the hotfix. For a complete list of Microsoft Product Support Services phone numbers and information about support costs, visit the following Microsoft Web site:Note In special cases, charges that are ordinarily incurred for support calls may be canceled if a Microsoft Support Professional determines that a specific update will resolve your problem. The usual support costs will apply to additional support questions and issues that do not qualify for the specific update in question. PrerequisitesYou must have the .NET Framework 2.0 SP1 installed to apply this hotfix. Restart RequirementYou do not have restart your computer after you apply this hotfix. Hotfix Replacement InformationThis hotfix is not replaced by any later hotfix. Registry informationYou do not have to create or modify any registry keys to activate any hotfixes that are contained in this package. File InformationThe English version of this hotfix has the file attributes (or later) that are listed in the following table. The dates and times for these files are listed in coordinated universal time (UTC). When you view the file information, it is converted to local time. To find the difference between UTC and local time, use the Time Zone tab in the Date and Time tool in Control Panel. |File name||File version||File size||Date||Time||Platform| For more information about software update terminology, click the following article number to view the article in the Microsoft Knowledge Base: 824684 Description of the standard terminology that is used to describe Microsoft software updates Article ID: 948815 - Last Review: Feb 16, 2017 - Revision: 3 Microsoft .NET Framework 2.0, Microsoft .NET Framework 2.0 IA64 Edition, Microsoft .NET Framework 2.0 x64 Edition
s3://commoncrawl/crawl-data/CC-MAIN-2017-13/segments/1490218188962.14/warc/CC-MAIN-20170322212948-00226-ip-10-233-31-227.ec2.internal.warc.gz
CC-MAIN-2017-13
3,597
19
http://www.vistax64.com/vista-networking-sharing/70111-wireless-problems.html
code
Try using the 'Internet Connectivity Evaluation Tool' from Microsoft ) and see if everything goes fine. Otherwise, you might have to change the settings for your wireless card (which might be the culprit) through device manager (Start > Search for Device Manger > Network Adaptors > Double click on your wireless card's entry). Then make sure that the configuration is set just as it is in XP and also make sure that the Wireless Network Connection Properties (Control Panel > Network and Sharing Center > Manage Network Connections > right click on your wireless network and click properties) show that all the necessary protocols are installed. Hope this helps, email:email@example.com | web:http://vista-news.com "TobiLei" <TobiLei@discussions.microsoft.com> wrote in message > i have the same problem since 07-12-2007. > Any connection via WLAN isn't active. > Can any person help us ? > "Raniero" wrote: >> Hello I am dualbooting XP and Vista, writing from XP I have installed >> ultimate but I am experiencing some wireless network problems, I have a >> wireless router Wl-534, and a Wlan that came with the pc, it detected the >> devices and the wireless networks, But the one that I use to connect to >> internetis marked with an X and says that the PC doesnt meet the network >> requirements, the same network works perfectly with XP, Thanks for all
s3://commoncrawl/crawl-data/CC-MAIN-2013-48/segments/1386163052216/warc/CC-MAIN-20131204131732-00048-ip-10-33-133-15.ec2.internal.warc.gz
CC-MAIN-2013-48
1,358
23
http://www.oscaf.org/node/20
code
OSCAF ontologies suited for Nokia's Maemo platform Urho Konttori, Project Manager at Nokia for the Maemo Desktop Data platform, has been working with the NEPOMUK ontologies and has given the following statement of endorsment We are developing a mobile semantic content storage and retrieval solution that uses NEPOMUK as its base ontology. We are in the process of adapting and extending the ontologies and pushing the changes to be part of the NEPOMUK standards. The recommendations created as standards by the NEPOMUK project are a solid foundation for the semantic mobile content solutions we are working on. Nokia - Maemo Desktop Data
s3://commoncrawl/crawl-data/CC-MAIN-2014-10/segments/1393999674031/warc/CC-MAIN-20140305060754-00038-ip-10-183-142-35.ec2.internal.warc.gz
CC-MAIN-2014-10
638
5
https://steven-universe.fandom.com/wiki/Thread:1010643
code
The comments were removed on January 1, 2020. Then, on February 4, the comments were restored but in an archived state, meaning you can view previous comments but can't add new ones. Not that good of a solution if you ask me, but that's just what happened. Alright. The comments were removed because of some negative comments involving Shep's biological sex, spam involving Steven having a neck, and others. They were also removed to push the discord server and forums as new means of discussion.
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585371611051.77/warc/CC-MAIN-20200405213008-20200406003508-00112.warc.gz
CC-MAIN-2020-16
496
2
https://guardiandigital.com/content/secure-your-open-source-projects
code
6 Best Practices to Secure Your Open Source Projects Securing an open-source project requires a lot of effort and knowledge. It also requires a strategic plan and an ability to execute. The below article is intended to help you develop that plan. With that in mind, below are 6 best practices to secure an open-source project from start to finish. Hire the right IT experts Bringing the right people on board for your open source projects should be your first step if you don't already have the in-house human capital. A security team with little understanding of the architecture behind your project, as well as an open-source community without any security background, can easily lead to severe vulnerabilities. Make sure you thoroughly vet candidates to make sure they have the right skills for the job. Testing the candidate's knowledge in a real-life situation is worth more than checking their qualifications. An expert in Linux security who has never heard of Apache Struts shouldn't be your first choice, even if they have all the paper qualifications. Choose a secure password and don't share it with anyone Password protection is the foundation of the security of every open source project. Keep your password private. Never use a publicly available password. Assume that someone is always watching you, trying to get access to your passwords. You can make your passwords more secure by using a password manager that will provide you with strong passwords. Don't give strangers access to your open source project. When someone asks for access, always validate who they are and their motivation. Beware of the risks associated when allowing external users (i.e., not part of your development team) to have access to your projects' repositories or bug trackers. When in doubt, deny the access. Use a code signing certificate to sign your releases Code signing certificates are digitally signed security credentials that you can use to sign executables and scripts containing your software's cryptographic key. These digital signatures ensure that the file has not been modified since it was signed by your private key and is coming from you. You should also use SSH to access your code repositories. When using the Git or Subversion source control management system, always use SSH to access your repositories. This will force you to use a key-based authentication and prevent brute-force attacks. You can also configure your SSH client to lock your sessions after a few minutes of inactivity. Don't forget to back up your code. Think about how much time and effort you have put into creating your project. Now, imagine that someone deleted it from source control or that all your work is gone because of a disk drive failure. Make sure that you always have a backup or at least a copy of your code stored in a safe place. Set up a security vulnerability reporting process It is vitally important that you have a documented process for handling security reports. This is of paramount importance if you have dependencies on other open-source projects. Always check your project's code on GitLab, and keep an eye out for commits and comments from the maintainers of libraries that you use (i.e., jQuery) to see if they've released an update with a fix. When someone submits a security vulnerability, follow the tutorial to inform them about your process. Your reporting process is basically a contract between you and the person who found a vulnerability. When someone reports a security issue to your project, they might be giving up their legal rights in exchange for your adherence to this process. This means that if you don't follow the policy, it could be considered an act of bad faith. Encrypt sensitive data in your repositories Having your sensitive data encrypted ensures that nobody will be able to read it without your password. You can encrypt any file within Git using GPG, providing you with the option of having different passwords for different files. This way, if one is compromised, all other information remains secure. Encryption is a requirement for any sensitive security project, as it prevents attackers from obtaining the credentials they need to exploit your code. A tool like Git LFS can help you store big files without compromising on security. Regularly audit your code for security vulnerabilities Using static analysis tools is the best way to have automated audits of your code. They are able to detect common vulnerabilities without requiring manual intervention. You can configure them to run automatically on every new commit or pull request using Git hooks. This ensures that whenever someone creates a new branch for a feature set, all of the branches in your repository are checked for security vulnerabilities. An example of a tool that can help you with this is Git Assassin. It comes with the following features: A policy engine for defining exactly what changes are allowed in which branches, Excluding/including specific files based on regex matches, Automatic issue reporting on all violations detected. Secure your Business Emails with Email security solution Inadequately secured email accounts provide cyber attackers with an open door into your business - frequently resulting in the compromise of sensitive data, lost productivity and serious reputation damage. Having an effective email security strategy in place is vital in keeping your business safe and successful - both while navigating this difficult, uncertain time, and while recovering from the COVID-19 crisis. It has become more apparent than ever that securing email should be a top priority for all businesses. A secure open-source project is a successful open source project. There are plenty of things you can do to make sure your project is well protected, but the above is undoubtedly the foundation of any secure open-source initiative. - What Is an Email Filtering Service & How Does It Work to Secure Email? - How to Protect Your Email Account from Being Hacked? - KeyLogger - How it is used by Hackers to monitor what you type? - What Helps Protect from Spear Phishing: 21 Ways of Protecting Businesses from Spear Phishing - 6 Best Practices to Secure Your Open Source Projects - Improve Your IT Security With These 7 Fundamental Methods - How to Protect Your Email Account From Malware and Hackers - Practical Cybersecurity Advice for Small Businesses - End-to-End Encryption Online: Benefits & Freedoms - What Are the Benefits of Email Encryption? - What Is Guardian Digital EnGarde Cloud Email Security? - What are Some Examples of Malicious Code & What Can They Do? - How to Properly Scan Your Windows Computer for Malware & Remove Malware from Your PC - What Should I Do if I Accidentally Clicked on a Phishing Link? - What Are Denial of Service (DoS) Attacks? - Why Should Businesses Outsource Email Security? - What Is Domain Spoofing? - What Are Insider Threats & How Can You Reduce Your Risk? - The Silent Assassins: How Impersonation Attacks Target CEOs via Email
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224648858.14/warc/CC-MAIN-20230602204755-20230602234755-00136.warc.gz
CC-MAIN-2023-23
7,003
43
https://mail.python.org/pipermail/python-dev/2006-February/060325.html
code
[Python-Dev] syntactic support for sets dw at botanicus.net Thu Feb 2 01:36:24 CET 2006 On Wed, Feb 01, 2006 at 03:03:22PM -0500, Phillip J. Eby wrote: > The only case that looks slightly less than optimal is: > set((1, 2, 3, 4, 5)) > But I'm not sure that it warrants a special syntax just to get rid of the > extra (). In any case I don't think it's possible to differentiate between the current calling convention and the 'parenless' one reliably, eg.: S = set() There is no way to tell if that is a set containing an empty list created using the parenless syntax, or an empty set, as is created with the current calling convention. DISOBEY, v.t. To celebrate with an appropriate ceremony the maturity of a command. More information about the Python-Dev
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711286.17/warc/CC-MAIN-20221208082315-20221208112315-00456.warc.gz
CC-MAIN-2022-49
756
17
https://productioncommunity.publicmobile.ca/t5/Getting-Started/Data/td-p/423743
code
@HparharTo contact a moderator, click on the ? on the lower right hand corner of your page. Type in your issue to SIMon, once SIMon responds to you, type in submit ticket.. it will prefill the information from your conversation into a ticket for the moderator to look at. I have a very similar issue and I sent the Moderator a message in my inbox. Don't get to hopefully for fast support though. It took Public three days to reply and all they did was ask my info. Their support sucks. Try sending them a message.
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439738960.69/warc/CC-MAIN-20200813043927-20200813073927-00577.warc.gz
CC-MAIN-2020-34
513
2
http://itnas.civil.duth.gr/live/abstr/abstr_eann67.html
code
|Skin cancer is one of the most diagnosed cancers according to the World Health Organization and one of the most malignant. Unfortunately, still the available annotated data are in most cases not enough to successfully train deep learning algorithms that would allow highly accurate predictions. In this paper, we propose the utilization of transfer learning to fine-tune the parameters at the very last layers of a pre-trained a deep learning neural network. We expect that a limited number of skin lesion images is enough to affect significantly the later data-specific layers. Furthermore, we propose a pre-process step for skin lesion images that segments and crops the lesion, whereas smooths the effect of image masking, thus enhancing the network’s classification capabilities. The reported results are very promising, since the overall accuracy, as well as the accuracy of individual class classification improved in 7 out of the 8 classes, suggesting future developments in medical diagnosis through pre-trained deep learning models and specialized image prefiltering.| *** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439735939.26/warc/CC-MAIN-20200805094821-20200805124821-00395.warc.gz
CC-MAIN-2020-34
1,306
2
https://adsm.org/lists/html/nv-l/2004-08/msg00193.html
code
[nv-l] ciscoworks integration ? I have just changed my CISCOWORKS integration to now use HTTPS instead of HTTP via the INTEGRATION UTILITY from the pulldown menu and saved it via the pop up window. When I launch CISCO-VIEW from the CISCOWORKS pull down menu after selecting a CISCO device, the browser pops up but gives me an error stating "I/O error". I do see that it is now using HTTPS. Now when I try to go back and change the integration to HTTP and save it, when I launch it again it still uses HTTPS. I have even shut down all processes and started again but no luck. OIT Network Systems |<Prev in Thread] ||[Next in Thread>| - [nv-l] ciscoworks integration ?, Jeff Fitzwater <=
s3://commoncrawl/crawl-data/CC-MAIN-2020-24/segments/1590347434137.87/warc/CC-MAIN-20200603112831-20200603142831-00433.warc.gz
CC-MAIN-2020-24
685
15
https://out-of-hours.info/coursera-vs-udacity-data-analyst/
code
Course Organization– Coursera Vs Udacity Data Analyst. Convenient Summary on Tap Course content was quite well arranged, with a menu of lessons, grades, notes, and discussions down the left-hand column. The main dish page had a welcome message from the course tutor, highlighting crucial functions like where to get aid.was founded in 2012 by 2 computer technology teachers from Stanford University– Andrew Ng and Daphne Koller. Andrew Ng began playing around with online learning software much earlier than that. In 2008, he established the Stanford Engineering All Over (SEE) program, which provided 3 Stanford courses on machine learning, databases, and AI to online students free of charge. Each of these 3 online courses collected signups of 100,000 students or more, as detailed by Andrew himself. Seeing such need for online classes stimulated Andrew’s interest a lot more, and soon, he started actively establishing together with co-founder Daphne Koller. Going back in time to 2012, have a look at this interview with Daphne Koller, co-creator of. At the time when she was offering this talk, just had 43 online courses available. In less than 8 years, that number has grown nearly a hundred-fold to 4000. Andrew and Daphne saw a lot capacity in this kind of e-learning that they put their professions as teachers at Stanford on hold and started focusing entirely on the MOOC site. Reviewing it, they certainly made the right option, as only seven years later on, the business they created is currently valued at over $1 billion. Find Coursera Vs Udacity Data Analyst Online The 2 ex-CEOs of, Andrew, and Daphne, are no longer actively handling the business themselves. In 2018, Daphne Koller founded Insitro, an ingenious company that links drug discovery and maker learning. is still a fairly new company, and I am really interested to see what the future will appear like.Just how much does cost? ‘s cost depends upon the kind of online class. Private courses cost $29 to $99, however most of the times, they can be examined for free.’s expertise programs are based on monthly payments of $39 to $89 per month. The MasterTrack certificate programs cost starting from $2000.’s online degrees, nevertheless, can cost anywhere from $15000 to $42000. Plus is’s annual subscription service through which students can access all 3000+ courses, expertises and professional certificates with unrestricted access. The plan provides exceptional value for trainee such who take online courses frequently. Yes, is legit and worth the expense. is among the most economical MOOC sites presently out there. Thousands of university-backed online courses make it extremely appealing for MOOCs, and the brand-new subscription-based Plus offers exceptional value for regular online trainees. How does generate income? ‘s annual earnings is estimated to be around $140 million and the majority of it originates from paid online courses, Specializations, MasterTracks, online degrees, and business customers. The worldwide corporate e-learning market size is growing astonishingly rapidly, and it’s also becoming an increasingly big portion of’s income. You’ll immediately notice there’s a lot on deal when you delve into the course brochure. The catalog includes courses in arts and liberal arts, sciences, business, IT, languages, personal advancement, and more. as potentially the very best maker finding out course ever and i type of agree with that because it’s quite an excellent course however back in 2015 this course was a little bit too much for me due to the fact that after a couple of lessons i understood i needed to go back to the fundamentals but even if i started this course was so encouraging for me because i recognized there’s so many things that i require to discover when it comes to machine learning and it was unbelievable motivation to get started with artificial intelligence and then get to where i am now so played a big role when it comes to my profession and my motivation and i can not thank them enough for that having this in mind let’s go through some advantages that you might have and likewise through some unreasonable expectations that a lot of you might have because we all know that the e-learning area and the e-learning market is growing rapidly and along with we have so many other e-learning platforms such as you know a cloud guru or udemy or pluralsight there are numerous options out there for example for cloud services a cloud master is excellent and also for anything tech related pluralsight is very good and i use them all i utilize all of them right i utilized both pluralsight for numerous months and numerous times for many months due to the fact that i desired at different times to up my abilities and i likewise use for instance in 2013 2014 i’ve been utilizing udemy a lot however the important things resembles with you to me nowadays i do not truly use it that much due to the fact that it’s too much sound on that platform because everyone’s doing a lot of courses nowadays you get a lot of people that do not have a great deal of experience in numerous fields and they simply do courses on udemy because there’s no vetting procedure there and because of that there is a great deal of noise obviously you have a lot of great courses there however they get lost because extensive amount of of reasonably i do not know average courses however however uw still has some good courses and i have a video about the best device finding out course on udemy go and check that one out but again because we have a lot of platforms that develop courses and offer certifications this waters down the importance of one specific certification so you need an edge when it concerns these certifications and kind of has that edge due to the fact that it offers courses from leading universities and they’re rather cost effective and likewise you get these courses from these leading universities are likewise taped by specialists in the field so you get this type of impact due to the fact that the courses and the accreditations that you receive from they still have some sort of reputational advantage compared to other platforms so in my opinion coursera i believe is the best platform if you want to get a certification since you still have that reputation that kind of circulations down from the university onto you as a specific and also having these accreditations helps you because you can include them to your linkedin profile for example or to your cv i imply maybe not to your cv however clearly if you include them to your linkedin profile you can promote yourself and therefore you can signal the fact that you know those topics also it shows the reality that you are a long-lasting learner and this is very essential for employers because they wish to see an individual that constantly wants to up their skills alright you want someone that always is interested in enhancing that is in this sort of self-improvement mode that they never ever simply get comfortable with the position that they remain in since everyone kind of loves ideal everybody likes a self improver everyone
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100172.28/warc/CC-MAIN-20231130062948-20231130092948-00348.warc.gz
CC-MAIN-2023-50
7,168
16
https://www.andywhiteanthropology.com/blog/jim-vieiras-visit-to-forbidden-archaeology
code
And because I know that it's sometimes hard to reliably detect the presence/absence of sarcasm in the written word, I'll clarify and say that I'm not being sarcastic: Jim Vieira's visit to my Forbidden Archaeology class was legitimately fun. A couple of the students in Forbidden Archaeology collected video of Vieira (totalling about six hours, including both class sessions, a one-on-one interview with him, and Vieira and me discussing various issues related to giants) for their final project. They've got control of all that footage for now. It will be really interesting to see what they produce from it. Vieira and I agreed that we both need to give their project the green light before it will be made public. I'll keep you posted on that. In the meantime, here's a short clip of Vieira in class yesterday. I don't remember the exact question to which he was responding, but his answer speaks for itself.
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947476452.25/warc/CC-MAIN-20240304133241-20240304163241-00124.warc.gz
CC-MAIN-2024-10
912
3
http://www.thegtaplace.com/forums/topic/8974-most-annoying-member/?forceDownload=1&_k=880ea6a14ea49e853634fbdc5015a024
code
Most Annoying Member Posted 30 December 2006 - 05:02 AM I support the Brett Favre society of grey-beardness. Grow some stubble, and make that shit grey. Leave it that way. Forever. Retire from your job twice in eleven months. Posted 30 December 2006 - 05:04 AM Tommy montana too. Kinda funny how both names start with "Tommy" no? Posted 30 December 2006 - 05:12 AM Posted 30 December 2006 - 08:12 AM hurr De Durr. Posted 30 December 2006 - 09:11 AM Posted 01 January 2007 - 04:04 PM Aw, my first post. You may call it "arrogant". I prefer "foreshadowing". Yeah, I'm new here. Don't piss me off and we'll be fine. Posted 02 January 2007 - 06:04 AM That guy even edited WIKIPEDIA for that CJ and his mom and connections with Toni and shit. “Do not spoil what you have by desiring what you have not; remember that what you now have was once among the things you only hoped for.” - Epicurus 0 user(s) are reading this topic 0 members, 0 guests, 0 anonymous users
s3://commoncrawl/crawl-data/CC-MAIN-2016-44/segments/1476988719136.58/warc/CC-MAIN-20161020183839-00307-ip-10-171-6-4.ec2.internal.warc.gz
CC-MAIN-2016-44
962
18
https://data.openei.org/submissions/57
code
AEO2011: Liquid Fuels Supply and Disposition This dataset comes from the Energy Information Administration (EIA), and is part of the 2011 Annual Energy Outlook Report (AEO2011). This dataset is table 11, and contains only the reference case. The dataset uses million barrels per day. The data is broken down into crude oil, other petroleum supply, other non petroleum supply and liquid fuel consumption.
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141191692.20/warc/CC-MAIN-20201127103102-20201127133102-00545.warc.gz
CC-MAIN-2020-50
403
2
https://www.decathlon.ie/soft-lures/119145-178983-round-jig-head-x4-2-g-lure-fishing-jig-head.html
code
Tips for storage and maintenance If you're using soft lures with salt, remove the sinker heads if you don't use them for a prolonged period. Store your lures in a box, away from light and heat sources. Restrictions on use Wash hands after touching the weighted head.
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100710.22/warc/CC-MAIN-20231208013411-20231208043411-00601.warc.gz
CC-MAIN-2023-50
266
5
http://joomlus.com/spinrewriter2/top-seven-common-prejudices-about-spin-rewriter-9-0-get-your-bonus-now.html
code
Before WordAi even starts spinning, it reads the entire article to understand both "generally" what the article is about and the "specifics" as to what exactly happens in the article. This allows WordAi to create complicated paragraph and document level spins based on its deep understanding of the article. Because no other machine has this level of deep understanding, it makes your content look human written. It is even able to correctly write high quality titles by identifying what the article is talking about. Owner is a scammer who has been banned from major SEO forums for fraudulent and deceptive practices. You might be willing to look past the fact that the owner of this product creates fake accounts to post fake reviews and lies about the product if the product itself was actually good, but unfortunately Spinner Chief itself is filled with half baked buggy features (which are advertised as working completely) and can't go more than 10 minutes without crashing. All in all it is a complete waste of money and time. Yea mate, WordAI really is overpriced IMO too, which is why I didn’t rank it higher in the cost-effectiveness rankings. Now, it did change more than a few words, but I see how you would think it’s less than 80% unique. TBS is also great, but only in terms of functionality and the easy interface which allows manual spinning to be done quite fast. As for their thesaurus, I think they have a lot to improve. I've ranked websites in the past with spun content, and now they are still on #1 position of google. First year (2014) google found me because my spun was really bad, many paragraphs were just copied and my position drop from #2 to around #20 after a panda update. Later I cleaned all my text with better spun (I didn't buy any article!), my ranking went up again, and with more backlinks I reached #1 and it is still there Content reword programming which is regularly called turning programming isn't your best arrangement. An article rewriting tool can help you a lot. A considerable lot of these product projects will turn out total trash in the event that they are used erroneously and regardless of the possibility that used right they won't change the structure and organization of your work enough to trick the greater part of the web indexes. On the off chance that the work should be possible by programming it can likewise be spotted by programming. Genuine paper rewording will bring about another report that will appear to be totally unique to the first with various passages and sentences instead of only a couple of words having been changed all over. Tools such as Spinner Chief can help you. Not only are you getting access to Spin Rewriter 9.0 for the best price ever offered, but also You’re investing entirely without risk. Spin Rewriter 9.0 include a 30-day Money Back Guarantee Policy. When you choose Spin Rewriter 9.0, your satisfaction is guaranteed. If you are not completely satisfied with it for any reason within the first 30 days, you’re entitled to a full refund – no question asked. You’ve got nothing to lose! What Are You Waiting for ? Try It today and get The Following Bonus Now ! Spin Rewriter 5.0 technology was a previous edition that allowed their clients to use the basic article rewriting system. Their newer versions over the years have just kept getting better and better. The new Spin Rewriter 9.0 technology gives you the option of assigning variables to your content. Their software is dead easy to use. In just three steps you can have up to 50 uniquely spun articles. We know that you want more from your spinner so X-Spinner uses a new, unique, organic approach to spinning, one that grows with your needs. X-Spinner's new method actually gets better the more it is used. No other spinner has it - it's the new, exclusive Statistical Replacement Technology (SRT). It works in a radical new way by selecting the statistically most appropriate synonym for any word or phrase. SRT works in a similar way to Google Translate - you may have noticed Google Translate is getting better over time with more accurate translations. This is because as the web grows, the sample size for Google Translate's database increases, and so becomes statistically more likely to use the correct wording when it translates. X-Spinner works in a similar way by polling its huge Cloud Thesaurus for the statistically best synonym. As the Cloud Thesaurus grows, so X-Spinner gets better and better at synonym replacement. Remember - it's exclusive, don't expect to see this kind of technology anywhere else. ONLY with X-Spinner! Does your website need quality content in big numbers? The truth is; producing decent articles can take any author hours and limit the amount of fresh written content on your website especially if you are the only one writing for your website. On the other hand, hiring a writer can get expensive. Have you ever considered using an article spinner? Spinning content is a great way to keep your website relevant and fresh with new content. Of course, not all article spinner software is good. Therefore it’s important to do your homework first before choosing one. We suggest Spin Rewriter 9.0. With this tool, you can rewrite your articles using their intelligent One-Click Rewrite system. In addition, you also have a Bulk Rewrite option that lets your rewrite multiple articles with one click. You can take one great article and break it up into several great pieces of content. The options are endless. Hello and welcome to another review from the Make Money Online Zone. Today we are looking at something that I didn’t really expect to see ever again. But here we are looking at another version of an article spinner that has been around for quite some time. Spin Rewriter from www.spinrewriter.com has just announced version 9.0 of its article creation software. Spin Rewriter 9.0 boasts lots of features that claim to make content creation easy. But is this software all its cracked up to be or should you avoid this one like the plague? After rewriting and keeping the dynamics of the writing same, our tool passes it through a plagiarism checker. This ensures that the legal standard of plagiarism is maintained after rewriting the article. After all this process, a final rewritten form is presented to you. All this process may require some time depending upon the nature and length of the article to be rewritten.
s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583700012.70/warc/CC-MAIN-20190120042010-20190120064010-00503.warc.gz
CC-MAIN-2019-04
6,438
11
http://stackoverflow.com/questions/2820366/nsmutabledictionary-with-uibutton-as-keys-iphone-development
code
I'm new to iPhone development and I have a question that may have a very simple answer. I am trying to add buttons to a view and these buttons are associated with a custom class that I defined. When I add the buttons to the view, I would like to know what class these buttons correspond to. This is because when I press the button, I need to get some information about the class, but the receiver of the message is another class. I couldn't find information about an error that I'm getting on the web. The problem I have is that I'm trying to create an NSMutableDictionary where the keys are of type UIButton* and the values are of my custom type: // create button for unit UIButton* unitButton = [[UIButton alloc] init]; [sourceButtonMap setObject:composite forKey:unitButton]; Of course, the sourceButtonMap is defined in the class and initialized in the init function as sourceButtonMap = [[NSMutableDictionary alloc] init]; The error I get when I try to add the key-value pair is: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[UIButton copyWithZone:]: unrecognized selector sent to instance 0x3931e90' Is this happening because I can't store UIButton* as keys? Can anyone point me why I'm getting this error? Thank you all,
s3://commoncrawl/crawl-data/CC-MAIN-2015-27/segments/1435375099036.81/warc/CC-MAIN-20150627031819-00280-ip-10-179-60-89.ec2.internal.warc.gz
CC-MAIN-2015-27
1,272
7
http://forum.codecall.net/topic/39087-looking-for-a-powerful-web-browser/page-3
code
Print specific values from dictionary with a specific key name Siten0308 - Jun 20 2019 01:43 PM How to make code run differently depending on the platform it is running on? xarzu - Apr 05 2019 09:17 AM How do I set a breakpoint in an attached process in visual studio xarzu - Apr 04 2019 11:47 AM Recent Blog Entries Recent Status Updates - Managed C++ - Visual Basic 4 / 5 / 6 - linked list - hello world Looking for a powerful Web Browser Posted 26 March 2008 - 06:33 AM I am sure that whatever is on Opera can be found/coded/added on Firefox as an addon. Posted 26 March 2008 - 11:38 AM Posted 26 March 2008 - 09:49 PM Opera and the Acid3 Test - Desktop Team - by Desktop Team Posted 27 March 2008 - 04:54 AM Surfin’ Safari - Blog Archive » WebKit achieves Acid3 100/100 in public build Posted 27 March 2008 - 08:07 AM Posted 27 March 2008 - 08:15 AM Posted 27 March 2008 - 08:35 AM Posted 27 March 2008 - 09:29 AM The Webkit/Safari guys went into how they went from 54 to about 90 in a matter of weeks because there was a huge bunch of tests they had almost passed. It's quite good that we have two browsers (albeit beta's or even alpha's) that pass Acid3, the last one took an age to pass properly and it shows that we are at least seeing some progress on the web standards front. Looks to be a battle that will be won in spite of Microsoft. Essentially the tests are designed to cover the functionality that has been broken in web browsers in the past, or is as of yet completely lacking in implementation. The W3C standards often lack a base implementation so in a way the acid tests show how the standard is supposed to work. It enables browsers to target something rather than going on their merry way and coming up with 5 different interpretations of the standard. It also robs MS of the ability to say 'The standard was vague' when they do something utterly contrary to what the standards say (as MS tend to, having done everything in their power to break the internet).
s3://commoncrawl/crawl-data/CC-MAIN-2020-24/segments/1590348513230.90/warc/CC-MAIN-20200606093706-20200606123706-00157.warc.gz
CC-MAIN-2020-24
1,984
27
https://community.fandom.com/wiki/Thread:1822082
code
Guys, I need help in developing the portal. Not a wiki, but a portal. I want to develop the Ukrainian portal, I am active on some Ukrainian wikis, but I think that is not enough. I’m also going to become an helper for the Ukrainian portal, but I don’t know what and how. I very much want to develop the Ukrainian portal well so that it does not constantly remain aloof.
s3://commoncrawl/crawl-data/CC-MAIN-2020-16/segments/1585370496901.28/warc/CC-MAIN-20200330085157-20200330115157-00266.warc.gz
CC-MAIN-2020-16
373
2
https://twodee.org/blog/8358
code
My five-year-old son wanted to make an arrow. I one-upped him by making four arrows. He liked the shape and called it an “everywhere pointer.” We can make this with the following code: fracture = 0.1 length = 14 width = 4 pointiness = 15 nsides = 20 shaftToHeadAngle = 110 barbAngle = 135 arrow = -- Right upright. move length -- Start arrowhead. yaw shaftToHeadAngle move width -- Head to point. yaw -1 * barbAngle move pointiness -- We just hit the point. Now return. yaw -1 * (180.0 - 2 * (barbAngle - shaftToHeadAngle)) move pointiness -- Finish arrow head. yaw -1 * barbAngle move width -- Left upright. yaw shaftToHeadAngle move length end moveto 0 0 0 repeat 4 arrow yaw 90 end tube This code makes me realize I should try to add the unary negation operator. Right now I have to verbosely multiply by -1. Finding a general solution to the angles was another good geometry exercise.
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224644817.32/warc/CC-MAIN-20230529074001-20230529104001-00346.warc.gz
CC-MAIN-2023-23
892
4
https://blog.adafruit.com/2010/05/26/microsoft-releases-robotics-studio-free/
code
Over the past year or so, Microsoft’s robotics group has been working quietly, very quietly. That’s because, among other things, they were busy planning a significant strategy shift. Microsoft is upping the ante on its robotics ambitions by announcing today that its Robotics Developer Studio, or RDS, a big package of programming and simulation tools, is now available to anyone for free. The Microsoft RDS supports a number of hardware platforms, including the Lego Mindstorms NXT, iRobot Create and Parallax Boe-Bot, and it provides a physics-based simulation environment to allow you to test your designs. (please to note: the download is almost 500MB) Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts. Join us every Wednesday night at 8pm ET for Ask an Engineer! Learn resistor values with Mho’s Resistance or get the best electronics calculator for engineers “Circuit Playground” – Adafruit’s Apps! Maker Business — Lessons Learned Scaling Airbnb 100X Wearables — Start with a sketch Electronics — When do I use X10? Biohacking — Project Peri – Translates Sound into Light for the Hearing Impaired Sorry, the comment form is closed at this time.
s3://commoncrawl/crawl-data/CC-MAIN-2017-39/segments/1505818689192.26/warc/CC-MAIN-20170922202048-20170922222048-00609.warc.gz
CC-MAIN-2017-39
1,238
12
https://philosophy.stackexchange.com/users/908/naeg
code
I'm currently visiting a school for higher technical education, which hopefully permits me to acquire the university entry qualification next year, so I can study at the University of Innsbruck. My programming language of choice is Python. I'm quite familiar with Scrapy and Selenium (Webdriver) and some other libraries such as PyQt4. Also I like to program with declarative languages such as Prolog and LISP (which I'm learning at the moment). AutobiographerSep 24, 2014
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100264.9/warc/CC-MAIN-20231201021234-20231201051234-00268.warc.gz
CC-MAIN-2023-50
472
3
http://peter.michaux.ca/articles/scheme-from-scratch-bootstrap-v0_20-io
code
Scheme from Scratch - Bootstrap v0.20 - I/O The last major missing piece of the puzzle for a bootstrap interpreter is the ability to work with files. Scheme has several input and output primitive procedures and I’m implementing the ones I think will be useful. It is a slightly larger amount to implement than some days have been but hopefully quite straightforward for anyone who has come this far. A sample REPL session: $ ./scheme Welcome to Bootstrap Scheme. Use ctrl-c to exit. > (define out (open-output-port "asdf.txt")) ok > (write-char #\c out) ok > (close-output-port out) ok > (load "program.scm") program-loaded > (error "bad move") "bad move" exiting Some of these functions required refactoring in other areas of the interpreter. I refactored the C read function to handle the end of a source file properly. I refactored the C write function to take a stream parameter so that the Scheme write could write to any port. I’ll backport these changes to previous versions eventually. I’m not doing any of the the with-input-from-file business. If a port is not specified as an optional parameter to write-char, etc then C’s stdout are used. I added the error output form which writes all of its arguments and then exits. This form is not required by R5RS, for example, but is useful. There is still a little bit to do but we are oh so close. There is a v0.20 branch on github for this version. Have something to write? Comment on this article.
s3://commoncrawl/crawl-data/CC-MAIN-2019-18/segments/1555578527720.37/warc/CC-MAIN-20190419121234-20190419143234-00092.warc.gz
CC-MAIN-2019-18
1,461
18
http://dba.stackexchange.com/questions/tagged/referential-integrity+relational
code
Database Administrators Meta to customize your list. more stack exchange communities Start here for a quick overview of the site Detailed answers to any questions you might have Discuss the workings and policies of this site Relationships between subtypes of an entity I have a very big (hundred of milions rows) table which represent an entity (event) having many subtypes (different kind of recorded events). Each row is identified by an id (obj_id) and has an ... Dec 2 '13 at 15:06 newest referential-integrity relational questions feed Hot Network Questions Where does Allah come from? 4, 8, 15, 16, 23, 42 Display ghost image of last photo in Live View (Canon) What is a term for someone who has never left their home region? Why do share prices fall if profits fall? How do Academic Journals protect against empirical results given by bugs? Analogues of P vs. NP in the history of mathematics I forgot where I died the last time How to mirror resistor? (use same variable resistance for multiple op-amp gains) Prove that a counterexample exists without knowing one Feedback on my Conway's Game of Life Should an undergrad accept that some things don't make sense, or study the foundation of mathematics to resolve this? What? No error? Problem with csname macro expansion Understanding dependency injection How should a designer communicate price is negotiable to a client? Is there such a thing as a switch that can be actuated automatically? Express y in terms of x ISP broadcasting all IP-packets, so I can see traffic of other clients from ISP Single word to describe something that is "meant to be" Why are photos of satellites most often computer generated? Next vs. Continue Implicit cast from IEnumerable<T> to MyCollection more hot questions Life / Arts Culture / Recreation TeX - LaTeX Unix & Linux Ask Different (Apple) Geographic Information Systems Science Fiction & Fantasy Seasoned Advice (cooking) Personal Finance & Money English Language & Usage Mi Yodeya (Judaism) Cross Validated (stats) Theoretical Computer Science Meta Stack Overflow Stack Overflow Careers site design / logo © 2014 stack exchange inc; user contributions licensed under cc by-sa 3.0
s3://commoncrawl/crawl-data/CC-MAIN-2014-10/segments/1394678693008/warc/CC-MAIN-20140313024453-00001-ip-10-183-142-35.ec2.internal.warc.gz
CC-MAIN-2014-10
2,180
52
https://archive.sap.com/discussions/thread/783374
code
Change Tracker Configuration Problem I am trying to configure the change tracking for one of the repositories. We have deployed the WEB_UI package on Portal WAS. We have created JDBC connection using the JDBC 1.0. This is working fine. When I call the standalone change tracker WD application, using the jdbcAlias, I get the application page. The Table dropdown field is empty and other fields are disabled. We are using Oracle database. My repository name is XXX_Vendor. We are using the standard schema. Please tell me what am I missing in my configurations.
s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583826240.93/warc/CC-MAIN-20190122034213-20190122060213-00361.warc.gz
CC-MAIN-2019-04
560
8
https://library.gwu.edu/events?series=obvrsch_data&open_to=GWorld&f%5B0%5D=audience%3A4&f%5B1%5D=audience%3A9&f%5B2%5D=audience%3A10&f%5B3%5D=audience%3A13&f%5B4%5D=location%3A36&f%5B5%5D=tags%3A240
code
In this 2.5 day, hands-on workshop, participants will learn essential skills for working with genomics data. 7 events at GW Libraries. This workshop will introduce participants to basic R tasks such as reading data into R, analyzing data, and plotting data. This workshop will walk you through the R functionality you'll need to use when conducting hypothesis tests on continuous variables. This workshop will walk you through the R functionality you can use to compute correlations between continuous variables. Learn to make interactive, web-based graphs in R using the RShiny package.
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573163.7/warc/CC-MAIN-20220818033705-20220818063705-00637.warc.gz
CC-MAIN-2022-33
587
6