url
stringlengths
13
4.35k
tag
stringclasses
1 value
text
stringlengths
109
628k
file_path
stringlengths
109
155
dump
stringclasses
96 values
file_size_in_byte
int64
112
630k
line_count
int64
1
3.76k
https://skylerrankin.org/
code
This Jupyter Notebook steps through the process of decoding a JPEG image file. After learning to understand these files byte by byte, its clear that a very high level of engineering and foresight was needed to achieve the stability and efficiency that allows for millions of pixels created by a variety of cameras and applications to be packed into small, uniform files. However, much of the available official documentation for this process is either extremely brief or lacking real world examples; when it comes to procedures like ZRL encoding or Huffman table parsing, it takes more than a few sentences to get all of the details necessary to implement them. As such, I’ve created this repository using simple to understand functions written in Python with custom diagrams for all of the tricky sections. Hopefully this will be useful to anyone looking for an example of how the bytes are parsed in JPEG decoders. 3D graphics algorithms have reached a point where they can be implemented in pretty much every platform efficiently, but there was once a time where vertex buffers and pixel shaders just weren’t in the memory budget of the average computer. In these times, developers created some very clever methods of simulating 3D scenes and objects without actually calculating the distances normally necessary for such a visual. This small Java file implements one such method, 2D ray casting, to simulate a FPS view walking in a 3D maze. The entire file is around 160 lines with no 3D model, and I think gives an awesome effect for so little cost.
s3://commoncrawl/crawl-data/CC-MAIN-2019-47/segments/1573496664439.7/warc/CC-MAIN-20191111214811-20191112002811-00448.warc.gz
CC-MAIN-2019-47
1,558
2
https://discussion.tekeli.li/t/the-library-documents-of-interest/1743
code
This is the thread for documents held in the Covenant’s Library. And also for requests to see if there are certain types of books to be found there and for the ever revising catalogue. The only things the librarian is sure are there at the moment are: Summae on all the Arts (Quality 9 Level 5) a matching set of basic texts. A copy of the Lab Texts for: Aegis of the Hearth Level 30 Distillation of the Purest Stream (See Covenants p. 77) EDITED TO ADD: Wizards’ Communion 30. The Oath of the Covenant. The Scroll of Vis Sources (Also to be found at: (ROGER: is there a better way of doing this other than my Dropbox?) (Some of the stuff in the Vis Sources is taken from COVENANTS and some isn’t.)
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141188947.19/warc/CC-MAIN-20201126200910-20201126230910-00276.warc.gz
CC-MAIN-2020-50
704
12
http://stackoverflow.com/questions/2909282/why-are-functional-languages-considered-a-boon-for-multi-threaded-environments?answertab=active
code
Higher order functions. Consider a simple reduction operation, summing the elements of an array. In an imperative language, programmers typically write themselves a loop and perform reductions one element at a time. But that code isn't easy to make multi-threaded. When you write a loop you're assuming an order of operations and you have to spell out how to get from one element to the next. You'd really like to just say "sum the array" and have the compiler, or runtime, or whatever, make the decision about how to work through the array, dividing up the task as necessary between multiple cores, and combining those results together. So instead of writing a loop, with some addition code embedded inside it, an alternative is to pass something representing "addition" into a function that can do the divvying. As soon as you do that, you're writing functionally. You're passing a function (addition) into another function (the reducer). If you write this way then it not only makes more readable code, but when you change architecture, or want to write for heterogeneous architecture, you don't have to change the summer, just the reducer. In practice you might have many different algorithms that all share one reducer so this is a big payoff. This is just a simple example. You may want to build on this. Functions to apply other functions on 2D arrays, functions to apply functions to tree structures, functions to combine functions to apply functions (eg. if you have a hierarchical structure with trees above and arrays below) and so on.
s3://commoncrawl/crawl-data/CC-MAIN-2016-07/segments/1454701162808.51/warc/CC-MAIN-20160205193922-00336-ip-10-236-182-209.ec2.internal.warc.gz
CC-MAIN-2016-07
1,546
3
https://www.orderful.com/edi-blog/the-biggest-time-wasters-in-every-edi-implementation/
code
EDI implementation is a time-consuming, costly, and frustrating process for nearly everyone. And what happens when your retailers’ requirements change? You’re expected to make corresponding changes to your EDI implementation. You’ll have to go back in, rewrite code, send a test transaction, get the results, tweak the code, test again, and repeat until you achieve a successful transmission. Rather than just complaining, focus on identifying and eliminating the biggest bottlenecks in setting up EDI connections.
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296950363.89/warc/CC-MAIN-20230401221921-20230402011921-00644.warc.gz
CC-MAIN-2023-14
520
2
http://www.mp3car.com/engine-management-obd-ii-engine-diagnostics-etc/150036-powering-elm327-cable-without-a-car.html
code
I have an ELM327 usb-serial cable. It powers up fine when I connect it in my car (ie I get a power LED turning on) and works as expected. I'm trying to do a bit of dev with it though, and I don't want to be sitting in my car for that [I don't need to query the ECU for anything... just ELM stuff like reset, set BR, etc]. So I tried to power the cable outside of the car, but I can't seem to get this to work... the power LED doesn't even turn on. I applied 12VDC to the pin 16, and grounded pins 4,5. Does anyone have any advice? edit: this works fine if you know how to read a pinout and aren't an idiot Last edited by preet; 12-19-2011 at 07:14 PM. Check out my Frontend project!
s3://commoncrawl/crawl-data/CC-MAIN-2015-48/segments/1448398451744.67/warc/CC-MAIN-20151124205411-00085-ip-10-71-132-137.ec2.internal.warc.gz
CC-MAIN-2015-48
682
5
https://wp-dreams.com/forums/reply/1270/
code
I only have about 10 posts, so that shouldn’t be an issue. I have narrowed the problem down however, it only seems to happen when the results are in vertical mode. When the results are set to horizontal, it works absolutely perfectly. So just to recap, when the results are set to vertical, the search form doesn’t return anything for about 10 seconds after the page has loaded. FYI, the search form I am using is on the following site – http://www.pilotprize.co.uk/uk-charts/ There are two searches on there, both identical, but the top one is vertical, and the bottom is horizontal.
s3://commoncrawl/crawl-data/CC-MAIN-2022-27/segments/1656103947269.55/warc/CC-MAIN-20220701220150-20220702010150-00795.warc.gz
CC-MAIN-2022-27
590
5
https://steemit.com/academia/@borepstein/a-thousand-dollars-for-an-accounting-book
code
Well, not quite - sorry about not being fully truthful. It is, in actuality, 999 USD. And that is the price University of Louisiana at Lafayette is charging for an accounting book. Apparently - at least so the claim goes - that is the price of the online version of the book, the print version being much cheaper. The proclaimed reason for this is to encourage students to purchase the paper version of book and thus prevent them from printing reams of pages from the online version. And apparently the nature of the class is such that they would be forced to do so because otherwise they won't be able to do the work they are required to do in class. Though I don't see how one wouldn't be able to read the text off a tablet screen. The reasoning sounds strange, to put it mildly. After all, even if a person decides to print the entire online book - wouldn't it only have as many pages as the print version of the book, it being the same book? And the whole idea that you get to control how adults do their reading and how many pages of paper they use to print their reading materials - isn't that a little overly controlling and infantilizing? Be that as it may, one way in which I think this is useful is that if you choose to pay this much for a book containing something as standard and widely available as accounting-related knowledge then perhaps finance is not really your strong suit and you should major in something else. No offense but if that is how much you know about money and relevant worth of things at 18 - well, you need to learn a bit more before you can be useful to others as a financial services professional. Insane College Charges $999 For Online Textbook (video) Economic Invincibility, 4 September 2018 UL Lafayette charges $999 for online textbook. Here's why Leigh Guidry, Lafayette Daily Advertiser, 27 August 2018
s3://commoncrawl/crawl-data/CC-MAIN-2018-47/segments/1542039743046.43/warc/CC-MAIN-20181116132301-20181116153623-00051.warc.gz
CC-MAIN-2018-47
1,846
8
http://www.experts123.com/q/some-decimal-values-always-print-with-exponential-notation-is-there-a-way-to-get-a-non-exponential-representation.html
code
Some decimal values always print with exponential notation. Is there a way to get a non-exponential representation? A. For some values, exponential notation is the only way to express the number of significant places in the coefficient. For example, expressing 5.0E+3 as 5000 keeps the value constant but cannot show the original’s two-place significance. If an application does not care about tracking significance, it is easy to remove the exponent and trailing zeroes, losing significance, but keeping the value unchanged: >>> def remove_exponent(d): … return d.quantize(Decimal(1)) if d == d.to_integral() else d. - Why are values of grain size only given to 1 decimal place whereas percent sand and other data are only given as integers? - Some decimal values always print with exponential notation. Is there a way to get a non-exponential representation? - How do I place currency values in money-related cells or cells that require decimal places?
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703514423.60/warc/CC-MAIN-20210118061434-20210118091434-00522.warc.gz
CC-MAIN-2021-04
958
5
https://mudcat.org/detail.cfm?messages__Message_ID=48208
code
Joe Thanks for the response. I tried to understand the info on the thread link you gave me, but I left most of my "formal" musical skills in High school. The stuff we all (I assume) play is FOLK MUSIC much of which was communicated through the oral tradition. Even though I have been playing for most of my life I can't seem to get the hang of this ABC XYZ Midi stuff. Are they the same? What in the world is ASCII? Everytime someone gives an "explanation" it's like reading the Microsoft Help pages where they use the word being defined in the explanation. Does anyone out there know of a MIDI or ABC or whatever site for dummies?
s3://commoncrawl/crawl-data/CC-MAIN-2021-39/segments/1631780056856.4/warc/CC-MAIN-20210919095911-20210919125911-00587.warc.gz
CC-MAIN-2021-39
631
1
http://archive.linuxfromscratch.org/mail-archives/cross-lfs/2005-November/000250.html
code
Booting into x86_64 ken at linuxfromscratch.org Thu Nov 17 12:11:17 PST 2005 (adding the list back as a Cc) On Thu, 17 Nov 2005, jstipins at umich.edu wrote: > Quoting Ken Moffat <ken at linuxfromscratch.org>: >> Just like vanilla LFS, I'm guessing there is something unfortunate in your >> .config. This is probably your first attempt to run x86_64, and therefore >> you are now ready to build chapter 9 (tcl and so forth) ? (If you'd already >> finished the system, you could revert to the 64-bit .config you used when >> you were building it) >> Are you following the 'boot' or 'chroot' option ? If you boot, you >> should point the init= to wherever you installed the /tools files. If you >> chroot, you should point init= to the (32-bit) system containing /sbin/init >> (and also make sure the kernel can emulate 32-bit in the .config). > Hi, thanks for the responses. > It is in fact my first attempt to run x86_64. My current kernel cannot > run 64-bit executables, so my understanding is that my only option is to x86_64, powerpc64, and maybe other arches, will let a 64-bit kernel run a 32-bit userspace, and successfully chroot to a 64-bit system. But, trying to install modules in that situation is not recommended. However, if you've built everything necessary to boot, you should be good to go. > I agree with you that GRUB has done its job by the time I run > into trouble. It looks like there's some problem with the device driver > for my Maxtor 6L200M0, which is a SATA drive. > What I'm especially confused about is that the failure mode changes, > depending on whether or not the kernel is compiled with support for 32-bit > executables. I haven't done enough experimenting to be 100% certain of > this, but it seems like what is happening is as follows: > 1. With 32-bit support, I get the register dump that contains some "scsi_..." > symbols (don't have them handy right now, sorry). Odd, but x86_64 is still new enough to have a few strangenesses. If this is indeed an oops, all bets are off for what happens afterwards. > 2. Without 32-bit support, there is no register dump; rather, I just get > an error from VFS that it is unable to mount the root device. In other words, either you haven't included the necessary driver, or it hasn't been loaded by modprobe. > Does any of this make sense? That is, is there any reason why support for > 32-bit executables should have any effect on the loading of a device driver? No obvious reason. My advice is to change your config. Go through menuconfig, build in all the filesystems you use, your network card, and all the DMA and SATA options that might be relevant to your hardware. Take out anything that looks unnecessary. Also, take out support for 32-bit apps temporarily. Build in the correct SATA drivers for your chipset under the SCSI drivers. Build, see if it boots, repeat. When it boots, add in the 32-bit option. If it then oopses, you've identified a bug. The important thing is to reduce the variables in the problem to a manageable level. > Right now, I'm not sure if the problem is 64-bit based, or kernel 184.108.40.206 > based. (My current working LFS distribution is running 220.127.116.11.) I can't > seem to build kernel 18.104.22.168 with my current toolchain, so I can't easily > test whether the new kernel would cause a problem in pure 32-bit mode. It > occurred to me that I could try updating to the "development" LFS that uses > 22.214.171.124, and see if I can boot that (again, all 32-bit). Unfortunately, I > get errors in "make check" on glibc-2.3.6 and gcc-4.0.2, so I abandoned that I worry about your 32-bit toolchain, or the .config, but that is getting O/T for cross-lfs. Remember, the kernel config options for x86_64 and i686 are very different, so problems in one dont necessarily appear in the other. For me, 126.96.36.199 has been reliable on my limited range of x86_64 hardware. OTOH, building non-modular 64-bit kernels for newer versions has been problematic (a .config problem, they take for ever to boot - the modular versions are absolutely fine). If you have excessively new hardware, you might require a newer kernel (e.g. for PCI-E), but that is probably also true for 32-bit. Otherwise a newer kernel just throws more variables into the equation (once you identify a kernel problem, obviously you need to try newer kernels to see if it has been fixed). > I have a huge hard drive, and lots of time... any suggestions for plans of If you are sure what your hardware contains (lspci?), slimming down the kernel config is probably the quickest method: typically, 3 or 4 recompiles per hour, more on fast hardware. You could, at a pinch, install a 64-bit distro onto a spare partition and see which modules it loads (plus, fixing up grub from a rescue disk if the current options disappear), but that will take significant time das eine Mal als Tragödie, das andere Mal als Farce More information about the cross-lfs
s3://commoncrawl/crawl-data/CC-MAIN-2019-13/segments/1552912202450.64/warc/CC-MAIN-20190320170159-20190320192159-00094.warc.gz
CC-MAIN-2019-13
4,933
78
https://lists.debian.org/debian-boot/2001/06/msg00059.html
code
Re: where are uptodate boot disk images for woody On Sat, Jun 02, 2001 at 01:44:05PM +1000, Anthony Towns wrote: > On Tue, May 29, 2001 at 07:44:03PM -0400, Adam Di Carlo wrote: > > Current 2.3.4 i386 boot-floppies for woody testing (qualified testers > > only please) are available at > > http://people.debian.org/~aph/debian/dists/woody/main/disks-i386/current/ > > Good luck. Results to debian-boot or against the boot-floppies (or > > base-config, or debootstrap, or whatever package) please. > I haven't see any real success or failure reports from these... > Are we at the point where occassional lucky people can actually do woody > installs yet? No. I tried the 2.3.4 set earlier today, and a BusyBox bug prevented the permissions from being set correctly. Please see http://bugs.debian.org/99627 for a description of the problem and a fix. We'll need another BusyBox revision. Erik, may I suggest that we include the fix to this problem *only* to prevent further Also, the af_packet.o module was not included on the compact floppies root disk, so I still had to use the driver floppies. And both panes of the timezone configuration form showed the selections in blue, making it difficult to tell which pane was focused. I also noticed some errors during debootstraps run, but they scrolled by pretty quickly (exim had some kind of problem opening a logfile, for instance). I'm currently downloading the 42 megabytes of packages required to build the boot-floppies over a 14.4k modem, and then I'll start tracking these down.
s3://commoncrawl/crawl-data/CC-MAIN-2019-43/segments/1570986696339.42/warc/CC-MAIN-20191019141654-20191019165154-00461.warc.gz
CC-MAIN-2019-43
1,533
26
http://askubuntu.com/questions/194459/12-04-i-need-a-new-network-card-and-i-think-this-one-may-work
code
I found this list of compatible USB wireless network adaptors: Linux USB Wireless Compatibility Adapter List Listed here is the Belkin F5D8053. I'm looking for a store to buy a suitable adaptor in locally and I can't find that particular adaptor - but I can see a Belkin F7D4101az. I think this is simply a newer model - but a Google search for "ubuntu belkin f7d4101az" brings no results - not even people trying to get it to work. Nothing. So I didn't want to purchase it without being sure. There's really only one PC shop that sells this sort of stuff around here - so here's a list of all my available options: Criteria I'd really like to get one today, because I have a lot of work to do and don't really want to have to wait a week for a new adaptor - I spent 6 or 7 hours last night trying to get my Dell 1450 to work with no luck. Any help would be much appreciated.
s3://commoncrawl/crawl-data/CC-MAIN-2016-26/segments/1466783408828.55/warc/CC-MAIN-20160624155008-00069-ip-10-164-35-72.ec2.internal.warc.gz
CC-MAIN-2016-26
875
6
http://www.webassist.com/forums/post.php?pid=183615
code
"Alternatively, the login saves the User ID to a session, you could create a recordset that filters the ID column of the users table on the user ID session that is saved to retroeve the image name." Can you tell me the steps to go about this and what pages to do it on? I noticed if i register and choose a profile image, then log in, the picture does not load It was working but now all of a sudden it just doesn't work. After I register or insert a user, when i go login with the account then go to the userLanding.php all i get is a blank question mark where the picture was supposed to be. I know i am missing something somewhere but i've followed all of your steps to my knowledge Here are the pages if this helps you I appreciate it very much!
s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583662124.0/warc/CC-MAIN-20190119034320-20190119060320-00639.warc.gz
CC-MAIN-2019-04
749
7
https://www.medindia.net/news/theranostic-nanoparticles-aid-in-early-detection-of-diseases-174585-1.htm
code
Theranostic nanoparticles aid in early detection and therapy of diseases, and can also identify molecular markers, reveals a new study. A "theranostic" nanoparticle is a nanoparticle that simply has a therapeutic moiety and imaging or diagnostic moiety on the same particle. The authors of a new SLAS Technology review article pay particular attention to and emphasize the platforms in which self-reporting and disease monitoring is possible in real-time through the synergistic nature of the components on the theranostic particles. The evolving nature of the field toward such responsive and "smart" theranostic nanoparticles can be used as tools for life sciences researchers, especially in the context of identifying markers and characterizing cells and diseases over the course of its lifetime. Many clinical imaging technologies have limitations in resolution when detecting small quantities of molecular markers, but theranostic nanoparticles can be used in combination to provide early detection and therapy of diseases, and has the potential to advance imaging platforms for improved performance.
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320302706.62/warc/CC-MAIN-20220120220649-20220121010649-00273.warc.gz
CC-MAIN-2022-05
1,105
5
https://stackoverflow.com/users/3928341/nem035
code
Some of my answers: - TypeError in Node REPL: Cannot read property '0' of undefined when I add a property to Object.prototype - setTimeout, jQuery operation, transitionend executed/fired randomly - Tail recursion in NodeJS - React.js - Syntax error: this is a reserved word in render() function - Get animation status in microseconds - ES6 Map: Why/When to use an object or a function as a key? - Compare Ways of Creating Objects in JS - jQuery validation - syntax error - How do you best convert a recursive function to an iterative one? You can find me on:
s3://commoncrawl/crawl-data/CC-MAIN-2018-39/segments/1537267156460.64/warc/CC-MAIN-20180920101233-20180920121633-00528.warc.gz
CC-MAIN-2018-39
558
11
http://enbseries.enbdev.com/forum/viewtopic.php?f=22&p=75576&sid=4abc73b5fd5e4cf1585c7e82fef3f10a
code
Sorry, but it's strange even to think that problem is in the mod if all previous versions of driver were fine. Bug must be fixed by those who made it, not to invent workaround. I get where you're coming from and in this case I can agree in the short term that it's a Nvidea problem. But for the most part as far as development goes, shouldn't both the developers of the drivers and the developers of software reliant on those drivers attempt to develop at an equal pace rather than one saying "I'm staying where I am, you can go off in that direction for all I care". So I'm sorry, but I don't feel the question I asked is strange in the slightest. Especially if Nvidea has no plans to go back in future updates to "what works", as the changes they've made are instrumental to something else. And as I said in my first post, previous versions are not "fine" there are still issues, so clearly there's something more underlying than just a botched driver update.
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917121893.62/warc/CC-MAIN-20170423031201-00061-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
961
5
https://voiceofthedba.com/2019/03/25/sql-source-control-and-bitbucket-getting-started/
code
This is in response to someone asking about getting started with their database in SQL Source Control and then hosting at BitBucket. I’m going to assume people can set up an account at Bitbucket and won’t cover that. Instead, this is part of a series that looks at getting a CI build working. I’m going to start showing how to get a git repo set up locally and connect a database to it with SQL Source Control in this post. I’ll then connect that to Bitbucket and move changes around. In a future post, we’ll see a CI build taking place from our Bitbucket repo. Linking SQL Source Control I’ve got a test database set up on a SQL Server 2017 instance, called SQLSourceControlPoC. The script for that database is here: SOCPoC_Create.sql. The first step is to get a repository set up. You need to install git, which is easy. You can get a client tool, like SourceTree or Github Desktop for Windows, but I like the command line and I’ll use that. On Windows machines, under your use account, you have a Source, and then a Repos set of folders. I’ll change to those in the command line. I’ll then create a folder, called “SQLSourceControlPoC”. I find it easy to keep the folder name the same as my project, which in this case is the database. I’ll create a folder under this to store my actual code. Now I’ll set up a git repo, which I do with “git init” in the folder. That’s it. Now let’s start with SQL Source Control. I’m going to assume you have SQL Source Control installed already. If you don’t have that, download an eval and run the setup. From there, I can right click on the database name and select “Link database to source control”. This will open the SQL Source Control tab in SSMS. This shows my database name at the top, and since this database hasn’t been linked, we’ll start with the wizard for linking. We set up a git repo already, so we’ll leave the top item checked. We click Next and get a dialog that asks which VCS and where is our repo. I’ll select git and browse to the location where I created the repo. Note, I’ve specified the subdirectory. I like a subdirectory as this allows me to place other code in the repo if I need it (like notes, readme, etc.) and keep the database code from SQL Source Control clean. When I click link, I get a progress bar. When that finishes, I get the Setup tab, which shows me the configuration and gives me some options. We can ignore this for now and select the “Commit” tab instead (top left). This will switch to that tab and look for changes in the database that aren’t stored in our version control system. Here’s our current VCS view: We only have our SQL Source Control file. There are many more objects in the Commit tab, as we see below. SQL Source Control assumes the database is the source of truth here and tries to capture all changes. There’s a lot to see here, but we won’t dive into what’s here. There are other articles and posts on this. For this article, I’ll enter a commit message and click “Commit”. Once I do that, my VCS view changes. SQL Source Control has created folders for my objects, with a separate file for each object below the folder. For example, the Tables folder looks like this: The contents of the dbo.Blogs.sql file are shown here in Azure Data Studio. And my git status: We’ve gotten our code into a git repo, now let’s move on. Connecting to BitBucket I’m going to assume you have a Bitbucket account. If not, go do that. When you do, click your Repositories menu item, and you will get a list of repos. I have two already. In the mid left, there’s a plus (+) sign. Click that to get the add dialog. Pick repository and then you’ll enter some data. I chose a name that’s the same as my local repo to ensure some easy tracking. I made this public, so anyone can download my repo if they want to play with the code. Note: I’m not likely to accept and PRs. Once I click Create repository, I get a welcome screen. In this case, I get some instructions, and the important ones are moving my local git repo here. Let’s do that. I’ll go back to my command line and enter the git remote command (from above) and then the git push. Note the authentication popup. That failed for me, but when I went back to the command line, I entered my password again and it worked. Going back to Bitbucket and refreshing the Source tab, I see code. Right now I have code in a SQL Server database. This is linked to a local git repo on my desktop, which is linked to a remote git repo at Bitbucket. One last thing is to make a change on the local database and get that to Bitbucket. Let’s do that. I’ll enter this code in SSMS: I execute this and I have a proc in my database, but this isn’t in git. If I go to the Commit tab in SQL Source Control, I see one change. I’ll check this in the lower windows to verify the code, select the item in the middle and enter a Commit message. Once the commit completes, the change is in my local repo, but not in Bitbucket. However, SQL Source Control gives me a “Push” button. If I click this, a git push will execute. Note: I had some authentication issues here. The push may or may not work, depending on how you have authorization set up for Bitbucket. I had to enter a username and password, which sometimes worked, sometimes didn’t. Performing a “git push” from the command line worked. In Bitbucket, I now see my procedure. This is a quick look at how to get my database code in to Bitbucket via SQL Source Control and git. This should help you begin to understand how to start enabling database development to follow what application developers do. I’ll work on getting a CI build in my next post. If you want to see a particular CI system, let me know.
s3://commoncrawl/crawl-data/CC-MAIN-2021-17/segments/1618038072366.31/warc/CC-MAIN-20210413122252-20210413152252-00078.warc.gz
CC-MAIN-2021-17
5,786
39
https://community.nodebb.org/topic/15229/error-can-t-send-mail-all-recipients-were-rejected-501-invalid-rcpt-to-address-provided-at-smtpconnection/1
code
Setting it to more than 2 hours after the original time seemed to work. Error: Can't send mail - all recipients were rejected: 501 Invalid RCPT TO address provided at SMTPConnection Varun Ganesh D last edited by I selected all unverified users in the Users section and clicked on send validation email. on the log, I'm getting like this 2020-12-30T11:20:34.940Z [4567/20005] - [31merror[39m: Error: Can't send mail - all recipients were rejected: 501 Invalid RCPT TO address provided at SMTPConnection._formatError (/root/nodebb/node_modules/nodemailer/lib/smtp-connection/index.js:784:19) at SMTPConnection._actionRCPT (/root/nodebb/node_modules/nodemailer/lib/smtp-connection/index.js:1613:28) at SMTPConnection.<anonymous> (/root/nodebb/node_modules/nodemailer/lib/smtp-connection/index.js:1574:26) at SMTPConnection._processResponse (/root/nodebb/node_modules/nodemailer/lib/smtp-connection/index.js:942:20) at SMTPConnection._onData (/root/nodebb/node_modules/nodemailer/lib/smtp-connection/index.js:749:14) at TLSSocket.SMTPConnection._onSocketData (/root/nodebb/node_modules/nodemailer/lib/smtp-connection/index.js:195:44) at TLSSocket.emit (events.js:315:20) at addChunk (_stream_readable.js:295:12) at readableAddChunk (_stream_readable.js:271:9) at TLSSocket.Readable.push (_stream_readable.js:212:10) @varun-ganesh-d that means that for at least one the unverified users, they provided an email that doesn't exist.
s3://commoncrawl/crawl-data/CC-MAIN-2021-21/segments/1620243988759.29/warc/CC-MAIN-20210506175146-20210506205146-00104.warc.gz
CC-MAIN-2021-21
1,425
7
https://foss-backstage.de/member/isabel-drost-fromm-0
code
Isabel Drost-Fromm is Open Source Strategist at Europace AG Germany. She's a member of the Apache Software Foundation, co-founder of Apache Mahout and mentored several incubating projects. Isabel is interested in all things FOSS, search and text mining with a decent machine learning background. True to the nature of people living in Berlin she loves having friends fly in for a brief visit - as a result she co-founded and is still one of the creative heads behind Berlin Buzzwords, a tech conference on all things search, scale and storage. Open Source Strategist
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323588113.25/warc/CC-MAIN-20211027084718-20211027114718-00674.warc.gz
CC-MAIN-2021-43
566
2
https://vivienlemoine.com/files/resume.html
code
Led a team of designers in creating user experiences and interfaces for proprietary healthcare platforms. Facilitated effort to develop an enterprise visual design system for multiple products. Conduct research and interviews with clients to prioritize product roadmap. Consulted on and designed user experience and interface for a proprietary learning application. Supported developers troubleshooting and refining the interface and interactions. Render product features and concepts into user flows and working coded prototypes. Developed user interface features for a proprietary learning application. Consulted internal stakeholders and clients with design and usability decisions. Built reusable components to improve efficiency, consistency, and scalability of implementations. Responsible for ensuring interface met WCAG 2.1 accessibility standards and supported multiple devices, especially touch/mobile. Designed and developed branded user interfaces for a talent management system. Conducted user experience workshops for clients, both on-site and remote. Research software requirements and develop prototypes determine project feasibility. Develop new features and processes to improve efficiency and scalability of projects. Designed and developed custom user interfaces for a talent management system. - Accessibility (WCAG 2.1) - Agile (Scrum, Kanban) - Experience Design - Interface Design - Interaction Design - Product Design - Responsive Web Design - Usability Testing - User Research - Adobe Creative Suite - CSS & Sass (SCSS) - Component frameworks (Bootstrap, Material) - Front-end frameworks (Angular, Vue.js) - Prototyping tools (InVision, Marvel) - Software version control (Git, GitHub) Web Design & Interactive Media Art Institute of Tampa Branch of Miami International University of Art & Design Associate Degree | Completed 2012
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571950.76/warc/CC-MAIN-20220813111851-20220813141851-00638.warc.gz
CC-MAIN-2022-33
1,856
24
http://cpa.monetizador.net/lib/algorithms-for-static-and-dynamic-multiplicative
code
By Simo J.C. A formula and algorithmic remedy of static and dynamic plasticity at finite traces in accordance with the multiplicative decomposition is gifted which inherits all of the positive factors of the classical types of infinitesimal plasticity. the most important computational implication is that this: the closest-point-projection set of rules of any classical simple-surface or multi-surface version of infinitesimal plasticity incorporates over to the current finite deformation context with out amendment. specifically, the algorithmic elastoplastic tangent moduli of the infinitesimal conception stay unchanged. For the static challenge, the proposed type of algorithms look after precisely plastic quantity alterations if the yield criterion is strain insensitive. For the dynamic challenge, a category of time-stepping algorithms is gifted which inherits precisely the conservation legislation of overall linear and angular momentum. the particular functionality of the technique is illustrated in a few consultant huge scale static and dynamic simulations. Read or Download Algorithms for static and dynamic multiplicative PDF Best algorithms and data structures books String matching is a crucial topic within the wider area of textual content processing. It comprises discovering one,or extra as a rule, the entire occurrences of a string (more commonly known as a trend) in a textual content. The guide of tangible String Matching Algorithms offers 38 equipment for fixing this challenge. We advise a cascadic multigrid set of rules for a semilinear elliptic challenge. The nonlinear equations bobbing up from linear finite point discretizations are solved via Newton's procedure. Given an approximate answer at the coarsest grid on each one finer grid we practice precisely one Newton step taking the approximate answer from the former grid as preliminary wager. You could compensate for the newest advancements within the number 1, fastest-growing programming language on the planet with this absolutely up-to-date Schaum's advisor. Schaum's define of information constructions with Java has been revised to mirror all contemporary advances and alterations within the language. Association of information warehouses is a crucial, yet usually missed, point of starting to be an company. not like so much books at the topic that target both the technical elements of establishing facts warehouses or on enterprise recommendations, this worthwhile reference synthesizes technology with managerial most sensible practices to teach how more suitable alignment among facts warehouse plans and company techniques can result in winning facts warehouse adoption able to helping an enterprise’s complete infrastructure. - Windsock Datafile Special No. Jagdstaffel 5 Volume Two - Structural Complexity II - Problems on algorithms - Algorithms for Approximation A Iske J Levesley - Using Neural Networks And Genetic Algorithms To Predict Stock Market Returns Additional info for Algorithms for static and dynamic multiplicative 1 Introduction A star graph is a tree of diameter at most two. Equivalently, a star graph consists of a vertex designated center along with a set of leaves adjacent to it. In particular, a singleton vertex is a star as well. Given an undirected graph, a spanning star forest consists of a set of node-disjoint stars that cover all the nodes in the graph. In the spanning star forest problem, the objective is to maximize the number of edges (or equivalently, leaves) present in the forest. A dominating set of a graph is a subset of the vertices such that every other vertex is adjacent to a vertex in the dominating set. The approximation gap for the metric facility location problem is not yet closed. Operations Research Letters (ORL) 35(3), 379–384 (2007) 4. : Improved combinatorial algorithms for facility location and k-median problems. In: Proc. of the 40th IEEE Symp. on Foundations of Computer Science (FOCS), pp. 378–388. IEEE Computer Society Press, Los Alamitos (1999) 5. : Improved approximation algorithms for uncapacitated facility location. In: Proc. of the 6th Integer Programing and Combinatorial Optimization (IPCO), pp. Computing the maximum spanning star forest of a graph is NP-hard because computing the minimum dominating set is NP-hard. The spanning star forest problem has found applications in computational biology. Nguyen et al. use the spanning star forest problem to give an algorithm for the problem of aligning multiple genomic sequences, which is a basic bioinformatics task in comparative genomics. The spanning star forest problem and its directed version have found applications in the comparison of phylogenetic trees and the diversity problem in the automobile industry .
s3://commoncrawl/crawl-data/CC-MAIN-2018-26/segments/1529267867885.75/warc/CC-MAIN-20180625131117-20180625151117-00111.warc.gz
CC-MAIN-2018-26
4,757
17
https://list.hum.uu.nl/pipermail/elsnet-list/2006-March/001272.html
code
[Elsnet-list] UITP'06: Second Call for Papers event at in.tu-clausthal.de event at in.tu-clausthal.de Tue Mar 21 11:35:03 CET 2006 For info on unsubscription, see the info at the bottom of this message. We apologize for multiple copies you could receive. [Apologies if you receive multiple copies] CALL FOR PAPERS User Interfaces for Theorem Provers, UITP 2006 A satellite workshop of FLoC'06 Seattle, USA, Monday August 21st 2006 The User Interfaces for Theorem Provers workshop series brings together researchers interested in designing, developing and evaluating interfaces for interactive proof systems, such as theorem provers, formal method tools, and other tools manipulating and presenting mathematical formulas. While the reasoning capabilities of interactive proof systems have increased dramatically over the last years, the system interfaces have often not enjoyed the same attention as the proof engines themselves. In many cases, interfaces remain relatively basic and under-designed. Initial studies by HCI (Human-Computer Interaction) practitioners and theorem-prover developers working in collaboration have had promising early results, but much remains to be investigated. The User Interfaces for Theorem Provers workshop series provides a forum for researchers interested in improving human interaction with proof systems. We welcome participation and contributions from the theorem proving, formal methods and tools, and HCI communities, both to report on experience with existing systems, and to discuss new UITP 2006 is a one-day workshop to be held on Monday, August 21st 2006 in Seattle, USA, as a FLoC'06 workshop. We encourage submission of short abstracts or papers (from 4--20 pages). Submissions will be reviewed by the program committee. We will invite authors of accepted submissions to talk at the workshop (slots of 20--30 minutes are expected). Submissions presented at the workshop will be included in informal proceedings to be distributed at the workshop and made available electronically afterward. Suggested topics include, but are not restricted to: * Novel and traditional interfaces for interactive proof systems - command line based user interfaces - graphical user interfaces - natural language based user interfaces * Bridging the gap between human-oriented and machine-oriented * Design principles for interfaces * Representation languages for proofs and mathematical objects * Tools for exploration, visualization and explanation of mathematical objects and proofs * User-evaluation of interfaces * Integration of proof systems into e-learning environments * Web-based services for proof systems * Implementation experiences * System descriptions Authors are encouraged to bring along versions of their systems suitable for informal demonstration during breaks in the program of The workshop proceedings will be distributed at the workshop as a collection of the accepted papers. Final versions of accepted papers have to be prepared with LaTeX. Following up the workshop the (revised) accepted papers will be published in a volume of ENTCS devoted to the workshop. Deadline for submissions: May 15th 2006 Notification: June 20th 2006 Final versions due: July 10th 2006 Workshop: August 21st 2006 Submission is via EasyChair (thanks to Andrei Voronkov) More information can be found on the UITP web page at David Aspinall (University of Edinburgh, UK) Yves Bertot (INRIA Sophia Antiplois, France) Paul Cairns (University College London, UK) Ewen Denney (NASA Ames Research Center, USA) Christoph Lüth (University of Bremen, Germany) Michael Norrish (NICTA, Australia) Florina Piroi (RISC Linz, Austria) Aarne Ranta (Chalmers University, Sweden) Makarius M. M. Wenzel (Technical University Munich, Germany) Organizers and PC Chairs Serge Autexier (DFKI, Germany) Christoph Benzmüller (Saarland University, Germany) Christoph Benzmueller, Saarland University, www.ags.uni-sb.de/~chris This e-mail was delivered to you by event at in.tu-clausthal.de, what is a moderated list ran by Computational Intelligence Group of Clausthal University of Technology, Germany. All event announcements sent through this list are also listed in our conference planner at http://cig.in.tu-clausthal.de/index.php?id=planner. In the case of any requests, questions, or comments, do not hesitate and contact event-owner at in.tu-clausthal.de ASAP. If you want to unsubscribe from this list, please visit http://www2.in.tu-clausthal.de/mailman/listinfo/event, or send an e-mail to event at owner.in.tu-clausthal.de. * CIG does not take any responsibility for validity * * of content of messages sent through this list. * Computational Intelligence Group Department of Computer Science Clausthal University of Technology More information about the Elsnet-list
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999948.3/warc/CC-MAIN-20190625213113-20190625235113-00098.warc.gz
CC-MAIN-2019-26
4,784
90
https://community.developer.atlassian.com/t/seeking-clarification-around-act-as-user-scope-and-related-actions/71724
code
I’m trying to understand when the ACT_AS_USER scope is actually needed. Take creating issues for example: According to the documentation, only the WRITE scope is required for Connect apps. Without user impersonation, on whose behalf is the issue created? Would it be the Connect app itself? And if that is correct, I would assume that in order to create/update/delete any entity on a specific user’s behalf, we would have to leverage the ACT_AS_USER scope- is that also correct? AFAIK, answer to both your questions is YES. Your app can do various operations, based on the permissions it has. So if you have the WRITE permission you will be able to create an issue. Now, if some user that has no permissions to create issues (in certain projects for example) uses your app, then that user will create issues there and in Issue’s it will state that it was your app that created it. Sounds like “kind of workaround” for Jira permissions system. If you use ACT_AS_USER, Jira api will check all required permissions and will block operations (like issue creation) if specific user has no rights to perform them. From the other hands, there can be operations (for example this one) that your app requires ACT_AS_USER to perform them. More info here: User impersonation for Connect apps (atlassian.com)
s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233506623.27/warc/CC-MAIN-20230924055210-20230924085210-00292.warc.gz
CC-MAIN-2023-40
1,306
8
https://www.meetup.com/LA-PureScript/events/259245304/
code
Please join us for a weekend of Haskell/Purescript lectures and hacking in the spirit of BayHac (but in LA!). We are looking for speakers to do 30-45 minute presentations on any topics related to Haskell and Purescript. The dates and venue are the same as last years event, but this time with a bigger focus on Haskell and Purescript project demonstrations and time for collective hacking on them Important to know Parking will be available. Lunch will be offered to participants for free. Registration required. Seats are limited. Please only RVSP if you are certain you can make it. We are looking for open source maintainers who would like to present their Haskell/Purescript for contributions during our open hacking sessions. To propose a talk, submit your proposal here: https://docs.google.com/forms/d/e/1FAIpQLSfxroAXDfjO8IQJjpU2A9YWhcu5qboKVIzn37xcvHF5dEcopg/viewform The latest info will alway be posted on https://coday.today Contact us at: [masked]
s3://commoncrawl/crawl-data/CC-MAIN-2019-13/segments/1552912202496.35/warc/CC-MAIN-20190321051516-20190321073516-00516.warc.gz
CC-MAIN-2019-13
960
7
http://inwavetech.com/site/datalogger.html
code
Serial web converter with 1Mb EEPROM memory and real clock Convert all Inwave TTL serial protocol to TCP/IP Save event in local memory in case of offline server Synchronized with server clock for buffer events timestamping These are just examples and you can build your own color scheme in the backend.
s3://commoncrawl/crawl-data/CC-MAIN-2018-17/segments/1524125937074.8/warc/CC-MAIN-20180419223925-20180420003925-00078.warc.gz
CC-MAIN-2018-17
302
5
https://www.sommarskog.se/AbaPerls/doc/history.html
code
This page lists all changes to the AbaPerls tools and documentation from the first public release L1.0.0009 and on. (OK, not all changes, but the level of detail is fairly high.) The horizontal lines divide external public releases from each other. For a compact history for the external releases, see this page. The eleventh public release of AbaPerls. AbaPerls now supports temporal tables. See the new manual page for temporal tables for details. Beware that to use temporal tables, the database must be upgraded to label L1.0.0400 of the ABAPERLS subsystem. There are also some changes related to indexes: A couple of fixes/changes: IF rowcount_big() = 0 RETURNfirst in DML triggers. See the file-load page for more details. ‑forceoption. This is only possible with ABASQL, and only in databases labelled as development or test databases. AbaPerls did not handle the syntax for the trim() function correctly, so SELECT trim(@chars FROM string) would result in the error that the table source string was not found. This has been fixed. Previously, if a file included another file multiple times, directly or indirectly through other include files, AbaPerls would attempt to load the include file multiple times into abahistsysobjects, with a risk that this would fail with a duplicate key. This has been changed so that AbaPerls only enters the file once. The enhancement on 2021-02-22 introduced a bug. .fkey files failed to load if the constraints were defined in multiple ALTER TABLE statements. This has been fixed. DBBUILD and DBUPDGEN had some special handling for types with name ending in _upduser. These specials have been removed. The loading of files with foreign keys have been improved, so that keys that have not changed will not be dropped and recreated. If the definition is the same, but the name is different, AbaPerls will rename the existing constraint but not reload it. This can save significant amount of time when loading constraints for big tables. Note that this improvement only applies if the ABAPERLS subsystem is on at least label L1.0.0385, as it requires an new stored procedure, ap_compare_fkeys_sp. An enhancement to the $DBPERM directive. If an object was dropped and reloaded, the permissions granted on the object through $DBPERM were lost. Now when you load an object, AbaPerls checks if there are permissions for the object stored in abaspecialperms, and in such case AbaPerls reruns the GRANT statements implied by the saved $DBPERM directives. That is, the permissions will be retained. Fixed an error when loading a sequence object with a negative increment. The sequence could be restarted, although the current value was beyond the start value for the object. Added a utility procedure ap_hso_purge_old_sp to permit a DBA to delete old entries from abahistsysobjects in a controlled way. If you wanted to execute DDL from with an application stored procedure (typically signed with a certificate to package the permission) that was exceuted by a plain user, there were two permission issues in the DDL triggers that AbaPerls sets up: 1) permission was needed on the AbaPerls tables (because ownership chaining does not apply to DDL triggers) 2) the DDL triggers had statements that required the permission VIEW SERVER STATE. Both this issues have been addressed. The code for the DDL triggers is now in an SP wihch resolves the permissions on the table, and there are alternate solutions in place when the DDL is issued from within a stored procedure. Sequences are now first-class citizens in AbaPerls. Sequences are stored in .seq files which you put in the TYPE directory. DBBUILD loads .seq files between .typ and .tbltyp files. When you load a .seq file into a database and the sequence already exists, AbaPerls will under some circumstances alter the current sequence to agree with the new definition. See the file-load page for details. Previously, when you generated a patch script with DBUPDGEN and you specified an explicit version number for a file and that version number did not match a checkin for that file, you got an error. This has changed to that you only get a warning. Note, though, that DBUPDGEN does not really honour the version number, but will write a version number that matches the most recent checkin for the file at that version number. Fixed an issue where a load of file from a different subsystem would fail, if the file originallly had been loaded from version control and now was loaded from a install kit created with ‑get or vice versa. The issue was due to that the MD5hash was computed differently in the two cases. AbaPerls now computes both hashes when comparing a file to a database, and accepts if one of them matches. Bugfix: if there was a space in the path to the source file for a .NET file (.cs or .vb), compilation failed, since AbaPerls did not put quotes around the file path. This has been fixed. When you loaded a file in a subdirectory with ABASQL, and this file had an $INCLUDE or $DEPENDSON of a file in a different subsystem, AbaPerls incorrectly tacked on the subdirectory when looking up the included file, which caused the load to fail. This has been fixed. AbaPerls did not detect the default schema if the default schema was defined through a Windows group. This has been fixed. If you connected with permissions less than db_owner/CONTROL you got a error message about not being able to drop DDL triggers. This has been replaced with an explicit error message that you need to be member of db_owner or have CONTROL permission in the database. INSFILGEN now applies "Freeze panes" when creating the Excel file, so that the header line will always remain visible when you scroll, and likewise, all columns defined as keys on the Config sheet are always visible when scrolling sideways. On the Values sheet the frozen pane also includes the %param column. A change in INSFILGEN: if the column width is very narrow, INSFILGEN will not honour that, but restore the column to a default width. This avoids problems if someone saves the Excel file with some columns hidden. There is a change in how AbaPerls determines whether a file in TFS is binary or not. Previously, AbaPerls relied on the encoding in TFS. Now AbaPerls looks at the file extension. If this is an extension known to AbaPerls (e.g. .sp), this determines whether the file is to be handled as binary. For file extensions unknown to AbaPerls, e.g. .ps1, AbaPerls still uses the encoding in TFS. This change mainly affects SSGREP, SSREPLACE and VCDBLOAD what would fail to read files hat mistakenly had been marked as binary in TFS. A related change is that the table files created by VCDBLOAD now has a column isbinary which tracks if a file is considered binary, so that you can determine why a file has not been read when you expected it to be. Fixed a bug in the library module used use by update scripts generated by DBUPDGEN. When reloading a table, the re-targeting of foreign keys could fail if they were defined with ON CASCADE. (This was due an akward way of moving the they keys that was needed in SQL 6.0!) DBUPDGEN now includes an OPTION(RECOMPILE) in a table reload if there is a batch column. This can help to improve performance. A couple of small fixes/changes: Bugfix: When moving a table from one subsystem to another, and this table had one or more constraints defined within the file, you got a bogus message that OLDSUBSYSTEM!tbl.tbl needed to be reloaded. This has been fix. Bugifx in INSFILGEN: Failed to detect that there were extraneous keys in the Macros sheet, but this only resulted in incomprehensible Perl warnings. INSFILGEN will now stop with an error message when there are keys in Macros that are not on the Definition sheet. Bugfix with object checking: AbaPerls did not recognise CTEs in inline table-functions. It does now – but only if you do not enclose the query of the function in parentheses. fail if there was a pending branch. Bugfix: Loading of a table type would fail if a depending object had two parameters of this type, because two DROP commands where generated. This has been fixed. The tenth public release of AbaPerls. Bugfix in INSFILGEN: if consecutive keys in the Definition sheet had Delflag = 1, you could get bogus messages about extraneous rows in the Macros or Values sheets, and also lose rows from these sheets. There was a problem to connecting to TFS from AbaPerls on a machine with only Visual Studio 2015, particularly if there had not been any earlier versions of VS installed. This was due to that the TFS assemblies in VS2015 are not stored in the GAC. This only caused an error if you tried to use the explicitly, but AbaPerls also failed to map you current directory to a folder in TFS. You may not have noticed, but if nothing else, things broke with references to other subsystems. This has been addressed by bundling the TFS assemblies with the Perl executable, and AbaPerls now comes with a Perl installation that you are supposed and not your own Perl installation. Note however, that the current arrangement only supports VS2010 to VS2017. That is, later versions of VS are not supported with the current version, as that requires that AbaPerls bundles the assemblies for that new version of VS. Fixed problem where AbaPerls yelled about a private type name when that pattern appeared inside a string literal when loading a file. When loading an .ix file, AbaPerls did not write a row to abasysobjects and abahistsysobjects for indexes that were unchanged. This was intended to be a feature. However, this had the consequence that the md5hash column was not updated, which could lead to problems further down the road. So this "feature" has been removed, and abasysobjects is now updated, no matter whether the index definition has changed or not. Fixed an issue that affected both DBUPDGEN and DBBUILD which prevented them for finding checked out files in TFS 2017. (Because a property that is supposed return a name, returns a GUID with TFS 2017.) When applying server-level permissions for $SERVERPERM or a privileged assembly, AbaPerls now replicates the server-level commands to all nodes in any availability group the database is a member of. DBBUILD now only respects ‑use_disk when you build from version control for subsystems where you are building without a label or you explicitly have specified LATEST. Note that if you are building from a config-file and all subsystems but one has a label, DBBUILD will respect ‑use_disk for that last subsystem. If no subsystem is built from latest and you specify will get an error message. Changed the pattern for how temporary schemas are named, to work around a bug in SQL 2014. Bugfix in TBLFIX: line breaks in table constraints were lost. This has been fixed. A new Preppis directive $DEPENDSON which is similar to $INCLUDE and $REQUIRE in that it requires a mirroring $USEDBY, but in difference to the other two, $DEPENDSON does not load the referred file, but only serve to check that $USEDBY is present You use $DEPENDSON to state that a file A is dependent on another file B, in the sense that if B is changed, the object(s) in A must be dropped. The presence of $USEDBY makes it possible for DBUPDGEN to add A to the update script, if B has changed, but A is unchanged. $DEPENDSON is mandatory in these cases: These checks are error with ABASQL and update scripts generated with DBUPDGEN, but only warnings with DBBUILD and older update scripts. A benefit of this change is that AbaPerls can now can reload a table type, even if the table type has changed and is in use. AbaPerls now drops all dependent objects and issues a warning that the stored procedures and functions in question must be reloaded. This warning is handled by LISTERRS which will not display the warning if the objects are indeed recreated. The same applies to objects that are referred to WITH SCHEMABINDING – AbaPerls will drop all referencing objects if needed. Note that in both cases, AbaPerls only drops objects listed in abasysobjects. Similarly, this change removes the problems with triggers, indexes and foreign keys that are defined in different subsystems than the one of the table in the case the table is changed and reloaded in an update script. Previously, they could remain un-reloaded if no update script was created for the other subsystem. With the presence of $USEDBY, they are added to the update script for the subsystem of the table. Change in DBBUILD: previously DBBUILD skipped a subsystem if it had an sql directory, but there were no objects. This has been changed, so that DBBUILD in this case creates an empty subsystem. If there is no sql directory for the subsystem, the subsystem is still skipped. Bugfix in DBUPDGEN: if a file with an extension unknown to AbaPerls was deleted from source control, DBUPDGEN would put it into OBSOLETE-FILES whereupon abasqlfile would fail on processing it. This has been fixed, so that file is not entered into the update script. $INCLUDE is now permitted across subsystems. Assume that the file caller_sp.sp in the subsystem BETA needs to include the file compute.sqlinc in the subsystem ALPHA. The following applies: Beware that the feature is designed from the assumption that you are using a sysdef-file. If you are not using a sysdef-file, it may not always work that smoothly. For more information, see the topics for Preppis and DBUPDGEN. As part of the same development it is now possible to load a file from a different subsystem with ABASQL using the notation above, for instance: abasql -d somedatabase -S someserver somefile.sp SUBSYS!someotherfile.sp Bugfix for DBUPDGEN: table reloads were not correctly generated if there was a computed column or check constraint that splahed out over several lines and one of the lines ended with a comma. A modification to the changes from 2016-06-28: setup_for_datamove does not attempt to set the database into bulk-logged recovery if the database is member of an availability group or configured for mirroring, since SQL Server will not permit this. The way how DBUPDGEN generates code for data-move for table updates has been changed to improve performance: setup_for_datamovedisables all non-clustered indexes before the move. This ensures that any clustered index is always present when the table is reloaded. The indexes are then rebuilt by the routine wind_down_datamove called after the data move. setup_for_datamovealso sets the database in bulk_logged recovery if the database is set to full recovery to further speed up the load. Full recovery is restored after the indexes have been rebuilt. These changes do not affect existing update scripts. There where as a bug in aba_check_column which caused incorrectly to consider tables in any schema, not only dbo. DBBUILD and the update scripts generated by DBUPDGEN now check when you connect that you have CONTROL permission on the database (which is implied if you have membership in the db_owner role). Added a prefined macro, The description on how to suppress the name-clash check when an object has moved from one subsystem to another was incorrect with regards to stored procedures due to the buffering feature. This section has been updated and a check has been added so that update scripts now will abort if you attempt to change an AbaSql attribute in the SP section of an update script. VCDBLOAD now printes a final message with how many files that were loaded in how long time. There is also a new option ‑progress n which permits you to get a progress message every load of n files. Some small improvements to DBUPDGEN with regards to how the script is generated for tables: Bugfix: if you followed the instructions for how to handle a data-type change, your reward was an error message, because AbaPerls had an incorrect check for the existence of the type. This has been fixed. Bugfix for ABASQL: if you current directory on disk was a subdirectory to one of special directories in the AbaPerls SQL directory structure, for instance C:\Project\SQL\SP\Subdir and ABASQL was able to match that to a directory in version-control, ABASQL would nevertheless load the file from disk, even if you did not have the file checked out. This has been fixed. The fix does not affect DBBUILD and the update scripts generated from DBUPDGEN. Small improvement how Pre-SQL Analysis handles SELECT INTO in triggers with regards to the tables inserted and deleted. New option for ABASQL, ‑LoadRequire that permits you to force load of AbaPerls now prints a level 8 message if a file takes more than 60 seconds to load. The messsage includes the time elapsed and the current time. The intention is to make it easier to find slow operations in update scripts. The message is however generated with all tools, including ABASQL. DBUPDGEN no longer generates code to set the property Time_calls, as this property has been superceded by the scheme above. DBUPDGEN now prints the total execution time as well the time spent on retrieving files. Added an improvement to handle the situation that different versions SQL Server may store the definition of a computed column differently. To take benefit of this improvement, you may need to change your table definition. Added the option ‑batchmode for INSFILGEN to simplify mass runs. Improvements to INSFILGEN. Now when you generate INSERT filea from your Excel booka, INSFILGEN also generates .srcdata files. You can later recreate the Excel book from the .srcdata file with INSFILGEN. The idea is that you would put the .srcdata files under version control rather than the Excel books, as the latter are difficult to handle, not the least with regards to merging. The .srcdata files are XML files, and should be easier to merge. A side effect of this overhaul is that INSFILGEN is now somewhat stricter when reading the Excel book. Previously, INSFILGEN accepted that the same macro appeard twice in the Macros and Values sheets, but the result of this was undefined. INSFILGEN no longer accepts duplicates, but terminates an error. The same applies to LANGINSGEN which accepted that the same language appeared twice, again with undefined results. Here as well, duplicates now results in an error. There was a restriction, so if you had something like: DELETE kti OUTPUT deleted.katid INTO @out FROM katitransactions kti WHERE katid=@katid Objet Checking would report the alias kti as missing, because it did not consider the OUTPUT-INTO clause. This has been fixed. If you were running DBBUILD or update script over multiple subsystems and you ‑use_disk, AbaPerls did not find files for which there was a pending Add or a Rename operation in TFS. This has been fixed. SSREPLACE can now implicitly map the current folder to version-control directory in TFS and in this case, the mapping determines the workspace in which checkouts are performed. There is also a bugfix: the default option no matches the documentation. That is, ‑noexclusive is the default. Bugfix: attempting to process a dropped table type failed with a Perl run-time error. This has been fixed. A new feature in Preppis: delimited Macro expansion. You can now the delimit the macro name with <> and for the first four you get the macro value enclosed in the given delimiters. The angle brackets are not retained; you use these when you want to use the macro value as the base for a name, for instance &<message_type>_contract to form the name of a Service Broker contract from a message type. DBBUILD now includes files with a pending add operation in TFS when you use ‑use_disk together with Behavioural change for loading of Service Broker files: In databases labelled production, AbaPerls now considers it to be an error if there are active conversations related to any of the services in the file. In test and developement databases, AbaPerls still kills the conversations without mercy. Update scripts generated by DBUPDGEN no longer generates an error if a file that appears in the secrion OBSOLETE-FILES includes an object name that is defined in another subsystem. This happens if the object have been moved to a different subsystem. You only get an informational message that the object was ignored. Updated the manual page for Preppis to correct some inaccuracies with regards to macro expansion. Particularly, the topic suggested that macros would be expanded inside string literals and quoted identifiers which is not the case. Introduced a new file type, .mty, for Service Broker message types that are used in more than one .sb file. .mty files live in the SERVICEBROKER directory and they follow the regular rule that object and file name must match. When loading a message type that already exists, AbaPerls changes CREATE to ALTER. Bugfix: if the ‑Password option was preceded by an option without argument, ‑force, the password was not masked in the output. Improve to the version checks for production and test databases: AbaPerls now stores an MD5 hash for the files, and DBBUILD and the update scripts generated by DBUPDGEN will not perform any version checks if the stored MD5 hash matches the hash computed for the file. This should reduce the noise produced in the log files. New tool VCDBLOAD to load information under version-control to a full-text indexed database. Updated the instructions for VisualStudio and TextPad. Specifically, you should use $(ItemFilenName)$(ItemExt) for Visual $FileName for TextPad so that ABASQL only sees the file name and not the full path, as this works better with TFS. Bugfix in SSGREP: If you use both ‑lang, SSGREP would search no files. This has been fixed. Enchanced the Preppis directive $MACRO_LONG: if the parameter NOEXPAND is given, short macros inside the long macro are not expanded until the long macro is expanded. Added a predefined macro &SQL2014. NEWLABEL would incorrectly reporing labelling a TFS project as successful, although the operation failed. This has been fixed. However, due to limitations in the interface between Perl and the CLR, NEWLABEL is unable to report the reason for the failure. Minor adjustment in ap_compare_tables_sp, which failed to drop a constraint, if the constraint had been loaded to abasysobjects through ap_zz_sob_load_sp. Change in DBUPDGEN with regards how differences in TFS are determined. DBUPDGEN no longer assumes that files have changed just because the changeset ids are different, but instead DBUPDGEN compares length, encoding and the hash value stored in TFS. The check also applies to intermediate versions, to catch rollbacks. See further the TFS section in Determining Whether a File Has Changed on the DBUPDGEN page. ‑headeror put the header in the file tblfix.header in the directory where you run TBLFIX. ‑checkedout, DBUPDGEN would fail if needed to read a file that was only present in TFS as a pending Add. Bugfixes: 1) Added bypass of the DDL trigger to ap_sob_retarget_synonyms_sp and ap_spp_grant_assemperms_sp, as these procedures drops/alter objects old by AbaPerls and they are intended to be run from a query window. 2) The procedures to handle modules with special permissions could produce incorrect SQL in report mode, because AbaPerls failed to consider that output from PRINT is truncated after 8000 characters in SSMS. This has been addressed. Bugfix with DBBUILD ‑restruct: indexes were always rebuilt, even if they were unchanged. This has been fixed, but note that if the index defintion includes a WITH, ON or FILESTREAM_ON clause, it will always be rebuilt. A note about this has been added to the topic for DBBUILD and page about The DDL trigger has been eased for ALTER QUEUE, so that AbaPerls permits this command if you only use it to enable/disable the queue or activation, or set the execution principal, as this does not affect the application logic. Such an exempted command is not logged in abaddltribypasslog. See the Service Broker page for details. The patch-script functionality in DBUPDGEN has been revamped. When you ‑patch, you can no longer specify ‑subsystem on the command line. Instead you specify the version for the subsystem inside the patch file, and you can also specify the path in the patch file. The format for update scripts has been bumped to 3.6. See further the section Patch Scripts in the page for DBUPDGEN. Note that this section has been moved within the page. Bugfix in DBUPGEN. Table-updates were broken: DBUPDGEN failed under most circumstances to find any primary key, why now code for handling foreign keys were added, and unless the clustered index was a non-key index, DBUPDGEN did not generate any batching. AbaPerls now supports loading of DDL triggers on database level. (Server level triggers are not supported.) You put DDL triggers in files with the extension .ddltri. These files should reside in the MESSAGE directory. The name of the trigger must agree with the file name. AbaPerls now installs a DDL trigger to disallow modification of objects loaded with AbaPerls from SQL Server Management Studio and similar tools. It is possible to bypass the DDL trigger to deploy critical fixes in production databases, in which case the action is logged in the table abaddltribypasslog. See the documentation for this table for further details. The deprecated configuration options ‑[no]subscriber have now been dropped entirely ABASQL, DBBUILD and update scripts generated by DBUPDGEN incorrectly recognised the options ‑notlabel. This has been fixed. Bugfix: when running ‑get inside an SQL structure, the global grant.template was not extracted to the intended place in the subsystem directory. The ninth public release of AbaPerls. Bugfix for SSREPLACE with TFS: The rule that we should primarily pick a workspace that matched the computer name was incorrectly implemented. This has been fixed. AbaPerls now supports TFS 2012, both server and client. Bugfix: when loading files, the comments before CREATE PROCEDURE or corresponding were stripped before the batch was sent to SQL Server. In SYSTEM.DEFINITION you can now use options to define which subsystems references that are permitted and which are not. Default is that references must respect the build order, but you can override that to be more restrictive – or less. See further the section Sysdef-file Options. Improvements and changes in Object Checking: ‑sptwice for DBBUILD is now obsolete. Previously you would use ‑sptwice to ensure that you got complete dependency information. DBBUILD now employs a buffering scheme, and buffers all SPs in a subsystem. If an SP depends on a procedure to be loaded later, the SP is loaded, but requeued. In this way, a procedure is only loaded twice if there is no need to. See further the section Loading Stored Procedures on the For a table with an IDENTITY column, DBUPDGEN now generates code to transfer the current IDENTITY value from the old table to the new table. The manual page Known Issues has been dropped as it was not very well-maintained. Most of the restrictions are now listed on the Introduction page. AbaPerls did not inspect grant.template for permissions on assemblies, scalar types, rules and defaults. This has been corrected. When looking for SET ROWCOUNT, EXECUTE AS and the AS that concludes the of CREATE PROCEDURE, AbaPerls failed to consider lowercase or mixed case, and only looked at uppercase. This has been corrected. Added support for storage properties and indexes, e.g. compression and filegroup placement that you specify with the WITH and ON clauses and in case of tables in some cases also with EXEC sp_tableoption and ALTER TABLE. You can mention these explicitly in the source file. If not, AbaPerls will preserve the database setting which may have been configured locally. See the page Storage Settings for Tables and Indexes for details. To make use of these changes, the database must have at least version L1.0.0280 of the ABAPERLS subsystem. AbaPerls now supports columnstore indexes. AbaPerls now also understands XML and spatial indexes, but the support is very rudimentary and it is only practically useful for small development databases as XML and spatial indexes are dropped and recreated every time a .ix file is loaded. Previous, although it was not documented, AbaPerls would not start a transaction when it loaded an index file. This has been changed, so that there is a transaction for index files as well. Previously, when you loaded an .ix file, AbaPerls would drop indexes not in the .ix file but belonging to the same subsystem. Now AbaPerls also drops indexes not in abasysobjects at all (which is what the documentation has said all the time). Indexes in other subsystems are not touched. A few changes to update scripts generated by DBUPDGEN: Bugfix: When a .tbl file was reloaded, it could happen that for some of the constraints the most recent row in abahistsysobjects would say that the object was deleted, and this could later cause an error message with DBBUILD ‑restruct, particular for objects in the ABAPERLS subsystem. This problem arose due to a collision on loadtime, and it has been solved by adding a small delay Bugfix for DBUPDGEN: when DBUPDGEN compared the paths for two versions of the same TFS item, it considered the path as different if there was a difference in lower/uppercase only, why DBDUPGEN compared contents instead of changeset ids. If two versions of a file was identical, DBUPDGEN could fail to include the file, even if you wanted to. Bugfix in SSREPLACE: if you specified neither ‑type, it would not Bugfix in SSREPLACE: ‑type had incorrectly become mutually exclusive, so that no files would be processed. This has been fixed. LISTERRS now applies the default for the min-severity argument for each log in the log file, so that if your log file has a mix of logs from DBBUILD and update scripts from DBUPDGEN, the default is 10 for the logs from DBBUILD and 9 for the logs from the update scripts. Bugfix (sort of): when requesting data from TFS for the current user, AbaPerls now asks TFS what it thinks is the current user, rather than relying on Windows, as they could disagree, for instance in a workgroup. This affects several tools. New versions of TBLFIX and PDREP to support PowerDesigner16. This also includes a new report template to use with PDREP. PDREP was previously knowns as PDREP95, but it has been renamed as now also supports PowerDesigner 16. The older PDREPDIV has been removed. Bugfix: when print the command-line to the log file, AbaPerls could fail to include options with no arguments. ABASQL now prints a dotted line when it has completed accessing all files, to prevent error message from drowning in that noise. This line is not printed when you use the Bugfix in DBUPDGEN: If an assembly-related file such as a .dll file in SourceSafe had changed, but the .assem file was unchanged, the .assem file would always be included in all lowercase in the update script, even if the file name included uppercase. This has been fixed. This bug did not appear with files in TFS. Improvement: DBUPDGEN now makes use of the new capabilities when loading a table and generate update scripts that first attempts to load the abasqlfile, and only if this fails it enters the long table-update code, which now is known as fallback code. The update scripts now includes a section INCLUDE which lists all changed include files. They are commented out, but serve as reference. Behavioural change: AbaPerls now adds an initial ^ and a closing the regular expression in grant.template, so that the regexp must match the object from start to end. Previously the pattern zz_.* matched fuzz_sp, but it no longer does. (And this is in agreement with that the examples for grant.template have said all through the years.) If you want this match, you need to the change regexp to .*zz_.*. goor blank lines, AbaPerls did not see the subsequent CREATE INDEX statements, why it would only save information about the first index to abasysobjects. On an initial load, all index would be saved to the database, but on subsequent loads, changes or additions of indexes (still without a blank line) would not be loaded as long as the first index was unchanged. This bug was introduced with the changes of 2010-02-26. ‑get, AbaPerls failed to find the files in the OBSOLETE-FILES section. This bug was introduced with the changes of 2012-06-13. Bugfixes: 1) Reloading a user-defined aggregate did not work, because AbaPerls tried to use the non-existing command ALTER AGGREGATE. 2) Trigger files did not work if the name was quoted, for instance Changes to the AbaPerls file-loading process. Now when you load a .xmlsc, .typ or .tbltyp file and the type/schema is in use, AbaPerls will load the definition in a temp schema and compare the new definition with the definition in the database, and raise an error if they are different. The same applies to .tbl files for tables that have data, with the difference that AbaPerls will investigate whether it is possible to bring the existing definition up-to-date with ALTER TABLE, and in such case attempt such commands. See further the section Pre-SQL Analysis: Reloading a Table on the file-load page. There are two new options to DBBUILD, ‑revokeall and a change in behaviour for ‑restruct permits you to rebuild the database and change your subsystem structure. With ‑revokeall (which only can be used with ‑restruct), DBBUILD revokes all permissions to objects in ‑rebuild now reloads all files and does not skip files for types and tables. This is possible, thanks to the changes in the previous paragraph. For more information on the new options, see the section ‑rebuild and -restruct options in the topic for DBBUILD. All these changes require that the ABAPERLS subsystem is at least of version L1.0.0270. For lower versions, the old behaviour is retained. ‑VCwith ABASQL, DBBUILD and DBUPDGEN, but your current Windows directory is mapped to a TFS directory in some TFS workspace, AbaPerls will implicitly set the ‑VCoption to the TFS directory. This feature is not available for SourceSafe (because SourceSafe does not expose this information through an API like TFS.) ‑VCor mapped implicitly through TFS – to a subsystem. That is, in many cases developers that uses TFS never have to worry about specifying VC-path and subsystem, but get it for free. ‑nouse_disk, AbaPerls never reads from disk.) ‑use_diskover several subsystems for experimental builds. ‑config, AbaPerls would not stop directly if CONFIG.CFG was missing. Bugfix: the code to move referring foreign keys failed to consider that the referring table may be in a different schema. Specifically this broke the update script ap_up_1-0-0170-0230.pl which itself also has been changed to work with the side schema introduced 2012-02-20. Bugfix: when loading an index file, AbaPerls could under some circumstances mistakenly recreate an index, even if the index definition in the file agreed in the database. This has been fixed. More improvements to SSGREP. SSGREP can now store the output in a database ‑database option. When storing the output in a database, you can resume an interrupted search with the Added the option ‑input to SSGREP to permit SSGREP to read search strings from an input file. Changes in supplemental SQL checks. =*as an error. (SQL Server 2005 and 2008 only permit these operators in compatibility level 80 for outer joins, and *=is a perfectly legit operator in SQL 2008 for combined multiplication and assignment.) When AbaPerls adds a COLLATE clause to a temp table, AbaPerls now uses COLLATE database_default, rather than hardcoding a collation. Bugfix in Preppis: macros with one-letter names, &a were not permitted. This has been fixed. aba_tblrenameto name starting with old_, DBUPDGEN now generates a call to aba_move_aside, which moves the current definition of the table to a special side schema, AbaPerls$SideSchema. Update scripts check when starting and terminating that the side schema is empty, or else they abort with an error. Existing update scripts will pick up the changes automatically, as all code handling the side schema are in library routines called by the update scripts. aba_tblrenamehas been reimplemented as a wrapper on aba_move_aside. Use of it is deprecated, and it will be removed further afield. To retain compatibility with old scripts, update script still checks for objects with a name starting with old_. This check will be removed eventually. The version level for update scripts has been changed to 3.5. The eighth public release of AbaPerls. Bugfix in LISTERRS: When a procedure was missing because it had failed to load previously, the message incorrecty said that it was missing. Completely rewritten LISTERRS, for a number of enhancements: ‑laxthat permits you to match a message about a missing stored procedure, even when the stored procedure is loaded from a diifferent subsystem. To support these changes, there are a few more changes in AbaPerls: ‑logoption (ABASQL, DBBUILD and update scripts from DBUPDGEN) with ++, an existing file is appended to. Finally, two unrelated changes: If two developers used different paths to the same repository, this could cause the version checks for production and test databases to raise false alarams. AbaPerls addresses this by introducing repository IDs which are stored in the new table abarepositorymappings. AbaPerls accepts different repository paths, as long as the repository IDs are the same. The repository IDs are derived from within the version-control databases; both SourceSafe and TFS have a GUID that identifies the database. When the an SQL database upgraded to this version of AbaPerls (L1.0.0250), the repository ID is initially set to NULL, and it will be filled up by time as you run upgrade scripts or ABASQL on the database. If the ID missing, AbaPerls only checks the path within the repository. As a consequence of the previous change, the tables abasysobjects, abahistsysobjects and abainstallhistory have been changed, so that the path to the repository and the path within the repository are stored in separate columns. The format of the files SS‑FILES.LIS and SUBSYSTEMS.LIS has changed to accomodate the repository IDs. The new versions of the format is 1.1 and 1.3. They are designed so that older versions of AbaPerls are able to read this flies and act correctly. ...or at least the most recent versions of AbaPerls. For the future, AbaPerls how checks the version number of the format, and terminates with an error if the format is not supported. AbaPerls has changed its philosophy with regards to changeset-ids in TFS. Previously, if you asked for changeset 7634 of a file, and that file was not part of that changeset, AbaPerls would report the file as missing.. Nevertheless, TFS returns the version of the file that was current when that changeset was checked in. AbaPerls now goes with the version returned by TFS, as the original rule made it impossible to use changeset numbers with ABASQL. INSFILGEN now generates an error message when there is a duplicate entry in any of the sheets, and no file is generated. INSFILGEN how handles the extensions .xlsx and .xml on equal footing with .xls. This also applies to the sorting in DBUPDGEN. There was a bug in INSFILGEN, so you would get an error about missing MAXLEN for a parameter with an uppercase character in the name. Four fixes/changes with DBUPDGEN: DBUPDGEN now understands to strip out ASC and DESC from the definition of PRIMARY KEY and UNIQUE constraints. (DBUPDGEN already handled ASC and DESC in the definition of clustered indexes in .ix files.) The path to the help file in the welcome message was incorrect. This has been fixed. Added support for Service Broker to AbaPerls. You defined a collection of Server Broker objects in .sb files (new extension in AbaPerls) int the SERVICEBROKER directory in the AbaPerls SQL directory structure. See further the page Service Broker in AbaPerls. There are also two new Preppis directives, $PRELUDE and $ENDPRELUDE. While generic in nature, they are currently only permitted in Service Broker files. If the input file is a 8-bit file, SSREPLACE now retains the encoding. If the file is a Unicode file, SSREPLACE retains the encoding only if you have Perl 5.14 or later. With Perl 5.12, SSREPLACE always write Unicode files as UTF-8, since a bug in Perl prevents writing UTF-16 files. Bugfix in DBUPDGEN: if the path to a SourceSafe database is a UNC name, DBUPDGEN failed to observe that the leading \\ in the path must \\\\ in the Perl script, because the way Perl interprets INSFILGEN produced a sporious message Missing Index for an Excel book which does not have all four possible sheets. This has been fixed, and the message no longer appears. Bugfix: Object checking of Service Broker objects incorrectly flagged objects if there were more than one of the same kind referenced in the same file. CREATE XML SCHEMA COLLECTION is now permitted in .sql files to permit schema collections to be used with Service Broker (which typically are defined in .sql file.) AbaPerls now supports two version control systems: SourceSafe and Team Foundation Server (TFS). This has required a major overhaul of ABASQL, DBBUILD, DBUPDGEN, NEWSUBSYSVER, SSGREP and SSREPLACE and all code that supports these commands. You use TFS with AbaPerls very much in the same you work with SourceSafe – at least for now. Since TFS has a quite a different mindset, it is conceivable that AbaPerls will add further support for the TFS mindset later on. The support for TFS requires TFS 2010 SP1. The page Version-control Concepts includes important information about general concepts how AbaPerls approaches version-control systems and particularly how you specify the repository for a SourceSafe database or a TFS project collection. As a result of these changes, AbaPerls now requires Perl 5.12. It also requires the present of one more extra module, Win32::CLR. Furthermore, AbaPerls now requires the Perl installation to be on a local disk; it cannot be on a network disk. (Because of the reference to the CLR.) This applies even if you only work with SourceSafe. There are two new commands NEWLABEL, a command that sets a label on the AbaPerls format LetterMajor.Middle.Minor and which can compute the label for you. TFSLABELFIX, which is a TFS-specific command that re-scopes TFS labels to fit with AbaPerls. The general overhaul of the code has also lead to a number of other improvements and behavioural changes: ‑VSSis now officially ‑VCin the documentation and usage messages. ‑VSScan still be used as a synonym for the foreseeable future. However, the very old switch ‑project, which has been undocumented for many years, have been discontinued. ‑rebuildnow checks that you rebuild all subsystems against the same label. ‑fromswitches make sense relative each other. ‑checkedoutare now mutually exclusive. %errorlevel%in BAT files. ‑charsethas been dropped, as it proved to not be needed any more, but only served to produce garbage. The config-options for ANSI settings are now deprecated, and this is also true for the The following applies with regards to compatibility. Keep in mind that there are changes both to the AbaPerls client and the ABAPERLS subsystem, the database part: LISTERRS could fail to list errors in an assembly created from a source file in VB .Net, because of the special way VB .Net displays the erroneous code. Added support to grant permissions on types and XML schemas. aba_tblfkey(), used in update scripts generated by DBUPDGEN, did not transfer the FK options SET NULL or SET DEFAULT. This has been fixed. (But note that if you are running with any of the ANSI options off, the issue The seventh public release of AbaPerls, label L1.0.0190. AbaPerls now supports table types and XML schema collections. They can be defined in the file types .TBLTYP and .XMLSC respectively. These files reside in the TYPE directory. In DBBUILD and DBUPDGEN, XML schema collections are loaded before regular types (.TYP files), whereas table types are loaded after. You can also define a table type or an XML schema collection in a .SP or .SQLFUN file, if the type/schema is to be used by this procedure only. In this case the name must follow certain rules. See further in the section about the SP directory on the page The AbaPerls Subsystem Structure. AbaPerls now supports CREATE TYPE to define type. is still supported. You can now create an assembly directly through a source file written in C# or Visual Basic .Net. To do this, you need to supply the $COMPILE directive in the .ASSEM file. You find more information on the CLR page. As a consequence of this change, version-checks now also include .ASSEM files. If you run DBBUILD ‑rebuild for a single subsystem, AbaPerls now clears the settings configuration settings only for that subsystem, and leaves global settings and settings for other subsystems unchanged. Note also that the subsystem is rebuilt using the global configuration settings. When you had expressions with .nodes(), AbaPerls could produce a internal error about too many iterations. This problem has been fixed. I'm doing a minor overhaul of the design of the manual pages, and in a transitional period, the design between different pages will vary. ABASQL now masks any password when printing the command-line options. Added two new macros $DBPERM and $SERVERPERM that permits you to encapsulate permissions that can not be handled with regular ownership chaining. Examples include use of dynamic SQL and BULK INSERT. See the new topic Modules with Special Permissions for more details. Added an argument to the $DLLINCLUDE macro where you can specify the permission set for the assembly. Using this directive, you can install unsafe assemblies on databases that are not marked as trustworthy. See the topic Privileged Assemblies on the CLR page for details. Fixed bug in RUNSPS which caused RUNSPS to loop indefinitely if you did not supply any parameter definition at all, for instance because the procedure does not have any. Added the option DBBUILD. This option specifies an alternate build order where tables and views in all subsystem are built before any functions or procedures are loaded. Augmented the AbaPerls file-lookup order, so that AbaPerls now looks for the file in a subsystem directory before trying an AbaPerls SQL directory structure on the same level. AbaPerls now supports use of WHERE clauses in the definition of index and statistics, so-called filtered indexes. RUNSPS is now able to read regular Excel books, and is not constraint to CSV files. Older CSV files still work with RUNSPS, as long as Excel can read them. DBBUILD and DOBCP now use the ‑q option with BCP to set the setting QUOTED_IDENTIFIER (which for some reason is not on by default with BCP.) DBUPDGEN no longer complains if the ‑VSS option disagrees with the project in the file, if the only difference is that one of them has /SQL and the other not. The sixth public release of AbaPerls, label L1.0.0091. The tool PREPRC has been dropped. Fixed bug in aba_check_column, so that it now correctly returns 0 when ‑noexec is in effect. DBUPDGEN now generate update scripts that sets an error status on DOS level (i.e. %ERRORLEVEL%) if there are objects with names starting in old_ before the update starts, or when the update has completed. The rules for ABASQL in the version-check in the AbaPerls file-loading process have been modified. Previously AbaPerls required that when a file was loaded from SourceSafe that the path was the same as when the file was most recently loaded. This has been changed so that the check is now performed against the SourceSafe path the subsystem was most recently loaded from. The check is only effective if the ABAPERLS subsystem in the database is upgraded to label L1.0.0081. For older versions of the ABAPERLS subsystem, the old version check is retained. ‑Apaddinghave changed, so that ANSI_NULLS and ANSI_PADDING are on by default. Important note: to get the updated version of ap_scriptname, you need to run an update script, ap_up_1-0052-0080.pl. However, this script does not run on SQL 2005 RTM and SP1. The script runs on SQL 2000 and earlier, SQL 2005 SP2 and SQL 2008. The reason is that sp_rename SQL 2005 RTM/SP1 checks for referencing stored procedures when you rename a type, and ap_scriptname is used by the procedure that loads objects into abasysobjects, resulting in a catch-22 situation. ABASQL now requires that you specify ‑subsystem also for .TBL files. (As failing to do so when you create the table the first time, could buy problems further down the road.) INSFILGEN used to insert an extra go after each 40th call. This has been removed. (It was a workaround for problems with the tools in SQL 6.5) Added support for synonyms. See the page Using Synonyms for details. Bug fix: it was not possible to connect with AbaPerls to databases with space or other special characters in the name. Because of changes in SQL 2005 SP2, LISTERRS failed to list calls to missing stored procedure. This has been fixed. Also, AbaPerls also correctly handles the same situations with SQL 2008. The fifth external release, L1.0.0060. When using SourceSafe 2005, DBUPDGEN could crash while reading the object histories. This has been addressed. AbaPerls is now able to understand Common Table Expressions (CTEs). Support for the MERGE statement in SQL 2008 added. There was a problem with SourceSafe that made it impossible to dates with ‑label in various tools with SourceSafe 2005. AbaPerls now has a workaround for this problem. It is now possible to use the @ character as separator for site-specific files as an alternate to % which has been outlawed by SourceSafe since 6.0d. ABASQL now has the options ABASQL is now able to read SourceSafe information from SS‑FILES.LIS, permitting using install files at customer sites to which you have no direct connection. AbaPerls no longer performs object checking for the pattern ident1.ident2.ident3(. Previously, AbaPerls attempted to verify this as a scalar user-defined function with the database part of the name included. However, in SQL 2005, there is no syntactical difference between the call to a UDF and the invocation of a type method of an XML or CLR UDT column. AbaPerls opts to validate two-part names, but not three-part names, that look like function calls. As a consequence of this, you need to always use a table or alias prefix with your XML and CLR UDT column when you need to invoke a method, for instance A similar issue with the table-valued XML method .nodes has also been addressed. AbaPerls ignores three-part names that ends in nodes and are followed by a right parenthesis. There was a bug with object checking, so that AbaPerls did not see a scalar function when it appear directly after a keyword. This bug has been fixed. Thanks to Koen de Vos, the update scripts generated by DBUPDGEN will now correctly transfer ON DELETE/UPDATE CASCDE and also NOT FOR REPLICATION when moving foreign keys from one table to another. DBBUILD and the update scripts generated by DBUPDGEN now accepts an ‑User to permit you to specify a different SQL login than sa. AbaPerls verifies that the login you use – SQL login or Windows login – has dbo as the default schema. This applies to ABASQL as well. DBBUILD and the update scripts generated by DBUPDGEN no longer sets the database in simple recovery, nor do they set a read-only database in read-write mode, but you need to cater for this yourself. As a consequence of this, the update scripts are no longer generated with the DBBUILD now uses the option ‑h "CHECK_CONSTRAINTS" with BCP to make sure that constraints trusted. Also added the utility procedure ap_zz_enable_constraints_sp to enable all constraints in a database. Added a note the page for DOBCP about constraint ‑forcepermitted you to override the check between file name and object name for any type of object. You can now only override this check for stored procedures and user-defined functions. ‑forcepermitted you to override the check that the object agreed with the file type, so you could for instance create a stored procedure from a .SQLFUN file. This possibility has been removed. &SQL2005. (And the page for Preppis has been corrected. It listed &SQL8as a predefined macro, but the correct name is The tool SPTRITEST has been removed. It was a useful tool on SQL 6.5, but less so on later versions. Changes to DOBCP: ‑nativeis now default. out, DOBCP automatically generates format files. When you import data, DOBCP by default assumes that there is a format file for each table. ‑nofmtfileinstructs DOBCP to not use format files. ‑unicodeto force Unicode format for all character data. The default is ‑421has been dropped. AbaPerls now reads the real SRCSAFE.INI for the SourceSafe database, and if there is a definition of a journal file, it is copied to the SRCSAFE.INI that AbaPerls creates so that actions from tools like SSREPLACE and NEWSUBSYSVER are journaled appropriately. New feature in update scripts from DBUPDGEN: you can now have a final epilogue. ‑environment, permits you to classify a database as development, test or production database. For test and production databases, the AbaPerls file-loading process performs checks to avoid that update scripts over-writes newer versions. This also enforces rules when you load files with ABASQL. See the file-load page for details. Bug fix: ABASQL would incorrectly report file successfully loaded, when it in fact it could not even find the file in SourceSafe. LISTERRS now knows how to filter out messages about missing stored procedures that appear later in the script. As a consequence of this DBBUILD no longer by default loads stored procedures twice. There is a new option, ‑sptwice, to request this in case you want full dependency information. Note that this is only possible to achieve on SQL 2005, due to flaws in SQL 2000 and The Preppis directives $MACRO_LONG and $ENDMACRO are now implemented. New option for SSGREP, that causes SSGREP to print matches per unique string. This is good if you are looking for references to a couple of stored procedures or similar. The AbaPerls File-Loading Process now also replaces user-defined data types in schema declarations for OPENXML. LISTERRS no longer prints headers for subsystems in which there are no errors. File-Loading Process now also checks for improper comparisons with WHERE x = NULL. When generating an update script, DBUPDGEN did not add code SET IDENTITY_INSERT ON, if the IDENTITY keyword followed NOT NULL in the table definition. This has been fixed. Several checks added/enhanced for the AbaPerls File-Loading Process: =*) and emits an error or a warning, depending on the tool in use. ‑log option to New tool PDREP95 for handling of reports generated by PowerDesigner 9.5. The fourth public release, L1.0.0040. ‑subsystemto ABASQL when you load triggers, indexes and foreign keys. See more about this on the pages for the, AbaPerls structure, the file-loading process, DBUPDGEN and ABASQL. Slight change in the table-updates generated by DBUPDGEN: there is now a $batchcol which makes a little easier to change the control column for the INSERT-loop. Two options for SSGREP and ‑lang that you can use to constrain the tools to only work with files of certain types. For instance will constraint the search to SQL files. The AbaPerls file-loading process could incorrectly gave a style message for a computed column. In the update scripts generated by DBUPDGEN, the table-updates now use RAISERROR WITH NOWAIT to print their messages, so that they are flushed to the log immediately. The first message for a table-update, which displays the number of rows to copy uses severity level 8, so that you can use LISTERRS to review the copying. LISTERRS now supports reading logs from update scripts generated by DBUPDGEN. To this end, there are some minor changes to the logs from DBBUILD and the update scripts. INSFILGEN and LANGINSGEN now want to you declare maximum lengths for parameters with string data, to avoid truncation when you run the INSERT-files. You declare these lengths in a MAXLEN section on the Config sheet. To begin with MAXLEN is not mandatory, but the plan is that it will eventually be. See further the INSFILGEN page. TBLFIX now formats computed columns nicely. The third public release, L1.0.0030. ‑rebuildnow also reloads index files, both for tables and for views. When running AbaPerls with Perl 5.8.3 (or ActivePerl Build 809), you could get a run-time error in Perl just as ABASQL and similar tools were about to exit. I believe the cause is a bug in Perl, but a workaround is now in place. Bug fix in ap_sob_update_sp. When running an update script with name longer than 35 characters, the script failed with "binary or string data would be terminated". This has been fixed, so that the name is truncated to 35 characters when saved in abasubsystems. Since the change is in the stored procedure, you need to deploy this procedure to take benefit of the fix. Changes in DBBUILD: ‑[no]insertpermits you specify whether INSERT-files are to be loaded, independently of the ‑bcpoption. Thus, you can now load both INSERT-files and BCP files in the same build. Default is ‑insert, unless you specify ‑bcp, in which case ‑noinsertis the default. Changes in DBUPDGEN and for update scripts generated by DBUPDGEN: aba_check_column()permits to easily check whether a table-update may have been run by a previous update. aba_tblfkey2()as alternative for foreign-key moves in table-updates. See point 4 in the comments to the sample script. LANGINSGEN now supports multi-column keys. When using a config-file, and searching for the labels of the projects in the file where there is no explicit version specification, the default behaviour of AbaPerls is now to consider only labels that match the standard AbaPerls label format permit you to override this behaviour, there are two new configuration options, Bug fix: When DBUPDGEN was reading a table definition to generate a table-update, it failed to recognize CREATE TABLE if these words were in lowercase. Big news for the AbaPerls file-loading process: AbaPerls now replaces user-defined data types with their definitions in temp tables and table variables, and thereby relieving you of the need to have user-defined data types in the tempdb and model databases. As a consequence of this, DBBUILD no longer loads user-defined data types in these databases, and neither does DBUPDGEN generate code for loading types in tempdb and model. AbaPerls now adds a COLLATE clause for all character (char, varchar, nchar, nvarchar, text and ntext) columns in temp tables and table variables for the default collation of the database. This permits you to mix databases with different collations on the same server, and freely move a database from server to another, without thinking of collations issues. (Obviously, to take benefit of this change you need run for the database.) See the file-load page for an example. AbaPerls now checks that the objects in a file agrees with the name of the object and issues a warning, if they do not. Likewise, AbaPerls issues a warning if a file contains an object which does not match with the file extension according to the AbaPerls SQL Directory Structure. Object checking now also finds objects in other databases, albeit still with some restrictions. See Known Issues for details. You need to install the user defined function list_to_tbl and stored procedure ap_check_existence_sp in the database for the improved checking to work. Else object checking will revert to the old behaviour. INSFILGEN: Added property Postlude to the Config sheet. New tool NEWSUBSYSVER that packages the procedure to create new version directories. During file-load, AbaPerls now creates a temp #current$subsystem(subsystem varchar(80) NULL) which holds the name of the current subsystem. When DBUPDGEN generates code for a table-update, it now adds check that the number of rows in the new table is the same as in the old table. See further the section on table updates in the article on DBUPDGEN. Fixed header generated by INSFILGEN, so that it does not include entry from check-in of INSFILGEN itself. The second public release of AbaPerls, L1.0.0021. The changes down to 2003-11-03 are the changes since first public release. Pay particular attention to the changes 2003-01-22 and 2002-12-05, since they include significant functional changes. option did not work properly when you used the keywords AFTER or INSTEAD OF in a trigger declaration. If a table file had been renamed, DBUPDGEN did not include that file in OBSOLETE-FILES. This had the effect that the objects in the file were not deleted from abasubsystems. Note that if a file is renamed, but the objects are not, this could lead to that the objects are dropped from the database entirely, if the file is loaded in a table-update that appears before OBSOLETE-FILES. Documentation fix: if there is a site-specific version of a file, and the main file changes, DBUPDGEN includes all the site-specific versions in the file. This is not a new feature, but the page for DBUPDGEN failed to mention this. The update scripts generated by DBUPDGEN now creates a temp table #update$script that you can test for in triggers, to disable parts of the triggers when running an update script. Use it like this: IF object_id('tempdb..#update$script') IS NULL There are several changes to the AbaPerls system tables. The main purpose of the changes is to permit two subsystems to have labels with the same Major, Middle and Minor, but have different letter and/or leading zeroes. The new column abasubsystems.sortorder permits you define a sort order to that you can view abasubsystems and get the same order as in the config-file. Other database changes are mainly of internal nature. See the database documentation for details. To update the ABAPERLS subsystem to the new format, run the script ap_update_1.0.0020.pl which is in the SQL/SCRIPTS directory of the AbaPerls installation. You run this script as you run any other update script generated by DBUPDGEN. The database update is not mandatory; that is, the tools will run against both the old format of the database (L1.0.0010) and the new format (L1.0.0020.) ‑HTMLswitch with SSGREP, SSGREP did not escape characters such as <. > and &, that have a specific meaning in HTML. The same problem still exists in SSREPLACE. Bug fix in INSFILGEN: when more than one single quote appeared in a value, the generated SQL would be syntactically incorrect. Some minor changes were made to TBLFIX so that it would handle scripts from PowerDesigner 9.5 without flaws. ‑Aquoted– permit you to control the ANSI ‑relatedsettings. By default all such settings are on, except ANSI_NULLS and ANSI_PADDING. ‑quotereplwhich causes AbaPerls to replace double quotes with single quotes as string delimiters before passing the code to SQL Sever. It is on by default. "", but there are still many situations where this does not work. See this known issue for details. sp_addmessage. Instead add the parameter @replaceto your calls to sp_addmessage. See Books Online for details. ‑quotereplalso being on by default. However, if you have dynamic SQL strings like this: EXEC ('SELECT "Ciao!"')That is, the inner string is delimited by double quotes. This statement will fail if it appears in a stored procedure that was loaded with the default settings in AbaPerls. Flipping quote style remedies the problem. This is further discussed under the ‑getoption with DBBUILD and the update scripts generated by DBUPDGEN, the file was not fetched from SourceSafe, if it already was on disk, even if this was a different version than the one that should be in the install kit. ‑label LATESTto DBBUILD, the first configuration option for a subsystem was ignored, if it was on the same line as the subsystem definition. ‑subscriberoption in the database was ignored, and the command-line took precedence. DBBUILD and the update scripts generated by DBUPDGEN now write a header to the log file with information about who and when and which command-line options that were in use. (Old update scripts will not print command-line options, though. Regenerate them to get this information.) The header is included in the output from LISTERRS First public release of AbaPerls. Copyright © 1996-2021, Erland Sommarskog SQL-Konsult AB. All rights reserved. AbaPerls is available under Perl Artistic License This page last updated 22-04-23 12:31
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500250.51/warc/CC-MAIN-20230205063441-20230205093441-00100.warc.gz
CC-MAIN-2023-06
64,704
544
https://mailman.cchmc.org/postorius/lists/tftools.mailman.cchmc.org/
code
User discussion list for Weirauch Lab analysis tools A discussion list for questions, problems, suggestions, and notifications of scheduled downtime for Weirauch Lab analysis tools, including tf.cchmc.org, RELI, MARIO, and NextGenAligner To contact the list owners, use the following email address: email@example.com Subscription / Unsubscription To subscribe or unsubscribe from this list, please log in first. If you have not previously logged in, you may need to set up an account with the appropriate email address. You can also subscribe without creating an account. If you wish to do so, please use the form below.
s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652663016853.88/warc/CC-MAIN-20220528123744-20220528153744-00646.warc.gz
CC-MAIN-2022-21
620
6
https://giraict.rw/product/op-laptop-bagpack-18/
code
- DURABLE EXTERIOR: Made of tear-resistant and anti-wrinkle polyester material with waterproof zipper for durability and protection. - FUCTIONAL INTERIOR: Lined with TPU scratch-resistant padding for safe and easy storage. - LARGE CAPACITY: Unique military design allows an assortment of compatible pouches and accessories to be attached. - COMFORTABLE DESIGN: Thick padding mesh provides shoulder and back support while remaining breathable. - COMPACT COMPARTMENT: Take gaming and work anywhere with a dedicated slot for your Razer Blade 18 and laptops up to 18”
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224646076.50/warc/CC-MAIN-20230530163210-20230530193210-00032.warc.gz
CC-MAIN-2023-23
565
5
https://www.software.jobs/jobs/american-cybersystems-inc-acs-group-/sr-net-mvc-angular-bootstrap-developer/1545407528048897289
code
Loading some great jobs for you... Proven experience as a .NET Developer or Application Developer - Good in Angular 2 and above , Angular MVC - Huge plus with Angular 5 - Strong experience with Bootstrap - Familiarity with architecture styles/APIs (Web API/Rest API) - Familiarity with the ASP.NET framework, SQL Server - Knowledge of .NET languages (e.g. C#, Visual Studio .NET) , HTML, Bootstrap - Understanding of Agile methodologies - Prior experience of working on P&C Insurance - Excellent troubleshooting and communication skills 10+ years of experience as Lead and developer in implementing large scale IT systems, 15+ years of development experience Proficient with Microsoft .NET development using C#. Experience with web development technologies including ASP.NET, WCF Experience with database development including relational database design, SQL technologies. Experience with agile development methodologies Strong understanding of object oriented concepts, design patterns, and architectural patterns Estimation techniques like (WBS, FP, Use case and their complexity level decision) Data Security such as Tokenization, Encryption, Masking etc experience will be good to have and preferred Able to communicate clearly and concisely, both orally and in writing to business and technology stakeholders Ability to multitask and work effective with little supervision; ability to co-ordinate onsite and offsite resources deliverables/activities; Ability to work with client and manage their expectations & priorities Bachelor s degree or equivalent in Business Management, Computer Science, Information Systems or related field
s3://commoncrawl/crawl-data/CC-MAIN-2019-13/segments/1552912201882.11/warc/CC-MAIN-20190319012213-20190319034213-00061.warc.gz
CC-MAIN-2019-13
1,637
23
http://aboutfamilytreemaker.blogspot.com/2015/04/
code
Saturday, April 4, 2015 For splitting files or deleting unwanted individuals, families, or branches the report I see recommended most often is the Extended Family Chart. The various forums of FTM and Ancestry repeatedly refer to this as the best tool available. So one would think that this report would be one of the easiest for users to run. Certainly not this user and I have been complaining about it for a very long time. Admittedly I am probably not a typical user. I have been using FTM for nearly 20 years and my approach to genealogy is not that of a real genealogist. I do the vast majority of my research online and am not rigorous about who I put into my database. If the data for an individual, a family or a longer lineage looks as if it fits into my tree that is usually sufficient for me to add as much as I can capture to my tree. Consequently my database is approaching 60,000 individuals. FTM has proven to be horrid at managing a database of this size and 2014 is no exception. Reports take a very long time, often hours to produce, and it is not uncommon for internal errors to occur and for the system to automatically abort. When it does work, I can literally sit in front of my computer and visualize the system slowly walking the paths of the database to collect data or even to reposition the database following a request I made. When I was working for a living I was a computer consultant and database manager so I am quite familiar with the inner workings of databases. Especially databases that manage files with millions of records. For some of my clients and users 60,000 records was too small of a file to even test with. When I request a report from FTM it immediately starts running that report using the last settings of that report. If those settings are not what I want this time around that means I must wait for the report to run so I can change the setting to the way I want them this time. There is NO alternative. So I have learned that once I have successfully run a report I carefully reset the setting to use the smallest amount of data the next time the report is opened. That way I do not have to wait an extended amount of time for a meaningless report to finish. However, if I happen to be running a major report like the Extended Family Chart using all of my data and the system fails and automatically aborts the problem does not end there. Once there is an abort, when I return to the reports system, the Publish option, it immediately returns to the failed report and begins processing automatically. Oh how nice, the little box that showing the system is processing has a Cancel Button. The problem is, it does not cancel immediately. As near as I can tell it continues to walk the database collecting all the data, another hour or more, and only when it begins to format that data for printing does it accept the cancel button. It certainly does NOT immediately cancel the process. No corporate user would long tolerate this from their system and they would demand that it be fixed. I have been at corporations that when the vendor of the software failed to do that management switched to a different vendor and different software. The problem that we FTM users have is that Ancestry has been very successful at cornering the genealogy market to the point where there is little or no competition. So we have to take what the Advertising Department is willing to provide us. And it is a very small bone that they throw at us. I say Advertising Department because no Chief Information Officer would ignore the technical and user issues I and so many others encounter. But since FTM is not for internal use but is for the public to purchase and use there is no need for a CIO to manage and decide on technical priorities. The priorities of Ancestry and Family Tree Maker are entirely different that those of users like myself and others. I want a functional, efficient and working database with all of the associated tools that one finds with a database. Their priority is to make lots of money.
s3://commoncrawl/crawl-data/CC-MAIN-2018-22/segments/1526794868003.97/warc/CC-MAIN-20180527044401-20180527064401-00046.warc.gz
CC-MAIN-2018-22
4,047
13
http://pseudotheos.com/view_object.php?object_id=1092
code
I should begin by stating what I do believe. I'm a Christian. I believe that God defines good and evil. I believe (for now) that you can define Good and Evil for yourself (just like you can define X). Go ahead. Try. See? Good job. It still doesn't matter. What matters is that God is more powerful than you are (assuming he exists) and that he'll Fedex you to hell if you don't "kiss the Son." Are we surprised that it comes down to power *again?* Just so you know, I believe in Good and Evil. I'm no sociopath. The religious people (those of us who are able to think well) look at moral relativism with fear and disgust. Its a creed nobody has to obey. Its a set of laws which if violated, the maker is obligated to feel bad about stuff. The self made morality is as effective as self made religion. It's like...it's like........the UN. All talk. No power. *grins at his constant prodding of the UN* Naturally, moral relativists look at oldthinkful (pardon me but I just read 1984, and this bit of Newspeak is perfect because moral relativism was a necessity for good doublethink) people like myself with disgust as well. We get our creed from a book! An old book! An old book written by some old people a long time ago. Silly. Doubleplus ungoodthinkful. That's funny, because our worst case scenario is their best case scenario. If all our beliefs are false, we're merely people who follow someone else's creed. Moral relativism cannot hope to achieve any universality of creed. Nazi's are okay as long as they are okay with what they are doing. Moral relativism is unenforcible on anyone but yourself. Now, I shouldn't ignore the super silly and super common version of moral relativism which attempts to create (pay attention to the fact that someone had to invent this axiom....) one big objective moral axiom (an objective moral axiom that at one moment was not even thought of and the next moment had been thought of by a person....). Usually it revolves around not hurting other people. Lets pretend for a second that we could all agree on what "hurting" and "people" mean. Where did this axiom come from? If people were not around, would this moral system still hold? No? So when the first person (maybe second person) came into being (however that happened) this axiom merely became true? And so what? If I do something which is "wrong" what would that mean? Am I obligated to feel bad about it? No. Am I obligated to repent? No. Am I obligated to do anything at all? No. Moral relativism is baseless because it has no base. Someone thought it up. Others agreed that it was true. I could say "Sin is when you do anything which doesn't help you." That would have a far greater basis on human life than this nonsense. At least then it would be a good guide for you to follow if you wanted to live and thrive and gain power. So should we be surprised in this morally relativistic world that no one listens to us? Should I be surprised that some people don't care that the government takes more from some people than it does from others? Should you be surprised that some people murder without flinching? That policians band together and spend tons of money for their own state simply because they can? Such people merely have different moral standards than yourself. What's new? Judge them if you like. It doesn't matter. Kill them if you like. It doesn't matter if morality is like art. Expect natural consequences. But that was already true wasn't it? If you lie in your job interview and are not caught, the natural consequence will be that you will have a better chance at that big job. If you cheat on your wife, and you get caught, the natural consequence will be ...whatever it is. I dunno. People just don't police themselves so well. So don't complain to me that somewhere out there, a homosexual is crying because he or she can't marry the person they love. That'd be like me telling a moral relativist that Jesus loves them and has a wonderful plan for his or her life. Just makes your skin crawl don't it? Good for getting people to roll their eyes. Not much more (unless you're already a Christian of course). Do you get it yet? It either matters or it doesn't. If it matters, lets figure out what the moral law is and why. If not, then get used to crying in the dark. Because nobody cares what you think if you can't back it. Nobody cares what you define for yourself. Do we care what color the missile is painted? Why should we care what it hits then? Saying to the sociopath "But...you might kill people" is like saying to me "Oh goodness that's just *awful*...what you're doing with those colors is *wrong*!" So prove him wrong already. Oh wait, you're too busy killing him for his "crimes." See? It comes down to power again. The government is "god" and the sociopath is the sinner. Except this time you like it because "god" likes you and wants to protect you. Good of you. After reading all that, ask yourself two or three questions. 1) If we destroyed the entire world and everyone in it, what will have been lost which was truly valueable (and would those things become less valueable really considering that you no longer exist)? 2) Would it be wrong if the person that did it thought it was the only way to purge the world of evil? Oh go play video games, I can't believe you're still reading this. You probably agree with me or something. :) No wait, I just thought of a great way to put it. Imagine that you have to define a set of words to establish a dialog with someone. Imagine if every sentence you used to define these words confused the person further because those sentences were comprised of words as well. The conversation grows more and more detailed and becomes more about words then about anything else. Eventually, you begin to run out of words because everytime you re-use a word your friend insists that you are using cyclical definitions (like, a cow is an animal an animal is a living thing, a living thing is...well..you know...they move and breathe and reproduce and.......you know....like a cow). Eventually you give you because you realize that language is a cyclical system; the whole thing is man made and is completely subjective. You can't prove to this person what any one word means because all the words you use to prove it will need definitions themselves. A very similar problem is inherent in moral relativism. It lacks authority the same way my language lacks authority. I can't tell you what "courage" means. If we disagree and neither of us can convince the other, neither of us is right. One of us just has more people on our side which means very little. Nothing man makes can escape this problem unless it touches Reality. Like buildings which have to follow the laws of physics and math (which we did not make up but rather discovered). Philosophy, art, literature...without an objective and all knowing beholder, these things have no value in and of themselves. Now you can go play video games.
s3://commoncrawl/crawl-data/CC-MAIN-2022-21/segments/1652662512249.16/warc/CC-MAIN-20220516204516-20220516234516-00748.warc.gz
CC-MAIN-2022-21
6,967
15
http://suzimcgowen.blogspot.com/2011/06/in-which-i-destroy-my-blog.html
code
Sigh. Trying to change my template was a terrible, terrible exercise in failure. Unless my goal was to make all my blog posts appear twice on my blog, in which case, Epic Win! Sigh. I've emailed Blogger. Someday they may even get back to me. In the meantime, I'm just hoping no one scrolls down that far. You totally scrolled, didn't you? Well, guess what? I also somehow managed to duplicate my header. Now, you're going to look up, aren't you? Update: Well, I managed to get rid of the duplicate header. The extra posts are still there, but hopefully, not as easy to see. I'll try and deal with this later.
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917123560.51/warc/CC-MAIN-20170423031203-00325-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
608
4
https://github.com/smira/psycopg2-ctypes/commit/14eac71b670194d788ce8185e27442f14ca05641
code
Please sign in to comment. Report connection as closed if internal flag is set or libpq knows th… …at connection is closed. This fixes problem when connection is closed on backend side and conn.closed() still reports False, while conn.fileno() would return -1, indicating that connection is broken. - Loading branch information...
s3://commoncrawl/crawl-data/CC-MAIN-2017-13/segments/1490218187717.19/warc/CC-MAIN-20170322212947-00524-ip-10-233-31-227.ec2.internal.warc.gz
CC-MAIN-2017-13
334
4
https://www.jstage.jst.go.jp/browse/tjsai/25/2/_contents/-char/ja/
code
In this work, we propose a novel multi-objective evolutionary algorithm (MOEA) which improves search performance of MOEA especially for many-objective combinatorial optimization problems. Pareto dominance based MOEAs such as NSGA-II and SPEA2 meet difficulty to rank solutions in the population noticeably deteriorating search performance as we increase the number of objectives. In the proposed method, we rank solutions by calculating Pareto partial dominance between solutions using r objective functions selected from m objective functions to induce appropriate selection pressure in many-objective optimization by Pareto-based MOEA. Also, we temporally switch r objective functions among mCr combinations in every interval generations Ig to optimize all of the objective functions throughout the entire evolution process. In this work, we use many-objective 0/1 knapsack problems to show the search performance of the proposed method and analyze its evolution behavior. Simulation results show that there is an optimum value for the number of objective functions r to be considered for the calculation of Pareto partial dominance and the interval (generation numbers) Ig to maximize the entire search performance. Also, the search performance of the proposed method is superior to recent state-of-the-art MOEAs, i.e., IBEA, CDAS and MSOPS. Furthermore, we show that the computational time of the proposed method is much less than IBEA, CDAS and MSOPS, and comparative or sometimes less than NSGA-II.
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224649439.65/warc/CC-MAIN-20230604025306-20230604055306-00547.warc.gz
CC-MAIN-2023-23
1,504
8
https://crypto.stackexchange.com/questions/70619/is-there-a-lower-limit-on-message-length-for-signature
code
I was working on a tool that signs small messages (~20 bytes) when a question occurred about message size: What would be the risk of using extremely small/restricted input (say, 5 bytes of hexadecimal chars) messages? Since the data is a hash and replay attack is not a concern, the message wasn't originally salted. Would a signature and/or collision be easier (/easy) to generate without the private key? Or would that make the private key vulnerable in any way? In this particular case we are using Ed25519, which does double hashing on the value before encryption, so I don't see an issue for the public key, but I'm still concerned.
s3://commoncrawl/crawl-data/CC-MAIN-2021-17/segments/1618038461619.53/warc/CC-MAIN-20210417162353-20210417192353-00581.warc.gz
CC-MAIN-2021-17
637
3
https://pdfslide.us/data-analytics/storytelling-with-data-see-show-tell-engage.html
code
Stories have been recognized for their power of communication & persuasion for centuries and we need to operate at that intersection of data, visual and stories to fully harness the power of data. I take your through a short tour of the science and the art of visualization and storytelling. Then give you an introduction through examples and exemplar on the four different layers in a data-story: See - Show - Tell - Engage. Used in the session on Business Analytics and Intelligence at IIM Bangalore in July 2014. Amit Kapoor narrativeVIZ Storytelling with Data Data Visual Story * Approach Fundamentals Learn from first principles Know the science Understand the art Experiential I hear and I forget I see and I remember I do and I understand I experience and I learn (for life) Learning the Djembe Source: The Visitor - Learning the Djembe da - da - da - da tak - tak - tak 1 - 2 - 3 - 4 1 - 2 - 3 Linguistic (Verbal) Symbolic (Math-Logic) Interactive (Kinesthetic) Geometric (Visual-Spatial) Linguistic (Verbal) The Pythagoras' theorem is a relation in Euclidean geometry among the three sides of a right triangle. It states: The square of the hypotenuse (the side opposite the right angle) is equal to the sum of the squares of the other two sides. Symbolic (Math-Logic) Geometric (Visual) Book: The First Six Books of The Element of Euclid by Oliver Byrne Interactive (Kinesthetic) Source: Setosa.io - Pythagorean Source: Explorable Explanation - Bret Victor To develop a complete mind, study the science of art, the art of science. Learn how to see. Realize that everything connects to everything else. - Leonardo da Vinci Visual Thinking Spectrum Hand me the Pen I can draw but... Im not visual Pointing, Waving, Grabbing, Holding, Reaching out, Dancing Smiling, Frowning, Disinterest, Concern, Full Attention, Surprise Put this there Gesture with Pen Visual Wired Brain 70% of the sensory receptors are in the eyes 50% of the brain used for visual processing 100ms to get a sense of the visual scene Visual Language While you are travelling down this road there is a chance that one or more rocks of varying size may fall from the slopes. You should be aware of this before you travel this way so that you are cautious of this particular type of hazard. vlazen (noun) Derived from the Latin verb videre, "to look, to see" The act or instance to form a mental image or picture (without an object) Visualization The act or instance to make visible or visual (with an object) Why should we be interested in visualization? Because the human visual system is a pattern seeker of enormous power and subtlety. The eye and the visual cortex of the brain form a massively parallel processor that provides the highest-bandwidth channel into human cognitive centers. At higher levels of processing, perception and cognition are closely interrelated, which is the reason why the words understanding and seeing are synonymous. Colin Ware Pattern Seekers Pattern Recognition Driving a Car Pattern Recognition Facial & Emotion Recognition Pattern Recognition CAPTCHA Completely Automated Public Turing test to tell Computers and Humans Apart Pattern Recognition Chess Go Pattern Recognition Weather Forecasts Patterns in Random Noise Choropleth maps of cancer deaths in Texas, where darker colors = more deaths. Can you spot which of the nine plots is made from a real dataset and not from under the null hypothesis of spatial independence? Source: Graphical Inference for Infovis Transformation of the symbolic into the geometric - McCormick et al. 1987 The use of computer-generated, interactive, visual representations of abstract data to amplify cognition. - Card, Mackinlay, & Shneiderman 1999 Visualization Value of Visualization Expand memory Answer questions Find patterns See data in context Make decisions Persuade | Tell a story Share | Collaborate Inspire Exploration Explanation Expression Value of Visualization Data Tool for engagement, exploration and discovery Exploration | Interactive Source: Gramener Cricket Stats Source: Strategy& Working Capital Profiler Source: Pindecode Pincode decoder Data Stories for telling a specific and (linear) visual narrative Explanatory | Narrative Source: Hans Rosling The Joy of Stats Source: Politizane Wealth Inequality Source: Pitch Interactive Drone Attacks Data Art for visual expression, delight (and impact, insight) Exhibition | Expression Source: hint.fm/wind Wind Map Source: Aaron Koblin Flight Patterns Source: Internet Census Internet Census The ability to take datato be able to understand it, to process it, to extract value from it, to visualize it, to communicate it thats going to be a hugely important skill in the next decades, ... because now we really do have essentially free and ubiquitous data. So the complimentary scarce factor is the ability to understand that data and extract value from it. Hal Varian, Googles Chief Economist Making Sense of Data Approach for Creating Data-Visual- Stories Design Framework Word Writer Note Frame Musician Film Maker | | | Data Artist|Datum ??? ??? ??? ??? Datum Data-Visual-Story See the Data Show the Visual Tell the Story Engage the Audience Datum Data-Stories See the Data Pattern Deviation Outlier Trend Data Abstraction Anscombes Quartet x1 y1 x2 y2 x3 y3 x4 y4 10.0 8.04 10.0 9.14 10.0 7.46 8.0 6.58 8.0 6.95 8.0 8.14 8.0 6.77 8.0 5.76 13.0 7.58 13.0 8.74 13.0 12.74 8.0 7.71 9.0 8.81 9.0 8.77 9.0 7.11 8.0 8.84 11.0 8.33 11.0 9.26 11.0 7.81 8.0 8.47 14.0 9.96 14.0 8.10 14.0 8.84 8.0 7.04 6.0 7.24 6.0 6.13 6.0 6.08 8.0 5.25 4.0 4.26 4.0 3.10 4.0 5.39 19.0 12.50 12.0 10.84 12.0 9.13 12.0 8.15 8.0 5.56 7.0 4.82 7.0 7.26 7.0 6.42 8.0 7.91 Anscombes Quartet x(mean) = 9 y(mean) = 7.5 x(var) = 11 y(var) = 4.12 y = 3.00 + 0.500 x Anscombes Quartet This is hard work "80% perspiration, 10% great idea, 10% output." - Simon Rogers See the Data Acquire Prepare Refine Explore 1 3 2 4 See the Data Acquire Prepare Refine Explore Data Wrangling Exploratory Data Analysis 1 3 2 4 Explore "Visualization gives you answers to questions you didn't know you had." - Ben Schneiderman Directed Approach ? ExploreQuestion Insight Exploratory Approach ? ExploreQuestion InsightExplore Visually Exploring Active Seeing Skill Building over Time Comparison, Deviations Range, Distribution: high, low, shape Ranking: big, medium, small Categorical Comparison: proportion Measurement: absolutes Context: target, average, forecast Hierarchical: category, subcategories Trends Direction: up, down or flat Optima: highs. lows Rate of Change: linear, exponential Fluctuation: seasonal, rhythm Significance: signal vs. noise Intersection: overlap, crossover Patterns, Relationships Exceptions: outliers Boundaries: highs. lows Correlation: weak, strong Association: variables, values Clusters: bunching, gaps Intersection: overlap, crossover Show the Visual Framing Transition Visual Representation How Many? When? Why? Who & What? Where? How? Portrait Dist
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323588398.42/warc/CC-MAIN-20211028162638-20211028192638-00687.warc.gz
CC-MAIN-2021-43
6,953
2
https://wiki.alquds.edu/?query=User:Jaypp86/sandbox
code
This is a user sandbox of Jaypp86. You can use it for testing or practicing edits. This is not the sandbox where you should draft your assigned article for a dashboard.wikiedu.org course. To find the right sandbox for your assignment, visit your Dashboard course page and follow the Sandbox Draft link for your assigned article in the My Articles section. One of the examples of an approximate algorithm is the Knapsack problem. The Knapsack problem is a problem where there is a set of given items. The goal of the problem is to pack the knapsack to get the maximum total value. Each item has some weight and some value. Total weight that we can carry is no more than some fixed number X. So, we must consider weights of items as well as their value.
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224652149.61/warc/CC-MAIN-20230605153700-20230605183700-00002.warc.gz
CC-MAIN-2023-23
751
4
https://searchdomino.techtarget.com/answer/How-do-I-pass-attachments-from-one-form-to-another-using-LS
code
Dim s As New notessession Dim db As notesdatabase Dim docT As notesdocument Dim uiw As New notesuiworkspace Dim uidoc As notesuidocument Dim tcategory As String Dim tattach As NotesItem Set db = s.currentdatabase Set uidoc = uiw.currentdocument Call uidoc.refresh tcategory = uidoc.fieldgettext ("tcategory") Set tattach = docT.GetFirstItem ( "tattach" ) Set docT = db.createdocument With docT .form = "todo" .category = tcategory Call docT.CopyItem( tattach, "attach" ) End With CopyItem is a LotusScript method for copying an item of the type NotesItem. A NotesItem is only available on a background document. Call docT.CopyItem( tattach, "attach" ) is OK except for the bit where it says tattach. Tattach is not a reference to a NotesItem, it is a field on the foreground document -- the UIdoc. What you need to do is something like this: Call uidoc.Save Dim backDoc As Notesdocument Set backDoc = uidoc.Document Dim tattachItem As NotesItem Set tattachItem = backDoc. GetFirstItem("tattach") Call docT.CopyItem(tattachItem, "attach" )I believe that this would do the trick. Do you have comments on this Ask the Expert Q&A? Let us know.
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301670.75/warc/CC-MAIN-20220120005715-20220120035715-00483.warc.gz
CC-MAIN-2022-05
1,139
7
https://gov.gitcoin.co/t/grandiose-a-gitcoin-grant-discovery-and-exploration-tool/149/4
code
Grandiose is a Gitcoin grant discovery and exploration tool, aiding users in their quest to find and donate to grants and give an insight to what is present in the grants ecosystem. This is done by experimenting with novel ways to gamify the search for new grants. The app can be found over at https://grandiose.app. The project’s focus is currently on delivering a minimal viable product. The following features are currently live: - An overview of all grants (using a test subset at the moment) - A like / dislike matching algorithm - Grant of the Day - Some cute KPIs / stats Core features that are not yet done but are considered a requirement for the MVP: - Shopping cart with support for zkSync. There’s no use in using Grandiose if you still have to leave the app and go to the Gitcoin website where you manually have to find your grants again. - Collections. A great feature present in the current version of Gitcoin. People want to share what they like. - Flash grants. Yeeted the idea from the flash deals you can find in some online game stores. You get shown 4 different grants, and every 6 hours one of the 4 grants gets (randomly?) replaced by a different one. - Other leaderboard types / KPIs. Not a hard requirement though. With these features I feel like the app would be ready to offer users a proper and satisfying first experience. There’s no possible way this can be completed by GR10, meaning I’ll have until GR11 which should give some extra time. This is a side-project I work on after my normal working hours, so that extra time is a nice boon. There’s two reasons why I’ve decided to create this app. The first reason is that whenever a new funding round kicks off, people are presented and overwhelmed with an enormous collection of grants they might contribute to. You can search by terms and tags, but you still have to manually browse through a big list, which isn’t very fun. What usually happens is that people ask around, seeking grant recommendations from people they know and trust. This leads to funding becoming a popularity contest where a lot of hidden gems might not get the attention they deserve. The goal of this app is to introduce gamifications that not only should make exploration fun, but also provide a better experience overall where projects will get the attention they so rightfully deserve. The second reason is that Gitcoin wants to be a credibly neutral platform and focus on solving the problem of public goods (funding), not necessarily showcasing users what grants should be to their likings. Such algorithms could also be accused of being biased in some way or another, damaging Gitcoin’s legitimacy. This app could relieve Gitcoin from that duty. Will it be open source? Naturally. I’m planning to release the source together with the MVP when it is deemed ready. For those interested, the front-end is made with Vue + Typescript. The back-end is written in C# with .NET 5.
s3://commoncrawl/crawl-data/CC-MAIN-2021-31/segments/1627046154219.62/warc/CC-MAIN-20210801190212-20210801220212-00436.warc.gz
CC-MAIN-2021-31
2,951
19
https://sourceforge.net/directory/development/interpreters/language:java/language:fortran/
code
A native Windows port of the GNU Compiler Collection (GCC) MinGW: A native Windows port of the GNU Compiler Collection (GCC), with freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All of MinGW's software will execute on the 64bit Windows platforms. Proteus Cross Compiler system allows the generation and compilation of Java Code from llvm-gcc compatible languages (C/C++/fortran). The generated code will execute at up to 50% of native code.
s3://commoncrawl/crawl-data/CC-MAIN-2018-05/segments/1516084891976.74/warc/CC-MAIN-20180123131643-20180123151643-00310.warc.gz
CC-MAIN-2018-05
560
3
https://forums.gearboxsoftware.com/t/forum-reddit-shift-code/1492685
code
I[quote=“tcjgame, post:12, topic:1492685”] Ahh, @JoeKGBX will this forum generally receive the same information given to Reddit? AMAs and such. I recall Riot primarily doing their AMAs and the like over on the League sub reddit forum as opposed to their own site. Which was a little frustrating as some of us prefer using a game’s dedicated site to for finding out the latest news dealing with said game. Will that be the case for these forums? Do you guys prefer revealing information over on reddit rather than here, if given the choice? Not necessarily. We share official announcements via our official channels – Twitter, Facebook, websites, etc. Reddit is a community run thing so we don’t tend to use that platform to announce things. To answer your question though, we’ll announce things here on out official forums and then let the community naturally take it over to Reddit. To your point about AMAs, that’s a Reddit specific kind of thing, so those probably wouldn’t have much of a presence here. We obviously would welcome discussion of AMAs and linking to them, etc though. There has been talk of doing something similar to an AMA here on the forums. If that’s something you guys are all into, I think we could easily see to making that happen sometime soon.
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141187753.32/warc/CC-MAIN-20201126084625-20201126114625-00147.warc.gz
CC-MAIN-2020-50
1,288
6
https://dariusztarczynski.com/level-up-your-learning-gamifying-the-process-of-self-education
code
How can I make learning more engaging with gamification? Treat learning goals as game levels. Break your learning objectives into smaller, manageable 'levels'. Each time you understand a new concept or finish a chapter, it's like progressing to the next level in a game. This helps in making the learning process less overwhelming and more rewarding. “Education is a progressive discovery of our own ignorance.” - Will Durant What can I do to make learning a new topic less daunting? Start with easy 'levels' and gradually increase difficulty. Begin your learning journey with simple concepts before moving on to complex ones, just as you would in a game. This ensures that you build a solid foundation and gain confidence as you progress. “Start where you are. Use what you have. Do what you can.” - Arthur Ashe How can I motivate myself to study regularly? Use rewards and incentives. Reward yourself when you achieve certain learning milestones. For instance, once you've finished a challenging chapter or a project, treat yourself to a break, a favorite snack, or an episode of a favorite show. “The more you praise and celebrate your life, the more there is in life to celebrate.” - Oprah Winfrey What can I do if I'm finding it hard to understand a concept? Try different 'game strategies'. Just as in a game, you might need to try different strategies if one approach isn't working. This might mean using different resources, seeking help from a mentor, or using practical applications to understand a concept better. “I have not failed. I've just found 10,000 ways that won't work.” - Thomas Edison How can technology aid my learning process? Leverage gamified learning platforms. Numerous online learning platforms incorporate gamification elements, like earning badges, points, or climbing leaderboards. These elements can make the learning process more interactive and engaging. “Technology is just a tool. In terms of getting the kids working together and motivating them, the teacher is the most important.” - Bill Gates What can I do to ensure I remember what I learn? Incorporate quizzes and tests as 'boss fights'. Regular testing can help reinforce memory and understanding, similar to a 'boss fight' in a game where you test your skills and knowledge. “Tests are not a plot and conspiracy against you. A test is to determine what you know, to see if you're ready for the next level (i.e., the next grade). It's a process of maturation, growth, and development.” - Kimberly Giles How can I encourage myself to explore new topics of interest? Treat them as new 'game worlds'. Each new subject or topic can be seen as a new 'game world' to explore, filled with new challenges and rewards. This approach encourages curiosity and makes learning a fun adventure. “Curiosity is the wick in the candle of learning.” - William Arthur Ward How can I stay focused on long-term learning goals? Track your progress. Keeping a record of your learning journey - like the progress bars in video games - can serve as a great motivator. Seeing how far you've come can push you to go even further. “Without continual growth and progress, such words as improvement, achievement, and success have no meaning.” - Benjamin Franklin Gamification can revolutionize the way you approach self-education, making it more fun and effective. Remember, learning is not just about reaching an end goal, but about enjoying the journey and growing along the way. So, start your game of learning today!
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679515260.97/warc/CC-MAIN-20231211143258-20231211173258-00149.warc.gz
CC-MAIN-2023-50
3,516
25
https://kinja.com/thenerdybird/discussions
code
Wow, had not seen this yet, thanks for bringing to our attn. Btw, if anyone sees stuff we might have missed, best way to reach us is our tips email: email@example.com Don’t be an asshole. You are absolutely correct. Thank you, it’s been edited. I am still working through it all but the book requests overall have been noted! Can’t make any promises of course but we’ll do what we are able to. Thanks for the feedback. Looks like we’re having a problem with the clip. We’ll fix as soon as we can. Sorry, folks! Of course, drop it here! Someone asked “Didnt his ex or something accuse him of abuse?” All I did was drop the link to what she said. Chill.
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439738964.20/warc/CC-MAIN-20200813073451-20200813103451-00001.warc.gz
CC-MAIN-2020-34
667
7
https://www.experts-exchange.com/questions/10049291/I-need-help-with-my-USR-Sprotster-MessagePlus.html
code
I've got a problem with my modem. I'm using a USRobotics Sportster MessagePlus. I want to use the special features which are: recieving telephonecalls and faxes when the PC is not turned on. How can I do this ?? How can I use the modem as asweringmashine and fax-reciever ? What software can set up my modem ? My environtment: Intel Pentium 200MMX; 16gig SCSI HD (2 quantum Atlas1, 1quantum GranPrix, 1 Fujitsu), Toshiba 14.4 CD-Rom (3xxx series), Adaptec Host Adapter 2940AU, Matrox millenium 1 4mb.
s3://commoncrawl/crawl-data/CC-MAIN-2018-34/segments/1534221216724.69/warc/CC-MAIN-20180820180043-20180820200043-00020.warc.gz
CC-MAIN-2018-34
500
3
http://waves-mining.bauxitemining.pw/?bitcoin=40195
code
[$500 Donor] Milotic's High Value Item Shop [UKBT/PP/BTC/Skrill] [$250,000+ vouched!], Welcome to Milotics High Value Item Shop Payment methods are UKBT Skrill Paypal (from trusted users!) Bitcoin Buying AND Selling Spirit shields, third , RuneScape 2007 Item Exchange, I think Bitcoin will go above $1,000 and stay there permanently one day soon. Bitcoin has proven itself to be a safe store of wealth. The value fluctuates, but the technology has proven itself nearly bullet-proof. Since (I've heard) 95% of all bitcoin users reside in China, the only real threat to Bitcoin I see is the Chinese government making it illegal and prosecuting people for holding it. Adventure Quest World. Albion Online. Accounts Stack Exchange network consists of 176 Q&A communities including Stack Overflow, ... When trying to evolve Feebas into Milotic in Pokemon GO, you can either use 100 Feebas Candies to evolve it, or it says: Walk with you buddy to evolve this Pokemon. There is then a progress bar underneath, which you can fill by walking with Feebas as a buddy, and it becomes full once you've walked 20Km. I've ... Teams. Q&A for Work. Stack Overflow for Teams is a private, secure spot for you and your coworkers to find and share information. [index] This Friday's five fast facts we are heading to the waters of Pokemon and doing five fast facts about Milotic. Get a glimpse of what life is like in North Korea, a country rarely seen by foreigners. Britain's fastest snowboarder Jamie Barrow is our guide around the DP... Yo whats gooder my geodudes & nidoqueens! I got coordinates for a Feebas nest in Pokemon Go! These nests are subject to relocate over time! I will leave a link down below for the map to snipe ... Check out our introvert collection merch here: https://www.introvertpalace.com/collections/introvert-merch Original Article: https://psych2go.net/12-things-i... Zabie, el Mareep tipo Fantasma: Hidrobomba no toma prisioneros NKRT, el Beheyeem tipo Dragon/Fuego: HIDROBOMBA. NO. TOMA. PRISIONEROS. Hueco, el Herdier tipo Acero: Lagrimas en Giratina SuperTails ...
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301217.83/warc/CC-MAIN-20220119003144-20220119033144-00011.warc.gz
CC-MAIN-2022-05
2,089
3
http://www.flightsim.com/vbfs/showthread.php?101659-One-more-Delta-737(screenshot-posting-is-addictive)
code
Nice pics. Is that wingvieuw default with that model? [font size="0.5" color="red"]Stefan van Hierden | msn: firstname.lastname@example.org c you guys around next time in the wonderfull world of "flying"[/font] yes it is. If u want the wing view i got, u have to have Active Camera Pro. Thanx btw
s3://commoncrawl/crawl-data/CC-MAIN-2013-48/segments/1386164992771/warc/CC-MAIN-20131204134952-00038-ip-10-33-133-15.ec2.internal.warc.gz
CC-MAIN-2013-48
296
4
http://www.symantec.com/connect/symantec-blogs/symantec-management-platform-%28smp%29-developers/2901/all/all/all/all?device=mobile
code
SMP Developer Team Blog We’re here to support you in developing solutions that integrate with the Symantec Management Platform. Watch this blog for our posts on up-to-date topics that affect your solutions. We also “blog” answers to questions that are common to most solution developers, and we also post information about urgent solution-development-related topics. 0Updated: SMP Dev Team 18 Nov 2009
s3://commoncrawl/crawl-data/CC-MAIN-2013-20/segments/1368708142617/warc/CC-MAIN-20130516124222-00095-ip-10-60-113-184.ec2.internal.warc.gz
CC-MAIN-2013-20
407
3
http://mailman.isi.edu/pipermail/ns-developers/2008-October/004798.html
code
[Ns-developers] build system proposal gjcarneiro at gmail.com Tue Oct 7 04:05:02 PDT 2008 2008/10/7 Gustavo Carneiro <gjcarneiro at gmail.com> > 2008/10/7 Tom Henderson <tomh at tomh.org> > Mathieu Lacage wrote: >>> comments below, >>> On Tue, 2008-09-30 at 16:36 +0000, Tom Henderson wrote: >>>> After invoking download.py, it would look something like this: >>>> At any time when invoked later, download.py will synchronize all >>>> repos, or if passed an argument or set of arguments, will synchronize >>>> all of >>>> the named repos only. Note: Craig has suggested that maybe a separate >>>> script called "sync.py" or something like that could be used to do the >>>> later synchronization, and not reuse "download.py". >>> I would support instead making download.py require a mandatory argument. >>> ./download.py --> do nothing, print help output. >>> ./download.py all --> download everything >>> ./download.py nsc --> download nsc >>> ./download.py nsc ns-3 --> download nsc and ns-3. >>> To update ns-3-dev, a developer in the ns-3-dev directory may either: >>>> i) hg pull (as usual) >>>> ii) ../sync.py ns-3-dev (same outcome as i) >>>> iii) ../sync.py in which case everything is synchronized >>> I would suggest again "../sync.py all" to avoid spurious unwanted >> Any other comments on this proposal? If not, we'll take steps to >> implement it with Mathieu's additional suggestion. > Just one comment. I really _really_ think you should rename install.py to > build.py. "Installation" usually means something completely different than > what you want this script to do. I don't care if NS 2 has an install > script, if NS 3 is not to be system installed then we shouldn't call the > script install. > Otherwise it sounds fine in principle. Well, just a couple more of > details: 1. I would make the allinone dist a python/shell script instead of > waf wscript (otherwise waf might get confused, god knows it's not hard, by > the nested waf project), 2. I'm not sure how easy it will be to figure out > the version of each external component to rename the folders for the > allinone dist. OK, I have one more comment. Right now, when I make pybindgen support something new I can rescan the header files and commit the ns3_module_*.py API definition files. Since a newer pybindgen is required to "understand" the new features being requested, I also change the requested pybindgen version. This means that a unknowing developer will try to build ns-3-dev and then waf will automatically reconfigure the project, notice the too old pybindgen, and try to update pybindgen. If the new allinone build system does not take care of this problem then I'm afraid we will see occasional build problems where people blame pybindgen and I have to tell them to One way to cope with this problem is to keep the pybdingen detection in the ns-3-dev wscript, except that it will just disable python if pybindgen is too old and not try to download anything. Also the download.py script would have to peek into ns-3-dev/bindings/python/wscript and grab the value of REQUIRED_PYBINDGEN_VERSION, since it is a bad idea to have "constants" defined in two different places at the same time. Gustavo J. A. M. Carneiro INESC Porto, Telecommunications and Multimedia Unit "The universe is always one step beyond logic." -- Frank Herbert More information about the Ns-developers
s3://commoncrawl/crawl-data/CC-MAIN-2013-20/segments/1368705043997/warc/CC-MAIN-20130516115043-00047-ip-10-60-113-184.ec2.internal.warc.gz
CC-MAIN-2013-20
3,366
57
https://origin.geeksforgeeks.org/php-output_add_rewrite_var-function/
code
PHP | output_add_rewrite_var() Function The output_add_rewrite_var() function is an inbuilt function in PHP which is used as output control functions to add values of URL rewritter. This function adds another name or value pair to the URL rewrite mechanism. The name and value will be added to URLs (as GET parameter) and forms (as hidden input fields) the same way as transparent URL rewriting is enabled with session.use_trans_sid instead of session ID. The behaviour of this function is controlled by url_rewriter.tags and url_rewriter.hosts php.ini parameters. In further version dedicated output buffer is used, url_rewriter.tags is used solely for output functions, url_rewriter.hosts is added. Note: Calling output_add_rewrite_var() function start output buffering implicitly even if it is not active already. Syntax: bool output_add_rewrite_var( string $name, string $value ) - $name: It holds the variable name in string format. - $value: It holds the value of variable in string format. Return Value: This function returns TRUE on success and FALSE on failure. Below programs illustrate the output_add_rewrite_var() function in PHP: Program 1: Output: Program 2: Output: Reference: https://www.php.net/manual/en/function.output-add-rewrite-var.php Please Login to comment...
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296945472.93/warc/CC-MAIN-20230326111045-20230326141045-00755.warc.gz
CC-MAIN-2023-14
1,284
9
https://climate-dynamics.org/research/
code
Earth’s surface climate is controlled by turbulent atmospheric dynamics. Turbulent dynamics on scales of tens of meters and less control the distribution and properties of clouds, which regulate the energy input into the climate system. Macroturbulence on scales of thousands of kilometers transports heat, angular momentum, and water vapor and thereby shapes the distributions of surface temperatures, surface winds, and precipitation. So any theory of climate must build upon a theory of atmospheric turbulence across a vast range of scales. We use observational data and simulations to develop theories of how turbulent atmospheric dynamics shape climatic features. Questions we are addressing include: How do storminess and precipitation change as atmospheric greenhouse gas concentrations increase? How do monsoons and the intertropical convergence zone respond to the changes in insolation that accompany variations in Earth’s orbit around the sun? How does cloudiness change with climate, and how does that amplify or dampen the climate system’s response to perturbations? Our goal, in short, is to develop a set of fundamental physical laws governing climate. Progress toward this goal helps us understand and interpret the climate changes that occurred over our planet’s history and that are likely to occur in the future. We also strive to translate such progress into improvement of climate and weather forecasting models. Both an introductory lecture on Grand Challenges in Climate Dynamics, which summarizes some of our past and ongoing research on Earth’s climate, as well as a Watson Lecture on Where the Wind Comes From, with a historical perspective, are available on our Talks video page.
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474669.36/warc/CC-MAIN-20240226225941-20240227015941-00079.warc.gz
CC-MAIN-2024-10
1,716
5
https://community.nxp.com/t5/S32-SDK/Flex-NVM-Partition/td-p/1134944
code
I am using S32K146. The flash driver is configured ( in EB Tresos) and build successfully. But the fls_init() functions throws flash initialization error. The D_Flash size is 0 [ data flash 0 kb emulated eeprom backup 64kb ]as per the flex nvm partition, so the initialization is failed. How to configure the flex nvm partition ? Hello @PARVATHY , It's kinda weird. D_FLASH_SIZE should have a value, no matter how you configure the FLS sector, as it's defined by how you choose S32K derivative. Could you specify which package are you using, and share us the FLS and Resource configuration in *.epc, so we can double check in our site? If you want to partition FlexNVM, please refer to s32k-rm page 866. But, you have to understand first, how to do flash commands. and the program partition command execution will change the DEPART values. this is the only way to change DEPART.
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178362899.14/warc/CC-MAIN-20210301182445-20210301212445-00362.warc.gz
CC-MAIN-2021-10
878
10
https://topic.alibabacloud.com/zqpop/jquery-php-array_58008.html
code
Alibabacloud.com offers a wide variety of articles about jquery php array, easily find your jquery php array information here online. Document directory Index in the element: $ ("# More-Info "). index ("H3") New. the has () method (has document, commit) is equivalent to the has () filter in the selector. It obtains a jquery set and returns elements containing the specified ++ ++ Accordion class ++ ++ Based on jqueryDevelopment, very simple horizontal collapse control. Horizontal accordion: jquery Popular books: www.hotbook.cnJquery-horizontal accordion Horizontal Direction accordion with xbox360 blade I hope you will like the Jquery plug-in that has been accumulated for a long time. If you have any new plug-ins, please leave a message and we will record them together. Thank you for your support. 1. accordion class Developed based on jQuery, it is Collect commonly used jquery plug-ins 1000 I hope you will like the Jquery plug-in that has been accumulated for a long time. If you have any new plug-ins, please leave a message and we will record them together. Thank you for your support. 1. : This article mainly introduces php programming notes. if you are interested in PHP tutorials, please refer to them. Three methods for php to obtain POST data Php image watermarking source code Php + ajax + json is the simplest example. Php Chinese Three ways to get post data from PHP PHP image plus watermark source code One of the simplest examples of Php+ajax+json PHP Chinese Pinyin Source code PHP iterates through directories, generates MD5 values for each file in the directory, and writes Php programming notes-three methods for obtaining POST data from small_123 php Php image watermarking source code Php + ajax + json is the simplest example. Php Chinese character to pinyin source code Php traverses the Directory, generates the md5 Jquery cycle plugin Jquery cycle plugin is a slideshow plug-in. Multiple conversion effects are supported: shuffle, zoom, fade, turndown, curtainx, and scrollright. Jquery cycle plugin Gallery slideshow Ingrid This jquery DataGrid $.each (Array, [callback]) traversal Unlike the $.each () method of the JQuery object, this method can be used to sample any object (not just an array, oh ~). The callback function has two parameters: the first is the index of the member or array $. Each (array, [callback]) Traversal Unlike the $. each () method of the jQuery object, this method can be used to sample all objects (not just arrays ~). The callback function has two parameters: the first is the object's member or array index,
s3://commoncrawl/crawl-data/CC-MAIN-2019-39/segments/1568514573476.67/warc/CC-MAIN-20190919101533-20190919123533-00293.warc.gz
CC-MAIN-2019-39
2,574
11
https://www.yash.com/technology/oracle/fusion-middleware/oracle-webcenter/
code
Catalyze and leverage collaborative working and user-engagement We help you transform the web presence of your organization to the next level. With our tailored services you can channelize, automate and integrate content across social, mobile, and cloud channels, better than ever. YASH has collaborated with Oracle to provide better and complete support for businesses. YASH is a certified Oracle partner and can offer you an in-depth array of best practices to ensure you maximize ROI. Our services range from migration and upgradation to consultation and implementation of the Oracle solution: - Identification of combination of products from Oracle’s extensive WebCenter Suite to suit your business needs. - Implementation, upgrades, migration and support for the entire Oracle WebCenter stack, including mobile and Oracle’s Application Development Framework (ADF) . - Integration and customization of your applications at lower cost and risk, while maximizing the value of your IT investments. - Proactive maintenance and support for Oracle WebCenter solutions deployed in the organization.
s3://commoncrawl/crawl-data/CC-MAIN-2020-40/segments/1600402124756.81/warc/CC-MAIN-20201001062039-20201001092039-00187.warc.gz
CC-MAIN-2020-40
1,099
6
http://www.ppdgemini.com/op,closing-to-jay-jay-the-jet-plane-vhs,ak.html
code
We shoulder Windows 7, next to 64 bit kit. Be it Microsoft Windows 10 in addition Flight Simulator download flight simulator 2000 X Gold. E Sky Simulator, pardon e way of being simulator software downloads. Company report & famous executives designed for MaxFlight Corp (0525013D. Esky Flight Simulator Tutorial. Download the Flight Simulator X audition explanation which includes two airports, two missions, besides three goat simulator all goats marginal jet. Download microsoft break-out simulator x - Microsoft Flight Simulator X 2016.
s3://commoncrawl/crawl-data/CC-MAIN-2019-13/segments/1552912201812.2/warc/CC-MAIN-20190318232014-20190319013511-00035.warc.gz
CC-MAIN-2019-13
539
2
https://www.wob.com/en-au/books/nell-dale/computer-science-illuminated/9780763776466/GOR005797191
code
Computer Science Illuminated by Nell Dale Revised and updated with the latest information in the field, the Fourth Edition of Computer Science Illuminated continues to engage and enlighten students on the fundamental concepts and diverse capabilities of computing. Written by two of today's most respected computer science educators, Nell Dale and John Lewis, the text provides a broad overview of the many aspects of the discipline from a generic view point. Separate program language chapters are available as bundle items for those instructors who would like to explore a particular programming language with their students. The many layers of computing are thoroughly explained beginning with the information layer, working through the hardware, programming, operating systems, application, and communication layers, and ending with a discussion on the limitations of computing. Perfect for introductory computing and computer science courses, the fourth edition's thorough presentation of computing systems provides computer science majors with a solid foundation for further study, and offers non-majors a comprehensive and complete introduction to computing.
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947476464.74/warc/CC-MAIN-20240304165127-20240304195127-00512.warc.gz
CC-MAIN-2024-10
1,165
2
http://sara-land.blogspot.com/2007/09/new-painting.html
code
This is my peice for the 'Downright' show at the PEP gallery. Opening reception is on October 5th from 8-10 pm at 64 Washington Ave, Brooklyn, NY 11205 (Near Pratt, past the BQE between Park and Flushing). Its a really cute gallery on a surprizingly happenin' block. Unfortunatly i won't be there, but that shouldn't stop YOU from coming out! Here are some more images:
s3://commoncrawl/crawl-data/CC-MAIN-2017-17/segments/1492917119225.38/warc/CC-MAIN-20170423031159-00228-ip-10-145-167-34.ec2.internal.warc.gz
CC-MAIN-2017-17
369
3
https://www.tr.freelancer.com/projects/mobile-phone-android/android-developer-for-casual-gaming.2269256/
code
Peanut Gaming is an up and coming mobile gaming studio. We're developing a series of 15+ casual games (ex. Duck Shoot - [url removed, login to view]) and are looking for a talented Android game developer to work with us as a contractor on several of these. You would be responsible for developing three to five of these games working iteratively with our team of game designers and artists. We will provide high quality art assets, sound assets, detailed game specs, and our internal development team will be available to help as well. The project would ideally be a 3-month project for a set amount of time per week (20 to 40 hrs) where we will develop and perfect three to five casual mobile games together. Requirements for the project: *Significant and demonstrable Android game development experience with libgdx (or willingness to learn libgdx - [url removed, login to view]) *Efficient well structured readable code - we work with developers who take code quality seriously *Individual developers only - we strongly prefer to work directly with great developers instead of project managers *Good and regular communication - so we can stay on the same page and make great games *2-3 Month Commitment - we want to see the games through to completion If you are interested, please send us materials that will help us understand your game development capabilities and we will send additional details about the project and next steps. Please include “I AM A REAL DEVELOPER” at the top of your response to let us know you read the full project description and we'll get back to you soon!
s3://commoncrawl/crawl-data/CC-MAIN-2018-30/segments/1531676590866.65/warc/CC-MAIN-20180719105750-20180719125750-00262.warc.gz
CC-MAIN-2018-30
1,592
9
https://bugs.openjdk.java.net/browse/JDK-8025409
code
Sent a code review last week: https://sthinfra10.se.oracle.com/cru/CR-JDK8AWT-6 But then found out SPB's fixes were pushed to awt forest, not tl forest, and we have lots of duplicate. To avoid duplicate and future merge conflict, I started all over in awt forest. Sent a new code review after switching to awt forest: https://sthinfra10.se.oracle.com/cru/CR-JDK8AWT-8 . In this code review, 501 errors/warnings are fixed.
s3://commoncrawl/crawl-data/CC-MAIN-2018-09/segments/1518891812405.3/warc/CC-MAIN-20180219052241-20180219072241-00025.warc.gz
CC-MAIN-2018-09
421
4
http://jad.shahroodut.ac.ir/article_1852.html
code
Variable environmental conditions and runtime phenomena require developers of complex business information systems to expose configuration parameters to system administrators. This allows system administrators to intervene by tuning the bottleneck configuration parameters in response to current changes or in anticipation of future changes in order to maintain the system’s performance at an optimum level. However, these manual performance tuning interventions are prone to error and lack of standards due to fatigue, varying levels of expertise and over-reliance on inaccurate predictions of future states of a business information system. As a result, the purpose of this research is to investigate on how the capacity of probabilistic reasoning to handle uncertainty can be combined with the capacity of Markov chains to map stochastic environmental phenomena to ideal self-optimization actions. This was done using a comparative experimental research design that involved quantitative data collection through simulations of different algorithm variants. This provided compelling results that indicate that applying the algorithm in a distributed database system improves performance of tuning decisions under uncertainty. The improvement was quantitatively measured by a response-time latency that was 27% lower than average and a transaction throughput that was 17% higher than average.
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178369523.73/warc/CC-MAIN-20210304205238-20210304235238-00419.warc.gz
CC-MAIN-2021-10
1,395
1
http://lists.runrev.com/pipermail/use-livecode/2014-June/202856.html
code
ScreenRect bug or not ambassador at fourthworld.com Sun Jun 8 18:59:15 EDT 2014 Mark Wieder wrote: > Sunday, June 8, 2014, 11:28:28 AM, you wrote: >> A secret is a willful attempt to conceal, but we have no indication that >> anyone's doing that. >> So before anyone runs off to the hardware store to grab pitchforks for >> storming the castle over some imagined IDE conspiracy, please kindly >> take a moment to consider only what I wrote > Here is what you wrote: >> There is an IDE rewrite underway, and a very large-scope effort to >> improve overall rendering. One of the problems with my admittedly-lengthy writing style is that it can make posts too long to read - I had also written: The IDE rewrite is AFAIK very early-stage, a logical necessity from the Open Language initiative and the implications thereof related to extensibility. I imagine we'll be hearing more about it as it begins to move from sketchpad to code, but right now it's all about supporting OL so I don't believe there's much concrete that can be said about it until OL gets fleshed out more. AFAIK there is no version of the engine in any usable form that supports Open Language (on the contrary, I would imagine there are many deep design decisions still being fleshed out), so it would not be possible for the folks at RunRev to be secretly using an IDE dependent on it. As Jacque noted, the core dev team has been discussing plans for a new IDE for a long time. Evolution of features and design are an inherent part of the process for all software, and a glance at the Road Map makes it clear that it will only become increasingly necessary for RunRev as well. I just think it'll be more productive if we can discuss future development options with a presumption of good intentions. > As you know, I've been pushing for open-sourcing the IDE for over a > year now, but so far I've seen no move in that direction. If you're > privy to some information that the rest of us are not, then perhaps > you have a better word for it than "secret", because it's certainly > news to me. If something is merely unknown, using "unknown" may be a good choice. :) As the current acting Community Manager, the nature of the role requires me to help find ways to remove obstacles that may be preventing anyone from doing what they want to do in this open source project. To recap where we are with the IDE in terms of open source process: The IDE files are on GitHub, and even better are licensed under the very permissive MIT license: We use LiveCode because it represents a very different way of working, but that same benefit for us poses unique challenges as an open source As you know better than most, off-the-shelf versioning systems don't handle LiveCode's unique structure for stack files, leaving it for us to invent our own way to make that happen. Good work has been done along those lines (and a lot of that by you - thank you for helping to bring it as far as it's come), and many options exist for ways to do productive work even now, before we have an even better system in place. But ultimately the bigger issue here isn't a technical one of all, but the central challenge with all open source projects: Finding people with the time and skills to contribute. The skills required go beyond just LiveCode proficiency. As with any open source project, there has to be a willingness to work within a wide range of divergent interests and goals, and a sometimes-dizzying variety of design visions. Very few of us in the LiveCode universe have much hands-on experience with this sort of process. I've made only modest contributions to the Ubuntu project (and thankfully none of them in C++ code <g>), most of LiveCode's user base makes and uses only proprietary software, and RunRev themselves have been open source just over a year. We're all learning as we go. It complicates things further that the nature of LiveCode stack files currently precludes us from easily using off-the-shelf systems to help support the process. But I still believe we can do it. There are some very smart, inventive people both here in the community and on the core dev team, and we all share the common vision of both sides working together productively to make the best LiveCode the world To help this along, we have the good fortune of having bugs in the IDE, which are of course annoying but also allow us an opportunity: If we prioritize addressing bugs in the current IDE right now, we'll not only have fewer bugs, but more importantly we will have found the team members and processes that can guide bigger objectives. This email has already gotten too long, so let me outline some of the ways we can work on the IDE today in a separate post. LiveCode Community Manager richard at livecode.org More information about the Use-livecode
s3://commoncrawl/crawl-data/CC-MAIN-2019-39/segments/1568514574182.31/warc/CC-MAIN-20190921022342-20190921044342-00161.warc.gz
CC-MAIN-2019-39
4,798
83
https://gov20watch.pepperdine.edu/2010/12/2010-gov-2-0-interviews/
code
The Digiphile blog has posted their top 20 interviews on the topic of Gov 2.0: Regardless of the quality of light, image or sound, each interview taught me something new, and I’m proud they’re all available on the Web to the public. The list below isn’t exhaustive, either. There are easily a dozen other excellent interviews on my channel on YouTube, O’Reilly Media’s YouTube channel, uStream and Livestream. Thank you to each and every person who took time to talk to me this past year. See the interviews here.
s3://commoncrawl/crawl-data/CC-MAIN-2017-39/segments/1505818696696.79/warc/CC-MAIN-20170926212817-20170926232817-00128.warc.gz
CC-MAIN-2017-39
523
3
https://ambergriscaye.com/forum/ubbthreads.php/topics/221062/lobsterfest.html
code
It would be nice if the Lobsterfest committee indeed announced officially the dates of the festival. People plan their vacations to be here for the festival and most who want to come need to reserve their flights and hotels in advance. If anyone reading this board is in the planning committee please post any updated information. Thanks and cheers, Auxillou Beach Suites Caye Caulker, Belize [This message has been edited by Auxillou Beach Suites (edited 05-17-2006).]
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764494852.95/warc/CC-MAIN-20230127001911-20230127031911-00766.warc.gz
CC-MAIN-2023-06
469
5
https://daily.jstor.org/tag/skyscrapers/
code
The Real Reason Why NYC’s Skyscrapers Are Where They Are Why does Manhattan have two business separate districts? Turns out that it's not because of the usual story about bedrock depth. On The Black Skyscraper: An Interview with Literary Critic Adrienne Brown Early skyscrapers changed the ways we see race, how we see bodies, how we perceive and make judgments about people in the world.
s3://commoncrawl/crawl-data/CC-MAIN-2023-14/segments/1679296949506.62/warc/CC-MAIN-20230330225648-20230331015648-00158.warc.gz
CC-MAIN-2023-14
390
4
https://www.launchrock.com/community/questions/6881/what-should-i-use-ready-to-use-software-or-a-custom/answers/21877
code
I am planning to develop my own freelance marketplace like Fiverr and found this ready-made Fiverr clone: https://www.logicspice.com/fiverr-clone Should I go for this or create a website from scratch? If you haven't already validated the market, then it would be best to start with something simple and the option with the cheapest costs. If you see that there is in fact a need for your service, you can always start developing your own version whilst the template version is running. If you the costs of the templates are close to the costs of the 'made from scratch' (MFC) version (which would sound strange), then you need to take into account the following: 1. MFC version usually takes longer 2. MFC version usually has additional 'surprise' costs. 3. Template version has limited ability to make changes. If you decide to go for the MFC version, make sure that: a. you have a development contract with the developers b. you chose developers that you can trust (have recommendations). c. Don't ever pay the entire amount upfront. Always pay only after each milestone has been done, and leave at least 25% of the payment for at least 2 months after they have signed over the project (in case you discover bugs). d. Make sure that after every stage they send you the code. I've successfully helped over 350 entrepreneurs, startups and businesses, and I would be happy to help you. After scheduling a call, please send me some background information so that I can prepare in advance - thus giving you maximum value for your money. Take a look at the great reviews I’ve received: https://clarity.fm/assafben-david
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439737225.57/warc/CC-MAIN-20200807202502-20200807232502-00189.warc.gz
CC-MAIN-2020-34
1,617
13
https://www.nextpit.com/forum/703719/transactions-with-in-app-billing-version-2-format-in-google-analytics
code
- Forum posts: 1 Apr 9, 2016, 9:39:51 AM via Website Apr 9, 2016 9:39:51 AM via Website In my google analytics dashboard, I just observed that about 75% of transactions ID's follow old format of google play transactions (19digits.16digits) which was used in In-App Billing Version 2 and it is deprecated. These transactions ID's are not recorded in the google play finance report and I don't know where those transactions come from and whether I am loosing any revenue or not.
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141195198.31/warc/CC-MAIN-20201128070431-20201128100431-00474.warc.gz
CC-MAIN-2020-50
476
4
https://support.modenatechnologies.com/support/solutions/articles/36000290484-uninstall-using-the-microsoft-installation-troubleshooter
code
You may have trouble reinstalling software under the following circumstances: - Some components of the previous install were not removed when you uninstalled using the Windows Control Panel. - A corrupted product installation left "residue" that prevents reinstallation. These problems can be due to drive letter changes, removal of the original install image, or other drive changes. Microsoft has a troubleshooting tool (formerly called Fix It) that can automatically solve uninstall issues and works with Windows 10, Windows 8.1, Windows 8, and Windows 7. - Visit the Microsoft Support article, Fix problems that block programs from being installed or removed. - Click the Download button. - Click Run or Open, and then follow the steps in in the Program Install and Uninstall Troubleshooter. What It Fixes - Corrupted registry keys on 64-bit operating systems - Corrupted registry keys that control the update data - Problems that prevent new programs from being installed - Problems that prevent existing programs from being completely uninstalled or updated - Problems that block you from uninstalling a program through Add or Remove Programs (or Programs and Features) in Control Panel
s3://commoncrawl/crawl-data/CC-MAIN-2022-27/segments/1656103920118.49/warc/CC-MAIN-20220701034437-20220701064437-00624.warc.gz
CC-MAIN-2022-27
1,192
14
https://decisionstats.com/2011/08/16/using-two-operating-systems-for-rattle-rstats-data-mining-gui/
code
Using a virtual partition is slightly better than using a dual boot system. That is because you can keep the specialized operating system (usually Linux) within the main operating system (usually Windows), browse and alternate between the two operating system just using a simple command, and can utilize the advantages of both operating system. Also you can create project specific discs for enhanced security. In my (limited ) Mac experience, the comparisons of each operating system are- 1) Mac- Both robust and aesthetically designed OS, the higher price and hardware-lockin for Mac remains a disadvantage. Also many stats and analytical software just wont work on the Mac 2) Windows- It is cheaper than Mac and easier to use than Linux. Also has the most compatibility with applications (usually when not crashing) 3) Linux- The lightest and most customized software in the OS class, free to use, and has many lite versions for newbies. Not compatible with mainstream corporate IT infrastructure as of 2011. I personally use VMWare Player for creating the virtual disk (as much more convenient than the wubi.exe method) from http://www.vmware.com/support/product-support/player/ (and downloadable from http://downloads.vmware.com/d/info/desktop_downloads/vmware_player/3_0) That enables me to use Ubuntu on the alternative OS- keeping my Windows 7 for some Windows specific applications . For software like Rattle, the R data mining GUI , it helps to use two operating systems, in view of difficulties in GTK+. Installing Rattle on Windows 7 is a major pain thanks to backward compatibility issues and version issues of GTK, but it installs on Ubuntu like a breeze- and it is very very convenient to switch between the two operating systems Download Rattle from http://rattle.togaware.com/ and test it on the dual OS arrangement to see yourself. 3 thoughts on “Using Two Operating Systems for RATTLE, #Rstats Data Mining GUI” Thanks for the commentary – also note that the new versions of R/Rattle/RGtk2 make it much simpler now to install on Windows. The newest RGtk2 takes care of installing the required GTK package and so the user no longer needs to do that step themselves. I would be interested in your feedback on running Rattle on MS/Windows now. ok. Let me test Rattle again. I think Rattle works ok on a new install but removal of legacy GTK was the issue so let me test it . Thanks for writing on this You can run Windows on a recent Mac either as a dual-boot system or in a virtual environment. I use Parallels to run Windows so as to use Tableau and occasionally Rapidminer (I never got on with the Mac version) and it works absolutely fine. Of course what you can only do on a Mac is run OSX. As for price, there is no low-end offering but like for like they certainly aren’t as far off now as when I got my first one 20 years ago.
s3://commoncrawl/crawl-data/CC-MAIN-2023-23/segments/1685224647895.20/warc/CC-MAIN-20230601143134-20230601173134-00599.warc.gz
CC-MAIN-2023-23
2,858
16
http://www.gsp.com/cgi-bin/man.cgi?section=1&topic=mktextfm
code
This manual page is not meant to be exhaustive. The complete documentation for this version of T\h-0.1667m\v0.20vE\v-0.20v\h-0.125mX can be found in the info file Kpathsea: A library for path searching. mktextfm is used to generate a tfm file from the METAFONT source files for font, if possible. If destdir is given, the generated file will be installed there, otherwise a (rather complicated) heuristic is used. The name of the generated file is printed on standard output. mktextfm is typically called by other programs, rather than from the command
s3://commoncrawl/crawl-data/CC-MAIN-2018-26/segments/1529267865081.23/warc/CC-MAIN-20180623132619-20180623152619-00171.warc.gz
CC-MAIN-2018-26
552
10
http://informationworker.ru/msnev3.en/money_general_changemoneyfromustocanadiansettings.htm
code
Canadian settings will take effect the next time you start Money. For the settings to take effect, you must have Windows Regional Options set to English (Canada). For more information on Regional Options, see Help. It is possible that Money will display some United States content (such as information on U.S. taxes) when the Canadian dollar is set as the base currency. If this happens, try changing the Country/Region setting to United States, and then back to Canada again. Follow the instructions above, but in step 4, click United States (instead of Canada). Then repeat the instructions above as written (in step 4, click Canada). You'll need to restart Money each time you go through these steps.
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500044.66/warc/CC-MAIN-20230203091020-20230203121020-00781.warc.gz
CC-MAIN-2023-06
703
4
https://stylo.alfamedia.com/en/formatting.html
code
There is no special template editor for Stylo. The interface is the same as for the normal document creation and so you can create a template from an existing document in no time at all, which will be provided with the appropriate administrative information in the advertisment system. In a template you can assign rights for text as well as for graphic objects. This allows you to restrict which fonts and formatting are used for text. For graphic objects, you can specify whether they may be changed in size and position or whether they can be deleted, for example. In addition, the templates are usually prepared in such a way that they can be easily reformatted into other templates or used as alternative patterns.
s3://commoncrawl/crawl-data/CC-MAIN-2023-50/segments/1700679100989.75/warc/CC-MAIN-20231209233632-20231210023632-00327.warc.gz
CC-MAIN-2023-50
719
3
https://superuser.com/questions/965060/is-there-a-way-to-switch-between-tabs-in-firefox/965216
code
Is there a shortcut key to switch back and forth between two tabs in Firefox? For example, you can press CTRL + 1 to select the first tab, CTRL + 2 to select the second. What I want to do is switch between the current tab and the last selected tab. TV remotes have a similar previous or back button that does the same thing. If you only press this one button it will switching solely between two channels each time it is pressed. I see this is called a "last" button on TV remotes. I know you can move a tab to be right next to each other as tab 1 and tab 2 and then use ctrl + 1 and ctrl + 2 but I'm talking about a switch shortcut. I'm using Mac.
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439739177.25/warc/CC-MAIN-20200814040920-20200814070920-00515.warc.gz
CC-MAIN-2020-34
648
4
http://cogsci.stackexchange.com/questions/tagged/empathy
code
Over the years I've seen numerous articles about introverts that mention that they dislike large social interactions (meetings, parties, etc) because they drain their energy. These articles mention ... The Internet is of course full of memes from Sherlock Holmes show, based on one of the episodes having Sherlock self-diagnose: I'm not a psychopath, I'm a highly functioning sociopath But what ... Altruistic behavior can have different motivations: from the hope that the help you give will ultimately benefit yourself (social exchange theory) to a selfless wish to alleviate someone's suffering. ...
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931010402.68/warc/CC-MAIN-20141125155650-00047-ip-10-235-23-156.ec2.internal.warc.gz
CC-MAIN-2014-49
602
3
https://ss64.com/nt/cleanmgr.html
code
Automated cleanup of Temp files, Internet files, downloaded files, recycle bin. Syntax CLEANMGR option Options /d driveletter: Select the drive that you want Disk Cleanup to clean. /sageset:n Display the Disk Cleanup Settings dialog box and create a registry key to store the settings you select. The n value is stored in the registry and allows you to specify different tasks for Disk Cleanup to run. n can be any integer from 0 to 65535. Specify the %systemroot% drive to see all the available options. /sagerun:n Run task 'n' All drives in the computer will be enumerated, and the selected profile will be run against each drive. Only one of the 3 options above can be run at a time /TUNEUP:n Run sageset + sagerun for the same n /LOWDISK Run with the default settings. /VERYLOWDISK Run with the default settings, no user prompts. /SETUP Undocumented /Help Undocumented /Usage Undocumented /? Display help To enable Cleanmgr on Windows 2008, open Server Manager and choose Add feature, then select "Desktop Experience" After running cleanmgr on a server you will probably want to disable "Desktop Experience" again (or use this script.) When necessary cleanmgr can take ownership of the files before deleting them. Registry settings for CLEANMGR are held in: Options that can be chosen for cleanup: Temporary Internet Files Temporary Setup Files Downloaded Program Files Old Chkdsk Files Temporary Offline Files Compress Old Files Catalog Files for the Content Indexer System files: Debug/chkdsk/Installer/Memory dump/Windows update/error reporting logs. Items in bold can appear in more than one drive i.e not just in %SystemRoot% Many files in Application Data hold system data that should not be deleted, however some applications do leave files which you can delete from a roaming profile, these can be selectively removed with a VBScript like this. The 'User Profile/Recent' folder (for Start, Documents) can contain many more shortcuts than are set to display in the GUI. A very large number of these can affect logon/logoff times. To clear out the shortcuts: echo y| del *.* Close all applications Open a command prompt taskkill /im explorer This should kill explorer and bring up a logout/shutdown dialogue box. Simultaneously press CTRL+SHIFT+ALT. While you keep these keys pressed, click [Cancel] in the Shut Down Windows dialog box. In the command prompt window, you should now be able to delete the locked files. At the command prompt, type explorer, and press ENTER to restart Windows Explorer. ”Mrs. Joe was a very clean housekeeper, but had an exquisite art of making her cleanliness more uncomfortable and unacceptable than dirt itself“ ~ Charles Dickens DELPROF - Delete user profiles and/or User Profile cache DEFRAG - Defragment hard drive (XP) Q253597 - Automating Disk Cleanup in Windows Equivalent bash command (Linux): watch - Execute/display a program periodically
s3://commoncrawl/crawl-data/CC-MAIN-2018-22/segments/1526794867311.83/warc/CC-MAIN-20180526053929-20180526073929-00591.warc.gz
CC-MAIN-2018-22
2,895
33
https://www.roguides.co.in/2010/06/days-of-star-gladiator-taekwon-master.html
code
The Days Of Star Gladiator Means days of Sun,Moon,Star according to Calender. During This Days Star Gladiator Gets Bonus As Described in Its Skill. So for Example bonus of Star will get only in days of stars. So following Are Days of Sun,Moon,Star Accordingly Days Of Sun Are 2,4,6,8,10,12,14,16,18,20,22,24,26,28,30 Days Of Moon Are 1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31 Days Of Star Are 5,10,15,20,25,30 Note that the day follows the time on the RO server that you are playing. Example If your server is located in different Time Zone and You are Playing from Different Time Zone then if its 5th day for you but its 4th day for server then the days of sg will only work according to server’s day. For Checking Server’s time/date use @time command and if your server do not have such command then check your server’s website/forum or ask GM of that server. Thanks For Reading ^_^
s3://commoncrawl/crawl-data/CC-MAIN-2019-13/segments/1552912206677.94/warc/CC-MAIN-20190326220507-20190327002507-00078.warc.gz
CC-MAIN-2019-13
891
9
https://www.veritas.com/content/support/en_US/doc/77080687-140726362-0/v81106029-140726362
code
Veritas NetBackup™ Appliance Release Notes - About NetBackup appliance 220.127.116.11 - NetBackup appliance 18.104.22.168 features, enhancements, and changes - NetBackup Appliance Hardware features - NetBackup Appliance compatibility - Operational notes - About NetBackup support utilities - Appendix A. Release content - Appendix B. Related documents General NetBackup Appliance notes The following list contains the notes and the known issues that relate to the general workings of the NetBackup Appliance: The order in which the 52xx storage shelves are displayed on the NetBackup Appliance Web Console or the NetBackup Appliance Shell Menu may be different than the actual order and layout in your environment. The storage shelf order is displayed in the Monitor > Hardware tab on the left pane where the storage shelves are displayed as NetBackup StorageShelf 1, NetBackup StorageShelf2, etc. Similarly the order of the storage shelves is displayed on the NetBackup Appliance Shell Menu when you run the Monitor > Hardware > ShowHealth command. If the mainboard RAID controller is removed from a 52xx appliance, the NetBackup Appliance Web Console or the NetBackup Appliance Shell Menu may still display stale data when you click on the Monitor > Hardware > Adapter tab or run the Monitor> Hardware ShowHealth Appliance Adapter command. If the disks on the RAID 1 volume of the 5240 appliance are missing and you run the Monitor > Hardware ShowHealth Appliance RAID command, the location of the missing disks is displayed incorrectly as slot 0. The same behavior is observed in the NetBackup Appliance Web Console when you navigate to Monitor > Hardware > RAID for the 5240 appliance. This issue applies to 5240 appliances and is observed when the disks on the RAID 1 volume of the appliance are missing. If you connect a Veritas Storage Shelf to a NetBackup 52xx Appliance, an AutoSupport alert with UMI code V-475-100-1004 is generated for each storage disk when the storage shelf is turned on. The following message displays: "You can either import the foreign configuration or clear the disk." You can safely ignore these alerts. If you connect the storage shelf during initial configuration, the alerts are all cleared when initial configuration is complete. If you connect the storage shelf after initial configuration, the alerts are cleared when you run the storage scan as part of installation. In this case, the NetBackup Appliance Web Console and the NetBackup Appliance Shell Menu may show incorrect data for the storage shelf for approximately five minutes after installation is complete. See the NetBackup Appliance Hardware Installation Guides for more information on installing a storage shelf. An appliance self-test fails if the login banner heading or a single line in the login banner message contains only the following text: On a NetBackup 5330 Appliance, a preferred path failure can occur when the LUN ownership fails over from one controller to another controller. In some cases, one controller can reset the other controller, which then causes a preferred path failure. When this failure happens, the Storage Status for appliance hardware monitoring displays as Not Optimal. This failure can persist for weeks at a time until cleared. If the failure is not cleared, all paths fail, and the affected controller is taken offline, resulting in loss of redundancy and performance degradation. If you encounter this issue, contact Veritas Support and have your representative reference TECH225558. Windows 7/8.1 clients cannot automatically access the appliance CIFS shares. To work around this issue, run the following command from a Windows command prompt on the client: net use /user:admin \\appliance-name *, where appliance-name is the fully qualified domain name (FQDN) of the appliance. Enter your appliance administrator password at the prompt. Once you have run this command, the client is able to access the CIFS shares. For this release of NetBackup Appliance, Replication Director (RD) restores do not support dynamic multi-pathing (DMP) when the appliance is used as a backup or a recovery host. During a factory reset, when the following message appear: RESET STORAGE CONFIGURATION and BACKUP DATA [Optional] If you select no, it indicates that you keep the storage related configurations. After the factory reset, when you perform the initial configuration, make sure that the size of the Advanced Disk and MSDP are not set to 0, otherwise the role configuration fails. If the size of AdvancedDisk and MSDP partitions were set to 0 before the factory reset, they can remain at 0 when you perform the initial configuration. Starting with version 2.6.1, if you perform a factory reset and select to keep the network and the storage configuration settings, the error message "Cannot rollback volume" may appear. This message indicates that during the factory reset process, the appliance network configuration could not be saved. Although the factory reset has completed successfully, you must now reconfigure the network parameters. For more information, see the following tech note on the Veritas Support website: If you upgrade to the software version 3.1.2 or later, to continue support SAN Client backups on appliances, you must make sure that only one single initiator and one single target per zone. Each client host can only be zoned with one FT Media Server. See the NetBackup Appliance Fibre Channel Guide for more information. If you want to configure 16Gb Fibre Chanel cards on the appliance with software version 3.1.2 or later, do not directly connect two 16Gb Fibre Channel cards. Direct connection between two 16Gb Fibre Channel cards makes the HBA link down on the target port. To avoid this issue, always build an HBA link between two 16Gb Fibre Channel HBA cards through a switch. After you replace 8Gb FC cards with 16Gb FC cards on an appliance that updated to software version 3.1.2, the "Fibre Transport Deduplication state" may not show correctly when you run the command Manage > FibreChannel > Show in the NetBackup Appliance Shell Menu. To restart the infrastructure services and refresh the "Fibre Transport Deduplication state", run the following: Support > InfraServices Stop Support > InfraServices Start Manage > FibreChannel > Show Starting with release 3.1.1, if IPSec functionality is configured on any appliance that you plan to upgrade, the IPsec certificates may not be retained after the upgrade has completed. To avoid this issue, you must export the IPsec certificates before upgrading those appliances. For complete details, see the Veritas NetBackup Appliance Upgrade Guide. If your appliance has shares configured and you roll back to a user created checkpoint, some tuning parameters for the share may be lost after the rollback has completed. This issue can affect appliance performance once you start using the share again. To resolve this issue, contact Veritas Support and refer them to article 100047636.
s3://commoncrawl/crawl-data/CC-MAIN-2021-49/segments/1637964359082.78/warc/CC-MAIN-20211201022332-20211201052332-00259.warc.gz
CC-MAIN-2021-49
6,994
43
https://board.phpbuilder.com/d/10402342-sql-select-join-then-selecting-top-row
code
I have two tables. One with People, the other with contacts. I can join the tables via People (PersonID) and Contacts (PersonID) columns. In the Contacts Table, I have ContactDate. I want to select all People with not contact within 7 days. How? I'm getting contacts rows which aren't the most recent rows when I join the tables and do a where clause saying ContactDate > 7 days. I need the date of my the most recent contact I guess. But ultimately I only want People returned who have no contact date within 7 days.
s3://commoncrawl/crawl-data/CC-MAIN-2020-24/segments/1590347410745.37/warc/CC-MAIN-20200531023023-20200531053023-00187.warc.gz
CC-MAIN-2020-24
517
6
https://sciences.ucf.edu/physics/person/patrick-schelling/
code
Dr. Schelling received his PhD in Physics from University of Minnesota, Minneapolis, in 1999. He spent two years (1999-2001) as a Postdoctoral Researcher and two years (2001-2003) as a Visiting Scientist in Argonne National Laboratory, Argonne, IL before joining UCF in 2003 where he is an Associate Professor. He works in computational modeling of electron, mass, and heat transport in materials. He also computation to elucidate interactions and dissipation at interfaces with applications in materials science and planetary science. - B. D. Doan and P. K. Schelling, “Dissipation and adhesion hysteresis between (010) forsterite surfaces using molecular-dynamics simulation and the Jarzynski equality,” Comp. Mat. Sci. 206, 111259 (2022) doi.org/10.1016/j.commatsci.2022.111259 - W. E. Richardson, E. R. Mucciolo, and P. K. Schelling, “Resistivity size effect due to surface steps on ruthenium thin films computed with a realistic tight-binding model,” Journal of Applied Physics 130, 195108 (2021) doi.org/10.1063/5.0069046 - B. Doan, A. R. Dove, and P. K. Schelling, “Dissipation and adhesion between amorphous FeO nanoparticles,” J. Aero. Sci. 155, 105742 (2021) doi.org/10.1016/j.jaerosci.2020.105742 - K. Barmak, S. Ezzat, R. Gusley, A. Jog, S. Kerdsongpanya, A. Khanya, E. Milosevic, W. Richardson, K. Sentosun, A. Zangiabadi, D. Gall, W. E. Kaden, E. R. Mucciolo, P. K. Schelling, A. C. West, and K. R. Coffey, “Epitaxial metals for interconnects beyond Cu”, J. Vac. Sci. Tech. A 38, 033406 (2020) doi.org/10.1116/6.0000018 - K. Fernando and P. K. Schelling, “Non-local linear-response functions for thermal transport computed with equilibrium molecular-dynamics simulation,” J. Appl. Phys. 128, 215105 (2020) doi.org/10.1063/5.0032014 Oversees the following graduate students: Antonio Martinez Margolles
s3://commoncrawl/crawl-data/CC-MAIN-2023-40/segments/1695233506676.95/warc/CC-MAIN-20230925015430-20230925045430-00529.warc.gz
CC-MAIN-2023-40
1,834
8
https://80.lv/articles/granite-sdk-3-0-is-out/
code
Looks cool,what's the method for rendering animated substance?(Flipbook?) The plugin is called Hair Grabber and you can find it here:https://gumroad.com/l/GqVoR. It basically has parameters for manipulating cards in a spline manner Thanks dear Great blog. I really like this much, I have found lots of interesting stuff on your blog. Again thanks for that all interesting information. Visit for seo service in delhi Graphics middleware company Graphine today announced the launch of Granite SDK 3.0, the new release of its texture streaming middleware for high-quality real-time visualization and games. It includes a major redesign of the layer and caching systems, performance optimizations, more efficient video memory usage, and a smaller size on disk. You can find more detailed information here. Granite SDK 3.0 is a major update from Granite SDK 2.6 and a significant redesign that prepares Granite for the future. The upgrade gives artists total material freedom. It also contains many optimizations that make Granite – already the most advanced and optimized streaming system – even faster. Granite SDK 3.0 is an efficient texture streaming system for any texture data in video game scenes with high pixel density. It perfectly streams 2K, 4K, 8K, 16K textures up to 256K, minimizing the video memory needed to less than 1GB. It removes loading times while allowing hundreds of PBR materials, huge material masks or massive lightmaps. Granite SDK 3.0 is available as a stand-alone SDK that can integrate into any game engine and as a plugin for Unreal Engine 4 and Unity 3D. Source: official press-release
s3://commoncrawl/crawl-data/CC-MAIN-2018-47/segments/1542039750800.95/warc/CC-MAIN-20181121193727-20181121215727-00236.warc.gz
CC-MAIN-2018-47
1,618
8
http://tnt64.blogspot.com/2007/02/rose-by-any-other-name.html
code
William Shakespeare, genius that he was, penned the immortal lines. quite unfortunately, the converse also holds: a nettle by any other name is still going to sting just as well. this is the weird beginning to todays post on the many wonders of developing on Windows, specifically DLL hell - or at least a problem much like it. for the uninformed, DLL hell comes about from the process of installing and uninstalling programs, usually resulting in an important shared library (DLL) being replaced with a version that causes other programs to fail. see this Wikipedia article for more information. my problem isn't quite DLL hell, but it is related somehow. now, for unmanaged C++ (that is, C++ native code, or code not managed by the .NET Framework, hence the name), you'll most likely require some sort of runtime, as defined by your compiler vendor/settings. with Visual C++ 6.0, you needed msvcp60.dll for the runtime of non-debug C and C++ runtimes respectively, and for Visual C++ 8.0 (including VC++ 2005 Express, which i use), you have msvcp80.dll - or something like that. where DLL hell comes in is that if you have the required DLLs for a program to run installed, some program may replace/remove such when you install/uninstall it, breaking other programs needing it. in my case, i've been writing code that had a dependency on msvcp80.dll, and didn't have a problem until i uninstalled something. suddenly previously perfectly working programs i wrote stopped doing so. now Microsoft has these redistributable packages that allow you run programs with that particular runtime dependency. so i got a copy of the Visual C++ 2005 Redistributable package (download page here) and installed it. twice (two different copies, that is). no dice. i finally had to rebuild the application framework library (wxWidgets) so i could statically link in the C/C++ runtime for each application. ah well. i did at least know enough to get round my problem without reinstalling my compiler and its service pack. sayonara!
s3://commoncrawl/crawl-data/CC-MAIN-2018-30/segments/1531676589573.27/warc/CC-MAIN-20180717051134-20180717071134-00292.warc.gz
CC-MAIN-2018-30
2,015
6
https://medium.com/@arlindaliu.dev?source=post_internal_links---------5----------------------------
code
SwiftUI is a modern, simple and powerful framework that allows us to build user interfaces across all Apple platforms. By using it we can create complex views and animations in a declarative way. In this article, we are going to build a completely custom iOS App that will help you to learn more about SwiftUI. The App that we are going to build is an iOS version of the popular breathe app from apple watch. The design concept was created by the wonderful designer Daniel Korpai. … Apps nowadays deal with an abundance of real-time events that enable a highly interactive experience for the user. We need tools for properly dealing with these events. Is Apples latest framework an Answer for that? The Combine framework is the newest iOS declarative framework that helps with processing values over time. Learning Combine Swift and SwiftUI is a must if you want to stay up to date with the world of iOS Development. Combine raises the level of abstraction of your code so you can focus on the interdependence of events that define the business logic, rather than having to… Split testing is a method of determining which variation of an application performs better for a given goal. Multiple variants or behaviors of an application are distributed in a random manner. After statistics gathering and analysis, we determine which version performs better. The goal of this article is to provide a simple way of structuring and organizing your application in order to achieve clean and scalable iOS code when using split testing. Practical tips and examples are provided in order for this article to stay as a guideline for real-world app scenarios. Using split testing (also known as an A/B… In the late 1990s, while developing Extreme Programming, famous software developer Kent Beck came up with a list of rules for simple software design. According to Kent Beck, a good software design: In this article, we will discuss how these rules can be applied to the iOS development world by giving practical iOS examples and discussing how we can benefit from them. Software design helps us create a system that acts as intended. But how can we verify that a… There is no model-independent test of reality. It follows that a well-constructed model creates a reality of its own. An example that can help us think about issues of reality and creation is the Game of Life, invented in 1970 by a young mathematician at Cambridge named John Conway. — The Grand Design, by Stephen Hawking Game of Life is a popular implementation of the automaton theory. It is not really a game where one player can play against the computer or where two or more players can play against each other. Game of Life is automation that plays by… It is always better to be investing our time on creative tasks and avoid doing repetitive work as much as we can. This article is written to provide you with seven iOS development tips towards writing better and easily extendable code that will contribute to saving a lot of time. Even if your app does not support multiple themes it is always a good idea to bring together all the application style related logic into a manageable object that gets shared across the app. In order to provide high-quality software and avoid regression, implementing unit testing is a must for every iOS application. Mocking objects is a technique in unit testing that creates fake objects by using the same APIs as the real ones. This article is wirtten to provide you with the best practices on how to use fake data and write unit tests for the most occuring scenarions on iOS apps. When writing unit-tests we should always avoid altering real data of the application target and instead use fake data just for the testing purposes. The following parts will discuss how to… Application security is one of the most important aspects of software development. Users of our apps expect that their pieces of information are being kept private. Our sensitive application data should not be simply given away. Fortunately, in this article, we will discuss mistakes that developers make towards app security and how to easily fix them. I have researched multiple apps from the AppStore and a lot of them are doing the same mistake, storing sensitive data where they do not belong. If you are storing sensitive data in UserDefaults, then you are risking your application's information. UserDefaults get stored… Design Patterns are a crucial part of Software design, they provide a solution to commonly occurring problems. Apple uses these patterns all over iOS frameworks. In this article, we will be discussing how design patterns are used in the internal Apple APIs and how you can benefit from these implementations. We use iOS frameworks like CoreBluetooth and many others in our day to day development. Components like UIStackView all are written using a great reusable design from apple. There are many Design patterns that apple used throughout the development of their kits. … In software development, a design pattern is a general reusable solution to a problem. A design pattern is a description of how to solve a problem that can be used in different situations. In this Article we will be implementing three design patterns by using swift, that will help us to make our code much cleaner and provide a reusable template for solving commonly occurring problems on iOS In this article, we will be discussing better solutions using Design Patterns for a simple application for publishing a form. You can find complete code for the example application here.
s3://commoncrawl/crawl-data/CC-MAIN-2021-39/segments/1631780057424.99/warc/CC-MAIN-20210923135058-20210923165058-00304.warc.gz
CC-MAIN-2021-39
5,570
43
http://proxyhit.com/
code
This website provides a directory of currently active web proxies on the internet. Anyone can submit their own proxies to this list for free. With almost 1500 hits daily, this is one of the most popular web proxy lists on the Internet. A Web Proxy is a type of proxy that is used within a web browser that doesn't require making any changes to your system's proxy settings. When using a web proxy, visitor requests are sent to a proxy first which then makes that request from its own servers on behalf of you and then return the result back to you. This way, the owner of that website that the request was made to, is not aware of you, only the proxy.
s3://commoncrawl/crawl-data/CC-MAIN-2017-04/segments/1484560280899.42/warc/CC-MAIN-20170116095120-00306-ip-10-171-10-70.ec2.internal.warc.gz
CC-MAIN-2017-04
651
2
https://peppe8o.com/personal-cloud-with-raspberry-pi-and-nextcloud-on-docker/
code
This guide will provide us a very simple way to have a personal cloud with Raspberry Pi. We’ll use a Raspberry Pi 3 model B+, with an external USB drive that will store all our data. Our USB disk will be formatted in this procedure in order to assure that it will work. So be aware to use a free USB disk in order to avoid lose of data at format time. From software side, we’ll use Docker to have the enhancements obtained from container and NextCloud. What We Need As usual, I suggest adding from now to your favourite e-commerce shopping cart all needed hardware, so that at the end you will be able to evaluate overall costs and decide if continuing with the project or removing them from the shopping cart. So, hardware will be only: - Raspberry PI (including proper power supply or using a smartphone micro usb charger with at least 3A) - high speed micro SD card (at least 16 GB, at least class 10) Check hardware prices with following links: Step By Step Guide Prepare OS environment Our private cloud will be installed on official lite operating system. Use install Raspberry PI OS Lite guide to accomplish this task. Once done, remember to update from terminal: sudo apt update -y && sudo apt upgrade -y We are now ready to install Docker in Raspberry Pi. Enable USB Mount at boot We’ll use an USB drive to write data. So, we need to be sure that at every boot the USB Disk will be ready, without struggling on mounting it from terminal. For this purpose, I’ll use USBmount. Type on terminal: sudo apt install usbmount To be sure it will works, we need to change the line PrivateMounts=yes to PrivateMounts=no in “/lib/systemd/system/systemd-udevd.service”. Type sudo nano /lib/systemd/system/systemd-udevd.service and change the matching line. Reboot the Raspberry PI. Now, the simple command “mount” should list the following line within the other mounted devices: /dev/sda on /media/usb0 type vfat (rw,nodev,noexec,noatime,nodiratime,sync,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro) If we want to make USB drive writable from users (not only from root), we need to edit usbmount.conf: sudo nano /etc/usbmount/usbmount.conf identify “FS_MOUNTOPTIONS” and edit it as follow: - vfat is the filesystem - umask=0000 is the permission of the file and folder. 0000 means rwx-rwx-rwx Prepare Your USB Device I experienced some issues formatting my USB disk in FAT32. The only way I found to have a working installation was to format in ext4 the flash disk. Be aware: this operation will erase ALL your data in your USB disk. According with an askUbuntu guide (https://askubuntu.com/questions/22381/how-to-format-a-usb-flash-drive), identify the USB drive among all storage partitions and volumes on your computer use: You can also use: Suppose it may be /dev/sda. Unmount it with: sudo umount /dev/sda To format drive with the ext4 file system format: sudo mkfs.ext4 /dev/sda Reboot to have it mounted and ready to be used for our docker volumes. NOTE: in next section I’ll assume that your USB Disk is the only disk plugged on RPI. This should assure that at the boot it will be mounted on “/media/usb0”. Install Nextcloud Container It’s time to use Docker. As usual, it allows us to install and prepare the container with 1 row (copy and paste the entire following command): docker run -d -p 8080:80 --name nextcloud --restart unless-stopped \ -v /media/usb0/nextcloud:/var/www/html \ -v /media/usb0/apps:/var/www/html/custom_apps \ -v /media/usb0/config:/var/www/html/config \ -v /media/usb0/data:/var/www/html/data \ -v /media/usb0/theme:/var/www/html/themes \ nextcloud this simple command will map all main volumes on your USB key and initialize Nextcloud container. Be patient, because slow USB disks will require a while to prepare the container. You will be able to monitor installation process by typing: docker logs nextcloud or simple monitoring increasing USB disk space used from Nextcloud (on /media/usb0): watch df -H at the end of process it should have been used about 370MB of space. Login nextcloud and last settings Once the initialization have been completed, open with your browser the address http://<<YOUR_RPI_IP_ADDRESS>>:8080. You will see the following homepage (in your language, depending on your browser settings): If you want to use an external database, you have to set it in database section (link below the password field). If you want to use the built-in SQLlite DB, just insert username and password which you want to use for your cloud page and click on finish configuration to access (again, after a while for database initialization and recommended apps installation, if selected) the Nextcloud presentation pages and, after a few next, to Nextcloud home page: We are sorry that this post was not useful for you! Let us improve this post! Tell us how we can improve this post?
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323584913.24/warc/CC-MAIN-20211016170013-20211016200013-00188.warc.gz
CC-MAIN-2021-43
4,890
49
https://www.wattsatelier.com/painting-with-the-instructors/
code
Painting with the Instructors! Thank you to everyone who joined us at the Atelier on Sunday! We hope you enjoyed watching the instructor do their thing! If you missed the video you can find it here: https://www.youtube.com/watch?v=QWJTKNJYsf0 A lot of the viewers mentioned that they would like to see some images of the work so I thought I would snap a few pictures for everyone. Here is Jeffs set up and painting. Erik(left) and Tom(right) getting some painting in as well Robert Watts putting in some work on his personal project. If you want to see more of his process check out our previous video at https://www.youtube.com/watch?v=MODig8PmTYM&t=1335s Or his Instagram: https://www.instagram.com/robertwatts3637 Finally some killer drawing by Jim Hahn(left) and Brian Knox(right) There were a few books and other things mentioned in the stream so here is a list for you guys Thanks again everyone for joining in! Keep an eye out for the coupon code that’s coming out soon.
s3://commoncrawl/crawl-data/CC-MAIN-2020-50/segments/1606141727627.70/warc/CC-MAIN-20201203094119-20201203124119-00661.warc.gz
CC-MAIN-2020-50
979
12
http://gravitytales.com/post/age-of-adepts/age-of-adepts-chapter-592
code
Chapter 592 is up! We're doing decently in the GT Rankings. However, we've fallen from #1 to #2. Versatile Mage is giving us a run for our money this month. I know we don't have any bonus chapters to offer for votes, and our place on the chart doesn't translate to any tangible benefit for us, but please consider tossing a vote our way if you find the time. I like to maintain good visibility for new people coming to GT...and, vainly, selfishly, I want AoA to reign as king of GT for a bit longer. I had someone who had trouble seeing the updated Patreon chapter yesterday- it seemed the chapter didn't update for them until a few hours after the GT chapter went live. I'm trying to take measures to prevent that by updating the Patreon folders even earlier before the GT post (in case it's some kind of update lag in the folders that I can't see as the owner). If anyone else has trouble with the new chapters not showing up promptly, please ping me in the AoA Discord's error-channel so that Eris or I can check the folder.
s3://commoncrawl/crawl-data/CC-MAIN-2019-04/segments/1547583867214.54/warc/CC-MAIN-20190122182019-20190122204019-00569.warc.gz
CC-MAIN-2019-04
1,027
3
http://www.onlineshoes.com/mens-laredo-paris-black-leather-foot-r-toe-p_id315050
code
- DON'T PAY FOR 14 DAYS Product & Brand Information Customer Ratings & Reviews 96% would recommend to a friend. - write a review Great pair of boots !!! These boots are for everyday or a night out . The soles are non-slip letting me wear them more months out of the year . They are comfortable while walking or driving . I've only had them for a month but feel "it's a good day " when l'm wearing them. Always get the best price. We guarantee you'll always get the lowest price. If you find a better price within 10 days of purchase, call us at 1.800.786.3141 and we'll not only match that price but give you an additional 10% of the difference! Learn more
s3://commoncrawl/crawl-data/CC-MAIN-2017-04/segments/1484560280364.67/warc/CC-MAIN-20170116095120-00066-ip-10-171-10-70.ec2.internal.warc.gz
CC-MAIN-2017-04
656
9
https://www.researchwithrutgers.com/en/publications/designing-and-examining-pc-to-palm-collaboration
code
One trend in day-to-day computing involves moving seamlessly from large powerful workstations to small hand-held devices. A second trend is continuous collaboration with colleagues. Combining these trends requires solutions to both the problem of transferring large complex displays to smaller, less capable devices and of ensuring that a viable collaboration takes place even when the collaborators are using vastly different tools and viewing screen environments that differ significantly in their display richness. We briefly describe an architecture for managing displays across multiple platforms, which we call the Manifold framework. This architecture is incorporated into applications using our DISCIPLE collaboration system. We explore the use of Manifold by creating a 3D layout task that communicates with a 2D version of this task running on a Palm Pilot that is wirelessly connected to the Internet. In order to get measurable data on the collaboration problems and successes that users might encounter in this diverse communication tool arrangement, we ran two separate studies that captured the performance time, user errors and transcripts of the communication exchanges between the two users. We found that interface problems with each environment affected the task performance and that the different capabilities of the 3D and 2D environments created collaborative advantages rather than negatively affecting the collaboration.
s3://commoncrawl/crawl-data/CC-MAIN-2021-31/segments/1627046153491.18/warc/CC-MAIN-20210727202227-20210727232227-00566.warc.gz
CC-MAIN-2021-31
1,445
1
https://academia.stackexchange.com/questions/123254/are-usa-math-graduate-students-ready-for-graduate-school-in-math/123265
code
I have seen some answers on this site which say that a bachelor of science in mathematics in USA is a general degree and a student needs to study other subjects as well. It also seems to me that many students who did their undergraduate in USA do not have any research experience in mathematics. I did my undergraduate in an Australian university and it took me 4 years to complete. Each year, I had to do 8 subjects which all of them were mathematics subjects except one subject in the first year which was a programming course. I also did summer research at the end of each first three years in summer and I had to write a research thesis in the honor year (fourth year). This makes me wonder if those people who did their undergraduate in USA are ready for graduate school because it seems that they may not have done enough mathematics subjects? Even though I did all mathematics subjects in my undergraduate, I feel that there is some gap in my mathematics education for graduate school. How those people who did their undergraduate in USA feel about this? do they also see this gap and that they need to work hard? What do I mean by ready for graduate school? By being ready for graduate school, I mean that for instance a graduate student at university of Pennsylvania have seen all the materials in graduate preliminary exam in his undergraduate. Or if a graduate student takes any first year graduate courses, the pre-requisites for doing that course are satisfied and he does not need to learn assumed knowledge in a particular first year graduate course.
s3://commoncrawl/crawl-data/CC-MAIN-2020-24/segments/1590347396300.22/warc/CC-MAIN-20200527235451-20200528025451-00156.warc.gz
CC-MAIN-2020-24
1,565
7
https://www.pixtalgia.com/faq/
code
At this development stage Pixtalgia is far from being a native mobile app. However, Pixtalgia is an HTML5 app, this means it can be played in chromium based browser using a bluetooth gamepad. Due to performance issues and bugs, mobile browsers are not the optimal environment for playing. Also, no tech support will be provided on mobile iussues. CRTL, as SHIFT or ALT, are a browser/OS system dedicated buttons, so combo keys with CRLT and other keys, can fire unespected beaviours. for instance CTRL + W key combo, close the browser page. These beaviours cannot be controlled or avoided, so be careful when customize your game keys. Wallets supported are WAX CLOUD WALLET, ANCHOR, POLYGON Yes you can! Telegram support here: more support channels are coming
s3://commoncrawl/crawl-data/CC-MAIN-2024-10/segments/1707947474581.68/warc/CC-MAIN-20240225035809-20240225065809-00837.warc.gz
CC-MAIN-2024-10
759
5
https://wiki.eprints.org/w/index.php?title=Workflows/Workflow_Format&oldid=2388
code
|Workflow Format||Components →| The EPrints workflow file is written in XML, and follows the following structure: <workflow xmlns="http://eprints.org/ep3/workflow" xmlns:ep="http://eprints.org/ep3/"> <flow> <stage ref="first_stage" /> <stage ref="second_stage" /> </flow> <stage name="first_stage"> <component>...</component> ... </stage> ... </workflow> The <flow> structure defines the path taken through the workflow - in this case, from first_stage to second_stage. It is straightforward to embed EPScript in the flow block (as well as throughout the workflow) to provide more complex flows. For example, to only include a certain stage for a specific EPrint type: <flow> <ep:if test="type = 'thesis'"> <stage ref="thesis_stage"> </ep:if> ... </flow> The ref attribute of the <stage> element corresponds directly to the <stage>'s name attribute later in the workflow. While the flow element contained a list of stages to handle, the stage contains a list of components to render. Again, the stage may contain EPScript to customize the components rendered. For example, the default EPrint workflow contains the following to ensure the 'presentation type' field is only visible for conference items: <stage name="example_stage"> <ep:if test="type = 'conference_item'"> <component><field ref="pres_type" required="yes" /></component> </ep:if> ... </stage> Each component within a stage may have use any or none of the following attributes: - type (defaults to Field): The type of component to use (see the next section for components bundled with EPrints 3). - collapse (defaults to no): If yes, the component is initially collapsed. This hides the fields from the user unless they expand the component. - surround (defaults to Default): The surround is responsible for the 'look and feel' of the component. It renders the help, title, and the container in which the component's content is placed. It may be useful to create a custom surround based on the Default surround class, or to use the None surround, which renders nothing except the component content. The content of the <component> element is handled by the component itself, so may vary depending on its function. Again, the <component> content may contain EPScript to allow for dynamic configuration.
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500456.61/warc/CC-MAIN-20230207102930-20230207132930-00450.warc.gz
CC-MAIN-2023-06
2,265
13
https://unix.stackexchange.com/questions/390480/nice-and-ionice-which-one-should-come-first
code
I need to run some long and heavy commands but at the same time I'd like to keep my desktop system responsive. Examples: btrfs deduplication, btrfs balance, etc. I don't mind if such commands take longer to finish if I give them a lower priority, but my system should be responsive at all times. Using nice -n 19 and ionice -c 3 should solve my problem, but I'm not sure which command should come first for maximum benefit. # nice -n 19 ionice -c 3 btrfs balance start --full-balance / # ionice -c 3 nice -n 19 btrfs balance start --full-balance / Is there some subtle difference between options A and B? Are they equivalent perhaps?
s3://commoncrawl/crawl-data/CC-MAIN-2021-25/segments/1623487648194.49/warc/CC-MAIN-20210619111846-20210619141846-00226.warc.gz
CC-MAIN-2021-25
633
6
http://forum.hand2note.com/topic/2530/session-cash-missing-hands
code
Session cash missing hands since a month i've noticed that during my cash session on pokerstars, h2n seems to not importing all hands, every session i got basicly half of the hands i've played reported, thus also all the data about net and EV net results are wrong. I have the version 188.8.131.52 of software. Please check auto import settings: Everything is ok, otherwise it won't record any hand. It has been working alright for like 10 months without problem and then suddenly it started to not working well, after some h2n updates and pokerstarts software updates. So i suppose the problem has came after the updates of one of the softwares. Try to roll back Hand2Note module https://gifyu.com/image/RnDA
s3://commoncrawl/crawl-data/CC-MAIN-2021-31/segments/1627046153223.30/warc/CC-MAIN-20210727072531-20210727102531-00516.warc.gz
CC-MAIN-2021-31
709
5
https://flylib.com/books/en/4.234.1.85/1/
code
AirSnort is an encryption-cracking program. By exploiting the weaknesses discussed in Chapter 5, "Cracking WEP," AirSnort is able to capture encrypted radio data and extract the secret key, byte by byte. After capturing roughly 3,000,000 “5,000,000 packets, AirSnort can crack the password used by client and host in a few seconds. Although this program was not the first available to demonstrate the weaknesses of WEP, it quickly became the one of the most popular, because it can both capture and crack encrypted data. (Its forefather, WEPCrack, was the first publicly released code to crack archived data.) In addition, the newer releases of AirSnort provide a GUI, which is more appealing to most users than the previously used command-line interface. This program, although solely *nix-based, is simple to use. It ties right into the installed WNIC, places it in promiscuous mode, and starts to capture data. Every version of AirSnort includes some form of visual monitoring. The command-line version uses a simple text screen, while the GUI version lists summary information in its window. Each version supports the capability to crack the captured data on the fly while it is also capturing information. Version 2 performs the cracking function automatically, while version 1 and prerelease versions require manual execution of the cracking script. Both versions also support the capability to increase the speed factor of the cracker. These decrease capture time, but increase the chance of a faulty key. There is one major advantage to using AirSnort over other capture/crack tools: AirSnort supports both ORiNOCO (firmware 7.52) and Prism II cards. Because of the AirSnort authors' preference for ORiNOCO cards, they have imported the code required to make this program function for almost any WNIC. This extra feature can be a bit buggy , and requires additional steps and troubleshooting to become operational. However, the authors are constantly updating their software and posting patches and new editions to make AirSnort more stable and functional. Installing AirSnort can be difficult for the Linux newbie. Because of the many system configurations possible, getting this program running might require the installation of drivers, patches, updates, and more. Because this is Linux, be prepared for anything, but do not be surprised if everything works the first time. The first step is to get all the required files. This will vary depending on the current system status, installed WNIC, and operational preference. The following is the list of packages and programs you should download. If you have all the code on hand, you will at least be prepared if you need something. You will note that there are two modes of operation available in AirSnort: PF_NETLINK and PF_PACKET. The original capture programs used PF_NETLINK, which required conversion if the data was to be used in another program. PF_PACKET is the preferred method of data capture because it can be dumped right into another program, such as Ethereal. Although PF_PACKET is the optimal capture type, it is only possible through the use of the most current version of AirSnort in conjunction with an ORiNOCO WNIC. Using the ORiNOCO Card Once you have collected the necessary files, it is time to install them. Please note that this tool is constantly being updated; thus, the following instructions may not be 100% accurate for the version of AirSnort that you download. Please verify the correct procedure before attempting to install AirSnort. Using PCMCIA-cs -3.1.31 drivers (PF_NETLINK): pcmcia-cs-3.1.31 orinoco_cs-0.08 orinoco-08-1.diff orinocoSniff.diff linux-wlan-ng-0.1.13 + package airsnort-0.2.0.tar.gz or airsnort-0.1.0.tar.gz Using PCMCIA-cs-3.1.33 drivers: pcmcia-cs-3.1.33 orinoco_cs-0.09b orinoco-09b-2.diff (PF_NETLINK) orinoco-09b-packet-1.diff (PF_PACKET) airsnort-0.2.0.tar.gz or airsnort-0.1.0.tar.gz Using a Prism II card: Kernel Source pcmcia-cs-3.1.31 or pcmcia-cs-3.1.33 drivers linux-wlan-ng-0.1.13 + package airsnort-0.2.0.tar.gz or airsnort-0.1.0.tar.gz After you have all these parts , it is time to start installing. The first step is to install an updated version of the PCMCIA-cs drivers. Depending on your preferences and hardware, you will either be installing 3.1.31 or 3.1.33+. These drivers are required for the AirSnort program to interface correctly with the WNIC. The drivers are available from http://pcmcia-cs/ sourceforge .net in the file pcmcia-cs-3.1.33(31).tar.gz. We recommend you download this file to your /usr/src directory. This is the source tree for your operating system, and is where you will find other source code directories. Once downloaded, untar the file ( tar -zxvf pcmcia-cs-3.1.33(31).tar.gz ). If you plan on using an ORiNOCO card with these drivers, you will then need to apply the orinoco_cs-0.09b patch to the source tree using the following command: patch -p0 < orinoco-cs-0.09b This will insert and update some required code into the driver files that allow the ORiNOCO card to enter promiscuous mode. Next you will need to configure, make, and install the drivers. Before you do this, be sure you have your kernel source code installed. Typically, this will also be under the /usr/src directory. You will need to tell the configure script where to find the source code. In addition, you will need to be sure where your modules are located. We suggest using the /mlib/modules/2.4.x-x directory for the configuration script. Once the configure script is set up, simply make the files by using the ./make all command, and finally, install the new drivers using the ./make install script. You must be sure you do not have two copies of the same file located in the lib directory. If you do, remove both sets and re-install the new pcmcia-cs drivers. Otherwise, the operating system will use the wrong set of drivers, thus ensuring AirSnort will not work. The next step is to install the wlan-ng drivers. This is required unless you are using the PCMCIA-cs-3.1.33 drivers with an ORiNOCO card. To install this package, simply download the file to your chosen download directory, unzip/untar it and perform the same configure, make, and install commands used to install the PCMCIA drivers. This should install several new files to your system, and set it up to use the wlan-ng package to control your WNIC. In addition, it will install several scripts that enable you to quickly put your card in promiscuous mode, so AirSnort can use it. If you choose, you can avoid the whole wlan-ng installation with an ORiNOCO card by using the orinocoSniff.c program instead. However, you must first compile this program before you can execute it. To do this, you can use any c compiler, such as gcc. You compile the program using a command, such as gcc orinocoSniff.c , which will create an a.out file you can execute by typing ./a.out . If this doesn't result in a success, you might need to use the wlan-ng package, or perform some troubleshooting to figure out why it did not work. The final step, without getting into every possible patch or scenario that might arise, is to install the AirSnort program. Again, you will need to download and unzip/untar the program to your chosen location. Once complete, you will need to enter the airsnort directory and run the autogen .sh script. This will configure the program, after which you will need to run the ./make all command to compile the program. Once the program is properly installed with no errors and a full reboot for the fun of it, you are ready to use the program. We will cover the two main versions of AirSnort. We prefer version 1's simplicity, but also like version 2's added features. You might want to play with both of the programs to see which you like. If you are using version 1, you will find two folders ” scripts and src . The scripts folder holds a script file with the command wlanctl-ng wlan0 lnxreq_wlansniff channel=6 enable=true . This command is used to place the card into promiscuous mode so AirSnort can detect and monitor the packets. This command can be entered manually. Once the WNIC is in promiscuous mode, you will find the capture script in the src directory. To start this program, type ./capture -c captureFile1.txt . This will start the capture, show you the results, and dump the data into a file named captureFile1.txt . Once you have the program running, you will see a screen similar to Figure 9.21. Figure 9.21. AirSnort capture. Note that although this is only text, you can still see several things. For example, you can see the number of total packets. Because the typical WLAN sends data in 1500-byte packets, this number will get quite high. You will need several million packets to crack WEP, so be prepared to see this number climb. In addition, AirSnort shows the last IV. As you learned in Chapter 5, this is the key to cracking WEP. If you see a key in the form B+3, 255, x (33-47, ff, xx) , you should also see the Interesting Packets field increase. Another valuable indicator is the Timeout field. If this field continues to increase and the packets stay the same, you might have lost your connection. This is very useful if you are moving around while capturing data. When you have a sufficient amount of data, you can start cracking the password. Using the "crack" script, you can test the capture file periodically to see whether you have enough keys to extract the password. In addition, you can adjust the crack program to test a wider range of possible passwords using the -b switch; however, this might result in false positives. It is recommended that you not adjust the breadth to greater than 4. However, in testing we successfully cracked a password in a shorter time using the maximum of 10. In addition, you can shorten the crack time by specifying the key length. This is done using the -l switch, but obviously this is only useful if cracking a known secret key for educational purposes. If used in a real situation, limiting yourself to one length or another might result in missed keys. The following is the command used to crack our capture file, and Figure 9.22 is a screenshot of what it looks like. Figure 9.22. Capture file. "./crack -b 10 -l 40 test.3" As you can see, AirSnort version 1 is not a difficult program to use. Setting it up might be challenging, but once that hurdle is overcome , you can capture and crack quite easily. Now, let's move on to AirSnort version 2, which includes extra features. AirSnort 2 is a more comprehensive WEP-cracking tool. It not only incorporates the cracking tools of the previous version, but also includes SSID detection and access point MAC listing, and provides the user with the capability to sniff either PF_NETLINK or PF_PACKET. However, as version 2 is further developed, the capability to sniff using PF_NETLINK will cease to exist. As of version 2.1, this feature is no longer used. To use version 2, you only need to download and install the necessary patches. Once this is accomplished, you need to place the WNIC in promiscuous mode, which is accomplished using the following command, with alternative options. iwpriv eth0 monitor <m> <c> mone of the following 0disable monitor mode 1enable monitor mode with Prism2 header info prepended to packet (ARPHRD_IEEE80211_PRISM) 2enable monitor mode with no Prism2 info (ARPHRD_IEEE80211) cchannel to monitor After you successfully place the card in promiscuous mode, you are ready to execute AirSnort. Figure 9.23 illustrates AirSnort 2 in action. Figure 9.23. AirSnort version 2. As you can see, this version will scan for channels, monitor the last IV, and keep a numerical listing of the packets captured and interesting IVs captured, as well as the Name and ID of the access point. In addition, this program will perform the cracking routine while sniffing. Once enough data has been collected, you will be shown the password in ASCII and hex by scrolling right in the program. There are several options that need to be set up under the Settings menu. You will need to designate the name of the WNIC. Typically this will either be wlan0 or eth0 , depending on the WNIC you are using. Depending on the version, you will also need to select the type of packet capture you are attempting (PF_NETLINK or PF_PACKET). Finally, you will need to check a box that determines whether the WNIC is in promiscuous mode, which it should be at this point. This program will even allow a user to pause the sniffing operation, take out the existing WNIC, and swap it with the other flavor of WNIC, and then resume sniffing. In addition, you can pause and resume sniffing any number of times during the cracking process. Although these are the current options, this program is picking up momentum and is undergoing semi-major updates every few weeks. Therefore, be prepared for a more user-friendly tool with more options. AirSnort's patches include code that allows Kismet to use ORiNOCO cards. This facilitates the capability of Kismet to capture and store data in the AirSnort format, with Prism- or Hermes-based cards. From this segment, you should realize that AirSnort is your best WEP-cracking tool. Although it is a bit skimpy on additional features that can be found in other sniffing tools, AirSnort is the best sniffer/cracker tool online. If you want to get to cracking, AirSnort will get you there the fastest .
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573908.30/warc/CC-MAIN-20220820043108-20220820073108-00343.warc.gz
CC-MAIN-2022-33
13,382
48
https://lists.bisq.network/pipermail/bisq-github/2019-October/023180.html
code
[bisq-network/bisq] NTP: Fix a couple of UI issues in the New Trade Protocol (#3410) notifications at github.com Tue Oct 15 22:07:10 UTC 2019 sqrrm commented on this pull request. > @@ -42,6 +43,11 @@ private final BooleanProperty showOpenMediationTicketsNotification = new SimpleBooleanProperty(); + private final StringProperty numOpenRefundTickets = new SimpleStringProperty(); + private final BooleanProperty showOpenRefundTicketsNotification = new SimpleBooleanProperty(); Not that I can see, just like the other properties. You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub: -------------- next part -------------- An HTML attachment was scrubbed... More information about the bisq-github
s3://commoncrawl/crawl-data/CC-MAIN-2020-34/segments/1596439737645.2/warc/CC-MAIN-20200808110257-20200808140257-00276.warc.gz
CC-MAIN-2020-34
760
14
http://brianedwards.blogs.com/my_weblog/2006/03/microsoft_relea.html
code
Microsoft has released its Office System 2007 Licensing. Some interesting things to note: a) Current Office Professional users will migrate to Office Professional Plus under SA's and EA's. b) Forms Server (InfoPath forms via the web) will be released as a separate server offering, but also included in the new Enterprise CAL introduced by Microsoft. This will have a separate licensing structure for external website use. c) The new Enterprise CAL (which will be an upgrade to the Core CAL many companies already own) will include: - Business data exposure via the Business Data Catalog - Forms Server - Excel Services (Not sure what this is going to be called) - Rights Management Services - MOM licensing - Security licensing (don't know exactly what this is yet) - Communicator licensing d) A new desktop suite called Microsoft Office Enterprise 2007 will include the laptop friendly tools Groove & OneNote. e) A product acquired from a company called UMT (www.umt.com) is being re-branded and included in the 2007 server lineup as: Microsoft Project Portfolio Server 2007. Microsoft.com: Client Suite Licensing Microsoft.com: Pricing Overview (Including Servers) Microsoft.com (Word Doc): Pricing FAQ Sheet - Includes Great details about each individual server.
s3://commoncrawl/crawl-data/CC-MAIN-2018-43/segments/1539583517495.99/warc/CC-MAIN-20181023220444-20181024001944-00200.warc.gz
CC-MAIN-2018-43
1,266
16
https://www.ninjatb.com/howtofindwords/how-to-find-competitors-adwords-keywords-how-to-find-what-keywords-a-website-is-ranking-for-2019.html
code
3. Finally, there's just good old research through trends and news. Google Trends, keeping up on industry news of the business, and even newsjacking (if there are relevant topics). These all require different resources depending on the business, but once you find the leaders in their news you can not only leverage them for keyword research but also glean insights into how you can become an industry leader yourself (and dominate SEO). One of the most important aspects of an effective SEO strategy is the ability to research, analyze, and ultimately select the keywords that are most likely to result in success for your clients. There are a variety of free tools available on the web specifically designed to help online marketers do just this. Each tool has its own unique methodology for collecting and presenting this data. Comparing any of the tools’ results without knowing the subtle differences can lead to incorrect inferences and an SEO strategy based on misinformation. Hi – I’ve read your post with great interest. Not only am I happy for you and your success, it does provide a glimmer of hope to those of us who do have “other” ideas. I, too, have had an idea for a very long time now, but I have no idea how to go about “making it a reality.” I am not a programmer either… though I do know some html… I also remember the old days of having even a web developer holding domains hostage, never mind the site’s entire code. And, yes, that is one of the major things that stops me from even discussing things with a developer. I simply don’t trust them… Jaaxy uses a combination of search engine data from the major search engines (Google, Bing and Yahoo) and Long Tail Pro get its search data from Google alone (via the Google Keyword Planner tool). While this seems to be an advantage for Jaaxy, you might only be interested in getting information from the number one search engine in the World: Google. TIP: A really good strategy for increasing your search engine rankings (and maybe even getting a featured snippet), is to pick a number of popular questions, and answer them in your content. You can do this in the form of a ‘Question & Answer’ section or maybe ‘FAQs’. Just pick half a dozen or so questions, and list them, together with a short answer. The ubersuggest keyword suggestion tool is ideal for those people who would like to perform some quick and easy keyword research for their blogs. It is one of the simple and user-friendly keyword research tools that you can find in the market out there. Even though the tool is simple, it is very powerful as it has helped several users to optimize their websites. It is an absolutely free tool that produces real and reliable data, comprehensive keyword research results, and is incredibly easy to use. How much is a keyword worth to your website? If you own an online shoe store, do you make more sales from visitors searching for "brown shoes" or "black boots"? The keywords visitors type into search engines are often available to webmasters, and keyword research tools allow us to find this information. However, those tools cannot show us directly how valuable it is to receive traffic from those searches. To understand the value of a keyword, we need to understand our own websites, make some hypotheses, test, and repeat—the classic web marketing formula. In a nutshell, Long Tail Pro helps you quickly find keywords in bulk based on a seed keyword that you input. In addition to returning hundreds of related keywords, the tool also shows search volume (monthly search volume), advertiser bid, number of words, rank value, and my favorite Keyword Competitiveness (helps you judge the competition and keyword difficulty).
s3://commoncrawl/crawl-data/CC-MAIN-2019-47/segments/1573496665573.50/warc/CC-MAIN-20191112124615-20191112152615-00148.warc.gz
CC-MAIN-2019-47
3,738
8
https://lists.boost.org/Archives/boost/2004/02/60233.php
code
From: David Abrahams (dave_at_[hidden]) Date: 2004-02-01 12:14:02 Robert, please make a little more effort with your posting to indicate who's writing, which text you are quoting from another message, and which you are writing yourself. If your mailer won't add the leading ">"s manually you can do it with your text editor's search/replace. That'll save me the trouble of doing it for you and make sure that everyone can understand your post. Robert Ramey <ramey_at_[hidden]> writes: >Dave Abrahams wrote: >> Robert Ramey <ramey_at_[hidden]> writes: >>> I have a problem with the latest version of the new iterators. >>> This is illustrated in the following example. >>> #include <strstream> >>> #include <iterator> >>> #include <boost/iterator/counting_iterator.hpp> >>> std::istrstream is("abcdefg"); >>> typedef boost::counting_iterator<std::istream_iterator<char> > ci; >>> const ci end = std::istream_iterator<char>(); // note the const !!! >> I don't see why the const should be relevant. > From looking at the code for counting iterator This is your first mistake: don't try to understand what the class is meant to do by looking at the code. The documentation is the > it seems to me that it is implemented for non - random access > iterators by incrementing one iterator until it equals the other and > return the number of times the increment occured. Yeah, it's a mistake. It shouldn't use distance, but operator-. That way neither one of your subtractions would compile. > (I concede that I've had difficulty following the implementation so > I could be wrong). So if one of the iterators is const - that one > can't be the one that's incremented. Hah, well it increments a copy of the iterator. You don't really want us modifying your iterator out from under you just because you measured a distance with it, do you? In that case the distance would be zero after the call. >>> ci begin = std::istream_iterator<char>(is); >> Hmm, the implicit conversions from istream_iterator to the >> counting_iterator above are troubling. I think those constructors >> should be explicit. > Hmmm - From looking at the documentation and the code, I get > absolutly no indication that the counting iterator requires a > randome access iterator. The docs say: Specializations of counting_iterator model Readable Lvalue Iterator. In addition, they model the concepts corresponding to the iterator tags to which their iterator_category is convertible. And then you go back and look at the synopsis and find: iterator_category is determined according to the following algorithm: if (CategoryOrTraversal is not use_default) else if (numeric_limits<Incrementable>::is_specialized) random_access_traversal_tag, Incrementable, const Incrementable&) Incrementable, const Incrementable&) In your case, you didn't pass a CategoryOrTraversal explicitly, and since the Incrementable type is not numeric, the last clause gets used. iterator-category (follow the link) just preserves the traversal properties of an iterator, so if the Incrementable type is a single pass traversal iterator (which yours is), the iterator_category of the counting iterator will reflect that. > In fact, looking at the code suggests that there is code especially > included to handle any forward transversal iterators. And the first > invocation below DOES work - I can't believe that's an accident. It's an accident. >>> unsigned int size; >>> // the following should fail at compilation ? >> Why? OK, maybe it would be better to make it fail because the >> iterator is not in fact a random-access iterator. >>> // in fact it compiles are returns a value of -7 !!! >>> size = begin - end; >>> // the following should compile and return a value of 7 >>> // in fact, it compiles but goes into an infinite loop >>> size = end - begin; > My mistake - I wrongly presumed that only the second iterator would be > incremented and since its const it should fail. It looks like the code > intends to select which iterator to increment depending on const. > So this first case looks ok to me now. >> I don't see why you think the first should compile but the second >> should not. If they're not random-access iterators, it seems to me >> that there's no reason you should be able to subtract them in any >> order. If they were random-access iterators, it would be unreasonable >> to expect the compiler to detect which one had an earlier position >> and disallow one subtraction. >> So my position is that both compile (as they should). The first works >> as one would expect while the second results in an infinite loop. I'm >> sorry but I don't see how this can be correct. Ideally, neither one would compile. Single pass traversal iterators don't provide random access operations like operator-. -- Dave Abrahams Boost Consulting www.boost-consulting.com Boost list run by bdawes at acm.org, gregod at cs.rpi.edu, cpdaniel at pacbell.net, john at johnmaddock.co.uk
s3://commoncrawl/crawl-data/CC-MAIN-2021-25/segments/1623487608702.10/warc/CC-MAIN-20210613100830-20210613130830-00023.warc.gz
CC-MAIN-2021-25
4,923
88
https://thedungeonmastersdojo.podbean.com/2021/04/
code
This week we wrap up our story telling series with part 2 of the Harmon Circle. DM Scott shows you how to apply this technique to your game. Tell us what you think about this episode and gaming in general. If you’re interested in a certain topic let us know, we’ll do an episode on it! Our Home page >>https://www.thedungeonmastersdojo.com/<< On Facebook: >>https://www.facebook.com/TheDungeonMastersDojo << On Twitter: >>https://twitter.com/DungeonDojo<< Or by Email: >>email@example.com << Interested in Supporting the Podcast? Grab a +1 Charisma Bonus by wearing The DMD’s Logo Wear: >>https://www.thedungeonmastersdojo.com/shop << Podcasting is a thirsty business, whet our whistles and buy us a Saki or three, or five. Don’t forget to say something nice, or mean, we don’t care, you’re buying us Saki! >>https://www.buymeacoffee.com/TheDMD <<
s3://commoncrawl/crawl-data/CC-MAIN-2021-49/segments/1637964363418.83/warc/CC-MAIN-20211207201422-20211207231422-00422.warc.gz
CC-MAIN-2021-49
859
8
https://www.eur.nl/en/essb/information/admitted-students/master/programme-specific-information/msc-soc
code
After offer of admission: Master in Sociology In order to plan the beginning of your studies it might be useful for you to know more about the planning, as well as the content of the programme. In your prior education some topics might not have received the attention you might have hoped for considering your choice of the track in this master programme. In this respect you might feel the need to catch up before you start the program. Below you will find a list of suggested readings that provide an introduction to the major themes that are relevant to the master programme and that might help you in preparation for it. On social theory - Collins, R. (1994). Four sociological traditions. New York and Oxford: Oxford University Press. - Heerikhuizen, B. (2016). Massive Open Online Course (MOOC) on classical sociological theory, available through https://www.coursera.org/learn/classical-sociological-theory [free after registration]. On developing and working with theory - Elster, J. (2007). Explaining social behavior. More nuts and bolts for the social sciences. Cambridge: Cambridge University Press. - Swedberg, R. (2014). The art of social theory. Princeton: Princeton University Press. On quantitative research methods - Aneshensel, C. S. (2013). Theory-based data analysis for the social sciences (2nd edition). London: Sage. (Chapters 1,2,3,4,5,7,8,9,11,13, glossary) - Field, A. (2013). Discovering Statistics Using IBM SPSS Statistics (4th edition). London: Sage (Chapters 1,2,3,4,5,7,8,9,10,17,19) Catch-up readings for Politics & Society On political sociology - Nash, K. (2009). Contemporary political sociology: globalization, politics and power (2nd edition). John Wiley & Sons. On political science - Hague, R., Harrop, M., & McCormick, J. (2016). Comparative government and politics. An introduction ( 10th edition). New York: Palgrave Macmillan. (Chapters 1,2,8,9,12-18)
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764494852.95/warc/CC-MAIN-20230127001911-20230127031911-00754.warc.gz
CC-MAIN-2023-06
1,896
17
https://types.cs.washington.edu/list-archives/checkers/2008-September/000495.html
code
[Checkers] Q about new JSR 308 proposal mernst at csail.mit.edu Thu Sep 4 15:24:51 EDT 2008 > __ My previous understanding __ > I thought that you would read a string array (even before using > annotations) as: > declaration: @English String @NonNull @Length(10) > order of reading: 3--------------> 2---------> 1-------------> > So it would be "length-10 array of non-null array of English String." > This is how I described it in the manual previously. No, this is incorrect. The outermost array type is the leftmost ("inside") one in the written syntax. If you execute then the result is a length-10 array, and each of its elements is a length-20 array. It is *not* a length-20 array of length-10 arrays. Even Josh Bloch got this wrong when I gave it to him as a Java Puzzler of my own, so you don't have to feel bad about your confusion. > The new proposal now has the same problem it tries to solve. Namely, > given the following declaration: > @English String @NonNull arr1, arr2 @Length(10) ; > In the proposed syntax, the types would be: > arr1: nonnull array of English string > arr2: nonnull array of length-10 array of English string No, arr2 is a length-10 array of nonnull array of English string, for a different reason than you thought: in each component you read left-to-right, like this: @English String @NonNull arr1, arr2 @Length(10) ; 3-------------> 2---------------> 1------------------> Thanks for pointing out that I need considerably more explanation in the document. The Java array syntax is a mess to understand even before annotations; my hope is that annotations don't make things any worse (or any different), but unfortunately they do make people notice the existing problems in the syntax. Let me know whether this makes sense, and what I can do to explain it. More information about the checkers
s3://commoncrawl/crawl-data/CC-MAIN-2020-10/segments/1581875143695.67/warc/CC-MAIN-20200218120100-20200218150100-00296.warc.gz
CC-MAIN-2020-10
1,836
35
https://ez.analog.com/thread/92646-no-output-for-ad9910
code
Dear Sir/ Madam, I encountered some issues when using DDS chip : DDS chip circuit is shown below , using FPGA to control CSB,CLK,SDIO,IO-update pin. And the sequence diagram is shown in the following figure, using oscilloscope to probe tests: ref-clk=40MHZ, amplitude 1.8V, ref-clk-out output DC 1.8V,no output for sysnc-CLK. No signal on the output pin. I woner knowing the following questions: 1,Whether or not the input signals 40MHZ, 1.8V for ref-CLK meets AD9910's requirement, whehter or not no output for ref-CLK-out mean that the chips didn't work, or no output of sysnc-CLK? 2,Whether or not the time series of FPGA has problems (have little timing differences in PDF), why there is no output signal? are Registers not written in?
s3://commoncrawl/crawl-data/CC-MAIN-2018-22/segments/1526794864622.33/warc/CC-MAIN-20180522014949-20180522034949-00022.warc.gz
CC-MAIN-2018-22
739
4