language stringclasses 15
values | src_encoding stringclasses 34
values | length_bytes int64 6 7.85M | score float64 1.5 5.69 | int_score int64 2 5 | detected_licenses listlengths 0 160 | license_type stringclasses 2
values | text stringlengths 9 7.85M |
|---|---|---|---|---|---|---|---|
Java | UTF-8 | 1,016 | 3.21875 | 3 | [] | no_license | package com.wolf.concurrenttest.bfbczm.foundation;
import java.util.concurrent.TimeUnit;
/**
* Description: 测试中断wait的线程
* Created on 2021/9/2 8:43 AM
*
* @author 李超
* @version 0.0.1
*/
public class WaitNotifyInterupt {
static Object obj = new Object();
public static void main(String[] args) throws InterruptedException {
Thread threadA = new Thread(() -> {
try {
System.out.println("---begin---");
synchronized (obj) {
obj.wait(); // 其他线程调用interrupt则这个线程抛出异常InterruptedException
}
System.out.println("---end---");
} catch (InterruptedException e) {
e.printStackTrace();
}
});
threadA.start();
TimeUnit.SECONDS.sleep(1);
System.out.println("---begin interrupt threadA---");
threadA.interrupt();
System.out.println("---end interrupt threadA---");
}
}
|
Python | UTF-8 | 316 | 2.8125 | 3 | [] | no_license | #!/usr/bin/env python
'''
This client should connect to the server (set to localhost:1234)
and print b'Hello from client\n' to the server
'''
import socket
ADDRESS = '127.0.0.1'
PORT = 1234
my_socket = socket.socket()
my_socket.connect((ADDRESS,PORT))
my_socket.sendall(b'Hello from client\n')
my_socket.close()
|
Markdown | UTF-8 | 11,402 | 3.0625 | 3 | [
"MIT"
] | permissive | ---
title: Proposal
---
## Who am I
My name is Dustin Schau, and I’ve been a front-end developer for the bulk of my professional career, about five or so years now, but much longer as an amateur. I started off my career designing and developing an internal component library for a Fortune 200 company used by 95% of internal applications. From there, I started working on hybrid mobile applications built with Cordova, and released a number of applications to both the Google Play store as well as the iOS App Store. Lately, I’ve found myself becoming more and more involved in the local community and I’ve given talks on a number of frontend topics, ranging from Webpack, Angular, React, to of course, CSS in JS.
My talk on CSS in JS was actually picked up by a front-end focused web conference in Oklahoma City called ThunderPlains. In preparation for the presentation, I created a web application ([css-in-js-playground.com][cssinjsplayground]) which has been extremely well received by the community (recently featured by CSS Tricks as one of the best frontend tools of the year!). The application lets the user switch through a number of the most common CSS in JS libraries to get a better feel for what it feels like authoring CSS in JS with that library, as well as lets the user change out content which will be instantly reflected in the preview.
As far as my qualifications as a teacher, I haven’t as of yet done any video teaching, but I have done a number of in-person workshops (as well as presentations) so I believe myself to be someone who is more than able to communicate effectively. Just as importantly, with my background in delivering workshops, I have oftentimes found it can be challenging to strike a balance between too easy and too challenging, and I feel confident that I will be able to strike that balance and truly deliver something engaging for audiences of all skill levels with this course.
## About the course
This course is intended to alleviate any misinformation around the concept of CSS in JS, while also laying bare the foundation upon which CSS in JS was laid, why it matters, and the incredible things these techniques can do for an application.
The course will begin by first defining what CSS in JS _is_ and what it _is not_. From this foundational perspective, we can then construct the argument for why CSS in JS exists and what problems it’s intending to solve. From this point forward, we’ll deep dive into how it solves these problems and specifically some “real world” examples of some solutions to common problems that CSS in JS can solve cleanly and clearly. We’ll finish with advanced features like server side rendering, theming support, and even some brief demonstrations of native CSS in JS (e.g. React Native).
The course is intended to be consumed in "bite size" videos, each of which is approximately 5 minutes. In its entirety, the course should contain approximately 3 - 4 hours of content.
## Who is the learner? What do they know? What don’t they know?
The “learner” for this course is someone who wants to grow in their knowledge of alternative CSS techniques (like CSS in JS) I’d like to think of the ideal learner as someone in one of the following categories:
**Beginner**
Someone who has developed an application or two, and is familiar with the syntax and practices of frontend development, as a whole, but also in authoring CSS. This person probably has a single “style.css,” and no significant build tooling in place.
**Intermediate**
Someone who has developed a number of applications, and has used tools and techniques like SASS, LESS, or CSS Modules (itself CSS in JS!) to better structure their styling. He/she likely has used build tools like Gulp, Grunt, Webpack, etc. to structure and modularize their application styling.
**Advanced**
Someone who has developed a great deal of applications, and has likely used tools like SASS, LESS, CSS Modules, etc. _and_ has also at least experimented with CSS in JS techniques like styled-components, glamorous, etc.
**Requirements**
- Basic familiarity with CSS
- Rules, selectors, and class names
- Media queries
- Pseudo styles (`:hover`, `:focus`, etc.)
- Flexbox
- Basic familiarity with React
- CSS in JS does not necessarily _require_ React, but to really teach the most interesting libraries and concepts, most of that is happening in the React ecosystem
- Through this training, we will primarily be using the library styled-components, which is a product of, and primarily used in, the React ecosystem
- Basic understanding of ES2015+, specifically:
- Template literals
- Arrow functions
- Willingness to learn and try something new!
## What will the learner learn to _do_?
High level, the course will start as more informational (i.e. drawbacks of CSS, how CSS solves them, etc.) and then quickly get into real development to keep it interesting and applicable. We’ll build a form component, complete with branding and validation styling, as well as support theming (e.g. a dark mode!).
- Differentiate CSS in JS from inline styles, and recognize the benefits of CSS in JS over a solution like inline styles
- Understand the problems (that they have likely faced without realizing it!) present in traditional CSS architectures
- Understand the benefit(s) that CSS in JS provides, e.g.
- Style encapsulation
- Bringing the “component era” to styles/CSS
- Prop injection
- Theming support
- Linking of view and style
- How CSS in JS actually works
- i.e. that it creates a real stylesheet under the hood, and the performance impact of this
- Common libraries for instrumenting CSS in JS
- As well as libraries that don’t require React
- Best practices for authoring with CSS in JS
- The drawbacks of CSS in JS, and when it might make sense to use other techniques
## The project
Our project is quite simple, but we are simply going to iteratively build up a form--or perhaps something more interesting--(like one seen on [css-in-js-playground.com][cssinjsplayground]), but definitely something that will be real-world applicable _and_ demonstrate the benefits of CSS in JS. Through this, we can build up the foundational concepts of CSS in JS (e.g. style encapsulation, props injection, theming, etc.) while also illustrating familiar concepts like media queries, hover styles, etc.
The “labs” will be utilizing some type of interactive development environment, either an augmented version of [css-in-js-playground.com][cssinjsplayground] tailored to teaching, or perhaps something like CodeSandbox or something in-house. We will first introduce the CSS in JS--what it is, and is _not_, and then immediately get to building smoething!
## What can the learner do with these skills
As React tends to be non-opinionated about much else outside of the **v**iew in MVC, we previously saw a great many competing libraries attempting to solve things like state management, routing, and other expected app-like utilities.
With the advent of CSS in JS and its explosive growth, we haven’t yet standardized on a definitive library. This course outlines the argument for why CSS in JS techniques are so persuasive and how it can offer tangible value to a developer or development team. From this basis, the developer actually gains experience with using a CSS in JS library (styled-components), and gets a real feel for the development experience of CSS in JS writ large, as well as some of the real-world usage examples that we will go through.
With this increased understanding of CSS in JS, a developer will feel empowered to make a decision as to whether CSS in JS offers tangible benefit for his/her application, or if their are perhaps other techniques (e.g. CSS Modules) that may be more appropriate for his/her application. Regardless or where we end up, it’s truly the ability to make an informed decision that is the true benefit of this course, and it is the goal above all else that each and every developer will be able to make an informed, measured, and reasonable decision as to their next styling solution.
## Competitive Analysis
As best as I can tell, I haven’t seen any CSS in JS courses available via video learning. There have been a number of recorded presentations (mine included!), screencasts, as well as a great number of blog posts on the topic, but zero training focused courses.
This puts us in a great competitive advantage, as we can deliver something to the market that hasn’t been quite tapped yet but of which I believe there to be a great demand!
## Schedule
Collectively, I’d like around four to six months to get the course content put together, and I’d like to break it up into about two week chunks and possibly have milestones and status updates during these two week cycles.
### First Month
- Formalization of course content and getting familiar with video recording style
- Planning and additional work re: the product we’re building (e.g. formalization on design, functionality, etc.)
- Further refinement of course content
- By the end of this month, I'd have to at least have a fairly firm idea of each 5 minute video I'll be recording
- This section is intentionally light, as to get more familiar with video recording, content, stylistic/tone concerns, etc.
### Second Month
- Writing the scripts/agenda for the first few sections
- Starting to record at least the first few sections, and getting more familiar with video training
### Third Month
- Writing scripts/agenda for next 1-2 sections
- Authoring with CSS in JS video recording
- Advanced CSS in JS video recording
### Fourth Month
- Writing scripts/agenda for last few sections
- Recording and polishing videos for last few sections
### Fifth and Sixth Months (as needed)
If needed, I'll use the last 1-2 months to polish any content, or go back and tweak stuff that may have changed or has since been improved. Perhaps some training(s) can be revisited/redone upon second glance, and some extra padded time to revisit and _perfect_ will be useful here.
## Course Content
1. Introduction of CSS in JS
- What CSS in JS is
- What CSS in JS is not
- Specifically, how CSS in JS is not “inline styles”
1. Authoring with CSS in JS (styled-components + React)
- Basic set up (CodeSandbox and/or CSS in JS Playground)
- Your first styled component
- Pseudo styles, media queries, and hover effects, oh my!
- Injecting Props
- Inheritance
- Animation
- Extensibility (e.g. allowing components to be customized)
- Using CSS in JS with existing libraries (e.g. Bootstrap, Foundation, etc.)
1. Advanced CSS in JS
- Theming
- Server side and/or static rendering
- Global injection
- When does it _not_ make sense to use CSS in JS libraries?
1. Roll your own CSS in JS (e.g. CSS in JS in Vue or Angular)
- CSS Modules
- Automated tooling
- Scoped styles (a la Vue scoped styles, Angular scoped styles, etc.)
- Libraries that do not require React (JSS, Fela, inline styles, etc.)
1. Wrap Up
- Other libraries (e.g. Glamorous, Emotion, etc.)
- This will sort of necessarily have a React focus since we've already covered other libraries
- Drawbacks of CSS in JS
- Recommended next steps/call to action
[cssinjsplayground]: https://css-in-js-playground.com
|
Java | UTF-8 | 224 | 2.1875 | 2 | [] | no_license | package me.luliru.parctice.algorithm.bit;
/**
* Test
* Created by luliru on 2020/8/27.
*/
public class Test {
public static void main(String[] args) {
int a = -8;
System.out.println(a >> 3);
}
}
|
Go | UTF-8 | 1,508 | 3.140625 | 3 | [
"MIT"
] | permissive | package main
import (
flag "github.com/ogier/pflag"
"fmt"
"os"
"sort"
"strings"
)
type helpCmd struct {
showAll bool
}
func (c *helpCmd) Usage() string {
return `Usage: auth-server help [--all]
Help page for the auth server.`
}
func (c *helpCmd) Flags() {
flag.BoolVar(&c.showAll, "all", false, "Show all commands (not just interesting ones)")
}
func (c *helpCmd) Run(args []string) error {
if len(args) > 0 {
for _, name := range args {
command, ok := commands[name]
if !ok {
fmt.Fprintf(os.Stderr, "error: unknown command: %s"+"\n", name)
} else {
fmt.Fprintf(os.Stdout, command.Usage()+"\n")
}
}
return nil
}
fmt.Println("Usage: auth-server <command> [options]")
fmt.Println()
fmt.Println(`This is the auth server.
All of auth server's features can be driven through the various commands below.
For help with any of those, simply call them with --help.`)
fmt.Println()
fmt.Println("Commands:")
var names []string
for name := range commands {
names = append(names, name)
}
sort.Strings(names)
for _, name := range names {
if name == "help" {
continue
}
command := commands[name]
fmt.Printf(" %-16s %s\n", name, summaryLine(command.Usage()))
}
fmt.Println()
return nil
}
func summaryLine(usage string) string {
for _, line := range strings.Split(usage, "\n") {
if strings.HasPrefix(line, "Usage:") {
continue
}
if len(line) == 0 {
continue
}
return strings.TrimSuffix(line, ".")
}
return "Missing summary."
}
|
Java | UTF-8 | 757 | 2.234375 | 2 | [] | no_license | package com.thd;
import com.thd.dao.UserMapper;
import com.thd.entity.User;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.context.junit4.SpringRunner;
import javax.annotation.Resource;
import java.time.LocalDateTime;
@SpringBootTest
@RunWith(SpringRunner.class)
public class InsertTest {
@Resource
private UserMapper userMapper;
@Test
public void insert(){
User build = User.builder().name("Kakalote")
.age(25)
.email("test5@baomidou.com")
.createTime(LocalDateTime.now())
.build();
int insert = userMapper.insert(build);
System.out.println(insert);
}
}
|
Markdown | UTF-8 | 3,614 | 3.578125 | 4 | [
"MIT"
] | permissive | ---
title: "Know Well More than Two Programming Languages"
---
# Know Well More than Two Programming Languages
The psychology of programming people have known for a long time now that programming expertise is related directly to the number of different programming paradigms that a programmer is comfortable with. That is not just know about, or know a bit, but genuinely can program with.
Every programmer starts with one programming language. That language has a dominating effect on the way that programmer thinks about software. No matter how many years of experience the programmer gets using that language, if they stay with that language, they will only know that language. A *one language* programmer is constrained in their thinking by that language.
A programmer who learns a second language will be challenged, especially if that language has a different computational model than the first. C, Pascal, Fortran, all have the same fundamental computational model. Switching from Fortran to C introduces a few, but not many, challenges. Moving from C or Fortran to C++ or Ada introduces fundamental challenges in the way programs behave. Moving from C++ to Haskell is a significant change and hence a significant challenge. Moving from C to Prolog is a very definite challenge.
We can enumerate a number of paradigms of computation: procedural, object-oriented, functional, logic, dataflow, etc. Moving between these paradigms creates the greatest challenges.
Why are these challenges good? It is to do with the way we think about the implementation of algorithms and the idioms and patterns of implementation that apply. In particular, cross-fertilization is at the core of expertise. Idioms for problem solutions that apply in one language may not be possible in another language. Trying to port the idioms from one language to another teaches us about both languages and about the problem being solved.
Cross-fertilization in the use of programming languages has huge effects. Perhaps the most obvious is the increased and increasing use of declarative modes of expression in systems implemented in imperative languages. Anyone versed in functional programming can easily apply a declarative approach even when using a language such as C. Using declarative approaches generally leads to shorter and more comprehensible programs. C++, for instance, certainly takes this on board with its wholehearted support for generic programming, which almost necessitates a declarative mode of expression.
The consequence of all this is that it behooves every programmer to be well skilled in programming in at least two different paradigms, and ideally at least the five mentioned above. Programmers should always be interested in learning new languages, preferably from an unfamiliar paradigm. Even if the day job always uses the same programming language, the increased sophistication of use of that language when a person can cross-fertilize from other paradigms should not be underestimated. Employers should take this on board and allow in their training budget for employees to learn languages that are not currently being used as a way of increasing the sophistication of use of the languages that are used.
Although it's a start, a one-week training course is not sufficient to learn a new language: It generally takes a good few months of use, even if part-time, to gain a proper working knowledge of a language. It is the idioms of use, not just the syntax and computational model, that are the important factors.
By [Russel Winder](http://programmer.97things.oreilly.com/wiki/index.php/Russel_Winder)
|
Markdown | UTF-8 | 11,621 | 3.53125 | 4 | [] | no_license | ---
layout: post
title: "Typeclasses in Scala"
date: 2019-03-18 19:52:02 -0400
categories: jekyll update
---
I was reading the School of Haskell's [Type Families and
Pokemon](https://www.schoolofhaskell.com/school/to-infinity-and-beyond/pick-of-the-week/type-families-and-pokemon)
post, and thought this would be a nice example to attempt to translate into
Scala. I'm fairly certain that the *type families* portion won't be something I
can implement, but this is good practice for Type Classes.
The general goal of this post is to:
* understand how to write a Type Class
* investigate their strengths over other implementations
### Basic Design
So `Pokemon` should have a *type* and a *tier*. For instance `Charmander` is a
`Fire` type with `Tier1`. Also, the Pokemon have strengths vs some other types,
but weaknesses against others.
We also want to give users of this library some easy requirements that will
leverage typeclasses. The basic requirements of a consumer should be something
like:
1. Define a Pokemon Type (eg "air")
2. Put pokemon in that type, with `Tier`
3. Provide a typeclass instance of something that tells how they stack up against other types
One smart way we could implement this is to model the `Ord` type that Haskell
uses. Scala happens to have pretty much the same thing, in the `Ordering` typeclass.
```scala
trait Hierarchy {
val order: Int
}
case object Tier1 extends Hierarchy {
override val order = 1
}
case object Tier2 extends Hierarchy {
override val order = 2
}
case object Tier3 extends Hierarchy {
override val order = 3
}
implicit val TierOrdering: Ordering[Hierarchy] = new Ordering[Hierarchy] {
override def compare(x: Hierarchy, y: Hierarchy): Int = x.order - y.order
}
```
Nice, now let's test that this works.
```scala
it("should correctly compare Tiers") {
Tier1 > Tier2 shouldBe false
Tier2 > Tier1 shouldBe true
Tier1 < Tier2 shouldBe true
Tier3 == Tier3 shouldBe true
}
```
```
[info] Hierarchy
[info] - should correctly compare Tiers
[info] Run completed in 334 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
```
### Creating the Typeclass
First thing to do, is create a basic outline of how we want to fight (taking
advantage of the typeclass model). We know that we are going to need to leave
room for "users" to be able to add their own new types of `Pokemon`, and all
they need to do is supply a typeclass.
```scala
def fight[A <: Pokemon, B <: Pokemon](a: A, b: B)(implicit rulesA: Rules[A], rulesB: Rules[B]) = ???
```
Ok so what should `Rules` look like? This is our typeclass that we're going to
ask users to implement for their new types. While we're at it, we should
probably make a simple ADT that expresses the "*decision*" of the fights.
```scala
trait Decision
case class Winner(a: Pokemon) extends Decision
case object Draw extends Decision
case object NotHandled extends Decision
trait Rules[A] {
def compare(a1: A, a2: Pokemon): Decision
}
```
We need `NotHandled` here so that our implementations leave room for others to
add new `Rules` and `Pokemon` types in the future. Speaking of pokemon, we need
to make a bunch.
```scala
sealed trait Pokemon {
val tier: Hierarchy
}
case class FirePokemon(override val tier: Hierarchy) extends Pokemon
val charmander = FirePokemon(Tier1)
val charmeleon = FirePokemon(Tier2)
val charizard = FirePokemon(Tier3)
case class WaterPokemon(override val tier: Hierarchy) extends Pokemon
val squirtle = WaterPokemon(Tier1)
val warTortle = WaterPokemon(Tier2)
val blastoise = WaterPokemon(Tier3)
case class GrassPokemon(override val tier: Hierarchy) extends Pokemon
val bulbasaur = GrassPokemon(Tier1)
val ivysaur = GrassPokemon(Tier2)
val venusaur = GrassPokemon(Tier3)
```
Now we have enough information to implement `Rules`. Here is a first pass at
`FirePokemon`.
```scala
implicit val FireRules: Rules[FirePokemon] = new Rules[FirePokemon] {
def compare(fire: FirePokemon, opponent: Pokemon): Decision = opponent match {
case water: WaterPokemon => Winner(water)
case grass: GrassPokemon => Winner(fire)
case anotherFire: FirePokemon => Draw
case _ => NotHandled
}
}
```
As you can see, we're using the `NotHandled` result for the case where we're
fighting a type we don't know about when we implemented `FireRules`. However you
can see that there is a problem here with this approach. Are aren't taking into
account the tier of the pokemon. A `FirePokemon` is supposed to be a
`GrassPokemon`, but what if the grass pokemon is a higher tier? We need to take
that into account.
We should also take this time to implement this same logic into `Draw` (when
they are the same type `Pokemon` but potentially different tiers).
```scala
implicit class RichTieredPokemon(a: Pokemon) {
import TierOrdering._
def considerTiers(b: Pokemon): Pokemon = if (a.tier < b.tier) b else a
}
private def handleSameType(same1: Pokemon, same2: Pokemon) =
if (same1.tier == same2.tier)
Draw
else
Winner(same2.considerTiers(same1))
```
The reason this is in an `implicit class` is so that we can use nice dot
notation. It's also nice because with get to finally use the `Ordering`
typeclass we implemented at the beginning.
Taking this new function, we can improve our `FireRules`. While we're at it, we
can implement the others as well.
```scala
implicit val FireRules: Rules[FirePokemon] = new Rules[FirePokemon] {
def compare(fire: FirePokemon, opponent: Pokemon): Decision = opponent match {
case water: WaterPokemon => Winner(water.considerTiers(fire))
case grass: GrassPokemon => Winner(fire.considerTiers(grass))
case anotherFire: FirePokemon => handleSameType(fire, anotherFire)
case _ => NotHandled
}
}
implicit val WaterRules: Rules[WaterPokemon] = new Rules[WaterPokemon] {
def compare(water: WaterPokemon, opponent: Pokemon): Decision = opponent match {
case fire: FirePokemon => Winner(water.considerTiers(fire))
case grass: GrassPokemon => Winner(grass.considerTiers(water))
case anotherWater: WaterPokemon => handleSameType(water, anotherWater)
case _ => NotHandled
}
}
implicit val GrassRules: Rules[GrassPokemon] = new Rules[GrassPokemon] {
def compare(grass: GrassPokemon, opponent: Pokemon): Decision = opponent match {
case fire: FirePokemon => Winner(fire.considerTiers(grass))
case water: WaterPokemon => Winner(grass.considerTiers(water))
case anotherGrass: GrassPokemon => handleSameType(grass, anotherGrass)
case _ => NotHandled
}
}
```
Awesome. Now we have all the things we need to finally implement `fight`.
```scala
def fight[A <: Pokemon, B <: Pokemon](a: A, b: B)(implicit rulesA: Rules[A], rulesB: Rules[B]): Decision = {
val originalResult = rulesA.compare(a, b)
originalResult match {
case NotHandled => rulesB.compare(b, a)
case _ => originalResult
}
}
```
First we check to see if the first pokemon has all of the information it needs
in its `Rules`, which is true in the case of the original fire, grass, and
water. If there is a new type that was added, we need to check the rules for
that new type to see if they've correctly implemented their element into the
strange pokemon taxonomy.
Let's be sure that this works correctly.
```scala
describe("PokemonBattle") {
describe("Fire") {
it("is a draw with same tiered Fire") {
val newFireTier1 = FirePokemon(Tier1)
PokemonBattle.fight(Pokemodels.charmander, newFireTier1) shouldBe Draw
}
it("loses to water") {
val fight = PokemonBattle.fight(Pokemodels.charizard, Pokemodels.blastoise)
val fightReverse = PokemonBattle.fight(Pokemodels.blastoise, Pokemodels.charizard)
fight shouldBe Winner(Pokemodels.blastoise)
fightReverse shouldBe Winner(Pokemodels.blastoise)
}
it("wins vs grass") {
val fight = PokemonBattle.fight(Pokemodels.charizard, Pokemodels.venusaur)
val fightReverse = PokemonBattle.fight(Pokemodels.venusaur, Pokemodels.charizard)
fight shouldBe Winner(Pokemodels.charizard)
fightReverse shouldBe Winner(Pokemodels.charizard)
}
it("considers tier correctly") {
val grassTier3 = GrassPokemon(Tier3)
val fight = PokemonBattle.fight(Pokemodels.charmeleon, grassTier3)
val fightReverse = PokemonBattle.fight(grassTier3, Pokemodels.charmeleon)
fight shouldBe Winner(grassTier3)
fightReverse shouldBe Winner(grassTier3)
}
}
}
```
Running the tests with `sbt`. We see they all pass.
```
[info] PokemonTypeClassesTest:
[info] Hierarchy
[info] - should correctly compare Tiers
[info] PokemonBattle
[info] Fire
[info] - is a draw with same tiered Fire
[info] - loses to water
[info] - wins vs grass
[info] - considers tier correctly
[info] Run completed in 129 milliseconds.
[info] Total number of tests run: 5
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 5, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
```
### Adding a new Pokemon
If we've done everything as planned, we should be able to add a new pokemon
type and fight it against the original 3 types. Let's go ahead and implement
`Air` type Pokemon. Since I have no idea how pokemon actually work, I'm just
going to make stuff up for the taxonomy. Let's just say it only wins against
water, because why not?
```scala
case class AirPokemon(override val tier: Hierarchy) extends Pokemon
val thinger = AirPokemon(Tier1)
val bigChungus = AirPokemon(Tier2)
val oleFarty = AirPokemon(Tier3)
implicit val AirRules: Rules[AirPokemon] = new Rules[AirPokemon] {
def compare(air: AirPokemon, opponent: Pokemon): Decision = opponent match {
case fire: FirePokemon => Winner(fire.considerTiers(air))
case water: WaterPokemon => Winner(air.considerTiers(water))
case grass: GrassPokemon => Winner(grass.considerTiers(air))
case anotherAir: AirPokemon => handleSameType(air, anotherAir)
case _ => NotHandled
}
}
# test it!
describe("Air") {
it("wins against water") {
val fight = PokemonBattle.fight(Pokemodels.oleFarty, Pokemodels.blastoise)
val fightReverse = PokemonBattle.fight(Pokemodels.blastoise, Pokemodels.oleFarty)
fight shouldBe Winner(Pokemodels.oleFarty)
fightReverse shouldBe Winner(Pokemodels.oleFarty)
}
it("loses to fire") {
val fight = PokemonBattle.fight(Pokemodels.bigChungus, Pokemodels.charmeleon)
val fightReverse = PokemonBattle.fight(Pokemodels.charmeleon, Pokemodels.bigChungus)
fight shouldBe Winner(Pokemodels.charmeleon)
fightReverse shouldBe Winner(Pokemodels.charmeleon)
}
}
```
They pass!
```
[info] PokemonTypeClassesTest:
[info] Hierarchy
[info] - should correctly compare Tiers
[info] PokemonBattle
[info] Fire
[info] - is a draw with same tiered Fire
[info] - loses to water
[info] - wins vs grass
[info] - considers tier correctly
[info] Air
[info] - wins against water
[info] - loses to fire
[info] Run completed in 132 milliseconds.
[info] Total number of tests run: 7
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 7, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
```
### Take aways
This is a really nice illustration of why typeclasses work well for providing
*extendable* libraries. We could give the original grass/water/fire
implementation of the library to anyone. As long as they provide an instance of
the `Rules` typeclass, everything works just like it did with the original.
There is no need to change the library at all!
|
Swift | UTF-8 | 5,600 | 3.109375 | 3 | [] | no_license | //
// DiningHall.swift
// WellesleyFresh
//
// Represents a dining halls' operating hours and times.
//
// Created by Audrey Seo on 22/10/2016.
// Copyright © 2016 Audrey Seo. All rights reserved.
//
import UIKit
class DiningHall {
var hours:[[HourRange]]
var days:DayOptions
var in_between:HourRange
var whenClosed:DayHourRange
let nextDay:[String:String] = [
"Su":"Mo", "Mo":"Tu", "Tu":"We", "We":"Th", "Th":"Fr", "Fr":"Sa", "Sa": "Su"
]
let yesterday:[String:String] = [
"Su":"Sa", "Sa":"Fr", "Fr":"Th", "Th":"We", "We":"Tu", "Tu":"Mo", "Mo":"Su"
]
init(newDays: [String], newHours: [[[Double]]], meals: [[String]]) {
hours = [[HourRange]]()
for i in 0...newHours.count - 1 {
hours.append([HourRange]())
for j in 0...newHours[i].count - 1 {
hours[i].append(HourRange(low: newHours[i][j][0], high: newHours[i][j][1], name: meals[i][j]))
}
}
days = DayOptions(initialDays: newDays)
in_between = HourRange(low: 0, high: 0, name: "Closed")
whenClosed = DayHourRange(lowHour: 0.0, highHour: 0.0, name: "Closed", lowDay: 0, highDay: 0)
}
func openToday() -> Bool {
return days.hasToday()
}
func nextOpenIndex() -> Int {
let today = days.todaysWeekDate()
var count:Int = 0
//if openToday() && hours[0][0].hours() < 12 {
// return days.getOptionForDay(day: today)
//}
var next = nextDay[today]
while count < 7 {
if days.dayInDayOptions(day: next!) {
return days.getOptionForDay(day: next!)
} else {
next = nextDay[next!]
count += 1
}
}
return -1
}
func whenNextOpen() -> String {
let today = days.todaysWeekDate()
var count:Int = 0
var next = nextDay[today]
count += 1
while count < 7 {
if days.dayInDayOptions(day: next!) {
return next!
} else {
next = nextDay[next!]
count += 1
}
}
return ""
}
func whenLastOpen() -> String {
let today = days.todaysWeekDate()
if openToday() && hours[0][0].hours() >= 12 {
return today
}
var count:Int = 0
var yester = yesterday[today]
count += 1
while count < 7 {
if days.dayInDayOptions(day: yester!) {
return yester!
} else {
yester = yesterday[yester!]
count += 1
}
}
return ""
}
func nextOpenInterval() {
let lastOpenIndex = days.getOption()
let nextOpenIndex = self.nextOpenIndex()
var endHour:Double = -1.0
var beginHour:Double = -1.0
if nextOpenIndex >= 0 {
endHour = hours[nextOpenIndex][0].lowHour
for i in 0...hours[nextOpenIndex].count - 1 {
print("#", i, " : ", hours[nextOpenIndex][i].lowHour)
}
}
if lastOpenIndex >= 0 {
beginHour = hours[lastOpenIndex][hours[lastOpenIndex].count - 1].highHour
}
let lastDay:String = whenNextOpen()
let firstDay:String = whenLastOpen()
if lastDay != "" && firstDay != "" {
whenClosed = DayHourRange(lowHour: beginHour, highHour: endHour, name: "Closed", lowWeekDay: firstDay, highWeekDay: lastDay)
}
}
func closed() -> Bool {
return !days.hasToday()
}
func isClosed() -> Bool {
return currentHours().name().contains("Closed")
}
func currentMeal() -> String {
return currentHours().name()
}
func inBetween(_ a:HourRange, b:HourRange) {
let currentTime = a.currentHour()
// print("In-Between?: \(currentTime), (\(a.lowHour), \(a.highHour)), (\(b.lowHour), \(b.highHour))")
if currentTime >= a.highHour && currentTime <= b.lowHour {
in_between = HourRange(low: a.highHour, high: b.lowHour, name: "Next: \(b.name())")
}
}
func findRange(option:Int) -> HourRange {
for i in 0...hours[option].count - 1 {
if hours[option][i].withinRange() {
return hours[option][i]
}
if i < hours[option].count - 1 {
inBetween(hours[option][i], b: hours[option][i + 1])
if in_between.withinRange() {
return in_between
}
}
}
return HourRange(low: 0, high: 0, name: "Closed")
}
func currentHours() -> HourRange {
if (self.openToday()) {
let option = self.days.getOption()
if (in_between.hasARange()) {
if (!in_between.withinRange()) {
return findRange(option: option)
} else {
return in_between
}
} else {
return findRange(option: option)
}
}
return HourRange(low: 0, high: 0, name: "Closed")
}
func closedHours() -> DayHourRange {
if !whenClosed.hasARange() && !whenClosed.withinRange() {
nextOpenInterval()
}
return whenClosed
}
func percentLeft() -> Double {
let current = currentHours()
if (current.name() != "Closed") {
return currentHours().percentTimeElapsed()
}
if !whenClosed.hasARange() && !whenClosed.withinRange() {
nextOpenInterval()
print("When Closed Percent Time Elapsed: ", whenClosed.percentTimeElapsed())
print("Current Hours: ", whenClosed.currentHour())
print("Hours to go: ", whenClosed.totalChange() - whenClosed.currentHour())
}
//print("WHen Closed Beginnings: ", whenClosed.lowDay, ", ", whenClosed.lowHour)
//print("When Closed Endings: ", whenClosed.highDay, ", ", whenClosed.highHour)
return whenClosed.percentTimeElapsed()
// print("Are we open today?: ", self.openToday())
// if self.openToday() {
// print("Hi.")
// let option = self.days.getOption()
// if !in_between.hasARange() {
// for i in 0...hours[option].count - 1 {
// if hours[option][i].withinRange() {
// print("If happened")
// return hours[option][i].percentTimeElapsed()
// }
// }
// } else {
// print("Else happened");
// return in_between.percentTimeElapsed()
// }
// } else {
// print("Other hi")
// return 0
// }
}
}
|
Java | UTF-8 | 360 | 2.03125 | 2 | [
"Beerware"
] | permissive | package se.chalmers.dat255.ircsex.irc;
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.IOException;
/**
* Created by oed on 10/9/13.
*/
public interface Flavor {
public BufferedReader getInput() throws IOException;
public BufferedWriter getOutput() throws IOException;
public void close() throws IOException;
}
|
Python | UTF-8 | 7,458 | 2.734375 | 3 | [
"MIT"
] | permissive | #!/bin/usr/python
import os
import pandas as pd
import subprocess
import shutil
from nrgpy.utilities import check_platform, windows_folder_path, linux_folder_path, affirm_directory
class local(object):
"""
nrgpy.convert_rwd.local - use local installation of Symphonie Data Retriever (SDR)
to convert *.RWD files to *.TXT
parameters -
filename : '', if populated, a single file is exported
encryption_pin : '', four digit pin, only used for encrypted files
sdr_path : r'"C:/NRG/SymDR/SDR.exe"', may be any path
site_filter : '', filters files on text in filename
rwd_dir : '', folder to check for RWD files
out_dir : '', folder to save exported TXT files into
wine_folder : '~/.wine/drive_c/', for linux installations
functions -
check_sdr : checks operating system for SDR installation
convert : process files
"""
def __init__(self, rwd_dir='', out_dir='', filename='', encryption_pin='',
sdr_path=r'C:/NRG/SymDR/SDR.exe',
convert_type='meas', site_filter='',
wine_folder='~/.wine/drive_c/', **kwargs):
if encryption_pin != '':
self.command_switch = '/z'
else:
self.command_switch = '/q'
self.encryption_pin = encryption_pin
self.sdr_path = windows_folder_path(sdr_path)[:-1]
self.root_folder = "\\".join(self.sdr_path.split('\\')[:-2])
self.RawData = self.root_folder + '\\RawData\\'
self.ScaledData = self.root_folder + '\\ScaledData\\'
self.site_filter = site_filter
self.rwd_dir = windows_folder_path(rwd_dir) # rwd_dir must be in Windows format, even if using Wine
self.platform = check_platform()
if self.platform == 'win32':
self.out_dir = windows_folder_path(out_dir)
self.file_path_joiner = '\\'
pass
else:
self.out_dir = linux_folder_path(out_dir)
self.check_sdr()
self.file_path_joiner = '/'
if filename != '':
self.filename = filename
self.single_file()
self.wine_folder = wine_folder
def check_sdr(self):
"""
determine if SDR is installed
"""
if self.platform == 'win32':
# do the windows check
try:
os.path.exists(self.sdr_path)
self.sdr_ok = True
except:
self.sdr_ok = False
print('SDR not installed. Please install SDR or check path.\nhttps://www.nrgsystems.com/support/product-support/software/symphonie-data-retriever-software')
else:
# do the linux check
try:
subprocess.check_output(['wine','--version'])
except NotADirectoryError:
print('System not configured for running SDR.\n Please follow instructions in SDR_Linux_README.md to enable.')
try:
subprocess.check_output(['wine',self.sdr_path,'/s','test.rwd'])
print('SDR test OK!')
self.sdr_ok = True
except:
self.sdr_ok = False
print('SDR unable to start')
def convert(self):
"""
process rwd files
1 - create list of RWD files that match filtering
2 - copy RWD files to RawData directory
3 - create out_dir if necessary
4 - iterate through files
"""
self.list_files()
self.copy_rwd_files()
affirm_directory(self.out_dir)
for f in self.rwd_file_list:
site_num = f[:4]
try:
self._filename = "\\".join([self.RawData+site_num,f])
self.single_file()
except:
print('file conversion failed on {}'.format(self._filename))
def list_files(self):
"""
get list of files in rwd_dir
"""
self.dir_paths = []
self.rwd_file_list = []
if self.platform == 'win32':
walk_path = self.rwd_dir
else:
walk_path = linux_folder_path(self.rwd_dir)
for dirpath, subdirs, files in os.walk(walk_path):
self.dir_paths.append(dirpath)
for x in files:
if x.startswith(self.site_filter) and x.lower().endswith('.rwd'):
self.rwd_file_list.append(x)
def single_file(self):
"""
process for converting a single file
"""
_f = self._filename
if self.platform == 'linux':
self.sdr_path = windows_folder_path(self.sdr_path)[:-1]
_f = windows_folder_path(_f)[:-1]
wine = 'wine'
else:
wine = ''
cmd = [wine, '"'+self.sdr_path+'"', self.command_switch, self.encryption_pin, '"'+_f+'"']
try:
print("Converting {}\t\t".format(_f), end="", flush=True)
subprocess.check_output(" ".join(cmd), shell=True)
print("[DONE]")
except:
print("[FAILED")
print('unable to convert {}. check ScaledData folder for log file'.format(_f))
try:
print("\tCopying {}\t\t".format(_f[:-3]+'txt'), end="", flush=True)
self.copy_txt_file()
print("[DONE]")
except:
print("[FAILED]")
print('unable to copy {} to text folder'.format(_f))
def copy_rwd_files(self):
"""
copy RWD files from self.RawData to self.rwd_dir
"""
for f in self.rwd_file_list:
if self.site_filter in f:
site_num = f[:4]
site_folder = "\\".join([self.RawData,site_num])
copy_cmd = 'copy'
if self.platform == 'linux':
site_folder = ''.join([self.wine_folder,'NRG/RawData/',site_num])
copy_cmd = 'cp'
try:
affirm_directory(site_folder)
except:
print("couldn't create {}".format(site_folder))
pass
try:
cmd = [copy_cmd,"".join([self.dir_paths[0], f]),self.file_path_joiner.join([site_folder, f])]
subprocess.check_output(" ".join(cmd), shell=True)
except:
print('unable to copy file to RawData folder: {}'.format(f))
def copy_txt_file(self):
"""
copy TXT file from self.ScaledData to self.out_dir
"""
try:
txt_file_name = self._filename.split('\\')[-1][:-4] + '.txt'
txt_file_path = "\\".join([self.ScaledData,txt_file_name])
out_path = self.file_path_joiner.join([self.out_dir,txt_file_name])
except:
print("could not do the needful")
if self.platform == 'linux':
out_path = linux_folder_path(self.out_dir) + txt_file_name
txt_file_path = ''.join([self.wine_folder, 'NRG/ScaledData/',txt_file_name])
try:
shutil.copy(txt_file_path, out_path)
try:
os.remove(txt_file_path)
except:
print("{0} remains in {1}".format(txt_file_name, self.ScaledData))
except:
print("Unable to copy {0} to {1}".format(txt_file_name,self.out_dir))
|
JavaScript | UTF-8 | 1,361 | 2.6875 | 3 | [] | no_license | const { join } = require('path');
const { readFile } = require('fs');
const queryString = require('querystring');
// eslint-disable-next-line no-unused-vars
const getDataFromApi = require('../getBooksFromApi');
// un comment it to fill books.json file with books data
// getDataFromApi();
const getBooks = (req, res) => {
const filePath = join(__dirname, '..', 'books.json');
readFile(filePath, (err, file) => {
if (err) {
res.writeHead(500, { 'Content-Type': 'text/html' });
res.write('<h1>server error</h1>');
return res.end();
}
res.writeHead(200, { 'Content-Type': 'application/json' });
return res.end(file);
});
};
const getSearchBook = (req, res) => {
const filePath = join(__dirname, '..', 'books.json');
readFile(filePath, (err, file) => {
if (err) {
res.writeHead(500, { 'Content-Type': 'text/html' });
res.write('<h1>server error</h1>');
return res.end();
}
const searchValue = queryString.parse(req.url.split('?')[1]).q;
const parseData = JSON.parse(file);
const filteredData = parseData.filter((el) => el.volumeInfo.title.toLocaleLowerCase()
.includes(searchValue.toLocaleLowerCase()));
res.writeHead(200, { 'Content-Type': 'application/json' });
return res.end(JSON.stringify(filteredData));
});
};
module.exports = { getBooks, getSearchBook };
|
Markdown | UTF-8 | 579 | 2.796875 | 3 | [] | no_license | # Travelling Salesman Problem using Simulated Annealing
It's a java program with GUI that solves the **Travelling Salesman Problem** using **Simulated Annealing**. Simulated Annealing is a machine learning algorithm used for optimization.
##### Input file specification:
1. First line contains number of nodes (n)
2. Next n lines contain the coordinates of ith node separated by space (x y)
3. Last line contains a fixed number 9999999999
## Screenshots
.png)
|
Markdown | UTF-8 | 1,384 | 2.59375 | 3 | [] | no_license | ---
layout: page
title: About
permalink: /About/
---
<img src="http://people.duke.edu/~nl149/dataplus2018/Alexis.jpg" alt="Alexis Malone" width="200px" class="photo"/>
__Alexis Malone__ is a senior at Duke University majoring in statistical
science. <br> She is interested in African American studies and Gender Studies, and
hopes to use <br> statistics and computer science to better understand social and
educational inequality.
<img src="https://media.licdn.com/dms/image/C4E03AQFLDjYgiN_Lvg/profile-displayphoto-shrink_800_800/0?e=1538006400&v=beta&t=w9FnYa_FHPhxLMXv7-Vlv8vwdf2YQbn9TmN6cqqoXpg" alt="Sandra Luksic" width="200px" class="photo"/>
__Sandra Luksic__ is a junior at Duke University studying philosophy,
political science, and economics. She enjoys exploring feminist
questions with data science, literature, and philosophy. One day,
she hopes to practice contract law or teach as a philosophy professor.
<img src="https://media.licdn.com/dms/image/C4E03AQHGR2yexS4oag/profile-displayphoto-shrink_200_200/0?e=1538006400&v=beta&t=TMSv0W-3cLByctG6B7h6ZwKSQE4USifQ_K5-mEm-NTw" alt="Nathan Liang" width="200px" class="photo"/>
__Nathan Liang__ is a sophomore at Duke University double-majoring in psychology
and neuroscience. He is interested in exploring the applications of
data science methods to the social sciences and hopes to eventually
become a cognitive psychology research professor.
|
Java | UTF-8 | 273 | 2.046875 | 2 | [] | no_license | package com.company.Views;
import com.company.Model.Contacts;
import java.util.ArrayList;
/**
* Make Builder as Index PageBuilder
*/
public interface IIndexPageBuilder {
public IIndexPageBuilder addUsersList(ArrayList<Contacts> list);
public Page build();
}
|
JavaScript | UTF-8 | 5,752 | 2.890625 | 3 | [] | no_license | var expansions = [];
function Expansion (name, picture, description, price) {
this.packname = name;
this.picture = picture;
this.description = description;
this.price = price;
populateHTML(this);
pushToArray(expansions, this);
}
function pushToArray(array, item) {
array.push(item);
}
function populateHTML(object) {
var myDiv = document.createElement("div");
var myH2 = document.createElement("h2");
var myImg = document.createElement("img");
var myH6 = document.createElement("h6");
var anotherH4 = document.createElement("h4");
var myBtn = document.createElement("button");
myDiv.appendChild(myH2);
myDiv.appendChild(myImg);
myDiv.appendChild(myH6);
myDiv.appendChild(anotherH4);
myDiv.appendChild(myBtn);
myDiv.className = "bg-light text-center col-12 mx-auto my-4 p-5"
myH6.className = "px-lg-5 mx-lg- pb-3"
anotherH4.className = "m-3"
myImg.className = "col-lg-7"
myBtn.className = "btn btn-primary btn-lg"
document.body.style.backgroundColor = "green"
myH2.innerText = object.packname;
myImg.src = object.picture;
myH6.innerText = "Description: " + object.description;
anotherH4.innerText = "Price: $" + object.price;
myBtn.innerText = "Buy Now";
document.getElementsByClassName("row")[0].appendChild(myDiv);
}
new Expansion("The Boomsday Project", "img/boomsday.png", "Can you feel it? That electric tension on the wind, the reek of ozone and Khorium, the distant rumble of explosions; there’s science in the air! Word around the inn is that Dr. Boom has returned to his secret laboratory in the Netherstorm. No one knows exactly what the mad genius will unleash, but a lot of very, VERY strange cards have found their way into decks around here lately: mechanical amalgamations, eerie crystals, revolting, lab-grown monstrosities… the good doctor and his esteemed colleagues are working on something big! Time to venture into the Netherstorm, find the lab, and uncover what these maniacs are up to.", 49.99);
new Expansion("The Witchwood", "img/witchwood.png", "Hush, brave heroes, and take heed; you tread on dangerous ground. See how the light dims all around, and moving shadows creep? The Witchwood calls, but I implore: do not its treasures seek! Stay where it’s safe, pull up a chair, let’s play another round! I see, your minds are quite made up; please hear me importune: Keep your decks close, and your wits sharp…", 49.99);
new Expansion("Kobolds & Catacombs", "img/kobolds.png", "Drawn by tales of untold riches, you have gathered a party of bold adventurers to find fortune in the catacombs. Now, shadows dance as your torches flicker and you descend into the mines. Just then, a gust of wind snuffs out your light. In the sudden dark, you make out a faint glow ahead, coming closer. An all-too familiar cackle echoes through the gloom.", 49.99);
new Expansion("Knights of the Frozen Throne", "img/knights.png", "An icy wind howls outside the inn's walls, but the chill you feel crawling up your spine has little to do with the bitter cold. No deck is safe from the Lich King's evil influence; even the most stalwart champions of the Light have been turned into wicked Death Knights. As the agents of the undead Scourge plague the land, it falls to you to gather your cards, face these vile abominations, and turn their dark powers against them.", 49.99);
new Expansion("Journey to Un'Goro", "img/ungoro.png", "Travel to the forbidding jungles of Un’Goro Crater, where primordial creatures from the dawn of time roam. Here, giant dinosaurs infused with raw magical power mercilessly hunt their prey, prehistoric fauna and flora abound, and unchained elementals run wild with primitive magic. Prepare your decks for the expedition of a lifetime!", 49.99);
new Expansion("Mean Streets of Gadgetzan", "img/gadgetzan.png", "Gadgetzan, the wondrous jewel of Tanaris, reports explosive economic growth and a boom in trade across the board, according to census data released from mayor Noggenfogger’s office. News of the city’s rocket-like ascent comes as no surprise; the recent flux of traders, merchants, businessmen both respectable and otherwise, as well as the daily arrival of more fortune-seekers speaks for itself. Of course, this uptick in activity hasn’t been without its downsides. Many residents complain of increased tension among the movers and shakers of Gadgetzan, which could spill into the streets at any moment… and if it does, you’ll read it here first!", 49.99);
new Expansion("The Grand Tournament", "img/tournament.png", "When the Lich King and his undead Scourge threatened the world, the Argent Crusade called upon Azeroth’s mightiest heroes to prove their mettle in a magnificent tournament. Knights of all races flocked to Northrend, vying for glory in epic battles against fearsome monsters. Though the Lich King’s evil has been vanquished, the Grand Tournament continues… the competitive atmosphere’s just a bit more playful than it used to be.", 49.99);
new Expansion("Goblins vs Gnomes", "img/goblins.png", "Welcome to Goblins vs Gnomes! It’s a Hearthstone expansion, which means you can expect stacks of fresh cards, oodles of unique minions, and countless new ways to play. Use gnomish ingenuity or goblin craftiness to create explosive new strategies, challenge your friends, and defeat your foes.", 49.99);
new Expansion("Whispers of the Old Gods", "img/whispers.png", "For countless millennia, the Old Gods have slept. Now, the time of their awakening draws near; their evil influence has crawled even into the tavern! Can you feel your cards trembling in their decks as the corruption spreads? Some of your old friends have already grown icky masses of tentacles and a downright frightening number of eyeballs!", 49.99);
|
Python | UTF-8 | 340 | 3.875 | 4 | [] | no_license | def compact(lst):
"""Return a copy of lst with non-true elements removed.
>>> compact([0, 1, 2, '', [], False, (), None, 'All done'])
[1, 2, 'All done']
"""
tru = []
for item in lst:
if item:
tru.append(item)
return tru
print(compact([0, 1, 2, '', [], False, (), None, 'All done'])) |
Java | UTF-8 | 3,473 | 2.140625 | 2 | [
"Apache-2.0"
] | permissive | package me.carda.awesome_notifications.core.managers;
import android.app.NotificationChannelGroup;
import android.app.NotificationManager;
import android.content.Context;
import android.os.Build;
import androidx.annotation.RequiresApi;
import java.util.List;
import me.carda.awesome_notifications.core.Definitions;
import me.carda.awesome_notifications.core.exceptions.AwesomeNotificationsException;
import me.carda.awesome_notifications.core.models.NotificationChannelGroupModel;
import me.carda.awesome_notifications.core.utils.StringUtils;
public class ChannelGroupManager {
private static final SharedManager<NotificationChannelGroupModel> shared
= new SharedManager<>(
StringUtils.getInstance(),
"ChannelGroupManager",
NotificationChannelGroupModel.class,
"NotificationChannelGroup");
public static Boolean removeChannelGroup(Context context, String channelGroupKey) throws AwesomeNotificationsException {
return shared.remove(context, Definitions.SHARED_CHANNEL_GROUP, channelGroupKey);
}
public static List<NotificationChannelGroupModel> listChannelGroup(Context context) throws AwesomeNotificationsException {
return shared.getAllObjects(context, Definitions.SHARED_CHANNEL_GROUP);
}
public static void saveChannelGroup(Context context, NotificationChannelGroupModel channelGroupModel) {
try {
channelGroupModel.validate(context);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O)
setAndroidChannelGroup(context, channelGroupModel);
shared.set(context, Definitions.SHARED_CHANNEL_GROUP, channelGroupModel.channelGroupKey, channelGroupModel);
} catch (AwesomeNotificationsException e) {
e.printStackTrace();
}
}
public static NotificationChannelGroupModel getChannelGroupByKey(Context context, String channelGroupKey) throws AwesomeNotificationsException {
return shared.get(context, Definitions.SHARED_CHANNEL_GROUP, channelGroupKey);
}
public static void cancelAllChannelGroup(Context context) throws AwesomeNotificationsException {
List<NotificationChannelGroupModel> channelGroupList = shared.getAllObjects(context, Definitions.SHARED_CHANNEL_GROUP);
if (channelGroupList != null){
for (NotificationChannelGroupModel channelGroup : channelGroupList) {
cancelChannelGroup(context, channelGroup.channelGroupKey);
}
}
}
public static void cancelChannelGroup(Context context, String channelGroupKey) throws AwesomeNotificationsException {
NotificationChannelGroupModel channelGroup = getChannelGroupByKey(context, channelGroupKey);
if(channelGroup !=null)
removeChannelGroup(context, channelGroupKey);
}
public static void commitChanges(Context context) throws AwesomeNotificationsException {
shared.commit(context);
}
@RequiresApi(api = Build.VERSION_CODES.O /*Android 8*/)
public static void setAndroidChannelGroup(Context context, NotificationChannelGroupModel newChannelGroup) {
NotificationManager notificationManager = (NotificationManager) context.getSystemService(Context.NOTIFICATION_SERVICE);
notificationManager.createNotificationChannelGroup(new NotificationChannelGroup(newChannelGroup.channelGroupKey, newChannelGroup.channelGroupName));
}
}
|
Python | UTF-8 | 2,606 | 2.734375 | 3 | [] | no_license | __author__ = 'ando'
import logging as log
from functools import partial
from torch.autograd import Variable
import torch as t
import torch.nn as nn
log.basicConfig(format='%(asctime).19s %(levelname)s %(filename)s: %(lineno)s %(message)s', level=log.DEBUG)
class Node2Emb(nn.Module):
'''
Class that train the context embedding
'''
def __init__(self, model, negative=5):
'''
:param alpha: learning rate
:param window: windows size used to compute the context embeddings
:param workers: number of thread
:param negative: number of negative samples
:return:
'''
super(Node2Emb, self).__init__()
self.negative = negative
self.node_embedding = model.node_embedding
self.context_embedding = model.node_embedding
def forward(self, input_labels, out_labels, negative_sampling_fn):
"""
:param input_labes: Tensor with shape of [batch_size] of Long type
:param out_labels: Tensor with shape of [batch_size, window_size] of Long type
:param negative_sampling_fn: Function that sample negative nodes based on the weights
:return: Loss estimation with shape of [1]
"""
use_cuda = self.context_embedding.weight.is_cuda
[batch_size] = out_labels.size()
input = self.node_embedding(Variable(input_labels.view(-1)))
output = self.context_embedding(Variable(out_labels.view(-1)))
log_target = (input * output).sum(1).squeeze().sigmoid().log() # loss of the positive example
# equivalente to t.bmm(t.transpose(t.unsqueeze(input, 2),1, 2), t.unsqueeze(output, 2)).squeeze()
# SUBSAMPLE
noise_sample_count = batch_size * self.negative
draw = negative_sampling_fn(noise_sample_count)
draw.resize((batch_size, self.negative))
noise = Variable(t.from_numpy(draw))
if use_cuda:
noise = noise.cuda()
noise = self.context_embedding(noise).neg()
sum_log_sampled = t.bmm(noise, input.unsqueeze(2)).sigmoid().log().sum(1).squeeze()
loss = log_target + sum_log_sampled
return -loss.sum() / batch_size
def input_embeddings(self):
return self.node_embedding.weight.data.cpu().numpy()
def transfer_fn(self, model_dic):
'''
Transfer function form edge=[node_id_1 node_id_b] to edge=[node_idx_1 node_idx_b]
:param model_dic: dictionary form node_id to node_data. Used for index substitution and sampling
'''
return partial(lambda x, vocab: vocab[x].index, vocab=model_dic)
|
Markdown | UTF-8 | 5,717 | 2.515625 | 3 | [] | no_license | # vpl4vscode README
The vpl4vscode extension allows to interact with VPL moodle's webservice directly from VSCode.
## Modification on Moodle VPL plugin
In [externallib.php](https://github.com/jcrodriguez-dis/moodle-mod_vpl/blob/v3.3.8/externallib.php), add the following code to the end of file in order to allow run and debug as VPL webservices
```php
/*
* run function. run the student's submitted files
*/
public static function run_parameters() {
return new external_function_parameters( array (
'id' => new external_value( PARAM_INT, 'Activity id (course_module)', VALUE_REQUIRED ),
'password' => new external_value( PARAM_RAW, 'Activity password', VALUE_DEFAULT, '' )
) );
}
public static function run($id, $password) {
global $USER;
$params = self::validate_parameters( self::evaluate_parameters(), array (
'id' => $id,
'password' => $password
) );
$vpl = self::initial_checks( $id, $password );
$vpl->require_capability( VPL_SUBMIT_CAPABILITY );
$instance = $vpl->get_instance();
if (! $vpl->is_submit_able()) {
throw new Exception( get_string( 'notavailable' ) );
}
if ($instance->example or ! $instance->run) {
throw new Exception( get_string( 'notavailable' ) );
}
$res = mod_vpl_edit::execute( $vpl, $USER->id, 'run' );
if ( empty($res->monitorPath) ) {
throw new Exception( get_string( 'notavailable' ) );
}
$monitorurl = 'ws://' . $res->server . ':' . $res->port . '/' . $res->monitorPath;
$smonitorurl = 'wss://' . $res->server . ':' . $res->securePort . '/' . $res->monitorPath;
$executeurl = 'ws://' . $res->server . ':' . $res->port . '/' . $res->executionPath;
$sexecuteurl = 'wss://' . $res->server . ':' . $res->securePort . '/' . $res->executionPath;
return array ( 'monitorURL' => $monitorurl, 'smonitorURL' => $smonitorurl,'executeURL' => $executeurl, 'sexecuteURL' => $sexecuteurl );
}
public static function run_returns() {
return new external_single_structure( array (
'monitorURL' => new external_value( PARAM_RAW, 'URL to the service that monitor the evaluation in the jail server' ),
'smonitorURL' => new external_value( PARAM_RAW, 'URL to the service that monitor the evaluation in the jail server' ),
'executeURL' => new external_value( PARAM_RAW, 'URL to the service that execute the evaluation in the jail server'),
'sexecuteURL' => new external_value( PARAM_RAW, 'URL to the service that execute the evaluation in the jail server')
) );
}
/*
* debug function. debug the student's submitted files
*/
public static function debug_parameters() {
return new external_function_parameters( array (
'id' => new external_value( PARAM_INT, 'Activity id (course_module)', VALUE_REQUIRED ),
'password' => new external_value( PARAM_RAW, 'Activity password', VALUE_DEFAULT, '' )
) );
}
public static function debug($id, $password) {
global $USER;
$params = self::validate_parameters( self::evaluate_parameters(), array (
'id' => $id,
'password' => $password
) );
$vpl = self::initial_checks( $id, $password );
$vpl->require_capability( VPL_SUBMIT_CAPABILITY );
$instance = $vpl->get_instance();
if (! $vpl->is_submit_able()) {
throw new Exception( get_string( 'notavailable' ) );
}
if ($instance->example or ! $instance->debug) {
throw new Exception( get_string( 'notavailable' ) );
}
$res = mod_vpl_edit::execute( $vpl, $USER->id, 'debug' );
if ( empty($res->monitorPath) ) {
throw new Exception( get_string( 'notavailable' ) );
}
$monitorurl = 'ws://' . $res->server . ':' . $res->port . '/' . $res->monitorPath;
$smonitorurl = 'wss://' . $res->server . ':' . $res->securePort . '/' . $res->monitorPath;
$executeurl = 'ws://' . $res->server . ':' . $res->port . '/' . $res->executionPath;
$sexecuteurl = 'wss://' . $res->server . ':' . $res->securePort . '/' . $res->executionPath;
return array ( 'monitorURL' => $monitorurl, 'smonitorURL' => $smonitorurl, 'executeURL' => $executeurl, 'sexecuteURL' => $sexecuteurl );
}
public static function debug_returns() {
return new external_single_structure( array (
'monitorURL' => new external_value( PARAM_RAW, 'URL to the service that monitor the evaluation in the jail server' ),
'smonitorURL' => new external_value( PARAM_RAW, 'URL to the service that monitor the evaluation in the jail server' ),
'executeURL' => new external_value( PARAM_RAW, 'URL to the service that execute the evaluation in the jail server'),
'sexecuteURL' => new external_value( PARAM_RAW, 'URL to the service that execute the evaluation in the jail server')
) );
}
```
In [view.php](https://github.com/jcrodriguez-dis/moodle-mod_vpl/blob/v3.3.8/view.php), replace
```php
if (vpl_get_webservice_available()) {
echo "<a href='views/show_webservice.php?id=$id'>";
echo get_string( 'webservice', 'core_webservice' ) . '</a><br>';
}
```
by
```php
if (vpl_get_webservice_available()) {
$service_url = vpl_get_webservice_urlbase($vpl);
echo '<a href="vscode://GuillaumeBlin.vpl4vscode/open?’.$service_url.’" class="btn btn-primary">Import in VS Code</a>';
}
```
## Known Issues
Completely in beta test
|
PHP | UTF-8 | 567 | 2.9375 | 3 | [] | no_license | <?php
namespace App\Domains\Repository;
class RepositoryProviderFactory
{
static function make($type) {
if ($type && is_string($type)) {
$type = trim(strtolower($type));
}
switch ($type) {
case "github":
$repository = new GitHubProvider();
break;
case "local":
$repository = new LocalProvider();
break;
default:
throw new InvalidRepositoryProviderException();
}
return $repository;
}
}
|
Python | UTF-8 | 5,453 | 4.125 | 4 | [] | no_license | # Кирилиця ---> Латиниця (На часі)
def next_let_func_1(text, route, index_letter, value_let, capital_letters, lower_case_letters): # check letter (next or past) is capital or lower_case
meter_help = 0
for index_new_letter in range(1, 5):
try:
check_letter = text[index_letter + index_new_letter * route] #end of string
except IndexError:
break
if (text[index_letter + index_new_letter * route] in capital_letters) and (index_new_letter == 1): #next letter is capital
value_let = 0
break
elif text[index_letter + index_new_letter * route] in lower_case_letters: #letter is lower-case
value_let = 1
break
elif (text[index_letter + index_new_letter * route] in capital_letters) and (index_new_letter != 1): #letter is capital
meter_help += 1
elif not(text[index_letter + index_new_letter * route] in capital_letters) and not(text[index_letter + index_new_letter * route] in lower_case_letters): #this is not letter
meter_help -= 1
elif meter_help == 2:
value_let = 0
break
return value_let
def next_let_func_2(text, index_letter, check_list, route):
try:
check_letter = text[index_letter + route]
except IndexError:
return 3
if check_letter in check_list:
return 2
else:
return 3
def text_tran(text):
alphabet = {'А': 'A', 'О': 'O', 'І': 'I', 'Е': 'E', 'У': 'U', 'И': 'Y', 'Ї': 'Ї', 'Є': 'Je', 'Я': 'Ja', 'Ю': 'Ju', 'а': 'a', 'о': 'o',
'і': 'i', 'е' : 'e', 'у': 'u', 'и': 'y', 'ї': 'ї', 'є': 'je', 'я': 'ja', 'ю': 'ju', 'Б': 'B', 'В': 'V', 'Г': 'G', 'Ґ': 'Ğ', 'Д': 'D',
'Ж': 'Ž', 'З': 'Z', 'Й': 'J', 'К': 'K', 'Л': 'L', 'М': 'M', 'Н': 'N', 'П': 'P', 'Р': 'R', 'С': 'S', 'Т': 'T', 'Ф': 'F', 'Х': 'H',
'Ц': 'C', 'Ч': 'Č', 'Ш': 'Š', 'Щ': 'Šč', 'б': 'b', 'в': 'v', 'г': 'g', 'ґ': 'ğ', 'д': 'd', 'ж': 'ž', 'з': 'z', 'й': 'j', 'к': 'k',
'л': 'l', 'м': 'm', 'н': 'n', 'п': 'p', 'р': 'r', 'с': 's', 'т': 't', 'ф': 'f', 'х': 'h', 'ц': 'c', 'ч': 'č', 'ш': 'š', 'щ': 'šč'}
alphabet_set = ('а', 'б', 'в', 'г', 'ґ', 'д', 'е', 'є', 'ж', 'з', 'и', 'і', 'ї', 'й', 'к', 'л', 'м', 'н', 'о', 'п', 'р', 'с', 'т', 'у',
'ф', 'х', 'ц', 'ч', 'ш', 'щ', 'ь', 'ю', 'я', '’', "'", 'А', 'Б', 'В', 'Г', 'Ґ', 'Д', 'Е', 'Є', 'Ж', 'З', 'И', 'І', 'Ї', 'Й', 'К', 'Л',
'М', 'Н', 'О', 'П', 'Р', 'С', 'Т', 'У', 'Ф', 'Х', 'Ц', 'Ч', 'Ш', 'Щ', 'Ь', 'Ю', 'Я')
other_letters = {'Д': 'Ď', 'Т': 'Ť', 'З': 'Ź', 'С': 'Ś', 'Ц': 'Ć', 'Л': 'Ľ', 'Н': 'Ń', 'д': 'ď', 'т': 'ť', 'з': 'ź', 'с': 'ś', 'ц': 'ć',
'л': 'ľ', 'н': 'ń', 'Щ': 'ŠČ', 'Є': 'JE', 'Ю': 'JU', 'Я': 'JA'}
capital_letters = ('А', 'О', 'І', 'Е', 'У', 'И', 'Ї', 'Б', 'В', 'Г', 'Ґ', 'Ж', 'Й', 'К', 'М', 'П', 'Р', 'Ф', 'Х', 'Ч', 'Ш', 'Д', 'Т', 'З', 'С',
'Ц', 'Л', 'Н', 'Щ', 'Я', 'Ю', 'Є', 'Ь')
lower_case_letters = ('а', 'о', 'і', 'е', 'у', 'и', 'ї', 'б', 'в', 'г', 'ґ', 'ж', 'й', 'к', 'м', 'п', 'р', 'ф', 'х', 'ч', 'ш', 'д', 'т', 'з', 'с',
'ц', 'л', 'н', 'щ', 'я', 'ю', 'є', 'ь')
simple_letters = ('А', 'О', 'І', 'Е', 'У', 'И', 'Ї', 'а', 'о', 'і', 'е', 'у', 'и', 'ї', 'Б', 'В', 'Г', 'Ґ', 'Ж', 'Й', 'К', 'М', 'П', 'Р',
'Ф', 'Х', 'Ч', 'Ш', 'б', 'в', 'г', 'ґ', 'ж', 'й', 'к', 'м', 'п', 'р', 'ф', 'х', 'ч', 'ш', 'щ', 'є', 'ю', 'я')
letters_with = ('Д', 'Т', 'З', 'С', 'Ц', 'Л', 'Н', 'д', 'т', 'з', 'с', 'ц', 'л', 'н')
b_double_letters = ('Щ', 'Є', 'Ю', 'Я')
text_out, index_letter, text_status = '', -1, True
# print(len(text))
# print(text[0]+text[1])
# print(ord(text[0]), len("П"))
for letter in text:
index_letter, value_let = index_letter + 1, -1
if letter == '*':
value_let = next_let_func_2(text, index_letter, ('*'), 1) + next_let_func_2(text, index_letter, ('*'), 2)
text_status = not(text_status) if value_let == 4 else text_status
if text_status == False:
text_out += letter
continue
if letter in simple_letters:
text_out += alphabet[letter]
elif letter in letters_with:
value_let = next_let_func_2(text, index_letter, ('Ь', 'ь'), 1)
text_out = text_out + other_letters[letter] if value_let == 2 else text_out + alphabet[letter]
elif letter in b_double_letters:
value_let = next_let_func_1(text, 1, index_letter, value_let, capital_letters, lower_case_letters)
if value_let == -1:
value_let = next_let_func_1(text, -1, index_letter, value_let, capital_letters, lower_case_letters)
text_out = text_out + alphabet[letter] if value_let == 1 else text_out + other_letters[letter]
elif letter in ('Ь', 'ь'):
value_let = next_let_func_2(text,index_letter, alphabet_set, -1)
text_out = text_out + '' if value_let == 2 else text_out + letter
else:
text_out += letter
return text_out |
Python | UTF-8 | 4,812 | 3.421875 | 3 | [] | no_license | import sys
import socket
import string
import datetime
# IP Address for the Virtual Machine:
HOST = "10.0.42.17"
# default Connection Port
PORT = 6667
# nickname of the Bot
NICK = "ProBot"
# username of the Bot
USERNAME = "ProBotUserName"
# real Name of the bot
REALNAME = "RealBot"
# buffer for the messages sent to the channel that the bot is connected to
readbuffer = ""
# channel #test that the bot will automatically connect to once it starts running
CHANNEL = "#test"
# socket that the bot will use to communicate to the server
botSocket = socket.socket()
# connection to the server
botSocket.connect((HOST, PORT))
# bot sends automatically the username and the real name of the Bot to the server
botSocket.send(bytes("USER " + USERNAME + " " + USERNAME + " " + USERNAME + " :" + REALNAME + "\r\n", "UTF-8"))
# bot sends automatically the nickname of the Bot to the server
botSocket.send(bytes("NICK " + NICK + "\r\n", "UTF-8"))
# bot automatically joins the channel set in advance, on the server
botSocket.send(bytes("JOIN %s" % (CHANNEL) + "\r\n", "UTF-8"))
# while loop used to run the bot indefinitely until the bot is stopped
while 1:
# buffer that stores the messages from the channel
readbuffer = readbuffer + botSocket.recv(1024).decode("UTF-8")
# variable to store and split the messages from the buffer
temp = str.split(readbuffer, "\n")
# popping from the temp variable
readbuffer = temp.pop()
# for loop to iterate through the different lines in temp
for line in temp:
# striping the line
line = str.rstrip(line)
# splitting the line
line = str.split(line)
# checking if a private message is being sent to the bot
if (line[1] == "PRIVMSG"):
# variable to store the sender of the private message
sender = ""
# iterating through every character in the first element of the line array
for char in line[0]:
# break if an exclamation sign is found in the first element of the line array
if (char == "!"):
break
# checking if there is any colon, set the sender to the character
if (char != ":"):
sender += char
# if any user inputs !day the bot will send the date in a day/month/year format back to the channel and #
# on a private message to the user that asked for it
if line[3] == ":!day":
date = datetime.datetime.now()
message = date.strftime("%d/%m/%Y")
botSocket.send(bytes("PRIVMSG %s %s" % (CHANNEL, message) + "\r\n", "UTF-8"))
botSocket.send(bytes("PRIVMSG %s %s" % (sender, message) + "\r\n", "UTF-8"))
# if any user inputs !time the bot will send the time in a hour/minute/second format back to the channel #
# and on a private message to the user that asked for it
elif line[3] == ":!time":
time = datetime.datetime.now()
message = time.strftime("%H:%M:%S")
botSocket.send(bytes("PRIVMSG %s %s" % (CHANNEL, message) + "\r\n", "UTF-8"))
botSocket.send(bytes("PRIVMSG %s %s" % (sender, message) + "\r\n", "UTF-8"))
# if the user sends anything else to the bot that is not !day or !time, the user will receive a #
# personalised private message
else:
import random
# random number used to choose which random fact it will be displayed
number = random.randint(1, 6)
# function that displays a random fact depending on a number i, it works using a switcher
def fact(i):
# switcher used to choose the random fact
switcher = {
1: 'Did you know that most toilets flush in E flat?',
2: 'Did you know that sloths can held breath longer than dolphins can?',
3: 'Did you know that Scotland has 421 words for "snow"?',
4: 'Did you know octopuses have three hearts?',
5: 'Did you know that Empire State Building has its own post code?',
6: 'Did you know that the Eiffel Tower was originally intended for Barcelona?'
}
return switcher.get(i, "Not valid number")
# Random fact that will be displayed to the user
fact = fact(number)
# Displaying the random fact to the user
botSocket.send(bytes("PRIVMSG %s :%s" % (sender, fact) + "\r\n", "UTF-8"))
|
JavaScript | UTF-8 | 374 | 2.8125 | 3 | [] | no_license | export function getBase64(img, callback) {
const reader = new FileReader();
reader.addEventListener('load', () => callback(reader.result));
reader.readAsDataURL(img);
}
/**
* 判断是否为图片文件
*/
export function isImageFile(fileName) {
if (!/\.(jpg|jpeg|png|gif|bmp)$/.test(fileName.toLowerCase())) {
return false;
} else {
return true;
}
}
|
C++ | UTF-8 | 8,541 | 3.40625 | 3 | [] | no_license | 一、分析字符串S(长为m)和T(长为n)最长公共子序列的DP算法:
1. 创建1个二维数组dp[m+1][n+1]:dp[i][j]代表字符串S[0...i-1]和T[0...j-1]匹配的最长公共子序列长度
2. 初始化dp[][]内容为0,其中dp[0][0],dp[i][0],dp[0][j]始终保持初值为0
3. i:1 to m;j:1 to n进行循环(状态转移方程如下:)
(1) if (S[i-1]==T[j-1]):dp[i][j] = dp[i-1][j-1] + 1
(2) if (S[i-1]!=T[j-1]):选择1,此次放弃S[i-1],此时dp[i][j] = S[0...i-2] 与 T[0...j-1]公共子序列长度,即dp[i-1][j]
选择2:此次放弃T[j-1],让S[i]去尝试匹配 T[0...j-2]看看是否能生成更长的公共子序列,此时dp[i][j] = S[0...i-1] 与 T[0...j-2]公共内子序列长度,即dp[i][j-1]
总结:S[i-1] != T[j-1]时,dp[i][j] = max(dp[i-1][j], dp[i][j-1])
4. 循环结束后,dp[m][n]中的数值:最长公共子序列的长度
二、从dp数组中查找最长的公共子序列str(先执行一)
1. i:m to 1 ,j:n to 1
2. if (S[i-1] == T[j-1]),说明S[i-1],T[j-1]被选做了公共子序列中的1个字符,将S[i-1]字符插入到str的头部,i--,j--
3. if (S[i-1] != T[j-1]),比较dp[i-1][j] 与 dp[i][j-1]的大小.
(1) 如果dp[i-1][j]更大,说明在动态规划匹配S[i]和T[j]时,选择了放弃S[i-1],直接取值为 S[0...i-2] 与 T[0...j-1]的子序列长度
所以i--;
(2) 如果dp[i][j-1]更大,说明在匹配S[i]和T[j]时,选择了放弃T[j-1],直接取值为 S[0...i-1] 与 T[0...j-2]的子序列长度
所以j--;
(3) dp[i-1][j] == dp[i][j-1],则说明在匹配S[i]和T[j]时,有着放弃S[i]/放弃T[j]两种选择,这两种选择都可以得到最终的最长公共子序列
PS:下端代码的计算结果有误(猜测是direction[i][j]==4的情况定义不明确),对于求解子序列数目有着更简洁的做法
/*
三、计算公共子序列str的可能个数(先执行一,参考二中的算法计算“迷宫右下角->左上角”的走路方式,一旦到达左上角则 count++)
1. 递归临界条件:如果[i==0 || j==0],count++,返回上一层
2. if (S[i-1] == T[j-1]),说明S[i-1],T[j-1]被选做了公共子序列中的1个字符,下层递归执行(i-1,j-1)
3. if (S[i-1] != T[j-1]),比较dp[i-1][j] 与 dp[i][j-1]的大小.
(1) dp[i-1][j] == dp[i][j-1],则说明在匹配S[i-1]和T[j-1]时,有着放弃S[i-1]/放弃T[j-1]两种选择,下层递归先执行(i-1,j),然后执行(i,j-1)
(2) 如果dp[i-1][j]更大,说明在动态规划匹配S[i-1]和T[j-1]时,选择了放弃S[i-1],下层递归执行(i-1,j)
(3) 如果dp[i][j-1]更大,说明在匹配S[i-1]和T[j-1]时,选择了放弃T[j-1],下层递归执行(i,j-1)
PS:为了避免太多次的数值比较,使用方向数组direction[i][j]:取值为1,S[i-1] == T[j-1];取值为2,dp[i-1][j]更大;取值为3,dp[i][j-1]更大;取值为4,dp[i-1][j]==dp[i][j-1s]
*/
class Solution {
public:
int count=0;
int numDistinct(string s, string t) {
int i,j,m,n;
m = s.length();
n = t.length();
if(m < n || m==0 || n==0){
return 0;
}
int dp[m+1][n+1];
vector<int> temp(n+1,0);
vector<vector<int>> direction(m+1,temp);//direction[i][j]:1:[i,j]->[i-1,j-1], 2:[i,j]->[i-1,j] ,3:[i,j]->[i,j-1], 4:[i-1,j],[i,j-1]两方向都得遍历
// int direction[m+1][n+1];
for(i=0;i<=m;i++){
for(j=0;j<=n;j++){
dp[i][j]=0;
// direction[i][j]=0;
}
}
//动态规划,求解最长公共子序列的长度
for(i=1;i<=m;i++){
for(j=1;j<=n;j++){
if(s[i-1] == t[j-1]){
dp[i][j] = dp[i-1][j-1] + 1;
direction[i][j] = 1;
}
else{
if(dp[i-1][j] > dp[i][j-1]){
dp[i][j] = dp[i-1][j];
direction[i][j] = 2;
}
else{
if(dp[i-1][j] < dp[i][j-1]){
dp[i][j] = dp[i][j-1];
direction[i][j] = 3;
}
else{
dp[i][j] = dp[i][j-1];//dp[i-1][j]亦可
direction[i][j] = 4;
}
}
}
}
}
// for(i = 1; i <= m; i++)
// {
// for(j = 1; j <= n; j++)
// {
// cout << dp[i][j] << " ";
// }
// cout << endl;
// }
if(dp[m][n] < n){
return 0;
}
cal_seqnum(m,n,direction);
return count;
}
// void cal_seqnum(int i,int j,vector<vector<int>> direction){
// if(i==0 || j==0){
// count++;
// return;
// }
// if(direction[i][j] == 1){
// cal_seqnum(i-1,j-1,direction);
// }
// else{
// if(direction[i][j] == 2){
// cal_seqnum(i-1,j,direction);
// }
// else{
// if(direction[i][j] == 3){
// cal_seqnum(i,j-1,direction);
// }
// else{
// cal_seqnum(i-1,j,direction);
// cal_seqnum(i,j-1,direction);
// }
// }
// }
// }
};
给定字符串S和T,S通过删除某些位置的字符得到T的话,就记作一种subSequence
递归代码便于理解,但会超时
class Solution {
public:
int numDistinct(string S, string T)
{
if (S.size() < T.size()) return 0;
if (S.size() == T.size() && S == T) return 1; // 都为空的时候返回什么呢???
if (S[0] != T[0]){//如果本层递归的S[0]与T[0]不匹配,则只能删除此时的S[0]后,将S[1...m-1]与T匹配
return numDistinct(S.substr(1), T);
}
else{//如果本层递归的S[0]==T[0],则有2种选择:1.用S[0]匹配掉T[0],二者都被消耗掉,下层匹配S[1..m-1],T[1...n-1];2.本次放弃使用S[0]匹配掉T[0],就当S[0]不存在(即只消耗掉S[0]),下层匹配S[1...m-1]和T[0...n-1]
return numDistinct(S.substr(1), T.substr(1)) + numDistinct(S.substr(1), T);
}
}
};
用dp[i][j]记录S[0...i-1]中T[0...j-1]作为子序列的个数,那么最后目标就是dp[S.size()][T.size()];
初始化,j = 0 时候,dp[i][0] = 1,因为所有的都可以通过删除所有变成空字符,并且只有一种。
递推式子如下:
1. i和j都从1开始,且j不能大于i,因为匹配的长度至少为1开始,j大于i无意义
2. 如果 i == j,那么 dp[i][j] = (S.substr(0, i) == T.substr(0, j)); //即S[0..i-1]与T[0...j-1]是否完全匹配
3. 如果 i != j 分两种情况
(1) S[i-1] != T[j-1] 时,只能在S[0...i-2]中尝试寻找T[0...j-1]作为子序列的种数,那么 dp[i][j] = dp[i - 1][j];
(2) S[i-1] == T[j-1] 时,那么对于字符T[j-1]有两种选择:选择1. T[j-1]与S[i-1]匹配消耗掉,选择1得到的子序列种数 = S[0...i-2]中T[0...j-2]作为子序列的种数 = dp[i-1][j-1]
选择2. T[j-1]本次并不与S[i-1]匹配掉,就当作S[i-1]不存在(即只消耗掉S[0]),选择2得到的匹配种数 = S[0...i-2]中T[0...j-1]作为子序列的种数 = dp[i][j-1]
所以dp[i][j] = dp[i - 1][j - 1] + dp[i - 1][j];
class Solution {
public:
int numDistinct(string s, string t){
int m = s.length();
int n = t.length();
if(m < n){
return 0;
}
if(n == 0){//如果t为空,则默认t出现过1次
return 1;
}
int dp[m+1][n+1];
for(int i=0;i<=m;i++){
dp[i][0] = 1;
}
for(int i=1;i<=m;i++){
for(int j=1;j<=i && j<=n;j++){
if(i == j){
dp[i][j] = (s.substr(0,i) == t.substr(0,j));
}
else{//j限定条件包括j<=i &&j<=n,所以此时i>j
if(s[i-1] == t[j-1]){
dp[i][j] = dp[i-1][j-1] + dp[i-1][j];
}
else{
dp[i][j] = dp[i-1][j];
}
}
}
}
return dp[m][n];
}
}; |
Markdown | UTF-8 | 1,246 | 3.296875 | 3 | [] | no_license | ## LoadingCards
### Android library to create cards in your screen. This cards before being loaded into the screen shows an animation which will trick the user to believe that data is loaded quickly.
### Usage
#### When you create a card it consists of Bitmap, title and subtitle. It may contain some extra things but this are the basic need. Now to create a list of cards you have to use LoadingCardList. To add a card you have to can code the logic in fetchBitmap,fetchTitle,fetchSubTitle.
#### Creating LoadingCardList in Activity
```
LoadingCardList loadingCardList = new LoadingCardList(this);
```
#### adding loading card along with the processor.
```
loadingCardList.addLoadingCard(new CardProcessor(){
public Bitmap fetchBitmap() {
//Your blocking logic
return null;
}
public String fetchTitle() {
//Your blocking logic
return null;
}
public String fetchSubTitle() {
//Your blocking logic
return null;
}
})
```
#### Showing loading card
```
loadingCardList.show();
```
#### Demo
<img src="https://github.com/Anwesh43/LoadingCards/blob/master/demo/loadingcards.gif" width="300px" height="500px">
|
Go | UTF-8 | 4,945 | 3.21875 | 3 | [
"MIT"
] | permissive | package data
import (
"encoding/json"
"testing"
)
func TestAccess(t *testing.T) {
t.Parallel()
a := NewAccess(0)
if a == nil {
t.Fatal("Failed to Create")
}
if a.Level != 0 || a.Flags != 0 {
t.Error("Bad init")
}
a = NewAccess(100, "aBC", "d")
if a.Level != 100 {
t.Error("Level was not set")
}
for _, v := range "aBCd" {
if !a.HasFlag(v) {
t.Errorf("Flag %c was not found.", v)
}
}
}
func TestAccess_HasLevel(t *testing.T) {
t.Parallel()
a := NewAccess(50)
var table = map[uint8]bool{
50: true,
49: true,
51: false,
}
for level, result := range table {
if res := a.HasLevel(level); res != result {
t.Errorf("HasLevel %v resulted in: %v", level, res)
}
}
}
func TestAccess_HasFlag(t *testing.T) {
t.Parallel()
a := NewAccess(0, "aBC", "d")
for _, v := range "aBCd" {
if !a.HasFlag(v) {
t.Errorf("Flag %c was not found.", v)
}
}
}
func TestAccess_HasFlags(t *testing.T) {
t.Parallel()
a := NewAccess(0, "aBC", "d")
if !a.HasFlags("aBCd") {
t.Error("Flags were not all found.")
}
if !a.HasFlags("aZ") || !a.HasFlags("zB") {
t.Error("Flags should or together for access.")
}
}
func TestAccess_SetFlag(t *testing.T) {
t.Parallel()
a := NewAccess(0)
if a.HasFlag('a') {
t.Error("Really bad init.")
}
a.SetFlag('a')
if !a.HasFlag('a') {
t.Error("Set flag failed")
}
a.SetFlag('A')
if !a.HasFlag('A') {
t.Error("Set flag failed")
}
a.SetFlag('!')
if !a.HasFlag('a') || !a.HasFlag('A') || a.HasFlag('!') {
t.Error("Set flag failed")
}
}
func TestAccess_MultiEffects(t *testing.T) {
t.Parallel()
a := NewAccess(0)
a.SetFlags("ab", "A")
if !a.HasFlags("ab", "A") {
t.Error("Set flags failed")
}
a.ClearFlags("ab", "A")
if a.HasFlags("a") || a.HasFlags("b") || a.HasFlags("A") {
t.Error("Clear flags failed")
}
}
func TestAccess_ClearFlag(t *testing.T) {
t.Parallel()
a := NewAccess(0, "aBCd")
for _, v := range "aBCd" {
if !a.HasFlag(v) {
t.Errorf("Flag %c was not found.", v)
}
}
a.ClearFlag('a')
a.ClearFlag('C')
for _, v := range "Bd" {
if !a.HasFlag(v) {
t.Errorf("Flag %c was not found.", v)
}
}
for _, v := range "aC" {
if a.HasFlag(v) {
t.Errorf("Flag %c was found.", v)
}
}
}
func TestAccess_ClearAllFlags(t *testing.T) {
t.Parallel()
a := NewAccess(0, "aBCd")
a.ClearAllFlags()
for _, v := range "aBCd" {
if a.HasFlag(v) {
t.Errorf("Flag %c was found.", v)
}
}
}
func TestAccess_IsZero(t *testing.T) {
t.Parallel()
a := Access{}
if !a.IsZero() {
t.Error("Should be zero.")
}
a.SetAccess(1, "a")
if a.IsZero() {
t.Error("Should not be zero.")
}
}
func TestAccess_String(t *testing.T) {
t.Parallel()
var table = []struct {
Level uint8
Flags string
Expect string
}{
{100, "aBCd", "100 BCad"},
{0, wholeAlphabet, allFlags},
{0, "BCad", "BCad"},
{100, "", "100"},
{0, "", none},
}
for _, test := range table {
a := NewAccess(test.Level, test.Flags)
if was := a.String(); was != test.Expect {
t.Errorf("Expected: %s, was: %s", test.Expect, was)
}
}
}
func Test_getFlagBits(t *testing.T) {
t.Parallel()
bits := getFlagBits("Aab")
aFlag, bFlag, AFlag := getFlagBit('a'), getFlagBit('b'), getFlagBit('A')
if aFlag != aFlag&bits {
t.Error("The correct bit was not set.")
}
if bFlag != bFlag&bits {
t.Error("The correct bit was not set.")
}
if AFlag != AFlag&bits {
t.Error("The correct bit was not set.")
}
}
func Test_getFlagBit(t *testing.T) {
t.Parallel()
var table = map[rune]uint64{
'A': 0x1,
'Z': 0x1 << (nAlphabet - 1),
'a': 0x1 << nAlphabet,
'z': 0x1 << (nAlphabet*2 - 1),
'!': 0x0, '_': 0x0, '|': 0x0,
}
for flag, expect := range table {
if bit := getFlagBit(flag); bit != expect {
t.Errorf("Flag did not match: %c, %X (%X)",
flag, expect, bit)
}
}
}
func Test_getFlagString(t *testing.T) {
t.Parallel()
var table = map[uint64]string{
0x1: "A",
0x1 << (nAlphabet - 1): "Z",
0x1 << nAlphabet: "a",
0x1 << (nAlphabet*2 - 1): "z",
}
for bit, expect := range table {
if flag := getFlagString(bit); flag != expect {
t.Errorf("Flag did not match: %X, %s (%s)",
bit, expect, flag)
}
}
bits := getFlagBit('a') | getFlagBit('b') | getFlagBit('A') |
1<<(nAlphabet*2+1)
should := "Aab"
if was := getFlagString(bits); was != should {
t.Errorf("Flag string should be: (%s) was: (%s)", should, was)
}
}
func TestAccess_JSONifying(t *testing.T) {
t.Parallel()
a := NewAccess(23, "DEFabc")
var b Access
str, err := json.Marshal(a)
if err != nil {
t.Error(err)
}
if string(str) != `{"level":23,"flags":469762104}` {
t.Errorf("Wrong JSON: %s", str)
}
if err = json.Unmarshal(str, &b); err != nil {
t.Error(err)
}
if *a != b {
t.Error("A and B differ:", a, b)
}
}
func TestAccess_Protofy(t *testing.T) {
t.Parallel()
a := NewAccess(23, "DEFabc")
var b Access
b.FromProto(a.ToProto())
if *a != b {
t.Error("A and B differ:", a, b)
}
}
|
Java | UTF-8 | 257 | 2.359375 | 2 | [] | no_license | package com.pi.common;
/**
* Interface representing any disposable object.
*
* @author Westin
*
*/
public interface Disposable {
/**
* Dispose any resources used by this class, and stops any threads it uses.
*/
void dispose();
}
|
Markdown | UTF-8 | 5,491 | 2.65625 | 3 | [] | no_license | # Appunti su Kafka
## Presentazione
Potete scaricare il file pdf della presentazione [qui](https://drive.google.com/file/d/1gk9go5UErV9dW2g1167nEgZlh9YYHPHi/view?usp=sharing)
## Creazione del cluster Kafka
Per la creazione del cluster Kafka sarà molto utile, in questa demo, utilizzare Docker e, in particolare, per orchestrare la creazione e la gestione dei tre nodi Kafka, del nodo ZooKeeper, del nodo con KafDrop e Portainer utilizzare docker compose.
Qui potete trovare il file [docker-compose.yml](https://github.com/danilopaissan/kafka-example/blob/master/docker-compose.yml) utilizzato.
Procedere come di seguito:
* installare Docker
* installare docker compose
* modificare con gli IP corretti il file docker-compose.yml
* eseguire `docker-compose up -d`
* per verificare lo stato dei singoli container eseguire il comando `docker container ls`
## Demo Producer - Consumer
Per creare un topic `nometopic` con fattore di partizionamento uguale a 1 e fattore di replica a 1
```
docker exec <dockerid> /usr/bin/kafka-topics --zookeeper XXX.XXX.XXX.XXX:2181 --create --topic nometopic --partitions 1 --replication-factor 1
```
Verifico il topic
```
docker exec <dockerid> /usr/bin/kafka-topics --zookeeper XXX.XXX.XXX.XXX:2181 --list
```
Facciamo un test di scrittura sul topic
```
docker exec -it <dockerid> /usr/bin/kafka-console-producer --broker-list XXX.XXX.XXX.XXX:9092 --topic nometopic
```
Nel caso si vogliano inviare messaggi più complessi, ad esempio messaggi JSON, sarà possibile agire in questo modo
```
docker exec <dockerid> /usr/bin/kafka-console-producer --broker-list XXX.XXX.XXX.XXX:9092 --topic nometopic < file_con_il_mio_messaggio.json
```
dove `file_con_il_mio_messaggio.json` sarà il file dove avremo precedentemente salvato il nostro messaggio
Parallelamente leggiamo quanto scritto cominciando dall'inizio del topic con il parametro `--from-beginning`
```
docker exec -it <dockerid> /usr/bin/kafka-console-consumer --bootstrap-server XXX.XXX.XXX.XXX:9092 --from-beginning --topic nometopic
```
Terminato il test cancelliamo il topic
```
docker exec <dockerid> /usr/bin/kafka-topics --zookeeper XXX.XXX.XXX.XXX:2181 --delete --topic nometopic
```
## Setup per provare il progetto
Per monitorare in maniera più agevole i topic possiamo connetterci a Kafkadrop http://XXX.XXX.XXX.XXX:10000/
Per prima cosa creiamo il topic `catalog` con un fattore di partizionamento uguale a 2 e un fattore di replica uguale a 1
```
docker exec <dockerid> /usr/bin/kafka-topics --create --zookeeper XXX.XXX.XXX.XXX:2181 --topic catalog --partitions 2 --replication-factor 1
```
Modifichiamo poi il topic appena creato portando la retention a un'ora
```
docker exec <dockerid> /usr/bin/kafka-configs --zookeeper XXX.XXX.XXX.XXX:2181 --alter --entity-type topics --entity-name catalog --add-config retention.ms=3600000
```
Sarà sempre possibile eliminare la configurazione relativa alla retention nel modo seguente
```
docker exec <dockerid> /usr/bin/kafka-configs --zookeeper XXX.XXX.XXX.XXX:2181 --alter --entity-type topics --entity-name catalog --delete-config retention.ms
```
Al termine potremo cancellare il topic
```
docker exec <dockerid> /usr/bin/kafka-topics --zookeeper XXX.XXX.XXX.XXX:2181 --delete --topic catalog
```
## Esecuzione del progetto
Il [progetto](https://github.com/danilopaissan/kafka-example) simula, con il producer, un semplice sistema di inserimento di articoli in un catalogo mentre, con i due consumer, simula due differenti *viste* sul catalogo.
Per compilare ed eseguire il progetto serviranno una [JDK 8](https://docs.aws.amazon.com/corretto/latest/corretto-8-ug/downloads-list.html) e [Maven](https://maven.apache.org/download.cgi)
Per avere un aiuto su come funziona il progetto, una volta compilato, basterà eseguirlo senza argomento
```
$ java -jar jug-kafka.jar
usage: java -jar jug-kafka.jar [-c <arg>] [-p]
-c,--consumer <arg> run as a consumer [general|limited_offer]
-p,--producer run as a producer
```
Il progetto avrà bisogno di un producer e di due consumer, il producer dovrà essere lanciato nel seguente modo
```
java -jar jug-kafka.jar -p
```
i due consumer dovranno essere istanziati in maniera distinta
il primo dovrà essere creato in questo modo
```
java -jar jug-kafka.jar -c general
```
questo consumer visualizzerà tutti gli articoli inseriti nel catalogo da quando il catalogo è stato creato
il secondo dovrà, invece, essere creato in questo modo
```
java -jar jug-kafka.jar -c limited_offer
```
questo consumer visualizzerà tutti gli articoli inseriti e catalogati con un'offerta limitata, anch'esso da quando il catalogo è stato creato
Per essere certi che i nostri consumer funzionino nella maniera corretta sarà interessante interromperli e, dopo averli fatti ripartire, constatare che entrambi abbiano riletto tutti i messaggi di loro interesse.
## Link interessanti
* [Publishing with Apache Kafka at The New York Times](https://www.confluent.io/blog/publishing-apache-kafka-new-york-times/)
* [7 mistakes when using Apache Kafka](https://blog.softwaremill.com/7-mistakes-when-using-apache-kafka-44358cd9cd6)
* [Configuring Apache Kafka for Performance and Resource Management](https://docs.cloudera.com/documentation/kafka/latest/topics/kafka_performance.html)
* [Optimizing Your Apache Kafka Deployment](https://www.confluent.io/blog/optimizing-apache-kafka-deployment/)
* [What is Apache Kafka?](https://www.youtube.com/watch?v=06iRM1Ghr1k)
|
Markdown | UTF-8 | 1,760 | 2.59375 | 3 | [] | no_license | # उदयपुर/
- [उदयपुरगढी गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv](उदयपुरगढी गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [कटारी नगरपालिका-उदयपुर-प्रदेश नं_ १.csv](कटारी नगरपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [चौदण्डीगढी नगरपालिका-उदयपुर-प्रदेश नं_ १.csv](चौदण्डीगढी नगरपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [ताप्ली गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv](ताप्ली गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [त्रियुगा नगरपालिका-उदयपुर-प्रदेश नं_ १.csv](त्रियुगा नगरपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [रौतामाई गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv](रौतामाई गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [वेलका नगरपालिका-उदयपुर-प्रदेश नं_ १.csv](वेलका नगरपालिका-उदयपुर-प्रदेश नं_ १.csv)
- [सुनकोशी गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv](सुनकोशी गाउँपालिका-उदयपुर-प्रदेश नं_ १.csv)
|
Java | UTF-8 | 1,271 | 4.125 | 4 | [] | no_license | package by.project.java_fundamentals.main;
/**
* This class performs following task:
* Enter the number between 1 and 12 inclusively. Display in console name of month according to entered number.
* Check all the numbers to correct.
*/
public class Task5 {
private static String[] LIST_OF_MONTH = {"January", "February","March", "April", "May","June", "July","August",
"September","October", "November", "December"};
public static void main(String[] args) {
int number = enterNumberAndCheck(args);
try {
System.out.println(LIST_OF_MONTH[number - 1]);
}catch (ArrayIndexOutOfBoundsException e){
System.out.println("There is not such number of month. Try again!");
}
}
/**
* @param args arguments of command line to check for correspondence given range
* @return number of month
*/
private static int enterNumberAndCheck(String[] args) throws NumberFormatException {
try {
if(args.length == 1) {
return Integer.parseInt(args[0]);
}
}catch (NumberFormatException e) {
throw new NumberFormatException("Not suitable format of numbers or amount. Try again");
}
return 0;
}
}
|
Java | UTF-8 | 801 | 2.109375 | 2 | [] | no_license | package com.itheima.service.Impl;
import com.itheima.dao.ItemsDao;
import com.itheima.poji.Items;
import com.itheima.service.ItemsService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.util.List;
@Service
//事务处理类
@Transactional
public class ItemsServiceImpl implements ItemsService {
@Autowired
private ItemsDao dao;
/**
* 查询所有用户
* @return
*/
public List<Items> findAll(){
List<Items> itemsList = dao.findAll();
return itemsList;
}
/**
* 添加用户
* @return
*/
public int save(Items items){
int rows = dao.save(items);
return rows;
}
}
|
Python | UTF-8 | 1,755 | 3.859375 | 4 | [] | no_license | # Uses python3
""" Implement binary search
Input format:
First line of input contains an integer n and a sequence a0 < a1 <.. < an-1
of n pairwise distinct positive integers in increasing order. The next line
contains an integer k and k positive integers b0, b1, ... bk-1
Constraints: 1<=n,k<=10^5, 1<= ai <= 10^9 for all 0<= i < n, 1 <=bj <=10^9
Output format:
For all i from 0 to k - 1 output an index 0<= j <= n-1 such that aj = bi or -1 if
there is no such index.
"""
input_list = list(map(int, input().split()))
n, search_list = input_list[0], input_list[1:]
input_list = list(map(int, input().split()))
k, to_search = input_list[0], input_list[1:]
def solve(search_list, to_search):
search_results = {}
for element in to_search:
if search_results.get(element, None) is not None:
continue
search_results[element] = binary_search_iterative(search_list, element)
# print the results
return [search_results.get(element) for element in to_search]
def binary_search(search_list, element, offset):
if not search_list:
return -1
m = int(len(search_list) / 2)
if search_list[m] > element:
return binary_search(search_list[:m], element, 0)
elif search_list[m] < element:
return binary_search(search_list[m + 1:], element, m)
# if its equal
return m + offset
def binary_search_iterative(search_list, element):
low, high = 0, len(search_list) - 1
while low <= high:
mid = low + int((high - low) / 2)
if element < search_list[mid]:
high = mid - 1
elif element > search_list[mid]:
low = mid + 1
else:
return mid
return -1
print(' '.join(map(str, solve(search_list, to_search)))) |
C# | UTF-8 | 8,633 | 2.515625 | 3 | [
"MIT"
] | permissive | // The MIT License (MIT)
//
// Copyright (c) 2021 Henk-Jan Lebbink
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
namespace AsmTools
{
using System;
using System.Diagnostics.Contracts;
using System.Globalization;
using System.Linq;
using Microsoft.CodeAnalysis.CSharp.Scripting;
public static class ExpressionEvaluator
{
/// <summary> Check if the provided string is a constant. Does not evaluate arithmetic in the string </summary>
public static (bool valid, ulong value, int nBits) Parse_Constant(string str, bool isCapitals = false)
{
Contract.Requires(str != null);
string token2;
bool isHex = false;
bool isBinary = false;
bool isDecimal = false;
bool isOctal = false;
bool isNegative = false;
// Console.WriteLine("AsmSourceTools:ToConstant token=" + token);
str = str.Replace("_", string.Empty);
if (!isCapitals)
{
str = str.ToUpperInvariant();
}
str = str.Trim();
if (str.StartsWith("-", StringComparison.Ordinal))
{
token2 = str;
isDecimal = true;
isNegative = true;
}
// note the special case of token 0h (zero hex) should not be confused with the prefix 0h;
else if (str.EndsWith("H", StringComparison.Ordinal))
{
token2 = str.Substring(0, str.Length - 1);
isHex = true;
}
else if (str.StartsWith("0H", StringComparison.Ordinal) || str.StartsWith("0X", StringComparison.Ordinal) || str.StartsWith("$0", StringComparison.Ordinal))
{
token2 = str.Substring(2);
isHex = true;
}
else if (str.StartsWith("0O", StringComparison.Ordinal) || str.StartsWith("0Q", StringComparison.Ordinal))
{
token2 = str.Substring(2);
isOctal = true;
}
else if (str.EndsWith("Q", StringComparison.Ordinal) || str.EndsWith("O", StringComparison.Ordinal))
{
token2 = str.Substring(0, str.Length - 1);
isOctal = true;
}
else if (str.StartsWith("0D", StringComparison.Ordinal))
{
token2 = str.Substring(2);
isDecimal = true;
}
else if (str.EndsWith("D", StringComparison.Ordinal))
{
token2 = str;
isDecimal = true;
}
else if (str.StartsWith("0B", StringComparison.Ordinal) || str.StartsWith("0Y", StringComparison.Ordinal))
{
token2 = str.Substring(2);
isBinary = true;
}
else if (str.EndsWith("Y", StringComparison.Ordinal))
{
token2 = str.Substring(0, str.Length - 1);
isBinary = true;
}
else
{
// special case with trailing B: either this B is from a hex number of the Binary
if (str.EndsWith("B", StringComparison.Ordinal))
{
bool parsedSuccessfully_tmp = ulong.TryParse(str, NumberStyles.HexNumber, CultureInfo.CurrentCulture, out ulong dummy);
if (parsedSuccessfully_tmp)
{
isHex = true;
token2 = str;
}
else
{
token2 = str.Substring(0, str.Length - 1);
isBinary = true;
}
}
else
{ // assume decimal
token2 = str;
isDecimal = true;
}
}
ulong value = 0;
bool parsedSuccessfully;
if (isHex)
{
parsedSuccessfully = ulong.TryParse(token2, NumberStyles.HexNumber, CultureInfo.CurrentCulture, out value);
}
else if (isOctal)
{
try
{
value = Convert.ToUInt64(token2, 8);
parsedSuccessfully = true;
}
catch
{
parsedSuccessfully = false;
}
}
else if (isBinary)
{
try
{
value = Convert.ToUInt64(token2, 2);
parsedSuccessfully = true;
}
catch
{
parsedSuccessfully = false;
}
}
else if (isDecimal)
{
if (isNegative)
{
parsedSuccessfully = long.TryParse(token2, NumberStyles.Integer, CultureInfo.CurrentCulture, out long signedValue);
value = (ulong)signedValue;
// Console.WriteLine("AsmSourceTools:ToConstant token2=" + token2 + "; signed value = " + Convert.ToString(signedValue, 16) + "; unsigned value = " + string.Format(AsmDudeToolsStatic.CultureUI, "{0:X}", value));
}
else
{
parsedSuccessfully = ulong.TryParse(token2, NumberStyles.Integer, CultureInfo.CurrentCulture, out value);
if (!parsedSuccessfully)
{
parsedSuccessfully = ulong.TryParse(token2, NumberStyles.HexNumber, CultureInfo.CurrentCulture, out value);
}
}
}
else
{
// unreachable
parsedSuccessfully = false;
}
int nBits = parsedSuccessfully ? AsmSourceTools.NBitsStorageNeeded(value, isNegative) : -1;
return (valid: parsedSuccessfully, value: value, nBits: nBits);
}
public static (bool valid, ulong value, int nBits) Evaluate_Constant(string str, bool isCapitals = false)
{
Contract.Requires(str != null);
// 1] test whether str has digits, if it has none it is not a constant
if (!str.Any(char.IsDigit))
{
return (valid: false, value: 0, nBits: -1);
}
// 2] test whether str is a constant
(bool valid, ulong value, int nBits) v = Parse_Constant(str, isCapitals);
if (v.valid)
{
return v;
}
// 3] check if str contains operators
if (str.Contains('+') || str.Contains('-') || str.Contains('*') || str.Contains('/') ||
str.Contains("<<") || str.Contains(">>"))
{
// second: if str is not a constant, test whether evaluating it yields a ulong
try
{
System.Threading.Tasks.Task<ulong> t = CSharpScript.EvaluateAsync<ulong>(str);
ulong value = t.Result;
bool isNegative = false;
return (valid: true, value: value, nBits: AsmSourceTools.NBitsStorageNeeded(value, isNegative));
}
catch (Exception)
{
// Do nothing
}
}
// 4] don't know what it is but it is not likely to be a constant.
return (valid: false, value: 0, nBits: -1);
}
}
}
|
JavaScript | UTF-8 | 4,452 | 2.5625 | 3 | [] | no_license | import React, { useEffect, useState, useRef } from 'react'
import { LineChart, Line, Tooltip, YAxis, XAxis, Label } from 'recharts'
import UserService from '../auth/user-service'
const MovingAvgGraph = ({ user, type, game, title, xLabel, yLabel }) => {
const [graphData, setGraphData] = useState([])
const [windowSize, setWindowSize] = useState(0)
const [currentGame, setCurrentGame] = useState(game)
const getData = async winSize => {
const res = UserService.getUserData(type, currentGame, winSize)
return res
}
const reformatData = data => {
const formattedData = []
let i = 0
data.forEach((item, index) => {
formattedData.push({
trial: i,
value: item.value,
})
i += 1
})
return formattedData
}
const refreshData = winSize => {
getData(winSize).then(res => {
console.log(res.data)
// Check if there's any data
if (res.data.length > 0) {
const data = reformatData(res.data)
setGraphData(data)
} else {
setGraphData([])
}
})
}
return (
<div className="card">
<div className="card-body">
<h5 className="card-title">{ title }</h5>
<h6 className="card-title">Window Size: {windowSize}</h6>
<LineChart
width={600}
height={300}
data={graphData}
margin={{
top: 10, right: 10, bottom: 20, left: 10,
}}
>
<XAxis type="number" dataKey="trial" name={xLabel} allowDecimals={false} domain={['dataMin', 'dataMax']}>
<Label value={xLabel} position="bottom" />
</XAxis>
<YAxis type="number" dataKey="value" name={yLabel} domain={[0, dataMax => Math.ceil(dataMax / 1000) * 1000]}>
<Label value={yLabel} angle={-90} position="left" />
</YAxis>
<Line name={title} dataKey="value" stroke="#66a8ff" strokeWidth={1} dot={false} />
</LineChart>
</div>
<div className="btn-group text-center">
<button
type="button"
className="btn btn-secondary"
onClick={() => {
refreshData(5)
setWindowSize(5)
}}
>
5
</button>
<button
type="button"
className="btn btn-secondary"
onClick={() => {
refreshData(10)
setWindowSize(10)
}}
>
10
</button>
<button
type="button"
className="btn btn-secondary"
onClick={() => {
refreshData(50)
setWindowSize(50)
}}
>
50
</button>
<button
type="button"
className="btn btn-secondary"
onClick={() => {
refreshData(100)
setWindowSize(100)
}}
>
100
</button>
<div className="btn-group">
<button className="btn btn-secondary dropdown-toggle" type="button" data-toggle="dropdown">
Filter by Game
</button>
<ul className="dropdown-menu" aria-labelledby="dropdownMenuButton1">
<li>
<a
className="dropdown-item"
href="#/"
onClick={() => {
setCurrentGame('all')
refreshData()
}}
>
All
</a>
</li>
<li>
<a
className="dropdown-item"
href="#/"
onClick={() => {
setCurrentGame('moba')
refreshData()
}}
>
MOBA
</a>
</li>
<li>
<a
className="dropdown-item"
href="#/"
onClick={() => {
setCurrentGame('fps')
refreshData()
}}
>
FPS
</a>
</li>
<li>
<a
className="dropdown-item"
href="#/"
onClick={() => {
setCurrentGame('3d')
refreshData()
}}
>
3D Sports
</a>
</li>
</ul>
</div>
</div>
</div>
)
}
export default MovingAvgGraph
|
C++ | UTF-8 | 767 | 2.90625 | 3 | [] | no_license | #pragma once
#include "plan.h"
#include <fstream>
class ExercisePlan : public Plan
{
private:
int stepGoal;
public:
//constructor
ExercisePlan();
ExercisePlan(int newStepGoal, string newName, Date newDate);
ExercisePlan(string newStepGoal, string newName, string newDate);
//copy constructor
ExercisePlan(const ExercisePlan &plan);
//destructor
~ExercisePlan();
friend void operator<<(ostream & os, const ExercisePlan & exercisePlan);
friend void operator<<(ofstream& os, const ExercisePlan& exercisePlan);
friend ExercisePlan& operator>>(ifstream& os, ExercisePlan& exercisePlan);
void setStepGoal(int newStepGoal);
void print();
int getStepGoal() const;
string getName() const;
string getDate() const;
string returnFormattedData();
}; |
Python | UTF-8 | 1,040 | 4.78125 | 5 | [] | no_license | #how to write your own functions
#functions help with grouping code that is run often or multiple times
#you can just call a function instead of writing the code each time
#always avoid duplicating code
#you can just change the function code and it changes everywhere the function is called
def hello(): #def statement defines the new function
#the code below is called the body of the function
print('Howdy!')
print('Howdy partner!!!')
print('Hello there.')
#this calls the hello function 3 times
hello()
hello()
hello()
#this is a new function
def hello(name): #this function has a variable or parameter called 'name'
print('Hello ' + name)
hello('Alice')
hello('Bob')
#can also get the parameters from user input
#can't get the input inside of the function???
def helloAgain(firstName):
#firstName = input('What is your name?') WHY DOESN'T THIS WORK?
print('Hello ' + firstName)
#you have to get the input when as a parameter when you call the function?
helloAgain(firstName=input('What is your name?'))
|
Java | UTF-8 | 2,699 | 2.296875 | 2 | [] | no_license | package sg.edu.rp.c346.id20032316.l09_ndpsongs;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.os.Bundle;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.RadioGroup;
import android.widget.RatingBar;
import android.widget.Toast;
import java.util.ArrayList;
public class MainActivity extends AppCompatActivity {
EditText etTitle, etSinger, etYear;
// RadioGroup rgStar;
RatingBar rbStar;
Button btnInsert, btnShow;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
etTitle = findViewById(R.id.etSongTitle);
etSinger = findViewById(R.id.etSongSinger);
etYear = findViewById(R.id.etYear);
// rgStar = findViewById(R.id.rgStar);
rbStar = findViewById(R.id.rbStar);
btnInsert = findViewById(R.id.btnAdd);
btnShow = findViewById(R.id.btnShowList);
btnInsert.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
String title = etTitle.getText().toString();
String singer = etSinger.getText().toString();
int year = Integer.parseInt(etYear.getText().toString());
// int stars = 0;
// if (rgStar.getCheckedRadioButtonId() == R.id.rb1) {
// stars = 1;
// } else if (rgStar.getCheckedRadioButtonId() == R.id.rb2) {
// stars = 2;
// } else if (rgStar.getCheckedRadioButtonId() == R.id.rb3) {
// stars = 3;
// }else if (rgStar.getCheckedRadioButtonId() == R.id.rb4) {
// stars = 4;
// }else if (rgStar.getCheckedRadioButtonId() == R.id.rb5) {
// stars = 5;
// }
float stars = rbStar.getRating();
DBHelper dbh = new DBHelper(MainActivity.this);
long inserted_id = dbh.insertSong(title, singer,year, stars);
if (inserted_id != -1){
Toast.makeText(MainActivity.this, "Insert successful",
Toast.LENGTH_SHORT).show();
}
}
});
btnShow.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
Intent intent = new Intent(MainActivity.this, ShowSongActivity.class);
startActivity(intent);
}
});
}
} |
C++ | UTF-8 | 1,588 | 3.0625 | 3 | [] | no_license | //
// Created by timking.nf@foxmail.com on 2020/7/14.
//
#include "gtest/gtest.h"
#include "header.h"
#include "141.linked-list-cycle.h"
namespace LeetCode141 {
typedef tuple<vector<int>, int> ArgumentType;
typedef bool ResultType;
typedef tuple<ArgumentType, ResultType> ParamType;
ListNode* getNthNode(ListNode* head, int n) {
if (n < 0) return nullptr;
while (n-- > 0) {
head = head->next;
}
return head;
}
class LeetCode141Test : public ::testing::TestWithParam<ParamType> {
// You can implement all the usual fixture class members here.
// To access the test parameter, call GetParam() from class
// TestWithParam<T>.
};
TEST_P(LeetCode141Test, main) {
// Inside a test, access the test parameter with the GetParam() method
// of the TestWithParam<T> class:
auto solution = new LeetCode141::Solution();
ArgumentType arguments;
ResultType ret;
tie(arguments, ret) = GetParam();
vector<int> inputs;
int n;
tie(inputs, n) = arguments;
ListNode* head = construct_link_list<ListNode, int>(inputs);
ListNode* tail = getNthNode(head, inputs.size()-1);
ListNode* cycleEnterNode = getNthNode(head, n);
tail->next = cycleEnterNode; // 构造环
ASSERT_EQ(solution->hasCycle(head), ret);
// link_list_print<ListNode>(head);
}
auto values = ::testing::Values(
ParamType(ArgumentType({3,2,0,-4}, 1), true),
ParamType(ArgumentType({1,2}, 0), true),
ParamType(ArgumentType({1}, -1), false)
);
//第一个参数是前缀;第二个是类名;第三个是参数生成器
INSTANTIATE_TEST_SUITE_P(LeetCode141ParamTest, LeetCode141Test, values);
} |
TypeScript | UTF-8 | 4,552 | 3.109375 | 3 | [] | no_license | import BN from 'bn.js';
export function trxToSun(trxCount: number) {
if (trxCount >= 0) {
return trxCount * Math.pow(10, 6);
}
return 0;
}
export function sunToTrx(sunCount: number) {
if (sunCount >= 0) {
return sunCount / Math.pow(10, 6);
}
return 0;
}
export enum DivMode {
Ceil,
Round,
Floor,
}
const DecimalsMap: Record<string, string> = {
0: '1',
1: '10',
2: '100',
3: '1000',
4: '10000',
5: '100000',
6: '1000000',
7: '10000000',
8: '100000000',
9: '1000000000',
10: '10000000000',
11: '100000000000',
12: '1000000000000',
13: '10000000000000',
14: '100000000000000',
15: '1000000000000000',
16: '10000000000000000',
17: '100000000000000000',
18: '1000000000000000000',
};
const negative1 = new BN(-1);
function numberToString(arg: string | number | any): string {
if (typeof arg === 'string') {
if (!arg.match(/^-?[0-9.]+$/)) {
throw new Error(
`while converting number to string, invalid number value '${arg}', should be a number matching (^-?[0-9.]+).`
);
}
return arg;
} else if (typeof arg === 'number') {
return String(arg);
} else if (typeof arg === 'object' && arg.toString && (arg.toTwos || arg.dividedToIntegerBy)) {
if (arg.toPrecision) {
return String(arg.toPrecision());
} else {
return arg.toString(10);
}
}
throw new Error(
`while converting number to string, invalid number value '${arg}' type ${typeof arg}.`
);
}
function validateDecimals(tokenDecimals: number) {
if (tokenDecimals > 18 || tokenDecimals < 0) {
throw new Error('Invalid decimal number: ' + tokenDecimals);
}
}
export function getDecimalBase(tokenDecimals: number): BN {
validateDecimals(tokenDecimals);
return new BN(DecimalsMap[tokenDecimals]);
}
export function toTokenDecimals(value: string | number | BN, tokenDecimals: number): BN {
validateDecimals(tokenDecimals);
let valueStr = numberToString(value);
const base = new BN(DecimalsMap[tokenDecimals]);
const baseLength = DecimalsMap[tokenDecimals].length - 1 || 1;
// Is it negative?
const negative = valueStr.substring(0, 1) === '-';
if (negative) {
valueStr = valueStr.substring(1);
}
if (valueStr === '.') {
throw new Error(`[ethjs-unit] while converting number ${value} to wei, invalid value`);
}
// Split it into a whole and fractional part
const comps = valueStr.split('.'); // eslint-disable-line
if (comps.length > 2) {
throw new Error(
`[ethjs-unit] while converting number ${value} to wei, too many decimal points`
);
}
const whole = comps[0] ? comps[0] : '0';
let fraction = comps[1] ? comps[1] : '0';
if (fraction.length > baseLength) {
throw new Error(
`[ethjs-unit] while converting number ${valueStr} to wei, too many decimal places`
);
}
while (fraction.length < baseLength) {
fraction += '0';
}
let wei = new BN(whole).mul(base).add(new BN(fraction));
if (negative) {
wei = wei.mul(negative1);
}
// TODO why is this?? we should return wei!
return new BN(wei.toString(10), 10);
}
export function fromTokenDecimals(value: BN, tokenDecimals: number): string {
validateDecimals(tokenDecimals);
const negative = value.isNeg();
value = value.abs();
const base = new BN(DecimalsMap[tokenDecimals]);
const baseLength = DecimalsMap[tokenDecimals].length - 1 || 1;
let fraction = value.mod(base).toString(10);
// Add prefix 0 to complete decimals length
while (fraction.length < baseLength) {
fraction = `0${fraction}`;
}
const whole = value.div(base).toString(10);
return `${negative ? '-' : ''}${whole}${fraction === '0' ? '' : `.${fraction}`}`;
}
export function changeDecimals(
value: BN,
inputDecimals: number,
outputDecimals: number,
divmode = DivMode.Round
): string {
if (outputDecimals >= inputDecimals) {
return fromTokenDecimals(value, inputDecimals);
}
validateDecimals(inputDecimals);
const decimalToRemove = inputDecimals - outputDecimals;
const baseToRemove = new BN(DecimalsMap[decimalToRemove]);
switch (divmode) {
case DivMode.Round:
return fromTokenDecimals(value.divRound(baseToRemove), outputDecimals);
case DivMode.Floor:
return fromTokenDecimals(value.div(baseToRemove), outputDecimals);
case DivMode.Ceil:
const rest = value.mod(baseToRemove);
value = value.div(baseToRemove);
if (rest.gt(new BN(0))) {
value = value.add(new BN(1));
}
return fromTokenDecimals(value, outputDecimals);
}
}
|
PHP | UTF-8 | 595 | 2.859375 | 3 | [
"MIT"
] | permissive | <?php
namespace BrainExe\Core\Websockets;
use BrainExe\Core\EventDispatcher\AbstractEvent;
/**
* @api
*/
class WebSocketEvent extends AbstractEvent
{
const PUSH = 'websocket.push';
/**
* @var AbstractEvent
*/
private $payload;
/**
* @param AbstractEvent $payload
*/
public function __construct(AbstractEvent $payload)
{
parent::__construct(self::PUSH);
$this->payload = $payload;
}
/**
* @return AbstractEvent
*/
public function getPayload() : AbstractEvent
{
return $this->payload;
}
}
|
Java | UTF-8 | 4,763 | 2.0625 | 2 | [] | no_license | package com.xdbigdata.user_manage_admin.web.controller;
import com.xdbigdata.framework.web.annotation.PreventRepeatSubmit;
import com.xdbigdata.framework.web.model.JsonResponse;
import com.xdbigdata.user_manage_admin.bean.ResultBean;
import com.xdbigdata.user_manage_admin.model.UserModel;
import com.xdbigdata.user_manage_admin.model.qo.student.AddAndUpdateStudentQo;
import com.xdbigdata.user_manage_admin.model.qo.student.ListClazzStudentQo;
import com.xdbigdata.user_manage_admin.model.qo.student.MoveClazzStudentQo;
import com.xdbigdata.user_manage_admin.model.qo.student.TKQuery;
import com.xdbigdata.user_manage_admin.model.vo.student.ListClazzStudentVo;
import com.xdbigdata.user_manage_admin.model.vo.student.StudentVo;
import com.xdbigdata.user_manage_admin.service.StudentService;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiImplicitParam;
import io.swagger.annotations.ApiOperation;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.multipart.MultipartFile;
import javax.validation.Valid;
import java.util.List;
/**
* @创建人:huangjianfeng
* @简要描述:学生相关的管理
* @创建时间: 2018/10/25 10:49
* @参数:
* @返回:
*/
@Api(value = "学生用户", tags = "人员管理 - 学生用户")
@RestController
@RequestMapping("api/student/")
public class StudentController {
@Autowired
private StudentService studentService;
@CrossOrigin
@PostMapping("listStu")
public JsonResponse findByClassId(@RequestBody TKQuery tkQuery) {
return JsonResponse.success(studentService.findSimpStuInf2TalkManage(tkQuery));
}
@ApiOperation(value = "获取班级下的学生列表", notes = "班级的学生列表接口")
@PostMapping(value = "findByClassId", produces = "application/json;charset=UTF-8")
public ResultBean<ListClazzStudentVo> findByClassId(@Valid @RequestBody ListClazzStudentQo listClazzStudentQo) {
ListClazzStudentVo listClazzStudentVo = studentService.findByClassId(listClazzStudentQo);
return new ResultBean<>(listClazzStudentVo);
}
@ApiOperation(value = "查看学生详细信息", notes = "查看学生详细信息")
@GetMapping(value = "findById/{id}", produces = "application/json;charset=UTF-8")
public ResultBean<StudentVo> findById(@PathVariable Long id) {
StudentVo studentVo = studentService.findById(id);
return new ResultBean<>(studentVo);
}
@ApiOperation(value = "在选中班级下新增/修改学生", notes = "班级下新增/修改学生接口")
@PostMapping(value = "addOrUpdate", produces = "application/json;charset=UTF-8")
public ResultBean addOrUpdate(@Valid @RequestBody AddAndUpdateStudentQo addAndUpdateStudentQo) {
studentService.addOrUpdate(addAndUpdateStudentQo);
return ResultBean.createResultBean("操作成功");
}
@PreventRepeatSubmit
@ApiOperation(value = "批量新增学生")
@RequestMapping(value = "/uploadStudentExcel/{classId}", method = RequestMethod.POST)
public ResultBean uploadStudentExcel(@PathVariable("classId") Long classId, @RequestParam("file") MultipartFile file) throws Exception {
List<String> studentSns = studentService.uploadStudentExcel(file, classId);
return new ResultBean(studentSns);
}
@PreventRepeatSubmit
@ApiOperation(value = "学生移动班级", notes = "学生移动班级接口")
@PostMapping(value = "move", produces = "application/json;charset=UTF-8")
public ResultBean editClazzStudent(@Valid @RequestBody MoveClazzStudentQo moveClazzStudentQo) {
String message = studentService.moveClazzStudent(moveClazzStudentQo);
return ResultBean.createResultBean(message);
}
@PreventRepeatSubmit
@ApiOperation(value = "删除班级学生", notes = "删除班级学生接口")
@GetMapping(value = "delete/{studentId}", produces = "application/json;charset=UTF-8")
@ApiImplicitParam(name = "studentId", value = "学生id", dataType = "long", paramType = "path", required = true)
public ResultBean removeClazzStudent(@PathVariable("studentId") Long studentId) {
String message = studentService.removeClazzStudent(studentId);
return ResultBean.createResultBean(message);
}
@ApiOperation(value = "授权学生中根据学号筛选学生", notes = "授权学生中根据学号筛选学生")
@GetMapping(value = "findBySnAndOnlyStudentRole/{sn}", produces = "application/json;charset=UTF-8")
public ResultBean findBySnAndOnlyStudentRole(@PathVariable String sn) {
UserModel userModel = studentService.findBySnAndOnlyStudentRole(sn);
return new ResultBean<>(userModel);
}
} |
Python | UTF-8 | 3,267 | 3.390625 | 3 | [] | no_license | # pendulum project
# Amma Agyei and Tina Guo
#Import Statements
import math
import numpy as np
import matplotlib.pyplot as plt
import time
import scipy.signal as sig
#Global Variables
L = 0.56 #m
g = -9.8 #m/s^2
m = 0.5 #kg
#Functions
def system_update(omega_initial, theta_initial, acc_initial, time1, time2):
#return updated values of period, acceleration, and velocity, angular position
dt = time2 - time1
theta = theta_initial + omega_initial * dt
omega = omega_initial + (acc_initial * dt)
acc_t = ((-1 * g) / L ) * math.sin(theta)
return theta, acc_t, omega
def print_system(omega_initial, acc_initial, theta_initial):
print("POSITION: ", theta_initial)
print("ACCELERATION: ", acc_initial)
print("VELOCITY: ", omega_initial, "\n")
print((omega_initial, acc_initial, theta_initial))
#initial conditions
theta_initial = [math.pi / 6]
omega_initial = [0]
acc_initial = [-4.9]
time = np.linspace(0, 20, 100000)
print_system(omega_initial[0], acc_initial[0], theta_initial[0])
i = 1
while i < len(time):
theta, acc_t, omega = system_update(omega_initial[i-1], theta_initial[i-1], acc_initial[i-1], time[i-1], time[i])
omega_initial.append(omega)
theta_initial.append(theta)
acc_initial.append(acc_t)
i += 1
#Plots acceleration vs time on graph
plt.figure(1)
plt.subplot(3,1,2)
plt.plot(time,acc_initial, 'k-')
plt.xlabel('Time(seconds)')
plt.ylabel('acceleration(m/s**2)')
plt.title('Acceleration vs Time')
plt.xlim((0,20))
plt.grid()
#Plots velocity vs time on graph
plt.figure(2)
plt.subplot(3,1,2)
plt.plot(time,omega_initial, 'k-')
plt.xlabel('Time(seconds)')
plt.ylabel('velocity(m/s)')
plt.title('Velocity vs Time')
plt.xlim((0,20))
plt.grid()
#Plots position vs time on graph
plt.figure(3)
plt.subplot(3,1,2)
plt.plot(time,theta_initial, 'k-')
plt.xlabel('Time(seconds)')
plt.ylabel('position(m)')
plt.title('Position vs Time')
plt.xlim((0,20))
plt.grid()
#finding peaks
theta_initial = np.radians(theta_initial) #converts theta into radians
acc_initial = np.array(acc_initial) #creates an array for acceleration
t = np.array(time) #creates an array for time
t = t
acc_pks,_ = sig.find_peaks(acc_initial,2,3,10) #finds peaks within the window provided
acc_filt = sig.medfilt(acc_initial,5) #filters noisy data
acc_filt_pks, _ = sig.find_peaks(acc_filt,1,None,7)
#Plotting of Noisy Data
plt.figure()
plt.plot(t,acc_initial, 'r-', t[acc_pks],acc_initial[acc_pks], 'b.')
plt.xlabel('Time(seconds)')
plt.ylabel('acceleration(m/s**2)')
plt.title('Noisy Data')
plt.show()
#Plotting of Filtered Data
plt.figure()
plt.plot(t, acc_filt, 'r-',t[acc_filt_pks],acc_filt[acc_filt_pks], 'b.')
plt.xlabel('Time(seconds)')
plt.ylabel('acceleration(m/s**2)')
plt.title('Filtered Data')
plt.show()
#Plotting peaks over time in order to calculate the period
newt = t[acc_filt_pks]
newacc = acc_initial[acc_filt_pks]
t.resize((17,),refcheck = False) #resizes the array so that they match
plt.figure()
plt.plot(newt, newacc, 'ro-')
plt.xlabel('Time(seconds)')
plt.ylabel('acceleration(m/s**2)')
plt.title('Peaks Vs Time')
plt.show()
#Finding the mean period to calculate the period
mean_Period = newt.mean()
print(mean_Period) |
PHP | UTF-8 | 585 | 3.078125 | 3 | [] | no_license | <?php
class DBController {
private $host = "localhost";
private $username = "root";
private $dbname = "conference";
private $conn;
function __construct() {
$this->conn = $this->connectDB();
}
function connectDB() {
$conn = new PDO("mysql:host=$this->host;dbname=$this->dbname", $this->username);
return $conn;
}
function exec($query){
$this->conn->exec($query);
}
function runQuery($query) {
$statement = $this->conn->prepare($query);
$statement->execute();
return $statement->fetchAll(PDO::FETCH_ASSOC);
}
}
?> |
C++ | UTF-8 | 398 | 2.734375 | 3 | [] | no_license | #pragma once
class LevelStrategy
{
public:
virtual int get_base() = 0;
};
class P1LStrategy : public LevelStrategy{
public:
virtual int get_base() { return 2000; }
};
class P2LStrategy : public LevelStrategy
{
public:
virtual int get_base() { return 5000; }
};
class P3LStrategy : public LevelStrategy
{
public:
virtual int get_base() { return 10000; }
}; |
Java | UTF-8 | 3,768 | 2.484375 | 2 | [] | no_license | package com.github.franckysolo.am2032;
import android.bluetooth.BluetoothDevice;
import android.graphics.Color;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.BaseAdapter;
import android.widget.ImageView;
import android.widget.TextView;
import java.util.ArrayList;
/**
* Created by franckysolo on 24/12/16.
*
* Listes des appareils Bluetooth détecté via BLE
*/
public class DeviceListAdapter extends BaseAdapter {
private ArrayList<BluetoothDevice> mDeviceList;
private ArrayList<Integer> mSignalList;
private LayoutInflater mInflater;
public DeviceListAdapter(MainActivity mainActivity) {
super();
mDeviceList = new ArrayList<BluetoothDevice>();
mSignalList = new ArrayList<Integer>();
mInflater = mainActivity.getLayoutInflater();
}
public void clear() {
mDeviceList.clear();
}
public void addDevice(BluetoothDevice device, int rssi) {
if (!mDeviceList.contains(device)) {
mDeviceList.add(device);
mSignalList.add(rssi);
}
}
public BluetoothDevice getDevice(int position) {
return mDeviceList.get(position);
}
@Override
public int getCount() {
return mDeviceList.size();
}
@Override
public Object getItem(int position) {
return mDeviceList.get(position);
}
@Override
public long getItemId(int position) {
return position;
}
@Override
public View getView(int position, View convertView, ViewGroup parent) {
ViewHolder viewHolder;
if (convertView == null) {
convertView = mInflater.inflate(R.layout.list_item_device, null);
viewHolder = new ViewHolder();
viewHolder.name = (TextView) convertView.findViewById(R.id.name);
viewHolder.address = (TextView) convertView.findViewById(R.id.address);
viewHolder.rssi = (TextView) convertView.findViewById(R.id.rssi);
viewHolder.icon = (ImageView) convertView.findViewById(R.id.icon);
} else {
viewHolder = (ViewHolder) convertView.getTag();
}
BluetoothDevice device = mDeviceList.get(position);
final int signal = mSignalList.get(position);
final String deviceSignal = String.valueOf(signal);
if (viewHolder != null) {
if (device.getName() == null) {
viewHolder.name.setText(R.string.unknown_peripheral);
} else {
viewHolder.name.setText(device.getName());
}
if (device.getAddress() == null) {
viewHolder.address.setText(R.string.unknown_address);
} else {
viewHolder.address.setText(device.getAddress());
}
if (deviceSignal == null) {
viewHolder.rssi.setText(R.string.no_signal);
} else {
viewHolder.rssi.setText(deviceSignal);
}
if (signal >= -50) { // High quality
viewHolder.icon.setColorFilter(Color.parseColor("#009900"));
} else if (signal < -51 && signal >= -75) { // Medium quality
viewHolder.icon.setColorFilter(Color.parseColor("#ff9900"));
} else {
if (signal < -75 && signal >= -85) { // Low quality
viewHolder.icon.setColorFilter(Color.parseColor("#990000"));
} else {
viewHolder.icon.setColorFilter(Color.LTGRAY);
}
}
}
return convertView;
}
static class ViewHolder {
TextView name;
TextView address;
TextView rssi;
ImageView icon;
}
}
|
Markdown | UTF-8 | 784 | 2.515625 | 3 | [] | no_license | Title: Self-loathing and the Love of God
Lead: Some thoughts on the limitations of love
Published: 5/18/2021
Tags: [limits, faith]
---
Why is it that the source of our most intimate and intense emotions are felt in conjunction with things that don't exist? Does the existential doubt that underlies these imaginary constructions bring the mind into such conflict with itself that it overdoses on the feedback loop?
Or perhaps it is precisely that the feeling is toward something against which it cannot be checked with the reality principle, that it is able to grow and grow. My love for other (real) people is constantly checked by their insufferability and their constant complaining.
God and the self suffer no concrete contradictions to diffuse their torment or ecstacy.
|
Markdown | UTF-8 | 11,639 | 2.734375 | 3 | [] | no_license | #### yum概述
rpm很大程度上简化了安装程序的流程,但它也存在一个缺陷:不能解决程序包的依赖问题。而yum的诞生就是为了弥补这个不足。yum不能说是一款全新的程序包管理工具,它只是rpm的前端管理工具。之所以这么说,是因为yum的安装、升级、卸载、查询等操作,都要依靠rpm来完成。要想使用yum,系统上必须安装rpm,yum程序包是依赖于rpm程序包的。
yum是一款基于rpm包管理器的前端程序包管理器,全称为Yellow dog Update Modifier。它可以从指定服务器上自动下载程序包,并自动分析程序包的元数据、自动处理程序包之间的依赖关系,能一次性安装完所有依赖的包,而无须繁琐地一次次安装所有依赖包。
#### yum的工作原理
yum是基于C-S(客户端-服务器)模式的,用户从服务器上下载程序包进行安装。为此,我们需要在服务器上构建一个存储rpm包的仓库,用于存储rpm包,并接受来自客户端的访问。为了方便客户端一次性获取全部rpm包详细信息,以分析依赖关系,服务器端将仓库中所有rpm包的详细信息(包括依赖关系信息)提取出来,做成仓库的元数据文件。这样依赖,客户端只需获取元数据文件,便能得到整个仓库中rpm包的信息,以进行依赖关系分析。
**第一次使用yum时的的工作方式如下所示:**
>第一步:获取yum仓库元数据文件,并缓存在本地的/var/cache/yum目录下。
>第二步:在元数据文件中查找用户选择要安装的程序包,根据元数据文件在本地分析要安装的程序包的依赖关系,结合当前系统上已安装的程序包,列出需要但未安装的程序包清单。
>第三步:将需要下载的程序包清单发送给yum仓库。
>第四步:下载各程序包缓存至本地,并安装。
>第五步:程序包安装结束后,缓存的程序包文件将被删除,但缓存的仓库元数据文件将被保留,用于下次安装时使用。

保留yum仓库元数据文件,能够节省带宽,但同时也带来一个问题:yum仓库如果进行了更新,它的元数据文件也会发生变化,这样以来客户端的元数据文件和服务器的元数据文件就不一致了。为了解决者两个文件的一致性问题,便需要用到元数据文件的校验码了。我们可以在yum仓库中为元数据文件生成一个校验码,客户端从服务器请求元数据文件校验码,与缓存中的元数据文件进行比对,相同则缓存与服务器元数据文件一致,可以使用,不同则不一致,需要重新下载元数据文件。
**后续使用yum安装程序包的流程:**
>第一步:获取yum仓库校验码文件。
>第二步:核对校验码,相同则使用缓存的元数据文件,不同则重新下载元数据文件。
>第三步:之后的步骤与上面介绍的首次使用yum相同。

**CentOS上两个常见的前端工具: yum, dnf**
**yum repository: yum repo**
存储了众多rpm包,以及包的相关的元数据文件(放置于特定目录下:repodata);
#### yum客户端:
##### 配置文件:
**/etc/yum.conf:为所有仓库提供公共配置**
```
[main]
cachedir=/var/cache/yum/$basearch/$releasever #设置缓存目录
keepcache=0 #安装后是否保留缓存
debuglevel=2 #调试消息输出等级
logfile=/var/log/yum.log #日志文件路径
exactarch=1 #是否做精确平台匹配
obsoletes=1
gpgcheck=1 #是否检查来源合法性和包完整性
plugins=1 #是否支持插件机制
installonly_limit=5 #最多同时安装多少个程序包
bugtracker_url=http://bugs.centos.org/set_project.php?
project_id=23&ref=http://bugs.centos.org/bug_report_page.php?category=yum #bug追踪URL
distroverpkg=centos-release #发行版版本
```
**/etc/yum.repos.d/\*.repo:为仓库的指向提供配置**
```
[repositoryID] #仓库名
name=Some name for this repository #对repo作功能性说明
baseurl=url://path/to/repository/ #仓库指向的路径,可指向多个路径
enabled={1|0} #是否启用该仓库,默认为1(启用)
gpgcheck={1|0} #是否要对程序包数据的来源合法性和数据完整性做校验
gpgkey=URL #指定GPG密钥文件的访问路径,可由仓库提供
enablegroups={1|0} #是否允许以组的方式管理仓库;
failovermethod={roundrobin|priority} #当baseurl同时指向多个仓库路径时,可指定以什么方式选择url去访问仓库,以及当某一路径访问失败时,可指定如何再选择路径;roundrobin是随机挑选路径访问,priority是自上而下选择路径访问;默认为roundrobin
cost= #开销;开销越小,该仓库url更优;默认为1000。
```
**仓库文件服务器指向:**
>ftp://
>http://
>nfs://
>file:///
#### yum命令的用法:
`yum [options] [command] [package ...]`
command is one of:
>\* install package1 [package2] [...]
>\* update [package1] [package2] [...]
>\* update-to [package1] [package2] [...]
>\* check-update
>\* upgrade [package1] [package2] [...]
>\* upgrade-to [package1] [package2] [...]
>\* distribution-synchronization [package1] [package2] [...]
>\* remove | erase package1 [package2] [...]
>\* list [...]
>\* info [...]
>\* provides | whatprovides feature1 [feature2] [...]
>\* clean [ packages | metadata | expire-cache | rpmdb | plugins | all ]
>\* makecache
>\* groupinstall group1 [group2] [...]
>\* groupupdate group1 [group2] [...]
>\* grouplist [hidden] [groupwildcard] [...]
>\* groupremove group1 [group2] [...]
>\* groupinfo group1 [...]
>\* search string1 [string2] [...]
>\* shell [filename]
>\* resolvedep dep1 [dep2] [...]
>\* localinstall rpmfile1 [rpmfile2] [...]
>(maintained for legacy reasons only - use install)
>\* localupdate rpmfile1 [rpmfile2] [...]
>(maintained for legacy reasons only - use update)
>\* reinstall package1 [package2] [...]
>\* downgrade package1 [package2] [...]
>\* deplist package1 [package2] [...]
>\* repolist [all|enabled|disabled]
>\* version [ all | installed | available | group-* | nogroups* | grouplist | groupinfo ]
>\* history [info|list|packages-list|packages-info|summary|addon-info|redo|undo|rollback|new|sync|stats]
>\* check
>\* help [command]
**显示仓库列表:**
`repolist [all|enabled|disabled]`
**显示程序包:list**
`yum list [all | glob_exp1] [glob_exp2] [...]`
`yum list {available|installed|updates} [glob_exp1] [...]`
**安装程序包:**
`yum install package1 [package2] [...]`
`yum reinstall package1 [package2] [...] ` (重新安装)
**升级程序包:**
`yum update [package1] [package2] [...]`
`yum downgrade package1 [package2] [...] `(降级)
**检查可用升级:**
`yum check-update`
**卸载程序包:**
`yum remove | erase package1 [package2] [...]`
**查看程序包information:**
` yum info package1 [package2] [...]`
**查看指定的特性(可以是某文件)是由哪个程序包所提供:**
`yum provides | whatprovides feature1 [feature2] [...]`
**清理本地缓存:**
`yum clean [ packages | metadata | expire-cache | rpmdb | plugins | all ]`
**构建缓存:**
`yum makecache`
**搜索:以指定的关键字搜索程序包名及summary信息**
`yum search string1 [string2] [...]`
**查看指定包所依赖的capabilities:**
`yum deplist package1 [package2] [...]`
**查看yum事务历史:**
`yum history [info|list|packages-list|packages-info|summary|addon-info|redo|undo|rollback|new|sync|stats]`
##### 安装及升级本地程序包:
`yum localinstall rpmfile1 [rpmfile2] [...]` (maintained for legacy reasons only - use install)
`yum localupdate rpmfile1 [rpmfile2] [...]` (maintained for legacy reasons only - use update)
##### 包组管理的相关命令:
`yum groupinstall group1 [group2] [...]`
`yum groupupdate group1 [group2] [...]`
`yum grouplist [hidden] [groupwildcard] [...]`
`yum groupremove group1 [group2] [...]`
`yum groupinfo group1 [...]`
##### 如何使用光盘当作本地yum仓库:
(1) 挂载光盘至某目录,例如/media/cdrom
`mount -r -t iso9660 /dev/cdrom /media/cdrom`
(2) 创建配置文件
```
[CentOS 7]
name=this is a local repo.
baseurl=file:///media/cdrom
gpgcheck=0
enabled=1
```
##### yum的命令行选项:
>--nogpgcheck:禁止进行gpg check;
>-y: 自动回答为“yes”;
>-q:静默模式;
>--disablerepo=repoidglob:临时禁用此处指定的repo;
>--enablerepo=repoidglob:临时启用此处指定的repo;
>--noplugins:禁用所有插件;
##### yum的repo配置文件中可用的变量:
>\$releasever: 当前OS的发行版的主版本号;
>\$arch: 平台;
>\$basearch:基础平台;
>\$YUM0-\$YUM9
>[http://mirrors.magedu.com/centos/\$releasever/\$basearch/os](http://mirrors.magedu.com/centos/\$releasever/\$basearch/os)
>如:http://mirrors.magedu.com/centos/7/x86_64/os
##### 创建yum仓库:
`createrepo [options] <directory>`
```
[root@promote ~]# yum install createrepo
```
**练习:使用自制的yum源安装ftp、openssh、wget、tcpdump等软件包**
使用光盘镜像中的包模拟创建本地仓库
```
[root@promote ~]# mount -r -t iso9660 /dev/cdrom /media/cdrom/
[root@promote ~]# mkdir -p /yumrepo/Packages ###创建本地packages目录
###将需要的Packages拷贝到本地目录
cp /media/cdrom/Packages/ftp-0.17-67.el7.x86_64.rpm /yumrepo/Packages/
cp /media/cdrom/Packages/openssh-* /yumrepo/Packages/
cp /media/cdrom/Packages/curl-7.29.0-35.el7.centos.x86_64.rpm /yumrepo/Packages/
cp /media/cdrom/Packages/wget-1.14-13.el7.x86_64.rpm /yumrepo/Packages/
cp /media/cdrom/Packages/tcpdump-4.5.1-3.el7.x86_64.rpm /yumrepo/Packages/
###使用createrepo命令创建本地yum源(如果没有此命令可以使用yum -y install createrepo安装)
[root@promote yumrepo]# createrepo /yumrepo/
Spawning worker 0 with 15 pkgs
Workers Finished
Gathering worker results
Saving Primary metadata
Saving file lists metadata
Saving other metadata
Generating sqlite DBs
Sqlite DBs complete
```
修改yum配置文件,将仓库指向本地源
```
[root@www ~]# cd /etc/yum.repos.d/
[root@www yum.repos.d]# vim local.repo
[yumrepo]
name=yumrepo
baseurl=file:///yumrepo
enabled=1
gpgcheck=0
[root@www yum.repos.d]# yum repolist #######查看yum仓库
已加载插件:fastestmirror, security
Loading mirror speeds from cached hostfile
* base: mirror.lzu.edu.cn
* extras: mirror.lzu.edu.cn
* updates: mirrors.cqu.edu.cn
仓库标识 仓库名称 状态
base CentOS-6 - Base 6,713
extras CentOS-6 - Extras 35
updates CentOS-6 - Updates 251
yumrepo yumrepo 15
repolist: 7,014
```
安装ftp、openssh、curl、wget、tcpdump等软件包
```
yum install -y ftp openssh curl wget tcpdump
``` |
Java | UTF-8 | 6,130 | 2.203125 | 2 | [] | no_license | package com.github.javaclub.jorm.proxy.pojo.javassist;
import java.io.Serializable;
import java.lang.reflect.InvocationTargetException;
import java.lang.reflect.Method;
import javassist.util.proxy.MethodFilter;
import javassist.util.proxy.MethodHandler;
import javassist.util.proxy.ProxyFactory;
import javassist.util.proxy.ProxyObject;
import org.apache.commons.logging.LogFactory;
import com.github.javaclub.jorm.JormException;
import com.github.javaclub.jorm.Session;
import com.github.javaclub.jorm.common.Reflections;
import com.github.javaclub.jorm.proxy.JormProxy;
import com.github.javaclub.jorm.proxy.pojo.BasicLazyInitializer;
/**
* A Javassist-based lazy initializer proxy.
*
* @author Muga Nishizawa
*/
@SuppressWarnings("unchecked")
public class JavassistLazyInitializer extends BasicLazyInitializer implements MethodHandler {
private static final MethodFilter FINALIZE_FILTER = new MethodFilter() {
public boolean isHandled(Method m) {
// skip finalize methods
return !( m.getParameterTypes().length == 0 && m.getName().equals( "finalize" ) );
}
};
private Class[] interfaces;
private boolean constructed = false;
private JavassistLazyInitializer(
final Class persistentClass,
final Class[] interfaces,
final Serializable id,
final Method getIdentifierMethod,
final Method setIdentifierMethod,
final Session session) {
super( persistentClass, id, getIdentifierMethod, setIdentifierMethod, session );
this.interfaces = interfaces;
}
public static JormProxy getProxy(
final Class persistentClass,
final Class[] interfaces,
final Method getIdentifierMethod,
final Method setIdentifierMethod,
final Serializable id,
final Session session) throws JormException {
// note: interface is assumed to already contain HibernateProxy.class
try {
final JavassistLazyInitializer instance = new JavassistLazyInitializer(
persistentClass,
interfaces,
id,
getIdentifierMethod,
setIdentifierMethod,
session
);
ProxyFactory factory = new ProxyFactory();
factory.setSuperclass( interfaces.length == 1 ? persistentClass : null );
factory.setInterfaces( interfaces );
factory.setFilter( FINALIZE_FILTER );
Class cl = factory.createClass();
final JormProxy proxy = ( JormProxy ) cl.newInstance();
( ( ProxyObject ) proxy ).setHandler( instance );
instance.constructed = true;
return proxy;
}
catch ( Throwable t ) {
LogFactory.getLog( BasicLazyInitializer.class ).error(
"Javassist Enhancement failed: " + persistentClass, t);
throw new JormException(
"Javassist Enhancement failed: "
+ persistentClass, t);
}
}
public static JormProxy getProxy(
final Class factory,
final Class persistentClass,
final Class[] interfaces,
final Method getIdentifierMethod,
final Method setIdentifierMethod,
final Serializable id,
final Session session) throws JormException {
final JavassistLazyInitializer instance = new JavassistLazyInitializer(
persistentClass,
interfaces, id,
getIdentifierMethod,
setIdentifierMethod,
session
);
final JormProxy proxy;
try {
proxy = ( JormProxy ) factory.newInstance();
if(null != proxy) {
Reflections.invokeMethod(proxy, setIdentifierMethod, id);
}
}
catch ( Exception e ) {
throw new JormException(
"Javassist Enhancement failed: "
+ persistentClass.getName(), e
);
}
( ( ProxyObject ) proxy ).setHandler( instance );
instance.constructed = true;
return proxy;
}
public static Class getProxyFactory(
Class persistentClass,
Class[] interfaces) throws JormException {
// note: interfaces is assumed to already contain HibernateProxy.class
try {
ProxyFactory factory = new ProxyFactory();
factory.setSuperclass( interfaces.length == 1 ? persistentClass : null );
factory.setInterfaces( interfaces );
factory.setFilter( FINALIZE_FILTER );
return factory.createClass();
}
catch ( Throwable t ) {
LogFactory.getLog( BasicLazyInitializer.class ).error(
"Javassist Enhancement failed: "
+ persistentClass.getName(), t
);
throw new JormException(
"Javassist Enhancement failed: "
+ persistentClass.getName(), t
);
}
}
public Object invoke(
final Object proxy,
final Method thisMethod,
final Method proceed,
final Object[] args) throws Throwable {
if ( this.constructed ) {
Object result;
try {
result = this.invoke( thisMethod, args, proxy );
}
catch ( Throwable t ) {
throw new Exception( t.getCause() );
}
if ( result == INVOKE_IMPLEMENTATION ) {
Object target = getImplementation();
if(null == target) {
// 可能已经被删除了,或数据库中不存在
return null;
// throw new IllegalStateException("The database record does not exist or it had been deleted.");
}
final Object returnValue;
try {
if ( Reflections.isPublic( persistentClass, thisMethod ) ) {
if ( !thisMethod.getDeclaringClass().isInstance( target ) ) {
throw new ClassCastException( target.getClass().getName() );
}
returnValue = thisMethod.invoke( target, args );
}
else {
if ( !thisMethod.isAccessible() ) {
thisMethod.setAccessible( true );
}
returnValue = thisMethod.invoke( target, args );
}
return returnValue == target ? proxy : returnValue;
}
catch ( InvocationTargetException ite ) {
throw ite.getTargetException();
}
}
else {
return result;
}
}
else {
// while constructor is running
if ( thisMethod.getName().equals( "getLazyInitializer" ) ) {
return this;
}
else {
return proceed.invoke( proxy, args );
}
}
}
protected Object serializableProxy() {
return new SerializableProxy(
persistentClass,
interfaces,
getIdentifier(),
getIdentifierMethod,
setIdentifierMethod);
}
}
|
JavaScript | UTF-8 | 1,412 | 2.53125 | 3 | [
"MIT"
] | permissive | let PROTO_PATH = __dirname + "/wkrpt401.proto"
let sizeof = require('object-sizeof')
let grpc = require('grpc');
let proto = grpc.load(PROTO_PATH).wkrpt401;
var requestCount = 0
function getBestPersonality(call, callback) {
console.log("Size of call = " + sizeof(call))
let userData = call.request;
let name = userData.name;
let level = userData.level;
var maxPersonalityName = "";
var maxPersonalityValue = -1;
for (var i in userData.personality) {
let personalityName = userData.personality[i].name;
let personalityValue = userData.personality[i].amount;
if (personalityValue > maxPersonalityValue) {
maxPersonalityName = personalityName;
maxPersonalityValue = personalityValue;
}
}
let response = "The best personality attribute for " + name
+ " (Lv. " + level + ") is "
+ maxPersonalityName + " @ "
+ maxPersonalityValue;
requestCount++
console.log("getBestPersonality called with response '" + response + "'" + " (request #" + requestCount + ")");
callback(null, {response: response});
}
function main() {
var server = new grpc.Server();
server.addProtoService(proto.UserManager.service, {getBestPersonality: getBestPersonality});
server.bind("localhost:50051", grpc.ServerCredentials.createInsecure());
server.start();
console.log("WKRPT401 gRPC server starting.");
}
main();
|
PHP | UTF-8 | 389 | 4.125 | 4 | [] | no_license | <?php
for ($i = 1; $i <=5; $i++) {
echo "Hello World!<br>";
}
// adding all numbers from a given number
$sum = 0;
$result = 0;
$start = 1;
$end = 10;
for ($i = $start; $i <= $end; $i++) {
for ($k=1; $k <= 10;$k++) {
$result = $sum;
$sum = $i * $k;
echo "{$i} x {$k} = {$sum}<br>";
}
}
echo "The sum of numbers from {$start}
to {$end} is {$sum}"; |
Markdown | UTF-8 | 1,414 | 2.890625 | 3 | [] | no_license | # road-map
implementation of client server architecture (C++) for querying an road map stored in a data base system. The server is storing a the road map of a city (modeled as a graph) in disk pages simulated as files in my program. Clients would be querying this map through a simple get neighbors query. This query will take the following inputs: (a) source node, (b) number of neighbors (k). I have wrote a BFS code which starts at this source node and returns the first k node ids found in the graph representation of the road map. Note that BFS may span across different pages. I have partition graph using two method (1) uniform partiton (2) BFS partition.
FOLDER CONTAINS FOLLOWING FILE.
1 server.cpp
2 project.cpp
3 client.cpp
4 slides
####### FILE DISCRIPTION ##############
For Partitioning :
from "project.cpp" uncomment main function from line number 727-731.
%%%%%%%% Compile & Run: %%%%%%%%%%%
g++ project.cpp
./a.out
BFS_Partition() tooks around 6-8 hours("Sorry and thanks for passion")
Uniform_Partition() tooks around 2-3 minutes("Happy")
Partition Done...............
We
######## RUNNING SERVER & CLIENT ##############
server.cpp: contain server program,
client.cpp: contain client program,
%%%%%%% Complie & Run %%%%%%%%%%%%%
g++ server.cpp -o server
./server
g++ client.cpp -o client_n ; where n={1,2........,N} # of client want to create.
./client_n
|
C++ | UTF-8 | 195 | 2.90625 | 3 | [] | no_license | #include <iostream>
using namespace std;
int main() {
string s;
cin >> s;
int result = 0;
for (char& c: s) {
if (c == '1') result++;
}
cout << result << endl;
} |
PHP | UTF-8 | 1,193 | 2.90625 | 3 | [
"LicenseRef-scancode-warranty-disclaimer",
"MIT"
] | permissive | <?php
namespace App\Twilio\Services;
use App\Services\Exceptions\MessageException;
use App\Services\Message as Service;
use Twilio\Exceptions\ConfigurationException;
use Twilio\Exceptions\TwilioException;
use Twilio\Rest\Client;
class Message implements Service
{
/**
* @var Client
*/
private $client;
/**
* Message constructor.
* @param Client|null $client
* @throws ConfigurationException
*/
public function __construct(Client $client = null)
{
if ($client !== null) {
$this->client = $client;
} else {
$this->client = new Client(env('TWILIO_ACCOUNT_SID'), env('TWILIO_AUTH_TOKEN'));
}
}
/**
* @param string $to
* @param string $body
* @return string
* @throws MessageException
*/
public function send(string $to, string $body)
{
try {
$message = $this->client->api->messages->create($to, [
'from' => env('TWILIO_PHONE_NUMBER'),
'body' => $body,
]);
} catch (TwilioException $e) {
throw new MessageException($e);
}
return $message->sid;
}
}
|
Java | UTF-8 | 951 | 3.015625 | 3 | [] | no_license | package SpaceShooter;
import java.awt.Point;
public class BoardDimensions {
private int left=-1, top=-1;
int width=-1, height=-1;
double x_scale=1.0, y_scale=1.0;
// note that width and height are the effective width and height
public void setParms(int top, int left, int width , int height)
{
this.top = top;
this.left = left;
this.width = width;
this.height = height;
x_scale = width /MyGame.fieldWidth;
y_scale = height/MyGame.fieldHeight;
}
DPoint toGenericPoint(Point p)
{
double dx = (p.x - left)/x_scale;
double dy = (p.y - top)/y_scale;
if (dx < 0.0 || dy < 0.0 || dx > MyGame.fieldWidth || dy > MyGame.fieldHeight)
return null;
else
return new DPoint(dx, dy);
}
double toPaintScaleX(double x)
{
return x*x_scale;
}
DPoint toPaintPoint(DPoint dp)
{
int x=left + (int)(dp.x*x_scale);
int y=top+ (int)(dp.y*y_scale);
return new DPoint(x,y);
}
}
|
Go | UTF-8 | 673 | 3.96875 | 4 | [] | no_license | package main
// 导入unsafe包,里面有Sizeof函数
import (
"fmt";
"unsafe"
)
/**
这个函数用来练习字符串类型 string
*/
func main(){
// 基本使用
// 双引号
// 默认占用16字节
// 使用utf-8编码
// 字符串不可变的
var a string = "北京长城"
fmt.Printf("a = %s ,类型是 %T,大小是%d",a,a,unsafe.Sizeof(a))
// 之前说的字符串拼接用 +
var b string = "abc"
var c string = "def"
var e = b + c
// 如果要跨行拼接,就需要 把+号写在后面
f := "hello" + "hello" + "hello" + "hello" +
"hello" + "hello" + "hello" + "hello"+
"hello" + "hello"
// 这样 才不是错的
} |
C# | UTF-8 | 758 | 2.578125 | 3 | [] | no_license | using System.Data;
namespace Ambiel.AdoNet
{
public class SqlParam
{
/// <summary>
/// 参数
/// </summary>
public string _param { get; set; }
/// <summary>
/// 参数类型
/// </summary>
public DbType _dbtype { get; set; }
/// <summary>
/// 参数值
/// </summary>
public object _value { get; set; }
public SqlParam(string param, DbType dbtype, object value)
{
this._param = param;
this._dbtype = dbtype;
this._value = value;
}
public SqlParam(string param, object value)
{
this._param = param;
this._value = value;
}
}
} |
Java | UTF-8 | 1,468 | 2.203125 | 2 | [] | no_license | /*
* Copyright (C) 2015 HERU
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*/
package SystemClass;
/**
*
* @author HERU
*/
public class OccularRx {
private String OD,OS,PD,CASE,REMARKS;
public void setOD(String od){
this.OD = od;
}
public String getOD(){
return OD;
}
public void setOS(String os){
this.OS = os;
}
public String getOS(){
return OS;
}
public void setPD(String pd){
this.PD = pd;
}
public String getPD(){
return PD;
}
public void setCase(String c){
this.CASE = c;
}
public String getCase(){
return CASE;
}
public void setRemarks(String r){
this.REMARKS = r;
}
public String getRemarks(){
return REMARKS;
}
}
|
Java | UTF-8 | 1,486 | 2.015625 | 2 | [] | no_license | package com.buit.his.sys.openapi.response;
import io.swagger.annotations.ApiModel;
import io.swagger.annotations.ApiModelProperty;
/**
* @Author sunjx
* @Date 2020-12-15 15:47
* @Description
**/
@ApiModel("查询字典响应")
public class QueryDictResponse {
@ApiModelProperty(value = "code,数据存储使用")
private String dv;
@ApiModelProperty(value = "名称")
private String dn;
@ApiModelProperty(value = "是否停用 sys_flag_data:11,0:启用 1:停用 ")
private String stop;
@ApiModelProperty(value = "排序")
private Integer showOrder;
@ApiModelProperty(value = "拼音码")
private String py;
@ApiModelProperty(value = "五笔码")
private String wb;
public String getDv() {
return dv;
}
public void setDv(String dv) {
this.dv = dv;
}
public String getDn() {
return dn;
}
public void setDn(String dn) {
this.dn = dn;
}
public String getStop() {
return stop;
}
public void setStop(String stop) {
this.stop = stop;
}
public Integer getShowOrder() {
return showOrder;
}
public void setShowOrder(Integer showOrder) {
this.showOrder = showOrder;
}
public String getPy() {
return py;
}
public void setPy(String py) {
this.py = py;
}
public String getWb() {
return wb;
}
public void setWb(String wb) {
this.wb = wb;
}
}
|
C++ | UTF-8 | 1,163 | 3.328125 | 3 | [] | no_license | #include <gtest/gtest.h>
#include <cstdlib>
#include <queue>
#include <unordered_map>
using namespace std;
class RecentCounter {
public:
RecentCounter() : interval_start_at_(1), num_calls_(0), calls_() {}
int ping(int t) {
int new_interval_start_at = t - 3000;
// outside of the interval
for (int i = interval_start_at_; i < new_interval_start_at; ++i) {
if (calls_.find(i) != calls_.end()) {
num_calls_--;
}
}
interval_start_at_ = new_interval_start_at;
num_calls_++;
calls_[t] = true;
return num_calls_;
}
private:
int interval_start_at_;
int num_calls_;
unordered_map<int, bool> calls_;
};
int main() {
{
RecentCounter counter;
EXPECT_EQ(1, counter.ping(1)); // 1
EXPECT_EQ(2, counter.ping(100)); // 1, 100
EXPECT_EQ(3, counter.ping(3001)); // 1, 100, 3001
EXPECT_EQ(3, counter.ping(3002)); // 100, 3001, 3002
}
{
RecentCounter counter;
EXPECT_EQ(1, counter.ping(642));
EXPECT_EQ(2, counter.ping(1849));
EXPECT_EQ(1, counter.ping(4921));
EXPECT_EQ(2, counter.ping(5936));
EXPECT_EQ(3, counter.ping(5957));
}
exit(EXIT_SUCCESS);
}
|
C++ | UTF-8 | 832 | 3.578125 | 4 | [
"MIT"
] | permissive | #pragma once
template <typename T>
class Stack
{
public:
bool empty()
{
return isEmpty;
}
void push(T data)
{
if (isEmpty)
{
top = new node;
top->data = data;
isEmpty = false;
}
else
{
node *temp = top;
top = new node;
top->data = data;
top->last = temp;
}
}
T GetTop()
{
if (isEmpty)
throw "Stack is Empty";
return top->data;
}
T pop()
{
if (isEmpty)
throw "Stack is Empty";
if (top->last == nullptr)
{
T data = top->data;
delete top;
isEmpty = true;
return data;
}
else
{
T data = top->data;
node *temp = top;
top = top->last;
delete temp;
return data;
}
}
~Stack()
{
while (true)
{
if (isEmpty)
break;
pop();
}
}
private:
struct node
{
T data;
node *last = nullptr;
};
node *top;
bool isEmpty = true;
}; |
Java | UTF-8 | 7,547 | 2.203125 | 2 | [
"Apache-2.0"
] | permissive | /*
* Copyright 2006-2012 Amazon Technologies, Inc. or its affiliates.
* Amazon, Amazon.com and Carbonado are trademarks or registered trademarks
* of Amazon Technologies, Inc. or its affiliates. All rights reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.amazon.carbonado.txn;
import com.amazon.carbonado.Cursor;
import junit.framework.TestCase;
import junit.framework.TestSuite;
import com.amazon.carbonado.cursor.EmptyCursorFactory;
/**
* Test case for TransactionManager.CursorList.
*
* @author Brian S O'Neill
*/
public class TestCursorList extends TestCase {
public static void main(String[] args) {
junit.textui.TestRunner.run(suite());
}
public static TestSuite suite() {
return new TestSuite(TestCursorList.class);
}
TransactionScope.CursorList mList;
public TestCursorList(String name) {
super(name);
}
protected void setUp() {
mList = new TransactionScope.CursorList();
}
public void testRegisterFew() {
assertEquals(0, mList.size());
{
Cursor cursor = EmptyCursorFactory.newEmptyCursor();
mList.register(cursor, null);
assertEquals(1, mList.size());
assertEquals(cursor, mList.getCursor(0));
assertEquals(null, mList.getValue(0));
Object value = mList.unregister(cursor);
assertEquals(0, mList.size());
assertEquals(null, value);
}
{
Cursor cursor_1 = EmptyCursorFactory.newEmptyCursor();
Cursor cursor_2 = EmptyCursorFactory.newEmptyCursor();
mList.register(cursor_1, null);
assertEquals(1, mList.size());
mList.register(cursor_2, null);
assertEquals(2, mList.size());
assertEquals(cursor_1, mList.getCursor(0));
assertEquals(cursor_2, mList.getCursor(1));
assertEquals(null, mList.getValue(0));
assertEquals(null, mList.getValue(1));
Object value = mList.unregister(cursor_2);
assertEquals(1, mList.size());
assertEquals(cursor_1, mList.getCursor(0));
assertEquals(null, value);
mList.unregister(cursor_2);
assertEquals(1, mList.size());
mList.unregister(cursor_1);
assertEquals(0, mList.size());
}
// unregister in reverse
{
Cursor cursor_1 = EmptyCursorFactory.newEmptyCursor();
Cursor cursor_2 = EmptyCursorFactory.newEmptyCursor();
mList.register(cursor_1, null);
mList.register(cursor_2, null);
mList.unregister(cursor_1);
assertEquals(1, mList.size());
assertEquals(cursor_2, mList.getCursor(0));
mList.unregister(cursor_1);
assertEquals(1, mList.size());
mList.unregister(cursor_2);
assertEquals(0, mList.size());
}
}
public void testRegisterFewValue() {
Cursor cursor_1 = EmptyCursorFactory.newEmptyCursor();
Cursor cursor_2 = EmptyCursorFactory.newEmptyCursor();
String value_1 = "1";
String value_2 = "2";
mList.register(cursor_1, value_1);
assertEquals(1, mList.size());
assertEquals(cursor_1, mList.getCursor(0));
assertEquals(value_1, mList.getValue(0));
mList.register(cursor_2, value_2);
assertEquals(2, mList.size());
assertEquals(cursor_1, mList.getCursor(0));
assertEquals(value_1, mList.getValue(0));
assertEquals(cursor_2, mList.getCursor(1));
assertEquals(value_2, mList.getValue(1));
Object value = mList.unregister(cursor_2);
assertEquals(1, mList.size());
assertEquals(cursor_1, mList.getCursor(0));
assertEquals(value_1, mList.getValue(0));
assertEquals(value_2, value);
value = mList.unregister(cursor_2);
assertEquals(1, mList.size());
assertEquals(null, value);
value = mList.unregister(cursor_1);
assertEquals(0, mList.size());
assertEquals(value_1, value);
}
// Tests that the array expands properly.
public void testRegisterMany() {
final int count = 50;
Cursor[] cursors = new Cursor[count];
for (int i=0; i<count; i++) {
cursors[i] = EmptyCursorFactory.newEmptyCursor();
mList.register(cursors[i], null);
assertEquals(i + 1, mList.size());
}
for (int i=0; i<count; i++) {
assertEquals(cursors[i], mList.getCursor(i));
assertEquals(null, mList.getValue(i));
}
for (int i=0; i<count; i++) {
mList.unregister(cursors[i]);
assertEquals(count - i - 1, mList.size());
}
}
// Tests that the arrays expand properly and store values.
public void testRegisterManyValues() {
final int count = 50;
Cursor[] cursors = new Cursor[count];
Integer[] values = new Integer[count];
for (int i=0; i<count; i++) {
cursors[i] = EmptyCursorFactory.newEmptyCursor();
values[i] = i;
mList.register(cursors[i], values[i]);
assertEquals(i + 1, mList.size());
}
for (int i=0; i<count; i++) {
assertEquals(cursors[i], mList.getCursor(i));
assertEquals(values[i], mList.getValue(i));
}
for (int i=0; i<count; i++) {
Object value = mList.unregister(cursors[i]);
assertEquals(count - i - 1, mList.size());
assertEquals(values[i], value);
}
}
public void testCloseCursors() throws Exception {
final int count = 50;
// Without values
{
Cursor[] cursors = new Cursor[count];
for (int i=0; i<count; i++) {
cursors[i] = EmptyCursorFactory.newEmptyCursor();
mList.register(cursors[i], null);
}
mList.closeCursors();
assertEquals(0, mList.size());
/*
for (int i=0; i<count; i++) {
assertEquals(true, cursors[i].isClosed());
}
*/
}
// With values
{
Cursor[] cursors = new Cursor[count];
Integer[] values = new Integer[count];
for (int i=0; i<count; i++) {
cursors[i] = EmptyCursorFactory.newEmptyCursor();
values[i] = i;
mList.register(cursors[i], values[i]);
}
mList.closeCursors();
assertEquals(0, mList.size());
/*
for (int i=0; i<count; i++) {
assertEquals(true, cursors[i].isClosed());
}
*/
}
}
}
|
Python | UTF-8 | 523 | 2.578125 | 3 | [
"Apache-2.0"
] | permissive | #!/usr/bin/env python
import requests
import re
import os
import platform
distributionInfo = platform.platform()
matchObj = re.match( r'linux|darwin', distributionInfo, re.I)
OS= matchObj.group().lower()
def main():
pattern = re.compile(".*%s-amd64.tgz$" % OS)
r = requests.get('https://api.github.com/repos/minishift/minishift/releases/latest')
for asset in r.json()['assets']:
if pattern.match(asset['name']):
print asset['browser_download_url']
if __name__ == "__main__":
main()
|
Java | UTF-8 | 4,229 | 2.046875 | 2 | [] | no_license | package idv.kyle.practice.lucene;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.List;
import java.util.UUID;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.contrib.index.lucene.FileSystemDirectory;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.lucene.analysis.Analyzer;
import org.apache.lucene.analysis.standard.StandardAnalyzer;
import org.apache.lucene.document.Document;
import org.apache.lucene.document.Field;
import org.apache.lucene.document.LongField;
import org.apache.lucene.document.StringField;
import org.apache.lucene.document.TextField;
import org.apache.lucene.index.DirectoryReader;
import org.apache.lucene.index.IndexReader;
import org.apache.lucene.index.IndexWriter;
import org.apache.lucene.index.IndexWriterConfig;
import org.apache.lucene.index.IndexWriterConfig.OpenMode;
import org.apache.lucene.index.IndexableField;
import org.apache.lucene.index.Term;
import org.apache.lucene.queryparser.classic.ParseException;
import org.apache.lucene.queryparser.classic.QueryParser;
import org.apache.lucene.search.IndexSearcher;
import org.apache.lucene.search.Query;
import org.apache.lucene.search.ScoreDoc;
import org.apache.lucene.search.TopDocs;
import org.apache.lucene.store.Directory;
import org.apache.lucene.store.FSDirectory;
import org.apache.lucene.util.Version;
public class HelloLucene {
public void indexLocalFile(String indexPath) {
try {
Directory dir = FSDirectory.open(new File(indexPath));
Analyzer analyzer = new StandardAnalyzer(Version.LUCENE_47);
IndexWriterConfig iwc = new IndexWriterConfig(Version.LUCENE_47, analyzer);
iwc.setOpenMode(OpenMode.CREATE_OR_APPEND);
IndexWriter writer = new IndexWriter(dir, iwc);
indexDoc(writer);
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
public void indexHDFS(String indexPath) {
try {
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
FileSystemDirectory dir = new FileSystemDirectory(fs, new Path(" input/index/ "), false, conf);
Analyzer analyzer = new StandardAnalyzer(Version.LUCENE_47);
IndexWriterConfig iwc = new IndexWriterConfig(Version.LUCENE_47, analyzer);
iwc.setOpenMode(OpenMode.CREATE_OR_APPEND);
IndexWriter writer = new IndexWriter(dir, iwc);
indexDoc(writer);
writer.close();
} catch (IOException e) {
e.printStackTrace();
}
}
public void indexDoc(IndexWriter writer) throws IOException {
for (int i = 1; i <= 10000000; i++) {
Document doc = new Document();
doc.add(new StringField("id", "testid_" + i, Field.Store.YES));
doc.add(new StringField("name", "name_" + i, Field.Store.YES));
doc.add(new StringField("user", "user_" + i, Field.Store.YES));
doc.add(new StringField("location", "location_" + i, Field.Store.YES));
doc.add(new StringField("_version", "1", Field.Store.YES));
Field uid = new StringField("_uid", "test#" + i, Field.Store.YES);
//uid.setStringValue("uf#" + UUID.randomUUID().toString());
doc.add(uid);
writer.addDocument(doc);
}
}
public void searchAll(String indexPath){
try {
IndexReader reader = DirectoryReader.open(FSDirectory.open(new File(indexPath)));
IndexSearcher searcher = new IndexSearcher(reader);
Analyzer analyzer = new StandardAnalyzer(Version.LUCENE_47);
Query query = new QueryParser(Version.LUCENE_47, "id", analyzer).parse("testid_*");
TopDocs tds=searcher.search(query, 10);
ScoreDoc[] sds= tds.scoreDocs;
for (ScoreDoc sd:sds) {
Document document=searcher.doc(sd.doc);
List<IndexableField> fields = document.getFields();
System.out.print("score:" + sd.score);
for(IndexableField field : fields){
System.out.print(", " + field.name() + ":" + field.stringValue());
}
System.out.println("");
}
reader.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}catch (ParseException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
|
Java | UTF-8 | 2,587 | 2.5625 | 3 | [] | no_license | package com.test.moviesdb.activity;
import android.content.Intent;
import android.util.Log;
import android.widget.ImageView;
import android.widget.TextView;
import com.squareup.picasso.Picasso;
import com.test.moviesdb.R;
import com.test.moviesdb.model.Movie;
import com.test.moviesdb.utils.Constant;
/*
* Movie details activity to show data of the selected movie
*/
public class MovieDetails extends BaseActivity {
//TAG for debugging and logging purpose
private static final String TAG = MovieDetails.class.getSimpleName();
//Movie to be displayed on this screen
Movie mMovie;
//Variables to be mapped on views
TextView movieTitle;
TextView movieVoteCount;
TextView movieVoteAverage;
TextView movieOverview;
ImageView moviePoster;
@Override
protected int setActivityLayout() {
//To add the back navigation to home screen from top bar (action bar)
getSupportActionBar().setDisplayHomeAsUpEnabled(true);
return R.layout.activity_movie_details;
}
@Override
public boolean onSupportNavigateUp() {
finish();
return super.onSupportNavigateUp();
}
@Override
protected void initViews() {
movieTitle=findViewById(R.id.movie_title_text);
moviePoster = findViewById(R.id.movie_poster_big);
movieVoteAverage = findViewById(R.id.vote_average_text);
movieVoteCount = findViewById(R.id.vote_count_text);
movieOverview = findViewById(R.id.movie_overview_text);
}
@Override
protected void initValues() {
//Receiving the data sent by previous activity by using intent
Intent intent=getIntent();
//storing the movie object from received intent to mMovie object
mMovie=(Movie)intent.getSerializableExtra(Constant.MOVIE_OBJECT);
Log.i(TAG,mMovie.getMovieTitle());
}
@Override
protected void initValuesInViews() {
//Initializing all values into views
movieTitle.setText(mMovie.getMovieTitle());
movieVoteAverage.setText(mMovie.getVoteAverage()+"");
movieVoteCount.setText("("+mMovie.getVoteCount()+")");
movieOverview.setText(mMovie.getOverview());
if(mMovie.getMoviePoster()!=null)
{
//Displaying hight quality image with w500 to imageview
String image_path= Constant.IMAGE_PATH_W500+mMovie.getMoviePoster();
Picasso.with(this).load(image_path).placeholder(R.drawable.default_image).into(moviePoster);
}
}
@Override
protected void setListenersOnViews() {
}
}
|
JavaScript | UTF-8 | 6,397 | 2.90625 | 3 | [
"MIT"
] | permissive | /*!
* js-logger - http://github.com/jonnyreeves/js-logger
* Jonny Reeves, http://jonnyreeves.co.uk/
* js-logger may be freely distributed under the MIT license.
*/
/*jshint sub:true*/
/*global console:true,define:true, module:true*/
(function (global) {
"use strict";
// Top level module for the global, static logger instance.
var Logger = { };
// For those that are at home that are keeping score.
Logger.VERSION = "0.9.13";
// Function which handles all incoming log messages.
var logHandler;
// Map of ContextualLogger instances by name; used by Logger.get() to return the same named instance.
var contextualLoggersByNameMap = {};
// Polyfill for ES5's Function.bind.
var bind = function(scope, func) {
return function() {
return func.apply(scope, arguments);
};
};
// Super exciting object merger-matron 9000 adding another 100 bytes to your download.
var merge = function () {
var args = arguments, target = args[0], key, i;
for (i = 1; i < args.length; i++) {
for (key in args[i]) {
if (!(key in target) && args[i].hasOwnProperty(key)) {
target[key] = args[i][key];
}
}
}
return target;
};
// Helper to define a logging level object; helps with optimisation.
var defineLogLevel = function(value, name) {
return { value: value, name: name };
};
// Predefined logging levels.
Logger.DEBUG = defineLogLevel(1, 'DEBUG');
Logger.INFO = defineLogLevel(2, 'INFO');
Logger.WARN = defineLogLevel(4, 'WARN');
Logger.ERROR = defineLogLevel(8, 'ERROR');
Logger.OFF = defineLogLevel(99, 'OFF');
// Inner class which performs the bulk of the work; ContextualLogger instances can be configured independently
// of each other.
var ContextualLogger = function(defaultContext) {
this.context = defaultContext;
this.setLevel(defaultContext.filterLevel);
this.log = this.info; // Convenience alias.
};
ContextualLogger.prototype = {
// Changes the current logging level for the logging instance.
setLevel: function (newLevel) {
// Ensure the supplied Level object looks valid.
if (newLevel && "value" in newLevel) {
this.context.filterLevel = newLevel;
}
},
// Is the logger configured to output messages at the supplied level?
enabledFor: function (lvl) {
var filterLevel = this.context.filterLevel;
return lvl.value >= filterLevel.value;
},
debug: function () {
this.invoke(Logger.DEBUG, arguments);
},
info: function () {
this.invoke(Logger.INFO, arguments);
},
warn: function () {
this.invoke(Logger.WARN, arguments);
},
error: function () {
this.invoke(Logger.ERROR, arguments);
},
// Invokes the logger callback if it's not being filtered.
invoke: function (level, msgArgs) {
//invoke
if (logHandler && this.enabledFor(level)) {
logHandler(msgArgs, merge({ level: level }, this.context));
}
///invoke
}
};
// Protected instance which all calls to the to level `Logger` module will be routed through.
var globalLogger = new ContextualLogger({ filterLevel: Logger.OFF });
// Configure the global Logger instance.
(function() {
// Shortcut for optimisers.
var L = Logger;
L.enabledFor = bind(globalLogger, globalLogger.enabledFor);
L.debug = bind(globalLogger, globalLogger.debug);
L.info = bind(globalLogger, globalLogger.info);
L.warn = bind(globalLogger, globalLogger.warn);
L.error = bind(globalLogger, globalLogger.error);
// Don't forget the convenience alias!
L.log = L.info;
}());
// Set the global logging handler. The supplied function should expect two arguments, the first being an arguments
// object with the supplied log messages and the second being a context object which contains a hash of stateful
// parameters which the logging function can consume.
Logger.setHandler = function (func) {
logHandler = func;
};
// Sets the global logging filter level which applies to *all* previously registered, and future Logger instances.
// (note that named loggers (retrieved via `Logger.get`) can be configured independently if required).
Logger.setLevel = function(level) {
// Set the globalLogger's level.
globalLogger.setLevel(level);
// Apply this level to all registered contextual loggers.
for (var key in contextualLoggersByNameMap) {
if (contextualLoggersByNameMap.hasOwnProperty(key)) {
contextualLoggersByNameMap[key].setLevel(level);
}
}
};
// Retrieve a ContextualLogger instance. Note that named loggers automatically inherit the global logger's level,
// default context and log handler.
Logger.get = function (name) {
// All logger instances are cached so they can be configured ahead of use.
return contextualLoggersByNameMap[name] ||
(contextualLoggersByNameMap[name] = new ContextualLogger(merge({ name: name }, globalLogger.context)));
};
// Configure and example a Default implementation which writes to the `window.console` (if present).
Logger.useDefaults = function(defaultLevel) {
// Check for the presence of a logger.
if (!console) {
return;
}
Logger.setLevel(defaultLevel || Logger.DEBUG);
Logger.setHandler(function(messages, context) {
var hdlr = console.log;
if (Logger.disabled || (Logger.filter && Logger.filter.indexOf(context.name) == -1)) {
return;
}
// Prepend the logger's name to the log message for easy identification.
if (context.name) {
messages[0] = "[" + context.level.name + "] [" + (new Date().toTimeString()).split(' ')[0] + "] [" + context.name + "] " + messages[0];
}
// Delegate through to custom warn/error loggers if present on the console.
if (context.level === Logger.WARN && console.warn) {
hdlr = console.warn;
} else if (context.level === Logger.ERROR && console.error) {
hdlr = console.error;
} else if (context.level === Logger.INFO && console.info) {
hdlr = console.info;
}
// Support for IE8+ (and other, slightly more sane environments)
Function.prototype.apply.call(hdlr, console, messages);
});
};
// Export to popular environments boilerplate.
if (typeof define === 'function' && define.amd) {
define(Logger);
}
else if (typeof module !== 'undefined' && module.exports) {
module.exports = Logger;
}
else {
Logger._prevLogger = global.Logger;
Logger.noConflict = function () {
global.Logger = Logger._prevLogger;
return Logger;
};
global.Logger = Logger;
}
}(this)); |
Java | UTF-8 | 424 | 3 | 3 | [] | no_license | package ua.course1.week3.MatrixMethods_HW1;
public class MatrixUtils {
public static String toString(int[][] matrix) {
String matRes = "";
for (int a = 0; a < matrix.length; a++) {
for (int s = 0; s < matrix[a].length; s++) {
int col = matrix[a][s];
matRes += col + " ";
}
matRes += "\n";
}
return matRes;
}
}
|
C | UTF-8 | 1,768 | 4.28125 | 4 | [
"MIT"
] | permissive | /*
============================================================================
Name : pthread_0.c
Author : govind
Version :
Copyright : Your copyright notice
Description : pthread in C, Ansi-style
============================================================================
*/
#include <stdio.h>
#include <stdlib.h>
#include <pthread.h>
#include <unistd.h>
// This is the thread function that will execute when the thread is created
// it passes and receives data by void pointers
void* threadFunction(void *value)
{
int *x = (int*) value; //cast the data passed to an int pointer
while (*x < 5)
{ //while the value of x is less than 5
usleep(1000); //sleep for 1ms - encourage main thread
(*x)++; //increment the value of x by 1
}
return x; //return the pointer x (as a void*)
}
int main()
{
int x = 0, y = 0;
pthread_t thread; //this is our handle to the pthread
// create the thread, pass the reference, address of the function and data
// pthread_create() returns 0 on the successful creation of a thread
if (pthread_create(&thread, NULL, &threadFunction, &x) != 0)
{
printf("Failed to create the thread::\n");
return 1;
}
// at this point the thread was created successfully
while (y < 5)
{ // loop and increment y, displaying values
printf("The value of x=%d and y=%d \n", x, y);
++y;
usleep(1000); // encourage the pthread to run
}
void *result; // OPTIONAL: receive data back from pthread
pthread_join(thread, &result); // allow the pthread to complete
int *z = (int*) result; // cast from void* to int* to get z
printf("Final: x=%d, y=%d and z=%d\n", x, y, *z);
return EXIT_SUCCESS;
}
|
Markdown | UTF-8 | 3,108 | 3.046875 | 3 | [
"MIT"
] | permissive | # EasySlug (easy creation of slugs for Laravel 5+)
[](http://choosealicense.com/licenses/mit/)
[](https://packagist.org/packages/daaner/easy-slug)
[](https://packagist.org/packages/daaner/easy-slug)
## Fix
- fix softDeleted tray
## About
[ru] Простое создание уникального слага для Ларавел 5+
[en] Simple creation of a unique slug for Laravel 5+
## Установка
`composer require daaner/easy-slug`
[ru]
- В шапке модели `use EasySlug\EasySlug;`
- В теле модели подключаем трейт `use EasySlug;`
[en]
- In header model `use EasySlug\EasySlug;`
- In the body of the model use trait `use EasySlug;`
```
...
use EasySlug\EasySlug;
class BaseModel extends Model
{
use EasySlug;
...
```
## Использование
[ru]
- Добавляем мутатор в нужную модель (или базовую модель)
- Первый параметр `$value` - наше значение
- Второй параметр `slug` - поле, в котором хранится наш слаг
- Третий параметр `custom_field` - поле из которого будет формироваться слаг
- Обрезает длину слага до 100 символов
ЗЫ: Если Вы хотите формировать слаг из полей `title` или `name` - третий параметр можно не указывать (если находит `title` формирует сперва из него, если нет - ищет `name`). Если третий параметр не указан или не найден - формируется слаг `'slug_'. date("Y-m-d-H-i-s")`, проверяется на уникальность и, если есть совпадения, дописывает первое уникальное число через дефис.
[en]
- Add a mutator to the desired model (or base model)
- The first parameter `$value` is our value
- The second parameter `slug` is the field in which our slug is stored
- The third parameter `custom_field` is the field from which the slug will be formed
- Cuts the length of the slag to 100 characters
PS: If you want to form a slug from the `title` or` name` fields - the third parameter can be null (if it finds `title` forms first of it, if not, it searches for` name`). If the third parameter is not specified or not found, the slug formed `'slug_'. date("Y-m-d-H-i-s")`, is checked for uniqueness and, if there are matches, appends the first unique number with a hyphen.
```
public function setSlugAttribute($value) {
$this->EasySlugCheck($value, 'slug', 'custom_field');
}
```
## Контакты
https://t.me/neodaan по всем вопросам
## License
EasySlug is open-sourced software licensed under the [MIT license](http://opensource.org/licenses/MIT).
|
C++ | UTF-8 | 2,750 | 2.5625 | 3 | [] | no_license | #include <iostream>
#include <stdlib.h>
#include <string.h>
#include <thread>
#include <time.h>
#include <queue>
#include <ppl.h>
#include <windows.h>
#include <mutex>
void haslo_init(int n, char* alfabet, char* haslo) {
for (int i = 0; i < n; i++)
{
haslo[i] = alfabet[0];
haslo[n] = 0;
}
}
int haslo_next(int n, int L, char* alfabet, char* haslo) {
for (int i = 0; i < L - 1; i++)
{
if (haslo[0] == alfabet[i])
{
haslo[0] = alfabet[i + 1];
return 1;
}
}
if (n == 0)
{
return 0;
}
haslo[0] = alfabet[0];
return haslo_next(n - 1, L, alfabet, &haslo[1]);
}
void Producent(int n, int L, char* alfabet, char* haslo, std::queue<char*> *bufor, std::mutex *mutex_, bool* stoped)
{
int bufor_size = 0;
haslo = (char*)malloc((n + 1) * sizeof(char));
char *swap_bufor = (char*)malloc((n + 1) * sizeof(char));
haslo_init(n, alfabet, haslo);
//std::cout << "producent " << haslo << std::endl;
int liczba_hasel=0;
do
{
mutex_->lock();
bufor->push(haslo);
bufor_size = bufor->size();
mutex_->unlock();
strcpy_s(swap_bufor, n + 1, haslo);
haslo = (char*)malloc((n + 1) * sizeof(char));
strcpy_s(haslo, n + 1, swap_bufor);
if (bufor_size > 100)
{
Sleep(1);
}
//liczba_hasel++;
if (haslo_next(n, L, alfabet, haslo) == 0)
{
//std::cout<<"wyprodukowano " << liczba_hasel << " hasel\n";
*stoped = true;
return;
}
//std::cout << "producent " << haslo << std::endl;
} while (!*stoped);
}
void Konsument(int n, std::queue<char*>* bufor,std::mutex *mutex_, bool* stoped)
{
char* haslo = (char*)"qwerty1234";
int liczba_hasel = 0;
while (1)
{
if (!bufor->empty())
{
mutex_->lock();
char *haslo2 = bufor->front();
bufor->pop();
mutex_->unlock();
if (!strcmp(haslo, haslo2))
{
std::cout << "odgadnieto haslo " << std::endl;
*stoped = true;
}
//std::cout << "konsument " << haslo2 << std::endl;
//liczba_hasel++;
free(haslo2);
}
else
{
if (*stoped) break;
}
}
//std::cout << "odebrano " << liczba_hasel << " hasel\n";
return;
}
int main()
{
for (int i = 1; i < 20; i++)
{
int dlugosc_hasla = i;
char* alfabet = (char*)"abcd";
int dlugosc_alfabetu = strlen(alfabet);
char* haslo;
bool* stoped = new bool(false);
std::queue<char*>* bufor = new std::queue<char*>();
std::mutex* mutex_ = new std::mutex();
clock_t begin = clock();
std::thread producent(Producent, dlugosc_hasla, dlugosc_alfabetu, alfabet, haslo, bufor, mutex_, stoped);
std::thread konsument(Konsument, dlugosc_hasla, bufor, mutex_, stoped);
producent.join();
konsument.join();
clock_t end = clock();
int time_spent = (int)(1000 * (end - begin) / CLOCKS_PER_SEC);
std::cout << i << ". czas: " << time_spent << "ms\n";
}
}
|
Markdown | UTF-8 | 739 | 2.640625 | 3 | [
"MIT"
] | permissive | ---
layout: default
title: OpenHack - Fort Wayne, IN
---
## Fort Wayne, IN

### Info
[Squaremouth](http://www.squaremouth.com) is hosting OpenHack events in
our Fort Wayne office.
We'll keep our [events on Eventbrite](http://openhackfw.eventbrite.com)
and would love to have you join us! Be sure to RSVP for the event so we
know to get enough food.
If you'd like to get involved, ping
[@OpenHackFW](http://twitter.com/OpenHackFW) to let us know what
you're thinking!
### Next meetup
We usually hold our OpenHacks on the second Thursday of each month, but
you should really [check our Eventbrite page](http://openhackfw.eventbrite.com)
which will always have the next OpenHack event.
|
Go | UTF-8 | 1,831 | 3.21875 | 3 | [
"MIT"
] | permissive | package setlock
import (
"os"
"testing"
"time"
)
const (
lockfile = "setlockfile-test"
)
func TestLockWithBlocking(t *testing.T) {
defer os.Remove(lockfile)
go func() {
locker := NewLocker(lockfile, false)
locker.Lock()
defer locker.Unlock()
time.Sleep(5 * time.Second)
}()
time.Sleep(1 * time.Second)
begin := time.Now().Unix()
locker := NewLocker(lockfile, false)
locker.Lock()
defer locker.Unlock()
end := time.Now().Unix()
if end-begin < 3 { // XXX
t.Error("(Maybe) Lock is not blocking")
}
}
func TestLockWithNonBlocking(t *testing.T) {
defer os.Remove(lockfile)
go func() {
locker := NewLocker(lockfile, false)
locker.Lock()
defer locker.Unlock()
time.Sleep(5 * time.Second)
}()
time.Sleep(1 * time.Second)
defer func() {
err := recover()
// expected occurring panic
if err == nil {
t.Error("Not non-blocking")
return
}
}()
locker := NewLocker(lockfile, true)
locker.Lock()
defer locker.Unlock()
}
func TestLockWithErrWithBlocking(t *testing.T) {
defer os.Remove(lockfile)
go func() {
locker := NewLocker(lockfile, false)
locker.LockWithErr()
defer locker.Unlock()
time.Sleep(5 * time.Second)
}()
time.Sleep(1 * time.Second)
begin := time.Now().Unix()
locker := NewLocker(lockfile, false)
locker.LockWithErr()
defer locker.Unlock()
end := time.Now().Unix()
if end-begin < 3 { // XXX
t.Error("(Maybe) Lock is not blocking")
}
}
func TestLockWithErrWithNonBlocking(t *testing.T) {
defer os.Remove(lockfile)
go func() {
locker := NewLocker(lockfile, false)
locker.LockWithErr()
defer locker.Unlock()
time.Sleep(5 * time.Second)
}()
time.Sleep(1 * time.Second)
locker := NewLocker(lockfile, true)
// should raise error
if err := locker.LockWithErr(); err == nil {
t.Error("Not non-blocking")
return
}
}
|
Python | UTF-8 | 1,679 | 3 | 3 | [] | no_license | import torch
import numpy as np
# Loss Calculator
class LossMetric(object):
def __init__(self, loss_func):
self.metric = loss_func
def __call__(self, gt, pred):
loss = self.metric(gt, pred)
return torch.mean(loss)
# Logging Loss
class LossAccumulator(object):
def __init__(self, dataloader):
self.datasize = len(dataloader.dataset)
self.loss = 0
def __call__(self, loss):
self.loss += np.mean(loss.detach().cpu().numpy()) / self.datasize
def clear(self):# call this beggining of each epoch
self.loss = 0
class QtoPConverter(object):
def __init__(self, s=2):
self.s = s
def __call__(self, q):
# (Batch, Cluster)
batch_size, num_clusters = q.shape
f = torch.sum(q, dim=0)
numerator = q**self.s/torch.unsqueeze(f, 0).repeat(batch_size, 1)
denominator = torch.sum(numerator, dim=1)
denominaotr = torch.unsqueeze(denominator, 1).repeat(1, num_clusters)
p = numerator/denominaotr
return p
# https://pytorch.org/tutorials/beginner/fgsm_tutorial.html
def fgsm_attack(image, epsilon, data_grad):
# Collect the element-wise sign of the data gradient
sign_data_grad = data_grad.sign()
# Create the perturbed image by adjusting each pixel of the input image
perturbed_image = image + epsilon*sign_data_grad
# Adding clipping to maintain [0,1] range
perturbed_image = torch.clamp(perturbed_image, 0, 1).detach()#[WARNING] added .detach()
# Return the perturbed image
return perturbed_image
# KL Divergence
def KLDiv(p, q):
return p*torch.log(p/q) |
Markdown | UTF-8 | 3,359 | 2.78125 | 3 | [
"CC-BY-NC-4.0",
"MIT"
] | permissive | # Data Preparation
We describe the process of aligning long audio files with their transcripts and generating shorter audio segments below.
- Step 1: Download and install torchaudio using the nightly version. We have open sourced the CTC forced alignment algorithm described in our paper via [torchaudio](https://github.com/pytorch/audio/pull/3348).
```
pip install --pre torchaudio --index-url https://download.pytorch.org/whl/nightly/cu118
```
- Step 2: Download [uroman](https://github.com/isi-nlp/uroman) from Github. It is a universal romanizer which converts text in any script to the Latin alphabet. Use [this link](https://www.isi.edu/~ulf/uroman.html) to try their web interface.
```
git clone git@github.com:isi-nlp/uroman.git
```
- Step 3: Install a few other dependencies
```
pip install sox
pip install dataclasses
```
- Step 4: Create a text file containing the transcript for a (long) audio file. Each line in the text file will correspond to a separate audio segment that will be generated upon alignment.
Example content of the input text file :
```
Text of the desired first segment
Text of the desired second segment
Text of the desired third segment
```
- Step 5: Run forced alignment and segment the audio file into shorter segments.
```
python align_and_segment.py --audio /path/to/audio.wav --textfile /path/to/textfile --lang <iso> --outdir /path/to/output --uroman /path/to/uroman/bin
```
The above code will generated the audio segments under output directory based on the content of each line in the input text file. The `manifest.json` file consisting of the of segmented audio filepaths and their corresponding transcripts.
```
> head /path/to/output/manifest.json
{"audio_start_sec": 0.0, "audio_filepath": "/path/to/output/segment1.flac", "duration": 6.8, "text": "she wondered afterwards how she could have spoken with that hard serenity how she could have", "normalized_text": "she wondered afterwards how she could have spoken with that hard serenity how she could have", "uroman_tokens": "s h e w o n d e r e d a f t e r w a r d s h o w s h e c o u l d h a v e s p o k e n w i t h t h a t h a r d s e r e n i t y h o w s h e c o u l d h a v e"}
{"audio_start_sec": 6.8, "audio_filepath": "/path/to/output/segment2.flac", "duration": 5.3, "text": "gone steadily on with story after story poem after poem till", "normalized_text": "gone steadily on with story after story poem after poem till", "uroman_tokens": "g o n e s t e a d i l y o n w i t h s t o r y a f t e r s t o r y p o e m a f t e r p o e m t i l l"}
{"audio_start_sec": 12.1, "audio_filepath": "/path/to/output/segment3.flac", "duration": 5.9, "text": "allan's grip on her hands relaxed and he fell into a heavy tired sleep", "normalized_text": "allan's grip on her hands relaxed and he fell into a heavy tired sleep", "uroman_tokens": "a l l a n ' s g r i p o n h e r h a n d s r e l a x e d a n d h e f e l l i n t o a h e a v y t i r e d s l e e p"}
```
To visualize the segmented audio files, [Speech Data Explorer](https://github.com/NVIDIA/NeMo/tree/main/tools/speech_data_explorer) tool from NeMo toolkit can be used.
As our alignment model outputs uroman tokens for input audio in any language, it also works with non-english audio and their corresponding transcripts.
|
SQL | UTF-8 | 44,845 | 3 | 3 | [] | no_license | -- --------------------------------------------------------
-- Host: 127.0.0.1
-- Server version: 5.6.37-log - MySQL Community Server (GPL)
-- Server OS: Win64
-- HeidiSQL Version: 9.4.0.5125
-- --------------------------------------------------------
/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */;
/*!40101 SET NAMES utf8 */;
/*!50503 SET NAMES utf8mb4 */;
/*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */;
/*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */;
-- Dumping database structure for saiapp_db
DROP DATABASE IF EXISTS `saiapp_db`;
CREATE DATABASE IF NOT EXISTS `saiapp_db` /*!40100 DEFAULT CHARACTER SET utf8 */;
USE `saiapp_db`;
-- Dumping structure for table saiapp_db.address
DROP TABLE IF EXISTS `address`;
CREATE TABLE IF NOT EXISTS `address` (
`address_id` int(11) NOT NULL AUTO_INCREMENT,
`address_line1` varchar(100) DEFAULT NULL,
`address_line2` varchar(100) DEFAULT NULL,
`city` varchar(100) DEFAULT NULL,
`zipcode` varchar(100) DEFAULT NULL,
`state` varchar(100) DEFAULT NULL,
`created_at` datetime DEFAULT NULL,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`address_id`)
) ENGINE=InnoDB AUTO_INCREMENT=262 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.address: ~10 rows (approximately)
DELETE FROM `address`;
/*!40000 ALTER TABLE `address` DISABLE KEYS */;
INSERT INTO `address` (`address_id`, `address_line1`, `address_line2`, `city`, `zipcode`, `state`, `created_at`, `version`) VALUES
(203, 'Lingaraj Vihar', 'Pokhariput', NULL, '423423', NULL, '2017-03-30 18:10:19', 1),
(216, 'fsdfsdfsdf', 'fsdfsfs', NULL, NULL, NULL, '2017-03-31 10:15:15', 0),
(217, 'dasfsd', 'vsd', NULL, '312312', NULL, '2017-03-31 10:18:05', 0),
(218, 'dasda', 'dasda', NULL, '423543', NULL, '2017-03-31 19:41:40', 0),
(255, 'Pokhariput', 'BBSR', NULL, '751020', NULL, '2017-05-11 18:21:11', 0),
(257, 'dasd', 'das', NULL, NULL, NULL, '2017-05-16 16:53:13', 1),
(258, 'yyruuytuytu', 'dasd', NULL, NULL, NULL, '2017-05-16 14:55:20', 0),
(259, 'fsdf', 'fsdfs', NULL, NULL, NULL, '2017-05-16 15:39:05', 0),
(260, 'gdgf', 'gdfgd', NULL, NULL, NULL, '2017-05-16 15:55:34', 0),
(261, 'qeqweqewqeqwe', 'eqweqwqwe', NULL, NULL, NULL, '2017-05-16 16:53:28', 0);
/*!40000 ALTER TABLE `address` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.agent
DROP TABLE IF EXISTS `agent`;
CREATE TABLE IF NOT EXISTS `agent` (
`agent_id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(100) DEFAULT NULL,
`mobile` varchar(45) DEFAULT NULL,
`enabled` int(11) DEFAULT NULL,
`created_at` datetime DEFAULT NULL,
`address_id` int(11) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
`agent_type` varchar(45) DEFAULT NULL,
`del_flag` int(11) DEFAULT '0',
PRIMARY KEY (`agent_id`),
UNIQUE KEY `mobile` (`mobile`),
KEY `agent_address_fk_idx` (`address_id`),
CONSTRAINT `agent_address_fk` FOREIGN KEY (`address_id`) REFERENCES `address` (`address_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=20 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.agent: ~1 rows (approximately)
DELETE FROM `agent`;
/*!40000 ALTER TABLE `agent` DISABLE KEYS */;
INSERT INTO `agent` (`agent_id`, `name`, `mobile`, `enabled`, `created_at`, `address_id`, `version`, `agent_type`, `del_flag`) VALUES
(19, 'Tushar Das', '0987654321', NULL, '2017-05-11 18:21:11', 255, 0, 'ARCHITECT', 0);
/*!40000 ALTER TABLE `agent` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.application_users
DROP TABLE IF EXISTS `application_users`;
CREATE TABLE IF NOT EXISTS `application_users` (
`user_id` int(11) NOT NULL AUTO_INCREMENT,
`username` varchar(100) DEFAULT NULL,
`password` varchar(100) DEFAULT NULL,
`enabled` int(11) DEFAULT NULL,
`created_by` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
`password_changed` int(11) DEFAULT NULL,
`employee_id` int(11) DEFAULT NULL,
PRIMARY KEY (`user_id`),
UNIQUE KEY `username_uq_cons` (`username`),
UNIQUE KEY `idx_application_users` (`employee_id`),
CONSTRAINT `fk_application_users_employee` FOREIGN KEY (`employee_id`) REFERENCES `employee` (`employee_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=10 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.application_users: ~1 rows (approximately)
DELETE FROM `application_users`;
/*!40000 ALTER TABLE `application_users` DISABLE KEYS */;
INSERT INTO `application_users` (`user_id`, `username`, `password`, `enabled`, `created_by`, `created_at`, `version`, `password_changed`, `employee_id`) VALUES
(9, 'admin', 'admin', 1, 1, '2017-04-02 01:43:28', 0, NULL, 24);
/*!40000 ALTER TABLE `application_users` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.bank_accounts
DROP TABLE IF EXISTS `bank_accounts`;
CREATE TABLE IF NOT EXISTS `bank_accounts` (
`bank_accounts_id` int(10) NOT NULL AUTO_INCREMENT,
`bank_name` varchar(100) NOT NULL,
`account_number` bigint(16) NOT NULL,
`account_name` varchar(50) NOT NULL,
`address` varchar(100) NOT NULL,
`ifsc_code` varchar(10) NOT NULL,
`balance` bigint(20) DEFAULT '0',
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`created_by` int(10) NOT NULL,
`modified_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`modified_by` int(10) NOT NULL,
`version` int(3) DEFAULT NULL,
PRIMARY KEY (`bank_accounts_id`),
UNIQUE KEY `account_number` (`account_number`),
KEY `FK_bank_accounts_createdBy_users` (`created_by`),
KEY `FK_bank_accounts_ModifiedBy_users` (`modified_by`),
CONSTRAINT `FK_bank_accounts_ModifiedBy_users` FOREIGN KEY (`modified_by`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_bank_accounts_createdBy_users` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.bank_accounts: ~2 rows (approximately)
DELETE FROM `bank_accounts`;
/*!40000 ALTER TABLE `bank_accounts` DISABLE KEYS */;
INSERT INTO `bank_accounts` (`bank_accounts_id`, `bank_name`, `account_number`, `account_name`, `address`, `ifsc_code`, `balance`, `created_at`, `created_by`, `modified_at`, `modified_by`, `version`) VALUES
(1, 'Bank of Baroda', 24280500004332, 'Sai Plywood', 'Samantarapur', 'BOB2345', 9954, '2017-06-07 13:47:24', 9, '2017-06-07 13:47:24', 9, 3),
(3, 'Bank of Baroda', 24280500004333, 'Sai Plywood', 'Samantarapur', 'BOB2345', 0, '2017-06-07 13:47:24', 9, '2017-06-07 13:47:24', 9, 0);
/*!40000 ALTER TABLE `bank_accounts` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.bank_credits
DROP TABLE IF EXISTS `bank_credits`;
CREATE TABLE IF NOT EXISTS `bank_credits` (
`bank_credit_id` int(10) NOT NULL AUTO_INCREMENT,
`bank_name` varchar(100) NOT NULL,
`account_number` bigint(16) NOT NULL,
`deposit_date` datetime DEFAULT NULL,
`clearance_date` datetime DEFAULT NULL,
`credit_reason` varchar(500) DEFAULT NULL,
`payment_details_id` int(11) DEFAULT NULL,
`created_at` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
`created_by` int(11) DEFAULT NULL,
`modified_at` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
`modified_by` int(11) DEFAULT NULL,
`version` int(3) DEFAULT NULL,
PRIMARY KEY (`bank_credit_id`),
KEY `FK_bank_deposits_payment_details_id` (`payment_details_id`),
KEY `FK_bank_deposits_createdBy_users` (`created_by`),
KEY `FK_bank_deposits_ModifiedBy_users` (`modified_by`),
CONSTRAINT `FK_bank_deposits_ModifiedBy_users` FOREIGN KEY (`modified_by`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_bank_deposits_createdBy_users` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_bank_deposits_payment_details_id` FOREIGN KEY (`payment_details_id`) REFERENCES `order_payment_details` (`payment_details_Id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.bank_credits: ~1 rows (approximately)
DELETE FROM `bank_credits`;
/*!40000 ALTER TABLE `bank_credits` DISABLE KEYS */;
/*!40000 ALTER TABLE `bank_credits` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.bank_debits
DROP TABLE IF EXISTS `bank_debits`;
CREATE TABLE IF NOT EXISTS `bank_debits` (
`bank_debit_id` int(10) NOT NULL AUTO_INCREMENT,
`bank_name` varchar(100) NOT NULL,
`account_number` bigint(16) NOT NULL,
`debit_date` datetime DEFAULT NULL,
`debit_reason` varchar(500) NOT NULL,
`payment_id` int(10) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`created_by` int(10) NOT NULL,
`modified_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`modified_by` int(10) NOT NULL,
`version` int(3) DEFAULT NULL,
PRIMARY KEY (`bank_debit_id`),
KEY `FK_bank_debits_payment_id` (`payment_id`),
KEY `FK_bank_debits_createdBy_users` (`created_by`),
KEY `FK_bank_debits_ModifiedBy_users` (`modified_by`),
CONSTRAINT `FK_bank_debits_ModifiedBy_users` FOREIGN KEY (`modified_by`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_bank_debits_createdBy_users` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_bank_debits_payment_id` FOREIGN KEY (`payment_id`) REFERENCES `vendor_payment_details` (`payment_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.bank_debits: ~0 rows (approximately)
DELETE FROM `bank_debits`;
/*!40000 ALTER TABLE `bank_debits` DISABLE KEYS */;
/*!40000 ALTER TABLE `bank_debits` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.challan
DROP TABLE IF EXISTS `challan`;
CREATE TABLE IF NOT EXISTS `challan` (
`challan_id` int(11) NOT NULL AUTO_INCREMENT,
`challan_number` varchar(45) DEFAULT NULL,
`order_id` int(11) DEFAULT NULL,
`status` varchar(45) DEFAULT NULL,
`created_at` varchar(45) DEFAULT NULL,
PRIMARY KEY (`challan_id`),
KEY `FK_challan_order_booking` (`order_id`),
CONSTRAINT `FK_challan_order_booking` FOREIGN KEY (`order_id`) REFERENCES `order_booking` (`order_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.challan: ~0 rows (approximately)
DELETE FROM `challan`;
/*!40000 ALTER TABLE `challan` DISABLE KEYS */;
/*!40000 ALTER TABLE `challan` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.customer
DROP TABLE IF EXISTS `customer`;
CREATE TABLE IF NOT EXISTS `customer` (
`customer_id` int(11) NOT NULL AUTO_INCREMENT,
`first_name` varchar(100) DEFAULT NULL,
`last_name` varchar(100) DEFAULT NULL,
`customer_type` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`address_id` int(11) DEFAULT NULL,
`email` varchar(45) DEFAULT NULL,
`mobile` varchar(45) DEFAULT NULL,
`phone` varchar(45) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`customer_id`),
UNIQUE KEY `mobile_UNIQUE` (`mobile`),
KEY `customer_address_fk_idx` (`address_id`),
CONSTRAINT `customer_address_fk` FOREIGN KEY (`address_id`) REFERENCES `address` (`address_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.customer: ~0 rows (approximately)
DELETE FROM `customer`;
/*!40000 ALTER TABLE `customer` DISABLE KEYS */;
INSERT INTO `customer` (`customer_id`, `first_name`, `last_name`, `customer_type`, `created_at`, `address_id`, `email`, `mobile`, `phone`, `version`) VALUES
(1, 'dasda', 'dasd', NULL, '2017-05-16 16:52:55', 257, NULL, '3123122313', NULL, 1);
/*!40000 ALTER TABLE `customer` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.designation
DROP TABLE IF EXISTS `designation`;
CREATE TABLE IF NOT EXISTS `designation` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`code` varchar(100) DEFAULT NULL,
`name` varchar(100) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.designation: ~0 rows (approximately)
DELETE FROM `designation`;
/*!40000 ALTER TABLE `designation` DISABLE KEYS */;
/*!40000 ALTER TABLE `designation` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.employee
DROP TABLE IF EXISTS `employee`;
CREATE TABLE IF NOT EXISTS `employee` (
`employee_id` int(11) NOT NULL AUTO_INCREMENT,
`first_name` varchar(100) DEFAULT NULL,
`middle_name` varchar(100) DEFAULT NULL,
`last_name` varchar(100) DEFAULT NULL,
`address_id` int(11) DEFAULT NULL,
`id_proof` varchar(100) DEFAULT NULL,
`id_type` varchar(100) DEFAULT NULL,
`email` varchar(100) DEFAULT NULL,
`created_at` datetime DEFAULT NULL,
`mobile` varchar(100) DEFAULT NULL,
`phone` varchar(100) DEFAULT NULL,
`del_flag` int(11) DEFAULT '0',
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`employee_id`),
UNIQUE KEY `id_proof_UNIQUE` (`id_proof`),
KEY `FK_employee` (`address_id`),
KEY `idx_user_id` (`first_name`),
CONSTRAINT `fk_employee` FOREIGN KEY (`address_id`) REFERENCES `address` (`address_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=25 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.employee: ~1 rows (approximately)
DELETE FROM `employee`;
/*!40000 ALTER TABLE `employee` DISABLE KEYS */;
INSERT INTO `employee` (`employee_id`, `first_name`, `middle_name`, `last_name`, `address_id`, `id_proof`, `id_type`, `email`, `created_at`, `mobile`, `phone`, `del_flag`, `version`) VALUES
(24, 'Swadhin', 'Kumar', 'Mohanta', 203, 'BBWER6G', 'Pan Card', 'swadhin4@gmail.com', '2017-03-30 18:08:55', '9583291103', NULL, 0, 0);
/*!40000 ALTER TABLE `employee` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.grades
DROP TABLE IF EXISTS `grades`;
CREATE TABLE IF NOT EXISTS `grades` (
`grade_id` int(11) NOT NULL AUTO_INCREMENT,
`grade_name` varchar(100) DEFAULT NULL,
`thickness` decimal(10,2) DEFAULT NULL,
`thickness_unit` varchar(45) DEFAULT NULL,
`length` decimal(10,2) DEFAULT NULL,
`width` decimal(10,2) DEFAULT NULL,
`size_unit` varchar(10) DEFAULT NULL,
`price` decimal(10,2) DEFAULT NULL,
`unit_sqmtr_piece` decimal(10,2) DEFAULT NULL,
`vat` decimal(5,2) DEFAULT NULL,
`cgst` decimal(5,2) DEFAULT NULL,
`igst` decimal(5,2) DEFAULT NULL,
`sgst` decimal(5,2) DEFAULT NULL,
`vendor_id` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
`product_id` int(11) DEFAULT NULL,
`description` varchar(500) DEFAULT NULL,
PRIMARY KEY (`grade_id`),
KEY `FK_grades_vendor` (`vendor_id`),
KEY `FK_grades_product_idx` (`product_id`),
CONSTRAINT `FK_grades_product` FOREIGN KEY (`product_id`) REFERENCES `product` (`product_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_grades_vendor` FOREIGN KEY (`vendor_id`) REFERENCES `vendor` (`vendor_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=6 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.grades: ~3 rows (approximately)
DELETE FROM `grades`;
/*!40000 ALTER TABLE `grades` DISABLE KEYS */;
INSERT INTO `grades` (`grade_id`, `grade_name`, `thickness`, `thickness_unit`, `length`, `width`, `size_unit`, `price`, `unit_sqmtr_piece`, `vat`, `cgst`, `igst`, `sgst`, `vendor_id`, `created_at`, `version`, `product_id`, `description`) VALUES
(1, NULL, NULL, NULL, 10.00, 8.00, NULL, 100.00, 2.00, 4.00, NULL, NULL, NULL, 1, '2017-03-31 10:16:42', 4, 22, '19 MM PLY Century Club Plus 10 X 8'),
(2, '', NULL, '', 9.00, 8.00, 'MTR', 200.00, 2.00, 5.00, NULL, NULL, NULL, 1, '2017-04-03 19:59:45', 0, 22, '10MM Ply Century Club Plus 9 X 8'),
(4, NULL, NULL, NULL, 9.00, 8.00, NULL, 500.00, 2.00, 5.00, NULL, NULL, NULL, 1, '2017-04-03 21:39:03', 3, 22, '24MM Ply Century Club Plus 9 X 8'),
(5, NULL, NULL, NULL, NULL, NULL, NULL, 400.00, NULL, 4.00, NULL, NULL, NULL, 1, '2017-09-14 02:08:39', 5, 22, 'Test');
/*!40000 ALTER TABLE `grades` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.grades_backup
DROP TABLE IF EXISTS `grades_backup`;
CREATE TABLE IF NOT EXISTS `grades_backup` (
`grade_id` int(11) NOT NULL AUTO_INCREMENT,
`grade_name` varchar(100) DEFAULT NULL,
`thickness` decimal(10,2) DEFAULT NULL,
`thickness_unit` varchar(45) DEFAULT NULL,
`length` decimal(10,2) DEFAULT NULL,
`width` decimal(10,2) DEFAULT NULL,
`size_unit` varchar(10) DEFAULT NULL,
`price` decimal(10,2) DEFAULT NULL,
`vat` decimal(5,2) DEFAULT NULL,
`vendor_id` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
`product_id` int(11) DEFAULT NULL,
`description` varchar(500) DEFAULT NULL,
PRIMARY KEY (`grade_id`),
KEY `FK_grades_vendor` (`vendor_id`),
KEY `FK_grades_product_idx` (`product_id`),
CONSTRAINT `grades_backup_ibfk_1` FOREIGN KEY (`product_id`) REFERENCES `product` (`product_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `grades_backup_ibfk_2` FOREIGN KEY (`vendor_id`) REFERENCES `vendor` (`vendor_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8 ROW_FORMAT=DYNAMIC;
-- Dumping data for table saiapp_db.grades_backup: ~0 rows (approximately)
DELETE FROM `grades_backup`;
/*!40000 ALTER TABLE `grades_backup` DISABLE KEYS */;
INSERT INTO `grades_backup` (`grade_id`, `grade_name`, `thickness`, `thickness_unit`, `length`, `width`, `size_unit`, `price`, `vat`, `vendor_id`, `created_at`, `version`, `product_id`, `description`) VALUES
(1, NULL, NULL, NULL, 10.00, 8.00, 'MTR', 100.00, 5.00, 1, '2017-03-31 10:16:42', 1, 22, 'fdsf');
/*!40000 ALTER TABLE `grades_backup` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.hardware
DROP TABLE IF EXISTS `hardware`;
CREATE TABLE IF NOT EXISTS `hardware` (
`hardware_stock_id` int(11) NOT NULL AUTO_INCREMENT,
`description` varchar(250) DEFAULT NULL,
`unit_price` decimal(10,2) DEFAULT NULL,
`cost_price` decimal(10,2) DEFAULT NULL,
`total_units` int(11) DEFAULT NULL,
`unit_type` varchar(50) DEFAULT NULL,
`vat` decimal(10,2) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`hardware_stock_id`),
UNIQUE KEY `description` (`description`)
) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.hardware: ~2 rows (approximately)
DELETE FROM `hardware`;
/*!40000 ALTER TABLE `hardware` DISABLE KEYS */;
INSERT INTO `hardware` (`hardware_stock_id`, `description`, `unit_price`, `cost_price`, `total_units`, `unit_type`, `vat`, `created_at`, `version`) VALUES
(1, 'ADHESIVE FEVICOL MARINE 50KG', 8558.95, NULL, 10, 'Pieces', 14.50, '2017-04-04 19:44:59', 15),
(2, 'FEVICOL HEATX 2 LTRS', 698.68, NULL, 17, 'Litre', 14.50, '2017-04-04 19:46:15', 8);
/*!40000 ALTER TABLE `hardware` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.invoice
DROP TABLE IF EXISTS `invoice`;
CREATE TABLE IF NOT EXISTS `invoice` (
`invoice_id` int(11) NOT NULL AUTO_INCREMENT,
`invoice_number` varchar(50) DEFAULT NULL,
`order_id` int(11) DEFAULT NULL,
`invoice_type` varchar(50) DEFAULT NULL,
`invoice_status` varchar(50) DEFAULT NULL,
`created_at` timestamp NULL DEFAULT CURRENT_TIMESTAMP,
`created_by` int(11) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`invoice_id`),
UNIQUE KEY `invoice_number` (`invoice_number`),
KEY `FK_invoice_order_booking` (`order_id`),
CONSTRAINT `FK_invoice_order_booking` FOREIGN KEY (`order_id`) REFERENCES `order_booking` (`order_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.invoice: ~1 rows (approximately)
DELETE FROM `invoice`;
/*!40000 ALTER TABLE `invoice` DISABLE KEYS */;
/*!40000 ALTER TABLE `invoice` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.order_booking
DROP TABLE IF EXISTS `order_booking`;
CREATE TABLE IF NOT EXISTS `order_booking` (
`order_id` int(11) NOT NULL AUTO_INCREMENT,
`order_number` varchar(20) DEFAULT NULL,
`customer_id` int(11) DEFAULT NULL,
`order_address_id` int(11) DEFAULT NULL,
`order_amount` decimal(10,2) DEFAULT NULL,
`created_at` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
`last_updated_at` datetime DEFAULT NULL,
`order_status` varchar(50) DEFAULT NULL,
`payment_status` varchar(50) DEFAULT NULL,
`collection_agent` int(11) DEFAULT NULL,
`order_agent` int(11) DEFAULT NULL,
`last_modified_by` int(11) DEFAULT NULL,
`created_by` int(11) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`order_id`),
UNIQUE KEY `order_number` (`order_number`),
KEY `order_cust_fk_idx` (`customer_id`),
KEY `order_address_fk_idx` (`order_address_id`),
KEY `FK_order_agent` (`order_agent`),
KEY `FK_order_employee` (`collection_agent`),
KEY `FK_order_user` (`last_modified_by`),
KEY `FK_order_user_created_by` (`created_by`),
CONSTRAINT `FK_order_address` FOREIGN KEY (`order_address_id`) REFERENCES `address` (`address_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_order_agent` FOREIGN KEY (`order_agent`) REFERENCES `agent` (`agent_id`),
CONSTRAINT `FK_order_customer` FOREIGN KEY (`customer_id`) REFERENCES `customer` (`customer_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_order_employee` FOREIGN KEY (`collection_agent`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_order_user` FOREIGN KEY (`last_modified_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_order_user_created_by` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.order_booking: ~0 rows (approximately)
DELETE FROM `order_booking`;
/*!40000 ALTER TABLE `order_booking` DISABLE KEYS */;
/*!40000 ALTER TABLE `order_booking` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.order_item
DROP TABLE IF EXISTS `order_item`;
CREATE TABLE IF NOT EXISTS `order_item` (
`item_id` int(11) NOT NULL AUTO_INCREMENT,
`grade_id` int(11) DEFAULT NULL,
`hardware_stock_id` int(11) DEFAULT NULL,
`quantity` int(11) DEFAULT NULL,
`total_price` decimal(10,2) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`order_id` int(11) DEFAULT NULL,
`total_square_meter` decimal(10,2) DEFAULT NULL,
PRIMARY KEY (`item_id`),
KEY `FK_order_items_order` (`order_id`),
KEY `FK_order_item_grade` (`grade_id`),
KEY `FK_order_item_hardware` (`hardware_stock_id`),
CONSTRAINT `FK_order_item_grade` FOREIGN KEY (`grade_id`) REFERENCES `grades` (`grade_id`),
CONSTRAINT `FK_order_item_hardware` FOREIGN KEY (`hardware_stock_id`) REFERENCES `hardware` (`hardware_stock_id`),
CONSTRAINT `FK_order_items_order` FOREIGN KEY (`order_id`) REFERENCES `order_booking` (`order_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.order_item: ~0 rows (approximately)
DELETE FROM `order_item`;
/*!40000 ALTER TABLE `order_item` DISABLE KEYS */;
/*!40000 ALTER TABLE `order_item` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.order_item_warehouse_breakup
DROP TABLE IF EXISTS `order_item_warehouse_breakup`;
CREATE TABLE IF NOT EXISTS `order_item_warehouse_breakup` (
`warehouse_breakup_id` int(11) NOT NULL AUTO_INCREMENT,
`warehouse_id` int(11) DEFAULT NULL,
`quantity` int(11) DEFAULT NULL,
`item_id` int(11) DEFAULT NULL,
PRIMARY KEY (`warehouse_breakup_id`),
KEY `warehouse_breakup_fk_idx` (`warehouse_id`),
KEY `item_breakup_fk_idx` (`item_id`),
CONSTRAINT `item_breakup_fk` FOREIGN KEY (`item_id`) REFERENCES `order_item` (`item_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `warehouse_breakup_fk` FOREIGN KEY (`warehouse_id`) REFERENCES `warehouse` (`warehouse_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.order_item_warehouse_breakup: ~0 rows (approximately)
DELETE FROM `order_item_warehouse_breakup`;
/*!40000 ALTER TABLE `order_item_warehouse_breakup` DISABLE KEYS */;
/*!40000 ALTER TABLE `order_item_warehouse_breakup` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.order_payment_details
DROP TABLE IF EXISTS `order_payment_details`;
CREATE TABLE IF NOT EXISTS `order_payment_details` (
`payment_details_Id` int(11) NOT NULL AUTO_INCREMENT,
`payment_amount` decimal(10,2) DEFAULT NULL,
`payment_type` varchar(50) DEFAULT NULL,
`bank_name` varchar(50) DEFAULT NULL,
`branch_name` varchar(50) DEFAULT NULL,
`amount_due` decimal(10,2) DEFAULT NULL,
`cheque_draft_no` varchar(20) DEFAULT NULL,
`cheque_draft_date` datetime DEFAULT NULL,
`payment_date` datetime DEFAULT NULL,
`next_pmt_due_date` date DEFAULT NULL,
`receipt` varchar(50) DEFAULT NULL,
`order_id` int(11) DEFAULT NULL,
`created_at` datetime DEFAULT NULL,
`remarks` varchar(200) DEFAULT NULL,
PRIMARY KEY (`payment_details_Id`),
KEY `Order_PMT_OrderId_FK_idx` (`order_id`),
CONSTRAINT `Order_PMT_OrderId_FK` FOREIGN KEY (`order_id`) REFERENCES `order_booking` (`order_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='Payment details of orders';
-- Dumping data for table saiapp_db.order_payment_details: ~0 rows (approximately)
DELETE FROM `order_payment_details`;
/*!40000 ALTER TABLE `order_payment_details` DISABLE KEYS */;
/*!40000 ALTER TABLE `order_payment_details` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.product
DROP TABLE IF EXISTS `product`;
CREATE TABLE IF NOT EXISTS `product` (
`product_id` int(11) NOT NULL AUTO_INCREMENT,
`product_name` varchar(100) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
`product_code` varchar(100) DEFAULT NULL,
`del_flag` int(11) DEFAULT NULL,
PRIMARY KEY (`product_id`)
) ENGINE=InnoDB AUTO_INCREMENT=29 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.product: ~3 rows (approximately)
DELETE FROM `product`;
/*!40000 ALTER TABLE `product` DISABLE KEYS */;
INSERT INTO `product` (`product_id`, `product_name`, `created_at`, `version`, `product_code`, `del_flag`) VALUES
(22, 'Plywood', '2017-03-30 21:11:41', 5, NULL, 0),
(27, 'Laminates1', '2017-03-31 00:34:15', 1, NULL, 0),
(28, 'Hardware', '2017-04-04 19:42:57', 0, NULL, 0);
/*!40000 ALTER TABLE `product` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.product_grade_stock
DROP TABLE IF EXISTS `product_grade_stock`;
CREATE TABLE IF NOT EXISTS `product_grade_stock` (
`stock_id` int(11) NOT NULL AUTO_INCREMENT,
`grade_id` int(11) DEFAULT NULL,
`quantity` int(11) DEFAULT NULL,
`warehouse_id` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`stock_id`),
KEY `FK_product_stock` (`grade_id`),
KEY `FK_product_stock_warehouse` (`warehouse_id`),
CONSTRAINT `FK_product_stock_grade` FOREIGN KEY (`grade_id`) REFERENCES `grades` (`grade_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_product_stock_warehouse` FOREIGN KEY (`warehouse_id`) REFERENCES `warehouse` (`warehouse_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=21 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.product_grade_stock: ~4 rows (approximately)
DELETE FROM `product_grade_stock`;
/*!40000 ALTER TABLE `product_grade_stock` DISABLE KEYS */;
INSERT INTO `product_grade_stock` (`stock_id`, `grade_id`, `quantity`, `warehouse_id`, `created_at`, `version`) VALUES
(17, 1, 171, 1, '2017-05-11 19:22:59', 21),
(18, 1, 170, 2, '2017-05-11 19:22:59', 23),
(19, 2, 200, 1, '2017-09-14 00:06:05', 0),
(20, 2, 100, 2, '2017-09-14 00:06:06', 0);
/*!40000 ALTER TABLE `product_grade_stock` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.purchase_order
DROP TABLE IF EXISTS `purchase_order`;
CREATE TABLE IF NOT EXISTS `purchase_order` (
`purchase_order_id` int(10) NOT NULL AUTO_INCREMENT,
`purchase_order_number` varchar(20) DEFAULT NULL,
`purchase_invoice_number` varchar(20) DEFAULT NULL,
`vendor_id` int(10) DEFAULT NULL,
`purchase_order_amount` decimal(10,2) DEFAULT NULL,
`order_status` varchar(50) DEFAULT NULL,
`payment_status` varchar(50) DEFAULT NULL,
`is_stock_entry` int(11) DEFAULT '0',
`created_by` int(10) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`last_modified_by` int(10) DEFAULT NULL,
`last_updated_at` datetime DEFAULT NULL,
`version` int(3) DEFAULT NULL,
PRIMARY KEY (`purchase_order_id`),
UNIQUE KEY `purchage_order_number_unique` (`purchase_order_number`),
UNIQUE KEY `purchase_invoice_number` (`purchase_invoice_number`),
KEY `FK_purchage_order_vendor` (`vendor_id`),
KEY `FK_purchage_order_user_modified_by` (`last_modified_by`),
KEY `FK_purchage_order_user_created_by` (`created_by`),
CONSTRAINT `FK_purchage_order_user_created_by` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_purchage_order_user_modified_by` FOREIGN KEY (`last_modified_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_purchage_order_vendor` FOREIGN KEY (`vendor_id`) REFERENCES `vendor` (`vendor_id`)
) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.purchase_order: ~1 rows (approximately)
DELETE FROM `purchase_order`;
/*!40000 ALTER TABLE `purchase_order` DISABLE KEYS */;
INSERT INTO `purchase_order` (`purchase_order_id`, `purchase_order_number`, `purchase_invoice_number`, `vendor_id`, `purchase_order_amount`, `order_status`, `payment_status`, `is_stock_entry`, `created_by`, `created_at`, `last_modified_by`, `last_updated_at`, `version`) VALUES
(3, 'dasdda', 'dasd', 1, 1.00, 'RECEIVED', 'PENDING', 1, 9, '2017-09-14 12:33:16', 9, '2017-09-14 12:34:15', 1);
/*!40000 ALTER TABLE `purchase_order` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.purchase_order_image
DROP TABLE IF EXISTS `purchase_order_image`;
CREATE TABLE IF NOT EXISTS `purchase_order_image` (
`img_id` int(11) NOT NULL AUTO_INCREMENT,
`image` blob,
`purchase_order_id` int(11) DEFAULT NULL,
PRIMARY KEY (`img_id`),
KEY `FK_purchase_order_image_purchase_order` (`purchase_order_id`),
CONSTRAINT `FK_purchase_order_image_purchase_order` FOREIGN KEY (`purchase_order_id`) REFERENCES `purchase_order` (`purchase_order_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.purchase_order_image: ~1 rows (approximately)
DELETE FROM `purchase_order_image`;
/*!40000 ALTER TABLE `purchase_order_image` DISABLE KEYS */;
/*!40000 ALTER TABLE `purchase_order_image` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.purchase_order_item
DROP TABLE IF EXISTS `purchase_order_item`;
CREATE TABLE IF NOT EXISTS `purchase_order_item` (
`purchase_item_id` int(10) NOT NULL AUTO_INCREMENT,
`grade_id` int(10) DEFAULT NULL,
`description` varchar(50) DEFAULT NULL,
`total_sq_mtr` decimal(10,3) DEFAULT NULL,
`hardware_id` int(11) DEFAULT NULL,
`product_id` int(11) DEFAULT NULL,
`quantity` int(10) DEFAULT NULL,
`unit_cost_price` decimal(10,2) DEFAULT NULL,
`total_cost_price` decimal(10,2) DEFAULT NULL,
`is_stock_entered` int(11) DEFAULT '0',
`purchase_order_id` int(10) DEFAULT NULL,
`created_by` int(10) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`last_modified_by` int(10) DEFAULT NULL,
`last_updated_at` datetime DEFAULT NULL,
`version` int(3) DEFAULT NULL,
PRIMARY KEY (`purchase_item_id`),
KEY `FK_purchage_items_grade` (`grade_id`),
KEY `FK_order_items_hardware` (`hardware_id`),
KEY `FK_purchage_items_order` (`purchase_order_id`),
KEY `FK_purchage_items_user_modified_by` (`last_modified_by`),
KEY `FK_purchage_items_user_created_by` (`created_by`),
KEY `FK_purchase_order_item_product` (`product_id`),
CONSTRAINT `FK_order_items_hardware` FOREIGN KEY (`hardware_id`) REFERENCES `hardware` (`hardware_stock_id`),
CONSTRAINT `FK_purchage_items_grade` FOREIGN KEY (`grade_id`) REFERENCES `grades` (`grade_id`),
CONSTRAINT `FK_purchage_items_order` FOREIGN KEY (`purchase_order_id`) REFERENCES `purchase_order` (`purchase_order_id`),
CONSTRAINT `FK_purchage_items_user_created_by` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_purchage_items_user_modified_by` FOREIGN KEY (`last_modified_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_purchase_order_item_product` FOREIGN KEY (`product_id`) REFERENCES `product` (`product_id`)
) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.purchase_order_item: ~1 rows (approximately)
DELETE FROM `purchase_order_item`;
/*!40000 ALTER TABLE `purchase_order_item` DISABLE KEYS */;
INSERT INTO `purchase_order_item` (`purchase_item_id`, `grade_id`, `description`, `total_sq_mtr`, `hardware_id`, `product_id`, `quantity`, `unit_cost_price`, `total_cost_price`, `is_stock_entered`, `purchase_order_id`, `created_by`, `created_at`, `last_modified_by`, `last_updated_at`, `version`) VALUES
(1, 1, '19 MM PLY Century Club Plus 10 X 8', 1.000, NULL, 22, 1, 1.00, 1.00, 1, 3, 9, '2017-09-14 12:33:16', 9, NULL, 1);
/*!40000 ALTER TABLE `purchase_order_item` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.role
DROP TABLE IF EXISTS `role`;
CREATE TABLE IF NOT EXISTS `role` (
`role_id` int(11) NOT NULL AUTO_INCREMENT,
`role_name` varchar(100) DEFAULT NULL,
`description` varchar(100) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
PRIMARY KEY (`role_id`)
) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.role: ~3 rows (approximately)
DELETE FROM `role`;
/*!40000 ALTER TABLE `role` DISABLE KEYS */;
INSERT INTO `role` (`role_id`, `role_name`, `description`, `version`) VALUES
(1, 'ROLE_ADMIN', 'Admin', 0),
(2, 'STOCK_MANAGER', 'Stock Manager', 0),
(3, 'COLLECTION_MANAGER', 'Collection Manager', 0);
/*!40000 ALTER TABLE `role` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.total_grade_stock
DROP TABLE IF EXISTS `total_grade_stock`;
CREATE TABLE IF NOT EXISTS `total_grade_stock` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`grade_id` int(11) DEFAULT NULL,
`total_quantity` int(11) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`id`),
KEY `grade_total_fk_idx` (`grade_id`),
CONSTRAINT `grade_total_fk` FOREIGN KEY (`grade_id`) REFERENCES `grades` (`grade_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.total_grade_stock: ~0 rows (approximately)
DELETE FROM `total_grade_stock`;
/*!40000 ALTER TABLE `total_grade_stock` DISABLE KEYS */;
/*!40000 ALTER TABLE `total_grade_stock` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.trader_account
DROP TABLE IF EXISTS `trader_account`;
CREATE TABLE IF NOT EXISTS `trader_account` (
`trader_account_id` int(11) NOT NULL AUTO_INCREMENT,
`bank_name` varchar(100) DEFAULT NULL,
`account_number` bigint(20) DEFAULT NULL,
`amount_balance` bigint(20) DEFAULT NULL,
`bank_address` varchar(100) DEFAULT NULL,
`ifsc_code` varchar(50) DEFAULT NULL,
PRIMARY KEY (`trader_account_id`),
UNIQUE KEY `account_number` (`account_number`)
) ENGINE=InnoDB AUTO_INCREMENT=7 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.trader_account: ~4 rows (approximately)
DELETE FROM `trader_account`;
/*!40000 ALTER TABLE `trader_account` DISABLE KEYS */;
INSERT INTO `trader_account` (`trader_account_id`, `bank_name`, `account_number`, `amount_balance`, `bank_address`, `ifsc_code`) VALUES
(1, 'Bank of Baroda', 24280500004332, NULL, 'Samantarapur', 'BARB0SABHUB'),
(4, 'Bank of Baroda', 24280500004331, NULL, 'Samantarapur', 'BARB0SABHUB'),
(5, 'Bank of India', 12345678902345, NULL, 'Samantarapur', 'BOB0SABHUB'),
(6, 'Bank of India', 18899978902345, NULL, 'Samantarapur', 'BOB0SABHUB');
/*!40000 ALTER TABLE `trader_account` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.user_role
DROP TABLE IF EXISTS `user_role`;
CREATE TABLE IF NOT EXISTS `user_role` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`user_id` int(11) DEFAULT NULL,
`role_id` int(11) DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `user_role_ibfk_1` (`role_id`),
KEY `user_role_ibfk_2` (`user_id`),
CONSTRAINT `user_role_ibfk_1` FOREIGN KEY (`role_id`) REFERENCES `role` (`role_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `user_role_ibfk_2` FOREIGN KEY (`user_id`) REFERENCES `application_users` (`user_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=33 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.user_role: ~1 rows (approximately)
DELETE FROM `user_role`;
/*!40000 ALTER TABLE `user_role` DISABLE KEYS */;
INSERT INTO `user_role` (`id`, `user_id`, `role_id`) VALUES
(32, 9, 1);
/*!40000 ALTER TABLE `user_role` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.vendor
DROP TABLE IF EXISTS `vendor`;
CREATE TABLE IF NOT EXISTS `vendor` (
`vendor_id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(100) DEFAULT NULL,
`address_id` int(11) DEFAULT NULL,
`enabled` char(1) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`created_by` int(11) DEFAULT NULL,
`version` int(11) DEFAULT NULL,
`contact_person` varchar(45) DEFAULT NULL,
`phone` varchar(10) DEFAULT NULL,
`mobile` varchar(10) DEFAULT NULL,
`del_flag` int(11) DEFAULT '0',
PRIMARY KEY (`vendor_id`),
UNIQUE KEY `name_phone` (`name`,`phone`),
KEY `FK_vendor_address` (`address_id`),
CONSTRAINT `FK_vendor_address` FOREIGN KEY (`address_id`) REFERENCES `address` (`address_id`)
) ENGINE=InnoDB AUTO_INCREMENT=2 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.vendor: ~0 rows (approximately)
DELETE FROM `vendor`;
/*!40000 ALTER TABLE `vendor` DISABLE KEYS */;
INSERT INTO `vendor` (`vendor_id`, `name`, `address_id`, `enabled`, `created_at`, `created_by`, `version`, `contact_person`, `phone`, `mobile`, `del_flag`) VALUES
(1, 'Century', 216, NULL, '2017-03-31 10:15:21', 0, 2, 'Test', '5345345345', '3424234234', 0);
/*!40000 ALTER TABLE `vendor` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.vendor_bank_details
DROP TABLE IF EXISTS `vendor_bank_details`;
CREATE TABLE IF NOT EXISTS `vendor_bank_details` (
`bank_id` int(11) NOT NULL AUTO_INCREMENT,
`bank_name` varchar(100) DEFAULT NULL,
`account_name` varchar(100) DEFAULT NULL,
`account_number` bigint(20) DEFAULT NULL,
`ifsc_code` varchar(100) DEFAULT NULL,
`branch` varchar(100) DEFAULT NULL,
`address` varchar(300) DEFAULT NULL,
`vendor_id` int(11) DEFAULT NULL,
`version` int(11) DEFAULT '0',
PRIMARY KEY (`bank_id`),
KEY `FK_vendor_bank_details_vendor` (`vendor_id`),
CONSTRAINT `FK_vendor_bank_details_vendor` FOREIGN KEY (`vendor_id`) REFERENCES `vendor` (`vendor_id`)
) ENGINE=InnoDB AUTO_INCREMENT=5 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.vendor_bank_details: ~0 rows (approximately)
DELETE FROM `vendor_bank_details`;
/*!40000 ALTER TABLE `vendor_bank_details` DISABLE KEYS */;
INSERT INTO `vendor_bank_details` (`bank_id`, `bank_name`, `account_name`, `account_number`, `ifsc_code`, `branch`, `address`, `vendor_id`, `version`) VALUES
(4, 'ICIC', 'dfsf', 3213, 'fsdfsdf', 'fsd', 'fsdfsf', 1, 0);
/*!40000 ALTER TABLE `vendor_bank_details` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.vendor_payment_details
DROP TABLE IF EXISTS `vendor_payment_details`;
CREATE TABLE IF NOT EXISTS `vendor_payment_details` (
`payment_id` int(10) NOT NULL AUTO_INCREMENT,
`payment_amount` decimal(10,2) DEFAULT NULL,
`payment_type` varchar(50) DEFAULT NULL,
`cheque_draft_no` varchar(20) DEFAULT NULL,
`cheque_draft_date` datetime DEFAULT NULL,
`deposit_date` datetime DEFAULT NULL,
`bank_name` varchar(50) DEFAULT NULL,
`branch_name` varchar(50) DEFAULT NULL,
`vendor_bank_name` varchar(100) DEFAULT NULL,
`vendor_account_number` bigint(16) NOT NULL,
`transanction_no` varchar(20) DEFAULT NULL,
`receipt_no` varchar(20) DEFAULT NULL,
`amount_due` decimal(10,2) DEFAULT NULL,
`next_pmt_due_date` datetime DEFAULT NULL,
`purchage_order_id` int(11) DEFAULT NULL,
`remarks` varchar(500) DEFAULT NULL,
`created_by` int(10) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`last_modified_by` int(10) DEFAULT NULL,
`last_updated_at` datetime DEFAULT NULL,
`version` int(3) DEFAULT NULL,
PRIMARY KEY (`payment_id`),
KEY `Order_PMT_purchage_OrderId_FK` (`purchage_order_id`),
KEY `FK_vendor_payment_user_modified_by` (`last_modified_by`),
KEY `FK_vendor_payment_user_created_by` (`created_by`),
CONSTRAINT `FK_vendor_payment_user_created_by` FOREIGN KEY (`created_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `FK_vendor_payment_user_modified_by` FOREIGN KEY (`last_modified_by`) REFERENCES `application_users` (`user_id`),
CONSTRAINT `Order_PMT_purchage_OrderId_FK` FOREIGN KEY (`purchage_order_id`) REFERENCES `purchase_order` (`purchase_order_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.vendor_payment_details: ~0 rows (approximately)
DELETE FROM `vendor_payment_details`;
/*!40000 ALTER TABLE `vendor_payment_details` DISABLE KEYS */;
/*!40000 ALTER TABLE `vendor_payment_details` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.vendor_product
DROP TABLE IF EXISTS `vendor_product`;
CREATE TABLE IF NOT EXISTS `vendor_product` (
`product_id` int(11) NOT NULL,
`VENDOR_ID` int(11) NOT NULL,
PRIMARY KEY (`product_id`,`VENDOR_ID`),
KEY `VENDOR_FK_idx` (`VENDOR_ID`),
CONSTRAINT `BU_FK` FOREIGN KEY (`product_id`) REFERENCES `product` (`product_id`) ON DELETE CASCADE ON UPDATE CASCADE,
CONSTRAINT `VENDOR_FK` FOREIGN KEY (`VENDOR_ID`) REFERENCES `vendor` (`vendor_id`) ON DELETE CASCADE ON UPDATE CASCADE
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.vendor_product: ~0 rows (approximately)
DELETE FROM `vendor_product`;
/*!40000 ALTER TABLE `vendor_product` DISABLE KEYS */;
/*!40000 ALTER TABLE `vendor_product` ENABLE KEYS */;
-- Dumping structure for table saiapp_db.warehouse
DROP TABLE IF EXISTS `warehouse`;
CREATE TABLE IF NOT EXISTS `warehouse` (
`warehouse_id` int(11) NOT NULL AUTO_INCREMENT,
`warehouse_name` varchar(100) DEFAULT NULL,
`address_id` int(11) DEFAULT NULL,
`incharge_id` int(11) DEFAULT NULL,
`created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
`version` int(11) DEFAULT NULL,
`capacity` int(11) DEFAULT NULL,
`del_flag` int(11) DEFAULT '0',
PRIMARY KEY (`warehouse_id`),
KEY `FK_warehouse` (`address_id`),
KEY `FK_warehouse_employee` (`incharge_id`),
CONSTRAINT `FK_warehouse` FOREIGN KEY (`address_id`) REFERENCES `address` (`address_id`) ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT `FK_warehouse_employee` FOREIGN KEY (`incharge_id`) REFERENCES `employee` (`employee_id`) ON DELETE NO ACTION ON UPDATE NO ACTION
) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=utf8;
-- Dumping data for table saiapp_db.warehouse: ~2 rows (approximately)
DELETE FROM `warehouse`;
/*!40000 ALTER TABLE `warehouse` DISABLE KEYS */;
INSERT INTO `warehouse` (`warehouse_id`, `warehouse_name`, `address_id`, `incharge_id`, `created_at`, `version`, `capacity`, `del_flag`) VALUES
(1, 'W1', 217, 24, '2017-03-31 10:18:05', 0, 1000, 0),
(2, 'W2', 218, 24, '2017-03-31 19:41:40', 0, 200, 0);
/*!40000 ALTER TABLE `warehouse` ENABLE KEYS */;
/*!40101 SET SQL_MODE=IFNULL(@OLD_SQL_MODE, '') */;
/*!40014 SET FOREIGN_KEY_CHECKS=IF(@OLD_FOREIGN_KEY_CHECKS IS NULL, 1, @OLD_FOREIGN_KEY_CHECKS) */;
/*!40101 SET CHARACTER_SET_CLIENT=@OLD_CHARACTER_SET_CLIENT */;
|
Java | UTF-8 | 3,251 | 2.125 | 2 | [] | no_license | package com.sundeinfo.sypglass.service;
import com.sundeinfo.core.log.ActionLog;
import com.sundeinfo.core.log.LogInfo;
import com.sundeinfo.core.log.LogService;
import com.sundeinfo.core.mvc.service.AbstractService;
import com.sundeinfo.core.utility.ConvertUtils;
import com.sundeinfo.security.utility.AuthenticationGetter;
import com.sundeinfo.sypglass.mapper.ActionLogMapper;
import com.sundeinfo.sypglass.model.ActionLogExample;
import org.aspectj.lang.ProceedingJoinPoint;
import org.springframework.stereotype.Service;
import javax.servlet.http.HttpServletRequest;
import java.util.Enumeration;
import java.util.List;
@Service("ActionLogService")
public class ActionLogService extends AbstractService<ActionLogService,LogInfo> implements LogService<ActionLogService,LogInfo> {
private ActionLogMapper actionLogMapper;
private AuthenticationGetter authenticationGetter;
public void setActionLogMapper(ActionLogMapper actionLogMapper) {
this.actionLogMapper = actionLogMapper;
}
public void setAuthenticationGetter(AuthenticationGetter authenticationGetter) {
this.authenticationGetter = authenticationGetter;
}
@Override
public LogInfo createLog(ProceedingJoinPoint joinPoint, ActionLog log, HttpServletRequest request) {
StringBuilder builder = new StringBuilder();
builder.append("参数列表(");
for (Object element:joinPoint.getArgs()) {
builder.append("type:").append(element.getClass().getTypeName()).append(";");
builder.append("value:").append(element.toString()).append(";");
}
builder.append(")");
try {
LogInfo logInfo = create(log,joinPoint);
logInfo.setHttp_Method(request.getMethod());
logInfo.setClass_Method(joinPoint.getSignature().getDeclaringTypeName() + "." + joinPoint.getSignature().getName());
logInfo.setIp(request.getRemoteAddr());
logInfo.setUrl(request.getRequestURL().toString());
logInfo.setParameter(builder.toString());
if (authenticationGetter != null && authenticationGetter.getAnyUser() != null){
logInfo.setUserId(authenticationGetter.getAnyUser().getUser().getId());
logInfo.setUserName(authenticationGetter.getAnyUser().getUser().getName());
}
return logInfo;
}catch (Exception ex){
return null;
}
}
@Override
public void export(LogInfo logInfo) {
try {
if (logInfo != null && logInfo.isNeedful()){
actionLogMapper.insert(ConvertUtils.convert(logInfo, com.sundeinfo.sypglass.model.ActionLog.class));
}
}catch (Exception ex){
logger.error("log保存失败,异常信息:" + ex.getMessage());
}
}
private LogInfo create(ActionLog log,ProceedingJoinPoint joinPoint) throws IllegalAccessException, InstantiationException {
LogInfo logInfo = null;
if (LogInfo.class.isAssignableFrom(log.execClass())){
logInfo = (LogInfo)log.execClass().newInstance();
}else{
logInfo = new LogInfo();
}
logInfo.setJoinPoint(joinPoint);
return logInfo;
}
}
|
Java | UTF-8 | 265 | 1.875 | 2 | [] | no_license | package p;
/*
* This test checks to make sure a Tool verifies that an
* indirectly-loaded source file contains the type that it is named
* after.
*/
// This class just serves to force indirectly loading the type Referenced:
class Master extends Referenced {}
|
Markdown | UTF-8 | 4,802 | 3.4375 | 3 | [
"Apache-2.0"
] | permissive | ---
seo:
title: Schema Evolution
description: Schema Evolution adds new information to, or restructures, an Event.
---
# Schema Evolution
Schema Evolution is an important aspect of data management.
Similar to how APIs evolve and need to be compatible with all applications that rely on older or newer API versions, schemas also evolve, and likewise need to be compatible with all applications that rely on older or newer schema versions.
## Problem
How can I restructure or add new information to an event, ideally in a way that ensures [Schema Compatibility](schema-compatibility.md)?
## Solution

One approach in evolving a schema is to evolve the schema "in place" (as illustrated above). In this approach, a stream can contain events that use new and previous schema versions. The schema compatibility checks then ensure that [Event Processing Applications](../event-processing/event-processing-application.md) and [Event Sinks](../event-sink/event-sink.md) can read schemas in both formats.

Another approach is to perform _dual schema upgrades_, also known as creating _versioned streams_ (illustrated above). This approach is especially useful when you need to introduce breaking changes into a stream's schema(s) and the new schema will be incompatible with the previous schema. Here, [Event Sources](../event-source/event-source.md) write to two streams:
1. One stream that uses the previous schema version, such as `payments-v1`.
2. Another stream that uses the new schema version, such as `payments-v2`.
Event Processing Applications and Event Sinks then each consume from the stream that uses the schema version with which they are compatible.
After upgrading all consumers to the new schema version, you can retire the stream that uses the old schema version.
## Implementation
For "in-place" schema evolution, an [Event Stream](../event-stream/event-stream.md) has multiple versions of the same schema, and the different versions are [compatible with each other](schema-compatibility.md).
For example, if a field is removed, then a consumer that was developed to process events without this field will be able to process events that were written with the old schema and contain the field; the consumer will simply ignore that field.
In this case, if the original schema was as follows:
```
{"namespace": "io.confluent.examples.client",
"type": "record",
"name": "Event",
"fields": [
{"name": "field1", "type": "boolean", "default": true},
{"name": "field2", "type": "string"}
]
}
```
Then we could modify the schema to omit `field2`, as shown below, and the event stream would then have a mix of both schema types.
If new consumers process events written with the old schema, these consumers would ignore `field2`.
```
{"namespace": "io.confluent.examples.client",
"type": "record",
"name": "Event",
"fields": [
{"name": "field1", "type": "boolean", "default": true}
]
}
```
_Dual schema upgrades_ involve breaking changes, so the processing flow needs to be more explicit: clients must only write to and read from the Event Stream that corresponds to the schema that they can process.
For example, an application that can process a v1 schema would read from one stream:
```
CREATE STREAM payments-v1 (transaction_id BIGINT, username VARCHAR)
WITH (kafka_topic='payments-v1');
```
And an application that can process a v2 schema would read from another stream:
```
CREATE STREAM payments-v2 (transaction_id BIGINT, store_id VARCHAR)
WITH (kafka_topic='payments-v2');
```
## Considerations
Follow [Schema Compatibility](../event-stream/schema-compatibility.md) rules to determine which schema changes are compatible and which are breaking changes.
## References
* See also the [Event Serializer](../event/event-serializer.md) pattern, in which we encode events so that they can be written to disk, transferred across the network, and generally preserved for future readers
* See also the [Schema-on-Read](../event/schema-on-read.md) pattern, which enables the reader of events to determine which schema to apply to the Event that the reader is processing.
* [Schema evolution and compatibility](https://docs.confluent.io/platform/current/schema-registry/avro.html) covers backward compatibility, forward compatibility, and full compatibility.
* [Working with schemas](https://docs.confluent.io/cloud/current/client-apps/schemas-manage.html) covers how to create schemas, edit them, and compare versions.
* The [Schema Registry Maven Plugin](https://docs.confluent.io/platform/current/schema-registry/develop/maven-plugin.html#schema-registry-test-compatibility) allows you to test for schema compatibility during the development cycle.
|
Shell | UTF-8 | 14,707 | 3.21875 | 3 | [
"MIT"
] | permissive | # This code corresponds to Fig. S7(a) in Alexander et al.
# It runs programs/scripts to summarize the genic regions covered
# by the ddRADseq sequencing and generates genomic cline analyses
# via bgc (and outputs Fig 4, Table S6, and Fig S11) in
# Alexander et al.
# 1. Getting bgc inputs together
# Making directory and copying necessary inputs into it
mkdir bgc
cd bgc
# Following files need to be copied into the bgc folder
# chickadee_ref.vcf
# *.snps.map or *.snpsmap from ipyrad outfiles
# S7b_generate_bgc_inputs.R
# S7c_stationarity_convergence_results.R
# Table_S1.txt
#2. Downloading annotations for black-capped chickadee genome
wget https://ftp.ncbi.nlm.nih.gov/genomes/all/GCA/011/421/415/GCA_011421415.1_CUB_Patr_1.0/GCA_011421415.1_CUB_Patr_1.0_genomic.gbff.gz
# Summazing genic content and chromosome position of RADseq loci
# Obtain chromosome labels for scaffolds from a gff file (when previously using P. major as reference)
# zgrep -E "RefSeq" GCF_001522545.3_Parus_major1.1_genomic.gff.gz | grep "region" | grep "chromosome" | sed 's/RefSeq.*Name=//g' | sed 's/;chromosome.*//g' > chromosome_scaffolds.txt
# Obtain chromosome labels for scaffolds from a gbff file (current code)
gunzip GCA_011421415.1_CUB_Patr_1.0_genomic.gbff.gz
grep VERSION GCA_011421415.1_CUB_Patr_1.0_genomic.gbff | awk '{ print $2 }' > scaffold_names.txt
grep DEFINITION GCA_011421415.1_CUB_Patr_1.0_genomic.gbff | sed 's/DEFINITION Poecile atricapillus chromosome //g' | sed 's/DEFINITION Poecile atricapillus scaffold[0-9]*-unlocalized-//g' | sed 's/DEFINITION Poecile atricapillus //g' | sed 's/, .*//g' | sed 's/ .*//g' > chromosomes.txt
paste scaffold_names.txt chromosomes.txt > chromosome_scaffolds.txt
# Example of what chromosome_scaffolds.txt looks like
CM022157.1 1
JAAMOC010000001.1 1
JAAMOC010000002.1 1
JAAMOC010000003.1 1
JAAMOC010000004.1 2
JAAMOC010000005.1 2
JAAMOC010000550.1 2
JAAMOC010000551.1 2
JAAMOC010000552.1 2
JAAMOC010000010.1 3
# Strip comment rows off the output vcf file and GFF file
grep -v "##" chickadee_ref.vcf > headerless.vcf
grep -F -e "##" chickadee_ref.vcf > header_rows.txt
# Using R code to summarize RADseq markers based on ref_guided.snps.map
# (in order to find which SNP positions correspond to each locus),
# headerless.vcf (to obtain chromosome, and position along chromosome),
# chromosome_scaffolds.txt (to find which scaffold corresponds to what chromosome)
# *.snps.map or *.snpsmap from ipyrad outfiles: in order to find which SNP
# positions correspond to each locus (in this case chickadee_ref.snpsmap )
module load R/3.6.2-gimkl-2020a
Rscript Fig_S7b_generate_bgc_inputs.R
# 3. Compiling the bgc software in bin directory
# Obtaining the bgc tar
wget https://sites.google.com/site/bgcsoftware/home/bgcdist1.03.tar.gz
tar -zvxf bgcdist1.03.tar.gz
cd
# Loading required modules
module load HDF5/1.10.5-gimkl-2018b
module load GSL/2.4-GCC-7.4.0
# Compiling based on instructions at https://popgencode.wordpress.com/2015/04/08/hello-world/
h5c++ -Wall -O2 -L/usr/include/gsl -I/usr/include/gsl -o bgc bgc_main.C bgc_func_readdata.C bgc_func_initialize.C bgc_func_mcmc.C bgc_func_write.C bgc_func_linkage.C bgc_func_ngs.C bgc_func_hdf5.C mvrandist.c -lgsl -lgslcblas
# Compiling estpost
h5c++ -Wall -O3 -o estpost estpost_h5.c -lgsl -lgslcblas
# 4. Bgc input options explanation (guided by mix of bgc manual and Taylor et al. 2014)
# -a Infile with genetic data for parental population 0.
# -b Infile with genetic data for parental population 1.
# -h Infile with genetic data for admixed population(s).
# -M Infile with genetic map, only used for ICARrho model
# -O format to write MCMC samples (2=HDF5 and ascii text)
# -x Number of MCMC steps for the analysis
# -n Discard the first n MCMC samples as a burn-in
# -t Thin MCMC samples by recording every nth value
# -p Specifies which parameter samples to print: 1 = also print precision parameters
# -q Boolean, calculate and print cline parameter quantiles [default = 0]
# -i Boolean, calculate and print interspecific-heterozygosity [default = 0].
# -N Boolean, use genotype-uncertainty model
# -E Boolean, use sequence error model, only valid in conjunction with the genotype-
# uncertainty model [default = 0].
# -m Boolean, use ICARrho model for linked loci [default = 0].
# -d Boolean, all loci are diploid
# -s Boolean, sum-to-zero constraint on locus cline parameters
# -o Boolean, assume a constant population-level cline parameter variance for all loci
# -I Select algorithm to initialize MCMC [default = 1]. 0 = use information from the
# data to initialize ancestry and hybrid index [0 only works with read data not called genotypes]
# -D Maximum distance between loci, free recombination [default = 0.5]. [In contrast to manual
# set this to 1252.497 as this was the maximum size of scaffolds in kb mapping to LG in the
# black-capped chickadee genome: manual was in cM in comparison]
# -T If non-zero, use a truncated gamma prior for tau with this upper bound [default =
# 0]. Otherwise use a full gamma prior.
# -u MCMC tuning parameter, maximum deviate from uniform for proposed hybrid index
# hybrid index [default = 0.1].
# -g MCMC tuning parameter, standard deviation for Gaussian proposal of cline param-
# eter gamma [default = 0.05].
# -z MCMC tuning parameter, standard deviation for Gaussian proposal of cline parameter zeta [default = 0.05]
# -e MCMC tuning paramteer, standard deviation for Gaussian proposal of cline parameters eta and kappa [default = 0.02]
# 4. Sbatch script for running bgc
#!/bin/bash -e
#SBATCH -A uoo00105
#SBATCH -J bgc
#SBATCH --ntasks 1
#SBATCH -c 1
#SBATCH -t 72:00:00
#SBATCH --mem=3G
#SBATCH -D /nesi/nobackup/uoo00105/chickadees/bgc
#SBATCH --mail-type=ALL
#SBATCH --mail-user=alana.alexander@otago.ac.nz
#SBATCH -N 1
#SBATCH --hint=nomultithread
#SBATCH --partition=large
module load HDF5/1.10.5-gimkl-2018b
module load GSL/2.4-GCC-7.4.0
export PATH=/nesi/nobackup/uoo00105/chickadees/bin/bgcdist:$PATH
bgc -a black_capped.txt -b Carolina.txt -h admixed.txt -M genetic_map.txt -O 2 -x 50000 -n 25000 -t 5 -p 1 -q 1 -i 0 -N 1 -E 0.0001 -m 1 -d 1 -s 1 -o 0 -I 0 -D 1252.497 -T 0 -u 0.1 -g 0.08 -z 0.05 -e 0.2
# 5. After copying out outputfiles into run1_output, using same code as above,
# ran a second shorter chain (-x 25000 -n 12500) to check convergence and copied
# these files into run2_output
# Summary of output files
# LnL output.txt: Each line in this file contains the log likelihood for a single MCMC step.
# For this and all other files only post-burnin post-thinning MCMC samples are included.
# alpha output.txt: Each line begins with the locus number (in input order, but simply
# numbered 0 to N, where N is one less than the number of loci). This is followed by the
# cline parameter α for each population. Populations are separated by commas. Each line
# corresponds to a single MCMC step and additional lines give samples for subsequent MCMC
# steps.
# beta output.txt: This file contains MCMC samples for cline parameter β and follows the
# format described for alpha output.txt.
# hi output.txt: This file contains MCMC samples of hybrid index (h). Each line corresponds
# to a single MCMC step and contains MCMC samples of hybrid index for each individual.
# The parameter values for each individual are separated by commas and appear in the order
# that individuals were listed in the input file.
# gamma output.txt: This file contains the quantile of each γ cline parameter in the
# estimated genome-wide distribution. Each line corresponds to a single MCMC step and
# gives the quantiles for each locus in order. Values are comma separated.
# zeta output.txt: This file contains the quantile of each ζ cline parameter in the estimated
# genome-wide distribution and follows the format described for q gamma output.txt.
# 6. Summarizing bgc output
# Within the run1_output and run2_output folders:
# -i Infile, MCMC results from bgc in HDF5 format.
# -o Outfile [default = postout].
# -p Name of parameter to summarize, possibilities include: ’LnL’, ’alpha’, ’beta’,
# ’eta’, ’eta-quantile’, ’gamma-quantile’, ’gamma-quantile-local’, ’hi’, ’interspecific-
# het’, ’kappa’, ’kappa-quantile’, ’rho’, ’tau-alpha’, ’tau-beta’, ’zeta-quantile’, and
# ’zeta-quantile-local’.
# -c Credible interval to calculate [default = 0.95].
# -b Number of additional (beyond the burn-in passed to bgc) MCMC samples to discard
# for burn-in [default = 0]. This burn-in is based on the number of thinned samples.
# -h Number of bins for posterior sample histogram [default = 20].
# -s Which summary to perform: 0 = posterior estimates and credible intervals, 1 =
# histogram of posterior samples, 2 = convert to plain text.
# -w Write parameter identification and headers to file, boolean [default = 1].
# Point estimates and CI (-s 0): This file contains an optional header row and parameter
# identification column (parameters are ordered and number as they were ordered in the input
# files but starting with 0). Each line gives the mean, median, and lower and upper bounds of
# the specified credible interval (this is an equal-tail probability interval).
# Posterior point estimates and 95% ETPIs for the α and β parameters and cline parameter quantiles
# With Bonferroni correction for 6,748 loci
#Alpha:
estpost -i mcmcout.hdf5 -o alphaest.txt -p alpha -s 0 -c 0.99999259039 -w 1
#Beta
estpost -i mcmcout.hdf5 -o betaest.txt -p beta -s 0 -c 0.99999259039 -w 1
#Gamma
estpost -i mcmcout.hdf5 -o gammaest.txt -p gamma-quantile -s 0 -c 0.99999259039 -w 1
#Zeta
estpost -i mcmcout.hdf5 -o zetaest.txt -p zeta-quantile -s 0 -c 0.99999259039 -w 1
#Hi
estpost -i mcmcout.hdf5 -o hi.txt -p hi -s 0 -c 0.99999259039 -w 1
# 7a. Checking for stationarity and convergence
Rscript Fig_S7c_stationarity_convergence_results.R
# Based on the stationarity/convergence results, it might be a good idea to remove additional burn-in
# In our case, 1500 states were removed for both run1 and run2, and these are the summarized files
# in the github repository
#Alpha:
estpost -i mcmcout.hdf5 -o alphaest.txt -p alpha -s 0 -c 0.99999259039 -w 1 -b 1500
#Beta
estpost -i mcmcout.hdf5 -o betaest.txt -p beta -s 0 -c 0.99999259039 -w 1 -b 1500
#Gamma
estpost -i mcmcout.hdf5 -o gammaest.txt -p gamma-quantile -s 0 -c 0.99999259039 -w 1 -b 1500
#Zeta
estpost -i mcmcout.hdf5 -o zetaest.txt -p zeta-quantile -s 0 -c 0.99999259039 -w 1 -b 1500
#Hi
estpost -i mcmcout.hdf5 -o hi.txt -p hi -s 0 -c 0.99999259039 -w 1 -b 1500
# 7b. Re-checking for stationarity and convergence and presenting results
Rscript Fig_S7c_stationarity_convergence_results.R
# 8. Loading necessary modules for pulling out genes of interest
module load seqtk/1.3-gimkl-2018b
# Copying across the annotations for the other black-capped chickadee genome (less contiguous than the one we
# used for reference mapping, but with annotations available)
wget https://ftp.ncbi.nlm.nih.gov/genomes/genbank/vertebrate_other/Poecile_atricapillus/latest_assembly_versions/GCA_013398625.1_ASM1339862v1/GCA_013398625.1_ASM1339862v1_cds_from_genomic.fna.gz
gunzip GCA_013398625.1_ASM1339862v1_cds_from_genomic.fna.gz
# downloading magic-blast (need splice-aware method, as the nucleotide CDS we've downloaded will exclude intons)
wget https://ftp.ncbi.nlm.nih.gov/blast/executables/magicblast/LATEST/ncbi-magicblast-1.5.0-x64-linux.tar.gz
tar -zvxf ncbi-magicblast-1.5.0-x64-linux.tar.gz
export PATH=/nesi/nobackup/uoo00105/chickadees/bin/ncbi-magicblast-1.5.0/bin:$PATH
number_of_matches=`wc -l Table_S6_outlying_marker_bed_format.bed | awk '{print $1}'`
# Outputting a list of genes found in the regions of interest (potential inversions)
for i in `seq 1 $number_of_matches`;
do bedline=`head -n $i Table_S6_outlying_marker_bed_format.bed | tail -n 1`;
echo $bedline > temp.bed;
seqname=`echo $bedline | awk '{print $1}'`;
seqtk subseq GCA_011421415.1_CUB_Patr_1.0_genomic.fna temp.bed > $seqname.$i.fa;
magicblast -query GCA_013398625.1_ASM1339862v1_cds_from_genomic.fna -subject $seqname.$i.fa -perc_identity 99 -outfmt tabular -no_unaligned > $seqname.$i.matches.txt;
grep -v "#" $seqname.$i.matches.txt | awk '{print $1}' > temp_sequence_name.txt;
seqtk subseq GCA_013398625.1_ASM1339862v1_cds_from_genomic.fna temp_sequence_name.txt | grep -F "[gene=" | sed 's/.*\[gene=//g' | sed 's/\].*//g' > $seqname.$i.gene_matches.txt;
rm temp.bed;
rm temp_sequence_name.txt;
done
# Combintions of these genes were then used in geneontology.org
# Chromosome Z
cat CM022174*gene_matches.txt | sort | uniq
# Region by region chromosome Z
cat CM022174.1.2.gene_matches.txt
cat CM022174.1.3.gene_matches.txt
cat CM022174.1.4.gene_matches.txt
cat CM022174.1.5.gene_matches.txt
cat CM022174.1.6.gene_matches.txt
cat CM022174.1.7.gene_matches.txt
cat CM022174.1.8.gene_matches.txt
cat CM022174.1.9.gene_matches.txt
cat CM022174.1.10.gene_matches.txt
cat CM022174.1.11.gene_matches.txt
#Chromosome 1A
cat JAAMOC010000547.1.1.gene_matches.txt
# All putative inversions
cat *gene_matches.txt
# Repeating a similar analysis on each of the SNPs. Creating a subfolder to hold all the data
mkdir positive_beta_SNPs
cd positive_beta_SNPs
# Place Table_S6_positive_beta_SNPs_bed_format.bed into this folder
number_of_matches=`wc -l Table_S6_positive_beta_SNPs_bed_format.bed | awk '{print $1}'`
# Outputting a list of genes found in the regions of interest
for i in `seq 1 $number_of_matches`;
do bedline=`head -n $i Table_S6_positive_beta_SNPs_bed_format.bed | tail -n 1`;
echo $bedline > temp.bed;
seqname=`echo $bedline | awk '{print $1}'`;
seqtk subseq ../GCA_011421415.1_CUB_Patr_1.0_genomic.fna temp.bed > $seqname.$i.fa;
magicblast -query ../GCA_013398625.1_ASM1339862v1_cds_from_genomic.fna -subject $seqname.$i.fa -perc_identity 99 -outfmt tabular -no_unaligned > $seqname.$i.matches.txt;
grep -v "#" $seqname.$i.matches.txt | awk '{print $1}' > temp_sequence_name.txt;
seqtk subseq ../GCA_013398625.1_ASM1339862v1_cds_from_genomic.fna temp_sequence_name.txt | grep -F "[gene=" | sed 's/.*\[gene=//g' | sed 's/\].*//g' > $seqname.$i.gene_matches.txt;
rm temp.bed;
rm temp_sequence_name.txt;
done
# Obtaining a list of uniq genes to run through Gene Ontology
cat *gene_matches.txt | sort | uniq
# Finding the genes that are within 5,000 bp of multiple outlying SNPs
cat *gene_matches.txt | sort | uniq -c | grep -v "1 "
# Getting a summary of what genes were associated with what SNPs to add to Table S6
for i in `seq 1 $number_of_matches`;
do filename=`echo *.$i.gene_matches.txt`;
seqs=`cat $filename | tr '\n' '\t'`;
echo $filename $seqs >> ordered_gene_matches.txt;
done
|
SQL | UTF-8 | 1,107 | 3.359375 | 3 | [
"MIT"
] | permissive |
set line 200 trimspool on
set pagesize 60
col table_name format a15 head 'TABLE NAME'
col index_name format a20 head 'INDEX NAME'
col partition_name format a10 head 'PARTITION|NAME'
col subpartition_name format a15 head 'SUBPARTITION|NAME'
col num_rows format 99999999 head 'NUM|ROWS'
col stale_stats format a5 head 'STALE|STATS'
col global_stats format a6 head 'GLOBAL|STATS'
col last_analyzed format a21 head 'LAST ANALYZED'
alter session set nls_date_format = 'yyyy-mm-dd hh24:mi:ss';
select table_name
,partition_name
, subpartition_name
, global_stats
, last_analyzed
, num_rows
, stale_stats
from dba_tab_statistics
where owner = 'JKSTILL'
and table_name = 'HASH_SP_TEST'
and partition_name = 'NAME_P2'
order by table_name, partition_position, subpartition_position
/
select table_name
, index_name
, partition_name
, subpartition_name
, global_stats
, last_analyzed
, num_rows
, stale_stats
from dba_ind_statistics
where owner = 'JKSTILL'
and table_name = 'HASH_SP_TEST'
and partition_name = 'NAME_P2'
order by table_name, index_name, partition_position, subpartition_position
/
|
C# | UTF-8 | 856 | 3.25 | 3 | [] | no_license | using System;
using System.CodeDom;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Class_Constructor
{
class Program
{
static void Main(string[] args)
{
using (var sw = new StreamWriter("codeblog.txt", false, Encoding.UTF8))
{
for (int i = 0; i < 10; i++)
{
sw.WriteLine($"Line {i}");
}
}
using (var sr = new StreamReader("codeblog.txt"))
{
int i = 0;
while(!sr.EndOfStream)
{
Console.WriteLine(sr.ReadLine() + " " + i);
i++;
}
}
Console.ReadKey();
}
}
}
|
TypeScript | UTF-8 | 1,661 | 3 | 3 | [
"MIT"
] | permissive | import { slugifyEntry } from "./utils";
describe("slugifyEntry", () => {
it("should slugify an entry", () => {
const entry = {
body: "",
date: new Date(2020, 0, 23),
slug: "",
title: "Foo Bar Baz"
};
const result = slugifyEntry(entry);
expect(result).toBe("2020-01-23-foo-bar-baz");
});
it("should remove quotes", () => {
const entry = {
body: "",
date: new Date(2020, 0, 23),
slug: "",
title: "'Foo' \"Bar\" Baz"
};
const result = slugifyEntry(entry);
expect(result).toBe("2020-01-23-foo-bar-baz");
});
});
// describe("delayedLoader", () => {
// it("should resolve if load fn is faster than delay", async () => {
// const promise = Promise.resolve("data!");
// const showLoaderFn = jest.fn();
// const result = await delayedLoader(promise, showLoaderFn, 10);
// expect(result).toBe("data!");
// expect(showLoaderFn).toHaveBeenCalledTimes(0);
// });
// it("should call showLoaderFn if load fn is slower than delay", async () => {
// const promise = new Promise(resolve => {
// setTimeout(resolve, 20, "data!");
// });
// const showLoaderFn = jest.fn();
// await delayedLoader(promise, showLoaderFn, 10);
// expect(showLoaderFn).toHaveBeenCalledTimes(1);
// });
// it("should reject if load fn rejects", async () => {
// expect.assertions(1);
// const promise = Promise.reject(new Error("oh no!"));
// const showLoaderFn = jest.fn();
// try {
// await delayedLoader(promise, showLoaderFn, 10);
// } catch (err) {
// expect(err.message).toMatch("oh no!");
// }
// });
// });
|
Markdown | UTF-8 | 2,888 | 3.25 | 3 | [] | no_license | ## 用户异常行为检测
本项目采用的是二分类思路,预测哪些是正常用户操作,哪些是伪装用户或者异常操作,数据全是linux bash操作符
目前核心代码和验证已完成,有关部署的一些想法:在ubuntu上装一个bash操作的log实时提取,然后将命令的参数去除,每一百条做为一个基准,有异常强制退出。谁有更稳妥有效简单的部署方案,欢迎提交issue
- ### Masquerading User Data数据集简介:
Matthias Schonlau是加拿大安大略省滑铁卢大学统计学教授他试图通过搜集Linux服务器的Bash操作日志,通过训练识别用户的操作习惯,判断主机上的账户是否被入侵、盗用。然后进行进一步识别账户的异常行为。在他的个人网站上发布了针对Linux操作的训练数据。训练数据中包括50个户的操作日志,每个日志包含15 000 条操作命令,其中前5 000 条都是正常操作,后面的10 000 条日志中随机包含有异常操作。为了便于分析,数据集每100 条操作作为个操作序列,同时进行了标注,每个操作序列只要有1条操作异常就认为这个操作序列异常。
- ### 特征提取与数据清洗:
碰巧兜哥的《web安全之机器学习入门》介绍了此数据集,借鉴他的特征提取方法:
1. KNN_50.py采用了词频统计,将用户使用频率最高的前50条操作和最低的50条操作加以区分,采用KNN训练
2. NB_all.py采用了词集模型,将所有命令作为特征,根据命令的是否命中,将操作序列向量化,采用NB训练
- ### 模型训练:
看了兜哥的用SVM来预测XSS准确度而收到了启发,SVM在向量化的二分类问题优势很大
1. NB_all.py 采用朴素贝叶斯,结果分别与经过交叉验证之后做对比
2. KNN_50.py 经过SVM NB KNN,最终选择knn,并采用十折交叉验证
- ### 预测的准确率:
注意测试集的不同,准确率也不同,默认是user3测试,换作其他user可能会小幅度波动(KNN推荐:24,42,26,36,9;NB推荐:24,42,26,25,15)
1. NB_all.py User9 NB(98%) 十折交叉NB(100%) User24 NB(58%) 十折交叉(100%) User42 NB(60%) 十折交叉(94%)
2. KNN_50.py User9 KNN(98%) 十折交叉KNN(100%) User24 KNN(58%)十折交叉(90%) User42 KNN(60%)十折交叉(64%)
普遍代表:9 十折交叉的提升典范:24 26 特征提取的影响典范:42,15
- ### 不足之处
有些操作我都没见过,如有条件拿到更适用的数据,我会再试一次
- ### 一些思考
机器学习项目里,令人最头疼和繁琐的还是数据清洗和数据特征提取。实验证明**数据清洗和特征提取方法的不同对最后准确率的影响远远大于不同机器学习算法带来的影响**! |
Markdown | UTF-8 | 7,717 | 3.953125 | 4 | [] | no_license | ### Problem:
<p>Your task is to make two functions, <code>max</code> and <code>min</code> (<code>maximum</code> and <code>minimum</code> in PHP and Python) that take a(n) array/vector of integers <code>list</code> as input and outputs, respectively, the largest and lowest number in that array/vector.</p>
<p>#Examples</p>
<pre><code class="language-cpp">max({<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,<span class="hljs-number">-134</span>,<span class="hljs-number">566</span>}) returns <span class="hljs-number">566</span>
min({<span class="hljs-number">-52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, <span class="hljs-number">-54</span>, <span class="hljs-number">0</span>, <span class="hljs-number">-110</span>}) returns <span class="hljs-number">-110</span>
max({<span class="hljs-number">5</span>}) returns <span class="hljs-number">5</span>
min({<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>}) returns <span class="hljs-number">0</span></code></pre>
<pre style="display: none;"><code class="language-csharp">max({<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,<span class="hljs-number">-134</span>,<span class="hljs-number">566</span>}) returns <span class="hljs-number">566</span>
min({<span class="hljs-number">-52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, <span class="hljs-number">-54</span>, <span class="hljs-number">0</span>, <span class="hljs-number">-110</span>}) returns <span class="hljs-number">-110</span>
max({<span class="hljs-number">5</span>}) returns <span class="hljs-number">5</span>
min({<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>}) returns <span class="hljs-number">0</span></code></pre>
<pre style="display: none;"><code class="language-ruby">max([<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,-<span class="hljs-number">134</span>,<span class="hljs-number">566</span>]) returns <span class="hljs-number">566</span>
min([-<span class="hljs-number">52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, -<span class="hljs-number">54</span>, <span class="hljs-number">0</span>, -<span class="hljs-number">110</span>]) returns -<span class="hljs-number">110</span>
max([<span class="hljs-number">5</span>]) returns <span class="hljs-number">5</span>
min([<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>]) returns <span class="hljs-number">0</span></code></pre>
<pre style="display: none;"><code class="language-python">maximun([<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,<span class="hljs-number">-134</span>,<span class="hljs-number">566</span>]) returns <span class="hljs-number">566</span>
minimun([<span class="hljs-number">-52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, <span class="hljs-number">-54</span>, <span class="hljs-number">0</span>, <span class="hljs-number">-110</span>]) returns <span class="hljs-number">-110</span>
maximun([<span class="hljs-number">5</span>]) returns <span class="hljs-number">5</span>
minimun([<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>]) returns <span class="hljs-number">0</span></code></pre>
<pre style="display: none;"><code class="language-javascript">max([<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,<span class="hljs-number">-134</span>,<span class="hljs-number">566</span>]) returns <span class="hljs-number">566</span>
min([<span class="hljs-number">-52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, <span class="hljs-number">-54</span>, <span class="hljs-number">0</span>, <span class="hljs-number">-110</span>]) returns <span class="hljs-number">-110</span>
max([<span class="hljs-number">5</span>]) returns <span class="hljs-number">5</span>
min([<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>]) returns <span class="hljs-number">0</span></code></pre>
<pre style="display: none;"><code class="language-typescript">max([<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,<span class="hljs-number">-134</span>,<span class="hljs-number">566</span>]) returns <span class="hljs-number">566</span>
min([<span class="hljs-number">-52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, <span class="hljs-number">-54</span>, <span class="hljs-number">0</span>, <span class="hljs-number">-110</span>]) returns <span class="hljs-number">-110</span>
max([<span class="hljs-number">5</span>]) returns <span class="hljs-number">5</span>
min([<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>]) returns <span class="hljs-number">0</span></code></pre>
<pre style="display: none;"><code class="language-java">max({<span class="hljs-number">4</span>,<span class="hljs-number">6</span>,<span class="hljs-number">2</span>,<span class="hljs-number">1</span>,<span class="hljs-number">9</span>,<span class="hljs-number">63</span>,-<span class="hljs-number">134</span>,<span class="hljs-number">566</span>}) returns <span class="hljs-number">566</span>
min({-<span class="hljs-number">52</span>, <span class="hljs-number">56</span>, <span class="hljs-number">30</span>, <span class="hljs-number">29</span>, -<span class="hljs-number">54</span>, <span class="hljs-number">0</span>, -<span class="hljs-number">110</span>}) returns -<span class="hljs-number">110</span>
max({<span class="hljs-number">5</span>}) returns <span class="hljs-number">5</span>
min({<span class="hljs-number">42</span>, <span class="hljs-number">54</span>, <span class="hljs-number">65</span>, <span class="hljs-number">87</span>, <span class="hljs-number">0</span>}) returns <span class="hljs-number">0</span></code></pre>
<p>#Notes</p>
<ul>
<li>You may consider that there will not be any empty arrays/vectors.</li>
</ul>
### Solution |
JavaScript | UTF-8 | 6,505 | 3.03125 | 3 | [] | no_license | let button_home = document.getElementById('home-btn');
let button_about_us = document.getElementById('aboutus-btn');
let button_jobs = document.getElementById('jobs-btn');
//let button_clients = document.getElementById('clients-btn');
/**
* variable de estado
* 0 = home en pantalla
* 1 = about us en pantalla
* 2 = jobs en pantalla
* 3 = clients en pantalla
*/
let state = 0;
$('.about-us').hide();
$('.jobs').hide();
button_about_us.addEventListener('click', function () {
if (state == 0) {
//home en pantalla
let main = $('.main');
let contactform = $('.contact-form');
let aboutus = $('.about-us');
aboutus.show();
main.animate({marginLeft: '-100%'}, 500);
contactform.animate({marginLeft: '-100%'}, 300);
setTimeout(function () {
aboutus.animate({marginLeft: '0', opacity: '1'}, 500);
}, 200);
setTimeout(function(){
contactform.hide();
},500);
}else if (state == 1 ) {
//about us en pantalla
}else if (state == 2) {
//our work en pantalla
let jobs = $('.jobs');
let aboutus = $('.about-us');
aboutus.show();
jobs.animate({marginLeft: '100%'}, 500);
setTimeout(function () {
aboutus.animate({marginLeft: '0', opacity: '1'}, 500);
}, 200);
setTimeout(function(){
jobs.hide();
},500);
}else if (state == 3) {
//clients en pantalla
let main = $('.main');
let contactform = $('.contact-form');
let aboutus = $('.about-us');
aboutus.show();
main.animate({marginLeft: '-100%'}, 500);
contactform.animate({marginLeft: '-100%'}, 300);
setTimeout(function () {
aboutus.animate({marginLeft: '0', opacity: '1'}, 500);
}, 200);
}
state = 1;
});
button_home.addEventListener('click', function () {
if(state == 0){
}else if(state == 1){
let main = $('.main');
let contactform = $('.contact-form');
let aboutus = $('.about-us');
contactform.show();
aboutus.animate({marginLeft: '100%', opacity: '0'}, 300);
setTimeout(function () {
main.animate({marginLeft: '0'}, 300);
if(window.innerWidth > 750){
contactform.animate({marginLeft: '64%'}, 500);
}else{
contactform.animate({marginLeft: '0'}, 500);
}
}, 100);
setTimeout(function(){
aboutus.hide();
},500);
}else if (state == 2) {
let main = $('.main');
let contactform = $('.contact-form');
let jobs = $('.jobs');
contactform.show();
jobs.animate({marginLeft: '100%', opacity: '0'}, 300);
setTimeout(function () {
main.animate({marginLeft: '0'}, 300);
if(window.innerWidth > 800){
contactform.animate({marginLeft: '64%'}, 500);
}else{
contactform.animate({marginLeft: '0'}, 500);
}
}, 100);
setTimeout(function(){
jobs.hide();
},500);
}else if (state = 3) {
let main = $('.main');
let contactform = $('.contact-form');
let clients = $('.clients');
clients.animate({marginLeft: '100%', opacity: '0'}, 300);
setTimeout(function () {
main.animate({marginLeft: '0'}, 300);
contactform.animate({marginLeft: '64%'}, 500);
}, 100);
setTimeout(function(){
clients.hide();
},500);
}
state = 0;
});
button_jobs.addEventListener('click', function () {
if (state == 0) {
//home en pantalla
let main = $('.main');
let contactform = $('.contact-form');
let jobs = $('.jobs');
jobs.show();
main.animate({marginLeft: '-100%'}, 500);
contactform.animate({marginLeft: '-100%'}, 300);
setTimeout(function () {
jobs.animate({marginLeft: '0', opacity: '1'}, 500);
}, 200);
setTimeout(function(){
contactform.hide();
},500);
}else if (state == 1 ) {
//about us en pantalla
let aboutus = $('.about-us');
let jobs = $('.jobs');
jobs.show();
aboutus.animate({marginLeft: '-100%'}, 500);
setTimeout(function () {
jobs.animate({marginLeft: '0', opacity: '1'}, 500);
}, 200);
setTimeout(function(){
aboutus.hide();
},500);
}else if (state == 2) {
//our work en pantalla
}else if (state == 3) {
//clients en pantalla
let main = $('.main');
let contactform = $('.contact-form');
let aboutus = $('.about-us');
aboutus.show();
main.animate({marginLeft: '-100%'}, 500);
contactform.animate({marginLeft: '-100%'}, 300);
setTimeout(function () {
aboutus.animate({marginLeft: '0', opacity: '1'}, 500);
}, 200);
}
state = 2;
});
/**
button_clients.addEventListener('click', function () {
});
*/
let form = document.getElementById('form');
let bgimage = $('.background-image');
let something = $('.something');
form.addEventListener('mouseover', function () {
bgimage.css({filter: 'blur(3px)'});
something.css({filter: 'blur(3px)'});
});
form.addEventListener('mouseleave', function () {
bgimage.css({filter: 'none'});
something.css({filter: 'none'});
});
//VALIDACION DE FORMULARIO
//button to send mail
let send = document.getElementById('send-btn');
let mensajeError = "";
let mensaje_error_div = document.getElementById('error-message');
send.addEventListener('click', function () {
console.log('click ok');
//variables
let asunto = document.getElementById('asunto').value;
let mail = document.getElementById('mail').value;
let mensaje = document.getElementById('mensaje');
if(asunto == ""){
mensajeError = "Por favor completa todos los campos";
mensaje_error_div.innerText = mensajeError;
$('#asunto').css({border: '1px solid #f80063'});
}else if (mail == "") {
mensajeError = "Por favor completa todos los campos";
mensaje_error_div.innerText = mensajeError;
$('#mail').css({border: '1px solid #f80063'});
}else if (mensaje == "") {
mensajeError = "Por favor completa todos los campos";
mensaje_error_div.innerText = mensajeError;
$('#mensaje').css({border: '1px solid #f80063'});
}else{
$('.btn-none').click();
}
});
let btn_nav_menu = document.getElementById('btn-nav-menu');
btn_nav_menu.addEventListener('click', function () {
$('.nav-menu').slideToggle(200);
$('.nav-menu').css({display: 'flex', justifyContent: 'center'});
});
|
TypeScript | UTF-8 | 415 | 2.84375 | 3 | [] | no_license | export class User {
id: number;
fullname: string;
username: string;
photoUrl: string;
email: string;
type: string;
selected: boolean;
constructor(user: any = {}) {
this.id = user.id;
this.fullname = user.fullname;
this.username = user.username;
this.photoUrl = user.photoUrl;
this.email = user.email;
this.type = user.type;
this.selected = user.selected || false;
}
}
|
C++ | UTF-8 | 1,706 | 2.578125 | 3 | [] | no_license | #include "MergeBloomShaderProgram.h"
MergeBloomShaderProgram::MergeBloomShaderProgram()
{
createShaderData();
}
MergeBloomShaderProgram::~MergeBloomShaderProgram()
{
}
void MergeBloomShaderProgram::createShaderData()
{
ShaderProgram temp = ShaderProgram();
gShaderProgram = temp.CreateShaderData("MergeBloomVS.glsl", "", "", "", "MergeBloomFS.glsl", "");
//create vertices
QuadVertex vertices[] = {
// pos and normal and uv for each vertex
{ 2 * 0.5f, 2 * -0.5f, 1.0f, 0.0f },
{ 2 * 0.5f, 2 * 0.5f, 1.0f, 1.0f },
{ 2 * -0.5f, 2 * -0.5f, 0.0f, 0.0f },
{ 2 * -0.5f, 2 * 0.5f, 0.0f, 1.0f },
};
unsigned int indices[] = {
0,1,2,
2,1,3,
};
glGenVertexArrays(1, &vao);
glBindVertexArray(vao);
glEnableVertexAttribArray(0);
glEnableVertexAttribArray(1);
glGenBuffers(1, &vbo);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
unsigned int vertexPos = glGetAttribLocation(gShaderProgram, "vertex_position");
if (vertexPos == -1) {
OutputDebugStringA("Error, cannt find aPos attribute in vertex shader\n");
return;
}
glVertexAttribPointer(
vertexPos,
2,
GL_FLOAT,
GL_FALSE,
sizeof(QuadVertex),
BUFFER_OFFSET(0)
);
unsigned int uvPos = glGetAttribLocation(gShaderProgram, "uvCoord");
if (uvPos == -1) {
OutputDebugStringA("Error, cannt find aTexCoords attribute in vertex shader\n");
return;
}
glVertexAttribPointer(
uvPos,
2,
GL_FLOAT,
GL_FALSE,
sizeof(QuadVertex),
BUFFER_OFFSET(sizeof(float) * 2)
);
//ebo
glGenBuffers(1, &ebo);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ebo);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW);
}
|
SQL | UTF-8 | 7,034 | 4.15625 | 4 | [] | no_license | CREATE DATABASE GeekTextDB;
USE GeekTextDB;
/** Users - Lists users created and
their account info **/
CREATE TABLE `Users` (
`UserID` int not null auto_increment,
`Username` varchar(25) not null,
`FirstName` varchar(25) not null,
`MiddleName` varchar(25) null,
`LastName` varchar(25) not null,
`Email` varchar(50) not null,
`Password` varchar(20) not null,
`Nickname` varchar(25) not null,
`PrivacySettings` boolean default False not null,
PRIMARY KEY (`UserID`)
);
/** Home Address - Stores each user Home Address**/
CREATE TABLE `HomeAddress` (
`UserID` int not null,
`Street` varchar(50) not null,
`Apt` varchar(5) null,
`City` varchar(25) not null,
`State` varchar(25) not null,
`ZipCode` varchar(10) not null,
`Country` varchar(25) not null,
PRIMARY KEY (`UserID`),
FOREIGN KEY (`UserID`) references GeekTextDB.Users(`UserID`)
);
/** Shipping Address - Stores various Shipping
Addresses per user **/
CREATE TABLE `ShippingAddress` (
`ShippingID` int not null auto_increment,
`UserID` int not null,
`Street` varchar(50) not null,
`Apt` varchar(5) null,
`City` varchar(25) not null,
`State` varchar(25) not null,
`ZipCode` varchar(10) not null,
`Country` varchar(25) not null,
PRIMARY KEY (`ShippingID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`)
);
/** CreditCard - Stores various credit cards
per user as payment method **/
CREATE TABLE `CreditCards` (
`CreditCardID` int not null auto_increment,
`CreditCardNumber` varchar(19) not null,
`UserID` int not null,
`CardholderName` varchar(50) not null,
`ExpirationDate` date not null,
`CVV` varchar(4) not null,
PRIMARY KEY (`CreditCardID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`)
);
/** Publishers - Lists possible book publishers **/
CREATE TABLE `Publishers` (
`PublisherID` int not null auto_increment,
`PublisherName` varchar(50) not null,
PRIMARY KEY (`PublisherID`)
);
/** Genres - Lists possible book genres **/
CREATE TABLE `Genres` (
`GenreID` int not null auto_increment,
`GenreName` varchar(50) not null,
PRIMARY KEY (`GenreID`)
);
/** Authors - Lists authors info **/
CREATE TABLE `Authors` (
`AuthorID` int not null auto_increment,
`Name` varchar(50) not null,
`Biography` varchar(500) null,
PRIMARY KEY (`AuthorID`)
);
/** Books - Lists all book details and records
the total sells**/
CREATE TABLE `Books` (
`BookID` int not null auto_increment,
`ISBN` varchar(13) not null,
`Title` varchar(100) not null,
`Cover` varchar(255) not null,
`GenreID` int not null,
`AuthorID` int not null,
`Price` decimal(6,2) not null,
`DatePublished` varchar(10) not null,
`PublisherID` int not null,
`Summary` varchar(500) not null,
`BookSells` int default 0 not null,
PRIMARY KEY (`BookID`),
FOREIGN KEY (`GenreID`) REFERENCES GeekTextDB.Genres(`GenreID`),
FOREIGN KEY (`AuthorID`) REFERENCES GeekTextDB.Authors(`AuthorID`),
FOREIGN KEY (`PublisherID`) REFERENCES GeekTextDB.Publishers(`PublisherID`)
);
/** WishList - Holds up to 3 Wishlists per user**/
CREATE TABLE `WishList` (
`WishListID` int not null auto_increment,
`UserID` int not null,
`WishListIndex` int not null,
`WishListName` varchar(25) null,
PRIMARY KEY (`WishListID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`)
);
/** WishListItems - Holds which items and the quantity
added to the user Wishlist **/
CREATE TABLE `WishListItems` (
`WishListItemID` int not null auto_increment,
`WishListID` int not null,
`UserID` int not null,
`BookID` int not null,
PRIMARY KEY (`WishListItemID`),
FOREIGN KEY (`WishListID`) REFERENCES GeekTextDB.WishList(`WishListID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`),
FOREIGN KEY (`BookID`) REFERENCES GeekTextDB.Books(`BookID`)
);
/** ShoppingCart - Each user has their unique
shoping cart with the current balance **/
CREATE TABLE `ShoppingCart` (
`ShoppingCartID` int not null auto_increment,
`UserID` int not null,
`SubTotal` decimal(12,2) not null,
PRIMARY KEY (`ShoppingCartID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`)
);
/** ShoppingCartItems - Holds all the items on a Shopping Cart **/
CREATE TABLE `ShoppingCartItems` (
`ShoppingCartItemID` int not null auto_increment,
`ShoppingCartID` int not null,
`BookID` int not null,
`Quantity` decimal(3,0) not null default 1,
`SaveForLater` boolean default false not null,
PRIMARY KEY (`ShoppingCartItemID`),
FOREIGN KEY (`BookID`) references GeekTextDB.Books(`BookID`),
FOREIGN KEY (`ShoppingCartID`) references GeekTextDB.ShoppingCart(`ShoppingCartID`)
);
/** Order - Lists all the details of an Order **/
CREATE TABLE `Orders` (
`OrderID` int not null auto_increment,
`UserID` int not null,
`ShippingID` int not null,
`TotalPrice` decimal(12,2) not null,
`DateOrdered` timestamp not null,
`CreditCardID` int not null,
PRIMARY KEY (`OrderID`),
FOREIGN KEY (`CreditCardID`) REFERENCES GeekTextDB.CreditCards(`CreditCardID`),
FOREIGN KEY (`ShippingID`) REFERENCES GeekTextDB.ShippingAddress(`ShippingID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`)
);
/** Reviews - Holds all the items on an Order **/
CREATE TABLE `OrderItems` (
`OrderItemID` int not null auto_increment,
`OrderID` int not null,
`BookID` int not null,
`Quantity` decimal(5,0) not null default 1,
PRIMARY KEY (`OrderItemID`),
FOREIGN KEY (`OrderID`) references GeekTextDB.Orders(`OrderID`),
FOREIGN KEY (`BookID`) references GeekTextDB.Books(`BookID`)
);
/** Reviews - Holds the reviews and ratings each user has
made on a book after an order has taken place**/
CREATE TABLE `Reviews` (
`ReviewID` int not null auto_increment,
`UserID` int not null,
`OrderID` int not null,
`BookID` int not null,
`Rating` int not null,
`Comments` varchar(255) null,
`DatePosted` timestamp not null,
PRIMARY KEY(`ReviewID`),
FOREIGN KEY (`UserID`) REFERENCES GeekTextDB.Users(`UserID`),
FOREIGN KEY (`OrderID`) REFERENCES GeekTextDB.Orders(`OrderID`),
FOREIGN KEY (`BookID`) REFERENCES GeekTextDB.Books(`BookID`)
);
/** WrittenBy View - Display the association of books with their respective authors**/
CREATE VIEW `WrittenBy` AS
SELECT `Name`, `Title`
FROM `Books`, `Authors`
WHERE `Books`.`AuthorID` = `Authors`.`AuthorID`;
/** TopSellers View - Display a list of books that meet certain criteria on book sells and rating**/
CREATE VIEW `TopSellers` AS
SELECT `Title`, `Cover`, `ISBN`, `Rating`
FROM `Books`
INNER JOIN `Reviews` ON `Books`.`BookID` = `Reviews`.`BookID`
WHERE `Rating` > 3.5 AND `BookSells` > 90;
/** BrowseAndSortBy View - Display a list of books with their respetive specific Information**/
CREATE VIEW `BrowseAndSortBy` AS
SELECT `Title`, `Cover`, `Name`,`GenreName`, `DatePublished`, `Price`, `Rating`
FROM `Books`, `Genres`, `Authors`, `Reviews`
WHERE `Books`.`GenreID` = `Genres`.`GenreID`
AND `Books`.`AuthorID` = `Authors`.`AuthorID`
AND `Books`.`BookID` = `Reviews`.`BookID`
|
C++ | UTF-8 | 992 | 2.59375 | 3 | [] | no_license | #pragma once
#include <string>
// TODO: Create generic interface for file loader
// add support for different file formats
#include <dr_wav.h>
//
class IAudioFileLoader
{
public:
IAudioFileLoader(const std::string path);
~IAudioFileLoader();
inline const std::string& GetPath() const { return m_FilePath; }
inline int GetChannels() const { return m_Channels; }
inline int GetSampleRate() const { return m_SampleRate; }
inline size_t GetTotalFrames() const { return m_TotalFrames; }
inline size_t GetTotalSize() const { return m_TotalFrames * GetFrameSize(); }
inline size_t GetFrameSize() const { return m_Channels * sizeof(drwav_int16); }
inline operator bool() { return m_Loaded; }
// Returns the total frames read
//
size_t ReadFrames(size_t framesToRead, unsigned char* data);
void SeekToFrame(size_t frame);
private:
std::string m_FilePath;
drwav m_FileHandle;
int m_Channels = 0;
int m_SampleRate = 0;
size_t m_TotalFrames = 0;
bool m_Loaded = false;
}; |
C++ | UTF-8 | 685 | 3.609375 | 4 | [] | no_license | /******************************************************************************
Cyclic sort
Time complexity: O(N)
Can be applied iff elements in array range from 1 to N
*******************************************************************************/
#include <bits/stdc++.h>
using namespace std;
void cyclicSort(int arr[],int n){
// in this we need to sort array in a single pass O(n)
for(int i=0;i<n;i++){
if((arr[i]-1)!=i){
int num = arr[i];
swap(arr[i],arr[num-1]);
}
}
for(int i=0;i<n;i++)
cout<<arr[i]<<" ";
}
int main()
{
int arr[] = {2,3,1,5,4};
int n=5;
cyclicSort(arr,n);
return 0;
}
|
Markdown | UTF-8 | 2,106 | 2.5625 | 3 | [
"LicenseRef-scancode-warranty-disclaimer",
"Apache-2.0"
] | permissive | ## Heading 1 {FirstHeaderH1Rule} {FirstLineH1Rule}
#### Heading 2 {HeaderIncrementRule}
# Heading 3 {ConsistentHeaderStyleRule} {MD043} #
* list {BlanksAroundListsRule}
+ list {ConsistentUlStyleRule} {ListIndentRule} {UlStartLeftRule} {UlIndentRule} {ListMarkerSpaceRule} {BlanksAroundListsRule} {LineLengthRule}
* list
* list {UlIndentRule}
* list #{ListIndentRule}
{NoTrailingSpacesRule} {NoHardTabsRule}
(name)[link] {NoReversedLinksRule} {NoEmptyLinksRule}
{NoMultipleBlanksRule:18}
long line long line long line long line long line long line long line long line long line {LineLengthRule}
$ dollar {CommandsShowOutputRule}
#Heading 4 {NoMissingSpaceAtxRule}
# Heading 5 {NoMultipleSpaceAtxRule}
#Heading 6 {ConsistentHeaderStyleRule} {NoMissingSpaceClosedAtxRule} {BlanksAroundHeadersRule} {LineLengthRule} #
# Heading 7 {NoMultipleSpaceClosedAtxRule} {BlanksAroundHeadersRule} {HeaderStartLeftRule} {ConsistentHeaderStyleRule} {LineLengthRule} #
# Heading 8
# Heading 8
{NoDuplicateHeaderRule:34}
Note: Can not break SingleH1Rule and FirstHeaderH1Rule in the same file
# Heading 9 {NoTrailingPunctuationRule}.
> {NoMultipleSpaceBlockquoteRule}
> {NoBlanksBlockquoteRule:43}
1. list
3. list {OlPrefixRule}
```js
```
* list {BlanksAroundListsRule}
{BlanksAroundFencesRule:50}
<br/> {NoInlineHtmlRule}
http://example.com/page {NoBareUrlsRule}
---
***
{HrStyleRule:61}
_Section {NoEmphasisAsHeaderRule} Heading_
Emphasis *with * space {NoSpaceInEmphasisRule}
Code `with ` space {NoSpaceInCodeRule}
[link with space ](link) {NoSpaceInLinksRule} {ValidRelativeLinksRule}
[link without scheme](www.example.com) {MissingLinkSchemeRule}
```
code fence without language {FencedCodeLanguageRule:75}
```
[empty link]() {NoEmptyLinksRule}
markdownLint {ProperNamesRule}
 {MD045} {ValidRelativeImagesRule}
- [ ] First task item
- [x] Second item {ListMarkerSpaceRule}
- [X] Third inconsistent item {ConsistentTaskListMarkerStyleRule}
- [x] An item
- [x] Another item {TaskListMarkerSpaceRule}
- [x]Not enough space {TaskListMarkerSpaceRule}
|
Markdown | UTF-8 | 815 | 2.640625 | 3 | [
"Apache-2.0"
] | permissive | # GitBook常用操作
- 初始化书籍目录
```
gitbook init
```
会生成README.md和SUMMARY.md文件。SUMMARY.md 是书籍的目录结构,gitbook会根据SUMMARY.md中的配置生成对应目录结构。
- 编译书籍
```
gitbook serve
```
``gitbook serve`` 命令实际上会首先调用 ``gitbook build`` 编译书籍,完成以后会打开一个 web 服务器,监听在本地的 4000 端口。编译后就可以用 ``http://127.0.0.1:4000/`` 预览效果了。
- 安装插件后本地预览效果正常,但是提交到GitHub后却没任何插件的效果?
这是因为插件安装后需要编译才有效果,得有
参考:
- [GitBook 简明教程](http://www.chengweiyang.cn/)
- [GitBook搭建](https://liangjunrong.github.io/other-library/Markdown-Websites/GitBook/GitBook-study.html)
|
Python | UTF-8 | 6,111 | 2.734375 | 3 | [] | no_license | import xmltodict
import json
import re
from device import Device
def checkroutes(sw):
#load command variable with xml from this command
command = sw.show('show ip route')
result = xmltodict.parse(command[1])
#define variables
i = 0
routeloop = False
t1 = 0
t2 = 0
#run through each route looking for route uptime
while i < len(result['ins_api']['outputs']['output']['body']['TABLE_vrf']['ROW_vrf']['TABLE_addrf']['ROW_addrf']['TABLE_prefix']['ROW_prefix']):
#assign uptime value to variable
uptime = result['ins_api']['outputs']['output']['body']['TABLE_vrf']['ROW_vrf']['TABLE_addrf']['ROW_addrf']['TABLE_prefix']['ROW_prefix'][0]['TABLE_path']['ROW_path'][0]['uptime']
i += 1
#using regex assign number of days route has been up to a variable
day = re.findall(r'\P(.*)D', uptime)
for days in day:
t1=days
#using regesx assign number of hours route has been up to a variable
hour = re.findall(r'\T(.*)H', uptime)
for hours in hour:
t2=hours
if int(t1) < 1 and int(t2) < 1:
routeloop = True
#check if routes have been up less than an hour or not
if routeloop:
print "You have routes that are less than 1 hour old. Possible routes flapping(loop)"
else:
print "Routes look good."
return routeloop
def test_ospf(sw):
"""Test_ospf uses various show commands to determine if OSPF is running on the switch.
"""
cmd = cmd = sw.show('show ip ospf')
resp = xmltodict.parse(cmd[1])['ins_api']['outputs']['output']
try:
if resp["code"] == "400":
#most likely feature ospf is not in the configuration.
return False
elif resp["code"] == "501" and resp["clierror"] == "Note: process currently not running\n":
#feature ospf is enabled but not configured.
return False
elif resp["code"] == "200":
#ospf appears to be configured
contexts = resp["body"]["TABLE_ctx"]["ROW_ctx"]
if len(contexts) > 0:
return True
except Exception as oops:
print type(oops)
print oops.args
print oops
return False
def test_eigrp(sw):
"""Test_eigrp uses various show commands to determine if OSPF is running on the switch.
"""
cmd = cmd = sw.show('show ip eigrp')
resp = xmltodict.parse(cmd[1])['ins_api']['outputs']['output']
try:
if resp["code"] == "400":
#most likely feature eigrp is not in the configuration.
return False
elif resp["code"] == "501" and resp["clierror"] == "Note: process currently not running\n":
#feature eigrp is enabled but not configured.
return False
elif resp["code"] == "200":
#eigrp appears to be configured
contexts = resp["body"]["TABLE_asn"]["ROW_asn"]
if len(contexts) > 0:
return True
except Exception as oops:
print type(oops)
print oops.args
print oops
return False
def test_bgp(sw):
"""Test_bgp uses various show commands to determine if BGP is running on the switch.
"""
cmd = cmd = sw.show('show ip bgp')
resp = xmltodict.parse(cmd[1])['ins_api']['outputs']['output']
try:
if resp["code"] == "400":
#most likely feature bgp is not in the configuration.
return False
elif resp["code"] == "501" and resp["clierror"] == "Note: process currently not running\n":
#feature bgp is enabled but not configured.
return False
elif resp["code"] == "501" and resp["msg"] == "Structured output unsupported":
#bgp appears to be configured
return True
except Exception as oops:
print type(oops)
print oops.args
print oops
return False
def get_ip_protocols(sw):
protocols = []
ospf = test_ospf(sw)
if ospf:
protocols.append("ospf")
eigrp = test_eigrp(sw)
if eigrp:
protocols.append("eigrp")
bgp = test_bgp(sw)
if bgp:
protocols.append("bgp")
return protocols
def stp_detail(switch):
debug = False
ifloop = False
stp_split = []
getdata = switch.conf('show spanning-tree detail')
show_stp = xmltodict.parse(getdata[1])
stp = show_stp ['ins_api']['outputs']['output']['body']
tsn_change = re.findall('(?<=occurred\s).*(?=\:)', stp)
for time in tsn_change:
stp_time = time
stp_split = stp_time.split(':')
if debug:
print stp_split
if debug:
print "Last topology change happened " + stp_time + " hours ago"
tsn_number = re.findall('(?<=changes\s).*(?=\last)', stp)
for number in tsn_number:
stp_number = number
if debug:
print "Number of topology changes = " + stp_number
if int(stp_split[0]) == 0 and int(stp_split[1]) <= 5:
ifloop = True
if ifloop:
print "Last topology change happened " + stp_time + " hours ago"
print "Number of topology changes = " + stp_number
else:
print "No STP topology changes."
def main():
switch = Device(ip='172.23.193.210', username='admin', password='P@ssw0rd')
switch.open()
checkroutes(switch)
get_ip_protocols(switch)
stp_detail(switch)
if __name__ == "__main__":
main()
|
Python | UTF-8 | 3,939 | 2.828125 | 3 | [] | no_license | from PIL import Image
import numpy as np
import pandas as pd
import tensorflow as tf
import sys
import cv2
from sklearn.linear_model import Perceptron
import matplotlib.pyplot as plt
import math
from sklearn import decomposition
from sklearn import datasets
from mpl_toolkits.mplot3d import Axes3D
from sklearn.neural_network import MLPClassifier
from sklearn import metrics
#(0=Angry, 1=Disgust, 2=Fear, 3=Happy, 4=Sad, 5=Surprise, 6=Neutral).
#https://www.kaggle.com/c/challenges-in-representation-learning-facial-expression-recognition-challenge/data
#size of images : 48 x 48
data = pd.read_csv('fer2013.csv')
training_df = data.groupby(['Usage']).get_group('Training')
test_df = data.groupby(['Usage']).get_group('PublicTest')
pixel_list_training = training_df['pixels'].tolist()
pixel_list_test = test_df['pixels'].tolist()
def pixelToList(pixel_list):
L = []
for row in pixel_list:
row = row.split(' ')
row = list(map(int,row))
L.append(row)
return L
def DFColum():
col_list = []
for x in range(2304):
col_list.append('pixel'+str(x))
return col_list
pixel_cols = DFColum();
pixel_list_training = pixelToList(pixel_list_training)
pixel_list_test = pixelToList(pixel_list_test)
train_data = pd.DataFrame(pixel_list_training, columns = pixel_cols)
#pixel_data.insert(loc = 0 , column = 'emotion', value = training_df['emotion'])
#pixel_data['emotion'] = training_df['emotion']
test_data = pd.DataFrame(pixel_list_test, columns = pixel_cols)
#pixel_data_test['emotion'] = test_df['emotion']
#pixel_data_test.insert(loc = 0 , column = 'emotion', value = test_df['emotion'])
"""
plt.hist(pixel_data['emotion'])
plt.title("frequency histohram of emotions in training data")
plt.xlabel("Emotion Value")
plt.ylabel("Frequency")
plt.show()
#Note to self: From this plot it appears that very few disgust images are available
"""
#print(pixel_data.shape) #size of training data is 28709 x 2305 - 2304 pixels for the image and 1 for the emotion label
#print(pixel_data.head())
#print(pixel_data_test.head())
def plotTrainingImages(train_data):
f,ax = plt.subplots(5,5)
for i in range(0,25):
data = train_data.iloc[i, 0:2304].values
nrows, ncols = 48, 48
grid = data.reshape((nrows,ncols))
n = math.ceil(i/5)-1
m=[0,1,2,3,4]*5
ax[m[i-1],n].imshow(grid)
plt.show()
##Normalizing the training data set ##
y_train = training_df['emotion']
train_data = train_data/255
test_data = test_data/255
#train_data['emotion'] = y_train
##PCA decomposition
pca = decomposition.PCA(n_components = 200) #find the first 200 PCs
pca.fit(train_data)
plt.plot(pca.explained_variance_ratio_)
plt.ylabel(' % of variance explained')
#plt.show()
pca = decomposition.PCA(n_components = 50)
pca.fit(train_data)
PCtrain = pd.DataFrame(pca.transform(train_data))
PCtrain['emotion'] = y_train
PCtest = pd.DataFrame(pca.transform(test_data))
fig = plt.figure()
ax = fig.add_subplot(111,projection = '3d')
x = PCtrain[0]
y = PCtrain[1]
z = PCtrain[2]
colors = [int(i%6) for i in PCtrain['emotion']]
ax.scatter(x, y, z, c = colors, marker = 'o', label = colors)
ax.set_xlabel('PC1')
ax.set_label('PC2')
ax.set_label('PC3')
#plt.show()
"""
Y = PCtrain['emotion'][0:22000]
X = PCtrain.drop('emotion', axis = 1)[0:22000]
clf = MLPClassifier(solver = 'lbfgs', alpha = 1e-5, hidden_layer_sizes = (3500,), random_state = 1)
clf.fit(X,Y)
print(clf)
predicted = clf.predict(PCtrain.drop('emotion', axis = 1)[22000:])
expected = PCtrain['emotion'][22000:]
print("Classification report fpr classifier %s: \n %s \n" % (clf, metrics.classification_report(expected, predicted)))
print("Confusion matrix: \n%s" % metrics.confusion_matrix(expected, predicted))
output = pd.DataFrame(clf.predict(PCtest), columns =['emotion'])
output.reset_index(inplace=True)
output.rename(columns={'index': 'ImageId'}, inplace=True)
output['ImageId']=output['ImageId']+1
output.to_csv('output.csv', index=False)
"""
print(test_df['emotion'])
|
PHP | UTF-8 | 813 | 2.921875 | 3 | [] | no_license | <?php
namespace app\core;
class Request
{
public function getPath()
{
return trim(parse_url($_SERVER['REQUEST_URI'], PHP_URL_PATH), '/');
}
public function getMethod()
{
return strtolower($_SERVER['REQUEST_METHOD']);
}
public function getSanitizedData()
{
$data= [];
if($this->getMethod() == 'get')
{
foreach($_GET as $key => $value)
{
$data[$key] = filter_input(INPUT_GET, $key, FILTER_SANITIZE_SPECIAL_CHARS);
}
}
if($this->getMethod() == 'post')
{
foreach($_POST as $key => $value)
{
$data[$key] = filter_input(INPUT_POST, $key, FILTER_SANITIZE_SPECIAL_CHARS);
}
}
return $data;
}
} |
Python | UTF-8 | 967 | 3.828125 | 4 | [
"MIT"
] | permissive | # Technique 2: This can be used for fitting a gaussian when your data is not from a histogram but from a profile (i.e y axis is not counts but some parameter)
# We will use optimize module from scipy
# infact this technique can be used for fitting to any arbitary function
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
def gaus(x,a,mu,sd):
return a*np.exp(-(x-mu)**2/(2*sd**2))
x = np.arange(-3,3,0.1)
y = gaus(x,10,0,1) # mean zero and sd 1, with amplitude 10
# popt is an array of parameters and pcov is the matrix of covarianes
popt,pcov = curve_fit(gaus,x,y,p0=[1,1,1]) # the number within square brackets are the initial values.
plt.plot(x,y,'b+:',label='data')
plt.plot(x,gaus(x,*popt),'ro:',label='fit')
plt.legend()
plt.show()
# assignment
# generate a data set containing numbers from an exponential decay function
# write a function to fit the dataset to an exponential decay and find the fit parameters |
Java | UTF-8 | 1,043 | 2.484375 | 2 | [] | no_license | package com.moonsworth.client.nethandler.server;
import java.io.*;
import com.moonsworth.client.nethandler.*;
import com.moonsworth.client.nethandler.client.*;
import java.beans.*;
public class LCPacketStaffModState extends LCPacket
{
private String mod;
private boolean state;
@Override
public void write(ByteBufWrapper b) {
b.writeString(this.mod);
b.buf().writeBoolean(this.state);
}
@Override
public void read(ByteBufWrapper b) {
this.mod = b.readString();
this.state = b.buf().readBoolean();
}
@Override
public void process(ILCNetHandler handler) {
((ILCNetHandlerClient)handler).handleStaffModState(this);
}
public String getMod() {
return this.mod;
}
public boolean isState() {
return this.state;
}
@ConstructorProperties({ "mod", "state" })
public LCPacketStaffModState(String mod, boolean state) {
this.mod = mod;
this.state = state;
}
public LCPacketStaffModState() {
}
}
|
Java | UTF-8 | 1,454 | 3.296875 | 3 | [] | no_license | import java.util.*;
public class GreedyPerfectMatch {
public HashSet<Edge> PerfectMatching(EdgeWeightedGraph G) {
HashSet<Edge> M = new HashSet<Edge>(G.E());
ArrayList<Edge> L = new ArrayList<Edge>();
HashSet<Integer> V = new HashSet<Integer>(G.V());
Edge e;
int u, v;
for (Edge edge : G.edges())
L.add(edge);
for (int i=0; i<G.V(); i++)
V.add(i);
Collections.sort(L, new Comparator<Edge>() {
public int compare(Edge e1, Edge e2) {
return Double.compare(e1.weight(), e2.weight());
}
});
System.out.println("L = " + L);
System.out.println("V = " + V);
while (!V.isEmpty() && !L.isEmpty()) {
// System.out.println("One iteration");
e = L.get(L.size() - 1);
L.remove(L.size() - 1);
u = e.either();
v = e.other(u);
if (V.contains(u) && V.contains(v)) {
M.add(e);
V.remove(u);
V.remove(v);
}
}
return M;
}
public static void main(String[] args) {
In in = new In(args[0]);
EdgeWeightedGraph G = new EdgeWeightedGraph(in);
GreedyPerfectMatch gpm = new GreedyPerfectMatch();
System.out.println(gpm.PerfectMatching(G));
}
} |
JavaScript | UTF-8 | 1,085 | 3.03125 | 3 | [] | no_license |
function checkCanSend() {
var sendButton = document.querySelector('.team-estimate__send-button');
var form = document.getElementById('team-estimate-form');
var radioGroups = document.querySelectorAll('.team-estimate-criteria__marks');
var checkedRadiosCount = calcRadioCheckedCount(form);
var textAreas = document.querySelectorAll('.team-estimate-criteria__area__text');
var filledAreasCount = calcFilledAreasCount(form);
var isValidForm = radioGroups.length === checkedRadiosCount && textAreas.length === filledAreasCount;
isValidForm ? sendButton.classList.remove('disabled') : sendButton.classList.add('disabled');
}
function calcRadioCheckedCount(form) {
var count = 0;
var inputs = form.getElementsByTagName('input');
for (var i = 0; i < inputs.length; i++) {
if (inputs[i].checked){
count ++;
}
}
return count;
}
function calcFilledAreasCount(form) {
var count = 0;
var areas = form.getElementsByTagName('textarea');
for (var i = 0; i < areas.length; i++) {
if (areas[i].value){
count ++;
}
}
return count;
}
|
C++ | UTF-8 | 1,451 | 2.8125 | 3 | [] | no_license | #pragma once
#include <vector>
#include "ralloc/ralloc.h" // RDMA malloc
namespace nocc {
/**
* The buffer used by rpc should be carefully allocated (from registered memory).
* This class help this
*/
class RPCBufAllocator {
public:
RPCBufAllocator(int req_buf_num,int reply_buf_num,int req_msg_size,int reply_msg_size,int coroutines) {
assert(req_buf_num > 0 && reply_buf_num > 0);
RThreadLocalInit();
for(uint i = 0;i < coroutines + 1;++i) {
req_buf_slots_.push_back(0);
req_buf_pool_.push_back(std::vector<char *> ());
for(uint j = 0;j < req_buf_num;++j) {
char *buf = (char *)Rmalloc(req_msg_size);
req_buf_pool_[i].push_back(buf);
}
}
for(uint i = 0;i < reply_buf_num;++i) {
char *buf = (char *)Rmalloc(reply_msg_size);
reply_buf_pool_.push_back(buf);
}
}
// Get a reply buffer for the RPC handler
inline char *get_reply_buf() {
auto res = reply_buf_pool_[(reply_buf_slot_++) % reply_buf_pool_.size()];
return res;
}
// Get a buffer for sending RPC request.
// This buffer can be re-used.
inline char *get_fly_buf(int cid) {
char *res = req_buf_pool_[cid][(req_buf_slots_[cid]++) % req_buf_pool_[cid].size()];
return res;
}
private:
// buffer pools
std::vector<char *> reply_buf_pool_;
uint16_t reply_buf_slot_ = 0;
std::vector<std::vector<char *> >req_buf_pool_;
std::vector<uint8_t> req_buf_slots_;
};
};
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.