text stringlengths 184 4.48M |
|---|
import pytest
import numpy as np
from itertools import product
from .. import states
QUBITS = []
for i in range(1, 4):
qubits_i = [s for s in product(states.QBIT_MATRICES.keys(), repeat=i)]
QUBITS.extend(["|" + "".join(s) + ">" for s in qubits_i])
QUBITS.extend(["-|" + "".join(s) + ">" for s in qubits_i])
@pytest.mark.parametrize("qubit", QUBITS)
def test_decode_state(qubit):
decoded_state = states.States.decode_state(qubit)
possibilities = np.square(decoded_state)
assert abs(1 - np.sum(possibilities)) < states.EPSILON
def test_encode_state_positive():
assert (
states.States.encode_state(
np.array(
[
0.0,
0.0,
-0.0,
-0.0,
0.0,
0.0,
-0.0,
-0.0,
0.0,
0.0,
-0.0,
-0.0,
0.0,
0.0,
-0.0,
-0.0,
0.0,
0.0,
-0.0,
-0.0,
0.5,
0.5,
-0.5,
-0.5,
0.0,
0.0,
-0.0,
-0.0,
0.0,
0.0,
-0.0,
-0.0,
]
)
)
== "|101-+>"
)
def test_encode_state_negative():
assert states.States.encode_state(np.array([0.0, -1.0])) == "-|1>"
@pytest.mark.parametrize("qubit", QUBITS)
def test_cycle_decode_encode(qubit):
assert states.States.encode_state(states.States.decode_state(qubit)) == qubit |
Use Case: Use MVAPICH2 (message passing interface for InfiniBand version 2) for running parallel programming applications in a cluster.
Code details and examples:
Code:
To illustrate the usage of this software let's take a simple MPI code which finds the maximum number in a list.(max.c)
```C
#include <stdio.h>
#include "mpi.h"
#define MAX_LIST 10000
int main(int argc, char **argv){
int rank, total_ranks;
int global_max = 0;
int local_max = 0;
int i;
int locals[MAX_LIST] = { 0 };
MPI_Init(&argc, &argv);
MPI_Comm_rank(MPI_COMM_WORLD, &rank);
MPI_Comm_size(MPI_COMM_WORLD, &total_ranks);
for(i=0;i<MAX_LIST;i++)
{
locals[i] = rank * MAX_LIST + i;
}
for(i=0;i<MAX_LIST;i++)
{
if(locals[i] > local_max)
local_max = locals[i];
}
MPI_Reduce(&local_max, &global_max, 1, MPI_INT, MPI_MAX, 0,MPI_COMM_WORLD);
if(rank==0)
{
printf("Maximum value = %d\n", global_max);
}
MPI_Finalize();
return 0;
}
```
To compile this code you would use:
`mpicc -o max max.c`
After this, to run the Max program (compiled above) on your cluster via MVAPICH2, the general format for the command is:
`mpirun -np 4 ./max`
Here, `-np` denotes the number of processes and `4` implies we wish to run this program using 4 processes. You may want to adjust this number according to the architecture of the cluster you are working on.
Files required: Input files are not necessary for this example as we are generating an array of integers based on the rank of each process.
Specifications and details in input files: N/A.
There will not be any input files provided as we are generating data based on the rank of each process.
It is important to mention that MVAPICH2 is designed to work with the Message Passing Interface (MPI) standard which is widely used for writing parallel programs. This standard provides a library of functions that allow communication between distributed processes, essential for running parallel programs in a cluster, but also useful for multiprocessor systems. The MPI library is available in C, C++, and Fortran. |
<h1 align="center">
🦅 Dramatiq Header Middleware for RabbitMQ
</h1>
# 🛠 Installation
```sh
pip install dramatiq-header
```
# ⬆️ Upgrade version
```sh
pip install dramatiq-header --upgrade
```
# ✏️ Usage
## Worker code:
```py
import dramatiq
from dramatiq.brokers.rabbitmq import RabbitmqBroker
from dramatiq_header import HeadersMessage # Import Middleware
rabbitmq_broker = RabbitmqBroker()
dramatiq.set_broker(rabbitmq_broker)
rabbitmq_broker.add_middleware(HeadersMessage()) # Add Middleware
@dramatiq.actor(queue_name='example')
def my_task(message):
print(f'Message Received: {message}')
print(HeadersMessage.get_headers()) # Get headers
```
> [!TIP]
> You can add middleware specifically to monitor a header key. For example:
```py
rabbitmq_broker.add_middleware(HeadersMessage('x-test-header'))
```
## Sender example
```py
import dramatiq
from dramatiq import Message
from dramatiq.brokers.rabbitmq import RabbitmqBroker
rabbitmq_broker = RabbitmqBroker()
dramatiq.set_broker(rabbitmq_broker)
def send_message(msg: str):
message = Message(
queue_name='example',
actor_name='my_task',
args=(msg, ),
kwargs={},
options={'x-test-header': 'test-header'}, # Send your entire header here
)
rabbitmq_broker.enqueue(message)
if __name__ == '__main__':
send_message('test message')
```
## Output

> [!IMPORTANT]
> This library does NOT transmit the header using the RabbitMQ header property; rather, it sends the header as metadata within the message that Dramatiq already dispatches.
---
<p align="center">
<a href="https://ko-fi.com/guedesfelipe" target="_blank">
<img src="https://user-images.githubusercontent.com/25853920/175832199-6c75d866-31b8-4209-bd1a-db116a6dd032.png" width=300 />
</a>
</p> |
#https://leetcode.com/problems/high-access-employees/description/
"""
2933. High-Access Employees
Medium
You are given a 2D 0-indexed array of strings, access_times, with size n. For each i where 0 <= i <= n - 1, access_times[i][0] represents the name of an employee, and access_times[i][1] represents the access time of that employee. All entries in access_times are within the same day.
The access time is represented as four digits using a 24-hour time format, for example, "0800" or "2250".
An employee is said to be high-access if he has accessed the system three or more times within a one-hour period.
Times with exactly one hour of difference are not considered part of the same one-hour period. For example, "0815" and "0915" are not part of the same one-hour period.
Access times at the start and end of the day are not counted within the same one-hour period. For example, "0005" and "2350" are not part of the same one-hour period.
Return a list that contains the names of high-access employees with any order you want.
Example 1:
Input: access_times = [["a","0549"],["b","0457"],["a","0532"],["a","0621"],["b","0540"]]
Output: ["a"]
Explanation: "a" has three access times in the one-hour period of [05:32, 06:31] which are 05:32, 05:49, and 06:21.
But "b" does not have more than two access times at all.
So the answer is ["a"].
Example 2:
Input: access_times = [["d","0002"],["c","0808"],["c","0829"],["e","0215"],["d","1508"],["d","1444"],["d","1410"],["c","0809"]]
Output: ["c","d"]
Explanation: "c" has three access times in the one-hour period of [08:08, 09:07] which are 08:08, 08:09, and 08:29.
"d" has also three access times in the one-hour period of [14:10, 15:09] which are 14:10, 14:44, and 15:08.
However, "e" has just one access time, so it can not be in the answer and the final answer is ["c","d"].
Example 3:
Input: access_times = [["cd","1025"],["ab","1025"],["cd","1046"],["cd","1055"],["ab","1124"],["ab","1120"]]
Output: ["ab","cd"]
Explanation: "ab" has three access times in the one-hour period of [10:25, 11:24] which are 10:25, 11:20, and 11:24.
"cd" has also three access times in the one-hour period of [10:25, 11:24] which are 10:25, 10:46, and 10:55.
So the answer is ["ab","cd"].
Constraints:
1 <= access_times.length <= 100
access_times[i].length == 2
1 <= access_times[i][0].length <= 10
access_times[i][0] consists only of English small letters.
access_times[i][1].length == 4
access_times[i][1] is in 24-hour time format.
access_times[i][1] consists only of '0' to '9'.
"""
from collections import defaultdict
from typing import List
class Solution:
def findHighAccessEmployees(self, access_times: List[List[str]]) -> List[str]:
# Group the times for each employee
employee_times = defaultdict(list)
for employee, time in access_times:
employee_times[employee].append(int(time))
# Sort the times in ascending order for each employee
for times in employee_times.values():
times.sort()
high_access_employees = []
for employee, times in employee_times.items():
for i in range(len(times)):
if i + 2 < len(times) and times[i+2] - times[i] < 100:
high_access_employees.append(employee)
break
return high_access_employees
if __name__ == "__main__":
solution = Solution()
print(solution.findHighAccessEmployees(access_times= [["cd","1025"],["ab","1025"],["cd","1046"],["cd","1055"],["ab","1124"],["ab","1120"]])) |
#!/usr/bin/env node
//this is nodejs shebang syntax
let inputArr=process.argv.slice(2);
let fs=require("fs");
const path = require("path");
console.log(inputArr);
//node main.js tree "directoryPath"
//node main.js organize "directoryPath"
//node main.js help
let command=inputArr[0];
switch(command) {
case "tree":
treeFn(inputArr[1])
break;
case "organize":
organizeFn(inputArr[1])
break;
case "help":
helpFn();
break;
default:
console.log("😥 Please INput right command") ;
break;
}
function treeFn(dirPath) {
let destPath;
if(dirPath==undefined) {
treeHelper(process.cwd(),"");
return;
}
else {
//if the file path is valid or not
doesExist=fs.existsSync(dirPath);
if(doesExist) {
treeHelper(dirPath,"");
}
else {
console.log("Kindly enter the path");
return;
}
}
}
function treeHelper(dirPath, indent) {
// is file or folder
let isFile = fs.lstatSync(dirPath).isFile();
if (isFile == true) {
let fileName = path.basename(dirPath);
console.log(indent + "├──" + fileName);
} else {
let dirName = path.basename(dirPath)
console.log(indent + "└──" + dirName);
let childrens = fs.readdirSync(dirPath);
for (let i = 0; i < childrens.length; i++) {
let childPath = path.join(dirPath, childrens[i]);
treeHelper(childPath, indent + "\t");
}
}
}
function organizeFn(dirPath) {
// console.log("Organize command implemented for ",dirPath);
//1. input ->dirctory path
let destPath;
if(dirPath==undefined) {
destPath=process.cwd();
return;
}
else {
//if the file path is valid or not
doesExist=fs.existsSync(dirPath);
if(doesExist) {
//2. create->organized_files->directory
destPath=path.join(dirPath,"organized_files");
if(fs.existsSync(destPath)==false) {
//agar vo folder pehle se nahi h tab hi banao
fs.mkdirSync(destPath);
}
}
else {
console.log("Kindly enter the path");
return;
}
}
//dirPath->jisko organize karna h
//destPath->jisme organize karna h
organizeHelper(dirPath,destPath);
}
function organizeHelper(src, dest) {
// 3. identify categories of all the files present in that input directory ->
let childNames = fs.readdirSync(src);
// console.log(childNames);
for (let i = 0; i < childNames.length; i++) {
let childAddress = path.join(src, childNames[i]);
let isFile = fs.lstatSync(childAddress).isFile();
if (isFile) {
// console.log(childNames[i]);
let category = getCategory(childNames[i]);
console.log(childNames[i], "belongs to --> ", category);
// 4. copy / cut files to that organized directory inside of any of category folder
sendFiles(childAddress, dest, category);
}
}
}
function sendFiles(srcFilePath, dest, category) {
//
let categoryPath = path.join(dest, category);
if (fs.existsSync(categoryPath) == false) {
fs.mkdirSync(categoryPath);
}
let fileName = path.basename(srcFilePath);
let destFilePath = path.join(categoryPath, fileName);
fs.copyFileSync(srcFilePath, destFilePath);
fs.unlinkSync(srcFilePath);
//isse original file remove ho jayegi
console.log(fileName, "copied to ", category);
}
function helpFn(dirPath) {
console.log(`
List of All commands:
//node main.js tree "directoryPath"
//node main.js organize "directoryPath"
//node main.js help
`);
}
function getCategory(name) {
let types={
media:["mp4","mkv"],
archives:['zip','7z','rar','tar','gz','ar','iso','xz'],
documents:['docx','doc','pdf','xlsx','xls','odt','ods','odp','odg','odf','txt','ps','tex'],
app:['exe','dmg','pkg','deb']
}
let ext=path.extname(name);
ext=ext.slice(1);
//shuru ka . hata diya
console.log(ext);
for(let type in types) {
let cTypeArray=types[type];
for(let i=0;i<cTypeArray.length;i++) {
if(ext==cTypeArray[i]) {
return type;
}
}
}
return "others";
} |
package com.ssafy.soltravel.v2.controller;
import com.ssafy.soltravel.v2.dto.ResponseDto;
import com.ssafy.soltravel.v2.dto.group.GroupDto;
import com.ssafy.soltravel.v2.dto.group.ParticipantDto;
import com.ssafy.soltravel.v2.dto.group.request.CreateGroupRequestDto;
import com.ssafy.soltravel.v2.dto.group.request.CreateParticipantRequestDto;
import com.ssafy.soltravel.v2.dto.group.request.GroupCodeGenerateRequestDto;
import com.ssafy.soltravel.v2.dto.group.request.GroupUpdateRequestDto;
import com.ssafy.soltravel.v2.dto.group.response.GroupCodeGenerateResponseDto;
import com.ssafy.soltravel.v2.dto.group.response.GroupSummaryDto;
import com.ssafy.soltravel.v2.service.group.GroupService;
import com.ssafy.soltravel.v2.util.LogUtil;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.media.Content;
import io.swagger.v3.oas.annotations.media.Schema;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import java.util.List;
import lombok.RequiredArgsConstructor;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@Tag(name = "Group API", description = "모임 & 참여자 관련 API")
@RestController
@RequiredArgsConstructor
@RequestMapping("/groups")
public class GroupController {
private final GroupService groupService;
// === 모임 관련 메서드 ===
@Operation(summary = "새로운 모임 생성", description = "새로운 모임을 생성하는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "201", description = "모임 생성 성공", content = @Content(schema = @Schema(implementation = GroupDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청 데이터", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@PostMapping("/createGroup")
public ResponseEntity<GroupDto> createNewGroup(
@RequestBody CreateGroupRequestDto requestDto
) {
GroupDto accountDto = groupService.createNewGroup(requestDto);
return ResponseEntity.status(HttpStatus.CREATED).body(accountDto);
}
// 모임 조회
@Operation(summary = "모임 정보 조회", description = "특정 모임의 정보를 조회하는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "모임 정보 조회 성공", content = @Content(schema = @Schema(implementation = GroupDto.class))),
@ApiResponse(responseCode = "404", description = "모임을 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/{groupId}")
public ResponseEntity<GroupDto> getGroupInfo(
@PathVariable Long groupId
) {
GroupDto accountDto = groupService.getGroupInfo(groupId);
return ResponseEntity.status(HttpStatus.OK).body(accountDto);
}
@Operation(summary = "가입한 모든 모임 조회 (생성한거는 조회 X)", description = "사용자가 가입한 모든 모임의 요약 정보를 반환하는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "모임 요약 정보 조회 성공",
content = @Content(schema = @Schema(implementation = GroupSummaryDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/joined")
public ResponseEntity<List<GroupSummaryDto>> getAllJoinedGroup() {
List<GroupSummaryDto> groupSummaryDtoList = groupService.getAllJoinedGroup(false);
return ResponseEntity.status(HttpStatus.OK).body(groupSummaryDtoList);
}
@Operation(summary = "생성한 모든 모임 조회 (가입 한거는 X 생성만)", description = "사용자가 생성한 모든 모임의 요약 정보를 반환하는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "모임 요약 정보 조회 성공",
content = @Content(schema = @Schema(implementation = GroupSummaryDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/created")
public ResponseEntity<List<GroupSummaryDto>> getAllCreatedGroup() {
List<GroupSummaryDto> groupSummaryDtoList = groupService.getAllJoinedGroup(true);
return ResponseEntity.status(HttpStatus.OK).body(groupSummaryDtoList);
}
// 모임 정보 업데이트
@Operation(summary = "그룹 정보 업데이트", description = "모임주가 특정 모임 정보를 업데이트 하는 API")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "모임 요약 업데이트 성공",
content = @Content(schema = @Schema(implementation = GroupSummaryDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@PutMapping("/{groupId}")
public ResponseEntity<ResponseDto> updateGroupinfo(
@PathVariable Long groupId,
@RequestBody GroupUpdateRequestDto requestDto
) {
ResponseDto responseDto = groupService.updateGroupInfo(groupId, requestDto);
return ResponseEntity.status(HttpStatus.OK).body(responseDto);
}
// 모임 탈퇴
@Operation(summary = "모임 삭제", description = "특정 모임의 삭제하는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "모임 삭제 성공", content = @Content(schema = @Schema(implementation = ResponseDto.class))),
@ApiResponse(responseCode = "404", description = "모임을 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@DeleteMapping("/{groupId}")
public ResponseEntity<ResponseDto> deleteGroup(
@PathVariable Long groupId
) {
ResponseDto responseDto = groupService.deleteGroup(groupId);
return ResponseEntity.status(HttpStatus.OK).body(responseDto);
}
// === 참여자 관련 메서드 ===
@Operation(summary = "새로운 참여자 생성", description = "새로운 참여자를 모임에 추가하는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "201", description = "참여자 생성 성공", content = @Content(schema = @Schema(implementation = ParticipantDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청 데이터", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@PostMapping("/participants/{groupId}")
public ResponseEntity<ParticipantDto> createNewParticipant(
@RequestBody CreateParticipantRequestDto requestDto
) {
ParticipantDto participantDto = groupService.createNewParticipant(requestDto, false);
return ResponseEntity.status(HttpStatus.CREATED).body(participantDto);
}
@Operation(summary = "참여자 모임 탈퇴", description = "기존 모임 참여자를 모임에서 탈퇴시키는 API.")
@ApiResponses(value = {
@ApiResponse(responseCode = "201", description = "참여자 탈퇴 성공", content = @Content(schema = @Schema(implementation = ResponseDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청 데이터", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@DeleteMapping("/participants/{participantId}")
public ResponseEntity<ResponseDto> deleteParticipant(
@PathVariable Long participantId
) {
ResponseDto responseDto = groupService.deleteParticipant(participantId);
return ResponseEntity.status(HttpStatus.CREATED).body(responseDto);
}
// === 초대 url 생성 ===
@Operation(summary = "참여코드 생성", description = "그룹 ID로 모임 참여 코드를 생성합니다.")
@ApiResponses(value = {
@ApiResponse(responseCode = "201", description = "참여 코드 생성 성공", content = @Content(schema = @Schema(implementation = GroupCodeGenerateResponseDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청 데이터", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@PostMapping(value = "/create/groupCode")
public ResponseEntity generateGroupCode(@RequestBody GroupCodeGenerateRequestDto request) {
LogUtil.info("모임 코드 생성", request);
GroupCodeGenerateResponseDto response = groupService.generateGroupCode(request);
return ResponseEntity.status(HttpStatus.CREATED).body(response);
}
// === 초대 url 확인 ===
@Operation(summary = "참여코드 조회", description = "어떤 모임의 참여코드인지 확인합니다.")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "참여 코드 조회 성공", content = @Content(schema = @Schema(implementation = GroupDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청 데이터", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/code/{code}")
public ResponseEntity findGroupByCode(@PathVariable String code) {
LogUtil.info("모임 코드 조회", code);
GroupDto response = groupService.findGroupByCode(code);
return ResponseEntity.status(HttpStatus.CREATED).body(response);
}
// === 초대 url 유효성 확인 ===
@Operation(summary = "참여코드 유효성 조회", description = "유효한 모임 초대코드인지 확인합니다(로그인X), 유효할 시 응답으로 만료까지 남은 시간을 초로 응답합니다.")
@ApiResponses(value = {
@ApiResponse(responseCode = "200", description = "참여 코드 조회 성공", content = @Content(schema = @Schema(implementation = GroupDto.class))),
@ApiResponse(responseCode = "400", description = "잘못된 요청 데이터", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/code/validation/{code}")
public ResponseEntity validGroupCode(@PathVariable String code) {
LogUtil.info("모임 코드 유효성 조회", code);
ResponseDto response = groupService.validGroupCode(code);
return ResponseEntity.status(HttpStatus.CREATED).body(response);
}
// @GetMapping()
// public ResponseEntity<ParticipantDto> getAllGroupsByUserId() {
//
// groupService.getAllGroupInfosByUserId();
//
// return ResponseEntity.status(HttpStatus.CREATED).body(participantDto);
// }
} |
import styled from "styled-components";
import Heading from "../../components/layout/Heading";
import PostFeatureItem from "../post/PostFeatureItem";
import React from "react";
import {
collection,
limit,
onSnapshot,
query,
where,
} from "firebase/firestore";
import { db } from "../../firebase-app/firebase-config";
const HomeFeatureStyles = styled.div``;
const HomeFeature = () => {
const [posts, setPosts] = React.useState([]);
React.useEffect(() => {
const colRef = collection(db, "posts");
const queries = query(
colRef,
where("status", "==", 1),
where("hot", "==", true),
limit(3)
);
onSnapshot(queries, (snapshot) => {
const results = [];
snapshot.forEach((doc) => {
results.push({
id: doc.id,
...doc.data(),
});
});
setPosts(results);
});
}, []);
if (posts.length <= 0) return null;
return (
<HomeFeatureStyles className="home-block">
<div className="container">
<Heading>Feature</Heading>
<div className="grid-layout">
{posts.map((post) => (
<PostFeatureItem
key={post.id}
data={post}
></PostFeatureItem>
))}
</div>
</div>
</HomeFeatureStyles>
);
};
export default HomeFeature; |
"use client"
import React, { useState } from 'react'
import Navbar from '@/components/Navbar'
import Hero from "@/components/hero/index"
import Company from "@/components/partner/index"
import Courses from "@/components/courses/index"
import HowWork from "@/components/howtowork/index"
import Feature from "@/components/features/index"
import Mentor from "@/components/mentor/index"
import Other from "@/components/other/index"
import Question from "@/components/question/index"
import Footer from "@/components/footer"
import Modal from "@/components/modal/index"
function page() {
const [modalstate ,setModalState] = useState(false)
return (
<div>
{
modalstate&&<Modal setModalState={setModalState}></Modal>
}
<div className={modalstate?"blur-sm fixed top-0" :"blur-0 relative"}>
<Navbar setModalState={setModalState}/>
<Hero/>
<Company/>
<Courses/>
<HowWork/>
<Feature/>
<Mentor/>
<Other/>
<Question/>
<Footer/>
</div>
</div>
)
}
export default page |
<?php
namespace App\Http\Controllers;
use App\Models\Categoria;
use Illuminate\Http\Request;
class CategoriaController extends Controller
{
/**
* Muestra la lista de categorías.
*/
public function index()
{
$categorias = Categoria::all();
return view('categorias.index', compact('categorias'));
}
/**
* Muestra el formulario para crear una nueva categoría.
*/
public function create()
{
return view('categorias.create');
}
/**
* Almacena una nueva categoría en la base de datos.
*/
public function store(Request $request)
{
$request->validate([
'nombre' => 'required|string|max:255|unique:categorias,nombre',
]);
Categoria::create($request->all());
return redirect()->route('categorias.index')->with('success', 'Categoría creada exitosamente.');
}
/**
* Muestra los detalles de una categoría específica.
*/
public function show(Categoria $categoria)
{
return view('categorias.show', compact('categoria'));
}
/**
* Muestra el formulario para editar una categoría existente.
*/
public function edit(Categoria $categoria)
{
return view('categorias.edit', compact('categoria'));
}
/**
* Actualiza una categoría existente en la base de datos.
*/
public function update(Request $request, Categoria $categoria)
{
$request->validate([
'nombre' => 'required|string|max:255|unique:categorias,nombre,' . $categoria->id,
]);
$categoria->update($request->all());
return redirect()->route('categorias.index')->with('success', 'Categoría actualizada exitosamente.');
}
/**
* Elimina una categoría de la base de datos.
*/
public function destroy(Categoria $categoria)
{
$categoria->delete();
return redirect()->route('categorias.index')->with('success', 'Categoría eliminada exitosamente.');
}
} |
import { Injectable } from '@nestjs/common';
import { InjectRepository } from '@nestjs/typeorm';
import { BookEntity } from 'src/entities/book.entity';
import { Repository } from 'typeorm';
import { BookItemDto } from './dto/book.item.dto';
import { BookCreateDto } from './dto/book.create.dto';
import { BookUpdateDto } from './dto/book.update.dto';
@Injectable()
export class BookService {
constructor(
@InjectRepository(BookEntity)
private readonly bookRepository: Repository<BookEntity>,
) {}
async findAll(): Promise<BookItemDto[]> {
return this.bookRepository.find();
}
async findById(id: number): Promise<BookItemDto | undefined> {
return this.bookRepository.findOneBy({ id: id });
}
async create(book: BookCreateDto): Promise<BookItemDto> {
return this.bookRepository.save(book);
}
async update(id: number, book: BookUpdateDto): Promise<BookItemDto | undefined> {
await this.bookRepository.update(id, book);
return this.bookRepository.findOneBy({ id: id });
}
async delete(id: number): Promise<void> {
await this.bookRepository.delete(id);
}
} |
package com.vobi.bank.service;
import java.util.List;
import java.util.Optional;
import java.util.Set;
import javax.validation.ConstraintViolation;
import javax.validation.ConstraintViolationException;
import javax.validation.Validator;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional;
import com.vobi.bank.domain.Customer;
import com.vobi.bank.repository.CustomerRepository;
@Service
public class CustomerServiceImpl implements CustomerService {
@Autowired
CustomerRepository customerRepository;
@Autowired
Validator validator;
@Override
@Transactional(readOnly = true)
public List<Customer> findAll() {
return customerRepository.findAll();
}
@Override
@Transactional(readOnly = true)
public Optional<Customer> findById(Integer id) {
return customerRepository.findById(id);
}
@Override
@Transactional(readOnly = true)
public Long count() {
return customerRepository.count();
}
@Override
@Transactional(readOnly = false, propagation = Propagation.REQUIRED, rollbackFor = Exception.class)
public Customer save(Customer entity) throws Exception {
if (entity == null) {
throw new Exception("El customer es nulo");
}
validate(entity);
if (customerRepository.existsById(entity.getCustId())) {
throw new Exception("El customer ya existe");
}
return customerRepository.save(entity);
}
@Override
@Transactional(readOnly = false, propagation = Propagation.REQUIRED, rollbackFor = Exception.class)
public Customer update(Customer entity) throws Exception {
if (entity == null) {
throw new Exception("El customer es nulo");
}
validate(entity);
if (customerRepository.existsById(entity.getCustId()) == false) {
throw new Exception("El customer no existe");
}
return customerRepository.save(entity);
}
@Override
@Transactional(readOnly = false, propagation = Propagation.REQUIRED, rollbackFor = Exception.class)
public void delete(Customer entity) throws Exception {
if (entity == null) {
throw new Exception("El customer es nulo");
}
if (entity.getCustId() == null) {
throw new Exception("El customer id es nulo");
}
if (customerRepository.existsById(entity.getCustId()) == false) {
throw new Exception("El customer no existe");
}
findById(entity.getCustId()).ifPresent(customer -> {
if (customer.getAccounts() != null && customer.getAccounts().isEmpty() == false) {
throw new RuntimeException("El customer tiene cuentas asociadas");
}
if (customer.getRegisteredAccounts() != null && customer.getRegisteredAccounts().isEmpty() == false) {
throw new RuntimeException("El customer tiene cuentas registradas asociadas");
}
});
customerRepository.deleteById(entity.getCustId());
}
@Override
@Transactional(readOnly = false, propagation = Propagation.REQUIRED, rollbackFor = Exception.class)
public void deleteById(Integer id) throws Exception {
if (id == null)
throw new Exception("El id es nulo");
if (customerRepository.existsById(id) == false) {
throw new Exception("El customer no existe");
}
delete(customerRepository.findById(id).get());
}
@Override
public void validate(Customer entity) throws Exception {
Set<ConstraintViolation<Customer>> constraintViolations = validator.validate(entity);
if (constraintViolations.isEmpty() == false) {
throw new ConstraintViolationException(constraintViolations);
}
}
} |
const express = require('express');
const SocketServer = require('ws').Server;
const WebSocket = require('ws');
const uuidv4 = require('uuid/v4');
// Set the port to 3001
const PORT = 3001;
// Create a new express server
const server = express()
// Make the express server serve static assets (html, javascript, css) from the /public folder
.use(express.static('public'))
.listen(PORT, '0.0.0.0', 'localhost', () => console.log(`Listening on ${ PORT }`));
// Create the WebSockets server
const wss = new SocketServer({ server });
// Set up a callback that will run when a client connects to the server
// When a client connects they are assigned a socket, represented by
// the ws parameter in the callback.
//BROADCAST TO EVERYONE
wss.broadcast = function broadcast(data) {
wss.clients.forEach(function each(client) {
if (client.readyState === WebSocket.OPEN) {
client.send(data);
}
});
};
wss.on('connection', (ws) => {
console.log('Client connected');
//Count number of users
let userCount = { numUsers: wss.clients.size };
let stringifiedCount = JSON.stringify(userCount);
wss.broadcast(stringifiedCount);
ws.on('message', (data) => {
//the data must be sent over the wire as a string. If passing JSON as a string, use JSON.parse() to convert the JSON string into an object the server can use it
const receivedMessage = JSON.parse(data);
receivedMessage.id = uuidv4();
switch(receivedMessage.type) {
case "postMessage":
receivedMessage.type = "incomingMessage";
break;
case "postNotification":
receivedMessage.type = "incomingNotification";
break;
}
let stringifiedMessage = JSON.stringify(receivedMessage);
wss.broadcast(stringifiedMessage);
});
// Set up a callback for when a client closes the socket. This usually means they closed their browser.
ws.on('close', () => {
console.log('Client disconnected')
let userCount = { numUsers: wss.clients.size };
let stringifiedCount = JSON.stringify(userCount);
wss.broadcast(stringifiedCount);
});
}); |
import pandas as pd
def new_col(stat: pd.Series, n_matches: pd.Series) -> float:
"""
Calculate the weighted average of a statistic.
Parameters
----------
stat : pandas.Series
The statistic to calculate the weighted average for.
n_matches : pandas.Series
The number of matches for each row in the statistic series.
Returns
-------
float
The weighted average of the statistic.
Notes
-----
The weighted average is calculated as the sum of the product of the
statistic and the number of matches, divided by the sum of the number of
matches.
"""
return (n_matches * stat).sum() / n_matches.sum()
def proccess_df_avg(df: pd.DataFrame) -> pd.DataFrame:
"""
Process a DataFrame to calculate the average statistics per season.
Parameters
----------
df : pandas.DataFrame
The DataFrame to process. It should have a 'SEASON' column and various
statistic columns (e.g. 'POINTS', 'TWOS_IN', etc.).
Returns
-------
pandas.DataFrame
A new DataFrame with the average statistics per season.
Notes
-----
The function groups the input DataFrame by the 'SEASON' column and
calculates the weighted average of each statistic using the 'N_MATCHES'
column as the weight.
"""
dff = df.groupby('SEASON')\
.apply(lambda x:
pd.Series(
{'N_MATCHES': x['N_MATCHES'].sum(),
'POINTS': new_col(x['POINTS'], x['N_MATCHES']),
'TWOS_IN': new_col(x['TWOS_IN'],
x['N_MATCHES']),
'TWOS_TRIED': new_col(x['TWOS_TRIED'],
x['N_MATCHES']),
'TWOS_PERC': new_col(x['TWOS_PERC'],
x['N_MATCHES']),
'THREES_IN': new_col(x['THREES_IN'],
x['N_MATCHES']),
'THREES_TRIED': new_col(x['THREES_TRIED'],
x['N_MATCHES']),
'THREES_PERC': new_col(x['THREES_PERC'],
x['N_MATCHES']),
'FIELD_GOALS_IN': new_col(x['FIELD_GOALS_IN'],
x['N_MATCHES']),
'FIELD_GOALS_TRIED': new_col(x['FIELD_GOALS_TRIED'],
x['N_MATCHES']),
'FIELD_GOALS_PERC': new_col(x['FIELD_GOALS_PERC'],
x['N_MATCHES']),
'FREE_THROWS_IN': new_col(x['FREE_THROWS_IN'],
x['N_MATCHES']),
'FREE_THROWS_TRIED': new_col(x['FREE_THROWS_TRIED'],
x['N_MATCHES']),
'FREE_THROWS_PERC': new_col(x['FREE_THROWS_PERC'],
x['N_MATCHES']),
'OFFENSIVE_REBOUNDS': new_col(x['OFFENSIVE_REBOUNDS'],
x['N_MATCHES']),
'DEFFENSIVE_REBOUNDS': new_col(x['DEFFENSIVE_REBOUNDS'],
x['N_MATCHES']),
'TOTAL_REBOUNDS': new_col(x['TOTAL_REBOUNDS'],
x['N_MATCHES']),
'ASSISTS': new_col(x['ASSISTS'],
x['N_MATCHES']),
'TURNOVERS': new_col(x['TURNOVERS'],
x['N_MATCHES']),
'BLOCKS_FAVOR': new_col(x['BLOCKS_FAVOR'],
x['N_MATCHES']),
'BLOCKS_AGAINST': new_col(x['BLOCKS_AGAINST'],
x['N_MATCHES']),
'DUNKS': new_col(x['DUNKS'], x['N_MATCHES']),
'PERSONAL_FOULS': new_col(x['PERSONAL_FOULS'],
x['N_MATCHES']),
'FOULS_RECEIVED': new_col(x['FOULS_RECEIVED'],
x['N_MATCHES']),
'EFFICIENCY': new_col(x['EFFICIENCY'],
x['N_MATCHES'])
}), include_groups=False)
return dff
def proccess_df_total(df: pd.DataFrame) -> pd.DataFrame:
"""
Process a DataFrame to calculate the total statistics per season.
Parameters
----------
df : pandas.DataFrame
The DataFrame to process. It should have a 'SEASON' column and various
statistic columns (e.g. 'POINTS', 'TWOS_IN', etc.).
Returns
-------
pandas.DataFrame
A new DataFrame with the total statistics per season.
Notes
-----
The function groups the input DataFrame by the 'SEASON' column and
calculates the sum of each statistic. For percentage statistics (e.g.
'TWOS_PERC'), it calculates the weighted average using the 'N_MATCHES'
column as the weight.
"""
dff = df.groupby('SEASON')\
.apply(lambda x:
pd.Series(
{'N_MATCHES': x['N_MATCHES'].sum(),
'POINTS': x['POINTS'].sum(),
'TWOS_IN': x['TWOS_IN'].sum(),
'TWOS_TRIED': x['TWOS_TRIED'].sum(),
'TWOS_PERC': new_col(x['TWOS_PERC'],
x['N_MATCHES']),
'THREES_IN': x['THREES_IN'].sum(),
'THREES_TRIED': x['THREES_TRIED'].sum(),
'THREES_PERC': new_col(x['THREES_PERC'],
x['N_MATCHES']),
'FIELD_GOALS_IN': x['FIELD_GOALS_IN'].sum(),
'FIELD_GOALS_TRIED': x['FIELD_GOALS_TRIED'].sum(),
'FIELD_GOALS_PERC': new_col(x['FIELD_GOALS_PERC'],
x['N_MATCHES']),
'FREE_THROWS_IN': x['FREE_THROWS_IN'].sum(),
'FREE_THROWS_TRIED': x['FREE_THROWS_TRIED'].sum(),
'FREE_THROWS_PERC': new_col(x['FREE_THROWS_PERC'],
x['N_MATCHES']),
'OFFENSIVE_REBOUNDS': x['OFFENSIVE_REBOUNDS'].sum(),
'DEFFENSIVE_REBOUNDS': x['DEFFENSIVE_REBOUNDS'].sum(),
'TOTAL_REBOUNDS': x['TOTAL_REBOUNDS'].sum(),
'ASSISTS': x['ASSISTS'].sum(),
'TURNOVERS': x['TURNOVERS'].sum(),
'BLOCKS_FAVOR': x['BLOCKS_FAVOR'].sum(),
'BLOCKS_AGAINST': x['BLOCKS_AGAINST'].sum(),
'DUNKS': x['DUNKS'].sum(),
'PERSONAL_FOULS': x['PERSONAL_FOULS'].sum(),
'FOULS_RECEIVED': x['FOULS_RECEIVED'].sum(),
'EFFICIENCY': x['EFFICIENCY'].sum(),
}), include_groups=False)
return dff |
import SortField from './SortField';
import SortOrder from './SortOrder';
export default function getProducts(
productsFromServer,
categoriesFromServer,
usersFromServer,
selectedUserId,
selectedCategoriesId,
query,
sortField,
sortOrder,
) {
let result = [...productsFromServer];
result = result.filter(product => {
const productCategory = categoriesFromServer.find(
category => product.categoryId === category.id,
);
const productUser = usersFromServer.find(
user => user.id === productCategory.ownerId,
);
if (selectedUserId && productUser.id !== selectedUserId) {
return false;
}
if (
selectedCategoriesId.length > 0 &&
!selectedCategoriesId.includes(product.categoryId)
) {
return false;
}
if (query && !product.name.toLowerCase().includes(query.toLowerCase())) {
return false;
}
return true;
});
// sorting
if (sortField) {
switch (sortField) {
case SortField.ID:
result.sort((product1, product2) => product1.id - product2.id);
break;
case SortField.PRODUCT:
result.sort((product1, product2) =>
product1.name
.toLowerCase()
.localeCompare(product2.name.toLowerCase()),
);
break;
case SortField.CATEGORY:
result.sort((product1, product2) => {
const productCategory1 = categoriesFromServer
.find(category => product1.categoryId === category.id)
.title.toLowerCase();
const productCategory2 = categoriesFromServer
.find(category => product2.categoryId === category.id)
.title.toLowerCase();
return productCategory1.localeCompare(productCategory2);
});
break;
case SortField.USER:
result.sort((product1, product2) => {
const productCategory1 = categoriesFromServer.find(
category => product1.categoryId === category.id,
);
const productCategory2 = categoriesFromServer.find(
category => product2.categoryId === category.id,
);
const productUser1 = usersFromServer
.find(user => user.id === productCategory1.ownerId)
.name.toLowerCase();
const productUser2 = usersFromServer
.find(user => user.id === productCategory2.ownerId)
.name.toLowerCase();
return productUser1.localeCompare(productUser2);
});
break;
default:
break;
}
}
if (sortOrder === SortOrder.DESC) {
result.reverse();
}
return result;
} |
<template>
<h1>Here are our Professionals</h1>
<div>
<filter-caretakers @change-filter="applyFilters"> </filter-caretakers>
</div>
<div>
<button @click="loadCaretakers()">Refresh</button>
<button v-if="isCaretaker">
<router-link to="requests"> Received</router-link>
</button>
<div class="try">
<base-card
v-for="people in availableCaretakers"
:key="people.profileId"
class="caretakerDetails"
>
<caretaker-details
:key="people.profileId"
:id="people.profileId"
:imageUrl="people.profileImageUrl"
:Name="people.name"
:speciality="people.speciality"
:rate="people.rate"
:location="people.location"
:bio="people.bio"
></caretaker-details>
</base-card>
<div class="spacer"></div>
</div>
</div>
</template>
<script>
import CaretakerDetails from "../functionalities/CaretakerDetails.vue";
import FilterCaretakers from "../functionalities/FilterCaretakers.vue";
export default {
data() {
return {
activeFilters: {
dog: true,
cat: true,
fish: true,
},
};
},
components: {
CaretakerDetails,
FilterCaretakers,
},
computed: {
availableCaretakers() {
const caretakers = this.$store.getters["caretakers/caretakers"];
return caretakers.filter((caretaker) => {
if (this.activeFilters.dog && caretaker.speciality.includes("dog")) {
return true;
}
if (this.activeFilters.cat && caretaker.speciality.includes("cat")) {
return true;
}
if (this.activeFilters.fish && caretaker.speciality.includes("fish")) {
return true;
}
return false;
});
},
hasCaretakers() {
return this.$store.getters["caretakers/hasCaretakers"];
},
isCaretaker() {
return this.$store.getters["caretakers/isCaretaker"];
},
},
created() {
this.loadCaretakers();
},
methods: {
applyFilters(updatedFilters) {
this.activeFilters = updatedFilters;
},
loadCaretakers() {
this.$store.dispatch("caretakers/loadCaretakers");
},
},
};
</script>
<style lang="scss" scoped>
h1 {
text-align: center;
font-size: 25px;
color: rgb(24, 24, 24);
}
.try {
display: flex;
flex-wrap: wrap;
}
div.caretakerDetails {
flex: 40%;
margin: 10px;
margin-top: 50px;
max-width: 100%;
box-shadow: rgba(0, 0, 0, 0.25) 0px 54px 55px,
rgba(0, 0, 0, 0.12) 0px -12px 30px, rgba(0, 0, 0, 0.12) 0px 4px 6px,
rgba(0, 0, 0, 0.17) 0px 12px 13px, rgba(0, 0, 0, 0.09) 0px -3px 5px;
transition: transform 0.5s;
img {
height: 100%;
width: 100%;
object-fit: contain;
}
}
div.caretakerDetails:hover {
transform: scale(1.05);
}
div.spacer {
margin: 0 10px;
padding: 0 10px;
flex: 40%;
}
</style> |
library(ggplot2)
library(ggsci)
library(rstan)
library(bayesplot)
library(reshape2)
library(dplyr)
library(tidyr)
library(loo)
library(deSolve)
theme_set(theme_classic())
compare_parameters = function(data1, data2, name1, name2, params_in_both = c("sigma","alpha","b_A","b_P","b_F","c_A","c_P","c_F","b","a_FP","a_FA","b","Gamma","delta")){
params1 = as.data.frame(data1, pars = c("alpha","b_P","b_F","c_P","c_F","b","a_FP","b","gamma","delta")) %>%
gather(factor_key = TRUE) %>%
mutate(key= str_replace(key, "gamma","Gamma"))
params2 = as.data.frame(data2, pars = c("alpha","b_P","b_F","c_P","c_F","b","a_FP","b","Gamma","delta")) %>%
gather(factor_key = TRUE) %>%
mutate(key= str_replace(key, "gamma","Gamma"))
compare_params = bind_rows(list(model1 = params1, model2 = params2), .id = "Model")
pdf(paste("from_cluster_include_longterm_data/plots/compare_", name2,"_vs_",name1,".pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(compare_params, aes(x = key, y = value, color = Model)) + geom_boxplot() + theme_bw())
dev.off()
}
model_qc = function(data, name, params_to_plot = c("sigma", "alpha", "b_P", "b_F", "c_P","c_F","a_FP","alpha_max"),
pars_acf1 = c("sigma[1]","sigma[2]","sigma[3]","sigma[4]","alpha","a_FP"),
pars_acf2 = c("b_P","b_F","c_P","c_F")){
print("Plot Model QC")
pdf(paste("model-qc_",name,"_plots.pdf", sep = "", collapse = ""), useDingbats = F)
print(rstan::traceplot(data, pars = params_to_plot))
print(mcmc_rhat(rhat(data, pars = params_to_plot)) + yaxis_text(hjust = 1))
print(rstan::traceplot(data, pars = c("lp__")) + theme(axis.text.x = element_text(size=6, angle=45)))
ratios <- neff_ratio(data,pars = params_to_plot)
color_scheme_set("brightblue")
print(mcmc_neff(ratios) + yaxis_ticks(on = TRUE) + yaxis_text(on = TRUE))
print(mcmc_acf(data, pars_acf1))
print(mcmc_acf(data, pars_acf2))
print(mcmc_parcoord(as.array(data, pars = pars_acf1), np = nuts_params(data)))
print(mcmc_parcoord(as.array(data, pars = pars_acf2), np = nuts_params(data)))
print(mcmc_parcoord(as.array(data, pars = pars_acf1), np = nuts_params(data)) + theme_bw())
print(mcmc_parcoord(as.array(data, pars = pars_acf2), np = nuts_params(data)) + theme_bw())
dev.off()
}
posterior_prediction_plots_differentRates_palbociclib = function(model_data, stan_data, name, dox = "Dox"){
predictions_cc = as.data.frame(model_data, pars = c('y_pred_cc_p') )%>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = c(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G0/G1`,
stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$S,
stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G2/M`),
Stage = rep(c("G0/G1","S","G2/M"), each = stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs),
Day = rep(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$t_obs,3),
drug_dose = rep(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$experiment_obs, 3),
Experiment = rep("Palbociclib", 3*stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs)
) %>% separate(drug_dose, c("Fulvestrant","Palbociclib"), "_") %>%
mutate( Fulvestrant = as.numeric(Fulvestrant), Palbociclib = as.numeric(Palbociclib)) %>%
mutate(Palbociclib = case_when(Palbociclib == 0 ~ 0,
Palbociclib == 1 ~ 1.25e-08,
Palbociclib == 2 ~ 2.5e-08,
Palbociclib == 3 ~ 5e-08,
Palbociclib == 4 ~ 1e-07,
Palbociclib == 5 ~ 2e-07,
Palbociclib == 6 ~ 4e-07),
Fulvestrant = case_when(Fulvestrant == 0 ~ 0,
Fulvestrant == 1 ~ 6.5e-10,
Fulvestrant == 2 ~ 1.3e-09,
Fulvestrant == 3 ~ 2.6e-09,
Fulvestrant == 4 ~ 5.2e-09,
Fulvestrant == 5 ~ 1.04e-08,
Fulvestrant == 6 ~ 2.08e-08))
predictions_ct <- as.data.frame(model_data, pars = c('y_pred_ct_p')) %>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$y_obs,
Day = stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$t_obs,
drug_dose = stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$experiment_obs,
Experiment = rep("Palbociclib", stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$T_obs)
) %>% separate(drug_dose, c("Fulvestrant","Palbociclib"), "_") %>%
mutate(Palbociclib = as.numeric(Palbociclib),
Fulvestrant = as.numeric(Fulvestrant)) %>%
mutate(Palbociclib = case_when(Palbociclib == 0 ~ 0,
Palbociclib == 1 ~ 1.25e-08,
Palbociclib == 2 ~ 2.5e-08,
Palbociclib == 3 ~ 5e-08,
Palbociclib == 4 ~ 1e-07,
Palbociclib == 5 ~ 2e-07,
Palbociclib == 6 ~ 4e-07),
Fulvestrant = case_when(Fulvestrant == 0 ~ 0,
Fulvestrant == 1 ~ 6.5e-10,
Fulvestrant == 2 ~ 1.3e-09,
Fulvestrant == 3 ~ 2.6e-09,
Fulvestrant == 4 ~ 5.2e-09,
Fulvestrant == 5 ~ 1.04e-08,
Fulvestrant == 6 ~ 2.08e-08))
pdf(paste("posterior_predictions_",name,"_plots.pdf", sep = "", collapse = ""), useDingbats = F)
## Cell total (Palbociclib)
print(ggplot(predictions_ct %>% filter( Experiment == "Palbociclib"),
aes(x = Day, y = data)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Palbociclib ~ Fulvestrant, labeller = label_both, scales = "free_y") +
theme(axis.text.x = element_text(size=6), strip.text.x = element_text(size = 6)) + scale_y_log10()+
ggtitle(" Palbociclib experiment") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")))
## Cell cycle (Fulvestrant alone - Palbo exp)
print(ggplot(subset(predictions_cc, Palbociclib == 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Fulvestrant alone -- Palbociclib experiment"))
print(ggplot(subset(predictions_cc, Fulvestrant == 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Palbociclib, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Palbociclib alone"))
print(ggplot(subset(predictions_cc, Fulvestrant >0 & Palbociclib > 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage ~ Palbociclib + Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10() +
ggtitle("Fulvestrant + Palbociclib"))
dev.off()
}
posterior_prediction_plots = function(model_data, stan_data, name, dox = "Dox", skip_pass = FALSE){
predictions_cc = as.data.frame(model_data, pars = c('y_pred_cc_p', 'y_pred_cc_a') )%>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = c(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G0/G1`,
stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$S,
stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G2/M`,
stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G0/G1`,
stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$S,
stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G2/M`),
Stage = c(rep(c("G0/G1","S","G2/M"), each = stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs),
rep(c("G0/G1","S","G2/M"), each = stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs)),
Day = c(rep(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$t_obs,3),
rep(stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$t_obs,3)),
drug_dose = c(rep(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$experiment_obs, 3),
rep(stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$experiment_obs, 3)),
Experiment = c(rep("Palbociclib", 3*stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs),
rep("Abemaciclib", 3*stan_data[[paste("abema_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs))
) %>% separate(drug_dose, c("Fulvestrant","Drug2"), "_") %>%
mutate(Palbociclib = as.numeric(case_when(Experiment == "Palbociclib" ~ Drug2,
TRUE ~ "0")),
Abemaciclib = as.numeric(case_when(Experiment == "Abemaciclib" ~ Drug2,
TRUE ~ "0")),
Fulvestrant = as.numeric(Fulvestrant)) %>% select(-Drug2) %>%
mutate(Palbociclib = case_when(Palbociclib == 0 ~ 0,
Palbociclib == 1 ~ 1.25e-08,
Palbociclib == 2 ~ 2.5e-08,
Palbociclib == 3 ~ 5e-08,
Palbociclib == 4 ~ 1e-07,
Palbociclib == 5 ~ 2e-07,
Palbociclib == 6 ~ 4e-07),
Abemaciclib = case_when(Abemaciclib == 0 ~ 0,
Abemaciclib == 1 ~ 8.02e-10,
Abemaciclib == 2 ~ 1.6e-09,
Abemaciclib == 3 ~ 3.21e-09,
Abemaciclib == 4 ~ 6.41e-09,
Abemaciclib == 5 ~ 1.28e-08,
Abemaciclib == 6 ~ 2.57e-08),
Fulvestrant = case_when(Fulvestrant == 0 ~ 0,
Fulvestrant == 1 ~ 6.5e-10,
Fulvestrant == 2 ~ 1.3e-09,
Fulvestrant == 3 ~ 2.6e-09,
Fulvestrant == 4 ~ 5.2e-09,
Fulvestrant == 5 ~ 1.04e-08,
Fulvestrant == 6 ~ 2.08e-08))
predictions_ct <- as.data.frame(model_data, pars = c('y_pred_ct_p','y_pred_ct_a')) %>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = c(stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$y_obs,
stan_data[[paste("abema_ct_",dox,"_input", sep = "", collapse = "")]]$y_obs),
Day = c(stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$t_obs,
stan_data[[paste("abema_ct_",dox,"_input", sep = "", collapse = "")]]$t_obs),
drug_dose = c(stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$experiment_obs,
stan_data[[paste("abema_ct_",dox,"_input", sep = "", collapse = "")]]$experiment_obs),
Experiment = c(rep("Palbociclib", stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$T_obs),
rep("Abemaciclib", stan_data[[paste("abema_ct_",dox,"_input", sep = "", collapse = "")]]$T_obs))
) %>% separate(drug_dose, c("Fulvestrant","Drug2"), "_") %>%
mutate(Palbociclib = as.numeric(case_when(Experiment == "Palbociclib" ~ Drug2,
TRUE ~ "0")),
Abemaciclib = as.numeric(case_when(Experiment == "Abemaciclib" ~ Drug2,
TRUE ~ "0")),
Fulvestrant = as.numeric(Fulvestrant)) %>% select(-Drug2) %>%
mutate(Palbociclib = case_when(Palbociclib == 0 ~ 0,
Palbociclib == 1 ~ 1.25e-08,
Palbociclib == 2 ~ 2.5e-08,
Palbociclib == 3 ~ 5e-08,
Palbociclib == 4 ~ 1e-07,
Palbociclib == 5 ~ 2e-07,
Palbociclib == 6 ~ 4e-07),
Abemaciclib = case_when(Abemaciclib == 0 ~ 0,
Abemaciclib == 1 ~ 8.02e-10,
Abemaciclib == 2 ~ 1.6e-09,
Abemaciclib == 3 ~ 3.21e-09,
Abemaciclib == 4 ~ 6.41e-09,
Abemaciclib == 5 ~ 1.28e-08,
Abemaciclib == 6 ~ 2.57e-08),
Fulvestrant = case_when(Fulvestrant == 0 ~ 0,
Fulvestrant == 1 ~ 6.5e-10,
Fulvestrant == 2 ~ 1.3e-09,
Fulvestrant == 3 ~ 2.6e-09,
Fulvestrant == 4 ~ 5.2e-09,
Fulvestrant == 5 ~ 1.04e-08,
Fulvestrant == 6 ~ 2.08e-08)
)
if(!skip_pass){
predictions_pass <- as.data.frame(model_data, pars = c('y_pred_pass_p')) %>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = stan_data[[paste("validation_data_1_",dox,"_input", sep = "", collapse = "")]]$y_obs,
Day = stan_data[[paste("validation_data_1_",dox,"_input", sep = "", collapse = "")]]$t_obs,
Experiment = stan_data[[paste("validation_data_1_",dox,"_input", sep = "", collapse = "")]]$experiment_obs)
}
pdf(paste("posterior_predictions_",name,"_plots.pdf", sep = "", collapse = ""), useDingbats = F)
## Cell total (Palbociclib)
print(ggplot(predictions_ct %>% filter( Experiment == "Palbociclib"),
aes(x = Day, y = data)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Palbociclib ~ Fulvestrant, labeller = label_both, scales = "free_y") +
theme(axis.text.x = element_text(size=6), strip.text.x = element_text(size = 6)) + scale_y_log10()+
ggtitle(" Palbociclib experiment") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")))
## Cell total (Abemaciclib)
print(ggplot(predictions_ct %>% filter( Experiment == "Abemaciclib"),
aes(x = Day, y = data)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Abemaciclib ~ Fulvestrant, labeller = label_both, scales = "free_y") +
theme(axis.text.x = element_text(size=6), strip.text.x = element_text(size = 6)) + scale_y_log10()+
ggtitle(" Abemaciclib experiment") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")))
## Cell cycle (Fulvestrant alone - Palbo exp)
print(ggplot(subset(predictions_cc, Palbociclib == 0 & Abemaciclib == 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Fulvestrant alone -- Palbociclib experiment"))
## cell cycle (Fulvestrant alone - abema exp)
print(ggplot(subset(predictions_cc, Palbociclib == 0 & Abemaciclib == 0 & Experiment == "Abemaciclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10() +
ggtitle("Fulvestrant alone -- Abemaciclib experiment"))
print(ggplot(subset(predictions_cc, Fulvestrant == 0 & Abemaciclib == 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Palbociclib, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Palbociclib alone"))
print(ggplot(subset(predictions_cc, Fulvestrant == 0 & Palbociclib == 0 & Experiment == "Abemaciclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Abemaciclib, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Abemaciclib alone"))
print(ggplot(subset(predictions_cc, Fulvestrant >0 & Palbociclib > 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage ~ Palbociclib + Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10() +
ggtitle("Fulvestrant + Palbociclib"))
print(ggplot(subset(predictions_cc, Fulvestrant >0 & Abemaciclib > 0 & Experiment == "Abemaciclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage ~ Abemaciclib + Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) +
scale_y_log10() +
ggtitle("Fulvestrant + Abemaciclib"))
if(!skip_pass){
print(ggplot(predictions_pass, aes(x = Day, y = data)) + geom_point() +
geom_line(aes(x = Day, y = median)) +
geom_ribbon(aes(ymin=lowerBound, ymax = upperBound, color = NULL, alpha = 0.25)) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_wrap(~Experiment, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10() +
ggtitle("Longterm data") + theme(position = NULL))
print(ggplot(predictions_pass %>% mutate(Day = round(Day)), aes(x = Day, y = data)) + geom_point(size=2) +
geom_errorbar(data = predictions_pass %>% mutate(Day = round(Day)) %>% group_by(Day, Experiment) %>% mutate(lowerBound = min(lowerBound), upperBound = max(upperBound), median = median(median)),
aes(x = Day, ymin = lowerBound, ymax = upperBound)) +
facet_wrap(~Experiment, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) +
ggtitle("Longterm data") )
}
dev.off()
}
predictions_cc = as.data.frame(test_singleparameter_6ODES_even_1000iters_3rates, pars = c('y_pred_cc_p') )%>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975))
p_cc_mean <- mean(predictions_cc$median)
p_cc_lower <- mean(predictions_cc$lowerBound)
p_cc_upper <- mean(predictions_cc$upperBound)
p_cc_upper - p_cc_lower
predictions_ct = as.data.frame(test_singleparameter_6ODES_even_1000iters_3rates, pars = c('y_pred_ct_p') )%>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975))
p_ct_mean <- mean(predictions_ct$median)
p_ct_lower <- mean(predictions_ct$lowerBound)
p_ct_upper <- mean(predictions_ct$upperBound)
p_ct_upper - p_ct_lower
posterior_prediction_plots_no_abema = function(model_data, stan_data, name, dox = "Dox", skip_pass = FALSE){
predictions_cc = as.data.frame(model_data, pars = c('y_pred_cc_p') )%>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = c(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G0/G1`,
stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$S,
stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$y_obs$`G2/M`),
Stage = c(rep(c("G0/G1","S","G2/M"), each = stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs)),
Day = c(rep(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$t_obs,3)),
drug_dose = c(rep(stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$experiment_obs, 3)),
Experiment = c(rep("Palbociclib", 3*stan_data[[paste("palbo_cc_",dox,"_input", sep = "", collapse = "")]]$T_obs))
) %>% separate(drug_dose, c("Fulvestrant","Palbociclib"), "_") %>%
mutate(Fulvestrant = as.numeric(Fulvestrant),
Palbociclib = as.numeric(Palbociclib)) %>%
mutate(Palbociclib = case_when(Palbociclib == 0 ~ 0,
Palbociclib == 1 ~ 12.50,
Palbociclib == 2 ~ 25.00,
Palbociclib == 3 ~ 50.00,
Palbociclib == 4 ~ 100.00,
Palbociclib == 5 ~ 200.00,
Palbociclib == 6 ~ 400.00),
Fulvestrant = case_when(Fulvestrant == 0 ~ 0,
Fulvestrant == 1 ~ 1.25,
Fulvestrant == 2 ~ 2.50,
Fulvestrant == 3 ~ 5.00,
Fulvestrant == 4 ~ 10.00,
Fulvestrant == 5 ~ 20.00,
Fulvestrant == 6 ~ 40.00))
predictions_ct <- as.data.frame(model_data, pars = c('y_pred_ct_p')) %>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = c(stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$y_obs),
Day = c(stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$t_obs),
drug_dose = c(stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$experiment_obs),
Experiment = c(rep("Palbociclib", stan_data[[paste("palbo_ct_",dox,"_input", sep = "", collapse = "")]]$T_obs))
) %>% separate(drug_dose, c("Fulvestrant","Palbociclib"), "_") %>%
mutate(Fulvestrant = as.numeric(Fulvestrant),
Palbociclib = as.numeric(Palbociclib)) %>%
mutate(Palbociclib = case_when(Palbociclib == 0 ~ 0,
Palbociclib == 1 ~ 12.50,
Palbociclib == 2 ~ 25.50,
Palbociclib == 3 ~ 50.10,
Palbociclib == 4 ~ 100.00,
Palbociclib == 5 ~ 200.00,
Palbociclib == 6 ~ 401.00),
Fulvestrant = case_when(Fulvestrant == 0 ~ 0,
Fulvestrant == 1 ~ 0.65,
Fulvestrant == 2 ~ 1.30,
Fulvestrant == 3 ~ 2.60,
Fulvestrant == 4 ~ 5.20,
Fulvestrant == 5 ~ 10.40,
Fulvestrant == 6 ~ 20.80)
)
if(!skip_pass){
predictions_pass <- as.data.frame(model_data, pars = c('y_pred_pass_p')) %>%
gather(factor_key = TRUE) %>%
group_by(key) %>%
summarize(lowerBound = quantile(value, probs = 0.025),
median = quantile(value, probs = 0.5),
upperBound = quantile(value, probs = 0.975)) %>%
mutate(data = stan_data[[paste("validation_data_1_",dox,"_input", sep = "", collapse = "")]]$y_obs,
Day = stan_data[[paste("validation_data_1_",dox,"_input", sep = "", collapse = "")]]$t_obs,
Experiment = stan_data[[paste("validation_data_1_",dox,"_input", sep = "", collapse = "")]]$experiment_obs)
}
pdf(paste("posterior_predictions_",name,"_plots.pdf", sep = "", collapse = ""), useDingbats = F)
## Cell total (Palbociclib)
print(ggplot(predictions_ct %>% filter( Experiment == "Palbociclib"),
aes(x = Day, y = data)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Palbociclib ~ Fulvestrant, labeller = label_both, scales = "free_y") +
theme(axis.text.x = element_text(size=6), strip.text.x = element_text(size = 6)) + scale_y_log10()+
ggtitle(" Palbociclib experiment") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")))
## Cell cycle (Fulvestrant alone - Palbo exp)
print(ggplot(subset(predictions_cc, Palbociclib == 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Fulvestrant alone -- Palbociclib experiment"))
print(ggplot(subset(predictions_cc, Fulvestrant == 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage~ Palbociclib, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10()+
ggtitle("Palbociclib alone"))
print(ggplot(subset(predictions_cc, Fulvestrant >0 & Palbociclib > 0 & Experiment == "Palbociclib"),
aes(x = Day, y = data, color = Stage)) + geom_point() +
geom_line(aes(x= Day, y = median)) +
geom_ribbon(aes(ymin = lowerBound, ymax = upperBound, fill = Stage, color = NULL), alpha = 0.25) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_grid(Stage ~ Palbociclib + Fulvestrant, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10() +
ggtitle("Fulvestrant + Palbociclib"))
if(!skip_pass){
print(ggplot(predictions_pass, aes(x = Day, y = data)) + geom_point() +
geom_line(aes(x = Day, y = median)) +
geom_ribbon(aes(ymin=lowerBound, ymax = upperBound, color = NULL, alpha = 0.25)) +
ylab("Cell Count") +
theme(legend.position = "bottom") +
facet_wrap(~Experiment, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) + scale_y_log10() +
ggtitle("Longterm data") + theme(position = NULL))
print(ggplot(predictions_pass %>% mutate(Day = round(Day)), aes(x = Day, y = data)) + geom_point(size=2) +
geom_errorbar(data = predictions_pass %>% mutate(Day = round(Day)) %>% group_by(Day, Experiment) %>% mutate(lowerBound = min(lowerBound), upperBound = max(upperBound), median = median(median)),
aes(x = Day, ymin = lowerBound, ymax = upperBound)) +
facet_wrap(~Experiment, labeller = label_both, scales = "free_y") + theme_bw() +
theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), axis.line = element_line(colour = "black")) +
ggtitle("Longterm data") )
}
dev.off()
}
validation_schedules_96well = function(validation_fn, model_data, name,
params= c("alpha","b_P","b_F","b_A","c_P","c_F","c_A","a_FP","a_FA","b","Gamma","delta","K"),
num_iters = 2250, dox = "Dox"){
valid2 = read.csv(validation_fn)
valid2_tx = valid2 %>% filter(Measurement == "Treatment", Dox == dox) %>% arrange(Experiment, Day) %>%
mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid2_obs = valid2 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
params_for_prediction = as.data.frame(model_data, pars = params)
init_conditions = as.data.frame(model_data, pars = c("y0_pass_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
## with passaging
prediction_results = data.frame()
for (exp in unique(valid2_tx$Experiment)){
pred = lapply(1:num_iters,
function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 28,
dose_times = (valid2_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid2_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage)) })
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>%
summarize(total_mean = mean(total), total_low = quantile(total, 0.025, na.rm = T),
total_high = quantile(total, 0.975, na.rm = T), total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name,"_posterior-predictions-passaging.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid2_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) + facet_wrap(Experiment ~. , ncol = 5))
dev.off()
prediction_results_no_passaging = data.frame()
for (exp in unique(valid2_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper_no_passaging(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 28,
dose_times = (valid2_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid2_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>%
summarize(total_mean = mean(total, na.rm = T), total_low = quantile(total, 0.025, na.rm = T),
total_high = quantile(total, 0.975, na.rm = T), total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_no_passaging = bind_rows(prediction_results_no_passaging, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_", name, "_posterior-predictions-noPassaging.pdf",sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_no_passaging, aes(x = time, y = total_median)) + geom_line() + geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) + facet_wrap(Experiment ~. , ncol = 5) +scale_y_log10())
print(ggplot(prediction_results_no_passaging, aes(x = time, y = total_median, color = Experiment)) + geom_line() + scale_y_log10())
dev.off()
write.csv(prediction_results_no_passaging %>% filter(time ==28) %>% arrange(-total_median),
paste("from_cluster_include_longterm_data/plots/posterior_predictions_", name, "_posterior-predictions_noPassaging_ranks.csv", sep = "", collapse = ""),
quote = FALSE, row.names = FALSE)
}
validation_schedules_24well = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","b_A","c_P","c_F","c_A","a_FP","a_FA","b","Gamma","delta","K"),
num_iters = 2250, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
params_for_prediction = as.data.frame(model_data, pars = params)
names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params_for_prediction_24wells[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_24well_palbo_only = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","b_min","Gamma","delta","K"),
num_iters = 1500, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e10, Palbociclib = Palbociclib * 1e8, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
params_for_prediction = as.data.frame(model_data, pars = params)
names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
params_for_prediction_24wells = params_for_prediction_24wells %>% mutate(b = b_min, c_A = 1, b_A = 1, a_FA = 0)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params_for_prediction_24wells[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_mod = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_mod(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_multiple = function(validation_fn, model_data, name,
params = c("alpha", "Beta", "Gamma", "b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_multiple(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_IC = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_pp"))
colnames(init_conditions) = c("G0G1_1", "G0G1_2", "G0G1_3", "G0G1_4", "G0G1_5", "S_1", "S_2", "S_3", "S_4", "S_5", "G2M_1", "G2M_2", "G2M_3", "G2M_4", "G2M_5")
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_IC(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_15_even = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_15_even(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells_15_even.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_15_even_ND = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e14, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_15_even_ND(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells_15_even.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_15_even_ful_G2 = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_F_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_15_even_ful_G2(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_15_even_ful_S = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_F_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_15_even_ful_S(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_14_G2 = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/6
init_conditions$G0G1_2 = init_conditions$G0G1/6
init_conditions$G0G1_3 = init_conditions$G0G1/6
init_conditions$G0G1_4 = init_conditions$G0G1/6
init_conditions$G0G1_5 = init_conditions$G0G1/6
init_conditions$G0G1_6 = init_conditions$G0G1/6
init_conditions$S_1 = init_conditions$S/6
init_conditions$S_2 = init_conditions$S/6
init_conditions$S_3 = init_conditions$S/6
init_conditions$S_4 = init_conditions$S/6
init_conditions$S_5 = init_conditions$S/6
init_conditions$S_6 = init_conditions$S/6
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_14_G2(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + G2M_1 + G2M_2) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_14_G2_palbo = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_P_2", "c_P_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/6
init_conditions$G0G1_2 = init_conditions$G0G1/6
init_conditions$G0G1_3 = init_conditions$G0G1/6
init_conditions$G0G1_4 = init_conditions$G0G1/6
init_conditions$G0G1_5 = init_conditions$G0G1/6
init_conditions$G0G1_6 = init_conditions$G0G1/6
init_conditions$S_1 = init_conditions$S/6
init_conditions$S_2 = init_conditions$S/6
init_conditions$S_3 = init_conditions$S/6
init_conditions$S_4 = init_conditions$S/6
init_conditions$S_5 = init_conditions$S/6
init_conditions$S_6 = init_conditions$S/6
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_14_G2_palbo(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_14_G2_ful = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_F_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/6
init_conditions$G0G1_2 = init_conditions$G0G1/6
init_conditions$G0G1_3 = init_conditions$G0G1/6
init_conditions$G0G1_4 = init_conditions$G0G1/6
init_conditions$G0G1_5 = init_conditions$G0G1/6
init_conditions$G0G1_6 = init_conditions$G0G1/6
init_conditions$S_1 = init_conditions$S/6
init_conditions$S_2 = init_conditions$S/6
init_conditions$S_3 = init_conditions$S/6
init_conditions$S_4 = init_conditions$S/6
init_conditions$S_5 = init_conditions$S/6
init_conditions$S_6 = init_conditions$S/6
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_14_G2_ful(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_16_G2_ND = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e14, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/7
init_conditions$G0G1_2 = init_conditions$G0G1/7
init_conditions$G0G1_3 = init_conditions$G0G1/7
init_conditions$G0G1_4 = init_conditions$G0G1/7
init_conditions$G0G1_5 = init_conditions$G0G1/7
init_conditions$G0G1_6 = init_conditions$G0G1/7
init_conditions$G0G1_7 = init_conditions$G0G1/7
init_conditions$S_1 = init_conditions$S/7
init_conditions$S_2 = init_conditions$S/7
init_conditions$S_3 = init_conditions$S/7
init_conditions$S_4 = init_conditions$S/7
init_conditions$S_5 = init_conditions$S/7
init_conditions$S_6 = init_conditions$S/7
init_conditions$S_7 = init_conditions$S/7
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_16_G2_ND(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + G0G1_7 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + S_7 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_16_G2_ful = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_F_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/7
init_conditions$G0G1_2 = init_conditions$G0G1/7
init_conditions$G0G1_3 = init_conditions$G0G1/7
init_conditions$G0G1_4 = init_conditions$G0G1/7
init_conditions$G0G1_5 = init_conditions$G0G1/7
init_conditions$G0G1_6 = init_conditions$G0G1/7
init_conditions$G0G1_7 = init_conditions$G0G1/7
init_conditions$S_1 = init_conditions$S/7
init_conditions$S_2 = init_conditions$S/7
init_conditions$S_3 = init_conditions$S/7
init_conditions$S_4 = init_conditions$S/7
init_conditions$S_5 = init_conditions$S/7
init_conditions$S_6 = init_conditions$S/7
init_conditions$S_7 = init_conditions$S/7
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_16_G2_ful(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + G0G1_7 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + S_7 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_14_G2_both_G2_nonsyn = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F", "alpha_max", "b_P_2", "b_F_2", "c_P_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/6
init_conditions$G0G1_2 = init_conditions$G0G1/6
init_conditions$G0G1_3 = init_conditions$G0G1/6
init_conditions$G0G1_4 = init_conditions$G0G1/6
init_conditions$G0G1_5 = init_conditions$G0G1/6
init_conditions$G0G1_6 = init_conditions$G0G1/6
init_conditions$S_1 = init_conditions$S/6
init_conditions$S_2 = init_conditions$S/6
init_conditions$S_3 = init_conditions$S/6
init_conditions$S_4 = init_conditions$S/6
init_conditions$S_5 = init_conditions$S/6
init_conditions$S_6 = init_conditions$S/6
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_14_G2_both_G2_nonsyn(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_16_G2_both_G2_nonsyn = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F", "alpha_max", "b_P_2", "b_F_2", "c_P_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/7
init_conditions$G0G1_2 = init_conditions$G0G1/7
init_conditions$G0G1_3 = init_conditions$G0G1/7
init_conditions$G0G1_4 = init_conditions$G0G1/7
init_conditions$G0G1_5 = init_conditions$G0G1/7
init_conditions$G0G1_6 = init_conditions$G0G1/7
init_conditions$G0G1_7 = init_conditions$G0G1/7
init_conditions$S_1 = init_conditions$S/7
init_conditions$S_2 = init_conditions$S/7
init_conditions$S_3 = init_conditions$S/7
init_conditions$S_4 = init_conditions$S/7
init_conditions$S_5 = init_conditions$S/7
init_conditions$S_6 = init_conditions$S/7
init_conditions$S_7 = init_conditions$S/7
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_16_G2_both_G2_nonsyn(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + G0G1_7 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + S_7 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_16_G2_both_G2_nonsyn_ND = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F", "alpha_max", "b_P_2", "b_F_2", "c_P_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e14, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/7
init_conditions$G0G1_2 = init_conditions$G0G1/7
init_conditions$G0G1_3 = init_conditions$G0G1/7
init_conditions$G0G1_4 = init_conditions$G0G1/7
init_conditions$G0G1_5 = init_conditions$G0G1/7
init_conditions$G0G1_6 = init_conditions$G0G1/7
init_conditions$G0G1_7 = init_conditions$G0G1/7
init_conditions$S_1 = init_conditions$S/7
init_conditions$S_2 = init_conditions$S/7
init_conditions$S_3 = init_conditions$S/7
init_conditions$S_4 = init_conditions$S/7
init_conditions$S_5 = init_conditions$S/7
init_conditions$S_6 = init_conditions$S/7
init_conditions$S_7 = init_conditions$S/7
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_16_G2_both_G2_nonsyn_ND(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + G0G1_7 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + S_7 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_14_G2_both_G2_syn = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F", "a_FP", "a_FP_2", "alpha_max", "b_P_2", "b_F_2", "c_P_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/6
init_conditions$G0G1_2 = init_conditions$G0G1/6
init_conditions$G0G1_3 = init_conditions$G0G1/6
init_conditions$G0G1_4 = init_conditions$G0G1/6
init_conditions$G0G1_5 = init_conditions$G0G1/6
init_conditions$G0G1_6 = init_conditions$G0G1/6
init_conditions$S_1 = init_conditions$S/6
init_conditions$S_2 = init_conditions$S/6
init_conditions$S_3 = init_conditions$S/6
init_conditions$S_4 = init_conditions$S/6
init_conditions$S_5 = init_conditions$S/6
init_conditions$S_6 = init_conditions$S/6
init_conditions$G2M_1 = init_conditions$G2M/2
init_conditions$G2M_2 = init_conditions$G2M/2
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_14_G2_both_G2_syn(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + G2M_1 + G2M_2 ) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_ful = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_F_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
init_conditions = init_conditions/5
colnames(init_conditions) = c("G0G1_1","S_1","G2M_1")
init_conditions$G0G1_2 = init_conditions$G0G1_1
init_conditions$G0G1_3 = init_conditions$G0G1_1
init_conditions$G0G1_4 = init_conditions$G0G1_1
init_conditions$G0G1_5 = init_conditions$G0G1_1
init_conditions$S_2 = init_conditions$S_1
init_conditions$S_3 = init_conditions$S_1
init_conditions$S_4 = init_conditions$S_1
init_conditions$S_5 = init_conditions$S_1
init_conditions$G2M_2 = init_conditions$G2M_1
init_conditions$G2M_3 = init_conditions$G2M_1
init_conditions$G2M_4 = init_conditions$G2M_1
init_conditions$G2M_5 = init_conditions$G2M_1
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_ful(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + S_1 + S_2 + S_3 + S_4 + S_5 + G2M_1 + G2M_2 + G2M_3 + G2M_4 + G2M_5) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_20_G2 = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/8
init_conditions$G0G1_2 = init_conditions$G0G1/8
init_conditions$G0G1_3 = init_conditions$G0G1/8
init_conditions$G0G1_4 = init_conditions$G0G1/8
init_conditions$G0G1_5 = init_conditions$G0G1/8
init_conditions$G0G1_6 = init_conditions$G0G1/8
init_conditions$G0G1_7 = init_conditions$G0G1/8
init_conditions$G0G1_8 = init_conditions$G0G1/8
init_conditions$S_1 = init_conditions$S/8
init_conditions$S_2 = init_conditions$S/8
init_conditions$S_3 = init_conditions$S/8
init_conditions$S_4 = init_conditions$S/8
init_conditions$S_5 = init_conditions$S/8
init_conditions$S_6 = init_conditions$S/8
init_conditions$S_7 = init_conditions$S/8
init_conditions$S_8 = init_conditions$S/8
init_conditions$G2M_1 = init_conditions$G2M/4
init_conditions$G2M_2 = init_conditions$G2M/4
init_conditions$G2M_3 = init_conditions$G2M/4
init_conditions$G2M_4 = init_conditions$G2M/4
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_20_G2(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + G0G1_7 + G0G1_8 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + S_7 + S_8 + G2M_1 + G2M_2 + G2M_3 + G2M_4) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_20_G2_ful = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","alpha_max", "b_F_2", "c_F_2", "alpha_max_2"),
num_iters = 1500, dox = "Dox", passage_day = 5.1, passage_number = 5000){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e9, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$G0G1_1 = init_conditions$G0G1/8
init_conditions$G0G1_2 = init_conditions$G0G1/8
init_conditions$G0G1_3 = init_conditions$G0G1/8
init_conditions$G0G1_4 = init_conditions$G0G1/8
init_conditions$G0G1_5 = init_conditions$G0G1/8
init_conditions$G0G1_6 = init_conditions$G0G1/8
init_conditions$G0G1_7 = init_conditions$G0G1/8
init_conditions$G0G1_8 = init_conditions$G0G1/8
init_conditions$S_1 = init_conditions$S/8
init_conditions$S_2 = init_conditions$S/8
init_conditions$S_3 = init_conditions$S/8
init_conditions$S_4 = init_conditions$S/8
init_conditions$S_5 = init_conditions$S/8
init_conditions$S_6 = init_conditions$S/8
init_conditions$S_7 = init_conditions$S/8
init_conditions$S_8 = init_conditions$S/8
init_conditions$G2M_1 = init_conditions$G2M/4
init_conditions$G2M_2 = init_conditions$G2M/4
init_conditions$G2M_3 = init_conditions$G2M/4
init_conditions$G2M_4 = init_conditions$G2M/4
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions = subset(init_conditions, select = -c(1,2,3) )
params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction = params_for_prediction %>% mutate(b = alpha_max)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_validation_wrapper_20_G2_ful(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
passage_day = passage_day,
passage_number = passage_number,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1_1 + G0G1_2 + G0G1_3 + G0G1_4 + G0G1_5 + G0G1_6 + G0G1_7 + G0G1_8 + S_1 + S_2 + S_3 + S_4 + S_5 + S_6 + S_7 + S_8 + G2M_1 + G2M_2 + G2M_3 + G2M_4) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("posterior_predictions_",name, "_posterior-predictions-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_obs_long, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 4))
print(ggplot(prediction_results %>% filter(Experiment == "DMSO"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_obs_long %>% filter(Experiment == "DMSO"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
validation_schedules_10day_96well_palbo_only_nopassaging = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","b_min","Gamma","delta","K"),
num_iters = 1500, dox = "Dox"){
valid1 = read.csv(validation_fn)
valid1_tx = valid1 %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e10, Palbociclib = Palbociclib * 1e8)
valid1_obs = valid1 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
valid1_obs_long = valid1_obs %>% pivot_longer(cols = contains("count"), names_to = "replicate", values_to = "total_count")
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$PP = 0
init_conditions$FF = 0
params_for_prediction = as.data.frame(model_data, pars = params)
names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
params_for_prediction = params_for_prediction %>% mutate(b = b_min, K = 1e10)
prediction_results = data.frame()
for (exp in unique(valid1_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_10day_no_passaging_validation_wrapper(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 10,
dose_times = (valid1_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_tx %>% filter(Experiment == exp) %>%
select(Fulvestrant,Palbociclib))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-no-passaging-10day-96wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median, color = Experiment)) + geom_line() +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL, fill = Experiment), alpha = 0.25))
dev.off()
return(prediction_results)
}
fourweek_schedule_prediction_no_passaging_24well_palbo_only = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","b_min","Gamma","delta","K"),
num_iters = 1500, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e10, Palbociclib = Palbociclib * 1e8, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
params_for_prediction = as.data.frame(model_data, pars = params)
names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
params_for_prediction_24wells = params_for_prediction_24wells %>% mutate(b = b_min, c_A = 1, b_A = 1, a_FA = 0)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper_no_passaging(params = params_for_prediction_24wells[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-no_passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
return(prediction_results_24wells)
}
validation_schedules_loglog_bmin_24well_palbo_only = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","c_P","c_F","a_FP","b_max","b_min","Gamma","delta","K"),
num_iters = 1500, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e10, Palbociclib = Palbociclib * 1e8)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
#init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
params_for_prediction = as.data.frame(model_data, pars = params)
names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_bmin_palboOnly_validation_wrapper(params = params_for_prediction_24wells[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
simulate_validation_schedules_24well = function(validation_fn, model_data, name,
params,
num_iters = 2250, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
#params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
return(prediction_results_24wells)
}
simulate_validation_schedules_24well_oneSample = function(validation_fn, model_data,
params,
dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
#params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params,
init = init_conditions_24wells[1,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
pred_summary = bind_rows(pred, .id = "rep") %>%
mutate(total = G0G1 + S + G2M) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
print(ggplot(prediction_results_24wells, aes(x = time, y = total)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
facet_wrap(Experiment ~. , ncol = 3))
#return(prediction_results_24wells)
}
simulate_validation_schedules_96well_test_prior_means = function(validation_fn, model_data,
params,
dox = "Dox",
drug2 = "Palbociclib"){
valid1_24wells = read.csv(validation_fn)
if(!("Palbociclib" %in% names(valid1_24wells))){
valid1_24wells = valid1_24wells %>% mutate(Palbociclib = 0)
} else{
valid1_24wells = valid1_24wells %>% mutate(Abemaciclib = 0)
}
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params,
init = init_conditions[1,],
tmax = 6,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
pred_summary = bind_rows(pred, .id = "rep") %>%
mutate(total = G0G1 + S + G2M) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
prediction_results_24wells = prediction_results_24wells %>% separate(Experiment, into = c("Fulvestrant", drug2))
valid1_24wells_obs = valid1_24wells_obs %>% select(-Fulvestrant, -Abemaciclib, -Palbociclib) %>% separate(Experiment, into = c("Fulvestrant", drug2))
print(ggplot(prediction_results_24wells, aes(x = time, y = total)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
facet_grid(Palbociclib ~ Fulvestrant , labeller = label_both, scales = "free_y" ))
#return(prediction_results_24wells)
}
validation_schedules_96well_differentIC50 = function(validation_fn, model_data, name,
params= c("alpha","b_P","b_F","b_A","c_P","c_F","c_A","c_P_eff","c_A_eff","a_FP","a_FA","b","Gamma","delta","K"),
num_iters = 2250, dox = "Dox"){
valid2 = read.csv(validation_fn)
valid2_tx = valid2 %>% filter(Measurement == "Treatment", Dox == dox) %>% arrange(Experiment, Day) %>%
mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid2_obs = valid2 %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
params_for_prediction = as.data.frame(model_data, pars = params)
init_conditions = as.data.frame(model_data, pars = c("y0_pass_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
## with passaging
prediction_results = data.frame()
for (exp in unique(valid2_tx$Experiment)){
pred = lapply(1:num_iters,
function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_differentIC50_validation_wrapper(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 28,
dose_times = (valid2_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid2_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage)) })
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>%
summarize(total_mean = mean(total), total_low = quantile(total, 0.025, na.rm = T),
total_high = quantile(total, 0.975, na.rm = T), total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results = bind_rows(prediction_results, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name,"_posterior-predictions-passaging.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid2_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) + facet_wrap(Experiment ~. , ncol = 5))
dev.off()
prediction_results_no_passaging = data.frame()
for (exp in unique(valid2_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_differentIC50_validation_wrapper(params = params_for_prediction[i,],
init = init_conditions[i,],
tmax = 28,
dose_times = (valid2_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid2_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>%
summarize(total_mean = mean(total, na.rm = T), total_low = quantile(total, 0.025, na.rm = T),
total_high = quantile(total, 0.975, na.rm = T), total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_no_passaging = bind_rows(prediction_results_no_passaging, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_", name, "_posterior-predictions-noPassaging.pdf",sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_no_passaging, aes(x = time, y = total_median)) + geom_line() + geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) + facet_wrap(Experiment ~. , ncol = 5) +scale_y_log10())
print(ggplot(prediction_results_no_passaging, aes(x = time, y = total_median, color = Experiment)) + geom_line() + scale_y_log10())
dev.off()
write.csv(prediction_results_no_passaging %>% filter(time ==28) %>% arrange(-total_median),
paste("from_cluster_include_longterm_data/plots/posterior_predictions_", name, "_posterior-predictions_noPassaging_ranks.csv", sep = "", collapse = ""),
quote = FALSE, row.names = FALSE)
}
validation_schedules_24well_differentIC50 = function(validation_fn, model_data, name,
params = c("alpha","b_P","b_F","b_A","c_P","c_F","c_A","c_P_eff","c_A_eff","a_FP","a_FA","b","Gamma","delta","K"),
num_iters = 2250, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
params_for_prediction = as.data.frame(model_data, pars = params)
names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_differentIC50_validation_wrapper(params = params_for_prediction_24wells[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
}
simulate_validation_schedules_24well_differentIC50 = function(validation_fn, model_data, name,
params,
num_iters = 2250, dox = "Dox"){
valid1_24wells = read.csv(validation_fn)
valid1_24wells_tx = valid1_24wells %>% filter(Measurement == "Treatment", Dox == dox) %>%
arrange(Experiment, Day) %>% mutate(Fulvestrant = Fulvestrant * 1e6, Palbociclib = Palbociclib * 1e6, Abemaciclib = Abemaciclib*1e6)
valid1_24wells_obs = valid1_24wells %>% filter(Measurement == "Observation", Dox == dox) %>% arrange(Experiment, Day)
init_conditions = as.data.frame(model_data, pars = c("y0_ct_p"))
colnames(init_conditions) = c("G0G1","S","G2M")
init_conditions$AA = 0
init_conditions$PP = 0
init_conditions$FF = 0
init_conditions_24wells = init_conditions %>%
mutate(total = G0G1 + S + G2M, G0G1 = G0G1/total, S = S/total, G2M = G2M/total) %>%
mutate(total = mean((valid1_24wells_obs %>% filter(Day == 0, Experiment == "Control"))$total_count)) %>%
mutate(G0G1 = G0G1 * total,
S = S * total,
G2M = G2M * total) %>%
select(-total)
#params_for_prediction = as.data.frame(model_data, pars = params)
#names(params_for_prediction) = str_replace(names(params_for_prediction), "gamma","Gamma")
#params_for_prediction_24wells = params_for_prediction %>% mutate(K = 6)
prediction_results_24wells = data.frame()
for (exp in unique(valid1_24wells_tx$Experiment)){
pred = lapply(1:num_iters, function(i){
ode_sigmoid_effectiveDose_g1s_transition_1_carrying_capacity_validation_wrapper(params = params[i,],
init = init_conditions_24wells[i,],
tmax = 28,
dose_times = (valid1_24wells_tx %>% filter(Experiment == exp))$Day,
drug_amount = valid1_24wells_tx %>% filter(Experiment == exp) %>%
select (Fulvestrant,Palbociclib,Abemaciclib,Passage))
})
pred_summary = bind_rows(pred, .id = "rep") %>% group_by(time) %>%
mutate(total = G0G1 + S + G2M) %>% summarize(total_mean = mean(total, na.rm = T),
total_low = quantile(total, 0.025, na.rm = T), total_high = quantile(total, 0.975, na.rm = T),
total_median = median(total, na.rm = T),
AA = mean(AA), PP = mean(PP), FF = mean(FF)) %>% mutate(Experiment = exp)
prediction_results_24wells = bind_rows(prediction_results_24wells, pred_summary)
}
pdf(paste("from_cluster_include_longterm_data/plots/posterior_predictions_",name, "_posterior-predictions-passaging-24wells.pdf", sep = "", collapse = ""), useDingbats = F)
print(ggplot(prediction_results_24wells, aes(x = time, y = total_median)) + geom_line() +
geom_point(data = valid1_24wells_obs, aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25) +
facet_wrap(Experiment ~. , ncol = 3))
print(ggplot(prediction_results_24wells %>% filter(Experiment == "Control"), aes(x = time, y = total_median)) +
geom_line() +
geom_point(data = valid1_24wells_obs %>% filter(Experiment == "Control"), aes(x = Day, y = total_count)) +
geom_ribbon(aes(ymin = total_low, ymax = total_high, linetype = NULL), alpha = 0.25))
dev.off()
return(prediction_results_24wells)
}
# ## plot beta as a function of palbocilib
# params_for_prediction_ND %>% summarize(b = median(b), c_P = median(c_P), b_P = median(b_P))
#
# f_palbo <- function(x) 2.544512*(1/(1 + (x/0.02920118)^0.9529028))
#
# curve(f_palbo, 0 , 1)
#
# params_for_prediction_ND %>% summarize(b = median(b), c_F = median(c_F), b_F = median(b_F))
# f_fulv <- function(x) 2.544512*(1/(1 + (x/0.005342141)^0.753689))
# curve(f_fulv, 0, 1) |
// src/__tests__/FAQSection.test.jsx
import { render, screen, fireEvent } from '@testing-library/react';
import FAQSection from '../components/FAQSection';
describe('FAQ Section', () => {
test('renders the FAQ title', () => {
render(<FAQSection />);
const titleElement = screen.getByText(/Frequently Asked Questions/i);
expect(titleElement).toBeInTheDocument();
});
test('displays FAQ questions', () => {
render(<FAQSection />);
const questionElement = screen.getByText(/What is BEES Foundation?/i);
expect(questionElement).toBeInTheDocument();
});
test('toggles answer visibility when a question is clicked', () => {
render(<FAQSection />);
const questionElement = screen.getByText(/What is BEES Foundation?/i);
fireEvent.click(questionElement);
const answerElement = screen.getByText(/BEES Foundation is an NGO focused on reforming society/i);
expect(answerElement).toBeInTheDocument();
});
}); |
man7.org > Linux > man-pages
Linux/UNIX system programming training
* * *
# fold(1p) -- Linux manual page
PROLOG | NAME | SYNOPSIS | DESCRIPTION | OPTIONS | OPERANDS | STDIN | INPUT
FILES | ENVIRONMENT VARIABLES | ASYNCHRONOUS EVENTS | STDOUT | STDERR | OUTPUT
FILES | EXTENDED DESCRIPTION | EXIT STATUS | CONSEQUENCES OF ERRORS |
APPLICATION USAGE | EXAMPLES | RATIONALE | FUTURE DIRECTIONS | SEE ALSO |
COPYRIGHT
FOLD(1P) POSIX Programmer's Manual FOLD(1P)
## PROLOG top
This manual page is part of the POSIX Programmer's Manual. The
Linux implementation of this interface may differ (consult the
corresponding Linux manual page for details of Linux behavior),
or the interface may not be implemented on Linux.
## NAME top
fold — filter for folding lines
## SYNOPSIS top
fold [-bs] [-w width] [file...]
## DESCRIPTION top
The fold utility is a filter that shall fold lines from its input
files, breaking the lines to have a maximum of width column
positions (or bytes, if the -b option is specified). Lines shall
be broken by the insertion of a <newline> such that each output
line (referred to later in this section as a segment) is the
maximum width possible that does not exceed the specified number
of column positions (or bytes). A line shall not be broken in the
middle of a character. The behavior is undefined if width is less
than the number of columns any single character in the input
would occupy.
If the <carriage-return>, <backspace>, or <tab> characters are
encountered in the input, and the -b option is not specified,
they shall be treated specially:
<backspace>
The current count of line width shall be decremented by
one, although the count never shall become negative.
The fold utility shall not insert a <newline>
immediately before or after any <backspace>, unless the
following character has a width greater than 1 and
would cause the line width to exceed width.
<carriage-return>
The current count of line width shall be set to zero.
The fold utility shall not insert a <newline>
immediately before or after any <carriage-return>.
<tab> Each <tab> encountered shall advance the column
position pointer to the next tab stop. Tab stops shall
be at each column position n such that n modulo 8
equals 1.
## OPTIONS top
The fold utility shall conform to the Base Definitions volume of
POSIX.1‐2017, Section 12.2, Utility Syntax Guidelines.
The following options shall be supported:
-b Count width in bytes rather than column positions.
-s If a segment of a line contains a <blank> within the
first width column positions (or bytes), break the line
after the last such <blank> meeting the width
constraints. If there is no <blank> meeting the
requirements, the -s option shall have no effect for
that output segment of the input line.
-w width Specify the maximum line length, in column positions
(or bytes if -b is specified). The results are
unspecified if width is not a positive decimal number.
The default value shall be 80.
## OPERANDS top
The following operand shall be supported:
file A pathname of a text file to be folded. If no file
operands are specified, the standard input shall be
used.
## STDIN top
The standard input shall be used if no file operands are
specified, and shall be used if a file operand is '-' and the
implementation treats the '-' as meaning standard input.
Otherwise, the standard input shall not be used. See the INPUT
FILES section.
## INPUT FILES top
If the -b option is specified, the input files shall be text
files except that the lines are not limited to {LINE_MAX} bytes
in length. If the -b option is not specified, the input files
shall be text files.
## ENVIRONMENT VARIABLES top
The following environment variables shall affect the execution of
fold:
LANG Provide a default value for the internationalization
variables that are unset or null. (See the Base
Definitions volume of POSIX.1‐2017, Section 8.2,
Internationalization Variables for the precedence of
internationalization variables used to determine the
values of locale categories.)
LC_ALL If set to a non-empty string value, override the values
of all the other internationalization variables.
LC_CTYPE Determine the locale for the interpretation of
sequences of bytes of text data as characters (for
example, single-byte as opposed to multi-byte
characters in arguments and input files), and for the
determination of the width in column positions each
character would occupy on a constant-width font output
device.
LC_MESSAGES
Determine the locale that should be used to affect the
format and contents of diagnostic messages written to
standard error.
NLSPATH Determine the location of message catalogs for the
processing of LC_MESSAGES.
## ASYNCHRONOUS EVENTS top
Default.
## STDOUT top
The standard output shall be a file containing a sequence of
characters whose order shall be preserved from the input files,
possibly with inserted <newline> characters.
## STDERR top
The standard error shall be used only for diagnostic messages.
## OUTPUT FILES top
None.
## EXTENDED DESCRIPTION top
None.
## EXIT STATUS top
The following exit values shall be returned:
0 All input files were processed successfully.
>0 An error occurred.
## CONSEQUENCES OF ERRORS top
Default.
The following sections are informative.
## APPLICATION USAGE top
The cut and fold utilities can be used to create text files out
of files with arbitrary line lengths. The cut utility should be
used when the number of lines (or records) needs to remain
constant. The fold utility should be used when the contents of
long lines need to be kept contiguous.
The fold utility is frequently used to send text files to
printers that truncate, rather than fold, lines wider than the
printer is able to print (usually 80 or 132 column positions).
## EXAMPLES top
An example invocation that submits a file of possibly long lines
to the printer (under the assumption that the user knows the line
width of the printer to be assigned by lp):
fold -w 132 bigfile | lp
## RATIONALE top
Although terminal input in canonical processing mode requires the
erase character (frequently set to <backspace>) to erase the
previous character (not byte or column position), terminal output
is not buffered and is extremely difficult, if not impossible, to
parse correctly; the interpretation depends entirely on the
physical device that actually displays/prints/stores the output.
In all known internationalized implementations, the utilities
producing output for mixed column-width output assume that a
<backspace> character backs up one column position and outputs
enough <backspace> characters to return to the start of the
character when <backspace> is used to provide local line motions
to support underlining and emboldening operations. Since fold
without the -b option is dealing with these same constraints,
<backspace> is always treated as backing up one column position
rather than backing up one character.
Historical versions of the fold utility assumed 1 byte was one
character and occupied one column position when written out. This
is no longer always true. Since the most common usage of fold is
believed to be folding long lines for output to limited-length
output devices, this capability was preserved as the default
case. The -b option was added so that applications could fold
files with arbitrary length lines into text files that could then
be processed by the standard utilities. Note that although the
width for the -b option is in bytes, a line is never split in the
middle of a character. (It is unspecified what happens if a
width is specified that is too small to hold a single character
found in the input followed by a <newline>.)
The tab stops are hardcoded to be every eighth column to meet
historical practice. No new method of specifying other tab stops
was invented.
## FUTURE DIRECTIONS top
None.
## SEE ALSO top
cut(1p)
The Base Definitions volume of POSIX.1‐2017, Chapter 8,
Environment Variables, Section 12.2, Utility Syntax Guidelines
## COPYRIGHT top
Portions of this text are reprinted and reproduced in electronic
form from IEEE Std 1003.1-2017, Standard for Information
Technology -- Portable Operating System Interface (POSIX), The
Open Group Base Specifications Issue 7, 2018 Edition, Copyright
(C) 2018 by the Institute of Electrical and Electronics
Engineers, Inc and The Open Group. In the event of any
discrepancy between this version and the original IEEE and The
Open Group Standard, the original IEEE and The Open Group
Standard is the referee document. The original Standard can be
obtained online at http://www.opengroup.org/unix/online.html .
Any typographical or formatting errors that appear in this page
are most likely to have been introduced during the conversion of
the source files to man page format. To report such errors, see
https://www.kernel.org/doc/man-pages/reporting_bugs.html .
IEEE/The Open Group 2017 FOLD(1P)
* * *
Pages that refer to this page: cut(1p), id(1p)
* * *
* * *
HTML rendering created 2023-06-24 by Michael Kerrisk, author of The Linux
Programming Interface.
For details of in-depth Linux/UNIX system programming training courses that I
teach, look here.
Hosting by jambit GmbH.
* * * |
<!DOCTYPE HTML>
<html xmlns:th="http://www.thymeleaf.org" lang="zh">
<head>
<title th:text="${sectionName}+'-嘟嘟社区'"></title>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<link rel="stylesheet" href="/css/bootstrap.min.css">
<link rel="stylesheet" href="/css/bootstrap-theme.css">
<link rel="stylesheet" href="/css/community.css">
<script src="/js/jquery-3.5.0.min.js" type="application/javascript"></script>
<script src="/js/bootstrap.min.js" type="application/javascript"></script>
</head>
<body>
<div th:insert="~{navigation :: nav}"></div>
<div class="container-fluid main">
<div class="row">
<div class="col-lg-9 col-md-12 col-sm-12 col-xs-12">
<h2><span th:text="${sectionName}"></span></h2>
<hr>
<div class="media" th:each="question:${paginationDTO.data}">
<div class="media-left">
<img class="media-object img_avatar_url"
th:src="${question.user.avatarUrl}" alt="网络头像" src=""/>
</div>
<div class="media-body">
<a th:href="@{'/question/'+${question.id}}"><h4 class="media-heading"
th:text="${question.title}"></h4>
</a>
<span th:text="${question.description}"></span><br>
<span class="text-desc">
<span th:text="${question.commentCount==null?0:question.commentCount}"> </span>个评论 •
<span th:text="${question.viewCount==null?0:question.viewCount}"> </span>次浏览 •
<span th:text="${question.likeCount==null?0:question.likeCount}"></span>个点赞 •
<span th:text="${#dates.format(question.gmtModified, 'yyyy年MM月dd HH:mm:ss')}"></span>
<a th:href="@{'/publish/'+${question.id}}" class="community-menu"
th:if="${session.gitHubUser!=null && session.gitHubUser.id==question.user.id}">
<span class="glyphicon glyphicon-pencil"
aria-hidden="true"> 编辑</span>
</a>
</span>
</div>
</div>
<nav aria-label="Page navigation">
<ul class="pagination">
<li th:if="${paginationDTO.showFirstPage}">
<a th:href="@{${section}(page=1)}" aria-label="Previous">
<span aria-hidden="true"><<</span>
</a>
</li>
<li th:if="${paginationDTO.showPrevious}">
<a th:href="@{${section}(page=${(paginationDTO.page)-1})}"
aria-label="Previous">
<span aria-hidden="true"><</span>
</a>
</li>
<li th:each="page:${paginationDTO.pages}"
th:class="${paginationDTO.page==page}?'active':''"><a
th:href="@{${section}(page=${page})}"
th:text="${page}"></a>
</li>
<li th:if="${paginationDTO.showNextPage}">
<a th:href="@{${section}(page=${paginationDTO.page+1})}"
aria-label="Next">
<span aria-hidden="true">></span>
</a>
</li>
<li th:if="${paginationDTO.showLastPage}">
<a th:href="@{${section}(page=${paginationDTO.totalPage})}"
aria-label="Next">
<span aria-hidden="true">>></span>
</a>
</li>
</ul>
</nav>
</div>
<div class="col-lg-3 col-md-12 col-sm-12 col-xs-12">
<div class="list-group">
<a href="/profile/questions"
th:class="${section=='questions'}?'list-group-item active':'list-group-item'">我的问题
</a>
<a href="/profile/replies"
th:class="${section=='replies'}?'list-group-item active':'list-group-item'">最新回复
<span class="badge">14</span></a>
</div>
</div>
</div>
</div>
</body>
</html> |
<?php
namespace Tests\Unit;
use App\Models\Categoria;
use App\Models\Producto;
use Illuminate\Foundation\Testing\RefreshDatabase;
use Tests\TestCase;
class ProductoTest extends TestCase
{
use RefreshDatabase;
protected $categoria;
protected $otraCategoria;
protected $producto;
public function setUp(): void
{
parent::setUp();
$this->categoria = Categoria::create([
'nombre' => 'Categoria 1',
'descripcion' => 'Descripción de la categoría 1',
]);
$this->otraCategoria = Categoria::create([
'nombre' => 'Categoria 2',
'descripcion' => 'Descripción de la categoría 2',
]);
$this->producto = Producto::create([
'nombre' => 'Producto de prueba',
'descripcion' => 'Descripción del producto de prueba',
'precio' => 10.99,
'categoria_id' => $this->categoria->id,
]);
}
/** @test */
public function crear_un_producto()
{
$this->assertDatabaseHas('productos', [
'nombre' => 'Producto de prueba',
'descripcion' => 'Descripción del producto de prueba',
'precio' => 10.99,
'categoria_id' => $this->categoria->id,
]);
}
/** @test */
public function actualizar_un_producto()
{
$this->producto->update([
'nombre' => 'Nuevo nombre de producto',
'descripcion' => 'Nueva descripción de producto',
'precio' => 20.99,
'categoria_id' => $this->otraCategoria->id,
]);
$this->assertEquals('Nuevo nombre de producto', $this->producto->fresh()->nombre);
$this->assertEquals('Nueva descripción de producto', $this->producto->fresh()->descripcion);
$this->assertEquals(20.99, $this->producto->fresh()->precio);
}
/** @test */
public function eliminar_un_producto()
{
$this->producto->delete();
$this->assertDatabaseMissing('productos', ['id' => $this->producto->id]);
}
/** @test */
public function validacion_precio()
{
$response = $this->post('/productos', [
'cantidad' => 10,
'precio' => 1234567, // Más de 6 dígitos
]);
$response->assertSessionHasErrors('precio');
}
} |
import React from "react";
import clsx from "clsx";
import LoadingSpinner from "./LoadingSpinner";
import styles from "./LoadingOverlay.module.scss";
export default function LoadingOverlay({
text = "Loading...",
relativePosition = false,
}: {
text?: string;
relativePosition?: boolean;
}): React.ReactElement {
const overlayClasses = clsx(styles.overlay, {
relativePositionOverlay: relativePosition,
});
const overlayTextClasses = clsx(styles.text, {
relativePositionOverlayText: relativePosition,
});
return (
<div className={overlayClasses}>
<div className={overlayTextClasses}>
<LoadingSpinner /> {text}
</div>
</div>
);
} |
package Project2;
import java.util.Scanner;
class BankAccout{
String name;
String userName;
String password;
String accountNo;
float balance = 1000000f;
int transactions = 0;
String transactionHistory = "";
public void register() {
Scanner sc = new Scanner(System.in);
System.out.println("\nEnter your Name - ");
this.name= sc.nextLine();
System.out.println("\nEnter Your Username - ");
this.userName=sc.nextLine();
System.out.print("\nEnter Your Password - ");
this.password = sc.nextLine();
System.out.print("\nEnter Your Account Number - ");
this.accountNo = sc.nextLine();
System.out.println("\nRegistration completed..kindly login");
}
public boolean login() {
boolean isLogin = false;
Scanner sc = new Scanner(System.in);
while(!isLogin) {
System.out.print("\nEnter Your Username - ");
String Username=sc.nextLine();
if(Username.equals(userName))
{
while(!isLogin)
{
System.out.print("\nEnter Your Password - ");
String Password = sc.nextLine();
if(Password.equals(password))
{
System.out.print("\nLogin Succesful!!");
isLogin = true;
}
else
{
System.out.println("\nIncorrect Password");
}
}
}
else
{
System.out.println("\nUsername not found");
}
}
return isLogin;
}
public void withdraw()
{
System.out.println("\nEnter amount to withdraw - ");
Scanner sc= new Scanner(System.in);
float amount = sc.nextFloat();
try {
if ( balance >= amount ) {
transactions++;
balance -= amount;
System.out.println("\nWithdraw Successfully : " +amount);
String str = amount + " Rs Withdrawed\n";
transactionHistory = transactionHistory.concat(str);
System.out.println();
System.out.println("******Thank You !! Visit Again******");
}
else {
System.out.println("\nInsufficient Balance");
}
}
catch ( Exception e) {
}
}
public void deposite()
{
System.out.println("Enter amount to deposite - ");
Scanner sc = new Scanner(System.in);
float amount = sc.nextFloat();
try {
if(amount <= 1000000f) {
transactions++;
balance += amount;
System.out.println("\nSuccessfully Deposited : " +amount);
String str = amount + " Rs deposited\n";
transactionHistory = transactionHistory.concat(str);
System.out.println();
System.out.println("******Thank You !! Visit Again******");
}
else {
System.out.println("\nSorry...Limit is 100000.00");
}
}
catch ( Exception e) {
}
}
public void transfer()
{
Scanner sc = new Scanner(System.in);
System.out.print("\nEnter Receipent's Name - ");
String receipent = sc.nextLine();
System.out.print("\nEnter amount to Transfer - ");
float amount= sc.nextFloat();
try {
if(balance >=amount) {
if(amount<= 50000f) {
transactions++;
balance -= amount;
System.out.println("\nSuccessfully Transfered " +amount+ " to " + receipent);
String str = amount + " Rs transfered to " + receipent + "\n";
transactionHistory = transactionHistory.concat(str);
System.out.println();
System.out.println("******Thank You !! Visit Again******");
}
else {
System.out.println("\nSorry...Limit is 50000.00");
}
}
else {
System.out.println("\nInsufficient Balance");
}
}
catch(Exception e) {
}
}
public void checkBalance()
{
System.out.println("\nYour Account Balance is : "+ balance + " Rs");
}
public void transHistory()
{
if ( transactions == 0 ) {
System.out.println("\nEmpty");
}
else {
System.out.println("\n" + transactionHistory);
}
}
}
public class AtmInterface {
public static int takeIntegerInput(int limit) {
int input = 0;
boolean flag = false;
while ( !flag ) {
try {
Scanner sc = new Scanner(System.in);
input = sc.nextInt();
flag = true;
if ( flag && input > limit || input < 1 ) {
System.out.println("Choose the number between 1 to " + limit);
flag = false;
}
}
catch ( Exception e ) {
System.out.println("Enter only integer value");
flag = false;
}
};
return input;
}
public static void main(String[] args) {
System.out.println("\n**********WELCOME TO SBI ATM MACHINE **********\n");
System.out.println("1.Register \n2.Exit");
System.out.print("Enter Your Choice - ");
int choice = takeIntegerInput(2);
if ( choice == 1 ) {
BankAccout b = new BankAccout();
b.register();
while(true) {
System.out.println("\n1.Login \n2.Exit");
System.out.print("Enter Your Choice - ");
int ch = takeIntegerInput(2);
if ( ch == 1 ) {
if (b.login()) {
System.out.println("\n\n**********WELCOME BACK " + b.name + " **********\n");
boolean isFinished = false;
while (!isFinished) {
System.out.println("\n1.Withdraw \n2.Deposit \n3.Transfer \n4.Check Balance \n5.Transaction History \n6.Exit");
System.out.print("\nEnter Your Choice - ");
int c = takeIntegerInput(6);
switch(c) {
case 1:
b.withdraw();
break;
case 2:
b.deposite();
break;
case 3:
b.transfer();
break;
case 4:
b.checkBalance();
break;
case 5:
b.transHistory();
break;
case 6:
isFinished = true;
break;
}
}
}
}
else {
System.exit(0);
}
}
}
else {
System.exit(0);
}
}
} |
const { NotAllowedError } = require("../../errors");
const logger = require("../../logger");
const { encrypt62 } = require("../../utils/encrypt.util");
const {
StudyroomChat,
User,
StudyroomMember,
Sequelize,
} = require("../../models");
const getChatByCursor = async ({ studyroom_id, cursor = null, user_id }) => {
// LIMIT
const limit = 5;
// cursor 유무에 따라 where clause 생성
const whereClause = {
studyroom_id: studyroom_id,
};
if (cursor) {
// 커서를 기반으로 studyroom_chat_id가 cursor보다 작은(오래된) 채팅을 조회
whereClause.studyroom_chat_id = { [Sequelize.Op.lt]: cursor };
}
const chats = await StudyroomChat.findAll({
include: [
{
model: User,
as: "sender", // StudyroomChat에 정의된 별칭
attributes: ["name", "nickname", "profile_image"],
required: false,
},
],
attributes: [
"studyroom_chat_id",
"studyroom_id",
"sender_id",
"type",
"content",
"created_at",
],
where: whereClause,
limit: limit + 1, // 다음 페이지 존재 여부 확인을 위해 하나 더 조회
order: [["studyroom_chat_id", "DESC"]], // 오래된게 앞에 표시되도록 ASC로 변경
});
let hasNextPage = false;
logger.debug(`[getChatByCursor] chats: ${chats.length}`);
if (chats.length > limit) {
hasNextPage = true;
chats.pop(); // limit + 1로 조회한 초과 레코드는 제거
}
chats.reverse(); // 오래된 채팅이 앞에 표시되도록 순서 변경
const nextCursor = hasNextPage ? chats[0].studyroom_chat_id : null;
let ret = [];
for (const chat of chats) {
ret.push({
is_my_chat: chat.sender_id === user_id,
studyroom_chat_id: chat.studyroom_chat_id,
studyroom_id: chat.studyroom_id,
sender_id: encrypt62(chat.sender_id),
sender_name: chat.sender ? chat.sender.name : null,
sender_nickname: chat.sender ? chat.sender.nickname : null,
sender_profile_image: chat.sender ? chat.sender.profile_image : null,
type: chat.type,
content: chat.content,
created_at: chat.created_at,
});
}
logger.debug(
`[getChatByCursor] nextCursor: ${nextCursor} hasNextPage: ${hasNextPage}`
);
return {
chats: ret,
nextCursor,
hasNextPage,
};
};
const isUserInStudyroom = async ({ studyroom_id, user_id }) => {
const result = await StudyroomMember.findOne({
attributes: ["studyroom_id"],
where: {
studyroom_id: studyroom_id,
user_id: user_id,
status: "active",
},
});
if (!result) {
logger.error(
`[isUserInStudyroom] ${user_id} 사용자가 ${studyroom_id} 스터디룸에 접근을 시도했습니다.`
);
throw new NotAllowedError("해당 스터디룸에 속한 사용자가 아닙니다.");
}
logger.debug(
`[isUserInStudyroom] ${user_id} 사용자가 ${studyroom_id} 스터디룸에 접근했습니다.`
);
return result;
};
const saveChat = async (studyroomId, senderId, type, content) => {
const result = await StudyroomChat.create({
studyroom_id: studyroomId,
sender_id: senderId,
type,
content,
});
if (!result) {
throw new InvalidInputError("채팅 저장에 실패했습니다.");
}
return result;
};
const getNameAndNicknameById = async (userId) => {
const user = await User.findByPk(userId, {
attributes: ["name", "nickname"],
});
logger.info(
`[getNameAndNicknameById] ${userId} 사용자 정보 조회: ${user.name}`
);
if (!user) {
throw new NotAllowedError("사용자 정보를 찾을 수 없습니다.");
}
return {
name: user.name,
nickname: user.nickname,
};
};
module.exports = {
isUserInStudyroom,
saveChat,
getChatByCursor,
getNameAndNicknameById,
}; |
<!DOCTYPE html>
<html>
<head>
</head>
<body>
<h2>Intent Aware Recommender Systems</h2>
<p align="center">
<img src="intentAware.webp" width="300", title="Intent Aware Recommender Systems">
</p>
<h3>Introduction</h3>
<p align="justify">This reproducibility package was prepared for the paper titled "Performance Comparison of Intent Aware and Non-Intent Aware Recommender Systems" and submitted to the ABC. The results reported in this paper were achieved with the help of the codes, which were shared by the original authors of the selected articles. For the implementation of baseline models, we utilized the session-rec and RecSys2019_DeepLearning_Evaluation frameworks. These frameworks include the state-of-the-art baseline models for session based and top-n recommender systems. More information about the session-rec and RecSys2019_DeepLearning_Evaluation frameworks can be found by following the given links. </p>
<ul>
<li><a href="https://rn5l.github.io/session-rec/index.html" target="_blank">Session rec framework</a></li>
<li><a href="https://github.com/MaurizioFD/RecSys2019_DeepLearning_Evaluation.git" target="_blank"> RecSys2019_DeepLearning_Evaluation framework </a></li>
</ul>
<h5>Selected articles</h5>
<ul>
<li> Modeling Multi-Purpose Sessions for Next-Item Recommendations via Mixture-Channel Purpose Routing Networks (JCAI'19)</li>
<li>Disentangled Graph Collaborative Filtering (SIGIR'2020)</li>
<li>Learning Intents behind Interactions with Knowledge Graph for Recommendation (WWW'2021) </li>
<li> Intent Contrastive Learning for Sequential Recommendation (WWW'22)</li>
<li>Enhancing Hypergraph Neural Networks with Intent Disentanglement for Session-based Recommendation (SIGIR'2022)</li>
<li>Dynamic Intent Aware Iterative Denoising Network for Session-based Recommendation (Journal: Information Processing & Management'2022 - IF: 7.4)</li>
<li>Intent Disentanglement and Feature Self-Supervision for Novel Recommendation (Journal: IEEE Transactions on Knowledge and Data Engineering'2022 - IF: 8.9) </li>
<li>Efficiently Leveraging Multi-level User Intent for Session-based Recommendation via Atten-Mixer Network (WSDM'23)</li>
<li>Sparse-Interest Network for Sequential Recommendation (WSDM'23)</li>
</ul>
<h5>Required libraries to run the framework</h5>
<ul>
<li>Anaconda 4.X (Python 3.8 or higher)</li>
<li>numpy</li>
<li>pandas</li>
<li>torch</li>
<li>torchvision</li>
<li>torch_geometric</li>
<li>pyg_lib</li>
<li>torch-scatter</li>
<li>torch-sparse</li>
<li>torch-cluster</li>
<li>torch-spline-conv</li>
<li>prettytable</li>
<li>python-dateutil</li>
<li>nltk</li>
<li>scipy</li>
<li>pytz</li>
<li>certifi</li>
<li>pyyaml</li>
<li>scikit-learn</li>
<li>six</li>
<li>psutil</li>
<li>pympler</li>
<li>Scikit-optimize</li>
<li>tables</li>
<li>scikit-optimize</li>
<li>tqdm</li>
<li>dill</li>
<li>numba</li>
</ul>
<h2>Installation guide</h2>
<p>This is how the framework can be downloaded and configured to run the experiments</p>
<h5>Using Docker</h5>
<ul>
<li>Download and install Docker from <a href="https://www.docker.com/">https://www.docker.com/</a></li>
<li>Run the following command to "pull Docker Image" from Docker Hub: <code>docker pull shefai/intent_aware_recomm_systems</code>
<li>Clone the GitHub repository by using the link: <code>https://github.com/Faisalse/IntentAwareRS.git</code>
<li>Move into the <b>IntentAwareRS</b> directory</li>
<li>Run the command to mount the current directory <i>IntentAwareRS</i> to the docker container named as <i>IntentAwareRS_container</i>: <code>docker run --name IntentAwareRS_container -it -v "$(pwd):/IntentAwareRS" -it shefai/IntentAwareRS</code>. If you have the support of CUDA-capable GPUs then run the following command to attach GPUs with the container: <code>docker run --name IntentAwareRS_container -it --gpus all -v "$(pwd):/IntentAwareRS" -it shefai/IntentAwareRS</code></li>
<li>If you are already inside the runing container, run the command to navigate to the mounted directory <i>IntentAwareRS</i>: <code>cd /IntentAwareRS</code> otherwise starts the "IntentAwareRS_container"</li>
</ul>
<h5>Using Anaconda</h5>
<ul>
<li>Download Anaconda from <a href="https://www.anaconda.com/">https://www.anaconda.com/</a> and install it</li>
<li>Clone the GitHub repository by using this link: <code>https://github.com/Faisalse/IntentAwareRS.git</code></li>
<li>Open the Anaconda command prompt</li>
<li>Move into the <b>IntentAwareRS</b> directory</li>
<li>Run this command to create virtual environment: <code>conda create --name IntentAwareRS_env python=3.8</code></li>
<li>Run this command to activate the virtual environment: <code>conda activate IntentAwareRS_env</code></li>
<li>Run this command to install the required libraries for CPU: <code>pip install -r requirements_cpu.txt</code>. However, if you have support of CUDA-capable GPUs,
then run this command to install the required libraries to run the experiments on GPU: <code>pip install -r requirements_gpu.txt</code></li>
</ul>
</p>
<h2>Instructions to Run Experiments for Intent Aware and Non-intent Aware Recommender Systems</h2>
<h5> Dynamic Intent-aware Iterative Denoising Network for Session-based Recommendation (DIDN)</h5>
<ul>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Yoochoose</a> dataset, unzip it and put the "yoochoose-clicks.dat" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the DIDN and baseline models on the shorter version of the Yoochoose dataset: <code>python run_experiments_for_DIDN_baseline_models.py --dataset yoochoose1_64</code> and run the following command to create the experiments for the larger version of the Yoochoose dataset <code>python run_experiments_for_DIDN_baseline_models.py --dataset yoochoose1_4</code> </li>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Diginetica</a> dataset, unzip it and put the "train-item-views.csv" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the DIDN and baseline models on the Diginetica dataset: <code>python run_experiments_for_DIDN_baseline_models.py --dataset diginetica</code></li>
</ul>
<h5>Enhancing Hypergraph Neural Networks with Intent Disentanglement for Session-based Recommendation (HIDE)</h5>
<ul>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Tmall</a> dataset, unzip it and put the "dataset15.csv" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the HIDE and baseline models on the Tmall dataset: <code>python run_experiments_HIDE_baseline_models.py --dataset Tmall</code></li>
</ul>
<h5>Modeling Multi-Purpose Sessions for Next-Item Recommendations via Mixture-Channel Purpose Routing Networks (Atten-Mixer)</h5>
<ul>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Diginetica</a> dataset, unzip it and put the "train-item-views.csv" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the Atten-Mixer and baseline models on the Diginetica dataset: <code>python run_experiments_AttenMixer_baseline_models.py --dataset diginetica</code></li>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Gowalla</a> dataset, unzip it and put the "loc-gowalla_totalCheckins.txt.gz" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the Atten-Mixer and baseline models on the Gowalla dataset: <code>python run_experiments_AttenMixer_baseline_models.py --dataset gowalla</code></li>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Yoochoose</a> dataset, unzip it and put the "yoochoose-clicks.dat" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the Atten-Mixer and baseline models on the shorter version of the Yoochoose dataset: <code>python python run_experiments_AttenMixer_baseline_models.py --dataset yoochoose1_64</code> and run the following command to create the experiments for the larger version of the Yoochoose dataset <code>python python run_experiments_AttenMixer_baseline_models.py --dataset yoochoose1_4</code> </li>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Retailrocket</a> dataset, unzip it and put the "events.csv" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the Atten-Mixer and baseline models on the Diginetica dataset: <code>python run_experiments_AttenMixer_baseline_models.py --dataset retailrocket</code></li>
</ul>
<h5>Efficiently Leveraging Multi-level User Intent for Session-based Recommendation via Atten-Mixer Network (MCPRN)</h5>
<ul>
<li>Download <a href="https://drive.google.com/drive/folders/1GocLZfbuwtxUjdRVEKq9xONyDbOjoNm4?usp=sharing" target="_blank">Yoochoose</a> dataset, unzip it and put the "yoochoose-buys.csv" file into the "data" directory/folder </li>
<li>Run this command to reproduce the experiments for the MCPRN and baseline models on the Yoochoose-buys dataset: <code>python run_experiments_for_MCRPN_baseline_models.py --dataset yoochoose</code></li>
</ul>
<h5>Learning Intents behind Interactions with Knowledge Graph for Recommendation (KIGN)</h5>
<ul>
<li>Run this command to reproduce the experiments for the KGIN and baseline models on the lastFm dataset: <code>python run_experiments_for_KGIN_baselines_algorithms.py --dataset lastFm</code> </li>
<li>Run this command to reproduce the experiments for the KGIN and baseline models on the alibabaFashion dataset: <code>python run_experiments_for_KGIN_baselines_algorithms.py --dataset alibabaFashion</code> </li>
<li>Run this command to reproduce the experiments for the KGIN and baseline models on the amazonBook dataset: <code>python run_experiments_for_KGIN_baselines_algorithms.py --dataset amazonBook</code> </li>
</ul>
<h5>Intent Disentanglement and Feature Self-supervision for Novel Recommendation (IDSNR)</h5>
<ul>
<li>Run this command to reproduce the experiments for the IDS4NR_NCF and baseline models on the MovieLens dataset: <code>python run_experiments_IDS4NR_baselines_algorithms.py --dataset MovieLens --model NCF</code> </li>
<li>Run this command to reproduce the experiments for the IDS4NR_NCF and baseline models on the Beauty dataset: <code>python run_experiments_IDS4NR_baselines_algorithms.py --dataset Beauty --model NCF</code> </li>
<li>Run this command to reproduce the experiments for the IDS4NR_NCF and baseline models on the Music dataset: <code>python run_experiments_IDS4NR_baselines_algorithms.py --dataset Music --model NCF</code> </li>
<li>Run this command to reproduce the experiments for the IDS4NR_LFM and baseline models on the MovieLens dataset: <code>python run_experiments_IDS4NR_baselines_algorithms.py --dataset MovieLens --model LFM</code> </li>
<li>Run this command to reproduce the experiments for the IDS4NR_LFM and baseline models on the Beauty dataset: <code>python run_experiments_IDS4NR_baselines_algorithms.py --dataset Beauty --model LFM</code> </li>
<li>Run this command to reproduce the experiments for the IDS4NR_LFM and baseline models on the Music dataset: <code>python run_experiments_IDS4NR_baselines_algorithms.py --dataset Music --model LFM</code> </li>
</ul>
<h5>Disentangled Graph Collaborative Filtering (DGCF)</h5>
<ul>
<li>Run this command to reproduce the experiments for the DGCF on the Yelp2018 dataset: <code>python run_experiments_for_DGCF_algorithm.py --dataset yelp2018</code> </li>
<li>Run this command to reproduce the experiments for the baseline models on the Yelp2018 dataset: <code>python run_experiments_DGCF_baseline_algorithms.py --dataset yelp2018</code> </li>
<li>Run this command to reproduce the experiments for the DGCF on the Gowalla dataset: <code>python run_experiments_for_DGCF_algorithm.py --dataset gowalla</code> </li>
<li>Run this command to reproduce the experiments for the baseline models on the Gowalla dataset: <code>python run_experiments_DGCF_baseline_algorithms.py --dataset gowalla</code> </li>
<li>Run this command to reproduce the experiments for the DGCF on the Amazon-book dataset: <code>python run_experiments_for_DGCF_algorithm.py --dataset amazonbook</code> </li>
<li>Run this command to reproduce the experiments for the baseline models on the Amazon-book dataset: <code>python run_experiments_DGCF_baseline_algorithms.py --dataset amazonbook</code> </li>
<h5>Note: The DGCF was implemented using TensorFlow 1.14, which does not support current versions of Python. Therefore, we provide a separate installation guide to run the experiments for DGCF model</h5>
<h5>Using Anaconda</h5>
<ul>
<li>Download Anaconda from <a href="https://www.anaconda.com/">https://www.anaconda.com/</a> and install it</li>
<li>Clone the GitHub repository by using this link: <code>https://github.com/Faisalse/IntentAwareRS.git</code></li>
<li>Open the Anaconda command prompt</li>
<li>Move into the <b>IntentAwareRS</b> directory</li>
<li>Run this command to create virtual environment: <code>conda create --name DGCF_env python=3.6</code></li>
<li>Run this command to activate the virtual environment: <code>conda activate DGCF_env</code></li>
<li>Run this command to install the required libraries for CPU: <code>pip install -r requirements_dgcf.txt</code></li>
</ul>
</body>
</html> |
import { useCallback, useRef, useState } from "react";
import { searchMovies } from "../services/getMovies";
import Movies from "../interfaces/movie";
import { searchMovieByID } from "../services/getMoviesByID";
export default function useMovies ({search, id}:{search: string, id: string}){
const [movies, setMovies] = useState<Movies[] | Movies
| null>([]);
const genres = Array.isArray(movies) ? [...new Set(movies.flatMap(movie => movie.genre))] : [];
const previousSearch = useRef(search)
const isFirstRender = useRef(true)
const getMovies = useCallback( async()=>{
if(!search.match(/^(?!.*\s)[\s\S]*$/)) return
if(!isFirstRender.current){
if(search.length <= 3) return
}
if(isFirstRender.current){
isFirstRender.current = false
} else{
if(search === previousSearch.current){
return
}
}
try {
const movies = await searchMovies({search})
setMovies(movies)
previousSearch.current = search
} catch (error) {
setMovies(null)
throw new Error('Hubo un error en la búsqueda');
}
}, [search])
const getMovieByID = useCallback( async()=>{
if(isFirstRender.current){
isFirstRender.current = false
} else{
if(search === previousSearch.current){
return
}
}
try {
const movie = await searchMovieByID({id})
setMovies(movie!)
previousSearch.current = search
} catch (error) {
throw new Error('No se encontro la pelicula con el ID solicitado.')
}
}, [search, id])
return {movies, genres, getMovies, getMovieByID}
} |
/*
* Copyright 2022-2023 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.instancio.internal.assignment;
import org.instancio.Select;
import org.instancio.exception.InstancioApiException;
import org.junit.jupiter.api.Nested;
import org.junit.jupiter.api.Test;
import static org.assertj.core.api.Assertions.assertThat;
import static org.assertj.core.api.Assertions.assertThatThrownBy;
class InternalAssignmentTest {
@Nested
class ToStringTest {
@Test
void verifyEmptyToString() {
assertThat(InternalAssignment.builder().build())
.hasToString("InternalAssignment[origin=null, destination=null]");
}
@Test
void verifyToString() {
assertThat(InternalAssignment.builder()
.origin(Select.field("foo"))
.destination(Select.field("bar"))
.build())
.hasToString("InternalAssignment[origin=field(\"foo\"), destination=field(\"bar\")]");
}
}
@Nested
class ValidationTest {
private final InternalAssignment.Builder builder = InternalAssignment.builder();
@Test
void origin() {
assertThatThrownBy(() -> builder.origin(null))
.isExactlyInstanceOf(InstancioApiException.class)
.hasMessageContaining("origin selector must not be null");
}
@Test
void destination() {
assertThatThrownBy(() -> builder.destination(null))
.isExactlyInstanceOf(InstancioApiException.class)
.hasMessageContaining("destination selector must not be null");
}
}
} |
import 'package:get/get.dart';
import 'package:kesehatan/app/modules/login/views/login_view.dart';
import 'package:kesehatan/app/modules/profile/views/profile_view.dart';
import '../modules/home/bindings/home_binding.dart';
import '../modules/home/views/home_view.dart';
import '../modules/login/bindings/login_binding.dart';
import '../modules/profile/bindings/profile_binding.dart';
import '../modules/splash_screen/bindings/splashscreen_binding.dart';
import '../modules/splash_screen/views/splashscreen_view.dart';
part 'app_routes.dart';
class AppPages {
AppPages._();
// Rute awal diubah menjadi SplashScreen agar user pertama kali melihat splash screen
static const INITIAL = Routes.SPLASH_SCREEN;
static final routes = [
GetPage(
name: _Paths.HOME,
page: () => const HomeView(),
binding: HomeBinding(),
),
GetPage(
name: _Paths.SPLASH_SCREEN,
page: () => const SplashScreenView(),
binding: SplashscreenBinding(),
),
GetPage(
name: _Paths.LOGIN,
page: () => const LoginView(),
binding: LoginBinding(),
),
GetPage(
name: _Paths.PROFILE,
page: () => const ProfileView(),
binding: ProfileBinding(),
),
];
} |
<!DOCTYPE html>
<html lang="ko">
<head>
<meta charset="UTF-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>JS 14-03 QuerySelector</title>
<style>
div {
margin: 10px;
}
.btn {
width: 100px;
height: 30px;
background-color: lightgray;
text-align: center;
line-height: 30px;
color: black;
border-radius: 15px;
cursor: pointer;
font-weight: 700;
transition: 500ms;
}
.btn--style1 {
background-color: red;
color: white;
}
.btn--style2 {
background-color: orange;
}
.btn--style3 {
background-color: yellow;
}
.btn--style4 {
background-color: green;
color: white;
}
.btn--style5 {
background-color: blue;
color: white;
}
.btn--style6 {
background-color: navy;
color: white;
}
.btn--style7 {
background-color: purple;
color: white;
}
.fs-xl{
font-size: 1.5rem;
font-family: 메이플스토리;
}
</style>
</head>
<body>
<div class="btn">BUTTON</div>
<div class="btn">BUTTON</div>
<div class="btn">BUTTON</div>
<div class="btn">BUTTON</div>
<div class="btn">BUTTON</div>
<div class="btn">BUTTON</div>
<div class="btn">BUTTON</div>
<script>
const btnEls = document.querySelectorAll('.btn');
btnEls.forEach((btnEl, idx) => {
btnEl.addEventListener('click', () => {
// btn--style이 안 들어가 있다면
if (!btnEl.classList.contains(`btn--style${idx+1}`)) {
btnEl.classList.add(`btn--style${idx+1}`);
btnEl.classList.add('fs-xl');
}
// btn-style이 들어가 있다면
else {
btnEl.classList.remove(`btn--style${idx+1}`);
btnEl.classList.remove('fs-xl');
}
}); //btnEl 이벤트 끝
}); //btnEls forEach 끝
</script>
</body>
</html> |
package util
import exception.NullArgumentException
/**
* Miscellaneous utility functions.
*
* @author haokangkang
* @see Precision
*/
object MathUtils {
/**
* Returns an integer hash code representing the given double value.
*
* @param value the value to be hashed
* @return the hash code
*/
fun hash(value: Double): Int {
return value.hashCode()
}
/**
* Returns `true` if the values are equal according to semantics of
* [Double.equals].
*
* @param x Value
* @param y Value
* @return `new Double(x).equals(new Double(y))`
*/
fun equals(x: Double, y: Double): Boolean {
return x == y
}
/**
* Returns an integer hash code representing the given double array.
*
* @param value the value to be hashed (may be null)
* @return the hash code
* @since 1.2
*/
fun hash(value: DoubleArray?): Int {
return value.hashCode()
}
/**
* Checks that an object is not null.
*
* @param o Object to be checked.
* @throws NullArgumentException if `o` is `null`.
*/
@Throws(NullArgumentException::class)
fun checkNotNull(o: Any?) {
if (o == null) {
throw NullArgumentException()
}
}
} |
import React, { useState, useEffect } from "react";
import { toast, ToastContainer } from "react-toastify";
import "react-toastify/dist/ReactToastify.css";
const GradeForm = ({ cohort }) => {
const [modules, setModules] = useState([]);
const [selectedModule, setSelectedModule] = useState("");
const [grade, setGrade] = useState("");
useEffect(() => {
if (cohort) {
fetch(`${cohort}`)
.then((response) => response.json())
.then((data) => setModules(data))
.catch((error) => {
console.error("Error fetching modules:", error);
toast.error("Failed to fetch modules");
});
}
}, [cohort]);
const handleSubmit = async (event) => {
event.preventDefault();
try {
const response = await fetch("/api/grade/", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({
cohort: cohort,
module: selectedModule,
grade: grade,
}),
});
if (response.ok) {
console.log("Grade updated successfully");
// Optionally reset form fields or show success message
} else {
console.error("Error updating grade:", response.statusText);
toast.error("Failed to update grade");
}
} catch (error) {
console.error("Error updating grade:", error);
toast.error("An error occurred while updating grade");
}
};
return (
<div className="flex items-center justify-center bg-gray-50 py-12 px-4 sm:px-6 lg:px-8">
<div className="max-w-md w-full space-y-8">
<div>
<h2 className="mt-6 text-center text-3xl font-extrabold text-gray-900">
Update Student Grades
</h2>
</div>
<form className="mt-8 space-y-6" onSubmit={handleSubmit}>
<div>
<label
htmlFor="module"
className="block text-sm font-medium text-gray-700"
>
Module
</label>
<select
id="module"
name="module"
className="mt-1 block w-full pl-3 pr-10 py-2 text-base border-gray-300 focus:outline-none focus:ring-blue-500 focus:border-blue-500 sm:text-sm rounded-md"
value={selectedModule}
onChange={(e) => setSelectedModule(e.target.value)}
>
<option value="">Select a module</option>
{modules.map((module) => (
<option key={module.id} value={module.id}>
{module.name}
</option>
))}
</select>
</div>
<div>
<button
type="submit"
className="w-full flex justify-center py-2 px-4 border border-transparent rounded-md shadow-sm text-sm font-medium text-white bg-blue-600 hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500"
>
Update Grade
</button>
</div>
</form>
</div>
<ToastContainer />
</div>
);
};
export default GradeForm; |
'use strict';
const express = require('express');
const graphqlHTTP = require('express-graphql');
const {
GraphQLSchema,
GraphQLObjectType,
GraphQLID,
GraphQLString,
GraphQLInt,
GraphQLBoolean
} = require('graphql');
const { log, warn } = require('../utils');
const PORT = process.env.PORT || 3000;
const server = express();
const videoType = new GraphQLObjectType({
name: 'VideoType',
description: 'A video on Egghead.io',
fields: {
id: {
type: GraphQLID,
description: 'The id of the video.'
},
title: {
type: GraphQLString,
description: 'The title of the video.'
},
duration: {
type: GraphQLInt,
description: 'The duration of the video (in seconds).'
},
watched: {
type: GraphQLBoolean,
description: 'Whether or not the viewer has watched the video.'
},
}
});
const queryType = new GraphQLObjectType({
name: 'QueryType',
description: 'The root query type.',
fields: {
video: {
type: videoType,
resolve: () => new Promise(resolve => {
resolve({
id: 'a',
title: 'GraphQL',
duration: 180,
watched: false
})
})
}
}
});
const schema = new GraphQLSchema({
query: queryType
});
const videoA = {
id: 'A',
title: 'video A',
duration: 120,
watched: true
};
const videoB = {
id: 'B',
title: 'video B',
duration: 160,
watched: false
};
const videos = [videoA, videoB];
server.use('/graphql', graphqlHTTP({
schema,
graphiql: true
}));
server.listen(PORT, () => console.log(`${(new Date()).toLocaleTimeString()} Listening on http://localhost:${PORT}`)); |
import { z } from 'zod';
import { createZodDto } from '../utils/createZodDto';
// Define the Health schema using Zod
export const HealthSchema = z.object({
status: z.string(),
timestamp: z.string().datetime(),
version: z.string().optional(),
uptime: z.number().optional(),
memory: z
.object({
used: z.number(),
total: z.number(),
})
.optional(),
});
// Create a type from the schema
export type Health = z.infer<typeof HealthSchema>;
// Create a DTO class that can be used with routing-controllers
export class HealthDto extends createZodDto(HealthSchema) {} |
import React, { useState, useEffect } from 'react'
import { useSelector } from 'react-redux'
import AutoSizer from 'react-virtualized-auto-sizer'
import { connectedInstanceSelector, connectedInstanceOverviewSelector } from 'uiSrc/slices/instances/instances'
import { pubSubSelector } from 'uiSrc/slices/pubsub/pubsub'
import { isVersionHigherOrEquals } from 'uiSrc/utils'
import { CommandsVersions } from 'uiSrc/constants/commandsVersions'
import EmptyMessagesList from './EmptyMessagesList'
import MessagesList from './MessagesList'
import styles from './MessagesList/styles.module.scss'
const MessagesListWrapper = () => {
const { messages = [], isSubscribed } = useSelector(pubSubSelector)
const { connectionType } = useSelector(connectedInstanceSelector)
const { version } = useSelector(connectedInstanceOverviewSelector)
const [isSpublishNotSupported, setIsSpublishNotSupported] = useState<boolean>(true)
useEffect(() => {
setIsSpublishNotSupported(
isVersionHigherOrEquals(
version,
CommandsVersions.SPUBLISH_NOT_SUPPORTED.since
)
)
}, [version])
return (
<>
{(messages.length > 0 || isSubscribed) && (
<div className={styles.wrapperContainer}>
<div className={styles.header} data-testid="messages-list">
<div className={styles.time}>Timestamp</div>
<div className={styles.channel}>Channel</div>
<div className={styles.message}>Message</div>
</div>
<div className={styles.listContainer}>
<AutoSizer>
{({ width, height }) => (
<MessagesList
items={messages}
width={width}
height={height}
/>
)}
</AutoSizer>
</div>
</div>
)}
{messages.length === 0 && !isSubscribed && (
<EmptyMessagesList isSpublishNotSupported={isSpublishNotSupported} connectionType={connectionType} />
)}
</>
)
}
export default MessagesListWrapper |
'use server';
import { getServerAuthSession } from '@repo/auth/server';
import { prisma } from '@repo/db';
import type { CommentRoot } from '@repo/db/types';
const PAGESIZE = 10;
const sortKeys = ['createdAt', 'vote', 'replies'] as const;
const sortOrders = ['asc', 'desc'] as const;
export type SortKey = (typeof sortKeys)[number];
export type SortOrder = (typeof sortOrders)[number];
function orderBy(sortKey: SortKey, sortOrder: SortOrder) {
switch (sortKey) {
case 'vote':
return {
vote: {
_count: sortOrder,
},
};
case 'replies':
return {
replies: {
_count: sortOrder,
},
};
case 'createdAt':
return {
[sortKey]: sortOrder,
};
}
}
export type PaginatedComments = NonNullable<Awaited<ReturnType<typeof getPaginatedComments>>>;
export type PreselectedCommentMetadata =
| NonNullable<Awaited<ReturnType<typeof getPreselectedCommentMetadata>>>
| NonNullable<Awaited<ReturnType<typeof getPreselectedSolutionCommentMetadata>>>;
export async function getPreselectedCommentMetadata(challengeId: number, commentId: number) {
const challengeComments = await prisma.comment.findMany({
where: {
rootType: 'CHALLENGE',
rootChallengeId: challengeId,
visible: true,
parentId: null,
},
orderBy: {
createdAt: 'desc',
},
select: {
id: true,
parentId: true,
},
});
const index = challengeComments.findIndex((comment) => comment.id === commentId);
const selectedComment = challengeComments[index];
const page = Math.ceil((index + 1) / PAGESIZE);
return {
page,
selectedComment,
index,
challengeComments: challengeComments.map((comment) => comment.id),
};
}
export async function getPreselectedSolutionCommentMetadata(
solutionId: number,
commentId: number,
challengeId: number,
) {
const solution = await prisma.sharedSolution.findFirst({
where: {
id: solutionId,
challengeId,
},
orderBy: {
createdAt: 'desc',
},
select: {
id: true,
solutionComment: true,
},
});
if (!solution || !solution.solutionComment) return;
const comments = solution.solutionComment;
const index = comments.findIndex((comment) => comment.id === commentId);
const selectedComment = comments[index];
const page = Math.ceil((index + 1) / PAGESIZE);
return {
page,
selectedComment,
index,
challengeComments: comments.map((comment) => comment.id),
};
}
export async function getPaginatedComments({
page,
rootId,
rootType,
parentId = null,
sortKey = 'createdAt',
sortOrder = 'desc',
}: {
page: number;
rootId: number;
rootType: CommentRoot;
parentId?: number | null;
sortKey?: SortKey;
sortOrder?: SortOrder;
}) {
const session = await getServerAuthSession();
const totalComments = await prisma.comment.count({
where: {
rootType,
parentId,
visible: true,
...(rootType === 'CHALLENGE' ? { rootChallengeId: rootId } : { rootSolutionId: rootId }),
},
});
const totalReplies = await prisma.comment.count({
where: {
rootType,
parentId: {
not: null,
},
visible: true,
...(rootType === 'CHALLENGE' ? { rootChallengeId: rootId } : { rootSolutionId: rootId }),
},
});
const comments = await prisma.comment.findMany({
skip: (page - 1) * PAGESIZE,
take: PAGESIZE,
where: {
rootType,
parentId,
...(rootType === 'CHALLENGE' ? { rootChallengeId: rootId } : { rootSolutionId: rootId }),
visible: true,
},
orderBy: orderBy(sortKey, sortOrder),
include: {
user: {
select: {
id: true,
name: true,
image: true,
},
},
_count: {
select: {
replies: true,
vote: true,
},
},
vote: {
select: {
userId: true,
},
where: {
userId: session?.user.id || '',
},
},
rootChallenge: {
select: {
name: true,
},
},
rootSolution: {
select: {
title: true,
},
},
},
});
const totalPages = Math.ceil(totalComments / PAGESIZE);
return {
totalComments: totalReplies + totalComments,
totalPages,
hasMore: page < totalPages,
comments,
};
}
export async function getAllComments({
rootId,
rootType,
parentId = null,
sortKey = 'createdAt',
sortOrder = 'desc',
}: {
rootId: number;
rootType: CommentRoot;
parentId?: number | null;
sortKey?: SortKey;
sortOrder?: SortOrder;
}) {
const session = await getServerAuthSession();
const comments = await prisma.comment.findMany({
where: {
rootType,
parentId,
...(rootType === 'CHALLENGE' ? { rootChallengeId: rootId } : { rootSolutionId: rootId }),
visible: true,
},
orderBy: orderBy(sortKey, sortOrder),
include: {
user: {
select: {
id: true,
name: true,
image: true,
},
},
_count: {
select: {
replies: true,
vote: true,
},
},
vote: {
select: {
userId: true,
},
where: {
userId: session?.user.id || '',
},
},
rootChallenge: {
select: {
name: true,
},
},
rootSolution: {
select: {
title: true,
},
},
},
});
return comments;
} |
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# # Writing a for loop and methods for avoiding loops # #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# Just for Colin, do not run:
# server = livecode::serve_file()
# Let's take a look at `iris` again:
# Compute the average sepal length for each species:
# - using a for loop
for (spec in unique(iris$Species)) {
print(mean(iris$Sepal.Length[iris$Species == spec]))
}
# - using tapply()
tapply(iris$Sepal.Length, iris$Species, mean)
microbenchmark::microbenchmark(
for_loop = {
species <- unique(iris$Species)
for (spec in species) {
mean(iris$Sepal.Length[iris$Species == spec])
}
},
tapply = {
tapply(iris$Sepal.Length, iris$Species, mean)
},
times = 100
)
# tapply() is about 40 times faster than the for loop
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
x <- seq(-pi,pi,length=100)
d <- data.frame(x,y=sin(x), z=cos(x), w=sin((x+3)/2), u=cos(x)*sin(x))
# Plot the columns of `d` in different colors against `x` using a for loop
plot(NULL, xlim=range(x), ylim=range(d), xlab="x", ylab="function")
for(i in seq_along(d)){
lines(x, d[,i], col=i, lwd=3, lty=i)
}
legend("topleft",
legend=colnames(d),
col=seq_along(d),
lty=seq_along(d),
lwd=3,
bty='n')
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# Reading many files at once
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# Using list.files(), find all the files in the "Data" folder that have the "sampleX_tempK_time-UNIT.csv" pattern.
# Hint: look at the pattern argument of list.files()
flist <- list.files(path="Data", pattern="sample", full.names = TRUE)
# Using a for loop, read all these files into a list of data.frames.
# Name the elements of this list with the file names without the extension.
# Hint: look at the basename() function
mylist <- list()
for (i in seq_along(flist)) {
mylist[[i]] <- read.csv(flist[i])
names(mylist)[i] <- flist[i] |> basename() |> sub("\\.csv", "", x=_)
}
mylist
# Using a for loop, read all these files into a single tidy data.frame containing all the data from the list stacked on top of each other.
# Add a column with the file name.
# Hint: look at the rbind() function
mylist2 <- data.frame()
for (f in flist) {
temp <- read.csv(f)
temp$file <- f |> basename() |> sub("\\.csv", "", x=_)
mylist2 <- rbind(mylist2, temp)
}
mylist2
# Using lapply() and read.csv(), read all these files into a list of data.frames.
# Name the elements of this list with the file names without the extension.
# Hint: look at the basename() function
mylist3 <- lapply(flist, read.csv)
names(mylist3) <- flist |> basename() |> gsub("\\.csv", "", x=_)
mylist3$sample1_800K_20min
# Do the same using Map() and read.csv()
mylist4 <- Map(read.csv, flist)
mylist4[[1]]
# Do the same using purrr::map() and read.csv()
flist |> purrr::map(read.csv)
# Using do.call() and rbind(), create a single tidy data.frame containing all the data from the list stacked on top of each other.
# Hint: look at the do.call() function
d <- do.call(rbind, mylist4)
# Add a column with the file name.
# Hint2: look at the row.name() function
d$file <- row.names(d)
row.names(d) <- NULL # delete the row names
d$file <- sub(".csv\\..*", "", d$file) # remove file extension and number from the file column
head(d)
# Advanced: using the tidyverse (see URFIST R class #2 in 3 weeks)
mylist6 <- tibble(flist) |>
mutate(data = map(flist, read.csv)) |>
unnest(data) |
<template>
<div>
<div v-if="getCategory != null">
<div class="banner banner-cat" style="background-image: url('assets/images/banners/banner-top.jpg');">
<div class="banner-content container">
<h2 class="banner-subtitle">check out over <span>200+</span></h2>
<h1 class="banner-title">
{{ getCategory.title }}
</h1>
<a href="#" class="btn btn-dark">Alışverişe Başla</a>
</div><!-- End .banner-content -->
</div><!-- End .banner -->
<nav aria-label="breadcrumb" class="breadcrumb-nav">
<div class="container">
<ol class="breadcrumb">
<li class="breadcrumb-item"><a href="#"><i class="icon-home"></i></a></li>
<li class="breadcrumb-item"><a href="#">Kategoriler</a></li>
<li class="breadcrumb-item active" aria-current="page">{{ getCategory.title }}</li>
</ol>
</div><!-- End .container -->
</nav>
<div class="container">
<div class="row">
<div class="col-lg-9">
<nav class="toolbox">
<div class="toolbox-left">
<div class="toolbox-item toolbox-sort">
<div class="select-custom">
<select name="orderby" class="form-control">
<option value="menu_order" selected="selected">Default sorting</option>
<option value="popularity">Sort by popularity</option>
<option value="rating">Sort by average rating</option>
<option value="date">Sort by newness</option>
<option value="price">Sort by price: low to high</option>
<option value="price-desc">Sort by price: high to low</option>
</select>
</div><!-- End .select-custom -->
<a href="#" class="sorter-btn" title="Set Ascending Direction"><span class="sr-only">Set Ascending Direction</span></a>
</div><!-- End .toolbox-item -->
</div><!-- End .toolbox-left -->
<div class="toolbox-item toolbox-show">
<label>Showing 1–9 of 60 results</label>
</div><!-- End .toolbox-item -->
<div class="layout-modes">
<a href="category.html" class="layout-btn btn-grid active" title="Grid">
<i class="icon-mode-grid"></i>
</a>
<a href="category-list.html" class="layout-btn btn-list" title="List">
<i class="icon-mode-list"></i>
</a>
</div><!-- End .layout-modes -->
</nav>
<div class="row row-sm">
<div class="col-6 col-md-4" v-for="product in products">
<div class="product-default">
<figure>
<router-link tag="a" :to="{ name : 'products.detail',params : {slug : product.slug}}">
<img :src="$root.productImage(product.image)">
</router-link>
</figure>
<div class="product-details">
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<h2 class="product-title">
<a href="product.html">{{ product.title }}</a>
</h2>
<div class="price-box">
<span class="product-price">₺ {{ product.price }}</span>
</div><!-- End .price-box -->
<div class="product-action">
<a href="#" class="btn-icon-wish"><i class="icon-heart"></i></a>
<button class="btn-icon btn-add-cart" data-toggle="modal" data-target="#addCartModal"><i class="icon-bag"></i>ADD TO CART</button>
<a href="ajax/product-quick-view.html" class="btn-quickview" title="Quick View"><i class="fas fa-external-link-alt"></i></a>
</div>
</div><!-- End .product-details -->
</div>
</div>
</div>
<nav class="toolbox toolbox-pagination">
<div class="toolbox-item toolbox-show">
<label>Showing 1–9 of 60 results</label>
</div><!-- End .toolbox-item -->
<ul class="pagination">
<li class="page-item disabled">
<a class="page-link page-link-btn" href="#"><i class="icon-angle-left"></i></a>
</li>
<li class="page-item active">
<a class="page-link" href="#">1 <span class="sr-only">(current)</span></a>
</li>
<li class="page-item"><a class="page-link" href="#">2</a></li>
<li class="page-item"><a class="page-link" href="#">3</a></li>
<li class="page-item"><a class="page-link" href="#">4</a></li>
<li class="page-item"><span>...</span></li>
<li class="page-item"><a class="page-link" href="#">15</a></li>
<li class="page-item">
<a class="page-link page-link-btn" href="#"><i class="icon-angle-right"></i></a>
</li>
</ul>
</nav>
</div><!-- End .col-lg-9 -->
<aside class="sidebar-shop col-lg-3 order-lg-first">
<div class="sidebar-wrapper">
<div class="widget">
<h3 class="widget-title">
<a data-toggle="collapse" href="#widget-body-1" role="button" aria-expanded="true" aria-controls="widget-body-1">Kategoriler</a>
</h3>
<div class="collapse show" id="widget-body-1">
<div class="widget-body">
<ul class="cat-list">
<li v-for="category in getCategories">
<router-link tag="a" :to="{name : 'categories.show',params : {slug : category.slug}}">
{{ category.title }}
</router-link>
</li>
</ul>
</div><!-- End .widget-body -->
</div><!-- End .collapse -->
</div><!-- End .widget -->
<div class="widget">
<h3 class="widget-title">
<a data-toggle="collapse" href="#widget-body-2" role="button" aria-expanded="true" aria-controls="widget-body-2">Price</a>
</h3>
<div class="collapse show" id="widget-body-2">
<div class="widget-body">
<form action="#">
<div class="price-slider-wrapper">
<div id="price-slider"></div><!-- End #price-slider -->
</div><!-- End .price-slider-wrapper -->
<div class="filter-price-action">
<button type="submit" class="btn btn-primary">Filter</button>
<div class="filter-price-text">
<span id="filter-price-range"></span>
</div><!-- End .filter-price-text -->
</div><!-- End .filter-price-action -->
</form>
</div><!-- End .widget-body -->
</div><!-- End .collapse -->
</div><!-- End .widget -->
<div class="widget">
<h3 class="widget-title">
<a data-toggle="collapse" href="#widget-body-3" role="button" aria-expanded="true" aria-controls="widget-body-3">Size</a>
</h3>
<div class="collapse show" id="widget-body-3">
<div class="widget-body">
<ul class="config-size-list">
<li><a href="#">S</a></li>
<li class="active"><a href="#">M</a></li>
<li><a href="#">L</a></li>
<li><a href="#">XL</a></li>
<li><a href="#">2XL</a></li>
<li><a href="#">3XL</a></li>
</ul>
</div><!-- End .widget-body -->
</div><!-- End .collapse -->
</div><!-- End .widget -->
<div class="widget">
<h3 class="widget-title">
<a data-toggle="collapse" href="#widget-body-4" role="button" aria-expanded="true" aria-controls="widget-body-4">Brands</a>
</h3>
<div class="collapse show" id="widget-body-4">
<div class="widget-body">
<ul class="cat-list">
<li><a href="#">Adidas <span>18</span></a></li>
<li><a href="#">Camel <span>22</span></a></li>
<li><a href="#">Seiko <span>05</span></a></li>
<li><a href="#">Samsung Galaxy <span>68</span></a></li>
<li><a href="#">Sony <span>03</span></a></li>
</ul>
</div><!-- End .widget-body -->
</div><!-- End .collapse -->
</div><!-- End .widget -->
<div class="widget">
<h3 class="widget-title">
<a data-toggle="collapse" href="#widget-body-6" role="button" aria-expanded="true" aria-controls="widget-body-6">Color</a>
</h3>
<div class="collapse show" id="widget-body-6">
<div class="widget-body">
<ul class="config-swatch-list">
<li>
<a href="#" style="background-color: #4090d5;"></a>
</li>
<li class="active">
<a href="#" style="background-color: #f5494a;"></a>
</li>
<li>
<a href="#" style="background-color: #fca309;"></a>
</li>
<li>
<a href="#" style="background-color: #11426b;"></a>
</li>
<li>
<a href="#" style="background-color: #f0f0f0;"></a>
</li>
<li>
<a href="#" style="background-color: #3fd5c9;"></a>
</li>
<li>
<a href="#" style="background-color: #979c1c;"></a>
</li>
<li>
<a href="#" style="background-color: #7d5a3c;"></a>
</li>
</ul>
</div><!-- End .widget-body -->
</div><!-- End .collapse -->
</div><!-- End .widget -->
<div class="widget widget-featured">
<h3 class="widget-title">Featured Products</h3>
<div class="widget-body">
<div class="owl-carousel widget-featured-products">
<div class="featured-col">
<div class="product-default left-details product-widget">
<figure>
<a href="product.html">
<img src="assets/images/products/product-7.jpg">
</a>
</figure>
<div class="product-details">
<h2 class="product-title">
<a href="product.html">Product Short Name</a>
</h2>
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<div class="price-box">
<span class="product-price">$49.00</span>
</div><!-- End .price-box -->
</div><!-- End .product-details -->
</div>
<div class="product-default left-details product-widget">
<figure>
<a href="product.html">
<img src="assets/images/products/product-8.jpg">
</a>
</figure>
<div class="product-details">
<h2 class="product-title">
<a href="product.html">Product Short Name</a>
</h2>
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<div class="price-box">
<span class="product-price">$49.00</span>
</div><!-- End .price-box -->
</div><!-- End .product-details -->
</div>
<div class="product-default left-details product-widget">
<figure>
<a href="product.html">
<img src="assets/images/products/product-9.jpg">
</a>
</figure>
<div class="product-details">
<h2 class="product-title">
<a href="product.html">Product Short Name</a>
</h2>
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<div class="price-box">
<span class="product-price">$49.00</span>
</div><!-- End .price-box -->
</div><!-- End .product-details -->
</div>
</div><!-- End .featured-col -->
<div class="featured-col">
<div class="product-default left-details product-widget">
<figure>
<a href="product.html">
<img src="assets/images/products/product-10.jpg">
</a>
</figure>
<div class="product-details">
<h2 class="product-title">
<a href="product.html">Product Short Name</a>
</h2>
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<div class="price-box">
<span class="product-price">$49.00</span>
</div><!-- End .price-box -->
</div><!-- End .product-details -->
</div>
<div class="product-default left-details product-widget">
<figure>
<a href="product.html">
<img src="assets/images/products/product-11.jpg">
</a>
</figure>
<div class="product-details">
<h2 class="product-title">
<a href="product.html">Product Short Name</a>
</h2>
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<div class="price-box">
<span class="product-price">$49.00</span>
</div><!-- End .price-box -->
</div><!-- End .product-details -->
</div>
<div class="product-default left-details product-widget">
<figure>
<a href="product.html">
<img src="assets/images/products/product-12.jpg">
</a>
</figure>
<div class="product-details">
<h2 class="product-title">
<a href="product.html">Product Short Name</a>
</h2>
<div class="ratings-container">
<div class="product-ratings">
<span class="ratings" style="width:100%"></span><!-- End .ratings -->
<span class="tooltiptext tooltip-top"></span>
</div><!-- End .product-ratings -->
</div><!-- End .product-container -->
<div class="price-box">
<span class="product-price">$49.00</span>
</div><!-- End .price-box -->
</div><!-- End .product-details -->
</div>
</div><!-- End .featured-col -->
</div><!-- End .widget-featured-slider -->
</div><!-- End .widget-body -->
</div><!-- End .widget -->
<div class="widget widget-block">
<h3 class="widget-title">{{ getCategory.title }}</h3>
<h5>This is a custom sub-title.</h5>
<p>{{ getCategory.short_description }}</p>
</div><!-- End .widget -->
</div><!-- End .sidebar-wrapper -->
</aside><!-- End .col-lg-3 -->
</div><!-- End .row -->
</div><!-- End .container -->
<div class="mb-5"></div><!-- margin -->
</div>
</div>
</template>
<script>
import {mapGetters} from "vuex";
export default {
name: "CategoryDetail",
data() {
return {
products: []
}
},
methods: {
async getProductsByCategorySlug(slug) {
const {data} = await this.$store.dispatch('product/getProductsByCategorySlug', slug)
this.products = data.products.data
}
},
computed: {
...mapGetters('category', ['getCategory', 'getCategories']),
},
beforeRouteUpdate(to, from, next) {
this.$store.dispatch('category/getCategoryBySlug', to.params.slug)
this.getProductsByCategorySlug(to.params.slug)
next()
},
mounted() {
this.$store.dispatch('category/getCategoryBySlug', this.$route.params.slug)
this.getProductsByCategorySlug(this.$route.params.slug)
},
}
</script> |
import SmallCard from "./SmallCard";
import css from "./SmallCardGrid.module.css";
interface CardData {
title: string;
icon: string;
href?: string;
onClick?: () => void;
}
interface SmallCardGridProps {
cards: Array<CardData>;
}
const SmallCardGrid: React.FC<SmallCardGridProps> = (
props: SmallCardGridProps
) => {
return (
<div className={css.gridList}>
{props.cards.map((card, index) => {
return (
<SmallCard
key={index}
title={card.title}
icon={card.icon}
href={card.href}
onClick={card.onClick}
/>
);
})}
</div>
);
};
export default SmallCardGrid; |
package org.tpjava.emsbackend.model;
import jakarta.persistence.*;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
@NoArgsConstructor
@AllArgsConstructor
@Getter
@Setter
@Entity
@Table(name = "products")
public class Product {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@Column(name = "name", nullable = false)
private String name;
@Column(name = "description", nullable = true)
private String description;
@Column(name = "price", nullable = false)
private Double price;
@Column(name = "ImageLink")
private String ImageLink;
@Column(name = "quantity", nullable = false)
private Integer quantity;
@Column(name = "category", nullable = true)
private String category;
@Column(name = "sku", nullable = false, unique = true)
private String sku;
} |
import React, { useState } from "react";
import Dialog from "@mui/material/Dialog";
import DialogActions from "@mui/material/DialogActions";
import DialogContent from "@mui/material/DialogContent";
import DialogContentText from "@mui/material/DialogContentText";
import DialogTitle from "@mui/material/DialogTitle";
import Button from "@mui/material/Button";
import TextField from "@mui/material/TextField";
export const SaveAsModal = ({ open, onClose, onSaveAs }) => {
const [newFileName, setNewFileName] = useState("");
const handleSave = () => {
onSaveAs(newFileName);
onClose();
};
return (
<Dialog open={open} onClose={onClose}>
<DialogTitle>다른 이름으로 저장</DialogTitle>
<DialogContent>
<DialogContentText>새로운 파일 이름을 입력해주세요.</DialogContentText>
<TextField
autoFocus
margin="dense"
label="파일 이름"
type="text"
fullWidth
value={newFileName}
onChange={(e) => setNewFileName(e.target.value)}
/>
</DialogContent>
<DialogActions>
<Button onClick={onClose} color="primary">
취소
</Button>
<Button onClick={handleSave} color="primary">
저장
</Button>
</DialogActions>
</Dialog>
);
}; |
package app.commands.executables;
import app.commands.Executable;
import app.io.nodes.Node;
import app.io.nodes.input.InputNode;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
import library.Library;
import library.entities.audio.audioFiles.Song;
import library.users.User;
import lombok.Getter;
import lombok.Setter;
import static app.Constants.DOESNT_EXIST;
import static app.Constants.THE_USERNAME;
public final class AdBreak implements Executable {
@Override
public Node execute(final InputNode command) {
User user = Library.getInstance().getUserByName(command.getUsername());
if (user == null) {
return new AdBreakOutputNode(command, THE_USERNAME + command.getUsername() + DOESNT_EXIST);
}
if (!user.getAudioPlayer().hasLoadedMusic()) {
return new AdBreakOutputNode(command, command.getUsername() + " is not playing any music.");
}
user.getAudioPlayer().getAd().setShouldAdBePlayed(true);
user.getAudioPlayer().getAd().setPrice(command.getPrice());
if (user.getAudioPlayer().hasLoadedMusic()) {
Song currentSong =
(Song)
user.getAudioPlayer().getTimeManager().getPlayingAudioEntity(user.getAudioPlayer());
long remainingSongTime =
user.getAudioPlayer().getTimeManager().getRemainingTime(user.getAudioPlayer());
if (currentSong != null && currentSong.getDuration() == remainingSongTime) {
user.getAudioPlayer().startAd(command.getTimestamp());
}
}
return new AdBreakOutputNode(command, "Ad inserted successfully.");
}
@Getter
@Setter
@JsonPropertyOrder({"command", "user", "timestamp", "message"})
private final class AdBreakOutputNode extends Node {
private String user;
private String message;
AdBreakOutputNode(final InputNode command, final String message) {
this.setCommand(command.getCommand());
this.setTimestamp(command.getTimestamp());
this.setUser(command.getUsername());
this.setMessage(message);
}
}
} |
package assign03;
import java.io.File;
import java.io.FileNotFoundException;
import java.util.Scanner;
/**
* CS 1420 Assignment 3
*
* @author Zifan Zuo
* @version Sep 9, 2024
*/
public class GradeCalculator {
public static void main(String[] args) {
// to get a valid input file
Scanner file;
do {
System.out.println("Please type the file name: ");
Scanner input = new Scanner(System.in);
try {
File filename = new File(input.nextLine());
file = new Scanner(filename);
input.close();
break;
} catch (FileNotFoundException e) {
System.out.println("File not found, please check your typing.");
}
} while (true);
// read first three line as average exam, lab, and quiz score
double examScore, labScore, quizScore;
examScore = file.nextFloat();
labScore = file.nextFloat();
quizScore = file.nextFloat();
// read through the file and store all assignments score into an array
int length = file.nextInt();
int[] assignScores = new int[length];
for (int i = 0; i < length; i++) {
assignScores[i] = file.nextInt();
}
// analysis assignment scores
int highestAssignScore = 0;
int lowestAssignScore = 101;
int zeroPointTimes = 0;
int totalScore = 0;
double avarageAssignScore;
for (int i = 0; i < length; i++) {
if (assignScores[i] > highestAssignScore)
highestAssignScore = assignScores[i];
if (assignScores[i] < lowestAssignScore)
lowestAssignScore = assignScores[i];
if (assignScores[i] == 0)
zeroPointTimes++;
totalScore += assignScores[i];
}
avarageAssignScore = (double) totalScore / length;
// deal with zero point assignments
int[] zeroPointIndex = new int[zeroPointTimes];
int zeroCount = 0;
String zeroString = "";
for (int i = 0; i < length; i++) {
if (assignScores[i] == 0) {
zeroPointIndex[zeroCount] = i;
zeroString += i + " ";
zeroCount++;
}
}
// calculate the course grade
double finalScore = examScore * 0.45 + avarageAssignScore * 0.35 + labScore * 0.1 + quizScore * 0.1;
String letterGrade;
if (finalScore >= 93)
letterGrade = "A";
else if (finalScore >= 90)
letterGrade = "A-";
else if (finalScore >= 87)
letterGrade = "B+";
else if (finalScore >= 83)
letterGrade = "B";
else if (finalScore >= 80)
letterGrade = "B-";
else if (finalScore >= 77)
letterGrade = "C+";
else if (finalScore >= 73)
letterGrade = "C";
else if (finalScore >= 70)
letterGrade = "C-";
else if (finalScore >= 67)
letterGrade = "D+";
else if (finalScore >= 63)
letterGrade = "D";
else if (finalScore >= 60)
letterGrade = "D-";
else
letterGrade = "E";
// print out the summary
System.out.printf("Average assignment score: %.2f \n", avarageAssignScore);
System.out.println("" + "0 points on assignment(s): " + zeroString + "\n"
+ "Highest assignment score: " + highestAssignScore + "\n"
+ "Lowest assignment score: " + lowestAssignScore);
System.out.printf("Course grade (numeric): %.2f \n", finalScore);
System.out.println("Course grade (letter): " + letterGrade);
}
} |
<!doctype html>
<html lang="en">
<head>
<!-- Required meta tags -->
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<!-- Bootstrap CSS -->
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css"
integrity="sha384-ggOyR0iXCbMQv3Xipma34MD+dH/1fQ784/j6cY/iJTQUOhcWr7x9JvoRxT2MZw1T" crossorigin="anonymous">
<link rel="stylesheet" href="https://fonts.googleapis.com/icon?family=Material+Icons">
<title>Search</title>
<!-- Style for Google map -->
<style>
#map {
height: 336px;
/* The height is 400 pixels */
width: 100%;
/* The width is the width of the web page */
}
</style>
<!--Style for search results-->
</head>
<body style="background-color:tomato">
<!--Top Navigation-->
<div style="max-width:100%; text-align: center;">
<div class="mt-4" style="max-width: 336px; text-align: right; margin: auto;">
<button type="button" class="btn btn-outline-light">Sign Up</button>
<button type="button" class="btn btn-outline-light">Login</button>
</div>
</div>
<!--Google Map-->
<div id="outerdiv" class="mt-1" style="max-width: 336px; margin: auto; text-align: center;">
<div id="map" class="text-center"></div>
</div>
<!--Filters-->
<form class="input-group mb-3 mt-4" style="max-width: 100%; text-align: center;" action="search.html" method="GET">
<div class="input-group" style="max-width: 336px; max-height: 47px; width: 100%; margin: auto; text-align: center;">
<button type="button" class="btn btn-outline-secondary dropdown-toggle dropdown-toggle-split bg-light"
data-toggle="dropdown" aria-haspopup="true" aria-expanded="false">
<span class="sr-only">Toggle Dropdown</span>
Filters</button>
<div class="dropdown-menu">
<optgroup label="Diet">
<option class="dropdown-item" href="#">Low-Calorie</a>
<option class="dropdown-item" href="#">Low-Fat</a>
<option class="dropdown-item" href="#">High-Protein</a>
<option class="dropdown-item" href="#">Dash</a>
<option class="dropdown-item" href="#">Flexitarian</a>
</optgroup>
<optgroup label="Cuisine">
<option class="dropdown-item" href="#">Mediterranean</a>
<option class="dropdown-item" href="#">American</a>
<option class="dropdown-item" href="#">Italian</a>
<option class="dropdown-item" href="#">Chinese</a>
<option class="dropdown-item" href="#">Mexican</a>
</optgroup>
</div>
<input type="text" class="form-control" placeholder="Search a diet or food here..."
aria-label="Search a diet or food here..." aria-describedby="button-addon2">
<div class="input-group-append">
<button class="btn btn-outline-secondary bg-light" type="submit" id="button-addon2">Search</button>
</div>
</div>
</form>
<!-- Search results -->
<div class="container">
<div class="card flex-row flex-wrap" style="max-width: 336px; margin: auto; text-align: center;">
<div class="card-header border-0">
<img src="../Foodie/Pictures/foodPic1.jpg" alt="" style="width: 100px; height: 100px;left: 0px;">
</div>
<div class="card-block px-2">
<h4 class="card-title">Title</h4>
<p class="card-text">Description</p>
<a href="#" class="btn btn-primary">More Information</a>
</div>
</div>
<div class="card flex-row flex-wrap" style="max-width: 336px; margin: auto; text-align: center;">
<div class="card-header border-0">
<img src="../Foodie/Pictures/foodPic2.jpg" alt="" style="width: 100px; height: 100px;left: 0px;">
</div>
<div class="card-block px-2">
<h4 class="card-title">Title</h4>
<p class="card-text">Description</p>
<a href="#" class="btn btn-primary">More Inforomation</a>
</div>
</div>
<div class="card flex-row flex-wrap" style="max-width: 336px; margin: auto; text-align: center;">
<div class="card-header border-0">
<img src="../Foodie/Pictures/foodPic1.jpg" alt="" style="width: 100px; height: 100px;left: 0px;">
</div>
<div class="card-block px-2">
<h4 class="card-title">Title</h4>
<p class="card-text">Description</p>
<a href="#" class="btn btn-primary">More Information</a>
</div>
</div>
</div>
<!-- Optional JavaScript -->
<!-- jQuery first, then Popper.js, then Bootstrap JS -->
<script src="https://code.jquery.com/jquery-3.3.1.slim.min.js"
integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo"
crossorigin="anonymous"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.7/umd/popper.min.js"
integrity="sha384-UO2eT0CpHqdSJQ6hJty5KVphtPhzWj9WO1clHTMGa3JDZwrnQq4sF86dIHNDz0W1"
crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/js/bootstrap.min.js"
integrity="sha384-JjSmVgyd0p3pXB1rRibZUAYoIIy6OrQ6VrjIEaFf/nJGzIxFDsf4x0xIM+B07jRM"
crossorigin="anonymous"></script>
<!--Script and Api for google maps-->
<script>
var map;
function initMap() {
map = new google.maps.Map(document.getElementById('map'), {
center: { lat: 37.7749, lng: -122.4194 },
zoom: 4
});
}
</script>
<script src="https://maps.googleapis.com/maps/api/js?key=AIzaSyBzuRuJxXZFZYBtwQshhslql4haoR1PdbE&callback=initMap"
async defer>
</script>
</body>
<footer>
<!--Bottom Navigation Bar-->
<nav class="navbar fixed-bottom navbar-light bg-light" style="max-width: 336px; width: 100%; margin: auto;">
<a class="navbar-brand" href="../Foodie/index.html">
<img src="..\Foodie\Pictures\baseline_home_black_18dp.png" class="img-fluid rounded" alt="Responsive image">
</a>
<a class="navbar-brand" href="../Foodie/search.html">
<img src="..\Foodie\Pictures\baseline_near_me_black_18dp.png" class="img-fluid rounded"
alt="Responsive image">
</a>
<a class="navbar-brand" href="../Foodie/profile.html">
<img src="..\Foodie\Pictures\baseline_person_black_18dp.png" class="img-fluid rounded"
alt="Responsive image">
</a>
<a class="navbar-brand" href="#">
<img src="..\Foodie\Pictures\baseline_favorite_black_18dp.png" class="img-fluid rounded"
alt="Responsive image">
</a>
</nav>
</footer>
</html> |
Semantic Policy Difference Tool for Security Enhanced Linux
Overview:
---------
The sediff and sediffx programs are policy analysis tools that take
two policies and compare them, showing a list of the differences. The
former is a command-line only program while the latter is a GTK+
application. They can compare source, monolithic binary, and modular
binary policies; they can also compare different versions of policy.
The two programs support source policies versions 12 and higher,
monolithic binary versions 15 and higher, and modular binary versions
5 and higher.
Limitations:
------------
The programs currently compare the following policy elements:
- commons and object classes
- levels and categories
- types and attributes
- roles
- users
- booleans
- access vector rules (allow, neverallow, etc.)
- type rules (type_transition, type_member, etc.)
- role allow rules
- role transition rules
- range transition rules
What is a Semantic Diff?
------------------------
The challenge with comparing two policies is that a straightforward
textual comparison is of little value. What one needs is the ability
to determine semantically how two policies differ. For example, one
could not simply grep for allow rules with a given type, and then
compare them to a similar list from another policy. Many factors
affect the semantic meaning of the rules. For example, multiple rules
can allow differing sets of permissions. Attributes can allow
permissions to or from a type. What was a type in one policy could
become an alias in another.
What sediff and sediffx do are analyze each policy semantically. We
define "semantically" as how the kernel security server uses a policy
to make enforcement decisions. This approach also allows binary and
source policies to be compared, as well as different versions of
policies.
NOTE: The one semantic assumption sediff and sediffx make is that when
an identifier (e.g., a type name) has the same string value in each
policy, then it represents the same semantic meaning in both policies.
sediff and sediffx Commands:
----------------------------
Policies may be differentiated upon the command line (see "sediff
--help") or in a graphical environment (see "sediffx --help"). The
sediffx tool is recommended because it gives additional details about
policy differences and affords type remappings. The remainder of this
document focuses on sediffx.
Understanding sediffx's Results:
--------------------------------
After calculating differences between two policies, the GUI shows the
compared policy components in the top-left frame. Besides each policy
component is a number representing the total number of differences for
that component. Select a policy component to show detailed results in
the right-hand window.
NOTE: All differences are shown from the perspective of the first
policy given (i.e., "original policy") to the second ("modified
policy"). There are five types of differences shown:
- Added (+): A policy component was added by the second policy (in
modified policy but not original policy).
- Removed (-): A policy component was removed by the second policy
(in original policy but not modified policy).
- Modified (*): A policy component was present in both policies, but
is different in the modified policy.
Where appropriate, two other differences are possible:
- Added because of new type (+): This policy component could not
exist in the original policy because that policy does not declare
a necessary type.
- Removed because of missing type (-): This policy component could
not exist in the modified policy because that policy no longer
declares a necessary type.
Supported Policy Areas Differences:
-----------------------------------
Below is an explanation of the difference for each supported policy
area:
Commons:
--------
Classes can be added, removed, or modified. Modified means that
the list of permissions associated with a common is different.
Classes:
--------
Classes are compared much like commons. They too may be added,
removed, or modified.
Levels:
-------
If either policy is MLS then levels will be compared. Levels can be
added or removed; a modified level means that the categories
assigned to its corresponding sensitivity has changed. Be aware
that levels' aliases are ignored by the diff algorithm.
Categories:
-----------
If either policy is MLS then categories will be compared. They can
be added or removed; there is no such thing as a "modified"
category. Be aware that categories' aliases are ignored by the
diff algorithm.
Types:
------
Types can be added, removed, or modified. Modified means that the
attributes associated with a type are different between the two
policies.
Attributes:
-----------
Attributes are compared like types. They can be added, removed,
or modified. Modified means that the types associated with the
attributes are different (types can be added or removed from the
attribute).
Roles:
------
Roles can be added, removed, or modified. Modified means that the
types associated with a role are different between the two policies.
Types can be added or removed from a role.
Users:
------
Users can be added, removed, or modified. Modified means that the
roles associated with a user are different between the two policies.
Roles can be added or removed from a user. In addition, if either
policy is MLS then the users' ranges and default levels are also
compared.
Booleans:
---------
Booleans can be added, removed, or modified. If comparing a version
15 or earlier policy with a version 16 or later policy, all the
booleans will be added or removed, for booleans were introduced in
version 16. Modified means that the default value is different
between the two policies.
AV Rules:
---------
Finding differences in access vector rules (allow, neverallow, etc.)
consumes the majority of time when diffing two policies. The rule
comparison is truly semantic. All issues of redundancy and
duplication, as well as indirect access through attributes are
resolved. All rules are keyed by the "source-target-class" (STC)
triple. In addition, conditional rules are distinguished from
non-conditional rules. Thus, for example, two rules with the same
STC will not be compared if one is non-conditional and the other is
conditional, or if both are conditional but conditioned on two
different conditional expressions. For conditional rules, the
conditional expression is compared to ensure that conditional rules
are meaningfully compared. In the results pane, conditional rules
are displayed with their associated conditional expression and if
the rule was in conditional's TRUE or FALSE branch.
NOTE: For conditional rules, the default and current values of
the booleans are ignored. Conditional expressions are compared
as if all booleans were in the same state.
Rules can be added, removed, or modified. Added means the STC
triple for that rule is not present in the original policy but in
the modified one. Removed means the STC triple is present in the
original but not modified policy. Modified means that the
permissions for the rule are different between the policies.
When source policies are compared, hyperlinked line numbers are
shown that takes the user to the policy's source where the rule was
defined. If there were more than one source rules that contributed
to a STC triple for a given rule, then all relevant source rules are
linked. Furthermore, the user may click upon an individual
permission to obtain a list of lines that contributed just that
permission.
Type Rules:
-----------
Type rules are type_transition, type_member, and type_change. They
are differentiated much like AV rules in that their STCs are used as
keys. For type rules, modified means that the default type is
different between the policies.
Role Allows:
------------
Role allow rules determine if a role is allowed to transition to
another role. Diffing a role allow involves taking the source role
and checking to see if there are corresponding rules in the other
policy with the same source role. A modified role allow means the
same source exist in both policies but target roles differ.
Role Transitions:
-----------------
Role transitions are keyed against both the source role and target
type. If a role transition exists in both policies but has a
different default role then it is marked as modified.
Range Transitions:
------------------
Range transitions have a STC much like AV rules. A modified range
transition indicates a difference in the rules' target ranges. This
could be either a difference in level or in minimal category set.
Policy Tabs:
------------
Each policy has a tab on the main window labeled Policy #: followed by
the policy file name. Under these tabs are a policy statistics tab
and a source tab.
Policy Statistics Tab:
----------------------
The policy statistics tab displays a summary of that policy's
contents.
Source Tab:
-----------
If the policy is a source policy, this tab displays the source of
that policy.
Remapping Types:
----------------
The diff algorithm implicitly treats a type with the same name across
both policies as the same semantic item. This includes a name that
was a type in one policy but became an alias in the other. There may
be instances where the operator has special knowledge of the remaining
unmapped types. From sediffx's main interface, select "Remap Types"
from the Tools menu to open a dialog box. Add additional remappings
between types as necessary.
There are times when a one-to-one mapping is not sufficient for
analysis purposes. Occasionally a type is "split" into two or more
types; conversely multiple types are "joined" into a single type. For
example, a policy has the type "games_t" for a number of programs. At
some point in the future the policy writer decides to give NetHack its
own type, "nethack_t". To represent this type split in sediffx, go to
the Remap Types dialog. There should already be an inferred mapping
from the original policy's games_t to the modified policy. Add a new
mapping from games_t to nethack_t. The mappings list will be updated
to show that games_t now maps to both games_t and nethack_t. |
import copy
import functools
import gc
import inspect
import logging
import os
import warnings
import numpy as np
from astropy.convolution import convolve_fft
from astropy.io import fits
from astropy.nddata.bitmask import interpret_bit_flags, bitfield_to_boolean_mask
from astropy.stats import sigma_clipped_stats, SigmaClip
from astropy.table import Table
from astropy.wcs import WCS
from astropy.wcs.wcsapi import SlicedLowLevelWCS
from photutils.segmentation import detect_threshold, detect_sources
from reproject import reproject_interp, reproject_adaptive, reproject_exact
from reproject.mosaicking import find_optimal_celestial_wcs, reproject_and_coadd
from reproject.mosaicking.subset_array import ReprojectedArraySubset
from scipy.interpolate import RegularGridInterpolator
from stdatamodels import util
from stdatamodels.jwst import datamodels
from stdatamodels.jwst.datamodels.dqflags import pixel
from .. import __version__
try:
import tomllib
except ModuleNotFoundError:
import tomli as tomllib
ALLOWED_REPROJECT_FUNCS = [
"interp",
"adaptive",
"exact",
]
# Useful values
PIXEL_SCALE_NAMES = ["XPIXSIZE", "CDELT1", "CD1_1", "PIXELSCL"]
# Pixel scales
jwst_pixel_scales = {
"miri": 0.11,
"nircam_long": 0.063,
"nircam_short": 0.031,
}
# All NIRCAM bands
nircam_bands = [
"F070W",
"F090W",
"F115W",
"F140M",
"F150W",
"F162M",
"F164N",
"F150W2",
"F182M",
"F187N",
"F200W",
"F210M",
"F212N",
"F250M",
"F277W",
"F300M",
"F322W2",
"F323N",
"F335M",
"F356W",
"F360M",
"F405N",
"F410M",
"F430M",
"F444W",
"F460M",
"F466N",
"F470N",
"F480M",
]
# All MIRI bands
miri_bands = [
"F560W",
"F770W",
"F1000W",
"F1130W",
"F1280W",
"F1500W",
"F1800W",
"F2100W",
"F2550W",
]
# FWHM of bands in pixels
fwhms_pix = {
# NIRCAM
"F070W": 0.987,
"F090W": 1.103,
"F115W": 1.298,
"F140M": 1.553,
"F150W": 1.628,
"F162M": 1.770,
"F164N": 1.801,
"F150W2": 1.494,
"F182M": 1.990,
"F187N": 2.060,
"F200W": 2.141,
"F210M": 2.304,
"F212N": 2.341,
"F250M": 1.340,
"F277W": 1.444,
"F300M": 1.585,
"F322W2": 1.547,
"F323N": 1.711,
"F335M": 1.760,
"F356W": 1.830,
"F360M": 1.901,
"F405N": 2.165,
"F410M": 2.179,
"F430M": 2.300,
"F444W": 2.302,
"F460M": 2.459,
"F466N": 2.507,
"F470N": 2.535,
"F480M": 2.574,
# MIRI
"F560W": 1.882,
"F770W": 2.445,
"F1000W": 2.982,
"F1130W": 3.409,
"F1280W": 3.818,
"F1500W": 4.436,
"F1800W": 5.373,
"F2100W": 6.127,
"F2550W": 7.300,
}
band_exts = {
"nircam": "nrc*",
"miri": "mirimage",
}
log = logging.getLogger("stpipe")
log.addHandler(logging.NullHandler())
def get_pixscale(hdu):
"""Get pixel scale from header.
Checks HDU header and returns a pixel scale
Args:
hdu: hdu to get pixel scale for
"""
for pixel_keyword in PIXEL_SCALE_NAMES:
try:
try:
pix_scale = np.abs(float(hdu.header[pixel_keyword]))
except ValueError:
continue
if pixel_keyword in ["CDELT1", "CD1_1"]:
pix_scale = WCS(hdu.header).proj_plane_pixel_scales()[0].value * 3600
# pix_scale *= 3600
return pix_scale
except KeyError:
pass
raise Warning("No pixel scale found")
def load_toml(filename):
"""Open a .toml file
Args:
filename (str): Path to toml file
"""
with open(filename, "rb") as f:
toml_dict = tomllib.load(f)
return toml_dict
def get_band_type(
band,
short_long_nircam=False,
):
"""Get the instrument type from the band name
Args:
band (str): Name of band
short_long_nircam (bool): Whether to distinguish between short/long
NIRCam bands. Defaults to False
"""
if band in miri_bands:
band_type = "miri"
elif band in nircam_bands:
band_type = "nircam"
else:
raise ValueError(f"band {band} unknown")
if not short_long_nircam:
return band_type
else:
if band_type in ["nircam"]:
if int(band[1:4]) <= 212:
short_long = "nircam_short"
else:
short_long = "nircam_long"
band_type = "nircam"
else:
short_long = copy.deepcopy(band_type)
return band_type, short_long
def get_band_ext(band):
"""Get the specific extension (e.g. mirimage) for a band"""
band_type = get_band_type(band)
band_ext = band_exts[band_type]
return band_ext
def get_default_args(func):
"""Pull the default arguments from a function"""
signature = inspect.signature(func)
return {
k: v.default
for k, v in signature.parameters.items()
if v.default is not inspect.Parameter.empty
}
def get_kws(
parameters,
func,
band,
target,
max_level=None,
):
"""Set up kwarg dict for a function, looping over band and target
Args:
parameters: Dictionary of parameters
func: Function to set the parameters for
band: Band to pull band-specific parameters for
target: Target to pull target-specific parameters for
max_level: How far to recurse down the dictionary. Defaults
to None, which will recurse all the way down
"""
args = get_default_args(func)
func_kws = {}
for arg in args:
if arg in parameters:
arg_val = parse_parameter_dict(
parameters=parameters,
key=arg,
band=band,
target=target,
max_level=max_level,
)
if arg_val == "VAL_NOT_FOUND":
arg_val = args[arg]
else:
arg_val = args[arg]
func_kws[arg] = arg_val
return func_kws
def parse_parameter_dict(
parameters,
key,
band,
target,
max_level=None,
):
"""Pull values out of a parameter dictionary
Args:
parameters (dict): Dictionary of parameters and associated values
key (str): Particular key in parameter_dict to consider
band (str): JWST band, to parse out band type and potentially per-band
values
target (str): JWST target, for very specific values
max_level: Maximum level to recurse down. Defaults to None, which will
go until it finds something that's not a dictionary
"""
if max_level is None:
max_level = np.inf
value = parameters[key]
band_type, short_long = get_band_type(
band,
short_long_nircam=True,
)
pixel_scale = jwst_pixel_scales[short_long]
found_value = False
level = 0
while level < max_level and not found_value:
if isinstance(value, dict):
# Define a priority here. It goes:
# * target
# * band
# * nircam_short/nircam_long
# * nircam/miri
if target in value:
value = value[target]
elif band in value:
value = value[band]
elif band_type == "nircam" and short_long in value:
value = value[short_long]
elif band_type in value:
value = value[band_type]
else:
value = "VAL_NOT_FOUND"
level += 1
if not isinstance(value, dict):
found_value = True
# Finally, if we have a string with a 'pix' in there, we need to convert to arcsec. Unless it's not a number!
# Then just return the value
if isinstance(value, str):
if "pix" in value:
try:
value = float(value.strip("pix")) * pixel_scale
except ValueError:
pass
return value
def attribute_setter(
pipeobj,
parameters,
band,
target,
):
"""Set attributes for a function
Args:
pipeobj: Function/class to set parameters for
parameters: Dictionary of parameters to set
band: Band to pull band-specific parameters for
target: Target to pull target-specific parameters for
"""
for key in parameters.keys():
if type(parameters[key]) is dict:
for subkey in parameters[key]:
value = parse_parameter_dict(
parameters=parameters[key],
key=subkey,
band=band,
target=target,
)
if value == "VAL_NOT_FOUND":
continue
recursive_setattr(
pipeobj,
".".join([key, subkey]),
value,
)
else:
value = parse_parameter_dict(
parameters=parameters,
key=key,
band=band,
target=target,
)
if value == "VAL_NOT_FOUND":
continue
recursive_setattr(
pipeobj,
key,
value,
)
return pipeobj
def recursive_setattr(
f,
attribute,
value,
protected=False,
):
"""Set potentially recursive function attributes.
This is needed for the JWST pipeline steps, which have levels to them
Args:
f: Function to consider
attribute: Attribute to consider
value: Value to set
protected: If a function is protected, this won't strip out the leading underscore
"""
pre, _, post = attribute.rpartition(".")
if pre:
pre_exists = True
else:
pre_exists = False
if protected:
post = "_" + post
return setattr(recursive_getattr(f, pre) if pre_exists else f, post, value)
def recursive_getattr(
f,
attribute,
*args,
):
"""Get potentially recursive function attributes.
This is needed for the JWST pipeline steps, which have levels to them
Args:
f: Function to consider
attribute: Attribute to consider
args: Named arguments
"""
def _getattr(f, attribute):
return getattr(f, attribute, *args)
return functools.reduce(_getattr, [f] + attribute.split("."))
def get_obs_table(
files,
check_bgr=False,
check_type="parallel_off",
background_name="off",
):
"""Pull necessary info out of fits headers"""
tab = Table(
names=[
"File",
"Type",
"Obs_ID",
"Filter",
"Start",
"Exptime",
"Objname",
"Program",
"Array",
],
dtype=[
str,
str,
str,
str,
str,
float,
str,
str,
str,
],
)
for f in files:
tab.add_row(
parse_fits_to_table(
f,
check_bgr=check_bgr,
check_type=check_type,
background_name=background_name,
)
)
return tab
def parse_fits_to_table(
file,
check_bgr=False,
check_type="parallel_off",
background_name="off",
):
"""Pull necessary info out of fits headers
Args:
file (str): File to get info for
check_bgr (bool): Whether to check if this is a science or background observation (in the MIRI case)
check_type (str): How to check if background observation. Options are
- 'parallel_off', which will use the filename to see if it's a parallel observation with NIRCAM
- 'check_in_name', which will use the observation name to check, matching against 'background_name'.
- 'filename', which will use the filename
Defaults to 'parallel_off'
background_name (str): Name to indicate background observation. Defaults to 'off'.
"""
# Figure out if we're a background observation or not
f_type = "sci"
if check_bgr:
# If it's a parallel observation (PHANGS-style)
if check_type == "parallel_off":
file_split = os.path.split(file)[-1]
if file_split.split("_")[1][2] == "2":
f_type = "bgr"
# If the backgrounds are labelled differently in the target name
elif check_type == "check_in_name":
with datamodels.open(file) as im:
if background_name in im.meta.target.proposer_name.lower():
f_type = "bgr"
# If we want to use some specific files within the science as observations
elif check_type == "filename":
if isinstance(background_name, str):
background_name = [background_name]
with datamodels.open(file) as im:
for bg_name in background_name:
if bg_name in im.meta.filename.lower():
f_type = "bgr"
else:
raise Warning(f"check_type {check_type} not known")
# Pull out data we need from header
with datamodels.open(file) as im:
obs_n = im.meta.observation.observation_number
obs_filter = im.meta.instrument.filter
obs_date = im.meta.observation.date_beg
obs_duration = im.meta.exposure.duration
# Sometimes the observation label is not defined, so have a fallback here
obs_label = im.meta.observation.observation_label
if obs_label is not None:
obs_label = obs_label.lower()
else:
obs_label = ""
obs_program = im.meta.observation.program_number
array_name = im.meta.subarray.name.lower().strip()
return (
file,
f_type,
obs_n,
obs_filter,
obs_date,
obs_duration,
obs_label,
obs_program,
array_name,
)
def get_dq_bit_mask(
dq,
bit_flags="~DO_NOT_USE+NON_SCIENCE",
):
"""Get a DQ bit mask from an input image
Args:
dq: DQ array
bit_flags: Bit flags to get mask for. Defaults to only get science pixels
"""
dq_bits = interpret_bit_flags(bit_flags=bit_flags, flag_name_map=pixel)
dq_bit_mask = bitfield_to_boolean_mask(
dq.astype(np.uint8), dq_bits, good_mask_value=0, dtype=np.uint8
)
return dq_bit_mask
def make_source_mask(
data,
mask=None,
nsigma=3,
npixels=3,
dilate_size=11,
sigclip_iters=5,
):
"""Make a source mask from segmentation image"""
sc = SigmaClip(
sigma=nsigma,
maxiters=sigclip_iters,
)
threshold = detect_threshold(
data,
mask=mask,
nsigma=nsigma,
sigma_clip=sc,
)
segment_map = detect_sources(
data,
threshold,
npixels=npixels,
)
# If sources are detected, we can make a segmentation mask, else fall back to 0 array
try:
mask = segment_map.make_source_mask(size=dilate_size)
except AttributeError:
mask = np.zeros(data.shape, dtype=bool)
return mask
def sigma_clip(
data,
dq_mask=None,
sigma=1.5,
n_pixels=5,
max_iterations=20,
):
"""Get sigma-clipped statistics for data"""
with warnings.catch_warnings():
warnings.simplefilter("ignore")
mask = make_source_mask(data, mask=dq_mask, nsigma=sigma, npixels=n_pixels)
if dq_mask is not None:
mask = np.logical_or(mask, dq_mask)
mean, median, std_dev = sigma_clipped_stats(
data, mask=mask, sigma=sigma, maxiters=max_iterations
)
return mean, median, std_dev
def reproject_image(
file,
optimal_wcs,
optimal_shape,
hdu_type="data",
do_sigma_clip=False,
stacked_image=False,
do_level_data=False,
reproject_func="interp",
):
"""Reproject an image to an optimal WCS
Args:
file: File to reproject
optimal_wcs: Optimal WCS for input image stack
optimal_shape: Optimal shape for input image stack
hdu_type: Type of HDU. Can either be 'data', 'err', or 'var_rnoise'
do_sigma_clip: Whether to perform sigma-clipping or not.
Defaults to False
stacked_image: Stacked image or not? Defaults to False
do_level_data: Whether to level between amplifiers or not.
Defaults to False
reproject_func: Which reproject function to use. Defaults to 'interp',
but can also be 'exact' or 'adaptive'
"""
if reproject_func == "interp":
r_func = reproject_interp
elif reproject_func == "exact":
r_func = reproject_exact
elif reproject_func == "adaptive":
r_func = reproject_adaptive
else:
raise ValueError(f"reproject_func should be one of {ALLOWED_REPROJECT_FUNCS}")
hdu_mapping = {
"data": "SCI",
"err": "ERR",
"var_rnoise": "VAR_RNOISE",
}
if not stacked_image:
with datamodels.open(file) as hdu:
dq_bit_mask = get_dq_bit_mask(hdu.dq)
wcs = hdu.meta.wcs.to_fits_sip()
w_in = WCS(wcs)
# Level data (but not in subarray mode)
if "sub" not in hdu.meta.subarray.name.lower() and do_level_data and hdu_type == "data":
hdu.data = level_data(hdu)
if hdu_type == "data":
data = copy.deepcopy(hdu.data)
elif hdu_type == "err":
data = copy.deepcopy(hdu.err)
elif hdu_type == "var_rnoise":
data = copy.deepcopy(hdu.var_rnoise)
else:
raise Warning(f"Unsure how to deal with hdu_type {hdu_type}")
else:
hdu_name = hdu_mapping[hdu_type]
with fits.open(file) as hdu:
sci = copy.deepcopy(hdu["SCI"].data)
data = copy.deepcopy(hdu[hdu_name].data)
wcs = hdu["SCI"].header
w_in = WCS(wcs)
dq_bit_mask = None
sig_mask = None
if do_sigma_clip:
sig_mask = make_source_mask(
sci,
mask=dq_bit_mask,
dilate_size=7,
)
sig_mask = sig_mask.astype(int)
data[data == 0] = np.nan
# This comes from the astropy reproject routines
edges = sample_array_edges(data.shape, n_samples=11)[::-1]
edges_out = optimal_wcs.world_to_pixel(w_in.pixel_to_world(*edges))[::-1]
# Determine the cutout parameters
# In some cases, images might not have valid coordinates in the corners,
# such as all-sky images or full solar disk views. In this case we skip
# this step and just use the full output WCS for reprojection.
ndim_out = len(optimal_shape)
skip_data = False
if np.any(np.isnan(edges_out)):
bounds = list(zip([0] * ndim_out, optimal_shape))
else:
bounds = []
for idim in range(ndim_out):
imin = max(0, int(np.floor(edges_out[idim].min() + 0.5)))
imax = min(optimal_shape[idim], int(np.ceil(edges_out[idim].max() + 0.5)))
bounds.append((imin, imax))
if imax < imin:
skip_data = True
break
if skip_data:
return
slice_out = tuple([slice(imin, imax) for (imin, imax) in bounds])
if isinstance(optimal_wcs, WCS):
wcs_out_indiv = optimal_wcs[slice_out]
else:
wcs_out_indiv = SlicedLowLevelWCS(optimal_wcs.low_level_wcs, slice_out)
shape_out_indiv = [imax - imin for (imin, imax) in bounds]
data_reproj_small = r_func(
(data, wcs),
output_projection=wcs_out_indiv,
shape_out=shape_out_indiv,
return_footprint=False,
)
# Mask out bad DQ, but only for unstacked images. This needs to use
# reproject_interp, so we can keep whole numbers
if not stacked_image:
dq_reproj_small = reproject_interp(
(dq_bit_mask, wcs),
output_projection=wcs_out_indiv,
shape_out=shape_out_indiv,
return_footprint=False,
order="nearest-neighbor",
)
data_reproj_small[dq_reproj_small == 1] = np.nan
# If we're sigma-clipping, reproject the mask. This needs to use
# reproject_interp, so we can keep whole numbers
if do_sigma_clip:
sig_mask_reproj_small = reproject_interp(
(sig_mask, wcs),
output_projection=wcs_out_indiv,
shape_out=shape_out_indiv,
return_footprint=False,
order="nearest-neighbor",
)
data_reproj_small[sig_mask_reproj_small == 1] = np.nan
footprint = np.ones_like(data_reproj_small)
footprint[
np.logical_or(data_reproj_small == 0, ~np.isfinite(data_reproj_small))
] = 0
data_array = ReprojectedArraySubset(
data_reproj_small, footprint, bounds[1][0], bounds[1][1], bounds[0][0], bounds[0][1],
)
del hdu
gc.collect()
return data_array
def sample_array_edges(shape, *, n_samples):
# Given an N-dimensional array shape, sample each edge of the array using
# the requested number of samples (which will include vertices). To do this
# we iterate through the dimensions and for each one we sample the points
# in that dimension and iterate over the combination of other vertices.
# Returns an array with dimensions (N, n_samples)
all_positions = []
ndim = len(shape)
shape = np.array(shape)
for idim in range(ndim):
for vertex in range(2 ** ndim):
positions = -0.5 + shape * ((vertex & (2 ** np.arange(ndim))) > 0).astype(int)
positions = np.broadcast_to(positions, (n_samples, ndim)).copy()
positions[:, idim] = np.linspace(-0.5, shape[idim] - 0.5, n_samples)
all_positions.append(positions)
positions = np.unique(np.vstack(all_positions), axis=0).T
return positions
def do_jwst_convolution(
file_in,
file_out,
file_kernel,
blank_zeros=True,
output_grid=None,
reproject_func="interp",
):
"""
Convolves input image with an input kernel, and writes to disk.
Will also process errors and do reprojection, if specified
Args:
file_in: Path to image file
file_out: Path to output file
file_kernel: Path to kernel for convolution
blank_zeros: If True, then all zero values will be set to NaNs. Defaults to True
output_grid: None (no reprojection to be done) or tuple (wcs, shape) defining the grid for reprojection.
Defaults to None
reproject_func: Which reproject function to use. Defaults to 'interp',
but can also be 'exact' or 'adaptive'
"""
if reproject_func == "interp":
r_func = reproject_interp
elif reproject_func == "exact":
r_func = reproject_exact
elif reproject_func == "adaptive":
r_func = reproject_adaptive
else:
raise ValueError(f"reproject_func should be one of {ALLOWED_REPROJECT_FUNCS}")
with fits.open(file_kernel) as kernel_hdu:
kernel_pix_scale = get_pixscale(kernel_hdu[0])
# Note the shape and grid of the kernel as input
kernel_data = kernel_hdu[0].data
kernel_hdu_length = kernel_hdu[0].data.shape[0]
original_central_pixel = (kernel_hdu_length - 1) / 2
original_grid = (
np.arange(kernel_hdu_length) - original_central_pixel
) * kernel_pix_scale
with fits.open(file_in) as image_hdu:
if blank_zeros:
# make sure that all zero values were set to NaNs, which
# astropy convolution handles with interpolation
image_hdu["ERR"].data[(image_hdu["SCI"].data == 0)] = np.nan
image_hdu["SCI"].data[(image_hdu["SCI"].data == 0)] = np.nan
image_pix_scale = get_pixscale(image_hdu["SCI"])
# Calculate kernel size after interpolating to the image pixel
# scale. Because sometimes there's a little pixel scale rounding
# error, subtract a little bit off the optimum size (Tom
# Williams).
interpolate_kernel_size = (
np.floor(kernel_hdu_length * kernel_pix_scale / image_pix_scale) - 2
)
# Ensure the kernel has a central pixel
if interpolate_kernel_size % 2 == 0:
interpolate_kernel_size -= 1
# Define a new coordinate grid onto which to project the kernel
# but using the pixel scale of the image
new_central_pixel = (interpolate_kernel_size - 1) / 2
new_grid = (
np.arange(interpolate_kernel_size) - new_central_pixel
) * image_pix_scale
x_coords_new, y_coords_new = np.meshgrid(new_grid, new_grid)
# Do the reprojection from the original kernel grid onto the new
# grid with pixel scale matched to the image
grid_interpolated = RegularGridInterpolator(
(original_grid, original_grid),
kernel_data,
bounds_error=False,
fill_value=0.0,
)
kernel_interp = grid_interpolated(
(x_coords_new.flatten(), y_coords_new.flatten())
)
kernel_interp = kernel_interp.reshape(x_coords_new.shape)
# Ensure the interpolated kernel is normalized to 1
kernel_interp = kernel_interp / np.nansum(kernel_interp)
# Now with the kernel centered and matched in pixel scale to the
# input image use the FFT convolution routine from astropy to
# convolve.
conv_im = convolve_fft(
image_hdu["SCI"].data,
kernel_interp,
allow_huge=True,
preserve_nan=True,
fill_value=np.nan,
)
# Convolve errors (with kernel**2, do not normalize it).
# This, however, doesn't account for covariance between pixels
conv_err = np.sqrt(
convolve_fft(
image_hdu["ERR"].data ** 2,
kernel_interp ** 2,
preserve_nan=True,
allow_huge=True,
normalize_kernel=False,
)
)
image_hdu["SCI"].data = conv_im
image_hdu["ERR"].data = conv_err
if output_grid is None:
image_hdu.writeto(file_out, overwrite=True)
else:
# Reprojection to target wcs grid define in output_grid
target_wcs, target_shape = output_grid
hdulist_out = fits.HDUList([fits.PrimaryHDU(header=image_hdu[0].header)])
repr_data, fp = r_func(
(conv_im, image_hdu["SCI"].header),
output_projection=target_wcs,
shape_out=target_shape,
)
fp = fp.astype(bool)
repr_data[~fp] = np.nan
header = image_hdu["SCI"].header
header.update(target_wcs.to_header())
hdulist_out.append(fits.ImageHDU(data=repr_data, header=header, name="SCI"))
# Note - this ignores the errors of interpolation and thus the resulting errors might be underestimated
repr_err = r_func(
(conv_err, image_hdu["SCI"].header),
output_projection=target_wcs,
shape_out=target_shape,
return_footprint=False,
)
repr_err[~fp] = np.nan
header = image_hdu["ERR"].header
hdulist_out.append(fits.ImageHDU(data=repr_err, header=header, name="ERR"))
hdulist_out.writeto(file_out, overwrite=True)
def level_data(
im,
):
"""Level overlaps in NIRCAM amplifiers
Args:
im: Input datamodel
"""
data = copy.deepcopy(im.data)
quadrant_size = data.shape[1] // 4
dq_mask = get_dq_bit_mask(dq=im.dq)
dq_mask = dq_mask | ~np.isfinite(im.data) | ~np.isfinite(im.err) | (im.data == 0)
for i in range(3):
quad_1 = data[:, i * quadrant_size: (i + 1) * quadrant_size][
:, quadrant_size - 20:
]
dq_1 = dq_mask[:, i * quadrant_size: (i + 1) * quadrant_size][
:, quadrant_size - 20:
]
quad_2 = data[:, (i + 1) * quadrant_size: (i + 2) * quadrant_size][:, :20]
dq_2 = dq_mask[:, (i + 1) * quadrant_size: (i + 2) * quadrant_size][:, :20]
quad_1[dq_1] = np.nan
quad_2[dq_2] = np.nan
with warnings.catch_warnings():
warnings.simplefilter("ignore")
med_1 = np.nanmedian(
quad_1,
axis=1,
)
med_2 = np.nanmedian(
quad_2,
axis=1,
)
diff = med_1 - med_2
delta = sigma_clipped_stats(diff, maxiters=None)[1]
data[:, (i + 1) * quadrant_size: (i + 2) * quadrant_size] += delta
return data
def save_file(im,
out_name,
dr_version,
):
"""Save out an image, adding in useful metadata
Args:
im: Input JWST datamodel
out_name: File to save output to
dr_version: Data processing version
"""
# Save versions both in the metadata, and in fits history
im.meta.pjpipe_version = __version__
im.meta.pjpipe_dr_version = dr_version
entry = util.create_history_entry(f"PJPIPE VER: {__version__}")
im.history.append(entry)
entry = util.create_history_entry(f"DATA PROCESSING VER: {dr_version}")
im.history.append(entry)
im.save(out_name)
return True
def make_stacked_image(
files,
out_name,
additional_hdus=None,
auto_rotate=True,
reproject_func="interp",
match_background=False,
):
"""Create a quick stacked image from a series of input images
Args:
files: List of input files
out_name: Output stacked file
additional_hdus: Can also append some additional data beyond the science
extension by specifying the fits extension here. Defaults to None,
which will not add anything extra
auto_rotate: Whether to rotate the WCS to make a minimum sized image.
Defaults to True
reproject_func: Which reproject function to use. Defaults to 'interp',
but can also be 'exact' or 'adaptive'
match_background: Whether to match backgrounds when making the stack.
Defaults to False
"""
# HDUs we need to square before they go into the reprojection
sq_hdus = [
"ERR",
]
# HDUs we'll need to take the square root of the stacked image to be meaningful
sqrt_hdus = [
"ERR",
"VAR_RNOISE",
]
combine_functions = {
"SCI": "mean",
"ERR": "sum",
"VAR_RNOISE": "sum",
}
if reproject_func == "interp":
r_func = reproject_interp
elif reproject_func == "exact":
r_func = reproject_exact
elif reproject_func == "adaptive":
r_func = reproject_adaptive
else:
raise ValueError(f"reproject_func should be one of {ALLOWED_REPROJECT_FUNCS}")
if additional_hdus is None:
additional_hdus = []
if isinstance(additional_hdus, str):
additional_hdus = [additional_hdus]
with warnings.catch_warnings():
warnings.simplefilter("ignore")
hdus = []
for file in files:
hdu = fits.open(file)
dq_bit_mask = get_dq_bit_mask(hdu["DQ"].data)
hdu["SCI"].data[dq_bit_mask != 0] = np.nan
for additional_hdu in additional_hdus:
hdu[additional_hdu].data[dq_bit_mask != 0] = np.nan
# Make sure the full WCS is in there by copying over the header
hdr = copy.deepcopy(hdu["SCI"].header)
hdr["EXTNAME"] = additional_hdu
hdu[additional_hdu].header = copy.deepcopy(hdr)
if additional_hdu in sq_hdus:
hdu[additional_hdu].data = hdu[additional_hdu].data ** 2
hdus.append(hdu)
output_projection, shape_out = find_optimal_celestial_wcs(hdus,
hdu_in="SCI",
auto_rotate=auto_rotate,
)
hdr = output_projection.to_header()
# Loop over the various HDUs we want to reproject
stacked_images = {}
stacked_image, stacked_footprint = reproject_and_coadd(
hdus,
output_projection=output_projection,
shape_out=shape_out,
hdu_in="SCI",
combine_function=combine_functions["SCI"],
reproject_function=r_func,
match_background=match_background,
)
stacked_image[stacked_footprint == 0] = np.nan
stacked_images["SCI"] = copy.deepcopy(stacked_image)
for additional_hdu in additional_hdus:
stacked_image, stacked_footprint = reproject_and_coadd(
hdus,
output_projection=output_projection,
shape_out=shape_out,
hdu_in=additional_hdu,
combine_function=combine_functions[additional_hdu],
reproject_function=r_func,
match_background=match_background,
)
stacked_image[stacked_footprint == 0] = np.nan
if additional_hdu in sqrt_hdus:
stacked_image = np.sqrt(stacked_image)
stacked_images[additional_hdu] = copy.deepcopy(stacked_image)
# Create an HDU list here
hdu = fits.HDUList()
hdu.append(fits.PrimaryHDU(header=hdus[0][0].header))
for key in stacked_images:
hdu.append(fits.ImageHDU(data=stacked_images[key], header=hdr, name=key))
hdu.writeto(
out_name,
overwrite=True,
)
del hdus
gc.collect()
return True |
using Google.Apis.Auth.OAuth2;
using Google.Apis.Drive.v3;
using Google.Apis.Services;
using Google.Apis.Util.Store;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Windows.Controls;
using System.Windows;
using Newtonsoft.Json.Linq;
namespace FileManager
{
internal class GoogleDriveService
{
private DriveService _service;
public DriveService GetDriveService()
{
return _service;
}
public GoogleDriveService()
{
string[] Scopes = { DriveService.Scope.Drive, DriveService.Scope.DriveFile };
string ApplicationName = "FileManager";
using (var stream = new FileStream("credentials.json", FileMode.Open, FileAccess.Read))
{
var credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.FromStream(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore("TokenStore", true)).Result;
_service = new DriveService(new BaseClientService.Initializer
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
}
}
// Отримання файлів з Google Drive
public IList<Google.Apis.Drive.v3.Data.File> GetFiles(string folderId = "root")
{
var request = _service.Files.List();
// Фільтр: не видалені файли
request.Q = $"'{folderId}' in parents and trashed=false";
request.Fields = "files(id, name, mimeType, modifiedTime, size)";
return request.Execute().Files;
}
public void FillGoogleDriveTreeView(TreeView treeView, string folderId = "root")
{
try
{
GoogleDriveService googleDriveService = new GoogleDriveService();
var items = googleDriveService.GetFiles(folderId);
treeView.Items.Clear();
foreach (var item in items)
{
// Створюємо об'єкт FileItem для зберігання даних про файл/папку
var fileItem = new FileItem
{
Name = item.Name,
Type = item.MimeType == "application/vnd.google-apps.folder" ? "Папка" : "Файл",
Id = item.Id,
DateModified = item.ModifiedTimeDateTimeOffset?.ToString("dd.MM.yyyy HH:mm") ?? "Unknown",
Size = item.Size.HasValue ? $"{item.Size.Value / 1024} KB" : "Unknown"
};
// Створюємо TreeViewItem з прив'язкою до FileItem через Tag
TreeViewItem treeItem = new TreeViewItem
{
Header = fileItem.Name,
Tag = fileItem, // Прив'язуємо FileItem
IsExpanded = false
};
// Якщо це папка, додаємо "заглушку" для відображення вкладених елементів
if (fileItem.Type == "Папка")
{
treeItem.Items.Add(null); // Заглушка
treeItem.Expanded += (s, e) =>
{
if (treeItem.Items.Count == 1 && treeItem.Items[0] == null)
{
treeItem.Items.Clear();
FillGoogleDriveTreeViewItem(treeItem, fileItem.Id);
}
};
}
treeView.Items.Add(treeItem);
}
}
catch (Exception ex)
{
MessageBox.Show($"Помилка: {ex.Message}", "Помилка", MessageBoxButton.OK, MessageBoxImage.Error);
}
}
private void FillGoogleDriveTreeViewItem(TreeViewItem parentItem, string folderId)
{
try
{
GoogleDriveService googleDriveService = new GoogleDriveService();
var items = googleDriveService.GetFiles(folderId);
foreach (var item in items)
{
// Створюємо об'єкт FileItem
var fileItem = new FileItem
{
Name = item.Name,
Type = item.MimeType == "application/vnd.google-apps.folder" ? "Папка" : "Файл",
Id = item.Id,
DateModified = item.ModifiedTimeDateTimeOffset?.ToString("dd.MM.yyyy HH:mm") ?? "Unknown",
Size = item.Size.HasValue ? $"{item.Size.Value / 1024} KB" : "Unknown"
};
// Створюємо TreeViewItem із прив'язкою до FileItem
TreeViewItem treeItem = new TreeViewItem
{
Header = fileItem.Name,
Tag = fileItem,
IsExpanded = false
};
// Якщо це папка, додаємо "заглушку" для вкладених елементів
if (fileItem.Type == "Папка")
{
treeItem.Items.Add(null);
treeItem.Expanded += (s, e) =>
{
if (treeItem.Items.Count == 1 && treeItem.Items[0] == null)
{
treeItem.Items.Clear();
FillGoogleDriveTreeViewItem(treeItem, fileItem.Id);
}
};
}
parentItem.Items.Add(treeItem);
}
}
catch (Exception ex)
{
MessageBox.Show($"Помилка: {ex.Message}", "Помилка", MessageBoxButton.OK, MessageBoxImage.Error);
}
}
public async Task FillGoogleDriveListView(ListView listView, string folderId = "root")
{
try
{
// Отримуємо файли з папки
var request = _service.Files.List();
request.Fields = "files(id, name, mimeType, modifiedTime, size,iconLink, thumbnailLink)";
request.Q = $"'{folderId}' in parents and trashed = false";
var response = await request.ExecuteAsync();
// Очищаємо вміст ListView
listView.Items.Clear();
// Додаємо файли та папки до ListView
foreach (var item in response.Files)
{
var fileItem = new FileItem
{
Name = item.Name,
Type = item.MimeType.Contains("folder") ? "Папка" : "Файл",
Id = item.Id,
DateModified = item.ModifiedTimeDateTimeOffset?.ToString("dd.MM.yyyy HH:mm") ?? "Unknown",
//Size = item.Size.HasValue ? $"{item.Size.Value / 1024} KB" : "",
Size = item.Size.HasValue? MainWindow.GetReadableFileSize(item.Size.Value) : "",
Icon = item.MimeType.Contains("folder") ? "folder.png" : item.ThumbnailLink //Встановлюємо посилання на іконку
};
//Якщо немає великої іконки ThumbnailLink, то встановлюємо iconLink
if (fileItem.Icon == null)
{
fileItem.Icon = item.IconLink;
}
listView.Items.Add(fileItem);
}
}
catch (Exception ex)
{
MessageBox.Show($"Помилка: {ex.Message}", "Помилка", MessageBoxButton.OK, MessageBoxImage.Error);
}
}
public async Task SearchGoogleDriveRecursive(string searchText, string currentFolderId = "root", Action<FileItem> onItemFound = null, CancellationToken cts = default)
{
try
{
// Черга для обходу папок
Queue<string> foldersToSearch = new Queue<string>();
foldersToSearch.Enqueue(currentFolderId);
while (foldersToSearch.Count > 0)
{
cts.ThrowIfCancellationRequested();
string folderId = foldersToSearch.Dequeue();
var request = _service.Files.List();
request.Fields = "files(id, name, mimeType, modifiedTime, size,iconLink, thumbnailLink, parents)";
request.Q = $"trashed = false and '{folderId}' in parents";
var response = await request.ExecuteAsync(cts);
foreach (var file in response.Files)
{
cts.ThrowIfCancellationRequested();
// Перевіряємо, чи відповідає ім'я умовам пошуку
if (file.Name.Contains(searchText))
{
var fileItem = new FileItem
{
Name = file.Name,
Type = file.MimeType.Contains("folder") ? "Папка" : "Файл",
DateModified = file.ModifiedTimeDateTimeOffset?.ToString("g") ?? "-",
Id = file.Id,
Size = file.Size.HasValue ? MainWindow.GetReadableFileSize(file.Size.Value) : "",
Icon = file.MimeType.Contains("folder") ? "folder.png" : file.ThumbnailLink
};
if (fileItem.Icon == null)
{
fileItem.Icon = file.IconLink;
}
onItemFound?.Invoke(fileItem);
}
// Якщо поточний файл - папка, додаємо її до черги
if (file.MimeType.Contains("folder"))
{
foldersToSearch.Enqueue(file.Id);
}
}
}
}
catch (OperationCanceledException)
{
}
catch (Exception ex)
{
MessageBox.Show($"Помилка рекурсивного пошуку на Google Диску: {ex.Message}");
}
}
public async Task<List<BreadcrumbItem>> GetGoogleDriveBreadcrumbs(string currentFolderId)
{
List<BreadcrumbItem> breadcrumbs = new List<BreadcrumbItem>();
while (!string.IsNullOrEmpty(currentFolderId))
{
// Отримуємо інформацію про поточну папку
var folderInfo = await GetFolderInfo(currentFolderId);
if (folderInfo == null) break;
// Додаємо на початок списку
breadcrumbs.Insert(0, new BreadcrumbItem
{
Name = folderInfo.Name,
FullPath = folderInfo.Id,
IsGoogleDrive = true
});
// Переходимо до батьківської папки
currentFolderId = folderInfo.FullPath;
}
return breadcrumbs;
}
public async Task<FileItem> GetFolderInfo(string folderId)
{
try
{
var request = _service.Files.Get(folderId);
request.Fields = "id, name, parents";
var file = await request.ExecuteAsync();
return new FileItem
{
Id = file.Id,
Name = file.Name,
FullPath = file.Parents?.FirstOrDefault() // Google Drive може мати кілька батьків
};
}
catch (Exception ex)
{
// Обробка помилок
MessageBox.Show($"Помилка отримання інформації про папку: {ex.Message}");
return null;
}
}
}
} |
### 第十五关挑战:海量场景问题该如何解决
* 原题目:给定一个输入文件,包含40亿个非负整数,请设计一个算法,产生一个不存在该文件中的整数,假设你有1GB的内存来完成这项任务。
* 现在请你思考:让你将这40亿中不存在的整数,全部找出来,该如何做?
## 解题过程
看到`40亿`数据就知道是`海量数据类型`题目,考虑使用`位存储`、`分块处理`、`堆`等方式。
* __明确限制条件__:1GB内存
* __明确大致做法__:遍历数据,统计并存储每个数是否存在的情况,最后遍历输出不存在的值。
* __考虑使用位存储__:对于每个整数,只有存在和不存在两个状态,正好可以使用二进制位的值1、0来表示。所以每个整数的状态存储仅需占用1个位,1个字节可以存储8个整数的状态。
* 计算位图存储40亿数据状态 需要多少内存:
```txt
4000000000/8/1024/1024/1024 = 0.47G ≈ 0.5G //这里只需要0.5G 此时还满足题意
```
* __考虑极端情况__:40亿数据都是重复的,此时不存在的整数有40亿-1个,则返回的数据列表占用内存为:
```txt
(4000000000-1)*4/1024/1024/1024 ≈ 15G //此时已经超过题目限制了,需要考虑分块处理
```
* __考虑根据内存限制,需要分成多少块更合适__
- 计算 1G内存 能存储多少个int数据
```txt
1*1024*1024*1024/4 = 268435456(个)
4000000000/268435456 ≈ 15(块)
```
- 所以 __40亿__ 数据,至少需要分割成 __15块__。
- 而一般情况下,我们推荐分割为 __2的整数倍__,所以这里分割成 __16块__ 更合适,每块处理 __268435456个__ 数据。
- 确保相同数据进入同一块中:分块(这里使用取余的方式)来保证相同数据进入同一块中。
* __然后依次对每块数据,使用位存储状态,找出当前块中不存在的整数__
* __最后汇总所有块的不存在的整数,返回即可__
### 代码实现
```go
func FindNoExistNumsBy1G(arr []int) (res []int) {
N := 16 //分成16块
for i := 0; i < N; i++ {
res = append(res, FindNoExistNumsByPiece(arr, i)...)
}
return
}
// 单独统计每块中不存在的整数
func FindNoExistNumsByPiece(arr []int, targetIndex int) (res []int) {
N := 268435456 //分成16块,每块处理的平均数
bitmap := make([]int, N/32+1) //最多占用8MB
// 统计当前块中数据的存在情况
for _, num := range arr {
if num%N == targetIndex {
index := num / 32
offset := num % 32
mark := 1 << offset
bitmap[index] |= mark
}
}
// 找出不存在的数据
for index, val := range bitmap {
for i := 0; i < 32; i++ {
mark := 1 << i
if val&mark == 0 {
// 计算不存在的整数值并添加到结果中
num := i + index*32 + targetIndex*N
res = append(res, num)
}
}
}
return
}
``` |
import React from "react";
import Home from "../Pages/Home";
import { Routes, Route } from "react-router-dom";
import AllProducts from "../Pages/AllProducts";
import Men from "../Pages/Men";
import Women from "../Pages/Women";
import DescriptionPage from "../components/Description/DescriptionPage";
import AllshoesD from "../Pages/Shoes";
import Cart from "../Pages/Cart";
import WishList from "../Pages/WishList";
import Login from "../Pages/Login";
import Register from "../Pages/SignUp";
import Checkout from "../Pages/Checkout";
import Authentication from "../PrivateRoute/Authentication";
import MyAccount from "../Pages/MyAccount";
import AdminPage from "../Admin/AdminPage";
const AllRoutes = () => {
return (
<div>
<Routes>
<Route path="/" element={<Home />} />
<Route path="/allproducts" element={<AllProducts />} />
<Route path="/men" element={<Men />} />
<Route path="/women" element={<Women />} />
<Route path="/shoes" element={<AllshoesD />} />
<Route path="/description/:id" element={<DescriptionPage />} />
<Route path="/cart" element={<Cart />} />
<Route
path="/wishlist"
element={
<Authentication>
<WishList />
</Authentication>
}
/>
<Route path="/login" element={<Login />} />
<Route path="/myaccount" element={<MyAccount />} />
<Route path="/register" element={<Register />} />
<Route
path="/checkout"
element={
<Authentication>
<Checkout />
</Authentication>
}
/>
<Route
path="/admin"
element={
<Authentication>
<AdminPage />
</Authentication>
}
/>
</Routes>
</div>
);
};
export default AllRoutes; |
import { useMutation, useQueryClient } from "@tanstack/react-query";
import { InferRequestType, InferResponseType } from "hono";
import { toast } from "sonner";
import { client } from "@/lib/hono";
type ResponseType = InferResponseType<typeof client.api.objectivemembers["bulk-create"]["$post"]>;
type RequestType = InferRequestType<typeof client.api.objectivemembers["bulk-create"]["$post"]>["json"];
export const useBulkCreateobjectivemembers = () => {
const queryClient = useQueryClient();
const mutation = useMutation<
ResponseType,
Error,
RequestType
>({
mutationFn: async (json) => {
console.log('Request Payload:', json);
const formattedPayload = json.map(obj => ({
...obj,
coefficient: Number(obj.coefficient),
customConstraintId: Number(obj.customConstraintId),
addToObjective: Boolean(obj.addToObjective)
}));
console.log('Formatted Payload:', formattedPayload);
const response = await client.api.objectivemembers["bulk-create"]["$post"]({ json: formattedPayload.map(obj => ({
...obj,
coefficient: obj.coefficient.toString(),
})) });
return await response.json();
},
onSuccess: () => {
toast.success("objectivemembers created");
queryClient.invalidateQueries({ queryKey: ["objectivemembers"] });
},
onError: () => {
toast.error("Failed to create objectivemembers");
},
});
return mutation;
}; |
package iat.alumni.controller;
import java.text.DecimalFormat;
import java.text.SimpleDateFormat;
import java.time.LocalDate;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.validation.BindingResult;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.ModelAttribute;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.servlet.ModelAndView;
import org.springframework.web.servlet.mvc.support.RedirectAttributes;
import iat.alumni.dto.CommentRequest;
import iat.alumni.model.Article;
import iat.alumni.model.Comment;
import iat.alumni.model.Rating;
import iat.alumni.model.User;
import iat.alumni.model.UserSession;
import iat.alumni.service.ArticleService;
import iat.alumni.service.CommentService;
import jakarta.servlet.http.HttpSession;
import jakarta.validation.Valid;
@Controller
@RequestMapping(value="/alumni")
public class ArticleDetailController {
@Autowired
private ArticleService articleService;
@Autowired
private CommentService commentService;
@GetMapping(value = "/articleDetail/{articleId}")
public ModelAndView showArticleDetailForm (@PathVariable(name = "articleId") Integer articleId, Model model, HttpSession session) {
UserSession userSession = (UserSession) session.getAttribute("userSession");
User currentUser = userSession.getUser();
ModelAndView mav = new ModelAndView ("article-details");
Comment comment = new Comment();
model.addAttribute("comment", comment);
List<Comment> listComment=commentService.getCommentsByArticleId(articleId);
model.addAttribute("listComment",listComment);
model.addAttribute("currentArticleId", articleId);
List<Article> listArticle=articleService.getAllArticle();
model.addAttribute("listArticle",listArticle);
Article article = articleService.getArticleById(articleId);
Double averageRating = article.getAverageRating();
DecimalFormat df = new DecimalFormat("0.0");
String formattedRating = averageRating != null ? df.format(averageRating) : "N/A";
model.addAttribute("article", article);
model.addAttribute("currentUser", currentUser);
model.addAttribute("average", formattedRating);
return mav;
}
@PostMapping("/savedcomment/{articleId}")
public String saveComment(@PathVariable("articleId") Integer articleId, @RequestParam("commentText") String commentText,
HttpSession session, @Valid @ModelAttribute("commentRequest") CommentRequest commentRequest, BindingResult result, RedirectAttributes redirectAttributes) {
UserSession userSession = (UserSession) session.getAttribute("userSession");
User user = userSession.getUser();
Article article = articleService.getArticleById(articleId);
if (user == null || article == null ) {
return "redirect:/login";
}
if (result.hasErrors()) {
redirectAttributes.addFlashAttribute("org.springframework.validation.BindingResult.commentRequest", result);
redirectAttributes.addFlashAttribute("commentRequest", commentRequest);
return "redirect:/alumni/articleDetail/{articleId}";
}
// SimpleDateFormat df = new SimpleDateFormat("MM/dd/yyyy");
// LocalDate today = LocalDate.now();
// comment.setDate(today);
// comment.setUser(userSession.getUser());
// comment.setArticle(articleService.getArticleById(articleId));
//
// articleService.createComment(comment);
commentService.saveCommentForArticle(commentText, user, article);
return "redirect:/alumni/articleDetail/" + articleId;
}
@GetMapping(value="/rating/{articleId}")
public String articleRatingandReviewForm(@PathVariable(name = "articleId") Integer articleId, Model model) {
Rating rating = new Rating();
model.addAttribute("rating", rating);
return "reviewandrating";
}
@PostMapping("/saveRating/{articleId}")
public String saveRating( @ModelAttribute @Valid Rating rating,@PathVariable Integer articleId, RedirectAttributes redirectAttributes,BindingResult result, HttpSession session,Model model) {
UserSession userSession = (UserSession) session.getAttribute("userSession");
if (userSession == null || userSession.getUser() == null ) {
return "redirect:/login";
}
if(result.hasErrors()) {
}
rating.setUser(userSession.getUser());
rating.setArticle(articleService.getArticleById(articleId));
articleService.createRating(rating);
return"redirect:/alumni/articleDetail/{articleId}";
}
} |
import {
Body,
Controller,
Delete,
HttpStatus,
InternalServerErrorException,
Param,
Post,
Req,
UseGuards,
} from '@nestjs/common';
import { ObjectId } from 'mongodb';
import { AuthenticationGuard } from 'src/common/guards/authentication.guard';
import {
AuthorizationGuard,
Roles,
} from 'src/common/guards/authorization.guard';
import { ErrorResponse, SuccessResponse } from 'src/common/helpers/response';
import {
convertObjectId,
hashPassword,
} from 'src/common/helpers/utilityFunctions';
import { JoiValidationPipe } from 'src/common/pipes/joi.validation.pipe';
import { ParseObjectIdPipe } from 'src/common/pipes/objectId.validation.pipe';
import { TrimBodyPipe } from 'src/common/pipes/trimBody.pipe';
import { ICreateProductLine } from '../product/product.interfaces';
import { createProductLineSchema } from '../product/product.validators';
import { ProductService } from '../product/services/product.service';
import { StorageService } from '../storage/services/storage.service';
import { ICreateStorage } from '../storage/storage.interfaces';
import { storageMessage } from '../storage/storage.messages';
import { createStorageSchema } from '../storage/storage.validators';
import { UserService } from '../user/services/user.service';
import { UserRole } from '../user/user.constants';
import { ICreateUser } from '../user/user.interfaces';
import { userMessages } from '../user/user.messages';
import { createUserSchema } from '../user/user.validators';
import adminMessages from './admin.messages';
@Controller('/admin')
@UseGuards(AuthenticationGuard, AuthorizationGuard)
@Roles(UserRole.ADMIN)
export class AdminController {
constructor(
private readonly userService: UserService,
private readonly storageService: StorageService,
private readonly productService: ProductService,
) {}
@Post('/user')
async createUser(
@Req() req,
@Body(new TrimBodyPipe(), new JoiValidationPipe(createUserSchema))
body: ICreateUser,
) {
try {
const user = await this.userService.getUserByField(
{
key: 'email',
value: body.email,
},
['_id'],
);
if (user) {
return new ErrorResponse(HttpStatus.BAD_REQUEST, [
{
code: HttpStatus.CONFLICT,
message: userMessages.errors.userExists,
key: 'email',
},
]);
}
body.password = hashPassword(body.password);
body.createdBy = new ObjectId(req.loggedUser._id);
const newUser = await this.userService.createUser(body);
return new SuccessResponse(newUser);
} catch (error) {
throw new InternalServerErrorException(error);
}
}
@Delete('/user/:id')
async deleteUser(
@Req() req,
@Param('id', new ParseObjectIdPipe()) id: ObjectId,
) {
try {
const user = await this.userService.getUserByField(
{
key: '_id',
value: id,
},
['_id'],
);
if (!user) {
return new ErrorResponse(HttpStatus.BAD_REQUEST, [
{
code: HttpStatus.NOT_FOUND,
message: userMessages.errors.userNotFound,
key: 'id',
},
]);
}
const products = await this.productService.getProductsByField(
{ key: 'userId', value: id },
['_id'],
);
if (products.length) {
return new ErrorResponse(HttpStatus.BAD_REQUEST, [
{
code: HttpStatus.UNPROCESSABLE_ENTITY,
message: adminMessages.errors.stillHasProduct,
key: 'id',
},
]);
}
const transitions =
await this.productService.getProductStatusTransitionsByField(
{ key: 'nextUserId', value: id },
['_id'],
);
if (transitions.length) {
return new ErrorResponse(HttpStatus.BAD_REQUEST, [
{
code: HttpStatus.UNPROCESSABLE_ENTITY,
message: adminMessages.errors.stillHasTransitionTo,
key: 'id',
},
]);
}
const deletedUser = await this.userService.deleteUser(
id,
new ObjectId(req.loggedUser._id),
);
return new SuccessResponse(deletedUser);
} catch (error) {
throw new InternalServerErrorException(error);
}
}
@Post('/storage')
async createStorage(
@Req() req,
@Body(new TrimBodyPipe(), new JoiValidationPipe(createStorageSchema))
body: ICreateStorage,
) {
try {
convertObjectId(body, ['userId']);
const user = await this.userService.getUserByField(
{ key: '_id', value: body.userId },
['_id', 'role'],
);
if (!user) {
return new ErrorResponse(HttpStatus.BAD_REQUEST, [
{
code: HttpStatus.NOT_FOUND,
message: userMessages.errors.userNotFound,
key: 'userId',
},
]);
}
if (
[
UserRole.ADMIN,
UserRole.WARRANTY_CENTER,
UserRole.CONSUMER,
].includes(user.role)
) {
return new ErrorResponse(HttpStatus.BAD_REQUEST, [
{
code: HttpStatus.UNPROCESSABLE_ENTITY,
message: storageMessage.errors.createForbidden,
key: 'userId',
},
]);
}
body.createdBy = new ObjectId(req.loggedUser._id);
const storage = await this.storageService.createStorage(body);
return new SuccessResponse(storage);
} catch (error) {
throw new InternalServerErrorException(error);
}
}
@Post('/product-line')
async createProductLine(
@Req() req,
@Body(
new TrimBodyPipe(),
new JoiValidationPipe(createProductLineSchema),
)
body: ICreateProductLine,
) {
try {
body.createdBy = new ObjectId(req.loggedUser._id);
const newProductLine =
await this.productService.createNewProductLine(body);
return new SuccessResponse(newProductLine);
} catch (error) {
throw new InternalServerErrorException(error);
}
}
} |
import fetch from 'node-fetch';
import { writeFileSync } from 'fs';
async function fetchAndWriteGoogleBooksData() {
try {
const googleBooksUrl = 'https://www.googleapis.com/books/v1/volumes';
const apiKey = "AIzaSyAOrXNWOcB5bNoZTgrlMiZR9lBl6OOJQ4Y"
const queryParams = {
q: 'les coulisses du football',
maxResults: 20,
langRestrict: 'fr',
orderBy: 'newest',
subject: "Football",
key: apiKey,
};
const queryString = new URLSearchParams(queryParams).toString();
const fullUrl = `${googleBooksUrl}?${queryString}`;
const googleBooksResponse = await fetch(fullUrl);
if (!googleBooksResponse.ok) {
throw new Error(`Erreur de requête Google Books API: ${googleBooksResponse.status} - ${googleBooksResponse.statusText}`);
}
const googleBooksData = await googleBooksResponse.json();
const formatted = JSON.stringify(googleBooksData, null, 2);
const file = 'google-books-api-response.json';
writeFileSync(file, formatted, 'utf8');
console.log(`Données de Google Books API écrites avec succès dans ${file}`);
} catch (error) {
console.error(`Une erreur s'est produite: ${error.message}`);
}
}
fetchAndWriteGoogleBooksData(); |
/*
* Copyright (C) 2020 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package android.autofillservice.cts.inline;
import static android.autofillservice.cts.testcore.CannedFillResponse.NO_RESPONSE;
import static android.autofillservice.cts.testcore.Helper.ID_PASSWORD;
import static android.autofillservice.cts.testcore.Helper.ID_USERNAME;
import static android.autofillservice.cts.testcore.Helper.assertTextIsSanitized;
import static android.autofillservice.cts.testcore.Helper.findAutofillIdByResourceId;
import static android.autofillservice.cts.testcore.Helper.findNodeByResourceId;
import static android.autofillservice.cts.testcore.Helper.getContext;
import static android.autofillservice.cts.testcore.InstrumentedAutoFillServiceInlineEnabled.SERVICE_NAME;
import static android.autofillservice.cts.testcore.Timeouts.MOCK_IME_TIMEOUT_MS;
import static com.android.cts.mockime.ImeEventStreamTestUtils.expectEvent;
import static com.google.common.truth.Truth.assertThat;
import static com.google.common.truth.Truth.assertWithMessage;
import static org.junit.Assume.assumeTrue;
import android.accessibilityservice.AccessibilityServiceInfo;
import android.app.PendingIntent;
import android.app.UiAutomation;
import android.autofillservice.cts.activities.DummyActivity;
import android.autofillservice.cts.activities.NonAutofillableActivity;
import android.autofillservice.cts.activities.UsernameOnlyActivity;
import android.autofillservice.cts.commontests.LoginActivityCommonTestCase;
import android.autofillservice.cts.testcore.CannedFillResponse;
import android.autofillservice.cts.testcore.Helper;
import android.autofillservice.cts.testcore.InlineUiBot;
import android.autofillservice.cts.testcore.InstrumentedAutoFillService;
import android.content.Intent;
import android.os.Binder;
import android.os.Bundle;
import android.os.SystemClock;
import android.platform.test.annotations.AppModeFull;
import android.platform.test.annotations.Presubmit;
import android.service.autofill.FillContext;
import android.support.test.uiautomator.Direction;
import android.view.accessibility.AccessibilityManager;
import androidx.test.platform.app.InstrumentationRegistry;
import com.android.cts.mockime.ImeEventStream;
import com.android.cts.mockime.MockImeSession;
import org.junit.Test;
import org.junit.rules.TestRule;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.TimeUnit;
@Presubmit
public class InlineLoginActivityTest extends LoginActivityCommonTestCase {
private static final String TAG = "InlineLoginActivityTest";
@Override
protected void enableService() {
Helper.enableAutofillService(getContext(), SERVICE_NAME);
}
public InlineLoginActivityTest() {
super(getInlineUiBot());
}
@Override
protected boolean isInlineMode() {
return true;
}
@Override
public TestRule getMainTestRule() {
return InlineUiBot.annotateRule(super.getMainTestRule());
}
@Test
public void testAutofill_disjointDatasets() throws Exception {
// Set service.
enableService();
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build())
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_PASSWORD, "sweet")
.setPresentation(createPresentation("The Password"))
.setInlinePresentation(createInlinePresentation("The Password"))
.build())
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_PASSWORD, "lollipop")
.setPresentation(createPresentation("The Password2"))
.setInlinePresentation(createInlinePresentation("The Password2"))
.build());
sReplier.addResponse(builder.build());
mActivity.expectAutoFill("dude");
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
// Switch focus to password
mUiBot.selectByRelativeId(ID_PASSWORD);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Password", "The Password2");
// Switch focus back to username
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
mUiBot.selectDataset("The Username");
mUiBot.waitForIdleSync();
// Check the results.
mActivity.assertAutoFilled();
// Make sure input was sanitized.
final InstrumentedAutoFillService.FillRequest request = sReplier.getNextFillRequest();
assertWithMessage("CancelationSignal is null").that(request.cancellationSignal).isNotNull();
assertTextIsSanitized(request.structure, ID_PASSWORD);
final FillContext fillContext = request.contexts.get(request.contexts.size() - 1);
assertThat(fillContext.getFocusedId())
.isEqualTo(findAutofillIdByResourceId(fillContext, ID_USERNAME));
// Make sure initial focus was properly set.
assertWithMessage("Username node is not focused").that(
findNodeByResourceId(request.structure, ID_USERNAME).isFocused()).isTrue();
assertWithMessage("Password node is focused").that(
findNodeByResourceId(request.structure, ID_PASSWORD).isFocused()).isFalse();
}
@Test
public void testAutofill_SwitchToAutofillableActivity() throws Exception {
assertAutofill_SwitchActivity(UsernameOnlyActivity.class, /* autofillable */ true);
}
@Test
public void testAutofill_SwitchToNonAutofillableActivity() throws Exception {
assertAutofill_SwitchActivity(NonAutofillableActivity.class, /* autofillable */ false);
}
private void assertAutofill_SwitchActivity(Class<?> clazz, boolean autofillable)
throws Exception {
// Set service.
enableService();
// Set expectations.
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setField(ID_PASSWORD, "password")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
sReplier.getNextFillRequest();
// Make sure the suggestion is shown.
mUiBot.assertDatasets("The Username");
mUiBot.pressHome();
mUiBot.waitForIdle();
// Switch to another Activity
startActivity(clazz);
mUiBot.waitForIdle();
// Trigger input method show.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
if (autofillable) {
sReplier.addResponse(NO_RESPONSE);
sReplier.getNextFillRequest();
}
// Make sure suggestion is not shown.
mUiBot.assertNoDatasets();
}
protected final void startActivity(Class<?> clazz) {
final Intent intent = new Intent(mContext, clazz);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
mContext.startActivity(intent);
}
@Test
public void testAutofill_selectDatasetThenHideInlineSuggestion() throws Exception {
// Set service.
enableService();
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
mActivity.expectAutoFill("dude");
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
mUiBot.selectDataset("The Username");
mUiBot.waitForIdleSync();
mUiBot.assertNoDatasets();
// Make sure input was sanitized.
final InstrumentedAutoFillService.FillRequest request = sReplier.getNextFillRequest();
assertWithMessage("CancelationSignal is null").that(request.cancellationSignal).isNotNull();
assertTextIsSanitized(request.structure, ID_PASSWORD);
final FillContext fillContext = request.contexts.get(request.contexts.size() - 1);
assertThat(fillContext.getFocusedId())
.isEqualTo(findAutofillIdByResourceId(fillContext, ID_USERNAME));
// Make sure initial focus was properly set.
assertWithMessage("Username node is not focused").that(
findNodeByResourceId(request.structure, ID_USERNAME).isFocused()).isTrue();
assertWithMessage("Password node is focused").that(
findNodeByResourceId(request.structure, ID_PASSWORD).isFocused()).isFalse();
}
@Test
public void testLongClickAttribution() throws Exception {
// Set service.
enableService();
Intent intent = new Intent(mContext, DummyActivity.class);
PendingIntent pendingIntent =
PendingIntent.getActivity(mContext, 0, intent, PendingIntent.FLAG_IMMUTABLE);
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(
createInlinePresentation("The Username", pendingIntent))
.build());
sReplier.addResponse(builder.build());
mActivity.expectAutoFill("dude");
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
// Long click on suggestion
mUiBot.longPressSuggestion("The Username");
mUiBot.waitForIdleSync();
// Make sure the attribution showed worked
mUiBot.selectByText("foo");
// Go back to the filled app.
mUiBot.pressBack();
sReplier.getNextFillRequest();
mUiBot.waitForIdleSync();
}
@Test
@AppModeFull(reason = "BROADCAST_STICKY permission cannot be granted to instant apps")
public void testAutofill_noInvalid() throws Exception {
final String keyInvalid = "invalid";
final String keyValid = "valid";
final String message = "Passes valid message to the remote service";
final Bundle bundle = new Bundle();
bundle.putBinder(keyInvalid, new Binder());
bundle.putString(keyValid, message);
// Set service.
enableService();
final MockImeSession mockImeSession = sMockImeSessionRule.getMockImeSession();
assumeTrue("MockIME not available", mockImeSession != null);
mockImeSession.callSetInlineSuggestionsExtras(bundle);
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
final InstrumentedAutoFillService.FillRequest request = sReplier.getNextFillRequest();
final Bundle extras = request.inlineRequest.getExtras();
assertThat(extras.get(keyInvalid)).isNull();
assertThat(extras.getString(keyValid)).isEqualTo(message);
final Bundle style = request.inlineRequest.getInlinePresentationSpecs().get(0).getStyle();
assertThat(style.get(keyInvalid)).isNull();
assertThat(style.getString(keyValid)).isEqualTo(message);
final Bundle style2 = request.inlineRequest.getInlinePresentationSpecs().get(1).getStyle();
assertThat(style2.get(keyInvalid)).isNull();
assertThat(style2.getString(keyValid)).isEqualTo(message);
}
@Test
@AppModeFull(reason = "WRITE_SECURE_SETTING permission can't be grant to instant apps")
public void testSwitchInputMethod() throws Exception {
// Set service
enableService();
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
// Trigger auto-fill
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
sReplier.getNextFillRequest();
// Trigger IME switch event
Helper.mockSwitchInputMethod(sContext);
mUiBot.waitForIdleSync();
final CannedFillResponse.Builder builder2 = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude2")
.setPresentation(createPresentation("The Username 2"))
.setInlinePresentation(createInlinePresentation("The Username 2"))
.build());
sReplier.addResponse(builder2.build());
// Trigger auto-fill
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
// Confirm new suggestion
mUiBot.assertDatasets("The Username 2");
// Confirm new fill request
sReplier.getNextFillRequest();
}
@Test
@AppModeFull(reason = "BROADCAST_STICKY permission cannot be granted to instant apps")
public void testImeDisableInlineSuggestions_fallbackDropdownUi() throws Exception {
// Set service.
enableService();
final MockImeSession mockImeSession = sMockImeSessionRule.getMockImeSession();
assumeTrue("MockIME not available", mockImeSession != null);
// Disable inline suggestions for the default service.
final Bundle bundle = new Bundle();
bundle.putBoolean("InlineSuggestions", false);
mockImeSession.callSetInlineSuggestionsExtras(bundle);
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
// Check that no inline requests are sent to the service.
final InstrumentedAutoFillService.FillRequest request = sReplier.getNextFillRequest();
assertThat(request.inlineRequest).isNull();
// Check dropdown UI shown.
getDropdownUiBot().assertDatasets("The Username");
}
@Test
public void testTouchExplorationEnabledImeSupportInline_inlineShown() throws Exception {
enableTouchExploration();
enableService();
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
try {
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
final InstrumentedAutoFillService.FillRequest request = sReplier.getNextFillRequest();
assertThat(request.inlineRequest).isNotNull();
// Check datasets shown.
mUiBot.assertDatasets("The Username");
} catch (Exception e) {
throw e;
} finally {
resetTouchExploration();
}
}
@Test
public void testScrollSuggestionView() throws Exception {
// Set service.
enableService();
final int firstDataset = 1;
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder();
for (int i = firstDataset; i <= 20; i++) {
builder.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude" + i)
.setPresentation(createPresentation("Username" + i))
.setInlinePresentation(createInlinePresentation("Username" + i))
.build());
}
sReplier.addResponse(builder.build());
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertSuggestion("Username" + firstDataset);
// Scroll the suggestion view
mUiBot.scrollSuggestionView(Direction.RIGHT, /* speed */ 3000);
mUiBot.waitForIdleSync();
mUiBot.assertNoSuggestion("Username" + firstDataset);
sReplier.getNextFillRequest();
mUiBot.waitForIdleSync();
}
@Test
public void testClickEventPassToIme() throws Exception {
testTouchEventPassToIme(/* longPress */ false);
}
@Test
public void testLongClickEventPassToIme() throws Exception {
testTouchEventPassToIme(/* longPress */ true);
}
private void testTouchEventPassToIme(boolean longPress) throws Exception {
final MockImeSession mockImeSession = sMockImeSessionRule.getMockImeSession();
assumeTrue("MockIME not available", mockImeSession != null);
// Set service.
enableService();
Intent intent = new Intent(mContext, DummyActivity.class);
PendingIntent pendingIntent =
PendingIntent.getActivity(mContext, 0, intent, PendingIntent.FLAG_IMMUTABLE);
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(longPress
? createInlinePresentation("The Username", pendingIntent)
: createInlinePresentation("The Username"))
.build());
sReplier.addResponse(builder.build());
final ImeEventStream stream = mockImeSession.openEventStream();
// Trigger auto-fill.
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
sReplier.getNextFillRequest();
mUiBot.assertDatasets("The Username");
if (longPress) {
// Long click on suggestion
mUiBot.longPressSuggestion("The Username");
expectEvent(stream,
event -> "onInlineSuggestionLongClickedEvent".equals(event.getEventName()),
MOCK_IME_TIMEOUT_MS);
} else {
// Click on suggestion
mUiBot.selectDataset("The Username");
expectEvent(stream,
event -> "onInlineSuggestionClickedEvent".equals(event.getEventName()),
MOCK_IME_TIMEOUT_MS);
}
}
@Test
public void testInlineSuggestionViewReleased() throws Exception {
// Set service
enableService();
// Prepare the autofill response
final CannedFillResponse.Builder builder = new CannedFillResponse.Builder()
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_USERNAME, "dude")
.setPresentation(createPresentation("The Username"))
.setInlinePresentation(createInlinePresentation("The Username"))
.build())
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_PASSWORD, "sweet")
.setPresentation(createPresentation("The Password"))
.setInlinePresentation(createInlinePresentation("The Password"))
.build())
.addDataset(new CannedFillResponse.CannedDataset.Builder()
.setField(ID_PASSWORD, "lollipop")
.setPresentation(createPresentation("The Password2"))
.setInlinePresentation(createInlinePresentation("The Password2"))
.build());
sReplier.addResponse(builder.build());
// Trigger auto-fill on username field
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
Helper.assertActiveViewCountFromInlineSuggestionRenderService(1);
// Switch focus to password
mUiBot.selectByRelativeId(ID_PASSWORD);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Password", "The Password2");
Helper.assertActiveViewCountFromInlineSuggestionRenderService(2);
// Switch focus back to username
mUiBot.selectByRelativeId(ID_USERNAME);
mUiBot.waitForIdleSync();
mUiBot.assertDatasets("The Username");
Helper.assertActiveViewCountFromInlineSuggestionRenderService(1);
// Select the autofill suggestion on username, then check the results
mActivity.expectAutoFill("dude");
mUiBot.selectDataset("The Username");
mUiBot.waitForIdleSync();
mActivity.assertAutoFilled();
sReplier.getNextFillRequest();
// Sleep for a while for the wait in {@link com.android.server.autofill.ui
// .RemoteInlineSuggestionUi} to timeout.
SystemClock.sleep(500);
Helper.assertActiveViewCountFromInlineSuggestionRenderService(0);
}
private void enableTouchExploration() throws InterruptedException {
toggleTouchExploration(/*enable=*/ true);
}
private void resetTouchExploration() throws InterruptedException {
toggleTouchExploration(/*enable=*/ false);
}
private void toggleTouchExploration(boolean enable)
throws InterruptedException {
final AccessibilityManager manager =
getContext().getSystemService(AccessibilityManager.class);
if (isTouchExplorationEnabled(manager) == enable) {
return;
}
final CountDownLatch latch = new CountDownLatch(1);
AccessibilityManager.TouchExplorationStateChangeListener serviceListener =
(boolean newState) -> {
if (newState == enable) {
latch.countDown();
}
};
manager.addTouchExplorationStateChangeListener(serviceListener);
final UiAutomation uiAutomation =
InstrumentationRegistry.getInstrumentation().getUiAutomation();
final AccessibilityServiceInfo info = uiAutomation.getServiceInfo();
assert info != null;
if (enable) {
info.flags |= AccessibilityServiceInfo.FLAG_REQUEST_TOUCH_EXPLORATION_MODE;
} else {
info.flags &= ~AccessibilityServiceInfo.FLAG_REQUEST_TOUCH_EXPLORATION_MODE;
}
uiAutomation.setServiceInfo(info);
// Wait for touch exploration state to be toggled
assertThat(latch.await(10, TimeUnit.SECONDS)).isTrue();
if (enable) {
assertThat(isTouchExplorationEnabled(manager)).isTrue();
} else {
assertThat(isTouchExplorationEnabled(manager)).isFalse();
}
manager.removeTouchExplorationStateChangeListener(serviceListener);
}
private static boolean isTouchExplorationEnabled(AccessibilityManager manager) {
return manager.isEnabled() && manager.isTouchExplorationEnabled();
}
} |
export class Binatang {
constructor(public name: string, public isCarnivore: boolean) {}
makan(): void {
console.log("Binatang makan");
}
}
export class Katak extends Binatang {
constructor(name: string, isSwimming: boolean, public color: string) {
super(name, isSwimming)
}
makan(): void {
console.log(`${this.name} makan`);
}
}
const katak = new Katak("aldo", true, "white")
console.log(katak);
katak.makan() |
import { Route,Routes } from "react-router-dom";
import {useEffect} from 'react';
import {useDispatch} from 'react-redux';
import {getallProperties, getCities,getCitiesA} from './redux/actions/index';
import Landing from "./pages/landing/Landing.jsx";
import Home from './pages/home/Home.jsx';
import Detail from "./pages/detail/Detail.jsx";
import Form from "./pages/createProperty/form.jsx";
import Nav from "./components/nav-bar/Nav.jsx"
import LogIn from "./pages/logIn/LogIn";
import SignUp from "./pages/signup/SignUp";
function App() {
const dispatch = useDispatch()
useEffect(() => {
dispatch(getallProperties())
dispatch(getCities())
dispatch(getCitiesA())
}, [])
return (
<div className="flex flex-col ">
<Routes>
<Route path="/" element={<><Landing/></>}/>
<Route path="/home" element={<><Nav/><Home/></>}/>
<Route path="/detail/:id" element={<><Nav/><Detail/></>}/>
<Route path="/createProperty" element={<><Nav/><Form/></>}/>
<Route path="/login" element={<LogIn/>}/>
<Route path="/signup" element={<SignUp/>}/>
</Routes>
</div>
);
}
export default App; |
# Data Warehouse, Introduzione
Sono basi dati usate per il supporto alle decisioni, e sono mantenute separate dalle basi dati operative dell'azienda.
I dati contenuti all'interno di un data warehouse sono **orientati ai soggetti di interesse**, **consistenti** e **durevoli nel tempo** e sono d'aiuto per le decisioni aziendali.
I dati sono tenuti in basi separati per diversi motivi: il primo riguarda le **prestazioni**, infatti interrogare un data warehouse potrebbe compromettere le transazioni sulla base dati operativa.
Inoltre esistono diversi mezzi di accesso a livello fisico.
Il secondo motivo è legato alla gestione dei dati: **storico**, **consolidamento dei dati** e **qualità dei dati**.
### Rappresentazione dei dati
I dati in una warehouse sono rappresentati come un **iper cubo** ad n dimensioni: i valori numerici (detti **misure**) sono individuati dall'intersezione delle n dimensioni:
![[Pasted image 20231005115906.png]]
Le tre dimensioni **descrittive** dell'esempio rappresentano ad esempio le vendite di un supermercato.
Una rappresentazione alternativa è quella **a stella**: le 3 dimensioni del cubo diventano **entità del modello ER**, le relazioni legano le entità alle misure (centro stella):
![[Pasted image 20231005120305.png]]
### Architetture per data warehouse
Trattiamo separatamente l'elaborazione OLTP e elaborazione OLAP, quindi dobbiamo necessariamente avere architetture su due livelli (quelle a un livello sono complessi e non presi in considerazione nel corso).
![[Pasted image 20231005121240.png]]
I dati in arrivo dalle sorgenti sono estratti tramite strumenti **ETL** (extraction transform load) e portati nella **data warehouse**:
- E: si acquisiscono i dati dalla sorgente
- T: prima di Puliscono i dati (correttezza e consistenza), poi si convertono nel formato del warehouse (integrazione)
- C: Infine si propagano le informazioni nel warehouse
Sopra al data warehouse sono presenti gli **OLAP server** che servono per rispondere meglio alle interrogazioni di analisi. Sono di 3 tipi:
- ROLAP (Relational OLAP): sono DBMS relazioni estesi con operazioni efficienti per i dati, SQL esteso e potenziato.
- MOLAP (Multidimensional OLAP): in questo tipo di tecnologie i dati sono rappresentati in forma matriciale, perciò sparsi, infatti sono presenti tecniche di compressione e decompressione. Queste soluzioni sono spesso proprietarie.
- HOLAP (Hybrid OLAP) costituiscono soluzioni ibride.
I **Data marts** sono porzioni che poi crescono in maniera omogena per generare il warehouse (sono 'pezzi' che compongono il warehouse): normalmente l'approccio è bottom up, cioè prima si progettano i marts e poi crescendo si passa la warehouse.
Normalmente i marts possono essere:
- alimentati dal data warehouse primario
- alimentati direttamente dalle sorgenti
Gli **strumenti di analisi** prendono i dati e analizzandoli li servono al cliente finale.
I **Metadati** sono dati sui dati, e descrivono tutti gli oggetti e le strutture che fanno parte della base dati.
Servono per trasformazione e caricamento (da dove provengono i dati... ), per la gestione dei dati (formato dei dati ecc...) e per la gestione delle query (codice SQL, CPU usage ecc...)
Concettualmente distinguiamo **due livelli: livello delle sorgenti e livello della warehouse**.
Nelle architetture a 3 livello è presente una **staging area** fra i due livelli di prima: funziona da buffer su cui svolgiamo la fase di ET, prima di essere caricati nel warehouse con gli strumenti L.
# Progettazione concettuale di Data warehouse
Quando si comincia un progetto le aspettative degli utenti sono alte, e ci sono una serie di problematiche come la qualità dei dati OLTP di partenza (dati inaffidabili) e la gestione 'politica' del progetto (collaborazione con i 'detentori' delle informazioni)
### Analisi dei requisiti
Questa fase raccoglie i dati che servono per realizzare le esigenze delle aziende, si individuano dei **vincoli**.
Il data mart che si andrà a progettare è strategico per una determinata azienda, e sarà alimentato da poche (affidabili) sorgenti.
```ad-note
title: Riconciliazione
La riconcilazione dei dati da parte di diverse fonti è un processo che richiede tempo e molta fatica
```
I **requisiti applicativi** descrivono gli eventi di interesse (ovvero i fatti) che interessano all'azienda (gli scontrini per un supermercato).
Questi fatti sono caratterizzate da dimensione descrittive.
Altro importante parametro è il **carico di lavoro**: per determinare quest'ultimo si ricorre a report aziendali o a specifiche domande in linguaggio naturale (quanti scontrini si fanno al mese).
I **requisiti strutturali** dipendono da:
- **Periodicità** dell'alimentazione, se voglio inserire nuovi dati giornalmente, mensilmente ecc...
- **Spazio disponibile** per dati e strutture accessorie
- **Tipologia di struttura**, ovvero il numero di livelli necessari, se i data mart sono indipendenti o no...
- **Pianificazione del deployment**: come si deciderà di avviare il data mart, e inoltre bisogna considerare un periodo di formazione per i nuovi utenti.
### Progettazione DFM
Si utilizza un modello detto **Dimensional Fact Model**, che per uno specifico fatto definisci **schemi di fatto** che modellano dimensioni gerarchie e misure.
E' un modello grafico e può essere anche usato come documentazione di progetto utile a priori e posteriori.
Il modello si rifà alla rappresentazione dei dati tramite iper cubo, con fatti, dimensioni e misure.
Il **Fatto** che prendiamo in esempio è la vendita di un certo prodotto, che definiamo graficamente come:
Le **dimensioni** aggiuntive sono **data e negozio**.
Ogni fatto è descritto da misure numeriche indicate all'interno del fatto.
![[Pasted image 20231006203829.png]]
Le **Gerarchie** rappresentano una relazione di generalizzazione tra un sottoinsieme di attributi di una dimensione: nella pratica è una relazione funzionale 1:n.
![[Pasted image 20231006204016.png]]
Prendendo in esempio la dimensione NEGOZIO notiamo che un negozio si trova in una sola città (il viceversa è 1:n), data una città c'è una sola regione e così via.
Questi attributi sono necessari per confrontare tramite group by e generare report significativi (esempio, confronto vendite Piemonte e Lombardia).
Per la gerarchia del tempo, oltre alla scomposizione della data possiamo usare un attributo **vacanza** oppure **evento speciale** per verificare se (ad esempio) le abitudini dei consumatori sono influenzate da giorni particolari (come natale...).
Altri componenti del modello DFM sono riportati nell'immagine di sotto:
![[Pasted image 20231006204850.png]]
Il primo componente è l'**arco di opzionalità** (0,1) come nel caso del prodotto dietetico o no.
Gli **attributi descrittivi** non servono per raggruppare ma solo a fornire maggiori dettagli (ad esempio il numero di telefono).
Un concetto importante è al **convergenza** delle gerarchie: ad esempio dei punti vendite sono legate a una certa area geografica (spesso non coincidente con la regione); questa caratteristica geografica nell'esempio è il distretto di vendita che converge poi nello stato.
La **non additività** pone un limite alle interrogazioni che posso fare al modello.
L'**aggregazione** è il processo di calcolo del valore di misure a granularità meno fine di quella presente nello schema di fatto originale.
Le misure possono essere **additive, non additive e non aggregabili**.
```ad-tip
title: Schemi di fatto vuoti
E' possibile inserire eventi che non sono caratterizzati da misure, ma semplicimente utili al calcolo di alcuni aggregati (come COUNT).
Si pensi ad esempio al numero di frequenze di uno studente a lezione: possiamo avere come evento la **frequenza**, con dimensioni matricola, data e corso. Procedere facendo una COUNT raggruppando per corso e studente.
```
### Misure
**Misure di flusso**: misure cumulative su un periodo di tempo, sono aggregabili con tutti gli operatori standard.
Ad esempio la quantità di prodotti venduti.
**Misure di livello**: misurano quantità in specifici istanti (snapshot), e non sono additive lungo la dimensione tempo.
Ad esempio il saldo del conto corrente
**Misure unitarie**: valutate in specifici istanti di tempo ed espressi in termini relativi, inoltre non sono additive lunga nessuna dimensione.
Ad esempio il prezzo di vendita di un prodotto.
### Operatori
**Operatori distributivi**: è sempre possibile il calcolo di aggregati da dati a livello di dettaglio maggiore.
Esempio: sum, min, max
**Algebrici**: il calcolo di aggregati da dati a livello di dettaglio maggiore è possibile in presenza di misure aggiuntive di supporto (ad esempio avg richiede count)
**Olistici**: Non è possibile il calcolo degli aggregati da dati a livello di gerarchia superiore. (moda, mediana)
### Altri costrutti
![[Pasted image 20231006212107.png]]
La soluzione sotto non va bene, meglio unire le gerarchie e definire i **ruoli** delle due dimensioni.
Le **relazioni molti a molti** nel DFM vengono sostituite con un **arco multiplo**:
![[Pasted image 20231011084135.png]]
Se una certa dimensione contiene molti attributi che possono essere o veri o falsi, allora si parla di **attributo configurazione**: ogni dettaglio può essere Y/N e si usa una stringa binaria per identificare la configurazione:
![[Pasted image 20231011084430.png]]
### Rappresentazione del tempo
Normalmente la variazione dei dati nel tempo è rappresentata dal versificarsi degli eventi.
Ma in alcuni casi possono variare nel tempo anche le dimensioni.
Una prima tecnica di rappresentazione del tempo è quella '**istantanea**': non ci interessa studiare la variazione di un certo dato nel data warehouse.
Pertanto se una persona cambia stato civile, tutti i suoi acquisti sono registrati come 'sposato', e di fatto non si tiene traccia del passato
Tecnica 2: ogni volta che un attributo si modifica si crea una nuova istanza del fatto precedente.
Se una persona cambia stato civile allora ci saranno due istanze della stessa persona che ne tiene traccia.
Tecnica 3: Oltre a istanziare una nuova tupla quando si cambia un attributo, si aggiunge anche due **timestamp** che indicano il periodo di validità di una certa tupla. Inoltre si definisce anche un attributo che permette di ricostruire lo storico della variazione (**master**) a partire dalla prima istanza originale.
### Carico, volume, sparsità
Il **Carico** è stimato e definito in fase di analisi con gli utenti: tuttavia il carico è difficilmente stimabile durante la progettazione.
Dopo l'avviamente del sistema la fase di **tuning** è necessaria per monitorare il carico reale.
Il **Volume** si stima per lo spazio occupato dai dati e dalle strutture accessorie: inoltre dipende dal numero di fatti, di gerarchie, dalla lunghezza degli attributi e dalla gestione del tempo nel data mart.
Questo problema è condizionato dalla **sparsità dei dati**.
- La sparsità si riduce al crescere del livello di aggregazione dei dati
- Può ridurre l'affidabilità della stima della cardinalità dei dati
![[Pasted image 20231011091957.png]]
# Progettazione logica di Data Warehouse
L'obiettivo di questa fase è quella di **produrre in output delle tabelle**.
La seguente tecnica è **lo schema a stella.
Per **ogni fatto ci sarà una tabella**:
- le misure diventano attributi
- le dimensioni del fatto diventano chiavi
Per **ogni dimensione ci sarà una tabella**:
- la chiave primaria è detta surrogata (generata artificialmente)
- contiene tutti gli attributi della dimensione
- gerarchie non rappresentate esplicitamente
- ridondanza dei dati, quindi non c'è forma normale
Si fa un trade off tra spazio (molto spazio occupato) e efficienza (non ci sono join da fare ad esempio con la città): la normalizzazione si **mantiene solo sul fatto**, mentre sulle dimensioni non c'è normalizzazione.
![[Pasted image 20231011093009.png]]
```ad-seealso
title: Schema snowflakes
Stesso concetto di base della stella, ma ottimizza la dimensione occupata dei dati: le gerarchie diventano tabelle esterne con ID come chiave esterna.
Normalmente il tradeoff è poco vantaggioso e lo schema è poco usato nella progettazione di data mart.
![[Pasted image 20231011093541.png]]
```
### Rappresentazione molti a molti
Per quanto riguarda la traduzione di **archi multipli** abbiamo due possibilità:
- **Bridge table** che specifica la relazione molti a molti ed introduce un nuovo attributo 'peso' che indica la partecipazione delle tuple nella relazione
- **Push down**: in questa strategia l'arco multiplo è integrato nella tabella dei fatti con un'apposita chiave
![[Pasted image 20231011094013.png]]
Dagli archi multipli si creano due tipi di interrogazione distinti: il primo tipo è l'interrogazione **pesata** che tiene conto della partecipazione di un certo attributo in una relazione; ad esempio se voglio calcolare l'incasso delle vendite di libri per autore
```
SELECT ID_Autore, SUM(Incasso*Peso)
FROM...
...
GROUP BY ID_Autore
```
Al contrario **le interrogazioni d'impatto** non considerano il peso: ad esempio come la somma dei libri venduti per autore:
```
SELECT ID_Autore, SUM(Quantità)
FROM...
...
GROUP BY ID_Autore
```
I pro e contro della bridge table e del push down sono i seguenti:
| Bridge Table | Push Down |
| --------------------------------------------- | ------------------------------------------- |
| semplice | E' difficile fare interrogazioni di impatto |
| Minore ridondanza | Maggior ridondanza dei dati |
| Costo di esecuzione maggiore dovuto alle join | Meno join, quindi più efficienza di esecuzione |
### Dimensioni degeneri
Sono **dimensioni rappresentate da un solo attributo**.
Per integrare queste dimensioni (spesso indipendenti fra di loro) posso o facendo un **push** della tabella dei fatti (solo per piccole dimensioni).
In alternativa posso creare una **junk dimension** ovvero una tabella che accumula tutte queste dimensioni.
Nell'esempio qui sotto la dimensione MCS contiene solo dimensioni degeneri:
![[Pasted image 20231011095715.png]]
# Analisi di Data Warehouse
Per analizzare i dati presenti è necessario calcolare **aggregati**, anche di complessità maggiore (top ten, media mobile), effettuare **confronti** ed utilizzare le tecniche di **data mining**.
Abbiamo diversi strumenti per l'analisi:
- **Ambiente controllato da query**: comprende ricerche complesse con **struttura prefissata** (cruscotto), rapporti con struttura prefissata. Offre una vista a colpo d'occhio per monitorare i dati, e richiede la scrittura del codice ad hoc.
- **Ambiente di query ad hoc**: in questo contesto è possibile definire interrogazioni sul momento tramite interfacce simili a fogli di calcolo (pensato per non informatici)
- Specifici strumenti di query
- Data mining
### OLAP (Online Analytical processing)
Una prima tecnica di analisi è il **ROLL UP** che consiste nel **ridurre il livello di dettaglio**, con l'aumento di un livello nella gerarchia (ad esempio, partendo da group by negozio, mese passo a group by città, mese).
Alternativamente anche **eliminando una dimensione** ottengo lo stesso risultato.
![[Pasted image 20231012103121.png]]
La tecnica inversa è detta **DRILL DOWN**, consiste nell'**aumentare il livello di dettaglio**, quindi procederò o aggiungendo una dimensione oppure scendendo in profondità nella gerarchia.
Spesso il drill down opera su un sottoinsieme dei dati di partenza: considerando le vendita raggruppate per regioni, procediamo ad analizzarne SOLO UNA:
![[Pasted image 20231012104019.png]]
Terza tecnica: **SLICE AND DICE** consiste nella selezione di un sottoinsieme dei dati tramite **predicati**: nel caso di un predicato su una dimensione allora estrarrò una 'fetta' del cubo, nel caso di più predicati su più dimensioni selezionerò 'cubetti'.
![[Pasted image 20231012104245.png]]
Quarta tecnica: **PIVOT** consiste nel **riorganizzare i dati senza variare il livello di dettaglio** in modo da permettere una migliore visualizzazione delle informazioni.
![[Pasted image 20231012105047.png]]
### Estensioni del linguaggio SQL
Nuova clausola **WINDOW** che definisce il partizionamento delle tuple senza modificarne la cardinalità.
Consideriamo un esempio:
![[Pasted image 20231012110512.png]]
Visualizzare, per ogni città e mese:
- l’importo delle vendite
- la media rispetto al mese corrente e ai due mesi precedenti, separatamente per ogni città
```SQL
SELECT Città, Mese, Importo,
AVG(Importo), OVER (PARTITION BY Città
ORDER BY Mese
ROWS 2 PRECIDING)
FROM Vendite
```
La **finestra mobile** è definibile in due modi:
- Livello **fisico**, andando ad indicare un conteggio delle righe su cui operare
- Livello **logico**: formando un gruppo in base alla definizione di un intervallo intorno alla chiave di ordinamento.
La keyword **ROWS** definisce l'intervallo fisico:
- ROWS BETWEEN 1 PRECIDING AND 1 FOLLOWING indica la riga corrente, precedente e successiva
- ROWS BETWEEN 3 PRECIDING AND 1 PRECEDING non include la riga corrente ma include le righe -3,-2,-1
- ROWS UNBOUNDED FOLLOWING indica tutte le righe dopo quella corrente (fino alla fine della partizione)
E' anche possibile **combinare** group by e over per, prima semplificare i dati e successivamente andare a fare aggregati particolari.
C'è la possibilità di utilizzare la funzione **RANK()** o **DENSERANK()** per fare la classifica sulla base di una OVER con un ORDER BY interno.
Una ulteriore estensione del GROUP BY serve per calcolare più combinazioni di aggregati: in questo modo si evitano lettere multiple dei dati e query ridondanti.
Ad esempio con la sintassi GROUPU BY ROLLUP(Città, Mese, Anno) calcoliamo la query per tutte le possibili combinazioni di quei 3 attributi.
L'ordine con cui si vanno a togliere gli attributi e quello specificato: nell'esempio avremo CittaMeseAnno, MeseAnno, Anno, -
Con la sintassi GROUP BY CUBE(...) andiamo ad eseguire la query per tutte le possibili combinazioni delle dimensioni specificate.
Infine GROUPING SET(...) permette di specificare le singole combinazioni per le quali eseguire la query.
### Viste materializzate
Le **viste materializzate** sono memorizzate esplicitamente nel data warehouse, questo consente di aumentare l'efficienza della interrogazioni che richiedono aggregazioni.
# Progettazione fisica del Data Warehouse
Questo processo dipende da numerosi fattori:
| Parametro | Note |
| ----------------------------- | ------------------------------------------------------------------------------------------------------------------------------------- |
| Carico di lavoro | In un data warehouse ci sono interrogazioni complesse che richiedono accesso (in sola lettura tipicamente) a grandi quantità di dati. |
| Strutture fisiche | Necessarie strutture non tradizionali: trattati più avanti. Sono usate spesso viste materializzate. |
| Ottimizzatore |Deve essere evoluto, serve capacità di riconoscere ed utilizzare viste materializzate |
| Procedimento di progettazione | Progetto specifico volto ad ottimizzare esigenze specifiche di progetto: ho vincoli di spazio e tempo |
| Tuning | Posso variare a posteriori le strutture fisiche di supporto, senza modificare le applicazioni |
| Parallelismo |Se le strutture lo permettono, parallelizzazione di operazioni complesse |
### Scelta degli indici
- Indicizzazione delle **dimensioni**
- Indici per i **join**
- Indici per i **group by**
# Alimentazione
Processo base dell'alimentazione è costituito dalla fase **ETL**, extract, transform e load.
L'**estrazione** è l'acquisizione dei dati dalle sorgenti, avviene in due modi:
- **statico**: è una fotografia dei dati operazionali
- **incrementale**: selezione degli aggiornamenti avvenuti dopo l'ultima registrazione
In ogni caso scelgo di estrarre i dati in base alla loro qualità.
### Estrazione statica
L'estrazione avviene in base alle informazioni temporali che possiedo riguardo ai dati:
- **Storicizzati**:ho già tutte le modifiche memorizzate nel sistema OLTP
- **Semi storicizzati**: Solo un numero limitato di stati sono memorizzati nel sistema OLTP
- **Transitori**: Il sistema OLTP mantiene solo lo stato più recente del dato
### Estrazione incrementale
Normalmente questo processo è assistito da un'applicazione: **modifico opportunamente l'applicazione OLTP** per permettermi di salvare dati.
Questo richiede di aumentare il carico, costoso, complesso.
Un secondo metodo consiste nell'uso di **log**, in formato proprietario, per tenere traccia di ogni transizione nel sistema. Questo metodo è efficiente in quanto non impatta sul carico applicativo.
Terzo modo: **uso dei trigger** per tenere traccia delle modifiche di interesse. Non modifico le applicazioni già esistenti ma aumento il carico applicativo
L'ultima soluzione è basato su **timestamp**: modifico opportunamente gli schemi logici per inserire un nuovo attributo per il tempo. Ogni transazione è marcata dal tempo e quindi indirettamente posso tener traccia dei dati. E' efficiente ma richiede la modifica di schemi e applicazioni (problema: se i dati sono transitori si possono perdere alcuni stati).
![[Pasted image 20231013091603.png]]
### Pulitura
Operazioni volte al miglioramento della qualità dei dati (correttezza e consistenza):
- dati duplicati
- mancanti
- uso non previsto di campo
- valori impossibili o errati
- inconsistenze
I problemi sono dovuti a errore umano, differenze di formato nei campi e evoluzione di paradigmi in azienda.
Ogni problema richiede una tecnica specifica di soluzione:
basate su dizionari per gli errori di battitura, tecniche di fusione approssimata. In ogni caso la **prevenzione** da errori è la migliore soluzione.
Join approssimato:
![[Pasted image 20231013093153.png]]
Problema Purge/Merge:
![[Pasted image 20231013093201.png]]
### Trasformazione e caricamento
I dati acquisiti vanno **normalizzati, standardizzati e poi corretti**. Per fare ciò si prevede un'integrazione dei dati per convertirli nel formato della base dati: questo prevede una **schema riconciliato** uniforme dei dati.
Altre metodologie di questa fase sono il filtraggio dei dati significativi, l'aggregazione, la generazione di chiavi surrogate e di valori aggregati
Dopo questa fase c'è il **caricamento** che prevede di aggiornare **in ordine** (per mantenere integrità):
1. Dimensioni
2. Tabelle dei fatti
3. Viste materializzate ed indici
![[Pasted image 20231013094442.png]]
![[Pasted image 20231013094525.png]]
[[Esercizio viste materializzate + trigger]]
[[Ditta Elettrodomestici]]
[[Eccellenze Made in Italy]] |
import 'package:flutter/material.dart';
import 'package:flutter_screenutil/flutter_screenutil.dart';
import 'package:google_nav_bar/google_nav_bar.dart';
import '../../../../core/constants/colors.dart';
class NavBar extends StatelessWidget {
const NavBar({super.key});
@override
Widget build(BuildContext context) {
return GNav(
mainAxisAlignment: MainAxisAlignment.spaceAround,
haptic: true, // haptic feedback
tabBorderRadius: 15,
backgroundColor: Colors.white,
tabMargin: EdgeInsets.symmetric(vertical: 10.sp),
tabActiveBorder:
Border.all(color: Colors.black, width: 1), // tab button border
tabBorder: Border.all(
color: Colors.grey, width: 1), // tab button border/ tab button shadow
gap: 8, // the tab button gap between icon and text
color: Colors.black, // unselected icon color
activeColor: Colors.black, // selected icon and text color
iconSize: 24, // tab button icon size
tabBackgroundColor:
AppColors.appBarColor, // selected tab background color
padding: EdgeInsets.symmetric(horizontal: 20.sp, vertical: 5.sp),
onTabChange: (value) {},
tabs: [
const GButton(
backgroundColor: AppColors.appBarColor,
icon: Icons.home,
text: "Home",
),
GButton(
backgroundColor: Colors.blue.withOpacity(0.6),
icon: Icons.person,
text: "Profile",
),
],
);
}
} |
import React from 'react'
import Image from 'next/image';
import ArrowUpRightIcon from '@heroicons/react/20/solid/ArrowUpRightIcon';
interface Props {
learnWeb3NFTs: any;
buildSpaceNFTs: any
}
function NFTCard({ learnWeb3NFTs, buildSpaceNFTs }: Props) {
return (
<div className="mx-auto max-w-2xl py-16 px-8 lg:max-w-5xl lg:px-2 sm:px-4">
<div className="group grid grid-cols-1 gap-x-4 gap-y-10 sm:gap-x-6 md:grid-cols-3 md:gap-y-0 lg:gap-x-8">
{
learnWeb3NFTs && learnWeb3NFTs.map((nft: any, i: any) =>
(
<div key={i + 1}>
<div className='relative h-80 w-80 '>
<Image
src={nft.media[0].gateway}
alt="learnWeb3NFTs"
layout='fill'
className="rounded-xl"
/>
</div>
<p className="mt-1 text-base font-semibold text-black">Id: {nft.id.tokenId.substr(nft.id.tokenId.length -4)}</p>
<p className="mt-1 text-base font-semibold text-black">{nft.title}</p>
<p className="mt-1 text-base font-semibold text-black">{nft.description?.substr(0, 150)}</p>
<div className='flex mt-1 items-center'>
<a className="text-base font-semibold text-gray-500" target={"_blank"} rel="noreferrer" href={`https://polygonscan.com/address/${nft.contract.address}`}>
View on polygon scan
</a>
<ArrowUpRightIcon className='h-5 w-5 text-gray-500 cursor-pointer' />
</div>
</div>
)
)
}
</div>
<div className="group grid grid-cols-1 gap-x-4 gap-y-10 sm:gap-x-6 md:grid-cols-3 md:gap-y-14 lg:gap-x-8 mt-14">
{
buildSpaceNFTs && buildSpaceNFTs.map((nft: any, i: any) =>
(
<div key={i + 1}>
<div className='relative h-80 w-80'>
{
nft.media[0].raw.includes("mp4") ?
<video
src={nft.media[0].raw}
className="rounded-xl h-80 w-80"
autoPlay
controls
/>
:
<Image
src={nft.media[0].gateway}
alt="buildSpaceNFTs"
layout='fill'
className="rounded-xl "
/>
}
</div>
<p className="mt-1 text-base font-semibold text-black">Id: {nft.id.tokenId.substr(nft.id.tokenId.length -4)}</p>
<p className="mt-1 text-base font-semibold text-black">{nft.description?.substr(0, 150)}</p>
<div className='flex mt-1 items-center'>
<a className="text-base font-semibold text-gray-500" target={"_blank"} rel="noreferrer" href={`https://polygonscan.com/address/${nft.contract.address}`}>
View on polygon scan
</a>
<ArrowUpRightIcon className='h-5 w-5 text-gray-500 cursor-pointer' />
</div>
</div>
)
)
}
</div>
</div>
)
}
export default NFTCard; |
import 'package:flutter/material.dart';
import 'package:toikhoe/MainScreen/bac_si_detail_screen.dart';
import 'package:toikhoe/MainScreen/bs_info_screen.dart';
import 'package:toikhoe/database/fetch_user_doctor.dart';
class FavoriteDoctorsScreen extends StatefulWidget {
@override
_FavoriteDoctorsScreenState createState() => _FavoriteDoctorsScreenState();
}
class _FavoriteDoctorsScreenState extends State<FavoriteDoctorsScreen> {
List<Map<String, dynamic>> favoriteDoctors = [];
bool isLoading = true;
@override
void initState() {
super.initState();
_loadFavoriteDoctors();
}
Future<void> _loadFavoriteDoctors() async {
final data = await fetchFavouriteDoctors();
setState(() {
favoriteDoctors = data;
isLoading = false;
});
}
void removeDoctor(int index) {
setState(() {
favoriteDoctors.removeAt(index);
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
backgroundColor: Colors.white,
elevation: 1,
centerTitle: true,
iconTheme: const IconThemeData(color: Colors.black),
title: const Text(
'Bác sĩ yêu thích',
style: TextStyle(
color: Colors.black,
fontWeight: FontWeight.bold,
fontSize: 20,
),
),
),
body: isLoading
? const Center(child: CircularProgressIndicator())
: favoriteDoctors.isEmpty
? const Center(
child: Text(
'Không có bác sĩ yêu thích nào.',
style: TextStyle(fontSize: 16, color: Colors.grey),
),
)
: ListView.builder(
itemCount: favoriteDoctors.length,
itemBuilder: (context, index) {
final doctor = favoriteDoctors[index];
return Card(
margin: const EdgeInsets.symmetric(
horizontal: 16, vertical: 8),
elevation: 3,
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(12),
),
child: Padding(
padding: const EdgeInsets.all(12.0),
child: Row(
crossAxisAlignment: CrossAxisAlignment.center,
children: [
const CircleAvatar(
backgroundImage:
AssetImage('assets/ZaloLogin.jpg'),
radius: 30,
),
const SizedBox(width: 12),
Expanded(
child: Column(
crossAxisAlignment: CrossAxisAlignment.start,
children: [
Text(
doctor['name'] ?? 'N/A',
style: const TextStyle(
fontWeight: FontWeight.bold,
fontSize: 16,
),
overflow: TextOverflow.ellipsis,
),
const SizedBox(height: 4),
Text(
doctor['specialization'] ?? '',
style: TextStyle(
fontSize: 14, color: Colors.grey[600]),
overflow: TextOverflow.ellipsis,
),
const SizedBox(height: 8),
Row(
children: [
const Icon(Icons.star,
color: Colors.orange, size: 16),
const SizedBox(width: 4),
Text(
'${doctor['experience']} năm',
style: const TextStyle(fontSize: 14),
),
],
),
],
),
),
Column(
children: [
ElevatedButton.icon(
onPressed: () {
Navigator.push(
context,
MaterialPageRoute(
builder: (context) => BacSiDetailScreen(
doctorData: doctor,
),
),
);
},
style: ElevatedButton.styleFrom(
backgroundColor: Colors.white,
padding: const EdgeInsets.symmetric(
horizontal: 8, vertical: 6),
shape: RoundedRectangleBorder(
borderRadius: BorderRadius.circular(8),
),
),
label: const Text(
'Xem hồ sơ',
style: TextStyle(fontSize: 12),
),
),
const SizedBox(height: 8),
GestureDetector(
onTap: () {
removeDoctor(index);
},
child: const Icon(
Icons.favorite,
color: Colors.red,
size: 28,
),
),
],
),
],
),
),
);
},
),
);
}
} |
#include <stdio.h>
#include <stdlib.h>
#include <assert.h>
#include "list.h"
typedef struct Node Node;
struct Node {
void * data;
Node * next;
Node * prev;
};
struct List {
Node * head;
Node * tail;
Node * current;
};
typedef List List;
Node * createNode(void * data) {
Node * new = (Node *)malloc(sizeof(Node));
assert(new != NULL);
new->data = data;
new->prev = NULL;
new->next = NULL;
return new;
}
List * createList() {
List * list = (List *) malloc(sizeof(List));
list->head = NULL;
list->tail = NULL;
list->current = NULL;
return list;
}
void * firstList(List * list) {
if(!list->head) return NULL;
list->current = list->head;
return list->head->data;
}
void * nextList(List * list) {
if(!list->current || !list->current->next) return NULL;
list->current = list->current->next;
return list->current->data;
}
void * lastList(List * list) {
if(!list->current || !list->tail) return NULL;
list->current = list->tail;
return list->tail->data;
}
void * prevList(List * list) {
if(!list->current || !list->current->prev) return NULL;
list->current = list->current->prev;
return list->current->data;
}
void pushFront(List * list, void * data) {
Node * newNode = createNode(data);
if(list->head) {
list->head->prev = newNode;
newNode->next = list->head;
} else {
list->tail = newNode;
}
list->head = newNode;
}
void pushBack(List * list, void * data) {
list->current = list->tail;
pushCurrent(list, data);
}
void pushCurrent(List * list, void * data) {
Node * new = createNode(data);
Node * current = list->current;
if(current == list->tail) {
new->prev = current;
list->tail = new;
} else {
new->prev = current;
new->next = current->next;
}
current->next = new;
}
void * popFront(List * list) {
list->current = list->head;
return popCurrent(list);
}
void * popBack(List * list) {
list->current = list->tail;
return popCurrent(list);
}
void * popCurrent(List * list) {
void * data = list->current->data;
if(!list->current) return NULL;
if(list->current == list->head) {
list->current->next->prev = NULL;
list->head = list->current->next;
} else if(list->current == list->tail) {
list->current->prev->next = NULL;
list->tail = list->current->prev;
} else {
list->current->prev->next = list->current->next;
list->current->next->prev = list->current->prev;
}
//current = current->next;
free(list->current);
return data;
}
void cleanList(List * list) {
while (list->head != NULL) {
popFront(list);
}
} |
import { defaultAfterAll, defaultAfterEach, defaultBeforeAll, defaultBeforeEach, haveNoAdditionalKeys } from "../utilities/setup";
import { Db, MongoClient } from "mongodb";
import { BaseSchema } from "@uems/uemscommlib";
import { EquipmentDatabase } from "../../src/database/EquipmentDatabase";
import Intentions = BaseSchema.Intentions;
const empty = <T extends Intentions>(intention: T): { msg_intention: T, msg_id: 0, status: 0, userID: string } => ({
msg_intention: intention,
msg_id: 0,
status: 0,
userID: 'user',
})
describe('create messages of states', () => {
let client!: MongoClient;
let db!: Db;
beforeAll(async () => {
const { client: newClient, db: newDb } = await defaultBeforeAll();
client = newClient;
db = newDb;
equipmentDB = new EquipmentDatabase(db, { details: 'details', changelog: 'changelog' });
});
afterAll(() => defaultAfterAll(client, db));
beforeEach(() => defaultBeforeEach([], client, db));
afterEach(() => defaultAfterEach(client, db));
let equipmentDB: EquipmentDatabase;
it('basic create inserts into the database', async () => {
const result = await equipmentDB.create({
...empty('CREATE'),
name: 'name',
manufacturer: 'manufacturer',
model: 'model',
amount: 1,
locationID: 'venue',
category: 'any',
});
expect(result).toHaveLength(1);
expect(typeof (result[0]) === 'string').toBeTruthy();
const query = await equipmentDB.query({ ...empty('READ') });
expect(query).toHaveLength(1);
expect(query[0].name).toEqual('name');
expect(haveNoAdditionalKeys(query[0], ['id', 'assetID', 'name', 'manufacturer', 'model', 'miscIdentifier', 'amount', 'location', 'locationSpecifier', 'manager', 'date', 'category']));
});
it('should not include additional properties in creating records', async () => {
const result = await equipmentDB.create({
...empty('CREATE'),
name: 'name',
manufacturer: 'manufacturer',
model: 'model',
amount: 1,
locationID: 'venue',
category: 'any',
// @ts-ignore
addProp: 'one',
something: 'else',
});
expect(result).toHaveLength(1);
expect(typeof (result[0]) === 'string').toBeTruthy();
const query = await equipmentDB.query({ ...empty('READ') });
expect(query).toHaveLength(1);
expect(query[0].name).toEqual('name');
expect(haveNoAdditionalKeys(query[0], ['id', 'assetID', 'name', 'manufacturer', 'model', 'miscIdentifier', 'amount', 'location', 'locationSpecifier', 'manager', 'date', 'category']));
});
it('should reject creation of duplicate state names', async () => {
const result = await equipmentDB.create({
...empty('CREATE'),
name: 'name',
manufacturer: 'manufacturer',
model: 'model',
amount: 1,
locationID: 'venue',
category: 'any',
assetID: 'abc1',
});
expect(result).toHaveLength(1);
expect(typeof (result[0]) === 'string').toBeTruthy();
await expect(equipmentDB.create({
...empty('CREATE'),
name: 'name',
manufacturer: 'manufacturer',
model: 'model',
amount: 1,
locationID: 'venue',
category: 'any',
assetID: 'abc1',
})).rejects.toThrowError('duplicate asset id');
});
}); |
import { Fragment, useContext, useEffect, useRef, useState } from "react"
import { useRouter } from "next/router"
import { Event, getAllLocalStorageItems, getRefValue, getRefValues, isTrue, preventDefault, refs, set_val, spreadArraysOrObjects, uploadFiles, useEventLoop } from "/utils/state"
import { EventLoopContext, initialEvents, StateContext } from "/utils/context.js"
import "focus-visible/dist/focus-visible"
import { Avatar, Box, Breadcrumb, BreadcrumbItem, Button, Drawer, DrawerBody, DrawerContent, DrawerHeader, DrawerOverlay, FormControl, Heading, HStack, Image, Input, Link, Menu, MenuButton, MenuDivider, MenuItem, MenuList, Modal, ModalBody, ModalContent, ModalFooter, ModalHeader, ModalOverlay, Text, useColorMode, VStack } from "@chakra-ui/react"
import { CloseIcon, DeleteIcon, HamburgerIcon } from "@chakra-ui/icons"
import NextLink from "next/link"
import { SpinningCircles } from "react-loading-icons"
import NextHead from "next/head"
export default function Component() {
const state = useContext(StateContext)
const router = useRouter()
const { colorMode, toggleColorMode } = useColorMode()
const focusRef = useRef();
// Main event loop.
const [addEvents, connectError] = useContext(EventLoopContext)
// Set focus to the specified element.
useEffect(() => {
if (focusRef.current) {
focusRef.current.focus();
}
})
// Route after the initial page hydration.
useEffect(() => {
const change_complete = () => addEvents(initialEvents.map((e) => ({...e})))
router.events.on('routeChangeComplete', change_complete)
return () => {
router.events.off('routeChangeComplete', change_complete)
}
}, [router])
const ref_question = useRef(null); refs['ref_question'] = ref_question;
return (
<Fragment>
<Fragment>
{isTrue(connectError !== null) ? (
<Fragment>
<Modal isOpen={connectError !== null}>
<ModalOverlay>
<ModalContent>
<ModalHeader>
{`Connection Error`}
</ModalHeader>
<ModalBody>
<Text>
{`Cannot connect to server: `}
{(connectError !== null) ? connectError.message : ''}
{`. Check if server is reachable at `}
{`http://localhost:8000`}
</Text>
</ModalBody>
</ModalContent>
</ModalOverlay>
</Modal>
</Fragment>
) : (
<Fragment/>
)}
</Fragment>
<VStack alignItems={`stretch`} spacing={`0`} sx={{"bg": "#111", "color": "#fff", "minH": "100vh", "alignItems": "stretch", "justifyContent": "space-between"}}>
<Box sx={{"bg": "#111", "backdropFilter": "auto", "backdropBlur": "lg", "p": "4", "borderBottom": "1px solid #fff3", "position": "sticky", "top": "0", "zIndex": "100"}}>
<HStack justify={`space-between`} sx={{"alignItems": "center", "justifyContent": "space-between"}}>
<HStack sx={{"alignItems": "center", "justifyContent": "space-between"}}>
<HamburgerIcon onClick={(_e) => addEvents([Event("state.toggle_drawer", {})], (_e))} sx={{"mr": 4, "cursor": "pointer"}}/>
<Link as={NextLink} href={`/`}>
<Box sx={{"p": "1", "borderRadius": "6", "bg": "#F0F0F0", "mr": "2"}}>
<Image src={`favicon.ico`} sx={{"width": 30, "height": "auto"}}/>
</Box>
</Link>
<Breadcrumb>
<BreadcrumbItem>
<Heading size={`sm`}>
{`ReflexGPT`}
</Heading>
</BreadcrumbItem>
<BreadcrumbItem>
<Text sx={{"size": "sm", "fontWeight": "normal"}}>
{state.current_chat}
</Text>
</BreadcrumbItem>
</Breadcrumb>
</HStack>
<HStack spacing={`8`} sx={{"alignItems": "center", "justifyContent": "space-between"}}>
<Button onClick={(_e) => addEvents([Event("state.toggle_modal", {})], (_e))} sx={{"bg": "#5535d4", "px": "4", "py": "2", "h": "auto", "shadow": "rgba(50, 50, 93, 0.25) 0px 50px 100px -20px, rgba(0, 0, 0, 0.3) 0px 30px 60px -30px, rgba(10, 37, 64, 0.35) 0px -2px 6px 0px inset;", "color": "#fff", "_hover": {"bg": "#4c2db3"}}}>
{`+ New chat`}
</Button>
<Menu sx={{"bg": "#111", "border": "red"}}>
<MenuButton>
<Avatar name={`User`} size={`md`} sx={{"shadow": "rgba(50, 50, 93, 0.25) 0px 50px 100px -20px, rgba(0, 0, 0, 0.3) 0px 30px 60px -30px, rgba(10, 37, 64, 0.35) 0px -2px 6px 0px inset;", "color": "#fff", "bg": "#fff3"}}/>
<Box/>
</MenuButton>
<MenuList sx={{"bg": "#111", "border": "1.5px solid #222"}}>
<MenuItem sx={{"bg": "#111", "color": "#fff"}}>
{`Help`}
</MenuItem>
<MenuDivider sx={{"border": "1px solid #222"}}/>
<MenuItem sx={{"bg": "#111", "color": "#fff"}}>
{`Settings`}
</MenuItem>
</MenuList>
</Menu>
</HStack>
</HStack>
</Box>
<VStack sx={{"py": "8", "flex": "1", "width": "100%", "maxW": "3xl", "paddingX": "4", "alignSelf": "center", "overflow": "hidden", "paddingBottom": "5em", "alignItems": "stretch", "justifyContent": "space-between"}}>
<Box>
{state.chats[state.current_chat].map((dctysvco, i) => (
<Box key={i} sx={{"width": "100%"}}>
<Box sx={{"textAlign": "right", "marginTop": "1em"}}>
<Text sx={{"bg": "#fff3", "shadow": "rgba(17, 12, 46, 0.15) 0px 48px 100px 0px;", "display": "inline-block", "p": "4", "borderRadius": "xl", "maxW": "30em"}}>
{dctysvco.question}
</Text>
</Box>
<Box sx={{"textAlign": "left", "paddingTop": "1em"}}>
<Text sx={{"bg": "#5535d4", "shadow": "rgba(17, 12, 46, 0.15) 0px 48px 100px 0px;", "display": "inline-block", "p": "4", "borderRadius": "xl", "maxW": "30em"}}>
{dctysvco.answer}
</Text>
</Box>
</Box>
))}
</Box>
</VStack>
<Box sx={{"position": "sticky", "bottom": "0", "left": "0", "py": "4", "backdropFilter": "auto", "backdropBlur": "lg", "borderTop": "1px solid #fff3", "alignItems": "stretch", "width": "100%"}}>
<VStack sx={{"width": "100%", "maxW": "3xl", "mx": "auto", "alignItems": "stretch", "justifyContent": "space-between"}}>
<Box as={`form`} onSubmit={(_e0) => addEvents([Event("state.process_question", {form_data:{"question": getRefValue(ref_question)}}),Event("_set_value", {ref:ref_question,value:""})], (_e0))} sx={{"width": "100%"}}>
<FormControl isDisabled={state.processing}>
<HStack sx={{"alignItems": "center", "justifyContent": "space-between"}}>
<Input id={`question`} placeholder={`Type something...`} ref={ref_question} sx={{"bg": "#222", "borderColor": "#fff3", "borderWidth": "1px", "p": "4", "_placeholder": {"color": "#fffa"}, "_hover": {"borderColor": "#5535d4"}}} type={`text`}/>
<Button sx={{"bg": "#222", "borderColor": "#fff3", "borderWidth": "1px", "p": "4", "_hover": {"bg": "#5535d4"}, "shadow": "rgba(50, 50, 93, 0.25) 0px 50px 100px -20px, rgba(0, 0, 0, 0.3) 0px 30px 60px -30px, rgba(10, 37, 64, 0.35) 0px -2px 6px 0px inset;", "color": "#fff"}} type={`submit`}>
<Fragment>
{isTrue(state.processing) ? (
<Fragment>
<SpinningCircles height={`1em`}/>
</Fragment>
) : (
<Fragment>
<Text>
{`Send`}
</Text>
</Fragment>
)}
</Fragment>
</Button>
</HStack>
</FormControl>
</Box>
<Text sx={{"fontSize": "xs", "color": "#fff6", "textAlign": "center"}}>
{`ReflexGPT may return factually incorrect or misleading responses. Use discretion.`}
</Text>
</VStack>
</Box>
<Drawer isOpen={state.drawer_open} placement={`left`}>
<DrawerOverlay>
<DrawerContent sx={{"bg": "#111", "color": "#fff", "opacity": "0.9"}}>
<DrawerHeader>
<HStack sx={{"alignItems": "center", "justifyContent": "space-between"}}>
<Text>
{`Chats`}
</Text>
<CloseIcon onClick={(_e) => addEvents([Event("state.toggle_drawer", {})], (_e))} sx={{"fontSize": "md", "color": "#fff8", "_hover": {"color": "#fff"}, "cursor": "pointer", "w": "8"}}/>
</HStack>
</DrawerHeader>
<DrawerBody>
<VStack alignItems={`stretch`} sx={{"alignItems": "stretch", "justifyContent": "space-between"}}>
{state.chat_titles.map((ofxmflzf, i) => (
<HStack key={i} sx={{"color": "#fff", "cursor": "pointer"}}>
<Box onClick={(_e) => addEvents([Event("state.set_chat", {chat_name:ofxmflzf})], (_e))} sx={{"border": "double 1px transparent;", "borderRadius": "10px;", "backgroundImage": "linear-gradient(#111, #111), radial-gradient(circle at top left, #5535d4,#4c2db3);", "backgroundOrigin": "border-box;", "backgroundClip": "padding-box, border-box;", "p": "2", "_hover": {"backgroundImage": "linear-gradient(#111, #111), radial-gradient(circle at top left, #5535d4,#6649D8);"}, "color": "#fff8", "flex": "1"}}>
{ofxmflzf}
</Box>
<Box sx={{"border": "double 1px transparent;", "borderRadius": "10px;", "backgroundImage": "linear-gradient(#111, #111), radial-gradient(circle at top left, #5535d4,#4c2db3);", "backgroundOrigin": "border-box;", "backgroundClip": "padding-box, border-box;", "p": "2", "_hover": {"backgroundImage": "linear-gradient(#111, #111), radial-gradient(circle at top left, #5535d4,#6649D8);"}}}>
<DeleteIcon onClick={(_e) => addEvents([Event("state.delete_chat", {})], (_e))} sx={{"fontSize": "md", "color": "#fff8", "_hover": {"color": "#fff"}, "cursor": "pointer", "w": "8"}}/>
</Box>
</HStack>
))}
</VStack>
</DrawerBody>
</DrawerContent>
</DrawerOverlay>
</Drawer>
<Modal isOpen={state.modal_open}>
<ModalOverlay>
<ModalContent sx={{"bg": "#222", "color": "#fff"}}>
<ModalHeader>
<HStack alignItems={`center`} justifyContent={`space-between`} sx={{"alignItems": "center", "justifyContent": "space-between"}}>
<Text>
{`Create new chat`}
</Text>
<CloseIcon onClick={(_e) => addEvents([Event("state.toggle_modal", {})], (_e))} sx={{"fontSize": "sm", "color": "#fff8", "_hover": {"color": "#fff"}, "cursor": "pointer"}}/>
</HStack>
</ModalHeader>
<ModalBody>
<Input onBlur={(_e0) => addEvents([Event("state.set_new_chat_name", {value:_e0.target.value})], (_e0))} placeholder={`Type something...`} sx={{"bg": "#222", "borderColor": "#fff3", "_placeholder": {"color": "#fffa"}}} type={`text`}/>
</ModalBody>
<ModalFooter>
<Button onClick={(_e) => addEvents([Event("state.create_chat", {}),Event("state.toggle_modal", {})], (_e))} sx={{"bg": "#5535d4", "boxShadow": "md", "px": "4", "py": "2", "h": "auto", "_hover": {"bg": "#4c2db3"}, "shadow": "rgba(50, 50, 93, 0.25) 0px 50px 100px -20px, rgba(0, 0, 0, 0.3) 0px 30px 60px -30px, rgba(10, 37, 64, 0.35) 0px -2px 6px 0px inset;", "color": "#fff"}}>
{`Create`}
</Button>
</ModalFooter>
</ModalContent>
</ModalOverlay>
</Modal>
</VStack>
<NextHead>
<title>
{`Reflex App`}
</title>
<meta content={`A Reflex app.`} name={`description`}/>
<meta content={`favicon.ico`} property={`og:image`}/>
</NextHead>
</Fragment>
)
} |
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>About Me Section</title>
<style>
body {
font-family: Arial, sans-serif;
margin: 0;
padding: 0;
background-color: #f4f4f9;
display: flex;
flex-wrap: wrap;
}
.about-photo {
width: 40%; /* Photo takes up 40% of the screen on large screens */
height: 100vh;
background: url('p1.jpg') no-repeat center center/cover;
background-size: cover;
object-fit: cover;
}
.about-content {
width: 60%; /* Content takes up 60% of the screen on large screens */
padding: 40px;
background: #fff;
box-sizing: border-box;
}
.about-content h2 {
font-size: 24px;
color: #333;
margin-bottom: 10px;
}
.about-content p {
font-size: 16px;
color: #666;
line-height: 1.6;
margin-bottom: 20px;
}
.skill-bar {
margin-bottom: 15px;
}
.skill-bar .label {
display: flex;
justify-content: space-between;
font-size: 14px;
color: #333;
margin-bottom: 5px;
}
.skill-bar .progress {
height: 10px;
background: #ddd;
border-radius: 5px;
overflow: hidden;
position: relative;
}
.skill-bar .progress .fill {
height: 100%;
background: #007bff;
width: 0;
border-radius: 5px;
animation: fill-animation 1.5s ease-in-out forwards;
}
@keyframes fill-animation {
from {
width: 0;
}
to {
width: var(--skill-level);
}
}
/* Mobile devices */
@media screen and (max-width: 768px) {
body {
display: block;
}
.about-photo {
width: 100%;
height: 200px; /* Adjust height for smaller screens */
object-fit: cover; /* Ensure the image fills the area without distortion */
}
.about-content {
width: 100%;
padding: 20px;
box-sizing: border-box;
}
}
/* Extra small mobile devices */
@media screen and (max-width: 480px) {
.about-photo {
height: 150px; /* Reduce the height further for very small screens */
}
.about-content h2 {
font-size: 20px;
}
.about-content p {
font-size: 14px;
}
}
</style>
</head>
<body>
<div class="about-photo"></div>
<div class="about-content">
<h2>About Me</h2>
<p>
Hello! I'm [Your Name], a passionate professional with expertise in [Your Profession/Field].
I love solving challenges and continuously improving my skills.
</p>
<div class="skill-bar">
<div class="label">
<span>HTML</span>
<span>90%</span>
</div>
<div class="progress">
<div class="fill" style="--skill-level: 90%;"></div>
</div>
</div>
<div class="skill-bar">
<div class="label">
<span>CSS</span>
<span>85%</span>
</div>
<div class="progress">
<div class="fill" style="--skill-level: 85%;"></div>
</div>
</div>
<div class="skill-bar">
<div class="label">
<span>JavaScript</span>
<span>75%</span>
</div>
<div class="progress">
<div class="fill" style="--skill-level: 75%;"></div>
</div>
</div>
</div>
</body>
</html> |
// Copyright 2020-2021 Signal Messenger, LLC
// SPDX-License-Identifier: AGPL-3.0-only
import * as React from 'react';
import 'react-quill/dist/quill.core.css';
import { boolean, select } from '@storybook/addon-knobs';
import { storiesOf } from '@storybook/react';
import { action } from '@storybook/addon-actions';
import { getDefaultConversation } from '../test-both/helpers/getDefaultConversation';
import type { Props } from './CompositionInput';
import { CompositionInput } from './CompositionInput';
import { setupI18n } from '../util/setupI18n';
import enMessages from '../../_locales/en/messages.json';
import { StorybookThemeContext } from '../../.storybook/StorybookThemeContext';
const i18n = setupI18n('en', enMessages);
const story = storiesOf('Components/CompositionInput', module);
const useProps = (overrideProps: Partial<Props> = {}): Props => ({
i18n,
disabled: boolean('disabled', overrideProps.disabled || false),
onSubmit: action('onSubmit'),
onEditorStateChange: action('onEditorStateChange'),
onTextTooLong: action('onTextTooLong'),
draftText: overrideProps.draftText || undefined,
draftBodyRanges: overrideProps.draftBodyRanges || [],
clearQuotedMessage: action('clearQuotedMessage'),
getPreferredBadge: () => undefined,
getQuotedMessage: action('getQuotedMessage'),
onPickEmoji: action('onPickEmoji'),
large: boolean('large', overrideProps.large || false),
sortedGroupMembers: overrideProps.sortedGroupMembers || [],
skinTone: select(
'skinTone',
{
skinTone0: 0,
skinTone1: 1,
skinTone2: 2,
skinTone3: 3,
skinTone4: 4,
skinTone5: 5,
},
overrideProps.skinTone || undefined
),
theme: React.useContext(StorybookThemeContext),
});
story.add('Default', () => {
const props = useProps();
return <CompositionInput {...props} />;
});
story.add('Large', () => {
const props = useProps({
large: true,
});
return <CompositionInput {...props} />;
});
story.add('Disabled', () => {
const props = useProps({
disabled: true,
});
return <CompositionInput {...props} />;
});
story.add('Starting Text', () => {
const props = useProps({
draftText: "here's some starting text",
});
return <CompositionInput {...props} />;
});
story.add('Multiline Text', () => {
const props = useProps({
draftText: `here's some starting text
and more on another line
and yet another line
and yet another line
and yet another line
and yet another line
and yet another line
and yet another line
and we're done`,
});
return <CompositionInput {...props} />;
});
story.add('Emojis', () => {
const props = useProps({
draftText: `😐😐😐😐😐😐😐
😐😐😐😐😐😐😐
😐😐😐😂😐😐😐
😐😐😐😐😐😐😐
😐😐😐😐😐😐😐`,
});
return <CompositionInput {...props} />;
});
story.add('Mentions', () => {
const props = useProps({
sortedGroupMembers: [
getDefaultConversation({
title: 'Kate Beaton',
}),
getDefaultConversation({
title: 'Parry Gripp',
}),
],
draftText: 'send _ a message',
draftBodyRanges: [
{
start: 5,
length: 1,
mentionUuid: '0',
replacementText: 'Kate Beaton',
},
],
});
return <CompositionInput {...props} />;
}); |
import numpy as np
import pandas as pd
import eq_parameters
import state_variables
def cauchy_function(t: float, y: np.array, constants: dict) -> np.array:
"""The derivative y'(t) = [V'(t), m'(t), h'(t), n'(t)] of the Cauchy's initial problem.
Args:
t (float): instante of time t
y (float): function y in the instant t
Returns:
float: the derivative of y'(t) = [V'(t), m'(t), h'(t), n'(t)] in the instante t
"""
# unpacking the parameters from the dictionary "constants"
I, C = constants['current'], constants['capacitance']
g_Na, g_K, g_L = constants['g_Na'], constants['g_K'], constants['g_L']
E_Na, E_K, E_L = constants['E_Na'], constants['E_K'], constants['E_L']
# Getting the values of V, m, h, n in the vector of y
V, m, h, n = y[0], y[1], y[2], y[3]
# computing the alpha and beta parameters of the model
alpha_m, beta_m = eq_parameters.alpha_m(V), eq_parameters.beta_m(V)
alpha_h, beta_h = eq_parameters.alpha_h(V), eq_parameters.beta_h(V)
alpha_n, beta_n = eq_parameters.alpha_n(V), eq_parameters.beta_n(V)
# print(V, I, C, m, h, n, g_Na, g_K, g_L, E_Na, E_K, E_L)
# evaluating the derivatives of each component
der_m = state_variables.der_m(alpha_m, beta_m, m)
der_h = state_variables.der_h(alpha_h, beta_h, h)
der_n = state_variables.der_n(alpha_n, beta_n, n)
der_v = state_variables.der_Voltage(V, I, C, m, h, n, constants)
der_y = np.array([der_v, der_m, der_h, der_n])
return der_y
def discretize_interval(interval: (float, float), n_steps: int) -> np.array:
"""Discretizes the interval [a, b], in to n points [t_0, t_1, t_2, t_3, ..., t_{n-1}],
evenly spaced. t_{k+1} = t_k + delta_t, where delta_t = (b-a)/n.
Args:
interval (float, float):
n_steps (int): number os steps
Returns:
np.array: the vector of the discretized interval [t_0, t_1, t_2, t_3, ..., t_{n-1}]
float: step size of the interval
"""
start, stop = interval[0], interval[1]
step_size = (stop - start)/n_steps
discrete_domain = np.linspace(start, stop, n_steps + 1) # +1 to include the stop point
return discrete_domain, step_size |
<template>
<v-row>
<v-col md="8" offset-md="2">
<admin-store></admin-store>
</v-col>
<v-col md="10" offset-md="1">
<v-data-table :headers="headers" :loading="loading" :items="admins" :items-per-page="10">
<template v-slot:item.state="{ item }">
{{ item.state.name }}
</template>
<template v-slot:item.role="{ item }">
<span v-if="item.role==1">Admin</span>
<span v-else-if="item.role==2">Editor</span>
<span v-else-if="item.role==3">Main Admin</span>
</template>
<template v-slot:item.actions="{ item }">
<v-icon color="blue darken-2" class="mr-2" @click="$router.push(`/admin/edit/${item.id}`)">mdi-pencil-outline</v-icon>
<v-icon color="red darken-2" class="mr-2" @click="deleteAdmin(item)">mdi-delete-outline</v-icon>
</template>
</v-data-table>
</v-col>
</v-row>
</template>
<script>
import AdminStore from "./components/AdminStore.vue";
import { mapState } from "vuex";
export default {
components: {
AdminStore
},
created() {
this.$store.dispatch("Admin/get");
},
computed: {
...mapState("Admin", ["admins"])
},
methods: {
deleteAdmin(item) {
this.$store.dispatch("Admin/delete", item.id);
}
},
data() {
return {
headers: [
{
text: "Name",
value: "name",
align: "center"
},
{
text: "Role",
value: "role",
align: "center"
},
{
text: "State",
value: "state",
align: "center"
},
{
text: "actions",
value: "actions",
align: "center"
}
]
};
}
};
</script>
<style>
</style> |
from NavigoPlatform.CommonBase.BasePage import BasePage
from selenium.webdriver.common.by import By
class CreateAirCraftPage(BasePage):
LOCAircraftTab = (By.XPATH, "//*[@id='root']/div/div[2]/div/div/div[1]/div[3]")
LOCAvailableAircraftTab = (By.XPATH, "//*[@id='root']/div/div[2]/div/div/div[2]/div/div/div[1]/div[2]")
LOCCreateNewAircraftBtn = (
By.XPATH, "//*[@id='root']/div/div[2]/div/div/div[2]/div/div/div[2]/div/div[2]/div[2]/button")
LOCAircraftMake = (By.ID, "make")
LOCAircraftModel = (By.ID, "model")
LOCAircraftTitle = (By.ID, "title")
LOCAircraftTail = (By.ID, "tail_number")
LOCAircraftWeight = (By.ID, "weight")
LOCAircraftLength = (By.ID, "length")
LOCAircraftSeatsTotal = (By.ID, "seats_total")
LOCAircraftNumberOfEngines = (By.ID, "number_of_engines")
LOCAircraftEngineType = (By.ID, "engine_type")
LOCAircraftMaxSpeed = (By.ID, "max_speed")
LOCAircraftMaxPayload = (By.ID, "maximum_payload")
LOCAirSchematicsBrowseFiles = (By.ID, "file-input")
LOCSeatAvailabiltyTab = (By.XPATH, "//*[@id='navigo.modal']/div/div[2]/div/div[1]/div[2]")
LOCBrowseSeatSelection = (By.ID, "file-input")
LOCSaveAircraftBtn = (By.XPATH, "//*[@id='navigo.modal']/div/div[3]/div[2]/button/span")
LOCVerifyAircraftName = (By.XPATH, "//*[@id='root']/div[1]/div[2]/div/div/div[2]/div/div/div[2]/div/div[1]/div[2]/div/table/tbody/tr[1]/td[1]/span")
#LOCVerifyAircraftName = (By.XPATH, "(//td[contains(@class,'px-3 py-3')]//span)[1]")
#LOCVerifyAircraftName = (By.XPATH, "(//span[contains(@class,'text-font font-roboto')])[2]")
Aircraft_title_name = " "
def __init__(self, driver):
super().__init__(driver)
def Click_on_Aircraft_Tab(self):
self.ClickElement(self.LOCAircraftTab)
def Click_on_Available_Aircraft_Tab(self):
self.ClickElement(self.LOCAvailableAircraftTab)
def Click_on_Create_New_Aircraft_Btn(self):
self.ClickElement(self.LOCCreateNewAircraftBtn)
def Fill_aircraft_details(self):
selected_aircraft_make = self.SelectFromDropDown(self.LOCAircraftMake)
selected_aircraft_model = self.SelectFromDropDown(self.LOCAircraftModel)
CreateAirCraftPage.Aircraft_title_name = f'{selected_aircraft_make}+{selected_aircraft_model}+automode'
print(f'{CreateAirCraftPage.Aircraft_title_name}')
self.InputElement(self.LOCAircraftTitle, f'{CreateAirCraftPage.Aircraft_title_name}')
tail_number = self.GenerateRandomNumber()
self.InputElement(self.LOCAircraftTail, tail_number)
self.InputElement(self.LOCAircraftWeight, "88888")
self.InputElement(self.LOCAircraftLength, "66666")
self.InputElement(self.LOCAircraftSeatsTotal, "450")
self.InputElement(self.LOCAircraftNumberOfEngines, "4")
self.InputElement(self.LOCAircraftEngineType, "automode dual type")
self.InputElement(self.LOCAircraftMaxSpeed, "555")
#self.InputElement(self.LOCAircraftMaxPayload, "999999")
def Upload_seat_schematics(self):
path_to_file = '/Users/harshithkumar/navigo-automate/NavigoPlatform/TestData/Seat_schematics.png'
self.UploadFile(self.LOCAirSchematicsBrowseFiles, path_to_file)
def Upload_seats(self):
path_to_file = '/Users/harshithkumar/navigo-automate/NavigoPlatform/TestData/75_seats.txt'
self.ClickElement(self.LOCSeatAvailabiltyTab)
self.UploadFile(self.LOCBrowseSeatSelection, path_to_file)
def Click_on_Save_Aircraft(self):
self.ClickElement(self.LOCSaveAircraftBtn)
def Verify_Created_aircraft(self):
if self.GetText(self.LOCVerifyAircraftName, CreateAirCraftPage.Aircraft_title_name):
expected_aircraft_name = self.GetElementText(self.LOCVerifyAircraftName)
print("Created this aircraft: ", CreateAirCraftPage.Aircraft_title_name)
print("Verifying this aircraft: ", expected_aircraft_name)
assert expected_aircraft_name == CreateAirCraftPage.Aircraft_title_name
else:
print(f'Created {CreateAirCraftPage.Aircraft_title_name} is not present') |
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Rendering;
using Microsoft.EntityFrameworkCore;
using MidStateShuttleService.Models;
using MidStateShuttleService.Service;
namespace MidStateShuttleService.Controllers
{
public class CheckInController : Controller
{
private readonly ApplicationDbContext _context;
private readonly ILogger<CheckInController> _logger;
// Inject ApplicationDbContext into the controller constructor
public CheckInController(ApplicationDbContext context, ILogger<CheckInController> logger)
{
_context = context; // Assign the injected ApplicationDbContext to the _context field
_logger = logger;
}
// GET: CheckInController/Create
[AllowAnonymous]
public ActionResult CheckIn()
{
LocationServices ls = new LocationServices(_context);
ViewBag.Locations = ls.GetAllEntities().Select(x => new SelectListItem { Text = x.Name, Value = x.LocationId.ToString() });
return View();
}
public ActionResult EditCheckIn(int id)
{
CheckInServices cs = new CheckInServices(_context);
CheckIn model = cs.GetEntityById(id);
LocationServices ls = new LocationServices(_context);
ViewBag.Locations = ls.GetAllEntities().Select(x => new SelectListItem { Text = x.Name, Value = x.LocationId.ToString() });
if (model == null)
return FailedCheckIn("Check In Not Found");
return View(model);
}
// POST: CheckInController/Create
[HttpPost]
[ValidateAntiForgeryToken]
[AllowAnonymous]
public ActionResult CheckIn(CheckIn checkIn)
{
//date
checkIn.Date = DateTime.Now;
CheckInServices cs = new CheckInServices(_context);
checkIn.IsActive = true;
cs.AddEntity(checkIn);
// Increment the check-in count in the session
int checkInCount = HttpContext.Session.GetInt32("CheckInCount") ?? 0;
checkInCount++;
HttpContext.Session.SetInt32("CheckInCount", checkInCount);
// The temp data which is used to display the modal after sending a form
HttpContext.Session.SetString("CheckInSuccess", "true");
TempData["CheckInSuccess"] = true;
return RedirectToAction("CheckIn");
}
[HttpPost]
[ValidateAntiForgeryToken]
public ActionResult EditCheckIn(CheckIn checkIn)
{
CheckInServices cs = new CheckInServices(_context);
if (checkIn == null)
return FailedCheckIn("Updates to check in could not be applied");
//not all values comming over from form
checkIn.IsActive = true;
cs.UpdateEntity(checkIn);
return RedirectToAction("Index", "Dashboard");
}
public ActionResult DeleteCheckIn(int id)
{
try
{
CheckInServices cs = new CheckInServices(_context);
CheckIn model = cs.GetEntityById(id);
if (model == null)
return FailedCheckIn("Check In could not be found");
model.IsActive = !model.IsActive; // Toggle IsActive value
cs.UpdateEntity(model); // Update the entity in the database
return RedirectToAction("Index", "Dashboard");
}
catch (Exception ex)
{
// Log the exception
LogEvents.LogSqlException(ex, (IWebHostEnvironment)_context);
_logger.LogError(ex, "An error occurred while toggling IsActive of the check in.");
// Optionally add a model error for displaying an error message to the user
ModelState.AddModelError("", "An unexpected error occurred while toggling IsActive of the check in, please try again.");
// Return the view with an error message or handle the error as required
return View();
}
}
public ActionResult Delete(int id)
{
return View();
}
[AllowAnonymous]
public ActionResult FailedCheckIn(string errorMessage)
{
ViewBag.ErrorMessage = errorMessage;
return View("FailedCheckIn");
}
}
} |
import * as React from 'react';
import {ChangeEvent, useCallback, useState} from 'react';
import {createRoot} from 'react-dom/client';
import {Provider} from 'react-redux';
import {Route, HashRouter, Link, Routes} from 'react-router-dom';
import Chooser from './components/Chooser';
import Footer from './components/Footer';
import store from './store';
import {initializeApp} from 'firebase/app';
import {getStorage, ref, uploadString} from 'firebase/storage';
// TODO: Add SDKs for Firebase products that you want to use
// https://firebase.google.com/docs/web/setup#available-libraries
// Your web app's Firebase configuration
const firebaseConfig = {
apiKey: 'AIzaSyCCon3Wo4SMiWssqt085aWufdgWL-PmS9A',
authDomain: 'lisc-90068.firebaseapp.com',
projectId: 'lisc-90068',
storageBucket: 'lisc-90068.appspot.com',
messagingSenderId: '718080118269',
appId: '1:718080118269:web:f79898d0f949add5c3dd94',
};
// Initialize Firebase
const app = initializeApp(firebaseConfig);
const storage = getStorage(app);
console.log(storage.app.name);
const storageRef = ref(storage, 'test.txt');
console.log('LiSc is open source! https://github.com/mpunkenhofer/lisc');
const New = () => <Link to="/new">New</Link>;
const Test = () => <h1>Test</h1>;
const NotFound = () => <p>Route not found.</p>;
const myRequest = new Request('flowers.jpg');
const Post = () => {
const [value, setValue] = useState('Hi');
const [title, setTitle] = useState('Title');
const onSubmit = useCallback(async () => {
console.log(title, value);
uploadString(storageRef, value).then(() => {
console.log('Uploaded a raw string!');
});
}, [title, value]);
return (
<>
<h1>Pastebin Test</h1>
<form onSubmit={onSubmit}>
<input
value={title}
onChange={(ev: ChangeEvent<HTMLInputElement>) =>
setTitle(ev.target.value)
}
/>
<textarea
value={value}
onChange={(ev: ChangeEvent<HTMLTextAreaElement>) =>
setValue(ev.target.value)
}
/>
<input type="submit" value="Submit" />
</form>
</>
);
};
const App = () => (
<HashRouter>
<Routes>
<Route path="/" element={<New />} />
<Route path="/test" element={<Test />} />
<Route path="/new" element={<Chooser />} />
<Route path="/post" element={<Post />} />
<Route path="*" element={<NotFound />} />
</Routes>
<Footer />
</HashRouter>
);
const container = document.getElementById('root') as HTMLElement;
const root = createRoot(container);
root.render(
<React.StrictMode>
<Provider store={store}>
<App />
</Provider>
</React.StrictMode>
);
// Demo: How to post to pastebin using their API
// fetch('https://pastebin.com/api/api_post.php', {
// method: 'POST',
// //mode: 'no-cors',
// headers: {
// 'Content-Type': 'application/x-www-form-urlencoded',
// //'Access-Control-Allow-Origin': '*',
// },
// body: new URLSearchParams({
// api_dev_key: '<pastebin api key>',
// api_option: 'paste',
// api_paste_name: title,
// api_paste_code: value,
// api_paste_expire_date: '10M',
// api_paste_private: '1',
// }),
// }).then(response => {
// response.text().then(json => console.log(json));
// }); |

# Fnord Discord Bot
GURPS[^1] Fnord Bot is a stand-alone, interactive, discord bot to work as game aid to [GURPS 4<sup>th</sup> Edition](http://www.sjgames.com/gurps) roleplaying game system.
## How to Use
If you're not familiar with running Node.js applications, this guide may not be suitable for you. However, it's not overly complicated, and you can find various tutorials on the internet to help you learn. Here's what you'll need:
A machine running 24/7 to keep your bot running. There are several free services that offer virtual machines, and I recommend using Oracle Cloud for this purpose.
An account with MongoDB.
A development environment with VSCode and Node.js installed on your machine.
If you have these prerequisites in place, you can proceed with the tutorial.
## Creating your bot and getting your token
1. Go to [Discord Developers Portal](https://discord.com/developers/docs/intro) and click on Application on Left site
2. Create a new application
3. Give a name, change the aplication image (this is not your bot yet!)
4. Now you need to get a token, if is not visible, click on reset token and copy it on some place to use it later and do not share this key! This key is something like this
```bash
MTE1MTI4Mjg0ODMwMTU4NDQ2Ng.GT8d6O.sQKmqYhXIrjNL07bDa3MiF6uv4mcOdkk0_OzF4
```
5. In settings go to Bot
- Here you need to choose yout bot name and avatar.
- If you want your bot public check PUBLIC BOT
- In Privileged Gateway Intents turn on: PRESENCE INTENT, SERVER MEMBER INTENT AND MESSAGE CONTENT INTENT.
- Save de changes
6. Now go to left side again in OAUTH2>URL Generator
- On Scopes select Bot
- On Bot permissions select Administrator.
- Scroll down the page and copy the url and paste it on your web browser. This is the url you need to invite your bot to your channel! Now you can see your bot running on your server!
Once you have your bot running we need to put some code on it!
## Installation
Make sure you have Node.js and npm installed on your machine.
1. Clone this repository:
```bash
git clone https://github.com/yourusername/yourbot.git
```
Navigate to the project folder:
2. Install the dependencies:
```bash
npm install
```
4. Create a mongodb account on
```bash
https://www.mongodb.com/
```
5. Create your dotenv file
Create a file ```.env``` under root folder follow this example:
```bash
DISCORD_TOKEN = discord token
GUILD_ID = discord channel id
CLIENT_ID = bot id
MONGODB_SRV = monngo url with login and passwd
```
6. Deploying Commands
Before running the bot, you need to deploy the commands to your Discord server. Execute the following command. This command will register all the commands defined for your bot on the Discord server.
```bash
node ./deploy_commands
```
7. Running the Bot
Now you're ready to start the bot:
```bash
node .
```
The bot is now up and running on your Discord server.
## Commands
Here are some of the commands available for use with this bot:
> Under Construction. I'll be listing it soon.
------------
[^1]: GURPS is a trademark of Steve Jackson Games, and its rules and art are copyrighted by Steve Jackson Games. All
rights are reserved by Steve Jackson Games. This game aid is the original creation of Richard A. Wilkes and is
released for free distribution, and not for resale, under the permissions granted in the
<a href="http://www.sjgames.com/general/online_policy.html">Steve Jackson Games Online Policy</a>. |
<template>
<nav class="app-topnav">
<div class="container">
<ul>
<!-- template标签可以看成一个元素,但是不解析成标签 -->
<template v-if="profile.token">
<li>
<RouterLink to="/member"><i class="iconfont icon-user"></i>{{ profile.account }}</RouterLink>
</li>
<li><a @click="logout" href="javascript:;">退出登录</a></li>
</template>
<template v-else>
<li><RouterLink to="/login">请先登录</RouterLink></li>
<li><a href="javascript:;">免费注册</a></li>
</template>
<li><RouterLink to="/member/order">我的订单</RouterLink></li>
<li><RouterLink to="/member">会员中心</RouterLink></li>
<li><a href="javascript:;">帮助中心</a></li>
<li><a href="javascript:;">关于我们</a></li>
<li>
<a href="javascript:;"><i class="iconfont icon-phone"></i>手机版</a>
</li>
</ul>
</div>
</nav>
</template>
<script>
import { computed } from 'vue'
import { useStore } from 'vuex'
import { useRouter } from 'vue-router'
export default {
name: 'AppTopnav',
setup() {
//获取用户的登录信息才能控制切换导航菜单
const store = useStore()
//使用state要用计算属性,否则不是响应式
const profile = computed(() => {
return store.state.user.profile
})
const router = useRouter()
const logout = () => {
// 清空用户信息
store.commit('user/setUser', {})
// 清空购物车
store.commit('user/setCartList', [])
router.push('/login')
}
return { profile, logout }
}
}
</script>
<style scoped lang="less">
.app-topnav {
background: #333;
ul {
display: flex;
height: 53px;
justify-content: flex-end;
align-items: center;
li {
a {
padding: 0 15px;
color: #cdcdcd;
line-height: 1;
display: inline-block;
i {
font-size: 14px;
margin-right: 2px;
}
&:hover {
color: @xtxColor;
}
}
//~选择器作用:选择当前选择器后面的所有元素
~ li {
a {
border-left: 2px solid #666;
}
}
}
}
}
</style> |
<template>
<!-- Navigation-->
<header className="header">
<nav class="navbar navbar-expand-lg navbar-light bg-light shadow fixed-top">
<div class="container px-4 px-lg-5">
<a class="navbar-brand .mr-3" href="#">UShop</a>
<button class="navbar-toggler" type="button" data-bs-toggle="collapse" data-bs-target="#navbarSupportedContent" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation"><span class="navbar-toggler-icon"></span></button>
<div class="collapse navbar-collapse" id="navbarSupportedContent">
<ul class="navbar-nav ms-auto">
<li class="nav-item pl-5">
<router-link class="nav-link active" aria-current="page" to="/">Home</router-link>
</li>
<li class="nav-item">
<router-link class="nav-link" :to="{name: 'about'}">About</router-link>
</li>
<li class="nav-item">
<router-link class="nav-link" to="/new-product">New Product</router-link>
</li>
</ul>
<form class="d-flex">
<button class="btn btn-outline-dark" type="button" @click="toggleSideBar">
<i class="bi-cart-fill me-1"></i>
Cart
<span class="badge bg-dark text-white ms-1 rounded-pill"> {{ totalQuantity }}</span>
</button>
</form>
</div>
</div>
</nav>
<!-- Header-->
<header class="navbar navbar-light bg-light navbar-expand-md header-bg py-5">
<div class="container px-4 my-2">
<div class="text-center text-white">
<h1 class="display-4 text-center fw-bolder py-5">The best deals</h1>
</div>
</div>
</header>
<Router-view
:inventory = "inventory"
:addTo = "addToCart"
:addInv = "addInventory"
:removeInv = "removeInventory"
:remItem = "removeItem"
:updateInv = "updateInventory"
/>
<Sidebar
v-if="showSideBar"
:toggle = "toggleSideBar"
:cart = "cart"
:inventory = "inventory"
:remove = "removeItem"
/>
</header>
</template>
<script>
import Sidebar from '@/components/SideBar.vue'
// import inventory from '@/product.json'
import ProductDataService from '@/services/ProductDataService'
export default {
components: {
Sidebar
},
data: () => {
return {
showSideBar: false,
inventory: [],
cart: {}
}
},
methods: {
toggleSideBar () {
this.showSideBar = !this.showSideBar
},
addToCart (product, index) {
if (!this.cart[product]) this.cart[product] = 0
this.cart[product] += this.inventory[index].quantity
this.inventory[index].quantity = 0
},
removeItem (name) {
delete this.cart[name]
},
addInventory (product) {
this.inventory.push(product)
},
removeInventory (index) {
this.inventory.splice(index, 1)
},
updateInventory (index, data) {
this.inventory[index].name = data.name
this.inventory[index].photo = data.photo
this.inventory[index].price = data.price
this.inventory[index].description = data.description
this.inventory[index].type = data.type
}
},
computed: {
totalQuantity () {
return Object.values(this.cart).reduce((acc, curr) => {
return acc + curr
}, 0)
}
},
mounted () {
ProductDataService.getAll()
.then(response => {
this.inventory = response.data
console.log(response.data)
})
}
}
</script> |
import {
Controller,
Get,
HttpException,
HttpStatus,
Param,
} from '@nestjs/common';
import { CollectionService } from './collection.service';
@Controller('collection')
export class CollectionController {
constructor(private readonly collectionService: CollectionService) {}
@Get()
async getAllCollections() {
const collections = await this.collectionService.getAllCollections();
if (!collections) {
return new HttpException('No collections found', HttpStatus.NOT_FOUND);
}
return collections;
}
@Get('/creator/:id')
async getAllCollectionsForCreator(@Param() params) {
const collections =
await this.collectionService.getAllCollectionsForCreator(params.id);
if (!collections) {
return new HttpException('No collections found', HttpStatus.NOT_FOUND);
}
return collections;
}
} |
import { useState } from "react";
function RecipeForm() {
const [newRecipeName, setNewRecipeName] = useState("");
async function addRecipe() {
if (newRecipeName.trim() !== "") {
try {
const response = await fetch("http://localhost:8080/api/v1/recipes", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ title: newRecipeName }),
});
if (response.ok) {
setNewRecipeName(""); // Reset het invoerveld
// Voeg hier de logica toe om de lijst van recepten bij te werken
} else {
throw new Error("Network response was not ok");
}
} catch (error) {
console.error("There was a problem with the fetch operation:", error);
}
}
}
return (
<div className="mt-4">
<input
type="text"
className="py-2 px-4 rounded border focus:outline-none focus:border-blue-500"
placeholder="Enter recipe name"
value={newRecipeName}
onChange={(e) => setNewRecipeName(e.target.value)}
/>
<button
className="ml-2 py-2 px-4 bg-blue-500 text-white rounded"
onClick={addRecipe}
>
Add Recipe
</button>
</div>
);
}
export default RecipeForm; |
package org.cdlib.xtf.dynaXML;
/**
* Copyright (c) 2004, Regents of the University of California
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
*
* - Redistributions of source code must retain the above copyright notice,
* this list of conditions and the following disclaimer.
* - Redistributions in binary form must reproduce the above copyright notice,
* this list of conditions and the following disclaimer in the documentation
* and/or other materials provided with the distribution.
* - Neither the name of the University of California nor the names of its
* contributors may be used to endorse or promote products derived from this
* software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
* CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
* SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
* INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*/
/**
* This exception is thrown when a requestor fails authentication (ie has
* the wrong password, IP address, etc).
*/
class NoPermissionException extends DynaXMLException
{
/**
* Constructor taking an IP address
*
* @param ipAddr The IP address of the requestor
*/
public NoPermissionException(String ipAddr) {
super("Permission denied");
set("ipAddr", ipAddr);
}
/**
* Constructor that only takes a 'cause'. This is used, for example, when
* an LDAP authentication attempt fails due to a communication error.
*
* @param cause The exception that caused this exception.
*/
public NoPermissionException(Throwable cause) {
super("Permission denied", cause);
}
/** Default constructor */
public NoPermissionException() {
super("Permission denied");
}
/** This particular exception isn't really severe enough to log */
public boolean isSevere() {
return false;
}
} // class NoPermissionException |
'use client';
import { Tooltip } from 'react-tooltip'
import { IconType } from "react-icons";
// import { randomUUID } from 'crypto';
import { v4 as uuidv4 } from 'uuid';
import { toast } from 'react-hot-toast';
interface ButtonProps {
label?: string;
onClick: (e: React.MouseEvent<HTMLButtonElement>) => void;
disabled?: boolean;
outline?: boolean;
small?: boolean;
icon?: IconType;
border?: boolean;
IsActive?: boolean;
size?: number;
IsBan?: boolean;
labelsize?: number;
responsive?: boolean;
type?: boolean;
showLabeL?: boolean
}
const Button: React.FC<ButtonProps> = ({
label,
labelsize,
onClick,
disabled,
outline,
small,
icon: Icon,
border,
IsActive,
type,
size,
IsBan,
responsive,
showLabeL
}) => {
const _labelsise: string | undefined = labelsize ? 'text-' + labelsize.toString() : undefined
const _id = uuidv4();
return (
<button
type={type ? 'submit' : 'button'}
disabled={disabled}
onClick={onClick}
data-tooltip-id={_id}
data-tooltip-content={label}
className={`
relative
flex
gap-2
disabled:opacity-70
disabled:cursor-not-allowed
rounded-lg
hover:opacity-80
capitalize
transition
items-center
w-max
px-2 z-10
${outline ? ' bg-transparent' : 'bg-rose-500'}
${border ? outline ? 'border-black' : 'border-rose-500' : ''}
${IsActive ? ' text-secondary' : IsBan ? ' text-isban' : 'text-white'}
${small ? 'text-sm' : 'text-md text-[#FFFFFF]'}
${small ? 'py-1' : 'py-3'}
${small ? 'font-light' : 'font-semibold'}
${border ? small ? 'border-[1px]' : 'border-2' : ''}
`}
>
{Icon && (<Icon size={size ? size : 24} />)}
{showLabeL && <span className={` ${responsive ? 'hidden sm:flex' : ''} ${_labelsise && _labelsise} `}> {label && label}</span>}
{label && responsive&&<Tooltip opacity={1} className='z-[37] bg-black' id={_id} place="left" />}
</button>
);
}
export default Button; |
# application support center - iOS Demo App

This is a simple but complete iOS app which includes the asc SDK to demonstrate the integration between the app and server. If configured correctly, the app will display the help pages, app contacts announcements and release notes from the admin panel.
## Components
Component included in this repo:
- <a href='https://github.com/SAP/application-support-center'>asc Core</a>
- asc iOS Demo App & SDK
## Installation
- Download/Fork this repo, unzip and open the project
## Usage
1. Import the SDK into your project and create a instance of the InfoCenter
```
let infoCenter = ASCInfoCenter()
```
2. Initialize the InfoCenter. This will perform a new content comparison and fetch data from the server. The appId and accessToken are displayed in the Admin UI.
```
override func viewDidLoad() {
super.viewDidLoad()
infoCenter.initHelp(appId: "789", serverURL: "http://localhost:5001/api/v1", accessToken: "xyz123")
}
```
3. Add a IBAction to your existing application to show the ASC InfoCenter popup.
```
@IBAction func showHelpCenter(_ sender: Any) {
infoCenter.showHelp(in: self)
}
```
4. Optional: Run a manual check for content changes. This method will query the ASC Server to check and download any changes. This could be called on a lifecycle event such as ```applicationWillEnterForeground(_:)```
```
infoCenter.checkContentVersion()
```
5. Optional: The InfoCenter can be opened directly in one of the views if you do not want to display the main page.
```
infoCenter.showHelp(View: "Release Notes", in: self) //"Help" || "Support" || "Announcements"
``` |
import pandas as pd
import numpy as np
import os
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from keras.models import Sequential
from keras.layers.embeddings import Embedding
from keras.layers.core import Dense, initializers, Dropout, Masking
from keras.layers.recurrent import GRU
from keras.optimizers import SGD, Adagrad, Adam
from sklearn.model_selection import train_test_split
from keras import backend as K
import tensorflow as tf
from keras.backend.tensorflow_backend import set_session
config = tf.ConfigProto()
config.gpu_options.per_process_gpu_memory_fraction = 0.8
set_session(tf.Session(config=config))
# import os
# os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID" # see issue #152
# os.environ["CUDA_VISIBLE_DEVICES"] = ""
def multiply_loss(y_true, y_pred):
return -K.mean(y_true * K.log(y_pred) + (1 - y_true) * K.log(1 - y_pred), axis=-1)
DIR = 'D:/github/Kaggle/Toxic_Comment_Classification_Challenge'
def gru_drop(input_dim, input_length=100, output_dim=200, label_n=6):
'''
:param input_dim: 字典长度,即onehot的长度
:param input_length: 文本长度
:param output_dim: 词向量长度
:return:
'''
model = Sequential()
model.add(Embedding(input_dim=input_dim + 1,
input_length=input_length,
output_dim=output_dim,
mask_zero=0))
model.add(Masking(mask_value=0))
model.add(GRU(units=64,
activation='tanh',
recurrent_activation='hard_sigmoid',
return_sequences=False))
model.add(Dense(units=128,
activation='relu'))
model.add(Dropout(0.20))
model.add(Dense(units=label_n,
activation='sigmoid'))
optimizer = Adagrad(lr=0.01)
model.compile(optimizer=optimizer, loss=multiply_loss, metrics=['accuracy'])
return model
train_data = pd.read_csv('./data/train.csv')
train_label = np.array(train_data.iloc[:, 2:])
tokenizer = Tokenizer()
tokenizer.fit_on_texts(texts=train_data.iloc[:, 1])
train_data_seq = tokenizer.texts_to_sequences(texts=train_data.iloc[:, 1])
train_data_new = pad_sequences(train_data_seq, maxlen=500, padding='post', value=0, dtype='float32')
# np.save('./data_seq/train_data_new.npy',train_data_seq)
test_data = pd.read_csv('./data/test.csv')
test_data_seq = tokenizer.texts_to_sequences(texts=test_data.iloc[:, 1])
test_data_new = pad_sequences(test_data_seq, maxlen=500, padding='post', value=0, dtype='float32')
# np.save('./data_seq/test_data_new.npy',train_data_seq)
input_dim = len(tokenizer.word_index)
model_gru_drop=gru_drop(input_dim=input_dim, input_length=500, output_dim=100, label_n=6)
samples=159571
batch_size=500
epochs=1
train_train_x, train_test_x, train_train_y, train_test_y = train_test_split(train_data_new[0:samples],
train_label[0:samples],
test_size=0.2)
np.save('./data_seq/train_train_x.npy',train_train_x)
np.save('./data_seq/train_test_x.npy',train_test_x)
np.save('./data_seq/train_train_y.npy',train_train_y)
np.save('./data_seq/train_test_y.npy',train_test_y)
model_gru_drop.fit(x=train_train_x, y=train_train_y,
validation_data=[train_test_x,train_test_y],
batch_size=batch_size, epochs=epochs,verbose=1)
model_gru_drop.save('./models/gru_drop_500_%d_%d_%d.h5' % (samples, batch_size, 2))
test_label=model_gru_drop.predict(test_data_new)
test_predict=pd.DataFrame(test_label,columns=train_data.columns[2:])
test_result=pd.concat([test_data.iloc[:,0:1],test_predict],axis=1)
test_result.to_csv('./result/gru_drop_500_%d_%d_%d.csv'%(samples,batch_size,epochs),index=False) |
/* -*- Mode: C; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*- */
/*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details:
*
* Copyright (C) 2012 Google, Inc.
*/
#include <stdio.h>
#include <stdlib.h>
#include <libqmi-glib.h>
#include <ModemManager.h>
#include <mm-errors-types.h>
#include "mm-port-qmi.h"
#include "mm-log.h"
G_DEFINE_TYPE (MMPortQmi, mm_port_qmi, MM_TYPE_PORT)
typedef struct {
QmiService service;
QmiClient *client;
MMPortQmiFlag flag;
} ServiceInfo;
struct _MMPortQmiPrivate {
gboolean opening;
QmiDevice *qmi_device;
GList *services;
gboolean llp_is_raw_ip;
};
/*****************************************************************************/
QmiClient *
mm_port_qmi_peek_client (MMPortQmi *self,
QmiService service,
MMPortQmiFlag flag)
{
GList *l;
for (l = self->priv->services; l; l = g_list_next (l)) {
ServiceInfo *info = l->data;
if (info->service == service &&
info->flag == flag)
return info->client;
}
return NULL;
}
QmiClient *
mm_port_qmi_get_client (MMPortQmi *self,
QmiService service,
MMPortQmiFlag flag)
{
QmiClient *client;
client = mm_port_qmi_peek_client (self, service, flag);
return (client ? g_object_ref (client) : NULL);
}
/*****************************************************************************/
typedef struct {
MMPortQmi *self;
GSimpleAsyncResult *result;
ServiceInfo *info;
} AllocateClientContext;
static void
allocate_client_context_complete_and_free (AllocateClientContext *ctx)
{
g_simple_async_result_complete (ctx->result);
if (ctx->info) {
g_assert (ctx->info->client == NULL);
g_free (ctx->info);
}
g_object_unref (ctx->result);
g_object_unref (ctx->self);
g_free (ctx);
}
gboolean
mm_port_qmi_allocate_client_finish (MMPortQmi *self,
GAsyncResult *res,
GError **error)
{
return !g_simple_async_result_propagate_error (G_SIMPLE_ASYNC_RESULT (res), error);
}
static void
allocate_client_ready (QmiDevice *qmi_device,
GAsyncResult *res,
AllocateClientContext *ctx)
{
GError *error = NULL;
ctx->info->client = qmi_device_allocate_client_finish (qmi_device, res, &error);
if (!ctx->info->client) {
g_prefix_error (&error,
"Couldn't create client for service '%s': ",
qmi_service_get_string (ctx->info->service));
g_simple_async_result_take_error (ctx->result, error);
} else {
g_simple_async_result_set_op_res_gboolean (ctx->result, TRUE);
/* Move the service info to our internal list */
ctx->self->priv->services = g_list_prepend (ctx->self->priv->services, ctx->info);
ctx->info = NULL;
}
allocate_client_context_complete_and_free (ctx);
}
void
mm_port_qmi_allocate_client (MMPortQmi *self,
QmiService service,
MMPortQmiFlag flag,
GCancellable *cancellable,
GAsyncReadyCallback callback,
gpointer user_data)
{
AllocateClientContext *ctx;
if (!!mm_port_qmi_peek_client (self, service, flag)) {
g_simple_async_report_error_in_idle (G_OBJECT (self),
callback,
user_data,
MM_CORE_ERROR,
MM_CORE_ERROR_EXISTS,
"Client for service '%s' already allocated",
qmi_service_get_string (service));
return;
}
ctx = g_new0 (AllocateClientContext, 1);
ctx->self = g_object_ref (self);
ctx->result = g_simple_async_result_new (G_OBJECT (self),
callback,
user_data,
mm_port_qmi_allocate_client);
ctx->info = g_new0 (ServiceInfo, 1);
ctx->info->service = service;
ctx->info->flag = flag;
qmi_device_allocate_client (self->priv->qmi_device,
service,
QMI_CID_NONE,
10,
cancellable,
(GAsyncReadyCallback)allocate_client_ready,
ctx);
}
/*****************************************************************************/
gboolean
mm_port_qmi_llp_is_raw_ip (MMPortQmi *self)
{
return self->priv->llp_is_raw_ip;
}
/*****************************************************************************/
typedef enum {
PORT_OPEN_STEP_FIRST,
PORT_OPEN_STEP_CHECK_OPENING,
PORT_OPEN_STEP_CHECK_ALREADY_OPEN,
PORT_OPEN_STEP_DEVICE_NEW,
PORT_OPEN_STEP_OPEN_WITHOUT_DATA_FORMAT,
PORT_OPEN_STEP_GET_KERNEL_DATA_FORMAT,
PORT_OPEN_STEP_ALLOCATE_WDA_CLIENT,
PORT_OPEN_STEP_GET_WDA_DATA_FORMAT,
PORT_OPEN_STEP_CHECK_DATA_FORMAT,
PORT_OPEN_STEP_SET_KERNEL_DATA_FORMAT,
PORT_OPEN_STEP_OPEN_WITH_DATA_FORMAT,
PORT_OPEN_STEP_LAST
} PortOpenStep;
typedef struct {
MMPortQmi *self;
GSimpleAsyncResult *result;
GCancellable *cancellable;
QmiDevice *device;
QmiClient *wda;
GError *error;
PortOpenStep step;
gboolean set_data_format;
QmiDeviceExpectedDataFormat kernel_data_format;
QmiWdaLinkLayerProtocol llp;
} PortOpenContext;
static void
port_open_context_complete_and_free (PortOpenContext *ctx)
{
g_simple_async_result_complete_in_idle (ctx->result);
if (ctx->wda) {
g_assert (ctx->device);
qmi_device_release_client (ctx->device,
ctx->wda,
QMI_DEVICE_RELEASE_CLIENT_FLAGS_RELEASE_CID,
3, NULL, NULL, NULL);
g_object_unref (ctx->wda);
}
if (ctx->device)
g_object_unref (ctx->device);
if (ctx->cancellable)
g_object_unref (ctx->cancellable);
g_object_unref (ctx->result);
g_object_unref (ctx->self);
g_slice_free (PortOpenContext, ctx);
}
gboolean
mm_port_qmi_open_finish (MMPortQmi *self,
GAsyncResult *res,
GError **error)
{
return !g_simple_async_result_propagate_error (G_SIMPLE_ASYNC_RESULT (res), error);
}
static void port_open_context_step (PortOpenContext *ctx);
static void
qmi_device_open_second_ready (QmiDevice *qmi_device,
GAsyncResult *res,
PortOpenContext *ctx)
{
qmi_device_open_finish (qmi_device, res, &ctx->error);
/* In both error and success, we go to last step */
ctx->step = PORT_OPEN_STEP_LAST;
port_open_context_step (ctx);
}
static void
get_data_format_ready (QmiClientWda *client,
GAsyncResult *res,
PortOpenContext *ctx)
{
QmiMessageWdaGetDataFormatOutput *output;
output = qmi_client_wda_get_data_format_finish (client, res, NULL);
if (!output ||
!qmi_message_wda_get_data_format_output_get_result (output, NULL) ||
!qmi_message_wda_get_data_format_output_get_link_layer_protocol (output, &ctx->llp, NULL))
/* If loading WDA data format fails, fallback to 802.3 requested via CTL */
ctx->step = PORT_OPEN_STEP_OPEN_WITH_DATA_FORMAT;
else
/* Go on to next step */
ctx->step++;
if (output)
qmi_message_wda_get_data_format_output_unref (output);
port_open_context_step (ctx);
}
static void
allocate_client_wda_ready (QmiDevice *device,
GAsyncResult *res,
PortOpenContext *ctx)
{
ctx->wda = qmi_device_allocate_client_finish (device, res, NULL);
if (!ctx->wda) {
/* If no WDA supported, then we just fallback to reopening explicitly
* requesting 802.3 in the CTL service. */
ctx->step = PORT_OPEN_STEP_OPEN_WITH_DATA_FORMAT;
port_open_context_step (ctx);
return;
}
/* Go on to next step */
ctx->step++;
port_open_context_step (ctx);
}
static void
qmi_device_open_first_ready (QmiDevice *qmi_device,
GAsyncResult *res,
PortOpenContext *ctx)
{
if (!qmi_device_open_finish (qmi_device, res, &ctx->error))
/* Error opening the device */
ctx->step = PORT_OPEN_STEP_LAST;
else if (!ctx->set_data_format)
/* If not setting data format, we're done */
ctx->step = PORT_OPEN_STEP_LAST;
else
/* Go on to next step */
ctx->step++;
port_open_context_step (ctx);
}
static void
qmi_device_new_ready (GObject *unused,
GAsyncResult *res,
PortOpenContext *ctx)
{
/* Store the device in the context until the operation is fully done,
* so that we return IN_PROGRESS errors until we finish this async
* operation. */
ctx->device = qmi_device_new_finish (res, &ctx->error);
if (!ctx->device)
/* Error creating the device */
ctx->step = PORT_OPEN_STEP_LAST;
else
/* Go on to next step */
ctx->step++;
port_open_context_step (ctx);
}
static void
port_open_context_step (PortOpenContext *ctx)
{
switch (ctx->step) {
case PORT_OPEN_STEP_FIRST:
mm_dbg ("Opening QMI device...");
ctx->step++;
/* Fall down to next step */
case PORT_OPEN_STEP_CHECK_OPENING:
mm_dbg ("Checking if QMI device already opening...");
if (ctx->self->priv->opening) {
g_simple_async_result_set_error (ctx->result,
MM_CORE_ERROR,
MM_CORE_ERROR_IN_PROGRESS,
"QMI device already being opened");
port_open_context_complete_and_free (ctx);
return;
}
ctx->step++;
/* Fall down to next step */
case PORT_OPEN_STEP_CHECK_ALREADY_OPEN:
mm_dbg ("Checking if QMI device already open...");
if (ctx->self->priv->qmi_device) {
g_simple_async_result_set_op_res_gboolean (ctx->result, TRUE);
port_open_context_complete_and_free (ctx);
return;
}
ctx->step++;
/* Fall down to next step */
case PORT_OPEN_STEP_DEVICE_NEW: {
GFile *file;
gchar *fullpath;
fullpath = g_strdup_printf ("/dev/%s", mm_port_get_device (MM_PORT (ctx->self)));
file = g_file_new_for_path (fullpath);
/* We flag in this point that we're opening. From now on, if we stop
* for whatever reason, we should clear this flag. We do this by ensuring
* that all callbacks go through the LAST step for completing. */
ctx->self->priv->opening = TRUE;
mm_dbg ("Creating QMI device...");
qmi_device_new (file,
ctx->cancellable,
(GAsyncReadyCallback) qmi_device_new_ready,
ctx);
g_free (fullpath);
g_object_unref (file);
return;
}
case PORT_OPEN_STEP_OPEN_WITHOUT_DATA_FORMAT:
/* Now open the QMI device without any data format CTL flag */
mm_dbg ("Opening device without data format update...");
qmi_device_open (ctx->device,
(QMI_DEVICE_OPEN_FLAGS_VERSION_INFO |
QMI_DEVICE_OPEN_FLAGS_PROXY),
10,
ctx->cancellable,
(GAsyncReadyCallback) qmi_device_open_first_ready,
ctx);
return;
case PORT_OPEN_STEP_GET_KERNEL_DATA_FORMAT:
mm_dbg ("Querying kernel data format...");
/* Try to gather expected data format from the sysfs file */
ctx->kernel_data_format = qmi_device_get_expected_data_format (ctx->device, NULL);
/* If data format cannot be retrieved, we fallback to 802.3 via CTL */
if (ctx->kernel_data_format == QMI_DEVICE_EXPECTED_DATA_FORMAT_UNKNOWN) {
ctx->step = PORT_OPEN_STEP_OPEN_WITH_DATA_FORMAT;
port_open_context_step (ctx);
return;
}
ctx->step++;
/* Fall down to next step */
case PORT_OPEN_STEP_ALLOCATE_WDA_CLIENT:
/* Allocate WDA client */
mm_dbg ("Allocating WDA client...");
qmi_device_allocate_client (ctx->device,
QMI_SERVICE_WDA,
QMI_CID_NONE,
10,
ctx->cancellable,
(GAsyncReadyCallback) allocate_client_wda_ready,
ctx);
return;
case PORT_OPEN_STEP_GET_WDA_DATA_FORMAT:
/* If we have WDA client, query current data format */
g_assert (ctx->wda);
mm_dbg ("Querying device data format...");
qmi_client_wda_get_data_format (QMI_CLIENT_WDA (ctx->wda),
NULL,
10,
ctx->cancellable,
(GAsyncReadyCallback) get_data_format_ready,
ctx);
return;
case PORT_OPEN_STEP_CHECK_DATA_FORMAT:
/* We now have the WDA data format and the kernel data format, if they're
* equal, we're done */
mm_dbg ("Checking data format: kernel %s, device %s",
qmi_device_expected_data_format_get_string (ctx->kernel_data_format),
qmi_wda_link_layer_protocol_get_string (ctx->llp));
if (ctx->kernel_data_format == QMI_DEVICE_EXPECTED_DATA_FORMAT_802_3 &&
ctx->llp == QMI_WDA_LINK_LAYER_PROTOCOL_802_3) {
ctx->self->priv->llp_is_raw_ip = FALSE;
ctx->step = PORT_OPEN_STEP_LAST;
port_open_context_step (ctx);
return;
}
if (ctx->kernel_data_format == QMI_DEVICE_EXPECTED_DATA_FORMAT_RAW_IP &&
ctx->llp == QMI_WDA_LINK_LAYER_PROTOCOL_RAW_IP) {
ctx->self->priv->llp_is_raw_ip = TRUE;
ctx->step = PORT_OPEN_STEP_LAST;
port_open_context_step (ctx);
return;
}
ctx->step++;
/* Fall down to next step */
case PORT_OPEN_STEP_SET_KERNEL_DATA_FORMAT:
/* Update the data format to be expected by the kernel */
mm_dbg ("Updating kernel data format: %s", qmi_wda_link_layer_protocol_get_string (ctx->llp));
if (ctx->llp == QMI_WDA_LINK_LAYER_PROTOCOL_802_3) {
ctx->kernel_data_format = QMI_DEVICE_EXPECTED_DATA_FORMAT_802_3;
ctx->self->priv->llp_is_raw_ip = FALSE;
} else if (ctx->llp == QMI_WDA_LINK_LAYER_PROTOCOL_RAW_IP) {
ctx->kernel_data_format = QMI_DEVICE_EXPECTED_DATA_FORMAT_RAW_IP;
ctx->self->priv->llp_is_raw_ip = TRUE;
} else
g_assert_not_reached ();
/* Regardless of the output, we're done after this action */
qmi_device_set_expected_data_format (ctx->device,
ctx->kernel_data_format,
&ctx->error);
ctx->step = PORT_OPEN_STEP_LAST;
port_open_context_step (ctx);
return;
case PORT_OPEN_STEP_OPEN_WITH_DATA_FORMAT:
/* Need to reopen setting 802.3 using CTL */
mm_dbg ("Closing device to reopen it right away...");
if (!qmi_device_close (ctx->device, &ctx->error)) {
mm_warn ("Couldn't close QMI device to reopen it");
ctx->step = PORT_OPEN_STEP_LAST;
port_open_context_step (ctx);
return;
}
mm_dbg ("Reopening device with data format...");
qmi_device_open (ctx->device,
(QMI_DEVICE_OPEN_FLAGS_VERSION_INFO |
QMI_DEVICE_OPEN_FLAGS_PROXY |
QMI_DEVICE_OPEN_FLAGS_NET_802_3 |
QMI_DEVICE_OPEN_FLAGS_NET_NO_QOS_HEADER),
10,
ctx->cancellable,
(GAsyncReadyCallback) qmi_device_open_second_ready,
ctx);
return;
case PORT_OPEN_STEP_LAST:
mm_dbg ("QMI port open operation finished");
/* Reset opening flag */
ctx->self->priv->opening = FALSE;
if (ctx->error) {
/* Propagate error */
if (ctx->device)
qmi_device_close (ctx->device, NULL);
g_simple_async_result_take_error (ctx->result, ctx->error);
ctx->error = NULL;
} else {
/* Store device in private info */
g_assert (ctx->device);
g_assert (!ctx->self->priv->qmi_device);
ctx->self->priv->qmi_device = g_object_ref (ctx->device);
g_simple_async_result_set_op_res_gboolean (ctx->result, TRUE);
}
port_open_context_complete_and_free (ctx);
return;
}
}
void
mm_port_qmi_open (MMPortQmi *self,
gboolean set_data_format,
GCancellable *cancellable,
GAsyncReadyCallback callback,
gpointer user_data)
{
PortOpenContext *ctx;
g_return_if_fail (MM_IS_PORT_QMI (self));
ctx = g_slice_new0 (PortOpenContext);
ctx->self = g_object_ref (self);
ctx->step = PORT_OPEN_STEP_FIRST;
ctx->set_data_format = set_data_format;
ctx->kernel_data_format = QMI_DEVICE_EXPECTED_DATA_FORMAT_UNKNOWN;
ctx->llp = QMI_WDA_LINK_LAYER_PROTOCOL_UNKNOWN;
ctx->result = g_simple_async_result_new (G_OBJECT (self),
callback,
user_data,
mm_port_qmi_open);
ctx->cancellable = cancellable ? g_object_ref (cancellable) : NULL;
port_open_context_step (ctx);
}
gboolean
mm_port_qmi_is_open (MMPortQmi *self)
{
g_return_val_if_fail (MM_IS_PORT_QMI (self), FALSE);
return !!self->priv->qmi_device;
}
void
mm_port_qmi_close (MMPortQmi *self)
{
GList *l;
GError *error = NULL;
g_return_if_fail (MM_IS_PORT_QMI (self));
if (!self->priv->qmi_device)
return;
/* Release all allocated clients */
for (l = self->priv->services; l; l = g_list_next (l)) {
ServiceInfo *info = l->data;
mm_dbg ("Releasing client for service '%s'...", qmi_service_get_string (info->service));
qmi_device_release_client (self->priv->qmi_device,
info->client,
QMI_DEVICE_RELEASE_CLIENT_FLAGS_RELEASE_CID,
3, NULL, NULL, NULL);
g_clear_object (&info->client);
}
g_list_free_full (self->priv->services, (GDestroyNotify)g_free);
self->priv->services = NULL;
/* Close and release the device */
if (!qmi_device_close (self->priv->qmi_device, &error)) {
mm_warn ("Couldn't properly close QMI device: %s",
error->message);
g_error_free (error);
}
g_clear_object (&self->priv->qmi_device);
}
/*****************************************************************************/
MMPortQmi *
mm_port_qmi_new (const gchar *name)
{
return MM_PORT_QMI (g_object_new (MM_TYPE_PORT_QMI,
MM_PORT_DEVICE, name,
MM_PORT_SUBSYS, MM_PORT_SUBSYS_USB,
MM_PORT_TYPE, MM_PORT_TYPE_QMI,
NULL));
}
static void
mm_port_qmi_init (MMPortQmi *self)
{
self->priv = G_TYPE_INSTANCE_GET_PRIVATE (self, MM_TYPE_PORT_QMI, MMPortQmiPrivate);
}
static void
dispose (GObject *object)
{
MMPortQmi *self = MM_PORT_QMI (object);
GList *l;
/* Deallocate all clients */
for (l = self->priv->services; l; l = g_list_next (l)) {
ServiceInfo *info = l->data;
if (info->client)
g_object_unref (info->client);
}
g_list_free_full (self->priv->services, (GDestroyNotify)g_free);
self->priv->services = NULL;
/* Clear device object */
g_clear_object (&self->priv->qmi_device);
G_OBJECT_CLASS (mm_port_qmi_parent_class)->dispose (object);
}
static void
mm_port_qmi_class_init (MMPortQmiClass *klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
g_type_class_add_private (object_class, sizeof (MMPortQmiPrivate));
/* Virtual methods */
object_class->dispose = dispose;
} |
package synth;
import java.io.InputStream;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
public class Sample {
private final int sampleRate;
private final int baseMidiNote;
private final int loopStart;
private final int loopLength;
private final int loopEnd;
private final boolean useLoop;
private final int fineTuning;
public byte[] data;
public Sample(String resource, int baseMidiNote, int loopStart, int loopEnd, int fineTuning) throws Exception {
int sampleRateTmp = 0;
try (
InputStream is = getClass().getResourceAsStream(resource);
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(is);
) {
AudioFormat format = audioInputStream.getFormat();
System.out.println("format: " + format + " encoding: " + format.getEncoding());
if (format.getChannels() != 1 || format.getSampleSizeInBits() != 8 || format.getEncoding() != AudioFormat.Encoding.PCM_UNSIGNED) {
throw new Exception("invalid audio format " + format + " !");
}
sampleRateTmp = (int) format.getSampleRate();
int frameSize = format.getFrameSize();
//data = new byte[(int) (frameSize * audioInputStream.getFrameLength())];
//audioInputStream.read(data);
data = audioInputStream.readAllBytes();
/*
int maxValue = 0;
for (int i = 0; i < data.length; i++) {
int sampleValue = Math.abs((data[i] & 0xff) - 128);
if (sampleValue > maxValue) {
maxValue = sampleValue;
}
}
for (int i = 0; i < data.length; i++) {
int sampleValue = ((data[i] & 0xff) - 128);
sampleValue = (int) (127.0 * (sampleValue / (double) (1.0 + maxValue)));
data[i] = (byte) (sampleValue + 128);
}
*/
}
this.sampleRate = sampleRateTmp;
this.baseMidiNote = baseMidiNote;
this.loopStart = loopStart;
if (loopEnd > data.length - 1) {
loopEnd = data.length - 1;
}
this.loopEnd = loopEnd;
this.loopLength = loopEnd - loopStart + 1;
this.useLoop = loopLength > 2;
this.fineTuning = fineTuning;
}
public int getSampleRate() {
return sampleRate;
}
public int getBaseMidiNote() {
return baseMidiNote;
}
public int getLoopStart() {
return loopStart;
}
public int getLoopLength() {
return loopLength;
}
public int getLoopEnd() {
return loopEnd;
}
public boolean isUseLoop() {
return useLoop;
}
public byte[] getData() {
return data;
}
public int getFineTuning() {
return fineTuning;
}
public byte getNextSample(double sampleIndex) {
int nextSample = 0;
int sampleIndexIntA = (int) sampleIndex;
int sampleIndexIntB = sampleIndexIntA + 1;
int nextSampleA = 0;
int nextSampleB = 0;
if (useLoop) {
if (sampleIndexIntA <= loopEnd) { // before loop
nextSampleA = data[sampleIndexIntA] & 0xff;
}
else { // after loop
nextSampleA = data[loopStart + ((sampleIndexIntA - loopEnd) % loopLength)] & 0xff;
}
if (sampleIndexIntB <= loopEnd) { // before loop
nextSampleB = data[sampleIndexIntB] & 0xff;
}
else { // after loop
nextSampleB = data[loopStart + ((sampleIndexIntB - loopEnd) % loopLength)] & 0xff;
}
}
else {
if (sampleIndexIntA < data.length) {
nextSampleA = data[sampleIndexIntA] & 0xff;
}
if (sampleIndexIntB < data.length) {
nextSampleB = data[sampleIndexIntB] & 0xff;
}
}
nextSampleA = nextSampleA - 128;
nextSampleB = nextSampleB - 128;
double lerp = sampleIndex - sampleIndexIntA;
nextSample = (int) (nextSampleA + lerp * (nextSampleB - nextSampleA));
return (byte) (nextSample + 128);
}
public void setData(byte[] data) {
this.data = data;
}
} |
var express = require('express');
var router = express.Router();
const { PrismaClient } = require('@prisma/client');
const { parseGoal } = require('../utils/parseGoal');
const prisma = new PrismaClient();
/* GET users listing. */
router.get('/', async (req, res) => {
try {
const goals = await prisma.goal.findMany({
orderBy: {
createdAt: 'desc'
}
});
res.json(goals);
} catch (error) {
res.status(500).json({ error: error.message });
}
})
router.get('/:id', async (req, res) => {
const { id } = req.params;
try {
const goal = await prisma.goal.findUnique({
where: {
id: parseInt(id),
},
});
if (goal) {
res.json(goal);
} else {
res.status(404).json({ message: 'Goal not found' });
}
} catch (error) {
res.status(500).json({ error: error.message });
}
})
router.post('/', async (req, res) => {
try {
const { description } = req.body;
const parsedGoal = parseGoal(description);
if (!parsedGoal) {
return res.status(400).json({ error: 'Invalid goal format. Please use the format "I want to [action] every [number] [time units]".' });
}
const newGoal = await prisma.goal.create({
data: {
title: parsedGoal.title,
frequency: parsedGoal.frequency,
description,
completed: false
},
});
res.status(201).json(newGoal);
} catch (error) {
res.status(500).json({ error: error.message });
}
});
router.put('/:id', async (req, res) => {
const { id } = req.params;
const { title, description, frequency, completed } = req.body;
try {
const updatedGoal = await prisma.goal.update({
where: { id: parseInt(id) },
data: { title, description, frequency, completed },
});
res.json(updatedGoal);
} catch (error) {
res.status(500).json({ error: error.message });
}
})
router.delete('/:id', async (req, res) => {
const { id } = req.params;
try {
await prisma.goal.delete({
where: { id: parseInt(id) },
});
res.status(204).send();
} catch (error) {
res.status(500).json({ error: error.message });
}
})
module.exports = router; |
package com.food.recipes.services;
import com.food.recipes.model.User;
import com.food.recipes.model.dto.security.AuthenticationRequest;
import com.food.recipes.model.dto.security.AuthenticationResponse;
import com.food.recipes.model.dto.security.RegisterRequest;
import com.food.recipes.model.enums.Role;
import com.food.recipes.repository.UserRepository;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.security.authentication.AuthenticationManager;
import org.springframework.security.authentication.UsernamePasswordAuthenticationToken;
import org.springframework.security.core.userdetails.UsernameNotFoundException;
import org.springframework.security.crypto.password.PasswordEncoder;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Service
@Slf4j
@RequiredArgsConstructor
public class AuthenticationService {
private final UserRepository repository;
private final PasswordEncoder passwordEncoder;
private final JwtService jwtService;
private final AuthenticationManager authenticationManager;
@Transactional
public AuthenticationResponse register(RegisterRequest request) {
User user = User.builder()
.name(request.getName())
.username(request.getEmail())
.password(passwordEncoder.encode(request.getPassword()))
.role(Role.USER)
.build();
repository.save(user);
var jwtToken = jwtService.generateToken(user);
log.info("Saving an user");
return AuthenticationResponse.builder()
.name(user.getName())
.token(jwtToken)
.build();
}
@Transactional
public AuthenticationResponse login(AuthenticationRequest request) {
authenticationManager.authenticate(
new UsernamePasswordAuthenticationToken(
request.getEmail(),
request.getPassword()
)
);
User user = repository.findByUsername(request.getEmail()).orElseThrow(
() -> new UsernameNotFoundException("Username not found!")
);
var jwtToken = jwtService.generateToken(user);
log.info("An user is authenticated");
return AuthenticationResponse.builder()
.name(user.getName())
.token(jwtToken)
.build();
}
} |
package main
import (
"fmt"
"sync"
"testing"
)
/*
Pool -> implementasi design pattern bernama object pool pattern
sederhananya, design pattern ini digunakan untuk menyimpan data, selanjutnya untuk menggunakan datanya kita bisa mengambil dari pool, dan selesai menggunakan kita bisa menyimpan kembali ke Poolnya
implementasi Pool di golang ini sudah aman dari problem race condition
*/
func TestPool(t *testing.T) {
group := sync.WaitGroup{}
pool := sync.Pool{}
pool.Put("Reo")
pool.Put("Sahobby")
pool.Put("Reooo")
for i := 0; i < 10; i++ {
group.Wait()
go func() {
defer group.Done()
group.Add(1)
data := pool.Get() // -> mengambil data dari pool
fmt.Println(data)
pool.Put(data) // -> ditaruh ke pool lagi, karena setelah diGet() data akan hilang dari pool
}()
}
group.Wait()
fmt.Println("test complete!!!")
} |
// In an online shopping application, customers can add items to their shopping cart. Implement a
// class ShoppingCart with a method addItem that adds items to the cart. If the customer attempts
// to add an item with a negative quantity or an invalid product code, throw appropriate exceptions
// (NegativeQuantityException or InvalidProductCodeExceptionz). Handle these exceptions and
// display error messages.
#include <iostream>
#include <stdexcept>
#include <string>
using namespace std;
class NegativeQuantityException : public std::exception
{
public:
const char *what() const noexcept override
{
return "Error: Cannot add item. Negative quantity not allowed.";
}
};
class InvalidProductCodeException : public std::exception
{
public:
const char *what() const noexcept override
{
return "Error: Invalid product code. Please enter a valid product code.";
}
};
class ShoppingCart
{
public:
void addItem(const std::string &productCode, int quantity)
{
if (quantity < 0)
{
throw NegativeQuantityException();
}
if (productCode.empty())
{
throw InvalidProductCodeException();
}
// Here, you can implement the logic to add the item to the shopping cart
std::cout << "Added item: Product Code - " << productCode << ", Quantity - " << quantity << std::endl;
}
};
int main()
{
ShoppingCart cart;
try
{
// Test Case 1: Adding an Item with Negative Quantity
cart.addItem("P001", -2);
}
catch (const NegativeQuantityException &e)
{
std::cout << e.what() << std::endl;
}
try
{
// Test Case 2: Adding an Item with Invalid Product Code
cart.addItem("", 3);
}
catch (const InvalidProductCodeException &e)
{
std::cout << e.what() << std::endl;
}
return 0;
} |
use crate::common::card::{AuctionType, Card, CardColor};
use leptos::ev::DragEvent;
use leptos::*;
pub const CARD_ID_FORMAT: &'static str = "mart/card";
#[component]
pub(crate) fn CardView(
card: Card,
#[prop(optional)] selectable: bool,
#[prop(optional)] display_only: bool,
) -> impl IntoView {
let selected_card: RwSignal<Option<Card>> = expect_context();
let selected =
Signal::derive(move || selected_card().is_some_and(|current| current.id == card.id));
let dragging: RwSignal<bool> = expect_context();
let wrapper_class = move || {
format!(
"{} {}
w-40 h-50 rd-2 border-3 border-solid transition-all relative select-none
overflow-hidden shadow-xl scale-100
hover:shadow-2xl active:shadow-2xl hover:scale-110 active:scale-110
animation-fall",
card.color.comp_bg(),
card.color.main_bd()
)
};
let ty_bg_class = move || {
format!(
"{}
h-10 flex flex-justify-center flex-items-center shadow-lg",
card.color.main_bg()
)
};
let ty_fg_class = move || {
format!(
"{}
novcento text-md font-bold",
card.color.comp_fg()
)
};
let on_dragstart = move |ev: DragEvent| {
ev.data_transfer()
.unwrap()
.set_data(CARD_ID_FORMAT, card.id.to_string().as_str())
.ok();
dragging.set(true);
};
let on_dragend = move |_| {
dragging.set(false);
};
let on_click = move |_| {
if selectable {
if selected() {
selected_card.set(None);
} else {
selected_card.set(Some(card));
}
}
};
view! {
<div
class=("cursor-grabbing", move || selectable && !selected())
class=("cursor-context-menu", selected)
class=("glow", move || selected() && !display_only)
class=wrapper_class
prop:draggable=selectable
on:dragstart=on_dragstart
on:dragend=on_dragend
on:click=on_click
>
<div class="flex flex-justify-center">
<img src="abstract.jpg" class="aspect-square w-40 h-40 pointer-events-none"/>
</div>
<div class=ty_bg_class>
<span class=ty_fg_class>{card.ty.text()}</span>
</div>
</div>
}
}
impl AuctionType {
pub(crate) fn text(&self) -> &'static str {
match self {
Self::Free => "Free",
Self::Circle => "Circle",
Self::Fist => "Fist",
Self::Marked => "Marked",
Self::Double => "Double",
}
}
}
impl CardColor {
pub(crate) fn comp_fg(&self) -> &'static str {
match self {
Self::Red | Self::Blue | Self::Purple => "c-white",
Self::Green | Self::Yellow => "c-black",
}
}
pub(crate) fn comp_bg(&self) -> &'static str {
match self {
Self::Red | Self::Blue | Self::Purple => "bg-white",
Self::Green | Self::Yellow => "bg-black",
}
}
pub(crate) fn main_fg(&self) -> &'static str {
match self {
Self::Red => "c-red-700",
Self::Green => "c-green",
Self::Blue => "c-blue",
Self::Purple => "c-purple",
Self::Yellow => "c-yellow",
}
}
pub(crate) fn main_bg(&self) -> &'static str {
match self {
Self::Red => "bg-red-600",
Self::Green => "bg-green",
Self::Blue => "bg-blue",
Self::Purple => "bg-purple",
Self::Yellow => "bg-yellow",
}
}
pub(crate) fn main_bd(&self) -> &'static str {
match self {
Self::Red => "border-red-600",
Self::Green => "border-green",
Self::Blue => "border-blue",
Self::Purple => "border-purple",
Self::Yellow => "border-yellow",
}
}
} |
import os
import openai
from openai import OpenAI
class ChatGPT:
def __init__(self, api_key_path: str):
with open(api_key_path, 'r') as arquivo:
# Lê o conteúdo do arquivo
conteudo = arquivo.read()
openai.api_key = conteudo
self.messages = []
def add_message(self, role: str, prompt: str):
self.messages.append({'role': role, 'content': prompt})
def reset_messages(self, maintain_context=True):
if maintain_context:
self.messages = [self.messages[0]]
else:
self.messages = []
def add_context(self, context: str):
# inserindo o contexto no começo da lista de mensagens
context = {'role': 'system', 'content': context}
self.messages = [context] + self.messages
def get_completion(self, model="gpt-3.5-turbo-0613", temperature=0):
response = openai.chat.completions.create(
model=model,
messages=self.messages,
temperature=temperature, # this is the degree of randomness of the model's output
)
return response.choices[0].message.content, response |
// 04_GoBananas
// Author: https://github.com/Mark-MDO47/
//
// The core algorithm is from https://playground.arduino.cc/Code/CapacitiveSensor/
// Here is the code history from that page
// * Original code by Mario Becker, Fraunhofer IGD, 2007 http://www.igd.fhg.de/igd-a4
// * Updated by: Alan Chatham http://unojoy.tumblr.com
// * Updated by Paul Stoffregen: Replaced '328 specific code with portOutputRegister, etc for compatibility with Arduino Mega, Teensy, Sanguino and other boards
// * Gratuitous optimization to improve sensitivity by Casey Rodarmor.
// * Updated by Martin Renold: disable interrupts while measuring. This fixes the occasional too-low results.
// * Updated by InvScribe for Arduino Due.
// * Updated non-Due code by Gabriel Staples (www.ElectricRCAircraftGuy.com), 15 Nov 2014: replaced "interrupts();" with "SREG = SREG_old;" in order to prevent nested interrupts in case you use this function inside an Interrupt Service Routine (ISR).
// I broke the original readCapacitivePin() into two parts
// * readCapacitivePinSetup() - get the bitmasks and do any initialization that can be done
// * readCapacitivePin() - similar to before but different calling sequence
//
// TODO - maybe have it process multiple pins at once?
// PROs - modified code - everything done in one call - maybe faster
// CONs - original code - is fast as possible for one pin - maybe faster
// PLAN - maybe fast enough as is. If not must try it both ways and evaluate.
#include "pins_arduino.h"
#include "Arduino.h"
typedef struct {
volatile uint8_t* port; // AVR PORT
volatile uint8_t* ddr; // AVR PIN
volatile uint8_t* pin; // AVR DDR
uint8_t bitmask; // which bit we are using
uint8_t calibration; // max number when not touching
} pinsetup_t; // return from readCapacitivePinSetup
pinsetup_t* readCapacitivePinSetup(int pinToMeasure);
uint8_t readCapacitivePin(pinsetup_t* pinSetupInfo); |
Flying PhotoBooth
The source code for the Android applications **Flying PhotoBooth** and **Party PhotoBooth**.
## Flying PhotoBooth <a href="https://play.google.com/store/apps/details?id=com.groundupworks.flyingphotobooth&utm_source=global_co&utm_medium=prtnr&utm_content=Mar2515&utm_campaign=PartBadge&pcampaignid=MKT-AC-global-none-all-co-pr-py-PartBadges-Oct1515-1"><img alt="Get it on Google Play" src="https://play.google.com/intl/en_us/badges/images/apps/en-play-badge.png" height="24px" /></a>
Create, review, save and share photo strips all in less than 30 seconds.
#### Create Photo Strip
* Selection of photo strip arrangements and photo filters
* Shoot up to 4 photos manually or use the photo booth style timer
* Unique swipe-to-retake feature to quickly review each photo
* Support for triggering with Muku Shuttr, other Bluetooth remotes and keyboards
#### Basic Save and Share
* Photo strips are auto-saved and added to the Gallery
* Beam to compatible devices using Android Beam
* Share through Facebook, Twitter, WhatsApp, Email, etc.
#### Share with [Wings](http://www.groundupworks.com/wings/)
* Link your Facebook, Dropbox, and Google Cloud Print account to enable one-click or auto share/print
* Automatically schedule retries in the background if sharing fails
* Share to any of your Facebook Albums or Pages, with privacy level control
## Party PhotoBooth <a href="https://play.google.com/store/apps/details?id=com.groundupworks.partyphotobooth&utm_source=global_co&utm_medium=prtnr&utm_content=Mar2515&utm_campaign=PartBadge&pcampaignid=MKT-AC-global-none-all-co-pr-py-PartBadges-Oct1515-1"><img alt="Get it on Google Play" src="https://play.google.com/intl/en_us/badges/images/apps/en-play-badge.png" height="24px" /></a>
Need a photo booth at your event? There is an app for that! What's a better way to explain how it works than to follow this [step-by-step instructable](http://www.instructables.com/id/5-minute-Photo-Booth/)?
#### One-minute Set Up
* Pick a photo strip template with optional event name, date and logo
* Link Facebook and Dropbox accounts for auto sharing
* Link Google Cloud Print for printing
* Keep the app in foreground with passcode-protected Kiosk mode
#### Guest Experience
* Take pictures as in a photo booth, with count down timer and animated review panel
* Trigger with on-screen button or Bluetooth remote
* Find your photo strip on Facebook, Dropbox, or printed via Google Cloud Print
* Automatically return to the Capture screen for the next guest
### Build
To compile the applications you must have the [Android SDK](http://developer.android.com/sdk/index.html) set up. With your device connected, build and install **Flying PhotoBooth** with:
```
./gradlew :flying-photo-booth:installDebug
```
Or **Party PhotoBooth** with:
```
./gradlew :party-photo-booth:installDebug
```
Some Wings Sharing endpoints may not work on your custom build as API keys from the service providers may be pinned to the release signing keys. You should find **donottranslate.xml** in each application and replace all API keys.
If you plan on distributing a fork of these applications, you must replace the following:
* Package names **com.groundupworks.flyingphotobooth** and **com.groundupworks.partyphotobooth**
* Application names **Flying PhotoBooth** and **Party PhotoBooth**
* Application icons and all branded graphics
* API keys
### Contact
Please use the [issue tracker](https://github.com/benhylau/flying-photo-booth/issues) for feature requests and reporting of bugs. Pull requests are welcome, but for custom features please fork instead. One main reason for open-sourcing this project is to allow for use-case-specific customizations without feature-bloating the mainline products, so fork away!
You can also contact me through the [Ground Up Works](http://www.groundupworks.com) channels.
### License
Copyright (c) 2012-2016 Benedict Lau
Source code licensed under the [GPLv3](http://www.gnu.org/licenses/gpl-3.0.html)
Application names, icons and branded graphics are properties of [Ground Up Works](http://www.groundupworks.com) |
# Creation of EU-HYDI database structure according to guidelines EUHYDI_v1.1.pdf and example file EUHYDI_v1.1_example.xls
#
# Author: M. Weynants
# Date created: 2012/11/09
# Last update: 2012/11/22
######################################################################
# load packages
# load functions
source('general_checks.r')
source('basic_checks.r')
source('chemical_checks.r')
source('psize_checks.r')
source('ret_checks.r')
source('cond_checks.r')
source('meth_checks.r')
source('tsermeta_checks.r')
source('tserdata_checks.r')
# Print warnings as they appear
# Create the tables with the right structure
# GENERAL
GENERAL <- as.data.frame(matrix(nrow=1,ncol=65,dimnames=list(NULL,c("PROFILE_ID","LOC_COOR_X","LOC_COOR_Y", "LOC_COOR_SYST","X_WGS84","Y_WGS84","ELEV","ISO_COUNTRY","RC_L1","RC_L2","LC_L1","LC_L2","LC_L3","LU_L1","LU_L2","SITE_LANDFORM","SITE_SLOP_POS","SITE_SLOP_FORM","SITE_SLOP_GRAD","SRF_ROCK_COV","SRF_ROCK_DIS","SRF_COAR_COV","SRF_COAR_SIZ","SRF_ERO_CAT","SRF_ERO_COV","SRF_ERO_DEG","SRF_ERO_ACT","SRF_SEAL_THIC","SRF_SEAL_CON","SRF_CRAC_WID","SRF_CRAC_DEP","SRF_CRAC_DIS","SRF_SAL_COV","SRF_SAL_THIC","PAR_MAT","AGE","WRB2006_RSG","WRB2006_PQ1","WRB2006_PQ2","WRB2006_PQ3","WRB2006_SQ1","WRB2006_SQ2","WRB2006_SQ3","WRB1998_RSG","WRB1998_ADJSPE1","WRB1998_ADJSPE2","WRB1998_ADJSPE3","WRB1998_ADJSPE4","WRB1998_ADJSPE5","WRB1998_ADJSPE6","NAT_CLAS","NAT_CLAS_REF","YEAR","MONTH","DAY","SURVEYOR_P","PUBL_REF","CONTACT_P","CONTACT_A","EMAIL","REL_ID","REL_T_SER","COMMENTS1","COMMENTS2","COMMENTS3"))))
# set field classes
GENERAL[,c(1,7)]<-as.integer(GENERAL[,c(1,7)])
GENERAL[,c(2,3,5,6)]<-as.numeric(GENERAL[,c(2,3,5,6)])
GENERAL[,c(4)]<-as.character(GENERAL[,c(4)])
# BASIC
BASIC <- as.data.frame(matrix(nrow=1,ncol=23,dimnames=list(NULL,c("PROFILE_ID","SAMPLE_ID","SAMPLE_POS","SAMPLE_DEP_TOP","SAMPLE_DEP_BOT","HOR1_NAME","HOR1_TOP","HOR1_BOT","HOR2_NAME","HOR2_TOP","HOR2_BOT","HOR3_NAME","HOR3_TOP","HOR3_BOT","STRUCTURE1","STR_COMB","STRUCTURE2","POR","POR_M","BD","BD_M","COARSE","COARSE_M"))))
# set field classes
BASIC[,c(1:5,7:8,10:11,13:14,19,21,23)] <- as.integer(BASIC[,c(1:5,7:8,10:11,13:14,19,21,23)])
BASIC[,c(18,20,22)] <- as.numeric(BASIC[,c(18,20,22)])
BASIC[,c(6,9,12,15:17)] <- as.character(BASIC[,c(6,9,12,15:17)])
# CHEMICAL
CHEMICAL <- as.data.frame(matrix(nrow=1,ncol=29,dimnames=list(NULL,c("PROFILE_ID","SAMPLE_ID","OC","OC_M","CACO3","CACO3_M","PH_H2O","PH_H2O_M","PH_KCL","PH_KCL_M","EC","EC_M","SALT","SALT_M","CEC","CEC_M","EX_NA","EX_NA_M","EX_MG","EX_MG_M","EX_K","EX_K_M","EX_CA","EX_CA_M","BASE_CATIONS","ACIDITY_NA4O","ACIDITY_NA4O_M","ACIDITY_KCL","ACIDITY_KCL_M"))))
# set field classes
CHEMICAL[,c(1,2,4,6,8,10,12,14,16,18,20,22,24,27,29)] <- as.integer(CHEMICAL[,c(1,2,4,6,8,10,12,14,16,18,20,22,24,27,29)])
CHEMICAL[,c(3,5,7,9,11,13,15,17,19,21,23,25,26,28)] <- as.numeric(CHEMICAL[,c(3,5,7,9,11,13,15,17,19,21,23,25,26,28)])
# PSIZE
# RET
# COND
# METHOD
# TSERMETA
# TSERDATA
# loop on directories of contributors
contr <- c('Anaya','Goncalves','Iovino','Katterer','Mako','Patyka','Schindler','Shein','Javaux','Daroussin','Romano','Kvaerno','Lamorski','Lilly','Houskova','Strauss','Matula','Cornelis') # etc.
dirs <- contr
DB <- vector("list",length(dirs)); names(DB)<-dirs
for (i in 1:18){
print('-----------------')
print(dirs[i])
print('-----------------')
# warnings off
options(warn=1)
if (exists('general')) rm(general,basic,chemical,psize,ret,cond,meth,tsermeta,tserdata)
if (exists('general1')) rm(general1,basic1,chemical1,psize1,ret1,cond1,meth1,tsermeta1,tserdata1)
# read csv files
# GENERAL
# ---- 2023/04/05 fix: files were imported without specifying encoding and special characters were scrambled
if (file.exists(paste('../data/',dirs[i],'/general_utf8.csv',sep=''))) {
general <- read.csv(paste('../data/',dirs[i],'/general_utf8.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip = TRUE)
} else {
general <- read.csv(paste('../data/',dirs[i],'/GENERAL.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip = TRUE)
}
# ------
# remove empty lines
general<- general[!is.na(general[[1]]),]
# METHOD
# 2023/04/18 exported in utf8 because impossible to import Straus as is invalid multibyte string at '<d6>NOR<4d> L 1068'
if (grepl("Strauss",dirs[i])) {
path <- paste('../data/',dirs[i],'/METHOD_utf8.csv',sep='')
} else {path <- paste('../data/',dirs[i],'/METHOD.csv',sep='')}
meth <- read.csv(path,header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
meth <- meth[!is.na(meth[[1]]),]
# BASIC
basic <- read.csv(paste('../data/',dirs[i],'/BASIC.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
basic <- basic[!is.na(basic[[1]]),]
# CHEMICAL
chemical <- read.csv(paste('../data/',dirs[i],'/CHEMICAL.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
chemical <- chemical[!is.na(chemical[[1]]),]
names(chemical) <- toupper(names(chemical))
# PSIZE
psize <- read.csv(paste('../data/',dirs[i],'/PSIZE.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
psize <- psize[!is.na(psize[[1]]),]
# RET
ret <- read.csv(paste('../data/',dirs[i],'/RET.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
ret <- ret[!is.na(ret[[1]]),]
# COND
cond <- read.csv(paste('../data/',dirs[i],'/COND.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
cond <- cond[!is.na(cond[[1]]),]
# clean specific contributions to match guidelines
if (grepl("Anaya",dirs[i])) source("cleanAnaya.r")
if (grepl("Iovino",dirs[i])) source("cleanIovino.r")
if (grepl("Katterer",dirs[i])) source('cleanKatterer.r')
if (grepl("Mako", dirs[i])) source("cleanMako.r")
if (grepl("Patyka", dirs[i])) source("cleanPatyka.r")
if (grepl("Schindler" ,dirs[i])) source('cleanSchindler.r')
if (grepl("Shein",dirs[i])) source('cleanShein.r')
if (grepl("Javaux",dirs[i])) source('cleanJavaux.r')
if (grepl("Daroussin",dirs[i])) source('cleanDaroussin.r')
if (grepl("Romano",dirs[i])) source('cleanRomano.r')
if (grepl("Kvaerno",dirs[i])) source('cleanKvaerno.r')
if (grepl("Lamorski",dirs[i])) source('cleanLamorski.r')
if (grepl("Lilly",dirs[i])) source('cleanLilly.r')
if (grepl("Houskova",dirs[i])) source('cleanHouskova.r')
if (grepl("Strauss",dirs[i])) source('cleanStrauss.r')
if (grepl("Matula",dirs[i])) source('cleanMatula.r')
if (grepl("Cornelis",dirs[i])) source('cleanCornelis.r')
# do the checks
print('checking tables')
general1 <- general.checks(general)
meth1 <- meth.checks(meth)
basic1 <- basic.checks(basic,general1[[1]],meth1[[1]])
chemical1 <- chemical.checks(chemical,basic1[[2]],meth1[[1]])
psize1 <- psize.checks(psize,basic1[[2]],meth1[[1]])
ret1 <- ret.checks(ret,basic1[[2]],meth1[[1]])
cond1 <- cond.checks(cond,basic1[[2]],meth1[[1]])
if (!(all(general1$REL_T_SER == -999) | all(general1$REL_T_SER == 'ND'))){
# TSERMETA
tsermeta <- read.csv(paste('../data/',dirs[i],'/TSERMETA.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
tsermeta <- tsermeta[!is.na(tsermeta[[1]]),]
tsermeta1 <- tsermeta.checks(tsermeta)
# TSERDATA
tserdata <- read.csv(paste('../data/',dirs[i],'/TSERDATA.csv',sep=''),header=TRUE,as.is=TRUE,blank.lines.skip=TRUE)
tserdata <- tserdata[!is.na(tserdata[[1]]),]
tserdata1 <- tserdata.checks(meth)
} else {tsermeta1<-NULL;tserdata1<-NULL}
# warnings on
options(warn=1)
if (grepl('Anaya',dirs[i])) source('anaya_corr.r')
if (grepl('Goncalves',dirs[i])) source('goncalves_corr.r')
if (grepl('Iovino',dirs[i])) source('iovino_corr.r')
if (grepl('Katterer',dirs[i])) source('katterer_corr.r')
if (grepl('Mako',dirs[i])) source('mako_corr.r')
if (grepl('Patyka',dirs[i])) source('patyka_corr.r')
if (grepl('Schindler',dirs[i])) source('schindler_corr.r')
if (grepl('Shein',dirs[i])) source('shein_corr.r')
if (grepl('Javaux',dirs[i])) source('javaux_corr.r')
if (grepl("Daroussin",dirs[i])) source('daroussin_corr.r')
if (grepl("Romano",dirs[i])) source('romano_corr.r')
if (grepl("Kvaerno",dirs[i])) source('kvaerno_corr.r')
if (grepl("Lamorski",dirs[i])) source('lamorski_corr.r')
if (grepl("Lilly",dirs[i])) source('lilly_corr.r')
if (grepl("Houskova",dirs[i])) source('houskova_corr.r')
if (grepl("Strauss",dirs[i])) source('strauss_corr.r')
if (grepl("Matula",dirs[i])) source('matula_corr.r')
if (grepl("Cornelis",dirs[i])) source('cornelis_corr.r')
# names(general): surveyor_p and not surveyer_p:
if (any(names(general1) != names(GENERAL))){names(general1) <- names(GENERAL); print(paste(dirs[i],": names(general) changed",sep=''))}
DB[[i]] <- list(general1,basic1,chemical1,psize1,ret1,cond1,meth1,tsermeta1,tserdata1)
names(DB[[i]]) <- c('general','basic','chemical','psize','ret','cond','meth','tsermeta','tserdata')
}
# for (i in 1:18){
# print("---------")
# print(dirs[i])
# print(table(nchar(DB[[i]]$basic$HOR1_NAME)))
# }
# add greek
# load("E:/weyname/Documents/Documents/MyWATER/Nestos-GR/Rcode/GRhydi.Rdata")
load("../data/Bilas/Nestos-GR/Rcode/GRhydi.Rdata")
DB[[19]] <- GRhydi
names(DB)[19] <- "Bilas"
save('DB', file='../output/HYDI_single.Rdata')
# import hypres data
#source('hypres_hydi.R')
load('../output/hypres_hydi.RData') #hypres_hydi
# checks
gen <- general.checks(hypres_hydi$general)
# Some EMAIL are missing
meth <- meth.checks(hypres_hydi$meth)
basic <- basic.checks(hypres_hydi$basic,hypres_hydi$general[[1]],hypres_hydi$meth[[1]])
# sample depths are missing; some BD are missing; coarse are missing
chem <- chemical.checks(hypres_hydi$chemical,hypres_hydi$basic[[2]],hypres_hydi$meth[[1]])
# some OC are missing
psize <- psize.checks(hypres_hydi$psize,hypres_hydi$basic[[2]],hypres_hydi$meth[[1]])
# CAUTION: replicates
ret <- ret.checks(hypres_hydi$ret,hypres_hydi$basic[[2]],hypres_hydi$meth[[1]])
# some negative heads
cond <- cond.checks(hypres_hydi$cond,hypres_hydi$basic[[2]],hypres_hydi$meth[[1]])
# data from Wosten, Hennings, Schindler and Slovakia
# select:
hypres.contr <- list(Wosten_HYPRES=hypres_hydi$general$PROFILE_ID[(grepl("Wosten",hypres_hydi$general$CONTACT_P) | grepl("Booltink",hypres_hydi$general$CONTACT_P) | grepl("Stolte",hypres_hydi$general$CONTACT_P) | grepl("Stricker",hypres_hydi$general$CONTACT_P)) & !grepl("UNSODA",hypres_hydi$general$COMMENTS2) & hypres_hydi$general$ISO_COUNTRY=="NL" ],
Hennings_HYPRES = hypres_hydi$general$PROFILE_ID[grepl("BGR",hypres_hydi$general$CONTACT_A)],
Schindler_HYPRES = hypres_hydi$general$PROFILE_ID[grepl("Schindler",hypres_hydi$general$CONTACT_P)],
Houskova_HYPRES = hypres_hydi$general$PROFILE_ID[hypres_hydi$general$ISO_COUNTRY=="SK"],
Romano_HYPRES = hypres_hydi$general$PROFILE_ID[grepl("Romano",hypres_hydi$general$CONTACT_P)])
for (i in 20:24) {
DB[[i]] <- list(hypres_hydi$general[hypres_hydi$general$PROFILE_ID %in% hypres.contr[[i-19]],1:65],
hypres_hydi$basic[hypres_hydi$basic$PROFILE_ID %in% hypres.contr[[i-19]],],
hypres_hydi$chemical[hypres_hydi$chemical$PROFILE_ID %in% hypres.contr[[i-19]],],
hypres_hydi$psize[hypres_hydi$psize$PROFILE_ID %in% hypres.contr[[i-19]],],
hypres_hydi$ret[hypres_hydi$ret$PROFILE_ID %in% hypres.contr[[i-19]],],
hypres_hydi$cond[hypres_hydi$cond$PROFILE_ID %in% hypres.contr[[i-19]],],
hypres_hydi$meth[is.na(hypres_hydi$meth[,1]),],
hypres_hydi$tsermeta,
hypres_hydi$tserdata)
names(DB)[i] <- names(hypres.contr)[i-19]
}
# add meth
DB$HYPRES[[7]] <- hypres_hydi$meth
save('DB', file='../output/HYDI_single_hyp.Rdata')
# Next step: Harmonize methods codes!!! and check identifiers for countries where more than one dataset (Germany, Italy, Belgium)
# BUNDLE with SOURCE
load("../output/HYDI_single_hyp.Rdata")
tnames <- c("GENERAL","BASIC","CHEMICAL","PSIZE","RET","COND","METHOD")
hydi <- list()
for (j in 1:length(DB)){
#print("---------")0
#print(names(DB)[j])
if (j != 25){
names(DB[[j]][[1]]) <- gsub("SLOPE","SLOP",names(DB[[j]][[1]]))
names(DB[[j]][[2]]) <- gsub("COURSE","COARSE",names(DB[[j]][[2]]))
names(DB[[j]][[3]]) <- toupper(names(DB[[j]][[3]]))
names(DB[[j]][[4]]) <- gsub("PSIZE","P_SIZE",names(DB[[j]][[4]]))}
for (k in 1:7){
if (j == 25 & k!=7){next}
#print("---------")
#print(tnames[k])
if (nrow(DB[[j]][[k]])>0){
# add ID to table COND
if (k %in% 4:6){
if (!any(names(DB[[j]][[k]])=="ID"))
{DB[[j]][[k]]$ID <- rep("ND",nrow(DB[[j]][[k]]))}
DB[[j]][[k]]$ID <- as.character(DB[[j]][[k]]$ID)}
#{DB[[j]][[k]] <- cbind(DB[[j]][[k]][,1:14],-999,DB[[j]][[k]][,15]);names(DB[[j]][[k]])[15:16] <- c("K_INV_P9","K_INV_MOD")}
tbl <- cbind(DB[[j]][[k]],SOURCE=names(DB)[j],stringsAsFactors=FALSE)
if (j==1){hydi[[k]] <- tbl} else {hydi[[k]] <- rbind(hydi[[k]],tbl)}
}}}
names(hydi) <- tnames
save('hydi',file="../output/HYDI_SOURCE.Rdata")
load("../output/HYDI_SOURCE.Rdata")
# harmonize ID's
unique(hydi$GENERAL$SOURCE[hydi$GENERAL$PROFILE_ID %in% hydi$GENERAL$PROFILE_ID[duplicated(hydi$GENERAL$PROFILE_ID)]])
# BE
M<-max(hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Cornelis"])
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Javaux"] <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Javaux"] + M - 5600000
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Javaux"] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Javaux"] + M - 5600000
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Javaux"] <- hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Javaux"] + (M-5600000)*100
}
# IT: Iovino
M<-max(hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Romano" & hydi$GENERAL$ISO_COUNTRY=="IT"]) - 38000000
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Iovino"] <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Iovino"] + M
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Iovino"] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Iovino"] + M
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Iovino"] <- hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Iovino"] + M*100
}
# IT: Romano_HYPRES
M<-max(hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Iovino" & hydi$GENERAL$ISO_COUNTRY=="IT"]) - 38000000
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Romano_HYPRES"] <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Romano_HYPRES"] + M
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Romano_HYPRES"] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Romano_HYPRES"] + M
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Romano_HYPRES"] <- hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Romano_HYPRES"] + M*100
}
# DE: 4 data sources (Schindler_HYPRES:52, Schindler:33, Hennings: 518, Romano: 14)
# give new IDs to Hennings
pid <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Hennings_HYPRES"]
npid <- 27600000 + 1:length(pid)
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Hennings_HYPRES"] <- npid
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Hennings_HYPRES"] <- npid[match(hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Hennings_HYPRES"],pid)]
sid <- as.numeric(substr(as.character(hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Hennings_HYPRES"]),9,10))
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Hennings_HYPRES"] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Hennings_HYPRES"]*100 + sid
}
# give new IDs to Schindler_HYPRES
M <- max(hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Hennings_HYPRES"])
pid <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Schindler_HYPRES"]
npid <- M + 1:length(pid)
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Schindler_HYPRES"] <- npid
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Schindler_HYPRES"] <- npid[match(hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Schindler_HYPRES"],pid)]
sid <- as.numeric(substr(as.character(hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Schindler_HYPRES"]),9,10))
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Schindler_HYPRES"] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Schindler_HYPRES"]*100 + sid
}
# give new IDs to Schindler
M <- max(hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Schindler_HYPRES"])
pid <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Schindler"]
npid <- M + 1:length(pid)
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Schindler"] <- npid
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Schindler"] <- npid[match(hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Schindler"],pid)]
sid <- as.numeric(substr(as.character(hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Schindler"]),9,10))
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Schindler"] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Schindler"]*100 + sid
}
# give new IDs to Romano
M <- max(hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Schindler"])
pid <- hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Romano" & hydi$GENERAL$ISO_COUNTRY=="DE"]
npid <- M + 1:length(pid)
hydi$GENERAL$PROFILE_ID[hydi$GENERAL$SOURCE=="Romano" & hydi$GENERAL$ISO_COUNTRY=="DE"] <- npid
for (itbl in 2:7){
hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Romano" & hydi[[itbl]]$PROFILE_ID<27700000] <- npid[match(hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Romano" & hydi[[itbl]]$PROFILE_ID<27700000],pid)]
sid <- as.numeric(substr(as.character(hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Romano" & hydi[[itbl]]$PROFILE_ID<27700000]),9,10))
hydi[[itbl]]$SAMPLE_ID[hydi[[itbl]]$SOURCE=="Romano" & hydi[[itbl]]$PROFILE_ID<27700000] <- hydi[[itbl]]$PROFILE_ID[hydi[[itbl]]$SOURCE=="Romano" & hydi[[itbl]]$PROFILE_ID<27700000]*100 + sid
}
# control absence of duplicates:
any(duplicated(hydi$GENERAL$PROFILE_ID))
unique(hydi$GENERAL$SOURCE[hydi$GENERAL$PROFILE_ID %in% hydi$GENERAL$PROFILE_ID[duplicated(hydi$GENERAL$PROFILE_ID)]])
# character(0)
# remove missing data in RET and COND
hydi$RET <- hydi$RET[hydi$RET$HEAD != -999,]
hydi$RET <- hydi$RET[hydi$RET$THETA != -999,]
hydi$COND <- hydi$COND[hydi$COND$VALUE != -999,]
hydi$COND <- hydi$COND[hydi$COND$COND != -999,]
# METH_PAR in uppercase
hydi$METHOD$METH_PAR <- toupper(hydi$METHOD$METH_PAR)
save('hydi',file="../output/HYDI_SOURCE_nd.Rdata")
# run checks on hydi
general_hydi <- general.checks(hydi$GENERAL) # remaining problems: LC and LU,, WRB2006_PQ1 and SQ1
meth_hydi <- meth.checks(hydi$METHOD) # need to be harmonized
basic_hydi <- basic.checks(hydi$BASIC,hydi$GENERAL$PROFILE_ID,hydi$METHOD$CODE_M) # profile and sample ID's don't match (Anaya)+ duplicates (anaya); HOR1_NAME; 0<POR<100; BD
unique(hydi$BASIC$SOURCE[!substr(hydi$BASIC$HOR1_NAME,2,2) %in% c('H', 'O', 'A', 'E', 'B', 'C', 'R', 'I', 'L', 'W', ' ','','N')])
chemical_hydi <-chemical.checks(hydi$CHEMICAL,hydi$BASIC$SAMPLE_ID,hydi$METHOD$CODE_M) # missing CODE_M; EX_CA, BASE_CATIONS, ACIDITY_NA4O out of range
psize_hydi <- psize.checks(hydi$PSIZE,hydi$BASIC$SAMPLE_ID,hydi$METHOD$CODE_M) # profile and sample ID's don't match (anaya); missing P_M
unique(hydi$PSIZE$SOURCE[duplicated(hydi$PSIZE[,c("SAMPLE_ID","P_SIZE")])]) # Patyka, Romano, Cornelis
ret_hydi <- ret.checks(hydi$RET,hydi$BASIC$SAMPLE_ID,hydi$METHOD$CODE_M)# HEAD, THETA (>1 Schindler), missing THETA_M
cond_hydi <- cond.checks(hydi$COND,hydi$BASIC$SAMPLE_ID,hydi$METHOD$CODE_M)# COND<0; missing COND_M in method
save('hydi',file="../output/HYDI_SOURCE_nd.Rdata")
# # export to mdb
# # example *.mdb
# # CAUTION: need to make de varchar(n) more efficient
# # integer for id's
# g=c("int","varchar(30)","varchar(30)","varchar(50)","float","float","int","varchar(2)","varchar(25)","varchar(25)","varchar(3)","varchar(3)","varchar(3)","varchar(4)","varchar(4)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","int","varchar(2)","varchar(2)","varchar(2)","int","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","int","varchar(2)","int","varchar(3)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(2)","varchar(3)","varchar(3)","varchar(3)","varchar(3)","varchar(3)","varchar(3)","varchar(120)","varchar(255)","int","int","int","varchar(150)","varchar(255)","varchar(50)","varchar(255)","varchar(50)","varchar(100)","varchar(20)","varchar(255)","varchar(255)","varchar(255)","varchar(20)")
# attr(g,"names") <- names(hydi[[1]])
# b <- c("int","float","int","float","float","varchar(7)","float","float","varchar(7)","float","float","varchar(7)","float","float","varchar(6)","varchar(2)","varchar(6)","float","int","float","int","float","int","varchar(20)")
# attr(b,"names") <- names(hydi[[2]])
# c <- c("int","float","float","int","float","int","float","int","float","int","float","int","float","int","float","int","float","int","float","int","float","int","float","int","float","float","int","float","int","varchar(20)")
# attr(c,"names") <- names(hydi[[3]])
# p <- c("int","float","float","float","int","varchar(30)","varchar(20)")
# attr(p,"names") <- names(hydi[[4]])
# r <- c("int","float","float","float","int","float","float","float","float","float","float","float","float","int","varchar(30)","varchar(20)")
# attr(r,"names") <- names(hydi[[5]])
# k <- c("int","float","varchar(5)","float","float","int","float","float","float","float","float","float","float","float","float","int","varchar(30)","varchar(20)")
# attr(k,"names") <- names(hydi[[6]])
# m <- c("int","varchar(255)","varchar(255)","varchar(20)","varchar(20)")
# attr(m,"names") <- names(hydi[[7]])
# vT=list(general=g,basic=b,chemical=c,psize=p,ret=r,cond=k,meth=m)
# # open connection with database
# require("RODBC")
# ch <- odbcConnectAccess2007('../output/HYDI-v1_BETA.accdb')
# # for (j in 1:8){
# # print("---------")
# # print(names(DB)[j])
# # #tnames <- toupper(paste(names(DB[[j]]),names(DB)[j],sep='_'))
# for (k in 1:7){
# print("---------")
# print(tnames[k])
# # if (nrow(DB[[j]][[k]])>0){
# # tbl <- cbind(DB[[j]][[k]],SOURCE=names(DB)[j],stringsAsFactors=FALSE)
# # if (j==1){
# sqlDrop(ch, tnames[k], errors = FALSE)
# sqlSave(ch, hydi[[k]], tablename = tnames[k], append = FALSE,
# rownames = FALSE, colnames = FALSE, verbose = FALSE,
# safer = TRUE, addPK = FALSE,varTypes=vT[[k]])
# # } else {
# # tryCatch(sqlSave(ch, tbl, tablename = tnames[k], append = TRUE,
# # rownames = FALSE, colnames = FALSE, verbose = FALSE,
# # safer = TRUE, addPK = FALSE,varTypes=vT[[k]]),
# # error=function(e){print(paste("Error when writing",tnames[k],"from",names(DB)[j],"to Access database"))})}
# }
# #}}
# odbcClose(ch)
# # # |
require 'rails_helper'
RSpec.describe ResponseSerializer do
context 'attributes' do
describe 'votes' do
it 'should only return the votes from current game' do
prompt = create(:prompt)
correct_response = create(:response, prompt: prompt, game: nil, correct: true)
game = create(:game, room_code: 'VOTEY', started_at: Time.now, round: 1)
game_prompt = create(:game_prompt, game: game, prompt: prompt)
player = create(:player, game: game)
vote = create(:vote, response: correct_response, game: game, player: player)
other_game = create(:game, room_code: 'NOTVOTEY', started_at: Time.now, round: 1)
other_player = create(:player, game: other_game)
other_game_prompt = create(:game_prompt, game: other_game, prompt: prompt)
other_vote = create(:vote, response: correct_response, game: other_game, player: other_player)
response = ResponseSerializer.new(correct_response, params: { game_id: game.id }).serializable_hash
votes = response[:data][:attributes][:votes][:data]
expect(votes.count).to eq(1)
expect(votes.first[:id]).to eq(vote.id.to_s)
response = ResponseSerializer.new(correct_response, params: { game_id: other_game.id }).serializable_hash
votes = response[:data][:attributes][:votes][:data]
expect(votes.count).to eq(1)
expect(votes.first[:id]).to eq(other_vote.id.to_s)
end
end
end
end |
#include <iostream>
#include <fstream>
#include <algorithm>
#include <vector>
using namespace std;
const int ESTIMATE_INPUT_SIZE = 16;
int left_child_of(int node_index) {
return 2 * node_index + 1;
}
int parent_of(int node_index) {
return (node_index - 1) / 2;
}
// Repair the heap whose root element is at index `start`, assuming the
// heaps rooted at its children are valid
void sift_down(vector<int> &array, int root, int end) {
// While the root has at least one child
while (left_child_of(root) < end) {
// Get the left child of root
int child = left_child_of(root);
// If there is a right child and that child is greater
if (child + 1 < end && array[child] < array[child + 1]) {
child = child + 1;
}
if (array[root] < array[child]) {
swap(array[root], array[child]);
// Repeat to continue sifting down the child
root = child;
} else {
// The root holds the largest element, done
return;
}
}
}
// Put elements of array in heap order, in-place operation
void heapify(vector<int> &array, int length) {
// `start` is initialized to the first leaf node
// The last element in a 0-base array is at index length - 1; find
// the parent of that element
int start = parent_of(length + 1);
while (start > 0) {
// Go to the last non-heap node
--start;
// sift down the node at index `start` to the proper place such
// that all nodes below the `start` index are in heap order
sift_down(array, start, length);
}
// After sifting down the root all nodes/elements are in heap order
}
void heapsort(vector<int> &array, int length) {
// Build the heap in array so that largest value is at the root
heapify(array, length);
// The following loop maintains the invariants that array[0: end - 1]
// is a heap, and every element array[end: length - 1] beyond end is
// greater than everything before it
// i.e. array[end: length - 1] is in sorted order
int end = length;
while (end > 1) {
// Reduce the heap size
--end;
// array[0] si the root and largest value. Move it in front of
// the sorted elements
swap(array[0], array[end]);
// The heap property damaged, restore it
sift_down(array, 0, end);
}
}
int main() {
// Construct the file stream
ifstream input_file_stream;
// Connect the file stream to the actual file
input_file_stream.open("../input");
// Construct a vector of int to hold the input
vector<int> input = vector<int>();
// Set the capacity of this vector to avoid reallocation
// during the pushing process
input.reserve(ESTIMATE_INPUT_SIZE);
int element;
while (input_file_stream >> element) {
input.emplace_back(element);
}
// Process the input
heapsort(input, input.size());
for (int e: input) {
cout << e << endl;
}
// Print the result
return 0;
} |
import React from "react";
import { Link } from "react-scroll";
import { useState } from "react";
import "./Navbar1.scss";
import logo from "../../assets/Logo/black-logo.png";
import { IoMenu } from "react-icons/io5";
import { IoClose } from "react-icons/io5";
import { motion, AnimatePresence } from "framer-motion";
const navData = [
{
id: 1,
link: "Home",
},
{
id: 2,
link: "About",
},
{
id: 3,
link: "Services",
},
{
id: 4,
link: "Our Works",
},
{
id: 5,
link: "Contact",
},
];
const variants = {
hidden: {
x: "100%",
opacity: 0,
transition: {
duration: 2,
},
},
visible: {
x: 0,
opacity: 1,
transition: {
duration: 2,
},
},
};
function Navbar() {
const [nav, setNav] = useState(false);
const toggleNav = () => {
setNav(!nav);
};
const closeNav = () => {
setNav(false);
};
return (
<section className="navbar1">
<div className="navContainer">
<div className="logo">
<Link to="Home" smooth={true} duration={1300} onClick={closeNav}>
<img src={logo} alt="" />
</Link>
</div>
<ul>
{navData.map(({ id, link }) => (
<li key={id}>
<Link to={link} smooth={true} duration={1300} onClick={closeNav}>
{link}
</Link>
</li>
))}
</ul>
<div className="hamBurger" onClick={toggleNav}>
{nav ? <IoClose /> : <IoMenu />}
<AnimatePresence>
{nav && (
<motion.ul
className="mobileNav"
variants={variants}
initial="hidden"
animate="visible"
exit="hidden"
>
{navData.map(({ id, link }) => (
<li key={id}>
<Link
to={link}
smooth={true}
duration={1300}
onClick={closeNav}
>
{link}
</Link>
</li>
))}
</motion.ul>
)}
</AnimatePresence>
</div>
</div>
</section>
);
}
export default Navbar; |
#include "variadic_functions.h"
#include <stdarg.h>
/**
* sum_them_all - Returns the sum of all its paramters.
* @n: The number of argument passed to the function.
* @...: A variable number of paramters to calculate the sum of.
*
* Return: If neutral
*/
int sum_them_all(const unsigned int n, ...)
{
va_list vr;
unsigned int i, sum = 0;
va_start(vr, n);
for (i = 0; i < n; i++)
sum += va_arg(vr, int);
va_end(vr);
return (sum);
} |
<!DOCTYPE html>
<html
lang="en"
xmlns:th="http://www.thymeleaf.org"
xmlns:sec="https://www.thymeleaf.org/thymeleaf-extras-springsecurity5"
>
<head>
<meta charset="utf-8" />
<title>Lg Issue Report</title>
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.5.2/css/bootstrap.min.css" integrity="sha384-JcKb8q3iqJ61gNV9KGb8thSsNjpSL0n8PARn9HuZOnIxN0hoP+VmmDGMN5t9UJ0Z" crossorigin="anonymous">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.2.0/css/all.min.css" integrity="sha512-xh6O/CkQoPOWDdYTDqeRdPCVd1SpvCA9XXcUnZS2FmJNp1coAFzvtCN9BmamE+4aHK8yyUHUSCcJHgXloTyT2A==" crossorigin="anonymous" referrerpolicy="no-referrer" />
<link th:rel="stylesheet" th:href="@{/css/datepicker/bootstrap-datepicker.css}"/>
<link th:rel="stylesheet" th:href="@{css/datepicker/bootstrap-datepicker.standalone.css}"/>
<link href="/css/common.css" rel="stylesheet" />
<link href="/css/main.css" rel="stylesheet" />
</head>
<body>
<div th:replace="fragments/header :: header"></div>
<div class="container">
<div>
<div class="container-fluid" style="max-width: 1200px; margin: 0 auto">
<h2 class="title">Issues</h2>
<div class="card my-3">
<div class="card-body p-2 my-2">
<form th:action="@{/issues}" method="get" id="searchForm">
<div class="form-group row">
<div class="col-md-3">
<label for="lgReference">LG Reference#</label>
<div>
<input
id="lgReference"
type="text"
name="lgReference"
th:value="${lgReference}"
class="form-control"
/>
</div>
</div>
<div class="col-md-3">
<label for="lgType">LG Type</label>
<div>
<input
id="lgType"
type="text"
name="lgType"
th:value="${lgType}"
class="form-control"
/>
</div>
</div>
<div class="col-md-3">
<label for="iban">IBAN#</label>
<div>
<input
id="iban"
type="text"
name="iban"
th:value="${iban}"
class="form-control"
/>
</div>
</div>
<div class="col-md-3">
<label for="lgType">Applicant CIF#</label>
<div>
<input
id="applicantCif"
type="text"
name="applicantCif"
th:value="${applicantCif}"
class="form-control"
/>
</div>
</div>
</div>
<div class="form-group row">
<div class="col-md-3">
<label for="issueDate">LG Issue Date</label>
<div class="input-group date">
<input
type="text"
id="issueDate"
name="issueDate"
autocomplete="off"
th:value="${#dates.format(issueDate,'yyyy-MM-dd')}"
class="form-control"
/>
<div class="input-group-append">
<span class="input-group-text"
><i class="far fa-calendar-alt"></i
></span>
</div>
</div>
</div>
<div class="col-md-3">
<label for="startDate">Start Date</label>
<div class="input-group date">
<input
type="text"
id="startDate"
name="startDate"
autocomplete="off"
th:value="${#dates.format(startDate,'yyyy-MM-dd')}"
class="form-control"
/>
<div class="input-group-append">
<span class="input-group-text"
><i class="far fa-calendar-alt"></i
></span>
</div>
</div>
</div>
<div class="col-md-3">
<label for="endDate">End Date</label>
<div class="input-group date">
<input
type="text"
id="endDate"
name="endDate"
autocomplete="off"
th:value="${#dates.format(endDate,'yyyy-MM-dd')}"
class="form-control"
/>
<div class="input-group-append">
<span class="input-group-text"
><i class="far fa-calendar-alt"></i
></span>
</div>
</div>
</div>
<div class="col-md-3">
<label for="filter" style="visibility: hidden">btn</label>
<div class="input-group">
<button
type="button"
id="filter"
name="action"
onclick="searchIssue()"
value="searchIssue"
class="btn btn-primary btn-sm"
>
Search
</button>
</div>
</div>
</div>
</form>
</div>
</div>
<div class="card mb-2">
<div class="card-body">
<table class="table table-responsive-xl m-0">
<thead class="thead-light">
<tr>
<th scope="col">Id</th>
<th scope="col">LG Reference#</th>
<th scope="col">LG Type</th>
<th scope="col">LG Issue Date</th>
<th scope="col">Amount & Ccy</th>
<th scope="col">Applicant Name</th>
<th scope="col">IBAN#</th>
<th scope="col">Status</th>
<th scope="col"></th>
</tr>
</thead>
<tbody>
<tr th:unless="${issues.size() > 0}">
<td colspan="8" class="text-center">No issues found!</td>
</tr>
<tr
th:if="${issues.size() > 0}"
th:each="issue : ${issues}"
>
<th scope="row">[[${issue.id}]]</th>
<td>[[${issue.lgNumber}]]</td>
<td>[[${issue.lgType}]]</td>
<td>
<span
th:text="${#dates.format(issue.issueDate,'yyyy-MM-dd')}"
></span>
</td>
<td>[[${issue.amount}]] [[${issue.currency}]]</td>
<td>[[${issue.entityName}]]</td>
<td>[[${issue.appIban}]]</td>
<td>[[${issue.status}]]</td>
<td>
<a
th:href="@{'/issues/' + ${issue.id}}"
title="View this issue"
class="fa-solid fa-eye icon-dark"
></a>
</td>
</tr>
</tbody>
</table>
</div>
</div>
<nav aria-label="Pagination" th:if="${totalPages > 0}">
<ul class="pagination justify-content-center">
<li
class="page-item"
th:classappend="${currentPage == 1} ? 'disabled'"
>
<a
th:replace="fragments/paging :: paging(1, '<<', 'First Page')"
></a>
</li>
<li
class="page-item font-weight-bold"
th:classappend="${currentPage == 1} ? 'disabled'"
>
<a
th:replace="fragments/paging :: paging(${currentPage - 1}, '<', 'Previous Page')"
></a>
</li>
<li class="page-item disabled" th:if="${currentPage - 2 > 1}">
<a class="page-link" href="#">...</a>
</li>
<li
class="page-item"
th:classappend="${page == currentPage} ? 'active'"
th:each="page : ${#numbers.sequence(currentPage > 2 ? currentPage - 2 : 1, currentPage + 2 < totalPages ? currentPage + 2 : totalPages)}"
>
<a
th:replace="fragments/paging :: paging(${page}, ${page}, 'Page ' + ${page})"
></a>
</li>
<li
class="page-item disabled"
th:if="${currentPage + 2 < totalPages}"
>
<a class="page-link" href="#">...</a>
</li>
<li
class="page-item font-weight-bold"
th:classappend="${currentPage == totalPages} ? 'disabled'"
>
<a
th:replace="fragments/paging :: paging(${currentPage + 1},'>', 'Next Page')"
></a>
</li>
<li
class="page-item"
th:classappend="${currentPage == totalPages} ? 'disabled'"
>
<a
th:replace="fragments/paging :: paging(${totalPages}, '>>', 'Last Page')"
></a>
</li>
</ul>
</nav>
</div>
</div>
</div>
<script src="https://code.jquery.com/jquery-3.5.1.slim.min.js" integrity="sha384-DfXdz2htPH0lsSSs5nCTpuj/zy4C+OGpamoFVy38MVBnE+IbbVYUew+OrCXaRkfj" crossorigin="anonymous"></script>
<script src="https://cdn.jsdelivr.net/npm/popper.js@1.16.1/dist/umd/popper.min.js" integrity="sha384-9/reFTGAW83EW2RDu2S0VKaIzap3H66lZH81PoYlFhbGU+6BZp6G7niu735Sk7lN" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.5.2/js/bootstrap.min.js" integrity="sha384-B4gt1jrGC7Jh4AgTPSdUtOBvfO8shuf57BaghqFfPlYxofvL8/KUEfYiJOMMV+rV" crossorigin="anonymous"></script>
<script th:src="@{/js/datepicker/bootstrap-datepicker.js}"></script>
<script src="/js/main.js"></script>
<script>
$(".input-group.date").datepicker({
autoclose: true,
todayHighlight: true,
format: "yyyy-mm-dd",
});
function searchIssue() {
$("#searchForm").submit( ); // Submit the form
}
</script>
</body>
</html> |
<!DOCTYPE html>
<html lang="en">
<head>
<link href="https://cdnjs.cloudflare.com/ajax/libs/extjs/6.0.0/classic/theme-classic/resources/theme-classic-all.css"
rel="stylesheet"/>
<meta charset="UTF-8">
<title>Title</title>
<script src="../static/js/ext-all.js"></script>
<script type="text/javascript">
Ext.define('StudentModel',{
extend:'Ext.data.Model' ,
fields: [
{name:'name',mapping:'name'},
{name:'age',mapping:'age'},
{name:'marks',mapping:'marks'}]
});
Ext.onReady(function () {
var myData=[
{name:'Ash',age:"20",marks:"91"},
{name:'Ash',age:"20",marks:"91"},
{name:'Ash',age:"20",marks:"91"},
{name:'Ash',age:"20",marks:"91"},
{name:'Ash',age:"20",marks:"91"}] ;
var gridStore=Ext.create('Ext.data.Store',{
model:'StudentModel',
data:myData
});
Ext.create('Ext.grid.Panel',{
id:'gridId',
store:gridStore,
stripeRows:true,
title:'Student Grid',
renderTo:'gridDiv',
width:600,
collapsible:true,
enableColumnMove :true,
enableColumnResize:true,
columns:[{
header:"Student Name",
dataIndex:'name',
id:'name',
flex:1,
sortable:true,
hideable:true
},{
header:"Age",
dataIndex:'age',
flex:.5,
sortable:true,
hideable:false
},{
sortable:true,
hideable:true,
flex:.5,
dataIndex:'marks',
header:"Marks"
}]
})
})
</script>
</head>
<body>
<div id = "gridDiv"></div>
</body>
</html> |
'use strict';
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
return new (P || (P = Promise))(function (resolve, reject) {
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
function step(result) { result.done ? resolve(result.value) : new P(function (resolve) { resolve(result.value); }).then(fulfilled, rejected); }
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
Object.defineProperty(exports, "__esModule", { value: true });
const vscode_1 = require("vscode");
const cp = require("child_process");
const path = require("path");
const pathExists = require('path-exists');
const expandHomeDir = require('expand-home-dir');
const findJavaHome = require('find-java-home');
const isWindows = process.platform.indexOf('win') === 0;
const JAVAC_FILENAME = 'javac' + (isWindows ? '.exe' : '');
/**
* Resolves the requirements needed to run the extension.
* Returns a promise that will resolve to a RequirementsData if
* all requirements are resolved, it will reject with ErrorData if
* if any of the requirements fails to resolve.
*
*/
function resolveRequirements() {
return __awaiter(this, void 0, void 0, function* () {
let java_home = yield checkJavaRuntime();
let javaVersion = yield checkJavaVersion(java_home);
return Promise.resolve({ 'java_home': java_home, 'java_version': javaVersion });
});
}
exports.resolveRequirements = resolveRequirements;
function checkJavaRuntime() {
return new Promise((resolve, reject) => {
let source;
let javaHome = readJavaConfig();
if (javaHome) {
source = 'The java.home variable defined in VS Code settings';
}
else {
javaHome = process.env['JDK_HOME'];
if (javaHome) {
source = 'The JDK_HOME environment variable';
}
else {
javaHome = process.env['JAVA_HOME'];
source = 'The JAVA_HOME environment variable';
}
}
if (javaHome) {
javaHome = expandHomeDir(javaHome);
if (!pathExists.sync(javaHome)) {
openJDKDownload(reject, source + ' points to a missing folder');
}
if (!pathExists.sync(path.resolve(javaHome, 'bin', JAVAC_FILENAME))) {
openJDKDownload(reject, source + ' does not point to a JDK.');
}
return resolve(javaHome);
}
//No settings, let's try to detect as last resort.
findJavaHome(function (err, home) {
if (err) {
openJDKDownload(reject, 'Java runtime could not be located');
}
else {
resolve(home);
}
});
});
}
function readJavaConfig() {
const config = vscode_1.workspace.getConfiguration();
return config.get('java.home', null);
}
function checkJavaVersion(java_home) {
return new Promise((resolve, reject) => {
cp.execFile(java_home + '/bin/java', ['-version'], {}, (error, stdout, stderr) => {
if (stderr.indexOf('version "9') > -1) {
resolve(9);
}
if (stderr.indexOf('1.8') < 0) {
openJDKDownload(reject, 'Java 8 is required to run. Please download and install a JDK 8.');
}
else {
resolve(8);
}
});
});
}
function openJDKDownload(reject, cause) {
let jdkUrl = 'http://developers.redhat.com/products/openjdk/overview/';
if (process.platform === 'darwin') {
jdkUrl = 'http://www.oracle.com/technetwork/java/javase/downloads/index.html';
}
reject({
message: cause,
label: 'Get Java Development Kit',
openUrl: vscode_1.Uri.parse(jdkUrl),
replaceClose: false
});
}
//# sourceMappingURL=requirements.js.map |
<div align="center">
<a href="https://git.io/typing-svg">
<img src="https://readme-typing-svg.demolab.com?font=Silkscreen&size=20&duration=1500&pause=1000¢er=true&vCenter=true&multiline=true&repeat=false&random=false&width=700&height=110&lines=API+MEDICAL"
alt="Typing SVG" />
</a>
<h5 align="center">
<b> Completo ✅ </b> | <a href="https://www.figma.com/design/vSaxzTG4lAqfYbJICvnf8Q/API_Medical?node-id=0-1&t=f3L0ZsehXjotUr6u-1">FIGMA📱</a>
</h5>
</div>
# Medical API Controller
Este projeto é um exemplo de uma API para gerenciamento de médicos utilizando Spring Boot. Ele inclui funcionalidades para cadastrar, listar, atualizar e excluir médicos no sistema. O código foi desenvolvido seguindo boas práticas de programação e utilizando recursos modernos do Spring Framework.
## Tecnologias Utilizadas
- **Java** ☕: Linguagem de programação principal.
- **Spring Boot** 🚀: Framework para criar APIs de forma rápida e eficiente.
- **Jakarta Validation** ✅: Para validação de dados nas requisições.
- **Jakarta Transactional** 🔄: Para gerenciamento de transações.
## Estrutura do Projeto
### Pacotes Principais
- `client.medical.api.controller`: Contém os controladores REST.
- `client.medical.api.medico`: Contém as classes relacionadas à entidade Medico, como DTOs e repositórios.
### Controlador Principal
A classe `MedicoController` fornece os seguintes endpoints:
#### 1. **Cadastrar Médico**
- **Endpoint**: `POST /medicos`
- **Descrição**: Cadastra um novo médico no sistema.
- **Payload**:
```json
{
"nome": "string",
"email": "string",
"telefone": "string",
"crm": "string",
"especialidade": "string",
"endereco": {
"logradouro": "string",
"numero": "string",
"bairro": "string",
"cidade": "string",
"uf": "string"
}
}
```
#### 2. **Listar Médicos**
- **Endpoint**: `GET /medicos`
- **Descrição**: Retorna uma lista paginada de médicos ativos.
- **Parâmetros de Consulta**:
- `size`: Tamanho da página (padrão: 10).
- `sort`: Campo para ordenação (padrão: nome).
#### 3. **Atualizar Médico**
- **Endpoint**: `PUT /medicos`
- **Descrição**: Atualiza as informações de um médico existente.
- **Payload**:
```json
{
"id": "long",
"nome": "string",
"email": "string",
"telefone": "string",
"crm": "string",
"especialidade": "string",
"endereco": {
"logradouro": "string",
"numero": "string",
"bairro": "string",
"cidade": "string",
"uf": "string"
}
}
```
#### 4. **Excluir Médico**
- **Endpoint**: `DELETE /medicos/{id}`
- **Descrição**: Marca um médico como inativo no sistema.
## Configuração e Execução
### Pré-requisitos
1. **Java 17** ou superior.
2. **Maven** para gerenciamento de dependências.
3. Banco de dados configurado com suporte ao Spring Data JPA.
---
<div align="center">
## 👩🏻💻 Autor <br>
<table>
<tr>
<td align="center">
<a href="https://lucasmessias.vercel.app">
<img src="https://avatars.githubusercontent.com/u/e?email=robsonlmds@hotmail.com&s=500" width="100px;" title="Autor Robson Lucas Messias" alt="Foto de Perfil do GitHub - Robson Lucas Messias"/><br>
<sub>
<b>Robson Lucas Messias</b>
</sub>
</a>
</td>
</tr>
</table>
</div>
<h4 align="center">
Made by: Robson Lucas Messias | <a href="mailto:robsonlmds@hotmail.com">Contato</a>
</h4>
<p align="center">
<a href="https://www.linkedin.com/in/r-lucas-messias/">
<img alt="Robson Lucas Messias" src="https://img.shields.io/badge/LinkedIn-R.Lucas_Messias-0e76a8?style=flat&logoColor=white&logo=linkedin">
</a>
</p>
<h1 align="center">
<img src="https://readme-typing-svg.herokuapp.com/?font=Silkscreen&size=35¢er=true&vCenter=true&width=700&height=70&duration=5000&lines=Obrigado+pela+atenção!;" />
</h1> |
/*****************************************************************************
* *
* UNURAN -- Universal Non-Uniform Random number generator *
* *
*****************************************************************************
* *
* FILE: unur_specfunct_source.h *
* *
* PURPOSE: *
* prototypes and macros for using special functions like erf(), *
* gamma(), beta(), etc., which are imported from other packages. *
* *
*****************************************************************************
* *
* Copyright (c) 2000-2010 Wolfgang Hoermann and Josef Leydold *
* Department of Statistics and Mathematics, WU Wien, Austria *
* *
* This program is free software; you can redistribute it and/or modify *
* it under the terms of the GNU General Public License as published by *
* the Free Software Foundation; either version 2 of the License, or *
* (at your option) any later version. *
* *
* This program is distributed in the hope that it will be useful, *
* but WITHOUT ANY WARRANTY; without even the implied warranty of *
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the *
* GNU General Public License for more details. *
* *
* You should have received a copy of the GNU General Public License *
* along with this program; if not, write to the *
* Free Software Foundation, Inc., *
* 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA *
* *
*****************************************************************************/
/*---------------------------------------------------------------------------*/
#ifndef UNUR_SPECFUNCT_SOURCE_H_SEEN
#define UNUR_SPECFUNCT_SOURCE_H_SEEN
/*---------------------------------------------------------------------------*/
/*****************************************************************************
* *
* Prototypes for special functions like erf(), gamma(), beta(), etc. *
* which are imported from other packages. *
* *
* We use the package CEPHES/DOUBLE for computing these functions *
* (available from NETLIB, http://www.netlib.org/cephes/ *
* Copyright 1984 - 1994 by Stephen L. Moshier *
* *
* Alternatively, we also can use the functions from the Rmath library *
* from the R project for statistical computing, http://www.R-project.org/ *
* *
*****************************************************************************/
/* We define macros for special functions.
*
* The following macros must be defined:
*
* _unur_SF_incomplete_beta ... incomplete beta integral
* _unur_SF_ln_gamma ... logarithm of gamma function
* _unur_SF_ln_factorial ... logarithm of factorial
* _unur_SF_incomplete_gamma ... incomplete gamma function
* _unur_SF_cdf_normal ... CDF of normal distribution
* _unur_SF_invcdf_normal ... inverse CDF of normal distribution
*
*---------------------------------------------------------------------------*/
#ifdef HAVE_LIBRMATH
/*---------------------------------------------------------------------------*/
/* Routines from the Rmath library (R project). */
/*---------------------------------------------------------------------------*/
/* we have to distinguish between two cases: */
# ifdef R_UNURAN
/* Rmath for 'Runuran': nothing special to do. */
# else
/* Rmath standalone library. */
# define MATHLIB_STANDALONE
# endif
/* include Rmath header file */
# include <Rmath.h>
/* we have to #undef some macros from Rmath.h */
#ifdef trunc
#undef trunc
#endif
#ifdef beta
#undef beta
#endif
/* ......................................................................... */
/* incomplete beta integral */
#define _unur_SF_incomplete_beta(x,a,b) pbeta((x),(a),(b),TRUE,FALSE)
/* logarithm of gamma function */
#define _unur_SF_ln_gamma(x) lgammafn(x)
/* logarithm of factorial */
#define _unur_SF_ln_factorial(x) lgammafn((x)+1.)
/* incomplete gamma function */
#define _unur_SF_incomplete_gamma(x,a) pgamma(x,a,1.,TRUE,FALSE)
/* modified Bessel function K_nu of second kind (AKA third kind) */
#define _unur_SF_bessel_k(x,nu) bessel_k((x),(nu),1)
/* Normal distribution */
#define _unur_SF_cdf_normal(x) pnorm((x),0.,1.,TRUE,FALSE)
#define _unur_SF_invcdf_normal(u) qnorm((u),0.,1.,TRUE,FALSE)
/* ..........................................................................*/
/* Beta Distribution */
#define _unur_SF_invcdf_beta(u,p,q) qbeta((u),(p),(q),TRUE,FALSE)
/* F Distribution */
#define _unur_SF_cdf_F(x,nua,nub) pf((x),(nua),(nub),TRUE,FALSE)
#define _unur_SF_invcdf_F(u,nua,nub) qf((u),(nua),(nub),TRUE,FALSE)
/* Gamma Distribution */
#define _unur_SF_invcdf_gamma(u,shape,scale) qgamma((u),(shape),(scale),TRUE,FALSE)
/* Student t Distribution */
#define _unur_SF_cdf_student(x,nu) pt((x),(nu),TRUE,FALSE)
#define _unur_SF_invcdf_student(u,nu) qt((u),(nu),TRUE,FALSE)
/* Binomial Distribution */
#define _unur_SF_invcdf_binomial(u,n,p) qbinom((u),(n),(p),TRUE,FALSE)
/* Hypergeometric Distribution */
#define _unur_SF_cdf_hypergeometric(x,N,M,n) phyper((x),(M),(N)-(M),(n),TRUE,FALSE)
#define _unur_SF_invcdf_hypergeometric(u,N,M,n) qhyper((u),(M),(N)-(M),(n),TRUE,FALSE)
/* Negative Binomial Distribution */
#define _unur_SF_cdf_negativebinomial(x,n,p) pnbinom((x),(n),(p),TRUE,FALSE)
#define _unur_SF_invcdf_negativebinomial(u,n,p) qnbinom((u),(n),(p),TRUE,FALSE)
/* Poisson Distribution */
#define _unur_SF_invcdf_poisson(u,theta) qpois((u),(theta),TRUE,FALSE)
/*---------------------------------------------------------------------------*/
/* end: Rmath library (R project) */
/*---------------------------------------------------------------------------*/
#else
/*---------------------------------------------------------------------------*/
/* Routines from the CEPHES library. */
/*---------------------------------------------------------------------------*/
/* incomplete beta integral */
double _unur_cephes_incbet(double a, double b, double x);
#define _unur_SF_incomplete_beta(x,a,b) _unur_cephes_incbet((a),(b),(x))
/* logarithm of gamma function */
double _unur_cephes_lgam(double x);
#define _unur_SF_ln_gamma(x) _unur_cephes_lgam(x)
/* logarithm of factorial */
#define _unur_SF_ln_factorial(x) _unur_cephes_lgam((x)+1.)
/* incomplete gamma function */
double _unur_cephes_igam(double a, double x);
#define _unur_SF_incomplete_gamma(x,a) _unur_cephes_igam((a),(x))
/* normal distribution function */
double _unur_cephes_ndtr(double x);
#define _unur_SF_cdf_normal(x) _unur_cephes_ndtr(x)
/* inverse of normal distribution function */
double _unur_cephes_ndtri(double x);
#define _unur_SF_invcdf_normal(x) _unur_cephes_ndtri(x)
/*---------------------------------------------------------------------------*/
/* end: CEPHES library */
/*---------------------------------------------------------------------------*/
#endif
/*****************************************************************************
* *
* Replacement for missing (system) functions *
* *
*****************************************************************************/
#if !HAVE_DECL_LOG1P
/* log(1+x) */
/* (replacement for missing C99 function log1p) */
double _unur_log1p(double x);
#endif
/*---------------------------------------------------------------------------*/
#endif /* UNUR_SPECFUNCT_SOURCE_H_SEEN */
/*---------------------------------------------------------------------------*/ |
#include <OneWire.h>
#include <DallasTemperature.h>
#include <LiquidCrystal.h>
#include <ESP8266WiFi.h>
#include <WiFiUdp.h>
//#define DEBUG (1)
#define ESP8266
#define USEWIFI
#define TEMP_HIGH_LIM 40.0f
#define TEMP_LOW_LIM 38.0f
#define FILTER_SZ 50
// initialize the library by associating any needed LCD interface pin
// with the arduino pin number it is connected to
#ifndef ESP8266
#define ONE_WIRE_BUS 3
#define RELAY_PIN 4
const int rs = 10, en = 9, d4 = 8, d5 = 7, d6 = 6, d7 = 5;
#else
#define ONE_WIRE_BUS 4
#define RELAY_PIN 5
//const int rs = D3, en = D4, d4 = D6, d5 = D5, d6 = D8, d7 = D7;
const int rs = 0, en = 2, d4 = 12, d5 = 14, d6 = 15, d7 = 13;
#endif
#ifndef USEWIFI
LiquidCrystal lcd(rs, en, d4, d5, d6, d7);
#else
int port = 8888;
WiFiUDP udp;
char incomingPacket[256];
const char *ssid = "DongXia_AP";
const char *password = "mywififoralexa";
const char *ipaddr = "192.168.4.2";
char tempHead[] = "temp:";
char rlyHead[] = "rly:";
char maxMinHead[] = "maxmin:";
char rlyCntHead[] = "rcnt:";
bool relayState;
#endif
OneWire onewire(ONE_WIRE_BUS);
DallasTemperature tempSensor(&onewire);
//filter buffer
static float buffer[FILTER_SZ];
void setup() {
#ifdef DEBUG
Serial.begin(9600);
#endif
pinMode(LED_BUILTIN, OUTPUT);
pinMode(RELAY_PIN, OUTPUT);
relayState = false;
tempSensor.begin();
#ifdef USEWIFI
WiFi.softAP(ssid, password);
udp.begin(port);
sndUdpPacket(rlyHead, 0);
#else
// set up the LCD's number of columns and rows:
lcd.begin(16, 2);
lcd.setCursor(0,0);
lcd.print("T: N:");
lcd.setCursor(0,1);
lcd.print("RLY:OFF[ - ]*C");
#endif
for(int i = 0; i < FILTER_SZ; i++) buffer[i] = 0.0f;
}
void loop() {
static float sum = 0.0f;
static int idx = 0;
static float maxTemp = 0, minTemp = 999.0;
static bool isNotReady = 1;
static int rlyCnter = 0;
tempSensor.requestTemperatures(); // Send the command to get temperatures
float temp = tempSensor.getTempCByIndex(0);
#ifdef DEBUG
Serial.println(temp);
#endif
//Moving average the temperature
sum += temp;
sum -= buffer[idx];
buffer[idx] = temp;
idx = (idx < FILTER_SZ - 1)?(idx+1):0;
temp = sum/FILTER_SZ;
#ifdef DEBUG
Serial.print("averaged:");
Serial.println(temp);
#endif
if(isNotReady && (idx == FILTER_SZ -1)) isNotReady = 0;
// Print a message to the LCD.
if(isNotReady) return;
#ifndef USEWIFI
DisplayTemp(temp);
#else
//send the temperature via udp
sndUdpPacket(tempHead, (int) temp);
#endif
maxTemp = (maxTemp<temp)?temp:maxTemp;
minTemp = (minTemp>temp)?temp:minTemp;
#ifndef USEWIFI
DisplayMaxMinTemp(maxTemp,minTemp);
#else
int maxmin = (int(maxTemp+0.5) << 8) + (int(minTemp+0.5));
sndUdpPacket(maxMinHead, maxmin);
#endif
ReactWithTemp(temp, &rlyCnter);
#ifndef USEWIFI
DisplayRelayCounter(rlyCnter);
#else
sndUdpPacket(rlyCntHead, rlyCnter);
sndUdpPacket(rlyHead, relayState?1:0);
#endif
}
#ifndef USEWIFI
void DisplayTemp(float temp)
{
lcd.setCursor(2,0);
if(temp > 99.9f) lcd.print("N/A");
else if(temp < 0.0f) lcd.print("ERR!");
else {
int inttemp = temp * 1000 + 5;
lcd.print((float)inttemp/1000.0f);
lcd.print("*C");
}
}
void DisplayMaxMinTemp(float maxtemp, float mintemp)
{
lcd.setCursor(8,1);
lcd.print(int(mintemp+0.5));
lcd.print("-");
lcd.print(int(maxtemp+0.5));
}
void DisplayRelayCounter(int no)
{
lcd.setCursor(13,0);
lcd.print(no);
}
#endif
uint8_t ReactWithTemp(float temp, int * pcnter)
{
static uint8_t statusRelay = 0;
if((temp <= TEMP_LOW_LIM) && (statusRelay == 0))
{
statusRelay = 1;
RelayOn();
(*pcnter)++;
}
if((temp >= TEMP_HIGH_LIM) && (statusRelay == 1))
{
statusRelay = 0;
RelayOff();
(*pcnter)++;
}
return statusRelay;
}
void RelayOn()
{
#ifndef ESP8266
digitalWrite(LED_BUILTIN, HIGH);
#endif
digitalWrite(RELAY_PIN, HIGH);
#ifdef DEBUG
Serial.println("RELAY ON");
#endif
#ifndef USEWIFI
lcd.setCursor(4,1);
lcd.print("ON ");
#else
//sndUdpPacket(rlyHead, 1);
relayState = true;
#endif
}
void RelayOff()
{
#ifndef ESP8266
digitalWrite(LED_BUILTIN, LOW);
#endif
digitalWrite(RELAY_PIN, LOW);
#ifdef DEBUG
Serial.println("RELAY OFF");
#endif
#ifndef USEWIFI
lcd.setCursor(4,1);
lcd.print("OFF");
#else
//sndUdpPacket(rlyHead, 0);
relayState = false;
#endif
}
#ifdef USEWIFI
void sndUdpPacket(char * preStr, int val)
{
String str = String(val);
udp.beginPacket(ipaddr, port);
udp.write(preStr);
udp.write(str.c_str());
udp.write(";");
udp.endPacket();
}
#endif |
package me.fengyj.springdemo.web.configs;
import me.fengyj.springdemo.utils.exceptions.ResourceNotFoundException;
import me.fengyj.springdemo.utils.exceptions.UserInvalidInputException;
import org.springframework.http.HttpStatusCode;
import org.springframework.web.bind.annotation.ControllerAdvice;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.servlet.ModelAndView;
@ControllerAdvice
public class ExceptionHandlerAdvice {
@ExceptionHandler(UserInvalidInputException.class)
public ModelAndView handleUserInvalidInputException(UserInvalidInputException ex) {
ModelAndView mav = new ModelAndView("errors/400");
mav.setStatus(HttpStatusCode.valueOf(400));
mav.addObject("message", ex.getMessage());
mav.addObject("rawInput", ex.getRawInput());
return mav;
}
@ExceptionHandler(ResourceNotFoundException.class)
public ModelAndView handleResourceNotFoundException(ResourceNotFoundException ex) {
ModelAndView mav = new ModelAndView("errors/404");
mav.setStatus(HttpStatusCode.valueOf(404));
mav.addObject("message", ex.getMessage());
mav.addObject("resourceId", ex.getResourceId());
mav.addObject("resourceType", ex.getResourceType());
return mav;
}
@ExceptionHandler(Exception.class)
public ModelAndView handleUnknownException(Exception ex) {
ModelAndView mav = new ModelAndView("errors/500");
mav.setStatus(HttpStatusCode.valueOf(500));
mav.addObject("message", ex.getMessage());
return mav;
}
} |
/* eslint-disable consistent-return */
import React from 'react';
import { Link, useForm, usePage } from '@inertiajs/react';
import { Camera, Loader2 } from 'lucide-react';
import { Transition } from '@headlessui/react';
import { PageProps } from '@/types';
import { UpdateUser } from '@/types/user';
import { cn, getInitial } from '@/lib/utils';
import { Label } from '@/components/ui/label';
import { Input } from '@/components/ui/input';
import InputError from '@/components/input-error';
import { Button, buttonVariants } from '@/components/ui/button';
import { Avatar, AvatarFallback } from '@/components/ui/avatar';
interface ProfileInformationProps {
mustVerifyEmail: boolean;
}
function ProfileInformationForm({ mustVerifyEmail }: ProfileInformationProps) {
// hooks
const { user, status } = usePage<PageProps>().props;
const { data, setData, post, errors, processing, recentlySuccessful } =
useForm<UpdateUser>({
name: user.name,
email: user.email,
_method: 'PATCH',
});
// states
const falbackAvatar = getInitial(user.name);
const [avatarUrl, setAvatarUrl] = React.useState<string | null>(
user.avatar || null
);
// events
const onChangeAvatar = (e: React.ChangeEvent<HTMLInputElement>) => {
const { files } = e.target;
if (files && files.length > 0) {
setData('avatar', files[0]);
const imageUrl = window.URL.createObjectURL(files[0]);
setAvatarUrl(imageUrl);
return () => window.URL.revokeObjectURL(imageUrl);
}
};
const onSubmit: React.FormEventHandler = (e) => {
e.preventDefault();
post(route('profile.update'));
};
// render
React.useEffect(() => {
setAvatarUrl(user.avatar || null);
}, [user]);
return (
<section id="profile-information-form">
<header>
<h2 className="text-lg font-medium">Profile Information</h2>
<p className="mt-1 text-sm text-muted-foreground">
Update your account's profile information and email address.
</p>
</header>
<form
onSubmit={onSubmit}
className="mt-5 space-y-5"
encType="multipart/form-data"
>
<div className="relative">
{avatarUrl !== null ? (
<img
src={avatarUrl}
alt={`@${user.name}`}
className="mx-auto h-36 w-36 rounded-full border border-border object-cover"
/>
) : (
<Avatar
className={cn(
'mx-auto h-36 w-36 rounded-full border border-border'
)}
>
<AvatarFallback>{falbackAvatar}</AvatarFallback>
</Avatar>
)}
<Label
htmlFor="avatar"
className={buttonVariants({
size: 'icon',
variant: 'ghost',
className:
'absolute bottom-1 left-1/2 -translate-x-1/2 transform rounded-full hover:cursor-pointer hover:bg-black/10',
})}
tabIndex={0}
>
<Camera className="h-5 w-5 text-white mix-blend-difference" />
<input
type="file"
id="avatar"
name="avatar"
onChange={onChangeAvatar}
className="hidden"
accept="image/*"
/>
</Label>
</div>
<div>
<Label htmlFor="name">Name</Label>
<Input
id="name"
type="text"
value={data.name}
onChange={(e) => setData('name', e.target.value)}
autoComplete="name"
autoFocus
/>
<InputError message={errors.name} className="mt-2" />
</div>
<div>
<Label htmlFor="email">Email</Label>
<Input
id="email"
type="email"
value={data.email}
onChange={(e) => setData('email', e.target.value)}
autoComplete="email"
/>
<InputError message={errors.email} className="mt-2" />
</div>
{mustVerifyEmail && user.email_verified_at === null && (
<div>
<p className="mt-3 text-sm">
Your email address is unverified.
<Link
href={route('verification.send')}
method="post"
as="button"
className={buttonVariants({ variant: 'link', size: 'sm' })}
>
Click here to re-send email.
</Link>
</p>
{status === 'verification-link-sent' && (
<Transition
show={status === 'verification-link-sent'}
enter="transition ease-in-out"
enterFrom="opacity-0"
leave="transition ease-in-out"
leaveTo="opacity-0"
>
<div className="mt-3 text-sm font-medium text-emerald-500">
A new verification link has been sent to your email address.
</div>
</Transition>
)}
</div>
)}
<div className="flex items-center gap-5">
<Button
type="submit"
disabled={processing}
className="inline-flex items-center justify-center gap-2"
>
{processing && <Loader2 className="h-4 w-4 animate-spin" />}
Save
</Button>
<Transition
show={recentlySuccessful}
enter="transition ease-in-out"
enterFrom="opacity-0"
leave="transition ease-in-out"
leaveTo="opacity-0"
>
<p className="text-sm text-primary">Saved.</p>
</Transition>
</div>
</form>
</section>
);
}
export default ProfileInformationForm; |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.