_id stringlengths 64 64 | repository stringlengths 6 84 | name stringlengths 4 110 | content stringlengths 0 248k | license null | download_url stringlengths 89 454 | language stringclasses 7
values | comments stringlengths 0 74.6k | code stringlengths 0 248k |
|---|---|---|---|---|---|---|---|---|
880075fad433e8a95fd7cdd968a47f347a2d78fa22e85b3004f82ea32b3438ba | mrphlip/aoc | 05.hs | # OPTIONS_GHC -Wno - tabs #
import Data.List
import Control.Exception
import Utils
getInput :: IO [Integer]
getInput = do
dat <- readFile "05.txt"
return $ map readPass $ lines dat
readPass :: String -> Integer
readPass = fromBaseN 2 . map charToBinary
where
charToBinary 'F' = 0
charToBinary 'B' = 1
charToBinary 'L' = 0
charToBinary 'R' = 1
tests :: IO ()
tests = do
check $ readPass "BFFFBBFRRR" == 567
check $ readPass "FFFBBBFRRR" == 119
check $ readPass "BBFFBBFRLL" == 820
where
check True = return ()
check False = throwIO $ AssertionFailed "test failed"
main = do
passes <- getInput
let sorted = sort passes
print $ last sorted
print [ x + 1 | (x,y) <- zip sorted (tail sorted), x /= y - 1]
| null | https://raw.githubusercontent.com/mrphlip/aoc/06395681eb6b50b838cd4561b2e0aa772aca570a/2020/05.hs | haskell | # OPTIONS_GHC -Wno - tabs #
import Data.List
import Control.Exception
import Utils
getInput :: IO [Integer]
getInput = do
dat <- readFile "05.txt"
return $ map readPass $ lines dat
readPass :: String -> Integer
readPass = fromBaseN 2 . map charToBinary
where
charToBinary 'F' = 0
charToBinary 'B' = 1
charToBinary 'L' = 0
charToBinary 'R' = 1
tests :: IO ()
tests = do
check $ readPass "BFFFBBFRRR" == 567
check $ readPass "FFFBBBFRRR" == 119
check $ readPass "BBFFBBFRLL" == 820
where
check True = return ()
check False = throwIO $ AssertionFailed "test failed"
main = do
passes <- getInput
let sorted = sort passes
print $ last sorted
print [ x + 1 | (x,y) <- zip sorted (tail sorted), x /= y - 1]
| |
da648db225b1ddf3d78dd65f9183f4673bd8d414d0b61861b5b092d15650b5bb | lisp/de.setf.wilbur | useful.lisp | -*- package : ; Syntax : Common - lisp ; Base : 10 -*-
;;;
useful.lisp
;;;
;;;
;;; --------------------------------------------------------------------------------------
;;;
The Original Software is
WILBUR2 : Nokia Semantic Web Toolkit for CLOS
;;;
Copyright ( c ) 2001 - 2009 Nokia Corp. and/or its subsidiaries . All Rights Reserved .
Portions Copyright ( c ) 1989 - 1992 . All Rights Reserved .
;;;
Contributor(s ): ( mailto: )
;;;
;;; This program is licensed under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation , version 2.1 of the License . Note
;;; however that a preamble attached below also applies to this program.
;;;
;;;
;;; --------------------------------------------------------------------------------------
;;;
Preamble to the Gnu Lesser General Public License
;;;
Copyright ( c ) 2000 Franz Incorporated , Berkeley , CA 94704
;;;
The concept of the GNU Lesser General Public License version 2.1 ( " LGPL " ) has been
;;; adopted to govern the use and distribution of above-mentioned application. However,
;;; the LGPL uses terminology that is more appropriate for a program written in C than
one written in . Nevertheless , the LGPL can still be applied to a Lisp program
;;; if certain clarifications are made. This document details those clarifications.
;;; Accordingly, the license for the open-source Lisp applications consists of this
;;; document plus the LGPL. Wherever there is a conflict between this document and the
;;; LGPL, this document takes precedence over the LGPL.
;;;
A " Library " in is a collection of Lisp functions , data and foreign modules .
The form of the Library can be Lisp source code ( for processing by an interpreter )
;;; or object code (usually the result of compilation of source code or built with some
;;; other mechanisms). Foreign modules are object code in a form that can be linked
;;; into a Lisp executable. When we speak of functions we do so in the most general way
;;; to include, in addition, methods and unnamed functions. Lisp "data" is also a
;;; general term that includes the data structures resulting from defining Lisp classes.
A Lisp application may include the same set of Lisp objects as does a Library , but
this does not mean that the application is necessarily a " work based on the Library "
;;; it contains.
;;;
The Library consists of everything in the distribution file set before any
;;; modifications are made to the files. If any of the functions or classes in the
;;; Library are redefined in other files, then those redefinitions ARE considered a
work based on the Library . If additional methods are added to generic functions in
the Library , those additional methods are NOT considered a work based on the
Library . If Library classes are subclassed , these subclasses are NOT considered a
work based on the Library . If the Library is modified to explicitly call other
;;; functions that are neither part of Lisp itself nor an available add-on module to
Lisp , then the functions called by the modified Library ARE considered a work based
on the Library . The goal is to ensure that the Library will compile and run without
;;; getting undefined function errors.
;;;
It is permitted to add proprietary source code to the Library , but it must be done
in a way such that the Library will still run without that proprietary code present .
;;; Section 5 of the LGPL distinguishes between the case of a library being dynamically
linked at runtime and one being statically linked at build time . Section 5 of the
;;; LGPL states that the former results in an executable that is a "work that uses the
Library . " Section 5 of the LGPL states that the latter results in one that is a
" derivative of the Library " , which is therefore covered by the LGPL . Since Lisp only
offers one choice , which is to link the Library into an executable at build time , we
declare that , for the purpose applying the LGPL to the Library , an executable that
results from linking a " work that uses the Library " with the Library is considered a
" work that uses the Library " and is therefore NOT covered by the LGPL .
;;;
Because of this declaration , section 6 of LGPL is not applicable to the Library .
;;; However, in connection with each distribution of this executable, you must also
;;; deliver, in accordance with the terms and conditions of the LGPL, the source code
;;; of Library (or your derivative thereof) that is incorporated into this executable.
;;;
;;; --------------------------------------------------------------------------------------
;;;
;;;
;;; Purpose: Useful functions and macros
;;;
(in-package "WILBUR")
;;; --------------------------------------------------------------------------------------
;;;
;;; GENERALLY USEFUL STUFF
;;;
(defmacro with-temps ((&rest variables) &body body)
`(let (,@(mapcar #'(lambda (variable)
`(,variable (gentemp)))
variables))
,@body))
(defmacro dolist+ ((pattern list &optional (value nil value-supplied-p)) &body body)
(if (symbolp pattern)
`(dolist (,pattern ,list ,@(and value-supplied-p (list value)))
,@body)
(let ((i (gentemp)))
`(dolist (,i ,list ,@(and value-supplied-p (list value)))
(destructuring-bind ,pattern ,i
,@body)))))
(defmacro dsb (pattern form &body body)
`(destructuring-bind ,pattern ,form ,@body))
(defun remove-weird (sequence item &rest options)
(declare (dynamic-extent options))
(apply #'remove item sequence options))
(defun delete-weird (sequence item &rest options)
(declare (dynamic-extent options))
(apply #'delete item sequence options))
(define-modify-macro removef (items &rest options) remove-weird)
(define-modify-macro deletef (items &rest options) delete-weird)
(define-modify-macro unionf (items) union)
(defun eq~ (x y)
(or (null x)
(null y)
(eq x y)))
(declaim (inline eq~))
(defun string->keyword (string &optional (package :keyword))
(if package (intern (string-upcase string) package) string))
(defmacro defequal (name value &optional documentation)
(let ((name-var (gensym)))
`(defconstant ,name
(let ((,name-var ,value))
(if (boundp ',name)
(progn (assert (equalp (symbol-value ',name) ,name-var) ()
"Previous value for constant ~a not equal to new binding: ~s."
',name ,name-var)
(symbol-value ',name))
,name-var))
,@(when documentation (list documentation)))))
;;; --------------------------------------------------------------------------------------
;;;
;;; STRING DICTIONARY
;;;
;;; Some care must be taken when using this, since (in the interest of making the
;;; implementation not cons so much) we have used destructive operations.
;;;
(defun string-dict-get (keys&values key)
(cdr (assoc key keys&values :test #'string=)))
(defun string-dict-get-by-value (keys&values value)
(car (rassoc value keys&values :test #'string=)))
(defun string-dict-add (keys&values key value)
(acons key value keys&values))
(defun string-dict-del (keys&values key)
(delete key keys&values :key #'car :test #'string=))
(defmacro do-string-dict ((key value dict) &body body)
`(loop for (,key . ,value) in ,dict do (progn ,@body)))
;;; --------------------------------------------------------------------------------------
;;;
;;; LIST MANIPULATION
;;;
(defun split-list (head tail n &optional (no-split-p nil))
(if no-split-p
(values tail nil)
(if (and tail (plusp n))
(split-list (cons (first tail) head) (rest tail) (1- n) no-split-p)
(values (nreverse head) tail))))
(defun prioritize-list (list possible-priority-items
&key (test #'eql) (key #'identity))
(prioritize list :prefer possible-priority-items :test test :key key))
(defun prioritize (list
&key (prefer nil)
(exclude nil)
(test #'eql)
(key #'identity)
(splitp nil))
(let* ((items (remove-if #'(lambda (item)
(find-if #'(lambda (e)
(funcall test e (funcall key item)))
exclude))
list))
(priority-items (mapcan #'(lambda (p)
(let ((item (find p items :test test :key key)))
(and item (list item))))
prefer))
(other-items (remove-if #'(lambda (item)
(find-if #'(lambda (p)
(funcall test
(funcall key p)
(funcall key item)))
priority-items))
items)))
(if splitp
(values priority-items other-items)
(append priority-items other-items))))
| null | https://raw.githubusercontent.com/lisp/de.setf.wilbur/c5c1321e6a05cead8b90e54116f14c3810d520e2/src/useful.lisp | lisp | Syntax : Common - lisp ; Base : 10 -*-
--------------------------------------------------------------------------------------
This program is licensed under the terms of the GNU Lesser General Public License
however that a preamble attached below also applies to this program.
--------------------------------------------------------------------------------------
adopted to govern the use and distribution of above-mentioned application. However,
the LGPL uses terminology that is more appropriate for a program written in C than
if certain clarifications are made. This document details those clarifications.
Accordingly, the license for the open-source Lisp applications consists of this
document plus the LGPL. Wherever there is a conflict between this document and the
LGPL, this document takes precedence over the LGPL.
or object code (usually the result of compilation of source code or built with some
other mechanisms). Foreign modules are object code in a form that can be linked
into a Lisp executable. When we speak of functions we do so in the most general way
to include, in addition, methods and unnamed functions. Lisp "data" is also a
general term that includes the data structures resulting from defining Lisp classes.
it contains.
modifications are made to the files. If any of the functions or classes in the
Library are redefined in other files, then those redefinitions ARE considered a
functions that are neither part of Lisp itself nor an available add-on module to
getting undefined function errors.
Section 5 of the LGPL distinguishes between the case of a library being dynamically
LGPL states that the former results in an executable that is a "work that uses the
However, in connection with each distribution of this executable, you must also
deliver, in accordance with the terms and conditions of the LGPL, the source code
of Library (or your derivative thereof) that is incorporated into this executable.
--------------------------------------------------------------------------------------
Purpose: Useful functions and macros
--------------------------------------------------------------------------------------
GENERALLY USEFUL STUFF
--------------------------------------------------------------------------------------
STRING DICTIONARY
Some care must be taken when using this, since (in the interest of making the
implementation not cons so much) we have used destructive operations.
--------------------------------------------------------------------------------------
LIST MANIPULATION
|
useful.lisp
The Original Software is
WILBUR2 : Nokia Semantic Web Toolkit for CLOS
Copyright ( c ) 2001 - 2009 Nokia Corp. and/or its subsidiaries . All Rights Reserved .
Portions Copyright ( c ) 1989 - 1992 . All Rights Reserved .
Contributor(s ): ( mailto: )
as published by the Free Software Foundation , version 2.1 of the License . Note
Preamble to the Gnu Lesser General Public License
Copyright ( c ) 2000 Franz Incorporated , Berkeley , CA 94704
The concept of the GNU Lesser General Public License version 2.1 ( " LGPL " ) has been
one written in . Nevertheless , the LGPL can still be applied to a Lisp program
A " Library " in is a collection of Lisp functions , data and foreign modules .
The form of the Library can be Lisp source code ( for processing by an interpreter )
A Lisp application may include the same set of Lisp objects as does a Library , but
this does not mean that the application is necessarily a " work based on the Library "
The Library consists of everything in the distribution file set before any
work based on the Library . If additional methods are added to generic functions in
the Library , those additional methods are NOT considered a work based on the
Library . If Library classes are subclassed , these subclasses are NOT considered a
work based on the Library . If the Library is modified to explicitly call other
Lisp , then the functions called by the modified Library ARE considered a work based
on the Library . The goal is to ensure that the Library will compile and run without
It is permitted to add proprietary source code to the Library , but it must be done
in a way such that the Library will still run without that proprietary code present .
linked at runtime and one being statically linked at build time . Section 5 of the
Library . " Section 5 of the LGPL states that the latter results in one that is a
" derivative of the Library " , which is therefore covered by the LGPL . Since Lisp only
offers one choice , which is to link the Library into an executable at build time , we
declare that , for the purpose applying the LGPL to the Library , an executable that
results from linking a " work that uses the Library " with the Library is considered a
" work that uses the Library " and is therefore NOT covered by the LGPL .
Because of this declaration , section 6 of LGPL is not applicable to the Library .
(in-package "WILBUR")
(defmacro with-temps ((&rest variables) &body body)
`(let (,@(mapcar #'(lambda (variable)
`(,variable (gentemp)))
variables))
,@body))
(defmacro dolist+ ((pattern list &optional (value nil value-supplied-p)) &body body)
(if (symbolp pattern)
`(dolist (,pattern ,list ,@(and value-supplied-p (list value)))
,@body)
(let ((i (gentemp)))
`(dolist (,i ,list ,@(and value-supplied-p (list value)))
(destructuring-bind ,pattern ,i
,@body)))))
(defmacro dsb (pattern form &body body)
`(destructuring-bind ,pattern ,form ,@body))
(defun remove-weird (sequence item &rest options)
(declare (dynamic-extent options))
(apply #'remove item sequence options))
(defun delete-weird (sequence item &rest options)
(declare (dynamic-extent options))
(apply #'delete item sequence options))
(define-modify-macro removef (items &rest options) remove-weird)
(define-modify-macro deletef (items &rest options) delete-weird)
(define-modify-macro unionf (items) union)
(defun eq~ (x y)
(or (null x)
(null y)
(eq x y)))
(declaim (inline eq~))
(defun string->keyword (string &optional (package :keyword))
(if package (intern (string-upcase string) package) string))
(defmacro defequal (name value &optional documentation)
(let ((name-var (gensym)))
`(defconstant ,name
(let ((,name-var ,value))
(if (boundp ',name)
(progn (assert (equalp (symbol-value ',name) ,name-var) ()
"Previous value for constant ~a not equal to new binding: ~s."
',name ,name-var)
(symbol-value ',name))
,name-var))
,@(when documentation (list documentation)))))
(defun string-dict-get (keys&values key)
(cdr (assoc key keys&values :test #'string=)))
(defun string-dict-get-by-value (keys&values value)
(car (rassoc value keys&values :test #'string=)))
(defun string-dict-add (keys&values key value)
(acons key value keys&values))
(defun string-dict-del (keys&values key)
(delete key keys&values :key #'car :test #'string=))
(defmacro do-string-dict ((key value dict) &body body)
`(loop for (,key . ,value) in ,dict do (progn ,@body)))
(defun split-list (head tail n &optional (no-split-p nil))
(if no-split-p
(values tail nil)
(if (and tail (plusp n))
(split-list (cons (first tail) head) (rest tail) (1- n) no-split-p)
(values (nreverse head) tail))))
(defun prioritize-list (list possible-priority-items
&key (test #'eql) (key #'identity))
(prioritize list :prefer possible-priority-items :test test :key key))
(defun prioritize (list
&key (prefer nil)
(exclude nil)
(test #'eql)
(key #'identity)
(splitp nil))
(let* ((items (remove-if #'(lambda (item)
(find-if #'(lambda (e)
(funcall test e (funcall key item)))
exclude))
list))
(priority-items (mapcan #'(lambda (p)
(let ((item (find p items :test test :key key)))
(and item (list item))))
prefer))
(other-items (remove-if #'(lambda (item)
(find-if #'(lambda (p)
(funcall test
(funcall key p)
(funcall key item)))
priority-items))
items)))
(if splitp
(values priority-items other-items)
(append priority-items other-items))))
|
73b9ddcd1b2e2e247c04d90195bba64b97b930e1dee63c5b3ac6b03606112375 | batsh-dev-team/Batsh | winbat_functions.ml | open Core_kernel
open Winbat_ast
let rec expand_command (name : varstrings) (args : parameters) =
match name with
| [`Str "bash"] ->
`Empty
| [`Str "batch"] -> (
match args with
| [[`Str raw]] ->
`Raw raw
| _ ->
failwith "batch raw command must have 1 argument of string literal."
)
| [`Str "println"] -> (
match args with
| [] ->
`Call ([`Str "echo:"], [])
| _ ->
`Call ([`Str "echo"], args)
)
| [`Str "print"] ->
`Call ([`Str "echo | set /p ="], [] :: args)
| [`Str "call"] -> (
match args with
| cmd :: real_args ->
expand_command cmd real_args
| [] ->
failwith "call must have at least 1 argument."
)
| [`Str "readdir"] ->
`Call ([`Str "dir /w"], args)
| _ ->
`Call (name, args)
let rec expand_statement (stmt : statement) : statement =
match stmt with
| `Call (name, exprs) ->
expand_command name exprs
| `Output (lvalue, name, exprs) ->
let expaned = expand_command name exprs in (
match expaned with
| `Call (name, exprs) -> `Output (lvalue, name, exprs)
| _ -> failwith (sprintf "command do not have a return value.")
)
| `If (condition, stmts) ->
`If (condition, expand_statements stmts)
| `IfElse (condition, then_stmts, else_stmts) ->
`IfElse (condition,
expand_statements then_stmts,
expand_statements else_stmts)
| `Assignment _
| `ArithAssign _
| `Comment _ | `Raw _ | `Label _ | `Goto _ | `Empty -> stmt
and expand_statements (stmts: statements) : statements =
List.map stmts ~f: expand_statement
let expand (ast : t) : t =
expand_statements ast
| null | https://raw.githubusercontent.com/batsh-dev-team/Batsh/5c8ae421e0eea5dcb3da01643152ad96af941f07/lib/winbat_functions.ml | ocaml | open Core_kernel
open Winbat_ast
let rec expand_command (name : varstrings) (args : parameters) =
match name with
| [`Str "bash"] ->
`Empty
| [`Str "batch"] -> (
match args with
| [[`Str raw]] ->
`Raw raw
| _ ->
failwith "batch raw command must have 1 argument of string literal."
)
| [`Str "println"] -> (
match args with
| [] ->
`Call ([`Str "echo:"], [])
| _ ->
`Call ([`Str "echo"], args)
)
| [`Str "print"] ->
`Call ([`Str "echo | set /p ="], [] :: args)
| [`Str "call"] -> (
match args with
| cmd :: real_args ->
expand_command cmd real_args
| [] ->
failwith "call must have at least 1 argument."
)
| [`Str "readdir"] ->
`Call ([`Str "dir /w"], args)
| _ ->
`Call (name, args)
let rec expand_statement (stmt : statement) : statement =
match stmt with
| `Call (name, exprs) ->
expand_command name exprs
| `Output (lvalue, name, exprs) ->
let expaned = expand_command name exprs in (
match expaned with
| `Call (name, exprs) -> `Output (lvalue, name, exprs)
| _ -> failwith (sprintf "command do not have a return value.")
)
| `If (condition, stmts) ->
`If (condition, expand_statements stmts)
| `IfElse (condition, then_stmts, else_stmts) ->
`IfElse (condition,
expand_statements then_stmts,
expand_statements else_stmts)
| `Assignment _
| `ArithAssign _
| `Comment _ | `Raw _ | `Label _ | `Goto _ | `Empty -> stmt
and expand_statements (stmts: statements) : statements =
List.map stmts ~f: expand_statement
let expand (ast : t) : t =
expand_statements ast
| |
05b93df42af24f5673e70dc080d194f708b0d5b20f02af086673fbb0b0b013c7 | erlangonrails/devdb | nodetree_tree_odbc.erl | %%% ====================================================================
` ` The contents of this file are subject to the Erlang Public License ,
Version 1.1 , ( the " License " ) ; you may not use this file except in
%%% compliance with the License. You should have received a copy of the
%%% Erlang Public License along with this software. If not, it can be
%%% retrieved via the world wide web at /.
%%%
Software distributed under the License is distributed on an " AS IS "
%%% basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See
%%% the License for the specific language governing rights and limitations
%%% under the License.
%%%
The Initial Developer of the Original Code is ProcessOne .
Portions created by ProcessOne are Copyright 2006 - 2010 , ProcessOne
All Rights Reserved . ''
This software is copyright 2006 - 2010 , ProcessOne .
%%%
%%%
2006 - 2010 ProcessOne
@author < >
%%% [-one.net/]
%%% @version {@vsn}, {@date} {@time}
%%% @end
%%% ====================================================================
@doc The module < strong>{@module}</strong > is the default PubSub node tree plugin .
< p > It is used as a default for all unknown PubSub node type . It can serve
%%% as a developer basis and reference to build its own custom pubsub node tree
%%% types.</p>
< p > PubSub node tree plugins are using the { @link gen_nodetree } behaviour.</p >
%%% <p><strong>The API isn't stabilized yet</strong>. The pubsub plugin
%%% development is still a work in progress. However, the system is already
%%% useable and useful as is. Please, send us comments, feedback and
%%% improvements.</p>
-module(nodetree_tree_odbc).
-author('').
-include("pubsub.hrl").
-include("jlib.hrl").
-define(PUBSUB, mod_pubsub_odbc).
-define(PLUGIN_PREFIX, "node_").
-behaviour(gen_pubsub_nodetree).
-export([init/3,
terminate/2,
options/0,
set_node/1,
get_node/3,
get_node/2,
get_node/1,
get_nodes/2,
get_nodes/1,
get_parentnodes/3,
get_parentnodes_tree/3,
get_subnodes/3,
get_subnodes_tree/3,
create_node/6,
delete_node/2
]).
-export([raw_to_node/2]).
%% ================
%% API definition
%% ================
@spec ( Host , ServerHost , Opts ) - > any ( )
%% Host = mod_pubsub:host()
ServerHost = host ( )
%% Opts = list()
%% @doc <p>Called during pubsub modules initialisation. Any pubsub plugin must
%% implement this function. It can return anything.</p>
%% <p>This function is mainly used to trigger the setup task necessary for the
%% plugin. It can be used for example by the developer to create the specific
module database schema if it does not exists >
init(_Host, _ServerHost, _Opts) ->
ok.
terminate(_Host, _ServerHost) ->
ok.
( ) - > [ Option ]
Option = mod_pubsub : ( )
%% @doc Returns the default pubsub node tree options.
options() ->
[{virtual_tree, false},
{odbc, true}].
%% @spec (Host, Node, From) -> pubsubNode() | {error, Reason}
%% Host = mod_pubsub:host()
%% Node = mod_pubsub:pubsubNode()
get_node(Host, Node, _From) ->
get_node(Host, Node).
get_node(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "' and node='", N, "';"])
of
{selected, ["node", "parent", "type", "nodeid"], [RItem]} ->
raw_to_node(Host, RItem);
{'EXIT', _Reason} ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
{error, ?ERR_ITEM_NOT_FOUND}
end.
get_node(NodeId) ->
case catch ejabberd_odbc:sql_query_t(
["select host, node, parent, type "
"from pubsub_node "
"where nodeid='", NodeId, "';"])
of
{selected, ["host", "node", "parent", "type"], [{Host, Node, Parent, Type}]} ->
raw_to_node(Host, {Node, Parent, Type, NodeId});
{'EXIT', _Reason} ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
{error, ?ERR_ITEM_NOT_FOUND}
end.
%% @spec (Host, From) -> [pubsubNode()] | {error, Reason}
%% Host = mod_pubsub:host() | mod_pubsub:jid()
get_nodes(Host, _From) ->
get_nodes(Host).
get_nodes(Host) ->
H = ?PUBSUB:escape(Host),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "';"])
of
{selected, ["node", "parent", "type", "nodeid"], RItems} ->
lists:map(fun(Item) -> raw_to_node(Host, Item) end, RItems);
_ ->
[]
end.
%% @spec (Host, Node, From) -> [{Depth, Record}] | {error, Reason}
%% Host = mod_pubsub:host() | mod_pubsub:jid()
%% Node = mod_pubsub:pubsubNode()
%% From = mod_pubsub:jid()
%% Depth = integer()
%% Record = pubsubNode()
@doc < p > Default node tree does not handle parents , return empty list.</p >
get_parentnodes(_Host, _Node, _From) ->
[].
%% @spec (Host, Node, From) -> [{Depth, Record}] | {error, Reason}
%% Host = mod_pubsub:host() | mod_pubsub:jid()
%% Node = mod_pubsub:pubsubNode()
%% From = mod_pubsub:jid()
%% Depth = integer()
%% Record = pubsubNode()
%% @doc <p>Default node tree does not handle parents, return a list
%% containing just this node.</p>
get_parentnodes_tree(Host, Node, From) ->
case get_node(Host, Node, From) of
N when is_record(N, pubsub_node) -> [{0, [N]}];
_Error -> []
end.
get_subnodes(Host, Node, _From) ->
get_subnodes(Host, Node).
%% @spec (Host, Index) -> [pubsubNode()] | {error, Reason}
%% Host = mod_pubsub:host()
%% Node = mod_pubsub:pubsubNode()
get_subnodes(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "' and parent='", N, "';"])
of
{selected, ["node", "parent", "type", "nodeid"], RItems} ->
lists:map(fun(Item) -> raw_to_node(Host, Item) end, RItems);
_ ->
[]
end.
get_subnodes_tree(Host, Node, _From) ->
get_subnodes_tree(Host, Node).
%% @spec (Host, Index) -> [pubsubNode()] | {error, Reason}
%% Host = mod_pubsub:host()
%% Node = mod_pubsub:pubsubNode()
get_subnodes_tree(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "' and node like '", N, "%';"])
of
{selected, ["node", "parent", "type", "nodeid"], RItems} ->
lists:map(fun(Item) -> raw_to_node(Host, Item) end, RItems);
_ ->
[]
end.
%% @spec (Host, Node, Type, Owner, Options, Parents) -> ok | {error, Reason}
%% Host = mod_pubsub:host() | mod_pubsub:jid()
%% Node = mod_pubsub:pubsubNode()
NodeType = mod_pubsub : nodeType ( )
%% Owner = mod_pubsub:jid()
%% Options = list()
%% Parents = list()
create_node(Host, Node, Type, Owner, Options, Parents) ->
BJID = jlib:jid_tolower(jlib:jid_remove_resource(Owner)),
case nodeid(Host, Node) of
{error, ?ERR_ITEM_NOT_FOUND} ->
ParentExists =
case Host of
{_U, _S, _R} ->
%% This is special case for PEP handling
%% PEP does not uses hierarchy
true;
_ ->
case Parents of
[] -> true;
[Parent|_] ->
case nodeid(Host, Parent) of
{result, PNodeId} ->
case nodeowners(PNodeId) of
[{[], Host, []}] -> true;
Owners -> lists:member(BJID, Owners)
end;
_ ->
false
end;
_ ->
false
end
end,
case ParentExists of
true ->
case set_node(#pubsub_node{
nodeid={Host, Node},
parents=Parents,
type=Type,
options=Options}) of
{result, NodeId} -> {ok, NodeId};
Other -> Other
end;
false ->
%% Requesting entity is prohibited from creating nodes
{error, ?ERR_FORBIDDEN}
end;
{result, _} ->
%% NodeID already exists
{error, ?ERR_CONFLICT};
Error ->
Error
end.
%% @spec (Host, Node) -> [mod_pubsub:node()]
%% Host = mod_pubsub:host() | mod_pubsub:jid()
%% Node = mod_pubsub:pubsubNode()
delete_node(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
Removed = get_subnodes_tree(Host, Node),
catch ejabberd_odbc:sql_query_t(
["delete from pubsub_node "
"where host='", H, "' and node like '", N, "%';"]),
Removed.
%% helpers
raw_to_node(Host, {Node, Parent, Type, NodeId}) ->
Options = case catch ejabberd_odbc:sql_query_t(
["select name,val "
"from pubsub_node_option "
"where nodeid='", NodeId, "';"])
of
{selected, ["name", "val"], ROptions} ->
DbOpts = lists:map(fun({Key, Value}) ->
RKey = list_to_atom(Key),
Tokens = element(2, erl_scan:string(Value++".")),
RValue = element(2, erl_parse:parse_term(Tokens)),
{RKey, RValue}
end, ROptions),
Module = list_to_atom(?PLUGIN_PREFIX++Type),
StdOpts = Module:options(),
lists:foldl(fun({Key, Value}, Acc)->
lists:keyreplace(Key, 1, Acc, {Key, Value})
end, StdOpts, DbOpts);
_ ->
[]
end,
#pubsub_node{
nodeid = {Host, ?PUBSUB:string_to_node(Node)},
parents = [?PUBSUB:string_to_node(Parent)],
id = NodeId,
type = Type,
options = Options}.
%% @spec (NodeRecord) -> ok | {error, Reason}
%% Record = mod_pubsub:pubsub_node()
set_node(Record) ->
{Host, Node} = Record#pubsub_node.nodeid,
Parent = case Record#pubsub_node.parents of
[] -> <<>>;
[First|_] -> First
end,
Type = Record#pubsub_node.type,
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
P = ?PUBSUB:escape(?PUBSUB:node_to_string(Parent)),
NodeId = case nodeid(Host, Node) of
{result, OldNodeId} ->
catch ejabberd_odbc:sql_query_t(
["delete from pubsub_node_option "
"where nodeid='", OldNodeId, "';"]),
catch ejabberd_odbc:sql_query_t(
["update pubsub_node "
"set host='", H, "' "
"node='", N, "' "
"parent='", P, "' "
"type='", Type, "' "
"where nodeid='", OldNodeId, "';"]),
OldNodeId;
_ ->
catch ejabberd_odbc:sql_query_t(
["insert into pubsub_node(host, node, parent, type) "
"values('", H, "', '", N, "', '", P, "', '", Type, "');"]),
case nodeid(Host, Node) of
{result, NewNodeId} -> NewNodeId;
_ -> none % this should not happen
end
end,
case NodeId of
none ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
lists:foreach(fun({Key, Value}) ->
SKey = atom_to_list(Key),
SValue = ?PUBSUB:escape(lists:flatten(io_lib:fwrite("~p",[Value]))),
catch ejabberd_odbc:sql_query_t(
["insert into pubsub_node_option(nodeid, name, val) "
"values('", NodeId, "', '", SKey, "', '", SValue, "');"])
end, Record#pubsub_node.options),
{result, NodeId}
end.
nodeid(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select nodeid "
"from pubsub_node "
"where host='", H, "' and node='", N, "';"])
of
{selected, ["nodeid"], [{NodeId}]} ->
{result, NodeId};
{'EXIT', _Reason} ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
{error, ?ERR_ITEM_NOT_FOUND}
end.
nodeowners(NodeId) ->
{result, Res} = node_hometree_odbc:get_node_affiliations(NodeId),
lists:foldl(fun({LJID, owner}, Acc) -> [LJID|Acc];
(_, Acc) -> Acc
end, [], Res).
| null | https://raw.githubusercontent.com/erlangonrails/devdb/0e7eaa6bd810ec3892bfc3d933439560620d0941/dev/ejabberd-2.1.4/src/mod_pubsub/nodetree_tree_odbc.erl | erlang | ====================================================================
compliance with the License. You should have received a copy of the
Erlang Public License along with this software. If not, it can be
retrieved via the world wide web at /.
basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See
the License for the specific language governing rights and limitations
under the License.
[-one.net/]
@version {@vsn}, {@date} {@time}
@end
====================================================================
as a developer basis and reference to build its own custom pubsub node tree
types.</p>
<p><strong>The API isn't stabilized yet</strong>. The pubsub plugin
development is still a work in progress. However, the system is already
useable and useful as is. Please, send us comments, feedback and
improvements.</p>
================
API definition
================
Host = mod_pubsub:host()
Opts = list()
@doc <p>Called during pubsub modules initialisation. Any pubsub plugin must
implement this function. It can return anything.</p>
<p>This function is mainly used to trigger the setup task necessary for the
plugin. It can be used for example by the developer to create the specific
@doc Returns the default pubsub node tree options.
@spec (Host, Node, From) -> pubsubNode() | {error, Reason}
Host = mod_pubsub:host()
Node = mod_pubsub:pubsubNode()
@spec (Host, From) -> [pubsubNode()] | {error, Reason}
Host = mod_pubsub:host() | mod_pubsub:jid()
@spec (Host, Node, From) -> [{Depth, Record}] | {error, Reason}
Host = mod_pubsub:host() | mod_pubsub:jid()
Node = mod_pubsub:pubsubNode()
From = mod_pubsub:jid()
Depth = integer()
Record = pubsubNode()
@spec (Host, Node, From) -> [{Depth, Record}] | {error, Reason}
Host = mod_pubsub:host() | mod_pubsub:jid()
Node = mod_pubsub:pubsubNode()
From = mod_pubsub:jid()
Depth = integer()
Record = pubsubNode()
@doc <p>Default node tree does not handle parents, return a list
containing just this node.</p>
@spec (Host, Index) -> [pubsubNode()] | {error, Reason}
Host = mod_pubsub:host()
Node = mod_pubsub:pubsubNode()
@spec (Host, Index) -> [pubsubNode()] | {error, Reason}
Host = mod_pubsub:host()
Node = mod_pubsub:pubsubNode()
@spec (Host, Node, Type, Owner, Options, Parents) -> ok | {error, Reason}
Host = mod_pubsub:host() | mod_pubsub:jid()
Node = mod_pubsub:pubsubNode()
Owner = mod_pubsub:jid()
Options = list()
Parents = list()
This is special case for PEP handling
PEP does not uses hierarchy
Requesting entity is prohibited from creating nodes
NodeID already exists
@spec (Host, Node) -> [mod_pubsub:node()]
Host = mod_pubsub:host() | mod_pubsub:jid()
Node = mod_pubsub:pubsubNode()
helpers
@spec (NodeRecord) -> ok | {error, Reason}
Record = mod_pubsub:pubsub_node()
this should not happen | ` ` The contents of this file are subject to the Erlang Public License ,
Version 1.1 , ( the " License " ) ; you may not use this file except in
Software distributed under the License is distributed on an " AS IS "
The Initial Developer of the Original Code is ProcessOne .
Portions created by ProcessOne are Copyright 2006 - 2010 , ProcessOne
All Rights Reserved . ''
This software is copyright 2006 - 2010 , ProcessOne .
2006 - 2010 ProcessOne
@author < >
@doc The module < strong>{@module}</strong > is the default PubSub node tree plugin .
< p > It is used as a default for all unknown PubSub node type . It can serve
< p > PubSub node tree plugins are using the { @link gen_nodetree } behaviour.</p >
-module(nodetree_tree_odbc).
-author('').
-include("pubsub.hrl").
-include("jlib.hrl").
-define(PUBSUB, mod_pubsub_odbc).
-define(PLUGIN_PREFIX, "node_").
-behaviour(gen_pubsub_nodetree).
-export([init/3,
terminate/2,
options/0,
set_node/1,
get_node/3,
get_node/2,
get_node/1,
get_nodes/2,
get_nodes/1,
get_parentnodes/3,
get_parentnodes_tree/3,
get_subnodes/3,
get_subnodes_tree/3,
create_node/6,
delete_node/2
]).
-export([raw_to_node/2]).
@spec ( Host , ServerHost , Opts ) - > any ( )
ServerHost = host ( )
module database schema if it does not exists >
init(_Host, _ServerHost, _Opts) ->
ok.
terminate(_Host, _ServerHost) ->
ok.
( ) - > [ Option ]
Option = mod_pubsub : ( )
options() ->
[{virtual_tree, false},
{odbc, true}].
get_node(Host, Node, _From) ->
get_node(Host, Node).
get_node(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "' and node='", N, "';"])
of
{selected, ["node", "parent", "type", "nodeid"], [RItem]} ->
raw_to_node(Host, RItem);
{'EXIT', _Reason} ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
{error, ?ERR_ITEM_NOT_FOUND}
end.
get_node(NodeId) ->
case catch ejabberd_odbc:sql_query_t(
["select host, node, parent, type "
"from pubsub_node "
"where nodeid='", NodeId, "';"])
of
{selected, ["host", "node", "parent", "type"], [{Host, Node, Parent, Type}]} ->
raw_to_node(Host, {Node, Parent, Type, NodeId});
{'EXIT', _Reason} ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
{error, ?ERR_ITEM_NOT_FOUND}
end.
get_nodes(Host, _From) ->
get_nodes(Host).
get_nodes(Host) ->
H = ?PUBSUB:escape(Host),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "';"])
of
{selected, ["node", "parent", "type", "nodeid"], RItems} ->
lists:map(fun(Item) -> raw_to_node(Host, Item) end, RItems);
_ ->
[]
end.
@doc < p > Default node tree does not handle parents , return empty list.</p >
get_parentnodes(_Host, _Node, _From) ->
[].
get_parentnodes_tree(Host, Node, From) ->
case get_node(Host, Node, From) of
N when is_record(N, pubsub_node) -> [{0, [N]}];
_Error -> []
end.
get_subnodes(Host, Node, _From) ->
get_subnodes(Host, Node).
get_subnodes(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "' and parent='", N, "';"])
of
{selected, ["node", "parent", "type", "nodeid"], RItems} ->
lists:map(fun(Item) -> raw_to_node(Host, Item) end, RItems);
_ ->
[]
end.
get_subnodes_tree(Host, Node, _From) ->
get_subnodes_tree(Host, Node).
get_subnodes_tree(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select node, parent, type, nodeid "
"from pubsub_node "
"where host='", H, "' and node like '", N, "%';"])
of
{selected, ["node", "parent", "type", "nodeid"], RItems} ->
lists:map(fun(Item) -> raw_to_node(Host, Item) end, RItems);
_ ->
[]
end.
NodeType = mod_pubsub : nodeType ( )
create_node(Host, Node, Type, Owner, Options, Parents) ->
BJID = jlib:jid_tolower(jlib:jid_remove_resource(Owner)),
case nodeid(Host, Node) of
{error, ?ERR_ITEM_NOT_FOUND} ->
ParentExists =
case Host of
{_U, _S, _R} ->
true;
_ ->
case Parents of
[] -> true;
[Parent|_] ->
case nodeid(Host, Parent) of
{result, PNodeId} ->
case nodeowners(PNodeId) of
[{[], Host, []}] -> true;
Owners -> lists:member(BJID, Owners)
end;
_ ->
false
end;
_ ->
false
end
end,
case ParentExists of
true ->
case set_node(#pubsub_node{
nodeid={Host, Node},
parents=Parents,
type=Type,
options=Options}) of
{result, NodeId} -> {ok, NodeId};
Other -> Other
end;
false ->
{error, ?ERR_FORBIDDEN}
end;
{result, _} ->
{error, ?ERR_CONFLICT};
Error ->
Error
end.
delete_node(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
Removed = get_subnodes_tree(Host, Node),
catch ejabberd_odbc:sql_query_t(
["delete from pubsub_node "
"where host='", H, "' and node like '", N, "%';"]),
Removed.
raw_to_node(Host, {Node, Parent, Type, NodeId}) ->
Options = case catch ejabberd_odbc:sql_query_t(
["select name,val "
"from pubsub_node_option "
"where nodeid='", NodeId, "';"])
of
{selected, ["name", "val"], ROptions} ->
DbOpts = lists:map(fun({Key, Value}) ->
RKey = list_to_atom(Key),
Tokens = element(2, erl_scan:string(Value++".")),
RValue = element(2, erl_parse:parse_term(Tokens)),
{RKey, RValue}
end, ROptions),
Module = list_to_atom(?PLUGIN_PREFIX++Type),
StdOpts = Module:options(),
lists:foldl(fun({Key, Value}, Acc)->
lists:keyreplace(Key, 1, Acc, {Key, Value})
end, StdOpts, DbOpts);
_ ->
[]
end,
#pubsub_node{
nodeid = {Host, ?PUBSUB:string_to_node(Node)},
parents = [?PUBSUB:string_to_node(Parent)],
id = NodeId,
type = Type,
options = Options}.
set_node(Record) ->
{Host, Node} = Record#pubsub_node.nodeid,
Parent = case Record#pubsub_node.parents of
[] -> <<>>;
[First|_] -> First
end,
Type = Record#pubsub_node.type,
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
P = ?PUBSUB:escape(?PUBSUB:node_to_string(Parent)),
NodeId = case nodeid(Host, Node) of
{result, OldNodeId} ->
catch ejabberd_odbc:sql_query_t(
["delete from pubsub_node_option "
"where nodeid='", OldNodeId, "';"]),
catch ejabberd_odbc:sql_query_t(
["update pubsub_node "
"set host='", H, "' "
"node='", N, "' "
"parent='", P, "' "
"type='", Type, "' "
"where nodeid='", OldNodeId, "';"]),
OldNodeId;
_ ->
catch ejabberd_odbc:sql_query_t(
["insert into pubsub_node(host, node, parent, type) "
"values('", H, "', '", N, "', '", P, "', '", Type, "');"]),
case nodeid(Host, Node) of
{result, NewNodeId} -> NewNodeId;
end
end,
case NodeId of
none ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
lists:foreach(fun({Key, Value}) ->
SKey = atom_to_list(Key),
SValue = ?PUBSUB:escape(lists:flatten(io_lib:fwrite("~p",[Value]))),
catch ejabberd_odbc:sql_query_t(
["insert into pubsub_node_option(nodeid, name, val) "
"values('", NodeId, "', '", SKey, "', '", SValue, "');"])
end, Record#pubsub_node.options),
{result, NodeId}
end.
nodeid(Host, Node) ->
H = ?PUBSUB:escape(Host),
N = ?PUBSUB:escape(?PUBSUB:node_to_string(Node)),
case catch ejabberd_odbc:sql_query_t(
["select nodeid "
"from pubsub_node "
"where host='", H, "' and node='", N, "';"])
of
{selected, ["nodeid"], [{NodeId}]} ->
{result, NodeId};
{'EXIT', _Reason} ->
{error, ?ERR_INTERNAL_SERVER_ERROR};
_ ->
{error, ?ERR_ITEM_NOT_FOUND}
end.
nodeowners(NodeId) ->
{result, Res} = node_hometree_odbc:get_node_affiliations(NodeId),
lists:foldl(fun({LJID, owner}, Acc) -> [LJID|Acc];
(_, Acc) -> Acc
end, [], Res).
|
d12e49860ae34e6d1093e70cbcd99cfe9f5986702e3d6e3af148955691abf014 | brendanhay/gogol | List.hs | # LANGUAGE DataKinds #
# LANGUAGE DeriveGeneric #
# LANGUAGE DerivingStrategies #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleInstances #
# LANGUAGE GeneralizedNewtypeDeriving #
# LANGUAGE LambdaCase #
{-# LANGUAGE OverloadedStrings #-}
# LANGUAGE PatternSynonyms #
# LANGUAGE RecordWildCards #
{-# LANGUAGE StrictData #-}
# LANGUAGE TypeFamilies #
# LANGUAGE TypeOperators #
# LANGUAGE NoImplicitPrelude #
# OPTIONS_GHC -fno - warn - duplicate - exports #
# OPTIONS_GHC -fno - warn - name - shadowing #
# OPTIONS_GHC -fno - warn - unused - binds #
# OPTIONS_GHC -fno - warn - unused - imports #
# OPTIONS_GHC -fno - warn - unused - matches #
-- |
Module : . Script . Processes . List
Copyright : ( c ) 2015 - 2022
License : Mozilla Public License , v. 2.0 .
Maintainer : < brendan.g.hay+ >
-- Stability : auto-generated
Portability : non - portable ( GHC extensions )
--
-- List information about processes made by or on behalf of a user, such as process type and current status.
--
-- /See:/ <-script/api/ Apps Script API Reference> for @script.processes.list@.
module Gogol.Script.Processes.List
( -- * Resource
ScriptProcessesListResource,
-- ** Constructing a Request
ScriptProcessesList (..),
newScriptProcessesList,
)
where
import qualified Gogol.Prelude as Core
import Gogol.Script.Types
-- | A resource alias for @script.processes.list@ method which the
-- 'ScriptProcessesList' request conforms to.
type ScriptProcessesListResource =
"v1"
Core.:> "processes"
Core.:> Core.QueryParam "$.xgafv" Xgafv
Core.:> Core.QueryParam "access_token" Core.Text
Core.:> Core.QueryParam "callback" Core.Text
Core.:> Core.QueryParam "pageSize" Core.Int32
Core.:> Core.QueryParam "pageToken" Core.Text
Core.:> Core.QueryParam "uploadType" Core.Text
Core.:> Core.QueryParam "upload_protocol" Core.Text
Core.:> Core.QueryParam
"userProcessFilter.deploymentId"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.endTime"
Core.DateTime
Core.:> Core.QueryParam
"userProcessFilter.functionName"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.projectName"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.scriptId"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.startTime"
Core.DateTime
Core.:> Core.QueryParams
"userProcessFilter.statuses"
ProcessesListUserProcessFilterStatuses
Core.:> Core.QueryParams
"userProcessFilter.types"
ProcessesListUserProcessFilterTypes
Core.:> Core.QueryParams
"userProcessFilter.userAccessLevels"
ProcessesListUserProcessFilterUserAccessLevels
Core.:> Core.QueryParam "alt" Core.AltJSON
Core.:> Core.Get
'[Core.JSON]
ListUserProcessesResponse
-- | List information about processes made by or on behalf of a user, such as process type and current status.
--
-- /See:/ 'newScriptProcessesList' smart constructor.
data ScriptProcessesList = ScriptProcessesList
{ -- | V1 error format.
xgafv :: (Core.Maybe Xgafv),
-- | OAuth access token.
accessToken :: (Core.Maybe Core.Text),
| JSONP
callback :: (Core.Maybe Core.Text),
| The maximum number of returned processes per page of results . Defaults to 50 .
pageSize :: (Core.Maybe Core.Int32),
| The token for continuing a previous list request on the next page . This should be set to the value of @nextPageToken@ from a previous response .
pageToken :: (Core.Maybe Core.Text),
| Legacy upload protocol for media ( e.g. \"media\ " , \"multipart\ " ) .
uploadType :: (Core.Maybe Core.Text),
-- | Upload protocol for media (e.g. \"raw\", \"multipart\").
uploadProtocol :: (Core.Maybe Core.Text),
-- | Optional field used to limit returned processes to those originating from projects with a specific deployment ID.
userProcessFilterDeploymentId :: (Core.Maybe Core.Text),
-- | Optional field used to limit returned processes to those that completed on or before the given timestamp.
userProcessFilterEndTime :: (Core.Maybe Core.DateTime),
-- | Optional field used to limit returned processes to those originating from a script function with the given function name.
userProcessFilterFunctionName :: (Core.Maybe Core.Text),
-- | Optional field used to limit returned processes to those originating from projects with project names containing a specific string.
userProcessFilterProjectName :: (Core.Maybe Core.Text),
-- | Optional field used to limit returned processes to those originating from projects with a specific script ID.
userProcessFilterScriptId :: (Core.Maybe Core.Text),
-- | Optional field used to limit returned processes to those that were started on or after the given timestamp.
userProcessFilterStartTime :: (Core.Maybe Core.DateTime),
-- | Optional field used to limit returned processes to those having one of the specified process statuses.
userProcessFilterStatuses :: (Core.Maybe [ProcessesListUserProcessFilterStatuses]),
-- | Optional field used to limit returned processes to those having one of the specified process types.
userProcessFilterTypes :: (Core.Maybe [ProcessesListUserProcessFilterTypes]),
-- | Optional field used to limit returned processes to those having one of the specified user access levels.
userProcessFilterUserAccessLevels :: (Core.Maybe [ProcessesListUserProcessFilterUserAccessLevels])
}
deriving (Core.Eq, Core.Show, Core.Generic)
-- | Creates a value of 'ScriptProcessesList' with the minimum fields required to make a request.
newScriptProcessesList ::
ScriptProcessesList
newScriptProcessesList =
ScriptProcessesList
{ xgafv = Core.Nothing,
accessToken = Core.Nothing,
callback = Core.Nothing,
pageSize = Core.Nothing,
pageToken = Core.Nothing,
uploadType = Core.Nothing,
uploadProtocol = Core.Nothing,
userProcessFilterDeploymentId = Core.Nothing,
userProcessFilterEndTime = Core.Nothing,
userProcessFilterFunctionName = Core.Nothing,
userProcessFilterProjectName = Core.Nothing,
userProcessFilterScriptId = Core.Nothing,
userProcessFilterStartTime = Core.Nothing,
userProcessFilterStatuses = Core.Nothing,
userProcessFilterTypes = Core.Nothing,
userProcessFilterUserAccessLevels = Core.Nothing
}
instance Core.GoogleRequest ScriptProcessesList where
type
Rs ScriptProcessesList =
ListUserProcessesResponse
type Scopes ScriptProcessesList = '[Script'Processes]
requestClient ScriptProcessesList {..} =
go
xgafv
accessToken
callback
pageSize
pageToken
uploadType
uploadProtocol
userProcessFilterDeploymentId
userProcessFilterEndTime
userProcessFilterFunctionName
userProcessFilterProjectName
userProcessFilterScriptId
userProcessFilterStartTime
(userProcessFilterStatuses Core.^. Core._Default)
(userProcessFilterTypes Core.^. Core._Default)
( userProcessFilterUserAccessLevels
Core.^. Core._Default
)
(Core.Just Core.AltJSON)
scriptService
where
go =
Core.buildClient
( Core.Proxy ::
Core.Proxy ScriptProcessesListResource
)
Core.mempty
| null | https://raw.githubusercontent.com/brendanhay/gogol/77394c4e0f5bd729e6fe27119701c45f9d5e1e9a/lib/services/gogol-script/gen/Gogol/Script/Processes/List.hs | haskell | # LANGUAGE OverloadedStrings #
# LANGUAGE StrictData #
|
Stability : auto-generated
List information about processes made by or on behalf of a user, such as process type and current status.
/See:/ <-script/api/ Apps Script API Reference> for @script.processes.list@.
* Resource
** Constructing a Request
| A resource alias for @script.processes.list@ method which the
'ScriptProcessesList' request conforms to.
| List information about processes made by or on behalf of a user, such as process type and current status.
/See:/ 'newScriptProcessesList' smart constructor.
| V1 error format.
| OAuth access token.
| Upload protocol for media (e.g. \"raw\", \"multipart\").
| Optional field used to limit returned processes to those originating from projects with a specific deployment ID.
| Optional field used to limit returned processes to those that completed on or before the given timestamp.
| Optional field used to limit returned processes to those originating from a script function with the given function name.
| Optional field used to limit returned processes to those originating from projects with project names containing a specific string.
| Optional field used to limit returned processes to those originating from projects with a specific script ID.
| Optional field used to limit returned processes to those that were started on or after the given timestamp.
| Optional field used to limit returned processes to those having one of the specified process statuses.
| Optional field used to limit returned processes to those having one of the specified process types.
| Optional field used to limit returned processes to those having one of the specified user access levels.
| Creates a value of 'ScriptProcessesList' with the minimum fields required to make a request. | # LANGUAGE DataKinds #
# LANGUAGE DeriveGeneric #
# LANGUAGE DerivingStrategies #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleInstances #
# LANGUAGE GeneralizedNewtypeDeriving #
# LANGUAGE LambdaCase #
# LANGUAGE PatternSynonyms #
# LANGUAGE RecordWildCards #
# LANGUAGE TypeFamilies #
# LANGUAGE TypeOperators #
# LANGUAGE NoImplicitPrelude #
# OPTIONS_GHC -fno - warn - duplicate - exports #
# OPTIONS_GHC -fno - warn - name - shadowing #
# OPTIONS_GHC -fno - warn - unused - binds #
# OPTIONS_GHC -fno - warn - unused - imports #
# OPTIONS_GHC -fno - warn - unused - matches #
Module : . Script . Processes . List
Copyright : ( c ) 2015 - 2022
License : Mozilla Public License , v. 2.0 .
Maintainer : < brendan.g.hay+ >
Portability : non - portable ( GHC extensions )
module Gogol.Script.Processes.List
ScriptProcessesListResource,
ScriptProcessesList (..),
newScriptProcessesList,
)
where
import qualified Gogol.Prelude as Core
import Gogol.Script.Types
type ScriptProcessesListResource =
"v1"
Core.:> "processes"
Core.:> Core.QueryParam "$.xgafv" Xgafv
Core.:> Core.QueryParam "access_token" Core.Text
Core.:> Core.QueryParam "callback" Core.Text
Core.:> Core.QueryParam "pageSize" Core.Int32
Core.:> Core.QueryParam "pageToken" Core.Text
Core.:> Core.QueryParam "uploadType" Core.Text
Core.:> Core.QueryParam "upload_protocol" Core.Text
Core.:> Core.QueryParam
"userProcessFilter.deploymentId"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.endTime"
Core.DateTime
Core.:> Core.QueryParam
"userProcessFilter.functionName"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.projectName"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.scriptId"
Core.Text
Core.:> Core.QueryParam
"userProcessFilter.startTime"
Core.DateTime
Core.:> Core.QueryParams
"userProcessFilter.statuses"
ProcessesListUserProcessFilterStatuses
Core.:> Core.QueryParams
"userProcessFilter.types"
ProcessesListUserProcessFilterTypes
Core.:> Core.QueryParams
"userProcessFilter.userAccessLevels"
ProcessesListUserProcessFilterUserAccessLevels
Core.:> Core.QueryParam "alt" Core.AltJSON
Core.:> Core.Get
'[Core.JSON]
ListUserProcessesResponse
data ScriptProcessesList = ScriptProcessesList
xgafv :: (Core.Maybe Xgafv),
accessToken :: (Core.Maybe Core.Text),
| JSONP
callback :: (Core.Maybe Core.Text),
| The maximum number of returned processes per page of results . Defaults to 50 .
pageSize :: (Core.Maybe Core.Int32),
| The token for continuing a previous list request on the next page . This should be set to the value of @nextPageToken@ from a previous response .
pageToken :: (Core.Maybe Core.Text),
| Legacy upload protocol for media ( e.g. \"media\ " , \"multipart\ " ) .
uploadType :: (Core.Maybe Core.Text),
uploadProtocol :: (Core.Maybe Core.Text),
userProcessFilterDeploymentId :: (Core.Maybe Core.Text),
userProcessFilterEndTime :: (Core.Maybe Core.DateTime),
userProcessFilterFunctionName :: (Core.Maybe Core.Text),
userProcessFilterProjectName :: (Core.Maybe Core.Text),
userProcessFilterScriptId :: (Core.Maybe Core.Text),
userProcessFilterStartTime :: (Core.Maybe Core.DateTime),
userProcessFilterStatuses :: (Core.Maybe [ProcessesListUserProcessFilterStatuses]),
userProcessFilterTypes :: (Core.Maybe [ProcessesListUserProcessFilterTypes]),
userProcessFilterUserAccessLevels :: (Core.Maybe [ProcessesListUserProcessFilterUserAccessLevels])
}
deriving (Core.Eq, Core.Show, Core.Generic)
newScriptProcessesList ::
ScriptProcessesList
newScriptProcessesList =
ScriptProcessesList
{ xgafv = Core.Nothing,
accessToken = Core.Nothing,
callback = Core.Nothing,
pageSize = Core.Nothing,
pageToken = Core.Nothing,
uploadType = Core.Nothing,
uploadProtocol = Core.Nothing,
userProcessFilterDeploymentId = Core.Nothing,
userProcessFilterEndTime = Core.Nothing,
userProcessFilterFunctionName = Core.Nothing,
userProcessFilterProjectName = Core.Nothing,
userProcessFilterScriptId = Core.Nothing,
userProcessFilterStartTime = Core.Nothing,
userProcessFilterStatuses = Core.Nothing,
userProcessFilterTypes = Core.Nothing,
userProcessFilterUserAccessLevels = Core.Nothing
}
instance Core.GoogleRequest ScriptProcessesList where
type
Rs ScriptProcessesList =
ListUserProcessesResponse
type Scopes ScriptProcessesList = '[Script'Processes]
requestClient ScriptProcessesList {..} =
go
xgafv
accessToken
callback
pageSize
pageToken
uploadType
uploadProtocol
userProcessFilterDeploymentId
userProcessFilterEndTime
userProcessFilterFunctionName
userProcessFilterProjectName
userProcessFilterScriptId
userProcessFilterStartTime
(userProcessFilterStatuses Core.^. Core._Default)
(userProcessFilterTypes Core.^. Core._Default)
( userProcessFilterUserAccessLevels
Core.^. Core._Default
)
(Core.Just Core.AltJSON)
scriptService
where
go =
Core.buildClient
( Core.Proxy ::
Core.Proxy ScriptProcessesListResource
)
Core.mempty
|
2caf6764e13beed904331ae826de9fd94db4e27b510fe32e6c829fcd32c45398 | johnelse/mkaudio | mkaudio.ml | open Cmdliner
(* man page definition. *)
let help man_format cmds topic =
match topic with
| None -> `Help (`Pager, None)
| Some topic ->
let topics = "topics" :: cmds in
let conv, _ = Arg.enum (List.rev_map (fun s -> (s, s)) topics) in
match conv topic with
| `Error e -> `Error (false, e)
| `Ok t when t = "topics" ->
List.iter print_endline topics;
`Ok (Result.Ok ())
| `Ok t when List.mem t cmds -> `Help (man_format, Some t)
| `Ok _ ->
let page = (topic, 7, "", "", ""), [`S topic; `P "Say something"] in
Manpage.print man_format Format.std_formatter page;
`Ok (Result.Ok ())
let help_secs = [
`S "MORE HELP";
`P "Use `$(mname) $(i,command) --help' for help on a single command.";
`Noblank;
]
let help_cmd =
let topic =
let doc = "The topic to get help on. `topics' lists the topics." in
Arg.(value & pos 0 (some string) None & info [] ~docv:"TOPIC" ~doc)
in
let doc = "display help about mkaudio" in
let man = [
`S "DESCRIPTION";
`P "Prints help about mkaudio commands."
] @ help_secs in
Term.(ret (pure help $ Term.man_format $ Term.choice_names $ topic)),
Term.info "help" ~doc ~man
(* Argument definitions. *)
let channels =
let doc = "The number of channels to use when creating the audio file." in
Arg.(value & opt int 2 & info ["channels"] ~docv:"CHANNELS" ~doc)
let sample_rate =
let doc = "The sample rate to use when creating the audio file." in
Arg.(value & opt int 44100 & info ["sample-rate"] ~docv:"SAMPLERATE" ~doc)
let gain =
let doc = "The gain to apply to the audio output." in
Arg.(value & opt float 1.0 & info ["gain"] ~docv:"GAIN" ~doc)
let duration =
let doc = "The duration of the created file. Expected format is a number
followed by 's', 'm' or 'h', specifying seconds, minutes or hours
respectively. For example, 5s is 5 seconds, 2.5m is 2.5 minutes"
in
Arg.(value & opt (some string) None & info ["duration"] ~docv:"DURATION" ~doc)
let tempo =
let doc = "The tempo of the created file in beats per minute." in
Arg.(value & opt (some float) None & info ["tempo"] ~docv:"TEMPO" ~doc)
let beats =
let doc = "The duration of the created file in beats." in
Arg.(value & opt (some int) None & info ["beats"] ~docv:"BEATS" ~doc)
let frequency =
let doc = "The frequency of the generated sound" in
Arg.(value & opt float 440. & info ["frequency"] ~docv:"FREQUENCY" ~doc)
let kick =
let doc = "String describing the kick drum pattern." in
Arg.(value & opt (some string) None & info ["kick"] ~docv:"KICK" ~doc)
let snare =
let doc = "String describing the snare drum pattern." in
Arg.(value & opt (some string) None & info ["snare"] ~docv:"SNARE" ~doc)
let hihat =
let doc = "String describing the hihat drum pattern." in
Arg.(value & opt (some string) None & info ["hihat"] ~docv:"HIHAT" ~doc)
let repeats =
let doc = "Number of times the beat will be repeated" in
Arg.(value & opt int 1 & info ["repeats"] ~docv:"REPEATS" ~doc)
let output_file =
let doc = "The file to write." in
Arg.(required & pos 0 (some string) None & info [] ~docv:"OUTPUTFILE" ~doc)
(* Command definitions. *)
let saw_cmd =
let doc = "write an audio file containing a saw wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a saw wave.";
] @ help_secs in
Term.(pure Commands.saw
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "saw" ~doc ~man
let sine_cmd =
let doc = "write an audio file containing a sine wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a sine wave.";
] @ help_secs in
Term.(pure Commands.sine
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "sine" ~doc ~man
let square_cmd =
let doc = "write an audio file containing a square wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a square wave.";
] @ help_secs in
Term.(pure Commands.square
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "square" ~doc ~man
let triangle_cmd =
let doc = "write an audio file containing a triangle wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a triangle wave.";
] @ help_secs in
Term.(pure Commands.triangle
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "triangle" ~doc ~man
let white_noise_cmd =
let doc = "write an audio file containing white noise." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing white noise.";
] @ help_secs in
Term.(pure Commands.white_noise
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ output_file),
Term.info "white-noise" ~doc ~man
let beat_cmd =
let doc = "write an audio file containing a beat." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a beat.";
`P "";
`P "Kick, snare and hihat patterns should be specified as equal-length
strings, where each character which is '1' or 'x' corresponds to a
drum hit, and any other character corresponds to a lack of drum hit.";
] @ help_secs in
Term.(pure Commands.beat
$ channels
$ sample_rate
$ gain
$ tempo
$ kick
$ snare
$ hihat
$ repeats
$ output_file),
Term.info "beat" ~doc ~man
let default_command =
let doc = "mkaudio" in
let man = help_secs in
Term.(ret (pure (fun _ -> `Help (`Pager, None)) $ pure ())),
Term.info "mkaudio" ~version:"1.1.0" ~doc ~man
let commands = [
help_cmd;
saw_cmd;
sine_cmd;
square_cmd;
triangle_cmd;
white_noise_cmd;
beat_cmd;
]
let () =
Printexc.record_backtrace true;
match Term.eval_choice default_command commands with
| `Error _ -> exit 1
| `Ok (Result.Ok ()) | `Version | `Help -> ()
| `Ok (Result.Error msg) ->
print_endline msg;
exit 1
| null | https://raw.githubusercontent.com/johnelse/mkaudio/172fc14443feafae2f0542948e53c1621f42bed2/bin/mkaudio.ml | ocaml | man page definition.
Argument definitions.
Command definitions. | open Cmdliner
let help man_format cmds topic =
match topic with
| None -> `Help (`Pager, None)
| Some topic ->
let topics = "topics" :: cmds in
let conv, _ = Arg.enum (List.rev_map (fun s -> (s, s)) topics) in
match conv topic with
| `Error e -> `Error (false, e)
| `Ok t when t = "topics" ->
List.iter print_endline topics;
`Ok (Result.Ok ())
| `Ok t when List.mem t cmds -> `Help (man_format, Some t)
| `Ok _ ->
let page = (topic, 7, "", "", ""), [`S topic; `P "Say something"] in
Manpage.print man_format Format.std_formatter page;
`Ok (Result.Ok ())
let help_secs = [
`S "MORE HELP";
`P "Use `$(mname) $(i,command) --help' for help on a single command.";
`Noblank;
]
let help_cmd =
let topic =
let doc = "The topic to get help on. `topics' lists the topics." in
Arg.(value & pos 0 (some string) None & info [] ~docv:"TOPIC" ~doc)
in
let doc = "display help about mkaudio" in
let man = [
`S "DESCRIPTION";
`P "Prints help about mkaudio commands."
] @ help_secs in
Term.(ret (pure help $ Term.man_format $ Term.choice_names $ topic)),
Term.info "help" ~doc ~man
let channels =
let doc = "The number of channels to use when creating the audio file." in
Arg.(value & opt int 2 & info ["channels"] ~docv:"CHANNELS" ~doc)
let sample_rate =
let doc = "The sample rate to use when creating the audio file." in
Arg.(value & opt int 44100 & info ["sample-rate"] ~docv:"SAMPLERATE" ~doc)
let gain =
let doc = "The gain to apply to the audio output." in
Arg.(value & opt float 1.0 & info ["gain"] ~docv:"GAIN" ~doc)
let duration =
let doc = "The duration of the created file. Expected format is a number
followed by 's', 'm' or 'h', specifying seconds, minutes or hours
respectively. For example, 5s is 5 seconds, 2.5m is 2.5 minutes"
in
Arg.(value & opt (some string) None & info ["duration"] ~docv:"DURATION" ~doc)
let tempo =
let doc = "The tempo of the created file in beats per minute." in
Arg.(value & opt (some float) None & info ["tempo"] ~docv:"TEMPO" ~doc)
let beats =
let doc = "The duration of the created file in beats." in
Arg.(value & opt (some int) None & info ["beats"] ~docv:"BEATS" ~doc)
let frequency =
let doc = "The frequency of the generated sound" in
Arg.(value & opt float 440. & info ["frequency"] ~docv:"FREQUENCY" ~doc)
let kick =
let doc = "String describing the kick drum pattern." in
Arg.(value & opt (some string) None & info ["kick"] ~docv:"KICK" ~doc)
let snare =
let doc = "String describing the snare drum pattern." in
Arg.(value & opt (some string) None & info ["snare"] ~docv:"SNARE" ~doc)
let hihat =
let doc = "String describing the hihat drum pattern." in
Arg.(value & opt (some string) None & info ["hihat"] ~docv:"HIHAT" ~doc)
let repeats =
let doc = "Number of times the beat will be repeated" in
Arg.(value & opt int 1 & info ["repeats"] ~docv:"REPEATS" ~doc)
let output_file =
let doc = "The file to write." in
Arg.(required & pos 0 (some string) None & info [] ~docv:"OUTPUTFILE" ~doc)
let saw_cmd =
let doc = "write an audio file containing a saw wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a saw wave.";
] @ help_secs in
Term.(pure Commands.saw
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "saw" ~doc ~man
let sine_cmd =
let doc = "write an audio file containing a sine wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a sine wave.";
] @ help_secs in
Term.(pure Commands.sine
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "sine" ~doc ~man
let square_cmd =
let doc = "write an audio file containing a square wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a square wave.";
] @ help_secs in
Term.(pure Commands.square
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "square" ~doc ~man
let triangle_cmd =
let doc = "write an audio file containing a triangle wave." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a triangle wave.";
] @ help_secs in
Term.(pure Commands.triangle
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ frequency
$ output_file),
Term.info "triangle" ~doc ~man
let white_noise_cmd =
let doc = "write an audio file containing white noise." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing white noise.";
] @ help_secs in
Term.(pure Commands.white_noise
$ channels
$ sample_rate
$ gain
$ duration
$ tempo
$ beats
$ output_file),
Term.info "white-noise" ~doc ~man
let beat_cmd =
let doc = "write an audio file containing a beat." in
let man = [
`S "DESCRIPTION";
`P "Write an audio file containing a beat.";
`P "";
`P "Kick, snare and hihat patterns should be specified as equal-length
strings, where each character which is '1' or 'x' corresponds to a
drum hit, and any other character corresponds to a lack of drum hit.";
] @ help_secs in
Term.(pure Commands.beat
$ channels
$ sample_rate
$ gain
$ tempo
$ kick
$ snare
$ hihat
$ repeats
$ output_file),
Term.info "beat" ~doc ~man
let default_command =
let doc = "mkaudio" in
let man = help_secs in
Term.(ret (pure (fun _ -> `Help (`Pager, None)) $ pure ())),
Term.info "mkaudio" ~version:"1.1.0" ~doc ~man
let commands = [
help_cmd;
saw_cmd;
sine_cmd;
square_cmd;
triangle_cmd;
white_noise_cmd;
beat_cmd;
]
let () =
Printexc.record_backtrace true;
match Term.eval_choice default_command commands with
| `Error _ -> exit 1
| `Ok (Result.Ok ()) | `Version | `Help -> ()
| `Ok (Result.Error msg) ->
print_endline msg;
exit 1
|
7b1c24f16751395b64e70d529d0002041cfdcb00f4ccdee1c69032691dfc2311 | dancrossnyc/multics | pe550.ctl.lisp | ;;; ***********************************************************
;;; * *
* Copyright , ( C ) Honeywell Information Systems Inc. , 1982 *
;;; * *
* Copyright ( c ) 1978 by Massachusetts Institute of *
* Technology and Honeywell Information Systems , Inc. *
;;; * *
;;; ***********************************************************
;;;
;;;
;;; pe550ctl - BSG 7/19/79 -- from
FOX-1100 control package
GMP on 08/17/78
;;;
(declare (special X Y screenheight screenlinelen ospeed))
(declare (special tty-type))
(declare (special idel-lines-availablep idel-chars-availablep))
Initialize terminal and terminal control package .
(defun DCTL-init ()
(setq idel-lines-availablep nil
idel-chars-availablep nil
screenheight 24.
screenlinelen 79.
tty-type 'pe550)
(setq X -1 Y -1)
(DCTL-position-cursor 0 0)
(DCTL-clear-rest-of-screen))
; Move terminal's cursor to desired position.
(defun DCTL-position-cursor (x y)
(cond ((and (= x X) (= y Y))
nil)
((and (= x 0) (= y 0))
(Rtyo 33) (Rprinc "H")
(setq X 0 Y 0))
(t (or (= x X)
(cond ((= x 0)
(Rtyo 15))
((< (abs (- x X)) 2)
(cond ((< X x)
(do ex X (1+ ex) (= ex x)
(Rtyo 33) (Rprinc "C")))
((< x X)
(do ex x (1+ ex) (= ex X) (Rtyo 010)))))
(t (Rtyo 33) (Rprinc "Y") (Rtyo (+ 40 x)))))
(or (= y Y)
(cond ((= y (1+ Y))
(Rtyo 12))
((< (abs (- y Y)) 2)
(cond ((< Y y)
(do wy Y (1+ wy) (= wy y)
(Rtyo 33) (Rprinc "B")))
((< y Y)
(do wy y (1+ wy) (= wy Y)
(Rtyo 33) (Rprinc "A")))))
(t (Rtyo 33) (Rprinc "X") (Rtyo (+ 40 y)))))
(setq X x Y y))))
; Output string.
(defun DCTL-display-char-string (string)
(setq X (+ X (stringlength string)))
(Rprinc string))
; Clear to end of screen.
(defun DCTL-clear-rest-of-screen () ;Really clear whole screen
(Rtyo 33) (Rprinc "K")(DCTL-pad 20000.)
(setq X 0 Y 0))
; Clear to end of line.
(defun DCTL-kill-line ()
(Rtyo 33) (Rprinc "I")(DCTL-pad 20000.))
; Send pad characters to wait specified no. of microseconds.
(defun DCTL-pad (n)
(do i (// (* n ospeed) 1000000.) (1- i) (= i 0)
(Rtyo 0)))
| null | https://raw.githubusercontent.com/dancrossnyc/multics/dc291689edf955c660e57236da694630e2217151/library_dir_dir/system_library_unbundled/source/bound_emacs_ctls_.s.archive/pe550.ctl.lisp | lisp | ***********************************************************
* *
* *
* *
***********************************************************
pe550ctl - BSG 7/19/79 -- from
Move terminal's cursor to desired position.
Output string.
Clear to end of screen.
Really clear whole screen
Clear to end of line.
Send pad characters to wait specified no. of microseconds. | * Copyright , ( C ) Honeywell Information Systems Inc. , 1982 *
* Copyright ( c ) 1978 by Massachusetts Institute of *
* Technology and Honeywell Information Systems , Inc. *
FOX-1100 control package
GMP on 08/17/78
(declare (special X Y screenheight screenlinelen ospeed))
(declare (special tty-type))
(declare (special idel-lines-availablep idel-chars-availablep))
Initialize terminal and terminal control package .
(defun DCTL-init ()
(setq idel-lines-availablep nil
idel-chars-availablep nil
screenheight 24.
screenlinelen 79.
tty-type 'pe550)
(setq X -1 Y -1)
(DCTL-position-cursor 0 0)
(DCTL-clear-rest-of-screen))
(defun DCTL-position-cursor (x y)
(cond ((and (= x X) (= y Y))
nil)
((and (= x 0) (= y 0))
(Rtyo 33) (Rprinc "H")
(setq X 0 Y 0))
(t (or (= x X)
(cond ((= x 0)
(Rtyo 15))
((< (abs (- x X)) 2)
(cond ((< X x)
(do ex X (1+ ex) (= ex x)
(Rtyo 33) (Rprinc "C")))
((< x X)
(do ex x (1+ ex) (= ex X) (Rtyo 010)))))
(t (Rtyo 33) (Rprinc "Y") (Rtyo (+ 40 x)))))
(or (= y Y)
(cond ((= y (1+ Y))
(Rtyo 12))
((< (abs (- y Y)) 2)
(cond ((< Y y)
(do wy Y (1+ wy) (= wy y)
(Rtyo 33) (Rprinc "B")))
((< y Y)
(do wy y (1+ wy) (= wy Y)
(Rtyo 33) (Rprinc "A")))))
(t (Rtyo 33) (Rprinc "X") (Rtyo (+ 40 y)))))
(setq X x Y y))))
(defun DCTL-display-char-string (string)
(setq X (+ X (stringlength string)))
(Rprinc string))
(Rtyo 33) (Rprinc "K")(DCTL-pad 20000.)
(setq X 0 Y 0))
(defun DCTL-kill-line ()
(Rtyo 33) (Rprinc "I")(DCTL-pad 20000.))
(defun DCTL-pad (n)
(do i (// (* n ospeed) 1000000.) (1- i) (= i 0)
(Rtyo 0)))
|
afa332d35b97ef0cb17e4789f3bbda06479942600dec6c904f0e4c035b581866 | acieroid/scala-am | church-6.scm | (letrec ((zero (lambda (f x) x))
(inc (lambda (n)
(lambda (f x)
(f (n f x)))))
(plus (lambda (m n)
(lambda (f x)
(m f (n f x))))))
((plus (inc (inc (inc zero))) (plus (inc (inc zero)) (inc zero))) (lambda (x) (+ x 1)) 0))
| null | https://raw.githubusercontent.com/acieroid/scala-am/13ef3befbfc664b77f31f56847c30d60f4ee7dfe/test/R5RS/church-6.scm | scheme | (letrec ((zero (lambda (f x) x))
(inc (lambda (n)
(lambda (f x)
(f (n f x)))))
(plus (lambda (m n)
(lambda (f x)
(m f (n f x))))))
((plus (inc (inc (inc zero))) (plus (inc (inc zero)) (inc zero))) (lambda (x) (+ x 1)) 0))
| |
7562776f8948c70bc6251eba4054a5c25411ef58a1fd28033e5b9eecc6a6c5b9 | c-cube/qcheck | QCheck_tests.ml | (** QCheck(1) tests **)
(* Please add any additional tests to both [QCheck_tests.ml] and [QCheck2_tests.ml].
This ensures that both generator approaches continue to work as expected
and furthermore allows us to compare their behaviour with
[diff -y test/core/QCheck_expect_test.expected test/core/QCheck2_expect_test.expected] *)
(** Module representing a tree data structure, used in tests *)
module IntTree = struct
open QCheck
type tree = Leaf of int | Node of tree * tree
let leaf x = Leaf x
let node x y = Node (x,y)
let rec depth = function
| Leaf _ -> 1
| Node (x, y) -> 1 + max (depth x) (depth y)
let rec print_tree = function
| Leaf x -> Printf.sprintf "Leaf %d" x
| Node (x, y) -> Printf.sprintf "Node (%s, %s)" (print_tree x) (print_tree y)
let gen_tree = Gen.(sized @@ fix
(fun self n -> match n with
| 0 -> map leaf nat
| n ->
frequency
[1, map leaf nat;
2, map2 node (self (n/2)) (self (n/2))]
))
let rec shrink_tree t = match t with
| Leaf l -> Iter.map (fun l' -> Leaf l') (Shrink.int l)
| Node (x,y) ->
let open Iter in
of_list [x;y]
<+> map (fun x' -> Node (x',y)) (shrink_tree x)
<+> map (fun y' -> Node (x,y')) (shrink_tree y)
let rec rev_tree = function
| Node (x, y) -> Node (rev_tree y, rev_tree x)
| Leaf x -> Leaf x
let rec contains_only_n tree n = match tree with
| Leaf n' -> n = n'
| Node (x, y) -> contains_only_n x n && contains_only_n y n
end
(* tests of overall functionality *)
module Overall = struct
open QCheck
let passing =
Test.make ~name:"list_rev_is_involutive" ~count:100 ~long_factor:100
(list small_int) (fun l -> List.rev (List.rev l) = l)
let failing =
Test.make ~name:"should_fail_sort_id" ~count:10
(small_list small_int) (fun l -> l = List.sort compare l)
exception Error
let error =
Test.make ~name:"should_error_raise_exn" ~count:10
int (fun _ -> raise Error)
let collect =
Test.make ~name:"collect_results" ~count:100 ~long_factor:100
(make ~collect:string_of_int (Gen.int_bound 4))
(fun _ -> true)
let stats =
Test.make ~name:"with_stats" ~count:100 ~long_factor:100
(make (Gen.int_bound 120)
~stats:[
"mod4", (fun i->i mod 4);
"num", (fun i->i);
])
(fun _ -> true)
let retries =
Test.make ~name:"with shrinking retries" ~retries:10
small_nat (fun i -> Printf.printf "%i %!" i; i mod 3 <> 1)
let bad_assume_warn =
Test.make ~name:"WARN_unlikely_precond" ~count:2_000
int
(fun x ->
QCheck.assume (x mod 100 = 1);
true)
let bad_assume_fail =
Test.make ~name:"FAIL_unlikely_precond" ~count:2_000
~if_assumptions_fail:(`Fatal, 0.1)
int
(fun x ->
QCheck.assume (x mod 100 = 1);
true)
let bad_gen_fail =
Test.make ~name:"FAIL_bad_gen"
(make Gen.(int >>= fun j -> int_bound j >>= fun i -> return (i,j)))
(fun (_i,_j) -> true) (* i may be negative, causing int_bound to fail *)
let bad_shrinker_fail =
Test.make ~name:"FAIL_bad_shrinker"
(make
~shrink:(fun _i -> raise Error)
Gen.int)
(fun _i -> false)
let neg_test_fail_as_expected =
Test.make_neg ~name:"all ints are even" small_int (fun i -> i mod 2 = 0)
let neg_test_unexpected_success =
Test.make_neg ~name:"int double" small_int (fun i -> i + i = i * 2)
let neg_test_fail_with_shrinking =
Test.make_neg ~name:"list rev concat"
(pair (list small_int) (list small_int)) (fun (is,js) -> (List.rev is)@(List.rev js) = List.rev (is@js))
let pos_test_fails_with_error =
Test.make ~name:"pos fail with error" small_int (fun _i -> raise Error)
let neg_test_fail_with_error =
Test.make_neg ~name:"neg fail with error" small_int (fun _i -> raise Error)
(* [apply_n f x n] computes f(f(...f(x))) with n applications of f *)
let rec apply_n f x n =
if n=0
then x
else apply_n f (f x) (pred n)
test from # 236
let bad_fun_repro =
let sleep_time = 0.175 in
let count = ref 0 in
Test.make ~count:10 ~name:"bad function reproducability"
(set_shrink Shrink.nil (triple small_int (fun1 Observable.int small_int) small_int))
(fun (i,f,j) ->
incr count;
Printf.printf "(%i,fun,%i)%s%!" i j (if !count mod 10 = 0 then "\n" else " ");
Unix.sleepf sleep_time;
if 1 = Float.to_int (Unix.time ()) mod 2
then
(ignore(apply_n (Fn.apply f) i j > 0); true)
else
(ignore(apply_n (Fn.apply f) i i > 0); true))
let tests = [
passing;
failing;
error;
collect;
stats;
retries;
bad_assume_warn;
bad_assume_fail;
bad_gen_fail;
bad_shrinker_fail ;
neg_test_fail_as_expected;
neg_test_unexpected_success;
neg_test_fail_with_shrinking;
pos_test_fails_with_error;
neg_test_fail_with_error;
(* we repeat the following multiple times to check the expected output for duplicate lines *)
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
]
end
(* positive tests of the various generators *)
module Generator = struct
open QCheck
example from issue # 23
let char_dist_issue_23 =
Test.make ~name:"char never produces '\\255'" ~count:1_000_000 char (fun c -> c <> '\255')
let char_test =
Test.make ~name:"char has right range" ~count:1000
char (fun c -> '\000' <= c && c <= '\255')
let printable_test =
Test.make ~name:"printable has right range" ~count:1000
printable_char (fun c -> c = '\n' || 32 <= Char.code c && Char.code c <= 126)
let numeral_test =
Test.make ~name:"numeral has right range" ~count:1000
numeral_char (fun c -> '0' <= c && c <= '9')
let nat_test =
Test.make ~name:"nat has right range" ~count:1000
(make ~print:Print.int Gen.nat) (fun n -> 0 <= n && n < 10000)
let bytes_test =
Test.make ~name:"bytes has right length and content" ~count:1000
bytes
(fun b ->
let len = Bytes.length b in
0 <= len && len < 10000
&& Bytes.to_seq b |>
Seq.fold_left (fun acc c -> acc && '\000' <= c && c <= '\255') true)
let string_test =
Test.make ~name:"string has right length and content" ~count:1000
string
(fun s ->
let len = String.length s in
0 <= len && len < 10000
&& String.to_seq s |>
Seq.fold_left (fun acc c -> acc && '\000' <= c && c <= '\255') true)
let pair_test =
Test.make ~name:"int pairs - commute over +" ~count:1000
(pair small_nat small_nat) (fun (i,j) -> i+j = j+i)
let triple_test =
Test.make ~name:"int triples - associative over +" ~count:1000
(triple small_nat small_nat small_nat) (fun (i,j,k) -> i+(j+k) = (i+j)+k)
let quad_test =
Test.make ~name:"int quadruples - product of sums" ~count:1000
(quad small_nat small_nat small_nat small_nat)
(fun (h,i,j,k) -> (h+i)*(j+k) = h*j + h*k + i*j + i*k)
let test_tup2 =
Test.make ~count:10
~name:"forall x in (0, 1): x = (0, 1)"
(tup2 (always 0) (always 1))
(fun x -> x = (0, 1))
let test_tup3 =
Test.make ~count:10
~name:"forall x in (0, 1, 2): x = (0, 1, 2)"
(tup3 (always 0) (always 1) (always 2))
(fun x -> x = (0, 1, 2))
let test_tup4 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3): x = (0, 1, 2, 3)"
(tup4 (always 0) (always 1) (always 2) (always 3))
(fun x -> x = (0, 1, 2, 3))
let test_tup5 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4): x = (0, 1, 2, 3, 4)"
(tup5 (always 0) (always 1) (always 2) (always 3) (always 4))
(fun x -> x = (0, 1, 2, 3, 4))
let test_tup6 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5): x = (0, 1, 2, 3, 4, 5)"
(tup6 (always 0) (always 1) (always 2) (always 3) (always 4) (always 5))
(fun x -> x = (0, 1, 2, 3, 4, 5))
let test_tup7 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5, 6): x = (0, 1, 2, 3, 4, 5, 6)"
(tup7
(always 0) (always 1) (always 2) (always 3) (always 4)
(always 5) (always 6))
(fun x -> x = (0, 1, 2, 3, 4, 5, 6))
let test_tup8 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5, 6, 7): x = (0, 1, 2, 3, 4, 5, 6, 7)"
(tup8
(always 0) (always 1) (always 2) (always 3) (always 4)
(always 5) (always 6) (always 7))
(fun x -> x = (0, 1, 2, 3, 4, 5, 6, 7))
let test_tup9 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5, 6, 7, 8): x = (0, 1, 2, 3, 4, 5, 6, 7, 8)"
(tup9
(always 0) (always 1) (always 2) (always 3) (always 4)
(always 5) (always 6) (always 7) (always 8))
(fun x -> x = (0, 1, 2, 3, 4, 5, 6, 7, 8))
let bind_test =
Test.make ~name:"bind test for ordered pairs" ~count:1000
(make Gen.(small_nat >>= fun j -> int_bound j >>= fun i -> return (i,j)))
(fun (i,j) -> i<=j)
let bind_pair_list_length =
Test.make ~name:"bind list length" ~count:1000
(make Gen.(int_bound 1000 >>= fun len ->
list_size (return len) (int_bound 1000) >>= fun xs -> return (len,xs)))
(fun (len,xs) -> len = List.length xs)
let list_test =
Test.make ~name:"list has right length" ~count:1000
(list unit) (fun l -> let len = List.length l in 0 <= len && len < 10_000)
let list_repeat_test =
let gen = Gen.(small_nat >>= fun i -> list_repeat i unit >>= fun l -> return (i,l)) in
Test.make ~name:"list_repeat has constant length" ~count:1000
(make ~print:Print.(pair int (list unit)) gen) (fun (i,l) -> List.length l = i)
let array_repeat_test =
let gen = Gen.(small_nat >>= fun i -> array_repeat i unit >>= fun l -> return (i,l)) in
Test.make ~name:"array_repeat has constant length" ~count:1000
(make ~print:Print.(pair int (array unit)) gen) (fun (i,l) -> Array.length l = i)
let passing_tree_rev =
Test.make ~name:"tree_rev_is_involutive" ~count:1000
(make IntTree.gen_tree)
(fun tree -> IntTree.(rev_tree (rev_tree tree)) = tree)
let nat_split2_spec =
Test.make ~name:"nat_split2 spec"
(make
~print:Print.(pair int (pair int int))
Gen.(small_nat >>= fun n ->
pair (return n) (nat_split2 n)))
(fun (n, (a, b)) ->
0 <= a && 0 <= b && a + b = n)
let pos_split2_spec =
Test.make ~name:"pos_split2 spec"
(make
~print:Print.(pair int (pair int int))
Gen.(small_nat >>= fun n ->
we need n > 2
let n = n + 2 in
pair (return n) (pos_split2 n)))
(fun (n, (a, b)) ->
(0 < a && 0 < b && a + b = n))
let range_subset_spec =
Test.make ~name:"range_subset_spec"
(make
~print:Print.(quad int int int (array int))
Gen.(pair small_nat small_nat >>= fun (m, n) ->
(* we must guarantee [low <= high]
and [size <= high - low + 1] *)
let low = m and high = m + n in
int_range 0 (high - low + 1) >>= fun size ->
quad (return size) (return low) (return high)
(range_subset ~size low high)))
(fun (size, low, high, arr) ->
if size = 0 then arr = [||]
else
Array.length arr = size
&& low <= arr.(0)
&& Array.for_all (fun (a, b) -> a < b)
(Array.init (size - 1) (fun k -> arr.(k), arr.(k+1)))
&& arr.(size - 1) <= high)
let nat_split_n_way =
Test.make ~name:"nat_split n-way"
(make
~print:Print.(pair int (array int))
Gen.(small_nat >>= fun n ->
pair (return n) (nat_split ~size:n n)))
(fun (n, arr) ->
Array.length arr = n
&& Array.for_all (fun k -> 0 <= k) arr
&& Array.fold_left (+) 0 arr = n)
let nat_split_smaller =
Test.make ~name:"nat_split smaller"
(make
~print:Print.(triple int int (array int))
Gen.(small_nat >>= fun size ->
int_bound size >>= fun n ->
triple (return size) (return n) (nat_split ~size n)))
(fun (m, n, arr) ->
Array.length arr = m
&& Array.for_all (fun k -> 0 <= k) arr
&& Array.fold_left (+) 0 arr = n)
let pos_split =
Test.make ~name:"pos_split"
(make
~print:Print.(triple int int (array int))
Gen.(pair small_nat small_nat >>= fun (m, n) ->
we need both size>0 and n>0 and size < = n
let size = 1 + min m n and n = 1 + max m n in
triple (return size) (return n) (pos_split ~size n)))
(fun (m, n, arr) ->
Array.length arr = m
&& Array.for_all (fun k -> 0 < k) arr
&& Array.fold_left (+) 0 arr = n)
let tests = [
char_dist_issue_23;
char_test;
printable_test;
numeral_test;
nat_test;
bytes_test;
string_test;
pair_test;
triple_test;
quad_test;
test_tup2;
test_tup3;
test_tup4;
test_tup5;
test_tup6;
test_tup7;
test_tup8;
test_tup9;
bind_test;
bind_pair_list_length;
list_test;
list_repeat_test;
array_repeat_test;
passing_tree_rev;
nat_split2_spec;
pos_split2_spec;
range_subset_spec;
nat_split_n_way;
nat_split_smaller;
pos_split;
]
end
(* negative tests that exercise shrinking behaviour *)
module Shrink = struct
open QCheck
let rec fac n = match n with
| 0 -> 1
| n -> n * fac (n - 1)
example from issue # 59
let test_fac_issue59 =
Test.make ~name:"test fac issue59"
(set_shrink Shrink.nil (small_int_corners ()))
(fun n -> try (fac n) mod n = 0
with
(*| Stack_overflow -> false*)
| Division_by_zero -> (n=0))
let big_bound_issue59 =
Test.make ~name:"big bound issue59"
(small_int_corners()) (fun i -> i < 209609)
let long_shrink =
let listgen = list_of_size (Gen.int_range 1000 10000) int in
Test.make ~name:"long_shrink" (pair listgen listgen)
(fun (xs,ys) -> List.rev (xs@ys) = (List.rev xs)@(List.rev ys))
test from issue # 36
let ints_arent_0_mod_3 =
Test.make ~name:"ints arent 0 mod 3" ~count:1000
int (fun i -> i mod 3 <> 0)
let ints_are_0 =
Test.make ~name:"ints are 0" ~count:1000
int (fun i -> Printf.printf "%i\n" i; i = 0)
test from issue # 59
let ints_smaller_209609 =
Test.make ~name:"ints < 209609"
(small_int_corners()) (fun i -> i < 209609)
let nats_smaller_5001 =
Test.make ~name:"nat < 5001" ~count:1000
(make ~print:Print.int ~shrink:Shrink.int Gen.nat) (fun n -> n < 5001)
let char_is_never_abcdef =
Test.make ~name:"char never produces 'abcdef'" ~count:1000
char (fun c -> not (List.mem c ['a';'b';'c';'d';'e';'f']))
should shrink towards ' a ' , hence produce ' & ' with highest ascii code 38
Test.make ~name:"printable never produces '!\"#$%&'" ~count:1000
printable_char (fun c -> not (List.mem c ['!';'"';'#';'$';'%';'&']))
let numeral_is_never_less_5 =
Test.make ~name:"printable never produces less than '5" ~count:1000
numeral_char (fun c -> c >= '5')
let bytes_are_empty =
Test.make ~name:"bytes are empty" ~count:1000
bytes (fun b -> b = Bytes.empty)
let bytes_never_has_000_char =
Test.make ~name:"bytes never has a \\000 char" ~count:1000
bytes
(fun b -> Bytes.to_seq b |> Seq.fold_left (fun acc c -> acc && c <> '\000') true)
let bytes_never_has_255_char =
Test.make ~name:"bytes never has a \\255 char" ~count:1000
bytes
(fun s -> Bytes.to_seq s |> Seq.fold_left (fun acc c -> acc && c <> '\255') true)
let bytes_unique_chars =
Test.make ~name:"bytes have unique chars" ~count:1000
bytes
(fun s ->
let ch_list = Bytes.to_seq s |> List.of_seq in
List.length ch_list = List.length (List.sort_uniq Char.compare ch_list))
let strings_are_empty =
Test.make ~name:"strings are empty" ~count:1000
string (fun s -> s = "")
let string_never_has_000_char =
Test.make ~name:"string never has a \\000 char" ~count:1000
string
(fun s -> String.to_seq s |> Seq.fold_left (fun acc c -> acc && c <> '\000') true)
let string_never_has_255_char =
Test.make ~name:"string never has a \\255 char" ~count:1000
string
(fun s -> String.to_seq s |> Seq.fold_left (fun acc c -> acc && c <> '\255') true)
let string_unique_chars =
Test.make ~name:"strings have unique chars" ~count:1000
string
(fun s ->
let ch_list = String.to_seq s |> List.of_seq in
List.length ch_list = List.length (List.sort_uniq Char.compare ch_list))
test from issue # 167
let pair_diff_issue_64 =
Test.make ~name:"pairs have different components"
(pair small_int small_int) (fun (i,j) -> i<>j)
let pair_same =
Test.make ~name:"pairs have same components" (pair int int) (fun (i,j) -> i=j)
let pair_one_zero =
Test.make ~name:"pairs have a zero component" (pair int int) (fun (i,j) -> i=0 || j=0)
let pair_all_zero =
Test.make ~name:"pairs are (0,0)" (pair int int) (fun (i,j) -> i=0 && j=0)
let pair_ordered =
Test.make ~name:"pairs are ordered" (pair pos_int pos_int) (fun (i,j) -> i<=j)
let pair_ordered_rev =
Test.make ~name:"pairs are ordered reversely" (pair pos_int pos_int) (fun (i,j) -> i>=j)
let pair_sum_lt_128 =
Test.make ~name:"pairs sum to less than 128" (pair pos_int pos_int) (fun (i,j) -> i+j<128)
let pair_lists_rev_concat =
Test.make ~name:"pairs lists rev concat"
(pair (list pos_int) (list pos_int))
(fun (xs,ys) -> List.rev (xs@ys) = (List.rev xs)@(List.rev ys))
let pair_lists_no_overlap =
Test.make ~name:"pairs lists no overlap"
(pair (list small_nat) (list small_nat))
(fun (xs,ys) -> List.for_all (fun x -> not (List.mem x ys)) xs)
let triple_diff =
Test.make ~name:"triples have pair-wise different components"
(triple small_int small_int small_int) (fun (i,j,k) -> i<>j && j<>k)
let triple_same =
Test.make ~name:"triples have same components"
(triple int int int) (fun (i,j,k) -> i=j || j=k)
let triple_ordered =
Test.make ~name:"triples are ordered"
(triple int int int) (fun (i,j,k) -> i<=j && j<=k)
let triple_ordered_rev =
Test.make ~name:"triples are ordered reversely"
(triple int int int) (fun (i,j,k) -> i>=j && j>=k)
let quad_diff =
Test.make ~name:"quadruples have pair-wise different components"
(quad small_int small_int small_int small_int) (fun (h,i,j,k) -> h<>i && i<>j && j<>k)
let quad_same =
Test.make ~name:"quadruples have same components"
(quad int int int int) (fun (h,i,j,k) -> h=i || i=j || j=k)
let quad_ordered =
Test.make ~name:"quadruples are ordered"
(quad int int int int) (fun (h,i,j,k) -> h <= i && i <= j && j <= k)
let quad_ordered_rev =
Test.make ~name:"quadruples are ordered reversely"
(quad int int int int) (fun (h,i,j,k) -> h >= i && i >= j && j >= k)
let test_tup2 =
Test.make
~name:"forall (a, b) in nat: a < b"
(tup2 small_int small_int)
(fun (a, b) -> a < b)
let test_tup3 =
Test.make
~name:"forall (a, b, c) in nat: a < b < c"
(tup3 small_int small_int small_int)
(fun (a, b, c) -> a < b && b < c)
let test_tup4 =
Test.make
~name:"forall (a, b, c, d) in nat: a < b < c < d"
(tup4 small_int small_int small_int small_int)
(fun (a, b, c, d) -> a < b && b < c && c < d)
let test_tup5 =
Test.make
~name:"forall (a, b, c, d, e) in nat: a < b < c < d < e"
(tup5 small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e) -> a < b && b < c && c < d && d < e)
let test_tup6 =
Test.make
~name:"forall (a, b, c, d, e, f) in nat: a < b < c < d < e < f"
(tup6 small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f) -> a < b && b < c && c < d && d < e && e < f)
let test_tup7 =
Test.make
~name:"forall (a, b, c, d, e, f, g) in nat: a < b < c < d < e < f < g"
(tup7 small_int small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f, g) -> a < b && b < c && c < d && d < e && e < f && f < g)
let test_tup8 =
Test.make
~name:"forall (a, b, c, d, e, f, g, h) in nat: a < b < c < d < e < f < g < h"
(tup8 small_int small_int small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f, g, h) -> a < b && b < c && c < d && d < e && e < f && f < g && g < h)
let test_tup9 =
Test.make
~name:"forall (a, b, c, d, e, f, g, h, i) in nat: a < b < c < d < e < f < g < h < i"
(tup9 small_int small_int small_int small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f, g, h, i) -> a < b && b < c && c < d && d < e && e < f && f < g && g < h && h < i)
let bind_pair_ordered =
Test.make ~name:"bind ordered pairs"
(make ~print:Print.(pair int int)
~shrink:Shrink.(filter (fun (i,j) -> i<=j) (pair int int))
Gen.(pint >>= fun j -> int_bound j >>= fun i -> return (i,j)))
(fun (_i,_j) -> false)
let bind_pair_list_size =
let shrink (_l,xs) =
Iter.map (fun xs' -> (List.length xs',xs')) Shrink.(list ~shrink:int xs) in
Test.make ~name:"bind list_size constant"
(make ~print:Print.(pair int (list int)) ~shrink
Gen.(int_bound 1000 >>= fun len ->
list_size (return len) (int_bound 1000) >>= fun xs -> return (len,xs)))
(fun (len,xs) -> let len' = List.length xs in len=len' && len' < 4)
let print_list xs = print_endline Print.(list int xs)
test from issue # 64
let lists_are_empty_issue_64 =
Test.make ~name:"lists are empty"
(list small_int) (fun xs -> print_list xs; xs = [])
let list_shorter_10 =
Test.make ~name:"lists shorter than 10"
(list small_int) (fun xs -> List.length xs < 10)
let length_printer xs =
Printf.sprintf "[...] list length: %i" (List.length xs)
let size_gen = Gen.(oneof [small_nat; int_bound 750_000])
let list_shorter_432 =
Test.make ~name:"lists shorter than 432"
(set_print length_printer (list_of_size size_gen small_int))
(fun xs -> List.length xs < 432)
let list_shorter_4332 =
Test.make ~name:"lists shorter than 4332"
(set_shrink Shrink.list_spine (set_print length_printer (list_of_size size_gen small_int)))
(fun xs -> List.length xs < 4332)
let list_equal_dupl =
Test.make ~name:"lists equal to duplication"
(list_of_size size_gen small_int)
(fun xs -> try xs = xs @ xs
with Stack_overflow -> false)
let list_unique_elems =
Test.make ~name:"lists have unique elems"
(list small_int)
(fun xs -> let ys = List.sort_uniq Int.compare xs in
print_list xs; List.length xs = List.length ys)
let tree_contains_only_42 =
Test.make ~name:"tree contains only 42"
IntTree.(make ~print:print_tree ~shrink:shrink_tree gen_tree)
(fun tree -> IntTree.contains_only_n tree 42)
let tests = [
(*test_fac_issue59;*)
big_bound_issue59;
long_shrink;
ints_arent_0_mod_3;
ints_are_0;
ints_smaller_209609;
nats_smaller_5001;
char_is_never_abcdef;
printable_is_never_sign;
numeral_is_never_less_5;
bytes_are_empty;
bytes_never_has_000_char;
bytes_never_has_255_char;
bytes_unique_chars;
strings_are_empty;
string_never_has_000_char;
string_never_has_255_char;
string_unique_chars;
pair_diff_issue_64;
pair_same;
pair_one_zero;
pair_all_zero;
pair_ordered;
pair_ordered_rev;
pair_sum_lt_128;
pair_lists_rev_concat;
pair_lists_no_overlap;
triple_diff;
triple_same;
triple_ordered;
triple_ordered_rev;
quad_diff;
quad_same;
quad_ordered;
quad_ordered_rev;
test_tup2;
test_tup3;
test_tup4;
test_tup5;
test_tup6;
test_tup7;
test_tup8;
test_tup9;
bind_pair_ordered;
bind_pair_list_size;
lists_are_empty_issue_64;
list_shorter_10;
list_shorter_432;
list_shorter_4332;
(*list_equal_dupl;*)
list_unique_elems;
tree_contains_only_42;
]
end
(* tests function generator and shrinker *)
module Function = struct
open QCheck
let fail_pred_map_commute =
Test.make ~name:"fail_pred_map_commute" ~count:100 ~long_factor:100
(triple
(small_list small_int)
(fun1 Observable.int int)
(fun1 Observable.int bool))
(fun (l,Fun (_,f),Fun (_,p)) ->
List.filter p (List.map f l) = List.map f (List.filter p l))
let fail_pred_strings =
Test.make ~name:"fail_pred_strings" ~count:100
(fun1 Observable.string bool)
(fun (Fun (_,p)) -> not (p "some random string") || p "some other string")
let int_gen = small_nat (* int *)
(* Another example (false) property *)
let prop_foldleft_foldright =
Test.make ~name:"fold_left fold_right" ~count:1000 ~long_factor:20
(triple
int_gen
(list int_gen)
(fun2 Observable.int Observable.int int_gen))
(fun (z,xs,f) ->
let l1 = List.fold_right (Fn.apply f) xs z in
let l2 = List.fold_left (Fn.apply f) z xs in
if l1=l2 then true
else Test.fail_reportf "l=%s, fold_left=%s, fold_right=%s@."
(Print.(list int) xs)
(Print.int l1)
(Print.int l2)
)
(* Another example (false) property *)
let prop_foldleft_foldright_uncurry =
Test.make ~name:"fold_left fold_right uncurried" ~count:1000 ~long_factor:20
(triple
(fun1 Observable.(pair int int) int_gen)
int_gen
(list int_gen))
(fun (f,z,xs) ->
List.fold_right (fun x y -> Fn.apply f (x,y)) xs z =
List.fold_left (fun x y -> Fn.apply f (x,y)) z xs)
(* Same as the above (false) property, but generating+shrinking functions last *)
let prop_foldleft_foldright_uncurry_funlast =
Test.make ~name:"fold_left fold_right uncurried fun last" ~count:1000 ~long_factor:20
(triple
int_gen
(list int_gen)
(fun1 Observable.(pair int int) int_gen))
(fun (z,xs,f) ->
List.fold_right (fun x y -> Fn.apply f (x,y)) xs z =
List.fold_left (fun x y -> Fn.apply f (x,y)) z xs)
test from issue # 64
let fold_left_test =
Test.make ~name:"fold_left test, fun first"
(quad (* string -> int -> string *)
(fun2 Observable.string Observable.int small_string)
small_string
(list small_int)
(list small_int))
(fun (f,acc,is,js) ->
let f = Fn.apply f in
List.fold_left f acc (is @ js)
= List.fold_left f (List.fold_left f acc is) is) (*Typo*)
let tests = [
fail_pred_map_commute;
fail_pred_strings;
prop_foldleft_foldright;
prop_foldleft_foldright_uncurry;
prop_foldleft_foldright_uncurry_funlast;
fold_left_test;
]
end
(* tests of (inner) find_example(_gen) behaviour *)
module FindExample = struct
open QCheck
let find_ex =
Test.make ~name:"find_example" (2--50)
(fun n ->
let st = Random.State.make [| 0 |] in
let f m = n < m && m < 2 * n in
try
let m = find_example_gen ~rand:st ~count:100_000 ~f Gen.(0 -- 1000) in
f m
with No_example_found _ -> false)
let find_ex_uncaught_issue_99_1_fail =
let rs = make (find_example ~count:10 ~f:(fun _ -> false) Gen.int) in
Test.make ~name:"FAIL_#99_1" rs (fun _ -> true)
let find_ex_uncaught_issue_99_2_succeed =
Test.make ~name:"should_succeed_#99_2" ~count:10
int (fun i -> i <= max_int)
let tests = [
find_ex;
find_ex_uncaught_issue_99_1_fail;
find_ex_uncaught_issue_99_2_succeed;
]
end
(* tests of statistics and histogram display *)
module Stats = struct
open QCheck
let bool_dist =
Test.make ~name:"bool dist" ~count:500_000 (set_collect Bool.to_string bool) (fun _ -> true)
let char_dist_tests =
[
Test.make ~name:"char code dist" ~count:500_000 (add_stat ("char code", Char.code) char) (fun _ -> true);
Test.make ~name:"printable char code dist" ~count:500_000 (add_stat ("char code", Char.code) printable_char) (fun _ -> true);
Test.make ~name:"numeral char code dist" ~count:500_000 (add_stat ("char code", Char.code) numeral_char) (fun _ -> true);
]
let bytes_len_tests =
let len = ("len",Bytes.length) in
[
Test.make ~name:"bytes_size len dist" ~count:5_000 (add_stat len (bytes_of_size (Gen.int_range 5 10))) (fun _ -> true);
Test.make ~name:"bytes len dist" ~count:5_000 (add_stat len bytes) (fun _ -> true);
Test.make ~name:"bytes_of len dist" ~count:5_000 (add_stat len (bytes_of (Gen.return 'a'))) (fun _ -> true);
Test.make ~name:"bytes_small len dist" ~count:5_000 (add_stat len bytes_small) (fun _ -> true);
]
let string_len_tests =
let len = ("len",String.length) in
[
Test.make ~name:"string_size len dist" ~count:5_000 (add_stat len (string_of_size (Gen.int_range 5 10))) (fun _ -> true);
Test.make ~name:"string len dist" ~count:5_000 (add_stat len string) (fun _ -> true);
Test.make ~name:"string_of len dist" ~count:5_000 (add_stat len (string_of (Gen.return 'a'))) (fun _ -> true);
Test.make ~name:"printable_string len dist" ~count:5_000 (add_stat len printable_string) (fun _ -> true);
Test.make ~name:"small_string len dist" ~count:5_000 (add_stat len small_string) (fun _ -> true);
]
let pair_dist =
Test.make ~name:"pair dist" ~count:500_000
(add_stat ("pair sum", (fun (i,j) -> i+j))
(pair (int_bound 100) (int_bound 100))) (fun _ -> true)
let triple_dist =
Test.make ~name:"triple dist" ~count:500_000
(add_stat ("triple sum", (fun (i,j,k) -> i+j+k))
(triple (int_bound 100) (int_bound 100) (int_bound 100))) (fun _ -> true)
let quad_dist =
Test.make ~name:"quad dist" ~count:500_000
(add_stat ("quad sum", (fun (h,i,j,k) -> h+i+j+k))
(quad (int_bound 100) (int_bound 100) (int_bound 100) (int_bound 100))) (fun _ -> true)
let bind_dist =
Test.make ~name:"bind dist" ~count:1_000_000
(make ~stats:[("ordered pair difference", (fun (i,j) -> j-i));("ordered pair sum", (fun (i,j) -> i+j))]
Gen.(int_bound 100 >>= fun j -> int_bound j >>= fun i -> return (i,j))) (fun _ -> true)
let list_len_tests =
let len = ("len",List.length) in
test from issue # 30
Test.make ~name:"list len dist" ~count:5_000 (add_stat len (list int)) (fun _ -> true);
Test.make ~name:"small_list len dist" ~count:5_000 (add_stat len (small_list int)) (fun _ -> true);
Test.make ~name:"list_of_size len dist" ~count:5_000 (add_stat len (list_of_size (Gen.int_range 5 10) int)) (fun _ -> true);
Test.make ~name:"list_repeat len dist" ~count:5_000 (add_stat len (make Gen.(list_repeat 42 int))) (fun _ -> true);
]
let array_len_tests =
let len = ("len",Array.length) in
[
Test.make ~name:"array len dist" ~count:5_000 (add_stat len (array int)) (fun _ -> true);
Test.make ~name:"small_array len dist" ~count:5_000 (add_stat len (make Gen.(small_array int))) (fun _ -> true);
Test.make ~name:"array_of_size len dist" ~count:5_000 (add_stat len (array_of_size (Gen.int_range 5 10) int)) (fun _ -> true);
Test.make ~name:"array_repeat len dist" ~count:5_000 (add_stat len (make Gen.(array_repeat 42 int))) (fun _ -> true);
]
let int_dist_tests =
let dist = ("dist",fun x -> x) in
test from issue # 40
Test.make ~name:"int_stats_neg" ~count:5000 (add_stat dist small_signed_int) (fun _ -> true);
distribution tests from PR # 45
Test.make ~name:"small_signed_int dist" ~count:1000 (add_stat dist small_signed_int) (fun _ -> true);
Test.make ~name:"small_nat dist" ~count:1000 (add_stat dist small_nat) (fun _ -> true);
Test.make ~name:"nat dist" ~count:1000 (add_stat dist (make Gen.nat)) (fun _ -> true);
Test.make ~name:"int_range (-43643) 435434 dist" ~count:1000 (add_stat dist (int_range (-43643) 435434)) (fun _ -> true);
Test.make ~name:"int_range (-40000) 40000 dist" ~count:1000 (add_stat dist (int_range (-40000) 40000)) (fun _ -> true);
Test.make ~name:"int_range (-4) 4 dist" ~count:1000 (add_stat dist (int_range (-4) 4)) (fun _ -> true);
Test.make ~name:"int_range (-4) 17 dist" ~count:1000 (add_stat dist (int_range (-4) 17)) (fun _ -> true);
Test.make ~name:"int dist" ~count:100000 (add_stat dist int) (fun _ -> true);
Test.make ~name:"oneof int dist" ~count:1000 (add_stat dist (oneofl[min_int;-1;0;1;max_int])) (fun _ -> true);
]
let tree_depth_test =
let depth = ("depth", IntTree.depth) in
Test.make ~name:"tree's depth" ~count:1000 (add_stat depth (make IntTree.gen_tree)) (fun _ -> true)
let range_subset_test =
Test.make ~name:"range_subset_spec" ~count:5_000
(add_stat ("dist", fun a -> a.(0)) (make (Gen.range_subset ~size:1 0 20)))
(fun a -> Array.length a = 1)
let int_dist_empty_bucket =
Test.make ~name:"int_dist_empty_bucket" ~count:1_000
(add_stat ("dist",fun x -> x) (oneof [small_int_corners ();int])) (fun _ -> true)
let tests =
[ bool_dist; ]
@ char_dist_tests
@ [tree_depth_test;
range_subset_test;]
@ bytes_len_tests
@ string_len_tests
@ [pair_dist;
triple_dist;
quad_dist;
bind_dist;]
@ list_len_tests
@ array_len_tests
@ int_dist_tests
end
| null | https://raw.githubusercontent.com/c-cube/qcheck/063c1d74795a24eb77fa661d218c4715382df566/test/core/QCheck_tests.ml | ocaml | * QCheck(1) tests *
Please add any additional tests to both [QCheck_tests.ml] and [QCheck2_tests.ml].
This ensures that both generator approaches continue to work as expected
and furthermore allows us to compare their behaviour with
[diff -y test/core/QCheck_expect_test.expected test/core/QCheck2_expect_test.expected]
* Module representing a tree data structure, used in tests
tests of overall functionality
i may be negative, causing int_bound to fail
[apply_n f x n] computes f(f(...f(x))) with n applications of f
we repeat the following multiple times to check the expected output for duplicate lines
positive tests of the various generators
we must guarantee [low <= high]
and [size <= high - low + 1]
negative tests that exercise shrinking behaviour
| Stack_overflow -> false
test_fac_issue59;
list_equal_dupl;
tests function generator and shrinker
int
Another example (false) property
Another example (false) property
Same as the above (false) property, but generating+shrinking functions last
string -> int -> string
Typo
tests of (inner) find_example(_gen) behaviour
tests of statistics and histogram display |
module IntTree = struct
open QCheck
type tree = Leaf of int | Node of tree * tree
let leaf x = Leaf x
let node x y = Node (x,y)
let rec depth = function
| Leaf _ -> 1
| Node (x, y) -> 1 + max (depth x) (depth y)
let rec print_tree = function
| Leaf x -> Printf.sprintf "Leaf %d" x
| Node (x, y) -> Printf.sprintf "Node (%s, %s)" (print_tree x) (print_tree y)
let gen_tree = Gen.(sized @@ fix
(fun self n -> match n with
| 0 -> map leaf nat
| n ->
frequency
[1, map leaf nat;
2, map2 node (self (n/2)) (self (n/2))]
))
let rec shrink_tree t = match t with
| Leaf l -> Iter.map (fun l' -> Leaf l') (Shrink.int l)
| Node (x,y) ->
let open Iter in
of_list [x;y]
<+> map (fun x' -> Node (x',y)) (shrink_tree x)
<+> map (fun y' -> Node (x,y')) (shrink_tree y)
let rec rev_tree = function
| Node (x, y) -> Node (rev_tree y, rev_tree x)
| Leaf x -> Leaf x
let rec contains_only_n tree n = match tree with
| Leaf n' -> n = n'
| Node (x, y) -> contains_only_n x n && contains_only_n y n
end
module Overall = struct
open QCheck
let passing =
Test.make ~name:"list_rev_is_involutive" ~count:100 ~long_factor:100
(list small_int) (fun l -> List.rev (List.rev l) = l)
let failing =
Test.make ~name:"should_fail_sort_id" ~count:10
(small_list small_int) (fun l -> l = List.sort compare l)
exception Error
let error =
Test.make ~name:"should_error_raise_exn" ~count:10
int (fun _ -> raise Error)
let collect =
Test.make ~name:"collect_results" ~count:100 ~long_factor:100
(make ~collect:string_of_int (Gen.int_bound 4))
(fun _ -> true)
let stats =
Test.make ~name:"with_stats" ~count:100 ~long_factor:100
(make (Gen.int_bound 120)
~stats:[
"mod4", (fun i->i mod 4);
"num", (fun i->i);
])
(fun _ -> true)
let retries =
Test.make ~name:"with shrinking retries" ~retries:10
small_nat (fun i -> Printf.printf "%i %!" i; i mod 3 <> 1)
let bad_assume_warn =
Test.make ~name:"WARN_unlikely_precond" ~count:2_000
int
(fun x ->
QCheck.assume (x mod 100 = 1);
true)
let bad_assume_fail =
Test.make ~name:"FAIL_unlikely_precond" ~count:2_000
~if_assumptions_fail:(`Fatal, 0.1)
int
(fun x ->
QCheck.assume (x mod 100 = 1);
true)
let bad_gen_fail =
Test.make ~name:"FAIL_bad_gen"
(make Gen.(int >>= fun j -> int_bound j >>= fun i -> return (i,j)))
let bad_shrinker_fail =
Test.make ~name:"FAIL_bad_shrinker"
(make
~shrink:(fun _i -> raise Error)
Gen.int)
(fun _i -> false)
let neg_test_fail_as_expected =
Test.make_neg ~name:"all ints are even" small_int (fun i -> i mod 2 = 0)
let neg_test_unexpected_success =
Test.make_neg ~name:"int double" small_int (fun i -> i + i = i * 2)
let neg_test_fail_with_shrinking =
Test.make_neg ~name:"list rev concat"
(pair (list small_int) (list small_int)) (fun (is,js) -> (List.rev is)@(List.rev js) = List.rev (is@js))
let pos_test_fails_with_error =
Test.make ~name:"pos fail with error" small_int (fun _i -> raise Error)
let neg_test_fail_with_error =
Test.make_neg ~name:"neg fail with error" small_int (fun _i -> raise Error)
let rec apply_n f x n =
if n=0
then x
else apply_n f (f x) (pred n)
test from # 236
let bad_fun_repro =
let sleep_time = 0.175 in
let count = ref 0 in
Test.make ~count:10 ~name:"bad function reproducability"
(set_shrink Shrink.nil (triple small_int (fun1 Observable.int small_int) small_int))
(fun (i,f,j) ->
incr count;
Printf.printf "(%i,fun,%i)%s%!" i j (if !count mod 10 = 0 then "\n" else " ");
Unix.sleepf sleep_time;
if 1 = Float.to_int (Unix.time ()) mod 2
then
(ignore(apply_n (Fn.apply f) i j > 0); true)
else
(ignore(apply_n (Fn.apply f) i i > 0); true))
let tests = [
passing;
failing;
error;
collect;
stats;
retries;
bad_assume_warn;
bad_assume_fail;
bad_gen_fail;
bad_shrinker_fail ;
neg_test_fail_as_expected;
neg_test_unexpected_success;
neg_test_fail_with_shrinking;
pos_test_fails_with_error;
neg_test_fail_with_error;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
bad_fun_repro;
]
end
module Generator = struct
open QCheck
example from issue # 23
let char_dist_issue_23 =
Test.make ~name:"char never produces '\\255'" ~count:1_000_000 char (fun c -> c <> '\255')
let char_test =
Test.make ~name:"char has right range" ~count:1000
char (fun c -> '\000' <= c && c <= '\255')
let printable_test =
Test.make ~name:"printable has right range" ~count:1000
printable_char (fun c -> c = '\n' || 32 <= Char.code c && Char.code c <= 126)
let numeral_test =
Test.make ~name:"numeral has right range" ~count:1000
numeral_char (fun c -> '0' <= c && c <= '9')
let nat_test =
Test.make ~name:"nat has right range" ~count:1000
(make ~print:Print.int Gen.nat) (fun n -> 0 <= n && n < 10000)
let bytes_test =
Test.make ~name:"bytes has right length and content" ~count:1000
bytes
(fun b ->
let len = Bytes.length b in
0 <= len && len < 10000
&& Bytes.to_seq b |>
Seq.fold_left (fun acc c -> acc && '\000' <= c && c <= '\255') true)
let string_test =
Test.make ~name:"string has right length and content" ~count:1000
string
(fun s ->
let len = String.length s in
0 <= len && len < 10000
&& String.to_seq s |>
Seq.fold_left (fun acc c -> acc && '\000' <= c && c <= '\255') true)
let pair_test =
Test.make ~name:"int pairs - commute over +" ~count:1000
(pair small_nat small_nat) (fun (i,j) -> i+j = j+i)
let triple_test =
Test.make ~name:"int triples - associative over +" ~count:1000
(triple small_nat small_nat small_nat) (fun (i,j,k) -> i+(j+k) = (i+j)+k)
let quad_test =
Test.make ~name:"int quadruples - product of sums" ~count:1000
(quad small_nat small_nat small_nat small_nat)
(fun (h,i,j,k) -> (h+i)*(j+k) = h*j + h*k + i*j + i*k)
let test_tup2 =
Test.make ~count:10
~name:"forall x in (0, 1): x = (0, 1)"
(tup2 (always 0) (always 1))
(fun x -> x = (0, 1))
let test_tup3 =
Test.make ~count:10
~name:"forall x in (0, 1, 2): x = (0, 1, 2)"
(tup3 (always 0) (always 1) (always 2))
(fun x -> x = (0, 1, 2))
let test_tup4 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3): x = (0, 1, 2, 3)"
(tup4 (always 0) (always 1) (always 2) (always 3))
(fun x -> x = (0, 1, 2, 3))
let test_tup5 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4): x = (0, 1, 2, 3, 4)"
(tup5 (always 0) (always 1) (always 2) (always 3) (always 4))
(fun x -> x = (0, 1, 2, 3, 4))
let test_tup6 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5): x = (0, 1, 2, 3, 4, 5)"
(tup6 (always 0) (always 1) (always 2) (always 3) (always 4) (always 5))
(fun x -> x = (0, 1, 2, 3, 4, 5))
let test_tup7 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5, 6): x = (0, 1, 2, 3, 4, 5, 6)"
(tup7
(always 0) (always 1) (always 2) (always 3) (always 4)
(always 5) (always 6))
(fun x -> x = (0, 1, 2, 3, 4, 5, 6))
let test_tup8 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5, 6, 7): x = (0, 1, 2, 3, 4, 5, 6, 7)"
(tup8
(always 0) (always 1) (always 2) (always 3) (always 4)
(always 5) (always 6) (always 7))
(fun x -> x = (0, 1, 2, 3, 4, 5, 6, 7))
let test_tup9 =
Test.make ~count:10
~name:"forall x in (0, 1, 2, 3, 4, 5, 6, 7, 8): x = (0, 1, 2, 3, 4, 5, 6, 7, 8)"
(tup9
(always 0) (always 1) (always 2) (always 3) (always 4)
(always 5) (always 6) (always 7) (always 8))
(fun x -> x = (0, 1, 2, 3, 4, 5, 6, 7, 8))
let bind_test =
Test.make ~name:"bind test for ordered pairs" ~count:1000
(make Gen.(small_nat >>= fun j -> int_bound j >>= fun i -> return (i,j)))
(fun (i,j) -> i<=j)
let bind_pair_list_length =
Test.make ~name:"bind list length" ~count:1000
(make Gen.(int_bound 1000 >>= fun len ->
list_size (return len) (int_bound 1000) >>= fun xs -> return (len,xs)))
(fun (len,xs) -> len = List.length xs)
let list_test =
Test.make ~name:"list has right length" ~count:1000
(list unit) (fun l -> let len = List.length l in 0 <= len && len < 10_000)
let list_repeat_test =
let gen = Gen.(small_nat >>= fun i -> list_repeat i unit >>= fun l -> return (i,l)) in
Test.make ~name:"list_repeat has constant length" ~count:1000
(make ~print:Print.(pair int (list unit)) gen) (fun (i,l) -> List.length l = i)
let array_repeat_test =
let gen = Gen.(small_nat >>= fun i -> array_repeat i unit >>= fun l -> return (i,l)) in
Test.make ~name:"array_repeat has constant length" ~count:1000
(make ~print:Print.(pair int (array unit)) gen) (fun (i,l) -> Array.length l = i)
let passing_tree_rev =
Test.make ~name:"tree_rev_is_involutive" ~count:1000
(make IntTree.gen_tree)
(fun tree -> IntTree.(rev_tree (rev_tree tree)) = tree)
let nat_split2_spec =
Test.make ~name:"nat_split2 spec"
(make
~print:Print.(pair int (pair int int))
Gen.(small_nat >>= fun n ->
pair (return n) (nat_split2 n)))
(fun (n, (a, b)) ->
0 <= a && 0 <= b && a + b = n)
let pos_split2_spec =
Test.make ~name:"pos_split2 spec"
(make
~print:Print.(pair int (pair int int))
Gen.(small_nat >>= fun n ->
we need n > 2
let n = n + 2 in
pair (return n) (pos_split2 n)))
(fun (n, (a, b)) ->
(0 < a && 0 < b && a + b = n))
let range_subset_spec =
Test.make ~name:"range_subset_spec"
(make
~print:Print.(quad int int int (array int))
Gen.(pair small_nat small_nat >>= fun (m, n) ->
let low = m and high = m + n in
int_range 0 (high - low + 1) >>= fun size ->
quad (return size) (return low) (return high)
(range_subset ~size low high)))
(fun (size, low, high, arr) ->
if size = 0 then arr = [||]
else
Array.length arr = size
&& low <= arr.(0)
&& Array.for_all (fun (a, b) -> a < b)
(Array.init (size - 1) (fun k -> arr.(k), arr.(k+1)))
&& arr.(size - 1) <= high)
let nat_split_n_way =
Test.make ~name:"nat_split n-way"
(make
~print:Print.(pair int (array int))
Gen.(small_nat >>= fun n ->
pair (return n) (nat_split ~size:n n)))
(fun (n, arr) ->
Array.length arr = n
&& Array.for_all (fun k -> 0 <= k) arr
&& Array.fold_left (+) 0 arr = n)
let nat_split_smaller =
Test.make ~name:"nat_split smaller"
(make
~print:Print.(triple int int (array int))
Gen.(small_nat >>= fun size ->
int_bound size >>= fun n ->
triple (return size) (return n) (nat_split ~size n)))
(fun (m, n, arr) ->
Array.length arr = m
&& Array.for_all (fun k -> 0 <= k) arr
&& Array.fold_left (+) 0 arr = n)
let pos_split =
Test.make ~name:"pos_split"
(make
~print:Print.(triple int int (array int))
Gen.(pair small_nat small_nat >>= fun (m, n) ->
we need both size>0 and n>0 and size < = n
let size = 1 + min m n and n = 1 + max m n in
triple (return size) (return n) (pos_split ~size n)))
(fun (m, n, arr) ->
Array.length arr = m
&& Array.for_all (fun k -> 0 < k) arr
&& Array.fold_left (+) 0 arr = n)
let tests = [
char_dist_issue_23;
char_test;
printable_test;
numeral_test;
nat_test;
bytes_test;
string_test;
pair_test;
triple_test;
quad_test;
test_tup2;
test_tup3;
test_tup4;
test_tup5;
test_tup6;
test_tup7;
test_tup8;
test_tup9;
bind_test;
bind_pair_list_length;
list_test;
list_repeat_test;
array_repeat_test;
passing_tree_rev;
nat_split2_spec;
pos_split2_spec;
range_subset_spec;
nat_split_n_way;
nat_split_smaller;
pos_split;
]
end
module Shrink = struct
open QCheck
let rec fac n = match n with
| 0 -> 1
| n -> n * fac (n - 1)
example from issue # 59
let test_fac_issue59 =
Test.make ~name:"test fac issue59"
(set_shrink Shrink.nil (small_int_corners ()))
(fun n -> try (fac n) mod n = 0
with
| Division_by_zero -> (n=0))
let big_bound_issue59 =
Test.make ~name:"big bound issue59"
(small_int_corners()) (fun i -> i < 209609)
let long_shrink =
let listgen = list_of_size (Gen.int_range 1000 10000) int in
Test.make ~name:"long_shrink" (pair listgen listgen)
(fun (xs,ys) -> List.rev (xs@ys) = (List.rev xs)@(List.rev ys))
test from issue # 36
let ints_arent_0_mod_3 =
Test.make ~name:"ints arent 0 mod 3" ~count:1000
int (fun i -> i mod 3 <> 0)
let ints_are_0 =
Test.make ~name:"ints are 0" ~count:1000
int (fun i -> Printf.printf "%i\n" i; i = 0)
test from issue # 59
let ints_smaller_209609 =
Test.make ~name:"ints < 209609"
(small_int_corners()) (fun i -> i < 209609)
let nats_smaller_5001 =
Test.make ~name:"nat < 5001" ~count:1000
(make ~print:Print.int ~shrink:Shrink.int Gen.nat) (fun n -> n < 5001)
let char_is_never_abcdef =
Test.make ~name:"char never produces 'abcdef'" ~count:1000
char (fun c -> not (List.mem c ['a';'b';'c';'d';'e';'f']))
should shrink towards ' a ' , hence produce ' & ' with highest ascii code 38
Test.make ~name:"printable never produces '!\"#$%&'" ~count:1000
printable_char (fun c -> not (List.mem c ['!';'"';'#';'$';'%';'&']))
let numeral_is_never_less_5 =
Test.make ~name:"printable never produces less than '5" ~count:1000
numeral_char (fun c -> c >= '5')
let bytes_are_empty =
Test.make ~name:"bytes are empty" ~count:1000
bytes (fun b -> b = Bytes.empty)
let bytes_never_has_000_char =
Test.make ~name:"bytes never has a \\000 char" ~count:1000
bytes
(fun b -> Bytes.to_seq b |> Seq.fold_left (fun acc c -> acc && c <> '\000') true)
let bytes_never_has_255_char =
Test.make ~name:"bytes never has a \\255 char" ~count:1000
bytes
(fun s -> Bytes.to_seq s |> Seq.fold_left (fun acc c -> acc && c <> '\255') true)
let bytes_unique_chars =
Test.make ~name:"bytes have unique chars" ~count:1000
bytes
(fun s ->
let ch_list = Bytes.to_seq s |> List.of_seq in
List.length ch_list = List.length (List.sort_uniq Char.compare ch_list))
let strings_are_empty =
Test.make ~name:"strings are empty" ~count:1000
string (fun s -> s = "")
let string_never_has_000_char =
Test.make ~name:"string never has a \\000 char" ~count:1000
string
(fun s -> String.to_seq s |> Seq.fold_left (fun acc c -> acc && c <> '\000') true)
let string_never_has_255_char =
Test.make ~name:"string never has a \\255 char" ~count:1000
string
(fun s -> String.to_seq s |> Seq.fold_left (fun acc c -> acc && c <> '\255') true)
let string_unique_chars =
Test.make ~name:"strings have unique chars" ~count:1000
string
(fun s ->
let ch_list = String.to_seq s |> List.of_seq in
List.length ch_list = List.length (List.sort_uniq Char.compare ch_list))
test from issue # 167
let pair_diff_issue_64 =
Test.make ~name:"pairs have different components"
(pair small_int small_int) (fun (i,j) -> i<>j)
let pair_same =
Test.make ~name:"pairs have same components" (pair int int) (fun (i,j) -> i=j)
let pair_one_zero =
Test.make ~name:"pairs have a zero component" (pair int int) (fun (i,j) -> i=0 || j=0)
let pair_all_zero =
Test.make ~name:"pairs are (0,0)" (pair int int) (fun (i,j) -> i=0 && j=0)
let pair_ordered =
Test.make ~name:"pairs are ordered" (pair pos_int pos_int) (fun (i,j) -> i<=j)
let pair_ordered_rev =
Test.make ~name:"pairs are ordered reversely" (pair pos_int pos_int) (fun (i,j) -> i>=j)
let pair_sum_lt_128 =
Test.make ~name:"pairs sum to less than 128" (pair pos_int pos_int) (fun (i,j) -> i+j<128)
let pair_lists_rev_concat =
Test.make ~name:"pairs lists rev concat"
(pair (list pos_int) (list pos_int))
(fun (xs,ys) -> List.rev (xs@ys) = (List.rev xs)@(List.rev ys))
let pair_lists_no_overlap =
Test.make ~name:"pairs lists no overlap"
(pair (list small_nat) (list small_nat))
(fun (xs,ys) -> List.for_all (fun x -> not (List.mem x ys)) xs)
let triple_diff =
Test.make ~name:"triples have pair-wise different components"
(triple small_int small_int small_int) (fun (i,j,k) -> i<>j && j<>k)
let triple_same =
Test.make ~name:"triples have same components"
(triple int int int) (fun (i,j,k) -> i=j || j=k)
let triple_ordered =
Test.make ~name:"triples are ordered"
(triple int int int) (fun (i,j,k) -> i<=j && j<=k)
let triple_ordered_rev =
Test.make ~name:"triples are ordered reversely"
(triple int int int) (fun (i,j,k) -> i>=j && j>=k)
let quad_diff =
Test.make ~name:"quadruples have pair-wise different components"
(quad small_int small_int small_int small_int) (fun (h,i,j,k) -> h<>i && i<>j && j<>k)
let quad_same =
Test.make ~name:"quadruples have same components"
(quad int int int int) (fun (h,i,j,k) -> h=i || i=j || j=k)
let quad_ordered =
Test.make ~name:"quadruples are ordered"
(quad int int int int) (fun (h,i,j,k) -> h <= i && i <= j && j <= k)
let quad_ordered_rev =
Test.make ~name:"quadruples are ordered reversely"
(quad int int int int) (fun (h,i,j,k) -> h >= i && i >= j && j >= k)
let test_tup2 =
Test.make
~name:"forall (a, b) in nat: a < b"
(tup2 small_int small_int)
(fun (a, b) -> a < b)
let test_tup3 =
Test.make
~name:"forall (a, b, c) in nat: a < b < c"
(tup3 small_int small_int small_int)
(fun (a, b, c) -> a < b && b < c)
let test_tup4 =
Test.make
~name:"forall (a, b, c, d) in nat: a < b < c < d"
(tup4 small_int small_int small_int small_int)
(fun (a, b, c, d) -> a < b && b < c && c < d)
let test_tup5 =
Test.make
~name:"forall (a, b, c, d, e) in nat: a < b < c < d < e"
(tup5 small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e) -> a < b && b < c && c < d && d < e)
let test_tup6 =
Test.make
~name:"forall (a, b, c, d, e, f) in nat: a < b < c < d < e < f"
(tup6 small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f) -> a < b && b < c && c < d && d < e && e < f)
let test_tup7 =
Test.make
~name:"forall (a, b, c, d, e, f, g) in nat: a < b < c < d < e < f < g"
(tup7 small_int small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f, g) -> a < b && b < c && c < d && d < e && e < f && f < g)
let test_tup8 =
Test.make
~name:"forall (a, b, c, d, e, f, g, h) in nat: a < b < c < d < e < f < g < h"
(tup8 small_int small_int small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f, g, h) -> a < b && b < c && c < d && d < e && e < f && f < g && g < h)
let test_tup9 =
Test.make
~name:"forall (a, b, c, d, e, f, g, h, i) in nat: a < b < c < d < e < f < g < h < i"
(tup9 small_int small_int small_int small_int small_int small_int small_int small_int small_int)
(fun (a, b, c, d, e, f, g, h, i) -> a < b && b < c && c < d && d < e && e < f && f < g && g < h && h < i)
let bind_pair_ordered =
Test.make ~name:"bind ordered pairs"
(make ~print:Print.(pair int int)
~shrink:Shrink.(filter (fun (i,j) -> i<=j) (pair int int))
Gen.(pint >>= fun j -> int_bound j >>= fun i -> return (i,j)))
(fun (_i,_j) -> false)
let bind_pair_list_size =
let shrink (_l,xs) =
Iter.map (fun xs' -> (List.length xs',xs')) Shrink.(list ~shrink:int xs) in
Test.make ~name:"bind list_size constant"
(make ~print:Print.(pair int (list int)) ~shrink
Gen.(int_bound 1000 >>= fun len ->
list_size (return len) (int_bound 1000) >>= fun xs -> return (len,xs)))
(fun (len,xs) -> let len' = List.length xs in len=len' && len' < 4)
let print_list xs = print_endline Print.(list int xs)
test from issue # 64
let lists_are_empty_issue_64 =
Test.make ~name:"lists are empty"
(list small_int) (fun xs -> print_list xs; xs = [])
let list_shorter_10 =
Test.make ~name:"lists shorter than 10"
(list small_int) (fun xs -> List.length xs < 10)
let length_printer xs =
Printf.sprintf "[...] list length: %i" (List.length xs)
let size_gen = Gen.(oneof [small_nat; int_bound 750_000])
let list_shorter_432 =
Test.make ~name:"lists shorter than 432"
(set_print length_printer (list_of_size size_gen small_int))
(fun xs -> List.length xs < 432)
let list_shorter_4332 =
Test.make ~name:"lists shorter than 4332"
(set_shrink Shrink.list_spine (set_print length_printer (list_of_size size_gen small_int)))
(fun xs -> List.length xs < 4332)
let list_equal_dupl =
Test.make ~name:"lists equal to duplication"
(list_of_size size_gen small_int)
(fun xs -> try xs = xs @ xs
with Stack_overflow -> false)
let list_unique_elems =
Test.make ~name:"lists have unique elems"
(list small_int)
(fun xs -> let ys = List.sort_uniq Int.compare xs in
print_list xs; List.length xs = List.length ys)
let tree_contains_only_42 =
Test.make ~name:"tree contains only 42"
IntTree.(make ~print:print_tree ~shrink:shrink_tree gen_tree)
(fun tree -> IntTree.contains_only_n tree 42)
let tests = [
big_bound_issue59;
long_shrink;
ints_arent_0_mod_3;
ints_are_0;
ints_smaller_209609;
nats_smaller_5001;
char_is_never_abcdef;
printable_is_never_sign;
numeral_is_never_less_5;
bytes_are_empty;
bytes_never_has_000_char;
bytes_never_has_255_char;
bytes_unique_chars;
strings_are_empty;
string_never_has_000_char;
string_never_has_255_char;
string_unique_chars;
pair_diff_issue_64;
pair_same;
pair_one_zero;
pair_all_zero;
pair_ordered;
pair_ordered_rev;
pair_sum_lt_128;
pair_lists_rev_concat;
pair_lists_no_overlap;
triple_diff;
triple_same;
triple_ordered;
triple_ordered_rev;
quad_diff;
quad_same;
quad_ordered;
quad_ordered_rev;
test_tup2;
test_tup3;
test_tup4;
test_tup5;
test_tup6;
test_tup7;
test_tup8;
test_tup9;
bind_pair_ordered;
bind_pair_list_size;
lists_are_empty_issue_64;
list_shorter_10;
list_shorter_432;
list_shorter_4332;
list_unique_elems;
tree_contains_only_42;
]
end
module Function = struct
open QCheck
let fail_pred_map_commute =
Test.make ~name:"fail_pred_map_commute" ~count:100 ~long_factor:100
(triple
(small_list small_int)
(fun1 Observable.int int)
(fun1 Observable.int bool))
(fun (l,Fun (_,f),Fun (_,p)) ->
List.filter p (List.map f l) = List.map f (List.filter p l))
let fail_pred_strings =
Test.make ~name:"fail_pred_strings" ~count:100
(fun1 Observable.string bool)
(fun (Fun (_,p)) -> not (p "some random string") || p "some other string")
let prop_foldleft_foldright =
Test.make ~name:"fold_left fold_right" ~count:1000 ~long_factor:20
(triple
int_gen
(list int_gen)
(fun2 Observable.int Observable.int int_gen))
(fun (z,xs,f) ->
let l1 = List.fold_right (Fn.apply f) xs z in
let l2 = List.fold_left (Fn.apply f) z xs in
if l1=l2 then true
else Test.fail_reportf "l=%s, fold_left=%s, fold_right=%s@."
(Print.(list int) xs)
(Print.int l1)
(Print.int l2)
)
let prop_foldleft_foldright_uncurry =
Test.make ~name:"fold_left fold_right uncurried" ~count:1000 ~long_factor:20
(triple
(fun1 Observable.(pair int int) int_gen)
int_gen
(list int_gen))
(fun (f,z,xs) ->
List.fold_right (fun x y -> Fn.apply f (x,y)) xs z =
List.fold_left (fun x y -> Fn.apply f (x,y)) z xs)
let prop_foldleft_foldright_uncurry_funlast =
Test.make ~name:"fold_left fold_right uncurried fun last" ~count:1000 ~long_factor:20
(triple
int_gen
(list int_gen)
(fun1 Observable.(pair int int) int_gen))
(fun (z,xs,f) ->
List.fold_right (fun x y -> Fn.apply f (x,y)) xs z =
List.fold_left (fun x y -> Fn.apply f (x,y)) z xs)
test from issue # 64
let fold_left_test =
Test.make ~name:"fold_left test, fun first"
(fun2 Observable.string Observable.int small_string)
small_string
(list small_int)
(list small_int))
(fun (f,acc,is,js) ->
let f = Fn.apply f in
List.fold_left f acc (is @ js)
let tests = [
fail_pred_map_commute;
fail_pred_strings;
prop_foldleft_foldright;
prop_foldleft_foldright_uncurry;
prop_foldleft_foldright_uncurry_funlast;
fold_left_test;
]
end
module FindExample = struct
open QCheck
let find_ex =
Test.make ~name:"find_example" (2--50)
(fun n ->
let st = Random.State.make [| 0 |] in
let f m = n < m && m < 2 * n in
try
let m = find_example_gen ~rand:st ~count:100_000 ~f Gen.(0 -- 1000) in
f m
with No_example_found _ -> false)
let find_ex_uncaught_issue_99_1_fail =
let rs = make (find_example ~count:10 ~f:(fun _ -> false) Gen.int) in
Test.make ~name:"FAIL_#99_1" rs (fun _ -> true)
let find_ex_uncaught_issue_99_2_succeed =
Test.make ~name:"should_succeed_#99_2" ~count:10
int (fun i -> i <= max_int)
let tests = [
find_ex;
find_ex_uncaught_issue_99_1_fail;
find_ex_uncaught_issue_99_2_succeed;
]
end
module Stats = struct
open QCheck
let bool_dist =
Test.make ~name:"bool dist" ~count:500_000 (set_collect Bool.to_string bool) (fun _ -> true)
let char_dist_tests =
[
Test.make ~name:"char code dist" ~count:500_000 (add_stat ("char code", Char.code) char) (fun _ -> true);
Test.make ~name:"printable char code dist" ~count:500_000 (add_stat ("char code", Char.code) printable_char) (fun _ -> true);
Test.make ~name:"numeral char code dist" ~count:500_000 (add_stat ("char code", Char.code) numeral_char) (fun _ -> true);
]
let bytes_len_tests =
let len = ("len",Bytes.length) in
[
Test.make ~name:"bytes_size len dist" ~count:5_000 (add_stat len (bytes_of_size (Gen.int_range 5 10))) (fun _ -> true);
Test.make ~name:"bytes len dist" ~count:5_000 (add_stat len bytes) (fun _ -> true);
Test.make ~name:"bytes_of len dist" ~count:5_000 (add_stat len (bytes_of (Gen.return 'a'))) (fun _ -> true);
Test.make ~name:"bytes_small len dist" ~count:5_000 (add_stat len bytes_small) (fun _ -> true);
]
let string_len_tests =
let len = ("len",String.length) in
[
Test.make ~name:"string_size len dist" ~count:5_000 (add_stat len (string_of_size (Gen.int_range 5 10))) (fun _ -> true);
Test.make ~name:"string len dist" ~count:5_000 (add_stat len string) (fun _ -> true);
Test.make ~name:"string_of len dist" ~count:5_000 (add_stat len (string_of (Gen.return 'a'))) (fun _ -> true);
Test.make ~name:"printable_string len dist" ~count:5_000 (add_stat len printable_string) (fun _ -> true);
Test.make ~name:"small_string len dist" ~count:5_000 (add_stat len small_string) (fun _ -> true);
]
let pair_dist =
Test.make ~name:"pair dist" ~count:500_000
(add_stat ("pair sum", (fun (i,j) -> i+j))
(pair (int_bound 100) (int_bound 100))) (fun _ -> true)
let triple_dist =
Test.make ~name:"triple dist" ~count:500_000
(add_stat ("triple sum", (fun (i,j,k) -> i+j+k))
(triple (int_bound 100) (int_bound 100) (int_bound 100))) (fun _ -> true)
let quad_dist =
Test.make ~name:"quad dist" ~count:500_000
(add_stat ("quad sum", (fun (h,i,j,k) -> h+i+j+k))
(quad (int_bound 100) (int_bound 100) (int_bound 100) (int_bound 100))) (fun _ -> true)
let bind_dist =
Test.make ~name:"bind dist" ~count:1_000_000
(make ~stats:[("ordered pair difference", (fun (i,j) -> j-i));("ordered pair sum", (fun (i,j) -> i+j))]
Gen.(int_bound 100 >>= fun j -> int_bound j >>= fun i -> return (i,j))) (fun _ -> true)
let list_len_tests =
let len = ("len",List.length) in
test from issue # 30
Test.make ~name:"list len dist" ~count:5_000 (add_stat len (list int)) (fun _ -> true);
Test.make ~name:"small_list len dist" ~count:5_000 (add_stat len (small_list int)) (fun _ -> true);
Test.make ~name:"list_of_size len dist" ~count:5_000 (add_stat len (list_of_size (Gen.int_range 5 10) int)) (fun _ -> true);
Test.make ~name:"list_repeat len dist" ~count:5_000 (add_stat len (make Gen.(list_repeat 42 int))) (fun _ -> true);
]
let array_len_tests =
let len = ("len",Array.length) in
[
Test.make ~name:"array len dist" ~count:5_000 (add_stat len (array int)) (fun _ -> true);
Test.make ~name:"small_array len dist" ~count:5_000 (add_stat len (make Gen.(small_array int))) (fun _ -> true);
Test.make ~name:"array_of_size len dist" ~count:5_000 (add_stat len (array_of_size (Gen.int_range 5 10) int)) (fun _ -> true);
Test.make ~name:"array_repeat len dist" ~count:5_000 (add_stat len (make Gen.(array_repeat 42 int))) (fun _ -> true);
]
let int_dist_tests =
let dist = ("dist",fun x -> x) in
test from issue # 40
Test.make ~name:"int_stats_neg" ~count:5000 (add_stat dist small_signed_int) (fun _ -> true);
distribution tests from PR # 45
Test.make ~name:"small_signed_int dist" ~count:1000 (add_stat dist small_signed_int) (fun _ -> true);
Test.make ~name:"small_nat dist" ~count:1000 (add_stat dist small_nat) (fun _ -> true);
Test.make ~name:"nat dist" ~count:1000 (add_stat dist (make Gen.nat)) (fun _ -> true);
Test.make ~name:"int_range (-43643) 435434 dist" ~count:1000 (add_stat dist (int_range (-43643) 435434)) (fun _ -> true);
Test.make ~name:"int_range (-40000) 40000 dist" ~count:1000 (add_stat dist (int_range (-40000) 40000)) (fun _ -> true);
Test.make ~name:"int_range (-4) 4 dist" ~count:1000 (add_stat dist (int_range (-4) 4)) (fun _ -> true);
Test.make ~name:"int_range (-4) 17 dist" ~count:1000 (add_stat dist (int_range (-4) 17)) (fun _ -> true);
Test.make ~name:"int dist" ~count:100000 (add_stat dist int) (fun _ -> true);
Test.make ~name:"oneof int dist" ~count:1000 (add_stat dist (oneofl[min_int;-1;0;1;max_int])) (fun _ -> true);
]
let tree_depth_test =
let depth = ("depth", IntTree.depth) in
Test.make ~name:"tree's depth" ~count:1000 (add_stat depth (make IntTree.gen_tree)) (fun _ -> true)
let range_subset_test =
Test.make ~name:"range_subset_spec" ~count:5_000
(add_stat ("dist", fun a -> a.(0)) (make (Gen.range_subset ~size:1 0 20)))
(fun a -> Array.length a = 1)
let int_dist_empty_bucket =
Test.make ~name:"int_dist_empty_bucket" ~count:1_000
(add_stat ("dist",fun x -> x) (oneof [small_int_corners ();int])) (fun _ -> true)
let tests =
[ bool_dist; ]
@ char_dist_tests
@ [tree_depth_test;
range_subset_test;]
@ bytes_len_tests
@ string_len_tests
@ [pair_dist;
triple_dist;
quad_dist;
bind_dist;]
@ list_len_tests
@ array_len_tests
@ int_dist_tests
end
|
be3091abc32d05ffbd11fb342f41797ac503d5300b12fcab06f5678e0c0c5b52 | dbuenzli/brr | brr_io.mli | ---------------------------------------------------------------------------
Copyright ( c ) 2020 The brr programmers . All rights reserved .
Distributed under the ISC license , see terms at the end of the file .
---------------------------------------------------------------------------
Copyright (c) 2020 The brr programmers. All rights reserved.
Distributed under the ISC license, see terms at the end of the file.
---------------------------------------------------------------------------*)
(** Clipboard, Form, Fetch, Geolocation, Media and Storage APIs. *)
open Brr
(** Clipboard access
See the {{:-US/docs/Web/API/Clipboard}
Clipboard API}. *)
module Clipboard : sig
(** Clipboard items. *)
module Item : sig
(** Presentation style enum. *)
module Presentation_style : sig
type t = Jstr.t
(** The type for
{{:-apis/#enumdef-presentationstyle}
presentation} style values. *)
val unspecified : t
val inline : t
val attachment : t
end
type opts
(** The type for
{{:-apis/#dictdef-clipboarditemoptions}
[ClipboardItemOptions]}. *)
val opts : ?presentation_style:Presentation_style.t -> unit -> opts
(** [opts ~presentation_style ()] are options for clipboard item
objects. *)
type t
(** The type for {{:-US/docs/Web/API/ClipboardItem}[ClipboardItem]} objects. *)
val create : ?opts:opts -> (Jstr.t * Blob.t) list -> t
* [ create ~opts data ] is { { : -US/docs/Web/API/ClipboardItem/ClipboardItem}clipboard item } with MIME types and associated
values [ data ] and options [ opts ] .
values [data] and options [opts]. *)
val presentation_style : t -> Presentation_style.t
* [ presentation_style i ] is the { { : -apis/#dom-clipboarditem-presentationstyle}presentation style } of [ i ] .
val last_modified_ms : t -> int
(** [last_modified_ms i] is the
{{:-apis/#dom-clipboarditem-lastmodified}
last modified time} in ms from the epoch of [i]. *)
val delayed : t -> bool
(** [delayed i] is the
{{:-apis/#dom-clipboarditem-delayed}delayed} property of [i]. *)
val types : t -> Jstr.t list
(** [types i] is the array of MIME types {{:-US/docs/Web/API/ClipboardItem/types}available} for [i]. *)
val get_type : t -> Jstr.t -> Brr.Blob.t Fut.or_error
* [ get_type i t ] is the { { : object } with MIME type [ t ] for item [ i ] .
(**/**)
include Jv.CONV with type t := t
(**/**)
end
type t
(** The type for {{:-US/docs/Web/API/Clipboard}[Clipboard]} objects. *)
val of_navigator : Navigator.t -> t
(** [of_navigator n] is a clipboard object for
{{:-US/docs/Web/API/Navigator/clipboard}navigator} [n]. *)
val as_target : t -> Ev.target
(** [as_target c] is [c] as an event target. *)
* { 1 : rw Reading and writing }
val read : t -> Item.t list Fut.or_error
(** [read c] is the {{:-US/docs/Web/API/Clipboard/read}content} of [c]. *)
val read_text : t -> Jstr.t Fut.or_error
(** [read_text c] is the clipboard {{:-US/docs/Web/API/Clipboard/readText}textual content} of [c]. *)
val write : t -> Item.t list -> unit Fut.or_error
(** [write c is]
{{:-US/docs/Web/API/Clipboard/write}
writes} the items [is] to [c]. *)
val write_text : t -> Jstr.t -> unit Fut.or_error
(** [write_text c s]
{{:-US/docs/Web/API/Clipboard/writeText}
writes} the string [s] to [c]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Form elements and form data. *)
module Form : sig
* { 1 : element Element }
type t
* The type for
{ { : }
[ HTMLFormElement ] } objects .
{{:-US/docs/Web/API/HTMLFormElement}
[HTMLFormElement]} objects. *)
val of_el : El.t -> t
* [ of_el e ] is a form from element [ e ] . This throws a JavaScript
error if [ e ] is not a form element .
error if [e] is not a form element. *)
val to_el : t -> El.t
(** [to_el f] is [f] as an an element. *)
val name : t -> Jstr.t
(** [name f] is the {{:-US/docs/Web/API/HTMLFormElement/name}name} of [f]. *)
val method' : t -> Jstr.t
(** [method' f] is the {{:-US/docs/Web/API/HTMLFormElement/method}method} of [f]. *)
val target : t -> Jstr.t
(** [target f] is the {{:-US/docs/Web/API/HTMLFormElement/target}target} of [f]. *)
val action : t -> Jstr.t
(** [action f] is the {{:-US/docs/Web/API/HTMLFormElement/action}action} of [f]. *)
val enctype : t -> Jstr.t
(** [enctype f] is the {{:-US/docs/Web/API/HTMLFormElement/enctype}enctype} of [f]. *)
val accept_charset : t -> Jstr.t
(** [accept_charset f] is the {{:-US/docs/Web/API/HTMLFormElement/acceptCharset}charset accepted} by [f]. *)
val autocomplete : t -> Jstr.t
(** [autocomplete f] refelects the value of the {{:-US/docs/Web/HTML/Element/form#attr-autocomplete}autocomplete} attribute
of [f]. *)
val no_validate : t -> bool
* [ no_validate f ] refelects the value of the { { : -US/docs/Web/HTML/Element/form#attr-novalidate}novalidate } attribute
of [ f ] .
of [f]. *)
val check_validity : t -> bool
(** [check_validity f] is [true] if the form's children controls
all satisfy their
{{:-US/docs/Web/Guide/HTML/HTML5/Constraint_validation}validation constraints}. *)
val report_validity : t -> bool
(** [report_validity f] is like {!check_validity} but also
{{:-US/docs/Web/API/HTMLFormElement/reportValidity}reports} problems to the user. *)
val request_submit : t -> El.t option -> unit
(** [request_submist f el] requests the form to be
{{:-US/docs/Web/API/HTMLFormElement/requestSubmit}submited} using button [el] or the form itself if unspecified. *)
val reset : t -> unit
(** [reset f]
{{:-US/docs/Web/API/HTMLFormElement/reset}
resets} the form. *)
val submit : t -> unit
(** [submit f] {{:-US/docs/Web/API/HTMLFormElement/submit}submits} the form. *)
* { 1 : data Data }
(** Form data. *)
module Data : sig
type form = t
(** See {!Brr_io.Form.t}. *)
type entry_value = [ `String of Jstr.t | `File of File.t ]
(** The type for form data entry values. *)
type t
(** The type for
{{:-US/docs/Web/API/FormData}FormData}
objects. *)
val create : unit -> t
(** [create ()] is new, empty, form data. *)
val of_form : form -> t
(** [of_form f] is a form data from the
{{:-US/docs/Web/API/FormData/FormData#Parameters}current key-values} of form [f]. *)
val is_empty : t -> bool
(** [is_empty d] is [true] if [d] has no entries. *)
val has_file_entry : t -> bool
(** [has_file_entry d] is [true] iff [d] has a file entry. *)
val mem : t -> Jstr.t -> bool
(** [mem d k] is [true] if [d]
{{:-US/docs/Web/API/FormData/has}has}
key [k]. *)
val find : t -> Jstr.t -> entry_value option
* [ find d k ] is the first value associated to [ k ] in [ d ] ( if any ) .
val find_all : t -> Jstr.t -> entry_value list
(** [find_all d k] are all the values associated to [k] in [d]. *)
val fold : (Jstr.t -> entry_value -> 'a -> 'a) -> t -> 'a -> 'a
(** [fold f d acc] folds over all key/value entries in [d] with [f]
starting with [k]. *)
val set : t -> Jstr.t -> Jstr.t -> unit
(** [set d k v]
{{:-US/docs/Web/API/FormData/set}
sets} the value of [k] to [v] in [d]. *)
val set_blob : ?filename:Jstr.t -> t -> Jstr.t -> Blob.t -> unit
(** [set d k b ~filename]
{{:-US/docs/Web/API/FormData/set}
sets} the value of [k] to [b] in [d]. [filename] can
specify the filename of [b]. *)
val append : t -> Jstr.t -> Jstr.t -> unit
(** [append d k v]
{{:-US/docs/Web/API/FormData/append}
appends} value [v] to the value of [k] in [d]. *)
val append_blob : ?filename:Jstr.t -> t -> Jstr.t -> Blob.t -> unit
(** [append d k b ~filename]
{{:-US/docs/Web/API/FormData/append}
appends} blob [b] to the value of [k] in [d]. [filename] can
specify the filename of [b]. *)
val delete : t -> Jstr.t -> unit
* [ delete d k ]
{ { : -US/docs/Web/API/FormData/delete }
deletes } the values of key [ k ] in [ d ] .
{{:-US/docs/Web/API/FormData/delete}
deletes} the values of key [k] in [d]. *)
* { 1 : convert Converting }
val of_assoc : (Jstr.t * entry_value) list -> t
* [ of_assoc l ] is form data from assoc [ l ] , data is { ! .
val to_assoc : t -> (Jstr.t * entry_value) list
(** [to_assoc l] is the form data as an association list. *)
val of_uri_params : Uri.Params.t -> t
(** [of_uri_params p] is a form data for [p]. *)
val to_uri_params : t -> Uri.Params.t
(** [to_uri_params t] is the form data as URI query parameters.
{b Note.} If your form has file inputs this will map their keys
to something like ["[Object File]"], {!has_file_entry} indicates
whether the form data has a file entry. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* { 1 : events Events }
(** Form events *)
module Ev : sig
(** Form data events *)
module Data : sig
type t
(** The type for
{{:-US/docs/Web/API/FormDataEvent}
[FormDataEvent]} objects. *)
val form_data : t -> Data.t
(** [form_data e] is the
{{:-US/docs/Web/API/FormDataEvent/formData}form data} when the event was fired. *)
end
val formdata : Data.t Ev.type'
(** [formadata] is the type for {{:-US/docs/Web/API/HTMLFormElement/formdata_event}[formdata]} event. *)
(** Submit events *)
module Submit : sig
type t
* The type for
{ { : -US/docs/Web/API/SubmitEvent }
[ SubmitEvent ] } objects .
{{:-US/docs/Web/API/SubmitEvent}
[SubmitEvent]} objects. *)
val submitter : t -> El.t option
(** [submitter e] is
{{:-US/docs/Web/API/SubmitEvent}
the element} which triggered the submission. *)
end
val submit : Submit.t Ev.type'
(** [submit] is the type for
{{:-US/docs/Web/API/HTMLFormElement/submit_event}submit} events. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Fetching resources.
See the {{:-US/docs/Web/API/Fetch_API}
Fetch API}. *)
module Fetch : sig
(** Body specification and interface. *)
module Body : sig
* { 1 : init Specification }
type init
(** The type for specifying bodies. *)
val of_jstr : Jstr.t -> init
(** [of_jstr s] is a body from string [s]. *)
val of_uri_params : Brr.Uri.Params.t -> init
* [ of_uri_params p ] is a body from URI params [ p ] .
val of_form_data : Form.Data.t -> init
(** [of_form_data d] is a body from form data [d]. *)
val of_blob : Brr.Blob.t -> init
(** [of_blob b] is a body from blob [b]. *)
val of_array_buffer : Brr.Tarray.Buffer.t -> init
(** [of_array_buffer b] is a body from array buffer [b]. *)
* { 1 : interface Interface }
type t
(** The type for objects implementing the
{{:-US/docs/Web/API/Body}[Body]}
interface. *)
val body_used : t -> bool
(** [body_used b] indicates
{{:-US/docs/Web/API/Body/bodyUsed}
indicates} if [b] was used. *)
val body : t -> Jv.t option
(** [body b] is [b] as a
{{:-US/docs/Web/API/Body/body}
stream}. *)
val array_buffer : t -> Tarray.Buffer.t Fut.or_error
(** [array_buffer b]
{{:-US/docs/Web/API/Body/arrayBuffer}
reads} [b] into an array buffer. *)
val blob : t -> Blob.t Fut.or_error
* [ blob b ]
{ { : }
reads } [ b ] as a blob .
{{:-US/docs/Web/API/Body/blob}
reads} [b] as a blob. *)
val form_data : t -> Form.Data.t Fut.or_error
(** [form_data b]
{{:-US/docs/Web/API/Body/formData}
reads} [b] as form data. *)
val json : t -> Json.t Fut.or_error
(** [json b]
{{:-US/docs/Web/API/Body/json}
reads} [b] and parses it as JSON data. *)
val text : t -> Jstr.t Fut.or_error
* [ text b ]
{ { : -US/docs/Web/API/Body/text}reads }
[ b ] and UTF-8 decodes it to a string .
{{:-US/docs/Web/API/Body/text}reads}
[b] and UTF-8 decodes it to a string. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Request and response headers.
{b Warning.} We left out mutable operations out of the interface
but remember these objects may mutate under your feet. *)
module Headers : sig
* { 1 : headers Headers }
type t
(** The type for
{{:-US/docs/Web/API/Headers}[Headers]}
objects. *)
val mem : Jstr.t -> t -> bool
(** [mem h hs] is [true] iff header [h] has a value in [hs].
The lookup is case insensitive. *)
val find : Jstr.t -> t -> Jstr.t option
(** [find h hs] is the value of header [h] in [hs] (if any).
The lookup is case insensitive. *)
val fold : (Jstr.t -> Jstr.t -> 'a -> 'a) -> t -> 'a -> 'a
(** [fold f hs acc] folds the headers [h] of [hs] and their value
[v] with [f h v] starting with [acc]. It's unclear but
header names are likely lowercased. *)
* { 1 : convert Converting }
val of_obj : Jv.t -> t
(** [of_obj o] uses the keys and values of object [o] to define
headers and their value. *)
val of_assoc : ?init:t -> (Jstr.t * Jstr.t) list -> t
(** [of_assoc ~init assoc] are the headers from [init] (default si
empty) to which the header value pairs of [assoc] are
appended. If a header is defined more than once this either
overwrites the previous definition, or appends to the value if
if the value can be multi-valued. *)
val to_assoc : t -> (Jstr.t * Jstr.t) list
(** [to_assoc hs] are the headres [hs] as an assoc list.
It's unclear but header names are likely lowercased. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Resource requests. *)
module Request : sig
* { 1 : enums Enumerations }
(** Request cache mode enum. *)
module Cache : sig
type t = Jstr.t
(** The type for {{:-US/docs/Web/API/Request/cache#Value}[RequestCache]} values. *)
val default : t
val force_cache : t
val no_cache : t
val no_store : t
val only_if_cached : t
val reload : t
end
(** Request credentials mode enum. *)
module Credentials : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/Request/credentials#Value}[RequestCredentials]} values. *)
val include' : t
val omit : t
val same_origin : t
end
(** Request destination enum. *)
module Destination : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/RequestDestination}[RequestDestination]} values. *)
val audio : t
val audioworklet : t
val document : t
val embed : t
val font : t
val frame : t
val iframe : t
val image : t
val manifest : t
val object' : t
val paintworklet : t
val report : t
val script : t
val sharedworker : t
val style : t
val track : t
val video : t
val worker : t
val xslt : t
end
(** Request mode enum. *)
module Mode : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/Request/mode#Value}
[RequestMode]} values. *)
val cors : t
val navigate : t
val no_cors : t
val same_origin : t
end
(** Request redirect enum. *)
module Redirect : sig
type t = Jstr.t
* The type for
{ { : -US/docs/Web/API/Request/redirect#Value }
[ RequestRedirect ] } values .
{{:-US/docs/Web/API/Request/redirect#Value}
[RequestRedirect]} values. *)
val error : t
val follow : t
val manual : t
end
* { 1 : req Requests }
type init
(** The type for request initialisation objects. *)
val init :
?body:Body.init -> ?cache:Cache.t -> ?credentials:Credentials.t ->
?headers:Headers.t -> ?integrity:Jstr.t -> ?keepalive:bool ->
?method':Jstr.t -> ?mode:Mode.t -> ?redirect:Redirect.t ->
?referrer:Jstr.t -> ?referrer_policy:Jstr.t ->
?signal:Abort.Signal.t -> unit -> init
(** [init ()] is a request initialisation object with given
{{:-US/docs/Web/API/Request/Request#Parameters}parameters}. *)
type t
(** The type for {{:-US/docs/Web/API/Request}[Request]} objects. *)
val v : ?init:init -> Jstr.t -> t
(** [v ~init uri] is a request on [uri] with parameters [init]. *)
val of_request : ?init:init -> t -> t
(** [of_request ~init r] is a copy of [r] updated by [init]. *)
external as_body : t -> Body.t = "%identity"
(** [as_body r] is the {!Body} interface of [r]. *)
* { 1 : props Properties }
val cache : t -> Cache.t
* [ cache r ] is the
{ { : -US/docs/Web/API/Request/cache }
cache } behaviour of [ r ] .
{{:-US/docs/Web/API/Request/cache}
cache} behaviour of [r]. *)
val credentials : t -> Credentials.t
(** [credentials r] are the
{{:-US/docs/Web/API/Request/credentials}
credentials} of [r]. *)
val destination : t -> Destination.t
(** [destination r] is the
{{:-US/docs/Web/API/Request/destination}
destination} of [r]. *)
val headers : t -> Headers.t
(** [headers r] are the
{{:-US/docs/Web/API/Request/headers}
headers} of [r]. *)
val integrity : t -> Jstr.t
(** [integrity r] is the
{{:-US/docs/Web/API/Request/integrity}
integrity} of [r]. *)
val is_history_navigation : t -> bool
(** [is_history_navigation r] is the
{{:-US/docs/Web/API/Request/isHistoryNavigation}
[isHistoryNavigation]} property of [r]. *)
val is_reload_navigation : t -> bool
(** [is_reload_navigation r] is the
{{:-US/docs/Web/API/Request/isReloadNavigation}
[isReloadNavigation]} property of [r]. *)
val keepalive : t -> bool
* [ keepalive r ] is the
{ { : -US/docs/Web/API/Request/keepalive }
} behaviour of [ r ] .
{{:-US/docs/Web/API/Request/keepalive}
keepalive} behaviour of [r]. *)
val method' : t -> Jstr.t
(** [method' r] is the
{{:-US/docs/Web/API/Request/method}
method} of [r]. *)
val mode : t -> Mode.t
(** [mode r] is the
{{:-US/docs/Web/API/Request/mode}
mode} of [r]. *)
val redirect : t -> Redirect.t
(** [redirect r] is the
{{:-US/docs/Web/API/Request/redirect}
redirect} behaviour of [r]. *)
val referrer : t -> Jstr.t
(** [referrer r] is the
{{:-US/docs/Web/API/Request/referrer}
referrer} of [r]. *)
val referrer_policy : t -> Jstr.t
* [ referrer_policy r ] is the
{ { :
referrer policy } of [ r ] .
{{:-US/docs/Web/API/Request/referrerPolicy}
referrer policy} of [r]. *)
val signal : t -> Abort.Signal.t option
(** [signal r] is the
{{:-US/docs/Web/API/Request/signal}
abort signal} of [r]. *)
val url : t -> Jstr.t
(** [url r] is the
{{:-US/docs/Web/API/Request/url}
url} of [r]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Request responses. *)
module Response : sig
* { 1 : enums Enumerations }
(** Response type enum. *)
module Type : sig
type t = Jstr.t
(** The type for {{:-US/docs/Web/API/Response/type#Value}[ResponseType]} values. *)
val basic : t
val cors : t
val default : t
val error : t
val opaque : t
val opaqueredirect : t
end
* { 1 : resp Responses }
type init
(** The type for response initialisation objects. *)
val init :
?headers:Headers.t -> ?status:int -> ?status_text:Jstr.t -> unit -> init
(** [init ()] is a response initialisation object with given
{{:-US/docs/Web/API/Response/Response#Parameters}parameters}. *)
type t
(** The type for {{:-US/docs/Web/API/Response}[Response]} objects. *)
val v : ?init:init -> ?body:Body.init -> unit -> t
(** [v ~init ~body] is a response with parameters [init] and body
[body]. *)
val of_response : t -> t
(** [of_response r] is a copy of [r]. *)
val error : unit -> t
(** [error] is a
{{:-US/docs/Web/API/Response/error}
network error response}. *)
val redirect : ?status:int -> Jstr.t -> t
(** [redirect ~status url] is a
{{:-US/docs/Web/API/Response/redirect}
redirect response} to [url] with status [status]. *)
external as_body : t -> Body.t = "%identity"
(** [as_body r] is the {{!Body}body interface} of [r]. *)
* { 1 : props Properties }
val headers : t -> Headers.t
(** [headers r] are the
{{:-US/docs/Web/API/Response/headers}
headers} of [r]. *)
val ok : t -> bool
* [ ok r ] is [ true ] if the response [ r ] is
{ { : }
successful } .
{{:-US/docs/Web/API/Response/ok}
successful}. *)
val redirected : t -> bool
(** [redirected r] is [true] if the reponse is the result of a
{{:-US/docs/Web/API/Response/redirected}
redirection}. *)
val status : t -> int
(** [status r] is the
{{:-US/docs/Web/API/Response/status}
status} of [r]. *)
val status_text : t -> Jstr.t
* [ status_text r ] is the
{ { : }
status text } of [ r ] .
{{:-US/docs/Web/API/Response/statusText}
status text} of [r]. *)
val url : t -> Jstr.t
(** [url r] is the
{{:-US/docs/Web/API/Response/url}
[url]} of [r]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Fetch caches. *)
module Cache : sig
type query_opts
(** The type for query options. *)
val query_opts :
?ignore_search:bool -> ?ignore_method:bool -> ?ignore_vary:bool ->
?cache_name:Jstr.t -> unit -> query_opts
(** [query_opts ~ignore_search ~ignore_method ~ignore_vary ~cache_name ()]
are query options with given {{:-US/docs/Web/API/CacheStorage/match#Parameters}parameters}. *)
type t
(** The type for
{{:-US/docs/Web/API/Cache}Cache}
objects. *)
val match' :
?query_opts:query_opts -> t -> Request.t -> Response.t option Fut.or_error
(** [match' c req] is a {{:-US/docs/Web/API/Cache/match}stored response} for [req] in [c] (if any). *)
val match_all :
?query_opts:query_opts -> t -> Request.t -> Response.t list Fut.or_error
(** [match_all c req] is a list {{:-US/docs/Web/API/Cache/matchAll}stored response} for [req] in [c]. *)
val add : t -> Request.t -> unit Fut.or_error
(** [add c req] fetches [req] and
{{:-US/docs/Web/API/Cache/add}adds}
the response to [c]. *)
val add_all : t -> Request.t list -> unit Fut.or_error
(** [add_all c reqs] fetches [reqs] and
{{:-US/docs/Web/API/Cache/addAll}adds}
their reponses to [c]. *)
val put : t -> Request.t -> Response.t -> unit Fut.or_error
(** [put c req resp]
{{:-US/docs/Web/API/Cache/put}puts}
the [req]/[resp] pair to the cache. *)
val delete : ?query_opts:query_opts -> t -> Request.t -> bool Fut.or_error
(** [delete c req] {{:-US/docs/Web/API/Cache/delete}deletes} response to [req] from the cache. [false]
is returned if [req] was not in the cache. *)
val keys :
?query_opts:query_opts -> ?req:Request.t -> t ->
Request.t list Fut.or_error
(** [keys c] are the {{:-US/docs/Web/API/Cache/keys}requests} cached by [c]. *)
* { 1 : cache_storage Cache storage }
(** Cache storage objects. *)
module Storage : sig
type cache = t
(** See {!t}. *)
type t
* The type for
{ { : -US/docs/Web/API/CacheStorage }
CacheStorage } objects . See { ! Brr_io.Fetch.caches } to get one .
{{:-US/docs/Web/API/CacheStorage}
CacheStorage} objects. See {!Brr_io.Fetch.caches} to get one. *)
val match' :
?query_opts:query_opts -> t -> Request.t ->
Response.t option Fut.or_error
(** [match' s req] is a {{:-US/docs/Web/API/CacheStorage/match}stored response} for [req] in [s] (if any). *)
val has : t -> Jstr.t -> bool Fut.or_error
(** [has s n] is [true] if [n] matches a {{:-US/docs/Web/API/CacheStorage/has}cache name} in [s]. *)
val open' : t -> Jstr.t -> cache Fut.or_error
(** [open' s n] {{:-US/docs/Web/API/CacheStorage/open}opens} the cache named [n] of [s]. *)
val delete : t -> Jstr.t -> bool Fut.or_error
(** [delete s n] {{:-US/docs/Web/API/CacheStorage/delete}deletes} the cache named [n] from [s]. [false] is returned
if [n] did not exist. *)
val keys : t -> Jstr.t list Fut.or_error
(** [keys s] are the {{:-US/docs/Web/API/CacheStorage/keys}cache names} in [s]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Fetch events. *)
module Ev : sig
type t
(** The type for
{{:-US/docs/Web/API/FetchEvent}
[FetchEvent]} objects. *)
val fetch : t Ev.type'
(** [fetch] is the [fetch] event type. *)
val as_extendable : t -> Ev.Extendable.t Ev.t
(** [as_extendable e] is [e] as an extendable event. *)
val request : t -> Request.t
(** [request e] is the
{{:-US/docs/Web/API/FetchEvent/request}
request} being fetched. *)
val preload_response : t -> Response.t option Fut.or_error
(** [preload_response e] is a navigation response {{:-US/docs/Web/API/FetchEvent/preloadResponse}preload} (if any). *)
val client_id : t -> Jstr.t
(** [client_id e] is the {{:-US/docs/Web/API/FetchEvent/clientId}client id} of [e]. *)
val resulting_client_id : t -> Jstr.t
(** [resulting_client_id e] is the {{:-US/docs/Web/API/FetchEvent/resultingClientId}resulting} client id. *)
val replaces_client_id : t -> Jstr.t
(** [replaces_client_id e] is the client id being
{{:-US/docs/Web/API/FetchEvent/replacesClientId}replaced}. *)
val handled : t -> unit Fut.or_error
(** [handled e] is obscure. *)
val respond_with : t -> Response.t Fut.or_error -> unit
(** [respond_with e resp] replace the browser's default fetch handling
with the {{:-US/docs/Web/API/FetchEvent/respondWith}
response} [resp]. *)
end
val url : ?init:Request.init -> Jstr.t -> Response.t Fut.or_error
(** [url ~init u] {{:-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch}fetches} URL [u] with the [init] request object. *)
val request : Request.t -> Response.t Fut.or_error
(** [request r] {{:-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch}fetches} request [r]. *)
val caches : Cache.Storage.t
(** [caches] is the global
{{:-US/docs/Web/API/WindowOrWorkerGlobalScope/caches}[caches]} object. *)
end
(** Access to device location.
See {{:-US/docs/Web/API/Geolocation_API}
Geolocation API}. *)
module Geolocation : sig
(** Position errors. *)
module Error : sig
* { 1 : codes Codes }
type code = int
(** The type for {{:-US/docs/Web/API/GeolocationPositionError/code#Value}error code} values. *)
val permission_denied : code
val position_unavailable : code
val timeout : code
* { 1 : errors Errors }
type t
* The type for { { : ] } objects .
val code : t -> code
(** [code e] is the
{{:-US/docs/Web/API/GeolocationPositionError/code}error code} of [e]. *)
val message : t -> Jstr.t
(** [message e] is a
{{:-US/docs/Web/API/GeolocationPositionError/message}human readable} error message. For programmers, not for end
users. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Positions. *)
module Pos : sig
type t
(** The type for {{:-US/docs/Web/API/GeolocationPosition}[GeolocationPosition]} objects (and their
{{:-US/docs/Web/API/GeolocationCoordinates}[GeolocationCoordinates]} member). *)
val latitude : t -> float
(** [latitude p] is the {{:-US/docs/Web/API/GeolocationCoordinates/latitude}latitude} in decimal degrees. *)
val longitude : t -> float
(** [longitude p] is the {{:-US/docs/Web/API/GeolocationCoordinates/longitude}longitude} in decimal degrees. *)
val accuracy : t -> float
(** [accuracy p] is the {{:-US/docs/Web/API/GeolocationCoordinates/accuracy}accuracy}, in meters, of the {!latitude}
and {!longitude} in meters. *)
val altitude : t -> float option
(** [altitude p] is the {{:-US/docs/Web/API/GeolocationCoordinates/altitude}altitude} in meters relative to sea level. *)
val altitude_accuracy : t -> float option
(** [altitude_accuracy p] is the {{:-US/docs/Web/API/GeolocationCoordinates/altitudeAccuracy}altitude accuracy}, in meters,
of the {!altitude}. *)
val heading : t -> float option
* [ heading p ] is the { { : -US/docs/Web/API/GeolocationCoordinates/heading}direction } in degree with respect to true north
( 90 ° is east ) . If { ! speed } is [ 0 ] , this is [ nan ] .
(90° is east). If {!speed} is [0], this is [nan]. *)
val speed : t -> float option
(** [speed p] is the device {{:-US/docs/Web/API/GeolocationCoordinates/speed}velocity} in meters per seconds. *)
val timestamp_ms : t -> float
(** [timestamp_ms p] is the
{{:-US/docs/Web/API/GeolocationPosition/timestamp}time} of measurement in [ms] since
the epoch. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
type opts
(** The type for geolocalisation options. *)
val opts :
?high_accuracy:bool -> ?timeout_ms:int -> ?maximum_age_ms:int -> unit ->
opts
(** [opts ~high_accuracy ~maximum_age_ms ~timeout_ms ()] are geolocalisation
{{:-US/docs/Web/API/PositionOptions#Properties}options}. *)
* { 1 : geoloc Geolocalizing }
type t
(** The type for device {{:-US/docs/Web/API/Geolocation}[Geolocation]} objects. *)
val of_navigator : Navigator.t -> t
(** [of_navigator n] is a device geolocalisation object for
{{:-US/docs/Web/API/Navigator/geolocation}navigator} [n]. *)
val get : ?opts:opts -> t -> (Pos.t, Error.t) Fut.result
(** [get l ~opts] is the position of [l]
{{:-US/docs/Web/API/Geolocation/getCurrentPosition}determined}
with options [opts]. *)
type watch_id = int
(** The type for watcher identifiers. *)
val watch : ?opts:opts -> t -> ((Pos.t, Error.t) result -> unit) -> watch_id
(** [watch l ~opts f] {{:-US/docs/Web/API/Geolocation/watchPosition}monitors} the position of [l] determined with [opts] by
periodically calling [f]. Stop watching by calling {!unwatch} with
the returned identifier. *)
val unwatch : t -> watch_id -> unit
(** [unwatch l id] {{:-US/docs/Web/API/Geolocation/clearWatch}unwatches} [id] as returned by a previous call to {!watch}. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* Access to media devices , streams and elements .
Access to the { { : -main}Media
Capture and Streams } API , the
{ { : -record/ } MediaStream
Recording } API and the
{ { : -US/docs/Web/API/HTMLMediaElement }
[ HTMLMediaElement ] } interface .
Access to the {{:-main}Media
Capture and Streams} API, the
{{:-record/} MediaStream
Recording} API and the
{{:-US/docs/Web/API/HTMLMediaElement}
[HTMLMediaElement]} interface. *)
module Media : sig
* { 1 : constrainable Constrainable pattern }
The following little bureaucracy tries to expose
the { { : -main/#constrainable-interface }
constrainable pattern } in a lean way .
{ { : -US/docs/Web/API/Media_Streams_API/Constraints}This introduction } on MDN may also be useful .
The following little bureaucracy tries to expose
the {{:-main/#constrainable-interface}
constrainable pattern} in a lean way.
{{:-US/docs/Web/API/Media_Streams_API/Constraints}This introduction} on MDN may also be useful. *)
(** Media objects properties, capabilities and constraints. *)
module Prop : sig
* { 1 : range_constraits Ranges and constraints }
(** [bool] constraints. *)
module Bool : sig
module Constraint : sig
type t
(** The type for [bool] constraints. *)
val v : ?exact:bool -> ?ideal:bool -> unit -> t
end
end
(** [int] ranges and constraints. *)
module Int : sig
module Range : sig
type t
(** The type for integer ranges. *)
val v : ?min:int -> ?max:int -> unit -> t
val min : t -> int option
val max : t -> int option
(**/**)
include Jv.CONV with type t := t
(**/**)
end
module Constraint : sig
type t
(** The type for integer range constraints. *)
val v : ?min:int -> ?max:int -> ?exact:int -> ?ideal:int -> unit -> t
(**/**)
include Jv.CONV with type t := t
(**/**)
end
end
(** [float] ranges and constraints. *)
module Float : sig
module Range : sig
type t
(** The type for float ranges. *)
val v : ?min:float -> ?max:float -> unit -> t
val min : t -> float option
val max : t -> float option
(**/**)
include Jv.CONV with type t := t
(**/**)
end
module Constraint : sig
type t
(** The type for float range constraints. *)
val v : ?min:float -> ?max:float -> ?exact:float -> ?ideal:float ->
unit -> t
(**/**)
include Jv.CONV with type t := t
(**/**)
end
end
* [ ] constraints .
module Jstr : sig
type t = Jstr.t
module Constraint : sig
type t
(** The type for [bool] constraints. *)
val v : ?exact:Jstr.t list -> ?ideal:Jstr.t list -> unit -> t
(**/**)
include Jv.CONV with type t := t
(**/**)
end
end
* { 1 : props Properties }
type ('a, 'b, 'c) t
(** The type for properties of type ['a] whose capabilities
are described by ['b] and which are constrained by ['c]. *)
type bool_t = (bool, bool list, Bool.Constraint.t) t
(** The type for boolean properties. *)
val bool : Jstr.t -> bool_t
(** [bool n] is a bool property named [n]. *)
type int_t = (int, Int.Range.t, Int.Constraint.t) t
(** The type for integer properties. *)
val int : Jstr.t -> int_t
(** [int n] is an integer property named [n]. *)
type float_t = (float, Float.Range.t, Float.Constraint.t) t
(** The type for floating point properties. *)
val float : Jstr.t -> float_t
(** [float n] is a float property named [n]. *)
type jstr_t = (Jstr.t, Jstr.t, Jstr.Constraint.t) t
(** The type for string properties. *)
val jstr : Jstr.t -> jstr_t
(** [jstr n] is a string property named [n]. *)
type jstr_enum_t = (Jstr.t, Jstr.t list, Jstr.Constraint.t) t
(** The type for string enumeration properties. *)
val jstr_enum : Jstr.t -> jstr_enum_t
(** [jstr n] is a string enumeration property named [n]. *)
* { 1 : low Low - level interface }
type 'a conv = ('a -> Jv.t) * (Jv.t -> 'a)
* [ ' a conv ] specifies encoding and decoding functions for JavaScript .
val v : Jstr.t -> 'a conv -> 'b conv -> 'c conv -> ('a, 'b, 'c) t
* [ v v_conv cap_conv constr_conv n ] is a new property named [ n ] whose
values are converted with [ v_conv ] , capabilities with [ cap_conv ] and
constraints with [ constr_conv ] .
values are converted with [v_conv], capabilities with [cap_conv] and
constraints with [constr_conv]. *)
val name : ('a, 'b, 'c) t -> Jstr.t
(** [name p] is the name of the property. *)
val value_of_jv : ('a, 'b, 'c) t -> Jv.t -> 'a
* [ of_jv p jv ] is the property value of [ p ] from [ jv ] .
val value_to_jv : ('a, 'b, 'c) t -> 'a -> Jv.t
* [ to_jv p v ] is the JavaScript value of [ p ] for [ v ] .
val cap_of_jv : ('a, 'b, 'c) t -> Jv.t -> 'b
(** [cap_of_jv p jv] is the property capability of [p] from [jv]. *)
val cap_to_jv : ('a, 'b, 'c) t -> 'b -> Jv.t
(** [cap_jv p v] is the capability value of [p] for [v]. *)
val constr_of_jv : ('a, 'b, 'c) t -> Jv.t -> 'c
(** [cap_of_jv p jv] is the property constraint of [p] from [jv]. *)
val constr_to_jv : ('a, 'b, 'c) t -> 'c -> Jv.t
(** [cap_jv p v] is the cosntraint value of [p] for [v]. *)
end
(** Supported property constraints.
Indicates the media properties constraints the user agent
understands. *)
module Supported_constraints : sig
type t
(** The type for supported constraints. *)
val mem : ('a, 'b, 'c) Prop.t -> t -> bool
(** [supports p n] is true if property [p] can be constrained. *)
val names : t -> Jstr.t list
(** [supported s] is the list of supported constraints. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Property constraints specifications. *)
module Constraints : sig
type t
(** The type for constraints. *)
val empty : unit -> t
(** [empty ()] is an empty set of constraints. *)
val find : ('a, 'b, 'c) Prop.t -> t -> 'c option
* [ find p s ] is the constraint for [ p ] in [ c ] ( if any ) .
val set : ('a, 'b, 'c) Prop.t -> 'c -> t -> unit
(** [set p v c] sets the constraint for [p] to [v] in [c]. *)
val delete : ('a, 'b, 'c) Prop.t -> t -> unit
(** [delete p c] deletes the constraint for [p] from [c]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Property capability specifications. *)
module Capabilities : sig
type t
(** The type for capabilities. *)
val find : ('a, 'b, 'c) Prop.t -> t -> 'b option
* [ find p s ] is the capability of [ p ] in [ c ] ( if any ) .
val set : ('a, 'b, 'c) Prop.t -> 'b -> t -> unit
(** [set p v c] sets the capability of [p] to [v] in [c]. *)
val delete : ('a, 'b, 'c) Prop.t -> t -> unit
(** [delete p c] deletes the capability of [p] from [c]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Property values. *)
module Settings : sig
type t
(** The type for settings. *)
val get : ('a, 'b, 'c) Prop.t -> t -> 'a
(** [get p s] is the value of [p] in [s]. *)
val find : ('a, 'b, 'c) Prop.t -> t -> 'a option
* [ find p s ] is the value of [ p ] in [ s ] ( if any ) .
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* { 1 : media Media devices , streams and tracks }
(** Media stream tracks. *)
module Track : sig
* { 1 : enum and properties }
(** Track state enumeration. *)
module State : sig
type t = Jstr.t
* The type for { { : } values .
val live : t
val ended : t
end
(** Track kind enumeration. *)
module Kind : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/MediaStreamTrack/kind#Value}track kind} values. *)
val audio : t
val video : t
end
(** Track properties *)
module Prop : sig
val aspect_ratio : Prop.float_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/aspectRatio}[aspectRatio]} property. *)
val auto_gain_control : Prop.bool_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/autoGainControl}[autoGainControl]} property. *)
val channel_count : Prop.int_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/channelCount}[channelCount]} property. *)
val cursor : Prop.jstr_enum_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/cursor}[cursor]} property. *)
val device_id : Prop.jstr_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/deviceId}[deviceId]} property. *)
val display_surface : Prop.jstr_enum_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/displaySurface}[displaySurface]} property. *)
val echo_cancellation : Prop.bool_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/echoCancellation}[echoCancellation]} property. *)
val facing_mode : Prop.jstr_enum_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/facingMode}[facingMode]} property. *)
val frame_rate : Prop.float_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/frameRate}[frameRate]} property. *)
val group_id : Prop.jstr_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/groupId}[groupId]} property. *)
val height : Prop.int_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/height}[height]} property. *)
val latency : Prop.float_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/latency}[latency]} property. *)
val logical_surface : Prop.bool_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/logicalSurface}[logicalSurface]} property. *)
val noise_suppresion : Prop.bool_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/noiseSuppression}[noiseSuppression]} property. *)
val resize_mode : Prop.jstr_enum_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/resizeMode}[resizeMode]} property. *)
val sample_rate : Prop.int_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/sampleRate}[sampleRate]} property. *)
val sample_size : Prop.int_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/sampleSize}[sampleSize]} property. *)
val width : Prop.int_t
(** The {{:-US/docs/Web/API/MediaTrackSettings/width}[width]} property. *)
end
* { 1 : tracks Tracks }
type t
(** The type for {{:-US/docs/Web/API/MediaStreamTrack}[MediaStreamTrack]} objects. *)
external as_target : t -> Ev.target = "%identity"
(** [as_target t] is [t] as an event target. *)
val id : t -> Jstr.t
(** [id t] is the {{:-US/docs/Web/API/MediaStreamTrack/id}unique identifier} of [t]. *)
val isolated : t -> bool
(** [isolated t] is the
{{:-identity/#dfn-isolated}isolation status}
of [t]. *)
val kind : t -> Kind.t
* [ kind ] is the
{ { : } of [ t ] .
{{:-US/docs/Web/API/MediaStreamTrack/kind}kind} of [t]. *)
val label : t -> Jstr.t
(** [label t] is the
{{:-US/docs/Web/API/MediaStreamTrack/label}label} of [t]. *)
val muted : t -> bool
(** [muted t] is [true] if [t] is
{{:-US/docs/Web/API/MediaStreamTrack/muted}muted}. Use {!set_enabled} to manually mute and unmute a track. Use events
{!Ev.mute} and {!Ev.unmute} to monitor mute status. *)
val ready_state : t -> State.t
(** [ready_state t] is the
{{:-US/docs/Web/API/MediaStreamTrack/readyState}status} of the track. Use event {!Ev.ended} to monitor ready state. *)
val enabled : t -> bool
(** [enabled t] is [true] if the track is {{:-US/docs/Web/API/MediaStreamTrack/enabled}allowed} to render the source
and [false] if it's not. Use {!set_enabled} to control this. *)
val set_enabled : t -> bool -> unit
(** [set_enabled t b] sets the track {!enabled} status to [b].
If the track has been disconnected this has no effect. *)
val get_capabilities : t -> Capabilities.t
(** [get_capabilities t] are the {{:-US/docs/Web/API/MediaStreamTrack/getCapabilities}capabilities} of [t]. *)
val get_constraints : t -> Constraints.t
(** [get_constraints t] are the {{:-US/docs/Web/API/MediaTrackConstraints}constraints} of [t]. *)
val apply_constraints : t -> Constraints.t option -> unit Fut.or_error
(** [apply_contraints t] applies the
{{:-US/docs/Web/API/MediaStreamTrack/applyConstraints}applies}
the given contraints. Constraints unspecified are restored to
their default value. If no contraints are given all
contraints are restored to their defaults. *)
val get_settings : t -> Settings.t
(** [get_settings t] are the {{:-US/docs/Web/API/MediaTrackSettings}settings} of [t]. *)
val stop : t -> unit
(** [stop t] {{:-US/docs/Web/API/MediaStreamTrack/stop}stops} the track. *)
val clone : t -> t
(** [clone t] creates a {{:-US/docs/Web/API/MediaStreamTrack/clone}copy} of [t] equal to it except for its {!id}. *)
* { 1 : events Events }
(** Track events. *)
module Ev : sig
* { 1 : obj Track event object }
type track = t
type t
(** The type for
{{:-US/docs/Web/API/MediaStreamTrackEvent} [MediaStreamTrackEvent]} objects. *)
val track : t -> track
(** [track e] is the track object associated to the event. *)
* { 1 : track_event Track events }
val ended : Ev.void
(** [ended] is the {{:-US/docs/Web/API/MediaStreamTrack/ended_event}ended} event. *)
val isolationchange : Ev.void
* [ isolationchange ] is the
{ { : -identity/#event-isolationchange }
isolationchange } event .
{{:-identity/#event-isolationchange}
isolationchange} event. *)
val mute : Ev.void
(** [mute] is the
{{:-US/docs/Web/API/MediaStreamTrack/mute_event}[mute]} event. *)
val unmute : Ev.void
(** [ummute] is the {{:-US/docs/Web/API/MediaStreamTrack/unmute_event}unmute} event. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Media streams. *)
module Stream : sig
(** Media stream constraints. *)
module Constraints : sig
type t
(** The type for
{{:-US/docs/Web/API/MediaStreamConstraints}[MediaStreamConstraints]}
objects. *)
type track = [ `No | `Yes of Constraints.t option ]
(** The type for specifying track constraints. *)
val v : ?audio:track -> ?video:track -> unit -> t
(** [v ~audio ~video ()] are stream constraints with
given arguments. If unspecified they default to [`No]. *)
val av : unit -> t
(** [av] says [`Yes None] to audio and video. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
type t
(** The type for
{{:-US/docs/Web/API/MediaStream}[MediaStream]} objects. *)
val create : unit -> t
(** [create ()] is a stream without tracks. *)
val of_stream : t -> t
(** [of_stream s] is a new stream which shares its tracks with [s]. *)
val of_tracks : Track.t list -> t
(** [of_tracks ts] is a stream with tracks [ts]. *)
external as_target : t -> Ev.target = "%identity"
(** [as_target s] is [s] as an event target. *)
val id : t -> Jstr.t
(** [id s] is a {{:-US/docs/Web/API/MediaStream/id}unique identifier} for [s]. *)
val active : t -> bool
(** [active s] is [true] if [s] is {{:-US/docs/Web/API/MediaStream/active}active}.*)
val get_audio_tracks : t -> Track.t list
(** [get_audio_tracks s] is the list of
{{:-US/docs/Web/API/MediaStream/getAudioTracks}audio tracks} of [s]. *)
val get_video_tracks : t -> Track.t list
(** [get_video_tracks s] is the list of
{{:-US/docs/Web/API/MediaStream/getVideoTracks}video tracks} of [s]. *)
val get_tracks : t -> Track.t list
(** [get_tracks s] is the list of
{{:-US/docs/Web/API/MediaStream/getTracks}tracks} of [s]. *)
val get_track_by_id : t -> Jstr.t -> Track.t option
(** [get_track_by_id s id]
{{:-US/docs/Web/API/MediaStream/getTrackById}finds} the track identified by [id] (if any). *)
val add_track : t -> Track.t -> unit
(** [add_track s t] {{:-US/docs/Web/API/MediaStream/addTrack}adds} track [t] so [s]. If [t] was already in [s]
nothing happens. *)
val remove_track : t -> Track.t -> unit
(** [remove_track s t] {{:-US/docs/Web/API/MediaStream/removeTrack}removes} track [t] from [s]. If [t] was not in [s]
nothing happens. *)
val clone : t -> t
(** [clone s] {{:-US/docs/Web/API/MediaStream/clone}clones} the tracks of [s] and [s] itself. It has the same
parameters except for [id]. *)
* { 1 : events Events }
(** Stream events *)
module Ev : sig
val addtrack : Track.Ev.t Ev.type'
(** [addtrack] is the {{:-US/docs/Web/API/MediaStream/onaddtrack}[addtrack]} event. *)
val removetrack : Track.Ev.t Ev.type'
(** [removetrack] is the {{:-US/docs/Web/API/MediaStream/onremovetrack}[removetrack]} event. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Media recorder.
See the {{:-record/}
MediaStream Recording} API. *)
module Recorder : sig
* { 1 : enums Enumerations }
(** Bitrate mode enumeration. *)
module Bitrate_mode : sig
type t = Jstr.t
(** The type for {{:-record/#bitratemode}[BitrateMode]} values. *)
val cbr : t
val vbr : t
end
(** Recording state enumeration. *)
module Recording_state : sig
type t = Jstr.t
(** The type for {{:-record/#recordingstate}[RecordingState]} values. *)
val inactive : t
val recording : t
val paused : t
end
* { 1 : recorders Recorder }
val is_type_supported : Jstr.t -> bool
(** [is_type_supported t] is [true] if recording to MIME type
[t] is {{:-US/docs/Web/API/MediaRecorder/isTypeSupported}supported}. *)
type init
(** The type for initialisation objects. *)
val init :
?type':Jstr.t -> ?audio_bps:int -> ?video_bps:int -> ?bps:int ->
?audio_bitrate_mode:Bitrate_mode.t -> unit -> init
(** [init ()] is a media recorder initialisation object with given
{{:-US/docs/Web/API/MediaRecorder/MediaRecorder#Parameters}parameters}. *)
type t
(** The type for
{{:-US/docs/Web/API/MediaRecorder}[MediaRecorder]} objects. *)
val create : ?init:init -> Stream.t -> t
* [ create ~init r ] is a
{ { : -US/docs/Web/API/MediaRecorder/MediaRecorder}recorder } for [ s ] . The function
raises if the [ type ' ] of the [ init ] object
is not { { ! is_type_supported}supported } .
{{:-US/docs/Web/API/MediaRecorder/MediaRecorder}recorder} for [s]. The function
raises if the [type'] of the [init] object
is not {{!is_type_supported}supported}. *)
val stream : t -> Stream.t
(** [stream r] *)
val type' : t -> Jstr.t
(** [type' r] is the stream's MIME type. *)
val state : t -> Recording_state.t
(** [state r] is the
{{:-US/docs/Web/API/MediaRecorder/state}recording state} of [r]. *)
val video_bps : t -> int
(** [video_bps r] is the {{:-US/docs/Web/API/MediaRecorder/videoBitsPerSecond}video encoding bit rate} of [s]. *)
val audio_bps : t -> int
(** [audio_bps r] is the {{:-US/docs/Web/API/MediaRecorder/audioBitsPerSecond}audio encoding bit rate} of [s]. *)
val audio_bitrate_mode : t -> Bitrate_mode.t
(** [audio_bps r] is the {{:-record/#dom-mediarecorder-audiobitratemode}audio encoding mode} of [s]. *)
val start : t -> timeslice_ms:int option -> (unit, Jv.Error.t) result
(** [start r ~timeslice_ms]
{{:-US/docs/Web/API/MediaRecorder/start}starts} [r]. [timeslice_ms] indicates the number of milliseconds to record in
each blob. If not specified the whole duration is in a single blob,
unless {!request_data} is invoked to drive the process. *)
val stop : t -> unit
(** [stop r] {{:-US/docs/Web/API/MediaRecorder/stop}stops} [r]. *)
val pause : t -> unit
* [ pause r ] { { : } [ r ] .
val resume : t -> unit
(** [resume r] {{:-US/docs/Web/API/MediaRecorder/resume}resume} [r]. *)
val request_data : t -> unit
(** [request_data] {{:-US/docs/Web/API/MediaRecorder/requestData}requests} the data of [r]. *)
* { 1 : events Events }
module Ev : sig
* { 1 : obj Event objects }
* events .
module Blob : sig
type t
(** The type for
{{:-record/#blobevent-section}
[BlobEvent]} objects. *)
val data : t -> Blob.t
(** [data e] is the requested data as a blob object. *)
val timecode : t -> float
* [ timecode e ] is the difference between timestamp of the first
chunk in { ! data } and the one produced by the first chunk in the
first blob event produced by the recorder ( that one may not
be zero ) .
chunk in {!data} and the one produced by the first chunk in the
first blob event produced by the recorder (that one may not
be zero). *)
end
(** Recorder errors. *)
module Error : sig
type t
(** The type for
{{:-US/docs/Web/API/MediaRecorderErrorEvent} [MediaRecorderErrorEvent]} objects. *)
val error : t -> Jv.Error.t
(** [error e] is the event's {{:-US/docs/Web/API/MediaRecorderErrorEvent/error}error}. *)
end
* { 1 : events Recorder events }
val start : Ev.void
(** [start] is the recorder {{:-US/docs/Web/API/MediaRecorder/onstart}[start]} event. *)
val stop : Ev.void
(** [stop] is the recorder {{:-US/docs/Web/API/MediaRecorder/onstop}[stop]} event. *)
val dataavailable : Blob.t Ev.type'
(** [dataavailable] is the recorder {{:-US/docs/Web/API/MediaRecorder/ondataavailable}[dataavailable]} event. *)
val pause : Ev.void
(** [pause] is the recorder {{:-US/docs/Web/API/MediaRecorder/onpause}[pause]} event. *)
val resume : Ev.void
(** [resume] is the recorder {{:-US/docs/Web/API/MediaRecorder/onresume}[resume]} event. *)
val error : Error.t Ev.type'
(** [error] is the recorder {{:-US/docs/Web/API/MediaRecorder/onerror}[error]} event. *)
end
end
(** Device kinds and information. *)
module Device : sig
(** Device kind enumeration. *)
module Kind : sig
type t = Jstr.t
(** The type for
{{:-main/#dom-mediadevicekind}
[MediaDeviceKind]} values. *)
val audioinput : t
val audiooutput : t
val videoinput : t
end
(** Device information. *)
module Info : sig
type t
(** The type for {{:-US/docs/Web/API/MediaDevices}[MediaDeviceInfo]} objects. *)
val device_id : t -> Jstr.t
(** [device_id d] is the identifier of the device. *)
val kind : t -> Kind.t
(** [kind d] is the kind of device. *)
val label : t -> Jstr.t
(** [label d] is a label describing the device. *)
val group_id : t -> Jstr.t
* [ group_id d ] is the group identifier of the device . Two devices
have the same group identifier if they belong to the same physical
device .
have the same group identifier if they belong to the same physical
device. *)
val to_json : t -> Json.t
(** [to_json d] is [d] as JSON data. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
end
(** Media device enumeration. *)
module Devices : sig
type t
(** The type for {{:-US/docs/Web/API/MediaDevices}[MediaDevices]} objects. *)
val of_navigator : Navigator.t -> t
(** [of_navigator n] provides access to media devices of [n]. *)
external as_target : t -> Ev.target = "%identity"
(** [as_target m] is [m] as an event target. *)
val enumerate : t -> Device.Info.t list Fut.or_error
* [ enumerate m ]
{ { : -US/docs/Web/API/MediaDevices/enumerateDevices}determines }
a list of connected media devices . Monitor changes by listening
{ ! Ev.devicechange } on [ m ] .
{{:-US/docs/Web/API/MediaDevices/enumerateDevices}determines}
a list of connected media devices. Monitor changes by listening
{!Ev.devicechange} on [m]. *)
val get_supported_constraints : t -> Supported_constraints.t
(** [get_supported_constraints m]
{{:-US/docs/Web/API/MediaDevices/getSupportedConstraints}determines}
the media constraints the user agent understands. *)
val get_user_media : t -> Stream.Constraints.t -> Stream.t Fut.or_error
* [ get_user_media m c ]
{ { : -US/docs/Web/API/MediaDevices/getUserMedia}prompts }
the user to use a media input which can produce a media stream
constrained by [ c ] .
{ { : -US/docs/Web/API/MediaDevices/getUserMedia#Exceptions}These
errors } can occur . In particular [ Jv . Error . Not_allowed ] and
[ Jv . Error . Not_found ] should be reported to the user in a
friendly way . In some browsers this call has to done
in a user interface event handler .
{{:-US/docs/Web/API/MediaDevices/getUserMedia}prompts}
the user to use a media input which can produce a media stream
constrained by [c].
{{:-US/docs/Web/API/MediaDevices/getUserMedia#Exceptions}These
errors} can occur. In particular [Jv.Error.Not_allowed] and
[Jv.Error.Not_found] should be reported to the user in a
friendly way. In some browsers this call has to done
in a user interface event handler. *)
val get_display_media : t -> Stream.Constraints.t -> Stream.t Fut.or_error
(** [get_display_media m c]
{{:-US/docs/Web/API/MediaDevices/getDisplayMedia}prompts} the user to select and grant permission to capture the
contents of a display as a media stream. A video
track is unconditionally returned even if [c] says otherwise.
In some browsers this call has to done in a user interface event
handler.
See this
{{:-US/docs/Web/API/Screen_Capture_API/Using_Screen_Capture}MDN article} for more details. *)
* { 1 : events Events }
(** Device events. *)
module Ev : sig
val devicechange : Ev.void
(** [devicechange] is the {{:-US/docs/Web/API/MediaDevices/devicechange_event}[devicechange]} event. Monitors
media device additions and removals on [MediaDevice] objects. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* { 1 : el Media element interface }
(** The HTML {{:-US/docs/Web/API/HTMLMediaElement}media element interface}.
{b Warning.} This binding is incomplete, the modules
{!El.Audio_track}, {!El.Video_track}, {!El.Text_track} are mostly
empty. *)
module El : sig
* { 1 : prelim Preliminaries }
(** Media errors *)
module Error : sig
* { 1 : codes Error codes }
type code = int
(** The type for
{{:-US/docs/Web/API/MediaError/code#Value}error code} values. *)
val aborted : code
val network : code
val decode : code
val src_not_supported : code
* { 1 : obj Error objects }
type t
* The type for
{ { : -US/docs/Web/API/MediaError }
[ MediaError ] } objects .
{{:-US/docs/Web/API/MediaError}
[MediaError]} objects. *)
val code : t -> code
(** [code e] is the error
{{:-US/docs/Web/API/MediaError/code}
code}. *)
val message : t -> Jstr.t
(** [message e] is the error {{:-US/docs/Web/API/MediaError/message}message}. *)
end
(** Can play enum. *)
module Can_play : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/HTMLMediaElement/canPlayType#Return_value}can play} values. *)
val maybe : t
val probably : t
end
(** Ready state codes. *)
module Have : sig
type t = int
* The type for
{ { : state } values .
{{:-US/docs/Web/API/HTMLMediaElement/readyState#Value}read state} values. *)
val nothing : t
val metadata : t
val current_data : t
val future_data : t
val enought_data : t
end
(** Network state codes. *)
module Network : sig
type t = int
(** The type for
{{:-US/docs/Web/API/HTMLMediaElement/networkState#Value}network state} values. *)
val empty : t
val idle : t
val loading : t
val no_source : t
end
(** CORS settings *)
module Cors : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/HTML/Attributes/crossorigin}CORS} values. *)
val anonymous : t
val use_credentials : t
end
(** Media providers. *)
module Provider : sig
type t
(** The type for
{{:#mediaprovider}
[MediaProvider]} objects. *)
val of_media_stream : Stream.t -> t
val of_blob : Blob.t -> t
val of_media_source : Jv.t -> t
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Audio tracks (incomplete). *)
module Audio_track : sig
type t
(** The type for
{{:-US/docs/Web/API/AudioTrack}
[AudioTrack]} objects. *)
(** Audio track lists. *)
module List : sig
type t
(** The type for
{{:-US/docs/Web/API/AudioTrackList}
[AudioTrackList]} objects. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Video tracks (incomplete). *)
module Video_track : sig
type t
(** The type for
{{:-US/docs/Web/API/VideoTrack}
VideoTrack} objects. *)
module List : sig
type t
(** The type for
{{:-US/docs/Web/API/VideoTrackList}
[VideoTrackList]} objects *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Text tracks (incomplete). *)
module Text_track : sig
module Kind : sig
type t = Jstr.t
end
type t
(** The type for
{{:-US/docs/Web/API/TextTrack}
TextTrack} objects. *)
(** Text track lists. *)
module List : sig
type t
(** The type for
{{:-US/docs/Web/API/TextTrackList}
[TextTrackList]} objects. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Time ranges. *)
module Time_ranges : sig
type t
(** The type for
{{:-US/docs/Web/API/TimeRanges}
[TimeRange]} objects. *)
val length : t -> int
* [ length r ] is the { { : } of [ r ] .
val start : t -> int -> float
(** [start r i] is the {{:-US/docs/Web/API/TimeRanges/start}start} time of range [i] in [r]. *)
val end' : t -> int -> float
(** [end' r i] is the {{:-US/docs/Web/API/TimeRanges/end'}end} time of range [i] in [r]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* { 1 : iface Media interface }
type t
(** The type for elements satifying the
{{:-US/docs/Web/API/HTMLMediaElement}
[HTMLMediaElement]} interface. *)
val of_el : El.t -> t
* [ of_el e ] is the media interface of [ e ] . This throws a JavaScript
error if [ e ] is not a { ! Brr.El.audio } or { ! Brr.El.video } element .
error if [e] is not a {!Brr.El.audio} or {!Brr.El.video} element. *)
val to_el : t -> El.t
(** [to_el m] is [m] as an an element. *)
* { 1 : error_state Error state }
val error : t -> Error.t option
(** [error m] is the most recent
{{:-US/docs/Web/API/HTMLMediaElement/error}error} of [m]. *)
* { 1 : network_state Network state }
val src : t -> Jstr.t
(** [src m] is the
{{:-US/docs/Web/API/HTMLMediaElement/src}URI source} of the played media. *)
val set_src : t -> Jstr.t -> unit
(** [set_src m s] sets the {!src} of [m] to [s]. *)
val src_object : t -> Provider.t option
* [ src_object m s ] is the
{ { : object } of [ m ] .
{{:-US/docs/Web/API/HTMLMediaElement/srcObject}source object} of [m]. *)
val set_src_object : t -> Provider.t option -> unit
* [ set_src_object m o ] sets the { ! } of [ m ] to [ o ] .
val current_src : t -> Jstr.t
(** [current_src m] is the
{{:-US/docs/Web/API/HTMLMediaElement/currentSrc}current source} of [m]. *)
val cross_origin : t -> Cors.t
(** [cross_origin m] is the
{{:-US/docs/Web/API/HTMLMediaElement/crossOrigin}CORS setting} of [m]. *)
val set_cross_origin : t -> Cors.t -> unit
(** [set_cross_origin m c] sets the {!cross_origin} of [m] to [c]. *)
val network_state : t -> Network.t
(** [network_state m] is the
{{:-US/docs/Web/API/HTMLMediaElement/networkState}network state} of [m]. *)
val preload : t -> Jstr.t
(** [preload m] is the preload state of [m]. *)
val set_preload : t -> Jstr.t -> unit
(** [set_preload m p] sets the preload of [m] to [p]. *)
val buffered : t -> Time_ranges.t
(** [buffered m] are the ranges of media that
are {{:-US/docs/Web/API/HTMLMediaElement/buffered}buffered}: *)
val load : t -> unit
(** [load m] restarts
{{:-US/docs/Web/API/HTMLMediaElement/load}loading} [m]. *)
val can_play_type : t -> Jstr.t -> Can_play.t
(** [can_play_type m t] indicates if [m]
{{:-US/docs/Web/API/HTMLMediaElement/canPlayType}can play} [t]. *)
* { 1 : ready_state Ready state }
val ready_state : t -> Have.t
(** [ready_state m] indicates the
{{:-US/docs/Web/API/HTMLMediaElement/readyState}readiness} of [m]. *)
val seeking : t -> bool
(** [seeking m] indicates [m] is seeking a new position. *)
* { 1 : playback_state Playback state }
val current_time_s : t -> float
(** [current_time m] is the {{:-US/docs/Web/API/HTMLMediaElement/currentTime}current time} of [m]. *)
val set_current_time_s : t -> float -> unit
(** [set_current_time_s m t] sets the {!current_time_s} of [m] to [t]. *)
val fast_seek_s : t -> float -> unit
* [ fast_seek_s m t ]
{ { : } [ m ] to [ t ] .
{{:-US/docs/Web/API/HTMLMediaElement/fastSeek}seeks} [m] to [t]. *)
val duration_s : t -> float
(** [duration_s m] is the
{{:-US/docs/Web/API/HTMLMediaElement/duration}duration} of [m]. *)
val paused : t -> bool
(** [paused m] indicates whether [m] is
{{:-US/docs/Web/API/HTMLMediaElement/paused}paused}. *)
val default_playback_rate : t -> float
(** [default_playback_rate m] is the
{{:-US/docs/Web/API/HTMLMediaElement/defaultPlaybackRate}default playback rate} of [m]. *)
val set_default_playback_rate : t -> float -> unit
(** [set_default_playback_rate m] sets the {!default_playback_rate}
of [m]. *)
val playback_rate : t -> float
(** [playback_rate m] is the
{{:-US/docs/Web/API/HTMLMediaElement/playbackRate}playback rate} of [m]. *)
val set_playback_rate : t -> float -> unit
(** [set_playback_rate m] sets the {!playback_rate}
of [m]. *)
val played : t -> Time_ranges.t
(** [played m] are the ranges that have been played. *)
val seekable : t -> Time_ranges.t
(** [seekable m] indicates the time ranges that are
{{:-US/docs/Web/API/HTMLMediaElement/seekable}seekable}. *)
val ended : t -> bool
(** [ended m] is [true] if the media has
{{:-US/docs/Web/API/HTMLMediaElement/ended}finished} playing. *)
val autoplay : t -> bool
(** [autoplay m] is the {{:-US/docs/Web/API/HTMLMediaElement/autoplay}autoplay} behaviour of [m]. *)
val set_auto_play : t -> bool -> unit
(** [set_auto_play m b] sets {!autoplay} of [m] to [b]. *)
val loop : t -> bool
(** [loop m] inidicates if [m] is set to {{:-US/docs/Web/API/HTMLMediaElement/loop}loop}. *)
val set_loop : t -> bool -> unit
(** [set_loop m b] sets the {!loop} of [m] to [b]. *)
val play : t -> unit Fut.or_error
(** [play m]
{{:-US/docs/Web/API/HTMLMediaElement/play}plays} [m]. *)
val pause : t -> unit
(** [pause m]
{{:-US/docs/Web/API/HTMLMediaElement/pause}pauses} [m]. *)
* { 1 : ctrls Controls }
val controls : t -> bool
(** [controls m] indicates if media controls are
{{:-US/docs/Web/API/HTMLMediaElement/controls}shown}. *)
val set_controls : t -> bool -> unit
(** [set_controls m b] sets the {!controls} of [m] to [b]. *)
val volume : t -> float
(** [volume m] is the
{{:-US/docs/Web/API/HTMLMediaElement/volume}volume} of [m]. *)
val set_volume : t -> float -> unit
(** [set_volume m b] sets the {!volume} of [m] to [b]. *)
val muted : t -> bool
(** [muted m] indicates whether audio is {{:-US/docs/Web/API/HTMLMediaElement/muted}muted}. *)
val set_muted : t -> bool -> unit
(** [set_muted m b] sets the {!muted} of [m] to [b]. *)
val default_muted : t -> bool
(** [default_muted m] is the {{:-US/docs/Web/API/HTMLMediaElement/defaultMuted}default muted} state. *)
val set_default_muted : t -> bool -> unit
(** [set_default_muted m b] sets the {!default_muted} of [m] to [b]. *)
* { 1 : tracks Tracks }
val audio_track_list : t -> Audio_track.List.t
(** [audio_track_list m] are the
{{:-US/docs/Web/API/HTMLMediaElement/audioTracks}audio tracks} of [m]. *)
val video_track_list : t -> Video_track.List.t
(** [video_track_list m] are the
{{:-US/docs/Web/API/HTMLMediaElement/videoTracks}video tracks} of [m]. *)
val text_track_list : t -> Text_track.List.t
(** [text_trac_list m] are the
{{:-US/docs/Web/API/HTMLMediaElement/textTracks}text tracks} of [m]. *)
val capture_stream : t -> Stream.t
(** [capture_tream m] is a
{{:-US/docs/Web/API/HTMLMediaElement/captureStream}media stream} for [m]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
end
(** Message events, ports, channels and broadcast channels. *)
module Message : sig
type transfer
(** The type for objects to transfer. *)
val transfer : 'a -> transfer
(** [transfer v] indicates valule [v] should be transfered, not just
cloned, meaning they are no longer usable on the sending side. *)
type opts
(** The type for messaging options. *)
val opts : ?target_origin:Jstr.t -> ?transfer:transfer list -> unit -> opts
(** [opts ~target_origin ~transfer ()] are messaging options.
See {{:-US/docs/Web/API/Window/postMessage#Syntax}here} for the semantics of [target_origin] and [transfer]. *)
(** Message ports. *)
module Port : sig
type t
(** The type for
{{:-US/docs/Web/API/MessagePort}
[MessagePort]} objects. *)
external as_target : t -> Ev.target = "%identity"
* [ as_target p ] is [ p ] as an event target .
val start : t -> unit
(** [start p]
{{:-US/docs/Web/API/MessagePort/start}
starts} [p]. *)
val close : t -> unit
(** [close p]
{{:-US/docs/Web/API/MessagePort/close}
closes} [p]. *)
val post : ?opts:opts -> t -> 'a -> unit
* [ v ]
{ { : -US/docs/Web/API/MessagePort/postMessage } posts } value [ v ] on port [ p ] with options [ opts ] ( the [ target_origin ]
option is meaningless in this case ) .
{{:-US/docs/Web/API/MessagePort/postMessage} posts} value [v] on port [p] with options [opts] (the [target_origin]
option is meaningless in this case). *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** Message channels.
See the {{:-US/docs/Web/API/Channel_Messaging_API}Channel Messaging API}. *)
module Channel : sig
type t
(** The type for
{{:-US/docs/Web/API/MessageChannel}
[MessageChannel]} objects. *)
val create : unit -> t
* [ create ( ) ] is a { { : } channel .
val port1 : t -> Port.t
(** [port c] is the {{:-US/docs/Web/API/MessageChannel/port1}first port} of [c]. The port attached to the context
that created the channel. *)
val port2 : t -> Port.t
* [ c ] is the { { : -US/docs/Web/API/MessageChannel/port2}second port } of [ c ] . The port attached to the context
at the other end of the channel .
at the other end of the channel. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* Broadcast channels .
See the
{ { : -US/docs/Web/API/Broadcast_Channel_API}Broadcast Channel API } .
See the
{{:-US/docs/Web/API/Broadcast_Channel_API}Broadcast Channel API}. *)
module Broadcast_channel : sig
type t
* The type for
{ { : -US/docs/Web/API/BroadcastChannel }
[ BroadcastChannel ] } objects .
{{:-US/docs/Web/API/BroadcastChannel}
[BroadcastChannel]} objects. *)
val create : Jstr.t -> t
(** [create n] {{:-US/docs/Web/API/BroadcastChannel/BroadcastChannel}creates} a channel named [n]. *)
external as_target : t -> Ev.target = "%identity"
(** [as_target b] is [b] as an event target. *)
val name : t -> Jstr.t
(** [name b] is the {{:-US/docs/Web/API/BroadcastChannel/name}name} of [b]. *)
val close : t -> unit
(** [close b] {{:-US/docs/Web/API/BroadcastChannel/close}closes} [b]. *)
val post : t -> 'a -> unit
* [ post b v ]
{ { : -US/docs/Web/API/BroadcastChannel/postMessage}sends } [ v ] to all listeners of { ! }
on [ b ] .
{{:-US/docs/Web/API/BroadcastChannel/postMessage}sends} [v] to all listeners of {!Brr_io.Message.Ev.message}
on [b]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
val window_post : ?opts:opts -> Window.t -> 'a -> unit
(** [window_post w v ~opts]
{{:-US/docs/Web/API/Window/postMessage}
posts} value [v] to window [w] with options [opts]. *)
* { 1 : events Events }
(** Message events. *)
module Ev : sig
* { 1 : obj Message event object }
type t
(** The type for
{{:-US/docs/Web/API/MessageEvent}
[MessageEvent]} and
{{:-US/docs/Web/API/ExtendableMessageEvent}[ExtendableMessageEvent]}
objects. *)
val as_extendable : t -> Ev.Extendable.t
(** [as_extendable e] is [e] as an extendable event. {b Warning.}
only for [ExtendableMessageEvents] objects. *)
val data : t -> 'a
(** [data e] is the
{{:-US/docs/Web/API/MessageEvent/data}
data send} by the emitter. {b Warning.} Unsafe,
make sure to constrain the result value to the right type. *)
val origin : t -> Jstr.t
(** [origin e] is the
{{:-US/docs/Web/API/MessageEvent/origin}
origin} of the message emitter. *)
val last_event_id : t -> Jstr.t
(** [last_event_id e] is a
{{:-US/docs/Web/API/MessageEvent/lastEventId}unique id} for the event. *)
val source : t -> Jv.t option
(** [source e] is the {{:-US/docs/Web/API/MessageEvent/source}message emitter}. *)
val ports : t -> Port.t list
(** [ports e] is a list of {{:-US/docs/Web/API/MessageEvent/ports}ports} associated with the channel the message is being
send through (if applicable). *)
* { 1 : events Events }
val message : t Ev.type'
(** [message] is the {{:-US/docs/Web/API/BroadcastChannel/message_event}[message]} event. *)
val messageerror : t Ev.type'
(** [messageerror] is the {{:-US/docs/Web/API/BroadcastChannel/messageerror_event}[messageerror]} event. *)
end
end
(** Notifying users.
See the {{:-US/docs/Web/API/Notifications_API}Notification API}. *)
module Notification : sig
* { 1 : perm Permission }
(** Permission enum. *)
module Permission : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/Notification/permission#Return_Value}notification permission} values. *)
val default : t
val denied : t
val granted : t
end
val permission : unit -> Permission.t
(** [permission ()] is the {{:-US/docs/Web/API/Notification/permission}permission} granted by the user. *)
val request_permission : unit -> Permission.t Fut.or_error
(** [request_permission ()] {{:-US/docs/Web/API/Notification/requestPermission}requests} permission to display
notifications. *)
* { 1 : notifications Notifications }
(** Direction enum. *)
module Direction : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/Notification/dir#Value}notification direction} values. *)
val auto : t
val ltr : t
val rtl : t
end
(** Actions. *)
module Action : sig
val max : unit -> int
(** [max] is the {{:-US/docs/Web/API/Notification/maxActions}maximum number} of actions supported. *)
type t
(** The type for {{:-US/docs/Web/API/NotificationAction}[NotificationAction]} objects. *)
val v : ?icon:Jstr.t -> action:Jstr.t -> title:Jstr.t -> unit -> t
* [ v ~action ( ) ] is an action with given
{ { : -US/docs/Web/API/NotificationAction#Properties}properties } .
{{:-US/docs/Web/API/NotificationAction#Properties}properties}. *)
val action : t -> Jstr.t
(** [action a] is the {{:-US/docs/Web/API/NotificationAction#Properties}action name} of [a]. *)
val title : t -> Jstr.t
(** [title a] is the {{:-US/docs/Web/API/NotificationAction#Properties}title} of [a]. *)
val icon : t -> Jstr.t option
(** [icon a] is the {{:-US/docs/Web/API/NotificationAction#Properties}icon} of [a]. *)
(**/**)
include Jv.CONV with type t := t
(**/**)
end
type opts
(** The type for notification options. *)
val opts :
?dir:Direction.t -> ?lang:Jstr.t -> ?body:Jstr.t -> ?tag:Jstr.t ->
?image:Jstr.t -> ?icon:Jstr.t -> ?badge:Jstr.t -> ?timestamp_ms:int ->
?renotify:bool -> ?silent:bool -> ?require_interaction:bool -> ?data:'a ->
?actions:Action.t list -> unit -> opts
type t
(** The type for
{{:-US/docs/Web/API/Notification}
[Notification]} objects. *)
type notification = t
(** See {!t} . *)
val create : ?opts:opts -> Jstr.t -> t
(** [create title ~opts] is a
{{:-US/docs/Web/API/Notification/Notification}notification}
with title [title] and options [opts]. *)
val close : t -> unit
(** [close n] {{:-US/docs/Web/API/Notification/close}closes} [n]. *)
external as_target : t -> Ev.target = "%identity"
(** [as_target n] is [n] as an event target. *)
* { 1 : props Properties }
val actions : t -> Action.t list
(** [action n] are the
{{:-US/docs/Web/API/Notification/actions}
actions} of [n]. *)
val badge : t -> Jstr.t
(** [badge n] is the
{{:-US/docs/Web/API/Notification/badge}
badge} of [n]. *)
val body : t -> Jstr.t
(** [body n] is the
{{:-US/docs/Web/API/Notification/body}
body} of [n]. *)
val data : t -> 'a
(** [data n] is the
{{:-US/docs/Web/API/Notification/data}
data} of [n]. {b Warning.} This is unsafe, constrain the result type.*)
val dir : t -> Direction.t
(** [dir n] is the
{{:-US/docs/Web/API/Notification/dir}
dir} of [n]. *)
val lang : t -> Jstr.t
(** [lang n] is the
{{:-US/docs/Web/API/Notification/lang}
lang} of [n]. *)
val tag : t -> Jstr.t
(** [tag n] is the
{{:-US/docs/Web/API/Notification/tag}
tag} of [n]. *)
val icon : t -> Jstr.t
(** [icon n] is the
{{:-US/docs/Web/API/Notification/icon}
icon} of [n]. *)
val image : t -> Jstr.t
(** [image n] is the
{{:-US/docs/Web/API/Notification/image}
image} of [n]. *)
val renotify : t -> bool
(** [renotify n]
{{:-US/docs/Web/API/Notification/renotify}
indicates} [n] replaces an old notification. *)
val require_interaction : t -> bool
(** [require_interaction n]
{{:-US/docs/Web/API/Notification/requireInteraction} indicates} [n] requires interaction. *)
val silent : t -> bool
(** [silent n]
{{:-US/docs/Web/API/Notification/silent} indicates} [n] should be silent. *)
val timestamp_ms : t -> int
(** [timestamp_ms n] is the {{:-US/docs/Web/API/Notification/timestamp}timestamp} of [n]. *)
val title : t -> Jstr.t
(** [title n] is the
{{:-US/docs/Web/API/Notification/title}
title} of [n]. *)
* { 1 : events Events }
(** Notification events. *)
module Ev : sig
* { 1 : obj Notification event object }
type t
(** The type for {{:-US/docs/Web/API/NotificationEvent}[NotificationEvent]} objects. *)
val as_extendable : t -> Ev.Extendable.t Ev.t
(** [as_extendable e] is [e] as an extendable event. *)
val notification : t -> notification
(** [notification e] is the
{{:-US/docs/Web/API/NotificationEvent/notification}notification} of [e]. *)
val action : t -> Jstr.t
(** [action e] is the notification {{:-US/docs/Web/API/NotificationEvent/action}action} clicked. *)
* { 1 : evs Notification events }
val notificationclick : t Ev.type'
(** [notificationclick] is the
{{:-US/docs/Web/API/ServiceWorkerGlobalScope/notificationclick_event}[notificationclick]} event. *)
val notificationclose : t Ev.type'
* [ notificationclick ] is the
{ { : } event .
{{:-US/docs/Web/API/ServiceWorkerGlobalScope/notificationclose_event}[notificationclose]} event. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
* [ Storage ] objects .
See { { : -US/docs/Web/API/Web_Storage_API }
Web Storage API }
See {{:-US/docs/Web/API/Web_Storage_API}
Web Storage API} *)
module Storage : sig
type t
(** The type for
{{:-US/docs/Web/API/Storage}[Storage]}
objects. *)
val local : Window.t -> t
(** [local w] is the storage {{:-US/docs/Web/API/Window/localStorage}saved accross page sessions} for the
window's {{:#concept-origin}origin}. *)
val session : Window.t -> t
(** [session w] is the storage {{:-US/docs/Web/API/Window/sessionStorage}cleared when the page session} ends for the window's {{:#concept-origin}origin}. *)
val length : t -> int
(** [length s] is the
{{:-US/docs/Web/API/Storage/length}
number of items} in [s]. *)
val key : t -> int -> Jstr.t option
(** [key s i] is the
{{:-US/docs/Web/API/Storage/key}name}
of the [i]th key. (N.B. local storage can race with other tabs) *)
val get_item : t -> Jstr.t -> Jstr.t option
* [ s k ] is the
{ { : -US/docs/Web/API/Storage/getItem }
value } of [ k ] in [ s ] .
{{:-US/docs/Web/API/Storage/getItem}
value} of [k] in [s]. *)
val set_item : t -> Jstr.t -> Jstr.t -> (unit, Jv.Error.t) result
* [ set_item s k v ]
{ { : -US/docs/Web/API/Storage/setItem}sets }
the value of [ k ] to [ v ] in [ s ] . An error is returned if the value could
not be set ( no permission or quota exceeded ) .
{{:-US/docs/Web/API/Storage/setItem}sets}
the value of [k] to [v] in [s]. An error is returned if the value could
not be set (no permission or quota exceeded). *)
val remove_item : t -> Jstr.t -> unit
* [ remove_item s k ]
{ { : }
removes } the value of [ k ] from [ s ] . If [ k ] has no
value this does nothing .
{{:-US/docs/Web/API/Storage/removeItem}
removes} the value of [k] from [s]. If [k] has no
value this does nothing. *)
val clear : t -> unit
(** [clear s]
{{:-US/docs/Web/API/Storage/clear}
removes} all keys from [s]. *)
* { 1 : events Events }
(** Storage event. *)
module Ev : sig
* { 1 : obj Storage event object }
type storage_area = t
(** See {!Brr_io.Storage.t} . *)
type t
(** The type for
{{:-US/docs/Web/API/StorageEvent}
[StorageEvent]} objects. *)
val key : t -> Jstr.t option
(** [key e] is the key of the item being changed. *)
val old_value : t -> Jstr.t option
(** [old_value e] is the old value of the key. *)
val new_value : t -> Jstr.t option
(** [new_value e] is the new value of the key. *)
val url : t -> Jstr.t
(** [url e] is the URL of the document whose storage item changed. *)
val storage_area : t -> storage_area option
(** [storage_area e] is the storage object. *)
* { 1 : events Storage event }
val storage : t Ev.type'
* [ storage ] is the type for [ storage ] event fired on { ! .
on storage changes .
on storage changes. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
(** [Websocket] objects.
See {{:-US/docs/Web/API/WebSockets_API}
Web Sockets API}.
{b XXX} Add a bit of future convenience. *)
module Websocket : sig
(** Binary type enum. *)
module Binary_type : sig
type t = Jstr.t
(** The type for
{{:-US/docs/Web/API/WebSocket/binaryType#Value}binary type} values. *)
val blob : t
val arraybuffer : t
end
(** Ready state enum. *)
module Ready_state : sig
type t = int
(** The type for
{{:-US/docs/Web/API/WebSocket/readyState#Value}ready state} values. *)
val connecting : t
val open' : t
val closing : t
val closed : t
end
type t
(** The type for
{{:-US/docs/Web/API/WebSocket}[WebSocket]}
objects. *)
val create : ?protocols:Jv.Jstr.t list -> Jstr.t -> t
* [ create ~protocols url ]
{ { : -US/docs/Web/API/WebSocket/WebSocket }
creates } a new socket connected to [ url ] .
{{:-US/docs/Web/API/WebSocket/WebSocket}
creates} a new socket connected to [url]. *)
external as_target : t -> Brr.Ev.target = "%identity"
(** [as_target s] is [s] as an event target. *)
val binary_type : t -> Binary_type.t
* [ binary_type s ] is the
{ { : -US/docs/Web/API/WebSocket/binaryType }
type } of binary data received .
{{:-US/docs/Web/API/WebSocket/binaryType}
type} of binary data received. *)
val set_binary_type : t -> Binary_type.t -> unit
(** [set_binary_type s t] sets the {!binary_type} of [s] to [t]. *)
val close : ?code:int -> ?reason:Jstr.t -> t -> unit
(** [close s]
{{:-US/docs/Web/API/WebSocket/close}
closes} [s]. *)
* { 1 : props Properties }
val url : t -> Jstr.t
(** [url s] is the {{:-US/docs/Web/API/WebSocket/url}url} of [s]. *)
val ready_state : t -> Ready_state.t
(** [ready_state s] is the {{:-US/docs/Web/API/WebSocket/readyState}state} of the connection. *)
val buffered_amount : t -> int
(** [buffered_amount s] is the sent {{:-US/docs/Web/API/WebSocket/bufferedAmount}buffered amount} of [s]. *)
val extensions : t -> Jstr.t
(** [extensions s] are the
{{:-US/docs/Web/API/WebSocket/extensions}
extensions} selected by the server. *)
val protocol : t -> Jstr.t
(** [protocol s] is the
{{:-US/docs/Web/API/WebSocket/protocol}
protocol} selected by the server. *)
* { 1 : send Sending }
val send_string : t -> Jstr.t -> unit
* [ send_string s d ]
{ { : -US/docs/Web/API/WebSocket/send }
sends } the UTF-8 encoding of [ d ] on [ s ] .
{{:-US/docs/Web/API/WebSocket/send}
sends} the UTF-8 encoding of [d] on [s]. *)
val send_blob : t -> Blob.t -> unit
(** [send_blob s d]
{{:-US/docs/Web/API/WebSocket/send}
sends} the binary content of [d] on [s]. *)
val send_array_buffer : t -> Tarray.Buffer.t -> unit
(** [send_blob s d]
{{:-US/docs/Web/API/WebSocket/send}
sends} the binary content of [d] on [s]. *)
val send_tarray : t -> ('a, 'b) Tarray.t -> unit
(** [send_blob s d]
{{:-US/docs/Web/API/WebSocket/send}
sends} the binary content of [d] on [s]. *)
* { 1 : events Events }
(** Websocket events. *)
module Ev : sig
(** Close events. *)
module Close : sig
type t
(** The type for
{{:-US/docs/Web/API/CloseEvent}
[CloseEvent]} objects. *)
val was_clean : t -> bool
* [ was_clean e ] is [ true ] if closure was { { : .
val code : t -> int
(** [code e] is the {{:-US/docs/Web/API/CloseEvent#Properties}close} code sent by the server. *)
val reason : t -> Jstr.t
* [ reason e ] is the closure { { : } .
end
val close : Close.t Ev.type'
(** [close] is the {{:-US/docs/Web/API/WebSocket/close_event}[close]} event. *)
end
(**/**)
include Jv.CONV with type t := t
(**/**)
end
---------------------------------------------------------------------------
Copyright ( c ) 2020 The brr programmers
Permission to use , copy , modify , and/or distribute this software for any
purpose with or without fee is hereby granted , provided that the above
copyright notice and this permission notice appear in all copies .
THE SOFTWARE IS PROVIDED " AS IS " AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS . IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL , DIRECT , INDIRECT , OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
RESULTING FROM LOSS OF USE , DATA OR PROFITS , WHETHER IN AN
ACTION OF CONTRACT , NEGLIGENCE OR OTHER TORTIOUS ACTION , ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE .
---------------------------------------------------------------------------
Copyright (c) 2020 The brr programmers
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
---------------------------------------------------------------------------*)
| null | https://raw.githubusercontent.com/dbuenzli/brr/b074832ae363ff9f68d52a2db31b3a83a94ee445/src/brr_io.mli | ocaml | * Clipboard, Form, Fetch, Geolocation, Media and Storage APIs.
* Clipboard access
See the {{:-US/docs/Web/API/Clipboard}
Clipboard API}.
* Clipboard items.
* Presentation style enum.
* The type for
{{:-apis/#enumdef-presentationstyle}
presentation} style values.
* The type for
{{:-apis/#dictdef-clipboarditemoptions}
[ClipboardItemOptions]}.
* [opts ~presentation_style ()] are options for clipboard item
objects.
* The type for {{:-US/docs/Web/API/ClipboardItem}[ClipboardItem]} objects.
* [last_modified_ms i] is the
{{:-apis/#dom-clipboarditem-lastmodified}
last modified time} in ms from the epoch of [i].
* [delayed i] is the
{{:-apis/#dom-clipboarditem-delayed}delayed} property of [i].
* [types i] is the array of MIME types {{:-US/docs/Web/API/ClipboardItem/types}available} for [i].
*/*
*/*
* The type for {{:-US/docs/Web/API/Clipboard}[Clipboard]} objects.
* [of_navigator n] is a clipboard object for
{{:-US/docs/Web/API/Navigator/clipboard}navigator} [n].
* [as_target c] is [c] as an event target.
* [read c] is the {{:-US/docs/Web/API/Clipboard/read}content} of [c].
* [read_text c] is the clipboard {{:-US/docs/Web/API/Clipboard/readText}textual content} of [c].
* [write c is]
{{:-US/docs/Web/API/Clipboard/write}
writes} the items [is] to [c].
* [write_text c s]
{{:-US/docs/Web/API/Clipboard/writeText}
writes} the string [s] to [c].
*/*
*/*
* Form elements and form data.
* [to_el f] is [f] as an an element.
* [name f] is the {{:-US/docs/Web/API/HTMLFormElement/name}name} of [f].
* [method' f] is the {{:-US/docs/Web/API/HTMLFormElement/method}method} of [f].
* [target f] is the {{:-US/docs/Web/API/HTMLFormElement/target}target} of [f].
* [action f] is the {{:-US/docs/Web/API/HTMLFormElement/action}action} of [f].
* [enctype f] is the {{:-US/docs/Web/API/HTMLFormElement/enctype}enctype} of [f].
* [accept_charset f] is the {{:-US/docs/Web/API/HTMLFormElement/acceptCharset}charset accepted} by [f].
* [autocomplete f] refelects the value of the {{:-US/docs/Web/HTML/Element/form#attr-autocomplete}autocomplete} attribute
of [f].
* [check_validity f] is [true] if the form's children controls
all satisfy their
{{:-US/docs/Web/Guide/HTML/HTML5/Constraint_validation}validation constraints}.
* [report_validity f] is like {!check_validity} but also
{{:-US/docs/Web/API/HTMLFormElement/reportValidity}reports} problems to the user.
* [request_submist f el] requests the form to be
{{:-US/docs/Web/API/HTMLFormElement/requestSubmit}submited} using button [el] or the form itself if unspecified.
* [reset f]
{{:-US/docs/Web/API/HTMLFormElement/reset}
resets} the form.
* [submit f] {{:-US/docs/Web/API/HTMLFormElement/submit}submits} the form.
* Form data.
* See {!Brr_io.Form.t}.
* The type for form data entry values.
* The type for
{{:-US/docs/Web/API/FormData}FormData}
objects.
* [create ()] is new, empty, form data.
* [of_form f] is a form data from the
{{:-US/docs/Web/API/FormData/FormData#Parameters}current key-values} of form [f].
* [is_empty d] is [true] if [d] has no entries.
* [has_file_entry d] is [true] iff [d] has a file entry.
* [mem d k] is [true] if [d]
{{:-US/docs/Web/API/FormData/has}has}
key [k].
* [find_all d k] are all the values associated to [k] in [d].
* [fold f d acc] folds over all key/value entries in [d] with [f]
starting with [k].
* [set d k v]
{{:-US/docs/Web/API/FormData/set}
sets} the value of [k] to [v] in [d].
* [set d k b ~filename]
{{:-US/docs/Web/API/FormData/set}
sets} the value of [k] to [b] in [d]. [filename] can
specify the filename of [b].
* [append d k v]
{{:-US/docs/Web/API/FormData/append}
appends} value [v] to the value of [k] in [d].
* [append d k b ~filename]
{{:-US/docs/Web/API/FormData/append}
appends} blob [b] to the value of [k] in [d]. [filename] can
specify the filename of [b].
* [to_assoc l] is the form data as an association list.
* [of_uri_params p] is a form data for [p].
* [to_uri_params t] is the form data as URI query parameters.
{b Note.} If your form has file inputs this will map their keys
to something like ["[Object File]"], {!has_file_entry} indicates
whether the form data has a file entry.
*/*
*/*
* Form events
* Form data events
* The type for
{{:-US/docs/Web/API/FormDataEvent}
[FormDataEvent]} objects.
* [form_data e] is the
{{:-US/docs/Web/API/FormDataEvent/formData}form data} when the event was fired.
* [formadata] is the type for {{:-US/docs/Web/API/HTMLFormElement/formdata_event}[formdata]} event.
* Submit events
* [submitter e] is
{{:-US/docs/Web/API/SubmitEvent}
the element} which triggered the submission.
* [submit] is the type for
{{:-US/docs/Web/API/HTMLFormElement/submit_event}submit} events.
*/*
*/*
* Fetching resources.
See the {{:-US/docs/Web/API/Fetch_API}
Fetch API}.
* Body specification and interface.
* The type for specifying bodies.
* [of_jstr s] is a body from string [s].
* [of_form_data d] is a body from form data [d].
* [of_blob b] is a body from blob [b].
* [of_array_buffer b] is a body from array buffer [b].
* The type for objects implementing the
{{:-US/docs/Web/API/Body}[Body]}
interface.
* [body_used b] indicates
{{:-US/docs/Web/API/Body/bodyUsed}
indicates} if [b] was used.
* [body b] is [b] as a
{{:-US/docs/Web/API/Body/body}
stream}.
* [array_buffer b]
{{:-US/docs/Web/API/Body/arrayBuffer}
reads} [b] into an array buffer.
* [form_data b]
{{:-US/docs/Web/API/Body/formData}
reads} [b] as form data.
* [json b]
{{:-US/docs/Web/API/Body/json}
reads} [b] and parses it as JSON data.
*/*
*/*
* Request and response headers.
{b Warning.} We left out mutable operations out of the interface
but remember these objects may mutate under your feet.
* The type for
{{:-US/docs/Web/API/Headers}[Headers]}
objects.
* [mem h hs] is [true] iff header [h] has a value in [hs].
The lookup is case insensitive.
* [find h hs] is the value of header [h] in [hs] (if any).
The lookup is case insensitive.
* [fold f hs acc] folds the headers [h] of [hs] and their value
[v] with [f h v] starting with [acc]. It's unclear but
header names are likely lowercased.
* [of_obj o] uses the keys and values of object [o] to define
headers and their value.
* [of_assoc ~init assoc] are the headers from [init] (default si
empty) to which the header value pairs of [assoc] are
appended. If a header is defined more than once this either
overwrites the previous definition, or appends to the value if
if the value can be multi-valued.
* [to_assoc hs] are the headres [hs] as an assoc list.
It's unclear but header names are likely lowercased.
*/*
*/*
* Resource requests.
* Request cache mode enum.
* The type for {{:-US/docs/Web/API/Request/cache#Value}[RequestCache]} values.
* Request credentials mode enum.
* The type for
{{:-US/docs/Web/API/Request/credentials#Value}[RequestCredentials]} values.
* Request destination enum.
* The type for
{{:-US/docs/Web/API/RequestDestination}[RequestDestination]} values.
* Request mode enum.
* The type for
{{:-US/docs/Web/API/Request/mode#Value}
[RequestMode]} values.
* Request redirect enum.
* The type for request initialisation objects.
* [init ()] is a request initialisation object with given
{{:-US/docs/Web/API/Request/Request#Parameters}parameters}.
* The type for {{:-US/docs/Web/API/Request}[Request]} objects.
* [v ~init uri] is a request on [uri] with parameters [init].
* [of_request ~init r] is a copy of [r] updated by [init].
* [as_body r] is the {!Body} interface of [r].
* [credentials r] are the
{{:-US/docs/Web/API/Request/credentials}
credentials} of [r].
* [destination r] is the
{{:-US/docs/Web/API/Request/destination}
destination} of [r].
* [headers r] are the
{{:-US/docs/Web/API/Request/headers}
headers} of [r].
* [integrity r] is the
{{:-US/docs/Web/API/Request/integrity}
integrity} of [r].
* [is_history_navigation r] is the
{{:-US/docs/Web/API/Request/isHistoryNavigation}
[isHistoryNavigation]} property of [r].
* [is_reload_navigation r] is the
{{:-US/docs/Web/API/Request/isReloadNavigation}
[isReloadNavigation]} property of [r].
* [method' r] is the
{{:-US/docs/Web/API/Request/method}
method} of [r].
* [mode r] is the
{{:-US/docs/Web/API/Request/mode}
mode} of [r].
* [redirect r] is the
{{:-US/docs/Web/API/Request/redirect}
redirect} behaviour of [r].
* [referrer r] is the
{{:-US/docs/Web/API/Request/referrer}
referrer} of [r].
* [signal r] is the
{{:-US/docs/Web/API/Request/signal}
abort signal} of [r].
* [url r] is the
{{:-US/docs/Web/API/Request/url}
url} of [r].
*/*
*/*
* Request responses.
* Response type enum.
* The type for {{:-US/docs/Web/API/Response/type#Value}[ResponseType]} values.
* The type for response initialisation objects.
* [init ()] is a response initialisation object with given
{{:-US/docs/Web/API/Response/Response#Parameters}parameters}.
* The type for {{:-US/docs/Web/API/Response}[Response]} objects.
* [v ~init ~body] is a response with parameters [init] and body
[body].
* [of_response r] is a copy of [r].
* [error] is a
{{:-US/docs/Web/API/Response/error}
network error response}.
* [redirect ~status url] is a
{{:-US/docs/Web/API/Response/redirect}
redirect response} to [url] with status [status].
* [as_body r] is the {{!Body}body interface} of [r].
* [headers r] are the
{{:-US/docs/Web/API/Response/headers}
headers} of [r].
* [redirected r] is [true] if the reponse is the result of a
{{:-US/docs/Web/API/Response/redirected}
redirection}.
* [status r] is the
{{:-US/docs/Web/API/Response/status}
status} of [r].
* [url r] is the
{{:-US/docs/Web/API/Response/url}
[url]} of [r].
*/*
*/*
* Fetch caches.
* The type for query options.
* [query_opts ~ignore_search ~ignore_method ~ignore_vary ~cache_name ()]
are query options with given {{:-US/docs/Web/API/CacheStorage/match#Parameters}parameters}.
* The type for
{{:-US/docs/Web/API/Cache}Cache}
objects.
* [match' c req] is a {{:-US/docs/Web/API/Cache/match}stored response} for [req] in [c] (if any).
* [match_all c req] is a list {{:-US/docs/Web/API/Cache/matchAll}stored response} for [req] in [c].
* [add c req] fetches [req] and
{{:-US/docs/Web/API/Cache/add}adds}
the response to [c].
* [add_all c reqs] fetches [reqs] and
{{:-US/docs/Web/API/Cache/addAll}adds}
their reponses to [c].
* [put c req resp]
{{:-US/docs/Web/API/Cache/put}puts}
the [req]/[resp] pair to the cache.
* [delete c req] {{:-US/docs/Web/API/Cache/delete}deletes} response to [req] from the cache. [false]
is returned if [req] was not in the cache.
* [keys c] are the {{:-US/docs/Web/API/Cache/keys}requests} cached by [c].
* Cache storage objects.
* See {!t}.
* [match' s req] is a {{:-US/docs/Web/API/CacheStorage/match}stored response} for [req] in [s] (if any).
* [has s n] is [true] if [n] matches a {{:-US/docs/Web/API/CacheStorage/has}cache name} in [s].
* [open' s n] {{:-US/docs/Web/API/CacheStorage/open}opens} the cache named [n] of [s].
* [delete s n] {{:-US/docs/Web/API/CacheStorage/delete}deletes} the cache named [n] from [s]. [false] is returned
if [n] did not exist.
* [keys s] are the {{:-US/docs/Web/API/CacheStorage/keys}cache names} in [s].
*/*
*/*
*/*
*/*
* Fetch events.
* The type for
{{:-US/docs/Web/API/FetchEvent}
[FetchEvent]} objects.
* [fetch] is the [fetch] event type.
* [as_extendable e] is [e] as an extendable event.
* [request e] is the
{{:-US/docs/Web/API/FetchEvent/request}
request} being fetched.
* [preload_response e] is a navigation response {{:-US/docs/Web/API/FetchEvent/preloadResponse}preload} (if any).
* [client_id e] is the {{:-US/docs/Web/API/FetchEvent/clientId}client id} of [e].
* [resulting_client_id e] is the {{:-US/docs/Web/API/FetchEvent/resultingClientId}resulting} client id.
* [replaces_client_id e] is the client id being
{{:-US/docs/Web/API/FetchEvent/replacesClientId}replaced}.
* [handled e] is obscure.
* [respond_with e resp] replace the browser's default fetch handling
with the {{:-US/docs/Web/API/FetchEvent/respondWith}
response} [resp].
* [url ~init u] {{:-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch}fetches} URL [u] with the [init] request object.
* [request r] {{:-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch}fetches} request [r].
* [caches] is the global
{{:-US/docs/Web/API/WindowOrWorkerGlobalScope/caches}[caches]} object.
* Access to device location.
See {{:-US/docs/Web/API/Geolocation_API}
Geolocation API}.
* Position errors.
* The type for {{:-US/docs/Web/API/GeolocationPositionError/code#Value}error code} values.
* [code e] is the
{{:-US/docs/Web/API/GeolocationPositionError/code}error code} of [e].
* [message e] is a
{{:-US/docs/Web/API/GeolocationPositionError/message}human readable} error message. For programmers, not for end
users.
*/*
*/*
* Positions.
* The type for {{:-US/docs/Web/API/GeolocationPosition}[GeolocationPosition]} objects (and their
{{:-US/docs/Web/API/GeolocationCoordinates}[GeolocationCoordinates]} member).
* [latitude p] is the {{:-US/docs/Web/API/GeolocationCoordinates/latitude}latitude} in decimal degrees.
* [longitude p] is the {{:-US/docs/Web/API/GeolocationCoordinates/longitude}longitude} in decimal degrees.
* [accuracy p] is the {{:-US/docs/Web/API/GeolocationCoordinates/accuracy}accuracy}, in meters, of the {!latitude}
and {!longitude} in meters.
* [altitude p] is the {{:-US/docs/Web/API/GeolocationCoordinates/altitude}altitude} in meters relative to sea level.
* [altitude_accuracy p] is the {{:-US/docs/Web/API/GeolocationCoordinates/altitudeAccuracy}altitude accuracy}, in meters,
of the {!altitude}.
* [speed p] is the device {{:-US/docs/Web/API/GeolocationCoordinates/speed}velocity} in meters per seconds.
* [timestamp_ms p] is the
{{:-US/docs/Web/API/GeolocationPosition/timestamp}time} of measurement in [ms] since
the epoch.
*/*
*/*
* The type for geolocalisation options.
* [opts ~high_accuracy ~maximum_age_ms ~timeout_ms ()] are geolocalisation
{{:-US/docs/Web/API/PositionOptions#Properties}options}.
* The type for device {{:-US/docs/Web/API/Geolocation}[Geolocation]} objects.
* [of_navigator n] is a device geolocalisation object for
{{:-US/docs/Web/API/Navigator/geolocation}navigator} [n].
* [get l ~opts] is the position of [l]
{{:-US/docs/Web/API/Geolocation/getCurrentPosition}determined}
with options [opts].
* The type for watcher identifiers.
* [watch l ~opts f] {{:-US/docs/Web/API/Geolocation/watchPosition}monitors} the position of [l] determined with [opts] by
periodically calling [f]. Stop watching by calling {!unwatch} with
the returned identifier.
* [unwatch l id] {{:-US/docs/Web/API/Geolocation/clearWatch}unwatches} [id] as returned by a previous call to {!watch}.
*/*
*/*
* Media objects properties, capabilities and constraints.
* [bool] constraints.
* The type for [bool] constraints.
* [int] ranges and constraints.
* The type for integer ranges.
*/*
*/*
* The type for integer range constraints.
*/*
*/*
* [float] ranges and constraints.
* The type for float ranges.
*/*
*/*
* The type for float range constraints.
*/*
*/*
* The type for [bool] constraints.
*/*
*/*
* The type for properties of type ['a] whose capabilities
are described by ['b] and which are constrained by ['c].
* The type for boolean properties.
* [bool n] is a bool property named [n].
* The type for integer properties.
* [int n] is an integer property named [n].
* The type for floating point properties.
* [float n] is a float property named [n].
* The type for string properties.
* [jstr n] is a string property named [n].
* The type for string enumeration properties.
* [jstr n] is a string enumeration property named [n].
* [name p] is the name of the property.
* [cap_of_jv p jv] is the property capability of [p] from [jv].
* [cap_jv p v] is the capability value of [p] for [v].
* [cap_of_jv p jv] is the property constraint of [p] from [jv].
* [cap_jv p v] is the cosntraint value of [p] for [v].
* Supported property constraints.
Indicates the media properties constraints the user agent
understands.
* The type for supported constraints.
* [supports p n] is true if property [p] can be constrained.
* [supported s] is the list of supported constraints.
*/*
*/*
* Property constraints specifications.
* The type for constraints.
* [empty ()] is an empty set of constraints.
* [set p v c] sets the constraint for [p] to [v] in [c].
* [delete p c] deletes the constraint for [p] from [c].
*/*
*/*
* Property capability specifications.
* The type for capabilities.
* [set p v c] sets the capability of [p] to [v] in [c].
* [delete p c] deletes the capability of [p] from [c].
*/*
*/*
* Property values.
* The type for settings.
* [get p s] is the value of [p] in [s].
*/*
*/*
* Media stream tracks.
* Track state enumeration.
* Track kind enumeration.
* The type for
{{:-US/docs/Web/API/MediaStreamTrack/kind#Value}track kind} values.
* Track properties
* The {{:-US/docs/Web/API/MediaTrackSettings/aspectRatio}[aspectRatio]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/autoGainControl}[autoGainControl]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/channelCount}[channelCount]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/cursor}[cursor]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/deviceId}[deviceId]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/displaySurface}[displaySurface]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/echoCancellation}[echoCancellation]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/facingMode}[facingMode]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/frameRate}[frameRate]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/groupId}[groupId]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/height}[height]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/latency}[latency]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/logicalSurface}[logicalSurface]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/noiseSuppression}[noiseSuppression]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/resizeMode}[resizeMode]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/sampleRate}[sampleRate]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/sampleSize}[sampleSize]} property.
* The {{:-US/docs/Web/API/MediaTrackSettings/width}[width]} property.
* The type for {{:-US/docs/Web/API/MediaStreamTrack}[MediaStreamTrack]} objects.
* [as_target t] is [t] as an event target.
* [id t] is the {{:-US/docs/Web/API/MediaStreamTrack/id}unique identifier} of [t].
* [isolated t] is the
{{:-identity/#dfn-isolated}isolation status}
of [t].
* [label t] is the
{{:-US/docs/Web/API/MediaStreamTrack/label}label} of [t].
* [muted t] is [true] if [t] is
{{:-US/docs/Web/API/MediaStreamTrack/muted}muted}. Use {!set_enabled} to manually mute and unmute a track. Use events
{!Ev.mute} and {!Ev.unmute} to monitor mute status.
* [ready_state t] is the
{{:-US/docs/Web/API/MediaStreamTrack/readyState}status} of the track. Use event {!Ev.ended} to monitor ready state.
* [enabled t] is [true] if the track is {{:-US/docs/Web/API/MediaStreamTrack/enabled}allowed} to render the source
and [false] if it's not. Use {!set_enabled} to control this.
* [set_enabled t b] sets the track {!enabled} status to [b].
If the track has been disconnected this has no effect.
* [get_capabilities t] are the {{:-US/docs/Web/API/MediaStreamTrack/getCapabilities}capabilities} of [t].
* [get_constraints t] are the {{:-US/docs/Web/API/MediaTrackConstraints}constraints} of [t].
* [apply_contraints t] applies the
{{:-US/docs/Web/API/MediaStreamTrack/applyConstraints}applies}
the given contraints. Constraints unspecified are restored to
their default value. If no contraints are given all
contraints are restored to their defaults.
* [get_settings t] are the {{:-US/docs/Web/API/MediaTrackSettings}settings} of [t].
* [stop t] {{:-US/docs/Web/API/MediaStreamTrack/stop}stops} the track.
* [clone t] creates a {{:-US/docs/Web/API/MediaStreamTrack/clone}copy} of [t] equal to it except for its {!id}.
* Track events.
* The type for
{{:-US/docs/Web/API/MediaStreamTrackEvent} [MediaStreamTrackEvent]} objects.
* [track e] is the track object associated to the event.
* [ended] is the {{:-US/docs/Web/API/MediaStreamTrack/ended_event}ended} event.
* [mute] is the
{{:-US/docs/Web/API/MediaStreamTrack/mute_event}[mute]} event.
* [ummute] is the {{:-US/docs/Web/API/MediaStreamTrack/unmute_event}unmute} event.
*/*
*/*
* Media streams.
* Media stream constraints.
* The type for
{{:-US/docs/Web/API/MediaStreamConstraints}[MediaStreamConstraints]}
objects.
* The type for specifying track constraints.
* [v ~audio ~video ()] are stream constraints with
given arguments. If unspecified they default to [`No].
* [av] says [`Yes None] to audio and video.
*/*
*/*
* The type for
{{:-US/docs/Web/API/MediaStream}[MediaStream]} objects.
* [create ()] is a stream without tracks.
* [of_stream s] is a new stream which shares its tracks with [s].
* [of_tracks ts] is a stream with tracks [ts].
* [as_target s] is [s] as an event target.
* [id s] is a {{:-US/docs/Web/API/MediaStream/id}unique identifier} for [s].
* [active s] is [true] if [s] is {{:-US/docs/Web/API/MediaStream/active}active}.
* [get_audio_tracks s] is the list of
{{:-US/docs/Web/API/MediaStream/getAudioTracks}audio tracks} of [s].
* [get_video_tracks s] is the list of
{{:-US/docs/Web/API/MediaStream/getVideoTracks}video tracks} of [s].
* [get_tracks s] is the list of
{{:-US/docs/Web/API/MediaStream/getTracks}tracks} of [s].
* [get_track_by_id s id]
{{:-US/docs/Web/API/MediaStream/getTrackById}finds} the track identified by [id] (if any).
* [add_track s t] {{:-US/docs/Web/API/MediaStream/addTrack}adds} track [t] so [s]. If [t] was already in [s]
nothing happens.
* [remove_track s t] {{:-US/docs/Web/API/MediaStream/removeTrack}removes} track [t] from [s]. If [t] was not in [s]
nothing happens.
* [clone s] {{:-US/docs/Web/API/MediaStream/clone}clones} the tracks of [s] and [s] itself. It has the same
parameters except for [id].
* Stream events
* [addtrack] is the {{:-US/docs/Web/API/MediaStream/onaddtrack}[addtrack]} event.
* [removetrack] is the {{:-US/docs/Web/API/MediaStream/onremovetrack}[removetrack]} event.
*/*
*/*
* Media recorder.
See the {{:-record/}
MediaStream Recording} API.
* Bitrate mode enumeration.
* The type for {{:-record/#bitratemode}[BitrateMode]} values.
* Recording state enumeration.
* The type for {{:-record/#recordingstate}[RecordingState]} values.
* [is_type_supported t] is [true] if recording to MIME type
[t] is {{:-US/docs/Web/API/MediaRecorder/isTypeSupported}supported}.
* The type for initialisation objects.
* [init ()] is a media recorder initialisation object with given
{{:-US/docs/Web/API/MediaRecorder/MediaRecorder#Parameters}parameters}.
* The type for
{{:-US/docs/Web/API/MediaRecorder}[MediaRecorder]} objects.
* [stream r]
* [type' r] is the stream's MIME type.
* [state r] is the
{{:-US/docs/Web/API/MediaRecorder/state}recording state} of [r].
* [video_bps r] is the {{:-US/docs/Web/API/MediaRecorder/videoBitsPerSecond}video encoding bit rate} of [s].
* [audio_bps r] is the {{:-US/docs/Web/API/MediaRecorder/audioBitsPerSecond}audio encoding bit rate} of [s].
* [audio_bps r] is the {{:-record/#dom-mediarecorder-audiobitratemode}audio encoding mode} of [s].
* [start r ~timeslice_ms]
{{:-US/docs/Web/API/MediaRecorder/start}starts} [r]. [timeslice_ms] indicates the number of milliseconds to record in
each blob. If not specified the whole duration is in a single blob,
unless {!request_data} is invoked to drive the process.
* [stop r] {{:-US/docs/Web/API/MediaRecorder/stop}stops} [r].
* [resume r] {{:-US/docs/Web/API/MediaRecorder/resume}resume} [r].
* [request_data] {{:-US/docs/Web/API/MediaRecorder/requestData}requests} the data of [r].
* The type for
{{:-record/#blobevent-section}
[BlobEvent]} objects.
* [data e] is the requested data as a blob object.
* Recorder errors.
* The type for
{{:-US/docs/Web/API/MediaRecorderErrorEvent} [MediaRecorderErrorEvent]} objects.
* [error e] is the event's {{:-US/docs/Web/API/MediaRecorderErrorEvent/error}error}.
* [start] is the recorder {{:-US/docs/Web/API/MediaRecorder/onstart}[start]} event.
* [stop] is the recorder {{:-US/docs/Web/API/MediaRecorder/onstop}[stop]} event.
* [dataavailable] is the recorder {{:-US/docs/Web/API/MediaRecorder/ondataavailable}[dataavailable]} event.
* [pause] is the recorder {{:-US/docs/Web/API/MediaRecorder/onpause}[pause]} event.
* [resume] is the recorder {{:-US/docs/Web/API/MediaRecorder/onresume}[resume]} event.
* [error] is the recorder {{:-US/docs/Web/API/MediaRecorder/onerror}[error]} event.
* Device kinds and information.
* Device kind enumeration.
* The type for
{{:-main/#dom-mediadevicekind}
[MediaDeviceKind]} values.
* Device information.
* The type for {{:-US/docs/Web/API/MediaDevices}[MediaDeviceInfo]} objects.
* [device_id d] is the identifier of the device.
* [kind d] is the kind of device.
* [label d] is a label describing the device.
* [to_json d] is [d] as JSON data.
*/*
*/*
* Media device enumeration.
* The type for {{:-US/docs/Web/API/MediaDevices}[MediaDevices]} objects.
* [of_navigator n] provides access to media devices of [n].
* [as_target m] is [m] as an event target.
* [get_supported_constraints m]
{{:-US/docs/Web/API/MediaDevices/getSupportedConstraints}determines}
the media constraints the user agent understands.
* [get_display_media m c]
{{:-US/docs/Web/API/MediaDevices/getDisplayMedia}prompts} the user to select and grant permission to capture the
contents of a display as a media stream. A video
track is unconditionally returned even if [c] says otherwise.
In some browsers this call has to done in a user interface event
handler.
See this
{{:-US/docs/Web/API/Screen_Capture_API/Using_Screen_Capture}MDN article} for more details.
* Device events.
* [devicechange] is the {{:-US/docs/Web/API/MediaDevices/devicechange_event}[devicechange]} event. Monitors
media device additions and removals on [MediaDevice] objects.
*/*
*/*
* The HTML {{:-US/docs/Web/API/HTMLMediaElement}media element interface}.
{b Warning.} This binding is incomplete, the modules
{!El.Audio_track}, {!El.Video_track}, {!El.Text_track} are mostly
empty.
* Media errors
* The type for
{{:-US/docs/Web/API/MediaError/code#Value}error code} values.
* [code e] is the error
{{:-US/docs/Web/API/MediaError/code}
code}.
* [message e] is the error {{:-US/docs/Web/API/MediaError/message}message}.
* Can play enum.
* The type for
{{:-US/docs/Web/API/HTMLMediaElement/canPlayType#Return_value}can play} values.
* Ready state codes.
* Network state codes.
* The type for
{{:-US/docs/Web/API/HTMLMediaElement/networkState#Value}network state} values.
* CORS settings
* The type for
{{:-US/docs/Web/HTML/Attributes/crossorigin}CORS} values.
* Media providers.
* The type for
{{:#mediaprovider}
[MediaProvider]} objects.
*/*
*/*
* Audio tracks (incomplete).
* The type for
{{:-US/docs/Web/API/AudioTrack}
[AudioTrack]} objects.
* Audio track lists.
* The type for
{{:-US/docs/Web/API/AudioTrackList}
[AudioTrackList]} objects.
*/*
*/*
*/*
*/*
* Video tracks (incomplete).
* The type for
{{:-US/docs/Web/API/VideoTrack}
VideoTrack} objects.
* The type for
{{:-US/docs/Web/API/VideoTrackList}
[VideoTrackList]} objects
*/*
*/*
*/*
*/*
* Text tracks (incomplete).
* The type for
{{:-US/docs/Web/API/TextTrack}
TextTrack} objects.
* Text track lists.
* The type for
{{:-US/docs/Web/API/TextTrackList}
[TextTrackList]} objects.
*/*
*/*
*/*
*/*
* Time ranges.
* The type for
{{:-US/docs/Web/API/TimeRanges}
[TimeRange]} objects.
* [start r i] is the {{:-US/docs/Web/API/TimeRanges/start}start} time of range [i] in [r].
* [end' r i] is the {{:-US/docs/Web/API/TimeRanges/end'}end} time of range [i] in [r].
*/*
*/*
* The type for elements satifying the
{{:-US/docs/Web/API/HTMLMediaElement}
[HTMLMediaElement]} interface.
* [to_el m] is [m] as an an element.
* [error m] is the most recent
{{:-US/docs/Web/API/HTMLMediaElement/error}error} of [m].
* [src m] is the
{{:-US/docs/Web/API/HTMLMediaElement/src}URI source} of the played media.
* [set_src m s] sets the {!src} of [m] to [s].
* [current_src m] is the
{{:-US/docs/Web/API/HTMLMediaElement/currentSrc}current source} of [m].
* [cross_origin m] is the
{{:-US/docs/Web/API/HTMLMediaElement/crossOrigin}CORS setting} of [m].
* [set_cross_origin m c] sets the {!cross_origin} of [m] to [c].
* [network_state m] is the
{{:-US/docs/Web/API/HTMLMediaElement/networkState}network state} of [m].
* [preload m] is the preload state of [m].
* [set_preload m p] sets the preload of [m] to [p].
* [buffered m] are the ranges of media that
are {{:-US/docs/Web/API/HTMLMediaElement/buffered}buffered}:
* [load m] restarts
{{:-US/docs/Web/API/HTMLMediaElement/load}loading} [m].
* [can_play_type m t] indicates if [m]
{{:-US/docs/Web/API/HTMLMediaElement/canPlayType}can play} [t].
* [ready_state m] indicates the
{{:-US/docs/Web/API/HTMLMediaElement/readyState}readiness} of [m].
* [seeking m] indicates [m] is seeking a new position.
* [current_time m] is the {{:-US/docs/Web/API/HTMLMediaElement/currentTime}current time} of [m].
* [set_current_time_s m t] sets the {!current_time_s} of [m] to [t].
* [duration_s m] is the
{{:-US/docs/Web/API/HTMLMediaElement/duration}duration} of [m].
* [paused m] indicates whether [m] is
{{:-US/docs/Web/API/HTMLMediaElement/paused}paused}.
* [default_playback_rate m] is the
{{:-US/docs/Web/API/HTMLMediaElement/defaultPlaybackRate}default playback rate} of [m].
* [set_default_playback_rate m] sets the {!default_playback_rate}
of [m].
* [playback_rate m] is the
{{:-US/docs/Web/API/HTMLMediaElement/playbackRate}playback rate} of [m].
* [set_playback_rate m] sets the {!playback_rate}
of [m].
* [played m] are the ranges that have been played.
* [seekable m] indicates the time ranges that are
{{:-US/docs/Web/API/HTMLMediaElement/seekable}seekable}.
* [ended m] is [true] if the media has
{{:-US/docs/Web/API/HTMLMediaElement/ended}finished} playing.
* [autoplay m] is the {{:-US/docs/Web/API/HTMLMediaElement/autoplay}autoplay} behaviour of [m].
* [set_auto_play m b] sets {!autoplay} of [m] to [b].
* [loop m] inidicates if [m] is set to {{:-US/docs/Web/API/HTMLMediaElement/loop}loop}.
* [set_loop m b] sets the {!loop} of [m] to [b].
* [play m]
{{:-US/docs/Web/API/HTMLMediaElement/play}plays} [m].
* [pause m]
{{:-US/docs/Web/API/HTMLMediaElement/pause}pauses} [m].
* [controls m] indicates if media controls are
{{:-US/docs/Web/API/HTMLMediaElement/controls}shown}.
* [set_controls m b] sets the {!controls} of [m] to [b].
* [volume m] is the
{{:-US/docs/Web/API/HTMLMediaElement/volume}volume} of [m].
* [set_volume m b] sets the {!volume} of [m] to [b].
* [muted m] indicates whether audio is {{:-US/docs/Web/API/HTMLMediaElement/muted}muted}.
* [set_muted m b] sets the {!muted} of [m] to [b].
* [default_muted m] is the {{:-US/docs/Web/API/HTMLMediaElement/defaultMuted}default muted} state.
* [set_default_muted m b] sets the {!default_muted} of [m] to [b].
* [audio_track_list m] are the
{{:-US/docs/Web/API/HTMLMediaElement/audioTracks}audio tracks} of [m].
* [video_track_list m] are the
{{:-US/docs/Web/API/HTMLMediaElement/videoTracks}video tracks} of [m].
* [text_trac_list m] are the
{{:-US/docs/Web/API/HTMLMediaElement/textTracks}text tracks} of [m].
* [capture_tream m] is a
{{:-US/docs/Web/API/HTMLMediaElement/captureStream}media stream} for [m].
*/*
*/*
* Message events, ports, channels and broadcast channels.
* The type for objects to transfer.
* [transfer v] indicates valule [v] should be transfered, not just
cloned, meaning they are no longer usable on the sending side.
* The type for messaging options.
* [opts ~target_origin ~transfer ()] are messaging options.
See {{:-US/docs/Web/API/Window/postMessage#Syntax}here} for the semantics of [target_origin] and [transfer].
* Message ports.
* The type for
{{:-US/docs/Web/API/MessagePort}
[MessagePort]} objects.
* [start p]
{{:-US/docs/Web/API/MessagePort/start}
starts} [p].
* [close p]
{{:-US/docs/Web/API/MessagePort/close}
closes} [p].
*/*
*/*
* Message channels.
See the {{:-US/docs/Web/API/Channel_Messaging_API}Channel Messaging API}.
* The type for
{{:-US/docs/Web/API/MessageChannel}
[MessageChannel]} objects.
* [port c] is the {{:-US/docs/Web/API/MessageChannel/port1}first port} of [c]. The port attached to the context
that created the channel.
*/*
*/*
* [create n] {{:-US/docs/Web/API/BroadcastChannel/BroadcastChannel}creates} a channel named [n].
* [as_target b] is [b] as an event target.
* [name b] is the {{:-US/docs/Web/API/BroadcastChannel/name}name} of [b].
* [close b] {{:-US/docs/Web/API/BroadcastChannel/close}closes} [b].
*/*
*/*
* [window_post w v ~opts]
{{:-US/docs/Web/API/Window/postMessage}
posts} value [v] to window [w] with options [opts].
* Message events.
* The type for
{{:-US/docs/Web/API/MessageEvent}
[MessageEvent]} and
{{:-US/docs/Web/API/ExtendableMessageEvent}[ExtendableMessageEvent]}
objects.
* [as_extendable e] is [e] as an extendable event. {b Warning.}
only for [ExtendableMessageEvents] objects.
* [data e] is the
{{:-US/docs/Web/API/MessageEvent/data}
data send} by the emitter. {b Warning.} Unsafe,
make sure to constrain the result value to the right type.
* [origin e] is the
{{:-US/docs/Web/API/MessageEvent/origin}
origin} of the message emitter.
* [last_event_id e] is a
{{:-US/docs/Web/API/MessageEvent/lastEventId}unique id} for the event.
* [source e] is the {{:-US/docs/Web/API/MessageEvent/source}message emitter}.
* [ports e] is a list of {{:-US/docs/Web/API/MessageEvent/ports}ports} associated with the channel the message is being
send through (if applicable).
* [message] is the {{:-US/docs/Web/API/BroadcastChannel/message_event}[message]} event.
* [messageerror] is the {{:-US/docs/Web/API/BroadcastChannel/messageerror_event}[messageerror]} event.
* Notifying users.
See the {{:-US/docs/Web/API/Notifications_API}Notification API}.
* Permission enum.
* The type for
{{:-US/docs/Web/API/Notification/permission#Return_Value}notification permission} values.
* [permission ()] is the {{:-US/docs/Web/API/Notification/permission}permission} granted by the user.
* [request_permission ()] {{:-US/docs/Web/API/Notification/requestPermission}requests} permission to display
notifications.
* Direction enum.
* The type for
{{:-US/docs/Web/API/Notification/dir#Value}notification direction} values.
* Actions.
* [max] is the {{:-US/docs/Web/API/Notification/maxActions}maximum number} of actions supported.
* The type for {{:-US/docs/Web/API/NotificationAction}[NotificationAction]} objects.
* [action a] is the {{:-US/docs/Web/API/NotificationAction#Properties}action name} of [a].
* [title a] is the {{:-US/docs/Web/API/NotificationAction#Properties}title} of [a].
* [icon a] is the {{:-US/docs/Web/API/NotificationAction#Properties}icon} of [a].
*/*
*/*
* The type for notification options.
* The type for
{{:-US/docs/Web/API/Notification}
[Notification]} objects.
* See {!t} .
* [create title ~opts] is a
{{:-US/docs/Web/API/Notification/Notification}notification}
with title [title] and options [opts].
* [close n] {{:-US/docs/Web/API/Notification/close}closes} [n].
* [as_target n] is [n] as an event target.
* [action n] are the
{{:-US/docs/Web/API/Notification/actions}
actions} of [n].
* [badge n] is the
{{:-US/docs/Web/API/Notification/badge}
badge} of [n].
* [body n] is the
{{:-US/docs/Web/API/Notification/body}
body} of [n].
* [data n] is the
{{:-US/docs/Web/API/Notification/data}
data} of [n]. {b Warning.} This is unsafe, constrain the result type.
* [dir n] is the
{{:-US/docs/Web/API/Notification/dir}
dir} of [n].
* [lang n] is the
{{:-US/docs/Web/API/Notification/lang}
lang} of [n].
* [tag n] is the
{{:-US/docs/Web/API/Notification/tag}
tag} of [n].
* [icon n] is the
{{:-US/docs/Web/API/Notification/icon}
icon} of [n].
* [image n] is the
{{:-US/docs/Web/API/Notification/image}
image} of [n].
* [renotify n]
{{:-US/docs/Web/API/Notification/renotify}
indicates} [n] replaces an old notification.
* [require_interaction n]
{{:-US/docs/Web/API/Notification/requireInteraction} indicates} [n] requires interaction.
* [silent n]
{{:-US/docs/Web/API/Notification/silent} indicates} [n] should be silent.
* [timestamp_ms n] is the {{:-US/docs/Web/API/Notification/timestamp}timestamp} of [n].
* [title n] is the
{{:-US/docs/Web/API/Notification/title}
title} of [n].
* Notification events.
* The type for {{:-US/docs/Web/API/NotificationEvent}[NotificationEvent]} objects.
* [as_extendable e] is [e] as an extendable event.
* [notification e] is the
{{:-US/docs/Web/API/NotificationEvent/notification}notification} of [e].
* [action e] is the notification {{:-US/docs/Web/API/NotificationEvent/action}action} clicked.
* [notificationclick] is the
{{:-US/docs/Web/API/ServiceWorkerGlobalScope/notificationclick_event}[notificationclick]} event.
*/*
*/*
* The type for
{{:-US/docs/Web/API/Storage}[Storage]}
objects.
* [local w] is the storage {{:-US/docs/Web/API/Window/localStorage}saved accross page sessions} for the
window's {{:#concept-origin}origin}.
* [session w] is the storage {{:-US/docs/Web/API/Window/sessionStorage}cleared when the page session} ends for the window's {{:#concept-origin}origin}.
* [length s] is the
{{:-US/docs/Web/API/Storage/length}
number of items} in [s].
* [key s i] is the
{{:-US/docs/Web/API/Storage/key}name}
of the [i]th key. (N.B. local storage can race with other tabs)
* [clear s]
{{:-US/docs/Web/API/Storage/clear}
removes} all keys from [s].
* Storage event.
* See {!Brr_io.Storage.t} .
* The type for
{{:-US/docs/Web/API/StorageEvent}
[StorageEvent]} objects.
* [key e] is the key of the item being changed.
* [old_value e] is the old value of the key.
* [new_value e] is the new value of the key.
* [url e] is the URL of the document whose storage item changed.
* [storage_area e] is the storage object.
*/*
*/*
* [Websocket] objects.
See {{:-US/docs/Web/API/WebSockets_API}
Web Sockets API}.
{b XXX} Add a bit of future convenience.
* Binary type enum.
* The type for
{{:-US/docs/Web/API/WebSocket/binaryType#Value}binary type} values.
* Ready state enum.
* The type for
{{:-US/docs/Web/API/WebSocket/readyState#Value}ready state} values.
* The type for
{{:-US/docs/Web/API/WebSocket}[WebSocket]}
objects.
* [as_target s] is [s] as an event target.
* [set_binary_type s t] sets the {!binary_type} of [s] to [t].
* [close s]
{{:-US/docs/Web/API/WebSocket/close}
closes} [s].
* [url s] is the {{:-US/docs/Web/API/WebSocket/url}url} of [s].
* [ready_state s] is the {{:-US/docs/Web/API/WebSocket/readyState}state} of the connection.
* [buffered_amount s] is the sent {{:-US/docs/Web/API/WebSocket/bufferedAmount}buffered amount} of [s].
* [extensions s] are the
{{:-US/docs/Web/API/WebSocket/extensions}
extensions} selected by the server.
* [protocol s] is the
{{:-US/docs/Web/API/WebSocket/protocol}
protocol} selected by the server.
* [send_blob s d]
{{:-US/docs/Web/API/WebSocket/send}
sends} the binary content of [d] on [s].
* [send_blob s d]
{{:-US/docs/Web/API/WebSocket/send}
sends} the binary content of [d] on [s].
* [send_blob s d]
{{:-US/docs/Web/API/WebSocket/send}
sends} the binary content of [d] on [s].
* Websocket events.
* Close events.
* The type for
{{:-US/docs/Web/API/CloseEvent}
[CloseEvent]} objects.
* [code e] is the {{:-US/docs/Web/API/CloseEvent#Properties}close} code sent by the server.
* [close] is the {{:-US/docs/Web/API/WebSocket/close_event}[close]} event.
*/*
*/* | ---------------------------------------------------------------------------
Copyright ( c ) 2020 The brr programmers . All rights reserved .
Distributed under the ISC license , see terms at the end of the file .
---------------------------------------------------------------------------
Copyright (c) 2020 The brr programmers. All rights reserved.
Distributed under the ISC license, see terms at the end of the file.
---------------------------------------------------------------------------*)
open Brr
module Clipboard : sig
module Item : sig
module Presentation_style : sig
type t = Jstr.t
val unspecified : t
val inline : t
val attachment : t
end
type opts
val opts : ?presentation_style:Presentation_style.t -> unit -> opts
type t
val create : ?opts:opts -> (Jstr.t * Blob.t) list -> t
* [ create ~opts data ] is { { : -US/docs/Web/API/ClipboardItem/ClipboardItem}clipboard item } with MIME types and associated
values [ data ] and options [ opts ] .
values [data] and options [opts]. *)
val presentation_style : t -> Presentation_style.t
* [ presentation_style i ] is the { { : -apis/#dom-clipboarditem-presentationstyle}presentation style } of [ i ] .
val last_modified_ms : t -> int
val delayed : t -> bool
val types : t -> Jstr.t list
val get_type : t -> Jstr.t -> Brr.Blob.t Fut.or_error
* [ get_type i t ] is the { { : object } with MIME type [ t ] for item [ i ] .
include Jv.CONV with type t := t
end
type t
val of_navigator : Navigator.t -> t
val as_target : t -> Ev.target
* { 1 : rw Reading and writing }
val read : t -> Item.t list Fut.or_error
val read_text : t -> Jstr.t Fut.or_error
val write : t -> Item.t list -> unit Fut.or_error
val write_text : t -> Jstr.t -> unit Fut.or_error
include Jv.CONV with type t := t
end
module Form : sig
* { 1 : element Element }
type t
* The type for
{ { : }
[ HTMLFormElement ] } objects .
{{:-US/docs/Web/API/HTMLFormElement}
[HTMLFormElement]} objects. *)
val of_el : El.t -> t
* [ of_el e ] is a form from element [ e ] . This throws a JavaScript
error if [ e ] is not a form element .
error if [e] is not a form element. *)
val to_el : t -> El.t
val name : t -> Jstr.t
val method' : t -> Jstr.t
val target : t -> Jstr.t
val action : t -> Jstr.t
val enctype : t -> Jstr.t
val accept_charset : t -> Jstr.t
val autocomplete : t -> Jstr.t
val no_validate : t -> bool
* [ no_validate f ] refelects the value of the { { : -US/docs/Web/HTML/Element/form#attr-novalidate}novalidate } attribute
of [ f ] .
of [f]. *)
val check_validity : t -> bool
val report_validity : t -> bool
val request_submit : t -> El.t option -> unit
val reset : t -> unit
val submit : t -> unit
* { 1 : data Data }
module Data : sig
type form = t
type entry_value = [ `String of Jstr.t | `File of File.t ]
type t
val create : unit -> t
val of_form : form -> t
val is_empty : t -> bool
val has_file_entry : t -> bool
val mem : t -> Jstr.t -> bool
val find : t -> Jstr.t -> entry_value option
* [ find d k ] is the first value associated to [ k ] in [ d ] ( if any ) .
val find_all : t -> Jstr.t -> entry_value list
val fold : (Jstr.t -> entry_value -> 'a -> 'a) -> t -> 'a -> 'a
val set : t -> Jstr.t -> Jstr.t -> unit
val set_blob : ?filename:Jstr.t -> t -> Jstr.t -> Blob.t -> unit
val append : t -> Jstr.t -> Jstr.t -> unit
val append_blob : ?filename:Jstr.t -> t -> Jstr.t -> Blob.t -> unit
val delete : t -> Jstr.t -> unit
* [ delete d k ]
{ { : -US/docs/Web/API/FormData/delete }
deletes } the values of key [ k ] in [ d ] .
{{:-US/docs/Web/API/FormData/delete}
deletes} the values of key [k] in [d]. *)
* { 1 : convert Converting }
val of_assoc : (Jstr.t * entry_value) list -> t
* [ of_assoc l ] is form data from assoc [ l ] , data is { ! .
val to_assoc : t -> (Jstr.t * entry_value) list
val of_uri_params : Uri.Params.t -> t
val to_uri_params : t -> Uri.Params.t
include Jv.CONV with type t := t
end
* { 1 : events Events }
module Ev : sig
module Data : sig
type t
val form_data : t -> Data.t
end
val formdata : Data.t Ev.type'
module Submit : sig
type t
* The type for
{ { : -US/docs/Web/API/SubmitEvent }
[ SubmitEvent ] } objects .
{{:-US/docs/Web/API/SubmitEvent}
[SubmitEvent]} objects. *)
val submitter : t -> El.t option
end
val submit : Submit.t Ev.type'
end
include Jv.CONV with type t := t
end
module Fetch : sig
module Body : sig
* { 1 : init Specification }
type init
val of_jstr : Jstr.t -> init
val of_uri_params : Brr.Uri.Params.t -> init
* [ of_uri_params p ] is a body from URI params [ p ] .
val of_form_data : Form.Data.t -> init
val of_blob : Brr.Blob.t -> init
val of_array_buffer : Brr.Tarray.Buffer.t -> init
* { 1 : interface Interface }
type t
val body_used : t -> bool
val body : t -> Jv.t option
val array_buffer : t -> Tarray.Buffer.t Fut.or_error
val blob : t -> Blob.t Fut.or_error
* [ blob b ]
{ { : }
reads } [ b ] as a blob .
{{:-US/docs/Web/API/Body/blob}
reads} [b] as a blob. *)
val form_data : t -> Form.Data.t Fut.or_error
val json : t -> Json.t Fut.or_error
val text : t -> Jstr.t Fut.or_error
* [ text b ]
{ { : -US/docs/Web/API/Body/text}reads }
[ b ] and UTF-8 decodes it to a string .
{{:-US/docs/Web/API/Body/text}reads}
[b] and UTF-8 decodes it to a string. *)
include Jv.CONV with type t := t
end
module Headers : sig
* { 1 : headers Headers }
type t
val mem : Jstr.t -> t -> bool
val find : Jstr.t -> t -> Jstr.t option
val fold : (Jstr.t -> Jstr.t -> 'a -> 'a) -> t -> 'a -> 'a
* { 1 : convert Converting }
val of_obj : Jv.t -> t
val of_assoc : ?init:t -> (Jstr.t * Jstr.t) list -> t
val to_assoc : t -> (Jstr.t * Jstr.t) list
include Jv.CONV with type t := t
end
module Request : sig
* { 1 : enums Enumerations }
module Cache : sig
type t = Jstr.t
val default : t
val force_cache : t
val no_cache : t
val no_store : t
val only_if_cached : t
val reload : t
end
module Credentials : sig
type t = Jstr.t
val include' : t
val omit : t
val same_origin : t
end
module Destination : sig
type t = Jstr.t
val audio : t
val audioworklet : t
val document : t
val embed : t
val font : t
val frame : t
val iframe : t
val image : t
val manifest : t
val object' : t
val paintworklet : t
val report : t
val script : t
val sharedworker : t
val style : t
val track : t
val video : t
val worker : t
val xslt : t
end
module Mode : sig
type t = Jstr.t
val cors : t
val navigate : t
val no_cors : t
val same_origin : t
end
module Redirect : sig
type t = Jstr.t
* The type for
{ { : -US/docs/Web/API/Request/redirect#Value }
[ RequestRedirect ] } values .
{{:-US/docs/Web/API/Request/redirect#Value}
[RequestRedirect]} values. *)
val error : t
val follow : t
val manual : t
end
* { 1 : req Requests }
type init
val init :
?body:Body.init -> ?cache:Cache.t -> ?credentials:Credentials.t ->
?headers:Headers.t -> ?integrity:Jstr.t -> ?keepalive:bool ->
?method':Jstr.t -> ?mode:Mode.t -> ?redirect:Redirect.t ->
?referrer:Jstr.t -> ?referrer_policy:Jstr.t ->
?signal:Abort.Signal.t -> unit -> init
type t
val v : ?init:init -> Jstr.t -> t
val of_request : ?init:init -> t -> t
external as_body : t -> Body.t = "%identity"
* { 1 : props Properties }
val cache : t -> Cache.t
* [ cache r ] is the
{ { : -US/docs/Web/API/Request/cache }
cache } behaviour of [ r ] .
{{:-US/docs/Web/API/Request/cache}
cache} behaviour of [r]. *)
val credentials : t -> Credentials.t
val destination : t -> Destination.t
val headers : t -> Headers.t
val integrity : t -> Jstr.t
val is_history_navigation : t -> bool
val is_reload_navigation : t -> bool
val keepalive : t -> bool
* [ keepalive r ] is the
{ { : -US/docs/Web/API/Request/keepalive }
} behaviour of [ r ] .
{{:-US/docs/Web/API/Request/keepalive}
keepalive} behaviour of [r]. *)
val method' : t -> Jstr.t
val mode : t -> Mode.t
val redirect : t -> Redirect.t
val referrer : t -> Jstr.t
val referrer_policy : t -> Jstr.t
* [ referrer_policy r ] is the
{ { :
referrer policy } of [ r ] .
{{:-US/docs/Web/API/Request/referrerPolicy}
referrer policy} of [r]. *)
val signal : t -> Abort.Signal.t option
val url : t -> Jstr.t
include Jv.CONV with type t := t
end
module Response : sig
* { 1 : enums Enumerations }
module Type : sig
type t = Jstr.t
val basic : t
val cors : t
val default : t
val error : t
val opaque : t
val opaqueredirect : t
end
* { 1 : resp Responses }
type init
val init :
?headers:Headers.t -> ?status:int -> ?status_text:Jstr.t -> unit -> init
type t
val v : ?init:init -> ?body:Body.init -> unit -> t
val of_response : t -> t
val error : unit -> t
val redirect : ?status:int -> Jstr.t -> t
external as_body : t -> Body.t = "%identity"
* { 1 : props Properties }
val headers : t -> Headers.t
val ok : t -> bool
* [ ok r ] is [ true ] if the response [ r ] is
{ { : }
successful } .
{{:-US/docs/Web/API/Response/ok}
successful}. *)
val redirected : t -> bool
val status : t -> int
val status_text : t -> Jstr.t
* [ status_text r ] is the
{ { : }
status text } of [ r ] .
{{:-US/docs/Web/API/Response/statusText}
status text} of [r]. *)
val url : t -> Jstr.t
include Jv.CONV with type t := t
end
module Cache : sig
type query_opts
val query_opts :
?ignore_search:bool -> ?ignore_method:bool -> ?ignore_vary:bool ->
?cache_name:Jstr.t -> unit -> query_opts
type t
val match' :
?query_opts:query_opts -> t -> Request.t -> Response.t option Fut.or_error
val match_all :
?query_opts:query_opts -> t -> Request.t -> Response.t list Fut.or_error
val add : t -> Request.t -> unit Fut.or_error
val add_all : t -> Request.t list -> unit Fut.or_error
val put : t -> Request.t -> Response.t -> unit Fut.or_error
val delete : ?query_opts:query_opts -> t -> Request.t -> bool Fut.or_error
val keys :
?query_opts:query_opts -> ?req:Request.t -> t ->
Request.t list Fut.or_error
* { 1 : cache_storage Cache storage }
module Storage : sig
type cache = t
type t
* The type for
{ { : -US/docs/Web/API/CacheStorage }
CacheStorage } objects . See { ! Brr_io.Fetch.caches } to get one .
{{:-US/docs/Web/API/CacheStorage}
CacheStorage} objects. See {!Brr_io.Fetch.caches} to get one. *)
val match' :
?query_opts:query_opts -> t -> Request.t ->
Response.t option Fut.or_error
val has : t -> Jstr.t -> bool Fut.or_error
val open' : t -> Jstr.t -> cache Fut.or_error
val delete : t -> Jstr.t -> bool Fut.or_error
val keys : t -> Jstr.t list Fut.or_error
include Jv.CONV with type t := t
end
include Jv.CONV with type t := t
end
module Ev : sig
type t
val fetch : t Ev.type'
val as_extendable : t -> Ev.Extendable.t Ev.t
val request : t -> Request.t
val preload_response : t -> Response.t option Fut.or_error
val client_id : t -> Jstr.t
val resulting_client_id : t -> Jstr.t
val replaces_client_id : t -> Jstr.t
val handled : t -> unit Fut.or_error
val respond_with : t -> Response.t Fut.or_error -> unit
end
val url : ?init:Request.init -> Jstr.t -> Response.t Fut.or_error
val request : Request.t -> Response.t Fut.or_error
val caches : Cache.Storage.t
end
module Geolocation : sig
module Error : sig
* { 1 : codes Codes }
type code = int
val permission_denied : code
val position_unavailable : code
val timeout : code
* { 1 : errors Errors }
type t
* The type for { { : ] } objects .
val code : t -> code
val message : t -> Jstr.t
include Jv.CONV with type t := t
end
module Pos : sig
type t
val latitude : t -> float
val longitude : t -> float
val accuracy : t -> float
val altitude : t -> float option
val altitude_accuracy : t -> float option
val heading : t -> float option
* [ heading p ] is the { { : -US/docs/Web/API/GeolocationCoordinates/heading}direction } in degree with respect to true north
( 90 ° is east ) . If { ! speed } is [ 0 ] , this is [ nan ] .
(90° is east). If {!speed} is [0], this is [nan]. *)
val speed : t -> float option
val timestamp_ms : t -> float
include Jv.CONV with type t := t
end
type opts
val opts :
?high_accuracy:bool -> ?timeout_ms:int -> ?maximum_age_ms:int -> unit ->
opts
* { 1 : geoloc Geolocalizing }
type t
val of_navigator : Navigator.t -> t
val get : ?opts:opts -> t -> (Pos.t, Error.t) Fut.result
type watch_id = int
val watch : ?opts:opts -> t -> ((Pos.t, Error.t) result -> unit) -> watch_id
val unwatch : t -> watch_id -> unit
include Jv.CONV with type t := t
end
* Access to media devices , streams and elements .
Access to the { { : -main}Media
Capture and Streams } API , the
{ { : -record/ } MediaStream
Recording } API and the
{ { : -US/docs/Web/API/HTMLMediaElement }
[ HTMLMediaElement ] } interface .
Access to the {{:-main}Media
Capture and Streams} API, the
{{:-record/} MediaStream
Recording} API and the
{{:-US/docs/Web/API/HTMLMediaElement}
[HTMLMediaElement]} interface. *)
module Media : sig
* { 1 : constrainable Constrainable pattern }
The following little bureaucracy tries to expose
the { { : -main/#constrainable-interface }
constrainable pattern } in a lean way .
{ { : -US/docs/Web/API/Media_Streams_API/Constraints}This introduction } on MDN may also be useful .
The following little bureaucracy tries to expose
the {{:-main/#constrainable-interface}
constrainable pattern} in a lean way.
{{:-US/docs/Web/API/Media_Streams_API/Constraints}This introduction} on MDN may also be useful. *)
module Prop : sig
* { 1 : range_constraits Ranges and constraints }
module Bool : sig
module Constraint : sig
type t
val v : ?exact:bool -> ?ideal:bool -> unit -> t
end
end
module Int : sig
module Range : sig
type t
val v : ?min:int -> ?max:int -> unit -> t
val min : t -> int option
val max : t -> int option
include Jv.CONV with type t := t
end
module Constraint : sig
type t
val v : ?min:int -> ?max:int -> ?exact:int -> ?ideal:int -> unit -> t
include Jv.CONV with type t := t
end
end
module Float : sig
module Range : sig
type t
val v : ?min:float -> ?max:float -> unit -> t
val min : t -> float option
val max : t -> float option
include Jv.CONV with type t := t
end
module Constraint : sig
type t
val v : ?min:float -> ?max:float -> ?exact:float -> ?ideal:float ->
unit -> t
include Jv.CONV with type t := t
end
end
* [ ] constraints .
module Jstr : sig
type t = Jstr.t
module Constraint : sig
type t
val v : ?exact:Jstr.t list -> ?ideal:Jstr.t list -> unit -> t
include Jv.CONV with type t := t
end
end
* { 1 : props Properties }
type ('a, 'b, 'c) t
type bool_t = (bool, bool list, Bool.Constraint.t) t
val bool : Jstr.t -> bool_t
type int_t = (int, Int.Range.t, Int.Constraint.t) t
val int : Jstr.t -> int_t
type float_t = (float, Float.Range.t, Float.Constraint.t) t
val float : Jstr.t -> float_t
type jstr_t = (Jstr.t, Jstr.t, Jstr.Constraint.t) t
val jstr : Jstr.t -> jstr_t
type jstr_enum_t = (Jstr.t, Jstr.t list, Jstr.Constraint.t) t
val jstr_enum : Jstr.t -> jstr_enum_t
* { 1 : low Low - level interface }
type 'a conv = ('a -> Jv.t) * (Jv.t -> 'a)
* [ ' a conv ] specifies encoding and decoding functions for JavaScript .
val v : Jstr.t -> 'a conv -> 'b conv -> 'c conv -> ('a, 'b, 'c) t
* [ v v_conv cap_conv constr_conv n ] is a new property named [ n ] whose
values are converted with [ v_conv ] , capabilities with [ cap_conv ] and
constraints with [ constr_conv ] .
values are converted with [v_conv], capabilities with [cap_conv] and
constraints with [constr_conv]. *)
val name : ('a, 'b, 'c) t -> Jstr.t
val value_of_jv : ('a, 'b, 'c) t -> Jv.t -> 'a
* [ of_jv p jv ] is the property value of [ p ] from [ jv ] .
val value_to_jv : ('a, 'b, 'c) t -> 'a -> Jv.t
* [ to_jv p v ] is the JavaScript value of [ p ] for [ v ] .
val cap_of_jv : ('a, 'b, 'c) t -> Jv.t -> 'b
val cap_to_jv : ('a, 'b, 'c) t -> 'b -> Jv.t
val constr_of_jv : ('a, 'b, 'c) t -> Jv.t -> 'c
val constr_to_jv : ('a, 'b, 'c) t -> 'c -> Jv.t
end
module Supported_constraints : sig
type t
val mem : ('a, 'b, 'c) Prop.t -> t -> bool
val names : t -> Jstr.t list
include Jv.CONV with type t := t
end
module Constraints : sig
type t
val empty : unit -> t
val find : ('a, 'b, 'c) Prop.t -> t -> 'c option
* [ find p s ] is the constraint for [ p ] in [ c ] ( if any ) .
val set : ('a, 'b, 'c) Prop.t -> 'c -> t -> unit
val delete : ('a, 'b, 'c) Prop.t -> t -> unit
include Jv.CONV with type t := t
end
module Capabilities : sig
type t
val find : ('a, 'b, 'c) Prop.t -> t -> 'b option
* [ find p s ] is the capability of [ p ] in [ c ] ( if any ) .
val set : ('a, 'b, 'c) Prop.t -> 'b -> t -> unit
val delete : ('a, 'b, 'c) Prop.t -> t -> unit
include Jv.CONV with type t := t
end
module Settings : sig
type t
val get : ('a, 'b, 'c) Prop.t -> t -> 'a
val find : ('a, 'b, 'c) Prop.t -> t -> 'a option
* [ find p s ] is the value of [ p ] in [ s ] ( if any ) .
include Jv.CONV with type t := t
end
* { 1 : media Media devices , streams and tracks }
module Track : sig
* { 1 : enum and properties }
module State : sig
type t = Jstr.t
* The type for { { : } values .
val live : t
val ended : t
end
module Kind : sig
type t = Jstr.t
val audio : t
val video : t
end
module Prop : sig
val aspect_ratio : Prop.float_t
val auto_gain_control : Prop.bool_t
val channel_count : Prop.int_t
val cursor : Prop.jstr_enum_t
val device_id : Prop.jstr_t
val display_surface : Prop.jstr_enum_t
val echo_cancellation : Prop.bool_t
val facing_mode : Prop.jstr_enum_t
val frame_rate : Prop.float_t
val group_id : Prop.jstr_t
val height : Prop.int_t
val latency : Prop.float_t
val logical_surface : Prop.bool_t
val noise_suppresion : Prop.bool_t
val resize_mode : Prop.jstr_enum_t
val sample_rate : Prop.int_t
val sample_size : Prop.int_t
val width : Prop.int_t
end
* { 1 : tracks Tracks }
type t
external as_target : t -> Ev.target = "%identity"
val id : t -> Jstr.t
val isolated : t -> bool
val kind : t -> Kind.t
* [ kind ] is the
{ { : } of [ t ] .
{{:-US/docs/Web/API/MediaStreamTrack/kind}kind} of [t]. *)
val label : t -> Jstr.t
val muted : t -> bool
val ready_state : t -> State.t
val enabled : t -> bool
val set_enabled : t -> bool -> unit
val get_capabilities : t -> Capabilities.t
val get_constraints : t -> Constraints.t
val apply_constraints : t -> Constraints.t option -> unit Fut.or_error
val get_settings : t -> Settings.t
val stop : t -> unit
val clone : t -> t
* { 1 : events Events }
module Ev : sig
* { 1 : obj Track event object }
type track = t
type t
val track : t -> track
* { 1 : track_event Track events }
val ended : Ev.void
val isolationchange : Ev.void
* [ isolationchange ] is the
{ { : -identity/#event-isolationchange }
isolationchange } event .
{{:-identity/#event-isolationchange}
isolationchange} event. *)
val mute : Ev.void
val unmute : Ev.void
end
include Jv.CONV with type t := t
end
module Stream : sig
module Constraints : sig
type t
type track = [ `No | `Yes of Constraints.t option ]
val v : ?audio:track -> ?video:track -> unit -> t
val av : unit -> t
include Jv.CONV with type t := t
end
type t
val create : unit -> t
val of_stream : t -> t
val of_tracks : Track.t list -> t
external as_target : t -> Ev.target = "%identity"
val id : t -> Jstr.t
val active : t -> bool
val get_audio_tracks : t -> Track.t list
val get_video_tracks : t -> Track.t list
val get_tracks : t -> Track.t list
val get_track_by_id : t -> Jstr.t -> Track.t option
val add_track : t -> Track.t -> unit
val remove_track : t -> Track.t -> unit
val clone : t -> t
* { 1 : events Events }
module Ev : sig
val addtrack : Track.Ev.t Ev.type'
val removetrack : Track.Ev.t Ev.type'
end
include Jv.CONV with type t := t
end
module Recorder : sig
* { 1 : enums Enumerations }
module Bitrate_mode : sig
type t = Jstr.t
val cbr : t
val vbr : t
end
module Recording_state : sig
type t = Jstr.t
val inactive : t
val recording : t
val paused : t
end
* { 1 : recorders Recorder }
val is_type_supported : Jstr.t -> bool
type init
val init :
?type':Jstr.t -> ?audio_bps:int -> ?video_bps:int -> ?bps:int ->
?audio_bitrate_mode:Bitrate_mode.t -> unit -> init
type t
val create : ?init:init -> Stream.t -> t
* [ create ~init r ] is a
{ { : -US/docs/Web/API/MediaRecorder/MediaRecorder}recorder } for [ s ] . The function
raises if the [ type ' ] of the [ init ] object
is not { { ! is_type_supported}supported } .
{{:-US/docs/Web/API/MediaRecorder/MediaRecorder}recorder} for [s]. The function
raises if the [type'] of the [init] object
is not {{!is_type_supported}supported}. *)
val stream : t -> Stream.t
val type' : t -> Jstr.t
val state : t -> Recording_state.t
val video_bps : t -> int
val audio_bps : t -> int
val audio_bitrate_mode : t -> Bitrate_mode.t
val start : t -> timeslice_ms:int option -> (unit, Jv.Error.t) result
val stop : t -> unit
val pause : t -> unit
* [ pause r ] { { : } [ r ] .
val resume : t -> unit
val request_data : t -> unit
* { 1 : events Events }
module Ev : sig
* { 1 : obj Event objects }
* events .
module Blob : sig
type t
val data : t -> Blob.t
val timecode : t -> float
* [ timecode e ] is the difference between timestamp of the first
chunk in { ! data } and the one produced by the first chunk in the
first blob event produced by the recorder ( that one may not
be zero ) .
chunk in {!data} and the one produced by the first chunk in the
first blob event produced by the recorder (that one may not
be zero). *)
end
module Error : sig
type t
val error : t -> Jv.Error.t
end
* { 1 : events Recorder events }
val start : Ev.void
val stop : Ev.void
val dataavailable : Blob.t Ev.type'
val pause : Ev.void
val resume : Ev.void
val error : Error.t Ev.type'
end
end
module Device : sig
module Kind : sig
type t = Jstr.t
val audioinput : t
val audiooutput : t
val videoinput : t
end
module Info : sig
type t
val device_id : t -> Jstr.t
val kind : t -> Kind.t
val label : t -> Jstr.t
val group_id : t -> Jstr.t
* [ group_id d ] is the group identifier of the device . Two devices
have the same group identifier if they belong to the same physical
device .
have the same group identifier if they belong to the same physical
device. *)
val to_json : t -> Json.t
include Jv.CONV with type t := t
end
end
module Devices : sig
type t
val of_navigator : Navigator.t -> t
external as_target : t -> Ev.target = "%identity"
val enumerate : t -> Device.Info.t list Fut.or_error
* [ enumerate m ]
{ { : -US/docs/Web/API/MediaDevices/enumerateDevices}determines }
a list of connected media devices . Monitor changes by listening
{ ! Ev.devicechange } on [ m ] .
{{:-US/docs/Web/API/MediaDevices/enumerateDevices}determines}
a list of connected media devices. Monitor changes by listening
{!Ev.devicechange} on [m]. *)
val get_supported_constraints : t -> Supported_constraints.t
val get_user_media : t -> Stream.Constraints.t -> Stream.t Fut.or_error
* [ get_user_media m c ]
{ { : -US/docs/Web/API/MediaDevices/getUserMedia}prompts }
the user to use a media input which can produce a media stream
constrained by [ c ] .
{ { : -US/docs/Web/API/MediaDevices/getUserMedia#Exceptions}These
errors } can occur . In particular [ Jv . Error . Not_allowed ] and
[ Jv . Error . Not_found ] should be reported to the user in a
friendly way . In some browsers this call has to done
in a user interface event handler .
{{:-US/docs/Web/API/MediaDevices/getUserMedia}prompts}
the user to use a media input which can produce a media stream
constrained by [c].
{{:-US/docs/Web/API/MediaDevices/getUserMedia#Exceptions}These
errors} can occur. In particular [Jv.Error.Not_allowed] and
[Jv.Error.Not_found] should be reported to the user in a
friendly way. In some browsers this call has to done
in a user interface event handler. *)
val get_display_media : t -> Stream.Constraints.t -> Stream.t Fut.or_error
* { 1 : events Events }
module Ev : sig
val devicechange : Ev.void
end
include Jv.CONV with type t := t
end
* { 1 : el Media element interface }
module El : sig
* { 1 : prelim Preliminaries }
module Error : sig
* { 1 : codes Error codes }
type code = int
val aborted : code
val network : code
val decode : code
val src_not_supported : code
* { 1 : obj Error objects }
type t
* The type for
{ { : -US/docs/Web/API/MediaError }
[ MediaError ] } objects .
{{:-US/docs/Web/API/MediaError}
[MediaError]} objects. *)
val code : t -> code
val message : t -> Jstr.t
end
module Can_play : sig
type t = Jstr.t
val maybe : t
val probably : t
end
module Have : sig
type t = int
* The type for
{ { : state } values .
{{:-US/docs/Web/API/HTMLMediaElement/readyState#Value}read state} values. *)
val nothing : t
val metadata : t
val current_data : t
val future_data : t
val enought_data : t
end
module Network : sig
type t = int
val empty : t
val idle : t
val loading : t
val no_source : t
end
module Cors : sig
type t = Jstr.t
val anonymous : t
val use_credentials : t
end
module Provider : sig
type t
val of_media_stream : Stream.t -> t
val of_blob : Blob.t -> t
val of_media_source : Jv.t -> t
include Jv.CONV with type t := t
end
module Audio_track : sig
type t
module List : sig
type t
include Jv.CONV with type t := t
end
include Jv.CONV with type t := t
end
module Video_track : sig
type t
module List : sig
type t
include Jv.CONV with type t := t
end
include Jv.CONV with type t := t
end
module Text_track : sig
module Kind : sig
type t = Jstr.t
end
type t
module List : sig
type t
include Jv.CONV with type t := t
end
include Jv.CONV with type t := t
end
module Time_ranges : sig
type t
val length : t -> int
* [ length r ] is the { { : } of [ r ] .
val start : t -> int -> float
val end' : t -> int -> float
include Jv.CONV with type t := t
end
* { 1 : iface Media interface }
type t
val of_el : El.t -> t
* [ of_el e ] is the media interface of [ e ] . This throws a JavaScript
error if [ e ] is not a { ! Brr.El.audio } or { ! Brr.El.video } element .
error if [e] is not a {!Brr.El.audio} or {!Brr.El.video} element. *)
val to_el : t -> El.t
* { 1 : error_state Error state }
val error : t -> Error.t option
* { 1 : network_state Network state }
val src : t -> Jstr.t
val set_src : t -> Jstr.t -> unit
val src_object : t -> Provider.t option
* [ src_object m s ] is the
{ { : object } of [ m ] .
{{:-US/docs/Web/API/HTMLMediaElement/srcObject}source object} of [m]. *)
val set_src_object : t -> Provider.t option -> unit
* [ set_src_object m o ] sets the { ! } of [ m ] to [ o ] .
val current_src : t -> Jstr.t
val cross_origin : t -> Cors.t
val set_cross_origin : t -> Cors.t -> unit
val network_state : t -> Network.t
val preload : t -> Jstr.t
val set_preload : t -> Jstr.t -> unit
val buffered : t -> Time_ranges.t
val load : t -> unit
val can_play_type : t -> Jstr.t -> Can_play.t
* { 1 : ready_state Ready state }
val ready_state : t -> Have.t
val seeking : t -> bool
* { 1 : playback_state Playback state }
val current_time_s : t -> float
val set_current_time_s : t -> float -> unit
val fast_seek_s : t -> float -> unit
* [ fast_seek_s m t ]
{ { : } [ m ] to [ t ] .
{{:-US/docs/Web/API/HTMLMediaElement/fastSeek}seeks} [m] to [t]. *)
val duration_s : t -> float
val paused : t -> bool
val default_playback_rate : t -> float
val set_default_playback_rate : t -> float -> unit
val playback_rate : t -> float
val set_playback_rate : t -> float -> unit
val played : t -> Time_ranges.t
val seekable : t -> Time_ranges.t
val ended : t -> bool
val autoplay : t -> bool
val set_auto_play : t -> bool -> unit
val loop : t -> bool
val set_loop : t -> bool -> unit
val play : t -> unit Fut.or_error
val pause : t -> unit
* { 1 : ctrls Controls }
val controls : t -> bool
val set_controls : t -> bool -> unit
val volume : t -> float
val set_volume : t -> float -> unit
val muted : t -> bool
val set_muted : t -> bool -> unit
val default_muted : t -> bool
val set_default_muted : t -> bool -> unit
* { 1 : tracks Tracks }
val audio_track_list : t -> Audio_track.List.t
val video_track_list : t -> Video_track.List.t
val text_track_list : t -> Text_track.List.t
val capture_stream : t -> Stream.t
include Jv.CONV with type t := t
end
end
module Message : sig
type transfer
val transfer : 'a -> transfer
type opts
val opts : ?target_origin:Jstr.t -> ?transfer:transfer list -> unit -> opts
module Port : sig
type t
external as_target : t -> Ev.target = "%identity"
* [ as_target p ] is [ p ] as an event target .
val start : t -> unit
val close : t -> unit
val post : ?opts:opts -> t -> 'a -> unit
* [ v ]
{ { : -US/docs/Web/API/MessagePort/postMessage } posts } value [ v ] on port [ p ] with options [ opts ] ( the [ target_origin ]
option is meaningless in this case ) .
{{:-US/docs/Web/API/MessagePort/postMessage} posts} value [v] on port [p] with options [opts] (the [target_origin]
option is meaningless in this case). *)
include Jv.CONV with type t := t
end
module Channel : sig
type t
val create : unit -> t
* [ create ( ) ] is a { { : } channel .
val port1 : t -> Port.t
val port2 : t -> Port.t
* [ c ] is the { { : -US/docs/Web/API/MessageChannel/port2}second port } of [ c ] . The port attached to the context
at the other end of the channel .
at the other end of the channel. *)
include Jv.CONV with type t := t
end
* Broadcast channels .
See the
{ { : -US/docs/Web/API/Broadcast_Channel_API}Broadcast Channel API } .
See the
{{:-US/docs/Web/API/Broadcast_Channel_API}Broadcast Channel API}. *)
module Broadcast_channel : sig
type t
* The type for
{ { : -US/docs/Web/API/BroadcastChannel }
[ BroadcastChannel ] } objects .
{{:-US/docs/Web/API/BroadcastChannel}
[BroadcastChannel]} objects. *)
val create : Jstr.t -> t
external as_target : t -> Ev.target = "%identity"
val name : t -> Jstr.t
val close : t -> unit
val post : t -> 'a -> unit
* [ post b v ]
{ { : -US/docs/Web/API/BroadcastChannel/postMessage}sends } [ v ] to all listeners of { ! }
on [ b ] .
{{:-US/docs/Web/API/BroadcastChannel/postMessage}sends} [v] to all listeners of {!Brr_io.Message.Ev.message}
on [b]. *)
include Jv.CONV with type t := t
end
val window_post : ?opts:opts -> Window.t -> 'a -> unit
* { 1 : events Events }
module Ev : sig
* { 1 : obj Message event object }
type t
val as_extendable : t -> Ev.Extendable.t
val data : t -> 'a
val origin : t -> Jstr.t
val last_event_id : t -> Jstr.t
val source : t -> Jv.t option
val ports : t -> Port.t list
* { 1 : events Events }
val message : t Ev.type'
val messageerror : t Ev.type'
end
end
module Notification : sig
* { 1 : perm Permission }
module Permission : sig
type t = Jstr.t
val default : t
val denied : t
val granted : t
end
val permission : unit -> Permission.t
val request_permission : unit -> Permission.t Fut.or_error
* { 1 : notifications Notifications }
module Direction : sig
type t = Jstr.t
val auto : t
val ltr : t
val rtl : t
end
module Action : sig
val max : unit -> int
type t
val v : ?icon:Jstr.t -> action:Jstr.t -> title:Jstr.t -> unit -> t
* [ v ~action ( ) ] is an action with given
{ { : -US/docs/Web/API/NotificationAction#Properties}properties } .
{{:-US/docs/Web/API/NotificationAction#Properties}properties}. *)
val action : t -> Jstr.t
val title : t -> Jstr.t
val icon : t -> Jstr.t option
include Jv.CONV with type t := t
end
type opts
val opts :
?dir:Direction.t -> ?lang:Jstr.t -> ?body:Jstr.t -> ?tag:Jstr.t ->
?image:Jstr.t -> ?icon:Jstr.t -> ?badge:Jstr.t -> ?timestamp_ms:int ->
?renotify:bool -> ?silent:bool -> ?require_interaction:bool -> ?data:'a ->
?actions:Action.t list -> unit -> opts
type t
type notification = t
val create : ?opts:opts -> Jstr.t -> t
val close : t -> unit
external as_target : t -> Ev.target = "%identity"
* { 1 : props Properties }
val actions : t -> Action.t list
val badge : t -> Jstr.t
val body : t -> Jstr.t
val data : t -> 'a
val dir : t -> Direction.t
val lang : t -> Jstr.t
val tag : t -> Jstr.t
val icon : t -> Jstr.t
val image : t -> Jstr.t
val renotify : t -> bool
val require_interaction : t -> bool
val silent : t -> bool
val timestamp_ms : t -> int
val title : t -> Jstr.t
* { 1 : events Events }
module Ev : sig
* { 1 : obj Notification event object }
type t
val as_extendable : t -> Ev.Extendable.t Ev.t
val notification : t -> notification
val action : t -> Jstr.t
* { 1 : evs Notification events }
val notificationclick : t Ev.type'
val notificationclose : t Ev.type'
* [ notificationclick ] is the
{ { : } event .
{{:-US/docs/Web/API/ServiceWorkerGlobalScope/notificationclose_event}[notificationclose]} event. *)
end
include Jv.CONV with type t := t
end
* [ Storage ] objects .
See { { : -US/docs/Web/API/Web_Storage_API }
Web Storage API }
See {{:-US/docs/Web/API/Web_Storage_API}
Web Storage API} *)
module Storage : sig
type t
val local : Window.t -> t
val session : Window.t -> t
val length : t -> int
val key : t -> int -> Jstr.t option
val get_item : t -> Jstr.t -> Jstr.t option
* [ s k ] is the
{ { : -US/docs/Web/API/Storage/getItem }
value } of [ k ] in [ s ] .
{{:-US/docs/Web/API/Storage/getItem}
value} of [k] in [s]. *)
val set_item : t -> Jstr.t -> Jstr.t -> (unit, Jv.Error.t) result
* [ set_item s k v ]
{ { : -US/docs/Web/API/Storage/setItem}sets }
the value of [ k ] to [ v ] in [ s ] . An error is returned if the value could
not be set ( no permission or quota exceeded ) .
{{:-US/docs/Web/API/Storage/setItem}sets}
the value of [k] to [v] in [s]. An error is returned if the value could
not be set (no permission or quota exceeded). *)
val remove_item : t -> Jstr.t -> unit
* [ remove_item s k ]
{ { : }
removes } the value of [ k ] from [ s ] . If [ k ] has no
value this does nothing .
{{:-US/docs/Web/API/Storage/removeItem}
removes} the value of [k] from [s]. If [k] has no
value this does nothing. *)
val clear : t -> unit
* { 1 : events Events }
module Ev : sig
* { 1 : obj Storage event object }
type storage_area = t
type t
val key : t -> Jstr.t option
val old_value : t -> Jstr.t option
val new_value : t -> Jstr.t option
val url : t -> Jstr.t
val storage_area : t -> storage_area option
* { 1 : events Storage event }
val storage : t Ev.type'
* [ storage ] is the type for [ storage ] event fired on { ! .
on storage changes .
on storage changes. *)
end
include Jv.CONV with type t := t
end
module Websocket : sig
module Binary_type : sig
type t = Jstr.t
val blob : t
val arraybuffer : t
end
module Ready_state : sig
type t = int
val connecting : t
val open' : t
val closing : t
val closed : t
end
type t
val create : ?protocols:Jv.Jstr.t list -> Jstr.t -> t
* [ create ~protocols url ]
{ { : -US/docs/Web/API/WebSocket/WebSocket }
creates } a new socket connected to [ url ] .
{{:-US/docs/Web/API/WebSocket/WebSocket}
creates} a new socket connected to [url]. *)
external as_target : t -> Brr.Ev.target = "%identity"
val binary_type : t -> Binary_type.t
* [ binary_type s ] is the
{ { : -US/docs/Web/API/WebSocket/binaryType }
type } of binary data received .
{{:-US/docs/Web/API/WebSocket/binaryType}
type} of binary data received. *)
val set_binary_type : t -> Binary_type.t -> unit
val close : ?code:int -> ?reason:Jstr.t -> t -> unit
* { 1 : props Properties }
val url : t -> Jstr.t
val ready_state : t -> Ready_state.t
val buffered_amount : t -> int
val extensions : t -> Jstr.t
val protocol : t -> Jstr.t
* { 1 : send Sending }
val send_string : t -> Jstr.t -> unit
* [ send_string s d ]
{ { : -US/docs/Web/API/WebSocket/send }
sends } the UTF-8 encoding of [ d ] on [ s ] .
{{:-US/docs/Web/API/WebSocket/send}
sends} the UTF-8 encoding of [d] on [s]. *)
val send_blob : t -> Blob.t -> unit
val send_array_buffer : t -> Tarray.Buffer.t -> unit
val send_tarray : t -> ('a, 'b) Tarray.t -> unit
* { 1 : events Events }
module Ev : sig
module Close : sig
type t
val was_clean : t -> bool
* [ was_clean e ] is [ true ] if closure was { { : .
val code : t -> int
val reason : t -> Jstr.t
* [ reason e ] is the closure { { : } .
end
val close : Close.t Ev.type'
end
include Jv.CONV with type t := t
end
---------------------------------------------------------------------------
Copyright ( c ) 2020 The brr programmers
Permission to use , copy , modify , and/or distribute this software for any
purpose with or without fee is hereby granted , provided that the above
copyright notice and this permission notice appear in all copies .
THE SOFTWARE IS PROVIDED " AS IS " AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS . IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL , DIRECT , INDIRECT , OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
RESULTING FROM LOSS OF USE , DATA OR PROFITS , WHETHER IN AN
ACTION OF CONTRACT , NEGLIGENCE OR OTHER TORTIOUS ACTION , ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE .
---------------------------------------------------------------------------
Copyright (c) 2020 The brr programmers
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
---------------------------------------------------------------------------*)
|
dc91418fb5ed22ba13f66436ca6be13f62ef50b87a417f4ae714674c312afb3c | phronmophobic/clong | libz.clj | (ns com.phronemophobic.libz
(:require [clojure.java.io :as io]
[clojure.string :as str]
[clojure.pprint :refer [pprint]]
[clojure.edn :as edn]
[com.phronemophobic.clong.clang :as clong]
[com.phronemophobic.clong.gen.jna :as gen])
(:import
java.io.PushbackReader
com.sun.jna.Memory
com.sun.jna.Pointer
com.sun.jna.ptr.PointerByReference
com.sun.jna.ptr.LongByReference
com.sun.jna.Structure)
(:gen-class))
(def libz
(com.sun.jna.NativeLibrary/getInstance "z"))
(def api (clong/easy-api "/opt/local/include/zlib.h"))
(gen/def-api libz api)
" 1.2.11 "
(def source "clong!")
(def dest (byte-array 255))
(def dest-size* (doto (LongByReference.)
(.setValue (alength dest))))
0
14
(def dest2 (byte-array (count source)))
(def dest2-size* (doto (LongByReference.)
(.setValue (alength dest2))))
0
" ! "
| null | https://raw.githubusercontent.com/phronmophobic/clong/bb4cb74c14f787b1c1dc8a4c9178d897e25557bb/examples/libz/src/clong/libz.clj | clojure | (ns com.phronemophobic.libz
(:require [clojure.java.io :as io]
[clojure.string :as str]
[clojure.pprint :refer [pprint]]
[clojure.edn :as edn]
[com.phronemophobic.clong.clang :as clong]
[com.phronemophobic.clong.gen.jna :as gen])
(:import
java.io.PushbackReader
com.sun.jna.Memory
com.sun.jna.Pointer
com.sun.jna.ptr.PointerByReference
com.sun.jna.ptr.LongByReference
com.sun.jna.Structure)
(:gen-class))
(def libz
(com.sun.jna.NativeLibrary/getInstance "z"))
(def api (clong/easy-api "/opt/local/include/zlib.h"))
(gen/def-api libz api)
" 1.2.11 "
(def source "clong!")
(def dest (byte-array 255))
(def dest-size* (doto (LongByReference.)
(.setValue (alength dest))))
0
14
(def dest2 (byte-array (count source)))
(def dest2-size* (doto (LongByReference.)
(.setValue (alength dest2))))
0
" ! "
| |
3fad5aa0948a50cacbec1b8a3f0bbb86984c48a5646859d3ea743ab3d96c0ed8 | ocsigen/ocsigenserver | ocsigen_server.mli | Ocsigen
*
* Copyright ( C ) 2005
* , , ,
*
* This program is free software ; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation , with linking exception ;
* either version 2.1 of the License , or ( at your option ) any later version .
*
* This program is distributed in the hope that it will be useful ,
* but WITHOUT ANY WARRANTY ; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
* GNU Lesser General Public License for more details .
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program ; if not , write to the Free Software
* Foundation , Inc. , 59 Temple Place - Suite 330 , Boston , MA 02111 - 1307 , USA .
*
* Copyright (C) 2005
* Vincent Balat, Denis Berthod, Nataliya Guts, Jérôme Vouillon
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, with linking exception;
* either version 2.1 of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
*)
val section : Lwt_log_core.section
* use Lwt_log . Section.set_level in order to debug
val reload : ?file:string -> unit -> unit
(** Reload the configuration of the server. The optional parameter
[?file] may be used to read the configuration from another
file. *)
val start : ?config:Xml.xml list list -> unit -> unit
(** Start the server. Never returns. *)
module type Config_nested = sig
type t
type 'a key
val key : ?preprocess:('a -> 'a) -> unit -> 'a key
val find : t -> 'a key -> 'a option
val set : t -> 'a key -> 'a -> unit
val unset : t -> 'a key -> unit
type accessor = {accessor : 'a. 'a key -> 'a option}
end
module Site : sig
type t
val create
: ?config_info:Ocsigen_extensions.config_info
-> ?id:[`Attach of t * Ocsigen_lib.Url.path | `Host of string * int option]
-> ?charset:Ocsigen_charset_mime.charset
-> ?auto_load_extensions:bool
-> unit
-> t
module Config : Config_nested with type t := t
type extension
val create_extension
: (Config.accessor -> Ocsigen_extensions.extension)
-> extension
val register : t -> extension -> unit
(**/**)
val create_extension_intrusive
: (Ocsigen_extensions.virtual_hosts
-> Ocsigen_extensions.config_info
-> Ocsigen_lib.Url.path
-> Config.accessor
-> Ocsigen_extensions.extension)
-> extension
* Lower - level interface for creating extensions that gives the
extension more info . To be avoided . Currently used by Eliom .
extension more info. To be avoided. Currently used by Eliom. *)
end
| null | https://raw.githubusercontent.com/ocsigen/ocsigenserver/d468cf464dcc9f05f820c35f346ffdbe6b9c7931/src/server/ocsigen_server.mli | ocaml | * Reload the configuration of the server. The optional parameter
[?file] may be used to read the configuration from another
file.
* Start the server. Never returns.
*/* | Ocsigen
*
* Copyright ( C ) 2005
* , , ,
*
* This program is free software ; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation , with linking exception ;
* either version 2.1 of the License , or ( at your option ) any later version .
*
* This program is distributed in the hope that it will be useful ,
* but WITHOUT ANY WARRANTY ; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
* GNU Lesser General Public License for more details .
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program ; if not , write to the Free Software
* Foundation , Inc. , 59 Temple Place - Suite 330 , Boston , MA 02111 - 1307 , USA .
*
* Copyright (C) 2005
* Vincent Balat, Denis Berthod, Nataliya Guts, Jérôme Vouillon
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, with linking exception;
* either version 2.1 of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
*)
val section : Lwt_log_core.section
* use Lwt_log . Section.set_level in order to debug
val reload : ?file:string -> unit -> unit
val start : ?config:Xml.xml list list -> unit -> unit
module type Config_nested = sig
type t
type 'a key
val key : ?preprocess:('a -> 'a) -> unit -> 'a key
val find : t -> 'a key -> 'a option
val set : t -> 'a key -> 'a -> unit
val unset : t -> 'a key -> unit
type accessor = {accessor : 'a. 'a key -> 'a option}
end
module Site : sig
type t
val create
: ?config_info:Ocsigen_extensions.config_info
-> ?id:[`Attach of t * Ocsigen_lib.Url.path | `Host of string * int option]
-> ?charset:Ocsigen_charset_mime.charset
-> ?auto_load_extensions:bool
-> unit
-> t
module Config : Config_nested with type t := t
type extension
val create_extension
: (Config.accessor -> Ocsigen_extensions.extension)
-> extension
val register : t -> extension -> unit
val create_extension_intrusive
: (Ocsigen_extensions.virtual_hosts
-> Ocsigen_extensions.config_info
-> Ocsigen_lib.Url.path
-> Config.accessor
-> Ocsigen_extensions.extension)
-> extension
* Lower - level interface for creating extensions that gives the
extension more info . To be avoided . Currently used by Eliom .
extension more info. To be avoided. Currently used by Eliom. *)
end
|
98fca0e98dc194ab27fffb5e1aa7af36fb4467cf77c53da12fdb2ec11dcbc6b3 | NSO-developer/genet | genet_logging.erl | %%% @doc Logging group definitions, supposed to be used in compile
%%% time by aop.
%%%
%%% This module needs to export function `log_groups/0' which returns
%%% a tuple consisting of log groups and assignment of groups to log
%%% levels. The value is used in compile time to instrument functions
%%% with "advices".
%%% @end
-module(genet_logging).
-export([log_groups/0]).
-include("log_groups.hrl").
log_groups() ->
{?LOG_GROUPS, ?LOG_LEVELS}.
| null | https://raw.githubusercontent.com/NSO-developer/genet/bc22f28c56b9cc9373b8580536e6d99cb724da4e/src/erlang-lib/ec_genet/src/logging/genet_logging.erl | erlang | @doc Logging group definitions, supposed to be used in compile
time by aop.
This module needs to export function `log_groups/0' which returns
a tuple consisting of log groups and assignment of groups to log
levels. The value is used in compile time to instrument functions
with "advices".
@end |
-module(genet_logging).
-export([log_groups/0]).
-include("log_groups.hrl").
log_groups() ->
{?LOG_GROUPS, ?LOG_LEVELS}.
|
23693e97d8d959d417efe7d62e4b7e87e79fd52dd0abbb5f28ebe264ab05e569 | Zulu-Inuoe/clution | ldap.lisp | (in-package :cl-user)
(defpackage quri.uri.ldap
(:use :cl)
(:import-from :quri.uri
:uri
:scheme
:port
:uri-path
:uri-query)
(:import-from :quri.port
:scheme-default-port)
(:import-from :split-sequence
:split-sequence)
(:import-from :alexandria
:when-let)
(:export :uri-ldap
:make-uri-ldap
:uri-ldap-p
:uri-ldaps
:make-uri-ldaps
:uri-ldaps-p
:uri-ldap-dn
:uri-ldap-attributes
:uri-ldap-scope
:uri-ldap-filter
:uri-ldap-extensions))
(in-package :quri.uri.ldap)
(defstruct (uri-ldap (:include uri (scheme "ldap") (port #.(scheme-default-port "ldap")))))
(defstruct (uri-ldaps (:include uri-ldap (scheme "ldaps") (port #.(scheme-default-port "ldaps")))))
(defun uri-ldap-dn (ldap)
(let ((path (uri-path ldap)))
(when (and path
(/= 0 (length path)))
(if (char= (aref path 0) #\/)
(subseq path 1)
path))))
(defun (setf uri-ldap-dn) (new ldap)
(setf (uri-path ldap)
(concatenate 'string "/" new))
new)
(defun nth-uri-ldap-lists (ldap n)
(check-type ldap uri-ldap)
(when-let (query (uri-query ldap))
(car (last (split-sequence #\? query :count n)))))
(defun (setf nth-uri-ldap-lists) (new ldap n)
(check-type ldap uri-ldap)
(check-type new string)
(let ((query (uri-query ldap)))
(setf (uri-query ldap)
(if query
(let ((parts (split-sequence #\? query)))
(with-output-to-string (s)
(dotimes (i n)
(princ (or (pop parts) "") s)
(write-char #\? s))
(princ new s)
(pop parts) ;; ignore
(dolist (part parts)
(write-char #\? s)
(princ part s))))
new))))
(defun uri-ldap-attributes (ldap)
(nth-uri-ldap-lists ldap 1))
(defun (setf uri-ldap-attributes) (new ldap)
(setf (nth-uri-ldap-lists ldap 0) new))
(defun uri-ldap-scope (ldap)
(nth-uri-ldap-lists ldap 2))
(defun (setf uri-ldap-scope) (new ldap)
(setf (nth-uri-ldap-lists ldap 1) new))
(defun uri-ldap-filter (ldap)
(nth-uri-ldap-lists ldap 3))
(defun (setf uri-ldap-filter) (new ldap)
(setf (nth-uri-ldap-lists ldap 2) new))
(defun uri-ldap-extensions (ldap)
(nth-uri-ldap-lists ldap 4))
(defun (setf uri-ldap-extensions) (new ldap)
(setf (nth-uri-ldap-lists ldap 3) new))
| null | https://raw.githubusercontent.com/Zulu-Inuoe/clution/b72f7afe5f770ff68a066184a389c23551863f7f/cl-clution/qlfile-libs/quri-20161204-git/src/uri/ldap.lisp | lisp | ignore | (in-package :cl-user)
(defpackage quri.uri.ldap
(:use :cl)
(:import-from :quri.uri
:uri
:scheme
:port
:uri-path
:uri-query)
(:import-from :quri.port
:scheme-default-port)
(:import-from :split-sequence
:split-sequence)
(:import-from :alexandria
:when-let)
(:export :uri-ldap
:make-uri-ldap
:uri-ldap-p
:uri-ldaps
:make-uri-ldaps
:uri-ldaps-p
:uri-ldap-dn
:uri-ldap-attributes
:uri-ldap-scope
:uri-ldap-filter
:uri-ldap-extensions))
(in-package :quri.uri.ldap)
(defstruct (uri-ldap (:include uri (scheme "ldap") (port #.(scheme-default-port "ldap")))))
(defstruct (uri-ldaps (:include uri-ldap (scheme "ldaps") (port #.(scheme-default-port "ldaps")))))
(defun uri-ldap-dn (ldap)
(let ((path (uri-path ldap)))
(when (and path
(/= 0 (length path)))
(if (char= (aref path 0) #\/)
(subseq path 1)
path))))
(defun (setf uri-ldap-dn) (new ldap)
(setf (uri-path ldap)
(concatenate 'string "/" new))
new)
(defun nth-uri-ldap-lists (ldap n)
(check-type ldap uri-ldap)
(when-let (query (uri-query ldap))
(car (last (split-sequence #\? query :count n)))))
(defun (setf nth-uri-ldap-lists) (new ldap n)
(check-type ldap uri-ldap)
(check-type new string)
(let ((query (uri-query ldap)))
(setf (uri-query ldap)
(if query
(let ((parts (split-sequence #\? query)))
(with-output-to-string (s)
(dotimes (i n)
(princ (or (pop parts) "") s)
(write-char #\? s))
(princ new s)
(dolist (part parts)
(write-char #\? s)
(princ part s))))
new))))
(defun uri-ldap-attributes (ldap)
(nth-uri-ldap-lists ldap 1))
(defun (setf uri-ldap-attributes) (new ldap)
(setf (nth-uri-ldap-lists ldap 0) new))
(defun uri-ldap-scope (ldap)
(nth-uri-ldap-lists ldap 2))
(defun (setf uri-ldap-scope) (new ldap)
(setf (nth-uri-ldap-lists ldap 1) new))
(defun uri-ldap-filter (ldap)
(nth-uri-ldap-lists ldap 3))
(defun (setf uri-ldap-filter) (new ldap)
(setf (nth-uri-ldap-lists ldap 2) new))
(defun uri-ldap-extensions (ldap)
(nth-uri-ldap-lists ldap 4))
(defun (setf uri-ldap-extensions) (new ldap)
(setf (nth-uri-ldap-lists ldap 3) new))
|
dd295e8fcf34d0d858fdf053fa0d62dae04f226c729d225a835d3c32cc2b9347 | jumarko/web-development-with-clojure | doo_runner.cljs | (ns picture-gallery.doo-runner
(:require [doo.runner :refer-macros [doo-tests]]
[picture-gallery.core-test]))
(doo-tests 'picture-gallery.core-test)
| null | https://raw.githubusercontent.com/jumarko/web-development-with-clojure/dfff6e40c76b64e9fcd440d80c7aa29809601b6b/examples/picture-gallery/test/cljs/picture_gallery/doo_runner.cljs | clojure | (ns picture-gallery.doo-runner
(:require [doo.runner :refer-macros [doo-tests]]
[picture-gallery.core-test]))
(doo-tests 'picture-gallery.core-test)
| |
9d2d6f14b1e59b05cf8a63a331c111af4557677caae39089aa9f3ab384d7d3a1 | realworldocaml/book | renaming.mli | (* A renaming is a mapping from type variable name to type variable name.
In definitions such as:
type 'a t =
| A : <type> -> 'b t
| B of 'a
we generate a function that takes an sexp_of parameter named after 'a, but 'a is not in
scope in <type> when handling the constructor A (because A is a gadt constructor).
Instead the type variables in scope are the ones defined in the return type of A,
namely 'b. There could be less or more type variable in cases such as:
type _ less = Less : int less
type _ more = More : ('a * 'a) more
If for instance, <type> is ['b * 'c], when we find 'b, we will look for ['b] in the
renaming and find ['a] (only in that gadt branch, it could be something else in other
branches), at which point we can call the previously bound sexp_of parameter named
after 'a.
If we can't find a resulting name, like when looking up ['c] in the renaming, then we
assume the variable is existentially quantified and treat it as [_] (which is ok,
assuming there are no constraints). *)
open! Base
open! Ppxlib
type t
(** Renaming for contexts outside a type declaration, such as expression extensions. *)
val without_type : unit -> t
(** Renaming for a type declaration. Adds [prefix] to bindings for type parameters. *)
val of_type_declaration : type_declaration -> prefix:string -> t
(** Adds a new name with the given [prefix] for a universally bound type variable. *)
val add_universally_bound : t -> string loc -> prefix:string -> t
module Binding_kind : sig
type t =
| Universally_bound of Fresh_name.t
| Existentially_bound
end
(** Looks up the binding for a type variable. *)
val binding_kind : t -> string -> loc:location -> Binding_kind.t
* Extends the renaming of a type declaration with GADT context for a constructor
declaration , if any .
declaration, if any. *)
val with_constructor_declaration
: t
-> constructor_declaration
-> type_parameters:string list
-> t
| null | https://raw.githubusercontent.com/realworldocaml/book/d822fd065f19dbb6324bf83e0143bc73fd77dbf9/duniverse/ppx_sexp_conv/expander/renaming.mli | ocaml | A renaming is a mapping from type variable name to type variable name.
In definitions such as:
type 'a t =
| A : <type> -> 'b t
| B of 'a
we generate a function that takes an sexp_of parameter named after 'a, but 'a is not in
scope in <type> when handling the constructor A (because A is a gadt constructor).
Instead the type variables in scope are the ones defined in the return type of A,
namely 'b. There could be less or more type variable in cases such as:
type _ less = Less : int less
type _ more = More : ('a * 'a) more
If for instance, <type> is ['b * 'c], when we find 'b, we will look for ['b] in the
renaming and find ['a] (only in that gadt branch, it could be something else in other
branches), at which point we can call the previously bound sexp_of parameter named
after 'a.
If we can't find a resulting name, like when looking up ['c] in the renaming, then we
assume the variable is existentially quantified and treat it as [_] (which is ok,
assuming there are no constraints).
* Renaming for contexts outside a type declaration, such as expression extensions.
* Renaming for a type declaration. Adds [prefix] to bindings for type parameters.
* Adds a new name with the given [prefix] for a universally bound type variable.
* Looks up the binding for a type variable. | open! Base
open! Ppxlib
type t
val without_type : unit -> t
val of_type_declaration : type_declaration -> prefix:string -> t
val add_universally_bound : t -> string loc -> prefix:string -> t
module Binding_kind : sig
type t =
| Universally_bound of Fresh_name.t
| Existentially_bound
end
val binding_kind : t -> string -> loc:location -> Binding_kind.t
* Extends the renaming of a type declaration with GADT context for a constructor
declaration , if any .
declaration, if any. *)
val with_constructor_declaration
: t
-> constructor_declaration
-> type_parameters:string list
-> t
|
0bf6939e52631228aa50d30be594ca2a72b3edddf81c68b76fded31bc70c9875 | typedclojure/typedclojure | destructure.clj | (ns typed-test.clj.ext.clojure.core.succeed.destructure
(:import (clojure.lang APersistentVector APersistentMap))
(:require [typed.clojure :as t :refer [ann-form]]))
;; map destructuring
(let [{:keys [b] :or {b 3}} {}]
(ann-form b Number))
clojure < = 1.10 kv destructuring
(let* [map__65083 {}
map__65083 (if (seq? map__65083)
(clojure.lang.PersistentHashMap/create (clojure.core/seq map__65083))
map__65083)
b (get map__65083 :b 3)]
(ann-form b Number))
clojure > = 1.11 kv destructuring
(let* [map__65083 {}
map__65083 (if (seq? map__65083)
(if (next ^clojure.lang.Seq map__65083)
(clojure.lang.PersistentArrayMap/createAsIfByAssoc (to-array ^clojure.lang.Seq map__65083))
(if (seq ^clojure.lang.Seq map__65083)
(first ^clojure.lang.Seq map__65083)
clojure.lang.PersistentArrayMap/EMPTY))
map__65083)
b (get map__65083 :b 3)]
(ann-form b Number))
(let [{:as c} {}]
(ann-form c '{}))
(let [{:as c} nil]
(ann-form c nil))
(let [{:strs [str] :syms [symb]} (ann-form {} (Extends [(APersistentMap t/Any String)] :without [(clojure.lang.ISeq t/Any)]))]
(ann-form symb (t/U nil String))
(ann-form str (t/U nil String)))
;; vector destructuring
(let [[a b & c :as d] (ann-form [] (APersistentVector Number))]
(ann-form a (t/U nil Number))
(ann-form b (t/U nil Number))
(ann-form c (t/U nil (t/Seqable Number)))
(ann-form d (t/U nil (APersistentVector Number))))
(let [[[x1 y1]
[x2 y2]] [[1 2] [3 4]]]
(ann-form [x1 y1 x2 y2]
(t/Seqable Number)))
(let [[a b & c :as str] "asdjhhfdas"]
;could do a bit better there
(ann-form [a b] (t/Seqable (t/U nil Character)))
(ann-form c (t/U nil (t/Seqable Character)))
(ann-form str String))
;vectors
(let [[a b c & d :as e] [1 2 3 4 5 6 7]]
(ann-form a Number)
(ann-form b Number)
(ann-form c Number)
(ann-form [a b c] (t/Seqable Number))
(ann-form d (t/U nil (t/Seqable Number)))
(ann-form e (t/Seqable Number)))
;lists
(let [[a b c & d :as e] '(1 2 3 4 5 6 7)]
(ann-form [a b c] (t/Seqable Number))
(ann-form d (t/U nil (t/Seqable Number)))
(ann-form e (t/Seqable Number)))
| null | https://raw.githubusercontent.com/typedclojure/typedclojure/5fd7cdf7941c6e7d1dd5df88bf44474fa35e1fca/typed/lib.clojure/test/typed_test/clj/ext/clojure/core/succeed/destructure.clj | clojure | map destructuring
vector destructuring
could do a bit better there
vectors
lists | (ns typed-test.clj.ext.clojure.core.succeed.destructure
(:import (clojure.lang APersistentVector APersistentMap))
(:require [typed.clojure :as t :refer [ann-form]]))
(let [{:keys [b] :or {b 3}} {}]
(ann-form b Number))
clojure < = 1.10 kv destructuring
(let* [map__65083 {}
map__65083 (if (seq? map__65083)
(clojure.lang.PersistentHashMap/create (clojure.core/seq map__65083))
map__65083)
b (get map__65083 :b 3)]
(ann-form b Number))
clojure > = 1.11 kv destructuring
(let* [map__65083 {}
map__65083 (if (seq? map__65083)
(if (next ^clojure.lang.Seq map__65083)
(clojure.lang.PersistentArrayMap/createAsIfByAssoc (to-array ^clojure.lang.Seq map__65083))
(if (seq ^clojure.lang.Seq map__65083)
(first ^clojure.lang.Seq map__65083)
clojure.lang.PersistentArrayMap/EMPTY))
map__65083)
b (get map__65083 :b 3)]
(ann-form b Number))
(let [{:as c} {}]
(ann-form c '{}))
(let [{:as c} nil]
(ann-form c nil))
(let [{:strs [str] :syms [symb]} (ann-form {} (Extends [(APersistentMap t/Any String)] :without [(clojure.lang.ISeq t/Any)]))]
(ann-form symb (t/U nil String))
(ann-form str (t/U nil String)))
(let [[a b & c :as d] (ann-form [] (APersistentVector Number))]
(ann-form a (t/U nil Number))
(ann-form b (t/U nil Number))
(ann-form c (t/U nil (t/Seqable Number)))
(ann-form d (t/U nil (APersistentVector Number))))
(let [[[x1 y1]
[x2 y2]] [[1 2] [3 4]]]
(ann-form [x1 y1 x2 y2]
(t/Seqable Number)))
(let [[a b & c :as str] "asdjhhfdas"]
(ann-form [a b] (t/Seqable (t/U nil Character)))
(ann-form c (t/U nil (t/Seqable Character)))
(ann-form str String))
(let [[a b c & d :as e] [1 2 3 4 5 6 7]]
(ann-form a Number)
(ann-form b Number)
(ann-form c Number)
(ann-form [a b c] (t/Seqable Number))
(ann-form d (t/U nil (t/Seqable Number)))
(ann-form e (t/Seqable Number)))
(let [[a b c & d :as e] '(1 2 3 4 5 6 7)]
(ann-form [a b c] (t/Seqable Number))
(ann-form d (t/U nil (t/Seqable Number)))
(ann-form e (t/Seqable Number)))
|
e345d2fe8596000997e78d169931decdab116278d14da0b98fd1d4c1bae2aba3 | monadbobo/ocaml-core | fqueue.ml | (** Simple implementation of a polymorphic functional queue *)
* Invariants :
- iff queue is not empty , outlist is not empty
- iff queue has more than 1 element , then inlist is not empty
- queue.length = queue.outlist + List.length queue.inlist
- iff queue is not empty, outlist is not empty
- iff queue has more than 1 element, then inlist is not empty
- queue.length = List.length queue.outlist + List.length queue.inlist
*)
open Std_internal
exception Empty with sexp
type 'a t = { inlist : 'a list; outlist : 'a list; length : int } with bin_io, sexp
let test_invariants queue =
let n_out = List.length queue.outlist in
let n_in = List.length queue.inlist in
assert (queue.length = n_out + n_in);
assert (queue.length = 0 || n_out <> 0);
assert (queue.length <= 1 || n_in <> 0)
let empty = { inlist = []; outlist = []; length = 0 }
let enqueue queue el =
let inlist, outlist =
if queue.length = 0 then [], [el]
else el :: queue.inlist, queue.outlist
in
{ inlist = inlist; outlist = outlist; length = queue.length + 1 }
(** enqueue el on the top of the queue, effectively making it
the least recently enqueued element *)
let enqueue_top queue el =
let inlist, outlist =
if queue.inlist = [] then List.rev queue.outlist, [el]
else queue.inlist, el :: queue.outlist
in
{ inlist = inlist; outlist = outlist; length = queue.length + 1 }
(** returns bottom (most-recently enqueued) item *)
let bot_exn queue =
match queue.inlist, queue.outlist with
| [], [x] | x :: _, _ -> x
| [], [] -> raise Empty
| [], _ :: _ :: _ ->
raise (Bug "Fqueue.bot_exn: empty inlist and outlist with len > 1")
let bot queue = try Some (bot_exn queue) with Empty -> None
(** returns top (least-recently enqueued) item *)
let top_exn queue =
match queue.outlist with
| x :: _ -> x
| [] -> raise Empty
let top queue = try Some (top_exn queue) with Empty -> None
(** returns top of queue and queue with top removed *)
let dequeue_exn queue =
let x, inlist, outlist =
match queue.inlist, queue.outlist with
| [_] as inlist, [x] -> x, [], inlist
| y :: ytl, [x] -> x, [y], List.rev ytl
| inlist, x :: xtl -> x, inlist, xtl
| [], [] -> raise Empty
| _ :: _, [] -> raise (Bug "Fqueue.dequeue_exn: outlist empty, inlist not")
in
x, { inlist = inlist; outlist = outlist; length = queue.length - 1 }
let dequeue queue = try Some (dequeue_exn queue) with Empty -> None
(** returns queue with top removed *)
let discard_exn queue = snd (dequeue_exn queue)
let to_list queue = List.append queue.outlist (List.rev queue.inlist)
let sexp_of_t sexp_of_a q = Sexplib.Conv.sexp_of_list sexp_of_a (to_list q)
let length queue = queue.length
let is_empty queue = queue.length = 0
| null | https://raw.githubusercontent.com/monadbobo/ocaml-core/9c1c06e7a1af7e15b6019a325d7dbdbd4cdb4020/base/core/lib/fqueue.ml | ocaml | * Simple implementation of a polymorphic functional queue
* enqueue el on the top of the queue, effectively making it
the least recently enqueued element
* returns bottom (most-recently enqueued) item
* returns top (least-recently enqueued) item
* returns top of queue and queue with top removed
* returns queue with top removed |
* Invariants :
- iff queue is not empty , outlist is not empty
- iff queue has more than 1 element , then inlist is not empty
- queue.length = queue.outlist + List.length queue.inlist
- iff queue is not empty, outlist is not empty
- iff queue has more than 1 element, then inlist is not empty
- queue.length = List.length queue.outlist + List.length queue.inlist
*)
open Std_internal
exception Empty with sexp
type 'a t = { inlist : 'a list; outlist : 'a list; length : int } with bin_io, sexp
let test_invariants queue =
let n_out = List.length queue.outlist in
let n_in = List.length queue.inlist in
assert (queue.length = n_out + n_in);
assert (queue.length = 0 || n_out <> 0);
assert (queue.length <= 1 || n_in <> 0)
let empty = { inlist = []; outlist = []; length = 0 }
let enqueue queue el =
let inlist, outlist =
if queue.length = 0 then [], [el]
else el :: queue.inlist, queue.outlist
in
{ inlist = inlist; outlist = outlist; length = queue.length + 1 }
let enqueue_top queue el =
let inlist, outlist =
if queue.inlist = [] then List.rev queue.outlist, [el]
else queue.inlist, el :: queue.outlist
in
{ inlist = inlist; outlist = outlist; length = queue.length + 1 }
let bot_exn queue =
match queue.inlist, queue.outlist with
| [], [x] | x :: _, _ -> x
| [], [] -> raise Empty
| [], _ :: _ :: _ ->
raise (Bug "Fqueue.bot_exn: empty inlist and outlist with len > 1")
let bot queue = try Some (bot_exn queue) with Empty -> None
let top_exn queue =
match queue.outlist with
| x :: _ -> x
| [] -> raise Empty
let top queue = try Some (top_exn queue) with Empty -> None
let dequeue_exn queue =
let x, inlist, outlist =
match queue.inlist, queue.outlist with
| [_] as inlist, [x] -> x, [], inlist
| y :: ytl, [x] -> x, [y], List.rev ytl
| inlist, x :: xtl -> x, inlist, xtl
| [], [] -> raise Empty
| _ :: _, [] -> raise (Bug "Fqueue.dequeue_exn: outlist empty, inlist not")
in
x, { inlist = inlist; outlist = outlist; length = queue.length - 1 }
let dequeue queue = try Some (dequeue_exn queue) with Empty -> None
let discard_exn queue = snd (dequeue_exn queue)
let to_list queue = List.append queue.outlist (List.rev queue.inlist)
let sexp_of_t sexp_of_a q = Sexplib.Conv.sexp_of_list sexp_of_a (to_list q)
let length queue = queue.length
let is_empty queue = queue.length = 0
|
1658a0b92110bc01162b5db9270b083cb25665a8ef0617a1f07c72be50591dbb | Simre1/haskell-game | CollisionHandler.hs | module Scene.Level.Initialize.CollisionHandler where
import Control.Monad.IO.Class (MonadIO)
import ECS.Apecs (SystemT, newEntity, Proxy(..), destroy)
import ECS.Physics (mkBeginCB, CollisionHandler(..), Collision(..), Shape, Body, CollisionSource(Between), addPostStepCallback)
import Scene.Level.World (World, Bullet, Player, Enemy)
initializeCollisionHandlers :: MonadIO m => SystemT World m ()
initializeCollisionHandlers = do
createPlayerCollisionHandler >>= newEntity
createEnemiesCollisionHandler >>= newEntity
createBulletCollisionHandler >>= newEntity
pure ()
createEnemiesCollisionHandler :: MonadIO m => SystemT World m CollisionHandler
createEnemiesCollisionHandler = do
begin <- mkBeginCB $ \(Collision _ enemy bullet enemyShape bulletShape) -> do
addPostStepCallback 1 $ do
destroy enemy (Proxy :: Proxy (Enemy, Body))
destroy enemyShape (Proxy :: Proxy Shape)
destroy bullet (Proxy :: Proxy (Bullet, Body))
destroy bulletShape (Proxy :: Proxy Shape)
pure False
pure $ CollisionHandler (Between 2 3) (Just begin) Nothing Nothing Nothing
createPlayerCollisionHandler :: MonadIO m => SystemT World m CollisionHandler
createPlayerCollisionHandler = do
beginCB <- mkBeginCB $ \(Collision _ player bullet playerShape bulletShape) -> do
addPostStepCallback 2 $ do
destroy player (Proxy :: Proxy (Player, Body))
destroy playerShape (Proxy :: Proxy Shape)
destroy bullet (Proxy :: Proxy (Bullet, Body))
destroy bulletShape (Proxy :: Proxy Shape)
pure False
pure $ CollisionHandler (Between 1 4) (Just beginCB) Nothing Nothing Nothing
createBulletCollisionHandler :: MonadIO m => SystemT World m CollisionHandler
createBulletCollisionHandler = do
beginCB <- mkBeginCB $ \(Collision _ bullet1 bullet2 bullet1Shape bullet2Shape) -> do
addPostStepCallback 3 $ do
destroy bullet1 (Proxy :: Proxy (Bullet, Body))
destroy bullet1Shape (Proxy :: Proxy Shape)
destroy bullet2 (Proxy :: Proxy (Bullet, Body))
destroy bullet2Shape (Proxy :: Proxy Shape)
pure False
pure $ CollisionHandler (Between 3 4) (Just beginCB) Nothing Nothing Nothing
| null | https://raw.githubusercontent.com/Simre1/haskell-game/272a0674157aedc7b0e0ee00da8d3a464903dc67/app/Scene/Level/Initialize/CollisionHandler.hs | haskell | module Scene.Level.Initialize.CollisionHandler where
import Control.Monad.IO.Class (MonadIO)
import ECS.Apecs (SystemT, newEntity, Proxy(..), destroy)
import ECS.Physics (mkBeginCB, CollisionHandler(..), Collision(..), Shape, Body, CollisionSource(Between), addPostStepCallback)
import Scene.Level.World (World, Bullet, Player, Enemy)
initializeCollisionHandlers :: MonadIO m => SystemT World m ()
initializeCollisionHandlers = do
createPlayerCollisionHandler >>= newEntity
createEnemiesCollisionHandler >>= newEntity
createBulletCollisionHandler >>= newEntity
pure ()
createEnemiesCollisionHandler :: MonadIO m => SystemT World m CollisionHandler
createEnemiesCollisionHandler = do
begin <- mkBeginCB $ \(Collision _ enemy bullet enemyShape bulletShape) -> do
addPostStepCallback 1 $ do
destroy enemy (Proxy :: Proxy (Enemy, Body))
destroy enemyShape (Proxy :: Proxy Shape)
destroy bullet (Proxy :: Proxy (Bullet, Body))
destroy bulletShape (Proxy :: Proxy Shape)
pure False
pure $ CollisionHandler (Between 2 3) (Just begin) Nothing Nothing Nothing
createPlayerCollisionHandler :: MonadIO m => SystemT World m CollisionHandler
createPlayerCollisionHandler = do
beginCB <- mkBeginCB $ \(Collision _ player bullet playerShape bulletShape) -> do
addPostStepCallback 2 $ do
destroy player (Proxy :: Proxy (Player, Body))
destroy playerShape (Proxy :: Proxy Shape)
destroy bullet (Proxy :: Proxy (Bullet, Body))
destroy bulletShape (Proxy :: Proxy Shape)
pure False
pure $ CollisionHandler (Between 1 4) (Just beginCB) Nothing Nothing Nothing
createBulletCollisionHandler :: MonadIO m => SystemT World m CollisionHandler
createBulletCollisionHandler = do
beginCB <- mkBeginCB $ \(Collision _ bullet1 bullet2 bullet1Shape bullet2Shape) -> do
addPostStepCallback 3 $ do
destroy bullet1 (Proxy :: Proxy (Bullet, Body))
destroy bullet1Shape (Proxy :: Proxy Shape)
destroy bullet2 (Proxy :: Proxy (Bullet, Body))
destroy bullet2Shape (Proxy :: Proxy Shape)
pure False
pure $ CollisionHandler (Between 3 4) (Just beginCB) Nothing Nothing Nothing
| |
1364d032a6b15af3abaeda45d9725e4fa92ba90538c4e60fd7b701c797703a87 | vbmithr/ocaml-thrift-lib | TThreadedServer.ml |
Licensed to the Apache Software Foundation ( ASF ) under one
or more contributor license agreements . See the NOTICE file
distributed with this work for additional information
regarding copyright ownership . The ASF licenses this file
to you under the Apache License , Version 2.0 ( the
" License " ) ; you may not use this file except in compliance
with the License . You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing ,
software distributed under the License is distributed on an
" AS IS " BASIS , WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND , either express or implied . See the License for the
specific language governing permissions and limitations
under the License .
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
*)
open Thrift
class t
(pf : Processor.t)
(st : Transport.server_t)
(tf : Transport.factory)
(ipf : Protocol.factory)
(opf : Protocol.factory)=
object
inherit TServer.t pf st tf ipf opf
method serve =
st#listen;
while true do
let tr = tf#getTransport (st#accept) in
ignore (Thread.create
(fun _ ->
let ip = ipf#getProtocol tr in
let op = opf#getProtocol tr in
try
while pf#process ip op do
()
done
with _ -> ()) ())
done
end
| null | https://raw.githubusercontent.com/vbmithr/ocaml-thrift-lib/70669ce410e99389975384b46492f9a55281ca10/src/TThreadedServer.ml | ocaml |
Licensed to the Apache Software Foundation ( ASF ) under one
or more contributor license agreements . See the NOTICE file
distributed with this work for additional information
regarding copyright ownership . The ASF licenses this file
to you under the Apache License , Version 2.0 ( the
" License " ) ; you may not use this file except in compliance
with the License . You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing ,
software distributed under the License is distributed on an
" AS IS " BASIS , WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND , either express or implied . See the License for the
specific language governing permissions and limitations
under the License .
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
*)
open Thrift
class t
(pf : Processor.t)
(st : Transport.server_t)
(tf : Transport.factory)
(ipf : Protocol.factory)
(opf : Protocol.factory)=
object
inherit TServer.t pf st tf ipf opf
method serve =
st#listen;
while true do
let tr = tf#getTransport (st#accept) in
ignore (Thread.create
(fun _ ->
let ip = ipf#getProtocol tr in
let op = opf#getProtocol tr in
try
while pf#process ip op do
()
done
with _ -> ()) ())
done
end
| |
036140b79c33cf3ae0ad5ac2d22090360d44da97265cb566843a0f48bd507c84 | mbutterick/aoc-racket | star1.rkt | 360603
2
2
1
2
-2
2
1
1
-4
-2
1
-3
-9
2
0
1
-8
-9
-12
2
-5
-8
0
-11
-3
-1
2
-20
-21
-18
-25
-25
-16
-18
-27
-25
-17
-13
-9
-35
-25
-1
-20
-22
-11
-3
-26
-32
-33
-21
-3
-42
-4
-48
-36
-15
-20
-29
-32
-12
-53
-32
0
-4
-2
-32
-55
-49
-23
-18
-34
-39
-70
-4
-64
-58
-66
-59
1
-36
-19
-38
-55
-25
-15
-65
-66
-65
-58
-53
-54
-18
-24
-79
-10
-73
-90
-11
-94
-79
-64
-74
-41
-85
-66
-38
-84
-46
-101
-38
-33
-107
-38
-70
-101
-7
-37
-94
1
-69
-42
-71
-15
-36
-70
-76
-42
-118
-39
-97
-27
-30
-6
-70
-23
-113
-103
-89
-35
-43
-11
-97
-137
-19
-6
-21
-107
-93
-50
-107
-46
-90
-51
-115
-134
-122
-88
-122
-74
-30
-10
-53
-12
-152
-68
-110
-123
-111
-40
-164
-10
-168
-38
-71
-67
-171
-146
1
-33
-13
-168
-106
-135
-145
-163
-125
-12
-156
-172
-69
0
-15
-137
-1
-21
0
-76
-48
-31
-86
-140
-9
-179
-45
-106
-85
-131
-180
-190
-3
-7
-89
-44
-106
-40
-171
-59
-214
-158
-160
-89
-59
-13
0
-215
-110
-204
-39
-171
-44
-173
-112
-153
-155
-85
-113
-226
-74
-104
-152
-5
-187
-171
-12
-165
-190
-152
-49
-177
-179
-123
-158
-131
-244
-143
-127
-124
-125
-170
-211
-201
-97
-33
-47
-243
-136
-243
-89
-192
-217
-105
-116
-268
-40
-154
-262
-223
-96
-20
-66
-239
-42
-24
-112
-162
-227
-52
-81
-80
-3
-33
-152
-33
-168
-48
-245
-87
-74
-139
-295
-17
-73
-180
-48
-182
-162
-59
-296
-37
-241
-67
-83
-218
-124
-136
-303
-38
-85
-205
-137
-14
-248
-13
-212
-266
-304
-294
-54
-145
-22
-312
-298
-23
-5
-126
-307
-190
-294
-43
-293
-339
-129
-223
-231
-59
-104
-60
-239
-11
-291
-242
-115
-117
-89
-7
-57
-280
-70
-70
-6
-35
-124
-253
-120
-322
-45
-187
-288
-237
-76
-192
-179
-269
-14
-269
-227
-374
-84
-15
-144
-266
-256
-162
-336
-266
-31
-367
-172
-156
-157
-220
-189
-152
-336
-253
-36
-324
-222
-175
-313
-289
-65
-123
-229
-204
-44
-72
-20
-78
-43
-226
-93
-399
-10
-150
-153
-221
-7
-302
-237
-288
-172
-233
-119
-247
-33
-170
-46
-424
-403
-40
-65
-116
-372
-148
-336
-287
-431
-134
-354
-14
-162
-228
-152
-119
-240
-356
-305
-15
-77
-303
-189
-205
-380
-175
-316
-396
-44
-42
-142
-363
-418
-313
-322
-307
-363
-454
-157
-308
-361
-398
-347
-249
-57
-453
-167
-194
-29
-443
-45
-359
-445
-89
-136
-60
-446
-97
-282
-298
-245
-91
-130
-423
-387
-492
-282
-415
-64
-294
-468
-251
-323
-210
-490
-142
-121
-393
-251
-265
-268
-135
-224
-116
-7
-27
-165
-265
-352
-226
-25
-327
-490
-493
-80
-501
-522
-436
-36
-464
-47
-349
-91
-460
-167
-436
-207
-41
-386
-461
-430
-412
-161
-460
-488
-207
-221
-17
-265
-115
-36
-499
-111
-518
-390
-526
-280
-150
-405
-458
-503
-527
-303
-383
-154
-317
-64
-379
-564
-385
-490
-524
-440
-69
-317
-377
-53
-503
-136
-71
-190
-19
-238
-572
-281
-294
-538
-146
-130
-406
-236
-223
-234
-16
-436
-391
-249
-508
-153
-41
-380
-229
-489
-466
-94
-596
-96
-544
-532
-106
-349
-34
-162
-6
-495
-287
-82
-307
-466
-358
-272
-282
-220
-84
-168
-124
-271
-307
-164
-627
-442
-348
-368
-548
-574
-248
-144
-516
-369
-319
-215
-476
-191
-326
-462
-286
-565
-395
-190
-353
-180
-88
-403
-520
-203
-268
-198
-374
-636
-565
-76
-447
-118
-658
-311
-468
-214
-215
-68
-370
-179
-286
-394
-532
-493
-339
-471
-344
-274
-512
-90
-306
-222
-348
-331
-670
-73
-136
-358
-518
-521
-41
-204
-476
-582
-633
-44
-133
-410
-658
-443
-158
-76
-162
-509
-150
-304
-407
-210
-667
-279
-109
-469
-680
-702
-101
-159
-197
-134
-154
-270
-90
-568
-237
-95
-328
-23
-493
-310
-649
-283
-31
-234
-126
-244
-312
-409
-84
-522
-72
-458
-190
-707
-338
-433
-709
-81
-693
-367
-302
-260
-322
-282
-379
-401
-60
-126
-645
-573
-91
-685
-312
-165
-217
-594
-386
-357
-72
-426
-354
-246
-278
-451
-37
-484
-232
-638
-569
-376
-278
-554
-666
-191
-513
-564
-502
-640
-531
-1
-667
-146
-20
-375
-771
-263
-754
-182
-190
-626
-249
-418
-278
-722
-676
-161
-666
-462
-488
-70
-569
-779
-642
-121
-556
-146
-531
-323
-116
-100
-76
-260
-339
-743
-44
-811
-137
-456
-462
-469
-500
-650
-681
-424
-336
-445
-172
-604
-776
-133
-829
-72
-88
-219
-247
-269
-205
-532
-222
-232
-201
-762
-105
-478
-263
-177
-475
-584
-439
-316
-722
-371
-35
-132
-417
-385
-695
-76
-452
-201
-848
-359
-731
-721
-211
-698
-502
-113
-271
-31
-66
-588
-794
-189
-417
-715
-96
-95
-691
-587
-193
-788
-445
-761
-347
-696
-34
-14
-358
-633
-152
-140
-371
-795
-500
-477
-617
-83
-462
-338
-167
-345
-99
-806
-798
-447
-338
-263
-76
-382
-913
-577
-657
-69
-344
-853
-679
-204
-560
-238
-455
1
-1
-163
-536
-344
-386
-600
-915
-245
-717
-914
-192
-931
-31
-230
-42
-531
-122
-66
-347
-476
-3
-506
-396
-839
-365
-12
-453
-247
-448
-369
-661
-451
-175
-64
-805
-63
-597
-451
-350
-77
-958
-525
-194
-116
-398
-100
-687
-758
-162
-424
-920
-162
-577
-549
-250
-594
-853
-44
-34
-882
-656
-358
-425
-592
-257
-963
-295
-561
-970
-885
-968
-112
-111
-734
-375
-825
-462
-333
-154
-490
-1004
-391
-925
-175
-299
-985
-432
-165
-721
-289
-817
-393
-831
-697
-599
-145
-977
-550
-577
-249
-743
-711
-4
-442
-252
-897
-130
-528
-906
-809
-228
-548
-695
-912
-676
-936
-209
-312
-951
-671
-898
-205
-730
-873
-798
-943 | null | https://raw.githubusercontent.com/mbutterick/aoc-racket/2c6cb2f3ad876a91a82f33ce12844f7758b969d6/2017/d05/star1.rkt | racket | 360603
2
2
1
2
-2
2
1
1
-4
-2
1
-3
-9
2
0
1
-8
-9
-12
2
-5
-8
0
-11
-3
-1
2
-20
-21
-18
-25
-25
-16
-18
-27
-25
-17
-13
-9
-35
-25
-1
-20
-22
-11
-3
-26
-32
-33
-21
-3
-42
-4
-48
-36
-15
-20
-29
-32
-12
-53
-32
0
-4
-2
-32
-55
-49
-23
-18
-34
-39
-70
-4
-64
-58
-66
-59
1
-36
-19
-38
-55
-25
-15
-65
-66
-65
-58
-53
-54
-18
-24
-79
-10
-73
-90
-11
-94
-79
-64
-74
-41
-85
-66
-38
-84
-46
-101
-38
-33
-107
-38
-70
-101
-7
-37
-94
1
-69
-42
-71
-15
-36
-70
-76
-42
-118
-39
-97
-27
-30
-6
-70
-23
-113
-103
-89
-35
-43
-11
-97
-137
-19
-6
-21
-107
-93
-50
-107
-46
-90
-51
-115
-134
-122
-88
-122
-74
-30
-10
-53
-12
-152
-68
-110
-123
-111
-40
-164
-10
-168
-38
-71
-67
-171
-146
1
-33
-13
-168
-106
-135
-145
-163
-125
-12
-156
-172
-69
0
-15
-137
-1
-21
0
-76
-48
-31
-86
-140
-9
-179
-45
-106
-85
-131
-180
-190
-3
-7
-89
-44
-106
-40
-171
-59
-214
-158
-160
-89
-59
-13
0
-215
-110
-204
-39
-171
-44
-173
-112
-153
-155
-85
-113
-226
-74
-104
-152
-5
-187
-171
-12
-165
-190
-152
-49
-177
-179
-123
-158
-131
-244
-143
-127
-124
-125
-170
-211
-201
-97
-33
-47
-243
-136
-243
-89
-192
-217
-105
-116
-268
-40
-154
-262
-223
-96
-20
-66
-239
-42
-24
-112
-162
-227
-52
-81
-80
-3
-33
-152
-33
-168
-48
-245
-87
-74
-139
-295
-17
-73
-180
-48
-182
-162
-59
-296
-37
-241
-67
-83
-218
-124
-136
-303
-38
-85
-205
-137
-14
-248
-13
-212
-266
-304
-294
-54
-145
-22
-312
-298
-23
-5
-126
-307
-190
-294
-43
-293
-339
-129
-223
-231
-59
-104
-60
-239
-11
-291
-242
-115
-117
-89
-7
-57
-280
-70
-70
-6
-35
-124
-253
-120
-322
-45
-187
-288
-237
-76
-192
-179
-269
-14
-269
-227
-374
-84
-15
-144
-266
-256
-162
-336
-266
-31
-367
-172
-156
-157
-220
-189
-152
-336
-253
-36
-324
-222
-175
-313
-289
-65
-123
-229
-204
-44
-72
-20
-78
-43
-226
-93
-399
-10
-150
-153
-221
-7
-302
-237
-288
-172
-233
-119
-247
-33
-170
-46
-424
-403
-40
-65
-116
-372
-148
-336
-287
-431
-134
-354
-14
-162
-228
-152
-119
-240
-356
-305
-15
-77
-303
-189
-205
-380
-175
-316
-396
-44
-42
-142
-363
-418
-313
-322
-307
-363
-454
-157
-308
-361
-398
-347
-249
-57
-453
-167
-194
-29
-443
-45
-359
-445
-89
-136
-60
-446
-97
-282
-298
-245
-91
-130
-423
-387
-492
-282
-415
-64
-294
-468
-251
-323
-210
-490
-142
-121
-393
-251
-265
-268
-135
-224
-116
-7
-27
-165
-265
-352
-226
-25
-327
-490
-493
-80
-501
-522
-436
-36
-464
-47
-349
-91
-460
-167
-436
-207
-41
-386
-461
-430
-412
-161
-460
-488
-207
-221
-17
-265
-115
-36
-499
-111
-518
-390
-526
-280
-150
-405
-458
-503
-527
-303
-383
-154
-317
-64
-379
-564
-385
-490
-524
-440
-69
-317
-377
-53
-503
-136
-71
-190
-19
-238
-572
-281
-294
-538
-146
-130
-406
-236
-223
-234
-16
-436
-391
-249
-508
-153
-41
-380
-229
-489
-466
-94
-596
-96
-544
-532
-106
-349
-34
-162
-6
-495
-287
-82
-307
-466
-358
-272
-282
-220
-84
-168
-124
-271
-307
-164
-627
-442
-348
-368
-548
-574
-248
-144
-516
-369
-319
-215
-476
-191
-326
-462
-286
-565
-395
-190
-353
-180
-88
-403
-520
-203
-268
-198
-374
-636
-565
-76
-447
-118
-658
-311
-468
-214
-215
-68
-370
-179
-286
-394
-532
-493
-339
-471
-344
-274
-512
-90
-306
-222
-348
-331
-670
-73
-136
-358
-518
-521
-41
-204
-476
-582
-633
-44
-133
-410
-658
-443
-158
-76
-162
-509
-150
-304
-407
-210
-667
-279
-109
-469
-680
-702
-101
-159
-197
-134
-154
-270
-90
-568
-237
-95
-328
-23
-493
-310
-649
-283
-31
-234
-126
-244
-312
-409
-84
-522
-72
-458
-190
-707
-338
-433
-709
-81
-693
-367
-302
-260
-322
-282
-379
-401
-60
-126
-645
-573
-91
-685
-312
-165
-217
-594
-386
-357
-72
-426
-354
-246
-278
-451
-37
-484
-232
-638
-569
-376
-278
-554
-666
-191
-513
-564
-502
-640
-531
-1
-667
-146
-20
-375
-771
-263
-754
-182
-190
-626
-249
-418
-278
-722
-676
-161
-666
-462
-488
-70
-569
-779
-642
-121
-556
-146
-531
-323
-116
-100
-76
-260
-339
-743
-44
-811
-137
-456
-462
-469
-500
-650
-681
-424
-336
-445
-172
-604
-776
-133
-829
-72
-88
-219
-247
-269
-205
-532
-222
-232
-201
-762
-105
-478
-263
-177
-475
-584
-439
-316
-722
-371
-35
-132
-417
-385
-695
-76
-452
-201
-848
-359
-731
-721
-211
-698
-502
-113
-271
-31
-66
-588
-794
-189
-417
-715
-96
-95
-691
-587
-193
-788
-445
-761
-347
-696
-34
-14
-358
-633
-152
-140
-371
-795
-500
-477
-617
-83
-462
-338
-167
-345
-99
-806
-798
-447
-338
-263
-76
-382
-913
-577
-657
-69
-344
-853
-679
-204
-560
-238
-455
1
-1
-163
-536
-344
-386
-600
-915
-245
-717
-914
-192
-931
-31
-230
-42
-531
-122
-66
-347
-476
-3
-506
-396
-839
-365
-12
-453
-247
-448
-369
-661
-451
-175
-64
-805
-63
-597
-451
-350
-77
-958
-525
-194
-116
-398
-100
-687
-758
-162
-424
-920
-162
-577
-549
-250
-594
-853
-44
-34
-882
-656
-358
-425
-592
-257
-963
-295
-561
-970
-885
-968
-112
-111
-734
-375
-825
-462
-333
-154
-490
-1004
-391
-925
-175
-299
-985
-432
-165
-721
-289
-817
-393
-831
-697
-599
-145
-977
-550
-577
-249
-743
-711
-4
-442
-252
-897
-130
-528
-906
-809
-228
-548
-695
-912
-676
-936
-209
-312
-951
-671
-898
-205
-730
-873
-798
-943 | |
a40447f7b00149082851e62d1550273b902a1c86adabfddca10ddf7d3e2bf0be | prowdsponsor/fb | Graph.hs | # LANGUAGE ConstraintKinds , CPP , DeriveDataTypeable , FlexibleContexts , OverloadedStrings #
module Facebook.Graph
( getObject
, postObject
, deleteObject
, searchObjects
, (#=)
, SimpleType(..)
, Place(..)
, Location(..)
, GeoCoordinates(..)
, Tag(..)
) where
import Control.Applicative
import Control.Monad (mzero)
import Control.Monad.Trans.Control (MonadBaseControl)
import Data.ByteString.Char8 (ByteString)
import Data.Int (Int8, Int16, Int32, Int64)
import Data.List (intersperse)
import Data.Text (Text)
import Data.Typeable (Typeable)
import Data.Word (Word, Word8, Word16, Word32, Word64)
#if MIN_VERSION_time(1,5,0)
import Data.Time (defaultTimeLocale)
#else
import System.Locale (defaultTimeLocale)
#endif
import qualified Control.Monad.Trans.Resource as R
import qualified Data.Aeson as A
import qualified Data.Aeson.Encode as AE (fromValue)
import qualified Data.ByteString.Char8 as B
import qualified Data.Text.Encoding as TE
import qualified Data.Text.Lazy as TL
import qualified Data.Text.Lazy.Builder as TLB
import qualified Data.Time as TI
import qualified Network.HTTP.Conduit as H
import qualified Network.HTTP.Types as HT
import Facebook.Auth
import Facebook.Base
import Facebook.Monad
import Facebook.Types
import Facebook.Pager
| Make a raw @GET@ request to Facebook 's Graph API .
getObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
Text -- ^ Path (should begin with a slash @\/@)
^ Arguments to be passed to Facebook
-> Maybe (AccessToken anyKind) -- ^ Optional access token
-> FacebookT anyAuth m a
getObject path query mtoken =
runResourceInFb $
asJson =<< fbhttp =<< fbreq path mtoken query
| Make a raw @POST@ request to Facebook 's Graph API .
postObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
Text -- ^ Path (should begin with a slash @\/@)
^ Arguments to be passed to Facebook
-> AccessToken anyKind -- ^ Access token
-> FacebookT Auth m a
postObject = methodObject HT.methodPost
| Make a raw @DELETE@ request to Facebook 's Graph API .
deleteObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
Text -- ^ Path (should begin with a slash @\/@)
^ Arguments to be passed to Facebook
-> AccessToken anyKind -- ^ Access token
-> FacebookT Auth m a
deleteObject = methodObject HT.methodDelete
-- | Helper function used by 'postObject' and 'deleteObject'.
methodObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
HT.Method
-> Text -- ^ Path (should begin with a slash @\/@)
^ Arguments to be passed to Facebook
-> AccessToken anyKind -- ^ Access token
-> FacebookT Auth m a
methodObject method path query token =
runResourceInFb $ do
req <- fbreq path (Just token) query
asJson =<< fbhttp req { H.method = method }
| Make a raw @GET@ request to the /search endpoint of Facebook ’s
Graph API . Returns a raw JSON ' A.Value ' .
searchObjects :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a)
^ A Facebook object type to search for
-> Text -- ^ The keyword to search for
-> [Argument] -- ^ Additional arguments to pass
-> Maybe UserAccessToken -- ^ Optional access token
-> FacebookT anyAuth m (Pager a)
searchObjects objectType keyword query = getObject "/search" query'
where query' = ("q" #= keyword) : ("type" #= objectType) : query
----------------------------------------------------------------------
| Create an ' Argument ' with a ' SimpleType ' . See the docs on
-- 'createAction' for an example.
(#=) :: SimpleType a => ByteString -> a -> Argument
p #= v = (p, encodeFbParam v)
| Class for data types that may be represented as a Facebook
-- simple type. (see
-- </>).
class SimpleType a where
encodeFbParam :: a -> B.ByteString
| Facebook 's simple type @Boolean@.
instance SimpleType Bool where
encodeFbParam b = if b then "1" else "0"
| Facebook 's simple type @DateTime@ with only the date .
instance SimpleType TI.Day where
encodeFbParam = B.pack . TI.formatTime defaultTimeLocale "%Y-%m-%d"
| Facebook 's simple type @DateTime@.
instance SimpleType TI.UTCTime where
encodeFbParam = B.pack . TI.formatTime defaultTimeLocale "%Y%m%dT%H%MZ"
| Facebook 's simple type @DateTime@.
instance SimpleType TI.ZonedTime where
encodeFbParam = encodeFbParam . TI.zonedTimeToUTC
@Enum@ does n't make sense to support as a data type .
| Facebook 's simple type @Float@ with less precision than supported .
instance SimpleType Float where
encodeFbParam = showBS
| Facebook 's simple type @Float@.
instance SimpleType Double where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int8 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word8 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int16 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word16 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int32 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word32 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int64 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word64 where
encodeFbParam = showBS
| Facebook 's simple type @String@.
instance SimpleType Text where
encodeFbParam = TE.encodeUtf8
| Facebook 's simple type @String@.
instance SimpleType ByteString where
encodeFbParam = id
-- | An object's 'Id' code.
instance SimpleType Id where
encodeFbParam = TE.encodeUtf8 . idCode
-- | 'Permission' is a @newtype@ of 'Text'
instance SimpleType Permission where
encodeFbParam = encodeFbParam . unPermission
-- | A comma-separated list of simple types. This definition
-- doesn't work everywhere, just for a few combinations that
Facebook uses ( e.g. @[Int]@ ) . Also , encoding a list of lists
-- is the same as encoding the concatenation of all lists. In
-- other words, this instance is here more for your convenience
-- than to make sure your code is correct.
instance SimpleType a => SimpleType [a] where
encodeFbParam = B.concat . intersperse "," . map encodeFbParam
showBS :: Show a => a -> B.ByteString
showBS = B.pack . show
----------------------------------------------------------------------
-- | Information about a place. This is not a Graph Object,
-- instead it's just a field of a Object. (Not to be confused
with the object . )
data Place =
^ .
^ name .
, placeLocation :: Maybe Location
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON Place where
parseJSON (A.Object v) =
Place <$> v A..: "id"
<*> v A..:? "name"
<*> v A..:? "location"
parseJSON _ = mzero
-- | A geographical location.
data Location =
Location { locationStreet :: Maybe Text
, locationCity :: Maybe Text
, locationState :: Maybe Text
, locationCountry :: Maybe Text
, locationZip :: Maybe Text
, locationCoords :: Maybe GeoCoordinates
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON Location where
parseJSON obj@(A.Object v) =
Location <$> v A..:? "street"
<*> v A..:? "city"
<*> v A..:? "state"
<*> v A..:? "country"
<*> v A..:? "zip"
<*> A.parseJSON obj
parseJSON _ = mzero
-- | Geographical coordinates.
data GeoCoordinates =
GeoCoordinates { latitude :: !Double
, longitude :: !Double
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON GeoCoordinates where
parseJSON (A.Object v) =
GeoCoordinates <$> v A..: "latitude"
<*> v A..: "longitude"
parseJSON _ = mzero
instance SimpleType GeoCoordinates where
encodeFbParam c =
let obj = A.object [ "latitude" A..= latitude c
, "longitude" A..= longitude c]
toBS = TE.encodeUtf8 . TL.toStrict . TLB.toLazyText . AE.fromValue
in toBS obj
-- | A tag (i.e. \"I'll /tag/ you on my post\").
data Tag =
Tag { tagId :: Id -- ^ Who is tagged.
, tagName :: Text -- ^ Name of the tagged person.
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON Tag where
parseJSON (A.Object v) =
Tag <$> v A..: "id"
<*> v A..: "name"
parseJSON _ = mzero
| null | https://raw.githubusercontent.com/prowdsponsor/fb/596f2633e017e1765c58a913949b4c4948a5455e/src/Facebook/Graph.hs | haskell | ^ Path (should begin with a slash @\/@)
^ Optional access token
^ Path (should begin with a slash @\/@)
^ Access token
^ Path (should begin with a slash @\/@)
^ Access token
| Helper function used by 'postObject' and 'deleteObject'.
^ Path (should begin with a slash @\/@)
^ Access token
^ The keyword to search for
^ Additional arguments to pass
^ Optional access token
--------------------------------------------------------------------
'createAction' for an example.
simple type. (see
</>).
| An object's 'Id' code.
| 'Permission' is a @newtype@ of 'Text'
| A comma-separated list of simple types. This definition
doesn't work everywhere, just for a few combinations that
is the same as encoding the concatenation of all lists. In
other words, this instance is here more for your convenience
than to make sure your code is correct.
--------------------------------------------------------------------
| Information about a place. This is not a Graph Object,
instead it's just a field of a Object. (Not to be confused
| A geographical location.
| Geographical coordinates.
| A tag (i.e. \"I'll /tag/ you on my post\").
^ Who is tagged.
^ Name of the tagged person. | # LANGUAGE ConstraintKinds , CPP , DeriveDataTypeable , FlexibleContexts , OverloadedStrings #
module Facebook.Graph
( getObject
, postObject
, deleteObject
, searchObjects
, (#=)
, SimpleType(..)
, Place(..)
, Location(..)
, GeoCoordinates(..)
, Tag(..)
) where
import Control.Applicative
import Control.Monad (mzero)
import Control.Monad.Trans.Control (MonadBaseControl)
import Data.ByteString.Char8 (ByteString)
import Data.Int (Int8, Int16, Int32, Int64)
import Data.List (intersperse)
import Data.Text (Text)
import Data.Typeable (Typeable)
import Data.Word (Word, Word8, Word16, Word32, Word64)
#if MIN_VERSION_time(1,5,0)
import Data.Time (defaultTimeLocale)
#else
import System.Locale (defaultTimeLocale)
#endif
import qualified Control.Monad.Trans.Resource as R
import qualified Data.Aeson as A
import qualified Data.Aeson.Encode as AE (fromValue)
import qualified Data.ByteString.Char8 as B
import qualified Data.Text.Encoding as TE
import qualified Data.Text.Lazy as TL
import qualified Data.Text.Lazy.Builder as TLB
import qualified Data.Time as TI
import qualified Network.HTTP.Conduit as H
import qualified Network.HTTP.Types as HT
import Facebook.Auth
import Facebook.Base
import Facebook.Monad
import Facebook.Types
import Facebook.Pager
| Make a raw @GET@ request to Facebook 's Graph API .
getObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
^ Arguments to be passed to Facebook
-> FacebookT anyAuth m a
getObject path query mtoken =
runResourceInFb $
asJson =<< fbhttp =<< fbreq path mtoken query
| Make a raw @POST@ request to Facebook 's Graph API .
postObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
^ Arguments to be passed to Facebook
-> FacebookT Auth m a
postObject = methodObject HT.methodPost
| Make a raw @DELETE@ request to Facebook 's Graph API .
deleteObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
^ Arguments to be passed to Facebook
-> FacebookT Auth m a
deleteObject = methodObject HT.methodDelete
methodObject :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a) =>
HT.Method
^ Arguments to be passed to Facebook
-> FacebookT Auth m a
methodObject method path query token =
runResourceInFb $ do
req <- fbreq path (Just token) query
asJson =<< fbhttp req { H.method = method }
| Make a raw @GET@ request to the /search endpoint of Facebook ’s
Graph API . Returns a raw JSON ' A.Value ' .
searchObjects :: (R.MonadResource m, MonadBaseControl IO m, A.FromJSON a)
^ A Facebook object type to search for
-> FacebookT anyAuth m (Pager a)
searchObjects objectType keyword query = getObject "/search" query'
where query' = ("q" #= keyword) : ("type" #= objectType) : query
| Create an ' Argument ' with a ' SimpleType ' . See the docs on
(#=) :: SimpleType a => ByteString -> a -> Argument
p #= v = (p, encodeFbParam v)
| Class for data types that may be represented as a Facebook
class SimpleType a where
encodeFbParam :: a -> B.ByteString
| Facebook 's simple type @Boolean@.
instance SimpleType Bool where
encodeFbParam b = if b then "1" else "0"
| Facebook 's simple type @DateTime@ with only the date .
instance SimpleType TI.Day where
encodeFbParam = B.pack . TI.formatTime defaultTimeLocale "%Y-%m-%d"
| Facebook 's simple type @DateTime@.
instance SimpleType TI.UTCTime where
encodeFbParam = B.pack . TI.formatTime defaultTimeLocale "%Y%m%dT%H%MZ"
| Facebook 's simple type @DateTime@.
instance SimpleType TI.ZonedTime where
encodeFbParam = encodeFbParam . TI.zonedTimeToUTC
@Enum@ does n't make sense to support as a data type .
| Facebook 's simple type @Float@ with less precision than supported .
instance SimpleType Float where
encodeFbParam = showBS
| Facebook 's simple type @Float@.
instance SimpleType Double where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int8 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word8 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int16 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word16 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int32 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word32 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Int64 where
encodeFbParam = showBS
| Facebook 's simple type @Integer@.
instance SimpleType Word64 where
encodeFbParam = showBS
| Facebook 's simple type @String@.
instance SimpleType Text where
encodeFbParam = TE.encodeUtf8
| Facebook 's simple type @String@.
instance SimpleType ByteString where
encodeFbParam = id
instance SimpleType Id where
encodeFbParam = TE.encodeUtf8 . idCode
instance SimpleType Permission where
encodeFbParam = encodeFbParam . unPermission
Facebook uses ( e.g. @[Int]@ ) . Also , encoding a list of lists
instance SimpleType a => SimpleType [a] where
encodeFbParam = B.concat . intersperse "," . map encodeFbParam
showBS :: Show a => a -> B.ByteString
showBS = B.pack . show
with the object . )
data Place =
^ .
^ name .
, placeLocation :: Maybe Location
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON Place where
parseJSON (A.Object v) =
Place <$> v A..: "id"
<*> v A..:? "name"
<*> v A..:? "location"
parseJSON _ = mzero
data Location =
Location { locationStreet :: Maybe Text
, locationCity :: Maybe Text
, locationState :: Maybe Text
, locationCountry :: Maybe Text
, locationZip :: Maybe Text
, locationCoords :: Maybe GeoCoordinates
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON Location where
parseJSON obj@(A.Object v) =
Location <$> v A..:? "street"
<*> v A..:? "city"
<*> v A..:? "state"
<*> v A..:? "country"
<*> v A..:? "zip"
<*> A.parseJSON obj
parseJSON _ = mzero
data GeoCoordinates =
GeoCoordinates { latitude :: !Double
, longitude :: !Double
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON GeoCoordinates where
parseJSON (A.Object v) =
GeoCoordinates <$> v A..: "latitude"
<*> v A..: "longitude"
parseJSON _ = mzero
instance SimpleType GeoCoordinates where
encodeFbParam c =
let obj = A.object [ "latitude" A..= latitude c
, "longitude" A..= longitude c]
toBS = TE.encodeUtf8 . TL.toStrict . TLB.toLazyText . AE.fromValue
in toBS obj
data Tag =
}
deriving (Eq, Ord, Show, Read, Typeable)
instance A.FromJSON Tag where
parseJSON (A.Object v) =
Tag <$> v A..: "id"
<*> v A..: "name"
parseJSON _ = mzero
|
26fd876323313c6c9e3ba9e0086c8938900beda61cdaf55ef8aa2bfcd3ba758e | mdaley/charlatan | core_test.clj | (ns charlatan.core-test
(:require [cheshire.core :as json]
[clj-http.client :as http]
[clojure.test :refer :all]
[charlatan.core :refer :all]
[charlatan.mountebank :as mb]))
(def mb-port 2525)
(def port 8080)
(deftest simple-response-to-ping
(testing "mountebank simple response stub"
(let [m (mb/start {:port mb-port})]
(try
(mb/create-imposter mb-port port
{:stubs [{:responses [{:is {:statusCode 202
:body "pong"}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [response (http/get (str ":" port "/ping")
{:throw-exceptions false})]
(is (= (:status response) 202))
(is (= (:body response) "pong"))
)
(finally
(mb/stop m))))))
(deftest test-with-approach
(testing "mountebank simple response stub inside with-mb macro"
(mb/with-mb {:port 2567}
(mb/create-imposter port {:stubs [{:responses [{:is {:statusCode 202
:body "pong"}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [{status :status body :body} (http/get (str ":" port "/ping")
{:throw-exceptions false})]
(is (= status 202))
(is (= body "pong"))))))
(deftest get-config
(testing "get config works"
(mb/with-mb {:port 2525}
(let [{status :status body :body} (mb/get-config)]
(is (= status 200))
(is (= (-> body :options :port) "2525"))))))
(deftest get-imposter
(testing "get imposter works"
(mb/with-mb {:port 2525 :debug true}
(mb/create-imposter port {:stubs [{:responses [{:is {:statusCode 200
:body "pong"}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [{status :status body :body} (http/get (str ":" port "/ping")
{:throw-exceptions false})]
(is (= status 202))
(is (= body "pong")))
(let [{status :status body :body} (mb/get-imposter port)
match (-> body :stubs (first) :matches (first) :request)]
(is (= (select-keys match [:method :path]) {:method "GET" :path "/ping"}))))))
(deftest service-healthcheck
(testing "Healthcheck returns failure response when remote service ping fails"
(mb/with-mb {:port 2525 :debug true}
(mb/create-imposter 8081 {:stubs [{:responses [{:is {:statusCode 500}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [{status :status body :body} (http/get ":8080/healthcheck"
{:throw-exceptions false})]
(is (= status 200))
(is (= body "Something useful that says remote service is broken"))))))
| null | https://raw.githubusercontent.com/mdaley/charlatan/8115f957187aa015428b686a099767b727cc8930/test/charlatan/core_test.clj | clojure | (ns charlatan.core-test
(:require [cheshire.core :as json]
[clj-http.client :as http]
[clojure.test :refer :all]
[charlatan.core :refer :all]
[charlatan.mountebank :as mb]))
(def mb-port 2525)
(def port 8080)
(deftest simple-response-to-ping
(testing "mountebank simple response stub"
(let [m (mb/start {:port mb-port})]
(try
(mb/create-imposter mb-port port
{:stubs [{:responses [{:is {:statusCode 202
:body "pong"}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [response (http/get (str ":" port "/ping")
{:throw-exceptions false})]
(is (= (:status response) 202))
(is (= (:body response) "pong"))
)
(finally
(mb/stop m))))))
(deftest test-with-approach
(testing "mountebank simple response stub inside with-mb macro"
(mb/with-mb {:port 2567}
(mb/create-imposter port {:stubs [{:responses [{:is {:statusCode 202
:body "pong"}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [{status :status body :body} (http/get (str ":" port "/ping")
{:throw-exceptions false})]
(is (= status 202))
(is (= body "pong"))))))
(deftest get-config
(testing "get config works"
(mb/with-mb {:port 2525}
(let [{status :status body :body} (mb/get-config)]
(is (= status 200))
(is (= (-> body :options :port) "2525"))))))
(deftest get-imposter
(testing "get imposter works"
(mb/with-mb {:port 2525 :debug true}
(mb/create-imposter port {:stubs [{:responses [{:is {:statusCode 200
:body "pong"}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [{status :status body :body} (http/get (str ":" port "/ping")
{:throw-exceptions false})]
(is (= status 202))
(is (= body "pong")))
(let [{status :status body :body} (mb/get-imposter port)
match (-> body :stubs (first) :matches (first) :request)]
(is (= (select-keys match [:method :path]) {:method "GET" :path "/ping"}))))))
(deftest service-healthcheck
(testing "Healthcheck returns failure response when remote service ping fails"
(mb/with-mb {:port 2525 :debug true}
(mb/create-imposter 8081 {:stubs [{:responses [{:is {:statusCode 500}}]
:predicates [{:equals {:method "GET"
:path "/ping"}}]}]})
(let [{status :status body :body} (http/get ":8080/healthcheck"
{:throw-exceptions false})]
(is (= status 200))
(is (= body "Something useful that says remote service is broken"))))))
| |
954946ef5d67d4d67723dd3c6e8213790b77a46b14f48866a65b622c550abe17 | blockchain-unica/defi-workbench | wallet.mli | (* type of the wallet *)
type t
(* type of the users' balance *)
type bt
val make : Address.t -> bt -> t
val empty : Address.t -> t
val balance : Token.t -> t -> int
val update : Token.t -> int -> t -> t
val get_address : t -> Address.t
val get_balance : t -> bt
val set_balance : bt -> t -> t
val balance_of_list : (Token.t * int) list -> bt
val list_of_balance : bt -> (Token.t * int) list
val to_string : t -> string
| null | https://raw.githubusercontent.com/blockchain-unica/defi-workbench/a3606bf425cef8e81d1f12b13dc1410a48f3e860/lib/wallet.mli | ocaml | type of the wallet
type of the users' balance | type t
type bt
val make : Address.t -> bt -> t
val empty : Address.t -> t
val balance : Token.t -> t -> int
val update : Token.t -> int -> t -> t
val get_address : t -> Address.t
val get_balance : t -> bt
val set_balance : bt -> t -> t
val balance_of_list : (Token.t * int) list -> bt
val list_of_balance : bt -> (Token.t * int) list
val to_string : t -> string
|
a1ab8dae0bf4234b2d3ccd075970f1b629898da737f80507e17f6e65fdb6797e | johnyob/dromedary | structure.mli | (*****************************************************************************)
(* *)
Dromedary
(* *)
, University of Cambridge
(* *)
Copyright 2021 .
(* *)
All rights reserved . This file is distributed under the terms of the MIT
(* license, as described in the file LICENSE. *)
(* *)
(*****************************************************************************)
(* This module implements signatured used for unification structures. *)
open! Import
module type Identifiable = sig
type 'a t
val id : 'a t -> int
end
module Rigid_var : sig
type t = private int [@@deriving sexp_of, compare]
val make : unit -> t
val hash : t -> int
end
module type S = Unifier.Structure.S
module Of_former (Former : Type_former.S) : sig
include S with type 'a t = 'a Former.t and type 'a ctx = unit
end
module First_order (S : S) : sig
type 'a t =
| Var
| Structure of 'a S.t
include S with type 'a t := 'a t and type 'a ctx = 'a S.ctx
end
module Ambivalent (S : S) : sig
module Rigid_type : sig
type t [@@deriving sexp_of]
val make_var : unit -> t
val make_rigid_var : Rigid_var.t -> t
val make_structure : t S.t -> t
end
module Equations : sig
module Scope : sig
(** [t] represents the "scope" of the equation. It is used to track
consistency in level-based generalization *)
type t = int
val outermost_scope : t
val max : t -> t -> t
end
module Ctx : sig
* [ t ] represents the equational scope used for Ambivalence
type t
(** [empty] is the empty equational context. *)
val empty : t
exception Inconsistent
* [ add t type1 type2 scope ] adds the equation [ type1 = type2 ]
in the scope [ scope ] .
in the scope [scope]. *)
val add
: ctx:Rigid_type.t S.ctx
-> t
-> Rigid_type.t
-> Rigid_type.t
-> Scope.t
-> t
end
end
(** ['a t] represents an ambivalent structure. *)
type 'a t
(** ['a repr] is the representation of ['a t]. *)
type 'a repr =
| Rigid_var of Rigid_var.t
| Structure of 'a S.t
(** [make repr] creates an ambivalent structure with representation [repr]. *)
val make : 'a repr -> 'a t
(** [repr t] returns the representation of [t]. *)
val repr : 'a t -> 'a repr
(** [scope t] returns the equational scope of [t]. *)
val scope : 'a t -> Equations.Scope.t
* [ update_scope t scope ] updates the scope of [ t ] .
val update_scope : 'a t -> Equations.Scope.t -> unit
type 'a ctx =
{ equations_ctx : Equations.Ctx.t
; make : 'a t -> 'a
; super_ : 'a S.ctx
}
include S with type 'a t := 'a t and type 'a ctx := 'a ctx
end
module Abbreviations (S : S) (Id : Identifiable with type 'a t := 'a S.t) : sig
module Abbrev : sig
module Type : sig
type t [@@deriving sexp_of, compare]
val make_var : unit -> t
val make_structure : t S.t -> t
end
type t
val make : Type.t S.t -> Type.t -> t
module Ctx : sig
type abbrev := t
type t
val empty : t
val add : t -> abbrev:abbrev -> t
end
end
type 'a t
val make : 'a S.t -> 'a t
val repr : 'a t -> 'a S.t
type 'a ctx =
{ abbrev_ctx : Abbrev.Ctx.t
; make_structure : 'a S.t -> 'a
; make_var : unit -> 'a
; super_ : 'a S.ctx
}
include S with type 'a t := 'a t and type 'a ctx := 'a ctx
end
module Rows (Label : Comparable.S) (S : S) : sig
type 'a t =
| Structure of 'a S.t
| Row_cons of Label.t * 'a * 'a
| Row_uniform of 'a
type 'a ctx =
{ make_var : unit -> 'a
; make_structure : 'a t -> 'a
; super_ : 'a S.ctx
}
include S with type 'a t := 'a t and type 'a ctx := 'a ctx
end
| null | https://raw.githubusercontent.com/johnyob/dromedary/721e6ac3df867c54af669d5a68cd9536341a1910/lib/constraints/structure.mli | ocaml | ***************************************************************************
license, as described in the file LICENSE.
***************************************************************************
This module implements signatured used for unification structures.
* [t] represents the "scope" of the equation. It is used to track
consistency in level-based generalization
* [empty] is the empty equational context.
* ['a t] represents an ambivalent structure.
* ['a repr] is the representation of ['a t].
* [make repr] creates an ambivalent structure with representation [repr].
* [repr t] returns the representation of [t].
* [scope t] returns the equational scope of [t]. | Dromedary
, University of Cambridge
Copyright 2021 .
All rights reserved . This file is distributed under the terms of the MIT
open! Import
module type Identifiable = sig
type 'a t
val id : 'a t -> int
end
module Rigid_var : sig
type t = private int [@@deriving sexp_of, compare]
val make : unit -> t
val hash : t -> int
end
module type S = Unifier.Structure.S
module Of_former (Former : Type_former.S) : sig
include S with type 'a t = 'a Former.t and type 'a ctx = unit
end
module First_order (S : S) : sig
type 'a t =
| Var
| Structure of 'a S.t
include S with type 'a t := 'a t and type 'a ctx = 'a S.ctx
end
module Ambivalent (S : S) : sig
module Rigid_type : sig
type t [@@deriving sexp_of]
val make_var : unit -> t
val make_rigid_var : Rigid_var.t -> t
val make_structure : t S.t -> t
end
module Equations : sig
module Scope : sig
type t = int
val outermost_scope : t
val max : t -> t -> t
end
module Ctx : sig
* [ t ] represents the equational scope used for Ambivalence
type t
val empty : t
exception Inconsistent
* [ add t type1 type2 scope ] adds the equation [ type1 = type2 ]
in the scope [ scope ] .
in the scope [scope]. *)
val add
: ctx:Rigid_type.t S.ctx
-> t
-> Rigid_type.t
-> Rigid_type.t
-> Scope.t
-> t
end
end
type 'a t
type 'a repr =
| Rigid_var of Rigid_var.t
| Structure of 'a S.t
val make : 'a repr -> 'a t
val repr : 'a t -> 'a repr
val scope : 'a t -> Equations.Scope.t
* [ update_scope t scope ] updates the scope of [ t ] .
val update_scope : 'a t -> Equations.Scope.t -> unit
type 'a ctx =
{ equations_ctx : Equations.Ctx.t
; make : 'a t -> 'a
; super_ : 'a S.ctx
}
include S with type 'a t := 'a t and type 'a ctx := 'a ctx
end
module Abbreviations (S : S) (Id : Identifiable with type 'a t := 'a S.t) : sig
module Abbrev : sig
module Type : sig
type t [@@deriving sexp_of, compare]
val make_var : unit -> t
val make_structure : t S.t -> t
end
type t
val make : Type.t S.t -> Type.t -> t
module Ctx : sig
type abbrev := t
type t
val empty : t
val add : t -> abbrev:abbrev -> t
end
end
type 'a t
val make : 'a S.t -> 'a t
val repr : 'a t -> 'a S.t
type 'a ctx =
{ abbrev_ctx : Abbrev.Ctx.t
; make_structure : 'a S.t -> 'a
; make_var : unit -> 'a
; super_ : 'a S.ctx
}
include S with type 'a t := 'a t and type 'a ctx := 'a ctx
end
module Rows (Label : Comparable.S) (S : S) : sig
type 'a t =
| Structure of 'a S.t
| Row_cons of Label.t * 'a * 'a
| Row_uniform of 'a
type 'a ctx =
{ make_var : unit -> 'a
; make_structure : 'a t -> 'a
; super_ : 'a S.ctx
}
include S with type 'a t := 'a t and type 'a ctx := 'a ctx
end
|
d4fc33815a181cebd03993316ebf0cdba89e65df4e15a835e7eac8ed43c3823d | Nick-Chapman/niz | numbers.ml | open Core
16 bit unsigned : 0 .. 0xFF
type t [@@deriving sexp]
val of_char : char -> t
val to_char : t -> char
val of_int_exn : int -> t
val to_int : t -> int
val zero : t
val is_zero : t -> bool
val bitN : int -> t -> bool
val set_bitN : int -> t -> t
val clear_bitN : int -> t -> t
val to_hexstring : t -> string
val to_bitstring : t -> string
end = struct
type t = int [@@deriving sexp]
let in_range i = (i >= 0 && i <= 255)
let of_char x = int_of_char x
let to_char t = assert(in_range t); Char.of_int_exn t
let of_int_exn i =
if not (in_range i) then failwith "Byte.of_int_exn"
else i
let to_int t = t
let zero = 0
let is_zero t = (t = 0)
let bitN n x = ((x lsr n) land 0x1) = 1
let set_bitN n x = x lor (0x1 lsl n)
let clear_bitN n x = x land (lnot (0x1 lsl n))
let to_hexstring t = sprintf "%2x" t
let to_bitstring x =
String.concat (List.rev (List.map (List.range 0 (7+1)) ~f:(fun n ->
if bitN n x then "1" else "0")))
end
module Zversion : sig
type t = Z1 | Z2 | Z3 | Z4 | Z5 [@@deriving sexp_of]
val of_byte : Byte.t -> t
val to_byte : t -> Byte.t
val to_file_extension : t -> string
end = struct
type t = Z1 | Z2 | Z3 | Z4 | Z5 [@@deriving sexp_of]
let of_byte b =
match Byte.to_int b with
| 1 -> Z1
| 2 -> Z2
| 3 -> Z3
| 4 -> Z4
| 5 -> Z5
| n -> failwithf "unsupported z-machine version: %d" n ()
let to_int = function
| Z1 -> 1
| Z2 -> 2
| Z3 -> 3
| Z4 -> 4
| Z5 -> 5
let to_byte t = Byte.of_int_exn (to_int t)
let to_file_extension t = sprintf "z%d" (to_int t)
end
16 bit unsigned : 0 .. 0xFFFF
type t [@@deriving sexp]
val zero : t
val of_byte : Byte.t -> t
val of_high_low : Byte.t * Byte.t -> t
val of_int_exn : int -> t
val is_zero : t -> bool
val to_int : t -> int
val to_high_low : t -> Byte.t * Byte.t
val to_low_byte : t -> Byte.t
val to_byte_exn : t -> Byte.t
end = struct
type t = int [@@deriving sexp]
let in_range i = (i >= 0 && i <= 0xFFFF)
let zero = 0
let of_byte = Byte.to_int
let of_int_exn i = if not (in_range i) then failwith "Word.of_int_exn" else i
let of_high_low (high,low) =
Byte.to_int high * 256 + Byte.to_int low
let is_zero t = (t=0)
let to_high_byte t = Byte.of_int_exn (t / 256)
let to_low_byte t = Byte.of_int_exn (t % 256)
let to_high_low t = to_high_byte t, to_low_byte t
let to_int t = t
let to_byte_exn = Byte.of_int_exn
end
module Var = struct
type t = Sp | Local of int | Global of int
[@@deriving sexp]
end
module Target : sig
type t [@@deriving sexp]
val create : Byte.t -> t
val var : t -> Var.t
end = struct
type t = Var.t [@@deriving sexp]
let var x = x
let create b = match Byte.to_int b with
| 0 -> Var.Sp
| n ->
if n<16 then Var.Local n
else Var.Global (n-16)
end
module Loc : sig
type t [@@deriving sexp]
include Comparable with type t := t
include Hashable with type t := t
val of_address : Word.t -> t
val of_packed_address : Zversion.t -> Word.t -> t
val of_int : int -> t
val zero : t
val is_zero : t -> bool
val to_word : t -> Word.t
val to_int : t -> int
val (+) : t -> int -> t
val compare : t -> t -> int
val align_packed_address : Zversion.t -> t -> t
val to_packed_address : Zversion.t -> t -> Word.t
end = struct
module T = struct
type t = int [@@deriving sexp,compare]
let hash x = x
end
include T
include Comparable.Make(T)
include Hashable.Make(T)
let create i =
TODO : get version here & keep smaller limit for per Z4
assert ( i>=0 & & i < 0x20000 ) ; ( * 128k
256k
failwithf ! "Loc.create:%d" i ();
i
let zero = 0
let is_zero t = (t=0)
let of_int i = create i
let of_address w =
create (Word.to_int w)
let packed_address_pointer_size zversion =
let open Zversion in
match zversion with
| Z1|Z2|Z3 -> 2
| Z4|Z5 -> 4
let of_packed_address zversion =
let pointer_size = packed_address_pointer_size zversion in
fun w -> create (pointer_size * Word.to_int w)
let to_word t = Word.of_int_exn t
let to_int t = t
let (+) loc offset = create (Int.(+) loc offset)
let compare = Int.compare
let assert_aligned_2 i = assert (i % 2 = 0)
let assert_aligned_4 i = assert (i % 4 = 0)
let align2 i = ((i-1)/2+1)*2
let align4 i = ((i-1)/4+1)*4
let i = assert_aligned_2 i ; i ( * why assume already aligned ?
let align4 i = assert_aligned_2 i; align4 i*)
let align_packed_address zversion =
let open Zversion in
match zversion with
| Z1|Z2|Z3 -> align2
| Z4|Z5 -> align4
let to_packed_address zversion i =
let open Zversion in
match zversion with
| Z1|Z2|Z3 -> (assert_aligned_2 i; Word.of_int_exn (i/2))
| Z4|Z5 -> (assert_aligned_4 i; Word.of_int_exn (i/4))
end
module Obj : sig
type t [@@deriving sexp]
val zero : t
val of_byte : Byte.t -> t
val of_word : Word.t -> t
val of_int_exn : int -> t
val is_zero : t -> bool
val to_int : t -> int
val to_byte_exn : t -> Byte.t
val to_word : t -> Word.t
end = struct
type t = Word.t [@@deriving sexp]
let zero = Word.zero
let of_byte = Word.of_byte
let of_word w = w
let of_int_exn = Word.of_int_exn
let is_zero = Word.is_zero
let to_int = Word.to_int
let to_byte_exn = Word.to_byte_exn
let to_word w = w
end
| null | https://raw.githubusercontent.com/Nick-Chapman/niz/603a437ace7c6babcb08648a98c6ace099e99216/lib/numbers.ml | ocaml | open Core
16 bit unsigned : 0 .. 0xFF
type t [@@deriving sexp]
val of_char : char -> t
val to_char : t -> char
val of_int_exn : int -> t
val to_int : t -> int
val zero : t
val is_zero : t -> bool
val bitN : int -> t -> bool
val set_bitN : int -> t -> t
val clear_bitN : int -> t -> t
val to_hexstring : t -> string
val to_bitstring : t -> string
end = struct
type t = int [@@deriving sexp]
let in_range i = (i >= 0 && i <= 255)
let of_char x = int_of_char x
let to_char t = assert(in_range t); Char.of_int_exn t
let of_int_exn i =
if not (in_range i) then failwith "Byte.of_int_exn"
else i
let to_int t = t
let zero = 0
let is_zero t = (t = 0)
let bitN n x = ((x lsr n) land 0x1) = 1
let set_bitN n x = x lor (0x1 lsl n)
let clear_bitN n x = x land (lnot (0x1 lsl n))
let to_hexstring t = sprintf "%2x" t
let to_bitstring x =
String.concat (List.rev (List.map (List.range 0 (7+1)) ~f:(fun n ->
if bitN n x then "1" else "0")))
end
module Zversion : sig
type t = Z1 | Z2 | Z3 | Z4 | Z5 [@@deriving sexp_of]
val of_byte : Byte.t -> t
val to_byte : t -> Byte.t
val to_file_extension : t -> string
end = struct
type t = Z1 | Z2 | Z3 | Z4 | Z5 [@@deriving sexp_of]
let of_byte b =
match Byte.to_int b with
| 1 -> Z1
| 2 -> Z2
| 3 -> Z3
| 4 -> Z4
| 5 -> Z5
| n -> failwithf "unsupported z-machine version: %d" n ()
let to_int = function
| Z1 -> 1
| Z2 -> 2
| Z3 -> 3
| Z4 -> 4
| Z5 -> 5
let to_byte t = Byte.of_int_exn (to_int t)
let to_file_extension t = sprintf "z%d" (to_int t)
end
16 bit unsigned : 0 .. 0xFFFF
type t [@@deriving sexp]
val zero : t
val of_byte : Byte.t -> t
val of_high_low : Byte.t * Byte.t -> t
val of_int_exn : int -> t
val is_zero : t -> bool
val to_int : t -> int
val to_high_low : t -> Byte.t * Byte.t
val to_low_byte : t -> Byte.t
val to_byte_exn : t -> Byte.t
end = struct
type t = int [@@deriving sexp]
let in_range i = (i >= 0 && i <= 0xFFFF)
let zero = 0
let of_byte = Byte.to_int
let of_int_exn i = if not (in_range i) then failwith "Word.of_int_exn" else i
let of_high_low (high,low) =
Byte.to_int high * 256 + Byte.to_int low
let is_zero t = (t=0)
let to_high_byte t = Byte.of_int_exn (t / 256)
let to_low_byte t = Byte.of_int_exn (t % 256)
let to_high_low t = to_high_byte t, to_low_byte t
let to_int t = t
let to_byte_exn = Byte.of_int_exn
end
module Var = struct
type t = Sp | Local of int | Global of int
[@@deriving sexp]
end
module Target : sig
type t [@@deriving sexp]
val create : Byte.t -> t
val var : t -> Var.t
end = struct
type t = Var.t [@@deriving sexp]
let var x = x
let create b = match Byte.to_int b with
| 0 -> Var.Sp
| n ->
if n<16 then Var.Local n
else Var.Global (n-16)
end
module Loc : sig
type t [@@deriving sexp]
include Comparable with type t := t
include Hashable with type t := t
val of_address : Word.t -> t
val of_packed_address : Zversion.t -> Word.t -> t
val of_int : int -> t
val zero : t
val is_zero : t -> bool
val to_word : t -> Word.t
val to_int : t -> int
val (+) : t -> int -> t
val compare : t -> t -> int
val align_packed_address : Zversion.t -> t -> t
val to_packed_address : Zversion.t -> t -> Word.t
end = struct
module T = struct
type t = int [@@deriving sexp,compare]
let hash x = x
end
include T
include Comparable.Make(T)
include Hashable.Make(T)
let create i =
TODO : get version here & keep smaller limit for per Z4
assert ( i>=0 & & i < 0x20000 ) ; ( * 128k
256k
failwithf ! "Loc.create:%d" i ();
i
let zero = 0
let is_zero t = (t=0)
let of_int i = create i
let of_address w =
create (Word.to_int w)
let packed_address_pointer_size zversion =
let open Zversion in
match zversion with
| Z1|Z2|Z3 -> 2
| Z4|Z5 -> 4
let of_packed_address zversion =
let pointer_size = packed_address_pointer_size zversion in
fun w -> create (pointer_size * Word.to_int w)
let to_word t = Word.of_int_exn t
let to_int t = t
let (+) loc offset = create (Int.(+) loc offset)
let compare = Int.compare
let assert_aligned_2 i = assert (i % 2 = 0)
let assert_aligned_4 i = assert (i % 4 = 0)
let align2 i = ((i-1)/2+1)*2
let align4 i = ((i-1)/4+1)*4
let i = assert_aligned_2 i ; i ( * why assume already aligned ?
let align4 i = assert_aligned_2 i; align4 i*)
let align_packed_address zversion =
let open Zversion in
match zversion with
| Z1|Z2|Z3 -> align2
| Z4|Z5 -> align4
let to_packed_address zversion i =
let open Zversion in
match zversion with
| Z1|Z2|Z3 -> (assert_aligned_2 i; Word.of_int_exn (i/2))
| Z4|Z5 -> (assert_aligned_4 i; Word.of_int_exn (i/4))
end
module Obj : sig
type t [@@deriving sexp]
val zero : t
val of_byte : Byte.t -> t
val of_word : Word.t -> t
val of_int_exn : int -> t
val is_zero : t -> bool
val to_int : t -> int
val to_byte_exn : t -> Byte.t
val to_word : t -> Word.t
end = struct
type t = Word.t [@@deriving sexp]
let zero = Word.zero
let of_byte = Word.of_byte
let of_word w = w
let of_int_exn = Word.of_int_exn
let is_zero = Word.is_zero
let to_int = Word.to_int
let to_byte_exn = Word.to_byte_exn
let to_word w = w
end
| |
12fac4a38bb4a3a829b21f819a00f99456be89271155d3c880a7a01ea5c6c302 | joearms/elib1 | elib1_doc.erl | Copyright ( c ) 2006 - 2009
See MIT - LICENSE for licensing information .
-module(elib1_doc).
%% -compile(export_all).
-export([batch/1, file/1, file/2, setup/1]).
%% takes a file like this <html> ... </html>
strips the header and replaces with a valid xhml header
%% expands <e>...</e>
-import(lists, [map/2, reverse/1, reverse/2]).
batch([X]) ->
File = filename:rootname(atom_to_list(X)),
file(File).
file converts to F.html in the same directory
file(F) ->
io:format("elib1_doc::~s~n",[F]),
file(F ++ ".ehtml", F ++ ".html").
file(InFile, OutFile) ->
case file:read_file(InFile) of
{ok, Bin} ->
Str1 = binary_to_list(Bin),
Str2 = remove_top_level_markup(Str1),
Str3 = elib1_expand:expand_string(Str2),
Str4 = add_xhtml_markup(InFile, Str3),
file:write_file(OutFile, Str4);
_ ->
cannot_read_file
end.
remove_top_level_markup("<html>" ++ T) -> remove_top_level_markup(T, []).
remove_top_level_markup("</html>" ++ _, L) -> reverse(L);
remove_top_level_markup([H|T], L) -> remove_top_level_markup(T, [H|L]).
add_xhtml_markup(File, L) ->
Root = filename:rootname(filename:basename(File)),
[<<"<!DOCTYPE html PUBLIC '-//W3C//DTD XHTML 1.0 Strict//EN'
'-strict.dtd'>
<html xmlns=''>\n">>,
setup(Root),
L,
<<"</body></html>\n">>].
setup(File) ->
["<head>
<title>", File, "</title>
<link href='../include/elib1.css' type='text/css' rel='stylesheet'/>
</head>
<body>
<a href='/cgi?mod=elib1_content_edit&func=edit&file=",
File,"'>edit</a>
"].
| null | https://raw.githubusercontent.com/joearms/elib1/d617d0ec70a058ef102749eadf51c024444c28d9/lib/src/elib1_doc.erl | erlang | -compile(export_all).
takes a file like this <html> ... </html>
expands <e>...</e> | Copyright ( c ) 2006 - 2009
See MIT - LICENSE for licensing information .
-module(elib1_doc).
-export([batch/1, file/1, file/2, setup/1]).
strips the header and replaces with a valid xhml header
-import(lists, [map/2, reverse/1, reverse/2]).
batch([X]) ->
File = filename:rootname(atom_to_list(X)),
file(File).
file converts to F.html in the same directory
file(F) ->
io:format("elib1_doc::~s~n",[F]),
file(F ++ ".ehtml", F ++ ".html").
file(InFile, OutFile) ->
case file:read_file(InFile) of
{ok, Bin} ->
Str1 = binary_to_list(Bin),
Str2 = remove_top_level_markup(Str1),
Str3 = elib1_expand:expand_string(Str2),
Str4 = add_xhtml_markup(InFile, Str3),
file:write_file(OutFile, Str4);
_ ->
cannot_read_file
end.
remove_top_level_markup("<html>" ++ T) -> remove_top_level_markup(T, []).
remove_top_level_markup("</html>" ++ _, L) -> reverse(L);
remove_top_level_markup([H|T], L) -> remove_top_level_markup(T, [H|L]).
add_xhtml_markup(File, L) ->
Root = filename:rootname(filename:basename(File)),
[<<"<!DOCTYPE html PUBLIC '-//W3C//DTD XHTML 1.0 Strict//EN'
'-strict.dtd'>
<html xmlns=''>\n">>,
setup(Root),
L,
<<"</body></html>\n">>].
setup(File) ->
["<head>
<title>", File, "</title>
<link href='../include/elib1.css' type='text/css' rel='stylesheet'/>
</head>
<body>
<a href='/cgi?mod=elib1_content_edit&func=edit&file=",
File,"'>edit</a>
"].
|
5f32d8c7cd84b54ddd9c602c09b8a013b92f3f41c3e2178a746ba5106f36fff7 | erlangonrails/devdb | authmod_gssapi.erl | %%%-------------------------------------------------------------------
%%% File : authmod_gssapi.erl
Author : < >
Description : Negotiate authentication module supporting GSSAPI
%%% and SPNEGO
%%%
Created : 17 May 2007 by < >
%%%-------------------------------------------------------------------
%%%
Copyright ( c ) 2007
%%% All rights reserved.
%%%
%%% Redistribution and use in source and binary forms, with or without
%%% modification, are permitted provided that the following conditions
%%% are met:
%%%
1 . Redistributions of source code must retain the above copyright
%%% notice, this list of conditions and the following disclaimer.
%%%
2 . Redistributions in binary form must reproduce the above copyright
%%% notice, this list of conditions and the following disclaimer in the
%%% documentation and/or other materials provided with the distribution.
%%%
3 . Neither the name of the copyright owner nor the names of its
%%% contributors may be used to endorse or promote products derived from
%%% this software without specific prior written permission.
%%%
THIS SOFTWARE IS PROVIDED BY THE INSTITUTE AND CONTRIBUTORS ` ` AS IS '' AND
%%% ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED . IN NO EVENT SHALL THE INSTITUTE OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT , INDIRECT , INCIDENTAL , SPECIAL , EXEMPLARY , OR CONSEQUENTIAL
%%% DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
%%% OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY , WHETHER IN CONTRACT , STRICT
%%% LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
%%% OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
%%% SUCH DAMAGE.
this code adds support for SPNEGO and GSSAPI negotiation to yaws .
It 's compatible with both Linux / Unix and Windows .
Supporting both Kerberos for windows ( kfw ) and SSPI on Windows .
It 's implemented as an authmod called authmod_gssapi .
Adding it to start_mod in < server > and authmod in an < auth > tag
activates the module . It expects a Kerberos keytab in < opaque > .
The keytab should contain ) for " HTTP/<fqdn>@<REALM > " ,
%% where <fqdn> is the fully qualified domain name of the host and <REALM>
the realm .
%% For example:
%% <server fqdn>
port = 80
%% listen = 0.0.0.0
%% docroot = /usr/share/yaws
start_mod = authmod_gssapi
%% <auth> authmod = authmod_gssapi
%% dir = /
%% </auth>
%% <opaque>
keytab = /etc / yaws / http.keytab
%% </opaque>
%% </server>
The authmod_gssapi module depends on egssapi
from : /
-module(authmod_gssapi).
-export([
start/1,
stop/0,
auth/2,
get_header/0,
out/1
]).
-include("yaws.hrl").
-include("yaws_api.hrl").
-define(SERVER, ?MODULE).
-define(SUPERVISOR, yaws_sup).
%%-define(ENABLE_DEBUG, yes).
-ifdef(ENABLE_DEBUG).
-define(INFO, io:format).
-define(DEBUG, io:format).
-else.
-define(INFO, ignore).
-define(DEBUG, ignore).
-endif.
-define(WARNING, io:format).
-define(ERROR, io:format).
start(Sconf) when is_record(Sconf, sconf) ->
Opaque = Sconf#sconf.opaque,
start_opaque(Opaque);
start(Keytab) when is_list(Keytab) ->
ChildSpec =
{?SERVER,
{egssapi, start_link, [{local, ?SERVER}, Keytab]},
permanent,
1000,
worker,
[egssapi, spnego]},
supervisor:start_child(?SUPERVISOR, ChildSpec).
stop() ->
egssapi:stop(?SERVER),
supervisor:terminate_child(?SUPERVISOR, ?SERVER),
supervisor:delete_child(?SUPERVISOR, ?SERVER).
out(Arg) ->
yaws_outmod:out(Arg).
auth(Arg, Auth) when is_record(Arg, arg),
is_record(Auth, auth) ->
H = Arg#arg.headers,
?INFO("~p~n", [?MODULE]),
case H#headers.authorization of
{_, _, "Negotiate " ++ Data} ->
?INFO("Negotiate~n", []),
Bin = base64:decode(Data),
case catch spnego:accept_sec_context(?SERVER, Bin) of
{'EXIT', Reason} ->
?ERROR("spnego failed EXIT:~p~n", [Reason]),
throw(Reason);
{error, Reason} ->
?ERROR("spnego failed error:~p~n", [Reason]),
throw(Reason);
{ok, {Context, User, Ccname, Resp}} ->
?DEBUG("spnego user ok ~p~n", [User]),
spnego:delete_sec_context(Context),
{true, {User, Ccname, base64:encode(Resp)}};
E ->
?ERROR("spnego error ~p~n", [E]),
throw(error)
end;
_ ->
?INFO("Request auth~n"),
{appmod, ?MODULE}
end.
%% The header that is set when authentication fails
get_header() ->
yaws:make_www_authenticate_header("Negotiate").
start_opaque(Opaque) when is_list(Opaque) ->
if
is_list(Opaque) ->
Keytab = get_option("keytab", Opaque),
start(Keytab);
true ->
throw(keytab_not_found)
end.
get_option(Name, Options) when is_list(Options) ->
case lists:keysearch(Name, 1, Options) of
{value, {Name, Value}} ->
Value;
false ->
throw(not_found)
end.
-ifndef(ENABLE_DEBUG).
ignore(_) -> ok.
ignore(_,_) -> ok.
-endif.
| null | https://raw.githubusercontent.com/erlangonrails/devdb/0e7eaa6bd810ec3892bfc3d933439560620d0941/dev/scalaris/contrib/yaws/src/authmod_gssapi.erl | erlang | -------------------------------------------------------------------
File : authmod_gssapi.erl
and SPNEGO
-------------------------------------------------------------------
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
notice, this list of conditions and the following disclaimer.
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
SUCH DAMAGE.
where <fqdn> is the fully qualified domain name of the host and <REALM>
For example:
<server fqdn>
listen = 0.0.0.0
docroot = /usr/share/yaws
<auth> authmod = authmod_gssapi
dir = /
</auth>
<opaque>
</opaque>
</server>
-define(ENABLE_DEBUG, yes).
The header that is set when authentication fails | Author : < >
Description : Negotiate authentication module supporting GSSAPI
Created : 17 May 2007 by < >
Copyright ( c ) 2007
1 . Redistributions of source code must retain the above copyright
2 . Redistributions in binary form must reproduce the above copyright
3 . Neither the name of the copyright owner nor the names of its
THIS SOFTWARE IS PROVIDED BY THE INSTITUTE AND CONTRIBUTORS ` ` AS IS '' AND
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED . IN NO EVENT SHALL THE INSTITUTE OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT , INDIRECT , INCIDENTAL , SPECIAL , EXEMPLARY , OR CONSEQUENTIAL
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY , WHETHER IN CONTRACT , STRICT
this code adds support for SPNEGO and GSSAPI negotiation to yaws .
It 's compatible with both Linux / Unix and Windows .
Supporting both Kerberos for windows ( kfw ) and SSPI on Windows .
It 's implemented as an authmod called authmod_gssapi .
Adding it to start_mod in < server > and authmod in an < auth > tag
activates the module . It expects a Kerberos keytab in < opaque > .
The keytab should contain ) for " HTTP/<fqdn>@<REALM > " ,
the realm .
port = 80
start_mod = authmod_gssapi
keytab = /etc / yaws / http.keytab
The authmod_gssapi module depends on egssapi
from : /
-module(authmod_gssapi).
-export([
start/1,
stop/0,
auth/2,
get_header/0,
out/1
]).
-include("yaws.hrl").
-include("yaws_api.hrl").
-define(SERVER, ?MODULE).
-define(SUPERVISOR, yaws_sup).
-ifdef(ENABLE_DEBUG).
-define(INFO, io:format).
-define(DEBUG, io:format).
-else.
-define(INFO, ignore).
-define(DEBUG, ignore).
-endif.
-define(WARNING, io:format).
-define(ERROR, io:format).
start(Sconf) when is_record(Sconf, sconf) ->
Opaque = Sconf#sconf.opaque,
start_opaque(Opaque);
start(Keytab) when is_list(Keytab) ->
ChildSpec =
{?SERVER,
{egssapi, start_link, [{local, ?SERVER}, Keytab]},
permanent,
1000,
worker,
[egssapi, spnego]},
supervisor:start_child(?SUPERVISOR, ChildSpec).
stop() ->
egssapi:stop(?SERVER),
supervisor:terminate_child(?SUPERVISOR, ?SERVER),
supervisor:delete_child(?SUPERVISOR, ?SERVER).
out(Arg) ->
yaws_outmod:out(Arg).
auth(Arg, Auth) when is_record(Arg, arg),
is_record(Auth, auth) ->
H = Arg#arg.headers,
?INFO("~p~n", [?MODULE]),
case H#headers.authorization of
{_, _, "Negotiate " ++ Data} ->
?INFO("Negotiate~n", []),
Bin = base64:decode(Data),
case catch spnego:accept_sec_context(?SERVER, Bin) of
{'EXIT', Reason} ->
?ERROR("spnego failed EXIT:~p~n", [Reason]),
throw(Reason);
{error, Reason} ->
?ERROR("spnego failed error:~p~n", [Reason]),
throw(Reason);
{ok, {Context, User, Ccname, Resp}} ->
?DEBUG("spnego user ok ~p~n", [User]),
spnego:delete_sec_context(Context),
{true, {User, Ccname, base64:encode(Resp)}};
E ->
?ERROR("spnego error ~p~n", [E]),
throw(error)
end;
_ ->
?INFO("Request auth~n"),
{appmod, ?MODULE}
end.
get_header() ->
yaws:make_www_authenticate_header("Negotiate").
start_opaque(Opaque) when is_list(Opaque) ->
if
is_list(Opaque) ->
Keytab = get_option("keytab", Opaque),
start(Keytab);
true ->
throw(keytab_not_found)
end.
get_option(Name, Options) when is_list(Options) ->
case lists:keysearch(Name, 1, Options) of
{value, {Name, Value}} ->
Value;
false ->
throw(not_found)
end.
-ifndef(ENABLE_DEBUG).
ignore(_) -> ok.
ignore(_,_) -> ok.
-endif.
|
b741f2685600019cab19884e9fafc5daf88d8d85bdba2efa837406521ee11486 | kazu-yamamoto/http2 | Worker.hs | # LANGUAGE NamedFieldPuns #
{-# LANGUAGE OverloadedStrings #-}
# LANGUAGE PatternGuards #
# LANGUAGE RecordWildCards #
module Network.HTTP2.Server.Worker (
worker
, WorkerConf(..)
, fromContext
) where
import Control.Exception (AsyncException(..))
import Data.IORef
import qualified Network.HTTP.Types as H
import qualified System.TimeManager as T
import UnliftIO.Exception (SomeException(..))
import qualified UnliftIO.Exception as E
import UnliftIO.STM
import Imports hiding (insert)
import Network.HPACK
import Network.HPACK.Token
import Network.HTTP2.Arch
import Network.HTTP2.Frame
import Network.HTTP2.Server.Types
----------------------------------------------------------------
data WorkerConf a = WorkerConf {
readInputQ :: IO (Input a)
, writeOutputQ :: Output a -> IO ()
, workerCleanup :: a -> IO ()
, isPushable :: IO Bool
, insertStream :: StreamId -> a -> IO ()
, makePushStream :: a -> PushPromise -> IO (StreamId, StreamId, a)
}
fromContext :: Context -> WorkerConf Stream
fromContext ctx@Context{..} = WorkerConf {
readInputQ = atomically $ readTQueue $ inputQ $ toServerInfo roleInfo
, writeOutputQ = enqueueOutput outputQ
, workerCleanup = \strm -> do
closed ctx strm Killed
let frame = resetFrame InternalError (streamNumber strm)
enqueueControl controlQ $ CFrame frame
, isPushable = enablePush <$> readIORef http2settings
, insertStream = insert streamTable
, makePushStream = \pstrm _ -> do
ws <- initialWindowSize <$> readIORef http2settings
sid <- getMyNewStreamId ctx
newstrm <- newPushStream sid ws
let pid = streamNumber pstrm
return (pid, sid, newstrm)
}
----------------------------------------------------------------
pushStream :: WorkerConf a
-> a -- parent stream
-> ValueTable -- request
-> [PushPromise]
-> IO OutputType
pushStream _ _ _ [] = return OObj
pushStream WorkerConf{..} pstrm reqvt pps0
| len == 0 = return OObj
| otherwise = do
pushable <- isPushable
if pushable then do
tvar <- newTVarIO 0
lim <- push tvar pps0 0
if lim == 0 then
return OObj
else
return $ OWait (waiter lim tvar)
else
return OObj
where
len = length pps0
increment tvar = atomically $ modifyTVar' tvar (+1)
waiter lim tvar = atomically $ do
n <- readTVar tvar
checkSTM (n >= lim)
push _ [] n = return (n :: Int)
push tvar (pp:pps) n = do
(pid, sid, newstrm) <- makePushStream pstrm pp
insertStream sid newstrm
let scheme = fromJust $ getHeaderValue tokenScheme reqvt
-- fixme: this value can be Nothing
auth = fromJust (getHeaderValue tokenAuthority reqvt
<|> getHeaderValue tokenHost reqvt)
path = promiseRequestPath pp
promiseRequest = [(tokenMethod, H.methodGet)
,(tokenScheme, scheme)
,(tokenAuthority, auth)
,(tokenPath, path)]
ot = OPush promiseRequest pid
Response rsp = promiseResponse pp
out = Output newstrm rsp ot Nothing $ increment tvar
writeOutputQ out
push tvar pps (n + 1)
-- | This function is passed to workers.
-- They also pass 'Response's from a server to this function.
-- This function enqueues commands for the HTTP/2 sender.
response :: WorkerConf a -> Manager -> T.Handle -> ThreadContinue -> a -> Request -> Response -> [PushPromise] -> IO ()
response wc@WorkerConf{..} mgr th tconf strm (Request req) (Response rsp) pps = case outObjBody rsp of
OutBodyNone -> do
setThreadContinue tconf True
writeOutputQ $ Output strm rsp OObj Nothing (return ())
OutBodyBuilder _ -> do
otyp <- pushStream wc strm reqvt pps
setThreadContinue tconf True
writeOutputQ $ Output strm rsp otyp Nothing (return ())
OutBodyFile _ -> do
otyp <- pushStream wc strm reqvt pps
setThreadContinue tconf True
writeOutputQ $ Output strm rsp otyp Nothing (return ())
OutBodyStreaming strmbdy -> do
otyp <- pushStream wc strm reqvt pps
-- We must not exit this server application.
-- If the application exits, streaming would be also closed.
-- So, this work occupies this thread.
--
-- We need to increase the number of workers.
spawnAction mgr
-- After this work, this thread stops to decease
-- the number of workers.
setThreadContinue tconf False
-- Since streaming body is loop, we cannot control it.
-- So, let's serialize 'Builder' with a designated queue.
fixme : hard coding : 10
writeOutputQ $ Output strm rsp otyp (Just tbq) (return ())
let push b = do
T.pause th
atomically $ writeTBQueue tbq (StreamingBuilder b)
T.resume th
flush = atomically $ writeTBQueue tbq StreamingFlush
strmbdy push flush
atomically $ writeTBQueue tbq StreamingFinished
deleteMyId mgr
where
(_,reqvt) = inpObjHeaders req
-- | Worker for server applications.
worker :: WorkerConf a -> Manager -> Server -> Action
worker wc@WorkerConf{..} mgr server = do
sinfo <- newStreamInfo
tcont <- newThreadContinue
timeoutKillThread mgr $ go sinfo tcont
where
go sinfo tcont th = do
setThreadContinue tcont True
ex <- E.trySyncOrAsync $ do
T.pause th
Input strm req <- readInputQ
let req' = pauseRequestBody req th
setStreamInfo sinfo strm
T.resume th
T.tickle th
let aux = Aux th
server (Request req') aux $ response wc mgr th tcont strm (Request req')
cont1 <- case ex of
Right () -> return True
Left e@(SomeException _)
-- killed by the local worker manager
| Just ThreadKilled <- E.fromException e -> return False
-- killed by the local timeout manager
| Just T.TimeoutThread <- E.fromException e -> do
cleanup sinfo
return True
| otherwise -> do
cleanup sinfo
return True
cont2 <- getThreadContinue tcont
clearStreamInfo sinfo
when (cont1 && cont2) $ go sinfo tcont th
pauseRequestBody req th = req { inpObjBody = readBody' }
where
readBody = inpObjBody req
readBody' = do
T.pause th
bs <- readBody
T.resume th
return bs
cleanup sinfo = do
minp <- getStreamInfo sinfo
case minp of
Nothing -> return ()
Just strm -> workerCleanup strm
----------------------------------------------------------------
-- A reference is shared by a responder and its worker.
-- The reference refers a value of this type as a return value.
-- If 'True', the worker continue to serve requests.
-- Otherwise, the worker get finished.
newtype ThreadContinue = ThreadContinue (IORef Bool)
# INLINE newThreadContinue #
newThreadContinue :: IO ThreadContinue
newThreadContinue = ThreadContinue <$> newIORef True
# INLINE setThreadContinue #
setThreadContinue :: ThreadContinue -> Bool -> IO ()
setThreadContinue (ThreadContinue ref) x = writeIORef ref x
# INLINE getThreadContinue #
getThreadContinue :: ThreadContinue -> IO Bool
getThreadContinue (ThreadContinue ref) = readIORef ref
----------------------------------------------------------------
-- | The type for cleaning up.
newtype StreamInfo a = StreamInfo (IORef (Maybe a))
# INLINE newStreamInfo #
newStreamInfo :: IO (StreamInfo a)
newStreamInfo = StreamInfo <$> newIORef Nothing
# INLINE clearStreamInfo #
clearStreamInfo :: StreamInfo a -> IO ()
clearStreamInfo (StreamInfo ref) = writeIORef ref Nothing
# INLINE setStreamInfo #
setStreamInfo :: StreamInfo a -> a -> IO ()
setStreamInfo (StreamInfo ref) inp = writeIORef ref $ Just inp
# INLINE getStreamInfo #
getStreamInfo :: StreamInfo a -> IO (Maybe a)
getStreamInfo (StreamInfo ref) = readIORef ref
| null | https://raw.githubusercontent.com/kazu-yamamoto/http2/88b3c192c251c15fa4dc426aa0372841dc3fee49/Network/HTTP2/Server/Worker.hs | haskell | # LANGUAGE OverloadedStrings #
--------------------------------------------------------------
--------------------------------------------------------------
parent stream
request
fixme: this value can be Nothing
| This function is passed to workers.
They also pass 'Response's from a server to this function.
This function enqueues commands for the HTTP/2 sender.
We must not exit this server application.
If the application exits, streaming would be also closed.
So, this work occupies this thread.
We need to increase the number of workers.
After this work, this thread stops to decease
the number of workers.
Since streaming body is loop, we cannot control it.
So, let's serialize 'Builder' with a designated queue.
| Worker for server applications.
killed by the local worker manager
killed by the local timeout manager
--------------------------------------------------------------
A reference is shared by a responder and its worker.
The reference refers a value of this type as a return value.
If 'True', the worker continue to serve requests.
Otherwise, the worker get finished.
--------------------------------------------------------------
| The type for cleaning up. | # LANGUAGE NamedFieldPuns #
# LANGUAGE PatternGuards #
# LANGUAGE RecordWildCards #
module Network.HTTP2.Server.Worker (
worker
, WorkerConf(..)
, fromContext
) where
import Control.Exception (AsyncException(..))
import Data.IORef
import qualified Network.HTTP.Types as H
import qualified System.TimeManager as T
import UnliftIO.Exception (SomeException(..))
import qualified UnliftIO.Exception as E
import UnliftIO.STM
import Imports hiding (insert)
import Network.HPACK
import Network.HPACK.Token
import Network.HTTP2.Arch
import Network.HTTP2.Frame
import Network.HTTP2.Server.Types
data WorkerConf a = WorkerConf {
readInputQ :: IO (Input a)
, writeOutputQ :: Output a -> IO ()
, workerCleanup :: a -> IO ()
, isPushable :: IO Bool
, insertStream :: StreamId -> a -> IO ()
, makePushStream :: a -> PushPromise -> IO (StreamId, StreamId, a)
}
fromContext :: Context -> WorkerConf Stream
fromContext ctx@Context{..} = WorkerConf {
readInputQ = atomically $ readTQueue $ inputQ $ toServerInfo roleInfo
, writeOutputQ = enqueueOutput outputQ
, workerCleanup = \strm -> do
closed ctx strm Killed
let frame = resetFrame InternalError (streamNumber strm)
enqueueControl controlQ $ CFrame frame
, isPushable = enablePush <$> readIORef http2settings
, insertStream = insert streamTable
, makePushStream = \pstrm _ -> do
ws <- initialWindowSize <$> readIORef http2settings
sid <- getMyNewStreamId ctx
newstrm <- newPushStream sid ws
let pid = streamNumber pstrm
return (pid, sid, newstrm)
}
pushStream :: WorkerConf a
-> [PushPromise]
-> IO OutputType
pushStream _ _ _ [] = return OObj
pushStream WorkerConf{..} pstrm reqvt pps0
| len == 0 = return OObj
| otherwise = do
pushable <- isPushable
if pushable then do
tvar <- newTVarIO 0
lim <- push tvar pps0 0
if lim == 0 then
return OObj
else
return $ OWait (waiter lim tvar)
else
return OObj
where
len = length pps0
increment tvar = atomically $ modifyTVar' tvar (+1)
waiter lim tvar = atomically $ do
n <- readTVar tvar
checkSTM (n >= lim)
push _ [] n = return (n :: Int)
push tvar (pp:pps) n = do
(pid, sid, newstrm) <- makePushStream pstrm pp
insertStream sid newstrm
let scheme = fromJust $ getHeaderValue tokenScheme reqvt
auth = fromJust (getHeaderValue tokenAuthority reqvt
<|> getHeaderValue tokenHost reqvt)
path = promiseRequestPath pp
promiseRequest = [(tokenMethod, H.methodGet)
,(tokenScheme, scheme)
,(tokenAuthority, auth)
,(tokenPath, path)]
ot = OPush promiseRequest pid
Response rsp = promiseResponse pp
out = Output newstrm rsp ot Nothing $ increment tvar
writeOutputQ out
push tvar pps (n + 1)
response :: WorkerConf a -> Manager -> T.Handle -> ThreadContinue -> a -> Request -> Response -> [PushPromise] -> IO ()
response wc@WorkerConf{..} mgr th tconf strm (Request req) (Response rsp) pps = case outObjBody rsp of
OutBodyNone -> do
setThreadContinue tconf True
writeOutputQ $ Output strm rsp OObj Nothing (return ())
OutBodyBuilder _ -> do
otyp <- pushStream wc strm reqvt pps
setThreadContinue tconf True
writeOutputQ $ Output strm rsp otyp Nothing (return ())
OutBodyFile _ -> do
otyp <- pushStream wc strm reqvt pps
setThreadContinue tconf True
writeOutputQ $ Output strm rsp otyp Nothing (return ())
OutBodyStreaming strmbdy -> do
otyp <- pushStream wc strm reqvt pps
spawnAction mgr
setThreadContinue tconf False
fixme : hard coding : 10
writeOutputQ $ Output strm rsp otyp (Just tbq) (return ())
let push b = do
T.pause th
atomically $ writeTBQueue tbq (StreamingBuilder b)
T.resume th
flush = atomically $ writeTBQueue tbq StreamingFlush
strmbdy push flush
atomically $ writeTBQueue tbq StreamingFinished
deleteMyId mgr
where
(_,reqvt) = inpObjHeaders req
worker :: WorkerConf a -> Manager -> Server -> Action
worker wc@WorkerConf{..} mgr server = do
sinfo <- newStreamInfo
tcont <- newThreadContinue
timeoutKillThread mgr $ go sinfo tcont
where
go sinfo tcont th = do
setThreadContinue tcont True
ex <- E.trySyncOrAsync $ do
T.pause th
Input strm req <- readInputQ
let req' = pauseRequestBody req th
setStreamInfo sinfo strm
T.resume th
T.tickle th
let aux = Aux th
server (Request req') aux $ response wc mgr th tcont strm (Request req')
cont1 <- case ex of
Right () -> return True
Left e@(SomeException _)
| Just ThreadKilled <- E.fromException e -> return False
| Just T.TimeoutThread <- E.fromException e -> do
cleanup sinfo
return True
| otherwise -> do
cleanup sinfo
return True
cont2 <- getThreadContinue tcont
clearStreamInfo sinfo
when (cont1 && cont2) $ go sinfo tcont th
pauseRequestBody req th = req { inpObjBody = readBody' }
where
readBody = inpObjBody req
readBody' = do
T.pause th
bs <- readBody
T.resume th
return bs
cleanup sinfo = do
minp <- getStreamInfo sinfo
case minp of
Nothing -> return ()
Just strm -> workerCleanup strm
newtype ThreadContinue = ThreadContinue (IORef Bool)
# INLINE newThreadContinue #
newThreadContinue :: IO ThreadContinue
newThreadContinue = ThreadContinue <$> newIORef True
# INLINE setThreadContinue #
setThreadContinue :: ThreadContinue -> Bool -> IO ()
setThreadContinue (ThreadContinue ref) x = writeIORef ref x
# INLINE getThreadContinue #
getThreadContinue :: ThreadContinue -> IO Bool
getThreadContinue (ThreadContinue ref) = readIORef ref
newtype StreamInfo a = StreamInfo (IORef (Maybe a))
# INLINE newStreamInfo #
newStreamInfo :: IO (StreamInfo a)
newStreamInfo = StreamInfo <$> newIORef Nothing
# INLINE clearStreamInfo #
clearStreamInfo :: StreamInfo a -> IO ()
clearStreamInfo (StreamInfo ref) = writeIORef ref Nothing
# INLINE setStreamInfo #
setStreamInfo :: StreamInfo a -> a -> IO ()
setStreamInfo (StreamInfo ref) inp = writeIORef ref $ Just inp
# INLINE getStreamInfo #
getStreamInfo :: StreamInfo a -> IO (Maybe a)
getStreamInfo (StreamInfo ref) = readIORef ref
|
6dc5cd46123652afdea98fdfb819b6f1e41edb80a72479c6364ad97410af9b68 | haskell-github/github | Webhooks.hs | -----------------------------------------------------------------------------
-- |
-- License : BSD-3-Clause
Maintainer : < >
--
module GitHub.Data.Webhooks where
import GitHub.Data.Id (Id)
import GitHub.Data.URL (URL)
import GitHub.Internal.Prelude
import Prelude ()
import qualified Data.Map as M
import qualified Data.Text as T
data RepoWebhook = RepoWebhook
{ repoWebhookUrl :: !URL
, repoWebhookTestUrl :: !URL
, repoWebhookId :: !(Id RepoWebhook)
, repoWebhookName :: !Text
, repoWebhookActive :: !Bool
, repoWebhookEvents :: !(Vector RepoWebhookEvent)
, repoWebhookConfig :: !(M.Map Text Text)
, repoWebhookLastResponse :: !RepoWebhookResponse
, repoWebhookUpdatedAt :: !UTCTime
, repoWebhookCreatedAt :: !UTCTime
}
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData RepoWebhook where rnf = genericRnf
instance Binary RepoWebhook
-- | See </#events>.
data RepoWebhookEvent
= WebhookWildcardEvent
| WebhookCheckRunEvent
| WebhookCheckSuiteEvent
| WebhookCodeScanningAlert
| WebhookCommitCommentEvent
| WebhookContentReferenceEvent
| WebhookCreateEvent
| WebhookDeleteEvent
| WebhookDeployKeyEvent
| WebhookDeploymentEvent
| WebhookDeploymentStatusEvent
| WebhookDiscussion
| WebhookDiscussionComment
| WebhookDownloadEvent
| WebhookFollowEvent
| WebhookForkEvent
| WebhookGistEvent
| WebhookGitHubAppAuthorizationEvent
| WebhookGollumEvent
| WebhookInstallationEvent
| WebhookInstallationRepositoriesEvent
| WebhookIssueCommentEvent
| WebhookIssuesEvent
| WebhookLabelEvent
| WebhookMarketplacePurchaseEvent
| WebhookMemberEvent
| WebhookMembershipEvent
| WebhookMetaEvent
| WebhookMilestoneEvent
| WebhookOrgBlockEvent
| WebhookOrganizationEvent
| WebhookPackage
| WebhookPageBuildEvent
| WebhookPingEvent
| WebhookProjectCardEvent
| WebhookProjectColumnEvent
| WebhookProjectEvent
| WebhookPublicEvent
| WebhookPullRequestEvent
| WebhookPullRequestReviewCommentEvent
| WebhookPullRequestReviewEvent
| WebhookPushEvent
| WebhookRegistryPackageEvent
| WebhookReleaseEvent
| WebhookRepositoryDispatch
| WebhookRepositoryEvent
| WebhookRepositoryImportEvent
| WebhookRepositoryVulnerabilityAlertEvent
| WebhookSecretScanningAlert
| WebhookSecurityAdvisoryEvent
| WebhookSponsorship
| WebhookStarEvent
| WebhookStatusEvent
| WebhookTeamAddEvent
| WebhookTeamEvent
| WebhookWatchEvent
| WebhookWorkflowDispatch
| WebhookWorkflowRun
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData RepoWebhookEvent where rnf = genericRnf
instance Binary RepoWebhookEvent
data RepoWebhookResponse = RepoWebhookResponse
{ repoWebhookResponseCode :: !(Maybe Int)
, repoWebhookResponseStatus :: !(Maybe Text)
, repoWebhookResponseMessage :: !(Maybe Text)
}
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData RepoWebhookResponse where rnf = genericRnf
instance Binary RepoWebhookResponse
data PingEvent = PingEvent
{ pingEventZen :: !Text
, pingEventHook :: !RepoWebhook
, pingEventHookId :: !(Id RepoWebhook)
}
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData PingEvent where rnf = genericRnf
instance Binary PingEvent
data NewRepoWebhook = NewRepoWebhook
{ newRepoWebhookName :: !Text
, newRepoWebhookConfig :: !(M.Map Text Text)
, newRepoWebhookEvents :: !(Maybe (Vector RepoWebhookEvent))
, newRepoWebhookActive :: !(Maybe Bool)
}
deriving (Eq, Ord, Show, Typeable, Data, Generic)
instance NFData NewRepoWebhook where rnf = genericRnf
instance Binary NewRepoWebhook
data EditRepoWebhook = EditRepoWebhook
{ editRepoWebhookConfig :: !(Maybe (M.Map Text Text))
, editRepoWebhookEvents :: !(Maybe (Vector RepoWebhookEvent))
, editRepoWebhookAddEvents :: !(Maybe (Vector RepoWebhookEvent))
, editRepoWebhookRemoveEvents :: !(Maybe (Vector RepoWebhookEvent))
, editRepoWebhookActive :: !(Maybe Bool)
}
deriving (Eq, Ord, Show, Typeable, Data, Generic)
instance NFData EditRepoWebhook where rnf = genericRnf
instance Binary EditRepoWebhook
-- JSON instances
instance FromJSON RepoWebhookEvent where
parseJSON = withText "RepoWebhookEvent" $ \t -> case T.toLower t of
"*" -> pure WebhookWildcardEvent
"check_run" -> pure WebhookCheckRunEvent
"check_suite" -> pure WebhookCheckSuiteEvent
"code_scanning_alert" -> pure WebhookCodeScanningAlert
"commit_comment" -> pure WebhookCommitCommentEvent
"content_reference" -> pure WebhookContentReferenceEvent
"create" -> pure WebhookCreateEvent
"delete" -> pure WebhookDeleteEvent
"deploy_key" -> pure WebhookDeployKeyEvent
"deployment" -> pure WebhookDeploymentEvent
"deployment_status" -> pure WebhookDeploymentStatusEvent
"discussion" -> pure WebhookDiscussion
"discussion_comment" -> pure WebhookDiscussionComment
"download" -> pure WebhookDownloadEvent
"follow" -> pure WebhookFollowEvent
"fork" -> pure WebhookForkEvent
"gist" -> pure WebhookGistEvent
"github_app_authorization" -> pure WebhookGitHubAppAuthorizationEvent
"gollum" -> pure WebhookGollumEvent
"installation" -> pure WebhookInstallationEvent
"installation_repositories" -> pure WebhookInstallationRepositoriesEvent
"issue_comment" -> pure WebhookIssueCommentEvent
"issues" -> pure WebhookIssuesEvent
"label" -> pure WebhookLabelEvent
"marketplace_purchase" -> pure WebhookMarketplacePurchaseEvent
"member" -> pure WebhookMemberEvent
"membership" -> pure WebhookMembershipEvent
"meta" -> pure WebhookMetaEvent
"milestone" -> pure WebhookMilestoneEvent
"org_block" -> pure WebhookOrgBlockEvent
"organization" -> pure WebhookOrganizationEvent
"package" -> pure WebhookPackage
"page_build" -> pure WebhookPageBuildEvent
"ping" -> pure WebhookPingEvent
"project" -> pure WebhookProjectEvent
"project_card" -> pure WebhookProjectCardEvent
"project_column" -> pure WebhookProjectColumnEvent
"public" -> pure WebhookPublicEvent
"pull_request" -> pure WebhookPullRequestEvent
"pull_request_review" -> pure WebhookPullRequestReviewEvent
"pull_request_review_comment" -> pure WebhookPullRequestReviewCommentEvent
"push" -> pure WebhookPushEvent
"registry_package" -> pure WebhookRegistryPackageEvent
"release" -> pure WebhookReleaseEvent
"repository" -> pure WebhookRepositoryEvent
"repository_dispatch" -> pure WebhookRepositoryDispatch
"repository_import" -> pure WebhookRepositoryImportEvent
"repository_vulnerability_alert" -> pure WebhookRepositoryVulnerabilityAlertEvent
"secret_scanning_alert" -> pure WebhookSecretScanningAlert
"security_advisory" -> pure WebhookSecurityAdvisoryEvent
"sponsorship" -> pure WebhookSponsorship
"star" -> pure WebhookStarEvent
"status" -> pure WebhookStatusEvent
"team" -> pure WebhookTeamEvent
"team_add" -> pure WebhookTeamAddEvent
"watch" -> pure WebhookWatchEvent
"workflow_dispatch" -> pure WebhookWorkflowDispatch
"workflow_run" -> pure WebhookWorkflowRun
_ -> fail $ "Unknown RepoWebhookEvent: " <> T.unpack t
instance ToJSON RepoWebhookEvent where
toJSON WebhookWildcardEvent = String "*"
toJSON WebhookCheckRunEvent = String "check_run"
toJSON WebhookCheckSuiteEvent = String "check_suite"
toJSON WebhookCodeScanningAlert = String "code_scanning_alert"
toJSON WebhookCommitCommentEvent = String "commit_comment"
toJSON WebhookContentReferenceEvent = String "content_reference"
toJSON WebhookCreateEvent = String "create"
toJSON WebhookDeleteEvent = String "delete"
toJSON WebhookDeployKeyEvent = String "deploy_key"
toJSON WebhookDeploymentEvent = String "deployment"
toJSON WebhookDeploymentStatusEvent = String "deployment_status"
toJSON WebhookDiscussion = String "discussion"
toJSON WebhookDiscussionComment = String "discussion_comment"
toJSON WebhookDownloadEvent = String "download"
toJSON WebhookFollowEvent = String "follow"
toJSON WebhookForkEvent = String "fork"
toJSON WebhookGistEvent = String "gist"
toJSON WebhookGitHubAppAuthorizationEvent = String "github_app_authorization"
toJSON WebhookGollumEvent = String "gollum"
toJSON WebhookInstallationEvent = String "installation"
toJSON WebhookInstallationRepositoriesEvent = String "installation_repositories"
toJSON WebhookIssueCommentEvent = String "issue_comment"
toJSON WebhookIssuesEvent = String "issues"
toJSON WebhookLabelEvent = String "label"
toJSON WebhookMarketplacePurchaseEvent = String "marketplace_purchase"
toJSON WebhookMemberEvent = String "member"
toJSON WebhookMembershipEvent = String "membership"
toJSON WebhookMetaEvent = String "meta"
toJSON WebhookMilestoneEvent = String "milestone"
toJSON WebhookOrgBlockEvent = String "org_block"
toJSON WebhookOrganizationEvent = String "organization"
toJSON WebhookPackage = String "package"
toJSON WebhookPageBuildEvent = String "page_build"
toJSON WebhookPingEvent = String "ping"
toJSON WebhookProjectCardEvent = String "project_card"
toJSON WebhookProjectColumnEvent = String "project_column"
toJSON WebhookProjectEvent = String "project"
toJSON WebhookPublicEvent = String "public"
toJSON WebhookPullRequestEvent = String "pull_request"
toJSON WebhookPullRequestReviewCommentEvent = String "pull_request_review_comment"
toJSON WebhookPullRequestReviewEvent = String "pull_request_review"
toJSON WebhookPushEvent = String "push"
toJSON WebhookRegistryPackageEvent = String "registry_package"
toJSON WebhookReleaseEvent = String "release"
toJSON WebhookRepositoryDispatch = String "repository_dispatch"
toJSON WebhookRepositoryEvent = String "repository"
toJSON WebhookRepositoryImportEvent = String "repository_import"
toJSON WebhookRepositoryVulnerabilityAlertEvent = String "repository_vulnerability_alert"
toJSON WebhookSecretScanningAlert = String "secret_scanning_alert"
toJSON WebhookSecurityAdvisoryEvent = String "security_advisory"
toJSON WebhookSponsorship = String "sponsorship"
toJSON WebhookStarEvent = String "star"
toJSON WebhookStatusEvent = String "status"
toJSON WebhookTeamAddEvent = String "team_add"
toJSON WebhookTeamEvent = String "team"
toJSON WebhookWatchEvent = String "watch"
toJSON WebhookWorkflowDispatch = String "workflow_dispatch"
toJSON WebhookWorkflowRun = String "workflow_run"
instance FromJSON RepoWebhook where
parseJSON = withObject "RepoWebhook" $ \o -> RepoWebhook
<$> o .: "url"
<*> o .: "test_url"
<*> o .: "id"
<*> o .: "name"
<*> o .: "active"
<*> o .: "events"
<*> o .: "config"
<*> o .: "last_response"
<*> o .: "updated_at"
<*> o .: "created_at"
instance FromJSON RepoWebhookResponse where
parseJSON = withObject "RepoWebhookResponse" $ \o -> RepoWebhookResponse
<$> o .: "code"
<*> o .:? "status"
<*> o .:? "message"
instance ToJSON NewRepoWebhook where
toJSON (NewRepoWebhook { newRepoWebhookName = name
, newRepoWebhookConfig = config
, newRepoWebhookEvents = events
, newRepoWebhookActive = active
}) = object
[ "name" .= name
, "config" .= config
, "events" .= events
, "active" .= active
]
instance ToJSON EditRepoWebhook where
toJSON (EditRepoWebhook { editRepoWebhookConfig = config
, editRepoWebhookEvents = events
, editRepoWebhookAddEvents = addEvents
, editRepoWebhookRemoveEvents = removeEvents
, editRepoWebhookActive = active
}) = object
[ "config" .= config
, "events" .= events
, "add_events" .= addEvents
, "remove_events" .= removeEvents
, "active" .= active
]
instance FromJSON PingEvent where
parseJSON = withObject "PingEvent" $ \o -> PingEvent
<$> o .: "zen"
<*> o .: "hook"
<*> o .: "hook_id"
| null | https://raw.githubusercontent.com/haskell-github/github/81d9b658c33a706f18418211a78d2690752518a4/src/GitHub/Data/Webhooks.hs | haskell | ---------------------------------------------------------------------------
|
License : BSD-3-Clause
| See </#events>.
JSON instances | Maintainer : < >
module GitHub.Data.Webhooks where
import GitHub.Data.Id (Id)
import GitHub.Data.URL (URL)
import GitHub.Internal.Prelude
import Prelude ()
import qualified Data.Map as M
import qualified Data.Text as T
data RepoWebhook = RepoWebhook
{ repoWebhookUrl :: !URL
, repoWebhookTestUrl :: !URL
, repoWebhookId :: !(Id RepoWebhook)
, repoWebhookName :: !Text
, repoWebhookActive :: !Bool
, repoWebhookEvents :: !(Vector RepoWebhookEvent)
, repoWebhookConfig :: !(M.Map Text Text)
, repoWebhookLastResponse :: !RepoWebhookResponse
, repoWebhookUpdatedAt :: !UTCTime
, repoWebhookCreatedAt :: !UTCTime
}
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData RepoWebhook where rnf = genericRnf
instance Binary RepoWebhook
data RepoWebhookEvent
= WebhookWildcardEvent
| WebhookCheckRunEvent
| WebhookCheckSuiteEvent
| WebhookCodeScanningAlert
| WebhookCommitCommentEvent
| WebhookContentReferenceEvent
| WebhookCreateEvent
| WebhookDeleteEvent
| WebhookDeployKeyEvent
| WebhookDeploymentEvent
| WebhookDeploymentStatusEvent
| WebhookDiscussion
| WebhookDiscussionComment
| WebhookDownloadEvent
| WebhookFollowEvent
| WebhookForkEvent
| WebhookGistEvent
| WebhookGitHubAppAuthorizationEvent
| WebhookGollumEvent
| WebhookInstallationEvent
| WebhookInstallationRepositoriesEvent
| WebhookIssueCommentEvent
| WebhookIssuesEvent
| WebhookLabelEvent
| WebhookMarketplacePurchaseEvent
| WebhookMemberEvent
| WebhookMembershipEvent
| WebhookMetaEvent
| WebhookMilestoneEvent
| WebhookOrgBlockEvent
| WebhookOrganizationEvent
| WebhookPackage
| WebhookPageBuildEvent
| WebhookPingEvent
| WebhookProjectCardEvent
| WebhookProjectColumnEvent
| WebhookProjectEvent
| WebhookPublicEvent
| WebhookPullRequestEvent
| WebhookPullRequestReviewCommentEvent
| WebhookPullRequestReviewEvent
| WebhookPushEvent
| WebhookRegistryPackageEvent
| WebhookReleaseEvent
| WebhookRepositoryDispatch
| WebhookRepositoryEvent
| WebhookRepositoryImportEvent
| WebhookRepositoryVulnerabilityAlertEvent
| WebhookSecretScanningAlert
| WebhookSecurityAdvisoryEvent
| WebhookSponsorship
| WebhookStarEvent
| WebhookStatusEvent
| WebhookTeamAddEvent
| WebhookTeamEvent
| WebhookWatchEvent
| WebhookWorkflowDispatch
| WebhookWorkflowRun
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData RepoWebhookEvent where rnf = genericRnf
instance Binary RepoWebhookEvent
data RepoWebhookResponse = RepoWebhookResponse
{ repoWebhookResponseCode :: !(Maybe Int)
, repoWebhookResponseStatus :: !(Maybe Text)
, repoWebhookResponseMessage :: !(Maybe Text)
}
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData RepoWebhookResponse where rnf = genericRnf
instance Binary RepoWebhookResponse
data PingEvent = PingEvent
{ pingEventZen :: !Text
, pingEventHook :: !RepoWebhook
, pingEventHookId :: !(Id RepoWebhook)
}
deriving (Show, Data, Typeable, Eq, Ord, Generic)
instance NFData PingEvent where rnf = genericRnf
instance Binary PingEvent
data NewRepoWebhook = NewRepoWebhook
{ newRepoWebhookName :: !Text
, newRepoWebhookConfig :: !(M.Map Text Text)
, newRepoWebhookEvents :: !(Maybe (Vector RepoWebhookEvent))
, newRepoWebhookActive :: !(Maybe Bool)
}
deriving (Eq, Ord, Show, Typeable, Data, Generic)
instance NFData NewRepoWebhook where rnf = genericRnf
instance Binary NewRepoWebhook
data EditRepoWebhook = EditRepoWebhook
{ editRepoWebhookConfig :: !(Maybe (M.Map Text Text))
, editRepoWebhookEvents :: !(Maybe (Vector RepoWebhookEvent))
, editRepoWebhookAddEvents :: !(Maybe (Vector RepoWebhookEvent))
, editRepoWebhookRemoveEvents :: !(Maybe (Vector RepoWebhookEvent))
, editRepoWebhookActive :: !(Maybe Bool)
}
deriving (Eq, Ord, Show, Typeable, Data, Generic)
instance NFData EditRepoWebhook where rnf = genericRnf
instance Binary EditRepoWebhook
instance FromJSON RepoWebhookEvent where
parseJSON = withText "RepoWebhookEvent" $ \t -> case T.toLower t of
"*" -> pure WebhookWildcardEvent
"check_run" -> pure WebhookCheckRunEvent
"check_suite" -> pure WebhookCheckSuiteEvent
"code_scanning_alert" -> pure WebhookCodeScanningAlert
"commit_comment" -> pure WebhookCommitCommentEvent
"content_reference" -> pure WebhookContentReferenceEvent
"create" -> pure WebhookCreateEvent
"delete" -> pure WebhookDeleteEvent
"deploy_key" -> pure WebhookDeployKeyEvent
"deployment" -> pure WebhookDeploymentEvent
"deployment_status" -> pure WebhookDeploymentStatusEvent
"discussion" -> pure WebhookDiscussion
"discussion_comment" -> pure WebhookDiscussionComment
"download" -> pure WebhookDownloadEvent
"follow" -> pure WebhookFollowEvent
"fork" -> pure WebhookForkEvent
"gist" -> pure WebhookGistEvent
"github_app_authorization" -> pure WebhookGitHubAppAuthorizationEvent
"gollum" -> pure WebhookGollumEvent
"installation" -> pure WebhookInstallationEvent
"installation_repositories" -> pure WebhookInstallationRepositoriesEvent
"issue_comment" -> pure WebhookIssueCommentEvent
"issues" -> pure WebhookIssuesEvent
"label" -> pure WebhookLabelEvent
"marketplace_purchase" -> pure WebhookMarketplacePurchaseEvent
"member" -> pure WebhookMemberEvent
"membership" -> pure WebhookMembershipEvent
"meta" -> pure WebhookMetaEvent
"milestone" -> pure WebhookMilestoneEvent
"org_block" -> pure WebhookOrgBlockEvent
"organization" -> pure WebhookOrganizationEvent
"package" -> pure WebhookPackage
"page_build" -> pure WebhookPageBuildEvent
"ping" -> pure WebhookPingEvent
"project" -> pure WebhookProjectEvent
"project_card" -> pure WebhookProjectCardEvent
"project_column" -> pure WebhookProjectColumnEvent
"public" -> pure WebhookPublicEvent
"pull_request" -> pure WebhookPullRequestEvent
"pull_request_review" -> pure WebhookPullRequestReviewEvent
"pull_request_review_comment" -> pure WebhookPullRequestReviewCommentEvent
"push" -> pure WebhookPushEvent
"registry_package" -> pure WebhookRegistryPackageEvent
"release" -> pure WebhookReleaseEvent
"repository" -> pure WebhookRepositoryEvent
"repository_dispatch" -> pure WebhookRepositoryDispatch
"repository_import" -> pure WebhookRepositoryImportEvent
"repository_vulnerability_alert" -> pure WebhookRepositoryVulnerabilityAlertEvent
"secret_scanning_alert" -> pure WebhookSecretScanningAlert
"security_advisory" -> pure WebhookSecurityAdvisoryEvent
"sponsorship" -> pure WebhookSponsorship
"star" -> pure WebhookStarEvent
"status" -> pure WebhookStatusEvent
"team" -> pure WebhookTeamEvent
"team_add" -> pure WebhookTeamAddEvent
"watch" -> pure WebhookWatchEvent
"workflow_dispatch" -> pure WebhookWorkflowDispatch
"workflow_run" -> pure WebhookWorkflowRun
_ -> fail $ "Unknown RepoWebhookEvent: " <> T.unpack t
instance ToJSON RepoWebhookEvent where
toJSON WebhookWildcardEvent = String "*"
toJSON WebhookCheckRunEvent = String "check_run"
toJSON WebhookCheckSuiteEvent = String "check_suite"
toJSON WebhookCodeScanningAlert = String "code_scanning_alert"
toJSON WebhookCommitCommentEvent = String "commit_comment"
toJSON WebhookContentReferenceEvent = String "content_reference"
toJSON WebhookCreateEvent = String "create"
toJSON WebhookDeleteEvent = String "delete"
toJSON WebhookDeployKeyEvent = String "deploy_key"
toJSON WebhookDeploymentEvent = String "deployment"
toJSON WebhookDeploymentStatusEvent = String "deployment_status"
toJSON WebhookDiscussion = String "discussion"
toJSON WebhookDiscussionComment = String "discussion_comment"
toJSON WebhookDownloadEvent = String "download"
toJSON WebhookFollowEvent = String "follow"
toJSON WebhookForkEvent = String "fork"
toJSON WebhookGistEvent = String "gist"
toJSON WebhookGitHubAppAuthorizationEvent = String "github_app_authorization"
toJSON WebhookGollumEvent = String "gollum"
toJSON WebhookInstallationEvent = String "installation"
toJSON WebhookInstallationRepositoriesEvent = String "installation_repositories"
toJSON WebhookIssueCommentEvent = String "issue_comment"
toJSON WebhookIssuesEvent = String "issues"
toJSON WebhookLabelEvent = String "label"
toJSON WebhookMarketplacePurchaseEvent = String "marketplace_purchase"
toJSON WebhookMemberEvent = String "member"
toJSON WebhookMembershipEvent = String "membership"
toJSON WebhookMetaEvent = String "meta"
toJSON WebhookMilestoneEvent = String "milestone"
toJSON WebhookOrgBlockEvent = String "org_block"
toJSON WebhookOrganizationEvent = String "organization"
toJSON WebhookPackage = String "package"
toJSON WebhookPageBuildEvent = String "page_build"
toJSON WebhookPingEvent = String "ping"
toJSON WebhookProjectCardEvent = String "project_card"
toJSON WebhookProjectColumnEvent = String "project_column"
toJSON WebhookProjectEvent = String "project"
toJSON WebhookPublicEvent = String "public"
toJSON WebhookPullRequestEvent = String "pull_request"
toJSON WebhookPullRequestReviewCommentEvent = String "pull_request_review_comment"
toJSON WebhookPullRequestReviewEvent = String "pull_request_review"
toJSON WebhookPushEvent = String "push"
toJSON WebhookRegistryPackageEvent = String "registry_package"
toJSON WebhookReleaseEvent = String "release"
toJSON WebhookRepositoryDispatch = String "repository_dispatch"
toJSON WebhookRepositoryEvent = String "repository"
toJSON WebhookRepositoryImportEvent = String "repository_import"
toJSON WebhookRepositoryVulnerabilityAlertEvent = String "repository_vulnerability_alert"
toJSON WebhookSecretScanningAlert = String "secret_scanning_alert"
toJSON WebhookSecurityAdvisoryEvent = String "security_advisory"
toJSON WebhookSponsorship = String "sponsorship"
toJSON WebhookStarEvent = String "star"
toJSON WebhookStatusEvent = String "status"
toJSON WebhookTeamAddEvent = String "team_add"
toJSON WebhookTeamEvent = String "team"
toJSON WebhookWatchEvent = String "watch"
toJSON WebhookWorkflowDispatch = String "workflow_dispatch"
toJSON WebhookWorkflowRun = String "workflow_run"
instance FromJSON RepoWebhook where
parseJSON = withObject "RepoWebhook" $ \o -> RepoWebhook
<$> o .: "url"
<*> o .: "test_url"
<*> o .: "id"
<*> o .: "name"
<*> o .: "active"
<*> o .: "events"
<*> o .: "config"
<*> o .: "last_response"
<*> o .: "updated_at"
<*> o .: "created_at"
instance FromJSON RepoWebhookResponse where
parseJSON = withObject "RepoWebhookResponse" $ \o -> RepoWebhookResponse
<$> o .: "code"
<*> o .:? "status"
<*> o .:? "message"
instance ToJSON NewRepoWebhook where
toJSON (NewRepoWebhook { newRepoWebhookName = name
, newRepoWebhookConfig = config
, newRepoWebhookEvents = events
, newRepoWebhookActive = active
}) = object
[ "name" .= name
, "config" .= config
, "events" .= events
, "active" .= active
]
instance ToJSON EditRepoWebhook where
toJSON (EditRepoWebhook { editRepoWebhookConfig = config
, editRepoWebhookEvents = events
, editRepoWebhookAddEvents = addEvents
, editRepoWebhookRemoveEvents = removeEvents
, editRepoWebhookActive = active
}) = object
[ "config" .= config
, "events" .= events
, "add_events" .= addEvents
, "remove_events" .= removeEvents
, "active" .= active
]
instance FromJSON PingEvent where
parseJSON = withObject "PingEvent" $ \o -> PingEvent
<$> o .: "zen"
<*> o .: "hook"
<*> o .: "hook_id"
|
9e23d07ec69d763ca566b4cfab37a6c569a510ea8b781f4aace4e1ce75f0ff1a | wavewave/hoodle | Coroutine.hs | # LANGUAGE ScopedTypeVariables #
-- |
-- Module : Hoodle.Script.Coroutine
Copyright : ( c ) 2012 - 2015
--
-- License : BSD3
Maintainer : < >
-- Stability : experimental
Portability : GHC
module Hoodle.Script.Coroutine where
import Control.Lens (view)
import Control.Monad.State (get, liftIO)
import Control.Monad.Trans.Maybe (MaybeT (..))
import Data.Hoodle.Simple (Hoodle)
import qualified Hoodle.Script.Hook as H
import Hoodle.Type.Coroutine (MainCoroutine)
import Hoodle.Type.HoodleState (hookSet)
--
-- |
afterSaveHook :: FilePath -> Hoodle -> MainCoroutine ()
afterSaveHook fp hdl = do
xstate <- get
let aftersavehk = do
hset <- view hookSet xstate
H.afterSaveHook hset
maybe (return ()) (\f -> liftIO (f fp hdl)) aftersavehk
-- |
saveAsHook :: FilePath -> Hoodle -> MainCoroutine ()
saveAsHook _fp hdl = do
xstate <- get
let saveashk = do
hset <- view hookSet xstate
H.saveAsHook hset
maybe (return ()) (\f -> liftIO (f hdl)) saveashk
hoist :: (Monad m) => Maybe a -> MaybeT m a
hoist = MaybeT . return
-- |
recentFolderHook :: MainCoroutine (Maybe FilePath)
recentFolderHook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfolder <- hoist (H.recentFolderHook hset)
liftIO rfolder
return r
-- |
embedPredefinedImageHook :: MainCoroutine (Maybe FilePath)
embedPredefinedImageHook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfilename <- hoist (H.embedPredefinedImageHook hset)
liftIO rfilename
return r
-- | temporary
embedPredefinedImage2Hook :: MainCoroutine (Maybe FilePath)
embedPredefinedImage2Hook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfilename <- hoist (H.embedPredefinedImage2Hook hset)
liftIO rfilename
return r
-- | temporary
embedPredefinedImage3Hook :: MainCoroutine (Maybe FilePath)
embedPredefinedImage3Hook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfilename <- hoist (H.embedPredefinedImage3Hook hset)
liftIO rfilename
return r
| null | https://raw.githubusercontent.com/wavewave/hoodle/fa7481d14a53733b2f6ae9debc95357d904a943c/core/src/Hoodle/Script/Coroutine.hs | haskell | |
Module : Hoodle.Script.Coroutine
License : BSD3
Stability : experimental
|
|
|
|
| temporary
| temporary | # LANGUAGE ScopedTypeVariables #
Copyright : ( c ) 2012 - 2015
Maintainer : < >
Portability : GHC
module Hoodle.Script.Coroutine where
import Control.Lens (view)
import Control.Monad.State (get, liftIO)
import Control.Monad.Trans.Maybe (MaybeT (..))
import Data.Hoodle.Simple (Hoodle)
import qualified Hoodle.Script.Hook as H
import Hoodle.Type.Coroutine (MainCoroutine)
import Hoodle.Type.HoodleState (hookSet)
afterSaveHook :: FilePath -> Hoodle -> MainCoroutine ()
afterSaveHook fp hdl = do
xstate <- get
let aftersavehk = do
hset <- view hookSet xstate
H.afterSaveHook hset
maybe (return ()) (\f -> liftIO (f fp hdl)) aftersavehk
saveAsHook :: FilePath -> Hoodle -> MainCoroutine ()
saveAsHook _fp hdl = do
xstate <- get
let saveashk = do
hset <- view hookSet xstate
H.saveAsHook hset
maybe (return ()) (\f -> liftIO (f hdl)) saveashk
hoist :: (Monad m) => Maybe a -> MaybeT m a
hoist = MaybeT . return
recentFolderHook :: MainCoroutine (Maybe FilePath)
recentFolderHook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfolder <- hoist (H.recentFolderHook hset)
liftIO rfolder
return r
embedPredefinedImageHook :: MainCoroutine (Maybe FilePath)
embedPredefinedImageHook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfilename <- hoist (H.embedPredefinedImageHook hset)
liftIO rfilename
return r
embedPredefinedImage2Hook :: MainCoroutine (Maybe FilePath)
embedPredefinedImage2Hook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfilename <- hoist (H.embedPredefinedImage2Hook hset)
liftIO rfilename
return r
embedPredefinedImage3Hook :: MainCoroutine (Maybe FilePath)
embedPredefinedImage3Hook = do
xstate <- get
(r :: Maybe FilePath) <- runMaybeT $ do
hset <- hoist (view hookSet xstate)
rfilename <- hoist (H.embedPredefinedImage3Hook hset)
liftIO rfilename
return r
|
5ca5b53a1cd24b627b5c818c66de921874b20d1df82d8c5c67c41ca4cabf99a9 | AccelerationNet/function-cache | ordered.lisp | (in-package :function-cache)
(deftype cache-node () '(or null cnode))
(defstruct (cnode
(:constructor cnode (cache-key result &optional older newer)))
cache-key
result
(older nil :type cache-node)
(newer nil :type cache-node))
(defclass ordered-cache-mixin (cache-capacity-mixin)
((oldest :initform nil :type cache-node :accessor oldest-node)
(newest :initform nil :type cache-node :accessor newest-node))
(:documentation "Mixin that keeps track of the order of cached results in a doubly linked list.
FIRST references the oldest cached result, and LAST references the most recent."))
(defun %add-cached-node (cache node)
(declare (ordered-cache-mixin cache) (cnode node))
"Add a node to the last position of the cache."
(with-slots (oldest newest) cache
(etypecase newest
(cnode (setf (cnode-newer newest) node)
(setf (cnode-older node) newest)
(setf newest node))
(null (setf newest node)
(setf oldest node))))
node)
(defun %remove-cached-node (cache node)
(declare (ordered-cache-mixin cache) (cnode node))
"Remove a node from the cache."
(with-slots (oldest newest) cache
(let ((newer (cnode-newer node))
(older (cnode-older node)))
(if older
(setf (cnode-newer older) newer)
(setf oldest newer))
(if newer
(setf (cnode-older newer) older)
(setf newest older))))
node)
(defun %move-cached-node (cache node)
(declare (ordered-cache-mixin cache) (cnode node))
"Move a node to the end of the cache, should be called when a cached result has been used."
(%remove-cached-node cache node)
(setf (cnode-newer node) nil)
(setf (cnode-older node) nil)
(%add-cached-node cache node))
(defmethod get-cached-value :around ((cache ordered-cache-mixin) cache-key)
(multiple-value-bind (result-node cached-at) (call-next-method)
(if result-node
(progn
(%move-cached-node cache result-node) ; Move the result to the end if there was a cached result.
(values (cnode-result result-node) cached-at))
(values nil nil))))
(defmethod (setf get-cached-value) :around (new (cache ordered-cache-mixin) cache-key)
(let ((node (cnode cache-key new)))
(call-next-method node cache cache-key)
(%add-cached-node cache node)))
(defun sync-ordered-cache (cache)
(declare (ordered-cache-mixin cache))
"Remove any nodes from the dlist that are no longer in the actual cache."
(iter (for node first (oldest-node cache) then (cnode-newer node))
(while node)
(for key = (cnode-cache-key node))
(unless (key-cached? cache key)
(%remove-cached-node cache node))))
(defmethod clear-cache-partial-arguments :after ((cache ordered-cache-mixin) to-match)
(sync-ordered-cache cache))
(defmethod clear-cache :after ((cache ordered-cache-mixin) &optional args)
(declare (ignore args))
(sync-ordered-cache cache))
(defmethod purge-cache :after ((cache ordered-cache-mixin))
(sync-ordered-cache cache))
(defclass lru-cache (ordered-cache-mixin hash-table-function-cache)
()
(:documentation "LRU cache backed by a hash-table.
Maintains capacity by removing least recently used cached values."))
(defmethod reduce-cached-set ((cache lru-cache) n)
(iter
(with ht = (cached-results cache))
(for i from 0 to n)
(for node first (oldest-node cache) then (cnode-newer node))
(while node)
(remhash (cnode-cache-key node) ht)
(%remove-cached-node cache node)))
(defclass mru-cache (ordered-cache-mixin hash-table-function-cache)
()
(:documentation "MRU cache backed by a hash-table.
Maintains capacity by removing the most recently used cached value.s"))
(defmethod reduce-cached-set ((cache mru-cache) n)
(iter
(with ht = (cached-results cache))
(for i from 0 below n)
(for node first (newest-node cache) then (cnode-older node))
(while node)
(remhash (cnode-cache-key node) ht)
(%remove-cached-node cache node)))
| null | https://raw.githubusercontent.com/AccelerationNet/function-cache/6a5ada401e57da2c8abf046f582029926e61fce8/src/ordered.lisp | lisp | Move the result to the end if there was a cached result. | (in-package :function-cache)
(deftype cache-node () '(or null cnode))
(defstruct (cnode
(:constructor cnode (cache-key result &optional older newer)))
cache-key
result
(older nil :type cache-node)
(newer nil :type cache-node))
(defclass ordered-cache-mixin (cache-capacity-mixin)
((oldest :initform nil :type cache-node :accessor oldest-node)
(newest :initform nil :type cache-node :accessor newest-node))
(:documentation "Mixin that keeps track of the order of cached results in a doubly linked list.
FIRST references the oldest cached result, and LAST references the most recent."))
(defun %add-cached-node (cache node)
(declare (ordered-cache-mixin cache) (cnode node))
"Add a node to the last position of the cache."
(with-slots (oldest newest) cache
(etypecase newest
(cnode (setf (cnode-newer newest) node)
(setf (cnode-older node) newest)
(setf newest node))
(null (setf newest node)
(setf oldest node))))
node)
(defun %remove-cached-node (cache node)
(declare (ordered-cache-mixin cache) (cnode node))
"Remove a node from the cache."
(with-slots (oldest newest) cache
(let ((newer (cnode-newer node))
(older (cnode-older node)))
(if older
(setf (cnode-newer older) newer)
(setf oldest newer))
(if newer
(setf (cnode-older newer) older)
(setf newest older))))
node)
(defun %move-cached-node (cache node)
(declare (ordered-cache-mixin cache) (cnode node))
"Move a node to the end of the cache, should be called when a cached result has been used."
(%remove-cached-node cache node)
(setf (cnode-newer node) nil)
(setf (cnode-older node) nil)
(%add-cached-node cache node))
(defmethod get-cached-value :around ((cache ordered-cache-mixin) cache-key)
(multiple-value-bind (result-node cached-at) (call-next-method)
(if result-node
(progn
(values (cnode-result result-node) cached-at))
(values nil nil))))
(defmethod (setf get-cached-value) :around (new (cache ordered-cache-mixin) cache-key)
(let ((node (cnode cache-key new)))
(call-next-method node cache cache-key)
(%add-cached-node cache node)))
(defun sync-ordered-cache (cache)
(declare (ordered-cache-mixin cache))
"Remove any nodes from the dlist that are no longer in the actual cache."
(iter (for node first (oldest-node cache) then (cnode-newer node))
(while node)
(for key = (cnode-cache-key node))
(unless (key-cached? cache key)
(%remove-cached-node cache node))))
(defmethod clear-cache-partial-arguments :after ((cache ordered-cache-mixin) to-match)
(sync-ordered-cache cache))
(defmethod clear-cache :after ((cache ordered-cache-mixin) &optional args)
(declare (ignore args))
(sync-ordered-cache cache))
(defmethod purge-cache :after ((cache ordered-cache-mixin))
(sync-ordered-cache cache))
(defclass lru-cache (ordered-cache-mixin hash-table-function-cache)
()
(:documentation "LRU cache backed by a hash-table.
Maintains capacity by removing least recently used cached values."))
(defmethod reduce-cached-set ((cache lru-cache) n)
(iter
(with ht = (cached-results cache))
(for i from 0 to n)
(for node first (oldest-node cache) then (cnode-newer node))
(while node)
(remhash (cnode-cache-key node) ht)
(%remove-cached-node cache node)))
(defclass mru-cache (ordered-cache-mixin hash-table-function-cache)
()
(:documentation "MRU cache backed by a hash-table.
Maintains capacity by removing the most recently used cached value.s"))
(defmethod reduce-cached-set ((cache mru-cache) n)
(iter
(with ht = (cached-results cache))
(for i from 0 below n)
(for node first (newest-node cache) then (cnode-older node))
(while node)
(remhash (cnode-cache-key node) ht)
(%remove-cached-node cache node)))
|
2f8deaf080737b799335e7335d658ccb4db3b7ba93e432cbb22d4b301e53f258 | naominitel/hmx | hmx.ml | type const =
| CInt of int
| CBool of bool
type term =
| Var of string
| App of term * term
| Abs of string * term
| Let of string * term * term
| Const of const
type ty =
| TConst of string
| TVar of var
| TApp of ty * ty
and var = var_descr Union_find.point
and var_descr = {
mutable structure: ty option ;
mutable rank: int ;
name: string
}
let arrow = TConst "->"
let function_type t1 t2 =
TApp ((TApp (arrow, t1)), t2)
type ty_sch =
| Forall of var list * constr * ty
and constr =
| CBool of bool
| CApp of string * ty list
| CAnd of constr * constr
| CExists of var list * constr
| CDef of string * ty_sch * constr
| CInstance of string * ty
| CDump
let sch ty =
Forall ([], CBool true, ty)
let is_subtype = "="
let is_instance sch ty =
let Forall(v, c, t) = sch in
CExists (v, CAnd (c, CApp (is_subtype, [t ; ty])))
let has_instance sch =
let Forall(v, c, _) = sch in
CExists (v, c)
let letin var sch constr =
CDef (var, sch, constr)
let fresh_ty_var =
let next = ref 0 in
fun () ->
let name = Printf.sprintf "α%d" !next in
incr next ;
Union_find.fresh {
structure = None ;
name = name ;
rank = 0
}
let t_int = TConst "int"
let t_bool = TConst "bool"
let rec infer term ty = match term with
| Const (CInt _) -> CApp (is_subtype, [t_int ; ty])
| Const (CBool _) -> CApp (is_subtype, [t_bool ; ty])
| Var x -> CInstance (x, ty)
| Abs (x, t) ->
let x1 = fresh_ty_var () in
let x2 = fresh_ty_var () in
let constr_body = infer t (TVar x2) in
CExists ([x1 ; x2], CAnd (CDef (x, sch (TVar x1), constr_body),
CApp (is_subtype, [function_type (TVar x1) (TVar x2) ; ty])))
| Let (z, e, t) ->
let x = fresh_ty_var () in
letin z (Forall ([x], infer e (TVar x), TVar x)) (infer t ty)
| App (f, a) ->
let x2 = fresh_ty_var () in
CExists ([x2], CAnd (infer f (function_type (TVar x2) ty),
infer a (TVar x2)))
type def = Def of string * term
type prog = def list
let infer_prog p =
List.fold_right
(fun (Def (v, t)) acc ->
let x = fresh_ty_var () in
letin v (Forall ([x], infer t (TVar x), TVar x)) acc)
p CDump
| null | https://raw.githubusercontent.com/naominitel/hmx/ed9ca9863a6a7691b50cd1f12440c410bc1f0b8a/hmx.ml | ocaml | type const =
| CInt of int
| CBool of bool
type term =
| Var of string
| App of term * term
| Abs of string * term
| Let of string * term * term
| Const of const
type ty =
| TConst of string
| TVar of var
| TApp of ty * ty
and var = var_descr Union_find.point
and var_descr = {
mutable structure: ty option ;
mutable rank: int ;
name: string
}
let arrow = TConst "->"
let function_type t1 t2 =
TApp ((TApp (arrow, t1)), t2)
type ty_sch =
| Forall of var list * constr * ty
and constr =
| CBool of bool
| CApp of string * ty list
| CAnd of constr * constr
| CExists of var list * constr
| CDef of string * ty_sch * constr
| CInstance of string * ty
| CDump
let sch ty =
Forall ([], CBool true, ty)
let is_subtype = "="
let is_instance sch ty =
let Forall(v, c, t) = sch in
CExists (v, CAnd (c, CApp (is_subtype, [t ; ty])))
let has_instance sch =
let Forall(v, c, _) = sch in
CExists (v, c)
let letin var sch constr =
CDef (var, sch, constr)
let fresh_ty_var =
let next = ref 0 in
fun () ->
let name = Printf.sprintf "α%d" !next in
incr next ;
Union_find.fresh {
structure = None ;
name = name ;
rank = 0
}
let t_int = TConst "int"
let t_bool = TConst "bool"
let rec infer term ty = match term with
| Const (CInt _) -> CApp (is_subtype, [t_int ; ty])
| Const (CBool _) -> CApp (is_subtype, [t_bool ; ty])
| Var x -> CInstance (x, ty)
| Abs (x, t) ->
let x1 = fresh_ty_var () in
let x2 = fresh_ty_var () in
let constr_body = infer t (TVar x2) in
CExists ([x1 ; x2], CAnd (CDef (x, sch (TVar x1), constr_body),
CApp (is_subtype, [function_type (TVar x1) (TVar x2) ; ty])))
| Let (z, e, t) ->
let x = fresh_ty_var () in
letin z (Forall ([x], infer e (TVar x), TVar x)) (infer t ty)
| App (f, a) ->
let x2 = fresh_ty_var () in
CExists ([x2], CAnd (infer f (function_type (TVar x2) ty),
infer a (TVar x2)))
type def = Def of string * term
type prog = def list
let infer_prog p =
List.fold_right
(fun (Def (v, t)) acc ->
let x = fresh_ty_var () in
letin v (Forall ([x], infer t (TVar x), TVar x)) acc)
p CDump
| |
765a7f3f9b67029f2ee0af07bd5b88728e0bbed3f02cace32146667f8ed7eaeb | ngless-toolkit/ngless | Transform.hs | Copyright 2016 - 2022 NGLess Authors
- License : MIT
- License: MIT
-}
# LANGUAGE FlexibleContexts #
module Transform
( transform
, pureTransform
, isVarUsed
, isVarUsed1
) where
import qualified Data.Text as T
import Control.Monad.Trans.Cont
import Control.Monad.Except
import Control.Monad.Writer
import Control.Monad.RWS
import Control.Arrow (first, second)
import Control.Monad.Identity (Identity(..), runIdentity)
import Control.Monad.State.Lazy
import Control.Monad.Extra (whenJust)
import Data.Maybe
import qualified Data.Hash.MD5 as MD5
import qualified Data.Map.Strict as M
import Data.List (sortOn)
import Language
import Modules
import Output (outputListLno', OutputType(..))
import NGLess
import Utils.Utils (uniq, secondM)
import NGLess.NGLEnvironment
import BuiltinFunctions (findFunction)
| NOTE
-
- Before interpretation , scripts are transformed to allow for several
- optimizations .
-
- INITIAL NORMALIZATION
-
- As a first step , the script is normalized , introducing temporary variables
- so that function calls do not contain nested expressions . For example :
-
- write(mapstats(samfile('input.sam ' ) ) , ofile='stats.txt ' )
-
- is re - written to the equivalent of :
-
- temp$0 = samfile('input.sam ' )
- temp$1 = mapstats(temp$0 )
- write(temp$1 , ofile='stats.txt ' )
-
- Note that " temp$xx " are not valid ngless variable names . Thus , these
- temporary variables can only be introduced internally and will never clash
- with any user variables . All subsequent transformations can assume that the
- scripts have been normalized .
-
-
- Before interpretation, scripts are transformed to allow for several
- optimizations.
-
- INITIAL NORMALIZATION
-
- As a first step, the script is normalized, introducing temporary variables
- so that function calls do not contain nested expressions. For example:
-
- write(mapstats(samfile('input.sam')), ofile='stats.txt')
-
- is re-written to the equivalent of:
-
- temp$0 = samfile('input.sam')
- temp$1 = mapstats(temp$0)
- write(temp$1, ofile='stats.txt')
-
- Note that "temp$xx" are not valid ngless variable names. Thus, these
- temporary variables can only be introduced internally and will never clash
- with any user variables. All subsequent transformations can assume that the
- scripts have been normalized.
-
-}
transform :: [Module] -> Script -> NGLessIO Script
transform mods sc = Script (nglHeader sc) <$> applyM transforms (nglBody sc)
where
applyM [] e = return e
applyM (t:ts) e = t e >>= applyM ts
transforms = preTransforms ++ modTransforms ++ builtinTransforms
modTransforms = map modTransform mods
preTransforms =
[ reassignPreprocess
, addTemporaries
, addOutputHash -- Hashing should be based on what the user input (pre-transforms)
, checkSimple
]
builtinTransforms =
[ writeToMove
, qcInPreprocess
, ifLenDiscardSpecial
, substrimReassign
, addFileChecks
, addRSChecks
, addIndexChecks
, addUseNewer
, addCountsCheck
]
| The condition is " one single function call per top level expression "
-}
checkSimple :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
checkSimple expr = forM_ expr (checkSimple1 . snd) *> return expr
where
checkSimple0 = \case
Condition{} -> throwShouldNotOccur "Non-simple expression (Condition)"
Assignment{} -> throwShouldNotOccur "Non-simple expression (Assignment)"
FunctionCall{} -> throwShouldNotOccur "Non-simple expression (FunctionCall)"
Rewriting for MethodCall is not implemented !
throwShouldNotOccur " Non - simple expression ( MethodCall ) "
ListExpression s -> mapM_ checkSimple0 s
UnaryOp _ a -> checkSimple0 a
BinaryOp _ a b -> checkSimple0 a *> checkSimple0 b
IndexExpression e ix -> checkSimple0 e *> checkSimpleIndex ix
Sequence s -> mapM_ checkSimple0 s
Lookup{} -> return ()
ConstStr{} -> return ()
ConstInt{} -> return ()
ConstDouble{} -> return ()
ConstBool{} -> return ()
ConstSymbol{} -> return ()
BuiltinConstant{} -> return ()
Continue -> return ()
Discard -> return ()
Optimized{} -> return ()
checkSimpleIndex (IndexOne e) = checkSimple0 e
checkSimpleIndex (IndexTwo a b) = whenJust a checkSimple0 *> whenJust b checkSimple0
checkSimple1 = \case
Condition ifC ifT ifF ->
checkSimple0 ifC *>
checkSimple1 ifT *>
checkSimple1 ifF
Assignment _ e -> checkSimple1 e
NOT IMPLEMENTED , BUT SHOULD BE !
FunctionCall _ e kwargs Nothing ->
checkSimple0 e *>
forM_ kwargs (checkSimple0 . snd)
-- whenJust block (checkSimple1 . blockBody)
MethodCall _ e arg kwargs ->
checkSimple0 e *>
whenJust arg checkSimple0 *>
forM_ kwargs (checkSimple0 . snd)
Sequence s -> mapM_ checkSimple1 s
ListExpression s -> mapM_ checkSimple0 s
UnaryOp _ a -> checkSimple0 a
BinaryOp _ a b -> checkSimple0 a *> checkSimple0 b
e -> checkSimple0 e
asSequence :: [Expression] -> Expression
asSequence [e] = e
asSequence es = Sequence es
pureRecursiveTransform :: (Expression -> Expression) -> Expression -> Expression
pureRecursiveTransform f e = runIdentity (recursiveTransform (return . f) e)
| A little helper function which turns a lifts a pure transform ` Expression
-- -> Expression` into the generic `[(Int, Expression)] -> NGLessIO [(Int, Expression)]`
pureTransform :: (Expression -> Expression) -> [(Int,Expression)] -> NGLessIO [(Int, Expression)]
pureTransform f = return . map (second (pureRecursiveTransform f))
-- | Add an argument to a function call iff the expression includes that function call
addArgument :: T.Text -- ^ function name
-> (Variable, Expression) -- ^ new argument
-> Expression -- ^ expression to transform
-> Expression -- ^ transformed expression
addArgument func newArg expr = case expr of
Assignment v e -> Assignment v (addArgument func newArg e)
FunctionCall fname@(FuncName fname') e args b
| fname' == func ->
FunctionCall fname e (newArg:args) b
_ -> expr
-- | Checks if a variable is used in any of the given expressions
--
-- See 'isVarUsed1'
isVarUsed :: Variable -> [(Int, Expression)] -> Bool
isVarUsed v = any (isVarUsed1 v . snd)
-- | Checks if a variable is used in a single 'Expression'
--
-- See 'isVarUsed'
isVarUsed1 :: Variable -> Expression -> Bool
isVarUsed1 v expr = evalCont $ callCC $ \exit -> do
recursiveAnalyse (isVarUsed1' exit) expr
return False
where
isVarUsed1' :: (Bool -> Cont Bool ()) -> Expression -> Cont Bool ()
isVarUsed1' exit (Assignment v' _)
| v == v' = exit True
isVarUsed1' exit (Lookup _ v')
| v == v' = exit True
isVarUsed1' _ _ = return ()
{- If a variable is not used after a call to write(), we can destroy it.
This is implemented by adding the argument __can_move=True to
write() calls -}
writeToMove :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
writeToMove = return . writeToMove' []
writeToMove' _ [] = []
writeToMove' blocked ((lno,expr):rest) = (lno, addMove toRemove expr):writeToMove' blocked' rest
where
toRemove = filter (`notElem` blocked) unused
unused = filter (not . flip isVarUsed rest) $ functionVars "write" expr
blocked' = blockhere ++ blocked
blockhere = case expr of
Assignment var (FunctionCall (FuncName fname) _ _ _)
| fname `elem` ["fastq", "paired", "samfile"] -> [var]
Assignment var (Lookup _ prev)
| prev `elem` blocked -> [var]
_ -> []
addMove :: [Variable] -> Expression -> Expression
addMove dead = pureRecursiveTransform addMove'
where
addMove' (FunctionCall f@(FuncName "write") e@(Lookup _ v) args b)
| v `elem` dead = FunctionCall f e ((Variable "__can_move", ConstBool True):args) b
addMove' e = e
-- | Variables used in calling the function func
functionVars :: T.Text -- ^ function name
-> Expression -- expression to analyse
-> [Variable]
functionVars fname expr = execWriter (recursiveAnalyse fvars expr)
where
fvars :: Expression -> Writer [Variable] ()
fvars (FunctionCall (FuncName fname') (Lookup _ v) _ _)
| fname' == fname = tell [v]
fvars _ = return ()
qcInPreprocess :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
qcInPreprocess [] = return []
qcInPreprocess ((lno,expr):rest) = case fastQVar expr of
Nothing -> ((lno,expr):) <$> qcInPreprocess rest
Just (fname, v) -> if not $ canQCPreprocessTransform v rest
then ((lno,expr):) <$> qcInPreprocess rest
else do
let expr' = addArgument fname (Variable "__perform_qc", ConstBool False) expr
rest' = rewritePreprocess v rest
outputListLno' TraceOutput ["Transformation for QC triggered for variable ", show v, " on line ", show lno, "."]
((lno, expr'):) <$> qcInPreprocess rest'
rewritePreprocess _ [] = [] -- this should never happen
rewritePreprocess v ((lno,expr):rest) = case expr of
Assignment t (FunctionCall f@(FuncName "preprocess") e@(Lookup _ v') args b)
| v == v' ->
let expr' = FunctionCall f e ((Variable "__input_qc", ConstBool True):args) b
in (lno,Assignment t expr'):rest
_ -> (lno,expr):rewritePreprocess v rest
fastQVar :: Expression -> Maybe (T.Text, Variable)
fastQVar (Assignment v (FunctionCall (FuncName fname) _ _ _))
| fname `elem` ["fastq", "paired", "load_fastq_directory", "load_mocat_sample"] = Just (fname, v)
fastQVar _ = Nothing
The rule is : we can perform the transform if the first usage of the Variable
-- 'v' is in a preproces call. Otherwise, it is not guaranteed to be safe
canQCPreprocessTransform :: Variable -> [(Int, Expression)] -> Bool
canQCPreprocessTransform _ [] = False
canQCPreprocessTransform v ((_,Assignment _ (FunctionCall (FuncName "preprocess") (Lookup _ v') _ _)):_)
| v' == v = True
canQCPreprocessTransform v ((_, expr):rest)
| isVarUsed1 v expr = False
| otherwise = canQCPreprocessTransform v rest
-- | 'ifLenDiscardSpecial' special cases a common case inside preprocess
-- blocks, namely:
--
-- if len(read) < #:
-- discard
--
-- gets rewritten to
--
-- Optimized (LenThresholdDiscard read < #)
--
ifLenDiscardSpecial :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
ifLenDiscardSpecial = pureTransform $ \case
(Condition (BinaryOp b (UnaryOp UOpLen (Lookup _ v)) (ConstInt thresh))
(Sequence [Discard])
(Sequence []))
| b `elem` [BOpLT, BOpLTE, BOpGT, BOpGTE] -> Optimized (LenThresholdDiscard v b (fromInteger thresh))
e -> e
substrimReassign :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
substrimReassign = pureTransform $ \case
(Assignment v (FunctionCall (FuncName "substrim") (Lookup _ v') [(Variable "min_quality", ConstInt mq)] Nothing))
| v == v' -> Optimized (SubstrimReassign v (fromInteger mq))
e -> e
-- | 'addFileChecks' implements the following transformation
--
-- variable = <non constant expression>
--
-- <code>
--
-- write(input, ofile="output/"+variable+".sam")
--
-- into
--
-- variable = <non constant expression>
-- __check_ofile("output/"+variable+".sam")
--
-- <code>
--
-- write(input, ofile="output/"+variable+".sam")
addFileChecks :: [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addFileChecks sc = reverse <$> (checkIFiles (reverse sc) >>= checkOFiles)
-- convert to genericCheckUpfloat
where
-- This could be combined into a single pass
-- For script preprocessing, we generally disregard performance, however
checkIFiles = addFileChecks' "__check_ifile" ArgCheckFileReadable
checkOFiles = addFileChecks' "__check_ofile" ArgCheckFileWritable
addFileChecks' :: T.Text -> ArgCheck -> [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addFileChecks' _ _ [] = return []
addFileChecks' checkFname tag ((lno,e):rest) = do
mods <- ngleLoadedModules <$> nglEnvironment
vars <- runNGLess $ execWriterT (recursiveAnalyse (getFileExpressions mods) e)
rest' <- addFileChecks' checkFname tag (addCheck vars (maybeAddChecks vars rest))
return ((lno,e):rest')
where
addCheck [(_, oexpr)] = ((lno, checkFileExpression oexpr):)
addCheck _ = id
maybeAddChecks :: [(Variable,Expression)] -> [(Int, Expression)] -> [(Int, Expression)]
maybeAddChecks _ [] = []
maybeAddChecks vars@[(v,complete)] ((lno',e'):rest') = case e' of
Assignment v' _
| v' == v -> (lno', checkFileExpression complete) : (lno', e') : rest'
_ -> (lno',e') : maybeAddChecks vars rest'
maybeAddChecks _ rest' = rest'
checkFileExpression complete = FunctionCall
(FuncName checkFname)
complete
[(Variable "original_lno", ConstInt (toInteger lno))]
Nothing
-- returns the variables used and expressions that depend on them
getFileExpressions :: [Module] -> Expression -> (WriterT [(Variable,Expression)] NGLess) ()
getFileExpressions mods (FunctionCall f expr args _) = case findFunction mods f of
Just finfo -> do
when (tag `elem` funcArgChecks finfo) $
extractExpressions (Just expr)
forM_ (funcKwArgs finfo) $ \ainfo ->
when (tag `elem` argChecks ainfo) $
extractExpressions (lookup (Variable $ argName ainfo) args)
Nothing -> throwShouldNotOccur ("Transform.getFileExpressions: Unknown function: '" ++ show f ++ "'. This should have been caught before")
getFileExpressions _ _ = return ()
extractExpressions :: (MonadWriter [(Variable, Expression)] m) => Maybe Expression -> m ()
extractExpressions (Just ofile) = case ofile of
BinaryOp _ re le -> case uniq (validVariables re ++ validVariables le) of
[v] -> tell [(v, ofile)]
_ -> return ()
Lookup _ v -> tell [(v, ofile)]
_ -> return ()
extractExpressions Nothing = return ()
validVariables (Lookup _ v) = [v]
validVariables (BinaryOp _ re le) = validVariables re ++ validVariables le
validVariables (ConstStr _) = []
validVariables _ = [Variable "this", Variable "wont", Variable "work"] -- this causes the caller to bailout
addRSChecks :: [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addRSChecks = return . genericCheckUpfloat addRSChecks'
addRSChecks' :: (Int, Expression) -> Maybe ([Variable],Expression)
addRSChecks' (lno, e) = case e of
Assignment _ (FunctionCall (FuncName "preprocess") lk@(Lookup _ v) _ _)
-> Just ([v],
FunctionCall (FuncName "__check_readset")
lk
[(Variable "original_lno", ConstInt (toInteger lno))]
Nothing)
_ -> Nothing
-- | 'addIndexChecks' implements the following transformation
--
-- array = <non constant expression>
--
-- <code>
--
-- array[ix]
--
-- into
--
-- array = <non constant expression>
_ _ , index1 = ix , ... )
--
-- <code>
--
-- write(input, ofile="output/"+variable+".sam")
addIndexChecks :: [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addIndexChecks = return . genericCheckUpfloat addIndexChecks'
addIndexChecks' :: (Int, Expression) -> Maybe ([Variable],Expression)
addIndexChecks' (lno, e) =
case execWriter (recursiveAnalyse extractIndexOne e) of
[] -> Nothing
vars -> Just (map fst vars, asSequence $ map (uncurry indexCheckExpr) vars)
where
extractIndexOne :: Expression -> Writer [(Variable, Expression)] ()
extractIndexOne (IndexExpression (Lookup _ v) (IndexOne ix1@ConstInt{})) = tell [(v, ix1)]
extractIndexOne _ = return ()
indexCheckExpr :: Variable -> Expression -> Expression
indexCheckExpr arr ix1 = FunctionCall
(FuncName "__check_index_access")
(Lookup Nothing arr)
[(Variable "original_lno", ConstInt (toInteger lno))
,(Variable "index1", ix1)]
Nothing
-- Many checks can be generalize so that certain expressions generate a
-- corresponding __check() function call. For example, bounds checks, transform
--
-- print(list[2])
--
-- into
--
_ _ check_index_access(list , )
-- print(list[2])
--
--
-- More interesting, these can be "bubbled up" so that __check_index_access
-- moves up (floats up):
--
-- list = [1,2,3]
-- <code>
-- print(list[2])
--
-- transforms into
--
-- list = [1,2,3]
_ _ check_index_access(list , )
-- <code>
-- print(list[2])
--
-- 'genericCheckUpfloat' generalizes this pattern
genericCheckUpfloat :: ((Int, Expression) -> Maybe ([Variable],Expression))
-> [(Int, Expression)]
-> [(Int, Expression)]
-- this is easier to do on the reversed script
genericCheckUpfloat f exprs = reverse $ genericCheckUpfloat' f (reverse exprs)
genericCheckUpfloat' :: ((Int, Expression) -> Maybe ([Variable],Expression))
-> [(Int, Expression)]
-> [(Int, Expression)]
genericCheckUpfloat' _ [] = []
genericCheckUpfloat' f (c@(lno, expr):rest) = case expr of
-- expand sequences
Sequence es -> genericCheckUpfloat' f (reverse [(lno,e) | e <- es] ++ rest)
Conditions are tricky . At some point , NGLess would erroneuously float
-- checks above the Condition, so that
--
list = [ 1 ]
--
if len(list ) > 1 :
-- print(list[1])
--
-- would trigger an error. Now, checks only float up within the block
Condition eC eT eF -> let
eT' = genericCheckUpfloat f [(lno, eT)]
eF' = genericCheckUpfloat f [(lno, eF)]
rest' = case f (lno,eC) of
Nothing -> rest
Just (vars, ne) -> floatDown vars (lno, ne) rest
untag tagged = asSequence (snd <$> tagged)
in
((lno, Condition eC (untag eT') (untag eF')):rest')
_ -> let
rest' = case recursiveCall f c of
Nothing -> rest
Just (vars, ne) -> floatDown vars (lno,ne) rest
in (c:genericCheckUpfloat' f rest')
recursiveCall :: ((Int, Expression) -> Maybe a) -> (Int, Expression) -> Maybe a
recursiveCall f (lno, e) = evalCont $ callCC $ \exit -> do
flip recursiveAnalyse e (\sub -> case f (lno, sub) of
Nothing -> return ()
j -> exit j)
return Nothing
floatDown :: [Variable] -> (Int, Expression) -> [(Int, Expression)] -> [(Int, Expression)]
floatDown _ e [] = [e]
floatDown vars e (c:rest)
| any (`isVarUsed1` snd c) vars = e : c : rest
| otherwise = c : floatDown vars e rest
-- | Implements addition of temp$nn variables to simplify expressions
--
-- This allows the rest of the code to be simpler. Namely, there are no complex expressions.
addTemporaries = addTemporaries' 0
where
addTemporaries' :: Int -> [(Int,Expression)] -> NGLessIO [(Int,Expression)]
addTemporaries' _ [] = return []
addTemporaries' next ((lno,e):rest) = do
mods <- ngleLoadedModules <$> nglEnvironment
let (next', es) = addTemporaries1 mods next e
rest' <- addTemporaries' next' rest
let lno_e' = (lno,) <$> es
return $ lno_e' ++ rest'
addTemporaries1 :: [Module] -> Int -> Expression -> (Int, [Expression])
addTemporaries1 _ next e@(FunctionCall _ _ _ (Just _)) = (next, [e])
addTemporaries1 _ next e@(Assignment _ (FunctionCall _ _ _ (Just _))) = (next, [e])
addTemporaries1 mods next (Condition ifC ifT ifF) = let
(next1, ifC') = addTemporaries1 mods next ifC
(next2, ifT') = addTemporaries1 mods next1 ifT
(next3, ifF') = addTemporaries1 mods next2 ifF
in (next3, init ifC' ++ [Condition (last ifC') (asSequence ifT') (asSequence ifF')])
addTemporaries1 mods next expr = let (e', next', pre) = runRWS (recursiveTransform functionCallTemp expr) () next in
(next', combineExpr pre e')
where
isAssignTo v (Assignment v' _) = v == v'
isAssignTo _ _ = False
findDrop :: [a] -> (a -> Bool) -> Maybe ([a], a)
findDrop [] _ = Nothing
findDrop (x:xs) f
| f x = Just (xs, x)
| otherwise = first (x:) <$> findDrop xs f
combineExpr :: [Expression] -> Expression -> [Expression]
combineExpr pre (Lookup _ v) = case findDrop pre (isAssignTo v) of
Just (pre', Assignment _ e') -> combineExpr pre' e'
_ -> error "This is impossible"
combineExpr pre (Assignment v' (Lookup _ vt@(Variable t)))
| T.isPrefixOf "temp$" t = case findDrop pre (isAssignTo vt) of
Just (pre', Assignment _ e) -> pre' ++ [Assignment v' e]
_ -> error "Impossible [combineExpr2]"
combineExpr pre e' = pre ++ [e']
functionCallTemp :: Expression -> RWS () [Expression] Int Expression
functionCallTemp e@(FunctionCall f _ _ _) = do
let t = funcRetType <$> findFunction mods f
if t == Just NGLVoid
then return e
else do
n <- get
let v = Variable (T.pack $ "temp$"++show n)
put (n + 1)
tell [Assignment v e]
return (Lookup t v)
The code below seemed like a good idea , but breaks the early
- error checking ( as it relies on a very simplistic way of
- " bubbling up " the error checking code :
-
functionCallTemp e@BinaryOp { } = do
n < - get
let v = Variable ( T.pack $ " temp$"++show n )
put ( n + 1 )
tell [ Assignment v e ]
return ( Lookup Nothing v )
- error checking (as it relies on a very simplistic way of
- "bubbling up" the error checking code:
-
functionCallTemp e@BinaryOp{} = do
n <- get
let v = Variable (T.pack $ "temp$"++show n)
put (n + 1)
tell [Assignment v e]
return (Lookup Nothing v)
-}
functionCallTemp e = return e
| Calculation of hashes for output method calls
so that the hash depends only on the relevant ( influencing the result ) part of
the script .
Hashes for variables are stored in a map ( as a state ) . For each expression
( top to bottom ) first the block variables are added to the map ( if present ) ,
then hashes are calculated and applied ( in lookups ) recursively .
Each output call receives new variable _ _ hash storing the hash of its own nput
expression ( with hashes already applied inside ) .
so that the hash depends only on the relevant (influencing the result) part of
the script.
Hashes for variables are stored in a map (as a state). For each expression
(top to bottom) first the block variables are added to the map (if present),
then hashes are calculated and applied (in lookups) recursively.
Each output call receives new variable __hash storing the hash of its own nput
expression (with hashes already applied inside).
-}
addOutputHash :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
addOutputHash expr_lst = do
nv <- ngleVersion <$> nglEnvironment
modules <- ngleLoadedModules <$> nglEnvironment
let modInfos = map modInfo modules
state0 = M.insert (Variable "ARGV") (T.pack "ARGV") M.empty
versionString = show nv ++ show (sortOn modName modInfos)
return $! evalState (mapM (secondM $ addOutputHash' versionString) expr_lst) state0
where
addOutputHash' :: String -> Expression -> State (M.Map Variable T.Text) Expression
addOutputHash' versionString expr = flip recursiveTransform expr $ \e -> case e of
Assignment v val -> do
h <- hashOf val
modify (M.insert v h)
return e
FunctionCall f@(FuncName fname) oarg kwargs block
| fname `elem` ["collect", "write"] -> do
h <- hashOf oarg
return (FunctionCall f oarg ((Variable "__hash", ConstStr h):kwargs) block)
_ -> return e
where
injectBlockVars :: Maybe Block -> M.Map Variable T.Text -> M.Map Variable T.Text
injectBlockVars Nothing m = m
injectBlockVars (Just (Block v@(Variable n) _)) m = M.insert v n m
hashOf :: Expression -> State (M.Map Variable T.Text) T.Text
hashOf e@(FunctionCall _ _ _ block) = withState (injectBlockVars block) $ hashOf' e
hashOf e = hashOf' e
hashOf' ex = do
expr' <- flip recursiveTransform ex $ \case
Lookup t v@(Variable n) -> do
h <- fromMaybe n <$> gets (M.lookup v)
return $! Lookup t (Variable h)
e -> return e
return . T.pack . MD5.md5s . MD5.Str . (versionString ++) . show $ expr'
In ngless 0.0 , preprocess ( ) would change its arguments , so that
--
-- preprocess(input) ...
--
-- was equivalent to
--
-- input = preprocess(input) ...
reassignPreprocess :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
reassignPreprocess sc = do
v <- ngleVersion <$> nglEnvironment
return $! case v of
NGLVersion 0 0 -> map (second reassignPreprocess') sc
_ -> sc
reassignPreprocess' :: Expression -> Expression
reassignPreprocess' e@(FunctionCall (FuncName "preprocess") (Lookup _ v) _ _) = Assignment v e
reassignPreprocess' e = e
-- | addUseNewer
--
-- Implements the following transformation:
--
-- mapped = select(mapped) using |mr|:
mr = ( ... )
--
--
-- mapped = select(mapped) using |mr|:
mr = ( ... , _ _ version11_or_higher = True )
--
--
-- if the ngless declaration asks for "ngless 1.1" or higher
addUseNewer :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
addUseNewer exprs = do
v <- ngleVersion <$> nglEnvironment
if v >= NGLVersion 1 1
then do
return exprs
else do
let addUseNewer' e = flip recursiveTransform e $ \case
(MethodCall mname@(MethodName mname') arg0 arg1 kwargs)
| mname' `elem` ["filter", "allbest"] -> do
outputListLno' WarningOutput ["The filter() and allbest() methods have changed behaviour in NGLess 1.1. Now using old behaviour for compatibility, but, if possible, upgrade your version statement. This refers to how a corner case in computing match sizes/identities is handled and will have no practical impacts on almost all datasets."]
return (MethodCall mname arg0 arg1 ((Variable "__version11_or_higher", ConstBool True):kwargs))
e' -> return e'
mapM (secondM addUseNewer') exprs
addCountsCheck :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
addCountsCheck = return . genericCheckUpfloat countCheck
where
countCheck (lno, FunctionCall (FuncName "count") _ kwargs Nothing) = Just (extractVars kwargs, buildCheck lno kwargs)
countCheck _ = Nothing
buildCheck lno kwargs =
FunctionCall
(FuncName "__check_count")
(BuiltinConstant (Variable "__VOID"))
((Variable "original_lno", ConstInt (toInteger lno)):kwargs)
Nothing
extractVars kwargs = concat (usedVariables . snd <$> kwargs)
| null | https://raw.githubusercontent.com/ngless-toolkit/ngless/ddbc751270a624550d9e82328336a9c9ffdb4382/NGLess/Transform.hs | haskell | Hashing should be based on what the user input (pre-transforms)
whenJust block (checkSimple1 . blockBody)
-> Expression` into the generic `[(Int, Expression)] -> NGLessIO [(Int, Expression)]`
| Add an argument to a function call iff the expression includes that function call
^ function name
^ new argument
^ expression to transform
^ transformed expression
| Checks if a variable is used in any of the given expressions
See 'isVarUsed1'
| Checks if a variable is used in a single 'Expression'
See 'isVarUsed'
If a variable is not used after a call to write(), we can destroy it.
This is implemented by adding the argument __can_move=True to
write() calls
| Variables used in calling the function func
^ function name
expression to analyse
this should never happen
'v' is in a preproces call. Otherwise, it is not guaranteed to be safe
| 'ifLenDiscardSpecial' special cases a common case inside preprocess
blocks, namely:
if len(read) < #:
discard
gets rewritten to
Optimized (LenThresholdDiscard read < #)
| 'addFileChecks' implements the following transformation
variable = <non constant expression>
<code>
write(input, ofile="output/"+variable+".sam")
into
variable = <non constant expression>
__check_ofile("output/"+variable+".sam")
<code>
write(input, ofile="output/"+variable+".sam")
convert to genericCheckUpfloat
This could be combined into a single pass
For script preprocessing, we generally disregard performance, however
returns the variables used and expressions that depend on them
this causes the caller to bailout
| 'addIndexChecks' implements the following transformation
array = <non constant expression>
<code>
array[ix]
into
array = <non constant expression>
<code>
write(input, ofile="output/"+variable+".sam")
Many checks can be generalize so that certain expressions generate a
corresponding __check() function call. For example, bounds checks, transform
print(list[2])
into
print(list[2])
More interesting, these can be "bubbled up" so that __check_index_access
moves up (floats up):
list = [1,2,3]
<code>
print(list[2])
transforms into
list = [1,2,3]
<code>
print(list[2])
'genericCheckUpfloat' generalizes this pattern
this is easier to do on the reversed script
expand sequences
checks above the Condition, so that
print(list[1])
would trigger an error. Now, checks only float up within the block
| Implements addition of temp$nn variables to simplify expressions
This allows the rest of the code to be simpler. Namely, there are no complex expressions.
preprocess(input) ...
was equivalent to
input = preprocess(input) ...
| addUseNewer
Implements the following transformation:
mapped = select(mapped) using |mr|:
mapped = select(mapped) using |mr|:
if the ngless declaration asks for "ngless 1.1" or higher | Copyright 2016 - 2022 NGLess Authors
- License : MIT
- License: MIT
-}
# LANGUAGE FlexibleContexts #
module Transform
( transform
, pureTransform
, isVarUsed
, isVarUsed1
) where
import qualified Data.Text as T
import Control.Monad.Trans.Cont
import Control.Monad.Except
import Control.Monad.Writer
import Control.Monad.RWS
import Control.Arrow (first, second)
import Control.Monad.Identity (Identity(..), runIdentity)
import Control.Monad.State.Lazy
import Control.Monad.Extra (whenJust)
import Data.Maybe
import qualified Data.Hash.MD5 as MD5
import qualified Data.Map.Strict as M
import Data.List (sortOn)
import Language
import Modules
import Output (outputListLno', OutputType(..))
import NGLess
import Utils.Utils (uniq, secondM)
import NGLess.NGLEnvironment
import BuiltinFunctions (findFunction)
| NOTE
-
- Before interpretation , scripts are transformed to allow for several
- optimizations .
-
- INITIAL NORMALIZATION
-
- As a first step , the script is normalized , introducing temporary variables
- so that function calls do not contain nested expressions . For example :
-
- write(mapstats(samfile('input.sam ' ) ) , ofile='stats.txt ' )
-
- is re - written to the equivalent of :
-
- temp$0 = samfile('input.sam ' )
- temp$1 = mapstats(temp$0 )
- write(temp$1 , ofile='stats.txt ' )
-
- Note that " temp$xx " are not valid ngless variable names . Thus , these
- temporary variables can only be introduced internally and will never clash
- with any user variables . All subsequent transformations can assume that the
- scripts have been normalized .
-
-
- Before interpretation, scripts are transformed to allow for several
- optimizations.
-
- INITIAL NORMALIZATION
-
- As a first step, the script is normalized, introducing temporary variables
- so that function calls do not contain nested expressions. For example:
-
- write(mapstats(samfile('input.sam')), ofile='stats.txt')
-
- is re-written to the equivalent of:
-
- temp$0 = samfile('input.sam')
- temp$1 = mapstats(temp$0)
- write(temp$1, ofile='stats.txt')
-
- Note that "temp$xx" are not valid ngless variable names. Thus, these
- temporary variables can only be introduced internally and will never clash
- with any user variables. All subsequent transformations can assume that the
- scripts have been normalized.
-
-}
transform :: [Module] -> Script -> NGLessIO Script
transform mods sc = Script (nglHeader sc) <$> applyM transforms (nglBody sc)
where
applyM [] e = return e
applyM (t:ts) e = t e >>= applyM ts
transforms = preTransforms ++ modTransforms ++ builtinTransforms
modTransforms = map modTransform mods
preTransforms =
[ reassignPreprocess
, addTemporaries
, checkSimple
]
builtinTransforms =
[ writeToMove
, qcInPreprocess
, ifLenDiscardSpecial
, substrimReassign
, addFileChecks
, addRSChecks
, addIndexChecks
, addUseNewer
, addCountsCheck
]
| The condition is " one single function call per top level expression "
-}
checkSimple :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
checkSimple expr = forM_ expr (checkSimple1 . snd) *> return expr
where
checkSimple0 = \case
Condition{} -> throwShouldNotOccur "Non-simple expression (Condition)"
Assignment{} -> throwShouldNotOccur "Non-simple expression (Assignment)"
FunctionCall{} -> throwShouldNotOccur "Non-simple expression (FunctionCall)"
Rewriting for MethodCall is not implemented !
throwShouldNotOccur " Non - simple expression ( MethodCall ) "
ListExpression s -> mapM_ checkSimple0 s
UnaryOp _ a -> checkSimple0 a
BinaryOp _ a b -> checkSimple0 a *> checkSimple0 b
IndexExpression e ix -> checkSimple0 e *> checkSimpleIndex ix
Sequence s -> mapM_ checkSimple0 s
Lookup{} -> return ()
ConstStr{} -> return ()
ConstInt{} -> return ()
ConstDouble{} -> return ()
ConstBool{} -> return ()
ConstSymbol{} -> return ()
BuiltinConstant{} -> return ()
Continue -> return ()
Discard -> return ()
Optimized{} -> return ()
checkSimpleIndex (IndexOne e) = checkSimple0 e
checkSimpleIndex (IndexTwo a b) = whenJust a checkSimple0 *> whenJust b checkSimple0
checkSimple1 = \case
Condition ifC ifT ifF ->
checkSimple0 ifC *>
checkSimple1 ifT *>
checkSimple1 ifF
Assignment _ e -> checkSimple1 e
NOT IMPLEMENTED , BUT SHOULD BE !
FunctionCall _ e kwargs Nothing ->
checkSimple0 e *>
forM_ kwargs (checkSimple0 . snd)
MethodCall _ e arg kwargs ->
checkSimple0 e *>
whenJust arg checkSimple0 *>
forM_ kwargs (checkSimple0 . snd)
Sequence s -> mapM_ checkSimple1 s
ListExpression s -> mapM_ checkSimple0 s
UnaryOp _ a -> checkSimple0 a
BinaryOp _ a b -> checkSimple0 a *> checkSimple0 b
e -> checkSimple0 e
asSequence :: [Expression] -> Expression
asSequence [e] = e
asSequence es = Sequence es
pureRecursiveTransform :: (Expression -> Expression) -> Expression -> Expression
pureRecursiveTransform f e = runIdentity (recursiveTransform (return . f) e)
| A little helper function which turns a lifts a pure transform ` Expression
pureTransform :: (Expression -> Expression) -> [(Int,Expression)] -> NGLessIO [(Int, Expression)]
pureTransform f = return . map (second (pureRecursiveTransform f))
addArgument func newArg expr = case expr of
Assignment v e -> Assignment v (addArgument func newArg e)
FunctionCall fname@(FuncName fname') e args b
| fname' == func ->
FunctionCall fname e (newArg:args) b
_ -> expr
isVarUsed :: Variable -> [(Int, Expression)] -> Bool
isVarUsed v = any (isVarUsed1 v . snd)
isVarUsed1 :: Variable -> Expression -> Bool
isVarUsed1 v expr = evalCont $ callCC $ \exit -> do
recursiveAnalyse (isVarUsed1' exit) expr
return False
where
isVarUsed1' :: (Bool -> Cont Bool ()) -> Expression -> Cont Bool ()
isVarUsed1' exit (Assignment v' _)
| v == v' = exit True
isVarUsed1' exit (Lookup _ v')
| v == v' = exit True
isVarUsed1' _ _ = return ()
writeToMove :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
writeToMove = return . writeToMove' []
writeToMove' _ [] = []
writeToMove' blocked ((lno,expr):rest) = (lno, addMove toRemove expr):writeToMove' blocked' rest
where
toRemove = filter (`notElem` blocked) unused
unused = filter (not . flip isVarUsed rest) $ functionVars "write" expr
blocked' = blockhere ++ blocked
blockhere = case expr of
Assignment var (FunctionCall (FuncName fname) _ _ _)
| fname `elem` ["fastq", "paired", "samfile"] -> [var]
Assignment var (Lookup _ prev)
| prev `elem` blocked -> [var]
_ -> []
addMove :: [Variable] -> Expression -> Expression
addMove dead = pureRecursiveTransform addMove'
where
addMove' (FunctionCall f@(FuncName "write") e@(Lookup _ v) args b)
| v `elem` dead = FunctionCall f e ((Variable "__can_move", ConstBool True):args) b
addMove' e = e
-> [Variable]
functionVars fname expr = execWriter (recursiveAnalyse fvars expr)
where
fvars :: Expression -> Writer [Variable] ()
fvars (FunctionCall (FuncName fname') (Lookup _ v) _ _)
| fname' == fname = tell [v]
fvars _ = return ()
qcInPreprocess :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
qcInPreprocess [] = return []
qcInPreprocess ((lno,expr):rest) = case fastQVar expr of
Nothing -> ((lno,expr):) <$> qcInPreprocess rest
Just (fname, v) -> if not $ canQCPreprocessTransform v rest
then ((lno,expr):) <$> qcInPreprocess rest
else do
let expr' = addArgument fname (Variable "__perform_qc", ConstBool False) expr
rest' = rewritePreprocess v rest
outputListLno' TraceOutput ["Transformation for QC triggered for variable ", show v, " on line ", show lno, "."]
((lno, expr'):) <$> qcInPreprocess rest'
rewritePreprocess v ((lno,expr):rest) = case expr of
Assignment t (FunctionCall f@(FuncName "preprocess") e@(Lookup _ v') args b)
| v == v' ->
let expr' = FunctionCall f e ((Variable "__input_qc", ConstBool True):args) b
in (lno,Assignment t expr'):rest
_ -> (lno,expr):rewritePreprocess v rest
fastQVar :: Expression -> Maybe (T.Text, Variable)
fastQVar (Assignment v (FunctionCall (FuncName fname) _ _ _))
| fname `elem` ["fastq", "paired", "load_fastq_directory", "load_mocat_sample"] = Just (fname, v)
fastQVar _ = Nothing
The rule is : we can perform the transform if the first usage of the Variable
canQCPreprocessTransform :: Variable -> [(Int, Expression)] -> Bool
canQCPreprocessTransform _ [] = False
canQCPreprocessTransform v ((_,Assignment _ (FunctionCall (FuncName "preprocess") (Lookup _ v') _ _)):_)
| v' == v = True
canQCPreprocessTransform v ((_, expr):rest)
| isVarUsed1 v expr = False
| otherwise = canQCPreprocessTransform v rest
ifLenDiscardSpecial :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
ifLenDiscardSpecial = pureTransform $ \case
(Condition (BinaryOp b (UnaryOp UOpLen (Lookup _ v)) (ConstInt thresh))
(Sequence [Discard])
(Sequence []))
| b `elem` [BOpLT, BOpLTE, BOpGT, BOpGTE] -> Optimized (LenThresholdDiscard v b (fromInteger thresh))
e -> e
substrimReassign :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
substrimReassign = pureTransform $ \case
(Assignment v (FunctionCall (FuncName "substrim") (Lookup _ v') [(Variable "min_quality", ConstInt mq)] Nothing))
| v == v' -> Optimized (SubstrimReassign v (fromInteger mq))
e -> e
addFileChecks :: [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addFileChecks sc = reverse <$> (checkIFiles (reverse sc) >>= checkOFiles)
where
checkIFiles = addFileChecks' "__check_ifile" ArgCheckFileReadable
checkOFiles = addFileChecks' "__check_ofile" ArgCheckFileWritable
addFileChecks' :: T.Text -> ArgCheck -> [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addFileChecks' _ _ [] = return []
addFileChecks' checkFname tag ((lno,e):rest) = do
mods <- ngleLoadedModules <$> nglEnvironment
vars <- runNGLess $ execWriterT (recursiveAnalyse (getFileExpressions mods) e)
rest' <- addFileChecks' checkFname tag (addCheck vars (maybeAddChecks vars rest))
return ((lno,e):rest')
where
addCheck [(_, oexpr)] = ((lno, checkFileExpression oexpr):)
addCheck _ = id
maybeAddChecks :: [(Variable,Expression)] -> [(Int, Expression)] -> [(Int, Expression)]
maybeAddChecks _ [] = []
maybeAddChecks vars@[(v,complete)] ((lno',e'):rest') = case e' of
Assignment v' _
| v' == v -> (lno', checkFileExpression complete) : (lno', e') : rest'
_ -> (lno',e') : maybeAddChecks vars rest'
maybeAddChecks _ rest' = rest'
checkFileExpression complete = FunctionCall
(FuncName checkFname)
complete
[(Variable "original_lno", ConstInt (toInteger lno))]
Nothing
getFileExpressions :: [Module] -> Expression -> (WriterT [(Variable,Expression)] NGLess) ()
getFileExpressions mods (FunctionCall f expr args _) = case findFunction mods f of
Just finfo -> do
when (tag `elem` funcArgChecks finfo) $
extractExpressions (Just expr)
forM_ (funcKwArgs finfo) $ \ainfo ->
when (tag `elem` argChecks ainfo) $
extractExpressions (lookup (Variable $ argName ainfo) args)
Nothing -> throwShouldNotOccur ("Transform.getFileExpressions: Unknown function: '" ++ show f ++ "'. This should have been caught before")
getFileExpressions _ _ = return ()
extractExpressions :: (MonadWriter [(Variable, Expression)] m) => Maybe Expression -> m ()
extractExpressions (Just ofile) = case ofile of
BinaryOp _ re le -> case uniq (validVariables re ++ validVariables le) of
[v] -> tell [(v, ofile)]
_ -> return ()
Lookup _ v -> tell [(v, ofile)]
_ -> return ()
extractExpressions Nothing = return ()
validVariables (Lookup _ v) = [v]
validVariables (BinaryOp _ re le) = validVariables re ++ validVariables le
validVariables (ConstStr _) = []
addRSChecks :: [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addRSChecks = return . genericCheckUpfloat addRSChecks'
addRSChecks' :: (Int, Expression) -> Maybe ([Variable],Expression)
addRSChecks' (lno, e) = case e of
Assignment _ (FunctionCall (FuncName "preprocess") lk@(Lookup _ v) _ _)
-> Just ([v],
FunctionCall (FuncName "__check_readset")
lk
[(Variable "original_lno", ConstInt (toInteger lno))]
Nothing)
_ -> Nothing
_ _ , index1 = ix , ... )
addIndexChecks :: [(Int,Expression)] -> NGLessIO [(Int, Expression)]
addIndexChecks = return . genericCheckUpfloat addIndexChecks'
addIndexChecks' :: (Int, Expression) -> Maybe ([Variable],Expression)
addIndexChecks' (lno, e) =
case execWriter (recursiveAnalyse extractIndexOne e) of
[] -> Nothing
vars -> Just (map fst vars, asSequence $ map (uncurry indexCheckExpr) vars)
where
extractIndexOne :: Expression -> Writer [(Variable, Expression)] ()
extractIndexOne (IndexExpression (Lookup _ v) (IndexOne ix1@ConstInt{})) = tell [(v, ix1)]
extractIndexOne _ = return ()
indexCheckExpr :: Variable -> Expression -> Expression
indexCheckExpr arr ix1 = FunctionCall
(FuncName "__check_index_access")
(Lookup Nothing arr)
[(Variable "original_lno", ConstInt (toInteger lno))
,(Variable "index1", ix1)]
Nothing
_ _ check_index_access(list , )
_ _ check_index_access(list , )
genericCheckUpfloat :: ((Int, Expression) -> Maybe ([Variable],Expression))
-> [(Int, Expression)]
-> [(Int, Expression)]
genericCheckUpfloat f exprs = reverse $ genericCheckUpfloat' f (reverse exprs)
genericCheckUpfloat' :: ((Int, Expression) -> Maybe ([Variable],Expression))
-> [(Int, Expression)]
-> [(Int, Expression)]
genericCheckUpfloat' _ [] = []
genericCheckUpfloat' f (c@(lno, expr):rest) = case expr of
Sequence es -> genericCheckUpfloat' f (reverse [(lno,e) | e <- es] ++ rest)
Conditions are tricky . At some point , NGLess would erroneuously float
list = [ 1 ]
if len(list ) > 1 :
Condition eC eT eF -> let
eT' = genericCheckUpfloat f [(lno, eT)]
eF' = genericCheckUpfloat f [(lno, eF)]
rest' = case f (lno,eC) of
Nothing -> rest
Just (vars, ne) -> floatDown vars (lno, ne) rest
untag tagged = asSequence (snd <$> tagged)
in
((lno, Condition eC (untag eT') (untag eF')):rest')
_ -> let
rest' = case recursiveCall f c of
Nothing -> rest
Just (vars, ne) -> floatDown vars (lno,ne) rest
in (c:genericCheckUpfloat' f rest')
recursiveCall :: ((Int, Expression) -> Maybe a) -> (Int, Expression) -> Maybe a
recursiveCall f (lno, e) = evalCont $ callCC $ \exit -> do
flip recursiveAnalyse e (\sub -> case f (lno, sub) of
Nothing -> return ()
j -> exit j)
return Nothing
floatDown :: [Variable] -> (Int, Expression) -> [(Int, Expression)] -> [(Int, Expression)]
floatDown _ e [] = [e]
floatDown vars e (c:rest)
| any (`isVarUsed1` snd c) vars = e : c : rest
| otherwise = c : floatDown vars e rest
addTemporaries = addTemporaries' 0
where
addTemporaries' :: Int -> [(Int,Expression)] -> NGLessIO [(Int,Expression)]
addTemporaries' _ [] = return []
addTemporaries' next ((lno,e):rest) = do
mods <- ngleLoadedModules <$> nglEnvironment
let (next', es) = addTemporaries1 mods next e
rest' <- addTemporaries' next' rest
let lno_e' = (lno,) <$> es
return $ lno_e' ++ rest'
addTemporaries1 :: [Module] -> Int -> Expression -> (Int, [Expression])
addTemporaries1 _ next e@(FunctionCall _ _ _ (Just _)) = (next, [e])
addTemporaries1 _ next e@(Assignment _ (FunctionCall _ _ _ (Just _))) = (next, [e])
addTemporaries1 mods next (Condition ifC ifT ifF) = let
(next1, ifC') = addTemporaries1 mods next ifC
(next2, ifT') = addTemporaries1 mods next1 ifT
(next3, ifF') = addTemporaries1 mods next2 ifF
in (next3, init ifC' ++ [Condition (last ifC') (asSequence ifT') (asSequence ifF')])
addTemporaries1 mods next expr = let (e', next', pre) = runRWS (recursiveTransform functionCallTemp expr) () next in
(next', combineExpr pre e')
where
isAssignTo v (Assignment v' _) = v == v'
isAssignTo _ _ = False
findDrop :: [a] -> (a -> Bool) -> Maybe ([a], a)
findDrop [] _ = Nothing
findDrop (x:xs) f
| f x = Just (xs, x)
| otherwise = first (x:) <$> findDrop xs f
combineExpr :: [Expression] -> Expression -> [Expression]
combineExpr pre (Lookup _ v) = case findDrop pre (isAssignTo v) of
Just (pre', Assignment _ e') -> combineExpr pre' e'
_ -> error "This is impossible"
combineExpr pre (Assignment v' (Lookup _ vt@(Variable t)))
| T.isPrefixOf "temp$" t = case findDrop pre (isAssignTo vt) of
Just (pre', Assignment _ e) -> pre' ++ [Assignment v' e]
_ -> error "Impossible [combineExpr2]"
combineExpr pre e' = pre ++ [e']
functionCallTemp :: Expression -> RWS () [Expression] Int Expression
functionCallTemp e@(FunctionCall f _ _ _) = do
let t = funcRetType <$> findFunction mods f
if t == Just NGLVoid
then return e
else do
n <- get
let v = Variable (T.pack $ "temp$"++show n)
put (n + 1)
tell [Assignment v e]
return (Lookup t v)
The code below seemed like a good idea , but breaks the early
- error checking ( as it relies on a very simplistic way of
- " bubbling up " the error checking code :
-
functionCallTemp e@BinaryOp { } = do
n < - get
let v = Variable ( T.pack $ " temp$"++show n )
put ( n + 1 )
tell [ Assignment v e ]
return ( Lookup Nothing v )
- error checking (as it relies on a very simplistic way of
- "bubbling up" the error checking code:
-
functionCallTemp e@BinaryOp{} = do
n <- get
let v = Variable (T.pack $ "temp$"++show n)
put (n + 1)
tell [Assignment v e]
return (Lookup Nothing v)
-}
functionCallTemp e = return e
| Calculation of hashes for output method calls
so that the hash depends only on the relevant ( influencing the result ) part of
the script .
Hashes for variables are stored in a map ( as a state ) . For each expression
( top to bottom ) first the block variables are added to the map ( if present ) ,
then hashes are calculated and applied ( in lookups ) recursively .
Each output call receives new variable _ _ hash storing the hash of its own nput
expression ( with hashes already applied inside ) .
so that the hash depends only on the relevant (influencing the result) part of
the script.
Hashes for variables are stored in a map (as a state). For each expression
(top to bottom) first the block variables are added to the map (if present),
then hashes are calculated and applied (in lookups) recursively.
Each output call receives new variable __hash storing the hash of its own nput
expression (with hashes already applied inside).
-}
addOutputHash :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
addOutputHash expr_lst = do
nv <- ngleVersion <$> nglEnvironment
modules <- ngleLoadedModules <$> nglEnvironment
let modInfos = map modInfo modules
state0 = M.insert (Variable "ARGV") (T.pack "ARGV") M.empty
versionString = show nv ++ show (sortOn modName modInfos)
return $! evalState (mapM (secondM $ addOutputHash' versionString) expr_lst) state0
where
addOutputHash' :: String -> Expression -> State (M.Map Variable T.Text) Expression
addOutputHash' versionString expr = flip recursiveTransform expr $ \e -> case e of
Assignment v val -> do
h <- hashOf val
modify (M.insert v h)
return e
FunctionCall f@(FuncName fname) oarg kwargs block
| fname `elem` ["collect", "write"] -> do
h <- hashOf oarg
return (FunctionCall f oarg ((Variable "__hash", ConstStr h):kwargs) block)
_ -> return e
where
injectBlockVars :: Maybe Block -> M.Map Variable T.Text -> M.Map Variable T.Text
injectBlockVars Nothing m = m
injectBlockVars (Just (Block v@(Variable n) _)) m = M.insert v n m
hashOf :: Expression -> State (M.Map Variable T.Text) T.Text
hashOf e@(FunctionCall _ _ _ block) = withState (injectBlockVars block) $ hashOf' e
hashOf e = hashOf' e
hashOf' ex = do
expr' <- flip recursiveTransform ex $ \case
Lookup t v@(Variable n) -> do
h <- fromMaybe n <$> gets (M.lookup v)
return $! Lookup t (Variable h)
e -> return e
return . T.pack . MD5.md5s . MD5.Str . (versionString ++) . show $ expr'
In ngless 0.0 , preprocess ( ) would change its arguments , so that
reassignPreprocess :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
reassignPreprocess sc = do
v <- ngleVersion <$> nglEnvironment
return $! case v of
NGLVersion 0 0 -> map (second reassignPreprocess') sc
_ -> sc
reassignPreprocess' :: Expression -> Expression
reassignPreprocess' e@(FunctionCall (FuncName "preprocess") (Lookup _ v) _ _) = Assignment v e
reassignPreprocess' e = e
mr = ( ... )
mr = ( ... , _ _ version11_or_higher = True )
addUseNewer :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
addUseNewer exprs = do
v <- ngleVersion <$> nglEnvironment
if v >= NGLVersion 1 1
then do
return exprs
else do
let addUseNewer' e = flip recursiveTransform e $ \case
(MethodCall mname@(MethodName mname') arg0 arg1 kwargs)
| mname' `elem` ["filter", "allbest"] -> do
outputListLno' WarningOutput ["The filter() and allbest() methods have changed behaviour in NGLess 1.1. Now using old behaviour for compatibility, but, if possible, upgrade your version statement. This refers to how a corner case in computing match sizes/identities is handled and will have no practical impacts on almost all datasets."]
return (MethodCall mname arg0 arg1 ((Variable "__version11_or_higher", ConstBool True):kwargs))
e' -> return e'
mapM (secondM addUseNewer') exprs
addCountsCheck :: [(Int, Expression)] -> NGLessIO [(Int, Expression)]
addCountsCheck = return . genericCheckUpfloat countCheck
where
countCheck (lno, FunctionCall (FuncName "count") _ kwargs Nothing) = Just (extractVars kwargs, buildCheck lno kwargs)
countCheck _ = Nothing
buildCheck lno kwargs =
FunctionCall
(FuncName "__check_count")
(BuiltinConstant (Variable "__VOID"))
((Variable "original_lno", ConstInt (toInteger lno)):kwargs)
Nothing
extractVars kwargs = concat (usedVariables . snd <$> kwargs)
|
8adcdf04d758feb091a64ba6b1f636555eba458756887832b431ac5f7b4fec73 | exercism/ocaml | example.ml |
let to_roman n =
assert (n <= 3000);
let build ones halves tens = function
| 0 -> ""
| 1 -> ones
| 2 -> ones ^ ones
| 3 -> ones ^ ones ^ ones
| 4 -> ones ^ halves
| 5 -> halves
| 6 -> halves ^ ones
| 7 -> halves ^ ones ^ ones
| 8 -> halves ^ ones ^ ones ^ ones
| 9 -> ones ^ tens
| _ -> assert false
in
let thousands n = build "M" "" "" (n / 1000 mod 10) in
let hundreds n = build "C" "D" "M" (n / 100 mod 10) in
let tens n = build "X" "L" "C" (n / 10 mod 10) in
let ones n = build "I" "V" "X" (n mod 10) in
thousands n ^ hundreds n ^ tens n ^ ones n | null | https://raw.githubusercontent.com/exercism/ocaml/914e58d48e58a96fe7635fe4b06cc103af4ed683/exercises/practice/roman-numerals/.meta/example.ml | ocaml |
let to_roman n =
assert (n <= 3000);
let build ones halves tens = function
| 0 -> ""
| 1 -> ones
| 2 -> ones ^ ones
| 3 -> ones ^ ones ^ ones
| 4 -> ones ^ halves
| 5 -> halves
| 6 -> halves ^ ones
| 7 -> halves ^ ones ^ ones
| 8 -> halves ^ ones ^ ones ^ ones
| 9 -> ones ^ tens
| _ -> assert false
in
let thousands n = build "M" "" "" (n / 1000 mod 10) in
let hundreds n = build "C" "D" "M" (n / 100 mod 10) in
let tens n = build "X" "L" "C" (n / 10 mod 10) in
let ones n = build "I" "V" "X" (n mod 10) in
thousands n ^ hundreds n ^ tens n ^ ones n | |
ad47e5fd34b109b0ed5fbd57a7612da76de1d4e2b8d16b2f8212381aec630a9d | spawnfest/eep49ers | snmp_pdus_SUITE.erl | %%
%% %CopyrightBegin%
%%
Copyright Ericsson AB 2003 - 2020 . All Rights Reserved .
%%
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% -2.0
%%
%% Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an " AS IS " BASIS ,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
%%
%% %CopyrightEnd%
%%
%%----------------------------------------------------------------------
%% Purpose:
%%----------------------------------------------------------------------
-module(snmp_pdus_SUITE).
%%----------------------------------------------------------------------
%% Include files
%%----------------------------------------------------------------------
-include_lib("common_test/include/ct.hrl").
-include("snmp_test_lib.hrl").
-include_lib("snmp/include/snmp_types.hrl").
%%----------------------------------------------------------------------
%% External exports
%%----------------------------------------------------------------------
-export([
suite/0, all/0, groups/0,
init_per_suite/1, end_per_suite/1,
init_per_group/2, end_per_group/2,
init_per_testcase/2, end_per_testcase/2,
otp7575/1,
otp8563/1,
otp9022/1,
otp10132/1
]).
%%======================================================================
%% Common Test interface functions
%%======================================================================
suite() ->
[{ct_hooks, [ts_install_cth]}].
all() ->
[{group, tickets}].
groups() ->
[{tickets, [], tickets_cases()}].
tickets_cases() ->
[
otp7575,
otp8563,
otp9022,
otp10132
].
%%
%% -----
%%
init_per_suite(Config0) when is_list(Config0) ->
?IPRINT("init_per_suite -> entry with"
"~n Config: ~p", [Config0]),
case ?LIB:init_per_suite(Config0) of
{skip, _} = SKIP ->
SKIP;
Config1 when is_list(Config1) ->
%% We need a monitor on this node also
snmp_test_sys_monitor:start(),
?IPRINT("init_per_suite -> end when"
"~n Config1: ~p", [Config1]),
Config1
end.
end_per_suite(Config0) when is_list(Config0) ->
?IPRINT("end_per_suite -> entry with"
"~n Config: ~p", [Config0]),
snmp_test_sys_monitor:stop(),
Config1 = ?LIB:end_per_suite(Config0),
?IPRINT("end_per_suite -> end"),
Config1.
%%
%% -----
%%
init_per_group(_GroupName, Config) ->
Config.
end_per_group(_GroupName, Config) ->
Config.
%%
%% -----
%%
init_per_testcase(_Case, Config) when is_list(Config) ->
?IPRINT("init_per_testcase -> entry with"
"~n Config: ~p", [Config]),
snmp_test_global_sys_monitor:reset_events(),
?IPRINT("init_per_testcase -> end"),
Config.
end_per_testcase(_Case, Config) when is_list(Config) ->
?IPRINT("end_per_testcase -> entry with"
"~n Config: ~p",
[Config]),
?IPRINT("system events during test: ~p",
[snmp_test_global_sys_monitor:events()]),
Config.
%%======================================================================
%% Test functions
%%======================================================================
otp7575(suite) -> [];
otp7575(doc) -> ["OTP-7575 - Message version"];
otp7575(Config) when is_list(Config) ->
?IPRINT("attempt to decode message with valid version"),
MsgWithOkVersion = <<48,39,2,1,0,4,6,112,117,98,108,105,99,160,26,2,2,1,49,2,1,0,2,1,0,48,14,48,12,6,8,43,6,1,2,1,1,5,0,5,0>>,
case (catch dec_message(MsgWithOkVersion)) of
Msg when is_record(Msg, message) ->
ok;
Unexpected1 ->
exit({unexpected_decode_result, 1, Unexpected1})
end,
?IPRINT("attempt to decode message with bad version"),
MsgWithBadVersion = <<48,48,2,10,1,1,1,1,1,1,1,1,1,1,4,6,112,117,98,108,105,99,160,26,2,2,1,49,2,1,0,2,1,0,48,14,48,12,6,8,43,6,1,2,1,1,5,0,5,0>>,
case (catch dec_message(MsgWithBadVersion)) of
{'EXIT', {bad_version, BadVersion}} when is_integer(BadVersion) ->
ok;
Unexpected2 ->
exit({unexpected_decode_result, 2, Unexpected2})
end,
?IPRINT("attempt to decode message with very bad version"),
MsgWithVeryBadVersion = <<48,49,2,11,1,1,1,1,1,1,1,1,1,1,1,4,6,112,117,98,108,105,99,160,26,2,2,1,49,2,1,0,2,1,0,48,14,48,12,6,8,43,6,1,2,1,1,5,0,5,0>>,
case (catch dec_message(MsgWithVeryBadVersion)) of
{'EXIT', {bad_version, {VersionSize, MaxVersionSize}}} when (VersionSize > MaxVersionSize) ->
ok;
Unexpected3 ->
exit({unexpected_decode_result, 3, Unexpected3})
end,
?IPRINT("done"),
ok.
otp8563(suite) -> [];
otp8563(doc) -> ["OTP-8563 - Counter64"];
otp8563(Config) when is_list(Config) ->
Val1 = 16#7fffffffffffffff,
?IPRINT("try encode and decode value 1: ~w (0x~.16b)", [Val1, Val1]),
Enc1 = snmp_pdus:enc_value('Counter64', Val1),
?IPRINT(" => ~w", [Enc1]),
{{'Counter64', Val1}, []} = snmp_pdus:dec_value(Enc1),
Val2 = Val1 + 1,
?IPRINT("try encode and decode value 2: ~w (0x~.16b)", [Val2, Val2]),
Enc2 = snmp_pdus:enc_value('Counter64', Val2),
?IPRINT(" => ~w", [Enc2]),
{{'Counter64', Val2}, []} = snmp_pdus:dec_value(Enc2),
Val3 = Val2 + 1,
?IPRINT("try encode and decode value 3: ~w (0x~.16b)", [Val3, Val3]),
Enc3 = snmp_pdus:enc_value('Counter64', Val3),
?IPRINT(" => ~w", [Enc3]),
{{'Counter64', Val3}, []} = snmp_pdus:dec_value(Enc3),
Val4 = 16#fffffffffffffffe,
?IPRINT("try encode and decode value 4: ~w (0x~.16b)", [Val4, Val4]),
Enc4 = snmp_pdus:enc_value('Counter64', Val4),
?IPRINT(" => ~w", [Enc4]),
{{'Counter64', Val4}, []} = snmp_pdus:dec_value(Enc4),
Val5 = Val4 + 1,
?IPRINT("try encode and decode value 5: ~w (0x~.16b)", [Val5, Val5]),
Enc5 = snmp_pdus:enc_value('Counter64', Val5),
?IPRINT(" => ~w", [Enc5]),
{{'Counter64', Val5}, []} = snmp_pdus:dec_value(Enc5),
Val6 = 16#ffffffffffffffff + 1,
?IPRINT("try and fail to encode value 6: ~w (0x~.16b)", [Val6, Val6]),
case (catch snmp_pdus:enc_value('Counter64', Val6)) of
{'EXIT', {error, {bad_counter64, Val6}}} ->
ok;
Unexpected6 ->
?IPRINT(" => ~w", [Unexpected6]),
exit({unexpected_encode_result, Unexpected6, Val6})
end,
Val7 = -1,
?IPRINT("try and fail to encode value 7: ~w", [Val7]),
case (catch snmp_pdus:enc_value('Counter64', Val7)) of
{'EXIT', {error, {bad_counter64, Val7}}} ->
ok;
Unexpected7 ->
?IPRINT(" => ~w", [Unexpected7]),
exit({unexpected_encode_result, Unexpected7, Val7})
end,
ok.
otp9022(suite) -> [];
otp9022(doc) -> ["OTP-9022 - Counter32"];
otp9022(Config) when is_list(Config) ->
Val0 = 2908389204,
?IPRINT("try encode and decode value 0: ~w (0x~.16b)", [Val0, Val0]),
Enc0 = snmp_pdus:enc_value('Counter32', Val0),
?IPRINT(" => ~w", [Enc0]),
{{'Counter32', Val0}, []} = snmp_pdus:dec_value(Enc0),
Val1 = 0,
?IPRINT("try encode and decode value 1: ~w (0x~.16b)", [Val1, Val1]),
Enc1 = snmp_pdus:enc_value('Counter32', Val1),
?IPRINT(" => ~w", [Enc1]),
{{'Counter32', Val1}, []} = snmp_pdus:dec_value(Enc1),
Val2 = Val1 + 1,
?IPRINT("try encode and decode value 2: ~w (0x~.16b)", [Val2, Val2]),
Enc2 = snmp_pdus:enc_value('Counter32', Val2),
?IPRINT(" => ~w", [Enc2]),
{{'Counter32', Val2}, []} = snmp_pdus:dec_value(Enc2),
Val3 = 16#7ffffffe,
?IPRINT("try encode and decode value 3: ~w (0x~.16b)", [Val3, Val3]),
Enc3 = snmp_pdus:enc_value('Counter32', Val3),
?IPRINT(" => ~w", [Enc3]),
{{'Counter32', Val3}, []} = snmp_pdus:dec_value(Enc3),
Val4 = Val3 + 1,
?IPRINT("try encode and decode value 4: ~w (0x~.16b)", [Val4, Val4]),
Enc4 = snmp_pdus:enc_value('Counter32', Val4),
?IPRINT(" => ~w", [Enc4]),
{{'Counter32', Val4}, []} = snmp_pdus:dec_value(Enc4),
Val5 = Val4 + 1,
?IPRINT("try encode and decode value 5: ~w (0x~.16b)", [Val5, Val5]),
Enc5 = snmp_pdus:enc_value('Counter32', Val5),
?IPRINT(" => ~w", [Enc5]),
{{'Counter32', Val5}, []} = snmp_pdus:dec_value(Enc5),
Val6 = 16#fffffffe,
?IPRINT("try encode and decode value 6: ~w (0x~.16b)", [Val6, Val6]),
Enc6 = snmp_pdus:enc_value('Counter32', Val6),
?IPRINT(" => ~w", [Enc6]),
{{'Counter32', Val6}, []} = snmp_pdus:dec_value(Enc6),
Val7 = Val6 + 1,
?IPRINT("try encode and decode value 7: ~w (0x~.16b)", [Val7, Val7]),
Enc7 = snmp_pdus:enc_value('Counter32', Val7),
?IPRINT(" => ~w", [Enc7]),
{{'Counter32', Val7}, []} = snmp_pdus:dec_value(Enc7),
Val8 = 16#ffffffff + 1,
?IPRINT("try and fail to encode value 8: ~w (0x~.16b)", [Val8, Val8]),
case (catch snmp_pdus:enc_value('Counter32', Val8)) of
{'EXIT', {error, {bad_counter32, Val8}}} ->
ok;
Unexpected8 ->
?IPRINT(" => ~w", [Unexpected8]),
exit({unexpected_encode_result, Unexpected8, Val8})
end,
Val9 = -1,
?IPRINT("try and fail to encode value 9: ~w", [Val9]),
case (catch snmp_pdus:enc_value('Counter32', Val9)) of
{'EXIT', {error, {bad_counter32, Val9}}} ->
ok;
Unexpected9 ->
?IPRINT(" => ~w", [Unexpected9]),
exit({unexpected_encode_result, Unexpected9, Val9})
end,
ok.
otp10132(suite) -> [];
otp10132(doc) -> ["OTP-10132 - TimeTicks"];
otp10132(Config) when is_list(Config) ->
Val0 = 2159001034,
?IPRINT("try encode and decode value 0: ~w (0x~.16b)", [Val0, Val0]),
Enc0 = snmp_pdus:enc_value('TimeTicks', Val0),
?IPRINT(" => ~w", [Enc0]),
{{'TimeTicks', Val0}, []} = snmp_pdus:dec_value(Enc0),
Val1 = 0,
?IPRINT("try encode and decode value 1: ~w (0x~.16b)", [Val1, Val1]),
Enc1 = snmp_pdus:enc_value('TimeTicks', Val1),
?IPRINT(" => ~w", [Enc1]),
{{'TimeTicks', Val1}, []} = snmp_pdus:dec_value(Enc1),
Val2 = Val1 + 1,
?IPRINT("try encode and decode value 2: ~w (0x~.16b)", [Val2, Val2]),
Enc2 = snmp_pdus:enc_value('TimeTicks', Val2),
?IPRINT(" => ~w", [Enc2]),
{{'TimeTicks', Val2}, []} = snmp_pdus:dec_value(Enc2),
Val3 = 16#7ffffffe,
?IPRINT("try encode and decode value 3: ~w (0x~.16b)", [Val3, Val3]),
Enc3 = snmp_pdus:enc_value('TimeTicks', Val3),
?IPRINT(" => ~w", [Enc3]),
{{'TimeTicks', Val3}, []} = snmp_pdus:dec_value(Enc3),
Val4 = Val3 + 1,
?IPRINT("try encode and decode value 4: ~w (0x~.16b)", [Val4, Val4]),
Enc4 = snmp_pdus:enc_value('TimeTicks', Val4),
?IPRINT(" => ~w", [Enc4]),
{{'TimeTicks', Val4}, []} = snmp_pdus:dec_value(Enc4),
Val5 = Val4 + 1,
?IPRINT("try encode and decode value 5: ~w (0x~.16b)", [Val5, Val5]),
Enc5 = snmp_pdus:enc_value('TimeTicks', Val5),
?IPRINT(" => ~w", [Enc5]),
{{'TimeTicks', Val5}, []} = snmp_pdus:dec_value(Enc5),
Val6 = 16#fffffffe,
?IPRINT("try encode and decode value 6: ~w (0x~.16b)", [Val6, Val6]),
Enc6 = snmp_pdus:enc_value('TimeTicks', Val6),
?IPRINT(" => ~w", [Enc6]),
{{'TimeTicks', Val6}, []} = snmp_pdus:dec_value(Enc6),
Val7 = Val6 + 1,
?IPRINT("try encode and decode value 7: ~w (0x~.16b)", [Val7, Val7]),
Enc7 = snmp_pdus:enc_value('TimeTicks', Val7),
?IPRINT(" => ~w", [Enc7]),
{{'TimeTicks', Val7}, []} = snmp_pdus:dec_value(Enc7),
Val8 = Val7 + 1,
?IPRINT("try and fail to encode value 8: ~w (0x~.16b)", [Val8, Val8]),
case (catch snmp_pdus:enc_value('TimeTicks', Val8)) of
{'EXIT', {error, {bad_timeticks, Val8}}} ->
ok;
Unexpected8 ->
?IPRINT(" => ~w", [Unexpected8]),
exit({unexpected_encode_result, Unexpected8, Val8})
end,
Val9 = -1,
?IPRINT("try and fail to encode value 9: ~w", [Val9]),
case (catch snmp_pdus:enc_value('TimeTicks', Val9)) of
{'EXIT', {error, {bad_timeticks, Val9}}} ->
ok;
Unexpected9 ->
?IPRINT(" => ~w", [Unexpected9]),
exit({unexpected_encode_result, Unexpected9, Val9})
end,
?IPRINT("done"),
ok.
%%======================================================================
Internal functions
%%======================================================================
dec_message(B) when is_binary(B) ->
L = binary_to_list(B),
snmp_pdus:dec_message(L).
| null | https://raw.githubusercontent.com/spawnfest/eep49ers/d1020fd625a0bbda8ab01caf0e1738eb1cf74886/lib/snmp/test/snmp_pdus_SUITE.erl | erlang |
%CopyrightBegin%
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
%CopyrightEnd%
----------------------------------------------------------------------
Purpose:
----------------------------------------------------------------------
----------------------------------------------------------------------
Include files
----------------------------------------------------------------------
----------------------------------------------------------------------
External exports
----------------------------------------------------------------------
======================================================================
Common Test interface functions
======================================================================
-----
We need a monitor on this node also
-----
-----
======================================================================
Test functions
======================================================================
======================================================================
====================================================================== | Copyright Ericsson AB 2003 - 2020 . All Rights Reserved .
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
distributed under the License is distributed on an " AS IS " BASIS ,
-module(snmp_pdus_SUITE).
-include_lib("common_test/include/ct.hrl").
-include("snmp_test_lib.hrl").
-include_lib("snmp/include/snmp_types.hrl").
-export([
suite/0, all/0, groups/0,
init_per_suite/1, end_per_suite/1,
init_per_group/2, end_per_group/2,
init_per_testcase/2, end_per_testcase/2,
otp7575/1,
otp8563/1,
otp9022/1,
otp10132/1
]).
suite() ->
[{ct_hooks, [ts_install_cth]}].
all() ->
[{group, tickets}].
groups() ->
[{tickets, [], tickets_cases()}].
tickets_cases() ->
[
otp7575,
otp8563,
otp9022,
otp10132
].
init_per_suite(Config0) when is_list(Config0) ->
?IPRINT("init_per_suite -> entry with"
"~n Config: ~p", [Config0]),
case ?LIB:init_per_suite(Config0) of
{skip, _} = SKIP ->
SKIP;
Config1 when is_list(Config1) ->
snmp_test_sys_monitor:start(),
?IPRINT("init_per_suite -> end when"
"~n Config1: ~p", [Config1]),
Config1
end.
end_per_suite(Config0) when is_list(Config0) ->
?IPRINT("end_per_suite -> entry with"
"~n Config: ~p", [Config0]),
snmp_test_sys_monitor:stop(),
Config1 = ?LIB:end_per_suite(Config0),
?IPRINT("end_per_suite -> end"),
Config1.
init_per_group(_GroupName, Config) ->
Config.
end_per_group(_GroupName, Config) ->
Config.
init_per_testcase(_Case, Config) when is_list(Config) ->
?IPRINT("init_per_testcase -> entry with"
"~n Config: ~p", [Config]),
snmp_test_global_sys_monitor:reset_events(),
?IPRINT("init_per_testcase -> end"),
Config.
end_per_testcase(_Case, Config) when is_list(Config) ->
?IPRINT("end_per_testcase -> entry with"
"~n Config: ~p",
[Config]),
?IPRINT("system events during test: ~p",
[snmp_test_global_sys_monitor:events()]),
Config.
otp7575(suite) -> [];
otp7575(doc) -> ["OTP-7575 - Message version"];
otp7575(Config) when is_list(Config) ->
?IPRINT("attempt to decode message with valid version"),
MsgWithOkVersion = <<48,39,2,1,0,4,6,112,117,98,108,105,99,160,26,2,2,1,49,2,1,0,2,1,0,48,14,48,12,6,8,43,6,1,2,1,1,5,0,5,0>>,
case (catch dec_message(MsgWithOkVersion)) of
Msg when is_record(Msg, message) ->
ok;
Unexpected1 ->
exit({unexpected_decode_result, 1, Unexpected1})
end,
?IPRINT("attempt to decode message with bad version"),
MsgWithBadVersion = <<48,48,2,10,1,1,1,1,1,1,1,1,1,1,4,6,112,117,98,108,105,99,160,26,2,2,1,49,2,1,0,2,1,0,48,14,48,12,6,8,43,6,1,2,1,1,5,0,5,0>>,
case (catch dec_message(MsgWithBadVersion)) of
{'EXIT', {bad_version, BadVersion}} when is_integer(BadVersion) ->
ok;
Unexpected2 ->
exit({unexpected_decode_result, 2, Unexpected2})
end,
?IPRINT("attempt to decode message with very bad version"),
MsgWithVeryBadVersion = <<48,49,2,11,1,1,1,1,1,1,1,1,1,1,1,4,6,112,117,98,108,105,99,160,26,2,2,1,49,2,1,0,2,1,0,48,14,48,12,6,8,43,6,1,2,1,1,5,0,5,0>>,
case (catch dec_message(MsgWithVeryBadVersion)) of
{'EXIT', {bad_version, {VersionSize, MaxVersionSize}}} when (VersionSize > MaxVersionSize) ->
ok;
Unexpected3 ->
exit({unexpected_decode_result, 3, Unexpected3})
end,
?IPRINT("done"),
ok.
otp8563(suite) -> [];
otp8563(doc) -> ["OTP-8563 - Counter64"];
otp8563(Config) when is_list(Config) ->
Val1 = 16#7fffffffffffffff,
?IPRINT("try encode and decode value 1: ~w (0x~.16b)", [Val1, Val1]),
Enc1 = snmp_pdus:enc_value('Counter64', Val1),
?IPRINT(" => ~w", [Enc1]),
{{'Counter64', Val1}, []} = snmp_pdus:dec_value(Enc1),
Val2 = Val1 + 1,
?IPRINT("try encode and decode value 2: ~w (0x~.16b)", [Val2, Val2]),
Enc2 = snmp_pdus:enc_value('Counter64', Val2),
?IPRINT(" => ~w", [Enc2]),
{{'Counter64', Val2}, []} = snmp_pdus:dec_value(Enc2),
Val3 = Val2 + 1,
?IPRINT("try encode and decode value 3: ~w (0x~.16b)", [Val3, Val3]),
Enc3 = snmp_pdus:enc_value('Counter64', Val3),
?IPRINT(" => ~w", [Enc3]),
{{'Counter64', Val3}, []} = snmp_pdus:dec_value(Enc3),
Val4 = 16#fffffffffffffffe,
?IPRINT("try encode and decode value 4: ~w (0x~.16b)", [Val4, Val4]),
Enc4 = snmp_pdus:enc_value('Counter64', Val4),
?IPRINT(" => ~w", [Enc4]),
{{'Counter64', Val4}, []} = snmp_pdus:dec_value(Enc4),
Val5 = Val4 + 1,
?IPRINT("try encode and decode value 5: ~w (0x~.16b)", [Val5, Val5]),
Enc5 = snmp_pdus:enc_value('Counter64', Val5),
?IPRINT(" => ~w", [Enc5]),
{{'Counter64', Val5}, []} = snmp_pdus:dec_value(Enc5),
Val6 = 16#ffffffffffffffff + 1,
?IPRINT("try and fail to encode value 6: ~w (0x~.16b)", [Val6, Val6]),
case (catch snmp_pdus:enc_value('Counter64', Val6)) of
{'EXIT', {error, {bad_counter64, Val6}}} ->
ok;
Unexpected6 ->
?IPRINT(" => ~w", [Unexpected6]),
exit({unexpected_encode_result, Unexpected6, Val6})
end,
Val7 = -1,
?IPRINT("try and fail to encode value 7: ~w", [Val7]),
case (catch snmp_pdus:enc_value('Counter64', Val7)) of
{'EXIT', {error, {bad_counter64, Val7}}} ->
ok;
Unexpected7 ->
?IPRINT(" => ~w", [Unexpected7]),
exit({unexpected_encode_result, Unexpected7, Val7})
end,
ok.
otp9022(suite) -> [];
otp9022(doc) -> ["OTP-9022 - Counter32"];
otp9022(Config) when is_list(Config) ->
Val0 = 2908389204,
?IPRINT("try encode and decode value 0: ~w (0x~.16b)", [Val0, Val0]),
Enc0 = snmp_pdus:enc_value('Counter32', Val0),
?IPRINT(" => ~w", [Enc0]),
{{'Counter32', Val0}, []} = snmp_pdus:dec_value(Enc0),
Val1 = 0,
?IPRINT("try encode and decode value 1: ~w (0x~.16b)", [Val1, Val1]),
Enc1 = snmp_pdus:enc_value('Counter32', Val1),
?IPRINT(" => ~w", [Enc1]),
{{'Counter32', Val1}, []} = snmp_pdus:dec_value(Enc1),
Val2 = Val1 + 1,
?IPRINT("try encode and decode value 2: ~w (0x~.16b)", [Val2, Val2]),
Enc2 = snmp_pdus:enc_value('Counter32', Val2),
?IPRINT(" => ~w", [Enc2]),
{{'Counter32', Val2}, []} = snmp_pdus:dec_value(Enc2),
Val3 = 16#7ffffffe,
?IPRINT("try encode and decode value 3: ~w (0x~.16b)", [Val3, Val3]),
Enc3 = snmp_pdus:enc_value('Counter32', Val3),
?IPRINT(" => ~w", [Enc3]),
{{'Counter32', Val3}, []} = snmp_pdus:dec_value(Enc3),
Val4 = Val3 + 1,
?IPRINT("try encode and decode value 4: ~w (0x~.16b)", [Val4, Val4]),
Enc4 = snmp_pdus:enc_value('Counter32', Val4),
?IPRINT(" => ~w", [Enc4]),
{{'Counter32', Val4}, []} = snmp_pdus:dec_value(Enc4),
Val5 = Val4 + 1,
?IPRINT("try encode and decode value 5: ~w (0x~.16b)", [Val5, Val5]),
Enc5 = snmp_pdus:enc_value('Counter32', Val5),
?IPRINT(" => ~w", [Enc5]),
{{'Counter32', Val5}, []} = snmp_pdus:dec_value(Enc5),
Val6 = 16#fffffffe,
?IPRINT("try encode and decode value 6: ~w (0x~.16b)", [Val6, Val6]),
Enc6 = snmp_pdus:enc_value('Counter32', Val6),
?IPRINT(" => ~w", [Enc6]),
{{'Counter32', Val6}, []} = snmp_pdus:dec_value(Enc6),
Val7 = Val6 + 1,
?IPRINT("try encode and decode value 7: ~w (0x~.16b)", [Val7, Val7]),
Enc7 = snmp_pdus:enc_value('Counter32', Val7),
?IPRINT(" => ~w", [Enc7]),
{{'Counter32', Val7}, []} = snmp_pdus:dec_value(Enc7),
Val8 = 16#ffffffff + 1,
?IPRINT("try and fail to encode value 8: ~w (0x~.16b)", [Val8, Val8]),
case (catch snmp_pdus:enc_value('Counter32', Val8)) of
{'EXIT', {error, {bad_counter32, Val8}}} ->
ok;
Unexpected8 ->
?IPRINT(" => ~w", [Unexpected8]),
exit({unexpected_encode_result, Unexpected8, Val8})
end,
Val9 = -1,
?IPRINT("try and fail to encode value 9: ~w", [Val9]),
case (catch snmp_pdus:enc_value('Counter32', Val9)) of
{'EXIT', {error, {bad_counter32, Val9}}} ->
ok;
Unexpected9 ->
?IPRINT(" => ~w", [Unexpected9]),
exit({unexpected_encode_result, Unexpected9, Val9})
end,
ok.
otp10132(suite) -> [];
otp10132(doc) -> ["OTP-10132 - TimeTicks"];
otp10132(Config) when is_list(Config) ->
Val0 = 2159001034,
?IPRINT("try encode and decode value 0: ~w (0x~.16b)", [Val0, Val0]),
Enc0 = snmp_pdus:enc_value('TimeTicks', Val0),
?IPRINT(" => ~w", [Enc0]),
{{'TimeTicks', Val0}, []} = snmp_pdus:dec_value(Enc0),
Val1 = 0,
?IPRINT("try encode and decode value 1: ~w (0x~.16b)", [Val1, Val1]),
Enc1 = snmp_pdus:enc_value('TimeTicks', Val1),
?IPRINT(" => ~w", [Enc1]),
{{'TimeTicks', Val1}, []} = snmp_pdus:dec_value(Enc1),
Val2 = Val1 + 1,
?IPRINT("try encode and decode value 2: ~w (0x~.16b)", [Val2, Val2]),
Enc2 = snmp_pdus:enc_value('TimeTicks', Val2),
?IPRINT(" => ~w", [Enc2]),
{{'TimeTicks', Val2}, []} = snmp_pdus:dec_value(Enc2),
Val3 = 16#7ffffffe,
?IPRINT("try encode and decode value 3: ~w (0x~.16b)", [Val3, Val3]),
Enc3 = snmp_pdus:enc_value('TimeTicks', Val3),
?IPRINT(" => ~w", [Enc3]),
{{'TimeTicks', Val3}, []} = snmp_pdus:dec_value(Enc3),
Val4 = Val3 + 1,
?IPRINT("try encode and decode value 4: ~w (0x~.16b)", [Val4, Val4]),
Enc4 = snmp_pdus:enc_value('TimeTicks', Val4),
?IPRINT(" => ~w", [Enc4]),
{{'TimeTicks', Val4}, []} = snmp_pdus:dec_value(Enc4),
Val5 = Val4 + 1,
?IPRINT("try encode and decode value 5: ~w (0x~.16b)", [Val5, Val5]),
Enc5 = snmp_pdus:enc_value('TimeTicks', Val5),
?IPRINT(" => ~w", [Enc5]),
{{'TimeTicks', Val5}, []} = snmp_pdus:dec_value(Enc5),
Val6 = 16#fffffffe,
?IPRINT("try encode and decode value 6: ~w (0x~.16b)", [Val6, Val6]),
Enc6 = snmp_pdus:enc_value('TimeTicks', Val6),
?IPRINT(" => ~w", [Enc6]),
{{'TimeTicks', Val6}, []} = snmp_pdus:dec_value(Enc6),
Val7 = Val6 + 1,
?IPRINT("try encode and decode value 7: ~w (0x~.16b)", [Val7, Val7]),
Enc7 = snmp_pdus:enc_value('TimeTicks', Val7),
?IPRINT(" => ~w", [Enc7]),
{{'TimeTicks', Val7}, []} = snmp_pdus:dec_value(Enc7),
Val8 = Val7 + 1,
?IPRINT("try and fail to encode value 8: ~w (0x~.16b)", [Val8, Val8]),
case (catch snmp_pdus:enc_value('TimeTicks', Val8)) of
{'EXIT', {error, {bad_timeticks, Val8}}} ->
ok;
Unexpected8 ->
?IPRINT(" => ~w", [Unexpected8]),
exit({unexpected_encode_result, Unexpected8, Val8})
end,
Val9 = -1,
?IPRINT("try and fail to encode value 9: ~w", [Val9]),
case (catch snmp_pdus:enc_value('TimeTicks', Val9)) of
{'EXIT', {error, {bad_timeticks, Val9}}} ->
ok;
Unexpected9 ->
?IPRINT(" => ~w", [Unexpected9]),
exit({unexpected_encode_result, Unexpected9, Val9})
end,
?IPRINT("done"),
ok.
Internal functions
dec_message(B) when is_binary(B) ->
L = binary_to_list(B),
snmp_pdus:dec_message(L).
|
c4abca788e5c4ad3939f12a60abfd8c0fb302aeb8bbaa829dc1d2fbedae8e546 | raviksharma/bartosz-basics-of-haskell | mymap.hs | import Data.Char -- for the example
import Prelude hiding (map)
-- show
map :: (a -> b) -> [a] -> [b]
map _ [] = []
map f (a : as) = f a : map f as
main = print $ map toUpper "hello world!"
| null | https://raw.githubusercontent.com/raviksharma/bartosz-basics-of-haskell/86d40d831f61415ef0022bff7fe7060ae6a23701/07-tokenizer-higher-order-functions/mymap.hs | haskell | for the example
show | import Prelude hiding (map)
map :: (a -> b) -> [a] -> [b]
map _ [] = []
map f (a : as) = f a : map f as
main = print $ map toUpper "hello world!"
|
9346072b9ea3a32837749ce66a56ed3b1dee58b1b15c5e4b0d619455de4307c2 | sadiqj/ocaml-esp32 | testerror.ml | (** Test that the right message errors are emitted by Arg *)
let usage= "Arg module testing"
let test total i (spec,anon,argv) =
let argv = Array.of_list ("testerror" :: argv) in
try Arg.parse_argv ~current:(ref 0) argv spec anon usage with
| Arg.Bad s-> Printf.printf "(%d/%d) Bad:\n%s\n" (i+1) total s
| Arg.Help s -> Printf.printf "(%d/%d) Help:\n%s\n" (i+1) total s
let tests = [
(** missing argument error *)
["-s", Arg.String ignore, "missing arg"], ignore, ["-s"]
(** No argument expected *)
; ["-set", Arg.Set (ref false), "no argument expected"], ignore, ["-set=true"]
(** help message *)
; [], ignore, ["-help" ]
(** wrong argument type *)
; ["-int", Arg.Int ignore, "wrong argument type" ], ignore, ["-int"; "not_an_int" ]
(** unknown option *)
; [], ignore, [ "-an-unknown-option" ]
(** user-error in anon fun *)
; [], (fun _ -> raise @@ Arg.Bad("User-raised error")), [ "argument" ]
(** user-error in anon fun *)
; ["-error",
Arg.Unit (fun () -> raise @@ Arg.Bad("User-raised error bis")),
"user raised error"]
, ignore, [ "-error" ]
]
let () =
let n = List.length tests in
List.iteri (test n) tests
| null | https://raw.githubusercontent.com/sadiqj/ocaml-esp32/33aad4ca2becb9701eb90d779c1b1183aefeb578/testsuite/tests/lib-arg/testerror.ml | ocaml | * Test that the right message errors are emitted by Arg
* missing argument error
* No argument expected
* help message
* wrong argument type
* unknown option
* user-error in anon fun
* user-error in anon fun |
let usage= "Arg module testing"
let test total i (spec,anon,argv) =
let argv = Array.of_list ("testerror" :: argv) in
try Arg.parse_argv ~current:(ref 0) argv spec anon usage with
| Arg.Bad s-> Printf.printf "(%d/%d) Bad:\n%s\n" (i+1) total s
| Arg.Help s -> Printf.printf "(%d/%d) Help:\n%s\n" (i+1) total s
let tests = [
["-s", Arg.String ignore, "missing arg"], ignore, ["-s"]
; ["-set", Arg.Set (ref false), "no argument expected"], ignore, ["-set=true"]
; [], ignore, ["-help" ]
; ["-int", Arg.Int ignore, "wrong argument type" ], ignore, ["-int"; "not_an_int" ]
; [], ignore, [ "-an-unknown-option" ]
; [], (fun _ -> raise @@ Arg.Bad("User-raised error")), [ "argument" ]
; ["-error",
Arg.Unit (fun () -> raise @@ Arg.Bad("User-raised error bis")),
"user raised error"]
, ignore, [ "-error" ]
]
let () =
let n = List.length tests in
List.iteri (test n) tests
|
66417ee0793e40cda12ee75d31f2c14f35a205493fa787df7d31d9ba1bac6317 | dradtke/Lisp-Text-Editor | button.lisp | (in-package :gtk-cffi)
(defclass button (bin)
())
(defcfun "gtk_button_new" :pointer)
(defcfun "gtk_button_new_with_label" :pointer (label gtk-string))
(defcfun "gtk_button_new_with_mnemonic" :pointer (label gtk-string))
(defcfun "gtk_button_new_from_stock" :pointer (label gtk-string))
(defmethod gconstructor ((button button)
&key label type &allow-other-keys)
"type can be :stock or :mnemonic, any other means button with label"
(if label
(let ((creator
(case type
(:stock #'gtk-button-new-from-stock)
(:mnemonic #'gtk-button-new-with-mnemonic)
(otherwise #'gtk-button-new-with-label))))
(funcall creator label))
(gtk-button-new)))
| null | https://raw.githubusercontent.com/dradtke/Lisp-Text-Editor/b0947828eda82d7edd0df8ec2595e7491a633580/quicklisp/dists/quicklisp/software/gtk-cffi-20120208-cvs/gtk/button.lisp | lisp | (in-package :gtk-cffi)
(defclass button (bin)
())
(defcfun "gtk_button_new" :pointer)
(defcfun "gtk_button_new_with_label" :pointer (label gtk-string))
(defcfun "gtk_button_new_with_mnemonic" :pointer (label gtk-string))
(defcfun "gtk_button_new_from_stock" :pointer (label gtk-string))
(defmethod gconstructor ((button button)
&key label type &allow-other-keys)
"type can be :stock or :mnemonic, any other means button with label"
(if label
(let ((creator
(case type
(:stock #'gtk-button-new-from-stock)
(:mnemonic #'gtk-button-new-with-mnemonic)
(otherwise #'gtk-button-new-with-label))))
(funcall creator label))
(gtk-button-new)))
| |
833959d3978876bf08ad0cfb3de0c5256a482d3e29d73af0fbe789665a4c28a4 | haskell-repa/repa | Source.hs |
module Data.Array.Repa.Flow.Seq.Source
( module Data.Array.Repa.Flow.Base
, Source (..)
, Step1 (..)
, Step8 (..)
, SourceState (..)
, startSource
, joinSourceStates
, getSourceState
, flow
, flowGuts)
where
import Data.Array.Repa.Bulk.Elt
import Data.Array.Repa.Flow.Base
import Data.Array.Repa.Flow.Seq.Base
import qualified Data.Array.Repa.Flow.Seq.Report as R
import qualified Data.Vector.Unboxed.Mutable as UM
import Prelude hiding (take)
import GHC.Exts
-- | A `Source` is an incremental element producer.
-- We can pull elements from a source without knowing where they come from.
--
Elements can be produced once at a time , or eight at a time as an
-- optional optimisation.
--
data Source mode a
= forall state. Source
{ -- | Representation of the source state depends on whether the
-- source has already been started.
sourceState :: SourceState mode state
-- | How many elements are still available.
, sourceSize :: state -> IO Size
-- | Report the current state of this flow.
, sourceReport :: state -> IO R.Report
-- | Takes a continuation and calls it with
a ` Step1 ` containing some data .
, sourceGet1 :: state -> (Step1 a -> IO ()) -> IO ()
-- | Takes a continuation and calls it with
-- a `Step8` containing some data.
, sourceGet8 :: state -> (Step8 a -> IO ()) -> IO ()
}
data Step1 a
| An element and a flag saying whether a full 8 elements are
-- likely to be available next pull.
--
- We do n't want to * force * the consumer to pull the full 8
-- if it doesn't want to, otherwise functions like folds would
-- become too complicated.
= Yield1 a Bool
-- | The source is finished, no more elements will ever be available.
| Done
| Provide eight elements in one go , or says try to pull the full eight
later . The two cases are split like this to force loop unrolling in
-- the intermediate code.
data Step8 a
-- | Eight successive elements of the flow.
= Yield8 a a a a a a a a
| The source can not yield a full 8 elements right now .
-- You should use `sourceGet1` to get the next element and try
-- `sourceGet8` again later.
| Pull1
-------------------------------------------------------------------------------
-- | Holds an action to start the source,
-- or the current state if it has already been started.
data SourceState mode state where
SourceStateDelayed
:: IO state
-> SourceState FD state
SourceStateActive
:: state
-> SourceState FS state
-- | Start a source, making it active.
startSource :: Source FD a -> IO (Source FS a)
startSource (Source istate size report get1 get8)
= do state <- getSourceState istate
return $ Source (SourceStateActive state) size report get1 get8
| Join two source states of the same mode .
--
-- * If both states are delayed it the resulting action starts both.
--
-- * If both states are already active the result returns both.
--
joinSourceStates
:: SourceState mode stateA
-> SourceState mode stateB
-> SourceState mode (stateA, stateB)
joinSourceStates
(SourceStateDelayed startA)
(SourceStateDelayed startB)
= SourceStateDelayed
$ do stateA <- startA
stateB <- startB
return $ (stateA, stateB)
joinSourceStates
(SourceStateActive stateA)
(SourceStateActive stateB)
= SourceStateActive (stateA, stateB)
joinSourceStates _ _
= error "joinSourceStates: bogus warning suppression"
{-# INLINE joinSourceStates #-}
-- | Start a source state,
-- or return the exising state if it has already been started.
getSourceState :: SourceState mode state -> IO state
getSourceState fstate
= case fstate of
SourceStateDelayed mkState -> mkState
SourceStateActive state -> return state
# INLINE getSourceState #
-------------------------------------------------------------------------------
-- | Create a delayed source based on the element index of the flow.
--
-- This is typically used to read elements from some randomly accessible vector.
--
flow :: Elt a
=> Int# -- ^ Total number of elements.
-> (Int# -> a) -- ^ Function to get the element at the given index.
-> Source FD a
flow !len !load
| (istate, size, report, get1, get8) <- flowGuts len load
= Source istate size report get1 get8
# INLINE [ 1 ] flow #
flowGuts
:: Elt a
=> Int#
-> (Int# -> a)
-> ( SourceState FD (UM.IOVector Int)
, UM.IOVector Int -> IO Size
, UM.IOVector Int -> IO R.Report
, UM.IOVector Int -> (Step1 a -> IO ()) -> IO ()
, UM.IOVector Int -> (Step8 a -> IO ()) -> IO ())
flowGuts !len load
= (istate, size, report, get1, get8)
where
here = "seq.flow"
istate = SourceStateDelayed
$ do refIx <- inew 1
iwrite here refIx 0# 0#
return refIx
# INLINE istate #
size refIx
= do !(I# ix) <- iread here refIx 0#
return $ Exact (len -# ix)
# INLINE size #
report refIx
= do !ix <- iread here refIx 0#
return $ R.Flow (I# len) ix
# NOINLINE report #
get1 refIx push1
= do !(I# ix) <- iread here refIx 0#
let !remain = len -# ix
if remain ># 0#
then do
iwrite here refIx 0# (ix +# 1#)
let !x = load ix
-- Touch because we want to be sure its unboxed as
-- soon as we read it. It we don't touch it, and
-- the continuation uses the value in multiple
-- case branches then it can be reboxed and then
-- unboxed again multiple times.
touch x
push1 $ Yield1 x (remain >=# 9#)
else push1 Done
# INLINE get1 #
get8 refIx push8
= do !(I# ix) <- iread here refIx 0#
let !remain = len -# ix
if remain >=# 8#
then do
iwrite here refIx 0# (ix +# 8#)
-- TODO: not sure whether we should force these here
let here' = return
!x0 <- here' $ load (ix +# 0#)
!x1 <- here' $ load (ix +# 1#)
!x2 <- here' $ load (ix +# 2#)
!x3 <- here' $ load (ix +# 3#)
!x4 <- here' $ load (ix +# 4#)
!x5 <- here' $ load (ix +# 5#)
!x6 <- here' $ load (ix +# 6#)
!x7 <- here' $ load (ix +# 7#)
push8 $ Yield8 x0 x1 x2 x3 x4 x5 x6 x7
else do
push8 Pull1
# INLINE get8 #
# INLINE [ 1 ] flowGuts #
| null | https://raw.githubusercontent.com/haskell-repa/repa/c867025e99fd008f094a5b18ce4dabd29bed00ba/icebox/abandoned/repa-flow/Data/Array/Repa/Flow/Seq/Source.hs | haskell | | A `Source` is an incremental element producer.
We can pull elements from a source without knowing where they come from.
optional optimisation.
| Representation of the source state depends on whether the
source has already been started.
| How many elements are still available.
| Report the current state of this flow.
| Takes a continuation and calls it with
| Takes a continuation and calls it with
a `Step8` containing some data.
likely to be available next pull.
if it doesn't want to, otherwise functions like folds would
become too complicated.
| The source is finished, no more elements will ever be available.
the intermediate code.
| Eight successive elements of the flow.
You should use `sourceGet1` to get the next element and try
`sourceGet8` again later.
-----------------------------------------------------------------------------
| Holds an action to start the source,
or the current state if it has already been started.
| Start a source, making it active.
* If both states are delayed it the resulting action starts both.
* If both states are already active the result returns both.
# INLINE joinSourceStates #
| Start a source state,
or return the exising state if it has already been started.
-----------------------------------------------------------------------------
| Create a delayed source based on the element index of the flow.
This is typically used to read elements from some randomly accessible vector.
^ Total number of elements.
^ Function to get the element at the given index.
Touch because we want to be sure its unboxed as
soon as we read it. It we don't touch it, and
the continuation uses the value in multiple
case branches then it can be reboxed and then
unboxed again multiple times.
TODO: not sure whether we should force these here |
module Data.Array.Repa.Flow.Seq.Source
( module Data.Array.Repa.Flow.Base
, Source (..)
, Step1 (..)
, Step8 (..)
, SourceState (..)
, startSource
, joinSourceStates
, getSourceState
, flow
, flowGuts)
where
import Data.Array.Repa.Bulk.Elt
import Data.Array.Repa.Flow.Base
import Data.Array.Repa.Flow.Seq.Base
import qualified Data.Array.Repa.Flow.Seq.Report as R
import qualified Data.Vector.Unboxed.Mutable as UM
import Prelude hiding (take)
import GHC.Exts
Elements can be produced once at a time , or eight at a time as an
data Source mode a
= forall state. Source
sourceState :: SourceState mode state
, sourceSize :: state -> IO Size
, sourceReport :: state -> IO R.Report
a ` Step1 ` containing some data .
, sourceGet1 :: state -> (Step1 a -> IO ()) -> IO ()
, sourceGet8 :: state -> (Step8 a -> IO ()) -> IO ()
}
data Step1 a
| An element and a flag saying whether a full 8 elements are
- We do n't want to * force * the consumer to pull the full 8
= Yield1 a Bool
| Done
| Provide eight elements in one go , or says try to pull the full eight
later . The two cases are split like this to force loop unrolling in
data Step8 a
= Yield8 a a a a a a a a
| The source can not yield a full 8 elements right now .
| Pull1
data SourceState mode state where
SourceStateDelayed
:: IO state
-> SourceState FD state
SourceStateActive
:: state
-> SourceState FS state
startSource :: Source FD a -> IO (Source FS a)
startSource (Source istate size report get1 get8)
= do state <- getSourceState istate
return $ Source (SourceStateActive state) size report get1 get8
| Join two source states of the same mode .
joinSourceStates
:: SourceState mode stateA
-> SourceState mode stateB
-> SourceState mode (stateA, stateB)
joinSourceStates
(SourceStateDelayed startA)
(SourceStateDelayed startB)
= SourceStateDelayed
$ do stateA <- startA
stateB <- startB
return $ (stateA, stateB)
joinSourceStates
(SourceStateActive stateA)
(SourceStateActive stateB)
= SourceStateActive (stateA, stateB)
joinSourceStates _ _
= error "joinSourceStates: bogus warning suppression"
getSourceState :: SourceState mode state -> IO state
getSourceState fstate
= case fstate of
SourceStateDelayed mkState -> mkState
SourceStateActive state -> return state
# INLINE getSourceState #
flow :: Elt a
-> Source FD a
flow !len !load
| (istate, size, report, get1, get8) <- flowGuts len load
= Source istate size report get1 get8
# INLINE [ 1 ] flow #
flowGuts
:: Elt a
=> Int#
-> (Int# -> a)
-> ( SourceState FD (UM.IOVector Int)
, UM.IOVector Int -> IO Size
, UM.IOVector Int -> IO R.Report
, UM.IOVector Int -> (Step1 a -> IO ()) -> IO ()
, UM.IOVector Int -> (Step8 a -> IO ()) -> IO ())
flowGuts !len load
= (istate, size, report, get1, get8)
where
here = "seq.flow"
istate = SourceStateDelayed
$ do refIx <- inew 1
iwrite here refIx 0# 0#
return refIx
# INLINE istate #
size refIx
= do !(I# ix) <- iread here refIx 0#
return $ Exact (len -# ix)
# INLINE size #
report refIx
= do !ix <- iread here refIx 0#
return $ R.Flow (I# len) ix
# NOINLINE report #
get1 refIx push1
= do !(I# ix) <- iread here refIx 0#
let !remain = len -# ix
if remain ># 0#
then do
iwrite here refIx 0# (ix +# 1#)
let !x = load ix
touch x
push1 $ Yield1 x (remain >=# 9#)
else push1 Done
# INLINE get1 #
get8 refIx push8
= do !(I# ix) <- iread here refIx 0#
let !remain = len -# ix
if remain >=# 8#
then do
iwrite here refIx 0# (ix +# 8#)
let here' = return
!x0 <- here' $ load (ix +# 0#)
!x1 <- here' $ load (ix +# 1#)
!x2 <- here' $ load (ix +# 2#)
!x3 <- here' $ load (ix +# 3#)
!x4 <- here' $ load (ix +# 4#)
!x5 <- here' $ load (ix +# 5#)
!x6 <- here' $ load (ix +# 6#)
!x7 <- here' $ load (ix +# 7#)
push8 $ Yield8 x0 x1 x2 x3 x4 x5 x6 x7
else do
push8 Pull1
# INLINE get8 #
# INLINE [ 1 ] flowGuts #
|
cde1719dfb71816f4dc12e812308b3e183354a16ee9218c0935008d111bb00b4 | ideas-edu/ideas | TestSuite.hs | -----------------------------------------------------------------------------
Copyright 2019 , Ideas project team . This file is distributed under the
terms of the Apache License 2.0 . For more information , see the files
" LICENSE.txt " and " NOTICE.txt " , which are included in the distribution .
-----------------------------------------------------------------------------
-- |
-- Maintainer :
-- Stability : provisional
Portability : portable ( depends on ghc )
--
A lightweight wrapper for organizing tests ( including QuickCheck tests ) . It
-- introduces the notion of a test suite, and it stores the test results for
later inspection ( e.g. , for the generation of a test report ) . A TestSuite
-- is a monoid.
--
-----------------------------------------------------------------------------
module Ideas.Utils.TestSuite
( -- * TestSuite
TestSuite
, suite, useProperty, usePropertyWith
, assertTrue, assertNull, assertEquals, assertIO
, assertMessage, assertMessageIO
, onlyWarnings, rateOnError
-- * Running a test suite
, runTestSuite, runTestSuiteResult
-- * Test Suite Result
, Result, subResults, findSubResult
, justOneSuite, allMessages, topMessages
, nrOfTests, nrOfErrors, nrOfWarnings
, timeInterval, makeSummary, printSummary
-- * Message
, Message, message, warning, messageLines
-- * Status
, Status, HasStatus(..)
, isError, isWarning, isOk
-- * Rating
, Rating, HasRating(..)
) where
import Control.Exception
import Control.Monad
import Data.Foldable (toList)
import Data.IORef
import Data.List
import Data.Maybe
import Data.Semigroup as Sem
import Data.Time
import Ideas.Utils.Prelude (getDiffTime)
import System.IO
import Test.QuickCheck hiding (Result)
import qualified Data.Sequence as S
----------------------------------------------------------------
-- Test Suite
newtype TestSuite = TS (S.Seq Test)
data Test = Case String (IO Message)
| Suite String TestSuite
instance Sem.Semigroup TestSuite where
TS xs <> TS ys = TS (xs S.>< ys)
instance Monoid TestSuite where
mempty = TS mempty
mappend = (<>)
tests :: TestSuite -> [Test]
tests (TS xs) = toList xs
makeTestSuite :: Test -> TestSuite
makeTestSuite = TS . S.singleton
----------------------------------------------------------------
-- Test suite constructors
-- | Construct a (named) test suite containing test cases and other suites
suite :: String -> [TestSuite] -> TestSuite
suite s = makeTestSuite . Suite s . mconcat
| Turn a QuickCheck property into the test suite . The first argument is
-- a label for the property
useProperty :: Testable prop => String -> prop -> TestSuite
useProperty = flip usePropertyWith stdArgs
-- | Turn a QuickCheck property into the test suite, also providing a test
configuration ( )
usePropertyWith :: Testable prop => String -> Args -> prop -> TestSuite
usePropertyWith s args =
makeTestSuite . Case s . fmap make . quickCheckWithResult args {chatty=False}
where
make qc =
case qc of
Success {} ->
mempty
Failure {reason = msg} ->
message msg
NoExpectedFailure {} ->
message "no expected failure"
GaveUp {numTests = i} ->
warning ("passed only " ++ show i ++ " tests")
InsufficientCoverage {numTests = i} ->
warning ("only performed " ++ show i ++ " tests")
assertTrue :: String -> Bool -> TestSuite
assertTrue s = assertIO s . return
assertNull :: Show a => String -> [a] -> TestSuite
assertNull s xs = assertMessages s (null xs) (map show xs)
assertEquals :: (Eq a, Show a) => String -> a -> a -> TestSuite
assertEquals s x y = assertMessage s (x==y) $
"not equal " ++ show x ++ " and " ++ show y
assertMessage :: String -> Bool -> String -> TestSuite
assertMessage s b = assertMessages s b . return
assertMessages :: String -> Bool -> [String] -> TestSuite
assertMessages s b xs = makeTestSuite . Case s $ return $
if b then mempty else mconcat (map message xs)
assertIO :: String -> IO Bool -> TestSuite
assertIO s = makeTestSuite . Case s . fmap f
where
f b = if b then mempty else message "assertion failed"
assertMessageIO :: String -> IO Message -> TestSuite
assertMessageIO s = makeTestSuite . Case s
-- | All errors are turned into warnings
onlyWarnings :: TestSuite -> TestSuite
onlyWarnings = changeMessages $ \m ->
m { messageStatus = messageStatus m `min` Warning
, messageRating = mempty
}
rateOnError :: Int -> TestSuite -> TestSuite
rateOnError n = changeMessages $ \m ->
if isError m then m { messageRating = Rating n } else m
changeMessages :: (Message -> Message) -> TestSuite -> TestSuite
changeMessages f = changeTS
where
changeTS (TS xs) = TS (fmap changeTest xs)
changeTest (Case s io) = Case s (f <$> io)
changeTest (Suite s t) = Suite s (changeTS t)
----------------------------------------------------------------
-- Running a test suite
runTestSuite :: Bool -> TestSuite -> IO ()
runTestSuite chattyIO = void . runTestSuiteResult chattyIO
runTestSuiteResult :: Bool -> TestSuite -> IO Result
runTestSuiteResult chattyIO ts = do
hSetBuffering stdout NoBuffering
ref <- newIORef 0
result <- runner ref chattyIO ts
newline ref
return result
runner :: IORef Int -> Bool -> TestSuite -> IO Result
runner ref chattyIO = runTS
where
runTS :: TestSuite -> IO Result
runTS ts = do
(res, dt) <- getDiffTime (foldM addTest mempty (tests ts))
returnStrict res { diffTime = dt }
runTest :: Test -> IO Result
runTest t =
case t of
Suite s xs -> runSuite s xs
Case s io -> runTestCase s io
runSuite ::String -> TestSuite -> IO Result
runSuite s ts = do
when chattyIO $ do
newline ref
putStrLn s
reset ref
result <- runTS ts
returnStrict (suiteResult s result)
runTestCase :: String -> IO Message -> IO Result
runTestCase s io = do
msg <- io `catch` handler
case messageStatus msg of
_ | not chattyIO -> return ()
Ok -> dot ref
_ -> do
newlineIndent ref
print msg
reset ref
returnStrict (caseResult (s, msg))
where
handler :: SomeException -> IO Message
handler = return . message . show
addTest :: Result -> Test -> IO Result
addTest res t = (res <>) <$> runTest t
-- formatting helpers
type WriteIO a = IORef Int -> IO a
newline :: WriteIO ()
newline ref = do
i <- readIORef ref
when (i>0) (putChar '\n')
reset ref
newlineIndent :: WriteIO ()
newlineIndent ref = do
newline ref
putStr " "
writeIORef ref 3
dot :: WriteIO ()
dot ref = do
i <- readIORef ref
unless (i>0 && i<60) (newlineIndent ref)
putChar '.'
modifyIORef ref (+1)
reset :: WriteIO ()
reset = (`writeIORef` 0)
----------------------------------------------------------------
-- Test Suite Result
data Result = Result
{ suites :: S.Seq (String, Result)
, cases :: S.Seq (String, Message)
, diffTime :: !NominalDiffTime
, nrOfTests :: !Int
, nrOfWarnings :: !Int
, nrOfErrors :: !Int
, resultRating :: !Rating
}
one - line summary
instance Show Result where
show result =
"(tests: " ++ show (nrOfTests result) ++
", errors: " ++ show (nrOfErrors result) ++
", warnings: " ++ show (nrOfWarnings result) ++
", " ++ show (diffTime result) ++ ")"
instance Sem.Semigroup Result where
x <> y = Result
{ suites = suites x S.>< suites y
, cases = cases x S.>< cases y
, diffTime = diffTime x + diffTime y
, nrOfTests = nrOfTests x + nrOfTests y
, nrOfWarnings = nrOfWarnings x + nrOfWarnings y
, nrOfErrors = nrOfErrors x + nrOfErrors y
, resultRating = resultRating x <> resultRating y
}
instance Monoid Result where
mempty = Result mempty mempty 0 0 0 0 mempty
mappend = (<>)
instance HasStatus Result where
getStatus r | nrOfErrors r > 0 = Error
| nrOfWarnings r > 0 = Warning
| otherwise = Ok
instance HasRating Result where
rating = rating . resultRating
rate n a = a {resultRating = Rating n}
suiteResult :: String -> Result -> Result
suiteResult s res = mempty
{ suites = S.singleton (s, res)
, nrOfTests = nrOfTests res
, nrOfWarnings = nrOfWarnings res
, nrOfErrors = nrOfErrors res
, resultRating = resultRating res
}
caseResult :: (String, Message) -> Result
caseResult x@(_, msg) =
case getStatus msg of
Ok -> new
Warning -> new { nrOfWarnings = 1 }
Error -> new { nrOfErrors = 1 }
where
new = mempty
{ cases = S.singleton x
, nrOfTests = 1
, resultRating = messageRating msg
}
subResults :: Result -> [(String, Result)]
subResults = toList . suites
topMessages :: Result -> [(String, Message)]
topMessages = toList . cases
allMessages :: Result -> [(String, Message)]
allMessages res =
topMessages res ++ concatMap (allMessages . snd) (subResults res)
findSubResult :: String -> Result -> Maybe Result
findSubResult name = listToMaybe . recs
where
recs = concatMap rec . subResults
rec (n, t)
| n == name = [t]
| otherwise = recs t
justOneSuite :: Result -> Maybe (String, Result)
justOneSuite res =
case subResults res of
[x] | S.null (cases res) -> Just x
_ -> Nothing
timeInterval :: Result -> Double
timeInterval = fromRational . toRational . diffTime
printSummary :: Result -> IO ()
printSummary = putStrLn . makeSummary
makeSummary :: Result -> String
makeSummary result = unlines $
[ line
, "Tests : " ++ show (nrOfTests result)
, "Errors : " ++ show (nrOfErrors result)
, "Warnings : " ++ show (nrOfWarnings result)
, ""
, "Time : " ++ show (diffTime result)
, ""
, "Suites: "
] ++ map f (subResults result)
++ [line]
where
line = replicate 75 '-'
f (name, r) = " " ++ name ++ " " ++ show r
-----------------------------------------------------
-- Message
data Message = M
{ messageStatus :: !Status
, messageRating :: !Rating
, messageLines :: [String]
}
deriving Eq
instance Show Message where
show a = st ++ sep ++ msg
where
msg = intercalate ", " (messageLines a)
sep = if null st || null msg then "" else ": "
st | isError a = "error"
| isWarning a = "warning"
| null (messageLines a) = "ok"
| otherwise = ""
instance Sem.Semigroup Message where
M s r xs <> M t q ys = M (s <> t) (r <> q) (xs <> ys)
instance Monoid Message where
mempty = M mempty mempty mempty
mappend = (<>)
instance HasStatus Message where
getStatus = messageStatus
instance HasRating Message where
rating = rating . messageRating
rate n a = a {messageRating = Rating n}
message :: String -> Message
message = M Error (Rating 0) . return
warning :: String -> Message
warning = M Warning mempty . return
-----------------------------------------------------
-- Status
data Status = Ok | Warning | Error
deriving (Eq, Ord)
instance Sem.Semigroup Status where
(<>) = max
instance Monoid Status where
mempty = Ok
mappend = (<>)
class HasStatus a where
getStatus :: a -> Status
isOk, isWarning, isError :: HasStatus a => a -> Bool
isOk = (== Ok) . getStatus
isWarning = (== Warning) . getStatus
isError = (== Error) . getStatus
-----------------------------------------------------
-- Rating
data Rating = Rating !Int | MaxRating
deriving (Eq, Ord)
instance Sem.Semigroup Rating where
(<>) = min
instance Monoid Rating where
mempty = MaxRating
mappend = (<>)
class HasRating a where
rating :: a -> Maybe Int
rate :: Int -> a -> a
instance HasRating Rating where
rating (Rating n) = Just n
rating MaxRating = Nothing
rate = const . Rating
-----------------------------------------------------
-- Utility function
returnStrict :: Monad m => a -> m a
returnStrict a = a `seq` return a | null | https://raw.githubusercontent.com/ideas-edu/ideas/f84907f92a8c407b7313f99e65a08d2646dc1565/src/Ideas/Utils/TestSuite.hs | haskell | ---------------------------------------------------------------------------
---------------------------------------------------------------------------
|
Maintainer :
Stability : provisional
introduces the notion of a test suite, and it stores the test results for
is a monoid.
---------------------------------------------------------------------------
* TestSuite
* Running a test suite
* Test Suite Result
* Message
* Status
* Rating
--------------------------------------------------------------
Test Suite
--------------------------------------------------------------
Test suite constructors
| Construct a (named) test suite containing test cases and other suites
a label for the property
| Turn a QuickCheck property into the test suite, also providing a test
| All errors are turned into warnings
--------------------------------------------------------------
Running a test suite
formatting helpers
--------------------------------------------------------------
Test Suite Result
---------------------------------------------------
Message
---------------------------------------------------
Status
---------------------------------------------------
Rating
---------------------------------------------------
Utility function
| Copyright 2019 , Ideas project team . This file is distributed under the
terms of the Apache License 2.0 . For more information , see the files
" LICENSE.txt " and " NOTICE.txt " , which are included in the distribution .
Portability : portable ( depends on ghc )
A lightweight wrapper for organizing tests ( including QuickCheck tests ) . It
later inspection ( e.g. , for the generation of a test report ) . A TestSuite
module Ideas.Utils.TestSuite
TestSuite
, suite, useProperty, usePropertyWith
, assertTrue, assertNull, assertEquals, assertIO
, assertMessage, assertMessageIO
, onlyWarnings, rateOnError
, runTestSuite, runTestSuiteResult
, Result, subResults, findSubResult
, justOneSuite, allMessages, topMessages
, nrOfTests, nrOfErrors, nrOfWarnings
, timeInterval, makeSummary, printSummary
, Message, message, warning, messageLines
, Status, HasStatus(..)
, isError, isWarning, isOk
, Rating, HasRating(..)
) where
import Control.Exception
import Control.Monad
import Data.Foldable (toList)
import Data.IORef
import Data.List
import Data.Maybe
import Data.Semigroup as Sem
import Data.Time
import Ideas.Utils.Prelude (getDiffTime)
import System.IO
import Test.QuickCheck hiding (Result)
import qualified Data.Sequence as S
newtype TestSuite = TS (S.Seq Test)
data Test = Case String (IO Message)
| Suite String TestSuite
instance Sem.Semigroup TestSuite where
TS xs <> TS ys = TS (xs S.>< ys)
instance Monoid TestSuite where
mempty = TS mempty
mappend = (<>)
tests :: TestSuite -> [Test]
tests (TS xs) = toList xs
makeTestSuite :: Test -> TestSuite
makeTestSuite = TS . S.singleton
suite :: String -> [TestSuite] -> TestSuite
suite s = makeTestSuite . Suite s . mconcat
| Turn a QuickCheck property into the test suite . The first argument is
useProperty :: Testable prop => String -> prop -> TestSuite
useProperty = flip usePropertyWith stdArgs
configuration ( )
usePropertyWith :: Testable prop => String -> Args -> prop -> TestSuite
usePropertyWith s args =
makeTestSuite . Case s . fmap make . quickCheckWithResult args {chatty=False}
where
make qc =
case qc of
Success {} ->
mempty
Failure {reason = msg} ->
message msg
NoExpectedFailure {} ->
message "no expected failure"
GaveUp {numTests = i} ->
warning ("passed only " ++ show i ++ " tests")
InsufficientCoverage {numTests = i} ->
warning ("only performed " ++ show i ++ " tests")
assertTrue :: String -> Bool -> TestSuite
assertTrue s = assertIO s . return
assertNull :: Show a => String -> [a] -> TestSuite
assertNull s xs = assertMessages s (null xs) (map show xs)
assertEquals :: (Eq a, Show a) => String -> a -> a -> TestSuite
assertEquals s x y = assertMessage s (x==y) $
"not equal " ++ show x ++ " and " ++ show y
assertMessage :: String -> Bool -> String -> TestSuite
assertMessage s b = assertMessages s b . return
assertMessages :: String -> Bool -> [String] -> TestSuite
assertMessages s b xs = makeTestSuite . Case s $ return $
if b then mempty else mconcat (map message xs)
assertIO :: String -> IO Bool -> TestSuite
assertIO s = makeTestSuite . Case s . fmap f
where
f b = if b then mempty else message "assertion failed"
assertMessageIO :: String -> IO Message -> TestSuite
assertMessageIO s = makeTestSuite . Case s
onlyWarnings :: TestSuite -> TestSuite
onlyWarnings = changeMessages $ \m ->
m { messageStatus = messageStatus m `min` Warning
, messageRating = mempty
}
rateOnError :: Int -> TestSuite -> TestSuite
rateOnError n = changeMessages $ \m ->
if isError m then m { messageRating = Rating n } else m
changeMessages :: (Message -> Message) -> TestSuite -> TestSuite
changeMessages f = changeTS
where
changeTS (TS xs) = TS (fmap changeTest xs)
changeTest (Case s io) = Case s (f <$> io)
changeTest (Suite s t) = Suite s (changeTS t)
runTestSuite :: Bool -> TestSuite -> IO ()
runTestSuite chattyIO = void . runTestSuiteResult chattyIO
runTestSuiteResult :: Bool -> TestSuite -> IO Result
runTestSuiteResult chattyIO ts = do
hSetBuffering stdout NoBuffering
ref <- newIORef 0
result <- runner ref chattyIO ts
newline ref
return result
runner :: IORef Int -> Bool -> TestSuite -> IO Result
runner ref chattyIO = runTS
where
runTS :: TestSuite -> IO Result
runTS ts = do
(res, dt) <- getDiffTime (foldM addTest mempty (tests ts))
returnStrict res { diffTime = dt }
runTest :: Test -> IO Result
runTest t =
case t of
Suite s xs -> runSuite s xs
Case s io -> runTestCase s io
runSuite ::String -> TestSuite -> IO Result
runSuite s ts = do
when chattyIO $ do
newline ref
putStrLn s
reset ref
result <- runTS ts
returnStrict (suiteResult s result)
runTestCase :: String -> IO Message -> IO Result
runTestCase s io = do
msg <- io `catch` handler
case messageStatus msg of
_ | not chattyIO -> return ()
Ok -> dot ref
_ -> do
newlineIndent ref
print msg
reset ref
returnStrict (caseResult (s, msg))
where
handler :: SomeException -> IO Message
handler = return . message . show
addTest :: Result -> Test -> IO Result
addTest res t = (res <>) <$> runTest t
type WriteIO a = IORef Int -> IO a
newline :: WriteIO ()
newline ref = do
i <- readIORef ref
when (i>0) (putChar '\n')
reset ref
newlineIndent :: WriteIO ()
newlineIndent ref = do
newline ref
putStr " "
writeIORef ref 3
dot :: WriteIO ()
dot ref = do
i <- readIORef ref
unless (i>0 && i<60) (newlineIndent ref)
putChar '.'
modifyIORef ref (+1)
reset :: WriteIO ()
reset = (`writeIORef` 0)
data Result = Result
{ suites :: S.Seq (String, Result)
, cases :: S.Seq (String, Message)
, diffTime :: !NominalDiffTime
, nrOfTests :: !Int
, nrOfWarnings :: !Int
, nrOfErrors :: !Int
, resultRating :: !Rating
}
one - line summary
instance Show Result where
show result =
"(tests: " ++ show (nrOfTests result) ++
", errors: " ++ show (nrOfErrors result) ++
", warnings: " ++ show (nrOfWarnings result) ++
", " ++ show (diffTime result) ++ ")"
instance Sem.Semigroup Result where
x <> y = Result
{ suites = suites x S.>< suites y
, cases = cases x S.>< cases y
, diffTime = diffTime x + diffTime y
, nrOfTests = nrOfTests x + nrOfTests y
, nrOfWarnings = nrOfWarnings x + nrOfWarnings y
, nrOfErrors = nrOfErrors x + nrOfErrors y
, resultRating = resultRating x <> resultRating y
}
instance Monoid Result where
mempty = Result mempty mempty 0 0 0 0 mempty
mappend = (<>)
instance HasStatus Result where
getStatus r | nrOfErrors r > 0 = Error
| nrOfWarnings r > 0 = Warning
| otherwise = Ok
instance HasRating Result where
rating = rating . resultRating
rate n a = a {resultRating = Rating n}
suiteResult :: String -> Result -> Result
suiteResult s res = mempty
{ suites = S.singleton (s, res)
, nrOfTests = nrOfTests res
, nrOfWarnings = nrOfWarnings res
, nrOfErrors = nrOfErrors res
, resultRating = resultRating res
}
caseResult :: (String, Message) -> Result
caseResult x@(_, msg) =
case getStatus msg of
Ok -> new
Warning -> new { nrOfWarnings = 1 }
Error -> new { nrOfErrors = 1 }
where
new = mempty
{ cases = S.singleton x
, nrOfTests = 1
, resultRating = messageRating msg
}
subResults :: Result -> [(String, Result)]
subResults = toList . suites
topMessages :: Result -> [(String, Message)]
topMessages = toList . cases
allMessages :: Result -> [(String, Message)]
allMessages res =
topMessages res ++ concatMap (allMessages . snd) (subResults res)
findSubResult :: String -> Result -> Maybe Result
findSubResult name = listToMaybe . recs
where
recs = concatMap rec . subResults
rec (n, t)
| n == name = [t]
| otherwise = recs t
justOneSuite :: Result -> Maybe (String, Result)
justOneSuite res =
case subResults res of
[x] | S.null (cases res) -> Just x
_ -> Nothing
timeInterval :: Result -> Double
timeInterval = fromRational . toRational . diffTime
printSummary :: Result -> IO ()
printSummary = putStrLn . makeSummary
makeSummary :: Result -> String
makeSummary result = unlines $
[ line
, "Tests : " ++ show (nrOfTests result)
, "Errors : " ++ show (nrOfErrors result)
, "Warnings : " ++ show (nrOfWarnings result)
, ""
, "Time : " ++ show (diffTime result)
, ""
, "Suites: "
] ++ map f (subResults result)
++ [line]
where
line = replicate 75 '-'
f (name, r) = " " ++ name ++ " " ++ show r
data Message = M
{ messageStatus :: !Status
, messageRating :: !Rating
, messageLines :: [String]
}
deriving Eq
instance Show Message where
show a = st ++ sep ++ msg
where
msg = intercalate ", " (messageLines a)
sep = if null st || null msg then "" else ": "
st | isError a = "error"
| isWarning a = "warning"
| null (messageLines a) = "ok"
| otherwise = ""
instance Sem.Semigroup Message where
M s r xs <> M t q ys = M (s <> t) (r <> q) (xs <> ys)
instance Monoid Message where
mempty = M mempty mempty mempty
mappend = (<>)
instance HasStatus Message where
getStatus = messageStatus
instance HasRating Message where
rating = rating . messageRating
rate n a = a {messageRating = Rating n}
message :: String -> Message
message = M Error (Rating 0) . return
warning :: String -> Message
warning = M Warning mempty . return
data Status = Ok | Warning | Error
deriving (Eq, Ord)
instance Sem.Semigroup Status where
(<>) = max
instance Monoid Status where
mempty = Ok
mappend = (<>)
class HasStatus a where
getStatus :: a -> Status
isOk, isWarning, isError :: HasStatus a => a -> Bool
isOk = (== Ok) . getStatus
isWarning = (== Warning) . getStatus
isError = (== Error) . getStatus
data Rating = Rating !Int | MaxRating
deriving (Eq, Ord)
instance Sem.Semigroup Rating where
(<>) = min
instance Monoid Rating where
mempty = MaxRating
mappend = (<>)
class HasRating a where
rating :: a -> Maybe Int
rate :: Int -> a -> a
instance HasRating Rating where
rating (Rating n) = Just n
rating MaxRating = Nothing
rate = const . Rating
returnStrict :: Monad m => a -> m a
returnStrict a = a `seq` return a |
cc893f8449e58775d38427114ff8773a11ab9e1cbb322011f37ca3683c011033 | Fresheyeball/Shpadoinkle | Types.hs | # LANGUAGE AllowAmbiguousTypes #
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE DerivingStrategies #-}
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleContexts #
# LANGUAGE FlexibleInstances #
# LANGUAGE GeneralizedNewtypeDeriving #
# LANGUAGE InstanceSigs #
# LANGUAGE LambdaCase #
# LANGUAGE MultiParamTypeClasses #
{-# LANGUAGE OverloadedLabels #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuantifiedConstraints #-}
{-# LANGUAGE RankNTypes #-}
# LANGUAGE RecordWildCards #
# LANGUAGE ScopedTypeVariables #
# LANGUAGE StandaloneDeriving #
# LANGUAGE TypeApplications #
# LANGUAGE TypeFamilies #
{-# LANGUAGE TypeOperators #-}
{-# LANGUAGE UndecidableInstances #-}
# OPTIONS_GHC -fno - warn - orphans #
# OPTIONS_GHC -fno - warn - redundant - constraints #
module Types (module Types, module Types.Prim) where
import Control.Lens as Lens (Identity, view,
(^.))
import Control.Monad.Except (MonadError (throwError),
MonadTrans (..))
import Data.Aeson (FromJSON, ToJSON)
import Data.Function (on)
import Data.Generics.Labels ()
import Data.Maybe (fromMaybe)
import Data.Proxy (Proxy (Proxy))
import Data.Text (Text)
import Database.Beam (Beamable, Columnar,
Database, DatabaseSettings,
Generic, Nullable,
Table (..), TableEntity,
defaultDbSettings)
import Servant.API (Capture, Delete,
FromHttpApiData, Get, JSON,
Post, Put, QueryParam, Raw,
ReqBody, ToHttpApiData,
type (:<|>) (..), type (:>))
import Shpadoinkle (Html, MonadJSM, NFData)
import qualified Shpadoinkle.Html as H
import Shpadoinkle.Router (HasRouter (type (:>>)),
Redirect (Redirect),
Routed (..), View, navigate)
import Shpadoinkle.Widgets.Form.Dropdown as Dropdown (Dropdown)
import Shpadoinkle.Widgets.Table as Table (Column, Row,
Sort (ASC, DESC),
SortCol (..),
Tabular (Effect, sortTable, toCell, toRows))
import Shpadoinkle.Widgets.Types (Field, Humanize (..),
Hygiene (Clean),
Input (Input, _value),
Pick (AtleastOne, One),
Present (present),
Search (Search),
Status (Edit, Errors, Valid),
Validate (rules),
fullOptions, fullOptionsMin)
import Shpadoinkle.Widgets.Validation (between, nonMEmpty, nonZero,
positive)
import Types.Prim (Description (..),
Operable (..), SKU (..),
SerialNumber (..),
SpaceCraftId (..),
Squadron (..))
data SpaceCraftT f = SpaceCraft
{ identity :: Columnar f SpaceCraftId
, sku :: Columnar f SKU
, description :: Columnar (Nullable f) Description
, serial :: Columnar f SerialNumber
, squadron :: Columnar f Squadron
, operable :: Columnar f Operable
} deriving (Generic, Beamable)
instance NFData (SpaceCraftT Identity)
instance Table SpaceCraftT where
newtype PrimaryKey SpaceCraftT f = SpaceCraftKey (Columnar f SpaceCraftId) deriving (Generic) deriving anyclass (Beamable)
primaryKey = SpaceCraftKey . identity
type SpaceCraft = SpaceCraftT Identity
deriving instance Eq SpaceCraft
deriving instance Ord SpaceCraft
deriving instance Show SpaceCraft
deriving instance ToJSON SpaceCraft
deriving instance FromJSON SpaceCraft
newtype DB f = DB { roster :: f (TableEntity SpaceCraftT) } deriving (Generic) deriving anyclass (Database be)
db :: DatabaseSettings be DB
db = defaultDbSettings
data SpaceCraftUpdate s = SpaceCraftUpdate
{ sku :: Field s Text Input SKU
, description :: Field s Text Input (Maybe Description)
, serial :: Field s Text Input SerialNumber
, squadron :: Field s Text (Dropdown 'One) Squadron
, operable :: Field s Text (Dropdown 'AtleastOne) Operable
} deriving Generic
instance ( NFData (Field s Text Input SKU)
, NFData (Field s Text Input (Maybe Description))
, NFData (Field s Text Input SerialNumber)
, NFData (Field s Text (Dropdown 'One) Squadron)
, NFData (Field s Text (Dropdown 'AtleastOne) Operable)
) => NFData (SpaceCraftUpdate s)
deriving instance Eq (SpaceCraftUpdate 'Valid)
deriving instance Ord (SpaceCraftUpdate 'Valid)
deriving instance Show (SpaceCraftUpdate 'Valid)
deriving instance ToJSON (SpaceCraftUpdate 'Valid)
deriving instance FromJSON (SpaceCraftUpdate 'Valid)
deriving instance Eq (SpaceCraftUpdate 'Edit)
deriving instance Ord (SpaceCraftUpdate 'Edit)
deriving instance Show (SpaceCraftUpdate 'Edit)
deriving instance ToJSON (SpaceCraftUpdate 'Edit)
deriving instance FromJSON (SpaceCraftUpdate 'Edit)
deriving instance Show (SpaceCraftUpdate 'Errors)
instance Validate SpaceCraftUpdate where
rules = SpaceCraftUpdate
{ sku = positive <> nonZero
, description = nonMEmpty
, serial = between (30, maxBound)
, squadron = maybe (throwError "Cannot be empty") pure
, operable = pure
}
data Roster = Roster
{ sort :: SortCol [SpaceCraft]
, search :: Input Search
, table :: [SpaceCraft]
}
deriving instance Eq Roster
deriving instance Ord Roster
deriving instance Show Roster
deriving instance Generic Roster
instance
( NFData (Column [SpaceCraft])
, NFData (SpaceCraftT Identity)) => NFData Roster
instance (ToJSON (Table.Column [SpaceCraft])) => ToJSON Roster
instance (FromJSON (Table.Column [SpaceCraft])) => FromJSON Roster
emptyEditForm :: SpaceCraftUpdate 'Edit
emptyEditForm = SpaceCraftUpdate
{ sku = Input Clean 0
, description = Input Clean Nothing
, serial = Input Clean 0
, squadron = fullOptions
, operable = fullOptionsMin
}
data ViewModel
= MEcho (Maybe Text)
| MList Roster
| MDetail (Maybe SpaceCraftId) (SpaceCraftUpdate 'Edit)
| M404
deriving (Eq, Ord, Show, Generic)
instance (ToJSON (Column [SpaceCraft])) => ToJSON ViewModel
instance (FromJSON (Column [SpaceCraft])) => FromJSON ViewModel
instance (NFData (Column [SpaceCraft])) => NFData ViewModel
data Route
= REcho (Maybe Text)
| RList (Input Search)
| RNew
| RExisting SpaceCraftId
deriving (Eq, Ord, Show, Generic, NFData, ToJSON, FromJSON)
type API = "api" :> "space-craft" :> Get '[JSON] [SpaceCraft]
:<|> "api" :> "space-craft" :> Capture "id" SpaceCraftId :> Get '[JSON] (Maybe SpaceCraft)
:<|> "api" :> "space-craft" :> Capture "id" SpaceCraftId :> ReqBody '[JSON] (SpaceCraftUpdate 'Valid) :> Post '[JSON] ()
:<|> "api" :> "space-craft" :> ReqBody '[JSON] (SpaceCraftUpdate 'Valid) :> Put '[JSON] SpaceCraftId
:<|> "api" :> "space-craft" :> ReqBody '[JSON] SpaceCraftId :> Delete '[JSON] ()
type SPA m = "app" :> "echo" :> QueryParam "echo" Text :> View m Text
:<|> "app" :> "new" :> View m ViewModel
:<|> "app" :> "edit" :> Capture "id" SpaceCraftId :> View m ViewModel
:<|> "app" :> QueryParam "search" Search :> View m ViewModel
:<|> Raw
routes :: SPA m :>> Route
routes = REcho
:<|> RNew
:<|> RExisting
:<|> RList . Input Clean . fromMaybe ""
:<|> RList (Input Clean "")
deriving newtype instance ToHttpApiData Search
deriving newtype instance FromHttpApiData Search
instance Routed (SPA m) Route where
redirect = \case
REcho t -> Redirect (Proxy @("app" :> "echo" :> QueryParam "echo" Text :> View m Text)) ($ t)
RNew -> Redirect (Proxy @("app" :> "new" :> View m ViewModel)) id
RExisting i -> Redirect (Proxy @("app" :> "edit" :> Capture "id" SpaceCraftId :> View m ViewModel)) ($ i)
RList s -> Redirect (Proxy @("app" :> QueryParam "search" Search :> View m ViewModel)) ($ Just (_value s))
class CRUDSpaceCraft m where
listSpaceCraft :: m [SpaceCraft]
getSpaceCraft :: SpaceCraftId -> m (Maybe SpaceCraft)
updateSpaceCraft :: SpaceCraftId -> SpaceCraftUpdate 'Valid -> m ()
createSpaceCraft :: SpaceCraftUpdate 'Valid -> m SpaceCraftId
deleteSpaceCraft :: SpaceCraftId -> m ()
instance (MonadTrans t, Monad m, CRUDSpaceCraft m) => CRUDSpaceCraft (t m) where
listSpaceCraft = lift listSpaceCraft
getSpaceCraft = lift . getSpaceCraft
updateSpaceCraft x = lift . updateSpaceCraft x
createSpaceCraft = lift . createSpaceCraft
deleteSpaceCraft = lift . deleteSpaceCraft
instance Humanize (Column [SpaceCraft]) where
humanize = \case
SKUT -> "SKU"
DescriptionT -> "Desc"
SerialNumberT -> "Serial #"
SquadronT -> "Squadron"
OperableT -> "Status"
ToolsT -> ""
data instance Column [SpaceCraft] =
SKUT | DescriptionT | SerialNumberT | SquadronT | OperableT | ToolsT
deriving (Eq, Ord, Show, Enum, Bounded, Generic, ToJSON, FromJSON, NFData)
newtype instance Row [SpaceCraft] = SpaceCraftRow { unRow :: SpaceCraft }
deriving (Eq, Ord, Show)
instance Tabular [SpaceCraft] where
type Effect [SpaceCraft] m = (MonadJSM m, CRUDSpaceCraft m)
toRows = fmap SpaceCraftRow
toCell :: forall m. Effect [SpaceCraft] m => [SpaceCraft] -> Row [SpaceCraft] -> Column [SpaceCraft] -> [Html m [SpaceCraft]]
toCell _ (SpaceCraftRow SpaceCraft {..}) = \case
SKUT -> present sku
DescriptionT -> present description
SerialNumberT -> present serial
SquadronT -> present squadron
OperableT -> present operable
ToolsT ->
[ H.div "btn-group"
[ H.button [ H.className "btn btn-sm btn-secondary",
H.onClickM_ $ navigate @ (SPA m) (RExisting identity) ] [ "Edit" ]
, H.button [ H.className "btn btn-sm btn-secondary",
H.onClickM $ do
deleteSpaceCraft identity
return . Prelude.filter $ \x -> x ^. #identity /= identity ] [ "Delete" ]
]
]
sortTable (SortCol c d) = f $ case c of
SKUT -> g #sku
DescriptionT -> g #description
SerialNumberT -> g #serial
SquadronT -> g #squadron
OperableT -> g #operable
ToolsT -> \_ _ -> EQ
where f = case d of ASC -> id; DESC -> flip
g l = compare `on` Lens.view l . unRow
| null | https://raw.githubusercontent.com/Fresheyeball/Shpadoinkle/5e5fb636fb0b0e99f04bae0d75cff722a10463ae/examples/servant-crud/Types.hs | haskell | # LANGUAGE DataKinds #
# LANGUAGE DeriveAnyClass #
# LANGUAGE DeriveGeneric #
# LANGUAGE DerivingStrategies #
# LANGUAGE OverloadedLabels #
# LANGUAGE OverloadedStrings #
# LANGUAGE QuantifiedConstraints #
# LANGUAGE RankNTypes #
# LANGUAGE TypeOperators #
# LANGUAGE UndecidableInstances # | # LANGUAGE AllowAmbiguousTypes #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleContexts #
# LANGUAGE FlexibleInstances #
# LANGUAGE GeneralizedNewtypeDeriving #
# LANGUAGE InstanceSigs #
# LANGUAGE LambdaCase #
# LANGUAGE MultiParamTypeClasses #
# LANGUAGE RecordWildCards #
# LANGUAGE ScopedTypeVariables #
# LANGUAGE StandaloneDeriving #
# LANGUAGE TypeApplications #
# LANGUAGE TypeFamilies #
# OPTIONS_GHC -fno - warn - orphans #
# OPTIONS_GHC -fno - warn - redundant - constraints #
module Types (module Types, module Types.Prim) where
import Control.Lens as Lens (Identity, view,
(^.))
import Control.Monad.Except (MonadError (throwError),
MonadTrans (..))
import Data.Aeson (FromJSON, ToJSON)
import Data.Function (on)
import Data.Generics.Labels ()
import Data.Maybe (fromMaybe)
import Data.Proxy (Proxy (Proxy))
import Data.Text (Text)
import Database.Beam (Beamable, Columnar,
Database, DatabaseSettings,
Generic, Nullable,
Table (..), TableEntity,
defaultDbSettings)
import Servant.API (Capture, Delete,
FromHttpApiData, Get, JSON,
Post, Put, QueryParam, Raw,
ReqBody, ToHttpApiData,
type (:<|>) (..), type (:>))
import Shpadoinkle (Html, MonadJSM, NFData)
import qualified Shpadoinkle.Html as H
import Shpadoinkle.Router (HasRouter (type (:>>)),
Redirect (Redirect),
Routed (..), View, navigate)
import Shpadoinkle.Widgets.Form.Dropdown as Dropdown (Dropdown)
import Shpadoinkle.Widgets.Table as Table (Column, Row,
Sort (ASC, DESC),
SortCol (..),
Tabular (Effect, sortTable, toCell, toRows))
import Shpadoinkle.Widgets.Types (Field, Humanize (..),
Hygiene (Clean),
Input (Input, _value),
Pick (AtleastOne, One),
Present (present),
Search (Search),
Status (Edit, Errors, Valid),
Validate (rules),
fullOptions, fullOptionsMin)
import Shpadoinkle.Widgets.Validation (between, nonMEmpty, nonZero,
positive)
import Types.Prim (Description (..),
Operable (..), SKU (..),
SerialNumber (..),
SpaceCraftId (..),
Squadron (..))
data SpaceCraftT f = SpaceCraft
{ identity :: Columnar f SpaceCraftId
, sku :: Columnar f SKU
, description :: Columnar (Nullable f) Description
, serial :: Columnar f SerialNumber
, squadron :: Columnar f Squadron
, operable :: Columnar f Operable
} deriving (Generic, Beamable)
instance NFData (SpaceCraftT Identity)
instance Table SpaceCraftT where
newtype PrimaryKey SpaceCraftT f = SpaceCraftKey (Columnar f SpaceCraftId) deriving (Generic) deriving anyclass (Beamable)
primaryKey = SpaceCraftKey . identity
type SpaceCraft = SpaceCraftT Identity
deriving instance Eq SpaceCraft
deriving instance Ord SpaceCraft
deriving instance Show SpaceCraft
deriving instance ToJSON SpaceCraft
deriving instance FromJSON SpaceCraft
newtype DB f = DB { roster :: f (TableEntity SpaceCraftT) } deriving (Generic) deriving anyclass (Database be)
db :: DatabaseSettings be DB
db = defaultDbSettings
data SpaceCraftUpdate s = SpaceCraftUpdate
{ sku :: Field s Text Input SKU
, description :: Field s Text Input (Maybe Description)
, serial :: Field s Text Input SerialNumber
, squadron :: Field s Text (Dropdown 'One) Squadron
, operable :: Field s Text (Dropdown 'AtleastOne) Operable
} deriving Generic
instance ( NFData (Field s Text Input SKU)
, NFData (Field s Text Input (Maybe Description))
, NFData (Field s Text Input SerialNumber)
, NFData (Field s Text (Dropdown 'One) Squadron)
, NFData (Field s Text (Dropdown 'AtleastOne) Operable)
) => NFData (SpaceCraftUpdate s)
deriving instance Eq (SpaceCraftUpdate 'Valid)
deriving instance Ord (SpaceCraftUpdate 'Valid)
deriving instance Show (SpaceCraftUpdate 'Valid)
deriving instance ToJSON (SpaceCraftUpdate 'Valid)
deriving instance FromJSON (SpaceCraftUpdate 'Valid)
deriving instance Eq (SpaceCraftUpdate 'Edit)
deriving instance Ord (SpaceCraftUpdate 'Edit)
deriving instance Show (SpaceCraftUpdate 'Edit)
deriving instance ToJSON (SpaceCraftUpdate 'Edit)
deriving instance FromJSON (SpaceCraftUpdate 'Edit)
deriving instance Show (SpaceCraftUpdate 'Errors)
instance Validate SpaceCraftUpdate where
rules = SpaceCraftUpdate
{ sku = positive <> nonZero
, description = nonMEmpty
, serial = between (30, maxBound)
, squadron = maybe (throwError "Cannot be empty") pure
, operable = pure
}
data Roster = Roster
{ sort :: SortCol [SpaceCraft]
, search :: Input Search
, table :: [SpaceCraft]
}
deriving instance Eq Roster
deriving instance Ord Roster
deriving instance Show Roster
deriving instance Generic Roster
instance
( NFData (Column [SpaceCraft])
, NFData (SpaceCraftT Identity)) => NFData Roster
instance (ToJSON (Table.Column [SpaceCraft])) => ToJSON Roster
instance (FromJSON (Table.Column [SpaceCraft])) => FromJSON Roster
emptyEditForm :: SpaceCraftUpdate 'Edit
emptyEditForm = SpaceCraftUpdate
{ sku = Input Clean 0
, description = Input Clean Nothing
, serial = Input Clean 0
, squadron = fullOptions
, operable = fullOptionsMin
}
data ViewModel
= MEcho (Maybe Text)
| MList Roster
| MDetail (Maybe SpaceCraftId) (SpaceCraftUpdate 'Edit)
| M404
deriving (Eq, Ord, Show, Generic)
instance (ToJSON (Column [SpaceCraft])) => ToJSON ViewModel
instance (FromJSON (Column [SpaceCraft])) => FromJSON ViewModel
instance (NFData (Column [SpaceCraft])) => NFData ViewModel
data Route
= REcho (Maybe Text)
| RList (Input Search)
| RNew
| RExisting SpaceCraftId
deriving (Eq, Ord, Show, Generic, NFData, ToJSON, FromJSON)
type API = "api" :> "space-craft" :> Get '[JSON] [SpaceCraft]
:<|> "api" :> "space-craft" :> Capture "id" SpaceCraftId :> Get '[JSON] (Maybe SpaceCraft)
:<|> "api" :> "space-craft" :> Capture "id" SpaceCraftId :> ReqBody '[JSON] (SpaceCraftUpdate 'Valid) :> Post '[JSON] ()
:<|> "api" :> "space-craft" :> ReqBody '[JSON] (SpaceCraftUpdate 'Valid) :> Put '[JSON] SpaceCraftId
:<|> "api" :> "space-craft" :> ReqBody '[JSON] SpaceCraftId :> Delete '[JSON] ()
type SPA m = "app" :> "echo" :> QueryParam "echo" Text :> View m Text
:<|> "app" :> "new" :> View m ViewModel
:<|> "app" :> "edit" :> Capture "id" SpaceCraftId :> View m ViewModel
:<|> "app" :> QueryParam "search" Search :> View m ViewModel
:<|> Raw
routes :: SPA m :>> Route
routes = REcho
:<|> RNew
:<|> RExisting
:<|> RList . Input Clean . fromMaybe ""
:<|> RList (Input Clean "")
deriving newtype instance ToHttpApiData Search
deriving newtype instance FromHttpApiData Search
instance Routed (SPA m) Route where
redirect = \case
REcho t -> Redirect (Proxy @("app" :> "echo" :> QueryParam "echo" Text :> View m Text)) ($ t)
RNew -> Redirect (Proxy @("app" :> "new" :> View m ViewModel)) id
RExisting i -> Redirect (Proxy @("app" :> "edit" :> Capture "id" SpaceCraftId :> View m ViewModel)) ($ i)
RList s -> Redirect (Proxy @("app" :> QueryParam "search" Search :> View m ViewModel)) ($ Just (_value s))
class CRUDSpaceCraft m where
listSpaceCraft :: m [SpaceCraft]
getSpaceCraft :: SpaceCraftId -> m (Maybe SpaceCraft)
updateSpaceCraft :: SpaceCraftId -> SpaceCraftUpdate 'Valid -> m ()
createSpaceCraft :: SpaceCraftUpdate 'Valid -> m SpaceCraftId
deleteSpaceCraft :: SpaceCraftId -> m ()
instance (MonadTrans t, Monad m, CRUDSpaceCraft m) => CRUDSpaceCraft (t m) where
listSpaceCraft = lift listSpaceCraft
getSpaceCraft = lift . getSpaceCraft
updateSpaceCraft x = lift . updateSpaceCraft x
createSpaceCraft = lift . createSpaceCraft
deleteSpaceCraft = lift . deleteSpaceCraft
instance Humanize (Column [SpaceCraft]) where
humanize = \case
SKUT -> "SKU"
DescriptionT -> "Desc"
SerialNumberT -> "Serial #"
SquadronT -> "Squadron"
OperableT -> "Status"
ToolsT -> ""
data instance Column [SpaceCraft] =
SKUT | DescriptionT | SerialNumberT | SquadronT | OperableT | ToolsT
deriving (Eq, Ord, Show, Enum, Bounded, Generic, ToJSON, FromJSON, NFData)
newtype instance Row [SpaceCraft] = SpaceCraftRow { unRow :: SpaceCraft }
deriving (Eq, Ord, Show)
instance Tabular [SpaceCraft] where
type Effect [SpaceCraft] m = (MonadJSM m, CRUDSpaceCraft m)
toRows = fmap SpaceCraftRow
toCell :: forall m. Effect [SpaceCraft] m => [SpaceCraft] -> Row [SpaceCraft] -> Column [SpaceCraft] -> [Html m [SpaceCraft]]
toCell _ (SpaceCraftRow SpaceCraft {..}) = \case
SKUT -> present sku
DescriptionT -> present description
SerialNumberT -> present serial
SquadronT -> present squadron
OperableT -> present operable
ToolsT ->
[ H.div "btn-group"
[ H.button [ H.className "btn btn-sm btn-secondary",
H.onClickM_ $ navigate @ (SPA m) (RExisting identity) ] [ "Edit" ]
, H.button [ H.className "btn btn-sm btn-secondary",
H.onClickM $ do
deleteSpaceCraft identity
return . Prelude.filter $ \x -> x ^. #identity /= identity ] [ "Delete" ]
]
]
sortTable (SortCol c d) = f $ case c of
SKUT -> g #sku
DescriptionT -> g #description
SerialNumberT -> g #serial
SquadronT -> g #squadron
OperableT -> g #operable
ToolsT -> \_ _ -> EQ
where f = case d of ASC -> id; DESC -> flip
g l = compare `on` Lens.view l . unRow
|
ade8d047cf7f61bbc1f9623b5a9b64c81610bcc52615f6faec36cebfa90a7474 | mk270/archipelago | parser.ml |
Archipelago , a multi - user dungeon ( MUD ) server , by ( C ) 2009 - 2012
This programme is free software ; you may redistribute and/or modify
it under the terms of the GNU Affero General Public Licence as published by
the Free Software Foundation , either version 3 of said Licence , or
( at your option ) any later version .
Archipelago, a multi-user dungeon (MUD) server, by Martin Keegan
Copyright (C) 2009-2012 Martin Keegan
This programme is free software; you may redistribute and/or modify
it under the terms of the GNU Affero General Public Licence as published by
the Free Software Foundation, either version 3 of said Licence, or
(at your option) any later version.
*)
open Model
(* this needs to become a tuple *)
type search_space =
| Global
| Siblings
| Children
| ParentLinks
type match_style =
| ByName
| ByCode
type arg' =
| MO of match_style * mo_type list * search_space list
| ArgLiteral
type arg =
MO ( [ MO_Item ; ] , [ Children ; ] )
MO ( [ MO_Item ; ] , [ Siblings ; ] )
MO ( [ MO_Item ; ] , [ Children ; Siblings ; ] )
MO ( [ MO_Portal ; MO_Item ; ] , [ ParentLinks ; Children ; Siblings ; ] )
| LocationCode (* *)
MO ( [ MO_Monster ; ] , [ Children ; Siblings ; ] )
MO ( [ MO_Item ; ] , [ Global ; ] )
ArgLiteral
| CurrentPlayer
| PlayerPresent
let search_pattern = function
| ItemCarried -> MO (ByName, [MO_Item;], [Children;])
| ItemInLocation -> MO (ByName, [MO_Item;], [Siblings;])
| ItemPresent -> MO (ByName, [MO_Item;], [Children; Siblings;])
| ItemOrPortalPresent ->
MO (ByName, [MO_Portal; MO_Item;], [ParentLinks; Children; Siblings;])
| LocationCode -> MO (ByCode, [MO_Room;], [Global])
| MonsterPresent -> MO (ByName, [MO_Monster;], [Children; Siblings;])
| ItemAnywhere -> MO (ByName, [MO_Item;], [Global;])
| CurrentPlayer -> MO (ByName, [MO_Player;], [Global;])
| Literal -> ArgLiteral
| PlayerPresent -> MO (ByName, [MO_Player;], [Siblings;])
type role =
| Patient
| Recipient
| Instrument
| Searchterm
type unary = actor : mudobject -> unit
type binary = actor : mudobject -> patient : mudobject -> unit
type binary_word = actor : mudobject -> word : string -> unit
type ditrans = actor : mudobject -> patient : mudobject ->
instrument : mudobject -> unit
type ditrans_word = actor : mudobject -> patient : mudobject -> word : string -> unit
type word =
| Var of role * arg
| Constant of string
| Rest_of_line of role * arg
type frame =
| Unary of unary
| Binary of binary
| Ditrans of ditrans
| Binary_word of binary_word
| Ditrans_word of ditrans_word
type resolved_arg =
| Mudobject of mudobject
| Word of string
type verb = {
v_name : string ;
v_args : word list ;
v_frame : frame ;
}
let delim_ws = Str.regexp " +"
let room_by_code name =
try let code = Loc_code.create name in
[ Search.room_by_code code; ]
with Loc_code.Invalid_loccode _ -> raise Not_found
let search_method name = function
| ByName -> Model.Props.match_name (Some name)
| ByCode -> let loc_code = Loc_code.create (String.uppercase name) in
Model.Props.match_loc_code (Some loc_code)
let objs_in_search_space ~actor = function
| Global -> Search.search_all ~ty:None ~name:None
| Siblings ->
List.filter (fun i -> i != actor) (Tree.children (Tree.parent actor))
| Children ->
Tree.children actor
| ParentLinks ->
Model.MudobjectSet.elements (Link.portals_in_room (Tree.parent actor))
let resolve_arg ~actor arg name =
let sp = search_pattern arg in
match sp with
| ArgLiteral -> Word name
| MO (meth, sought_types, search_spaces) ->
let f = search_method name meth in
let objs = fun i -> objs_in_search_space ~actor i in
let objects = List.map objs search_spaces in
let objects = List.flatten objects in
(* we now have all potentially searchable objects *)
let objects = List.filter f objects in
let filtered = List.map (
fun ty -> List.filter (fun o -> Model.mudobj_ty_match o (Some ty)) objects
) sought_types in
let filtered = List.flatten filtered in
if List.length filtered = 0
then raise Not_found
else Mudobject (List.hd filtered)
let mudobject_of_resolved_arg = function
| Mudobject mo -> mo
| _ -> failwith "Mudobject expected"
let word_of_resolved_arg = function
| Word s -> s
| _ -> failwith "Word expected"
exception Too_many_words
exception Constant_not_found
let exec verb ~actor line =
let rxp = Str.regexp " +" in
let do_match token s =
match token with
| Var (th, arg) ->
let words' = Str.bounded_split rxp s 2 in
let obj = resolve_arg ~actor arg (List.hd words') in
(match words' with
| [] -> failwith "no words left to parse"
| [hd] -> (Some (th, obj), "")
| hd :: tl ->
(Some (th, obj), List.hd tl))
| Constant s' ->
let words' = Str.bounded_split rxp s 2 in
(match words' with
| [] -> failwith "no worlds left to parse"
| [hd] ->
(if Utils.Ext.initial_match hd s'
then (None, "")
else raise Constant_not_found)
| hd :: tl ->
(if Utils.Ext.initial_match hd s'
then (None, List.hd tl)
else raise Constant_not_found)
)
| Rest_of_line (th, arg) ->
if 1 > String.length s
then failwith "no words left to parse"
else (Some (th, Word s), "")
in
let rec parse s tokens acc =
match tokens with
| [] -> acc (* ignore case of unparsed words at end *)
| hd :: tl ->
let role, rest_of_line = do_match hd s in
match role with
| Some r -> parse rest_of_line tl (r :: acc)
| None -> parse rest_of_line tl (acc)
in
let get_pairs () =
let tmp = Str.bounded_split rxp line 2 in
let line' = match tmp with
| [] -> failwith "no words on line"
| [hd] -> failwith "only one word on line"
| hd :: tl ->
assert (List.length tl = 1);
List.hd tl
in
parse line' verb.v_args []
in
let m_o_r_a = mudobject_of_resolved_arg in
let w_o_r_a = word_of_resolved_arg in
match verb.v_frame with
| Unary cmd -> cmd ~actor
| Binary cmd ->
let pairs = get_pairs () in
let patient = m_o_r_a (List.assoc Patient pairs) in
cmd ~actor ~patient
| Ditrans cmd ->
let pairs = get_pairs () in
let patient = m_o_r_a (List.assoc Patient pairs) in
let instrument = m_o_r_a (List.assoc Instrument pairs) in
cmd ~actor ~patient ~instrument
| Binary_word cmd ->
let pairs = get_pairs () in
let word = w_o_r_a (List.assoc Patient pairs) in
cmd ~actor ~word
| Ditrans_word cmd ->
let pairs = get_pairs () in
let patient = m_o_r_a (List.assoc Patient pairs) in
let word = w_o_r_a (List.assoc Instrument pairs) in
cmd ~actor ~patient ~word
let guard_exceptions ~actor e =
let msg = "An error occurred: " ^ (Printexc.to_string e) in
let bt = Printexc.get_backtrace () in
Game.emitl actor msg;
print_endline bt;
print_endline msg;
flush_all ()
let guard_exceptions ~actor = function
| Persona.Spell_failed (Persona.Spell_not_available) ->
Game.emitl actor "You don't know that spell."
| Persona.Spell_failed (Persona.Restricted r) ->
Game.emitl actor "Restricted"
| e -> guard_exceptions ~actor e
| null | https://raw.githubusercontent.com/mk270/archipelago/4241bdc994da6d846637bcc079051405ee905c9b/src/server/parser.ml | ocaml | this needs to become a tuple
we now have all potentially searchable objects
ignore case of unparsed words at end |
Archipelago , a multi - user dungeon ( MUD ) server , by ( C ) 2009 - 2012
This programme is free software ; you may redistribute and/or modify
it under the terms of the GNU Affero General Public Licence as published by
the Free Software Foundation , either version 3 of said Licence , or
( at your option ) any later version .
Archipelago, a multi-user dungeon (MUD) server, by Martin Keegan
Copyright (C) 2009-2012 Martin Keegan
This programme is free software; you may redistribute and/or modify
it under the terms of the GNU Affero General Public Licence as published by
the Free Software Foundation, either version 3 of said Licence, or
(at your option) any later version.
*)
open Model
type search_space =
| Global
| Siblings
| Children
| ParentLinks
type match_style =
| ByName
| ByCode
type arg' =
| MO of match_style * mo_type list * search_space list
| ArgLiteral
type arg =
MO ( [ MO_Item ; ] , [ Children ; ] )
MO ( [ MO_Item ; ] , [ Siblings ; ] )
MO ( [ MO_Item ; ] , [ Children ; Siblings ; ] )
MO ( [ MO_Portal ; MO_Item ; ] , [ ParentLinks ; Children ; Siblings ; ] )
MO ( [ MO_Monster ; ] , [ Children ; Siblings ; ] )
MO ( [ MO_Item ; ] , [ Global ; ] )
ArgLiteral
| CurrentPlayer
| PlayerPresent
let search_pattern = function
| ItemCarried -> MO (ByName, [MO_Item;], [Children;])
| ItemInLocation -> MO (ByName, [MO_Item;], [Siblings;])
| ItemPresent -> MO (ByName, [MO_Item;], [Children; Siblings;])
| ItemOrPortalPresent ->
MO (ByName, [MO_Portal; MO_Item;], [ParentLinks; Children; Siblings;])
| LocationCode -> MO (ByCode, [MO_Room;], [Global])
| MonsterPresent -> MO (ByName, [MO_Monster;], [Children; Siblings;])
| ItemAnywhere -> MO (ByName, [MO_Item;], [Global;])
| CurrentPlayer -> MO (ByName, [MO_Player;], [Global;])
| Literal -> ArgLiteral
| PlayerPresent -> MO (ByName, [MO_Player;], [Siblings;])
type role =
| Patient
| Recipient
| Instrument
| Searchterm
type unary = actor : mudobject -> unit
type binary = actor : mudobject -> patient : mudobject -> unit
type binary_word = actor : mudobject -> word : string -> unit
type ditrans = actor : mudobject -> patient : mudobject ->
instrument : mudobject -> unit
type ditrans_word = actor : mudobject -> patient : mudobject -> word : string -> unit
type word =
| Var of role * arg
| Constant of string
| Rest_of_line of role * arg
type frame =
| Unary of unary
| Binary of binary
| Ditrans of ditrans
| Binary_word of binary_word
| Ditrans_word of ditrans_word
type resolved_arg =
| Mudobject of mudobject
| Word of string
type verb = {
v_name : string ;
v_args : word list ;
v_frame : frame ;
}
let delim_ws = Str.regexp " +"
let room_by_code name =
try let code = Loc_code.create name in
[ Search.room_by_code code; ]
with Loc_code.Invalid_loccode _ -> raise Not_found
let search_method name = function
| ByName -> Model.Props.match_name (Some name)
| ByCode -> let loc_code = Loc_code.create (String.uppercase name) in
Model.Props.match_loc_code (Some loc_code)
let objs_in_search_space ~actor = function
| Global -> Search.search_all ~ty:None ~name:None
| Siblings ->
List.filter (fun i -> i != actor) (Tree.children (Tree.parent actor))
| Children ->
Tree.children actor
| ParentLinks ->
Model.MudobjectSet.elements (Link.portals_in_room (Tree.parent actor))
let resolve_arg ~actor arg name =
let sp = search_pattern arg in
match sp with
| ArgLiteral -> Word name
| MO (meth, sought_types, search_spaces) ->
let f = search_method name meth in
let objs = fun i -> objs_in_search_space ~actor i in
let objects = List.map objs search_spaces in
let objects = List.flatten objects in
let objects = List.filter f objects in
let filtered = List.map (
fun ty -> List.filter (fun o -> Model.mudobj_ty_match o (Some ty)) objects
) sought_types in
let filtered = List.flatten filtered in
if List.length filtered = 0
then raise Not_found
else Mudobject (List.hd filtered)
let mudobject_of_resolved_arg = function
| Mudobject mo -> mo
| _ -> failwith "Mudobject expected"
let word_of_resolved_arg = function
| Word s -> s
| _ -> failwith "Word expected"
exception Too_many_words
exception Constant_not_found
let exec verb ~actor line =
let rxp = Str.regexp " +" in
let do_match token s =
match token with
| Var (th, arg) ->
let words' = Str.bounded_split rxp s 2 in
let obj = resolve_arg ~actor arg (List.hd words') in
(match words' with
| [] -> failwith "no words left to parse"
| [hd] -> (Some (th, obj), "")
| hd :: tl ->
(Some (th, obj), List.hd tl))
| Constant s' ->
let words' = Str.bounded_split rxp s 2 in
(match words' with
| [] -> failwith "no worlds left to parse"
| [hd] ->
(if Utils.Ext.initial_match hd s'
then (None, "")
else raise Constant_not_found)
| hd :: tl ->
(if Utils.Ext.initial_match hd s'
then (None, List.hd tl)
else raise Constant_not_found)
)
| Rest_of_line (th, arg) ->
if 1 > String.length s
then failwith "no words left to parse"
else (Some (th, Word s), "")
in
let rec parse s tokens acc =
match tokens with
| hd :: tl ->
let role, rest_of_line = do_match hd s in
match role with
| Some r -> parse rest_of_line tl (r :: acc)
| None -> parse rest_of_line tl (acc)
in
let get_pairs () =
let tmp = Str.bounded_split rxp line 2 in
let line' = match tmp with
| [] -> failwith "no words on line"
| [hd] -> failwith "only one word on line"
| hd :: tl ->
assert (List.length tl = 1);
List.hd tl
in
parse line' verb.v_args []
in
let m_o_r_a = mudobject_of_resolved_arg in
let w_o_r_a = word_of_resolved_arg in
match verb.v_frame with
| Unary cmd -> cmd ~actor
| Binary cmd ->
let pairs = get_pairs () in
let patient = m_o_r_a (List.assoc Patient pairs) in
cmd ~actor ~patient
| Ditrans cmd ->
let pairs = get_pairs () in
let patient = m_o_r_a (List.assoc Patient pairs) in
let instrument = m_o_r_a (List.assoc Instrument pairs) in
cmd ~actor ~patient ~instrument
| Binary_word cmd ->
let pairs = get_pairs () in
let word = w_o_r_a (List.assoc Patient pairs) in
cmd ~actor ~word
| Ditrans_word cmd ->
let pairs = get_pairs () in
let patient = m_o_r_a (List.assoc Patient pairs) in
let word = w_o_r_a (List.assoc Instrument pairs) in
cmd ~actor ~patient ~word
let guard_exceptions ~actor e =
let msg = "An error occurred: " ^ (Printexc.to_string e) in
let bt = Printexc.get_backtrace () in
Game.emitl actor msg;
print_endline bt;
print_endline msg;
flush_all ()
let guard_exceptions ~actor = function
| Persona.Spell_failed (Persona.Spell_not_available) ->
Game.emitl actor "You don't know that spell."
| Persona.Spell_failed (Persona.Restricted r) ->
Game.emitl actor "Restricted"
| e -> guard_exceptions ~actor e
|
3ad817af22cd1395340ef426c339d005a4553ddfde57dda87db4cbd03bea082e | kelamg/HtDP2e-workthrough | ex282.rkt | The first three lines of this file were inserted by . They record metadata
;; about the language level of this file in a form that our tools can easily process.
#reader(lib "htdp-intermediate-lambda-reader.ss" "lang")((modname ex282) (read-case-sensitive #t) (teachpacks ()) (htdp-settings #(#t constructor repeating-decimal #f #t none #f () #f)))
(define (f-plain x)
(* 10 x))
(define f-lambda
(lambda (x)
(* 10 x)))
; Number -> Boolean
(define (compare x)
(= (f-plain x) (f-lambda x)))
(define (loop x n)
(cond
[(= n 0) #true]
[else (and (compare x)
(loop x (sub1 n)))]))
Run 100 times
(loop 100000 100) | null | https://raw.githubusercontent.com/kelamg/HtDP2e-workthrough/ec05818d8b667a3c119bea8d1d22e31e72e0a958/HtDP/Abstraction/ex282.rkt | racket | about the language level of this file in a form that our tools can easily process.
Number -> Boolean | The first three lines of this file were inserted by . They record metadata
#reader(lib "htdp-intermediate-lambda-reader.ss" "lang")((modname ex282) (read-case-sensitive #t) (teachpacks ()) (htdp-settings #(#t constructor repeating-decimal #f #t none #f () #f)))
(define (f-plain x)
(* 10 x))
(define f-lambda
(lambda (x)
(* 10 x)))
(define (compare x)
(= (f-plain x) (f-lambda x)))
(define (loop x n)
(cond
[(= n 0) #true]
[else (and (compare x)
(loop x (sub1 n)))]))
Run 100 times
(loop 100000 100) |
47d336c65d7b31bc6494b1ce94d22ff44f494a038dfb1af81b13dc5d6966b818 | OCamlPro/ocamlexc | tbl.ml | (***********************************************************************)
(* *)
(* Objective Caml *)
(* *)
, projet Cristal , INRIA Rocquencourt
(* *)
Copyright 1996 Institut National de Recherche en Informatique et
en Automatique . Distributed only by permission .
(* *)
(***********************************************************************)
$ I d : tbl.ml , v 1.2 1999/02/15 15:00:35 pessaux Exp $
type ('a, 'b) t =
Empty
| Node of ('a, 'b) t * 'a * 'b * ('a, 'b) t * int
let empty = Empty
let height = function
Empty -> 0
| Node(_,_,_,_,h) -> h
let create l x d r =
let hl = height l and hr = height r in
Node(l, x, d, r, (if hl >= hr then hl + 1 else hr + 1))
let bal l x d r =
let hl = height l and hr = height r in
if hl > hr + 1 then
match l with
| Node (ll, lv, ld, lr, _) when height ll >= height lr ->
create ll lv ld (create lr x d r)
| Node (ll, lv, ld, Node (lrl, lrv, lrd, lrr, _), _) ->
create (create ll lv ld lrl) lrv lrd (create lrr x d r)
| _ -> assert false
else if hr > hl + 1 then
match r with
| Node (rl, rv, rd, rr, _) when height rr >= height rl ->
create (create l x d rl) rv rd rr
| Node (Node (rll, rlv, rld, rlr, _), rv, rd, rr, _) ->
create (create l x d rll) rlv rld (create rlr rv rd rr)
| _ -> assert false
else
create l x d r
let rec add x data = function
Empty ->
Node(Empty, x, data, Empty, 1)
| Node(l, v, d, r, h) ->
let c = compare x v in
if c = 0 then
Node(l, x, data, r, h)
else if c < 0 then
bal (add x data l) v d r
else
bal l v d (add x data r)
let rec find x = function
Empty ->
raise Not_found
| Node(l, v, d, r, _) ->
let c = compare x v in
if c = 0 then d
else find x (if c < 0 then l else r)
let rec mem x = function
Empty -> false
| Node(l, v, _, r, _) ->
let c = compare x v in
c = 0 || mem x (if c < 0 then l else r)
let rec merge t1 t2 =
match (t1, t2) with
(Empty, t) -> t
| (t, Empty) -> t
| (Node(l1, v1, d1, r1, _), Node(l2, v2, d2, r2, _)) ->
bal l1 v1 d1 (bal (merge r1 l2) v2 d2 r2)
let rec remove x = function
Empty ->
Empty
| Node(l, v, d, r, _) ->
let c = compare x v in
if c = 0 then
merge l r
else if c < 0 then
bal (remove x l) v d r
else
bal l v d (remove x r)
let rec iter f = function
Empty -> ()
| Node(l, v, d, r, _) ->
iter f l; f v d; iter f r
open Format
let print print_key print_data tbl =
open_hvbox 2;
print_string "[[";
iter (fun k d ->
open_box 2;
print_key k; print_string " ->"; print_space();
print_data d; print_string ";";
close_box(); print_space())
tbl;
print_string "]]";
close_box()
| null | https://raw.githubusercontent.com/OCamlPro/ocamlexc/a9ddcfff6f376f5dd6c5fcc3211b8f89e36f79fe/utils/tbl.ml | ocaml | *********************************************************************
Objective Caml
********************************************************************* | , projet Cristal , INRIA Rocquencourt
Copyright 1996 Institut National de Recherche en Informatique et
en Automatique . Distributed only by permission .
$ I d : tbl.ml , v 1.2 1999/02/15 15:00:35 pessaux Exp $
type ('a, 'b) t =
Empty
| Node of ('a, 'b) t * 'a * 'b * ('a, 'b) t * int
let empty = Empty
let height = function
Empty -> 0
| Node(_,_,_,_,h) -> h
let create l x d r =
let hl = height l and hr = height r in
Node(l, x, d, r, (if hl >= hr then hl + 1 else hr + 1))
let bal l x d r =
let hl = height l and hr = height r in
if hl > hr + 1 then
match l with
| Node (ll, lv, ld, lr, _) when height ll >= height lr ->
create ll lv ld (create lr x d r)
| Node (ll, lv, ld, Node (lrl, lrv, lrd, lrr, _), _) ->
create (create ll lv ld lrl) lrv lrd (create lrr x d r)
| _ -> assert false
else if hr > hl + 1 then
match r with
| Node (rl, rv, rd, rr, _) when height rr >= height rl ->
create (create l x d rl) rv rd rr
| Node (Node (rll, rlv, rld, rlr, _), rv, rd, rr, _) ->
create (create l x d rll) rlv rld (create rlr rv rd rr)
| _ -> assert false
else
create l x d r
let rec add x data = function
Empty ->
Node(Empty, x, data, Empty, 1)
| Node(l, v, d, r, h) ->
let c = compare x v in
if c = 0 then
Node(l, x, data, r, h)
else if c < 0 then
bal (add x data l) v d r
else
bal l v d (add x data r)
let rec find x = function
Empty ->
raise Not_found
| Node(l, v, d, r, _) ->
let c = compare x v in
if c = 0 then d
else find x (if c < 0 then l else r)
let rec mem x = function
Empty -> false
| Node(l, v, _, r, _) ->
let c = compare x v in
c = 0 || mem x (if c < 0 then l else r)
let rec merge t1 t2 =
match (t1, t2) with
(Empty, t) -> t
| (t, Empty) -> t
| (Node(l1, v1, d1, r1, _), Node(l2, v2, d2, r2, _)) ->
bal l1 v1 d1 (bal (merge r1 l2) v2 d2 r2)
let rec remove x = function
Empty ->
Empty
| Node(l, v, d, r, _) ->
let c = compare x v in
if c = 0 then
merge l r
else if c < 0 then
bal (remove x l) v d r
else
bal l v d (remove x r)
let rec iter f = function
Empty -> ()
| Node(l, v, d, r, _) ->
iter f l; f v d; iter f r
open Format
let print print_key print_data tbl =
open_hvbox 2;
print_string "[[";
iter (fun k d ->
open_box 2;
print_key k; print_string " ->"; print_space();
print_data d; print_string ";";
close_box(); print_space())
tbl;
print_string "]]";
close_box()
|
f5874a98c2303bdf2d6e865c4ad433e1a67cb41afce1cbf27d9acd80f722a538 | aryx/fork-efuns | color.mli |
(* helper to colorize a buffer *)
val color: Efuns.buffer -> Str.regexp -> bool (* strict *) -> Text.attribute ->
unit
(* to be set by each major programming mode *)
val color_func: (Efuns.buffer -> Text.point -> Text.point -> unit) Var.t
val color_region: Efuns.action
val color_buffer: Efuns.action
(* to be used in 'install' of a mode *)
val color_buffer_buf: Efuns.buffer -> unit
| null | https://raw.githubusercontent.com/aryx/fork-efuns/8f2f8f66879d45e26ecdca0033f9c92aec2b783d/features/color.mli | ocaml | helper to colorize a buffer
strict
to be set by each major programming mode
to be used in 'install' of a mode |
unit
val color_func: (Efuns.buffer -> Text.point -> Text.point -> unit) Var.t
val color_region: Efuns.action
val color_buffer: Efuns.action
val color_buffer_buf: Efuns.buffer -> unit
|
ac7519840c4ba4207a34c23ecca5470c01e331945c937be2bb667f56385a860b | ekmett/indexed | Product.hs | # LANGUAGE PolyKinds #
# LANGUAGE TypeOperators #
# LANGUAGE DataKinds #
{-# LANGUAGE GADTs #-}
-----------------------------------------------------------------------------
-- |
-- Module : Indexed.Product
Copyright : ( C ) 2012
-- License : BSD-style (see the file LICENSE)
Maintainer : < >
-- Stability : experimental
-- Portability : non-portable
--
-- Products of indexed functors
-----------------------------------------------------------------------------
module Indexed.Product
( (*)(..)
, ifst
, isnd
) where
import Control.Applicative
import Data.Monoid
import Indexed.Functor
-- import Indexed.Types
import Indexed.Foldable
import Indexed.Traversable
import Indexed.Monoid
-- | Indexed functor product
data (*) :: ((x -> *) -> y -> *) -> ((x -> *) -> y -> *) -> (x -> *) -> y -> * where
(:*) :: f a i -> g a i -> (f * g) a i
ifst :: (f * g) a i -> f a i
ifst (a :* _) = a
isnd :: (f * g) a i -> g a i
isnd (_ :* b) = b
instance (IFunctor f, IFunctor g) => IFunctor (f * g) where
imap f (a :* b) = imap f a :* imap f b
instance (IFoldable f, IFoldable g) => IFoldable (f * g) where
ifoldMap f (a :* b) = ifoldMap f a <> ifoldMap f b
instance (ITraversable f, ITraversable g) => ITraversable (f * g) where
itraverse f (a :* b) = (:*) <$> itraverse f a <*> itraverse f b
instance (IApplicative f, IApplicative g) => IApplicative (f * g) where
ireturn a = ireturn a :* ireturn a
(af :* bf) /*/ (aa :* ba) = (af /*/ aa) :* (bf /*/ ba)
instance (IMonad f, IMonad g) => IMonad (f * g) where
ibind f (a :* b) = ibind (ifst . f) a :* ibind (isnd . f) b
-- | Foldable product
data (&) :: (((i,i) -> *) -> (j,j) -> *) -> (((i,i) -> *) -> (j,j) -> *) -> ((i,i) -> *) -> (j,j) -> * where
(:&) :: f a '(x,y) -> g a '(y,z) -> (f & g) a '(x,z)
instance (IFunctor f, IFunctor g) => IFunctor (f & g) where
imap f (a :& b) = imap f a :& imap f b
instance (IFoldable f, IFoldable g) => IFoldable (f & g) where
ifoldMap f (a :& b) = ifoldMap f a <> ifoldMap f b
instance (IIFoldable f, IIFoldable g) => IIFoldable (f & g) where
iifoldMap f (a :& b) = iifoldMap f a >< iifoldMap f b
| null | https://raw.githubusercontent.com/ekmett/indexed/331b5dd12eee9dfa89d8bf2dda18dce04030167b/src/Indexed/Product.hs | haskell | # LANGUAGE GADTs #
---------------------------------------------------------------------------
|
Module : Indexed.Product
License : BSD-style (see the file LICENSE)
Stability : experimental
Portability : non-portable
Products of indexed functors
---------------------------------------------------------------------------
import Indexed.Types
| Indexed functor product
| Foldable product | # LANGUAGE PolyKinds #
# LANGUAGE TypeOperators #
# LANGUAGE DataKinds #
Copyright : ( C ) 2012
Maintainer : < >
module Indexed.Product
( (*)(..)
, ifst
, isnd
) where
import Control.Applicative
import Data.Monoid
import Indexed.Functor
import Indexed.Foldable
import Indexed.Traversable
import Indexed.Monoid
data (*) :: ((x -> *) -> y -> *) -> ((x -> *) -> y -> *) -> (x -> *) -> y -> * where
(:*) :: f a i -> g a i -> (f * g) a i
ifst :: (f * g) a i -> f a i
ifst (a :* _) = a
isnd :: (f * g) a i -> g a i
isnd (_ :* b) = b
instance (IFunctor f, IFunctor g) => IFunctor (f * g) where
imap f (a :* b) = imap f a :* imap f b
instance (IFoldable f, IFoldable g) => IFoldable (f * g) where
ifoldMap f (a :* b) = ifoldMap f a <> ifoldMap f b
instance (ITraversable f, ITraversable g) => ITraversable (f * g) where
itraverse f (a :* b) = (:*) <$> itraverse f a <*> itraverse f b
instance (IApplicative f, IApplicative g) => IApplicative (f * g) where
ireturn a = ireturn a :* ireturn a
(af :* bf) /*/ (aa :* ba) = (af /*/ aa) :* (bf /*/ ba)
instance (IMonad f, IMonad g) => IMonad (f * g) where
ibind f (a :* b) = ibind (ifst . f) a :* ibind (isnd . f) b
data (&) :: (((i,i) -> *) -> (j,j) -> *) -> (((i,i) -> *) -> (j,j) -> *) -> ((i,i) -> *) -> (j,j) -> * where
(:&) :: f a '(x,y) -> g a '(y,z) -> (f & g) a '(x,z)
instance (IFunctor f, IFunctor g) => IFunctor (f & g) where
imap f (a :& b) = imap f a :& imap f b
instance (IFoldable f, IFoldable g) => IFoldable (f & g) where
ifoldMap f (a :& b) = ifoldMap f a <> ifoldMap f b
instance (IIFoldable f, IIFoldable g) => IIFoldable (f & g) where
iifoldMap f (a :& b) = iifoldMap f a >< iifoldMap f b
|
9374d54c2b554c4a2bae424e56b43d60ea6e74bb5026ab4c29402bbb004824ff | huangjs/cl | diffop.lisp | ;;; -*- Lisp -*-
DIFFOP : A Library for making ' a more useful character in
;;;
;;; Loading this file sets things up so that you can do
;;;
;;; DEPENDS(F,X); => [F(X)]
;;;
;;; F'; => dF/dX
;;;
3
F'3 ; or F '' ' ; = > d F
;;; -----
;;; 3
;;; dX
;;;
If a variable has more than one DEPENDS property , the variable
;;; which will be used is undefined.
If a variable has no DEPENDS property , UND will be used as the
;;; variable to differentiate by.
(DEFUN INFER-DEPENDENCY (X)
(OR (CAR (MGET (CADR ($LISTOFVARS X)) 'DEPENDS)) '$UND))
(DECLARE (SPECIAL STRING))
;Makes awfully big assumptions about the internals of GRAM
(DEFUN PARSE-PRIME (OP LEFT)
(SETQ LEFT (CDR LEFT))
(CONS '$ANY
(LIST '($DIFF)
LEFT
(INFER-DEPENDENCY LEFT)
(+ -1
(FLATC OP)
(COND ((AND STRING (NUMBERP (CAR STRING)))
(POP STRING))
(T 0))))))
(DEFPROP $/' 195. LBP)
(DEFPROP $/' PARSE-PRIME LED)
(DEFPROP $/'/' 195. LBP)
(DEFPROP $/'/' PARSE-PRIME LED)
| null | https://raw.githubusercontent.com/huangjs/cl/96158b3f82f82a6b7d53ef04b3b29c5c8de2dbf7/lib/maxima/share/misc/diffop.lisp | lisp | -*- Lisp -*-
Loading this file sets things up so that you can do
DEPENDS(F,X); => [F(X)]
F'; => dF/dX
or F '' ' ; = > d F
-----
3
dX
which will be used is undefined.
variable to differentiate by.
Makes awfully big assumptions about the internals of GRAM | DIFFOP : A Library for making ' a more useful character in
3
If a variable has more than one DEPENDS property , the variable
If a variable has no DEPENDS property , UND will be used as the
(DEFUN INFER-DEPENDENCY (X)
(OR (CAR (MGET (CADR ($LISTOFVARS X)) 'DEPENDS)) '$UND))
(DECLARE (SPECIAL STRING))
(DEFUN PARSE-PRIME (OP LEFT)
(SETQ LEFT (CDR LEFT))
(CONS '$ANY
(LIST '($DIFF)
LEFT
(INFER-DEPENDENCY LEFT)
(+ -1
(FLATC OP)
(COND ((AND STRING (NUMBERP (CAR STRING)))
(POP STRING))
(T 0))))))
(DEFPROP $/' 195. LBP)
(DEFPROP $/' PARSE-PRIME LED)
(DEFPROP $/'/' 195. LBP)
(DEFPROP $/'/' PARSE-PRIME LED)
|
83bd28009c6255b7db0ce561ec6c99a753350bcb2f11251f58abce55e8698cd9 | jacquev6/DrawGrammar | DrawingTests.ml | Copyright 2017 < >
open General.Abbr
open DrawGrammar
let single_rule_grammars =
Grammar.[
("terminal", terminal "in a rounded rectangle");
("token", token "in a rounded rectangle");
("rule with a name longer than its definition", terminal "short");
("non-terminal", non_terminal "in a rectangle");
("special", special "in an octogon");
("sequence", sequence [terminal "t1"; non_terminal "nt"; terminal "t2"]);
("alternative", alternative [terminal "short"; non_terminal "longestttttt"; terminal "medium"]);
("alternative with null branch", alternative [null; terminal "t"]);
("sequence with null", sequence [terminal "a"; null; terminal "b"; terminal "c"]);
("repetition with long forward branch", repetition (terminal "long branch") (terminal "short"));
("repetition with long backward branch", repetition (terminal "short") (terminal "long branch"));
("repetition with null forward branch", repetition null (terminal "t"));
("repetition with null backward branch", repetition (terminal "t") null);
("nested alternatives", alternative [
alternative [terminal "t1"; terminal "t2"];
alternative [terminal "t3"; terminal "t4"];
]);
("nested sequences", sequence [
sequence [terminal "t1"; terminal "t2"];
sequence [terminal "t3"; terminal "t4"];
]);
("nested repetitions", repetition
(repetition (terminal "forward 1") (terminal "backward 1"))
(repetition (terminal "forward 2") (terminal "backward 2"))
);
("repetitions in sequence", sequence [
repetition (terminal "forward 1") (terminal "backward 1");
repetition (terminal "forward 2") (terminal "backward 2");
]);
("exceptions in sequence", sequence [
except (terminal "base 1") (terminal "except 1");
except (terminal "base 2") (terminal "except 2");
]);
("alternatives in sequence", sequence [
alternative [terminal "t1"; terminal "t2"];
alternative [terminal "t3"; terminal "t4"];
]);
("alternatives in repetition", repetition
(alternative [terminal "t1"; terminal "t2"])
(alternative [terminal "t3"; terminal "t4"])
);
("repetitions in alternative", alternative [
repetition (terminal "forward 1") (terminal "backward 1");
repetition (terminal "forward 2") (terminal "backward 2");
]);
("exception with long base", except (terminal "long base branch") (terminal "except"));
("exception with long except", except (terminal "base") (terminal "long except branch"));
("range with long bottom", range (terminal "short") (terminal "to quite long branch"));
("range with long top", range (terminal "quite a bit longer") (terminal "to short"));
("short range", range (terminal "a") (terminal "z"));
("ranges in repetition", repetition
(range (terminal "min 1") (terminal "max 1"))
(range (terminal "min 2") (terminal "max 2"))
);
]
|> Li.map ~f:(fun (name, definition) ->
(name, Grammar.(grammar [rule name definition]))
)
let tests =
single_rule_grammars @ Grammar.[
(
"several rules",
grammar [
rule "rule1" (terminal "t1");
rule "rule2" (terminal "t2");
]
);
]
| null | https://raw.githubusercontent.com/jacquev6/DrawGrammar/ee056a086ca0d8b18889fa06883287fda84807c3/src/DrawingTests.ml | ocaml | Copyright 2017 < >
open General.Abbr
open DrawGrammar
let single_rule_grammars =
Grammar.[
("terminal", terminal "in a rounded rectangle");
("token", token "in a rounded rectangle");
("rule with a name longer than its definition", terminal "short");
("non-terminal", non_terminal "in a rectangle");
("special", special "in an octogon");
("sequence", sequence [terminal "t1"; non_terminal "nt"; terminal "t2"]);
("alternative", alternative [terminal "short"; non_terminal "longestttttt"; terminal "medium"]);
("alternative with null branch", alternative [null; terminal "t"]);
("sequence with null", sequence [terminal "a"; null; terminal "b"; terminal "c"]);
("repetition with long forward branch", repetition (terminal "long branch") (terminal "short"));
("repetition with long backward branch", repetition (terminal "short") (terminal "long branch"));
("repetition with null forward branch", repetition null (terminal "t"));
("repetition with null backward branch", repetition (terminal "t") null);
("nested alternatives", alternative [
alternative [terminal "t1"; terminal "t2"];
alternative [terminal "t3"; terminal "t4"];
]);
("nested sequences", sequence [
sequence [terminal "t1"; terminal "t2"];
sequence [terminal "t3"; terminal "t4"];
]);
("nested repetitions", repetition
(repetition (terminal "forward 1") (terminal "backward 1"))
(repetition (terminal "forward 2") (terminal "backward 2"))
);
("repetitions in sequence", sequence [
repetition (terminal "forward 1") (terminal "backward 1");
repetition (terminal "forward 2") (terminal "backward 2");
]);
("exceptions in sequence", sequence [
except (terminal "base 1") (terminal "except 1");
except (terminal "base 2") (terminal "except 2");
]);
("alternatives in sequence", sequence [
alternative [terminal "t1"; terminal "t2"];
alternative [terminal "t3"; terminal "t4"];
]);
("alternatives in repetition", repetition
(alternative [terminal "t1"; terminal "t2"])
(alternative [terminal "t3"; terminal "t4"])
);
("repetitions in alternative", alternative [
repetition (terminal "forward 1") (terminal "backward 1");
repetition (terminal "forward 2") (terminal "backward 2");
]);
("exception with long base", except (terminal "long base branch") (terminal "except"));
("exception with long except", except (terminal "base") (terminal "long except branch"));
("range with long bottom", range (terminal "short") (terminal "to quite long branch"));
("range with long top", range (terminal "quite a bit longer") (terminal "to short"));
("short range", range (terminal "a") (terminal "z"));
("ranges in repetition", repetition
(range (terminal "min 1") (terminal "max 1"))
(range (terminal "min 2") (terminal "max 2"))
);
]
|> Li.map ~f:(fun (name, definition) ->
(name, Grammar.(grammar [rule name definition]))
)
let tests =
single_rule_grammars @ Grammar.[
(
"several rules",
grammar [
rule "rule1" (terminal "t1");
rule "rule2" (terminal "t2");
]
);
]
| |
7498cfa9c1baa4ebd4fb7db050354b142d2bec554411a66147f5af6b161dd5f4 | mejgun/haskell-tdlib | GroupCallParticipantVideoInfo.hs | {-# LANGUAGE OverloadedStrings #-}
-- |
module TD.Data.GroupCallParticipantVideoInfo where
import qualified Data.Aeson as A
import qualified Data.Aeson.Types as T
import qualified TD.Data.GroupCallVideoSourceGroup as GroupCallVideoSourceGroup
import qualified Utils as U
-- |
data GroupCallParticipantVideoInfo = -- | Contains information about a group call participant's video channel
GroupCallParticipantVideoInfo
{ -- | True, if the video is paused. This flag needs to be ignored, if new video frames are received
is_paused :: Maybe Bool,
-- | Video channel endpoint identifier
endpoint_id :: Maybe String,
-- | List of synchronization source groups of the video
source_groups :: Maybe [GroupCallVideoSourceGroup.GroupCallVideoSourceGroup]
}
deriving (Eq)
instance Show GroupCallParticipantVideoInfo where
show
GroupCallParticipantVideoInfo
{ is_paused = is_paused_,
endpoint_id = endpoint_id_,
source_groups = source_groups_
} =
"GroupCallParticipantVideoInfo"
++ U.cc
[ U.p "is_paused" is_paused_,
U.p "endpoint_id" endpoint_id_,
U.p "source_groups" source_groups_
]
instance T.FromJSON GroupCallParticipantVideoInfo where
parseJSON v@(T.Object obj) = do
t <- obj A..: "@type" :: T.Parser String
case t of
"groupCallParticipantVideoInfo" -> parseGroupCallParticipantVideoInfo v
_ -> mempty
where
parseGroupCallParticipantVideoInfo :: A.Value -> T.Parser GroupCallParticipantVideoInfo
parseGroupCallParticipantVideoInfo = A.withObject "GroupCallParticipantVideoInfo" $ \o -> do
is_paused_ <- o A..:? "is_paused"
endpoint_id_ <- o A..:? "endpoint_id"
source_groups_ <- o A..:? "source_groups"
return $ GroupCallParticipantVideoInfo {is_paused = is_paused_, endpoint_id = endpoint_id_, source_groups = source_groups_}
parseJSON _ = mempty
instance T.ToJSON GroupCallParticipantVideoInfo where
toJSON
GroupCallParticipantVideoInfo
{ is_paused = is_paused_,
endpoint_id = endpoint_id_,
source_groups = source_groups_
} =
A.object
[ "@type" A..= T.String "groupCallParticipantVideoInfo",
"is_paused" A..= is_paused_,
"endpoint_id" A..= endpoint_id_,
"source_groups" A..= source_groups_
]
| null | https://raw.githubusercontent.com/mejgun/haskell-tdlib/dc380d18d49eaadc386a81dc98af2ce00f8797c2/src/TD/Data/GroupCallParticipantVideoInfo.hs | haskell | # LANGUAGE OverloadedStrings #
|
|
| Contains information about a group call participant's video channel
| True, if the video is paused. This flag needs to be ignored, if new video frames are received
| Video channel endpoint identifier
| List of synchronization source groups of the video |
module TD.Data.GroupCallParticipantVideoInfo where
import qualified Data.Aeson as A
import qualified Data.Aeson.Types as T
import qualified TD.Data.GroupCallVideoSourceGroup as GroupCallVideoSourceGroup
import qualified Utils as U
GroupCallParticipantVideoInfo
is_paused :: Maybe Bool,
endpoint_id :: Maybe String,
source_groups :: Maybe [GroupCallVideoSourceGroup.GroupCallVideoSourceGroup]
}
deriving (Eq)
instance Show GroupCallParticipantVideoInfo where
show
GroupCallParticipantVideoInfo
{ is_paused = is_paused_,
endpoint_id = endpoint_id_,
source_groups = source_groups_
} =
"GroupCallParticipantVideoInfo"
++ U.cc
[ U.p "is_paused" is_paused_,
U.p "endpoint_id" endpoint_id_,
U.p "source_groups" source_groups_
]
instance T.FromJSON GroupCallParticipantVideoInfo where
parseJSON v@(T.Object obj) = do
t <- obj A..: "@type" :: T.Parser String
case t of
"groupCallParticipantVideoInfo" -> parseGroupCallParticipantVideoInfo v
_ -> mempty
where
parseGroupCallParticipantVideoInfo :: A.Value -> T.Parser GroupCallParticipantVideoInfo
parseGroupCallParticipantVideoInfo = A.withObject "GroupCallParticipantVideoInfo" $ \o -> do
is_paused_ <- o A..:? "is_paused"
endpoint_id_ <- o A..:? "endpoint_id"
source_groups_ <- o A..:? "source_groups"
return $ GroupCallParticipantVideoInfo {is_paused = is_paused_, endpoint_id = endpoint_id_, source_groups = source_groups_}
parseJSON _ = mempty
instance T.ToJSON GroupCallParticipantVideoInfo where
toJSON
GroupCallParticipantVideoInfo
{ is_paused = is_paused_,
endpoint_id = endpoint_id_,
source_groups = source_groups_
} =
A.object
[ "@type" A..= T.String "groupCallParticipantVideoInfo",
"is_paused" A..= is_paused_,
"endpoint_id" A..= endpoint_id_,
"source_groups" A..= source_groups_
]
|
195c56674cde7e12f78bed31e0c007b78754abbae68e69b94b653f3b07da66e9 | steshaw/PLAR | tactics.ml | (* ========================================================================= *)
Goals , LCF - like tactics and - like proofs .
(* *)
Copyright ( c ) 2003 - 2007 , . ( See " LICENSE.txt " for details . )
(* ========================================================================= *)
type goals =
Goals of ((string * fol formula) list * fol formula)list *
(thm list -> thm);;
(* ------------------------------------------------------------------------- *)
Printer for goals ( just shows first goal plus total number ) .
(* ------------------------------------------------------------------------- *)
let print_goal =
let print_hyp (l,fm) =
open_hbox(); print_string(l^":"); print_space();
print_formula print_atom fm; print_newline(); close_box() in
fun (Goals(gls,jfn)) ->
match gls with
(asl,w)::ogls ->
print_newline();
(if ogls = [] then print_string "1 subgoal:" else
(print_int (length gls);
print_string " subgoals starting with"));
print_newline();
do_list print_hyp (rev asl);
print_string "---> ";
open_hvbox 0; print_formula print_atom w; close_box();
print_newline()
| [] -> print_string "No subgoals";;
#install_printer print_goal;;
(* ------------------------------------------------------------------------- *)
(* Setting up goals and terminating them in a theorem. *)
(* ------------------------------------------------------------------------- *)
let set_goal p =
let chk th = if concl th = p then th else failwith "wrong theorem" in
Goals([[],p],fun [th] -> chk(modusponens th truth));;
let extract_thm gls =
match gls with
Goals([],jfn) -> jfn []
| _ -> failwith "extract_thm: unsolved goals";;
let tac_proof g prf = extract_thm(itlist (fun f -> f) (rev prf) g);;
let prove p prf = tac_proof (set_goal p) prf;;
(* ------------------------------------------------------------------------- *)
(* Conjunction introduction tactic. *)
(* ------------------------------------------------------------------------- *)
let conj_intro_tac (Goals((asl,And(p,q))::gls,jfn)) =
let jfn' (thp::thq::ths) =
jfn(imp_trans_chain [thp; thq] (and_pair p q)::ths) in
Goals((asl,p)::(asl,q)::gls,jfn');;
(* ------------------------------------------------------------------------- *)
(* Handy idiom for tactic that does not split subgoals. *)
(* ------------------------------------------------------------------------- *)
let jmodify jfn tfn (th::oths) = jfn(tfn th :: oths);;
(* ------------------------------------------------------------------------- *)
(* Version of gen_right with a bound variable change. *)
(* ------------------------------------------------------------------------- *)
let gen_right_alpha y x th =
let th1 = gen_right y th in
imp_trans th1 (alpha x (consequent(concl th1)));;
(* ------------------------------------------------------------------------- *)
Universal introduction .
(* ------------------------------------------------------------------------- *)
let forall_intro_tac y (Goals((asl,(Forall(x,p) as fm))::gls,jfn)) =
if mem y (fv fm) or exists (mem y ** fv ** snd) asl
then failwith "fix: variable already free in goal" else
Goals((asl,subst(x |=> Var y) p)::gls,
jmodify jfn (gen_right_alpha y x));;
(* ------------------------------------------------------------------------- *)
Another inference rule : |- P[t ] = = > exists x. P[x ]
(* ------------------------------------------------------------------------- *)
let right_exists x t p =
let th = contrapos(ispec t (Forall(x,Not p))) in
let Not(Not p') = antecedent(concl th) in
end_itlist imp_trans
[imp_contr p' False; imp_add_concl False (iff_imp1 (axiom_not p'));
iff_imp2(axiom_not (Not p')); th; iff_imp2(axiom_exists x p)];;
(* ------------------------------------------------------------------------- *)
(* Existential introduction. *)
(* ------------------------------------------------------------------------- *)
let exists_intro_tac t (Goals((asl,Exists(x,p))::gls,jfn)) =
Goals((asl,subst(x |=> t) p)::gls,
jmodify jfn (fun th -> imp_trans th (right_exists x t p)));;
(* ------------------------------------------------------------------------- *)
(* Implication introduction tactic. *)
(* ------------------------------------------------------------------------- *)
let imp_intro_tac s (Goals((asl,Imp(p,q))::gls,jfn)) =
let jmod = if asl = [] then add_assum True else imp_swap ** shunt in
Goals(((s,p)::asl,q)::gls,jmodify jfn jmod);;
(* ------------------------------------------------------------------------- *)
(* Append contextual hypothesis to unconditional theorem. *)
(* ------------------------------------------------------------------------- *)
let assumptate (Goals((asl,w)::gls,jfn)) th =
add_assum (list_conj (map snd asl)) th;;
(* ------------------------------------------------------------------------- *)
Get the first assumption ( quicker than head of assumps result ) .
(* ------------------------------------------------------------------------- *)
let firstassum asl =
let p = snd(hd asl) and q = list_conj(map snd (tl asl)) in
if tl asl = [] then imp_refl p else and_left p q;;
(* ------------------------------------------------------------------------- *)
(* Import "external" theorem. *)
(* ------------------------------------------------------------------------- *)
let using ths p g =
let ths' = map (fun th -> itlist gen (fv(concl th)) th) ths in
map (assumptate g) ths';;
(* ------------------------------------------------------------------------- *)
(* Turn assumptions p1,...,pn into theorems |- p1 /\ ... /\ pn ==> pi *)
(* ------------------------------------------------------------------------- *)
let rec assumps asl =
match asl with
[] -> []
| [l,p] -> [l,imp_refl p]
| (l,p)::lps ->
let ths = assumps lps in
let q = antecedent(concl(snd(hd ths))) in
let rth = and_right p q in
(l,and_left p q)::map (fun (l,th) -> l,imp_trans rth th) ths;;
(* ------------------------------------------------------------------------- *)
(* Produce canonical theorem from list of theorems or assumption labels. *)
(* ------------------------------------------------------------------------- *)
let by hyps p (Goals((asl,w)::gls,jfn)) =
let ths = assumps asl in map (fun s -> assoc s ths) hyps;;
(* ------------------------------------------------------------------------- *)
Main automatic justification step .
(* ------------------------------------------------------------------------- *)
let justify byfn hyps p g =
match byfn hyps p g with
[th] when consequent(concl th) = p -> th
| ths ->
let th = lcffol(itlist (mk_imp ** consequent ** concl) ths p) in
if ths = [] then assumptate g th else imp_trans_chain ths th;;
(* ------------------------------------------------------------------------- *)
(* Nested subproof. *)
(* ------------------------------------------------------------------------- *)
let proof tacs p (Goals((asl,w)::gls,jfn)) =
[tac_proof (Goals([asl,p],fun [th] -> th)) tacs];;
(* ------------------------------------------------------------------------- *)
(* Trivial justification, producing no hypotheses. *)
(* ------------------------------------------------------------------------- *)
let at once p gl = [] and once = [];;
(* ------------------------------------------------------------------------- *)
(* Hence an automated terminal tactic. *)
(* ------------------------------------------------------------------------- *)
let auto_tac byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let th = justify byfn hyps w g in
Goals(gls,fun ths -> jfn(th::ths));;
(* ------------------------------------------------------------------------- *)
(* A "lemma" tactic. *)
(* ------------------------------------------------------------------------- *)
let lemma_tac s p byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let tr = imp_trans(justify byfn hyps p g) in
let mfn = if asl = [] then tr else imp_unduplicate ** tr ** shunt in
Goals(((s,p)::asl,w)::gls,jmodify jfn mfn);;
(* ------------------------------------------------------------------------- *)
(* Elimination tactic for existential quantification. *)
(* ------------------------------------------------------------------------- *)
let exists_elim_tac l fm byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let Exists(x,p) = fm in
if exists (mem x ** fv) (w::map snd asl)
then failwith "exists_elim_tac: variable free in assumptions" else
let th = justify byfn hyps (Exists(x,p)) g in
let jfn' pth =
imp_unduplicate(imp_trans th (exists_left x (shunt pth))) in
Goals(((l,p)::asl,w)::gls,jmodify jfn jfn');;
(* ------------------------------------------------------------------------- *)
If |- p = = > r and |- q = = > r then |- p \/ q = = > r
(* ------------------------------------------------------------------------- *)
let ante_disj th1 th2 =
let p,r = dest_imp(concl th1) and q,s = dest_imp(concl th2) in
let ths = map contrapos [th1; th2] in
let th3 = imp_trans_chain ths (and_pair (Not p) (Not q)) in
let th4 = contrapos(imp_trans (iff_imp2(axiom_not r)) th3) in
let th5 = imp_trans (iff_imp1(axiom_or p q)) th4 in
right_doubleneg(imp_trans th5 (iff_imp1(axiom_not(Imp(r,False)))));;
(* ------------------------------------------------------------------------- *)
(* Elimination tactic for disjunction. *)
(* ------------------------------------------------------------------------- *)
let disj_elim_tac l fm byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let th = justify byfn hyps fm g and Or(p,q) = fm in
let jfn' (pth::qth::ths) =
let th1 = imp_trans th (ante_disj (shunt pth) (shunt qth)) in
jfn(imp_unduplicate th1::ths) in
Goals(((l,p)::asl,w)::((l,q)::asl,w)::gls,jfn');;
(* ------------------------------------------------------------------------- *)
(* A simple example. *)
(* ------------------------------------------------------------------------- *)
START_INTERACTIVE;;
let g0 = set_goal
<<(forall x. x <= x) /\
(forall x y z. x <= y /\ y <= z ==> x <= z) /\
(forall x y. f(x) <= y <=> x <= g(y))
==> (forall x y. x <= y ==> f(x) <= f(y)) /\
(forall x y. x <= y ==> g(x) <= g(y))>>;;
let g1 = imp_intro_tac "ant" g0;;
let g2 = conj_intro_tac g1;;
let g3 = funpow 2 (auto_tac by ["ant"]) g2;;
extract_thm g3;;
(* ------------------------------------------------------------------------- *)
(* All packaged up together. *)
(* ------------------------------------------------------------------------- *)
prove <<(forall x. x <= x) /\
(forall x y z. x <= y /\ y <= z ==> x <= z) /\
(forall x y. f(x) <= y <=> x <= g(y))
==> (forall x y. x <= y ==> f(x) <= f(y)) /\
(forall x y. x <= y ==> g(x) <= g(y))>>
[imp_intro_tac "ant";
conj_intro_tac;
auto_tac by ["ant"];
auto_tac by ["ant"]];;
END_INTERACTIVE;;
(* ------------------------------------------------------------------------- *)
(* Declarative proof. *)
(* ------------------------------------------------------------------------- *)
let multishunt i th =
let th1 = imp_swap(funpow i (imp_swap ** shunt) th) in
imp_swap(funpow (i-1) (unshunt ** imp_front 2) th1);;
let assume lps (Goals((asl,Imp(p,q))::gls,jfn)) =
if end_itlist mk_and (map snd lps) <> p then failwith "assume" else
let jfn' th = if asl = [] then add_assum True th
else multishunt (length lps) th in
Goals((lps@asl,q)::gls,jmodify jfn jfn');;
let note (l,p) = lemma_tac l p;;
let have p = note("",p);;
let so tac arg byfn =
tac arg (fun hyps p (Goals((asl,w)::_,_) as gl) ->
firstassum asl :: byfn hyps p gl);;
let fix = forall_intro_tac;;
let consider (x,p) = exists_elim_tac "" (Exists(x,p));;
let take = exists_intro_tac;;
let cases = disj_elim_tac "";;
(* ------------------------------------------------------------------------- *)
(* Thesis modification. *)
(* ------------------------------------------------------------------------- *)
let conclude p byfn hyps (Goals((asl,w)::gls,jfn) as gl) =
let th = justify byfn hyps p gl in
if p = w then Goals((asl,True)::gls,jmodify jfn (fun _ -> th)) else
let p',q = dest_and w in
if p' <> p then failwith "conclude: bad conclusion" else
let mfn th' = imp_trans_chain [th; th'] (and_pair p q) in
Goals((asl,q)::gls,jmodify jfn mfn);;
(* ------------------------------------------------------------------------- *)
(* A useful shorthand for solving the whole goal. *)
(* ------------------------------------------------------------------------- *)
let our thesis byfn hyps (Goals((asl,w)::gls,jfn) as gl) =
conclude w byfn hyps gl
and thesis = "";;
(* ------------------------------------------------------------------------- *)
(* Termination. *)
(* ------------------------------------------------------------------------- *)
let qed (Goals((asl,w)::gls,jfn) as gl) =
if w = True then Goals(gls,fun ths -> jfn(assumptate gl truth :: ths))
else failwith "qed: non-trivial goal";;
(* ------------------------------------------------------------------------- *)
(* A simple example. *)
(* ------------------------------------------------------------------------- *)
START_INTERACTIVE;;
let ewd954 = prove
<<(forall x y. x <= y <=> x * y = x) /\
(forall x y. f(x * y) = f(x) * f(y))
==> forall x y. x <= y ==> f(x) <= f(y)>>
[note("eq_sym",<<forall x y. x = y ==> y = x>>)
using [eq_sym <<|x|>> <<|y|>>];
note("eq_trans",<<forall x y z. x = y /\ y = z ==> x = z>>)
using [eq_trans <<|x|>> <<|y|>> <<|z|>>];
note("eq_cong",<<forall x y. x = y ==> f(x) = f(y)>>)
using [axiom_funcong "f" [<<|x|>>] [<<|y|>>]];
assume ["le",<<forall x y. x <= y <=> x * y = x>>;
"hom",<<forall x y. f(x * y) = f(x) * f(y)>>];
fix "x"; fix "y";
assume ["xy",<<x <= y>>];
so have <<x * y = x>> by ["le"];
so have <<f(x * y) = f(x)>> by ["eq_cong"];
so have <<f(x) = f(x * y)>> by ["eq_sym"];
so have <<f(x) = f(x) * f(y)>> by ["eq_trans"; "hom"];
so have <<f(x) * f(y) = f(x)>> by ["eq_sym"];
so conclude <<f(x) <= f(y)>> by ["le"];
qed];;
END_INTERACTIVE;;
(* ------------------------------------------------------------------------- *)
(* More examples not in the main text. *)
(* ------------------------------------------------------------------------- *)
START_INTERACTIVE;;
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(f(f(f(f(y)))))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
note ("C",<<forall x. p(x) ==> p(f(f(f(f(x)))))>>)
proof
[have <<forall x. p(x) ==> p(f(f(x)))>> by ["B"];
so conclude <<forall x. p(x) ==> p(f(f(f(f(x)))))>> at once;
qed];
consider ("a",<<p(a)>>) by ["A"];
take <<|a|>>;
so conclude <<p(f(f(f(f(a)))))>> by ["C"];
qed];;
(* ------------------------------------------------------------------------- *)
(* Alternative formulation with lemma construct. *)
(* ------------------------------------------------------------------------- *)
let lemma (s,p) (Goals((asl,w)::gls,jfn) as gl) =
Goals((asl,p)::((s,p)::asl,w)::gls,
fun (thp::thw::oths) ->
jfn(imp_unduplicate(imp_trans thp (shunt thw)) :: oths)) in
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(f(f(f(f(y)))))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
lemma ("C",<<forall x. p(x) ==> p(f(f(f(f(x)))))>>);
have <<forall x. p(x) ==> p(f(f(x)))>> by ["B"];
so conclude <<forall x. p(x) ==> p(f(f(f(f(x)))))>> at once;
qed;
consider ("a",<<p(a)>>) by ["A"];
take <<|a|>>;
so conclude <<p(f(f(f(f(a)))))>> by ["C"];
qed];;
(* ------------------------------------------------------------------------- *)
Running a series of proof steps one by one on goals .
(* ------------------------------------------------------------------------- *)
let run prf g = itlist (fun f -> f) (rev prf) g;;
(* ------------------------------------------------------------------------- *)
LCF - style interactivity .
(* ------------------------------------------------------------------------- *)
let current_goal = ref[set_goal False];;
let g x = current_goal := [set_goal x]; hd(!current_goal);;
let e t = current_goal := (t(hd(!current_goal))::(!current_goal));
hd(!current_goal);;
let es t = current_goal := (run t (hd(!current_goal))::(!current_goal));
hd(!current_goal);;
let b() = current_goal := tl(!current_goal); hd(!current_goal);;
(* ------------------------------------------------------------------------- *)
(* Examples. *)
(* ------------------------------------------------------------------------- *)
prove <<p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[our thesis at once;
qed];;
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(f(f(f(f(y)))))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
note ("C",<<forall x. p(x) ==> p(f(f(f(f(x)))))>>) proof
[have <<forall x. p(x) ==> p(f(f(x)))>> by ["B"];
so our thesis at once;
qed];
consider ("a",<<p(a)>>) by ["A"];
take <<|a|>>;
so our thesis by ["C"];
qed];;
prove <<forall a. p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[fix "c";
assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
conclude <<p(c)>> by ["A"];
note ("C",<<p(c) ==> p(f(c))>>) by ["B"];
so our thesis by ["C"; "A"];
qed];;
prove <<p(c) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
conclude <<p(c)>> by ["A"];
our thesis by ["A"; "B"];
qed];;
prove <<forall a. p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[fix "c";
assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
conclude <<p(c)>> by ["A"];
note ("C",<<p(c) ==> p(f(c))>>) by ["B"];
our thesis by ["C"; "A"];
qed];;
prove <<forall a. p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[fix "c";
assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
note ("D",<<p(c)>>) by ["A"];
note ("C",<<p(c) ==> p(f(c))>>) by ["B"];
our thesis by ["C"; "A"; "D"];
qed];;
prove <<(p(a) \/ p(b)) ==> q ==> exists y. p(y)>>
[assume ["A",<<p(a) \/ p(b)>>];
assume ["",<<q>>];
cases <<p(a) \/ p(b)>> by ["A"];
take <<|a|>>;
so our thesis at once;
qed;
take <<|b|>>;
so our thesis at once;
qed];;
prove
<<(p(a) \/ p(b)) /\ (forall x. p(x) ==> p(f(x))) ==> exists y. p(f(y))>>
[assume ["base",<<p(a) \/ p(b)>>;
"Step",<<forall x. p(x) ==> p(f(x))>>];
cases <<p(a) \/ p(b)>> by ["base"];
so note("A",<<p(a)>>) at once;
note ("X",<<p(a) ==> p(f(a))>>) by ["Step"];
take <<|a|>>;
our thesis by ["A"; "X"];
qed;
take <<|b|>>;
so our thesis by ["Step"];
qed];;
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x))) ==> exists y. p(f(y))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
consider ("a",<<p(a)>>) by ["A"];
so note ("concl",<<p(f(a))>>) by ["B"];
take <<|a|>>;
our thesis by ["concl"];
qed];;
prove <<(forall x. p(x) ==> q(x)) ==> (forall x. q(x) ==> p(x))
==> (p(a) <=> q(a))>>
[assume ["A",<<forall x. p(x) ==> q(x)>>];
assume ["B",<<forall x. q(x) ==> p(x)>>];
note ("von",<<p(a) ==> q(a)>>) by ["A"];
note ("bis",<<q(a) ==> p(a)>>) by ["B"];
our thesis by ["von"; "bis"];
qed];;
* * - like
prove
< < ( p(a ) \/ p(b ) ) /\ ( forall x. p(x ) = = > p(f(x ) ) ) = = > exists y. p(f(y ) ) > >
[ assume [ " A",<<antecedent > > ] ;
note ( " Step",<<forall x. p(x ) = = > p(f(x ) ) > > ) by [ " A " ] ;
per_cases by [ " A " ] ;
suppose ( " base",<<p(a ) > > ) ;
note ( " X",<<p(a ) = = > p(f(a ) ) > > ) by [ " Step " ] ;
take < < |a| > > ;
our thesis by [ " base " ; " X " ] ;
qed ;
suppose ( " base",<<p(b ) > > ) ;
our thesis by [ " Step " ; " base " ] ;
qed ;
endcase ] ; ;
* * * *
prove
<<(p(a) \/ p(b)) /\ (forall x. p(x) ==> p(f(x))) ==> exists y. p(f(y))>>
[assume ["A",<<antecedent>>];
note ("Step",<<forall x. p(x) ==> p(f(x))>>) by ["A"];
per_cases by ["A"];
suppose ("base",<<p(a)>>);
note ("X",<<p(a) ==> p(f(a))>>) by ["Step"];
take <<|a|>>;
our thesis by ["base"; "X"];
qed;
suppose ("base",<<p(b)>>);
our thesis by ["Step"; "base"];
qed;
endcase];;
*****)
END_INTERACTIVE;;
(* ------------------------------------------------------------------------- *)
(* Some amusing efficiency tests versus a "direct" spec. *)
(* ------------------------------------------------------------------------- *)
* * * *
let test n = gen " x "
let double_th th =
let tm in modusponens ( modusponens ( and_pair tm tm ) th ) th ; ;
let testcase n =
gen " x " ( funpow n double_th ( lcftaut < < ) = = > q(1 ) \/ p(x ) > > ) ) ; ;
let test n = time ( spec < < |2| > > ) ( testcase n ) ,
time ( subst ( " x " |= > < < |2| > > ) ) ( concl(testcase n ) ) ;
( ) ; ;
test 10 ; ;
test 11 ; ;
test 12 ; ;
test 13 ; ;
test 14 ; ;
test 15 ; ;
* * *
let test n = gen "x"
let double_th th =
let tm = concl th in modusponens (modusponens (and_pair tm tm) th) th;;
let testcase n =
gen "x" (funpow n double_th (lcftaut <<p(x) ==> q(1) \/ p(x)>>));;
let test n = time (spec <<|2|>>) (testcase n),
time (subst ("x" |=> <<|2|>>)) (concl(testcase n));
();;
test 10;;
test 11;;
test 12;;
test 13;;
test 14;;
test 15;;
****)
| null | https://raw.githubusercontent.com/steshaw/PLAR/c143b097d1028963f4c1d24f45a1a56c8b65b838/tactics.ml | ocaml | =========================================================================
=========================================================================
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Setting up goals and terminating them in a theorem.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Conjunction introduction tactic.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Handy idiom for tactic that does not split subgoals.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Version of gen_right with a bound variable change.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Existential introduction.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Implication introduction tactic.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Append contextual hypothesis to unconditional theorem.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Import "external" theorem.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Turn assumptions p1,...,pn into theorems |- p1 /\ ... /\ pn ==> pi
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Produce canonical theorem from list of theorems or assumption labels.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Nested subproof.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Trivial justification, producing no hypotheses.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Hence an automated terminal tactic.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
A "lemma" tactic.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Elimination tactic for existential quantification.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Elimination tactic for disjunction.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
A simple example.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
All packaged up together.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Declarative proof.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Thesis modification.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
A useful shorthand for solving the whole goal.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Termination.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
A simple example.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
More examples not in the main text.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Alternative formulation with lemma construct.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Examples.
-------------------------------------------------------------------------
-------------------------------------------------------------------------
Some amusing efficiency tests versus a "direct" spec.
------------------------------------------------------------------------- | Goals , LCF - like tactics and - like proofs .
Copyright ( c ) 2003 - 2007 , . ( See " LICENSE.txt " for details . )
type goals =
Goals of ((string * fol formula) list * fol formula)list *
(thm list -> thm);;
Printer for goals ( just shows first goal plus total number ) .
let print_goal =
let print_hyp (l,fm) =
open_hbox(); print_string(l^":"); print_space();
print_formula print_atom fm; print_newline(); close_box() in
fun (Goals(gls,jfn)) ->
match gls with
(asl,w)::ogls ->
print_newline();
(if ogls = [] then print_string "1 subgoal:" else
(print_int (length gls);
print_string " subgoals starting with"));
print_newline();
do_list print_hyp (rev asl);
print_string "---> ";
open_hvbox 0; print_formula print_atom w; close_box();
print_newline()
| [] -> print_string "No subgoals";;
#install_printer print_goal;;
let set_goal p =
let chk th = if concl th = p then th else failwith "wrong theorem" in
Goals([[],p],fun [th] -> chk(modusponens th truth));;
let extract_thm gls =
match gls with
Goals([],jfn) -> jfn []
| _ -> failwith "extract_thm: unsolved goals";;
let tac_proof g prf = extract_thm(itlist (fun f -> f) (rev prf) g);;
let prove p prf = tac_proof (set_goal p) prf;;
let conj_intro_tac (Goals((asl,And(p,q))::gls,jfn)) =
let jfn' (thp::thq::ths) =
jfn(imp_trans_chain [thp; thq] (and_pair p q)::ths) in
Goals((asl,p)::(asl,q)::gls,jfn');;
let jmodify jfn tfn (th::oths) = jfn(tfn th :: oths);;
let gen_right_alpha y x th =
let th1 = gen_right y th in
imp_trans th1 (alpha x (consequent(concl th1)));;
Universal introduction .
let forall_intro_tac y (Goals((asl,(Forall(x,p) as fm))::gls,jfn)) =
if mem y (fv fm) or exists (mem y ** fv ** snd) asl
then failwith "fix: variable already free in goal" else
Goals((asl,subst(x |=> Var y) p)::gls,
jmodify jfn (gen_right_alpha y x));;
Another inference rule : |- P[t ] = = > exists x. P[x ]
let right_exists x t p =
let th = contrapos(ispec t (Forall(x,Not p))) in
let Not(Not p') = antecedent(concl th) in
end_itlist imp_trans
[imp_contr p' False; imp_add_concl False (iff_imp1 (axiom_not p'));
iff_imp2(axiom_not (Not p')); th; iff_imp2(axiom_exists x p)];;
let exists_intro_tac t (Goals((asl,Exists(x,p))::gls,jfn)) =
Goals((asl,subst(x |=> t) p)::gls,
jmodify jfn (fun th -> imp_trans th (right_exists x t p)));;
let imp_intro_tac s (Goals((asl,Imp(p,q))::gls,jfn)) =
let jmod = if asl = [] then add_assum True else imp_swap ** shunt in
Goals(((s,p)::asl,q)::gls,jmodify jfn jmod);;
let assumptate (Goals((asl,w)::gls,jfn)) th =
add_assum (list_conj (map snd asl)) th;;
Get the first assumption ( quicker than head of assumps result ) .
let firstassum asl =
let p = snd(hd asl) and q = list_conj(map snd (tl asl)) in
if tl asl = [] then imp_refl p else and_left p q;;
let using ths p g =
let ths' = map (fun th -> itlist gen (fv(concl th)) th) ths in
map (assumptate g) ths';;
let rec assumps asl =
match asl with
[] -> []
| [l,p] -> [l,imp_refl p]
| (l,p)::lps ->
let ths = assumps lps in
let q = antecedent(concl(snd(hd ths))) in
let rth = and_right p q in
(l,and_left p q)::map (fun (l,th) -> l,imp_trans rth th) ths;;
let by hyps p (Goals((asl,w)::gls,jfn)) =
let ths = assumps asl in map (fun s -> assoc s ths) hyps;;
Main automatic justification step .
let justify byfn hyps p g =
match byfn hyps p g with
[th] when consequent(concl th) = p -> th
| ths ->
let th = lcffol(itlist (mk_imp ** consequent ** concl) ths p) in
if ths = [] then assumptate g th else imp_trans_chain ths th;;
let proof tacs p (Goals((asl,w)::gls,jfn)) =
[tac_proof (Goals([asl,p],fun [th] -> th)) tacs];;
let at once p gl = [] and once = [];;
let auto_tac byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let th = justify byfn hyps w g in
Goals(gls,fun ths -> jfn(th::ths));;
let lemma_tac s p byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let tr = imp_trans(justify byfn hyps p g) in
let mfn = if asl = [] then tr else imp_unduplicate ** tr ** shunt in
Goals(((s,p)::asl,w)::gls,jmodify jfn mfn);;
let exists_elim_tac l fm byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let Exists(x,p) = fm in
if exists (mem x ** fv) (w::map snd asl)
then failwith "exists_elim_tac: variable free in assumptions" else
let th = justify byfn hyps (Exists(x,p)) g in
let jfn' pth =
imp_unduplicate(imp_trans th (exists_left x (shunt pth))) in
Goals(((l,p)::asl,w)::gls,jmodify jfn jfn');;
If |- p = = > r and |- q = = > r then |- p \/ q = = > r
let ante_disj th1 th2 =
let p,r = dest_imp(concl th1) and q,s = dest_imp(concl th2) in
let ths = map contrapos [th1; th2] in
let th3 = imp_trans_chain ths (and_pair (Not p) (Not q)) in
let th4 = contrapos(imp_trans (iff_imp2(axiom_not r)) th3) in
let th5 = imp_trans (iff_imp1(axiom_or p q)) th4 in
right_doubleneg(imp_trans th5 (iff_imp1(axiom_not(Imp(r,False)))));;
let disj_elim_tac l fm byfn hyps (Goals((asl,w)::gls,jfn) as g) =
let th = justify byfn hyps fm g and Or(p,q) = fm in
let jfn' (pth::qth::ths) =
let th1 = imp_trans th (ante_disj (shunt pth) (shunt qth)) in
jfn(imp_unduplicate th1::ths) in
Goals(((l,p)::asl,w)::((l,q)::asl,w)::gls,jfn');;
START_INTERACTIVE;;
let g0 = set_goal
<<(forall x. x <= x) /\
(forall x y z. x <= y /\ y <= z ==> x <= z) /\
(forall x y. f(x) <= y <=> x <= g(y))
==> (forall x y. x <= y ==> f(x) <= f(y)) /\
(forall x y. x <= y ==> g(x) <= g(y))>>;;
let g1 = imp_intro_tac "ant" g0;;
let g2 = conj_intro_tac g1;;
let g3 = funpow 2 (auto_tac by ["ant"]) g2;;
extract_thm g3;;
prove <<(forall x. x <= x) /\
(forall x y z. x <= y /\ y <= z ==> x <= z) /\
(forall x y. f(x) <= y <=> x <= g(y))
==> (forall x y. x <= y ==> f(x) <= f(y)) /\
(forall x y. x <= y ==> g(x) <= g(y))>>
[imp_intro_tac "ant";
conj_intro_tac;
auto_tac by ["ant"];
auto_tac by ["ant"]];;
END_INTERACTIVE;;
let multishunt i th =
let th1 = imp_swap(funpow i (imp_swap ** shunt) th) in
imp_swap(funpow (i-1) (unshunt ** imp_front 2) th1);;
let assume lps (Goals((asl,Imp(p,q))::gls,jfn)) =
if end_itlist mk_and (map snd lps) <> p then failwith "assume" else
let jfn' th = if asl = [] then add_assum True th
else multishunt (length lps) th in
Goals((lps@asl,q)::gls,jmodify jfn jfn');;
let note (l,p) = lemma_tac l p;;
let have p = note("",p);;
let so tac arg byfn =
tac arg (fun hyps p (Goals((asl,w)::_,_) as gl) ->
firstassum asl :: byfn hyps p gl);;
let fix = forall_intro_tac;;
let consider (x,p) = exists_elim_tac "" (Exists(x,p));;
let take = exists_intro_tac;;
let cases = disj_elim_tac "";;
let conclude p byfn hyps (Goals((asl,w)::gls,jfn) as gl) =
let th = justify byfn hyps p gl in
if p = w then Goals((asl,True)::gls,jmodify jfn (fun _ -> th)) else
let p',q = dest_and w in
if p' <> p then failwith "conclude: bad conclusion" else
let mfn th' = imp_trans_chain [th; th'] (and_pair p q) in
Goals((asl,q)::gls,jmodify jfn mfn);;
let our thesis byfn hyps (Goals((asl,w)::gls,jfn) as gl) =
conclude w byfn hyps gl
and thesis = "";;
let qed (Goals((asl,w)::gls,jfn) as gl) =
if w = True then Goals(gls,fun ths -> jfn(assumptate gl truth :: ths))
else failwith "qed: non-trivial goal";;
START_INTERACTIVE;;
let ewd954 = prove
<<(forall x y. x <= y <=> x * y = x) /\
(forall x y. f(x * y) = f(x) * f(y))
==> forall x y. x <= y ==> f(x) <= f(y)>>
[note("eq_sym",<<forall x y. x = y ==> y = x>>)
using [eq_sym <<|x|>> <<|y|>>];
note("eq_trans",<<forall x y z. x = y /\ y = z ==> x = z>>)
using [eq_trans <<|x|>> <<|y|>> <<|z|>>];
note("eq_cong",<<forall x y. x = y ==> f(x) = f(y)>>)
using [axiom_funcong "f" [<<|x|>>] [<<|y|>>]];
assume ["le",<<forall x y. x <= y <=> x * y = x>>;
"hom",<<forall x y. f(x * y) = f(x) * f(y)>>];
fix "x"; fix "y";
assume ["xy",<<x <= y>>];
so have <<x * y = x>> by ["le"];
so have <<f(x * y) = f(x)>> by ["eq_cong"];
so have <<f(x) = f(x * y)>> by ["eq_sym"];
so have <<f(x) = f(x) * f(y)>> by ["eq_trans"; "hom"];
so have <<f(x) * f(y) = f(x)>> by ["eq_sym"];
so conclude <<f(x) <= f(y)>> by ["le"];
qed];;
END_INTERACTIVE;;
START_INTERACTIVE;;
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(f(f(f(f(y)))))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
note ("C",<<forall x. p(x) ==> p(f(f(f(f(x)))))>>)
proof
[have <<forall x. p(x) ==> p(f(f(x)))>> by ["B"];
so conclude <<forall x. p(x) ==> p(f(f(f(f(x)))))>> at once;
qed];
consider ("a",<<p(a)>>) by ["A"];
take <<|a|>>;
so conclude <<p(f(f(f(f(a)))))>> by ["C"];
qed];;
let lemma (s,p) (Goals((asl,w)::gls,jfn) as gl) =
Goals((asl,p)::((s,p)::asl,w)::gls,
fun (thp::thw::oths) ->
jfn(imp_unduplicate(imp_trans thp (shunt thw)) :: oths)) in
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(f(f(f(f(y)))))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
lemma ("C",<<forall x. p(x) ==> p(f(f(f(f(x)))))>>);
have <<forall x. p(x) ==> p(f(f(x)))>> by ["B"];
so conclude <<forall x. p(x) ==> p(f(f(f(f(x)))))>> at once;
qed;
consider ("a",<<p(a)>>) by ["A"];
take <<|a|>>;
so conclude <<p(f(f(f(f(a)))))>> by ["C"];
qed];;
Running a series of proof steps one by one on goals .
let run prf g = itlist (fun f -> f) (rev prf) g;;
LCF - style interactivity .
let current_goal = ref[set_goal False];;
let g x = current_goal := [set_goal x]; hd(!current_goal);;
let e t = current_goal := (t(hd(!current_goal))::(!current_goal));
hd(!current_goal);;
let es t = current_goal := (run t (hd(!current_goal))::(!current_goal));
hd(!current_goal);;
let b() = current_goal := tl(!current_goal); hd(!current_goal);;
prove <<p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[our thesis at once;
qed];;
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(f(f(f(f(y)))))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
note ("C",<<forall x. p(x) ==> p(f(f(f(f(x)))))>>) proof
[have <<forall x. p(x) ==> p(f(f(x)))>> by ["B"];
so our thesis at once;
qed];
consider ("a",<<p(a)>>) by ["A"];
take <<|a|>>;
so our thesis by ["C"];
qed];;
prove <<forall a. p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[fix "c";
assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
conclude <<p(c)>> by ["A"];
note ("C",<<p(c) ==> p(f(c))>>) by ["B"];
so our thesis by ["C"; "A"];
qed];;
prove <<p(c) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
conclude <<p(c)>> by ["A"];
our thesis by ["A"; "B"];
qed];;
prove <<forall a. p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[fix "c";
assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
conclude <<p(c)>> by ["A"];
note ("C",<<p(c) ==> p(f(c))>>) by ["B"];
our thesis by ["C"; "A"];
qed];;
prove <<forall a. p(a) ==> (forall x. p(x) ==> p(f(x)))
==> exists y. p(y) /\ p(f(y))>>
[fix "c";
assume ["A",<<p(c)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
take <<|c|>>;
note ("D",<<p(c)>>) by ["A"];
note ("C",<<p(c) ==> p(f(c))>>) by ["B"];
our thesis by ["C"; "A"; "D"];
qed];;
prove <<(p(a) \/ p(b)) ==> q ==> exists y. p(y)>>
[assume ["A",<<p(a) \/ p(b)>>];
assume ["",<<q>>];
cases <<p(a) \/ p(b)>> by ["A"];
take <<|a|>>;
so our thesis at once;
qed;
take <<|b|>>;
so our thesis at once;
qed];;
prove
<<(p(a) \/ p(b)) /\ (forall x. p(x) ==> p(f(x))) ==> exists y. p(f(y))>>
[assume ["base",<<p(a) \/ p(b)>>;
"Step",<<forall x. p(x) ==> p(f(x))>>];
cases <<p(a) \/ p(b)>> by ["base"];
so note("A",<<p(a)>>) at once;
note ("X",<<p(a) ==> p(f(a))>>) by ["Step"];
take <<|a|>>;
our thesis by ["A"; "X"];
qed;
take <<|b|>>;
so our thesis by ["Step"];
qed];;
prove
<<(exists x. p(x)) ==> (forall x. p(x) ==> p(f(x))) ==> exists y. p(f(y))>>
[assume ["A",<<exists x. p(x)>>];
assume ["B",<<forall x. p(x) ==> p(f(x))>>];
consider ("a",<<p(a)>>) by ["A"];
so note ("concl",<<p(f(a))>>) by ["B"];
take <<|a|>>;
our thesis by ["concl"];
qed];;
prove <<(forall x. p(x) ==> q(x)) ==> (forall x. q(x) ==> p(x))
==> (p(a) <=> q(a))>>
[assume ["A",<<forall x. p(x) ==> q(x)>>];
assume ["B",<<forall x. q(x) ==> p(x)>>];
note ("von",<<p(a) ==> q(a)>>) by ["A"];
note ("bis",<<q(a) ==> p(a)>>) by ["B"];
our thesis by ["von"; "bis"];
qed];;
* * - like
prove
< < ( p(a ) \/ p(b ) ) /\ ( forall x. p(x ) = = > p(f(x ) ) ) = = > exists y. p(f(y ) ) > >
[ assume [ " A",<<antecedent > > ] ;
note ( " Step",<<forall x. p(x ) = = > p(f(x ) ) > > ) by [ " A " ] ;
per_cases by [ " A " ] ;
suppose ( " base",<<p(a ) > > ) ;
note ( " X",<<p(a ) = = > p(f(a ) ) > > ) by [ " Step " ] ;
take < < |a| > > ;
our thesis by [ " base " ; " X " ] ;
qed ;
suppose ( " base",<<p(b ) > > ) ;
our thesis by [ " Step " ; " base " ] ;
qed ;
endcase ] ; ;
* * * *
prove
<<(p(a) \/ p(b)) /\ (forall x. p(x) ==> p(f(x))) ==> exists y. p(f(y))>>
[assume ["A",<<antecedent>>];
note ("Step",<<forall x. p(x) ==> p(f(x))>>) by ["A"];
per_cases by ["A"];
suppose ("base",<<p(a)>>);
note ("X",<<p(a) ==> p(f(a))>>) by ["Step"];
take <<|a|>>;
our thesis by ["base"; "X"];
qed;
suppose ("base",<<p(b)>>);
our thesis by ["Step"; "base"];
qed;
endcase];;
*****)
END_INTERACTIVE;;
* * * *
let test n = gen " x "
let double_th th =
let tm in modusponens ( modusponens ( and_pair tm tm ) th ) th ; ;
let testcase n =
gen " x " ( funpow n double_th ( lcftaut < < ) = = > q(1 ) \/ p(x ) > > ) ) ; ;
let test n = time ( spec < < |2| > > ) ( testcase n ) ,
time ( subst ( " x " |= > < < |2| > > ) ) ( concl(testcase n ) ) ;
( ) ; ;
test 10 ; ;
test 11 ; ;
test 12 ; ;
test 13 ; ;
test 14 ; ;
test 15 ; ;
* * *
let test n = gen "x"
let double_th th =
let tm = concl th in modusponens (modusponens (and_pair tm tm) th) th;;
let testcase n =
gen "x" (funpow n double_th (lcftaut <<p(x) ==> q(1) \/ p(x)>>));;
let test n = time (spec <<|2|>>) (testcase n),
time (subst ("x" |=> <<|2|>>)) (concl(testcase n));
();;
test 10;;
test 11;;
test 12;;
test 13;;
test 14;;
test 15;;
****)
|
efaf7e28a91dd345fba4f2a56100b745bcd2df48b8c32c58119d503469d8b869 | abdulapopoola/SICPBook | 4.57.scm | #lang racket
(rule (can-replace ?person-1 ?person-2)
(and (or (and (job ?person-1 ?job)
(job ?person-2 ?job))
(and (job ?person-1 ?job1)
(job ?person-2 ?job2)
(can-do-job ?job1 ?job2)))
(not (same ?person-1 ?person-2))))
(can-replace ?person (Cy D.Fect))
(and (can-replace ?person1 ?person2)
(salary ?person1 v1)
(salary ?person2 v2)
(lisp-value < v1 v2)) | null | https://raw.githubusercontent.com/abdulapopoola/SICPBook/c8a0228ebf66d9c1ddc5ef1fcc1d05d8684f090a/Chapter%204/4.4/4.57.scm | scheme | #lang racket
(rule (can-replace ?person-1 ?person-2)
(and (or (and (job ?person-1 ?job)
(job ?person-2 ?job))
(and (job ?person-1 ?job1)
(job ?person-2 ?job2)
(can-do-job ?job1 ?job2)))
(not (same ?person-1 ?person-2))))
(can-replace ?person (Cy D.Fect))
(and (can-replace ?person1 ?person2)
(salary ?person1 v1)
(salary ?person2 v2)
(lisp-value < v1 v2)) | |
f232c4c87e78560c15453d8e5080d1f5edd9d6fe384367b89a1fbf387fb56981 | zachgk/catln | Parser.hs | --------------------------------------------------------------------
-- |
Module : Syntax . Ct .
Copyright : ( c ) 2019
License : MIT
-- Maintainer:
-- Stability : experimental
-- Portability: non-portable
--
-- This module is the main module for parsing. It will read in files
from their file paths and then parse into a ' RawPrgm ' .
--------------------------------------------------------------------
{-# LANGUAGE OverloadedStrings #-}
module Syntax.Ct.Parser where
import Control.Applicative hiding (many, some)
import Text.Megaparsec
import Text.Megaparsec.Char
import CRes
import Data.Maybe
import Syntax.Ct.Parser.Decl
import Syntax.Ct.Parser.Expr
import Syntax.Ct.Parser.Lexer
import Syntax.Ct.Parser.Syntax
import Syntax.Ct.Parser.Type (pTypeStatement)
import Syntax.Ct.Prgm
import qualified Text.Megaparsec.Char.Lexer as L
pImport :: Parser String
pImport = do
_ <- symbol "import"
imp <- some printChar
_ <- newline
return imp
liftPStatement :: Parser PStatement -> Parser PStatementTree
liftPStatement pSt = L.indentBlock scn p
where
pack st children = return $ RawStatementTree st children
p = do
st <- pSt
return (L.IndentMany Nothing (pack st) pStatementTree)
pModule :: Parser PStatement
pModule = do
_ <- symbol "module"
name <- ttypeidentifier
return $ RawModule name (getPath name)
pCommentStatement :: Parser PStatementTree
pCommentStatement = do
c <- pComment
return $ RawStatementTree (RawAnnot c) []
pStatementTree :: Parser PStatementTree
pStatementTree = do
notFollowedBy newline
liftPStatement pTypeStatement
<|> pCommentStatement
<|> liftPStatement (RawAnnot <$> pCompAnnot)
<|> liftPStatement pModule
<|> liftPStatement pDeclStatement
pNothingNewline :: Parser (Maybe a)
pNothingNewline = do
_ <- newline
return Nothing
pPrgm :: Parser PPrgm
pPrgm = do
_ <- many newline
imports <- many pImport
statements <- many (Just <$> pStatementTree <|> pNothingNewline)
return (imports, catMaybes statements)
contents :: Parser a -> Parser a
contents p = do
r <- p
eof
return r
ctParser :: String -> IO (CRes PPrgm)
ctParser fileName = do
fileContents <- readFile fileName
return $ case runParser (contents pPrgm) fileName fileContents of
Left err -> CErr [MkCNote $ ParseCErr err]
Right prgm -> return prgm
parseRepl :: String -> PReplRes
parseRepl s = case runParser (contents p) "<repl>" s of
Left e@(ParseErrorBundle _ _) -> ReplErr e
Right (Left statement) -> ReplStatement statement
Right (Right expr) -> ReplExpr expr
where p = try (Left <$> pStatementTree) <|> try (Right <$> pExpr)
| null | https://raw.githubusercontent.com/zachgk/catln/3427e1d67e076bf84969f7f57ff1e66ae02c7b00/src/Syntax/Ct/Parser.hs | haskell | ------------------------------------------------------------------
|
Maintainer:
Stability : experimental
Portability: non-portable
This module is the main module for parsing. It will read in files
------------------------------------------------------------------
# LANGUAGE OverloadedStrings # | Module : Syntax . Ct .
Copyright : ( c ) 2019
License : MIT
from their file paths and then parse into a ' RawPrgm ' .
module Syntax.Ct.Parser where
import Control.Applicative hiding (many, some)
import Text.Megaparsec
import Text.Megaparsec.Char
import CRes
import Data.Maybe
import Syntax.Ct.Parser.Decl
import Syntax.Ct.Parser.Expr
import Syntax.Ct.Parser.Lexer
import Syntax.Ct.Parser.Syntax
import Syntax.Ct.Parser.Type (pTypeStatement)
import Syntax.Ct.Prgm
import qualified Text.Megaparsec.Char.Lexer as L
pImport :: Parser String
pImport = do
_ <- symbol "import"
imp <- some printChar
_ <- newline
return imp
liftPStatement :: Parser PStatement -> Parser PStatementTree
liftPStatement pSt = L.indentBlock scn p
where
pack st children = return $ RawStatementTree st children
p = do
st <- pSt
return (L.IndentMany Nothing (pack st) pStatementTree)
pModule :: Parser PStatement
pModule = do
_ <- symbol "module"
name <- ttypeidentifier
return $ RawModule name (getPath name)
pCommentStatement :: Parser PStatementTree
pCommentStatement = do
c <- pComment
return $ RawStatementTree (RawAnnot c) []
pStatementTree :: Parser PStatementTree
pStatementTree = do
notFollowedBy newline
liftPStatement pTypeStatement
<|> pCommentStatement
<|> liftPStatement (RawAnnot <$> pCompAnnot)
<|> liftPStatement pModule
<|> liftPStatement pDeclStatement
pNothingNewline :: Parser (Maybe a)
pNothingNewline = do
_ <- newline
return Nothing
pPrgm :: Parser PPrgm
pPrgm = do
_ <- many newline
imports <- many pImport
statements <- many (Just <$> pStatementTree <|> pNothingNewline)
return (imports, catMaybes statements)
contents :: Parser a -> Parser a
contents p = do
r <- p
eof
return r
ctParser :: String -> IO (CRes PPrgm)
ctParser fileName = do
fileContents <- readFile fileName
return $ case runParser (contents pPrgm) fileName fileContents of
Left err -> CErr [MkCNote $ ParseCErr err]
Right prgm -> return prgm
parseRepl :: String -> PReplRes
parseRepl s = case runParser (contents p) "<repl>" s of
Left e@(ParseErrorBundle _ _) -> ReplErr e
Right (Left statement) -> ReplStatement statement
Right (Right expr) -> ReplExpr expr
where p = try (Left <$> pStatementTree) <|> try (Right <$> pExpr)
|
f2e83689fe4bec04a43c8d97912da1c2a6465f1da45a8277ad2fc170eb9c361e | flipstone/orville | Entity.hs | module StrangeFieldNames.Entity where
import Data.Int (Int32)
import qualified Database.Orville.PostgreSQL as O
-- field is generic and is meant to be used with different column names to test
-- that crud actions work with various name styles Ex. camelCase, snake_case etc.
data CrudEntity key = CrudEntity
{ crudEntityId :: key
, crudEntityField :: Int32
} deriving (Show, Eq)
newtype CrudEntityId = CrudEntityId
{ unCrudEntityId :: Int32
} deriving (Show, Eq)
-- Take in the column name so we can test different styles.
crudEntityTable ::
String ->
O.TableDefinition (CrudEntity CrudEntityId) (CrudEntity ()) CrudEntityId
crudEntityTable columnName =
O.mkTableDefinition $
O.TableParams
{ O.tblName = "crudEntity"
, O.tblPrimaryKey = O.primaryKey crudEntityIdField
, O.tblMapper =
CrudEntity
<$> O.readOnlyField crudEntityIdField
<*> O.attrField crudEntityField (crudEntityRenameableField columnName)
, O.tblGetKey = crudEntityId
, O.tblSafeToDelete = []
, O.tblComments = O.noComments
}
crudEntityIdField :: O.FieldDefinition O.NotNull CrudEntityId
crudEntityIdField =
O.automaticIdField "id" `O.withConversion`
O.convertSqlType unCrudEntityId CrudEntityId
crudEntityRenameableField :: String -> O.FieldDefinition O.NotNull Int32
crudEntityRenameableField = O.int32Field
| null | https://raw.githubusercontent.com/flipstone/orville/aee8d7a47ab3a7b442fdb274dbb5a95d687a23ce/orville-postgresql/test/StrangeFieldNames/Entity.hs | haskell | field is generic and is meant to be used with different column names to test
that crud actions work with various name styles Ex. camelCase, snake_case etc.
Take in the column name so we can test different styles. | module StrangeFieldNames.Entity where
import Data.Int (Int32)
import qualified Database.Orville.PostgreSQL as O
data CrudEntity key = CrudEntity
{ crudEntityId :: key
, crudEntityField :: Int32
} deriving (Show, Eq)
newtype CrudEntityId = CrudEntityId
{ unCrudEntityId :: Int32
} deriving (Show, Eq)
crudEntityTable ::
String ->
O.TableDefinition (CrudEntity CrudEntityId) (CrudEntity ()) CrudEntityId
crudEntityTable columnName =
O.mkTableDefinition $
O.TableParams
{ O.tblName = "crudEntity"
, O.tblPrimaryKey = O.primaryKey crudEntityIdField
, O.tblMapper =
CrudEntity
<$> O.readOnlyField crudEntityIdField
<*> O.attrField crudEntityField (crudEntityRenameableField columnName)
, O.tblGetKey = crudEntityId
, O.tblSafeToDelete = []
, O.tblComments = O.noComments
}
crudEntityIdField :: O.FieldDefinition O.NotNull CrudEntityId
crudEntityIdField =
O.automaticIdField "id" `O.withConversion`
O.convertSqlType unCrudEntityId CrudEntityId
crudEntityRenameableField :: String -> O.FieldDefinition O.NotNull Int32
crudEntityRenameableField = O.int32Field
|
cb0e1cd144fa196105cfc1872b1b13ab3a0c613c75260f28bc0e6dd36e8eb8be | Haskell-OpenAPI-Code-Generator/Haskell-OpenAPI-Client-Code-Generator | Main.hs | # LANGUAGE DuplicateRecordFields #
{-# LANGUAGE OverloadedStrings #-}
# LANGUAGE TemplateHaskell #
-- | Functionality to Generate Haskell Code out of an OpenAPI definition File
module OpenAPI.Generate.Main where
import Control.Monad
import qualified Data.Bifunctor as BF
import qualified Data.Map as Map
import qualified Data.Maybe as Maybe
import qualified Data.Set as Set
import Data.Text (Text)
import qualified Data.Text as T
import Language.Haskell.TH
import Language.Haskell.TH.PprLib hiding ((<>))
import qualified OpenAPI.Common as OC
import qualified OpenAPI.Generate.Doc as Doc
import OpenAPI.Generate.Internal.Unknown
import OpenAPI.Generate.Internal.Util
import qualified OpenAPI.Generate.Model as Model
import qualified OpenAPI.Generate.ModelDependencies as Dep
import qualified OpenAPI.Generate.Monad as OAM
import qualified OpenAPI.Generate.Operation as Operation
import qualified OpenAPI.Generate.OptParse as OAO
import qualified OpenAPI.Generate.SecurityScheme as SecurityScheme
import qualified OpenAPI.Generate.Types as OAT
-- | Defines all the operations as functions and the common imports
defineOperations :: String -> OAT.OpenApiSpecification -> OAM.Generator (Q [Dep.ModuleDefinition], Dep.Models)
defineOperations moduleName specification =
let paths = Map.toList $ OAT.paths specification
in OAM.nested "paths" $ do
warnAboutUnknownOperations paths
fmap
( BF.bimap
( fmap concat
. sequence
)
Set.unions
)
. mapAndUnzipM (uncurry $ Operation.defineOperationsForPath moduleName)
$ paths
-- | Defines the @defaultURL@ and the @defaultConfiguration@ containing this URL.
defineConfigurationInformation :: String -> OAT.OpenApiSpecification -> Q Doc
defineConfigurationInformation moduleName spec =
let servers' = (OAT.servers :: OAT.OpenApiSpecification -> [OAT.ServerObject]) spec
defaultURL = getServerURL servers'
defaultURLName = mkName "defaultURL"
getServerURL = maybe "/" (OAT.url :: OAT.ServerObject -> Text) . Maybe.listToMaybe
defaultApplicationNameVarName = mkName "defaultApplicationName"
defaultApplicationName = OAT.title $ OAT.info spec
in Doc.addConfigurationModuleHeader moduleName
. vcat
<$> sequence
[ pure $
Doc.generateHaddockComment
[ "The default url specified by the OpenAPI specification",
"",
"@" <> defaultURL <> "@"
],
ppr
<$> [d|$(varP defaultURLName) = T.pack $(stringE $ T.unpack defaultURL)|],
pure $
Doc.generateHaddockComment
[ "The default application name used in the @User-Agent@ header which is based on the @info.title@ field of the specification",
"",
"@" <> defaultApplicationName <> "@"
],
ppr
<$> [d|$(varP defaultApplicationNameVarName) = T.pack $(stringE $ T.unpack defaultApplicationName)|],
pure $ Doc.generateHaddockComment ["The default configuration containing the 'defaultURL' and no authorization"],
ppr <$> [d|$(varP $ mkName "defaultConfiguration") = OC.Configuration $(varE defaultURLName) OC.anonymousSecurityScheme True $(varE defaultApplicationNameVarName)|]
]
-- | Defines all models in the components.schemas section of the 'OAT.OpenApiSpecification'
defineModels :: String -> OAT.OpenApiSpecification -> Dep.Models -> OAM.Generator (Q [Dep.ModuleDefinition])
defineModels moduleName spec operationDependencies =
let schemaDefinitions = Map.toList $ OAT.schemas $ OAT.components spec
in OAM.nested "components" $
OAM.nested "schemas" $ do
warnAboutUnknownWhiteListedOrOpaqueSchemas schemaDefinitions
models <- mapM (uncurry Model.defineModelForSchema) schemaDefinitions
whiteListedSchemas <- OAM.getSetting OAO.settingWhiteListedSchemas
let dependencies = Set.union operationDependencies $ Set.fromList $ fmap transformToModuleName whiteListedSchemas
pure $ Dep.getModelModulesFromModelsWithDependencies moduleName dependencies models
-- | Defines all supported security schemes from the 'OAT.OpenApiSpecification'.
defineSecuritySchemes :: String -> OAT.OpenApiSpecification -> OAM.Generator (Q Doc)
defineSecuritySchemes moduleName =
OAM.nested "components"
. fmap (fmap $ Doc.addSecuritySchemesModuleHeader moduleName)
. SecurityScheme.defineSupportedSecuritySchemes (T.pack moduleName)
. Maybe.mapMaybe
( \(name', scheme') -> case scheme' of
OAT.Concrete s -> Just (name', s)
OAT.Reference _ -> Nothing
)
. Map.toList
. OAT.securitySchemes
. OAT.components
| null | https://raw.githubusercontent.com/Haskell-OpenAPI-Code-Generator/Haskell-OpenAPI-Client-Code-Generator/84e78c2c89ed7931a345ca4fe540c748c2600389/openapi3-code-generator/src/OpenAPI/Generate/Main.hs | haskell | # LANGUAGE OverloadedStrings #
| Functionality to Generate Haskell Code out of an OpenAPI definition File
| Defines all the operations as functions and the common imports
| Defines the @defaultURL@ and the @defaultConfiguration@ containing this URL.
| Defines all models in the components.schemas section of the 'OAT.OpenApiSpecification'
| Defines all supported security schemes from the 'OAT.OpenApiSpecification'. | # LANGUAGE DuplicateRecordFields #
# LANGUAGE TemplateHaskell #
module OpenAPI.Generate.Main where
import Control.Monad
import qualified Data.Bifunctor as BF
import qualified Data.Map as Map
import qualified Data.Maybe as Maybe
import qualified Data.Set as Set
import Data.Text (Text)
import qualified Data.Text as T
import Language.Haskell.TH
import Language.Haskell.TH.PprLib hiding ((<>))
import qualified OpenAPI.Common as OC
import qualified OpenAPI.Generate.Doc as Doc
import OpenAPI.Generate.Internal.Unknown
import OpenAPI.Generate.Internal.Util
import qualified OpenAPI.Generate.Model as Model
import qualified OpenAPI.Generate.ModelDependencies as Dep
import qualified OpenAPI.Generate.Monad as OAM
import qualified OpenAPI.Generate.Operation as Operation
import qualified OpenAPI.Generate.OptParse as OAO
import qualified OpenAPI.Generate.SecurityScheme as SecurityScheme
import qualified OpenAPI.Generate.Types as OAT
defineOperations :: String -> OAT.OpenApiSpecification -> OAM.Generator (Q [Dep.ModuleDefinition], Dep.Models)
defineOperations moduleName specification =
let paths = Map.toList $ OAT.paths specification
in OAM.nested "paths" $ do
warnAboutUnknownOperations paths
fmap
( BF.bimap
( fmap concat
. sequence
)
Set.unions
)
. mapAndUnzipM (uncurry $ Operation.defineOperationsForPath moduleName)
$ paths
defineConfigurationInformation :: String -> OAT.OpenApiSpecification -> Q Doc
defineConfigurationInformation moduleName spec =
let servers' = (OAT.servers :: OAT.OpenApiSpecification -> [OAT.ServerObject]) spec
defaultURL = getServerURL servers'
defaultURLName = mkName "defaultURL"
getServerURL = maybe "/" (OAT.url :: OAT.ServerObject -> Text) . Maybe.listToMaybe
defaultApplicationNameVarName = mkName "defaultApplicationName"
defaultApplicationName = OAT.title $ OAT.info spec
in Doc.addConfigurationModuleHeader moduleName
. vcat
<$> sequence
[ pure $
Doc.generateHaddockComment
[ "The default url specified by the OpenAPI specification",
"",
"@" <> defaultURL <> "@"
],
ppr
<$> [d|$(varP defaultURLName) = T.pack $(stringE $ T.unpack defaultURL)|],
pure $
Doc.generateHaddockComment
[ "The default application name used in the @User-Agent@ header which is based on the @info.title@ field of the specification",
"",
"@" <> defaultApplicationName <> "@"
],
ppr
<$> [d|$(varP defaultApplicationNameVarName) = T.pack $(stringE $ T.unpack defaultApplicationName)|],
pure $ Doc.generateHaddockComment ["The default configuration containing the 'defaultURL' and no authorization"],
ppr <$> [d|$(varP $ mkName "defaultConfiguration") = OC.Configuration $(varE defaultURLName) OC.anonymousSecurityScheme True $(varE defaultApplicationNameVarName)|]
]
defineModels :: String -> OAT.OpenApiSpecification -> Dep.Models -> OAM.Generator (Q [Dep.ModuleDefinition])
defineModels moduleName spec operationDependencies =
let schemaDefinitions = Map.toList $ OAT.schemas $ OAT.components spec
in OAM.nested "components" $
OAM.nested "schemas" $ do
warnAboutUnknownWhiteListedOrOpaqueSchemas schemaDefinitions
models <- mapM (uncurry Model.defineModelForSchema) schemaDefinitions
whiteListedSchemas <- OAM.getSetting OAO.settingWhiteListedSchemas
let dependencies = Set.union operationDependencies $ Set.fromList $ fmap transformToModuleName whiteListedSchemas
pure $ Dep.getModelModulesFromModelsWithDependencies moduleName dependencies models
defineSecuritySchemes :: String -> OAT.OpenApiSpecification -> OAM.Generator (Q Doc)
defineSecuritySchemes moduleName =
OAM.nested "components"
. fmap (fmap $ Doc.addSecuritySchemesModuleHeader moduleName)
. SecurityScheme.defineSupportedSecuritySchemes (T.pack moduleName)
. Maybe.mapMaybe
( \(name', scheme') -> case scheme' of
OAT.Concrete s -> Just (name', s)
OAT.Reference _ -> Nothing
)
. Map.toList
. OAT.securitySchemes
. OAT.components
|
848e1b19f923bd28bb85ffa3b5ce4f4aa530a33ee910bd5fc634e481a3ae4ad5 | axman6/amazonka-s3-streaming | StreamingUpload.hs | {-# LANGUAGE BangPatterns #-}
# LANGUAGE DerivingStrategies #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE ImportQualifiedPost #
# LANGUAGE LambdaCase #
# LANGUAGE NamedFieldPuns #
{-# LANGUAGE OverloadedStrings #-}
# LANGUAGE ParallelListComp #
# LANGUAGE ScopedTypeVariables #
module Amazonka.S3.StreamingUpload
( streamUpload
, ChunkSize
, minimumChunkSize
, NumThreads
, concurrentUpload
, UploadLocation(..)
, abortAllUploads
, module Amazonka.S3.CreateMultipartUpload
, module Amazonka.S3.CompleteMultipartUpload
) where
import Amazonka ( HashedBody(..), LogLevel(..), getFileSize, hashedFileRange, send, toBody )
import Amazonka.Crypto ( hash )
import Amazonka.Env ( Env, logger, manager )
import Amazonka.S3.AbortMultipartUpload ( AbortMultipartUploadResponse, newAbortMultipartUpload )
import Amazonka.S3.CompleteMultipartUpload
( CompleteMultipartUpload(..), CompleteMultipartUploadResponse, newCompleteMultipartUpload )
import Amazonka.S3.CreateMultipartUpload
( CreateMultipartUpload(..), CreateMultipartUploadResponse(..) )
import Amazonka.S3.ListMultipartUploads
( ListMultipartUploadsResponse(..), newListMultipartUploads, uploads )
import Amazonka.S3.Types
( BucketName, CompletedMultipartUpload(..), CompletedPart, MultipartUpload(..),
newCompletedMultipartUpload, newCompletedPart )
import Amazonka.S3.UploadPart ( UploadPartResponse(..), newUploadPart )
import Network.HTTP.Client ( managerConnCount, newManager )
import Network.HTTP.Client.TLS ( tlsManagerSettings )
import Control.Monad.Catch ( Exception, MonadCatch, onException )
import Control.Monad.IO.Class ( MonadIO, liftIO )
import Control.Monad.Trans.Class ( lift )
import Control.Monad.Trans.Resource ( MonadResource, runResourceT )
import Conduit ( MonadUnliftIO(..) )
import Data.Conduit ( ConduitT, Void, await, handleC, yield, (.|) )
import Data.Conduit.Combinators ( sinkList )
import Data.Conduit.Combinators qualified as CC
import Data.ByteString qualified as BS
import Data.ByteString.Builder ( stringUtf8 )
import Data.ByteString.Builder.Extra ( byteStringCopy, runBuilder )
import Data.ByteString.Internal ( ByteString(PS) )
import Data.List ( unfoldr )
import Data.List.NonEmpty ( fromList, nonEmpty )
import Data.Text ( Text )
import Control.Concurrent ( newQSem, signalQSem, waitQSem )
import Control.Concurrent.Async ( forConcurrently )
import Control.Exception.Base ( SomeException, bracket_ )
import Foreign.ForeignPtr ( ForeignPtr, mallocForeignPtrBytes, plusForeignPtr )
import Foreign.ForeignPtr.Unsafe ( unsafeForeignPtrToPtr )
import Control.DeepSeq ( rwhnf )
import Data.Foldable ( for_, traverse_ )
import Data.Typeable ( Typeable )
import Data.Word ( Word8 )
import Control.Monad ((>=>))
type ChunkSize = Int
type NumThreads = Int
| Minimum size of data which will be sent in a single part , currently 6 MB
minimumChunkSize :: ChunkSize
minimumChunkSize = 6*1024*1024 -- Making this 5MB+1 seemed to cause AWS to complain
data StreamingError
= UnableToCreateMultipartUpload CreateMultipartUploadResponse
| FailedToUploadPiece UploadPartResponse
| Other String
deriving stock (Show, Eq, Typeable)
instance Exception StreamingError
|
Given a ' CreateMultipartUpload ' , creates a ' Sink ' which will sequentially
upload the data streamed in in chunks of at least ' minimumChunkSize ' and return either
the ' CompleteMultipartUploadResponse ' , or if an exception is thrown ,
` AbortMultipartUploadResponse ` and the exception as ` SomeException ` . If aborting
the upload also fails then the exception caused by the call to abort will be thrown .
' Amazonka . S3.ListMultipartUploads ' can be used to list any pending
uploads - it is important to abort multipart uploads because you will
be charged for storage of the parts until it is completed or aborted .
See the AWS documentation for more details .
Internally , a single @chunkSize@d buffer will be allocated and reused between
requests to avoid holding onto incoming @ByteString@s .
May throw ' Amazonka . Error '
Given a 'CreateMultipartUpload', creates a 'Sink' which will sequentially
upload the data streamed in in chunks of at least 'minimumChunkSize' and return either
the 'CompleteMultipartUploadResponse', or if an exception is thrown,
`AbortMultipartUploadResponse` and the exception as `SomeException`. If aborting
the upload also fails then the exception caused by the call to abort will be thrown.
'Amazonka.S3.ListMultipartUploads' can be used to list any pending
uploads - it is important to abort multipart uploads because you will
be charged for storage of the parts until it is completed or aborted.
See the AWS documentation for more details.
Internally, a single @chunkSize@d buffer will be allocated and reused between
requests to avoid holding onto incoming @ByteString@s.
May throw 'Amazonka.Error'
-}
streamUpload :: forall m. (MonadUnliftIO m, MonadResource m)
=> Env
-> Maybe ChunkSize -- ^ Optional chunk size
-> CreateMultipartUpload -- ^ Upload location
-> ConduitT ByteString Void m (Either (AbortMultipartUploadResponse, SomeException) CompleteMultipartUploadResponse)
streamUpload env mChunkSize multiPartUploadDesc@CreateMultipartUpload'{bucket = buck, key = k} = do
buffer <- liftIO $ allocBuffer chunkSize
unsafeWriteChunksToBuffer buffer
.| enumerateConduit
.| startUpload buffer
where
chunkSize :: ChunkSize
chunkSize = maybe minimumChunkSize (max minimumChunkSize) mChunkSize
logStr :: String -> m ()
logStr msg = do
liftIO $ logger env Debug $ stringUtf8 msg
startUpload :: Buffer
-> ConduitT (Int, BufferResult) Void m
(Either (AbortMultipartUploadResponse, SomeException)
CompleteMultipartUploadResponse)
startUpload buffer = do
CreateMultipartUploadResponse'{uploadId = upId} <- lift $ send env multiPartUploadDesc
lift $ logStr "\n**** Created upload\n"
handleC (cancelMultiUploadConduit upId) $
CC.mapM (multiUpload buffer upId)
.| finishMultiUploadConduit upId
multiUpload :: Buffer -> Text -> (Int, BufferResult) -> m (Maybe CompletedPart)
multiUpload buffer upId (partnum, result) = do
let !bs = bufferToByteString buffer result
!bsHash = hash bs
UploadPartResponse'{eTag} <- send env $! newUploadPart buck k partnum upId $! toBody $! HashedBytes bsHash bs
let !_ = rwhnf eTag
logStr $ "\n**** Uploaded part " <> show partnum
return $! newCompletedPart partnum <$> eTag
-- collect all the parts
finishMultiUploadConduit :: Text
-> ConduitT (Maybe CompletedPart) Void m
(Either (AbortMultipartUploadResponse, SomeException) CompleteMultipartUploadResponse)
finishMultiUploadConduit upId = do
parts <- sinkList
res <- lift $ send env $ (newCompleteMultipartUpload buck k upId)
{ multipartUpload =
Just $ newCompletedMultipartUpload {parts = sequenceA $ fromList parts}
}
return $ Right res
-- in case of an exception, return Left
cancelMultiUploadConduit :: Text -> SomeException
-> ConduitT i Void m
(Either (AbortMultipartUploadResponse, SomeException) CompleteMultipartUploadResponse)
cancelMultiUploadConduit upId exc = do
res <- lift $ send env $ newAbortMultipartUpload buck k upId
return $ Left (res, exc)
count from 1
enumerateConduit :: ConduitT a (Int, a) m ()
enumerateConduit = loop 1
where
loop i = await >>= maybe (return ()) (go i)
go i x = do
yield (i, x)
loop (i + 1)
# INLINE enumerateConduit #
-- The number of bytes remaining in a buffer, and the pointer that backs it.
data Buffer = Buffer {remaining :: !Int, _fptr :: !(ForeignPtr Word8)}
data PutResult
= Ok Buffer -- Didn't fill the buffer, updated buffer.
| Full ByteString -- Buffer is full, the unwritten remaining string.
data BufferResult = FullBuffer | Incomplete Int
Accepts @ByteString@s and writes them into @Buffer@. When the buffer is full ,
@FullBuffer@ is emitted . If there is no more input , @Incomplete@ is emitted with
-- the number of bytes remaining in the buffer.
unsafeWriteChunksToBuffer :: MonadIO m => Buffer -> ConduitT ByteString BufferResult m ()
unsafeWriteChunksToBuffer buffer0 = awaitLoop buffer0 where
awaitLoop buf =
await >>= maybe (yield $ Incomplete $ remaining buf)
(liftIO . putBuffer buf >=> \case
Full next -> yield FullBuffer *> chunkLoop buffer0 next
Ok buf' -> awaitLoop buf'
)
-- Handle inputs which are larger than the chunkSize
chunkLoop buf = liftIO . putBuffer buf >=> \case
Full next -> yield FullBuffer *> chunkLoop buffer0 next
Ok buf' -> awaitLoop buf'
bufferToByteString :: Buffer -> BufferResult -> ByteString
bufferToByteString (Buffer bufSize fptr) FullBuffer = PS fptr 0 bufSize
bufferToByteString (Buffer bufSize fptr) (Incomplete remaining) = PS fptr 0 (bufSize - remaining)
allocBuffer :: Int -> IO Buffer
allocBuffer chunkSize = Buffer chunkSize <$> mallocForeignPtrBytes chunkSize
putBuffer :: Buffer -> ByteString -> IO PutResult
putBuffer buffer bs
| BS.length bs <= remaining buffer =
Ok <$> unsafeWriteBuffer buffer bs
| otherwise = do
let (remainder,rest) = BS.splitAt (remaining buffer) bs
Full rest <$ unsafeWriteBuffer buffer remainder
-- The length of the bytestring must be less than or equal to the number
-- of bytes remaining.
unsafeWriteBuffer :: Buffer -> ByteString -> IO Buffer
unsafeWriteBuffer (Buffer remaining fptr) bs = do
let ptr = unsafeForeignPtrToPtr fptr
len = BS.length bs
_ <- runBuilder (byteStringCopy bs) ptr remaining
pure $ Buffer (remaining - len) (plusForeignPtr fptr len)
-- | Specifies whether to upload a file or 'ByteString'.
data UploadLocation
= FP FilePath -- ^ A file to be uploaded
^ A strict ' ByteString '
|
Allows a file or ' ByteString ' to be uploaded concurrently , using the
async library . The chunk size may optionally be specified , but will be at least
` minimumChunkSize ` , and may be made larger than if the ` ByteString ` or file
is larger enough to cause more than 10,000 chunks .
Files are into ' chunkSize ' chunks and each chunk is uploaded in parallel .
This considerably reduces the memory necessary compared to reading the contents
into memory as a strict ' ByteString ' . The usual caveats about mmaped files apply :
if the file is modified during this operation , the data may become corrupt .
May throw ` Amazonka . Error ` , or ` IOError ` ; an attempt is made to cancel the
multipart upload on any error , but this may also fail if , for example , the network
connection has been broken . See ` abortAllUploads ` for a crude cleanup method .
Allows a file or 'ByteString' to be uploaded concurrently, using the
async library. The chunk size may optionally be specified, but will be at least
`minimumChunkSize`, and may be made larger than if the `ByteString` or file
is larger enough to cause more than 10,000 chunks.
Files are mmapped into 'chunkSize' chunks and each chunk is uploaded in parallel.
This considerably reduces the memory necessary compared to reading the contents
into memory as a strict 'ByteString'. The usual caveats about mmaped files apply:
if the file is modified during this operation, the data may become corrupt.
May throw `Amazonka.Error`, or `IOError`; an attempt is made to cancel the
multipart upload on any error, but this may also fail if, for example, the network
connection has been broken. See `abortAllUploads` for a crude cleanup method.
-}
concurrentUpload :: forall m.
(MonadResource m, MonadCatch m)
=> Env
-> Maybe ChunkSize -- ^ Optional chunk size
-> Maybe NumThreads -- ^ Optional number of threads to upload with
^ Whether to upload a file on disk or a ` ByteString ` that 's already in memory .
-> CreateMultipartUpload -- ^ Description of where to upload.
-> m CompleteMultipartUploadResponse
concurrentUpload env' mChunkSize mNumThreads uploadLoc
multiPartUploadDesc@CreateMultipartUpload'{bucket = buck, key = k}
= do
CreateMultipartUploadResponse'{uploadId = upId} <- send env' multiPartUploadDesc
let logStr :: MonadIO n => String -> n ()
logStr = liftIO . logger env' Info . stringUtf8
calculateChunkSize :: Int -> Int
calculateChunkSize len =
let chunkSize' = maybe minimumChunkSize (max minimumChunkSize) mChunkSize
in if len `div` chunkSize' >= 10000 then len `div` 9999 else chunkSize'
mConnCount = managerConnCount tlsManagerSettings
nThreads = maybe mConnCount (max 1) mNumThreads
env <- if maybe False (> mConnCount) mNumThreads
then do
mgr' <- liftIO $ newManager tlsManagerSettings{managerConnCount = nThreads}
pure env'{manager = mgr'}
else pure env'
flip onException (send env (newAbortMultipartUpload buck k upId)) $ do
sem <- liftIO $ newQSem nThreads
uploadResponses <- case uploadLoc of
BS bytes ->
let chunkSize = calculateChunkSize $ BS.length bytes
in liftIO $ forConcurrently (zip [1..] $ chunksOf chunkSize bytes) $ \(partnum, chunk) ->
bracket_ (waitQSem sem) (signalQSem sem) $ do
logStr $ "Starting part: " ++ show partnum
UploadPartResponse'{eTag} <- runResourceT $ send env . newUploadPart buck k partnum upId . toBody $ chunk
logStr $ "Finished part: " ++ show partnum
pure $ newCompletedPart partnum <$> eTag
FP filePath -> do
fsize <- liftIO $ getFileSize filePath
let chunkSize = calculateChunkSize $ fromIntegral fsize
(count,lst) = fromIntegral fsize `divMod` chunkSize
params = [(partnum, chunkSize*offset, size)
| partnum <- [1..]
| offset <- [0..count]
| size <- (chunkSize <$ [0..count-1]) ++ [lst]
]
liftIO $ forConcurrently params $ \(partnum,off,size) ->
bracket_ (waitQSem sem) (signalQSem sem) $ do
logStr $ "Starting file part: " ++ show partnum
chunkStream <- hashedFileRange filePath (fromIntegral off) (fromIntegral size)
UploadPartResponse'{eTag} <- runResourceT $
send env . newUploadPart buck k partnum upId . toBody $ chunkStream
logStr $ "Finished file part: " ++ show partnum
pure $ newCompletedPart partnum <$> eTag
let parts = nonEmpty =<< sequence uploadResponses
send env $ (newCompleteMultipartUpload buck k upId)
{ multipartUpload = Just $ newCompletedMultipartUpload { parts } }
-- | Aborts all uploads in a given bucket - useful for cleaning up.
abortAllUploads :: MonadResource m => Env -> BucketName -> m ()
abortAllUploads env buck = do
ListMultipartUploadsResponse' {uploads} <- send env $ newListMultipartUploads buck
flip (traverse_ . traverse_) uploads $ \MultipartUpload'{key, uploadId} -> do
let mki = (,) <$> key <*> uploadId
for_ mki $ \(key',uid) -> send env (newAbortMultipartUpload buck key' uid)
-- -analog-for-bytestring
justWhen :: (a -> Bool) -> (a -> b) -> a -> Maybe b
justWhen f g a = if f a then Just (g a) else Nothing
nothingWhen :: (a -> Bool) -> (a -> b) -> a -> Maybe b
nothingWhen f = justWhen (not . f)
chunksOf :: Int -> BS.ByteString -> [BS.ByteString]
chunksOf x = unfoldr (nothingWhen BS.null (BS.splitAt x))
| null | https://raw.githubusercontent.com/axman6/amazonka-s3-streaming/4d1aa55fa5fc347f46567fd4efdd0da0a69cda96/src/Amazonka/S3/StreamingUpload.hs | haskell | # LANGUAGE BangPatterns #
# LANGUAGE OverloadedStrings #
Making this 5MB+1 seemed to cause AWS to complain
^ Optional chunk size
^ Upload location
collect all the parts
in case of an exception, return Left
The number of bytes remaining in a buffer, and the pointer that backs it.
Didn't fill the buffer, updated buffer.
Buffer is full, the unwritten remaining string.
the number of bytes remaining in the buffer.
Handle inputs which are larger than the chunkSize
The length of the bytestring must be less than or equal to the number
of bytes remaining.
| Specifies whether to upload a file or 'ByteString'.
^ A file to be uploaded
^ Optional chunk size
^ Optional number of threads to upload with
^ Description of where to upload.
| Aborts all uploads in a given bucket - useful for cleaning up.
-analog-for-bytestring | # LANGUAGE DerivingStrategies #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE ImportQualifiedPost #
# LANGUAGE LambdaCase #
# LANGUAGE NamedFieldPuns #
# LANGUAGE ParallelListComp #
# LANGUAGE ScopedTypeVariables #
module Amazonka.S3.StreamingUpload
( streamUpload
, ChunkSize
, minimumChunkSize
, NumThreads
, concurrentUpload
, UploadLocation(..)
, abortAllUploads
, module Amazonka.S3.CreateMultipartUpload
, module Amazonka.S3.CompleteMultipartUpload
) where
import Amazonka ( HashedBody(..), LogLevel(..), getFileSize, hashedFileRange, send, toBody )
import Amazonka.Crypto ( hash )
import Amazonka.Env ( Env, logger, manager )
import Amazonka.S3.AbortMultipartUpload ( AbortMultipartUploadResponse, newAbortMultipartUpload )
import Amazonka.S3.CompleteMultipartUpload
( CompleteMultipartUpload(..), CompleteMultipartUploadResponse, newCompleteMultipartUpload )
import Amazonka.S3.CreateMultipartUpload
( CreateMultipartUpload(..), CreateMultipartUploadResponse(..) )
import Amazonka.S3.ListMultipartUploads
( ListMultipartUploadsResponse(..), newListMultipartUploads, uploads )
import Amazonka.S3.Types
( BucketName, CompletedMultipartUpload(..), CompletedPart, MultipartUpload(..),
newCompletedMultipartUpload, newCompletedPart )
import Amazonka.S3.UploadPart ( UploadPartResponse(..), newUploadPart )
import Network.HTTP.Client ( managerConnCount, newManager )
import Network.HTTP.Client.TLS ( tlsManagerSettings )
import Control.Monad.Catch ( Exception, MonadCatch, onException )
import Control.Monad.IO.Class ( MonadIO, liftIO )
import Control.Monad.Trans.Class ( lift )
import Control.Monad.Trans.Resource ( MonadResource, runResourceT )
import Conduit ( MonadUnliftIO(..) )
import Data.Conduit ( ConduitT, Void, await, handleC, yield, (.|) )
import Data.Conduit.Combinators ( sinkList )
import Data.Conduit.Combinators qualified as CC
import Data.ByteString qualified as BS
import Data.ByteString.Builder ( stringUtf8 )
import Data.ByteString.Builder.Extra ( byteStringCopy, runBuilder )
import Data.ByteString.Internal ( ByteString(PS) )
import Data.List ( unfoldr )
import Data.List.NonEmpty ( fromList, nonEmpty )
import Data.Text ( Text )
import Control.Concurrent ( newQSem, signalQSem, waitQSem )
import Control.Concurrent.Async ( forConcurrently )
import Control.Exception.Base ( SomeException, bracket_ )
import Foreign.ForeignPtr ( ForeignPtr, mallocForeignPtrBytes, plusForeignPtr )
import Foreign.ForeignPtr.Unsafe ( unsafeForeignPtrToPtr )
import Control.DeepSeq ( rwhnf )
import Data.Foldable ( for_, traverse_ )
import Data.Typeable ( Typeable )
import Data.Word ( Word8 )
import Control.Monad ((>=>))
type ChunkSize = Int
type NumThreads = Int
| Minimum size of data which will be sent in a single part , currently 6 MB
minimumChunkSize :: ChunkSize
data StreamingError
= UnableToCreateMultipartUpload CreateMultipartUploadResponse
| FailedToUploadPiece UploadPartResponse
| Other String
deriving stock (Show, Eq, Typeable)
instance Exception StreamingError
|
Given a ' CreateMultipartUpload ' , creates a ' Sink ' which will sequentially
upload the data streamed in in chunks of at least ' minimumChunkSize ' and return either
the ' CompleteMultipartUploadResponse ' , or if an exception is thrown ,
` AbortMultipartUploadResponse ` and the exception as ` SomeException ` . If aborting
the upload also fails then the exception caused by the call to abort will be thrown .
' Amazonka . S3.ListMultipartUploads ' can be used to list any pending
uploads - it is important to abort multipart uploads because you will
be charged for storage of the parts until it is completed or aborted .
See the AWS documentation for more details .
Internally , a single @chunkSize@d buffer will be allocated and reused between
requests to avoid holding onto incoming @ByteString@s .
May throw ' Amazonka . Error '
Given a 'CreateMultipartUpload', creates a 'Sink' which will sequentially
upload the data streamed in in chunks of at least 'minimumChunkSize' and return either
the 'CompleteMultipartUploadResponse', or if an exception is thrown,
`AbortMultipartUploadResponse` and the exception as `SomeException`. If aborting
the upload also fails then the exception caused by the call to abort will be thrown.
'Amazonka.S3.ListMultipartUploads' can be used to list any pending
uploads - it is important to abort multipart uploads because you will
be charged for storage of the parts until it is completed or aborted.
See the AWS documentation for more details.
Internally, a single @chunkSize@d buffer will be allocated and reused between
requests to avoid holding onto incoming @ByteString@s.
May throw 'Amazonka.Error'
-}
streamUpload :: forall m. (MonadUnliftIO m, MonadResource m)
=> Env
-> ConduitT ByteString Void m (Either (AbortMultipartUploadResponse, SomeException) CompleteMultipartUploadResponse)
streamUpload env mChunkSize multiPartUploadDesc@CreateMultipartUpload'{bucket = buck, key = k} = do
buffer <- liftIO $ allocBuffer chunkSize
unsafeWriteChunksToBuffer buffer
.| enumerateConduit
.| startUpload buffer
where
chunkSize :: ChunkSize
chunkSize = maybe minimumChunkSize (max minimumChunkSize) mChunkSize
logStr :: String -> m ()
logStr msg = do
liftIO $ logger env Debug $ stringUtf8 msg
startUpload :: Buffer
-> ConduitT (Int, BufferResult) Void m
(Either (AbortMultipartUploadResponse, SomeException)
CompleteMultipartUploadResponse)
startUpload buffer = do
CreateMultipartUploadResponse'{uploadId = upId} <- lift $ send env multiPartUploadDesc
lift $ logStr "\n**** Created upload\n"
handleC (cancelMultiUploadConduit upId) $
CC.mapM (multiUpload buffer upId)
.| finishMultiUploadConduit upId
multiUpload :: Buffer -> Text -> (Int, BufferResult) -> m (Maybe CompletedPart)
multiUpload buffer upId (partnum, result) = do
let !bs = bufferToByteString buffer result
!bsHash = hash bs
UploadPartResponse'{eTag} <- send env $! newUploadPart buck k partnum upId $! toBody $! HashedBytes bsHash bs
let !_ = rwhnf eTag
logStr $ "\n**** Uploaded part " <> show partnum
return $! newCompletedPart partnum <$> eTag
finishMultiUploadConduit :: Text
-> ConduitT (Maybe CompletedPart) Void m
(Either (AbortMultipartUploadResponse, SomeException) CompleteMultipartUploadResponse)
finishMultiUploadConduit upId = do
parts <- sinkList
res <- lift $ send env $ (newCompleteMultipartUpload buck k upId)
{ multipartUpload =
Just $ newCompletedMultipartUpload {parts = sequenceA $ fromList parts}
}
return $ Right res
cancelMultiUploadConduit :: Text -> SomeException
-> ConduitT i Void m
(Either (AbortMultipartUploadResponse, SomeException) CompleteMultipartUploadResponse)
cancelMultiUploadConduit upId exc = do
res <- lift $ send env $ newAbortMultipartUpload buck k upId
return $ Left (res, exc)
count from 1
enumerateConduit :: ConduitT a (Int, a) m ()
enumerateConduit = loop 1
where
loop i = await >>= maybe (return ()) (go i)
go i x = do
yield (i, x)
loop (i + 1)
# INLINE enumerateConduit #
data Buffer = Buffer {remaining :: !Int, _fptr :: !(ForeignPtr Word8)}
data PutResult
data BufferResult = FullBuffer | Incomplete Int
Accepts @ByteString@s and writes them into @Buffer@. When the buffer is full ,
@FullBuffer@ is emitted . If there is no more input , @Incomplete@ is emitted with
unsafeWriteChunksToBuffer :: MonadIO m => Buffer -> ConduitT ByteString BufferResult m ()
unsafeWriteChunksToBuffer buffer0 = awaitLoop buffer0 where
awaitLoop buf =
await >>= maybe (yield $ Incomplete $ remaining buf)
(liftIO . putBuffer buf >=> \case
Full next -> yield FullBuffer *> chunkLoop buffer0 next
Ok buf' -> awaitLoop buf'
)
chunkLoop buf = liftIO . putBuffer buf >=> \case
Full next -> yield FullBuffer *> chunkLoop buffer0 next
Ok buf' -> awaitLoop buf'
bufferToByteString :: Buffer -> BufferResult -> ByteString
bufferToByteString (Buffer bufSize fptr) FullBuffer = PS fptr 0 bufSize
bufferToByteString (Buffer bufSize fptr) (Incomplete remaining) = PS fptr 0 (bufSize - remaining)
allocBuffer :: Int -> IO Buffer
allocBuffer chunkSize = Buffer chunkSize <$> mallocForeignPtrBytes chunkSize
putBuffer :: Buffer -> ByteString -> IO PutResult
putBuffer buffer bs
| BS.length bs <= remaining buffer =
Ok <$> unsafeWriteBuffer buffer bs
| otherwise = do
let (remainder,rest) = BS.splitAt (remaining buffer) bs
Full rest <$ unsafeWriteBuffer buffer remainder
unsafeWriteBuffer :: Buffer -> ByteString -> IO Buffer
unsafeWriteBuffer (Buffer remaining fptr) bs = do
let ptr = unsafeForeignPtrToPtr fptr
len = BS.length bs
_ <- runBuilder (byteStringCopy bs) ptr remaining
pure $ Buffer (remaining - len) (plusForeignPtr fptr len)
data UploadLocation
^ A strict ' ByteString '
|
Allows a file or ' ByteString ' to be uploaded concurrently , using the
async library . The chunk size may optionally be specified , but will be at least
` minimumChunkSize ` , and may be made larger than if the ` ByteString ` or file
is larger enough to cause more than 10,000 chunks .
Files are into ' chunkSize ' chunks and each chunk is uploaded in parallel .
This considerably reduces the memory necessary compared to reading the contents
into memory as a strict ' ByteString ' . The usual caveats about mmaped files apply :
if the file is modified during this operation , the data may become corrupt .
May throw ` Amazonka . Error ` , or ` IOError ` ; an attempt is made to cancel the
multipart upload on any error , but this may also fail if , for example , the network
connection has been broken . See ` abortAllUploads ` for a crude cleanup method .
Allows a file or 'ByteString' to be uploaded concurrently, using the
async library. The chunk size may optionally be specified, but will be at least
`minimumChunkSize`, and may be made larger than if the `ByteString` or file
is larger enough to cause more than 10,000 chunks.
Files are mmapped into 'chunkSize' chunks and each chunk is uploaded in parallel.
This considerably reduces the memory necessary compared to reading the contents
into memory as a strict 'ByteString'. The usual caveats about mmaped files apply:
if the file is modified during this operation, the data may become corrupt.
May throw `Amazonka.Error`, or `IOError`; an attempt is made to cancel the
multipart upload on any error, but this may also fail if, for example, the network
connection has been broken. See `abortAllUploads` for a crude cleanup method.
-}
concurrentUpload :: forall m.
(MonadResource m, MonadCatch m)
=> Env
^ Whether to upload a file on disk or a ` ByteString ` that 's already in memory .
-> m CompleteMultipartUploadResponse
concurrentUpload env' mChunkSize mNumThreads uploadLoc
multiPartUploadDesc@CreateMultipartUpload'{bucket = buck, key = k}
= do
CreateMultipartUploadResponse'{uploadId = upId} <- send env' multiPartUploadDesc
let logStr :: MonadIO n => String -> n ()
logStr = liftIO . logger env' Info . stringUtf8
calculateChunkSize :: Int -> Int
calculateChunkSize len =
let chunkSize' = maybe minimumChunkSize (max minimumChunkSize) mChunkSize
in if len `div` chunkSize' >= 10000 then len `div` 9999 else chunkSize'
mConnCount = managerConnCount tlsManagerSettings
nThreads = maybe mConnCount (max 1) mNumThreads
env <- if maybe False (> mConnCount) mNumThreads
then do
mgr' <- liftIO $ newManager tlsManagerSettings{managerConnCount = nThreads}
pure env'{manager = mgr'}
else pure env'
flip onException (send env (newAbortMultipartUpload buck k upId)) $ do
sem <- liftIO $ newQSem nThreads
uploadResponses <- case uploadLoc of
BS bytes ->
let chunkSize = calculateChunkSize $ BS.length bytes
in liftIO $ forConcurrently (zip [1..] $ chunksOf chunkSize bytes) $ \(partnum, chunk) ->
bracket_ (waitQSem sem) (signalQSem sem) $ do
logStr $ "Starting part: " ++ show partnum
UploadPartResponse'{eTag} <- runResourceT $ send env . newUploadPart buck k partnum upId . toBody $ chunk
logStr $ "Finished part: " ++ show partnum
pure $ newCompletedPart partnum <$> eTag
FP filePath -> do
fsize <- liftIO $ getFileSize filePath
let chunkSize = calculateChunkSize $ fromIntegral fsize
(count,lst) = fromIntegral fsize `divMod` chunkSize
params = [(partnum, chunkSize*offset, size)
| partnum <- [1..]
| offset <- [0..count]
| size <- (chunkSize <$ [0..count-1]) ++ [lst]
]
liftIO $ forConcurrently params $ \(partnum,off,size) ->
bracket_ (waitQSem sem) (signalQSem sem) $ do
logStr $ "Starting file part: " ++ show partnum
chunkStream <- hashedFileRange filePath (fromIntegral off) (fromIntegral size)
UploadPartResponse'{eTag} <- runResourceT $
send env . newUploadPart buck k partnum upId . toBody $ chunkStream
logStr $ "Finished file part: " ++ show partnum
pure $ newCompletedPart partnum <$> eTag
let parts = nonEmpty =<< sequence uploadResponses
send env $ (newCompleteMultipartUpload buck k upId)
{ multipartUpload = Just $ newCompletedMultipartUpload { parts } }
abortAllUploads :: MonadResource m => Env -> BucketName -> m ()
abortAllUploads env buck = do
ListMultipartUploadsResponse' {uploads} <- send env $ newListMultipartUploads buck
flip (traverse_ . traverse_) uploads $ \MultipartUpload'{key, uploadId} -> do
let mki = (,) <$> key <*> uploadId
for_ mki $ \(key',uid) -> send env (newAbortMultipartUpload buck key' uid)
justWhen :: (a -> Bool) -> (a -> b) -> a -> Maybe b
justWhen f g a = if f a then Just (g a) else Nothing
nothingWhen :: (a -> Bool) -> (a -> b) -> a -> Maybe b
nothingWhen f = justWhen (not . f)
chunksOf :: Int -> BS.ByteString -> [BS.ByteString]
chunksOf x = unfoldr (nothingWhen BS.null (BS.splitAt x))
|
7d17be652659b6458509fed608741ddaeafcf78677b2d13441db487a89e3eb5a | yuanqing/code-problems | test.ml | open Bubble_sort
open OUnit2
let () = run_test_tt_main ("bubble_sort compare xs" >::: [
"empty list" >:: (fun _ ->
assert_equal [] (bubble_sort compare [])
);
"single item" >:: (fun _ ->
assert_equal [42] (bubble_sort compare [42])
);
"multiple items; ascending order" >:: (fun _ ->
assert_equal [4; 8; 15; 16; 23; 42]
(bubble_sort compare [42; 8; 15; 23; 4; 16])
);
"multiple items; descending order" >:: (fun _ ->
assert_equal [42; 23; 16; 15; 8; 4]
(bubble_sort (fun x y -> -(compare x y)) [42; 8; 15; 23; 4; 16])
);
])
| null | https://raw.githubusercontent.com/yuanqing/code-problems/30eb34ad616146306cddc50594a47deff111f341/src/bubble_sort/test.ml | ocaml | open Bubble_sort
open OUnit2
let () = run_test_tt_main ("bubble_sort compare xs" >::: [
"empty list" >:: (fun _ ->
assert_equal [] (bubble_sort compare [])
);
"single item" >:: (fun _ ->
assert_equal [42] (bubble_sort compare [42])
);
"multiple items; ascending order" >:: (fun _ ->
assert_equal [4; 8; 15; 16; 23; 42]
(bubble_sort compare [42; 8; 15; 23; 4; 16])
);
"multiple items; descending order" >:: (fun _ ->
assert_equal [42; 23; 16; 15; 8; 4]
(bubble_sort (fun x y -> -(compare x y)) [42; 8; 15; 23; 4; 16])
);
])
| |
371dc7fd9eb9c2db55da015a4265bce1d9ccd8aed5d8c1cde5c1de10029326bb | callum-oakley/advent-of-code | 04.clj | (ns aoc.2019.04
(:require
[clojure.test :refer [deftest is]]))
(defn parse [s]
(->> s (re-seq #"\d+") (map read-string)))
(defn valid? [f pass]
(let [digits (map int (str pass))]
(and (apply <= digits)
(some #(f 2 (count %)) (partition-by identity digits)))))
(defn part-* [input f]
(->> input (apply range) (filter #(valid? f %)) count))
(defn part-1 [input]
(part-* input <=))
(defn part-2 [input]
(part-* input =))
(deftest test-example
(is (valid? <= 111111))
(is (not (valid? <= 223450)))
(is (not (valid? <= 123789)))
(is (valid? = 112233))
(is (not (valid? = 123444)))
(is (valid? = 111122)))
| null | https://raw.githubusercontent.com/callum-oakley/advent-of-code/16a98d33fd8158a1b4f00776e74ead3242519833/src/aoc/2019/04.clj | clojure | (ns aoc.2019.04
(:require
[clojure.test :refer [deftest is]]))
(defn parse [s]
(->> s (re-seq #"\d+") (map read-string)))
(defn valid? [f pass]
(let [digits (map int (str pass))]
(and (apply <= digits)
(some #(f 2 (count %)) (partition-by identity digits)))))
(defn part-* [input f]
(->> input (apply range) (filter #(valid? f %)) count))
(defn part-1 [input]
(part-* input <=))
(defn part-2 [input]
(part-* input =))
(deftest test-example
(is (valid? <= 111111))
(is (not (valid? <= 223450)))
(is (not (valid? <= 123789)))
(is (valid? = 112233))
(is (not (valid? = 123444)))
(is (valid? = 111122)))
| |
02c7eb32b80b81ab5a5e144cc80504173cc55c9295a6275d8ec190cd03692358 | collaborativetrust/WikiTrust | author_sig_7_4.ml |
Copyright ( c ) 2007 - 2008 The Regents of the University of California
All rights reserved .
Authors :
Redistribution and use in source and binary forms , with or without
modification , are permitted provided that the following conditions are met :
1 . Redistributions of source code must retain the above copyright notice ,
this list of conditions and the following disclaimer .
2 . Redistributions in binary form must reproduce the above copyright notice ,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution .
3 . The names of the contributors may not be used to endorse or promote
products derived from this software without specific prior written
permission .
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS " AS IS "
AND ANY EXPRESS OR IMPLIED WARRANTIES , INCLUDING , BUT NOT LIMITED TO , THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED . IN NO EVENT SHALL THE COPYRIGHT OWNER OR
LIABLE FOR ANY DIRECT , INDIRECT , INCIDENTAL , SPECIAL , EXEMPLARY , OR
CONSEQUENTIAL DAMAGES ( INCLUDING , BUT NOT LIMITED TO , PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES ; LOSS OF USE , DATA , OR PROFITS ; OR BUSINESS
INTERRUPTION ) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY , IN
CONTRACT , STRICT LIABILITY , OR TORT ( INCLUDING NEGLIGENCE OR OTHERWISE )
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE , EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE .
Copyright (c) 2007-2008 The Regents of the University of California
All rights reserved.
Authors: Luca de Alfaro
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. The names of the contributors may not be used to endorse or promote
products derived from this software without specific prior written
permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
*)
open Eval_defs
(** Type of author signature *)
type packed_author_signature_t = int
type unpacked_author_signature_t = int * int * int * int
type author_signature_t = int
let mask = 0o177
let offset = 7
external hash_param : int -> int -> 'a -> int = "caml_hash_univ_param" "noalloc"
let hash x = 1 + (hash_param 10 100 x) mod 127
let empty_sigs = 0
let sexp_of_sigs = Sexplib.Conv.sexp_of_int
let sigs_of_sexp = Sexplib.Conv.int_of_sexp
let pack (a0: int) (a1: int) (a2: int) (a3: int) : packed_author_signature_t =
a0 lor ((a1 lor ((a2 lor (a3 lsl offset)) lsl offset)) lsl offset)
let unpack (p: packed_author_signature_t) : unpacked_author_signature_t =
let a0 = p land mask in
let b1 = p lsr offset in
let a1 = b1 land mask in
let b2 = b1 lsr offset in
let a2 = b2 land mask in
let b3 = b2 lsr offset in
let a3 = b3 land mask in
(a0, a1, a2, a3)
(** [is_author_in_sigs id w sigs] returns [true] if author [id] is in the signatures [sigs] of
word [w], and returns [false] otherwise. *)
let is_author_in_sigs (id: int) (w: string) (sigs: packed_author_signature_t) : bool =
if is_anonymous id then true
else
let (a0, a1, a2, a3) = unpack sigs in
let h = hash (id, w) in
(h = a0 || h = a1 || h = a2 || h = a3)
* [ add_author i d word ] adds author i d to the signatures [ sigs ] for word [ word ] ,
and returns the new signature . It assumes that the author was not already in the
list .
and returns the new signature. It assumes that the author was not already in the
list. *)
let add_author (id: int) (w: string) (sigs: packed_author_signature_t) : packed_author_signature_t =
if is_anonymous id then sigs
else
let (a0, a1, a2, a3) = unpack sigs in
let h = hash (id, w) in
pack h a0 a1 a2
| null | https://raw.githubusercontent.com/collaborativetrust/WikiTrust/9dd056e65c37a22f67d600dd1e87753aa0ec9e2c/analysis/author_sig_7_4.ml | ocaml | * Type of author signature
* [is_author_in_sigs id w sigs] returns [true] if author [id] is in the signatures [sigs] of
word [w], and returns [false] otherwise. |
Copyright ( c ) 2007 - 2008 The Regents of the University of California
All rights reserved .
Authors :
Redistribution and use in source and binary forms , with or without
modification , are permitted provided that the following conditions are met :
1 . Redistributions of source code must retain the above copyright notice ,
this list of conditions and the following disclaimer .
2 . Redistributions in binary form must reproduce the above copyright notice ,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution .
3 . The names of the contributors may not be used to endorse or promote
products derived from this software without specific prior written
permission .
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS " AS IS "
AND ANY EXPRESS OR IMPLIED WARRANTIES , INCLUDING , BUT NOT LIMITED TO , THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED . IN NO EVENT SHALL THE COPYRIGHT OWNER OR
LIABLE FOR ANY DIRECT , INDIRECT , INCIDENTAL , SPECIAL , EXEMPLARY , OR
CONSEQUENTIAL DAMAGES ( INCLUDING , BUT NOT LIMITED TO , PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES ; LOSS OF USE , DATA , OR PROFITS ; OR BUSINESS
INTERRUPTION ) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY , IN
CONTRACT , STRICT LIABILITY , OR TORT ( INCLUDING NEGLIGENCE OR OTHERWISE )
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE , EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE .
Copyright (c) 2007-2008 The Regents of the University of California
All rights reserved.
Authors: Luca de Alfaro
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
3. The names of the contributors may not be used to endorse or promote
products derived from this software without specific prior written
permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
*)
open Eval_defs
type packed_author_signature_t = int
type unpacked_author_signature_t = int * int * int * int
type author_signature_t = int
let mask = 0o177
let offset = 7
external hash_param : int -> int -> 'a -> int = "caml_hash_univ_param" "noalloc"
let hash x = 1 + (hash_param 10 100 x) mod 127
let empty_sigs = 0
let sexp_of_sigs = Sexplib.Conv.sexp_of_int
let sigs_of_sexp = Sexplib.Conv.int_of_sexp
let pack (a0: int) (a1: int) (a2: int) (a3: int) : packed_author_signature_t =
a0 lor ((a1 lor ((a2 lor (a3 lsl offset)) lsl offset)) lsl offset)
let unpack (p: packed_author_signature_t) : unpacked_author_signature_t =
let a0 = p land mask in
let b1 = p lsr offset in
let a1 = b1 land mask in
let b2 = b1 lsr offset in
let a2 = b2 land mask in
let b3 = b2 lsr offset in
let a3 = b3 land mask in
(a0, a1, a2, a3)
let is_author_in_sigs (id: int) (w: string) (sigs: packed_author_signature_t) : bool =
if is_anonymous id then true
else
let (a0, a1, a2, a3) = unpack sigs in
let h = hash (id, w) in
(h = a0 || h = a1 || h = a2 || h = a3)
* [ add_author i d word ] adds author i d to the signatures [ sigs ] for word [ word ] ,
and returns the new signature . It assumes that the author was not already in the
list .
and returns the new signature. It assumes that the author was not already in the
list. *)
let add_author (id: int) (w: string) (sigs: packed_author_signature_t) : packed_author_signature_t =
if is_anonymous id then sigs
else
let (a0, a1, a2, a3) = unpack sigs in
let h = hash (id, w) in
pack h a0 a1 a2
|
792f5159d1eae344c938bdb267150f3fb0e7081b7d42c77225c92d76b3d8d626 | dpiponi/Moodler | sum4.hs | do
(x0, y0) <- mouse
let (x, y) = quantise2 quantum (x0, y0)
root <- currentPlane
sum49 <- new' "sum4"
container7 <- container' "panel_sum4.png" (x+(0.0), y+(0.0)) (Inside root)
plugin10 <- plugin' (sum49 ! "signal1") (x+(-60.0), y+(72.0)) (Outside container7)
setColour plugin10 "#sample"
plugin11 <- plugin' (sum49 ! "signal2") (x+(-60.0), y+(24.0)) (Outside container7)
setColour plugin11 "#sample"
plugin12 <- plugin' (sum49 ! "signal3") (x+(-60.0), y+(-24.0)) (Outside container7)
setColour plugin12 "#sample"
plugin13 <- plugin' (sum49 ! "signal4") (x+(-60.0), y+(-72.0)) (Outside container7)
setColour plugin13 "#sample"
plugout14 <- plugout' (sum49 ! "result") (x+(60.0), y+(0.0)) (Outside container7)
setColour plugout14 "#sample"
recompile
return ()
| null | https://raw.githubusercontent.com/dpiponi/Moodler/a0c984c36abae52668d00f25eb3749e97e8936d3/Moodler/scripts/sum4.hs | haskell | do
(x0, y0) <- mouse
let (x, y) = quantise2 quantum (x0, y0)
root <- currentPlane
sum49 <- new' "sum4"
container7 <- container' "panel_sum4.png" (x+(0.0), y+(0.0)) (Inside root)
plugin10 <- plugin' (sum49 ! "signal1") (x+(-60.0), y+(72.0)) (Outside container7)
setColour plugin10 "#sample"
plugin11 <- plugin' (sum49 ! "signal2") (x+(-60.0), y+(24.0)) (Outside container7)
setColour plugin11 "#sample"
plugin12 <- plugin' (sum49 ! "signal3") (x+(-60.0), y+(-24.0)) (Outside container7)
setColour plugin12 "#sample"
plugin13 <- plugin' (sum49 ! "signal4") (x+(-60.0), y+(-72.0)) (Outside container7)
setColour plugin13 "#sample"
plugout14 <- plugout' (sum49 ! "result") (x+(60.0), y+(0.0)) (Outside container7)
setColour plugout14 "#sample"
recompile
return ()
| |
e20d96c5699d7f3e99ab4e8469537ed56cdbfdf66f9c0f707ab6a315a77b4ff3 | bucko909/erlcapnp | capnp_raw.erl | -module(capnp_raw).
-compile([export_all]).
-record(message, {
segments :: tuple(),
current_segment :: binary(),
current_offset :: integer(),
depth=0
}).
-record(struct, {
data :: binary(),
pointers :: list(term())
}).
-record(list, {
data :: list(term())
}).
read_message(<<SegmentCountM1:32/unsigned-little-integer, Rest/binary>>) ->
SegmentLengthsLength = (SegmentCountM1+1)*4,
PaddingLength = case SegmentCountM1 rem 2 == 0 of true -> 0; false -> 4 end,
<<SegmentLengthsData:SegmentLengthsLength/binary, _Padding:PaddingLength/binary, DataArea/binary>> = Rest,
SegmentLengths = get_segment_lengths(SegmentCountM1+1, SegmentLengthsData),
Segments = [FirstSegment|_] = get_segments(SegmentLengths, DataArea),
#message{
segments = list_to_tuple(Segments),
current_segment = FirstSegment,
current_offset = 0
}.
get_segment_lengths(0, <<>>) ->
[];
get_segment_lengths(N, <<SegmentLength:32/unsigned-little-integer, Rest/binary>>) ->
[SegmentLength*8|get_segment_lengths(N-1, Rest)].
get_segments([], <<>>) ->
[];
get_segments([Length|RestLengths], RemData) ->
<<Data:Length/binary, Rest/binary>> = RemData,
[Data|get_segments(RestLengths, Rest)].
decode_pointer(Message = #message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
<<_:OffsetSize/binary, X:8/binary, _/binary>> = CS,
decode_pointer(X, Message#message{});
decode_pointer(_) ->
excess_depth.
decode_pointers(<<>>, _) ->
[];
decode_pointers(<<Pointer:8/binary, Rest/binary>>, Message=#message{current_offset=Offset}) ->
[decode_pointer(Pointer, Message)|decode_pointers(Rest, Message#message{current_offset=Offset+1})].
decode_pointer(<<0:64/little-integer>>, _) ->
null_pointer;
decode_pointer(<<DataOffsetLSB:6/little-integer, 0:2/little-integer, DataOffsetMSB:24/little-integer, DataSize:16/little-integer, PointerSize:16/little-integer>>, Message=#message{current_offset=CurrentOffset}) ->
DataOffset = DataOffsetLSB + (DataOffsetMSB bsl 6),
io : format("Struct pointer : ~p , ~p ~ n " , [ P , { CurrentOffset+1+DataOffset , DataSize , PointerSize } ] ) ,
(catch decode_struct(DataSize, PointerSize, Message#message{current_offset=CurrentOffset+1+DataOffset}));
decode_pointer(<<DataOffsetLSB:6/little-integer, 1:2/little-integer, DataOffsetMSB:24/little-integer, ListLengthLSB:5/little-integer, ElementSize:3/little-integer, ListLengthMSB:24/little-integer>>, Message=#message{current_offset=CurrentOffset}) ->
DataOffset = DataOffsetLSB + (DataOffsetMSB bsl 6),
ListLength = ListLengthLSB + (ListLengthMSB bsl 5),
io : format("List pointer : ~p , ~p ~ n " , [ P , { { CurrentOffset , DataOffsetLSB , , CurrentOffset+1+DataOffset } , { ListLength , ListLengthLSB , ListLengthMSB } } ] ) ,
(catch decode_list(ElementSize, ListLength, Message#message{current_offset=CurrentOffset+1+DataOffset}));
decode_pointer(<<SegmentOffsetLSB:5/little-integer, LandingPadExtra:1/little-integer, 2:2/little-integer, SegmentOffsetMSB:24/little-integer, SegmentNumber:32/little-integer>>, Message) ->
SegmentOffset = SegmentOffsetLSB + (SegmentOffsetMSB bsl 5),
io : format("Far pointer : ~p , ~p ~ n " , [ P , { { SegmentOffset , SegmentOffsetLSB , SegmentOffsetMSB } , SegmentNumber , LandingPadExtra } ] ) ,
(catch decode_far_pointer(SegmentOffset, SegmentNumber, LandingPadExtra, Message));
decode_pointer(<<0:6/little-integer, 3:2/little-integer, 0:24/little-integer, CapabilityOffset:32/little-integer>>, #message{}) ->
%io:format("Capability: ~p, ~p~n", [P, CapabilityOffset]),
{not_implemented_capabilities, CapabilityOffset};
decode_pointer(Junk, _) ->
%io:format("Junk: ~p~n", [Junk]),
{junk, Junk}.
decode_struct(DataWords, PointerWords, Message=#message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
DataSize = DataWords * 8,
PointerSize = PointerWords * 8,
<<_:OffsetSize/binary, Data:DataSize/binary, Pointers:PointerSize/binary, _/binary>> = CS,
%io:format("Data: ~p; Pointers: ~p~n", [Data, Pointers]),
#struct{
data=Data,
pointers=(catch decode_pointers(Pointers, Message#message{current_offset=Offset+DataWords}))
}.
decode_list(6, Length, Message=#message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
PointerSize = Length * 8,
<<_:OffsetSize/binary, Pointers:PointerSize/binary, _/binary>> = CS,
%io:format("List pointers: ~p~n", [Pointers]),
#list{
data=(catch decode_pointers(Pointers, Message#message{current_offset=Offset+1}))
};
decode_list(7, _WordLength, Message=#message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
<<_:OffsetSize/binary, CS1/binary>> = CS,
<<ListLengthLSB:6/little-integer, 0:2/little-integer, ListLengthMSB:24/little-integer, DataWords:16/little-integer, PointerWords:16/little-integer, _/binary>> = CS1,
ListLength = ListLengthLSB + (ListLengthMSB bsl 6),
%io:format("Struct list.~n"),
#list{
data=[ (catch decode_struct(DataWords, PointerWords, Message#message{current_offset=Offset+1+I*(DataWords+PointerWords)})) || I <- lists:seq(0, ListLength-1) ]
};
decode_list(Size, Length, #message{current_segment=CS, current_offset=Offset}) ->
BitSize = element(Size+1, {0, 1, 8, 16, 32, 64}),
OffsetSize = Offset * 8,
DataSize = BitSize * Length,
<<_:OffsetSize/binary, Data:DataSize/bitstring, _/binary>> = CS,
Elts = (catch decode_list_elements(BitSize, Length, Data)),
io : format("Element list : ~p ~ n " , ) ,
#list{data=Elts}.
decode_list_elements(_, 0, <<>>) ->
[];
decode_list_elements(S, L, Data) ->
<<Elt:S/little-integer, Rest/binary>> = Data,
%io:format("decode_list_elements ~p ~p ~p~n", [S, L, Data]),
[Elt|decode_list_elements(S, L-1, Rest)].
decode_far_pointer(SegmentOffset, SegmentNumber, 0, Message=#message{segments=Segments}) ->
NewSegment = element(SegmentNumber+1, Segments),
io : format("Far pointer ~p ( ~p)~n " , [ SegmentOffset , SegmentNumber ] ) ,
(catch decode_pointer(Message#message{current_segment=NewSegment, current_offset=SegmentOffset})).
test() ->
{ok, Data} = file:read_file("capnp.raw"),
Mes = read_message(Data),
decode_pointer(Mes).
| null | https://raw.githubusercontent.com/bucko909/erlcapnp/813d314ec02f7ca634d4b296e2f57fecf67173d0/src/capnp_raw.erl | erlang | io:format("Capability: ~p, ~p~n", [P, CapabilityOffset]),
io:format("Junk: ~p~n", [Junk]),
io:format("Data: ~p; Pointers: ~p~n", [Data, Pointers]),
io:format("List pointers: ~p~n", [Pointers]),
io:format("Struct list.~n"),
io:format("decode_list_elements ~p ~p ~p~n", [S, L, Data]), | -module(capnp_raw).
-compile([export_all]).
-record(message, {
segments :: tuple(),
current_segment :: binary(),
current_offset :: integer(),
depth=0
}).
-record(struct, {
data :: binary(),
pointers :: list(term())
}).
-record(list, {
data :: list(term())
}).
read_message(<<SegmentCountM1:32/unsigned-little-integer, Rest/binary>>) ->
SegmentLengthsLength = (SegmentCountM1+1)*4,
PaddingLength = case SegmentCountM1 rem 2 == 0 of true -> 0; false -> 4 end,
<<SegmentLengthsData:SegmentLengthsLength/binary, _Padding:PaddingLength/binary, DataArea/binary>> = Rest,
SegmentLengths = get_segment_lengths(SegmentCountM1+1, SegmentLengthsData),
Segments = [FirstSegment|_] = get_segments(SegmentLengths, DataArea),
#message{
segments = list_to_tuple(Segments),
current_segment = FirstSegment,
current_offset = 0
}.
get_segment_lengths(0, <<>>) ->
[];
get_segment_lengths(N, <<SegmentLength:32/unsigned-little-integer, Rest/binary>>) ->
[SegmentLength*8|get_segment_lengths(N-1, Rest)].
get_segments([], <<>>) ->
[];
get_segments([Length|RestLengths], RemData) ->
<<Data:Length/binary, Rest/binary>> = RemData,
[Data|get_segments(RestLengths, Rest)].
decode_pointer(Message = #message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
<<_:OffsetSize/binary, X:8/binary, _/binary>> = CS,
decode_pointer(X, Message#message{});
decode_pointer(_) ->
excess_depth.
decode_pointers(<<>>, _) ->
[];
decode_pointers(<<Pointer:8/binary, Rest/binary>>, Message=#message{current_offset=Offset}) ->
[decode_pointer(Pointer, Message)|decode_pointers(Rest, Message#message{current_offset=Offset+1})].
decode_pointer(<<0:64/little-integer>>, _) ->
null_pointer;
decode_pointer(<<DataOffsetLSB:6/little-integer, 0:2/little-integer, DataOffsetMSB:24/little-integer, DataSize:16/little-integer, PointerSize:16/little-integer>>, Message=#message{current_offset=CurrentOffset}) ->
DataOffset = DataOffsetLSB + (DataOffsetMSB bsl 6),
io : format("Struct pointer : ~p , ~p ~ n " , [ P , { CurrentOffset+1+DataOffset , DataSize , PointerSize } ] ) ,
(catch decode_struct(DataSize, PointerSize, Message#message{current_offset=CurrentOffset+1+DataOffset}));
decode_pointer(<<DataOffsetLSB:6/little-integer, 1:2/little-integer, DataOffsetMSB:24/little-integer, ListLengthLSB:5/little-integer, ElementSize:3/little-integer, ListLengthMSB:24/little-integer>>, Message=#message{current_offset=CurrentOffset}) ->
DataOffset = DataOffsetLSB + (DataOffsetMSB bsl 6),
ListLength = ListLengthLSB + (ListLengthMSB bsl 5),
io : format("List pointer : ~p , ~p ~ n " , [ P , { { CurrentOffset , DataOffsetLSB , , CurrentOffset+1+DataOffset } , { ListLength , ListLengthLSB , ListLengthMSB } } ] ) ,
(catch decode_list(ElementSize, ListLength, Message#message{current_offset=CurrentOffset+1+DataOffset}));
decode_pointer(<<SegmentOffsetLSB:5/little-integer, LandingPadExtra:1/little-integer, 2:2/little-integer, SegmentOffsetMSB:24/little-integer, SegmentNumber:32/little-integer>>, Message) ->
SegmentOffset = SegmentOffsetLSB + (SegmentOffsetMSB bsl 5),
io : format("Far pointer : ~p , ~p ~ n " , [ P , { { SegmentOffset , SegmentOffsetLSB , SegmentOffsetMSB } , SegmentNumber , LandingPadExtra } ] ) ,
(catch decode_far_pointer(SegmentOffset, SegmentNumber, LandingPadExtra, Message));
decode_pointer(<<0:6/little-integer, 3:2/little-integer, 0:24/little-integer, CapabilityOffset:32/little-integer>>, #message{}) ->
{not_implemented_capabilities, CapabilityOffset};
decode_pointer(Junk, _) ->
{junk, Junk}.
decode_struct(DataWords, PointerWords, Message=#message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
DataSize = DataWords * 8,
PointerSize = PointerWords * 8,
<<_:OffsetSize/binary, Data:DataSize/binary, Pointers:PointerSize/binary, _/binary>> = CS,
#struct{
data=Data,
pointers=(catch decode_pointers(Pointers, Message#message{current_offset=Offset+DataWords}))
}.
decode_list(6, Length, Message=#message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
PointerSize = Length * 8,
<<_:OffsetSize/binary, Pointers:PointerSize/binary, _/binary>> = CS,
#list{
data=(catch decode_pointers(Pointers, Message#message{current_offset=Offset+1}))
};
decode_list(7, _WordLength, Message=#message{current_segment=CS, current_offset=Offset}) ->
OffsetSize = Offset * 8,
<<_:OffsetSize/binary, CS1/binary>> = CS,
<<ListLengthLSB:6/little-integer, 0:2/little-integer, ListLengthMSB:24/little-integer, DataWords:16/little-integer, PointerWords:16/little-integer, _/binary>> = CS1,
ListLength = ListLengthLSB + (ListLengthMSB bsl 6),
#list{
data=[ (catch decode_struct(DataWords, PointerWords, Message#message{current_offset=Offset+1+I*(DataWords+PointerWords)})) || I <- lists:seq(0, ListLength-1) ]
};
decode_list(Size, Length, #message{current_segment=CS, current_offset=Offset}) ->
BitSize = element(Size+1, {0, 1, 8, 16, 32, 64}),
OffsetSize = Offset * 8,
DataSize = BitSize * Length,
<<_:OffsetSize/binary, Data:DataSize/bitstring, _/binary>> = CS,
Elts = (catch decode_list_elements(BitSize, Length, Data)),
io : format("Element list : ~p ~ n " , ) ,
#list{data=Elts}.
decode_list_elements(_, 0, <<>>) ->
[];
decode_list_elements(S, L, Data) ->
<<Elt:S/little-integer, Rest/binary>> = Data,
[Elt|decode_list_elements(S, L-1, Rest)].
decode_far_pointer(SegmentOffset, SegmentNumber, 0, Message=#message{segments=Segments}) ->
NewSegment = element(SegmentNumber+1, Segments),
io : format("Far pointer ~p ( ~p)~n " , [ SegmentOffset , SegmentNumber ] ) ,
(catch decode_pointer(Message#message{current_segment=NewSegment, current_offset=SegmentOffset})).
test() ->
{ok, Data} = file:read_file("capnp.raw"),
Mes = read_message(Data),
decode_pointer(Mes).
|
037bd0aac324a40403f6ee15ddc91f764b8ffddeee54b3958bdb2b0a74ab6938 | jg513/enif_protobuf | ep_issue_19_tests.erl |
Copyright ( c ) ,
-module(ep_issue_19_tests).
-compile(export_all).
-include_lib("eunit/include/eunit.hrl").
-include_lib("gpb/include/gpb.hrl").
-record(m1, {a}).
issue_19_test() ->
Defs = [
{{msg, m1}, [
#field{name = a, fnum = 1, rnum = #m1.a, type = uint64, occurrence = required, opts = []}
]}
],
Bin = <<8, 181, 207, 209, 168, 154, 47>>,
enif_protobuf:load_cache(Defs),
{m1, 1621972248501} = enif_protobuf:decode(Bin, m1),
{m1, 1621972248501} = gpb:decode_msg(Bin, m1, Defs).
| null | https://raw.githubusercontent.com/jg513/enif_protobuf/15861eacfd87925297dd7a8a3acbaee42a1cfd4a/test/ep_issue_19_tests.erl | erlang |
Copyright ( c ) ,
-module(ep_issue_19_tests).
-compile(export_all).
-include_lib("eunit/include/eunit.hrl").
-include_lib("gpb/include/gpb.hrl").
-record(m1, {a}).
issue_19_test() ->
Defs = [
{{msg, m1}, [
#field{name = a, fnum = 1, rnum = #m1.a, type = uint64, occurrence = required, opts = []}
]}
],
Bin = <<8, 181, 207, 209, 168, 154, 47>>,
enif_protobuf:load_cache(Defs),
{m1, 1621972248501} = enif_protobuf:decode(Bin, m1),
{m1, 1621972248501} = gpb:decode_msg(Bin, m1, Defs).
| |
d780b241b8e1d28e6e681175462da87abd6b66baffdbe6f56b3238638d6ad715 | heraldry/heraldicon | text_field.cljs | (ns heraldicon.frontend.element.text-field
(:require
[heraldicon.frontend.element.core :as element]
[heraldicon.frontend.language :refer [tr]]
[heraldicon.frontend.tooltip :as tooltip]
[heraldicon.interface :as interface]
[re-frame.core :as rf]))
(defn text-field [context & {:keys [on-change style]}]
(when-let [option (interface/get-options context)]
(let [{:keys [inherited default]
:ui/keys [label tooltip placeholder]} option
current-value (interface/get-raw-data context)
value (or current-value
inherited
default)]
[:div.ui-setting
{:style style}
(when label
[:label [tr label]
[tooltip/info tooltip]])
[:div.option
[:input {:type "text"
:value value
:placeholder (tr placeholder)
:on-change #(let [value (-> % .-target .-value)]
(if on-change
(on-change value)
(rf/dispatch-sync [:set context value])))}]]])))
(defmethod element/element :ui.element/text-field [context]
[text-field context])
| null | https://raw.githubusercontent.com/heraldry/heraldicon/a931cf3affe14fcb0e2785744733af4389dd4a75/src/heraldicon/frontend/element/text_field.cljs | clojure | (ns heraldicon.frontend.element.text-field
(:require
[heraldicon.frontend.element.core :as element]
[heraldicon.frontend.language :refer [tr]]
[heraldicon.frontend.tooltip :as tooltip]
[heraldicon.interface :as interface]
[re-frame.core :as rf]))
(defn text-field [context & {:keys [on-change style]}]
(when-let [option (interface/get-options context)]
(let [{:keys [inherited default]
:ui/keys [label tooltip placeholder]} option
current-value (interface/get-raw-data context)
value (or current-value
inherited
default)]
[:div.ui-setting
{:style style}
(when label
[:label [tr label]
[tooltip/info tooltip]])
[:div.option
[:input {:type "text"
:value value
:placeholder (tr placeholder)
:on-change #(let [value (-> % .-target .-value)]
(if on-change
(on-change value)
(rf/dispatch-sync [:set context value])))}]]])))
(defmethod element/element :ui.element/text-field [context]
[text-field context])
| |
62899075e0cb6df69186329c27c7273b63fa61015d070a9d2d519ac1e92d33d7 | bos/llvm | Util.hs | {-# LANGUAGE ForeignFunctionInterface, ScopedTypeVariables, DeriveDataTypeable #-}
module LLVM.Core.Util(
-- * Module handling
Module(..), withModule, createModule, destroyModule, writeBitcodeToFile, readBitcodeFromFile,
getModuleValues, getFunctions, getGlobalVariables, valueHasType,
-- * Module provider handling
ModuleProvider(..), withModuleProvider, createModuleProviderForExistingModule,
-- * Pass manager handling
PassManager(..), withPassManager, createPassManager, createFunctionPassManager,
runFunctionPassManager, initializeFunctionPassManager, finalizeFunctionPassManager,
-- * Instruction builder
Builder(..), withBuilder, createBuilder, positionAtEnd, getInsertBlock,
-- * Basic blocks
BasicBlock,
appendBasicBlock, getBasicBlocks,
-- * Functions
Function,
addFunction, getParam, getParams,
-- * Structs
structType,
-- * Globals
addGlobal,
constString, constStringNul, constVector, constArray, constStruct,
-- * Instructions
makeCall, makeInvoke,
makeCallWithCc, makeInvokeWithCc,
withValue, getInstructions, getOperands,
-- * Uses and Users
hasUsers, getUsers, getUses, getUser, isChildOf, getDep,
-- * Misc
CString, withArrayLen,
withEmptyCString,
functionType, buildEmptyPhi, addPhiIns,
showTypeOf, getValueNameU, setValueNameU, getObjList, annotateValueList, isConstant,
-- * Transformation passes
addCFGSimplificationPass, addConstantPropagationPass, addDemoteMemoryToRegisterPass,
addGVNPass, addInstructionCombiningPass, addPromoteMemoryToRegisterPass, addReassociatePass,
addTargetData
) where
import Data.Typeable
import Data.List(intercalate)
import Control.Monad(liftM, filterM, when)
import Foreign.C.String (withCString, withCStringLen, CString, peekCString)
import Foreign.ForeignPtr (ForeignPtr, newForeignPtr, newForeignPtr_, withForeignPtr)
import Foreign.Ptr (Ptr, nullPtr)
import Foreign.Marshal.Array (withArrayLen, withArray, allocaArray, peekArray)
import Foreign.Marshal.Alloc (alloca)
import Foreign.Storable (Storable(..))
import Foreign.Marshal.Utils (fromBool)
import System.IO.Unsafe (unsafePerformIO)
import qualified LLVM.FFI.Core as FFI
import qualified LLVM.FFI.Target as FFI
import qualified LLVM.FFI.BitWriter as FFI
import qualified LLVM.FFI.BitReader as FFI
import qualified LLVM.FFI.Transforms.Scalar as FFI
type Type = FFI.TypeRef
unsafePerformIO just to wrap the non - effecting withArrayLen call
functionType :: Bool -> Type -> [Type] -> Type
functionType varargs retType paramTypes = unsafePerformIO $
withArrayLen paramTypes $ \ len ptr ->
return $ FFI.functionType retType ptr (fromIntegral len)
(fromBool varargs)
unsafePerformIO just to wrap the non - effecting withArrayLen call
structType :: [Type] -> Bool -> Type
structType types packed = unsafePerformIO $
withArrayLen types $ \ len ptr ->
return $ FFI.structType ptr (fromIntegral len) (if packed then 1 else 0)
--------------------------------------
-- Handle modules
-- Don't use a finalizer for the module, but instead provide an
-- explicit destructor. This is because handing a module to
-- a module provider changes ownership of the module to the provider,
-- and we don't want to free it by mistake.
-- | Type of top level modules.
newtype Module = Module {
fromModule :: FFI.ModuleRef
}
deriving (Show, Typeable)
withModule :: Module -> (FFI.ModuleRef -> IO a) -> IO a
withModule modul f = f (fromModule modul)
createModule :: String -> IO Module
createModule name =
withCString name $ \ namePtr -> do
liftM Module $ FFI.moduleCreateWithName namePtr
-- | Free all storage related to a module. *Note*, this is a dangerous call, since referring
-- to the module after this call is an error. The reason for the explicit call to free
-- the module instead of an automatic lifetime management is that modules have a
-- somewhat complicated ownership. Handing a module to a module provider changes
-- the ownership of the module, and the module provider will free the module when necessary.
destroyModule :: Module -> IO ()
destroyModule = FFI.disposeModule . fromModule
-- |Write a module to a file.
writeBitcodeToFile :: String -> Module -> IO ()
writeBitcodeToFile name mdl =
withCString name $ \ namePtr ->
withModule mdl $ \ mdlPtr -> do
rc <- FFI.writeBitcodeToFile mdlPtr namePtr
when (rc /= 0) $
ioError $ userError $ "writeBitcodeToFile: return code " ++ show rc
return ()
-- |Read a module from a file.
readBitcodeFromFile :: String -> IO Module
readBitcodeFromFile name =
withCString name $ \ namePtr ->
alloca $ \ bufPtr ->
alloca $ \ modPtr ->
alloca $ \ errStr -> do
rrc <- FFI.createMemoryBufferWithContentsOfFile namePtr bufPtr errStr
if rrc /= 0 then do
msg <- peek errStr >>= peekCString
ioError $ userError $ "readBitcodeFromFile: read return code " ++ show rrc ++ ", " ++ msg
else do
buf <- peek bufPtr
prc <- FFI.parseBitcode buf modPtr errStr
if prc /= 0 then do
msg <- peek errStr >>= peekCString
ioError $ userError $ "readBitcodeFromFile: parse return code " ++ show prc ++ ", " ++ msg
else do
ptr <- peek modPtr
return $ Module ptr
liftM Module $ newForeignPtr FFI.ptrDisposeModule ptr
liftM Module $ newForeignPtr FFI.ptrDisposeModule ptr
-}
getModuleValues :: Module -> IO [(String, Value)]
getModuleValues mdl = do
fs <- getFunctions mdl
gs <- getGlobalVariables mdl
return (fs ++ gs)
getFunctions :: Module -> IO [(String, Value)]
getFunctions mdl = getObjList withModule FFI.getFirstFunction FFI.getNextFunction mdl >>= filterM isIntrinsic >>= annotateValueList
getGlobalVariables :: Module -> IO [(String, Value)]
getGlobalVariables mdl = getObjList withModule FFI.getFirstGlobal FFI.getNextGlobal mdl >>= annotateValueList
-- This is safe because we just ask for the type of a value.
valueHasType :: Value -> Type -> Bool
valueHasType v t = unsafePerformIO $ do
vt <- FFI.typeOf v
uses hash consing for types , so pointer equality works .
showTypeOf :: Value -> IO String
showTypeOf v = FFI.typeOf v >>= showType'
showType' :: Type -> IO String
showType' p = do
pk <- FFI.getTypeKind p
case pk of
FFI.VoidTypeKind -> return "()"
FFI.HalfTypeKind -> return "Half"
FFI.FloatTypeKind -> return "Float"
FFI.DoubleTypeKind -> return "Double"
FFI.X86_FP80TypeKind -> return "X86_FP80"
FFI.FP128TypeKind -> return "FP128"
FFI.PPC_FP128TypeKind -> return "PPC_FP128"
FFI.X86_MMXTypeKind -> return "X86_MMX"
FFI.MetadataTypeKind -> return "Metadata"
FFI.LabelTypeKind -> return "Label"
FFI.IntegerTypeKind -> do w <- FFI.getIntTypeWidth p; return $ "(IntN " ++ show w ++ ")"
FFI.FunctionTypeKind -> do
r <- FFI.getReturnType p
c <- FFI.countParamTypes p
let n = fromIntegral c
as <- allocaArray n $ \ args -> do
FFI.getParamTypes p args
peekArray n args
ts <- mapM showType' (as ++ [r])
return $ "(" ++ intercalate " -> " ts ++ ")"
FFI.StructTypeKind -> return "(Struct ...)"
FFI.ArrayTypeKind -> do n <- FFI.getArrayLength p; t <- FFI.getElementType p >>= showType'; return $ "(Array " ++ show n ++ " " ++ t ++ ")"
FFI.PointerTypeKind -> do t <- FFI.getElementType p >>= showType'; return $ "(Ptr " ++ t ++ ")"
FFI.VectorTypeKind -> do n <- FFI.getVectorSize p; t <- FFI.getElementType p >>= showType'; return $ "(Vector " ++ show n ++ " " ++ t ++ ")"
--------------------------------------
-- Handle module providers
-- | A module provider is used by the code generator to get access to a module.
newtype ModuleProvider = ModuleProvider {
fromModuleProvider :: ForeignPtr FFI.ModuleProvider
}
deriving (Show, Typeable)
withModuleProvider :: ModuleProvider -> (FFI.ModuleProviderRef -> IO a)
-> IO a
withModuleProvider = withForeignPtr . fromModuleProvider
-- | Turn a module into a module provider.
createModuleProviderForExistingModule :: Module -> IO ModuleProvider
createModuleProviderForExistingModule modul =
withModule modul $ \modulPtr -> do
ptr <- FFI.createModuleProviderForExistingModule modulPtr
MPs given to the EE get taken over , so we should not GC them .
liftM ModuleProvider $ newForeignPtr_ {-FFI.ptrDisposeModuleProvider-} ptr
--------------------------------------
-- Handle instruction builders
newtype Builder = Builder {
fromBuilder :: ForeignPtr FFI.Builder
}
deriving (Show, Typeable)
withBuilder :: Builder -> (FFI.BuilderRef -> IO a) -> IO a
withBuilder = withForeignPtr . fromBuilder
createBuilder :: IO Builder
createBuilder = do
ptr <- FFI.createBuilder
liftM Builder $ newForeignPtr FFI.ptrDisposeBuilder ptr
positionAtEnd :: Builder -> FFI.BasicBlockRef -> IO ()
positionAtEnd bld bblk =
withBuilder bld $ \ bldPtr ->
FFI.positionAtEnd bldPtr bblk
getInsertBlock :: Builder -> IO FFI.BasicBlockRef
getInsertBlock bld =
withBuilder bld $ \ bldPtr ->
FFI.getInsertBlock bldPtr
--------------------------------------
type BasicBlock = FFI.BasicBlockRef
appendBasicBlock :: Function -> String -> IO BasicBlock
appendBasicBlock func name =
withCString name $ \ namePtr ->
FFI.appendBasicBlock func namePtr
getBasicBlocks :: Value -> IO [(String, Value)]
getBasicBlocks v = getObjList withValue FFI.getFirstBasicBlock FFI.getNextBasicBlock v >>= annotateValueList
--------------------------------------
type Function = FFI.ValueRef
addFunction :: Module -> FFI.Linkage -> String -> Type -> IO Function
addFunction modul linkage name typ =
withModule modul $ \ modulPtr ->
withCString name $ \ namePtr -> do
f <- FFI.addFunction modulPtr namePtr typ
FFI.setLinkage f (FFI.fromLinkage linkage)
return f
getParam :: Function -> Int -> Value
getParam f = FFI.getParam f . fromIntegral
getParams :: Value -> IO [(String, Value)]
getParams v = getObjList withValue FFI.getFirstParam FFI.getNextParam v >>= annotateValueList
--------------------------------------
addGlobal :: Module -> FFI.Linkage -> String -> Type -> IO Value
addGlobal modul linkage name typ =
withModule modul $ \ modulPtr ->
withCString name $ \ namePtr -> do
v <- FFI.addGlobal modulPtr typ namePtr
FFI.setLinkage v (FFI.fromLinkage linkage)
return v
unsafePerformIO is safe because it 's only used for the withCStringLen conversion
constStringInternal :: Bool -> String -> (Value, Int)
constStringInternal nulTerm s = unsafePerformIO $
withCStringLen s $ \(sPtr, sLen) ->
return (FFI.constString sPtr (fromIntegral sLen) (fromBool (not nulTerm)), sLen)
constString :: String -> (Value, Int)
constString = constStringInternal False
constStringNul :: String -> (Value, Int)
constStringNul str =
let (cstr, n) = constStringInternal True str
in (cstr, n+1)
--------------------------------------
type Value = FFI.ValueRef
withValue :: Value -> (Value -> IO a) -> IO a
withValue v f = f v
makeCall :: Function -> FFI.BuilderRef -> [Value] -> IO Value
makeCall = makeCallWithCc FFI.C
makeCallWithCc :: FFI.CallingConvention -> Function -> FFI.BuilderRef -> [Value] -> IO Value
makeCallWithCc cc func bldPtr args = do
{-
print "makeCall"
FFI.dumpValue func
mapM_ FFI.dumpValue args
print "----------------------"
-}
withArrayLen args $ \ argLen argPtr ->
withEmptyCString $ \cstr -> do
i <- FFI.buildCall bldPtr func argPtr
(fromIntegral argLen) cstr
FFI.setInstructionCallConv i (FFI.fromCallingConvention cc)
return i
makeInvoke :: BasicBlock -> BasicBlock -> Function -> FFI.BuilderRef ->
[Value] -> IO Value
makeInvoke = makeInvokeWithCc FFI.C
makeInvokeWithCc :: FFI.CallingConvention -> BasicBlock -> BasicBlock -> Function -> FFI.BuilderRef ->
[Value] -> IO Value
makeInvokeWithCc cc norm expt func bldPtr args =
withArrayLen args $ \ argLen argPtr ->
withEmptyCString $ \cstr -> do
i <- FFI.buildInvoke bldPtr func argPtr (fromIntegral argLen) norm expt cstr
FFI.setInstructionCallConv i (FFI.fromCallingConvention cc)
return i
getInstructions :: Value -> IO [(String, Value)]
getInstructions bb = getObjList withValue FFI.getFirstInstruction FFI.getNextInstruction bb >>= annotateValueList
getOperands :: Value -> IO [(String, Value)]
getOperands ii = geto ii >>= annotateValueList
where geto i = do
num <- FFI.getNumOperands i
let oloop instr number total = if number >= total then return [] else do
o <- FFI.getOperand instr number
os <- oloop instr (number + 1) total
return (o : os)
oloop i 0 num
--------------------------------------
buildEmptyPhi :: FFI.BuilderRef -> Type -> IO Value
buildEmptyPhi bldPtr typ = do
withEmptyCString $ FFI.buildPhi bldPtr typ
withEmptyCString :: (CString -> IO a) -> IO a
withEmptyCString = withCString ""
addPhiIns :: Value -> [(Value, BasicBlock)] -> IO ()
addPhiIns inst incoming = do
let (vals, bblks) = unzip incoming
withArrayLen vals $ \ count valPtr ->
withArray bblks $ \ bblkPtr ->
FFI.addIncoming inst valPtr bblkPtr (fromIntegral count)
--------------------------------------
-- | Manage compile passes.
newtype PassManager = PassManager {
fromPassManager :: ForeignPtr FFI.PassManager
}
deriving (Show, Typeable)
withPassManager :: PassManager -> (FFI.PassManagerRef -> IO a)
-> IO a
withPassManager = withForeignPtr . fromPassManager
-- | Create a pass manager.
createPassManager :: IO PassManager
createPassManager = do
ptr <- FFI.createPassManager
liftM PassManager $ newForeignPtr FFI.ptrDisposePassManager ptr
-- | Create a pass manager for a module.
createFunctionPassManager :: ModuleProvider -> IO PassManager
createFunctionPassManager modul =
withModuleProvider modul $ \modulPtr -> do
ptr <- FFI.createFunctionPassManager modulPtr
liftM PassManager $ newForeignPtr FFI.ptrDisposePassManager ptr
-- | Add a control flow graph simplification pass to the manager.
addCFGSimplificationPass :: PassManager -> IO ()
addCFGSimplificationPass pm = withPassManager pm FFI.addCFGSimplificationPass
-- | Add a constant propagation pass to the manager.
addConstantPropagationPass :: PassManager -> IO ()
addConstantPropagationPass pm = withPassManager pm FFI.addConstantPropagationPass
addDemoteMemoryToRegisterPass :: PassManager -> IO ()
addDemoteMemoryToRegisterPass pm = withPassManager pm FFI.addDemoteMemoryToRegisterPass
-- | Add a global value numbering pass to the manager.
addGVNPass :: PassManager -> IO ()
addGVNPass pm = withPassManager pm FFI.addGVNPass
addInstructionCombiningPass :: PassManager -> IO ()
addInstructionCombiningPass pm = withPassManager pm FFI.addInstructionCombiningPass
addPromoteMemoryToRegisterPass :: PassManager -> IO ()
addPromoteMemoryToRegisterPass pm = withPassManager pm FFI.addPromoteMemoryToRegisterPass
addReassociatePass :: PassManager -> IO ()
addReassociatePass pm = withPassManager pm FFI.addReassociatePass
addTargetData :: FFI.TargetDataRef -> PassManager -> IO ()
addTargetData td pm = withPassManager pm $ FFI.addTargetData td
runFunctionPassManager :: PassManager -> Function -> IO Int
runFunctionPassManager pm fcn = liftM fromIntegral $ withPassManager pm $ \ pmref -> FFI.runFunctionPassManager pmref fcn
initializeFunctionPassManager :: PassManager -> IO Int
initializeFunctionPassManager pm = liftM fromIntegral $ withPassManager pm FFI.initializeFunctionPassManager
finalizeFunctionPassManager :: PassManager -> IO Int
finalizeFunctionPassManager pm = liftM fromIntegral $ withPassManager pm FFI.finalizeFunctionPassManager
--------------------------------------
The unsafePerformIO is just for the non - effecting withArrayLen
constVector :: Int -> [Value] -> Value
constVector n xs = unsafePerformIO $ do
let xs' = take n (cycle xs)
withArrayLen xs' $ \ len ptr ->
return $ FFI.constVector ptr (fromIntegral len)
The unsafePerformIO is just for the non - effecting withArrayLen
constArray :: Type -> Int -> [Value] -> Value
constArray t n xs = unsafePerformIO $ do
let xs' = take n (cycle xs)
withArrayLen xs' $ \ len ptr ->
return $ FFI.constArray t ptr (fromIntegral len)
The unsafePerformIO is just for the non - effecting withArrayLen
constStruct :: [Value] -> Bool -> Value
constStruct xs packed = unsafePerformIO $ do
withArrayLen xs $ \ len ptr ->
return $ FFI.constStruct ptr (fromIntegral len) (if packed then 1 else 0)
--------------------------------------
getValueNameU :: Value -> IO String
getValueNameU a = do
-- sometimes void values need explicit names too
cs <- FFI.getValueName a
str <- peekCString cs
if str == "" then return (show a) else return str
setValueNameU :: String -> Value -> IO ()
setValueNameU str a = do
withCString str $ \ strPtr ->
FFI.setValueName a strPtr
getObjList :: (t1 -> (t2 -> IO [Ptr a]) -> t) -> (t2 -> IO (Ptr a))
-> (Ptr a -> IO (Ptr a)) -> t1 -> t
getObjList withF firstF nextF obj = do
withF obj $ \ objPtr -> do
ofst <- firstF objPtr
let oloop p = if p == nullPtr then return [] else do
n <- nextF p
ps <- oloop n
return (p : ps)
oloop ofst
annotateValueList :: [Value] -> IO [(String, Value)]
annotateValueList vs = do
names <- mapM getValueNameU vs
return $ zip names vs
isConstant :: Value -> IO Bool
isConstant v = do
isC <- FFI.isConstant v
if isC == 0 then return False else return True
isIntrinsic :: Value -> IO Bool
isIntrinsic v = do
if FFI.getIntrinsicID v == 0 then return True else return False
--------------------------------------
type Use = FFI.UseRef
hasUsers :: Value -> IO Bool
hasUsers v = do
nU <- FFI.getNumUses v
if nU == 0 then return False else return True
getUses :: Value -> IO [Use]
getUses = getObjList withValue FFI.getFirstUse FFI.getNextUse
getUsers :: [Use] -> IO [(String, Value)]
getUsers us = mapM FFI.getUser us >>= annotateValueList
getUser :: Use -> IO Value
getUser = FFI.getUser
isChildOf :: BasicBlock -> Value -> IO Bool
isChildOf bb v = do
bb2 <- FFI.getInstructionParent v
if bb == bb2 then return True else return False
getDep :: Use -> IO (String, String)
getDep u = do
producer <- FFI.getUsedValue u >>= getValueNameU
consumer <- FFI.getUser u >>= getValueNameU
return (producer, consumer)
| null | https://raw.githubusercontent.com/bos/llvm/819b94d048c9d7787ce41cd7c71b84424e894f64/LLVM/Core/Util.hs | haskell | # LANGUAGE ForeignFunctionInterface, ScopedTypeVariables, DeriveDataTypeable #
* Module handling
* Module provider handling
* Pass manager handling
* Instruction builder
* Basic blocks
* Functions
* Structs
* Globals
* Instructions
* Uses and Users
* Misc
* Transformation passes
------------------------------------
Handle modules
Don't use a finalizer for the module, but instead provide an
explicit destructor. This is because handing a module to
a module provider changes ownership of the module to the provider,
and we don't want to free it by mistake.
| Type of top level modules.
| Free all storage related to a module. *Note*, this is a dangerous call, since referring
to the module after this call is an error. The reason for the explicit call to free
the module instead of an automatic lifetime management is that modules have a
somewhat complicated ownership. Handing a module to a module provider changes
the ownership of the module, and the module provider will free the module when necessary.
|Write a module to a file.
|Read a module from a file.
This is safe because we just ask for the type of a value.
------------------------------------
Handle module providers
| A module provider is used by the code generator to get access to a module.
| Turn a module into a module provider.
FFI.ptrDisposeModuleProvider
------------------------------------
Handle instruction builders
------------------------------------
------------------------------------
------------------------------------
------------------------------------
print "makeCall"
FFI.dumpValue func
mapM_ FFI.dumpValue args
print "----------------------"
------------------------------------
------------------------------------
| Manage compile passes.
| Create a pass manager.
| Create a pass manager for a module.
| Add a control flow graph simplification pass to the manager.
| Add a constant propagation pass to the manager.
| Add a global value numbering pass to the manager.
------------------------------------
------------------------------------
sometimes void values need explicit names too
------------------------------------ | module LLVM.Core.Util(
Module(..), withModule, createModule, destroyModule, writeBitcodeToFile, readBitcodeFromFile,
getModuleValues, getFunctions, getGlobalVariables, valueHasType,
ModuleProvider(..), withModuleProvider, createModuleProviderForExistingModule,
PassManager(..), withPassManager, createPassManager, createFunctionPassManager,
runFunctionPassManager, initializeFunctionPassManager, finalizeFunctionPassManager,
Builder(..), withBuilder, createBuilder, positionAtEnd, getInsertBlock,
BasicBlock,
appendBasicBlock, getBasicBlocks,
Function,
addFunction, getParam, getParams,
structType,
addGlobal,
constString, constStringNul, constVector, constArray, constStruct,
makeCall, makeInvoke,
makeCallWithCc, makeInvokeWithCc,
withValue, getInstructions, getOperands,
hasUsers, getUsers, getUses, getUser, isChildOf, getDep,
CString, withArrayLen,
withEmptyCString,
functionType, buildEmptyPhi, addPhiIns,
showTypeOf, getValueNameU, setValueNameU, getObjList, annotateValueList, isConstant,
addCFGSimplificationPass, addConstantPropagationPass, addDemoteMemoryToRegisterPass,
addGVNPass, addInstructionCombiningPass, addPromoteMemoryToRegisterPass, addReassociatePass,
addTargetData
) where
import Data.Typeable
import Data.List(intercalate)
import Control.Monad(liftM, filterM, when)
import Foreign.C.String (withCString, withCStringLen, CString, peekCString)
import Foreign.ForeignPtr (ForeignPtr, newForeignPtr, newForeignPtr_, withForeignPtr)
import Foreign.Ptr (Ptr, nullPtr)
import Foreign.Marshal.Array (withArrayLen, withArray, allocaArray, peekArray)
import Foreign.Marshal.Alloc (alloca)
import Foreign.Storable (Storable(..))
import Foreign.Marshal.Utils (fromBool)
import System.IO.Unsafe (unsafePerformIO)
import qualified LLVM.FFI.Core as FFI
import qualified LLVM.FFI.Target as FFI
import qualified LLVM.FFI.BitWriter as FFI
import qualified LLVM.FFI.BitReader as FFI
import qualified LLVM.FFI.Transforms.Scalar as FFI
type Type = FFI.TypeRef
unsafePerformIO just to wrap the non - effecting withArrayLen call
functionType :: Bool -> Type -> [Type] -> Type
functionType varargs retType paramTypes = unsafePerformIO $
withArrayLen paramTypes $ \ len ptr ->
return $ FFI.functionType retType ptr (fromIntegral len)
(fromBool varargs)
unsafePerformIO just to wrap the non - effecting withArrayLen call
structType :: [Type] -> Bool -> Type
structType types packed = unsafePerformIO $
withArrayLen types $ \ len ptr ->
return $ FFI.structType ptr (fromIntegral len) (if packed then 1 else 0)
newtype Module = Module {
fromModule :: FFI.ModuleRef
}
deriving (Show, Typeable)
withModule :: Module -> (FFI.ModuleRef -> IO a) -> IO a
withModule modul f = f (fromModule modul)
createModule :: String -> IO Module
createModule name =
withCString name $ \ namePtr -> do
liftM Module $ FFI.moduleCreateWithName namePtr
destroyModule :: Module -> IO ()
destroyModule = FFI.disposeModule . fromModule
writeBitcodeToFile :: String -> Module -> IO ()
writeBitcodeToFile name mdl =
withCString name $ \ namePtr ->
withModule mdl $ \ mdlPtr -> do
rc <- FFI.writeBitcodeToFile mdlPtr namePtr
when (rc /= 0) $
ioError $ userError $ "writeBitcodeToFile: return code " ++ show rc
return ()
readBitcodeFromFile :: String -> IO Module
readBitcodeFromFile name =
withCString name $ \ namePtr ->
alloca $ \ bufPtr ->
alloca $ \ modPtr ->
alloca $ \ errStr -> do
rrc <- FFI.createMemoryBufferWithContentsOfFile namePtr bufPtr errStr
if rrc /= 0 then do
msg <- peek errStr >>= peekCString
ioError $ userError $ "readBitcodeFromFile: read return code " ++ show rrc ++ ", " ++ msg
else do
buf <- peek bufPtr
prc <- FFI.parseBitcode buf modPtr errStr
if prc /= 0 then do
msg <- peek errStr >>= peekCString
ioError $ userError $ "readBitcodeFromFile: parse return code " ++ show prc ++ ", " ++ msg
else do
ptr <- peek modPtr
return $ Module ptr
liftM Module $ newForeignPtr FFI.ptrDisposeModule ptr
liftM Module $ newForeignPtr FFI.ptrDisposeModule ptr
-}
getModuleValues :: Module -> IO [(String, Value)]
getModuleValues mdl = do
fs <- getFunctions mdl
gs <- getGlobalVariables mdl
return (fs ++ gs)
getFunctions :: Module -> IO [(String, Value)]
getFunctions mdl = getObjList withModule FFI.getFirstFunction FFI.getNextFunction mdl >>= filterM isIntrinsic >>= annotateValueList
getGlobalVariables :: Module -> IO [(String, Value)]
getGlobalVariables mdl = getObjList withModule FFI.getFirstGlobal FFI.getNextGlobal mdl >>= annotateValueList
valueHasType :: Value -> Type -> Bool
valueHasType v t = unsafePerformIO $ do
vt <- FFI.typeOf v
uses hash consing for types , so pointer equality works .
showTypeOf :: Value -> IO String
showTypeOf v = FFI.typeOf v >>= showType'
showType' :: Type -> IO String
showType' p = do
pk <- FFI.getTypeKind p
case pk of
FFI.VoidTypeKind -> return "()"
FFI.HalfTypeKind -> return "Half"
FFI.FloatTypeKind -> return "Float"
FFI.DoubleTypeKind -> return "Double"
FFI.X86_FP80TypeKind -> return "X86_FP80"
FFI.FP128TypeKind -> return "FP128"
FFI.PPC_FP128TypeKind -> return "PPC_FP128"
FFI.X86_MMXTypeKind -> return "X86_MMX"
FFI.MetadataTypeKind -> return "Metadata"
FFI.LabelTypeKind -> return "Label"
FFI.IntegerTypeKind -> do w <- FFI.getIntTypeWidth p; return $ "(IntN " ++ show w ++ ")"
FFI.FunctionTypeKind -> do
r <- FFI.getReturnType p
c <- FFI.countParamTypes p
let n = fromIntegral c
as <- allocaArray n $ \ args -> do
FFI.getParamTypes p args
peekArray n args
ts <- mapM showType' (as ++ [r])
return $ "(" ++ intercalate " -> " ts ++ ")"
FFI.StructTypeKind -> return "(Struct ...)"
FFI.ArrayTypeKind -> do n <- FFI.getArrayLength p; t <- FFI.getElementType p >>= showType'; return $ "(Array " ++ show n ++ " " ++ t ++ ")"
FFI.PointerTypeKind -> do t <- FFI.getElementType p >>= showType'; return $ "(Ptr " ++ t ++ ")"
FFI.VectorTypeKind -> do n <- FFI.getVectorSize p; t <- FFI.getElementType p >>= showType'; return $ "(Vector " ++ show n ++ " " ++ t ++ ")"
newtype ModuleProvider = ModuleProvider {
fromModuleProvider :: ForeignPtr FFI.ModuleProvider
}
deriving (Show, Typeable)
withModuleProvider :: ModuleProvider -> (FFI.ModuleProviderRef -> IO a)
-> IO a
withModuleProvider = withForeignPtr . fromModuleProvider
createModuleProviderForExistingModule :: Module -> IO ModuleProvider
createModuleProviderForExistingModule modul =
withModule modul $ \modulPtr -> do
ptr <- FFI.createModuleProviderForExistingModule modulPtr
MPs given to the EE get taken over , so we should not GC them .
newtype Builder = Builder {
fromBuilder :: ForeignPtr FFI.Builder
}
deriving (Show, Typeable)
withBuilder :: Builder -> (FFI.BuilderRef -> IO a) -> IO a
withBuilder = withForeignPtr . fromBuilder
createBuilder :: IO Builder
createBuilder = do
ptr <- FFI.createBuilder
liftM Builder $ newForeignPtr FFI.ptrDisposeBuilder ptr
positionAtEnd :: Builder -> FFI.BasicBlockRef -> IO ()
positionAtEnd bld bblk =
withBuilder bld $ \ bldPtr ->
FFI.positionAtEnd bldPtr bblk
getInsertBlock :: Builder -> IO FFI.BasicBlockRef
getInsertBlock bld =
withBuilder bld $ \ bldPtr ->
FFI.getInsertBlock bldPtr
type BasicBlock = FFI.BasicBlockRef
appendBasicBlock :: Function -> String -> IO BasicBlock
appendBasicBlock func name =
withCString name $ \ namePtr ->
FFI.appendBasicBlock func namePtr
getBasicBlocks :: Value -> IO [(String, Value)]
getBasicBlocks v = getObjList withValue FFI.getFirstBasicBlock FFI.getNextBasicBlock v >>= annotateValueList
type Function = FFI.ValueRef
addFunction :: Module -> FFI.Linkage -> String -> Type -> IO Function
addFunction modul linkage name typ =
withModule modul $ \ modulPtr ->
withCString name $ \ namePtr -> do
f <- FFI.addFunction modulPtr namePtr typ
FFI.setLinkage f (FFI.fromLinkage linkage)
return f
getParam :: Function -> Int -> Value
getParam f = FFI.getParam f . fromIntegral
getParams :: Value -> IO [(String, Value)]
getParams v = getObjList withValue FFI.getFirstParam FFI.getNextParam v >>= annotateValueList
addGlobal :: Module -> FFI.Linkage -> String -> Type -> IO Value
addGlobal modul linkage name typ =
withModule modul $ \ modulPtr ->
withCString name $ \ namePtr -> do
v <- FFI.addGlobal modulPtr typ namePtr
FFI.setLinkage v (FFI.fromLinkage linkage)
return v
unsafePerformIO is safe because it 's only used for the withCStringLen conversion
constStringInternal :: Bool -> String -> (Value, Int)
constStringInternal nulTerm s = unsafePerformIO $
withCStringLen s $ \(sPtr, sLen) ->
return (FFI.constString sPtr (fromIntegral sLen) (fromBool (not nulTerm)), sLen)
constString :: String -> (Value, Int)
constString = constStringInternal False
constStringNul :: String -> (Value, Int)
constStringNul str =
let (cstr, n) = constStringInternal True str
in (cstr, n+1)
type Value = FFI.ValueRef
withValue :: Value -> (Value -> IO a) -> IO a
withValue v f = f v
makeCall :: Function -> FFI.BuilderRef -> [Value] -> IO Value
makeCall = makeCallWithCc FFI.C
makeCallWithCc :: FFI.CallingConvention -> Function -> FFI.BuilderRef -> [Value] -> IO Value
makeCallWithCc cc func bldPtr args = do
withArrayLen args $ \ argLen argPtr ->
withEmptyCString $ \cstr -> do
i <- FFI.buildCall bldPtr func argPtr
(fromIntegral argLen) cstr
FFI.setInstructionCallConv i (FFI.fromCallingConvention cc)
return i
makeInvoke :: BasicBlock -> BasicBlock -> Function -> FFI.BuilderRef ->
[Value] -> IO Value
makeInvoke = makeInvokeWithCc FFI.C
makeInvokeWithCc :: FFI.CallingConvention -> BasicBlock -> BasicBlock -> Function -> FFI.BuilderRef ->
[Value] -> IO Value
makeInvokeWithCc cc norm expt func bldPtr args =
withArrayLen args $ \ argLen argPtr ->
withEmptyCString $ \cstr -> do
i <- FFI.buildInvoke bldPtr func argPtr (fromIntegral argLen) norm expt cstr
FFI.setInstructionCallConv i (FFI.fromCallingConvention cc)
return i
getInstructions :: Value -> IO [(String, Value)]
getInstructions bb = getObjList withValue FFI.getFirstInstruction FFI.getNextInstruction bb >>= annotateValueList
getOperands :: Value -> IO [(String, Value)]
getOperands ii = geto ii >>= annotateValueList
where geto i = do
num <- FFI.getNumOperands i
let oloop instr number total = if number >= total then return [] else do
o <- FFI.getOperand instr number
os <- oloop instr (number + 1) total
return (o : os)
oloop i 0 num
buildEmptyPhi :: FFI.BuilderRef -> Type -> IO Value
buildEmptyPhi bldPtr typ = do
withEmptyCString $ FFI.buildPhi bldPtr typ
withEmptyCString :: (CString -> IO a) -> IO a
withEmptyCString = withCString ""
addPhiIns :: Value -> [(Value, BasicBlock)] -> IO ()
addPhiIns inst incoming = do
let (vals, bblks) = unzip incoming
withArrayLen vals $ \ count valPtr ->
withArray bblks $ \ bblkPtr ->
FFI.addIncoming inst valPtr bblkPtr (fromIntegral count)
newtype PassManager = PassManager {
fromPassManager :: ForeignPtr FFI.PassManager
}
deriving (Show, Typeable)
withPassManager :: PassManager -> (FFI.PassManagerRef -> IO a)
-> IO a
withPassManager = withForeignPtr . fromPassManager
createPassManager :: IO PassManager
createPassManager = do
ptr <- FFI.createPassManager
liftM PassManager $ newForeignPtr FFI.ptrDisposePassManager ptr
createFunctionPassManager :: ModuleProvider -> IO PassManager
createFunctionPassManager modul =
withModuleProvider modul $ \modulPtr -> do
ptr <- FFI.createFunctionPassManager modulPtr
liftM PassManager $ newForeignPtr FFI.ptrDisposePassManager ptr
addCFGSimplificationPass :: PassManager -> IO ()
addCFGSimplificationPass pm = withPassManager pm FFI.addCFGSimplificationPass
addConstantPropagationPass :: PassManager -> IO ()
addConstantPropagationPass pm = withPassManager pm FFI.addConstantPropagationPass
addDemoteMemoryToRegisterPass :: PassManager -> IO ()
addDemoteMemoryToRegisterPass pm = withPassManager pm FFI.addDemoteMemoryToRegisterPass
addGVNPass :: PassManager -> IO ()
addGVNPass pm = withPassManager pm FFI.addGVNPass
addInstructionCombiningPass :: PassManager -> IO ()
addInstructionCombiningPass pm = withPassManager pm FFI.addInstructionCombiningPass
addPromoteMemoryToRegisterPass :: PassManager -> IO ()
addPromoteMemoryToRegisterPass pm = withPassManager pm FFI.addPromoteMemoryToRegisterPass
addReassociatePass :: PassManager -> IO ()
addReassociatePass pm = withPassManager pm FFI.addReassociatePass
addTargetData :: FFI.TargetDataRef -> PassManager -> IO ()
addTargetData td pm = withPassManager pm $ FFI.addTargetData td
runFunctionPassManager :: PassManager -> Function -> IO Int
runFunctionPassManager pm fcn = liftM fromIntegral $ withPassManager pm $ \ pmref -> FFI.runFunctionPassManager pmref fcn
initializeFunctionPassManager :: PassManager -> IO Int
initializeFunctionPassManager pm = liftM fromIntegral $ withPassManager pm FFI.initializeFunctionPassManager
finalizeFunctionPassManager :: PassManager -> IO Int
finalizeFunctionPassManager pm = liftM fromIntegral $ withPassManager pm FFI.finalizeFunctionPassManager
The unsafePerformIO is just for the non - effecting withArrayLen
constVector :: Int -> [Value] -> Value
constVector n xs = unsafePerformIO $ do
let xs' = take n (cycle xs)
withArrayLen xs' $ \ len ptr ->
return $ FFI.constVector ptr (fromIntegral len)
The unsafePerformIO is just for the non - effecting withArrayLen
constArray :: Type -> Int -> [Value] -> Value
constArray t n xs = unsafePerformIO $ do
let xs' = take n (cycle xs)
withArrayLen xs' $ \ len ptr ->
return $ FFI.constArray t ptr (fromIntegral len)
The unsafePerformIO is just for the non - effecting withArrayLen
constStruct :: [Value] -> Bool -> Value
constStruct xs packed = unsafePerformIO $ do
withArrayLen xs $ \ len ptr ->
return $ FFI.constStruct ptr (fromIntegral len) (if packed then 1 else 0)
getValueNameU :: Value -> IO String
getValueNameU a = do
cs <- FFI.getValueName a
str <- peekCString cs
if str == "" then return (show a) else return str
setValueNameU :: String -> Value -> IO ()
setValueNameU str a = do
withCString str $ \ strPtr ->
FFI.setValueName a strPtr
getObjList :: (t1 -> (t2 -> IO [Ptr a]) -> t) -> (t2 -> IO (Ptr a))
-> (Ptr a -> IO (Ptr a)) -> t1 -> t
getObjList withF firstF nextF obj = do
withF obj $ \ objPtr -> do
ofst <- firstF objPtr
let oloop p = if p == nullPtr then return [] else do
n <- nextF p
ps <- oloop n
return (p : ps)
oloop ofst
annotateValueList :: [Value] -> IO [(String, Value)]
annotateValueList vs = do
names <- mapM getValueNameU vs
return $ zip names vs
isConstant :: Value -> IO Bool
isConstant v = do
isC <- FFI.isConstant v
if isC == 0 then return False else return True
isIntrinsic :: Value -> IO Bool
isIntrinsic v = do
if FFI.getIntrinsicID v == 0 then return True else return False
type Use = FFI.UseRef
hasUsers :: Value -> IO Bool
hasUsers v = do
nU <- FFI.getNumUses v
if nU == 0 then return False else return True
getUses :: Value -> IO [Use]
getUses = getObjList withValue FFI.getFirstUse FFI.getNextUse
getUsers :: [Use] -> IO [(String, Value)]
getUsers us = mapM FFI.getUser us >>= annotateValueList
getUser :: Use -> IO Value
getUser = FFI.getUser
isChildOf :: BasicBlock -> Value -> IO Bool
isChildOf bb v = do
bb2 <- FFI.getInstructionParent v
if bb == bb2 then return True else return False
getDep :: Use -> IO (String, String)
getDep u = do
producer <- FFI.getUsedValue u >>= getValueNameU
consumer <- FFI.getUser u >>= getValueNameU
return (producer, consumer)
|
9bb1a4325e72746d24daa104fd7df5bf1797fab270fbd34eef6da73468fb8e81 | plumatic/grab-bag | config.clj | {:machine {:instance-type :r3.8xlarge
:tags {:owner "grabbag-corp"}}
:service {:server-port 80}
:parameters {:swank-port 6446
:yourkit-port 10001
:forward-ports {6666 :swank-port
10001 :yourkit-port}
:email-level :fatal}
:envs {:crunked {:env :stage
:machine {:instance-type :cc2.8xlarge
:groups ["woven" "grabbag-model-explorer-crunked"]}
:service {:service-name-override "model-explorer-crunked"}
:parameters {:num-days 0.0}}
:test {:service {:server-port 5888}}}}
| null | https://raw.githubusercontent.com/plumatic/grab-bag/a15e943322fbbf6f00790ce5614ba6f90de1a9b5/service/model-explorer/src/model_explorer/config.clj | clojure | {:machine {:instance-type :r3.8xlarge
:tags {:owner "grabbag-corp"}}
:service {:server-port 80}
:parameters {:swank-port 6446
:yourkit-port 10001
:forward-ports {6666 :swank-port
10001 :yourkit-port}
:email-level :fatal}
:envs {:crunked {:env :stage
:machine {:instance-type :cc2.8xlarge
:groups ["woven" "grabbag-model-explorer-crunked"]}
:service {:service-name-override "model-explorer-crunked"}
:parameters {:num-days 0.0}}
:test {:service {:server-port 5888}}}}
| |
b988dce345ab76461b9f8fa6013247903a090e1c1e925b1f51c605c1a87ccca0 | stylewarning/quickutil | management.lisp | ;;;; management.lisp
Copyright ( c ) 2013
(defpackage #:quickutil-client-management
(:use #:cl #:asdf)
(:export #:load-quickutil-utilities
#:unload-quickutil-utilities
#:with-quickutil-utilities
#:*verbose*))
(in-package #:quickutil-client-management)
(defvar *verbose* nil
"Dictates whether loading should be verbose.")
(defun symbol-accessible-from-package (symbol package)
"Is the symbol SYMBOL accessible from the package PACKAGE?"
(multiple-value-bind (found-symbol status)
(find-symbol (symbol-name symbol) package)
(and
(if (null symbol)
status
(eq symbol found-symbol))
t)))
(defun unbind-symbol (symbol)
"Ensure the symbol denoting a variable, function, or macro is unbound.
A continuable error will be signaled if it is a symbol in the CL package."
(cond
((symbol-accessible-from-package symbol :common-lisp)
(cerror "Continue, doing nothing with the symbol."
"The symbol ~S is in the CL package and cannot be ~
unbound."
symbol))
(t
(when (boundp symbol)
(makunbound symbol))
(when (fboundp symbol)
(unless (special-operator-p symbol)
(fmakunbound symbol))))))
(defun clean-and-delete-package (package-designator)
"Clean up the package designated by PACKAGE-DESIGNATOR (unbind all
of the bound symbols), and delete the package, if it exists."
(when (or (packagep package-designator)
(find-package package-designator))
(let ((package (if (packagep package-designator)
package-designator
(find-package package-designator))))
;; Clean up all the symbols.
(do-symbols (sym package-designator)
(when (eq (symbol-package sym) package)
(unbind-symbol sym)))
;; Delete the package.
(delete-package package-designator))))
(defun load-quickutil-utilities (&key (verbose *verbose*))
(when verbose
(format t "~&;;; Loading Quickutil utilities...~%"))
(let ((*standard-output* (make-broadcast-stream)))
(operate 'load-op :quickutil-utilities :force t :verbose nil)))
(defun unload-quickutil-utilities (&key (verbose *verbose*))
(when verbose
(format t "~&;;; Clearing QUICKUTIL-UTILITIES system...~%"))
(asdf:clear-system :quickutil-utilities)
(when verbose
(format t "~&;;; Unloading QUICKUTIL-UTILITIES.UTILITIES...~%"))
(clean-and-delete-package '#:quickutil-utilities.utilities)
(when verbose
(format t "~&;;; Unloading QUICKUTIL-UTILITIES...~%"))
(clean-and-delete-package '#:quickutil-utilities)
(when verbose
(format t "~&;;; Collecting trash...~%"))
(trivial-garbage:gc :full t :verbose verbose))
(defmacro with-quickutil-utilities (&body body)
"Load Quickutil utilities, execute BODY, and unload them, returning
the last result of BODY."
`(unwind-protect (progn
(quickutil-client-management:load-quickutil-utilities)
,@body)
(quickutil-client-management:unload-quickutil-utilities)))
| null | https://raw.githubusercontent.com/stylewarning/quickutil/5adb3463d99095145325c4013117bd08a8f6cac2/quickutil-client/management.lisp | lisp | management.lisp
Clean up all the symbols.
Delete the package. | Copyright ( c ) 2013
(defpackage #:quickutil-client-management
(:use #:cl #:asdf)
(:export #:load-quickutil-utilities
#:unload-quickutil-utilities
#:with-quickutil-utilities
#:*verbose*))
(in-package #:quickutil-client-management)
(defvar *verbose* nil
"Dictates whether loading should be verbose.")
(defun symbol-accessible-from-package (symbol package)
"Is the symbol SYMBOL accessible from the package PACKAGE?"
(multiple-value-bind (found-symbol status)
(find-symbol (symbol-name symbol) package)
(and
(if (null symbol)
status
(eq symbol found-symbol))
t)))
(defun unbind-symbol (symbol)
"Ensure the symbol denoting a variable, function, or macro is unbound.
A continuable error will be signaled if it is a symbol in the CL package."
(cond
((symbol-accessible-from-package symbol :common-lisp)
(cerror "Continue, doing nothing with the symbol."
"The symbol ~S is in the CL package and cannot be ~
unbound."
symbol))
(t
(when (boundp symbol)
(makunbound symbol))
(when (fboundp symbol)
(unless (special-operator-p symbol)
(fmakunbound symbol))))))
(defun clean-and-delete-package (package-designator)
"Clean up the package designated by PACKAGE-DESIGNATOR (unbind all
of the bound symbols), and delete the package, if it exists."
(when (or (packagep package-designator)
(find-package package-designator))
(let ((package (if (packagep package-designator)
package-designator
(find-package package-designator))))
(do-symbols (sym package-designator)
(when (eq (symbol-package sym) package)
(unbind-symbol sym)))
(delete-package package-designator))))
(defun load-quickutil-utilities (&key (verbose *verbose*))
(when verbose
(format t "~&;;; Loading Quickutil utilities...~%"))
(let ((*standard-output* (make-broadcast-stream)))
(operate 'load-op :quickutil-utilities :force t :verbose nil)))
(defun unload-quickutil-utilities (&key (verbose *verbose*))
(when verbose
(format t "~&;;; Clearing QUICKUTIL-UTILITIES system...~%"))
(asdf:clear-system :quickutil-utilities)
(when verbose
(format t "~&;;; Unloading QUICKUTIL-UTILITIES.UTILITIES...~%"))
(clean-and-delete-package '#:quickutil-utilities.utilities)
(when verbose
(format t "~&;;; Unloading QUICKUTIL-UTILITIES...~%"))
(clean-and-delete-package '#:quickutil-utilities)
(when verbose
(format t "~&;;; Collecting trash...~%"))
(trivial-garbage:gc :full t :verbose verbose))
(defmacro with-quickutil-utilities (&body body)
"Load Quickutil utilities, execute BODY, and unload them, returning
the last result of BODY."
`(unwind-protect (progn
(quickutil-client-management:load-quickutil-utilities)
,@body)
(quickutil-client-management:unload-quickutil-utilities)))
|
6efb35f8fdc8067fa34e9872904ba9dc74bb1ae03f4a4338460b97546e481da8 | geophf/1HaskellADay | Exercise.hs | module Y2020.M08.D26.Exercise where
import Y2020.M08.D25.Exercise
-
Yesterday , we read in the top - downloaded project gutenberg books with their
associated URLs to the book - pages , but , if you followed any of those links
you see those links are n't the books , themselves , but are the anchor - pages
to links to different readable kinds of those books .
Huh .
Today , we 're going to take those links and download the plain - text versions
of those books . Why the plain - text versions ? Because eh , that 's why .
( Also , I like the plain text versions for , but that 's just
moiself ( that is French ) ( no , it 's not ) ) .
Okay , so where is the plain text version of the book ?
We have :
" Pride and Prejudice by " at
url :
The plain - text version is at : -0.txt
And ... probably ( ? ) ... the other books follow suit ? unless they 're
multivolume ' How Rome was Built in One Day ( psych ! ) ' and who reads that ?
( I 've just insulted all the history - buffs in the world , but oh , well . )
SO !
With that tidbit , today 's # haskell exercise is to download the top-100
gutenberg books into your very own Library .
-
Yesterday, we read in the top-downloaded project gutenberg books with their
associated URLs to the book-pages, but, if you followed any of those links
you see those links aren't the books, themselves, but are the anchor-pages
to links to different readable kinds of those books.
Huh.
Today, we're going to take those links and download the plain-text versions
of those books. Why the plain-text versions? Because eh, that's why.
(Also, I like the plain text versions for vectorizing, but that's just
moiself (that is French) (no, it's not)).
Okay, so where is the plain text version of the book?
We have:
"Pride and Prejudice by Jane Austen" at
url:
The plain-text version is at: -0.txt
And ... probably (?) ... the other books follow suit? unless they're
multivolume 'How Rome was Built in One Day (psych!)' and who reads that?
(I've just insulted all the history-buffs in the world, but oh, well.)
SO!
With that tidbit, today's #haskell exercise is to download the top-100
gutenberg books into your very own Library.
--}
import Data.Map (Map)
type Title = String
type URL = FilePath
type BookInfo = (Title, URL)
type Text = String
type Library = Map BookInfo Text
importBook :: BookInfo -> IO Text
importBook bookInfo = undefined
importLibrary :: BookIndex -> IO Library
importLibrary bookinfos = undefined
| null | https://raw.githubusercontent.com/geophf/1HaskellADay/514792071226cd1e2ba7640af942667b85601006/exercises/HAD/Y2020/M08/D26/Exercise.hs | haskell | } | module Y2020.M08.D26.Exercise where
import Y2020.M08.D25.Exercise
-
Yesterday , we read in the top - downloaded project gutenberg books with their
associated URLs to the book - pages , but , if you followed any of those links
you see those links are n't the books , themselves , but are the anchor - pages
to links to different readable kinds of those books .
Huh .
Today , we 're going to take those links and download the plain - text versions
of those books . Why the plain - text versions ? Because eh , that 's why .
( Also , I like the plain text versions for , but that 's just
moiself ( that is French ) ( no , it 's not ) ) .
Okay , so where is the plain text version of the book ?
We have :
" Pride and Prejudice by " at
url :
The plain - text version is at : -0.txt
And ... probably ( ? ) ... the other books follow suit ? unless they 're
multivolume ' How Rome was Built in One Day ( psych ! ) ' and who reads that ?
( I 've just insulted all the history - buffs in the world , but oh , well . )
SO !
With that tidbit , today 's # haskell exercise is to download the top-100
gutenberg books into your very own Library .
-
Yesterday, we read in the top-downloaded project gutenberg books with their
associated URLs to the book-pages, but, if you followed any of those links
you see those links aren't the books, themselves, but are the anchor-pages
to links to different readable kinds of those books.
Huh.
Today, we're going to take those links and download the plain-text versions
of those books. Why the plain-text versions? Because eh, that's why.
(Also, I like the plain text versions for vectorizing, but that's just
moiself (that is French) (no, it's not)).
Okay, so where is the plain text version of the book?
We have:
"Pride and Prejudice by Jane Austen" at
url:
The plain-text version is at: -0.txt
And ... probably (?) ... the other books follow suit? unless they're
multivolume 'How Rome was Built in One Day (psych!)' and who reads that?
(I've just insulted all the history-buffs in the world, but oh, well.)
SO!
With that tidbit, today's #haskell exercise is to download the top-100
gutenberg books into your very own Library.
import Data.Map (Map)
type Title = String
type URL = FilePath
type BookInfo = (Title, URL)
type Text = String
type Library = Map BookInfo Text
importBook :: BookInfo -> IO Text
importBook bookInfo = undefined
importLibrary :: BookIndex -> IO Library
importLibrary bookinfos = undefined
|
27555bd2b2a27f79d98b57dd453a68fd88a9e60bb580d4b3209525d1d5ef6418 | souenzzo/pedestal-native | log.clj | (ns io.pedestal.log)
(defn default-tracer [& _])
(defn active-span [& _])
(defn TraceSpanBaggage [& _])
(defn -log-span [& _])
(defn log [& _])
(defn -counter [& _])
(defn jmx-reporter [& _])
(defn span-log-error-kind [& _])
(defn log-span [& _])
(defn -register [& _])
(defn LoggingMDC [& _])
(defn span [& _])
(defn ^:dynamic *mdc-context* [& _])
(defn -put-mdc [& _])
(defn -remove-mdc [& _])
(defmacro with-context-kv [& _])
(defn format-name [& _])
(defn -set-operation-name [& _])
(defn -active-span [& _])
(defn mdc-context-key [& _])
(defn finish-span [& _])
(defn log-level-dispatch [& _])
(defn metric-registry [& _])
(defn -meter [& _])
(defn -span [& _])
(defn histogram [& _])
(defn MetricRecorder [& _])
(defn -info [& _])
(defn -error [& _])
(defn -get-baggage [& _])
(defn -activate-span [& _])
(defn gauge [& _])
(defmacro spy [& _])
(defn span-baggage [& _])
(defmacro warn [& _])
(defn log-reporter [& _])
(defmacro trace [& _])
(defn -trace [& _])
(defn -clear-mdc [& _])
(defn span-log-stack [& _])
(defn TraceSpan [& _])
(defn span-log-error-obj [& _])
(defn -log-span-map [& _])
(defn span-log-event [& _])
(defmacro debug [& _])
(defn span-log-msg [& _])
(defn maybe-init-java-util-log [& _])
(defn -finish-span [& _])
(defn tag-span [& _])
(defn override-logger [& _])
(defn -gauge [& _])
(defn default-recorder [& _])
(defn -get-mdc [& _])
(defn counter [& _])
(defn -error-span [& _])
(defn -warn [& _])
(defn TraceSpanLog [& _])
(defn TraceOrigin [& _])
(defn -histogram [& _])
(defmacro with-context [& _])
(defn -debug [& _])
(defn meter [& _])
(defn LoggerSource [& _])
(defn -tag-span [& _])
(defn add-span-baggage! [& _])
(defmacro info [& _])
(defn -set-baggage [& _])
(defn -level-enabled? [& _])
(defn -get-baggage-map [& _])
(defn TraceSpanLogMap [& _])
(defmacro error [& _])
(defn -set-mdc [& _])
| null | https://raw.githubusercontent.com/souenzzo/pedestal-native/46e516751c12e70be8b3b2ff2e4ffc3e6afffc7d/projects/log-noop/src/io/pedestal/log.clj | clojure | (ns io.pedestal.log)
(defn default-tracer [& _])
(defn active-span [& _])
(defn TraceSpanBaggage [& _])
(defn -log-span [& _])
(defn log [& _])
(defn -counter [& _])
(defn jmx-reporter [& _])
(defn span-log-error-kind [& _])
(defn log-span [& _])
(defn -register [& _])
(defn LoggingMDC [& _])
(defn span [& _])
(defn ^:dynamic *mdc-context* [& _])
(defn -put-mdc [& _])
(defn -remove-mdc [& _])
(defmacro with-context-kv [& _])
(defn format-name [& _])
(defn -set-operation-name [& _])
(defn -active-span [& _])
(defn mdc-context-key [& _])
(defn finish-span [& _])
(defn log-level-dispatch [& _])
(defn metric-registry [& _])
(defn -meter [& _])
(defn -span [& _])
(defn histogram [& _])
(defn MetricRecorder [& _])
(defn -info [& _])
(defn -error [& _])
(defn -get-baggage [& _])
(defn -activate-span [& _])
(defn gauge [& _])
(defmacro spy [& _])
(defn span-baggage [& _])
(defmacro warn [& _])
(defn log-reporter [& _])
(defmacro trace [& _])
(defn -trace [& _])
(defn -clear-mdc [& _])
(defn span-log-stack [& _])
(defn TraceSpan [& _])
(defn span-log-error-obj [& _])
(defn -log-span-map [& _])
(defn span-log-event [& _])
(defmacro debug [& _])
(defn span-log-msg [& _])
(defn maybe-init-java-util-log [& _])
(defn -finish-span [& _])
(defn tag-span [& _])
(defn override-logger [& _])
(defn -gauge [& _])
(defn default-recorder [& _])
(defn -get-mdc [& _])
(defn counter [& _])
(defn -error-span [& _])
(defn -warn [& _])
(defn TraceSpanLog [& _])
(defn TraceOrigin [& _])
(defn -histogram [& _])
(defmacro with-context [& _])
(defn -debug [& _])
(defn meter [& _])
(defn LoggerSource [& _])
(defn -tag-span [& _])
(defn add-span-baggage! [& _])
(defmacro info [& _])
(defn -set-baggage [& _])
(defn -level-enabled? [& _])
(defn -get-baggage-map [& _])
(defn TraceSpanLogMap [& _])
(defmacro error [& _])
(defn -set-mdc [& _])
| |
935879ed78c3ddc37c33411fb9067fb88313b1afca42dca71eee725e71ceadc4 | yetanalytics/dl4clj | iterators.clj | (ns dl4clj.datasets.iterators
(:import [org.deeplearning4j.datasets.datavec RecordReaderDataSetIterator
RecordReaderMultiDataSetIterator$Builder RecordReaderMultiDataSetIterator
SequenceRecordReaderDataSetIterator]
[org.deeplearning4j.datasets.iterator
DoublesDataSetIterator FloatsDataSetIterator INDArrayDataSetIterator
AsyncDataSetIterator AsyncMultiDataSetIterator CombinedPreProcessor
CombinedPreProcessor$Builder CurvesDataSetIterator IteratorDataSetIterator
IteratorMultiDataSetIterator MovingWindowBaseDataSetIterator
MultipleEpochsIterator ReconstructionDataSetIterator SamplingDataSetIterator
ExistingDataSetIterator]
[org.deeplearning4j.datasets.iterator.impl MultiDataSetIteratorAdapter
ListDataSetIterator SingletonMultiDataSetIterator]
[org.deeplearning4j.datasets.iterator.impl
CifarDataSetIterator IrisDataSetIterator LFWDataSetIterator
MnistDataSetIterator RawMnistDataSetIterator]
[org.deeplearning4j.spark.iterator
PathSparkDataSetIterator
PathSparkMultiDataSetIterator
PortableDataStreamDataSetIterator
PortableDataStreamMultiDataSetIterator]
[java.util Random])
(:require [dl4clj.constants :refer [value-of]]
[dl4clj.berkeley :refer [new-pair]]
[dl4clj.helpers :refer :all]
[dl4clj.utils :refer [contains-many? generic-dispatching-fn builder-fn
obj-or-code?]]
[clojure.core.match :refer [match]]
[nd4clj.linalg.factory.nd4j :refer [vec-or-matrix->indarray]]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; multimethod
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmulti iterator
"Multimethod that builds a dataset iterator based on the supplied type and opts"
generic-dispatching-fn)
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; record reader dataset iterator mulimethods
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod iterator :rr-dataset-iter [opts]
(let [config (:rr-dataset-iter opts)
{rr :record-reader
batch-size :batch-size
label-idx :label-idx
n-labels :n-possible-labels
l-idx-from :label-idx-from
l-idx-to :label-idx-to
regression? :regression?
max-n-batches :max-num-batches
converter :writeable-converter} config]
(match [config]
[{:writeable-converter _ :batch-size _ :label-idx-from _
:label-idx-to _ :n-possible-labels _ :max-num-batches _
:regression? _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~l-idx-from
~l-idx-to ~n-labels ~max-n-batches
~regression?)
[{:writeable-converter _ :batch-size _ :label-idx _
:n-possible-labels _ :max-num-batches _ :regression? _
:record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~label-idx
~n-labels ~max-n-batches ~regression?)
[{:writeable-converter _ :batch-size _ :label-idx _
:n-possible-labels _ :regression? _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~label-idx
~n-labels ~regression?)
[{:record-reader _ :batch-size _ :label-idx-from _
:label-idx-to _ :regression? _}]
`(RecordReaderDataSetIterator.
~rr ~batch-size ~l-idx-from ~l-idx-to ~regression?)
[{:record-reader _ :batch-size _ :label-idx _
:n-possible-labels _ :max-num-batches _}]
`(RecordReaderDataSetIterator.
~rr ~batch-size ~label-idx ~n-labels ~max-n-batches)
[{:writeable-converter _ :batch-size _ :label-idx _
:n-possible-labels _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~label-idx ~n-labels)
[{:record-reader _ :batch-size _ :label-idx _ :n-possible-labels _}]
`(RecordReaderDataSetIterator.
~rr ~batch-size ~label-idx ~n-labels)
[{:writeable-converter _ :batch-size _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size)
[{:batch-size _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~batch-size))))
(defmethod iterator :seq-rr-dataset-iter [opts]
(let [config (:seq-rr-dataset-iter opts)
{rr :record-reader
m-batch-size :mini-batch-size
n-labels :n-possible-labels
label-idx :label-idx
regression? :regression?
labels-reader :labels-reader
features-reader :features-reader
alignment :alignment-mode} config]
(match [config]
[{:labels-reader _ :features-reader _ :mini-batch-size _
:n-possible-labels _ :regression? _ :alignment-mode _}]
`(SequenceRecordReaderDataSetIterator.
~features-reader ~labels-reader ~m-batch-size ~n-labels ~regression?
~(value-of-helper :seq-alignment-mode alignment))
[{:labels-reader _ :features-reader _ :mini-batch-size _
:n-possible-labels _ :regression? _}]
`(SequenceRecordReaderDataSetIterator.
~features-reader ~labels-reader ~m-batch-size ~n-labels ~regression?)
[{:record-reader _ :mini-batch-size _ :n-possible-labels _
:label-idx _ :regression? _}]
`(SequenceRecordReaderDataSetIterator.
~rr ~m-batch-size ~n-labels ~label-idx ~regression?)
[{:labels-reader _ :features-reader _ :mini-batch-size _ :n-possible-labels _}]
`(SequenceRecordReaderDataSetIterator.
~features-reader ~labels-reader ~m-batch-size ~n-labels)
[{:record-reader _ :mini-batch-size _ :n-possible-labels _ :label-idx _}]
`(SequenceRecordReaderDataSetIterator.
~rr ~m-batch-size ~n-labels ~label-idx))))
(defmethod iterator :multi-dataset-iter [opts]
(assert (integer? (:batch-size (:multi-dataset-iter opts)))
"you must supply batch-size and it must be an integer")
(let [config (:multi-dataset-iter opts)
{add-input :add-input
add-input-hot :add-input-one-hot
add-output :add-output
add-output-hot :add-output-one-hot
add-reader :add-reader
add-seq-reader :add-seq-reader
alignment :alignment-mode
batch-size :batch-size} config
{reader-name :reader-name
first-column :first-column
last-column :last-column} add-input
{hot-reader-name :reader-name
hot-column :column
hot-num-classes :n-classes} add-input-hot
{output-reader-name :reader-name
output-first-column :first-column
output-last-column :last-column} add-output
{hot-output-reader-name :reader-name
hot-output-column :column
hot-output-n-classes :n-classes} add-output-hot
{record-reader-name :reader-name
rr :record-reader} add-reader
{seq-reader-name :reader-name
seq-rr :record-reader} add-seq-reader
method-map {:input '.addInput
:input-one-hot '.addInputOneHot
:output '.addOutput
:output-one-hot '.addOutputOneHot
:reader '.addReader
:seq-reader '.addSequenceReader
:alignment '.sequenceAlignmentMode}
updated-opts {:alignment (if alignment (value-of-helper :multi-alignment-mode
alignment))
:reader (if add-reader [record-reader-name rr])
:seq-reader (if add-seq-reader [seq-reader-name seq-rr])
:output-one-hot (if add-output-hot [hot-output-reader-name
hot-output-column
hot-output-n-classes])
:output (if add-output [output-reader-name
output-first-column
output-last-column])
:input-one-hot (if add-input-hot [hot-reader-name
hot-column
hot-num-classes])
:input (if add-input [reader-name first-column last-column])}
opts* (into {} (filter val updated-opts))
b `(RecordReaderMultiDataSetIterator$Builder. ~batch-size)]
`(.build
~(builder-fn b method-map opts*))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; dataset iterator mulimethods
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod iterator :doubles-dataset-iter [opts]
(let [config (:doubles-dataset-iter opts)
{features :features
labels :labels
batch-size :batch-size} config]
`(DoublesDataSetIterator. [(new-pair :p1 (double-array ~features)
:p2 (double-array ~labels))]
~batch-size)))
(defmethod iterator :floats-dataset-iter [opts]
(let [config (:floats-dataset-iter opts)
{features :features
labels :labels
batch-size :batch-size} config]
`(FloatsDataSetIterator. [(new-pair :p1 (float-array ~features)
:p2 (float-array ~labels))]
~batch-size)))
(defmethod iterator :INDArray-dataset-iter [opts]
(let [config (:INDArray-dataset-iter opts)
{features :features
labels :labels
batch-size :batch-size} config]
`(INDArrayDataSetIterator. [(new-pair :p1 (vec-or-matrix->indarray ~features)
:p2 (vec-or-matrix->indarray ~labels))]
~batch-size)))
(defmethod iterator :iterator-multi-dataset-iter [opts]
(let [config (:iterator-multi-dataset-iter opts)
{iter :multi-dataset-iter
batch-size :batch-size} config]
`(IteratorMultiDataSetIterator. ~iter ~batch-size)))
(defmethod iterator :iterator-dataset-iter [opts]
(let [config (:iterator-dataset-iter opts)
{iter :iter
batch-size :batch-size} config]
`(IteratorDataSetIterator. ~iter ~batch-size)))
(defmethod iterator :async-multi-dataset-iter [opts]
(let [config (:async-multi-dataset-iter opts)
{iter :multi-dataset-iter
que-l :que-length} config]
`(AsyncMultiDataSetIterator. ~iter ~que-l)))
(defmethod iterator :moving-window-base-dataset-iter [opts]
(let [config (:moving-window-base-dataset-iter opts)
{batch :batch-size
n-examples :n-examples
data :dataset
window-rows :window-rows
window-columns :window-columns} config]
`(MovingWindowBaseDataSetIterator. ~batch ~n-examples ~data ~window-rows ~window-columns)))
(defmethod iterator :multiple-epochs-iter [opts]
(let [config (:multiple-epochs-iter opts)
{iter :iter
q-size :que-size
t-iterations :total-iterations
n-epochs :n-epochs
ds :dataset} config]
(match [config]
[{:n-epochs _ :iter _ :que-size _}]
`(MultipleEpochsIterator. ~n-epochs ~iter ~q-size)
[{:iter _ :que-size _ :total-iterations _}]
`(MultipleEpochsIterator. ~iter ~q-size ~t-iterations)
[{:n-epochs _ :iter _}]
`(MultipleEpochsIterator. ~n-epochs ~iter)
[{:n-epochs _ :dataset _}]
`(MultipleEpochsIterator. ~n-epochs ~ds)
[{:iter _ :total-iterations _}]
`(MultipleEpochsIterator. ~iter ~t-iterations))))
(defmethod iterator :reconstruction-dataset-iter [opts]
(let [config (:reconstruction-dataset-iter opts)
iter (:iter config)]
`(ReconstructionDataSetIterator. ~iter)))
(defmethod iterator :sampling-dataset-iter [opts]
(let [config (:sampling-dataset-iter opts)
{ds :sampling-source
batch-size :batch-size
n-samples :total-n-samples} config]
`(SamplingDataSetIterator. ~ds ~batch-size ~n-samples)))
(defmethod iterator :existing-dataset-iter [opts]
(let [config (:existing-dataset-iter opts)
{iterable :dataset
n-examples :total-examples
n-features :n-features
n-labels :n-labels
labels :labels
ds-iter :iter} config]
(match [config]
[{:dataset _ :total-examples _ :n-features _ :n-labels _}]
`(ExistingDataSetIterator. ~iterable ~n-examples ~n-features ~n-labels)
[{:dataset _ :labels _}]
`(ExistingDataSetIterator. ~iterable ~labels)
[{:iter _ :labels _}]
`(ExistingDataSetIterator. ~ds-iter ~labels)
[{:iter _}]
`(ExistingDataSetIterator. ~ds-iter)
[{:dataset _}]
`(ExistingDataSetIterator. ~iterable))))
(defmethod iterator :async-dataset-iter [opts]
(let [config (:async-dataset-iter opts)
{ds-iter :iter
que-size :que-size
que :que} config]
(match [config]
[{:iter _ :que _ :que-size _}]
`(AsyncDataSetIterator. ~ds-iter ~que-size ~que)
[{:iter _ :que-size _}]
`(AsyncDataSetIterator. ~ds-iter ~que-size)
[{:iter _}]
`(AsyncDataSetIterator. ~ds-iter))))
(defmethod iterator :ds-iter-to-multi-ds-iter [opts]
(let [conf (:ds-iter-to-multi-ds-iter opts)
iter (:iter conf)]
`(MultiDataSetIteratorAdapter. ~iter)))
(defmethod iterator :list-ds-iter [opts]
(let [conf (:list-ds-iter opts)
{ds :dataset
batch-size :batch-size} conf]
(if (contains? conf :batch-size)
`(ListDataSetIterator. ~ds ~batch-size)
`(ListDataSetIterator. ~ds))))
(defmethod iterator :multi-ds-to-multi-ds-iter [opts]
(let [conf (:multi-ds-to-multi-ds-iter opts)
mds (:multi-dataset conf)]
`(SingletonMultiDataSetIterator. ~mds)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; default dataset iterator mulimethods
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod iterator :curves-dataset-iter [opts]
(let [config (:curves-dataset-iter opts)
{batch :batch-size
n-examples :n-examples} config]
`(CurvesDataSetIterator. ~batch ~n-examples)))
(defmethod iterator :cifar-dataset-iter [opts]
(let [config (:cifar-dataset-iter opts)
{batch-size :batch-size
n-examples :n-examples
img-dims :img-dims
train? :train?
use-special-pre-process-cifar? :use-special-pre-process-cifar?
n-possible-labels :n-possible-labels
img-transform :img-transform} config
img `(int-array ~img-dims)]
(match [config]
[{:batch-size _ :n-examples _ :img-dims _ :n-possible-labels _
:img-transform _ :use-special-pre-process-cifar? _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img ~n-possible-labels
~img-transform ~use-special-pre-process-cifar? ~train?)
[{:batch-size _ :n-examples _ :img-dims _
:use-special-pre-process-cifar? _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img
~use-special-pre-process-cifar? ~train?)
[{:batch-size _ :n-examples _ :img-dims _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img ~train?)
[{:batch-size _ :n-examples _ :img-dims _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img)
[{:batch-size _ :n-examples _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~train?)
[{:batch-size _ :img-dims _}]
`(CifarDataSetIterator. ~batch-size ~img)
[{:batch-size _ :n-examples _}]
`(CifarDataSetIterator. ~batch-size ~n-examples))))
(defmethod iterator :iris-dataset-iter [opts]
(let [config (:iris-dataset-iter opts)
{batch-size :batch-size
n-examples :n-examples} config]
`(IrisDataSetIterator. ~batch-size ~n-examples)))
(defmethod iterator :lfw-dataset-iter [opts]
(let [config (:lfw-dataset-iter opts)
{img-dims :img-dims
batch-size :batch-size
n-examples :n-examples
use-subset? :use-subset?
train? :train?
split-train-test :split-train-test
n-labels :n-labels
seed :seed
label-generator :label-generator
image-transform :image-transform} config
img `(int-array ~img-dims)
rng (if (contains? config :seed)
`(new Random ~seed)
`(new Random 123))]
(match [config]
[{:batch-size _ :n-examples _ :img-dims _ :n-labels _
:use-subset? _ :label-generator _ :train? _ :split-train-test _
:seed _ :image-transform _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~n-labels ~use-subset?
~label-generator ~train? ~split-train-test ~image-transform
~rng)
[{:batch-size _ :n-examples _ :img-dims _ :n-labels _
:use-subset? _ :label-generator _ :train? _ :split-train-test _
:seed _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~n-labels ~use-subset?
~label-generator ~train? ~split-train-test ~rng)
[{:batch-size _ :n-examples _ :img-dims _ :n-labels _
:use-subset? _ :train? _ :split-train-test _ :seed _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~n-labels ~use-subset?
~train? ~split-train-test ~rng)
[{:batch-size _ :n-examples _ :n-labels _ :train? _ :split-train-test _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~n-labels ~train? ~split-train-test)
[{:batch-size _ :n-examples _ :img-dims _ :train? _ :split-train-test _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~train? ~split-train-test)
[{:batch-size _ :n-examples _ :img-dims _ }]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img)
[{:batch-size _ :use-subset? _ :img-dims _ }]
`(LFWDataSetIterator. ~batch-size ~img ~use-subset?)
[{:batch-size _ :n-examples _}]
`(LFWDataSetIterator. ~batch-size ~n-examples)
[{:img-dims _ }]
`(LFWDataSetIterator. ~img))))
(defmethod iterator :mnist-dataset-iter [opts]
(let [config (:mnist-dataset-iter opts)
{batch-size :batch-size
train? :train?
seed :seed
n-examples :n-examples
binarize? :binarize?
shuffle? :shuffle?
batch :batch} config]
(match [config]
[{:batch _ :n-examples _ :binarize? _
:train? _ :shuffle? _ :seed _}]
`(MnistDataSetIterator. ~batch ~n-examples ~binarize? ~train?
~shuffle? ~(long seed))
[{:batch-size _ :train? _ :seed _}]
`(MnistDataSetIterator. ~batch-size ~train? ~(int seed))
[{:batch _ :n-examples _ :binarize? _}]
`(MnistDataSetIterator. ~batch ~n-examples ~binarize?)
[{:batch _ :n-examples _}]
`(MnistDataSetIterator. ~batch ~n-examples))))
(defmethod iterator :raw-mnist-dataset-iter [opts]
(let [config (:raw-mnist-dataset-iter opts)
{batch :batch
n-examples :n-examples} config]
`(RawMnistDataSetIterator. ~batch ~n-examples)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; spark dataset iterator mulimethods
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; going to update once I get to updating spark stuff
(defmethod iterator :path-to-ds [opts]
(let [config (:path-to-ds opts)
{str-paths :string-paths
iter :iter} config]
(if str-paths
`(PathSparkDataSetIterator. ~str-paths)
`(PathSparkDataSetIterator. ~iter))))
(defmethod iterator :path-to-multi-ds [opts]
(let [config (:path-to-multi-ds opts)
{str-paths :string-paths
iter :iter} config]
(if str-paths
`(PathSparkMultiDataSetIterator. ~str-paths)
`(PathSparkMultiDataSetIterator. ~iter))))
(defmethod iterator :portable-ds-stream [opts]
(let [config (:portable-ds-stream opts)
{streams :streams
iter :iter} config]
(if streams
`(PortableDataStreamDataSetIterator. ~streams)
`(PortableDataStreamDataSetIterator. ~iter))))
(defmethod iterator :portable-multi-ds-stream [opts]
(let [config (:portable-multi-ds-stream opts)
{streams :streams
iter :iter} config]
(if streams
`(PortableDataStreamMultiDataSetIterator. ~streams)
`(PortableDataStreamMultiDataSetIterator. ~iter))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
record reader dataset iterators user facing fns
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn new-record-reader-dataset-iterator
"creates a new record reader dataset iterator by calling its constructor
with the supplied args. args are:
:record-reader (record-reader) a record reader, see datavec.api.records.readers
:as-code? (boolean), return java object or code for creating it
:batch-size (int) the batch size
:label-idx (int) the index of the labels in a dataset
:n-possible-labels (int) the number of possible labels
:label-idx-from (int) starting column for range of columns containing labels in the dataset
:label-idx-to (int) ending column for range of columns containing labels in the dataset
:regression? (boolean) are we dealing with a regression or classification problem
:max-num-batches (int) the maximum number of batches the iterator should go through
:writeable-converter (writable), converts a writable to another data type
- need to have a central source of their creation
-
- the classes which implement this interface
- opts are new-double-writable-converter, new-float-writable-converter
new-label-writer-converter, new-self-writable-converter
- see: dl4clj.datavec.api.io
see: "
[& {:keys [record-reader batch-size label-idx n-possible-labels
label-idx-from label-idx-to regression? max-num-batches
writeable-converter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:rr-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-seq-record-reader-dataset-iterator
"creates a new sequence record reader dataset iterator by calling its constructor
with the supplied args. args are:
:record-reader (sequence-record-reader) a record reader, see datavec.api.records.readers
:as-code? (boolean), return java object or code for creating it
:mini-batch-size (int) the mini batch size
:n-possible-labels (int) the number of possible labels
:label-idx (int) the index of the labels in a dataset
:regression? (boolean) are we dealing with a regression or classification problem
:labels-reader (record-reader) a record reader specificaly for labels, see datavec.api.records.readers
:features-reader (record-reader) a record reader specificaly for features, see datavec.api.records.readers
:alignment-mode (keyword), one of :equal-length, :align-start, :align-end
-see
see: "
[& {:keys [record-reader mini-batch-size n-possible-labels label-idx regression?
labels-reader features-reader alignment-mode as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:seq-rr-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-record-reader-multi-dataset-iterator
;; update after refactor of multi method
"creates a new record reader multi dataset iterator by calling its builder with
the supplied args. args are:
:alignment-mode (keyword), one of :equal-length, :align-start, :align-end
-see:
:batch-size (int), size of the batchs the iterator uses for a single run
:add-seq-reader (map) {:reader-name (str) :record-reader (record-reader)}, a sequence record reader
-see: datavec.api.records.readers
:add-reader (map) {:reader-name (str) :record-reader (record-reader)}, a record reader
-see: datavec.api.records.readers
:add-output-one-hot (map) {:reader-name (str) :column (int) :n-classes (int)}
:add-input-one-hot (map) {:reader-name (str) :column (int) :n-classes (int)}
:add-input (map) {:reader-name (str) :first-column (int) :last-column (int)}
:add-output (map) {:reader-name (str) :first-column (int) :last-column (int)}
see: "
[& {:keys [alignment-mode batch-size add-seq-reader
add-reader add-output-one-hot add-output
add-input-one-hot add-input as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:multi-dataset-iter opts})]
(obj-or-code? as-code? code)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
dataset iterators user facing fns
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn new-async-dataset-iterator
"AsyncDataSetIterator takes an existing DataFmulSetIterator and loads one or more
DataSet objects from it using a separate thread. For data sets where
(next! some-iterator) is long running (limited by disk read or processing time for example)
this may improve performance by loading the next DataSet asynchronously
(i.e., while training is continuing on the previous DataSet).
Obviously this may use additional memory.
Note however that due to asynchronous loading of data, (next! iter n) is not supported.
:as-code? (boolean), return java object or code for creating it
:iter (ds-iterator), a dataset iterator
- see: dl4clj.datasets.iterators (this ns)
:que-size (int), the size of the que
:que (blocking-que), the que containing the dataset
see: "
[& {:keys [iter que-size que as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:async-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-existing-dataset-iterator
"This wrapper provides DataSetIterator interface to existing datasets or dataset iterators
:dataset (iterable), an iterable object, some dataset
:as-code? (boolean), return java object or code for creating it
:total-examples (int), the total number of examples
:n-features (int), the total number of features in the dataset
:n-labels (int), the number of labels in a dataset
:labels (list), a list of labels as strings
:iter (iterator), a dataset iterator
- see: dl4clj.datasets.iterators (this ns)
see: "
[& {:keys [dataset total-examples n-features n-labels labels iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:existing-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-sampling-dataset-iterator
"A wrapper for a dataset to sample from.
This will randomly sample from the given dataset.
:sampling-source (dataset), the dataset to sample from
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
:total-n-samples (int), the total number of desired samples from the dataset
see: "
[& {:keys [sampling-source batch-size total-n-samples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:sampling-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-reconstruction-dataset-iterator
"Wraps a dataset iterator, setting the first (feature matrix) as the labels.
ds-iter (iterator), the iterator to wrap
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:reconstruction-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-multiple-epochs-iterator
"A dataset iterator for doing multiple passes over a dataset
:iter (dataset iterator), an iterator for a dataset
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
:que-size (int), the size for the multiple iterations (improve this desc)
:total-iterations (long), the total number of times to run through the dataset
:n-epochs (int), the number of epochs to run
:dataset (dataset), a dataset
see: "
[& {:keys [iter que-size total-iterations n-epochs dataset as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:multiple-epochs-iter opts})]
(obj-or-code? as-code? code)))
(defn new-moving-window-base-dataset-iterator
;; currently can't test this until I figure out the issue im running into with
;; moving window dataset fetcher
;; this calls the fetcher behind the scene
"DataSetIterator for moving window (rotating matrices)
:batch-size (int), the batch size
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the total number of examples
:dataset (dataset), a dataset to make new examples from
:window-rows (int), the number of rows to rotate
:window-columns (int), the number of columns to rotate
see: "
[& {:keys [batch-size n-examples dataset window-rows window-columns as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:moving-window-base-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-async-multi-dataset-iterator
"Async prefetching iterator wrapper for MultiDataSetIterator implementations
use caution when using this with a CUDA backend
:multi-dataset-iter (multidataset iterator), iterator to wrap
:as-code? (boolean), return java object or code for creating it
:que-length (int), length of the que for async processing
see: "
[& {:keys [multi-dataset-iter que-length as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:async-multi-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-iterator-dataset-iterator
"A DataSetIterator that works on an Iterator, combining and splitting the input
DataSet objects as required to get a consistent batch size.
Typically used in Spark training, but may be used elsewhere.
NOTE: reset method is not supported here.
:iter (iter), an iterator containing datasets
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [iter batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:iterator-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-iterator-multi-dataset-iterator
"A DataSetIterator that works on an Iterator, combining and splitting the input
DataSet objects as required to get a consistent batch size.
Typically used in Spark training, but may be used elsewhere.
NOTE: reset method is not supported here.
:multi-dataset-iter (iter) an iterator containing multiple datasets
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [multi-dataset-iter batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:iterator-multi-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-doubles-dataset-iterator
"creates a dataset iterator which iterates over the supplied features and labels
:features (coll of doubles), a collection of doubles which acts as inputs
- [0.2 0.4 ...]
:labels (coll of doubles), a collection of doubles which acts as targets
- [0.4 0.8 ...]
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [features labels batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:doubles-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-floats-dataset-iterator
"creates a dataset iterator which iterates over the supplied iterable
:features (coll of floats), a collection of floats which acts as inputs
:labels (coll of floats), a collection of floats which acts as the targets
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [features labels batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:floats-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-INDArray-dataset-iterator
"creates a dataset iterator given a pair of INDArrays and a batch-size
:features (vec or INDArray), an INDArray which acts as inputs
- see: nd4clj.linalg.factory.nd4j
:labels (vec or INDArray), an INDArray which as the targets
- see: nd4clj.linalg.factory.nd4j
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [features labels batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:INDArray-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-multi-data-set-iterator-adapter
"Iterator that adapts a DataSetIterator to a MultiDataSetIterator
:iter (datset-iterator), an iterator for a dataset
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:ds-iter-to-multi-ds-iter opts})]
(obj-or-code? as-code? code)))
(defn new-list-dataset-iterator
"creates a new list data set iterator given a collection of datasets.
:dataset (collection), a collection of dataset examples
- from (as-list dataset), from nd4clj.linalg.dataset.api.data-set
:batch-size (int), the batch size, if not supplied, defaults to 5
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [dataset batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:list-ds-iter opts})]
(obj-or-code? as-code? code)))
(defn new-singleton-multi-dataset-iterator
"A very simple adapter class for converting a single MultiDataSet to a MultiDataSetIterator.
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [multi-dataset as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:multi-ds-to-multi-ds-iter opts})]
(obj-or-code? as-code? code)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
default dataset iterators user facing fns
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn new-curves-dataset-iterator
"creates a dataset iterator for curbes data
:batch-size (int), the size of the batch
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the total number of examples
see: "
[& {:keys [batch-size n-examples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:curves-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-cifar-data-set-iterator
"Load the images from the cifar dataset,
:batch-size (int), the batch size
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the number of examples from the ds to include in the iterator
:img-dim (vector), desired dimensions of the images
- should contain 3 ints
:train? (boolean), are we training or testing?
:use-special-pre-process-cifar? (boolean), are we going to use the predefined preprocessor built for this dataset
- There is a special preProcessor used to normalize the dataset based on Sergey Zagoruyko example
:img-transform (map) config map for an image-transformation (as of writing this doc string, not implemented)
:n-possible-labels (int), specify the number of possible outputs/tags/classes for a given image
see:
and: "
[& {:keys [batch-size n-examples img-dims train?
use-special-pre-process-cifar?
n-possible-labels img-transform as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:cifar-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-iris-data-set-iterator
"IrisDataSetIterator handles traversing through the Iris Data Set.
:batch-size (int), size of the batch
:as-code? (boolean), return java object or code for creating it
:n-examples (int), number of examples to iterator over
see: "
[& {:keys [batch-size n-examples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:iris-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-lfw-data-set-iterator
"Creates a dataset iterator for the LFW image dataset.
:img-dims (vec), desired dimensions of the images
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
:n-examples (int), number of examples to take from the dataset
:use-subset? (boolean), use a subset of the dataset or the whole thing
:train? (boolean, are we training a net or testing it
:split-train-test (double), the division between training and testing datasets
:n-labels (int), the number of possible classifications for a single image
:seed (int), number used to keep randomization consistent
:label-generator (label generator), call (new-parent-path-label-generator) or
(new-pattern-path-label-generator opts) to create a label generator to use
:image-transform (map), a transform to apply to the images,
- as of writing this doc string, this functionality not implemented
see: "
[& {:keys [img-dims batch-size n-examples use-subset? train? split-train-test
n-labels seed label-generator image-transform as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:lfw-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-mnist-data-set-iterator
"creates a dataset iterator for the Mnist dataset
:batch-size (int), the batch size
:as-code? (boolean), return java object or code for creating it
:train? (boolean), training or testing
:seed (int), used to consistently randomize the dataset
:n-examples (int), the overall number of examples
:binarize? (boolean), whether to binarize mnist or not
:shuffle? (boolean), whether to shuffle the dataset or not
:batch (int), size of each patch
- supplying batch-size will retrieve the entire dataset where as batch will get a subset
see: "
[& {:keys [batch-size train? seed n-examples binarize? shuffle? rng-seed batch
as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:mnist-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-raw-mnist-data-set-iterator
"Mnist data with scaled pixels
:batch (int) size of each patch
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the overall number of examples
see: "
[& {:keys [batch n-examples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:raw-mnist-dataset-iter opts})]
(obj-or-code? as-code? code)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
spark dataset iterator user facing fns
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn new-path-spark-ds-iterator
"A DataSetIterator that loads serialized DataSet objects
from a String that represents the path
-note: DataSet objects saved from a DataSet to an output stream
:string-paths (coll), a collection of string paths representing the location
of the data-set streams
:iter (java.util.Iterator), an iterator for a collection of string paths
representing the location of the data-set streams
you should supply either :string-paths or :iter, if you supply both, will default
to using the collection of string paths
see: "
[& {:keys [string-paths iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:path-to-ds opts})]
(obj-or-code? as-code? code)))
(defn new-path-spark-multi-ds-iterator
"A DataSetIterator that loads serialized MultiDataSet objects
from a String that represents the path
-note: DataSet objects saved from a MultiDataSet to an output stream
:string-paths (coll), a collection of string paths representing the location
of the data-set streams
:iter (java.util.Iterator), an iterator for a collection of string paths
representing the location of the data-set streams
you should supply either :string-paths or :iter, if you supply both, will default
to using the collection of string paths
see: "
[& {:keys [string-paths iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:path-to-multi-ds opts})]
(obj-or-code? as-code? code)))
(defn new-spark-portable-datastream-ds-iterator
"A DataSetIterator that loads serialized DataSet objects
from a PortableDataStream, usually obtained from SparkContext.binaryFiles()
-note: DataSet objects saved from a DataSet to an output stream
:streams (coll), a collection of portable datastreams
:iter (java.util.Iterator), an iterator for a collection of portable datastreams
you should only supply :streams or :iter, if you supply both, will default to
using the collection of portable datastreams
see: "
[& {:keys [streams iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:portable-ds-stream opts})]
(obj-or-code? as-code? code)))
(defn new-spark-portable-datastream-multi-ds-iterator
"A DataSetIterator that loads serialized MultiDataSet objects
from a PortableDataStream, usually obtained from SparkContext.binaryFiles()
-note: DataSet objects saved from a MultiDataSet to an output stream
:streams (coll), a collection of portable datastreams
:iter (java.util.Iterator), an iterator for a collection of portable datastreams
you should only supply :streams or :iter, if you supply both, will default to
using the collection of portable datastreams
see: "
[& {:keys [streams iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:portable-multi-ds-stream opts})]
(obj-or-code? as-code? code)))
| null | https://raw.githubusercontent.com/yetanalytics/dl4clj/9ef055b2a460f1a6246733713136b981fd322510/src/dl4clj/datasets/iterators.clj | clojure |
multimethod
record reader dataset iterator mulimethods
dataset iterator mulimethods
default dataset iterator mulimethods
spark dataset iterator mulimethods
going to update once I get to updating spark stuff
update after refactor of multi method
currently can't test this until I figure out the issue im running into with
moving window dataset fetcher
this calls the fetcher behind the scene
| (ns dl4clj.datasets.iterators
(:import [org.deeplearning4j.datasets.datavec RecordReaderDataSetIterator
RecordReaderMultiDataSetIterator$Builder RecordReaderMultiDataSetIterator
SequenceRecordReaderDataSetIterator]
[org.deeplearning4j.datasets.iterator
DoublesDataSetIterator FloatsDataSetIterator INDArrayDataSetIterator
AsyncDataSetIterator AsyncMultiDataSetIterator CombinedPreProcessor
CombinedPreProcessor$Builder CurvesDataSetIterator IteratorDataSetIterator
IteratorMultiDataSetIterator MovingWindowBaseDataSetIterator
MultipleEpochsIterator ReconstructionDataSetIterator SamplingDataSetIterator
ExistingDataSetIterator]
[org.deeplearning4j.datasets.iterator.impl MultiDataSetIteratorAdapter
ListDataSetIterator SingletonMultiDataSetIterator]
[org.deeplearning4j.datasets.iterator.impl
CifarDataSetIterator IrisDataSetIterator LFWDataSetIterator
MnistDataSetIterator RawMnistDataSetIterator]
[org.deeplearning4j.spark.iterator
PathSparkDataSetIterator
PathSparkMultiDataSetIterator
PortableDataStreamDataSetIterator
PortableDataStreamMultiDataSetIterator]
[java.util Random])
(:require [dl4clj.constants :refer [value-of]]
[dl4clj.berkeley :refer [new-pair]]
[dl4clj.helpers :refer :all]
[dl4clj.utils :refer [contains-many? generic-dispatching-fn builder-fn
obj-or-code?]]
[clojure.core.match :refer [match]]
[nd4clj.linalg.factory.nd4j :refer [vec-or-matrix->indarray]]))
(defmulti iterator
"Multimethod that builds a dataset iterator based on the supplied type and opts"
generic-dispatching-fn)
(defmethod iterator :rr-dataset-iter [opts]
(let [config (:rr-dataset-iter opts)
{rr :record-reader
batch-size :batch-size
label-idx :label-idx
n-labels :n-possible-labels
l-idx-from :label-idx-from
l-idx-to :label-idx-to
regression? :regression?
max-n-batches :max-num-batches
converter :writeable-converter} config]
(match [config]
[{:writeable-converter _ :batch-size _ :label-idx-from _
:label-idx-to _ :n-possible-labels _ :max-num-batches _
:regression? _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~l-idx-from
~l-idx-to ~n-labels ~max-n-batches
~regression?)
[{:writeable-converter _ :batch-size _ :label-idx _
:n-possible-labels _ :max-num-batches _ :regression? _
:record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~label-idx
~n-labels ~max-n-batches ~regression?)
[{:writeable-converter _ :batch-size _ :label-idx _
:n-possible-labels _ :regression? _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~label-idx
~n-labels ~regression?)
[{:record-reader _ :batch-size _ :label-idx-from _
:label-idx-to _ :regression? _}]
`(RecordReaderDataSetIterator.
~rr ~batch-size ~l-idx-from ~l-idx-to ~regression?)
[{:record-reader _ :batch-size _ :label-idx _
:n-possible-labels _ :max-num-batches _}]
`(RecordReaderDataSetIterator.
~rr ~batch-size ~label-idx ~n-labels ~max-n-batches)
[{:writeable-converter _ :batch-size _ :label-idx _
:n-possible-labels _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size ~label-idx ~n-labels)
[{:record-reader _ :batch-size _ :label-idx _ :n-possible-labels _}]
`(RecordReaderDataSetIterator.
~rr ~batch-size ~label-idx ~n-labels)
[{:writeable-converter _ :batch-size _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~converter ~batch-size)
[{:batch-size _ :record-reader _}]
`(RecordReaderDataSetIterator. ~rr ~batch-size))))
(defmethod iterator :seq-rr-dataset-iter [opts]
(let [config (:seq-rr-dataset-iter opts)
{rr :record-reader
m-batch-size :mini-batch-size
n-labels :n-possible-labels
label-idx :label-idx
regression? :regression?
labels-reader :labels-reader
features-reader :features-reader
alignment :alignment-mode} config]
(match [config]
[{:labels-reader _ :features-reader _ :mini-batch-size _
:n-possible-labels _ :regression? _ :alignment-mode _}]
`(SequenceRecordReaderDataSetIterator.
~features-reader ~labels-reader ~m-batch-size ~n-labels ~regression?
~(value-of-helper :seq-alignment-mode alignment))
[{:labels-reader _ :features-reader _ :mini-batch-size _
:n-possible-labels _ :regression? _}]
`(SequenceRecordReaderDataSetIterator.
~features-reader ~labels-reader ~m-batch-size ~n-labels ~regression?)
[{:record-reader _ :mini-batch-size _ :n-possible-labels _
:label-idx _ :regression? _}]
`(SequenceRecordReaderDataSetIterator.
~rr ~m-batch-size ~n-labels ~label-idx ~regression?)
[{:labels-reader _ :features-reader _ :mini-batch-size _ :n-possible-labels _}]
`(SequenceRecordReaderDataSetIterator.
~features-reader ~labels-reader ~m-batch-size ~n-labels)
[{:record-reader _ :mini-batch-size _ :n-possible-labels _ :label-idx _}]
`(SequenceRecordReaderDataSetIterator.
~rr ~m-batch-size ~n-labels ~label-idx))))
(defmethod iterator :multi-dataset-iter [opts]
(assert (integer? (:batch-size (:multi-dataset-iter opts)))
"you must supply batch-size and it must be an integer")
(let [config (:multi-dataset-iter opts)
{add-input :add-input
add-input-hot :add-input-one-hot
add-output :add-output
add-output-hot :add-output-one-hot
add-reader :add-reader
add-seq-reader :add-seq-reader
alignment :alignment-mode
batch-size :batch-size} config
{reader-name :reader-name
first-column :first-column
last-column :last-column} add-input
{hot-reader-name :reader-name
hot-column :column
hot-num-classes :n-classes} add-input-hot
{output-reader-name :reader-name
output-first-column :first-column
output-last-column :last-column} add-output
{hot-output-reader-name :reader-name
hot-output-column :column
hot-output-n-classes :n-classes} add-output-hot
{record-reader-name :reader-name
rr :record-reader} add-reader
{seq-reader-name :reader-name
seq-rr :record-reader} add-seq-reader
method-map {:input '.addInput
:input-one-hot '.addInputOneHot
:output '.addOutput
:output-one-hot '.addOutputOneHot
:reader '.addReader
:seq-reader '.addSequenceReader
:alignment '.sequenceAlignmentMode}
updated-opts {:alignment (if alignment (value-of-helper :multi-alignment-mode
alignment))
:reader (if add-reader [record-reader-name rr])
:seq-reader (if add-seq-reader [seq-reader-name seq-rr])
:output-one-hot (if add-output-hot [hot-output-reader-name
hot-output-column
hot-output-n-classes])
:output (if add-output [output-reader-name
output-first-column
output-last-column])
:input-one-hot (if add-input-hot [hot-reader-name
hot-column
hot-num-classes])
:input (if add-input [reader-name first-column last-column])}
opts* (into {} (filter val updated-opts))
b `(RecordReaderMultiDataSetIterator$Builder. ~batch-size)]
`(.build
~(builder-fn b method-map opts*))))
(defmethod iterator :doubles-dataset-iter [opts]
(let [config (:doubles-dataset-iter opts)
{features :features
labels :labels
batch-size :batch-size} config]
`(DoublesDataSetIterator. [(new-pair :p1 (double-array ~features)
:p2 (double-array ~labels))]
~batch-size)))
(defmethod iterator :floats-dataset-iter [opts]
(let [config (:floats-dataset-iter opts)
{features :features
labels :labels
batch-size :batch-size} config]
`(FloatsDataSetIterator. [(new-pair :p1 (float-array ~features)
:p2 (float-array ~labels))]
~batch-size)))
(defmethod iterator :INDArray-dataset-iter [opts]
(let [config (:INDArray-dataset-iter opts)
{features :features
labels :labels
batch-size :batch-size} config]
`(INDArrayDataSetIterator. [(new-pair :p1 (vec-or-matrix->indarray ~features)
:p2 (vec-or-matrix->indarray ~labels))]
~batch-size)))
(defmethod iterator :iterator-multi-dataset-iter [opts]
(let [config (:iterator-multi-dataset-iter opts)
{iter :multi-dataset-iter
batch-size :batch-size} config]
`(IteratorMultiDataSetIterator. ~iter ~batch-size)))
(defmethod iterator :iterator-dataset-iter [opts]
(let [config (:iterator-dataset-iter opts)
{iter :iter
batch-size :batch-size} config]
`(IteratorDataSetIterator. ~iter ~batch-size)))
(defmethod iterator :async-multi-dataset-iter [opts]
(let [config (:async-multi-dataset-iter opts)
{iter :multi-dataset-iter
que-l :que-length} config]
`(AsyncMultiDataSetIterator. ~iter ~que-l)))
(defmethod iterator :moving-window-base-dataset-iter [opts]
(let [config (:moving-window-base-dataset-iter opts)
{batch :batch-size
n-examples :n-examples
data :dataset
window-rows :window-rows
window-columns :window-columns} config]
`(MovingWindowBaseDataSetIterator. ~batch ~n-examples ~data ~window-rows ~window-columns)))
(defmethod iterator :multiple-epochs-iter [opts]
(let [config (:multiple-epochs-iter opts)
{iter :iter
q-size :que-size
t-iterations :total-iterations
n-epochs :n-epochs
ds :dataset} config]
(match [config]
[{:n-epochs _ :iter _ :que-size _}]
`(MultipleEpochsIterator. ~n-epochs ~iter ~q-size)
[{:iter _ :que-size _ :total-iterations _}]
`(MultipleEpochsIterator. ~iter ~q-size ~t-iterations)
[{:n-epochs _ :iter _}]
`(MultipleEpochsIterator. ~n-epochs ~iter)
[{:n-epochs _ :dataset _}]
`(MultipleEpochsIterator. ~n-epochs ~ds)
[{:iter _ :total-iterations _}]
`(MultipleEpochsIterator. ~iter ~t-iterations))))
(defmethod iterator :reconstruction-dataset-iter [opts]
(let [config (:reconstruction-dataset-iter opts)
iter (:iter config)]
`(ReconstructionDataSetIterator. ~iter)))
(defmethod iterator :sampling-dataset-iter [opts]
(let [config (:sampling-dataset-iter opts)
{ds :sampling-source
batch-size :batch-size
n-samples :total-n-samples} config]
`(SamplingDataSetIterator. ~ds ~batch-size ~n-samples)))
(defmethod iterator :existing-dataset-iter [opts]
(let [config (:existing-dataset-iter opts)
{iterable :dataset
n-examples :total-examples
n-features :n-features
n-labels :n-labels
labels :labels
ds-iter :iter} config]
(match [config]
[{:dataset _ :total-examples _ :n-features _ :n-labels _}]
`(ExistingDataSetIterator. ~iterable ~n-examples ~n-features ~n-labels)
[{:dataset _ :labels _}]
`(ExistingDataSetIterator. ~iterable ~labels)
[{:iter _ :labels _}]
`(ExistingDataSetIterator. ~ds-iter ~labels)
[{:iter _}]
`(ExistingDataSetIterator. ~ds-iter)
[{:dataset _}]
`(ExistingDataSetIterator. ~iterable))))
(defmethod iterator :async-dataset-iter [opts]
(let [config (:async-dataset-iter opts)
{ds-iter :iter
que-size :que-size
que :que} config]
(match [config]
[{:iter _ :que _ :que-size _}]
`(AsyncDataSetIterator. ~ds-iter ~que-size ~que)
[{:iter _ :que-size _}]
`(AsyncDataSetIterator. ~ds-iter ~que-size)
[{:iter _}]
`(AsyncDataSetIterator. ~ds-iter))))
(defmethod iterator :ds-iter-to-multi-ds-iter [opts]
(let [conf (:ds-iter-to-multi-ds-iter opts)
iter (:iter conf)]
`(MultiDataSetIteratorAdapter. ~iter)))
(defmethod iterator :list-ds-iter [opts]
(let [conf (:list-ds-iter opts)
{ds :dataset
batch-size :batch-size} conf]
(if (contains? conf :batch-size)
`(ListDataSetIterator. ~ds ~batch-size)
`(ListDataSetIterator. ~ds))))
(defmethod iterator :multi-ds-to-multi-ds-iter [opts]
(let [conf (:multi-ds-to-multi-ds-iter opts)
mds (:multi-dataset conf)]
`(SingletonMultiDataSetIterator. ~mds)))
(defmethod iterator :curves-dataset-iter [opts]
(let [config (:curves-dataset-iter opts)
{batch :batch-size
n-examples :n-examples} config]
`(CurvesDataSetIterator. ~batch ~n-examples)))
(defmethod iterator :cifar-dataset-iter [opts]
(let [config (:cifar-dataset-iter opts)
{batch-size :batch-size
n-examples :n-examples
img-dims :img-dims
train? :train?
use-special-pre-process-cifar? :use-special-pre-process-cifar?
n-possible-labels :n-possible-labels
img-transform :img-transform} config
img `(int-array ~img-dims)]
(match [config]
[{:batch-size _ :n-examples _ :img-dims _ :n-possible-labels _
:img-transform _ :use-special-pre-process-cifar? _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img ~n-possible-labels
~img-transform ~use-special-pre-process-cifar? ~train?)
[{:batch-size _ :n-examples _ :img-dims _
:use-special-pre-process-cifar? _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img
~use-special-pre-process-cifar? ~train?)
[{:batch-size _ :n-examples _ :img-dims _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img ~train?)
[{:batch-size _ :n-examples _ :img-dims _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~img)
[{:batch-size _ :n-examples _ :train? _}]
`(CifarDataSetIterator. ~batch-size ~n-examples ~train?)
[{:batch-size _ :img-dims _}]
`(CifarDataSetIterator. ~batch-size ~img)
[{:batch-size _ :n-examples _}]
`(CifarDataSetIterator. ~batch-size ~n-examples))))
(defmethod iterator :iris-dataset-iter [opts]
(let [config (:iris-dataset-iter opts)
{batch-size :batch-size
n-examples :n-examples} config]
`(IrisDataSetIterator. ~batch-size ~n-examples)))
(defmethod iterator :lfw-dataset-iter [opts]
(let [config (:lfw-dataset-iter opts)
{img-dims :img-dims
batch-size :batch-size
n-examples :n-examples
use-subset? :use-subset?
train? :train?
split-train-test :split-train-test
n-labels :n-labels
seed :seed
label-generator :label-generator
image-transform :image-transform} config
img `(int-array ~img-dims)
rng (if (contains? config :seed)
`(new Random ~seed)
`(new Random 123))]
(match [config]
[{:batch-size _ :n-examples _ :img-dims _ :n-labels _
:use-subset? _ :label-generator _ :train? _ :split-train-test _
:seed _ :image-transform _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~n-labels ~use-subset?
~label-generator ~train? ~split-train-test ~image-transform
~rng)
[{:batch-size _ :n-examples _ :img-dims _ :n-labels _
:use-subset? _ :label-generator _ :train? _ :split-train-test _
:seed _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~n-labels ~use-subset?
~label-generator ~train? ~split-train-test ~rng)
[{:batch-size _ :n-examples _ :img-dims _ :n-labels _
:use-subset? _ :train? _ :split-train-test _ :seed _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~n-labels ~use-subset?
~train? ~split-train-test ~rng)
[{:batch-size _ :n-examples _ :n-labels _ :train? _ :split-train-test _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~n-labels ~train? ~split-train-test)
[{:batch-size _ :n-examples _ :img-dims _ :train? _ :split-train-test _}]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img ~train? ~split-train-test)
[{:batch-size _ :n-examples _ :img-dims _ }]
`(LFWDataSetIterator. ~batch-size ~n-examples ~img)
[{:batch-size _ :use-subset? _ :img-dims _ }]
`(LFWDataSetIterator. ~batch-size ~img ~use-subset?)
[{:batch-size _ :n-examples _}]
`(LFWDataSetIterator. ~batch-size ~n-examples)
[{:img-dims _ }]
`(LFWDataSetIterator. ~img))))
(defmethod iterator :mnist-dataset-iter [opts]
(let [config (:mnist-dataset-iter opts)
{batch-size :batch-size
train? :train?
seed :seed
n-examples :n-examples
binarize? :binarize?
shuffle? :shuffle?
batch :batch} config]
(match [config]
[{:batch _ :n-examples _ :binarize? _
:train? _ :shuffle? _ :seed _}]
`(MnistDataSetIterator. ~batch ~n-examples ~binarize? ~train?
~shuffle? ~(long seed))
[{:batch-size _ :train? _ :seed _}]
`(MnistDataSetIterator. ~batch-size ~train? ~(int seed))
[{:batch _ :n-examples _ :binarize? _}]
`(MnistDataSetIterator. ~batch ~n-examples ~binarize?)
[{:batch _ :n-examples _}]
`(MnistDataSetIterator. ~batch ~n-examples))))
(defmethod iterator :raw-mnist-dataset-iter [opts]
(let [config (:raw-mnist-dataset-iter opts)
{batch :batch
n-examples :n-examples} config]
`(RawMnistDataSetIterator. ~batch ~n-examples)))
(defmethod iterator :path-to-ds [opts]
(let [config (:path-to-ds opts)
{str-paths :string-paths
iter :iter} config]
(if str-paths
`(PathSparkDataSetIterator. ~str-paths)
`(PathSparkDataSetIterator. ~iter))))
(defmethod iterator :path-to-multi-ds [opts]
(let [config (:path-to-multi-ds opts)
{str-paths :string-paths
iter :iter} config]
(if str-paths
`(PathSparkMultiDataSetIterator. ~str-paths)
`(PathSparkMultiDataSetIterator. ~iter))))
(defmethod iterator :portable-ds-stream [opts]
(let [config (:portable-ds-stream opts)
{streams :streams
iter :iter} config]
(if streams
`(PortableDataStreamDataSetIterator. ~streams)
`(PortableDataStreamDataSetIterator. ~iter))))
(defmethod iterator :portable-multi-ds-stream [opts]
(let [config (:portable-multi-ds-stream opts)
{streams :streams
iter :iter} config]
(if streams
`(PortableDataStreamMultiDataSetIterator. ~streams)
`(PortableDataStreamMultiDataSetIterator. ~iter))))
record reader dataset iterators user facing fns
(defn new-record-reader-dataset-iterator
"creates a new record reader dataset iterator by calling its constructor
with the supplied args. args are:
:record-reader (record-reader) a record reader, see datavec.api.records.readers
:as-code? (boolean), return java object or code for creating it
:batch-size (int) the batch size
:label-idx (int) the index of the labels in a dataset
:n-possible-labels (int) the number of possible labels
:label-idx-from (int) starting column for range of columns containing labels in the dataset
:label-idx-to (int) ending column for range of columns containing labels in the dataset
:regression? (boolean) are we dealing with a regression or classification problem
:max-num-batches (int) the maximum number of batches the iterator should go through
:writeable-converter (writable), converts a writable to another data type
- need to have a central source of their creation
-
- the classes which implement this interface
- opts are new-double-writable-converter, new-float-writable-converter
new-label-writer-converter, new-self-writable-converter
- see: dl4clj.datavec.api.io
see: "
[& {:keys [record-reader batch-size label-idx n-possible-labels
label-idx-from label-idx-to regression? max-num-batches
writeable-converter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:rr-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-seq-record-reader-dataset-iterator
"creates a new sequence record reader dataset iterator by calling its constructor
with the supplied args. args are:
:record-reader (sequence-record-reader) a record reader, see datavec.api.records.readers
:as-code? (boolean), return java object or code for creating it
:mini-batch-size (int) the mini batch size
:n-possible-labels (int) the number of possible labels
:label-idx (int) the index of the labels in a dataset
:regression? (boolean) are we dealing with a regression or classification problem
:labels-reader (record-reader) a record reader specificaly for labels, see datavec.api.records.readers
:features-reader (record-reader) a record reader specificaly for features, see datavec.api.records.readers
:alignment-mode (keyword), one of :equal-length, :align-start, :align-end
-see
see: "
[& {:keys [record-reader mini-batch-size n-possible-labels label-idx regression?
labels-reader features-reader alignment-mode as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:seq-rr-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-record-reader-multi-dataset-iterator
"creates a new record reader multi dataset iterator by calling its builder with
the supplied args. args are:
:alignment-mode (keyword), one of :equal-length, :align-start, :align-end
-see:
:batch-size (int), size of the batchs the iterator uses for a single run
:add-seq-reader (map) {:reader-name (str) :record-reader (record-reader)}, a sequence record reader
-see: datavec.api.records.readers
:add-reader (map) {:reader-name (str) :record-reader (record-reader)}, a record reader
-see: datavec.api.records.readers
:add-output-one-hot (map) {:reader-name (str) :column (int) :n-classes (int)}
:add-input-one-hot (map) {:reader-name (str) :column (int) :n-classes (int)}
:add-input (map) {:reader-name (str) :first-column (int) :last-column (int)}
:add-output (map) {:reader-name (str) :first-column (int) :last-column (int)}
see: "
[& {:keys [alignment-mode batch-size add-seq-reader
add-reader add-output-one-hot add-output
add-input-one-hot add-input as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:multi-dataset-iter opts})]
(obj-or-code? as-code? code)))
dataset iterators user facing fns
(defn new-async-dataset-iterator
"AsyncDataSetIterator takes an existing DataFmulSetIterator and loads one or more
DataSet objects from it using a separate thread. For data sets where
(next! some-iterator) is long running (limited by disk read or processing time for example)
this may improve performance by loading the next DataSet asynchronously
(i.e., while training is continuing on the previous DataSet).
Obviously this may use additional memory.
Note however that due to asynchronous loading of data, (next! iter n) is not supported.
:as-code? (boolean), return java object or code for creating it
:iter (ds-iterator), a dataset iterator
- see: dl4clj.datasets.iterators (this ns)
:que-size (int), the size of the que
:que (blocking-que), the que containing the dataset
see: "
[& {:keys [iter que-size que as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:async-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-existing-dataset-iterator
"This wrapper provides DataSetIterator interface to existing datasets or dataset iterators
:dataset (iterable), an iterable object, some dataset
:as-code? (boolean), return java object or code for creating it
:total-examples (int), the total number of examples
:n-features (int), the total number of features in the dataset
:n-labels (int), the number of labels in a dataset
:labels (list), a list of labels as strings
:iter (iterator), a dataset iterator
- see: dl4clj.datasets.iterators (this ns)
see: "
[& {:keys [dataset total-examples n-features n-labels labels iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:existing-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-sampling-dataset-iterator
"A wrapper for a dataset to sample from.
This will randomly sample from the given dataset.
:sampling-source (dataset), the dataset to sample from
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
:total-n-samples (int), the total number of desired samples from the dataset
see: "
[& {:keys [sampling-source batch-size total-n-samples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:sampling-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-reconstruction-dataset-iterator
"Wraps a dataset iterator, setting the first (feature matrix) as the labels.
ds-iter (iterator), the iterator to wrap
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:reconstruction-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-multiple-epochs-iterator
"A dataset iterator for doing multiple passes over a dataset
:iter (dataset iterator), an iterator for a dataset
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
:que-size (int), the size for the multiple iterations (improve this desc)
:total-iterations (long), the total number of times to run through the dataset
:n-epochs (int), the number of epochs to run
:dataset (dataset), a dataset
see: "
[& {:keys [iter que-size total-iterations n-epochs dataset as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:multiple-epochs-iter opts})]
(obj-or-code? as-code? code)))
(defn new-moving-window-base-dataset-iterator
"DataSetIterator for moving window (rotating matrices)
:batch-size (int), the batch size
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the total number of examples
:dataset (dataset), a dataset to make new examples from
:window-rows (int), the number of rows to rotate
:window-columns (int), the number of columns to rotate
see: "
[& {:keys [batch-size n-examples dataset window-rows window-columns as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:moving-window-base-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-async-multi-dataset-iterator
"Async prefetching iterator wrapper for MultiDataSetIterator implementations
use caution when using this with a CUDA backend
:multi-dataset-iter (multidataset iterator), iterator to wrap
:as-code? (boolean), return java object or code for creating it
:que-length (int), length of the que for async processing
see: "
[& {:keys [multi-dataset-iter que-length as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:async-multi-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-iterator-dataset-iterator
"A DataSetIterator that works on an Iterator, combining and splitting the input
DataSet objects as required to get a consistent batch size.
Typically used in Spark training, but may be used elsewhere.
NOTE: reset method is not supported here.
:iter (iter), an iterator containing datasets
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [iter batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:iterator-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-iterator-multi-dataset-iterator
"A DataSetIterator that works on an Iterator, combining and splitting the input
DataSet objects as required to get a consistent batch size.
Typically used in Spark training, but may be used elsewhere.
NOTE: reset method is not supported here.
:multi-dataset-iter (iter) an iterator containing multiple datasets
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [multi-dataset-iter batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:iterator-multi-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-doubles-dataset-iterator
"creates a dataset iterator which iterates over the supplied features and labels
:features (coll of doubles), a collection of doubles which acts as inputs
- [0.2 0.4 ...]
:labels (coll of doubles), a collection of doubles which acts as targets
- [0.4 0.8 ...]
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [features labels batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:doubles-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-floats-dataset-iterator
"creates a dataset iterator which iterates over the supplied iterable
:features (coll of floats), a collection of floats which acts as inputs
:labels (coll of floats), a collection of floats which acts as the targets
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [features labels batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:floats-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-INDArray-dataset-iterator
"creates a dataset iterator given a pair of INDArrays and a batch-size
:features (vec or INDArray), an INDArray which acts as inputs
- see: nd4clj.linalg.factory.nd4j
:labels (vec or INDArray), an INDArray which as the targets
- see: nd4clj.linalg.factory.nd4j
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
see: "
[& {:keys [features labels batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:INDArray-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-multi-data-set-iterator-adapter
"Iterator that adapts a DataSetIterator to a MultiDataSetIterator
:iter (datset-iterator), an iterator for a dataset
- see: dl4clj.datasets.iterators (this ns)
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:ds-iter-to-multi-ds-iter opts})]
(obj-or-code? as-code? code)))
(defn new-list-dataset-iterator
"creates a new list data set iterator given a collection of datasets.
:dataset (collection), a collection of dataset examples
- from (as-list dataset), from nd4clj.linalg.dataset.api.data-set
:batch-size (int), the batch size, if not supplied, defaults to 5
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [dataset batch-size as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:list-ds-iter opts})]
(obj-or-code? as-code? code)))
(defn new-singleton-multi-dataset-iterator
"A very simple adapter class for converting a single MultiDataSet to a MultiDataSetIterator.
:as-code? (boolean), return java object or code for creating it
see: "
[& {:keys [multi-dataset as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:multi-ds-to-multi-ds-iter opts})]
(obj-or-code? as-code? code)))
default dataset iterators user facing fns
(defn new-curves-dataset-iterator
"creates a dataset iterator for curbes data
:batch-size (int), the size of the batch
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the total number of examples
see: "
[& {:keys [batch-size n-examples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:curves-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-cifar-data-set-iterator
"Load the images from the cifar dataset,
:batch-size (int), the batch size
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the number of examples from the ds to include in the iterator
:img-dim (vector), desired dimensions of the images
- should contain 3 ints
:train? (boolean), are we training or testing?
:use-special-pre-process-cifar? (boolean), are we going to use the predefined preprocessor built for this dataset
- There is a special preProcessor used to normalize the dataset based on Sergey Zagoruyko example
:img-transform (map) config map for an image-transformation (as of writing this doc string, not implemented)
:n-possible-labels (int), specify the number of possible outputs/tags/classes for a given image
see:
and: "
[& {:keys [batch-size n-examples img-dims train?
use-special-pre-process-cifar?
n-possible-labels img-transform as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:cifar-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-iris-data-set-iterator
"IrisDataSetIterator handles traversing through the Iris Data Set.
:batch-size (int), size of the batch
:as-code? (boolean), return java object or code for creating it
:n-examples (int), number of examples to iterator over
see: "
[& {:keys [batch-size n-examples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:iris-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-lfw-data-set-iterator
"Creates a dataset iterator for the LFW image dataset.
:img-dims (vec), desired dimensions of the images
:as-code? (boolean), return java object or code for creating it
:batch-size (int), the batch size
:n-examples (int), number of examples to take from the dataset
:use-subset? (boolean), use a subset of the dataset or the whole thing
:train? (boolean, are we training a net or testing it
:split-train-test (double), the division between training and testing datasets
:n-labels (int), the number of possible classifications for a single image
:seed (int), number used to keep randomization consistent
:label-generator (label generator), call (new-parent-path-label-generator) or
(new-pattern-path-label-generator opts) to create a label generator to use
:image-transform (map), a transform to apply to the images,
- as of writing this doc string, this functionality not implemented
see: "
[& {:keys [img-dims batch-size n-examples use-subset? train? split-train-test
n-labels seed label-generator image-transform as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:lfw-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-mnist-data-set-iterator
"creates a dataset iterator for the Mnist dataset
:batch-size (int), the batch size
:as-code? (boolean), return java object or code for creating it
:train? (boolean), training or testing
:seed (int), used to consistently randomize the dataset
:n-examples (int), the overall number of examples
:binarize? (boolean), whether to binarize mnist or not
:shuffle? (boolean), whether to shuffle the dataset or not
:batch (int), size of each patch
- supplying batch-size will retrieve the entire dataset where as batch will get a subset
see: "
[& {:keys [batch-size train? seed n-examples binarize? shuffle? rng-seed batch
as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:mnist-dataset-iter opts})]
(obj-or-code? as-code? code)))
(defn new-raw-mnist-data-set-iterator
"Mnist data with scaled pixels
:batch (int) size of each patch
:as-code? (boolean), return java object or code for creating it
:n-examples (int), the overall number of examples
see: "
[& {:keys [batch n-examples as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:raw-mnist-dataset-iter opts})]
(obj-or-code? as-code? code)))
spark dataset iterator user facing fns
(defn new-path-spark-ds-iterator
"A DataSetIterator that loads serialized DataSet objects
from a String that represents the path
-note: DataSet objects saved from a DataSet to an output stream
:string-paths (coll), a collection of string paths representing the location
of the data-set streams
:iter (java.util.Iterator), an iterator for a collection of string paths
representing the location of the data-set streams
you should supply either :string-paths or :iter, if you supply both, will default
to using the collection of string paths
see: "
[& {:keys [string-paths iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:path-to-ds opts})]
(obj-or-code? as-code? code)))
(defn new-path-spark-multi-ds-iterator
"A DataSetIterator that loads serialized MultiDataSet objects
from a String that represents the path
-note: DataSet objects saved from a MultiDataSet to an output stream
:string-paths (coll), a collection of string paths representing the location
of the data-set streams
:iter (java.util.Iterator), an iterator for a collection of string paths
representing the location of the data-set streams
you should supply either :string-paths or :iter, if you supply both, will default
to using the collection of string paths
see: "
[& {:keys [string-paths iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:path-to-multi-ds opts})]
(obj-or-code? as-code? code)))
(defn new-spark-portable-datastream-ds-iterator
"A DataSetIterator that loads serialized DataSet objects
from a PortableDataStream, usually obtained from SparkContext.binaryFiles()
-note: DataSet objects saved from a DataSet to an output stream
:streams (coll), a collection of portable datastreams
:iter (java.util.Iterator), an iterator for a collection of portable datastreams
you should only supply :streams or :iter, if you supply both, will default to
using the collection of portable datastreams
see: "
[& {:keys [streams iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:portable-ds-stream opts})]
(obj-or-code? as-code? code)))
(defn new-spark-portable-datastream-multi-ds-iterator
"A DataSetIterator that loads serialized MultiDataSet objects
from a PortableDataStream, usually obtained from SparkContext.binaryFiles()
-note: DataSet objects saved from a MultiDataSet to an output stream
:streams (coll), a collection of portable datastreams
:iter (java.util.Iterator), an iterator for a collection of portable datastreams
you should only supply :streams or :iter, if you supply both, will default to
using the collection of portable datastreams
see: "
[& {:keys [streams iter as-code?]
:or {as-code? true}
:as opts}]
(let [code (iterator {:portable-multi-ds-stream opts})]
(obj-or-code? as-code? code)))
|
f505f1f1ce8d36338ca2ba774684b3730814c60388fd7fe407cd852ebf54fc4a | mstksg/advent-of-code-2022 | Prelude.hs | {-# OPTIONS_GHC -Wno-compat-unqualified-imports #-}
-- |
-- Module : AOC.Prelude
Copyright : ( c ) 2021
-- License : BSD3
--
Maintainer :
-- Stability : experimental
-- Portability : non-portable
--
-- Custom Prelude while developing challenges. Ideally, once challenges
-- are completed, an import to this module would be replaced with explicit
-- ones for future readers.
--
module AOC.Prelude (
module P
) where
import AOC.Common as P
import AOC.Common.Point as P
import AOC.Common.Search as P
import AOC.Solver as P
import AOC.Util as P
import Control.Applicative as P
import Control.DeepSeq as P
import Control.Lens as P hiding (uncons)
import Control.Monad as P
import Control.Monad.Except as P
import Control.Monad.State as P
import Data.Bifunctor as P
import Data.Char as P
import Data.Coerce as P
import Data.Containers.ListUtils as P
import Data.Either as P
import Data.Finite as P (Finite, packFinite, getFinite, modulo, finites)
import Data.Foldable as P
import Data.Function as P
import Data.Functor as P
import Data.IntMap as P (IntMap)
import Data.IntMap.NonEmpty as P (NEIntMap)
import Data.IntSet as P (IntSet)
import Data.IntSet.NonEmpty as P (NEIntSet)
import Data.Kind as P
import Data.List as P
import Data.List.NonEmpty as P (NonEmpty(..), nonEmpty)
import Data.List.Split as P
import Data.Map as P (Map)
import Data.Map.NonEmpty as P (NEMap)
import Data.Maybe as P
import Data.Ord as P
import Data.Semigroup as P
import Data.Set as P (Set)
import Data.Set.NonEmpty as P (NESet)
import Data.Text as P (Text)
import Data.Text.Encoding as P (encodeUtf8, decodeUtf8)
import Data.Time as P hiding (Day)
import Data.Traversable as P
import Data.Tuple as P
import Data.Void as P
import Debug.Trace as P
import GHC.Generics as P (Generic)
import Numeric.Natural as P
import Safe.Exact as P
import Safe.Foldable as P
import Text.Printf as P
import Text.Read as P (readMaybe)
| null | https://raw.githubusercontent.com/mstksg/advent-of-code-2022/bfbd94a1ed72ce6a0784677c4584f6a5308153ef/src/AOC/Prelude.hs | haskell | # OPTIONS_GHC -Wno-compat-unqualified-imports #
|
Module : AOC.Prelude
License : BSD3
Stability : experimental
Portability : non-portable
Custom Prelude while developing challenges. Ideally, once challenges
are completed, an import to this module would be replaced with explicit
ones for future readers.
|
Copyright : ( c ) 2021
Maintainer :
module AOC.Prelude (
module P
) where
import AOC.Common as P
import AOC.Common.Point as P
import AOC.Common.Search as P
import AOC.Solver as P
import AOC.Util as P
import Control.Applicative as P
import Control.DeepSeq as P
import Control.Lens as P hiding (uncons)
import Control.Monad as P
import Control.Monad.Except as P
import Control.Monad.State as P
import Data.Bifunctor as P
import Data.Char as P
import Data.Coerce as P
import Data.Containers.ListUtils as P
import Data.Either as P
import Data.Finite as P (Finite, packFinite, getFinite, modulo, finites)
import Data.Foldable as P
import Data.Function as P
import Data.Functor as P
import Data.IntMap as P (IntMap)
import Data.IntMap.NonEmpty as P (NEIntMap)
import Data.IntSet as P (IntSet)
import Data.IntSet.NonEmpty as P (NEIntSet)
import Data.Kind as P
import Data.List as P
import Data.List.NonEmpty as P (NonEmpty(..), nonEmpty)
import Data.List.Split as P
import Data.Map as P (Map)
import Data.Map.NonEmpty as P (NEMap)
import Data.Maybe as P
import Data.Ord as P
import Data.Semigroup as P
import Data.Set as P (Set)
import Data.Set.NonEmpty as P (NESet)
import Data.Text as P (Text)
import Data.Text.Encoding as P (encodeUtf8, decodeUtf8)
import Data.Time as P hiding (Day)
import Data.Traversable as P
import Data.Tuple as P
import Data.Void as P
import Debug.Trace as P
import GHC.Generics as P (Generic)
import Numeric.Natural as P
import Safe.Exact as P
import Safe.Foldable as P
import Text.Printf as P
import Text.Read as P (readMaybe)
|
7c7c147075e59f728f9e81cd75f274bccaa29576d0bf07d6ed018826a2700345 | skanev/playground | 80-tests.scm | (require rackunit rackunit/text-ui)
(load "../80.scm")
(define sicp-3.80-tests
(test-suite
"Tests for SICP exercise 3.80"
(check-equal? (stream-take (RLC1 10 0) 4)
'((10 . 0)
(10 . 1.0)
(9.5 . 1.9)
(8.55 . 2.66)))
))
(run-tests sicp-3.80-tests)
| null | https://raw.githubusercontent.com/skanev/playground/d88e53a7f277b35041c2f709771a0b96f993b310/scheme/sicp/03/tests/80-tests.scm | scheme | (require rackunit rackunit/text-ui)
(load "../80.scm")
(define sicp-3.80-tests
(test-suite
"Tests for SICP exercise 3.80"
(check-equal? (stream-take (RLC1 10 0) 4)
'((10 . 0)
(10 . 1.0)
(9.5 . 1.9)
(8.55 . 2.66)))
))
(run-tests sicp-3.80-tests)
| |
b4023aeb8eaa490cb8988899aeb31af41c59ff078f76fdc637cbdb96c68da2fa | tweag/ormolu | empty-classes-out.hs | module Main where
-- | Foo!
class Foo a
-- | Bar!
class Bar a
| null | https://raw.githubusercontent.com/tweag/ormolu/34bdf62429768f24b70d0f8ba7730fc4d8ae73ba/data/examples/declaration/class/empty-classes-out.hs | haskell | | Foo!
| Bar! | module Main where
class Foo a
class Bar a
|
504cf44867d03b457234f65eae71c75e50eeb2c8d646d05dc2d722cb0e853c19 | nuscr/nuscr | pragma.mli | (** This module contains variables configuarations, to be set by pragmas or
command line arguments, not to be changed for the duration of the program *)
type t =
| NestedProtocols
| ShowPragmas
| PrintUsage
| RefinementTypes
| SenderValidateRefinements
| ReceiverValidateRefinements
| ValidateRefinementSatisfiability
| ValidateRefinementProgress
[@@deriving show]
val pragma_of_string : string -> t
type pragmas = (t * string option) list [@@deriving show]
val solver_show_queries : unit -> bool
* Whether to display queries to SMT solvers ( with RefinementTypes pragma )
val set_solver_show_queries : bool -> unit
(** Set solver_show_queries *)
val nested_protocol_enabled : unit -> bool
* Whether NestedProtocol pragma is enabled
val set_nested_protocol : bool -> unit
(** Set nested_protocol_enabled *)
val refinement_type_enabled : unit -> bool
(** Whether RefinementTypes pragma is enabled *)
val set_refinement_type : bool -> unit
(** Set refinement_type *)
val sender_validate_refinements : unit -> bool
(** When refinement types are enabled, senders should validate refinements *)
val set_sender_validate_refinements : bool -> unit
(** Set sender_validate_refinements *)
val receiver_validate_refinements : unit -> bool
(** When refinement types are enabled, receivers should validate refinements *)
val set_receiver_validate_refinements : bool -> unit
(** Set receiver_validate_refinements *)
val validate_refinement_satisfiability : unit -> bool
(** Validate whether a refined global type is semantically satisfiable *)
val set_validate_refinement_satisfiability : bool -> unit
(** Set validate_refinement_satisfiability *)
val validate_refinement_progress : unit -> bool
(** Validate whether a refined global type satisfies progress semantically *)
val set_validate_refinement_progress : bool -> unit
(** Set validate_refinement_progress *)
val verbose : unit -> bool
(** Whether to produce verbose outputs *)
val set_verbose : bool -> unit
(** Set verbose *)
val reset : unit -> unit
(** Reset all configuration to default *)
val load_from_pragmas : pragmas -> unit
(** Load config from pragmas *)
| null | https://raw.githubusercontent.com/nuscr/nuscr/7f350b978712dc639620f05c8e51ec261691f7cf/lib/utils/pragma.mli | ocaml | * This module contains variables configuarations, to be set by pragmas or
command line arguments, not to be changed for the duration of the program
* Set solver_show_queries
* Set nested_protocol_enabled
* Whether RefinementTypes pragma is enabled
* Set refinement_type
* When refinement types are enabled, senders should validate refinements
* Set sender_validate_refinements
* When refinement types are enabled, receivers should validate refinements
* Set receiver_validate_refinements
* Validate whether a refined global type is semantically satisfiable
* Set validate_refinement_satisfiability
* Validate whether a refined global type satisfies progress semantically
* Set validate_refinement_progress
* Whether to produce verbose outputs
* Set verbose
* Reset all configuration to default
* Load config from pragmas |
type t =
| NestedProtocols
| ShowPragmas
| PrintUsage
| RefinementTypes
| SenderValidateRefinements
| ReceiverValidateRefinements
| ValidateRefinementSatisfiability
| ValidateRefinementProgress
[@@deriving show]
val pragma_of_string : string -> t
type pragmas = (t * string option) list [@@deriving show]
val solver_show_queries : unit -> bool
* Whether to display queries to SMT solvers ( with RefinementTypes pragma )
val set_solver_show_queries : bool -> unit
val nested_protocol_enabled : unit -> bool
* Whether NestedProtocol pragma is enabled
val set_nested_protocol : bool -> unit
val refinement_type_enabled : unit -> bool
val set_refinement_type : bool -> unit
val sender_validate_refinements : unit -> bool
val set_sender_validate_refinements : bool -> unit
val receiver_validate_refinements : unit -> bool
val set_receiver_validate_refinements : bool -> unit
val validate_refinement_satisfiability : unit -> bool
val set_validate_refinement_satisfiability : bool -> unit
val validate_refinement_progress : unit -> bool
val set_validate_refinement_progress : bool -> unit
val verbose : unit -> bool
val set_verbose : bool -> unit
val reset : unit -> unit
val load_from_pragmas : pragmas -> unit
|
f83dbf5bb827f31f21322aa232a1b05ebec5176f375232786ebd9c0bb41f5dab | soupi/yh | VN.hs |
# LANGUAGE FunctionalDependencies #
# LANGUAGE MultiParamTypeClasses #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleInstances #
# LANGUAGE TemplateHaskell #
{-# LANGUAGE OverloadedStrings #-}
module VN where
import qualified SDL
import qualified Play.Engine.MySDL.MySDL as MySDL
import Play.Engine.Utils
import Play.Engine.Types
import Play.Engine.Input
import Play.Engine.Settings
import Data.Maybe
import Control.Monad.Except
import Control.Lens
import System.Random
import qualified Play.Engine.State as State
import qualified Play.Engine.Load as Load
import qualified Control.Monad.State as SM
import qualified Data.Map as M
import qualified Script
import qualified Play.Engine.Sprite as Spr
data State
= State
{ _bg :: Spr.Sprite
, _resources :: MySDL.Resources
, _script :: Script.Script
, _camera :: Int
}
makeFieldsNoPrefix ''State
wantedAssets :: [(String, MySDL.ResourceType FilePath)]
wantedAssets =
[ ("bg", MySDL.Texture "bg.png")
]
make :: Int -> Script.ScriptData -> State.State
make t sd = Load.mkState t (wantedAssets ++ Script.assets sd) (mkState $ Script.script sd)
mkState
:: (MySDL.Resources -> Script.Script)
-> MySDL.Resources -> Result State.State
mkState scrpt rs = do
state <- initState (scrpt rs) rs
pure $ State.mkState
state
update
render
initState :: Script.Script -> MySDL.Resources -> Result State
initState scrpt rs = do
case M.lookup "bg" (MySDL.textures rs) of
Nothing ->
throwError ["Texture not found: bg"]
Just bgt -> do
pure $ State
{ _bg = fromJust $ Spr.make $ Spr.simpleArgs (Point 800 1000) bgt
, _resources = rs
, _script = scrpt
, _camera = 0
}
update :: Input -> State -> Result (State.Command, State)
update input state = do
_wSize <- _windowSize <$> SM.get
(acts, script') <- Script.update input Nothing mempty (state ^. script)
let
newState =
state'
& set script script'
& over camera
(\c ->
if
| c <= 0 && Script.shake acts -> 60
| c <= 0 -> 0
| otherwise -> c - 1
)
where
state' =
if Script.stopTheWorld acts
then
state
else
state
& over bg
( case Script.changeSprite acts of
Nothing -> Spr.update Nothing False
Just sp -> const sp
)
pure (Script.command acts, newState)
render :: SDL.Renderer -> State -> IO ()
render renderer state = do
cam' <- Point <$> randomRIO (-1, 1) <*> randomRIO (-1, 1) :: IO FPoint
let cam = addPoint $ fmap (floor . (*) (fromIntegral $ state ^. camera `div` 3)) cam'
Spr.render renderer cam (Point 0 0) (state ^. bg . size) (state ^. bg)
Script.render renderer cam (state ^. script)
| null | https://raw.githubusercontent.com/soupi/yh/4cfbcc7da6f3ca5e330b7fd7d4adeacee7e2dff4/app/VN.hs | haskell | # LANGUAGE OverloadedStrings # |
# LANGUAGE FunctionalDependencies #
# LANGUAGE MultiParamTypeClasses #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleInstances #
# LANGUAGE TemplateHaskell #
module VN where
import qualified SDL
import qualified Play.Engine.MySDL.MySDL as MySDL
import Play.Engine.Utils
import Play.Engine.Types
import Play.Engine.Input
import Play.Engine.Settings
import Data.Maybe
import Control.Monad.Except
import Control.Lens
import System.Random
import qualified Play.Engine.State as State
import qualified Play.Engine.Load as Load
import qualified Control.Monad.State as SM
import qualified Data.Map as M
import qualified Script
import qualified Play.Engine.Sprite as Spr
data State
= State
{ _bg :: Spr.Sprite
, _resources :: MySDL.Resources
, _script :: Script.Script
, _camera :: Int
}
makeFieldsNoPrefix ''State
wantedAssets :: [(String, MySDL.ResourceType FilePath)]
wantedAssets =
[ ("bg", MySDL.Texture "bg.png")
]
make :: Int -> Script.ScriptData -> State.State
make t sd = Load.mkState t (wantedAssets ++ Script.assets sd) (mkState $ Script.script sd)
mkState
:: (MySDL.Resources -> Script.Script)
-> MySDL.Resources -> Result State.State
mkState scrpt rs = do
state <- initState (scrpt rs) rs
pure $ State.mkState
state
update
render
initState :: Script.Script -> MySDL.Resources -> Result State
initState scrpt rs = do
case M.lookup "bg" (MySDL.textures rs) of
Nothing ->
throwError ["Texture not found: bg"]
Just bgt -> do
pure $ State
{ _bg = fromJust $ Spr.make $ Spr.simpleArgs (Point 800 1000) bgt
, _resources = rs
, _script = scrpt
, _camera = 0
}
update :: Input -> State -> Result (State.Command, State)
update input state = do
_wSize <- _windowSize <$> SM.get
(acts, script') <- Script.update input Nothing mempty (state ^. script)
let
newState =
state'
& set script script'
& over camera
(\c ->
if
| c <= 0 && Script.shake acts -> 60
| c <= 0 -> 0
| otherwise -> c - 1
)
where
state' =
if Script.stopTheWorld acts
then
state
else
state
& over bg
( case Script.changeSprite acts of
Nothing -> Spr.update Nothing False
Just sp -> const sp
)
pure (Script.command acts, newState)
render :: SDL.Renderer -> State -> IO ()
render renderer state = do
cam' <- Point <$> randomRIO (-1, 1) <*> randomRIO (-1, 1) :: IO FPoint
let cam = addPoint $ fmap (floor . (*) (fromIntegral $ state ^. camera `div` 3)) cam'
Spr.render renderer cam (Point 0 0) (state ^. bg . size) (state ^. bg)
Script.render renderer cam (state ^. script)
|
a3080f5d09c04e62abe096d419b71d7a35c1958f9b8d30617157432b18e0642a | skymountain/MLSR | typing.ml | module S = Syntax
module T = Type
exception Error of string
;;
let err s = raise @@ Error s
;;
let type_const = function
| S.CInt _ -> T.TyInt
| S.CBool _ -> T.TyBool
| S.CStr _ -> T.TyStr
| S.CFloat _ -> T.TyFloat
| S.CUnit -> T.TyUnit
;;
let constraints_of_subst s =
List.map (fun (x, ty) -> (T.TyVar x, ty)) @@ T.TyvarMap.bindings s
;;
let constraints_of_subst_list ss =
List.flatten @@ List.map constraints_of_subst ss
;;
let solve =
let rec solve = function
| (T.TyVar _ as t1), (T.TyVar _ as t2)
| (T.TyBase _ as t1), (T.TyBase _ as t2)
| (T.TyVarFixed _ as t1), (T.TyVarFixed _ as t2)
when t1 = t2 -> T.TyvarMap.empty
| (T.TyVar x, ty) when not @@ T.TyvarSet.mem x (T.ftv_ty ty) ->
T.TyvarMap.singleton x ty
| (ty, T.TyVar x) when not @@ T.TyvarSet.mem x (T.ftv_ty ty) ->
T.TyvarMap.singleton x ty
| TyFun (t11, t12), TyFun (t21, t22)
| TyProd (t11, t12), TyProd (t21, t22)
| TySum (t11, t12), TySum (t21, t22)
->
let s1 = solve (t11, t21) in
let t12 = T.subst_ty s1 t12 in
let t22 = T.subst_ty s1 t22 in
let s2 = solve (t12, t22) in
let s1 = T.subst_subst ~by:s2 ~target:s1 in
T.TyvarMap.union (fun _ _ _ -> assert false) s1 s2
| TyList t1, TyList t2 -> solve (t1, t2)
| (( T.TyVar _
| T.TyBase _
| T.TyVarFixed _
| T.TyFun _
| T.TyProd _
| T.TySum _
| T.TyList _
) as t1), (_ as t2) ->
err @@ Printf.sprintf "Types \"%s\" and \"%s\" cannot be unified"
(T.string_of_ty t1) (T.string_of_ty t2)
in
List.fold_left (fun s (ty1, ty2) ->
let ty1 = T.subst_ty s ty1 in
let ty2 = T.subst_ty s ty2 in
let s' = solve (ty1, ty2) in
let s = T.subst_subst ~by:s' ~target:s in
T.TyvarMap.union (fun _ _ _ -> assert false) s s')
T.TyvarMap.empty
;;
let rec type_expr ((var_env, op_env) as env) = function
| S.EId x -> begin
match Env.find_opt x var_env with
| Some tysc -> (T.instantiate tysc, T.TyvarMap.empty)
| None -> err @@ Printf.sprintf "Variable \"%s\" is not defined" x
end
| S.EConst c -> (TyBase (type_const c), T.TyvarMap.empty)
| S.ELet (x, e1, e2) ->
let ty1, s1 = type_expr env e1 in
let var_env' = T.subst_tyenv s1 var_env in
let tysc1 = T.closing var_env' ty1 in
let ty2, s2 = type_expr (Env.add x tysc1 var_env', op_env) e2 in
let s = solve @@ constraints_of_subst_list [s1; s2] in
(T.subst_ty s ty2, s)
| S.ELetRec (x, y, e1, e2) ->
let arg_ty = T.fresh_tyvar () in
let ret_ty = T.fresh_tyvar () in
let fun_ty = T.TyFun (arg_ty, ret_ty) in
let ty1, s1 =
let var_env' =
Env.add y (T.tysc_of_ty arg_ty) @@
Env.add x (T.tysc_of_ty fun_ty) @@
var_env
in
type_expr (var_env', op_env) e1
in
let s1 = solve @@ (ret_ty, ty1) :: (constraints_of_subst s1) in
let fun_ty' = T.subst_ty s1 fun_ty in
let var_env' = T.subst_tyenv s1 var_env in
let fun_tysc = T.closing var_env' fun_ty' in
let ty2, s2 = type_expr (Env.add x fun_tysc var_env', op_env) e2 in
let s = solve @@ constraints_of_subst_list [s1; s2] in
(T.subst_ty s ty2, s)
| S.EFun (x, e) ->
let arg_ty = Type.fresh_tyvar () in
let ret_ty, s =
let var_env' = Env.add x (T.tysc_of_ty arg_ty) var_env in
type_expr (var_env', op_env) e in
let fun_ty = T.TyFun (T.subst_ty s arg_ty, ret_ty) in
(fun_ty, s)
| S.EApp (e1, e2) ->
let fun_ty, s1 = type_expr env e1 in
let arg_ty, s2 = type_expr env e2 in
let ret_ty = T.fresh_tyvar () in
let c =
(fun_ty, T.TyFun (arg_ty, ret_ty)) ::
(constraints_of_subst_list [s1; s2])
in
let s = solve c in
(T.subst_ty s ret_ty, s)
| S.EPair (e1, e2) ->
let fst_ty, s1 = type_expr env e1 in
let snd_ty, s2 = type_expr env e2 in
let s = solve @@ constraints_of_subst_list [s1; s2] in
(T.subst_ty s (TyProd (fst_ty, snd_ty)), s)
| S.EHandle (e, ((ret_x, ret_body), ops)) ->
let handled_ty, s1 = type_expr env e in
let ret_ty, s2 =
let var_env' = Env.add ret_x (T.tysc_of_ty handled_ty) var_env in
type_expr (var_env', op_env) ret_body in
let s3 = type_handler env ret_ty ops in
let s = solve @@ constraints_of_subst_list [s1; s2; s3] in
(T.subst_ty s ret_ty, s)
| S.EInl e ->
let left_ty, s = type_expr env e in
let right_ty = T.fresh_tyvar () in
(T.TySum (left_ty, right_ty), s)
| S.EInr e ->
let right_ty, s = type_expr env e in
let left_ty = T.fresh_tyvar () in
(T.TySum (left_ty, right_ty), s)
| S.EList es ->
let elem_ty = T.fresh_tyvar () in
let c = List.fold_left (fun c e ->
let ty, s = type_expr env e in
(elem_ty, ty) :: ((constraints_of_subst s) @ c)) [] es
in
let s = solve c in
(T.TyList (T.subst_ty s elem_ty), s)
| S.EMatch (e, m) -> begin
let mty, sm = type_expr env e in
match m with
| MPair (x, y, e) ->
let x_ty = T.fresh_tyvar () in
let y_ty = T.fresh_tyvar () in
let cty, sc =
let var_env' =
Env.add x (T.tysc_of_ty x_ty) @@
Env.add y (T.tysc_of_ty y_ty) @@
var_env
in
type_expr (var_env', op_env) e
in
let c =
(T.TyProd (x_ty, y_ty), mty) ::
(constraints_of_subst_list [sm; sc])
in
let s = solve c in
(T.subst_ty s cty, s)
| MInj (x, ex, y, ey) ->
let x_ty = T.fresh_tyvar () in
let y_ty = T.fresh_tyvar () in
let x_cty, x_sc =
let var_env' = Env.add x (T.tysc_of_ty x_ty) var_env in
type_expr (var_env', op_env) ex
in
let y_cty, y_sc =
let var_env' = Env.add y (T.tysc_of_ty y_ty) var_env in
type_expr (var_env', op_env) ey
in
let c = (x_cty, y_cty) :: (T.TySum (x_ty, y_ty), mty) ::
(constraints_of_subst_list [sm; x_sc; y_sc]) in
let s = solve c in
(T.subst_ty s x_cty, s)
| MList (en, x, xs, ec) ->
let elem_ty = T.fresh_tyvar () in
let n_cty, n_sc = type_expr env en in
let c_cty, c_sc =
let var_env' =
Env.add x (T.tysc_of_ty elem_ty) @@
Env.add xs (T.tysc_of_ty (T.TyList elem_ty)) @@
var_env
in
type_expr (var_env', op_env) ec
in
let c = (n_cty, c_cty) :: (T.TyList elem_ty, mty) ::
(constraints_of_subst_list [sm; n_sc; c_sc]) in
let s = solve c in
(T.subst_ty s n_cty, s)
end
| S.EIf (ce, te, ee) -> begin
let cty, sc = type_expr env ce in
let tty, st = type_expr env te in
let ety, se = type_expr env ee in
let s = solve @@
(cty, TyBase TyBool) ::
(tty, ety) ::
(constraints_of_subst_list [sc; st; se])
in
(T.subst_ty s tty, s)
end
and type_handler env ret_ty ops =
let constraints = List.map (type_operation_clause env ret_ty) ops in
solve (List.flatten constraints)
and type_operation_clause (var_env, op_env) ret_ty
{ op_name; op_arg_var; op_cont_var; op_body } =
match T.OpMap.find_opt op_name op_env with
| None ->
err @@ Printf.sprintf "Effect operation \"%s\" is not declared" op_name
| Some ty_sig ->
let fixed_tyvars, dom_ty, codom_ty = T.instantiate_ty_sig ty_sig in
let arg_ty = T.tysc_of_ty dom_ty in
let cont_ty = T.tysc_of_ty @@ T.TyFun (codom_ty, ret_ty) in
let op_body_ty, s =
let var_env' =
Env.add op_arg_var arg_ty @@
Env.add op_cont_var cont_ty @@
var_env
in
type_expr (var_env', op_env) op_body
in
let free_fixed_tyvars =
T.TyvarSet.inter fixed_tyvars (T.ftv_ty op_body_ty)
in
if T.TyvarSet.is_empty free_fixed_tyvars then
(ret_ty, op_body_ty) :: (constraints_of_subst s)
else
err "Type variables bound in an operation clause cannot be escaped"
;;
let check_SR = ref true
;;
let signature_restriction =
let rec tyvars_at_pos = function
| T.TyVar x -> T.TyvarSet.singleton x
| T.TyVarFixed _ -> assert false
| T.TyBase _ -> T.TyvarSet.empty
| T.TyFun (arg_ty, ret_ty) ->
T.TyvarSet.union (tyvars_at_neg arg_ty) (tyvars_at_pos ret_ty)
| T.TyProd (ty1, ty2) | T.TySum (ty1, ty2) ->
T.TyvarSet.union (tyvars_at_pos ty1) (tyvars_at_pos ty2)
| T.TyList ty -> tyvars_at_pos ty
and tyvars_at_neg = function
| T.TyVar _ | TyBase _ -> T.TyvarSet.empty
| T.TyVarFixed _ -> assert false
| T.TyFun (arg_ty, ret_ty) ->
T.TyvarSet.union (tyvars_at_pos arg_ty) (tyvars_at_neg ret_ty)
| T.TyProd (ty1, ty2) | T.TySum (ty1, ty2) ->
T.TyvarSet.union (tyvars_at_neg ty1) (tyvars_at_neg ty2)
| T.TyList ty -> tyvars_at_neg ty
in
let rec tyvars_at_nonstrict_pos = function
| T.TyVar _ | T.TyBase _ -> T.TyvarSet.empty
| T.TyVarFixed _ -> assert false
| T.TyFun (arg_ty, ret_ty) ->
T.TyvarSet.union
(tyvars_at_neg arg_ty)
(tyvars_at_nonstrict_pos ret_ty)
| T.TyProd (ty1, ty2) | T.TySum (ty1, ty2) ->
T.TyvarSet.union
(tyvars_at_nonstrict_pos ty1)
(tyvars_at_nonstrict_pos ty2)
| T.TyList ty -> tyvars_at_nonstrict_pos ty
in
fun (tyvars, dom_ty, codom_ty) ->
let dom_sat = T.TyvarSet.disjoint
tyvars (tyvars_at_nonstrict_pos dom_ty)
in
let codom_sat = T.TyvarSet.disjoint
tyvars (tyvars_at_neg codom_ty)
in
if not !check_SR then
None
else if (not dom_sat) && (not codom_sat) then
Some "both of the domain and codomain types"
else if not dom_sat then
Some "the domain type"
else if not codom_sat then
Some "the codomain type"
else
None
;;
let check_closed_tyenv (var_env, op_env) =
assert (T.TyvarSet.is_empty @@ T.ftv_tyenv var_env);
assert (T.TyvarSet.is_empty @@ T.ftv_openv op_env)
;;
let type_decl ((var_env, op_env) as env) =
check_closed_tyenv env;
function
| S.DExpr e ->
let ty, _ = type_expr env e in
let tysc = T.closing var_env ty in
(env, tysc)
| S.DLet (x, e) ->
let ty, _ = type_expr env e in
let tysc = T.closing var_env ty in
let env' = (Env.add x tysc var_env, op_env) in
(env', tysc)
| S.DEff (op_name, ((tyvars, dom_ty, codom_ty) as ty_sig)) ->
match signature_restriction ty_sig with
| Some blame ->
err @@
Printf.sprintf
"The type signature does not follow the signature restriction on %s"
blame
| None ->
let tysc = T.closing var_env @@
T.instantiate (tyvars, T.TyFun (dom_ty, codom_ty))
in
let env' =
(Env.add op_name tysc var_env, Env.add op_name ty_sig op_env)
in
(env', tysc)
;;
| null | https://raw.githubusercontent.com/skymountain/MLSR/4da4eaccb596e4678838c272b78872073f43e29c/lib/typing.ml | ocaml | module S = Syntax
module T = Type
exception Error of string
;;
let err s = raise @@ Error s
;;
let type_const = function
| S.CInt _ -> T.TyInt
| S.CBool _ -> T.TyBool
| S.CStr _ -> T.TyStr
| S.CFloat _ -> T.TyFloat
| S.CUnit -> T.TyUnit
;;
let constraints_of_subst s =
List.map (fun (x, ty) -> (T.TyVar x, ty)) @@ T.TyvarMap.bindings s
;;
let constraints_of_subst_list ss =
List.flatten @@ List.map constraints_of_subst ss
;;
let solve =
let rec solve = function
| (T.TyVar _ as t1), (T.TyVar _ as t2)
| (T.TyBase _ as t1), (T.TyBase _ as t2)
| (T.TyVarFixed _ as t1), (T.TyVarFixed _ as t2)
when t1 = t2 -> T.TyvarMap.empty
| (T.TyVar x, ty) when not @@ T.TyvarSet.mem x (T.ftv_ty ty) ->
T.TyvarMap.singleton x ty
| (ty, T.TyVar x) when not @@ T.TyvarSet.mem x (T.ftv_ty ty) ->
T.TyvarMap.singleton x ty
| TyFun (t11, t12), TyFun (t21, t22)
| TyProd (t11, t12), TyProd (t21, t22)
| TySum (t11, t12), TySum (t21, t22)
->
let s1 = solve (t11, t21) in
let t12 = T.subst_ty s1 t12 in
let t22 = T.subst_ty s1 t22 in
let s2 = solve (t12, t22) in
let s1 = T.subst_subst ~by:s2 ~target:s1 in
T.TyvarMap.union (fun _ _ _ -> assert false) s1 s2
| TyList t1, TyList t2 -> solve (t1, t2)
| (( T.TyVar _
| T.TyBase _
| T.TyVarFixed _
| T.TyFun _
| T.TyProd _
| T.TySum _
| T.TyList _
) as t1), (_ as t2) ->
err @@ Printf.sprintf "Types \"%s\" and \"%s\" cannot be unified"
(T.string_of_ty t1) (T.string_of_ty t2)
in
List.fold_left (fun s (ty1, ty2) ->
let ty1 = T.subst_ty s ty1 in
let ty2 = T.subst_ty s ty2 in
let s' = solve (ty1, ty2) in
let s = T.subst_subst ~by:s' ~target:s in
T.TyvarMap.union (fun _ _ _ -> assert false) s s')
T.TyvarMap.empty
;;
let rec type_expr ((var_env, op_env) as env) = function
| S.EId x -> begin
match Env.find_opt x var_env with
| Some tysc -> (T.instantiate tysc, T.TyvarMap.empty)
| None -> err @@ Printf.sprintf "Variable \"%s\" is not defined" x
end
| S.EConst c -> (TyBase (type_const c), T.TyvarMap.empty)
| S.ELet (x, e1, e2) ->
let ty1, s1 = type_expr env e1 in
let var_env' = T.subst_tyenv s1 var_env in
let tysc1 = T.closing var_env' ty1 in
let ty2, s2 = type_expr (Env.add x tysc1 var_env', op_env) e2 in
let s = solve @@ constraints_of_subst_list [s1; s2] in
(T.subst_ty s ty2, s)
| S.ELetRec (x, y, e1, e2) ->
let arg_ty = T.fresh_tyvar () in
let ret_ty = T.fresh_tyvar () in
let fun_ty = T.TyFun (arg_ty, ret_ty) in
let ty1, s1 =
let var_env' =
Env.add y (T.tysc_of_ty arg_ty) @@
Env.add x (T.tysc_of_ty fun_ty) @@
var_env
in
type_expr (var_env', op_env) e1
in
let s1 = solve @@ (ret_ty, ty1) :: (constraints_of_subst s1) in
let fun_ty' = T.subst_ty s1 fun_ty in
let var_env' = T.subst_tyenv s1 var_env in
let fun_tysc = T.closing var_env' fun_ty' in
let ty2, s2 = type_expr (Env.add x fun_tysc var_env', op_env) e2 in
let s = solve @@ constraints_of_subst_list [s1; s2] in
(T.subst_ty s ty2, s)
| S.EFun (x, e) ->
let arg_ty = Type.fresh_tyvar () in
let ret_ty, s =
let var_env' = Env.add x (T.tysc_of_ty arg_ty) var_env in
type_expr (var_env', op_env) e in
let fun_ty = T.TyFun (T.subst_ty s arg_ty, ret_ty) in
(fun_ty, s)
| S.EApp (e1, e2) ->
let fun_ty, s1 = type_expr env e1 in
let arg_ty, s2 = type_expr env e2 in
let ret_ty = T.fresh_tyvar () in
let c =
(fun_ty, T.TyFun (arg_ty, ret_ty)) ::
(constraints_of_subst_list [s1; s2])
in
let s = solve c in
(T.subst_ty s ret_ty, s)
| S.EPair (e1, e2) ->
let fst_ty, s1 = type_expr env e1 in
let snd_ty, s2 = type_expr env e2 in
let s = solve @@ constraints_of_subst_list [s1; s2] in
(T.subst_ty s (TyProd (fst_ty, snd_ty)), s)
| S.EHandle (e, ((ret_x, ret_body), ops)) ->
let handled_ty, s1 = type_expr env e in
let ret_ty, s2 =
let var_env' = Env.add ret_x (T.tysc_of_ty handled_ty) var_env in
type_expr (var_env', op_env) ret_body in
let s3 = type_handler env ret_ty ops in
let s = solve @@ constraints_of_subst_list [s1; s2; s3] in
(T.subst_ty s ret_ty, s)
| S.EInl e ->
let left_ty, s = type_expr env e in
let right_ty = T.fresh_tyvar () in
(T.TySum (left_ty, right_ty), s)
| S.EInr e ->
let right_ty, s = type_expr env e in
let left_ty = T.fresh_tyvar () in
(T.TySum (left_ty, right_ty), s)
| S.EList es ->
let elem_ty = T.fresh_tyvar () in
let c = List.fold_left (fun c e ->
let ty, s = type_expr env e in
(elem_ty, ty) :: ((constraints_of_subst s) @ c)) [] es
in
let s = solve c in
(T.TyList (T.subst_ty s elem_ty), s)
| S.EMatch (e, m) -> begin
let mty, sm = type_expr env e in
match m with
| MPair (x, y, e) ->
let x_ty = T.fresh_tyvar () in
let y_ty = T.fresh_tyvar () in
let cty, sc =
let var_env' =
Env.add x (T.tysc_of_ty x_ty) @@
Env.add y (T.tysc_of_ty y_ty) @@
var_env
in
type_expr (var_env', op_env) e
in
let c =
(T.TyProd (x_ty, y_ty), mty) ::
(constraints_of_subst_list [sm; sc])
in
let s = solve c in
(T.subst_ty s cty, s)
| MInj (x, ex, y, ey) ->
let x_ty = T.fresh_tyvar () in
let y_ty = T.fresh_tyvar () in
let x_cty, x_sc =
let var_env' = Env.add x (T.tysc_of_ty x_ty) var_env in
type_expr (var_env', op_env) ex
in
let y_cty, y_sc =
let var_env' = Env.add y (T.tysc_of_ty y_ty) var_env in
type_expr (var_env', op_env) ey
in
let c = (x_cty, y_cty) :: (T.TySum (x_ty, y_ty), mty) ::
(constraints_of_subst_list [sm; x_sc; y_sc]) in
let s = solve c in
(T.subst_ty s x_cty, s)
| MList (en, x, xs, ec) ->
let elem_ty = T.fresh_tyvar () in
let n_cty, n_sc = type_expr env en in
let c_cty, c_sc =
let var_env' =
Env.add x (T.tysc_of_ty elem_ty) @@
Env.add xs (T.tysc_of_ty (T.TyList elem_ty)) @@
var_env
in
type_expr (var_env', op_env) ec
in
let c = (n_cty, c_cty) :: (T.TyList elem_ty, mty) ::
(constraints_of_subst_list [sm; n_sc; c_sc]) in
let s = solve c in
(T.subst_ty s n_cty, s)
end
| S.EIf (ce, te, ee) -> begin
let cty, sc = type_expr env ce in
let tty, st = type_expr env te in
let ety, se = type_expr env ee in
let s = solve @@
(cty, TyBase TyBool) ::
(tty, ety) ::
(constraints_of_subst_list [sc; st; se])
in
(T.subst_ty s tty, s)
end
and type_handler env ret_ty ops =
let constraints = List.map (type_operation_clause env ret_ty) ops in
solve (List.flatten constraints)
and type_operation_clause (var_env, op_env) ret_ty
{ op_name; op_arg_var; op_cont_var; op_body } =
match T.OpMap.find_opt op_name op_env with
| None ->
err @@ Printf.sprintf "Effect operation \"%s\" is not declared" op_name
| Some ty_sig ->
let fixed_tyvars, dom_ty, codom_ty = T.instantiate_ty_sig ty_sig in
let arg_ty = T.tysc_of_ty dom_ty in
let cont_ty = T.tysc_of_ty @@ T.TyFun (codom_ty, ret_ty) in
let op_body_ty, s =
let var_env' =
Env.add op_arg_var arg_ty @@
Env.add op_cont_var cont_ty @@
var_env
in
type_expr (var_env', op_env) op_body
in
let free_fixed_tyvars =
T.TyvarSet.inter fixed_tyvars (T.ftv_ty op_body_ty)
in
if T.TyvarSet.is_empty free_fixed_tyvars then
(ret_ty, op_body_ty) :: (constraints_of_subst s)
else
err "Type variables bound in an operation clause cannot be escaped"
;;
let check_SR = ref true
;;
let signature_restriction =
let rec tyvars_at_pos = function
| T.TyVar x -> T.TyvarSet.singleton x
| T.TyVarFixed _ -> assert false
| T.TyBase _ -> T.TyvarSet.empty
| T.TyFun (arg_ty, ret_ty) ->
T.TyvarSet.union (tyvars_at_neg arg_ty) (tyvars_at_pos ret_ty)
| T.TyProd (ty1, ty2) | T.TySum (ty1, ty2) ->
T.TyvarSet.union (tyvars_at_pos ty1) (tyvars_at_pos ty2)
| T.TyList ty -> tyvars_at_pos ty
and tyvars_at_neg = function
| T.TyVar _ | TyBase _ -> T.TyvarSet.empty
| T.TyVarFixed _ -> assert false
| T.TyFun (arg_ty, ret_ty) ->
T.TyvarSet.union (tyvars_at_pos arg_ty) (tyvars_at_neg ret_ty)
| T.TyProd (ty1, ty2) | T.TySum (ty1, ty2) ->
T.TyvarSet.union (tyvars_at_neg ty1) (tyvars_at_neg ty2)
| T.TyList ty -> tyvars_at_neg ty
in
let rec tyvars_at_nonstrict_pos = function
| T.TyVar _ | T.TyBase _ -> T.TyvarSet.empty
| T.TyVarFixed _ -> assert false
| T.TyFun (arg_ty, ret_ty) ->
T.TyvarSet.union
(tyvars_at_neg arg_ty)
(tyvars_at_nonstrict_pos ret_ty)
| T.TyProd (ty1, ty2) | T.TySum (ty1, ty2) ->
T.TyvarSet.union
(tyvars_at_nonstrict_pos ty1)
(tyvars_at_nonstrict_pos ty2)
| T.TyList ty -> tyvars_at_nonstrict_pos ty
in
fun (tyvars, dom_ty, codom_ty) ->
let dom_sat = T.TyvarSet.disjoint
tyvars (tyvars_at_nonstrict_pos dom_ty)
in
let codom_sat = T.TyvarSet.disjoint
tyvars (tyvars_at_neg codom_ty)
in
if not !check_SR then
None
else if (not dom_sat) && (not codom_sat) then
Some "both of the domain and codomain types"
else if not dom_sat then
Some "the domain type"
else if not codom_sat then
Some "the codomain type"
else
None
;;
let check_closed_tyenv (var_env, op_env) =
assert (T.TyvarSet.is_empty @@ T.ftv_tyenv var_env);
assert (T.TyvarSet.is_empty @@ T.ftv_openv op_env)
;;
let type_decl ((var_env, op_env) as env) =
check_closed_tyenv env;
function
| S.DExpr e ->
let ty, _ = type_expr env e in
let tysc = T.closing var_env ty in
(env, tysc)
| S.DLet (x, e) ->
let ty, _ = type_expr env e in
let tysc = T.closing var_env ty in
let env' = (Env.add x tysc var_env, op_env) in
(env', tysc)
| S.DEff (op_name, ((tyvars, dom_ty, codom_ty) as ty_sig)) ->
match signature_restriction ty_sig with
| Some blame ->
err @@
Printf.sprintf
"The type signature does not follow the signature restriction on %s"
blame
| None ->
let tysc = T.closing var_env @@
T.instantiate (tyvars, T.TyFun (dom_ty, codom_ty))
in
let env' =
(Env.add op_name tysc var_env, Env.add op_name ty_sig op_env)
in
(env', tysc)
;;
| |
8a06d1b1f9d0f57ce85a023e1f1ba0837e82dc059b7c7042ee776c8a430eb2a2 | ztellman/lamina | queue.clj | Copyright ( c ) . All rights reserved .
;; The use and distribution terms for this software are covered by the
;; Eclipse Public License 1.0 (-1.0.php)
;; which can be found in the file epl-v10.html at the root of this distribution.
;; By using this software in any fashion, you are agreeing to be bound by
;; the terms of this license.
;; You must not remove this notice, or any other, from this software.
(ns lamina.test.queue
(:use
[clojure test]
[lamina.test utils]
[lamina.time :only (invoke-in)])
(:require
[lamina.core.queue :as q]
[lamina.core.result :as r])
(:import
[lamina.core.queue Message]))
(defn enqueue
([q msg]
(q/enqueue q msg true nil))
([q msg persist?]
(q/enqueue q msg persist? nil))
([q msg persist? release-fn]
(q/enqueue q msg persist? release-fn)))
(defn receive
([q]
(q/receive q nil nil nil))
([q predicate false-value]
(q/receive q predicate false-value nil))
([q predicate false-value result-channel]
(q/receive q predicate false-value result-channel)))
(defn cancel-receive [q callback]
(q/cancel-receive q callback))
(defn test-queue [q-fn]
;; test drain
(let [q (q-fn)]
(enqueue q nil)
(enqueue q :a)
(is (= [nil :a] (map #(.message ^Message %) (q/drain q)))))
;; enqueue, then receive
(let [q (q-fn)]
(enqueue q 0 false)
(enqueue q nil)
(is (= nil @(receive q))))
;; multi-enqueue, then drain
(let [q (q-fn)]
(enqueue q 0)
(enqueue q 1)
(is (= [0 1] (map #(.message ^Message %) (q/drain q)))))
;; multi-receive, then enqueue
(let [q (q-fn)
a (receive q)
b (receive q)]
(enqueue q :a)
(enqueue q :b)
(is (= :a @a))
(is (= :b @b)))
;; enqueue, then receive with predicate, and then with explicit result channel
(let [q (q-fn)]
(enqueue q 3)
(enqueue q 4)
(is (= ::nope @(receive q even? ::nope)))
(is (= 1 @(receive q even? ::nope (r/success-result 1))))
(is (= 3 @(receive q odd? nil)))
(let [r (r/result-channel)]
(receive q even? nil r)
(is (= 4 @r))))
;; multi-receive with predicate, then enqueue
(let [q (q-fn)
a (receive q odd? ::nope)
b (receive q even? nil (r/success-result 1))
c (receive q even? nil)]
(enqueue q 2)
(is (= ::nope @a))
(is (= 1 @b))
(is (= 2 @c)))
;; enqueue, then receive with faulty predicate
(let [q (q-fn)
a (receive q (fn [_] (throw (Exception. "boom"))) nil)
b (receive q (constantly true) nil)]
(enqueue q :msg)
(is (thrown? Exception @a))
(is (= :msg @b)))
;; receive, cancel, receive, and enqueue
(let [q (q-fn)
a (receive q)]
(is (= true (cancel-receive q a)))
(is (= false (cancel-receive q (r/result-channel))))
(let [b (receive q)]
(enqueue q 6)
(is (= 6 @b))))
;; multi-receive, cancel, and enqueue
(let [q (q-fn)
a (receive q)
b (receive q)]
(is (= true (cancel-receive q a)))
(enqueue q :msg)
(is (= :msg @b)))
;; receive with already claimed result-channel, then enqueue
(let [q (q-fn)]
(receive q nil nil (r/success-result 1))
(enqueue q 8)
(is (= 8 @(receive q))))
;; enqueue, then receive with already claimed result-channel
(let [q (q-fn)]
(enqueue q 9)
(receive q nil nil (r/success-result 1))
(is (= 9 @(receive q)))))
(deftest test-basic-queue
(test-queue q/queue))
(deftest test-transactional-queue
(test-queue q/transactional-queue))
;;;
(defn stress-test-single-queue [q-fn]
(let [q (q-fn)]
(dotimes* [i 1e5]
(invoke-in 0.01 #(enqueue q i))
(Thread/yield)
(is (= i @(receive q))))))
(defn stress-test-closing-queue [q-fn]
(dotimes* [i 1e5]
(let [q (q-fn)
result (receive q)]
(invoke-in 0.1 #(q/close q))
(Thread/sleep 1)
(is (thrown? Exception @result)))))
(deftest ^:stress stress-test-basic-queue
(println "\n----\n test single queue \n---\n")
(stress-test-single-queue q/queue)
(println "\n----\n test closing queue \n---\n")
(stress-test-closing-queue q/queue))
(deftest ^:stress stress-test-transactional-queue
(println "\n----\n test single transactional queue \n---\n")
(stress-test-single-queue q/transactional-queue)
(println "\n----\n test closing transactional queue \n---\n")
(stress-test-closing-queue q/transactional-queue))
;;;
(defn benchmark-queue [type q r-fn]
(bench (str type "receive and enqueue")
(q/receive q)
(enqueue q 1))
(bench (str type "receive with explicit result-channel and enqueue")
(receive q nil nil (r-fn))
(enqueue q 1))
(bench (str type "receive, cancel, receive and enqueue")
(let [r (receive q nil nil)]
(cancel-receive q r))
(q/receive q)
(enqueue q 1))
(bench (str type "multi-receive and multi-enqueue")
(q/receive q)
(q/receive q)
(enqueue q 1)
(enqueue q 2))
(bench (str type "multi-receive, cancel, and enqueue")
(q/receive q)
(let [r (q/receive q)]
(cancel-receive q r))
(enqueue q 1))
(bench (str type "enqueue and receive")
(enqueue q 1)
(receive q))
(bench (str type "enqueue and receive with explicit result-channel")
(enqueue q 1)
(receive q nil nil (r-fn)))
(bench (str type "enqueue without persistence")
(q/enqueue q 1 false nil)))
(deftest ^:benchmark benchmark-basic-queue
(bench "create basic queue"
(q/queue nil))
(benchmark-queue "basic-queue - "
(q/queue nil)
r/result-channel))
(deftest ^:benchmark benchmark-transactional-queue
(bench "create transactional queue"
(q/queue nil))
(benchmark-queue "transactional-queue - "
(q/transactional-queue nil)
r/result-channel))
| null | https://raw.githubusercontent.com/ztellman/lamina/07c3fb84fdb3f4a4892c0bdbe3abf28788bdeb29/test/lamina/test/queue.clj | clojure | The use and distribution terms for this software are covered by the
Eclipse Public License 1.0 (-1.0.php)
which can be found in the file epl-v10.html at the root of this distribution.
By using this software in any fashion, you are agreeing to be bound by
the terms of this license.
You must not remove this notice, or any other, from this software.
test drain
enqueue, then receive
multi-enqueue, then drain
multi-receive, then enqueue
enqueue, then receive with predicate, and then with explicit result channel
multi-receive with predicate, then enqueue
enqueue, then receive with faulty predicate
receive, cancel, receive, and enqueue
multi-receive, cancel, and enqueue
receive with already claimed result-channel, then enqueue
enqueue, then receive with already claimed result-channel
| Copyright ( c ) . All rights reserved .
(ns lamina.test.queue
(:use
[clojure test]
[lamina.test utils]
[lamina.time :only (invoke-in)])
(:require
[lamina.core.queue :as q]
[lamina.core.result :as r])
(:import
[lamina.core.queue Message]))
(defn enqueue
([q msg]
(q/enqueue q msg true nil))
([q msg persist?]
(q/enqueue q msg persist? nil))
([q msg persist? release-fn]
(q/enqueue q msg persist? release-fn)))
(defn receive
([q]
(q/receive q nil nil nil))
([q predicate false-value]
(q/receive q predicate false-value nil))
([q predicate false-value result-channel]
(q/receive q predicate false-value result-channel)))
(defn cancel-receive [q callback]
(q/cancel-receive q callback))
(defn test-queue [q-fn]
(let [q (q-fn)]
(enqueue q nil)
(enqueue q :a)
(is (= [nil :a] (map #(.message ^Message %) (q/drain q)))))
(let [q (q-fn)]
(enqueue q 0 false)
(enqueue q nil)
(is (= nil @(receive q))))
(let [q (q-fn)]
(enqueue q 0)
(enqueue q 1)
(is (= [0 1] (map #(.message ^Message %) (q/drain q)))))
(let [q (q-fn)
a (receive q)
b (receive q)]
(enqueue q :a)
(enqueue q :b)
(is (= :a @a))
(is (= :b @b)))
(let [q (q-fn)]
(enqueue q 3)
(enqueue q 4)
(is (= ::nope @(receive q even? ::nope)))
(is (= 1 @(receive q even? ::nope (r/success-result 1))))
(is (= 3 @(receive q odd? nil)))
(let [r (r/result-channel)]
(receive q even? nil r)
(is (= 4 @r))))
(let [q (q-fn)
a (receive q odd? ::nope)
b (receive q even? nil (r/success-result 1))
c (receive q even? nil)]
(enqueue q 2)
(is (= ::nope @a))
(is (= 1 @b))
(is (= 2 @c)))
(let [q (q-fn)
a (receive q (fn [_] (throw (Exception. "boom"))) nil)
b (receive q (constantly true) nil)]
(enqueue q :msg)
(is (thrown? Exception @a))
(is (= :msg @b)))
(let [q (q-fn)
a (receive q)]
(is (= true (cancel-receive q a)))
(is (= false (cancel-receive q (r/result-channel))))
(let [b (receive q)]
(enqueue q 6)
(is (= 6 @b))))
(let [q (q-fn)
a (receive q)
b (receive q)]
(is (= true (cancel-receive q a)))
(enqueue q :msg)
(is (= :msg @b)))
(let [q (q-fn)]
(receive q nil nil (r/success-result 1))
(enqueue q 8)
(is (= 8 @(receive q))))
(let [q (q-fn)]
(enqueue q 9)
(receive q nil nil (r/success-result 1))
(is (= 9 @(receive q)))))
(deftest test-basic-queue
(test-queue q/queue))
(deftest test-transactional-queue
(test-queue q/transactional-queue))
(defn stress-test-single-queue [q-fn]
(let [q (q-fn)]
(dotimes* [i 1e5]
(invoke-in 0.01 #(enqueue q i))
(Thread/yield)
(is (= i @(receive q))))))
(defn stress-test-closing-queue [q-fn]
(dotimes* [i 1e5]
(let [q (q-fn)
result (receive q)]
(invoke-in 0.1 #(q/close q))
(Thread/sleep 1)
(is (thrown? Exception @result)))))
(deftest ^:stress stress-test-basic-queue
(println "\n----\n test single queue \n---\n")
(stress-test-single-queue q/queue)
(println "\n----\n test closing queue \n---\n")
(stress-test-closing-queue q/queue))
(deftest ^:stress stress-test-transactional-queue
(println "\n----\n test single transactional queue \n---\n")
(stress-test-single-queue q/transactional-queue)
(println "\n----\n test closing transactional queue \n---\n")
(stress-test-closing-queue q/transactional-queue))
(defn benchmark-queue [type q r-fn]
(bench (str type "receive and enqueue")
(q/receive q)
(enqueue q 1))
(bench (str type "receive with explicit result-channel and enqueue")
(receive q nil nil (r-fn))
(enqueue q 1))
(bench (str type "receive, cancel, receive and enqueue")
(let [r (receive q nil nil)]
(cancel-receive q r))
(q/receive q)
(enqueue q 1))
(bench (str type "multi-receive and multi-enqueue")
(q/receive q)
(q/receive q)
(enqueue q 1)
(enqueue q 2))
(bench (str type "multi-receive, cancel, and enqueue")
(q/receive q)
(let [r (q/receive q)]
(cancel-receive q r))
(enqueue q 1))
(bench (str type "enqueue and receive")
(enqueue q 1)
(receive q))
(bench (str type "enqueue and receive with explicit result-channel")
(enqueue q 1)
(receive q nil nil (r-fn)))
(bench (str type "enqueue without persistence")
(q/enqueue q 1 false nil)))
(deftest ^:benchmark benchmark-basic-queue
(bench "create basic queue"
(q/queue nil))
(benchmark-queue "basic-queue - "
(q/queue nil)
r/result-channel))
(deftest ^:benchmark benchmark-transactional-queue
(bench "create transactional queue"
(q/queue nil))
(benchmark-queue "transactional-queue - "
(q/transactional-queue nil)
r/result-channel))
|
c514414f651d527095ff3516bf4e014e683328afd64b206e830c4a2787614bbd | Ledest/otpbp | otpbp_error_logger.erl | -module(otpbp_error_logger).
-compile([{parse_transform, otpbp_pt}]).
-ifdef(HAVE_error_logger__get_format_depth_0).
-import(error_logger, [get_format_depth/0]).
-endif.
-ifndef(HAVE_error_logger__limit_term_1).
OTP 20.0
-export([limit_term/1]).
-endif.
-ifndef(HAVE_error_logger__get_format_depth_0).
OTP 20.0
-export([get_format_depth/0]).
-endif.
-ifndef(HAVE_error_logger__limit_term_1).
-spec limit_term(term()) -> term().
limit_term(Term) ->
case get_format_depth() of
unlimited -> Term;
D -> io_lib:limit_term(Term, D)
end.
-endif.
-ifndef(HAVE_error_logger__get_format_depth_0).
get_format_depth() ->
case application:get_env(kernel, error_logger_format_depth) of
{ok, Depth} when is_integer(Depth) -> max(10, Depth);
undefined -> unlimited
end.
-endif.
| null | https://raw.githubusercontent.com/Ledest/otpbp/f93923239b33cc05500733027e6e775090ac80ad/src/otpbp_error_logger.erl | erlang | -module(otpbp_error_logger).
-compile([{parse_transform, otpbp_pt}]).
-ifdef(HAVE_error_logger__get_format_depth_0).
-import(error_logger, [get_format_depth/0]).
-endif.
-ifndef(HAVE_error_logger__limit_term_1).
OTP 20.0
-export([limit_term/1]).
-endif.
-ifndef(HAVE_error_logger__get_format_depth_0).
OTP 20.0
-export([get_format_depth/0]).
-endif.
-ifndef(HAVE_error_logger__limit_term_1).
-spec limit_term(term()) -> term().
limit_term(Term) ->
case get_format_depth() of
unlimited -> Term;
D -> io_lib:limit_term(Term, D)
end.
-endif.
-ifndef(HAVE_error_logger__get_format_depth_0).
get_format_depth() ->
case application:get_env(kernel, error_logger_format_depth) of
{ok, Depth} when is_integer(Depth) -> max(10, Depth);
undefined -> unlimited
end.
-endif.
| |
9a2195f7e3087538b1f1912329c09302af214cd25b75bb450747829f32f730ae | pjotrp/guix | refresh.scm | ;;; GNU Guix --- Functional package management for GNU
Copyright © 2013 , 2014 , 2015 < >
Copyright © 2013 < >
Copyright © 2014 < >
Copyright © 2015 < >
;;;
;;; This file is part of GNU Guix.
;;;
GNU is free software ; you can redistribute it and/or modify it
under the terms of the GNU General Public License as published by
the Free Software Foundation ; either version 3 of the License , or ( at
;;; your option) any later version.
;;;
;;; GNU Guix is distributed in the hope that it will be useful, but
;;; WITHOUT ANY WARRANTY; without even the implied warranty of
;;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
;;; GNU General Public License for more details.
;;;
You should have received a copy of the GNU General Public License
along with GNU . If not , see < / > .
(define-module (guix scripts refresh)
#:use-module (guix ui)
#:use-module (guix hash)
#:use-module (guix scripts)
#:use-module (guix store)
#:use-module (guix utils)
#:use-module (guix packages)
#:use-module (guix upstream)
#:use-module (guix graph)
#:use-module (guix scripts graph)
#:use-module (guix monads)
#:use-module ((guix gnu-maintenance)
#:select (%gnu-updater %gnome-updater))
#:use-module (guix import elpa)
#:use-module (guix import cran)
#:use-module (guix gnupg)
#:use-module (gnu packages)
#:use-module ((gnu packages commencement) #:select (%final-inputs))
#:use-module (ice-9 match)
#:use-module (ice-9 regex)
#:use-module (ice-9 vlist)
#:use-module (ice-9 format)
#:use-module (srfi srfi-1)
#:use-module (srfi srfi-11)
#:use-module (srfi srfi-26)
#:use-module (srfi srfi-37)
#:use-module (rnrs io ports)
#:export (guix-refresh
%updaters))
;;;
;;; Command-line options.
;;;
(define %default-options
Alist of default option values .
'())
(define %options
;; Specification of the command-line options.
(list (option '(#\u "update") #f #f
(lambda (opt name arg result)
(alist-cons 'update? #t result)))
(option '(#\s "select") #t #f
(lambda (opt name arg result)
(match arg
((or "core" "non-core")
(alist-cons 'select (string->symbol arg)
result))
(x
(leave (_ "~a: invalid selection; expected `core' or `non-core'~%")
arg)))))
(option '(#\t "type") #t #f
(lambda (opt name arg result)
(let* ((not-comma (char-set-complement (char-set #\,)))
(names (map string->symbol
(string-tokenize arg not-comma))))
(alist-cons 'updaters names result))))
(option '(#\L "list-updaters") #f #f
(lambda args
(list-updaters-and-exit)))
(option '(#\e "expression") #t #f
(lambda (opt name arg result)
(alist-cons 'expression arg result)))
(option '(#\l "list-dependent") #f #f
(lambda (opt name arg result)
(alist-cons 'list-dependent? #t result)))
(option '("key-server") #t #f
(lambda (opt name arg result)
(alist-cons 'key-server arg result)))
(option '("gpg") #t #f
(lambda (opt name arg result)
(alist-cons 'gpg-command arg result)))
(option '("key-download") #t #f
(lambda (opt name arg result)
(match arg
((or "interactive" "always" "never")
(alist-cons 'key-download (string->symbol arg)
result))
(_
(leave (_ "unsupported policy: ~a~%")
arg)))))
(option '(#\h "help") #f #f
(lambda args
(show-help)
(exit 0)))
(option '(#\V "version") #f #f
(lambda args
(show-version-and-exit "guix refresh")))))
(define (show-help)
(display (_ "Usage: guix refresh [OPTION]... PACKAGE...
Update package definitions to match the latest upstream version.
When PACKAGE... is given, update only the specified packages. Otherwise
update all the packages of the distribution, or the subset thereof
specified with `--select'.\n"))
(display (_ "
-e, --expression=EXPR consider the package EXPR evaluates to"))
(display (_ "
-u, --update update source files in place"))
(display (_ "
-s, --select=SUBSET select all the packages in SUBSET, one of
`core' or `non-core'"))
(display (_ "
-t, --type=UPDATER,... restrict to updates from the specified updaters
(e.g., 'gnu')"))
(display (_ "
-L, --list-updaters list available updaters and exit"))
(display (_ "
-l, --list-dependent list top-level dependent packages that would need to
be rebuilt as a result of upgrading PACKAGE..."))
(newline)
(display (_ "
--key-server=HOST use HOST as the OpenPGP key server"))
(display (_ "
--gpg=COMMAND use COMMAND as the GnuPG 2.x command"))
(display (_ "
--key-download=POLICY
handle missing OpenPGP keys according to POLICY:
'always', 'never', and 'interactive', which is also
used when 'key-download' is not specified"))
(newline)
(display (_ "
-h, --help display this help and exit"))
(display (_ "
-V, --version display version information and exit"))
(newline)
(show-bug-report-information))
;;;
;;; Updates.
;;;
(define-syntax maybe-updater
;; Helper macro for 'list-updaters'.
(syntax-rules (=>)
((_ ((module => updater) rest ...) result)
(maybe-updater (rest ...)
(let ((iface (false-if-exception
(resolve-interface 'module)))
(tail result))
(if iface
(cons (module-ref iface 'updater) tail)
tail))))
((_ (updater rest ...) result)
(maybe-updater (rest ...)
(cons updater result)))
((_ () result)
(reverse result))))
(define-syntax-rule (list-updaters updaters ...)
"Expand to '(list UPDATERS ...)' but only the subset of UPDATERS that are
either unconditional, or have their requirement met.
A conditional updater has this form:
((SOME MODULE) => UPDATER)
meaning that UPDATER is added to the list if and only if (SOME MODULE) could
be resolved at run time.
This is a way to discard at macro expansion time updaters that depend on
unavailable optional dependencies such as Guile-JSON."
(maybe-updater (updaters ...) '()))
(define %updaters
;; List of "updaters" used by default. They are consulted in this order.
(list-updaters %gnu-updater
%gnome-updater
%elpa-updater
%cran-updater
((guix import pypi) => %pypi-updater)))
(define (lookup-updater name)
"Return the updater called NAME."
(or (find (lambda (updater)
(eq? name (upstream-updater-name updater)))
%updaters)
(leave (_ "~a: no such updater~%") name)))
(define (list-updaters-and-exit)
"Display available updaters and exit."
(format #t (_ "Available updaters:~%"))
(for-each (lambda (updater)
(format #t "- ~a: ~a~%"
(upstream-updater-name updater)
(_ (upstream-updater-description updater))))
%updaters)
(exit 0))
(define* (update-package store package updaters
#:key (key-download 'interactive))
"Update the source file that defines PACKAGE with the new version.
KEY-DOWNLOAD specifies a download policy for missing OpenPGP keys; allowed
values: 'interactive' (default), 'always', and 'never'."
(let-values (((version tarball)
(package-update store package updaters
#:key-download key-download))
((loc)
(or (package-field-location package 'version)
(package-location package))))
(when version
(if (and=> tarball file-exists?)
(begin
(format (current-error-port)
(_ "~a: ~a: updating from version ~a to version ~a...~%")
(location->string loc)
(package-name package)
(package-version package) version)
(let ((hash (call-with-input-file tarball
port-sha256)))
(update-package-source package version hash)))
(warning (_ "~a: version ~a could not be \
downloaded and authenticated; not updating~%")
(package-name package) version)))))
;;;
;;; Dependents.
;;;
(define (all-packages)
"Return the list of all the distro's packages."
(fold-packages cons '()))
(define (list-dependents packages)
"List all the things that would need to be rebuilt if PACKAGES are changed."
(with-store store
(run-with-store store
;; Using %BAG-NODE-TYPE is more accurate than using %PACKAGE-NODE-TYPE
;; because it includes implicit dependencies.
(mlet %store-monad ((edges (node-back-edges %bag-node-type
(all-packages))))
(let* ((dependents (node-transitive-edges packages edges))
(covering (filter (lambda (node)
(null? (edges node)))
dependents)))
(match dependents
(()
(format (current-output-port)
(N_ "No dependents other than itself: ~{~a~}~%"
"No dependents other than themselves: ~{~a~^ ~}~%"
(length packages))
(map package-full-name packages)))
((x)
(format (current-output-port)
(_ "A single dependent package: ~a~%")
(package-full-name x)))
(lst
(format (current-output-port)
(N_ "Building the following package would ensure ~d \
dependent packages are rebuilt: ~*~{~a~^ ~}~%"
"Building the following ~d packages would ensure ~d \
dependent packages are rebuilt: ~{~a~^ ~}~%"
(length covering))
(length covering) (length dependents)
(map package-full-name covering))))
(return #t))))))
;;;
;;; Entry point.
;;;
(define (guix-refresh . args)
(define (parse-options)
;; Return the alist of option values.
(args-fold* args %options
(lambda (opt name arg result)
(leave (_ "~A: unrecognized option~%") name))
(lambda (arg result)
(alist-cons 'argument arg result))
%default-options))
(define (options->updaters opts)
;; Return the list of updaters to use.
(match (filter-map (match-lambda
(('updaters . names)
(map lookup-updater names))
(_ #f))
opts)
(()
;; Use the default updaters.
%updaters)
(lists
(concatenate lists))))
(define (keep-newest package lst)
If a newer version of PACKAGE is already in LST , return LST ; otherwise
return LST minus the other version of PACKAGE in it , plus PACKAGE .
(let ((name (package-name package)))
(match (find (lambda (p)
(string=? (package-name p) name))
lst)
((? package? other)
(if (version>? (package-version other) (package-version package))
lst
(cons package (delq other lst))))
(_
(cons package lst)))))
(define core-package?
(let* ((input->package (match-lambda
((name (? package? package) _ ...) package)
(_ #f)))
(final-inputs (map input->package %final-inputs))
(core (append final-inputs
(append-map (compose (cut filter-map input->package <>)
package-transitive-inputs)
final-inputs)))
(names (delete-duplicates (map package-name core))))
(lambda (package)
"Return true if PACKAGE is likely a \"core package\"---i.e., one whose
update would trigger a complete rebuild."
;; Compare by name because packages in base.scm basically inherit
;; other packages. So, even if those packages are not core packages
;; themselves, updating them would also update those who inherit from
;; them.
XXX : Fails to catch MPFR / MPC , whose * source * is used as input .
(member (package-name package) names))))
(let* ((opts (parse-options))
(update? (assoc-ref opts 'update?))
(updaters (options->updaters opts))
(list-dependent? (assoc-ref opts 'list-dependent?))
(key-download (assoc-ref opts 'key-download))
(packages
(match (filter-map (match-lambda
(('argument . spec)
;; Take either the specified version or the
;; latest one.
(specification->package spec))
(('expression . exp)
(read/eval-package-expression exp))
(_ #f))
opts)
(() ; default to all packages
(let ((select? (match (assoc-ref opts 'select)
('core core-package?)
('non-core (negate core-package?))
(_ (const #t)))))
(fold-packages (lambda (package result)
(if (select? package)
(keep-newest package result)
result))
'())))
(some ; user-specified packages
some))))
(with-error-handling
(cond
(list-dependent?
(list-dependents packages))
(update?
(let ((store (open-connection)))
(parameterize ((%openpgp-key-server
(or (assoc-ref opts 'key-server)
(%openpgp-key-server)))
(%gpg-command
(or (assoc-ref opts 'gpg-command)
(%gpg-command))))
(for-each
(cut update-package store <> updaters
#:key-download key-download)
packages))))
(else
(for-each (lambda (package)
(match (package-update-path package updaters)
((? upstream-source? source)
(let ((loc (or (package-field-location package 'version)
(package-location package))))
(format (current-error-port)
(_ "~a: ~a would be upgraded from ~a to ~a~%")
(location->string loc)
(package-name package) (package-version package)
(upstream-source-version source))))
(#f #f)))
packages))))))
| null | https://raw.githubusercontent.com/pjotrp/guix/96250294012c2f1520b67f12ea80bfd6b98075a2/guix/scripts/refresh.scm | scheme | GNU Guix --- Functional package management for GNU
This file is part of GNU Guix.
you can redistribute it and/or modify it
either version 3 of the License , or ( at
your option) any later version.
GNU Guix is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
Command-line options.
Specification of the command-line options.
Updates.
Helper macro for 'list-updaters'.
List of "updaters" used by default. They are consulted in this order.
allowed
not updating~%")
Dependents.
Using %BAG-NODE-TYPE is more accurate than using %PACKAGE-NODE-TYPE
because it includes implicit dependencies.
Entry point.
Return the alist of option values.
Return the list of updaters to use.
Use the default updaters.
otherwise
Compare by name because packages in base.scm basically inherit
other packages. So, even if those packages are not core packages
themselves, updating them would also update those who inherit from
them.
Take either the specified version or the
latest one.
default to all packages
user-specified packages | Copyright © 2013 , 2014 , 2015 < >
Copyright © 2013 < >
Copyright © 2014 < >
Copyright © 2015 < >
under the terms of the GNU General Public License as published by
You should have received a copy of the GNU General Public License
along with GNU . If not , see < / > .
(define-module (guix scripts refresh)
#:use-module (guix ui)
#:use-module (guix hash)
#:use-module (guix scripts)
#:use-module (guix store)
#:use-module (guix utils)
#:use-module (guix packages)
#:use-module (guix upstream)
#:use-module (guix graph)
#:use-module (guix scripts graph)
#:use-module (guix monads)
#:use-module ((guix gnu-maintenance)
#:select (%gnu-updater %gnome-updater))
#:use-module (guix import elpa)
#:use-module (guix import cran)
#:use-module (guix gnupg)
#:use-module (gnu packages)
#:use-module ((gnu packages commencement) #:select (%final-inputs))
#:use-module (ice-9 match)
#:use-module (ice-9 regex)
#:use-module (ice-9 vlist)
#:use-module (ice-9 format)
#:use-module (srfi srfi-1)
#:use-module (srfi srfi-11)
#:use-module (srfi srfi-26)
#:use-module (srfi srfi-37)
#:use-module (rnrs io ports)
#:export (guix-refresh
%updaters))
(define %default-options
Alist of default option values .
'())
(define %options
(list (option '(#\u "update") #f #f
(lambda (opt name arg result)
(alist-cons 'update? #t result)))
(option '(#\s "select") #t #f
(lambda (opt name arg result)
(match arg
((or "core" "non-core")
(alist-cons 'select (string->symbol arg)
result))
(x
(leave (_ "~a: invalid selection; expected `core' or `non-core'~%")
arg)))))
(option '(#\t "type") #t #f
(lambda (opt name arg result)
(let* ((not-comma (char-set-complement (char-set #\,)))
(names (map string->symbol
(string-tokenize arg not-comma))))
(alist-cons 'updaters names result))))
(option '(#\L "list-updaters") #f #f
(lambda args
(list-updaters-and-exit)))
(option '(#\e "expression") #t #f
(lambda (opt name arg result)
(alist-cons 'expression arg result)))
(option '(#\l "list-dependent") #f #f
(lambda (opt name arg result)
(alist-cons 'list-dependent? #t result)))
(option '("key-server") #t #f
(lambda (opt name arg result)
(alist-cons 'key-server arg result)))
(option '("gpg") #t #f
(lambda (opt name arg result)
(alist-cons 'gpg-command arg result)))
(option '("key-download") #t #f
(lambda (opt name arg result)
(match arg
((or "interactive" "always" "never")
(alist-cons 'key-download (string->symbol arg)
result))
(_
(leave (_ "unsupported policy: ~a~%")
arg)))))
(option '(#\h "help") #f #f
(lambda args
(show-help)
(exit 0)))
(option '(#\V "version") #f #f
(lambda args
(show-version-and-exit "guix refresh")))))
(define (show-help)
(display (_ "Usage: guix refresh [OPTION]... PACKAGE...
Update package definitions to match the latest upstream version.
When PACKAGE... is given, update only the specified packages. Otherwise
update all the packages of the distribution, or the subset thereof
specified with `--select'.\n"))
(display (_ "
-e, --expression=EXPR consider the package EXPR evaluates to"))
(display (_ "
-u, --update update source files in place"))
(display (_ "
-s, --select=SUBSET select all the packages in SUBSET, one of
`core' or `non-core'"))
(display (_ "
-t, --type=UPDATER,... restrict to updates from the specified updaters
(e.g., 'gnu')"))
(display (_ "
-L, --list-updaters list available updaters and exit"))
(display (_ "
-l, --list-dependent list top-level dependent packages that would need to
be rebuilt as a result of upgrading PACKAGE..."))
(newline)
(display (_ "
--key-server=HOST use HOST as the OpenPGP key server"))
(display (_ "
--gpg=COMMAND use COMMAND as the GnuPG 2.x command"))
(display (_ "
--key-download=POLICY
handle missing OpenPGP keys according to POLICY:
'always', 'never', and 'interactive', which is also
used when 'key-download' is not specified"))
(newline)
(display (_ "
-h, --help display this help and exit"))
(display (_ "
-V, --version display version information and exit"))
(newline)
(show-bug-report-information))
(define-syntax maybe-updater
(syntax-rules (=>)
((_ ((module => updater) rest ...) result)
(maybe-updater (rest ...)
(let ((iface (false-if-exception
(resolve-interface 'module)))
(tail result))
(if iface
(cons (module-ref iface 'updater) tail)
tail))))
((_ (updater rest ...) result)
(maybe-updater (rest ...)
(cons updater result)))
((_ () result)
(reverse result))))
(define-syntax-rule (list-updaters updaters ...)
"Expand to '(list UPDATERS ...)' but only the subset of UPDATERS that are
either unconditional, or have their requirement met.
A conditional updater has this form:
((SOME MODULE) => UPDATER)
meaning that UPDATER is added to the list if and only if (SOME MODULE) could
be resolved at run time.
This is a way to discard at macro expansion time updaters that depend on
unavailable optional dependencies such as Guile-JSON."
(maybe-updater (updaters ...) '()))
(define %updaters
(list-updaters %gnu-updater
%gnome-updater
%elpa-updater
%cran-updater
((guix import pypi) => %pypi-updater)))
(define (lookup-updater name)
"Return the updater called NAME."
(or (find (lambda (updater)
(eq? name (upstream-updater-name updater)))
%updaters)
(leave (_ "~a: no such updater~%") name)))
(define (list-updaters-and-exit)
"Display available updaters and exit."
(format #t (_ "Available updaters:~%"))
(for-each (lambda (updater)
(format #t "- ~a: ~a~%"
(upstream-updater-name updater)
(_ (upstream-updater-description updater))))
%updaters)
(exit 0))
(define* (update-package store package updaters
#:key (key-download 'interactive))
"Update the source file that defines PACKAGE with the new version.
values: 'interactive' (default), 'always', and 'never'."
(let-values (((version tarball)
(package-update store package updaters
#:key-download key-download))
((loc)
(or (package-field-location package 'version)
(package-location package))))
(when version
(if (and=> tarball file-exists?)
(begin
(format (current-error-port)
(_ "~a: ~a: updating from version ~a to version ~a...~%")
(location->string loc)
(package-name package)
(package-version package) version)
(let ((hash (call-with-input-file tarball
port-sha256)))
(update-package-source package version hash)))
(warning (_ "~a: version ~a could not be \
(package-name package) version)))))
(define (all-packages)
"Return the list of all the distro's packages."
(fold-packages cons '()))
(define (list-dependents packages)
"List all the things that would need to be rebuilt if PACKAGES are changed."
(with-store store
(run-with-store store
(mlet %store-monad ((edges (node-back-edges %bag-node-type
(all-packages))))
(let* ((dependents (node-transitive-edges packages edges))
(covering (filter (lambda (node)
(null? (edges node)))
dependents)))
(match dependents
(()
(format (current-output-port)
(N_ "No dependents other than itself: ~{~a~}~%"
"No dependents other than themselves: ~{~a~^ ~}~%"
(length packages))
(map package-full-name packages)))
((x)
(format (current-output-port)
(_ "A single dependent package: ~a~%")
(package-full-name x)))
(lst
(format (current-output-port)
(N_ "Building the following package would ensure ~d \
dependent packages are rebuilt: ~*~{~a~^ ~}~%"
"Building the following ~d packages would ensure ~d \
dependent packages are rebuilt: ~{~a~^ ~}~%"
(length covering))
(length covering) (length dependents)
(map package-full-name covering))))
(return #t))))))
(define (guix-refresh . args)
(define (parse-options)
(args-fold* args %options
(lambda (opt name arg result)
(leave (_ "~A: unrecognized option~%") name))
(lambda (arg result)
(alist-cons 'argument arg result))
%default-options))
(define (options->updaters opts)
(match (filter-map (match-lambda
(('updaters . names)
(map lookup-updater names))
(_ #f))
opts)
(()
%updaters)
(lists
(concatenate lists))))
(define (keep-newest package lst)
return LST minus the other version of PACKAGE in it , plus PACKAGE .
(let ((name (package-name package)))
(match (find (lambda (p)
(string=? (package-name p) name))
lst)
((? package? other)
(if (version>? (package-version other) (package-version package))
lst
(cons package (delq other lst))))
(_
(cons package lst)))))
(define core-package?
(let* ((input->package (match-lambda
((name (? package? package) _ ...) package)
(_ #f)))
(final-inputs (map input->package %final-inputs))
(core (append final-inputs
(append-map (compose (cut filter-map input->package <>)
package-transitive-inputs)
final-inputs)))
(names (delete-duplicates (map package-name core))))
(lambda (package)
"Return true if PACKAGE is likely a \"core package\"---i.e., one whose
update would trigger a complete rebuild."
XXX : Fails to catch MPFR / MPC , whose * source * is used as input .
(member (package-name package) names))))
(let* ((opts (parse-options))
(update? (assoc-ref opts 'update?))
(updaters (options->updaters opts))
(list-dependent? (assoc-ref opts 'list-dependent?))
(key-download (assoc-ref opts 'key-download))
(packages
(match (filter-map (match-lambda
(('argument . spec)
(specification->package spec))
(('expression . exp)
(read/eval-package-expression exp))
(_ #f))
opts)
(let ((select? (match (assoc-ref opts 'select)
('core core-package?)
('non-core (negate core-package?))
(_ (const #t)))))
(fold-packages (lambda (package result)
(if (select? package)
(keep-newest package result)
result))
'())))
some))))
(with-error-handling
(cond
(list-dependent?
(list-dependents packages))
(update?
(let ((store (open-connection)))
(parameterize ((%openpgp-key-server
(or (assoc-ref opts 'key-server)
(%openpgp-key-server)))
(%gpg-command
(or (assoc-ref opts 'gpg-command)
(%gpg-command))))
(for-each
(cut update-package store <> updaters
#:key-download key-download)
packages))))
(else
(for-each (lambda (package)
(match (package-update-path package updaters)
((? upstream-source? source)
(let ((loc (or (package-field-location package 'version)
(package-location package))))
(format (current-error-port)
(_ "~a: ~a would be upgraded from ~a to ~a~%")
(location->string loc)
(package-name package) (package-version package)
(upstream-source-version source))))
(#f #f)))
packages))))))
|
2ff8fee761d56e55687da6d46d11ddd6e2bfb40e24985e594a0d2afe657669dd | WhatsApp/eqwalizer | eqwalizer_specs.erl | Copyright ( c ) Meta Platforms , Inc. and affiliates . All rights reserved .
%%%
This source code is licensed under the Apache 2.0 license found in
%%% the LICENSE file in the root directory of this source tree.
-module(eqwalizer_specs).
-typing([eqwalizer]).
-compile([export_all, nowarn_export_all]).
%% -------- erlang --------
-spec 'erlang:abs'(number()) -> number().
'erlang:abs'(_) -> error(eqwalizer_specs).
-spec 'erlang:max'(A, B) -> A | B.
'erlang:max'(_, _) -> error(eqwalizer_specs).
-spec 'erlang:min'(A, B) -> A | B.
'erlang:min'(_, _) -> error(eqwalizer_specs).
-spec 'erlang:system_time'() -> pos_integer().
'erlang:system_time'() -> error(eqwalizer_specs).
-spec 'erlang:system_time'(erlang:time_unit()) -> pos_integer().
'erlang:system_time'(_) -> error(eqwalizer_specs).
%% -------- gb_sets --------
-spec 'gb_sets:empty'() -> gb_sets:set(none()).
'gb_sets:empty'() -> error(eqwalizer_specs).
-spec 'gb_sets:new'() -> gb_sets:set(none()).
'gb_sets:new'() -> error(eqwalizer_specs).
%% -------- lists --------
-spec 'lists:all'(fun((T) -> boolean()), [T]) -> boolean().
'lists:all'(_, _) -> error(eqwalizer_specs).
-spec 'lists:any'(fun((T) -> boolean()), [T]) -> boolean().
'lists:any'(_, _) -> error(eqwalizer_specs).
-spec 'lists:append'([[T]]) -> [T].
'lists:append'(_) -> error(eqwalizer_specs).
-spec 'lists:append'([T], [T]) -> [T].
'lists:append'(_, _) -> error(eqwalizer_specs).
-spec 'lists:delete'(T, [T]) -> [T].
'lists:delete'(_, _) -> error(eqwalizer_specs).
-spec 'lists:droplast'([T]) -> [T].
'lists:droplast'(_) -> error(eqwalizer_specs).
-spec 'lists:dropwhile'(fun((T) -> boolean()), [T]) -> [T].
'lists:dropwhile'(_, _) -> error(eqwalizer_specs).
-spec 'lists:duplicate'(non_neg_integer(), T) -> [T].
'lists:duplicate'(_, _) -> error(eqwalizer_specs).
-spec 'lists:filter'(fun((T) -> boolean()), [T]) -> [T].
'lists:filter'(_, _) -> error(eqwalizer_specs).
-spec 'lists:filtermap'(fun((T) -> boolean() | {'true', X}), [T]) -> [(T | X)].
'lists:filtermap'(_, _) -> error(eqwalizer_specs).
-spec 'lists:flatmap'(fun((A) -> [B]), [A]) -> [B].
'lists:flatmap'(_, _) -> error(eqwalizer_specs).
-spec 'lists:flatlength'([term()]) -> non_neg_integer().
'lists:flatlength'(_) -> error(eqwalizer_specs).
-spec 'lists:foldl'(fun((T, Acc) -> Acc), Acc, [T]) -> Acc.
'lists:foldl'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:foldr'(fun((T, Acc) -> Acc), Acc, [T]) -> Acc.
'lists:foldr'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:foreach'(fun((T) -> term()), [T]) -> ok.
'lists:foreach'(_, _) -> error(eqwalizer_specs).
-spec 'lists:join'(T, [T]) -> [T].
'lists:join'(_, _) -> error(eqwalizer_specs).
-spec 'lists:keydelete'(Key :: term(), N :: pos_integer(), [Tuple]) -> [Tuple].
'lists:keydelete'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:keyfind'(Key :: term(), N :: pos_integer(), [Tuple]) -> Tuple | false.
'lists:keyfind'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:keyreplace'(Key :: term(), N :: pos_integer(), [Tuple], Tuple) -> [Tuple].
'lists:keyreplace'(_, _, _, _) -> error(eqwalizer_specs).
-spec 'lists:keysearch'(Key :: term(), N :: pos_integer(), [Tuple]) -> {value, Tuple} | false.
'lists:keysearch'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:keytake'(Key :: term(), N :: pos_integer(), [Tuple]) -> {value, Tuple, [Tuple]} | false.
'lists:keytake'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:last'([T]) -> T.
'lists:last'(_) -> error(eqwalizer_specs).
-spec 'lists:map'(fun((A) -> B), [A]) -> [B].
'lists:map'(_, _) -> error(eqwalizer_specs).
-spec 'lists:mapfoldl'(fun((A, Acc) -> {B, Acc}), Acc, [A]) -> {[B], Acc}.
'lists:mapfoldl'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:mapfoldr'(fun((A, Acc) -> {B, Acc}), Acc, [A]) -> {[B], Acc}.
'lists:mapfoldr'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:max'([T]) -> T.
'lists:max'(_) -> error(eqwalizer_specs).
-spec 'lists:member'(T, [T]) -> boolean().
'lists:member'(_, _) -> error(eqwalizer_specs).
-spec 'lists:merge'([[T]]) -> [T].
'lists:merge'(_) -> error(eqwalizer_specs).
-spec 'lists:merge'([X], [Y]) -> [X | Y].
'lists:merge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:merge'(fun((A, B) -> boolean()), [A], [B]) -> [A | B].
'lists:merge'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:merge3'([X], [Y], [Z]) -> [X | Y | Z].
'lists:merge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:min'([T]) -> T.
'lists:min'(_) -> error(eqwalizer_specs).
-spec 'lists:nth'(pos_integer(), [T]) -> T.
'lists:nth'(_, _) -> error(eqwalizer_specs).
-spec 'lists:nthtail'(pos_integer(), [T]) -> [T].
'lists:nthtail'(_, _) -> error(eqwalizer_specs).
-spec 'lists:partition'(fun((T) -> boolean()), [T]) -> {[T], [T]}.
'lists:partition'(_, _) -> error(eqwalizer_specs).
-spec 'lists:prefix'([T], [T]) -> boolean().
'lists:prefix'(_, _) -> error(eqwalizer_specs).
-spec 'lists:reverse'([T]) -> [T].
'lists:reverse'(_) -> error(eqwalizer_specs).
-spec 'lists:reverse'([T], [T]) -> [T].
'lists:reverse'(_, _) -> error(eqwalizer_specs).
-spec 'lists:rmerge'([X], [Y]) -> [X | Y].
'lists:rmerge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:rmerge3'([X], [Y], [Z]) -> [X | Y | Z].
'lists:rmerge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:rumerge'([X], [Y]) -> [X | Y].
'lists:rumerge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:rumerge'(fun((X, Y) -> boolean()), [X], [Y]) -> [(X | Y)].
'lists:rumerge'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:rumerge3'([X], [Y], [Z]) -> [X | Y | Z].
'lists:rumerge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:search'(fun((T) -> boolean()), [T]) -> {value, T} | false.
'lists:search'(_, _) -> error(eqwalizer_specs).
-spec 'lists:sort'([T]) -> [T].
'lists:sort'(_) -> error(eqwalizer_specs).
-spec 'lists:sort'(fun((T, T) -> boolean()), [T]) -> [T].
'lists:sort'(_, _) -> error(eqwalizer_specs).
-spec 'lists:split'(non_neg_integer(), [T]) -> {[T], [T]}.
'lists:split'(_, _) -> error(eqwalizer_specs).
-spec 'lists:splitwith'(fun((T) -> boolean()), [T]) -> {[T], [T]}.
'lists:splitwith'(_, _) -> error(eqwalizer_specs).
-spec 'lists:sublist'([T], Len :: non_neg_integer()) -> [T].
'lists:sublist'(_, _) -> error(eqwalizer_specs).
-spec 'lists:sublist'([T], Start :: pos_integer(), Len :: non_neg_integer()) -> [T].
'lists:sublist'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:subtract'([T], [T]) -> [T].
'lists:subtract'(_, _) -> error(eqwalizer_specs).
-spec 'lists:suffix'([T], [T]) -> boolean().
'lists:suffix'(_, _) -> error(eqwalizer_specs).
-spec 'lists:takewhile'(fun((T) -> boolean()), [T]) -> [T].
'lists:takewhile'(_, _) -> error(eqwalizer_specs).
-spec 'lists:umerge'([[T]]) -> [T].
'lists:umerge'(_) -> error(eqwalizer_specs).
-spec 'lists:umerge'([A], [B]) -> [A | B].
'lists:umerge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:umerge'(fun((A, B) -> boolean()), [A], [B]) -> [A | B].
'lists:umerge'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:umerge3'([A], [B], [C]) -> [A | B | C].
'lists:umerge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:unzip'([{A, B}]) -> {[A], [B]}.
'lists:unzip'(_) -> error(eqwalizer_specs).
-spec 'lists:unzip3'([{A, B, C}]) -> {[A], [B], [C]}.
'lists:unzip3'(_) -> error(eqwalizer_specs).
-spec 'lists:usort'([T]) -> [T].
'lists:usort'(_) -> error(eqwalizer_specs).
-spec 'lists:usort'(fun((T, T) -> boolean()), [T]) -> [T].
'lists:usort'(_, _) -> error(eqwalizer_specs).
-spec 'lists:zf'(fun((T) -> boolean() | {'true', X}), [T]) -> [(T | X)].
'lists:zf'(_, _) -> error(eqwalizer_specs).
-spec 'lists:zip'([A], [B]) -> [{A, B}].
'lists:zip'(_, _) -> error(eqwalizer_specs).
-spec 'lists:zip3'([A], [B], [C]) -> [{A, B, C}].
'lists:zip3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:zipwith'(fun((X, Y) -> T), [X], [Y]) -> [T].
'lists:zipwith'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:zipwith3'(fun((X, Y, Z) -> T), [X], [Y], [Z]) -> [T].
'lists:zipwith3'(_, _, _, _) -> error(eqwalizer_specs).
%% -------- maps --------
-spec 'maps:find'(Key, #{Key => Value}) -> {ok, Value} | error.
'maps:find'(_, _) -> error(eqwalizer_specs).
-spec 'maps:from_list'([{Key, Value}]) -> #{Key => Value}.
'maps:from_list'(_) -> error(eqwalizer_specs).
-spec 'maps:merge'(#{Key => Value}, #{Key => Value}) -> #{Key => Value}.
'maps:merge'(_, _) -> error(eqwalizer_specs).
-spec 'maps:put'(Key, Value, #{Key => Value}) -> #{Key => Value}.
'maps:put'(_, _, _) -> error(eqwalizer_specs).
-spec 'maps:remove'(Key, #{Key => Value}) -> #{Key => Value}.
'maps:remove'(_, _) -> error(eqwalizer_specs).
-spec 'maps:update_with'(Key, fun((Value1) -> Value2), #{Key => Value1}) -> #{Key => Value1 | Value2}.
'maps:update_with'(_, _, _) -> error(eqwalizer_specs).
-spec 'maps:update_with'(Key, fun((Value1) -> Value2), Value2, #{Key => Value1}) -> #{Key => Value1 | Value2}.
'maps:update_with'(_, _, _, _) -> error(eqwalizer_specs).
%% -------- proplists --------
-spec 'proplists:delete'(term(), [A]) -> [A].
'proplists:delete'(_, _) -> error(eqwalizer_specs).
-spec 'proplists:from_map'(#{K => V}) -> [{K, V}].
'proplists:from_map'(_) -> error(eqwalizer_specs).
%% -------- timer --------
-spec 'timer:tc'(fun(() -> T)) -> {integer(), T}.
'timer:tc'(_) -> error(eqwalizer_specs).
| null | https://raw.githubusercontent.com/WhatsApp/eqwalizer/8017d486c025eaa5c35ced1481ad5bad0f665efa/mini-elp/test_projects/parse_error/eqwalizer/src/eqwalizer_specs.erl | erlang |
the LICENSE file in the root directory of this source tree.
-------- erlang --------
-------- gb_sets --------
-------- lists --------
-------- maps --------
-------- proplists --------
-------- timer -------- | Copyright ( c ) Meta Platforms , Inc. and affiliates . All rights reserved .
This source code is licensed under the Apache 2.0 license found in
-module(eqwalizer_specs).
-typing([eqwalizer]).
-compile([export_all, nowarn_export_all]).
-spec 'erlang:abs'(number()) -> number().
'erlang:abs'(_) -> error(eqwalizer_specs).
-spec 'erlang:max'(A, B) -> A | B.
'erlang:max'(_, _) -> error(eqwalizer_specs).
-spec 'erlang:min'(A, B) -> A | B.
'erlang:min'(_, _) -> error(eqwalizer_specs).
-spec 'erlang:system_time'() -> pos_integer().
'erlang:system_time'() -> error(eqwalizer_specs).
-spec 'erlang:system_time'(erlang:time_unit()) -> pos_integer().
'erlang:system_time'(_) -> error(eqwalizer_specs).
-spec 'gb_sets:empty'() -> gb_sets:set(none()).
'gb_sets:empty'() -> error(eqwalizer_specs).
-spec 'gb_sets:new'() -> gb_sets:set(none()).
'gb_sets:new'() -> error(eqwalizer_specs).
-spec 'lists:all'(fun((T) -> boolean()), [T]) -> boolean().
'lists:all'(_, _) -> error(eqwalizer_specs).
-spec 'lists:any'(fun((T) -> boolean()), [T]) -> boolean().
'lists:any'(_, _) -> error(eqwalizer_specs).
-spec 'lists:append'([[T]]) -> [T].
'lists:append'(_) -> error(eqwalizer_specs).
-spec 'lists:append'([T], [T]) -> [T].
'lists:append'(_, _) -> error(eqwalizer_specs).
-spec 'lists:delete'(T, [T]) -> [T].
'lists:delete'(_, _) -> error(eqwalizer_specs).
-spec 'lists:droplast'([T]) -> [T].
'lists:droplast'(_) -> error(eqwalizer_specs).
-spec 'lists:dropwhile'(fun((T) -> boolean()), [T]) -> [T].
'lists:dropwhile'(_, _) -> error(eqwalizer_specs).
-spec 'lists:duplicate'(non_neg_integer(), T) -> [T].
'lists:duplicate'(_, _) -> error(eqwalizer_specs).
-spec 'lists:filter'(fun((T) -> boolean()), [T]) -> [T].
'lists:filter'(_, _) -> error(eqwalizer_specs).
-spec 'lists:filtermap'(fun((T) -> boolean() | {'true', X}), [T]) -> [(T | X)].
'lists:filtermap'(_, _) -> error(eqwalizer_specs).
-spec 'lists:flatmap'(fun((A) -> [B]), [A]) -> [B].
'lists:flatmap'(_, _) -> error(eqwalizer_specs).
-spec 'lists:flatlength'([term()]) -> non_neg_integer().
'lists:flatlength'(_) -> error(eqwalizer_specs).
-spec 'lists:foldl'(fun((T, Acc) -> Acc), Acc, [T]) -> Acc.
'lists:foldl'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:foldr'(fun((T, Acc) -> Acc), Acc, [T]) -> Acc.
'lists:foldr'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:foreach'(fun((T) -> term()), [T]) -> ok.
'lists:foreach'(_, _) -> error(eqwalizer_specs).
-spec 'lists:join'(T, [T]) -> [T].
'lists:join'(_, _) -> error(eqwalizer_specs).
-spec 'lists:keydelete'(Key :: term(), N :: pos_integer(), [Tuple]) -> [Tuple].
'lists:keydelete'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:keyfind'(Key :: term(), N :: pos_integer(), [Tuple]) -> Tuple | false.
'lists:keyfind'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:keyreplace'(Key :: term(), N :: pos_integer(), [Tuple], Tuple) -> [Tuple].
'lists:keyreplace'(_, _, _, _) -> error(eqwalizer_specs).
-spec 'lists:keysearch'(Key :: term(), N :: pos_integer(), [Tuple]) -> {value, Tuple} | false.
'lists:keysearch'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:keytake'(Key :: term(), N :: pos_integer(), [Tuple]) -> {value, Tuple, [Tuple]} | false.
'lists:keytake'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:last'([T]) -> T.
'lists:last'(_) -> error(eqwalizer_specs).
-spec 'lists:map'(fun((A) -> B), [A]) -> [B].
'lists:map'(_, _) -> error(eqwalizer_specs).
-spec 'lists:mapfoldl'(fun((A, Acc) -> {B, Acc}), Acc, [A]) -> {[B], Acc}.
'lists:mapfoldl'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:mapfoldr'(fun((A, Acc) -> {B, Acc}), Acc, [A]) -> {[B], Acc}.
'lists:mapfoldr'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:max'([T]) -> T.
'lists:max'(_) -> error(eqwalizer_specs).
-spec 'lists:member'(T, [T]) -> boolean().
'lists:member'(_, _) -> error(eqwalizer_specs).
-spec 'lists:merge'([[T]]) -> [T].
'lists:merge'(_) -> error(eqwalizer_specs).
-spec 'lists:merge'([X], [Y]) -> [X | Y].
'lists:merge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:merge'(fun((A, B) -> boolean()), [A], [B]) -> [A | B].
'lists:merge'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:merge3'([X], [Y], [Z]) -> [X | Y | Z].
'lists:merge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:min'([T]) -> T.
'lists:min'(_) -> error(eqwalizer_specs).
-spec 'lists:nth'(pos_integer(), [T]) -> T.
'lists:nth'(_, _) -> error(eqwalizer_specs).
-spec 'lists:nthtail'(pos_integer(), [T]) -> [T].
'lists:nthtail'(_, _) -> error(eqwalizer_specs).
-spec 'lists:partition'(fun((T) -> boolean()), [T]) -> {[T], [T]}.
'lists:partition'(_, _) -> error(eqwalizer_specs).
-spec 'lists:prefix'([T], [T]) -> boolean().
'lists:prefix'(_, _) -> error(eqwalizer_specs).
-spec 'lists:reverse'([T]) -> [T].
'lists:reverse'(_) -> error(eqwalizer_specs).
-spec 'lists:reverse'([T], [T]) -> [T].
'lists:reverse'(_, _) -> error(eqwalizer_specs).
-spec 'lists:rmerge'([X], [Y]) -> [X | Y].
'lists:rmerge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:rmerge3'([X], [Y], [Z]) -> [X | Y | Z].
'lists:rmerge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:rumerge'([X], [Y]) -> [X | Y].
'lists:rumerge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:rumerge'(fun((X, Y) -> boolean()), [X], [Y]) -> [(X | Y)].
'lists:rumerge'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:rumerge3'([X], [Y], [Z]) -> [X | Y | Z].
'lists:rumerge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:search'(fun((T) -> boolean()), [T]) -> {value, T} | false.
'lists:search'(_, _) -> error(eqwalizer_specs).
-spec 'lists:sort'([T]) -> [T].
'lists:sort'(_) -> error(eqwalizer_specs).
-spec 'lists:sort'(fun((T, T) -> boolean()), [T]) -> [T].
'lists:sort'(_, _) -> error(eqwalizer_specs).
-spec 'lists:split'(non_neg_integer(), [T]) -> {[T], [T]}.
'lists:split'(_, _) -> error(eqwalizer_specs).
-spec 'lists:splitwith'(fun((T) -> boolean()), [T]) -> {[T], [T]}.
'lists:splitwith'(_, _) -> error(eqwalizer_specs).
-spec 'lists:sublist'([T], Len :: non_neg_integer()) -> [T].
'lists:sublist'(_, _) -> error(eqwalizer_specs).
-spec 'lists:sublist'([T], Start :: pos_integer(), Len :: non_neg_integer()) -> [T].
'lists:sublist'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:subtract'([T], [T]) -> [T].
'lists:subtract'(_, _) -> error(eqwalizer_specs).
-spec 'lists:suffix'([T], [T]) -> boolean().
'lists:suffix'(_, _) -> error(eqwalizer_specs).
-spec 'lists:takewhile'(fun((T) -> boolean()), [T]) -> [T].
'lists:takewhile'(_, _) -> error(eqwalizer_specs).
-spec 'lists:umerge'([[T]]) -> [T].
'lists:umerge'(_) -> error(eqwalizer_specs).
-spec 'lists:umerge'([A], [B]) -> [A | B].
'lists:umerge'(_, _) -> error(eqwalizer_specs).
-spec 'lists:umerge'(fun((A, B) -> boolean()), [A], [B]) -> [A | B].
'lists:umerge'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:umerge3'([A], [B], [C]) -> [A | B | C].
'lists:umerge3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:unzip'([{A, B}]) -> {[A], [B]}.
'lists:unzip'(_) -> error(eqwalizer_specs).
-spec 'lists:unzip3'([{A, B, C}]) -> {[A], [B], [C]}.
'lists:unzip3'(_) -> error(eqwalizer_specs).
-spec 'lists:usort'([T]) -> [T].
'lists:usort'(_) -> error(eqwalizer_specs).
-spec 'lists:usort'(fun((T, T) -> boolean()), [T]) -> [T].
'lists:usort'(_, _) -> error(eqwalizer_specs).
-spec 'lists:zf'(fun((T) -> boolean() | {'true', X}), [T]) -> [(T | X)].
'lists:zf'(_, _) -> error(eqwalizer_specs).
-spec 'lists:zip'([A], [B]) -> [{A, B}].
'lists:zip'(_, _) -> error(eqwalizer_specs).
-spec 'lists:zip3'([A], [B], [C]) -> [{A, B, C}].
'lists:zip3'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:zipwith'(fun((X, Y) -> T), [X], [Y]) -> [T].
'lists:zipwith'(_, _, _) -> error(eqwalizer_specs).
-spec 'lists:zipwith3'(fun((X, Y, Z) -> T), [X], [Y], [Z]) -> [T].
'lists:zipwith3'(_, _, _, _) -> error(eqwalizer_specs).
-spec 'maps:find'(Key, #{Key => Value}) -> {ok, Value} | error.
'maps:find'(_, _) -> error(eqwalizer_specs).
-spec 'maps:from_list'([{Key, Value}]) -> #{Key => Value}.
'maps:from_list'(_) -> error(eqwalizer_specs).
-spec 'maps:merge'(#{Key => Value}, #{Key => Value}) -> #{Key => Value}.
'maps:merge'(_, _) -> error(eqwalizer_specs).
-spec 'maps:put'(Key, Value, #{Key => Value}) -> #{Key => Value}.
'maps:put'(_, _, _) -> error(eqwalizer_specs).
-spec 'maps:remove'(Key, #{Key => Value}) -> #{Key => Value}.
'maps:remove'(_, _) -> error(eqwalizer_specs).
-spec 'maps:update_with'(Key, fun((Value1) -> Value2), #{Key => Value1}) -> #{Key => Value1 | Value2}.
'maps:update_with'(_, _, _) -> error(eqwalizer_specs).
-spec 'maps:update_with'(Key, fun((Value1) -> Value2), Value2, #{Key => Value1}) -> #{Key => Value1 | Value2}.
'maps:update_with'(_, _, _, _) -> error(eqwalizer_specs).
-spec 'proplists:delete'(term(), [A]) -> [A].
'proplists:delete'(_, _) -> error(eqwalizer_specs).
-spec 'proplists:from_map'(#{K => V}) -> [{K, V}].
'proplists:from_map'(_) -> error(eqwalizer_specs).
-spec 'timer:tc'(fun(() -> T)) -> {integer(), T}.
'timer:tc'(_) -> error(eqwalizer_specs).
|
874f8e62860e342f05755f7a29b8b9761cc736442c50ee0817a5b3b360a67916 | flodihn/NextGen | libid_sup.erl | %%----------------------------------------------------------------------
@author < >
%% @doc
This is the supervisor for the i d library ' ' .
%% @end
%%---------------------------------------------------------------------
-module(libid_sup).
-behaviour(supervisor).
-export([
start_link/0,
init/1
]).
start_link() ->
supervisor:start_link({local, libid_sup}, libid_sup, []).
init([]) ->
RestartStrategy = one_for_one,
MaxRestarts = 10,
MaxSecondsBetweenRestarts = 30,
SupFlags = {RestartStrategy, MaxRestarts, MaxSecondsBetweenRestarts},
Restart = permanent,
Shutdown = 2000,
LibPlayer = {'libid', {libid_srv, start_link,
[libid_std_impl]}, Restart, Shutdown, worker, dynamic},
{ok, {SupFlags, [LibPlayer]}}.
| null | https://raw.githubusercontent.com/flodihn/NextGen/3da1c3ee0d8f658383bdf5fccbdd49ace3cdb323/AreaServer/src/libid/libid_sup.erl | erlang | ----------------------------------------------------------------------
@doc
@end
--------------------------------------------------------------------- | @author < >
This is the supervisor for the i d library ' ' .
-module(libid_sup).
-behaviour(supervisor).
-export([
start_link/0,
init/1
]).
start_link() ->
supervisor:start_link({local, libid_sup}, libid_sup, []).
init([]) ->
RestartStrategy = one_for_one,
MaxRestarts = 10,
MaxSecondsBetweenRestarts = 30,
SupFlags = {RestartStrategy, MaxRestarts, MaxSecondsBetweenRestarts},
Restart = permanent,
Shutdown = 2000,
LibPlayer = {'libid', {libid_srv, start_link,
[libid_std_impl]}, Restart, Shutdown, worker, dynamic},
{ok, {SupFlags, [LibPlayer]}}.
|
705f0826e6461c92badfecfa29b18db44f02290b8d6f6a5e982874e21b6ba857 | PacktWorkshops/The-Clojure-Workshop | tennis.clj | (ns packt-clj.tennis
(:require [clojure.math.numeric-tower :as math]
[clojure.java.io :as io]
[clojure.data.csv :as csv]
[clojure.set :as set]
[semantic-csv.core :as sc]))
(defn match-probability [player-1-rating player-2-rating]
(/ 1
(+ 1 (math/expt 10 (/ (- player-2-rating player-1-rating) 400)))))
(defn recalculate-rating [k previous-rating expected-outcome real-outcome]
(+ previous-rating (* k (- real-outcome expected-outcome))))
(defn elo-world-simple
([csv k]
(with-open [r (io/reader csv)]
(->> (csv/read-csv r)
sc/mappify
(sc/cast-with {:winner_sets_won sc/->int
:loser_sets_won sc/->int
:winner_games_won sc/->int
:loser_games_won sc/->int})
(reduce (fn [{:keys [players] :as acc} {:keys [:winner_name :winner_slug
:loser_name :loser_slug] :as match}]
(let [winner-rating (get players winner_slug 400)
loser-rating (get players loser_slug 400)
winner-probability (match-probability winner-rating loser-rating)
loser-probability (- 1 winner-probability)
predictable-match? (not= winner-rating loser-rating)
prediction-correct? (> winner-rating loser-rating)
correct-predictions (if (and predictable-match? prediction-correct?)
(inc (:correct-predictions acc))
(:correct-predictions acc))
predictable-matches (if predictable-match?
(inc (:predictable-match-count acc))
(:predictable-match-count acc))]
(-> acc
(assoc :predictable-match-count predictable-matches)
(assoc :correct-predictions correct-predictions)
(assoc-in [:players winner_slug] (recalculate-rating k winner-rating winner-probability 1))
(assoc-in [:players loser_slug] (recalculate-rating k loser-rating loser-probability 0))
(update :match-count inc))))
{:players {}
:match-count 0
:predictable-match-count 0
:correct-predictions 0})))))
(defn k-adjustment-by-match [k {:keys [winner_sets_won loser_sets_won
winner_games_won loser_games_won]}]
(cond (> (- winner_sets_won loser_sets_won) 1)
(* k 1.2)
(< (- winner_games_won loser_games_won) 3)
(* k 0.6)
::otherwise
k))
(defn k-adjustment-by-ratings [k winner-rating loser-rating]
(cond (= winner-rating loser-rating)
(* k 0.5)
(and (> loser-rating winner-rating)
(> (- loser-rating winner-rating) 100))
(* k 1.1)
(> loser-rating winner-rating)
(* k 1.2)
::otherwise
k))
(defn elo-world-k-adjustment
([csv k k-fn]
(with-open [r (io/reader csv)]
(->> (csv/read-csv r)
sc/mappify
(sc/cast-with {:winner_sets_won sc/->int
:loser_sets_won sc/->int
:winner_games_won sc/->int
:loser_games_won sc/->int})
(reduce (fn [{:keys [players] :as acc} {:keys [:winner_name :winner_slug
:loser_name :loser_slug] :as match}]
(let [winner-rating (get players winner_slug 400)
loser-rating (get players loser_slug 400)
winner-probability (match-probability winner-rating loser-rating)
loser-probability (- 1 winner-probability)
predictable-match? (not= winner-rating loser-rating)
prediction-correct? (> winner-rating loser-rating)
correct-predictions (if (and predictable-match? prediction-correct?)
(inc (:correct-predictions acc))
(:correct-predictions acc))
predictable-matches (if predictable-match?
(inc (:predictable-match-count acc))
(:predictable-match-count acc))
adjusted-k (k-fn k winner-rating loser-rating)]
(-> acc
(assoc :predictable-match-count predictable-matches)
(assoc :correct-predictions correct-predictions)
(assoc-in [:players winner_slug] (recalculate-rating adjusted-k winner-rating winner-probability 1))
(assoc-in [:players loser_slug] (recalculate-rating adjusted-k loser-rating loser-probability 0))
(update :match-count inc))))
{:players {}
:match-count 0
:predictable-match-count 0
:correct-predictions 0})))))
(defn elo-world-k-adjustment-acc-param
([acc csv k k-fn]
(with-open [r (io/reader csv)]
(->> (csv/read-csv r)
sc/mappify
(sc/cast-with {:winner_sets_won sc/->int
:loser_sets_won sc/->int
:winner_games_won sc/->int
:loser_games_won sc/->int})
(reduce (fn [{:keys [players] :as acc} {:keys [:winner_name :winner_slug
:loser_name :loser_slug] :as match}]
(let [winner-rating (get players winner_slug 400)
loser-rating (get players loser_slug 400)
winner-probability (match-probability winner-rating loser-rating)
loser-probability (- 1 winner-probability)
predictable-match? (not= winner-rating loser-rating)
prediction-correct? (> winner-rating loser-rating)
correct-predictions (if (and predictable-match? prediction-correct?)
(inc (:correct-predictions acc))
(:correct-predictions acc))
predictable-matches (if predictable-match?
(inc (:predictable-match-count acc))
(:predictable-match-count acc))
adjusted-k (k-fn k winner-rating loser-rating)]
(-> acc
(assoc :predictable-match-count predictable-matches)
(assoc :correct-predictions correct-predictions)
(assoc-in [:players winner_slug] (recalculate-rating adjusted-k winner-rating winner-probability 1))
(assoc-in [:players loser_slug] (recalculate-rating adjusted-k loser-rating loser-probability 0))
(update :match-count inc))))
acc)))))
(defn multi-elo [csv-list k k-fn]
(reduce (fn [acc csv]
(elo-world-k-adjustment-acc-param acc csv k k-fn))
{:players {}
:match-count 0
:predictable-match-count 0
:correct-predictions 0}
csv-list))
;;; to call multi-elo
(comment
;; packt-clj.tennis>
(def multi-results (multi-elo
["match_scores_1877-1967_unindexed_csv.csv"
"match_scores_1968-1990_unindexed_csv.csv"
"match_scores_1991-2016_unindexed_csv.csv"])
40
k-adjustment-by-ratings))
| null | https://raw.githubusercontent.com/PacktWorkshops/The-Clojure-Workshop/3d309bb0e46a41ce2c93737870433b47ce0ba6a2/Chapter05/Activity5.01/tennis.clj | clojure | to call multi-elo
packt-clj.tennis> | (ns packt-clj.tennis
(:require [clojure.math.numeric-tower :as math]
[clojure.java.io :as io]
[clojure.data.csv :as csv]
[clojure.set :as set]
[semantic-csv.core :as sc]))
(defn match-probability [player-1-rating player-2-rating]
(/ 1
(+ 1 (math/expt 10 (/ (- player-2-rating player-1-rating) 400)))))
(defn recalculate-rating [k previous-rating expected-outcome real-outcome]
(+ previous-rating (* k (- real-outcome expected-outcome))))
(defn elo-world-simple
([csv k]
(with-open [r (io/reader csv)]
(->> (csv/read-csv r)
sc/mappify
(sc/cast-with {:winner_sets_won sc/->int
:loser_sets_won sc/->int
:winner_games_won sc/->int
:loser_games_won sc/->int})
(reduce (fn [{:keys [players] :as acc} {:keys [:winner_name :winner_slug
:loser_name :loser_slug] :as match}]
(let [winner-rating (get players winner_slug 400)
loser-rating (get players loser_slug 400)
winner-probability (match-probability winner-rating loser-rating)
loser-probability (- 1 winner-probability)
predictable-match? (not= winner-rating loser-rating)
prediction-correct? (> winner-rating loser-rating)
correct-predictions (if (and predictable-match? prediction-correct?)
(inc (:correct-predictions acc))
(:correct-predictions acc))
predictable-matches (if predictable-match?
(inc (:predictable-match-count acc))
(:predictable-match-count acc))]
(-> acc
(assoc :predictable-match-count predictable-matches)
(assoc :correct-predictions correct-predictions)
(assoc-in [:players winner_slug] (recalculate-rating k winner-rating winner-probability 1))
(assoc-in [:players loser_slug] (recalculate-rating k loser-rating loser-probability 0))
(update :match-count inc))))
{:players {}
:match-count 0
:predictable-match-count 0
:correct-predictions 0})))))
(defn k-adjustment-by-match [k {:keys [winner_sets_won loser_sets_won
winner_games_won loser_games_won]}]
(cond (> (- winner_sets_won loser_sets_won) 1)
(* k 1.2)
(< (- winner_games_won loser_games_won) 3)
(* k 0.6)
::otherwise
k))
(defn k-adjustment-by-ratings [k winner-rating loser-rating]
(cond (= winner-rating loser-rating)
(* k 0.5)
(and (> loser-rating winner-rating)
(> (- loser-rating winner-rating) 100))
(* k 1.1)
(> loser-rating winner-rating)
(* k 1.2)
::otherwise
k))
(defn elo-world-k-adjustment
([csv k k-fn]
(with-open [r (io/reader csv)]
(->> (csv/read-csv r)
sc/mappify
(sc/cast-with {:winner_sets_won sc/->int
:loser_sets_won sc/->int
:winner_games_won sc/->int
:loser_games_won sc/->int})
(reduce (fn [{:keys [players] :as acc} {:keys [:winner_name :winner_slug
:loser_name :loser_slug] :as match}]
(let [winner-rating (get players winner_slug 400)
loser-rating (get players loser_slug 400)
winner-probability (match-probability winner-rating loser-rating)
loser-probability (- 1 winner-probability)
predictable-match? (not= winner-rating loser-rating)
prediction-correct? (> winner-rating loser-rating)
correct-predictions (if (and predictable-match? prediction-correct?)
(inc (:correct-predictions acc))
(:correct-predictions acc))
predictable-matches (if predictable-match?
(inc (:predictable-match-count acc))
(:predictable-match-count acc))
adjusted-k (k-fn k winner-rating loser-rating)]
(-> acc
(assoc :predictable-match-count predictable-matches)
(assoc :correct-predictions correct-predictions)
(assoc-in [:players winner_slug] (recalculate-rating adjusted-k winner-rating winner-probability 1))
(assoc-in [:players loser_slug] (recalculate-rating adjusted-k loser-rating loser-probability 0))
(update :match-count inc))))
{:players {}
:match-count 0
:predictable-match-count 0
:correct-predictions 0})))))
(defn elo-world-k-adjustment-acc-param
([acc csv k k-fn]
(with-open [r (io/reader csv)]
(->> (csv/read-csv r)
sc/mappify
(sc/cast-with {:winner_sets_won sc/->int
:loser_sets_won sc/->int
:winner_games_won sc/->int
:loser_games_won sc/->int})
(reduce (fn [{:keys [players] :as acc} {:keys [:winner_name :winner_slug
:loser_name :loser_slug] :as match}]
(let [winner-rating (get players winner_slug 400)
loser-rating (get players loser_slug 400)
winner-probability (match-probability winner-rating loser-rating)
loser-probability (- 1 winner-probability)
predictable-match? (not= winner-rating loser-rating)
prediction-correct? (> winner-rating loser-rating)
correct-predictions (if (and predictable-match? prediction-correct?)
(inc (:correct-predictions acc))
(:correct-predictions acc))
predictable-matches (if predictable-match?
(inc (:predictable-match-count acc))
(:predictable-match-count acc))
adjusted-k (k-fn k winner-rating loser-rating)]
(-> acc
(assoc :predictable-match-count predictable-matches)
(assoc :correct-predictions correct-predictions)
(assoc-in [:players winner_slug] (recalculate-rating adjusted-k winner-rating winner-probability 1))
(assoc-in [:players loser_slug] (recalculate-rating adjusted-k loser-rating loser-probability 0))
(update :match-count inc))))
acc)))))
(defn multi-elo [csv-list k k-fn]
(reduce (fn [acc csv]
(elo-world-k-adjustment-acc-param acc csv k k-fn))
{:players {}
:match-count 0
:predictable-match-count 0
:correct-predictions 0}
csv-list))
(comment
(def multi-results (multi-elo
["match_scores_1877-1967_unindexed_csv.csv"
"match_scores_1968-1990_unindexed_csv.csv"
"match_scores_1991-2016_unindexed_csv.csv"])
40
k-adjustment-by-ratings))
|
64069b62b6931c84775911c9a576c55790dbc10c3b6754fb7ab04374b49ecf30 | YoshikuniJujo/funpaala | bmi.hs | bmi :: Double -> Double -> Double
bmi h w = w / (h / 100) ^ 2
isObese :: Double -> Double -> Bool
isObese h w = bmi h w >= 25
| null | https://raw.githubusercontent.com/YoshikuniJujo/funpaala/5366130826da0e6b1180992dfff94c4a634cda99/samples/06_type/bmi.hs | haskell | bmi :: Double -> Double -> Double
bmi h w = w / (h / 100) ^ 2
isObese :: Double -> Double -> Bool
isObese h w = bmi h w >= 25
| |
ca05588ac1e7eef22b9ebb05cc9409a9c561593b72517bb09fae4365cdd3b9b5 | UU-ComputerScience/uhc | ClassCtxtRed1.hs | {- ----------------------------------------------------------------------------------------
what : correct context reduction, not failing with 'cannot prove' Integral a
expected: ok
---------------------------------------------------------------------------------------- -}
module ClassCtxtRed1 where
import Debug.Trace
data FM a = FM a
lkup :: (Eq a, Show a) => FM a -> a -> a
lkup f a = if a == a then trace (show a {- -} ++ show (4::Int)) a else a
main = return ()
| null | https://raw.githubusercontent.com/UU-ComputerScience/uhc/f2b94a90d26e2093d84044b3832a9a3e3c36b129/EHC/test/regress/99/ClassCtxtRed1.hs | haskell | ----------------------------------------------------------------------------------------
what : correct context reduction, not failing with 'cannot prove' Integral a
expected: ok
----------------------------------------------------------------------------------------
|
module ClassCtxtRed1 where
import Debug.Trace
data FM a = FM a
lkup :: (Eq a, Show a) => FM a -> a -> a
main = return ()
|
86cf684ca45bfd012cdafc96496f7e7577b26abdba3a6e4886c2243b82f90f6d | racket/racket7 | contract.rkt | #lang racket/base
(require (for-syntax racket/base))
(provide check)
(define-syntax-rule (check who pred arg)
(unless (pred arg)
(raise-argument-error who (as-string pred) arg)))
(define-syntax (as-string stx)
(syntax-case stx ()
[(_ id)
(datum->syntax stx (symbol->string (syntax-e #'id)) stx)]))
| null | https://raw.githubusercontent.com/racket/racket7/5dbb62c6bbec198b4a790f1dc08fef0c45c2e32b/racket/src/expander/common/contract.rkt | racket | #lang racket/base
(require (for-syntax racket/base))
(provide check)
(define-syntax-rule (check who pred arg)
(unless (pred arg)
(raise-argument-error who (as-string pred) arg)))
(define-syntax (as-string stx)
(syntax-case stx ()
[(_ id)
(datum->syntax stx (symbol->string (syntax-e #'id)) stx)]))
| |
5a5a7242ee85e27e2346ecbc34ecead2ce7df4452a1d75b1dd3bb8b3b06293e2 | jiangpengnju/htdp2e | space_invader_game.v0.rkt | The first three lines of this file were inserted by . They record metadata
;; about the language level of this file in a form that our tools can easily process.
#reader(lib "htdp-beginner-reader.ss" "lang")((modname designing_with_itemizations_again) (read-case-sensitive #t) (teachpacks ()) (htdp-settings #(#t constructor repeating-decimal #f #t none #f () #f)))
Sample Problem : Design a game program using the 2htdp / universe library for
; playing a simple space invader game.
; The palyer is in control of a tank (a small rectangle) that must defend our
; planet (the buttom of the canvas) from a UFO that descends from the top of
; the canvas to the bottom. In order to stop the UFO from landing, the player
; may fire a single missile (a triangle smaller than the tank) by hitting the
; space bar. In response, the missile emerges from the tank.
; If the UFO collides with the missile, the player wins; otherwise the UFO
; lands and the player loses.
Here are some details concerning the three game objects and their movements .
;
First , the tank moves a constant speed along the bottom of the canvas
; though the player may use the left arrow key and the right arrow key to
; change directions.
Second , the UFO descends at a constant velocity but makes small random
; jumps to the left or right.
; Third, once fired the missile ascends along a straight vertical line
; at a constant speed at least twice as fast as the UFO descends.
; Finally, the UFO and the missile collide if their reference points are
; close enough, for whatever you think “close enough” means. | null | https://raw.githubusercontent.com/jiangpengnju/htdp2e/d41555519fbb378330f75c88141f72b00a9ab1d3/fixed-size-data/itemizations_and_structures/space_invader_game.v0.rkt | racket | about the language level of this file in a form that our tools can easily process.
playing a simple space invader game.
The palyer is in control of a tank (a small rectangle) that must defend our
planet (the buttom of the canvas) from a UFO that descends from the top of
the canvas to the bottom. In order to stop the UFO from landing, the player
may fire a single missile (a triangle smaller than the tank) by hitting the
space bar. In response, the missile emerges from the tank.
If the UFO collides with the missile, the player wins; otherwise the UFO
lands and the player loses.
though the player may use the left arrow key and the right arrow key to
change directions.
jumps to the left or right.
Third, once fired the missile ascends along a straight vertical line
at a constant speed at least twice as fast as the UFO descends.
Finally, the UFO and the missile collide if their reference points are
close enough, for whatever you think “close enough” means. | The first three lines of this file were inserted by . They record metadata
#reader(lib "htdp-beginner-reader.ss" "lang")((modname designing_with_itemizations_again) (read-case-sensitive #t) (teachpacks ()) (htdp-settings #(#t constructor repeating-decimal #f #t none #f () #f)))
Sample Problem : Design a game program using the 2htdp / universe library for
Here are some details concerning the three game objects and their movements .
First , the tank moves a constant speed along the bottom of the canvas
Second , the UFO descends at a constant velocity but makes small random |
78db294f4e48c51991bb35de8ecf871afd5317794e8584ee6f82b74615a437c4 | ocharles/blog | 2014-03-24-loops.hs | # LANGUAGE GeneralizedNewtypeDeriving #
import Control.Applicative
import Data.IORef
import Control.Monad (join)
import Control.Concurrent.MVar
import Data.Map (Map)
import Data.Traversable (for)
import qualified Data.Map as Map
import Data.Functor.Compose
import Data.Foldable
getAllUsers :: IO [Bool]
getAllUsers = undefined
getUserById = undefined
doSomething = undefined
example1 :: IO ()
example1 = do
userIds <- getAllUsers
users <- for userIds $ \userId -> do
getUserById userId
doSomething users
data ExpandedEntity = ExpandedEntity Bool (Maybe Bool) (Maybe Bool)
getAllEntities :: IO [Bool]
getAllEntities = undefined
getEntityTypeById = undefined
getEntityOwnerById = undefined
entityTypeId :: Bool -> Int
entityTypeId = undefined
entityOwnerId :: Bool -> Int
entityOwnerId = undefined
better :: IO ()
better = do
entities <- getAllEntities
expandedEntities <- for entities $ \entity -> do
entityType <- getEntityTypeById (entityTypeId entity)
entityOwner <- getEntityOwnerById (entityOwnerId entity)
return $ ExpandedEntity entity entityType entityOwner
doSomething expandedEntities
getEntityTypesById = undefined
getEntityOwnersById = undefined
correct = do
entities <- getAllEntities
let entityTypeIds = map entityTypeId entities
entityOwnerIds = map entityOwnerId entities
entityTypes <- getEntityTypesById entityTypeIds
entityOwners <- getEntityOwnersById entityOwnerIds
doSomething $ flip map entities $ \entity ->
ExpandedEntity entity
(entityTypeId entity `lookup` entityTypes)
(entityOwnerId entity `lookup` entityOwners)
data Query k v = Query (IORef (Map k [MVar (Maybe v)]))
(Query keys) @? k = do
result <- newEmptyMVar
modifyIORef' keys (Map.insertWith (++) k [result])
return (takeMVar result)
getEntityById = undefined
ohNo :: IO ()
ohNo = do entity <- getEntityById 1
if entityOwnerId entity `mod` 2 == 0
then do owner <- getEntityOwnerById (entityOwnerId entity)
return (entity, Just owner)
else return (entity, Nothing)
return ()
newtype Querying a = Querying { unQuerying :: Compose IO IO a }
deriving (Functor, Applicative)
(@?!) :: (Ord k, Eq k) => Query k v -> k -> Querying (Maybe v)
(Query keys) @?! k = Querying $ Compose $ do
result <- newEmptyMVar
modifyIORef' keys (Map.insertWith (++) k [result])
return (takeMVar result)
withQuery : : ( , ) = > ( [ k ] - > IO ( Map . Map k v ) ) - > ( Query k v - > Querying a ) - > Querying a
withQuery runner k = Querying $ Compose $ do
-- Create a IORef to keep track of requested keys and result MVars
keysRef <- newIORef Map.empty
Run the first phase of the Querying action
getResponse <- getCompose $ unQuerying (k (Query keysRef))
-- Check which keys were requested and perform a query
keys <- readIORef keysRef
qResults <- runner (Map.keys keys)
-- Populate all MVars with results
flip Map.traverseWithKey keys $ \k mvars ->
for_ mvars $ \mvar ->
putMVar mvar (Map.lookup k qResults)
Return the IO action that reads from the MVar
return getResponse
runQuerying :: Querying a -> IO a
runQuerying (Querying (Compose io)) = join io
getUserAgesById :: [Int] -> IO (Map.Map Int Int)
getUserAgesById keys = do
putStrLn $ "Looking up " ++ show keys
return $ Map.fromList $ [(1, 1), (2, 2)]
example :: IO (Maybe Int)
example = runQuerying $
withQuery getUserAgesById $ \usersAgeById ->
liftA2 (+) <$> (usersAgeById @?! 1) <*> (usersAgeById @?! 2)
| null | https://raw.githubusercontent.com/ocharles/blog/fa8e911d3c03b134eee891d187a1bb574f23a530/code/2014-03-24-loops.hs | haskell | Create a IORef to keep track of requested keys and result MVars
Check which keys were requested and perform a query
Populate all MVars with results | # LANGUAGE GeneralizedNewtypeDeriving #
import Control.Applicative
import Data.IORef
import Control.Monad (join)
import Control.Concurrent.MVar
import Data.Map (Map)
import Data.Traversable (for)
import qualified Data.Map as Map
import Data.Functor.Compose
import Data.Foldable
getAllUsers :: IO [Bool]
getAllUsers = undefined
getUserById = undefined
doSomething = undefined
example1 :: IO ()
example1 = do
userIds <- getAllUsers
users <- for userIds $ \userId -> do
getUserById userId
doSomething users
data ExpandedEntity = ExpandedEntity Bool (Maybe Bool) (Maybe Bool)
getAllEntities :: IO [Bool]
getAllEntities = undefined
getEntityTypeById = undefined
getEntityOwnerById = undefined
entityTypeId :: Bool -> Int
entityTypeId = undefined
entityOwnerId :: Bool -> Int
entityOwnerId = undefined
better :: IO ()
better = do
entities <- getAllEntities
expandedEntities <- for entities $ \entity -> do
entityType <- getEntityTypeById (entityTypeId entity)
entityOwner <- getEntityOwnerById (entityOwnerId entity)
return $ ExpandedEntity entity entityType entityOwner
doSomething expandedEntities
getEntityTypesById = undefined
getEntityOwnersById = undefined
correct = do
entities <- getAllEntities
let entityTypeIds = map entityTypeId entities
entityOwnerIds = map entityOwnerId entities
entityTypes <- getEntityTypesById entityTypeIds
entityOwners <- getEntityOwnersById entityOwnerIds
doSomething $ flip map entities $ \entity ->
ExpandedEntity entity
(entityTypeId entity `lookup` entityTypes)
(entityOwnerId entity `lookup` entityOwners)
data Query k v = Query (IORef (Map k [MVar (Maybe v)]))
(Query keys) @? k = do
result <- newEmptyMVar
modifyIORef' keys (Map.insertWith (++) k [result])
return (takeMVar result)
getEntityById = undefined
ohNo :: IO ()
ohNo = do entity <- getEntityById 1
if entityOwnerId entity `mod` 2 == 0
then do owner <- getEntityOwnerById (entityOwnerId entity)
return (entity, Just owner)
else return (entity, Nothing)
return ()
newtype Querying a = Querying { unQuerying :: Compose IO IO a }
deriving (Functor, Applicative)
(@?!) :: (Ord k, Eq k) => Query k v -> k -> Querying (Maybe v)
(Query keys) @?! k = Querying $ Compose $ do
result <- newEmptyMVar
modifyIORef' keys (Map.insertWith (++) k [result])
return (takeMVar result)
withQuery : : ( , ) = > ( [ k ] - > IO ( Map . Map k v ) ) - > ( Query k v - > Querying a ) - > Querying a
withQuery runner k = Querying $ Compose $ do
keysRef <- newIORef Map.empty
Run the first phase of the Querying action
getResponse <- getCompose $ unQuerying (k (Query keysRef))
keys <- readIORef keysRef
qResults <- runner (Map.keys keys)
flip Map.traverseWithKey keys $ \k mvars ->
for_ mvars $ \mvar ->
putMVar mvar (Map.lookup k qResults)
Return the IO action that reads from the MVar
return getResponse
runQuerying :: Querying a -> IO a
runQuerying (Querying (Compose io)) = join io
getUserAgesById :: [Int] -> IO (Map.Map Int Int)
getUserAgesById keys = do
putStrLn $ "Looking up " ++ show keys
return $ Map.fromList $ [(1, 1), (2, 2)]
example :: IO (Maybe Int)
example = runQuerying $
withQuery getUserAgesById $ \usersAgeById ->
liftA2 (+) <$> (usersAgeById @?! 1) <*> (usersAgeById @?! 2)
|
8ea6fe46f9cd74d62423b1caf6b07064cc06a2788f8ad2e3cdf75f0e8902aed8 | teamwalnut/graphql-ppx | log.ml | let log msg = if Ppx_config.verbose_logging () then print_endline msg
let error_log = prerr_endline
| null | https://raw.githubusercontent.com/teamwalnut/graphql-ppx/8276452ebe8d89a748b6b267afc94161650ab620/src/graphql_compiler/log.ml | ocaml | let log msg = if Ppx_config.verbose_logging () then print_endline msg
let error_log = prerr_endline
| |
3ef72419c558593569176a16b23bc8638f524c2dba594b4e3fdbfa90ab7e31f4 | RichiH/git-repair | Git.hs | git repository handling
-
- This is written to be completely independant of git - annex and should be
- suitable for other uses .
-
- Copyright 2010 - 2012 < >
-
- Licensed under the GNU GPL version 3 or higher .
-
- This is written to be completely independant of git-annex and should be
- suitable for other uses.
-
- Copyright 2010-2012 Joey Hess <>
-
- Licensed under the GNU GPL version 3 or higher.
-}
# LANGUAGE CPP #
module Git (
Repo(..),
Ref(..),
fromRef,
Branch,
Sha,
Tag,
repoIsUrl,
repoIsSsh,
repoIsHttp,
repoIsLocal,
repoIsLocalBare,
repoIsLocalUnknown,
repoDescribe,
repoLocation,
repoPath,
localGitDir,
attributes,
hookPath,
assertLocal,
adjustPath,
relPath,
) where
import Network.URI (uriPath, uriScheme, unEscapeString)
#ifndef mingw32_HOST_OS
import System.Posix.Files
#endif
import Common
import Git.Types
#ifndef mingw32_HOST_OS
import Utility.FileMode
#endif
{- User-visible description of a git repo. -}
repoDescribe :: Repo -> String
repoDescribe Repo { remoteName = Just name } = name
repoDescribe Repo { location = Url url } = show url
repoDescribe Repo { location = Local { worktree = Just dir } } = dir
repoDescribe Repo { location = Local { gitdir = dir } } = dir
repoDescribe Repo { location = LocalUnknown dir } = dir
repoDescribe Repo { location = Unknown } = "UNKNOWN"
{- Location of the repo, either as a path or url. -}
repoLocation :: Repo -> String
repoLocation Repo { location = Url url } = show url
repoLocation Repo { location = Local { worktree = Just dir } } = dir
repoLocation Repo { location = Local { gitdir = dir } } = dir
repoLocation Repo { location = LocalUnknown dir } = dir
repoLocation Repo { location = Unknown } = error "unknown repoLocation"
Path to a repository . For non - bare , this is the worktree , for bare ,
- it 's the gitdir , and for URL repositories , is the path on the remote
- host .
- it's the gitdir, and for URL repositories, is the path on the remote
- host. -}
repoPath :: Repo -> FilePath
repoPath Repo { location = Url u } = unEscapeString $ uriPath u
repoPath Repo { location = Local { worktree = Just d } } = d
repoPath Repo { location = Local { gitdir = d } } = d
repoPath Repo { location = LocalUnknown dir } = dir
repoPath Repo { location = Unknown } = error "unknown repoPath"
{- Path to a local repository's .git directory. -}
localGitDir :: Repo -> FilePath
localGitDir Repo { location = Local { gitdir = d } } = d
localGitDir _ = error "unknown localGitDir"
{- Some code needs to vary between URL and normal repos,
- or bare and non-bare, these functions help with that. -}
repoIsUrl :: Repo -> Bool
repoIsUrl Repo { location = Url _ } = True
repoIsUrl _ = False
repoIsSsh :: Repo -> Bool
repoIsSsh Repo { location = Url url }
| scheme == "ssh:" = True
-- git treats these the same as ssh
| scheme == "git+ssh:" = True
| scheme == "ssh+git:" = True
| otherwise = False
where
scheme = uriScheme url
repoIsSsh _ = False
repoIsHttp :: Repo -> Bool
repoIsHttp Repo { location = Url url }
| uriScheme url == "http:" = True
| uriScheme url == "https:" = True
| otherwise = False
repoIsHttp _ = False
repoIsLocal :: Repo -> Bool
repoIsLocal Repo { location = Local { } } = True
repoIsLocal _ = False
repoIsLocalBare :: Repo -> Bool
repoIsLocalBare Repo { location = Local { worktree = Nothing } } = True
repoIsLocalBare _ = False
repoIsLocalUnknown :: Repo -> Bool
repoIsLocalUnknown Repo { location = LocalUnknown { } } = True
repoIsLocalUnknown _ = False
assertLocal :: Repo -> a -> a
assertLocal repo action
| repoIsUrl repo = error $ unwords
[ "acting on non-local git repo"
, repoDescribe repo
, "not supported"
]
| otherwise = action
{- Path to a repository's gitattributes file. -}
attributes :: Repo -> FilePath
attributes repo
| repoIsLocalBare repo = repoPath repo ++ "/info/.gitattributes"
| otherwise = repoPath repo ++ "/.gitattributes"
{- Path to a given hook script in a repository, only if the hook exists
- and is executable. -}
hookPath :: String -> Repo -> IO (Maybe FilePath)
hookPath script repo = do
let hook = localGitDir repo </> "hooks" </> script
ifM (catchBoolIO $ isexecutable hook)
( return $ Just hook , return Nothing )
where
#if mingw32_HOST_OS
isexecutable f = doesFileExist f
#else
isexecutable f = isExecutable . fileMode <$> getFileStatus f
#endif
Makes the path to a local be relative to the cwd .
relPath :: Repo -> IO Repo
relPath = adjustPath torel
where
torel p = do
p' <- relPathCwdToFile p
if null p'
then return "."
else return p'
Adusts the path to a local Repo using the provided function .
adjustPath :: (FilePath -> IO FilePath) -> Repo -> IO Repo
adjustPath f r@(Repo { location = l@(Local { gitdir = d, worktree = w }) }) = do
d' <- f d
w' <- maybe (pure Nothing) (Just <$$> f) w
return $ r
{ location = l
{ gitdir = d'
, worktree = w'
}
}
adjustPath f r@(Repo { location = LocalUnknown d }) = do
d' <- f d
return $ r { location = LocalUnknown d' }
adjustPath _ r = pure r
| null | https://raw.githubusercontent.com/RichiH/git-repair/c61b677e7a67a286df34c0629c52aeae9be9299a/Git.hs | haskell | User-visible description of a git repo.
Location of the repo, either as a path or url.
Path to a local repository's .git directory.
Some code needs to vary between URL and normal repos,
- or bare and non-bare, these functions help with that.
git treats these the same as ssh
Path to a repository's gitattributes file.
Path to a given hook script in a repository, only if the hook exists
- and is executable. | git repository handling
-
- This is written to be completely independant of git - annex and should be
- suitable for other uses .
-
- Copyright 2010 - 2012 < >
-
- Licensed under the GNU GPL version 3 or higher .
-
- This is written to be completely independant of git-annex and should be
- suitable for other uses.
-
- Copyright 2010-2012 Joey Hess <>
-
- Licensed under the GNU GPL version 3 or higher.
-}
# LANGUAGE CPP #
module Git (
Repo(..),
Ref(..),
fromRef,
Branch,
Sha,
Tag,
repoIsUrl,
repoIsSsh,
repoIsHttp,
repoIsLocal,
repoIsLocalBare,
repoIsLocalUnknown,
repoDescribe,
repoLocation,
repoPath,
localGitDir,
attributes,
hookPath,
assertLocal,
adjustPath,
relPath,
) where
import Network.URI (uriPath, uriScheme, unEscapeString)
#ifndef mingw32_HOST_OS
import System.Posix.Files
#endif
import Common
import Git.Types
#ifndef mingw32_HOST_OS
import Utility.FileMode
#endif
repoDescribe :: Repo -> String
repoDescribe Repo { remoteName = Just name } = name
repoDescribe Repo { location = Url url } = show url
repoDescribe Repo { location = Local { worktree = Just dir } } = dir
repoDescribe Repo { location = Local { gitdir = dir } } = dir
repoDescribe Repo { location = LocalUnknown dir } = dir
repoDescribe Repo { location = Unknown } = "UNKNOWN"
repoLocation :: Repo -> String
repoLocation Repo { location = Url url } = show url
repoLocation Repo { location = Local { worktree = Just dir } } = dir
repoLocation Repo { location = Local { gitdir = dir } } = dir
repoLocation Repo { location = LocalUnknown dir } = dir
repoLocation Repo { location = Unknown } = error "unknown repoLocation"
Path to a repository . For non - bare , this is the worktree , for bare ,
- it 's the gitdir , and for URL repositories , is the path on the remote
- host .
- it's the gitdir, and for URL repositories, is the path on the remote
- host. -}
repoPath :: Repo -> FilePath
repoPath Repo { location = Url u } = unEscapeString $ uriPath u
repoPath Repo { location = Local { worktree = Just d } } = d
repoPath Repo { location = Local { gitdir = d } } = d
repoPath Repo { location = LocalUnknown dir } = dir
repoPath Repo { location = Unknown } = error "unknown repoPath"
localGitDir :: Repo -> FilePath
localGitDir Repo { location = Local { gitdir = d } } = d
localGitDir _ = error "unknown localGitDir"
repoIsUrl :: Repo -> Bool
repoIsUrl Repo { location = Url _ } = True
repoIsUrl _ = False
repoIsSsh :: Repo -> Bool
repoIsSsh Repo { location = Url url }
| scheme == "ssh:" = True
| scheme == "git+ssh:" = True
| scheme == "ssh+git:" = True
| otherwise = False
where
scheme = uriScheme url
repoIsSsh _ = False
repoIsHttp :: Repo -> Bool
repoIsHttp Repo { location = Url url }
| uriScheme url == "http:" = True
| uriScheme url == "https:" = True
| otherwise = False
repoIsHttp _ = False
repoIsLocal :: Repo -> Bool
repoIsLocal Repo { location = Local { } } = True
repoIsLocal _ = False
repoIsLocalBare :: Repo -> Bool
repoIsLocalBare Repo { location = Local { worktree = Nothing } } = True
repoIsLocalBare _ = False
repoIsLocalUnknown :: Repo -> Bool
repoIsLocalUnknown Repo { location = LocalUnknown { } } = True
repoIsLocalUnknown _ = False
assertLocal :: Repo -> a -> a
assertLocal repo action
| repoIsUrl repo = error $ unwords
[ "acting on non-local git repo"
, repoDescribe repo
, "not supported"
]
| otherwise = action
attributes :: Repo -> FilePath
attributes repo
| repoIsLocalBare repo = repoPath repo ++ "/info/.gitattributes"
| otherwise = repoPath repo ++ "/.gitattributes"
hookPath :: String -> Repo -> IO (Maybe FilePath)
hookPath script repo = do
let hook = localGitDir repo </> "hooks" </> script
ifM (catchBoolIO $ isexecutable hook)
( return $ Just hook , return Nothing )
where
#if mingw32_HOST_OS
isexecutable f = doesFileExist f
#else
isexecutable f = isExecutable . fileMode <$> getFileStatus f
#endif
Makes the path to a local be relative to the cwd .
relPath :: Repo -> IO Repo
relPath = adjustPath torel
where
torel p = do
p' <- relPathCwdToFile p
if null p'
then return "."
else return p'
Adusts the path to a local Repo using the provided function .
adjustPath :: (FilePath -> IO FilePath) -> Repo -> IO Repo
adjustPath f r@(Repo { location = l@(Local { gitdir = d, worktree = w }) }) = do
d' <- f d
w' <- maybe (pure Nothing) (Just <$$> f) w
return $ r
{ location = l
{ gitdir = d'
, worktree = w'
}
}
adjustPath f r@(Repo { location = LocalUnknown d }) = do
d' <- f d
return $ r { location = LocalUnknown d' }
adjustPath _ r = pure r
|
5842b8ca9a417abe81fd2b869d151e7dcb28f67a89b58533cece0090f460d1c9 | well-typed/large-records | Rewriter.hs | {-# LANGUAGE LambdaCase #-}
# LANGUAGE RecordWildCards #
{-# LANGUAGE ViewPatterns #-}
# LANGUAGE NamedFieldPuns #
module Data.Record.Anon.Internal.Plugin.TC.Rewriter (rewrite) where
import Data.Record.Anon.Internal.Plugin.TC.Row.KnownRow (KnownRow)
import Data.Record.Anon.Internal.Plugin.TC.Row.ParsedRow (Fields)
import Data.Record.Anon.Internal.Plugin.TC.GhcTcPluginAPI
import Data.Record.Anon.Internal.Plugin.TC.NameResolution
import Data.Record.Anon.Internal.Plugin.TC.TyConSubst
import qualified Data.Record.Anon.Internal.Plugin.TC.Row.KnownField as KnownField
import qualified Data.Record.Anon.Internal.Plugin.TC.Row.KnownRow as KnownRow
import qualified Data.Record.Anon.Internal.Plugin.TC.Row.ParsedRow as ParsedRow
rewrite :: ResolvedNames -> UniqFM TyCon TcPluginRewriter
rewrite rn@ResolvedNames{..} = listToUFM [
(tyConFieldTypes , rewriteRecordMetadataOf tyConFieldTypes rn)
, (tyConSimpleFieldTypes , rewriteRecordMetadataOf tyConSimpleFieldTypes rn)
]
data Args = Args {
-- | Functor argument, if any
argsFunctor :: Maybe Type
| fields
, argsParsedFields :: Maybe Fields
-- | Known record, if all fields are known
, argsParsedKnown :: Maybe (KnownRow Type)
}
mkArgs :: TyConSubst -> ResolvedNames -> Maybe Type -> Type -> Args
mkArgs tcs rn argsFunctor r = Args{..}
where
argsParsedFields = ParsedRow.parseFields tcs rn r
argsParsedKnown = ParsedRow.allKnown =<< argsParsedFields
parseArgs :: [Ct] -> ResolvedNames -> [Type] -> Args
parseArgs given rn = \case
[_k, f, r] -> mkArgs tcs rn (Just f) r
[ r] -> mkArgs tcs rn Nothing r
args -> panic $ concat [
"Data.Record.Anon.Plugin.Rewriter.parseArgs: "
, "unexpected arguments: "
, showSDocUnsafe (ppr args)
]
where
tcs :: TyConSubst
tcs = mkTyConSubst given
rewriteRecordMetadataOf :: TyCon -> ResolvedNames -> TcPluginRewriter
rewriteRecordMetadataOf fun rn given args@(parseArgs given rn -> Args{..}) =
-- trace _debugInput $
-- trace _debugParsed $
case argsParsedKnown of
Nothing ->
return TcPluginNoRewrite
Just knownFields ->
return TcPluginRewriteTo {
tcRewriterNewWanteds = []
, tcPluginReduction =
mkTyFamAppReduction
"large-anon"
Nominal
fun
args
(computeMetadataOf argsFunctor knownFields)
}
where
_debugInput :: String
_debugInput = unlines [
"*** input"
, concat [
"given:"
, showSDocUnsafe (ppr given)
]
, concat [
"args: "
, showSDocUnsafe (ppr args)
]
]
_debugParsed :: String
_debugParsed = unlines [
"*** parsed"
, concat [
"parsedFields: "
, showSDocUnsafe (ppr argsParsedFields)
]
, concat [
"mKnownFields: "
, showSDocUnsafe (ppr argsParsedKnown)
]
]
computeMetadataOf :: Maybe Type -> KnownRow Type -> TcType
computeMetadataOf mf r =
mkPromotedListTy
(mkTupleTy Boxed [mkTyConTy typeSymbolKindCon, liftedTypeKind])
(map (KnownField.toType mf) $ KnownRow.toList r)
| null | https://raw.githubusercontent.com/well-typed/large-records/fb983aa136c2602499c2421323bd52b6a54b7c9a/large-anon/src/Data/Record/Anon/Internal/Plugin/TC/Rewriter.hs | haskell | # LANGUAGE LambdaCase #
# LANGUAGE ViewPatterns #
| Functor argument, if any
| Known record, if all fields are known
trace _debugInput $
trace _debugParsed $ | # LANGUAGE RecordWildCards #
# LANGUAGE NamedFieldPuns #
module Data.Record.Anon.Internal.Plugin.TC.Rewriter (rewrite) where
import Data.Record.Anon.Internal.Plugin.TC.Row.KnownRow (KnownRow)
import Data.Record.Anon.Internal.Plugin.TC.Row.ParsedRow (Fields)
import Data.Record.Anon.Internal.Plugin.TC.GhcTcPluginAPI
import Data.Record.Anon.Internal.Plugin.TC.NameResolution
import Data.Record.Anon.Internal.Plugin.TC.TyConSubst
import qualified Data.Record.Anon.Internal.Plugin.TC.Row.KnownField as KnownField
import qualified Data.Record.Anon.Internal.Plugin.TC.Row.KnownRow as KnownRow
import qualified Data.Record.Anon.Internal.Plugin.TC.Row.ParsedRow as ParsedRow
rewrite :: ResolvedNames -> UniqFM TyCon TcPluginRewriter
rewrite rn@ResolvedNames{..} = listToUFM [
(tyConFieldTypes , rewriteRecordMetadataOf tyConFieldTypes rn)
, (tyConSimpleFieldTypes , rewriteRecordMetadataOf tyConSimpleFieldTypes rn)
]
data Args = Args {
argsFunctor :: Maybe Type
| fields
, argsParsedFields :: Maybe Fields
, argsParsedKnown :: Maybe (KnownRow Type)
}
mkArgs :: TyConSubst -> ResolvedNames -> Maybe Type -> Type -> Args
mkArgs tcs rn argsFunctor r = Args{..}
where
argsParsedFields = ParsedRow.parseFields tcs rn r
argsParsedKnown = ParsedRow.allKnown =<< argsParsedFields
parseArgs :: [Ct] -> ResolvedNames -> [Type] -> Args
parseArgs given rn = \case
[_k, f, r] -> mkArgs tcs rn (Just f) r
[ r] -> mkArgs tcs rn Nothing r
args -> panic $ concat [
"Data.Record.Anon.Plugin.Rewriter.parseArgs: "
, "unexpected arguments: "
, showSDocUnsafe (ppr args)
]
where
tcs :: TyConSubst
tcs = mkTyConSubst given
rewriteRecordMetadataOf :: TyCon -> ResolvedNames -> TcPluginRewriter
rewriteRecordMetadataOf fun rn given args@(parseArgs given rn -> Args{..}) =
case argsParsedKnown of
Nothing ->
return TcPluginNoRewrite
Just knownFields ->
return TcPluginRewriteTo {
tcRewriterNewWanteds = []
, tcPluginReduction =
mkTyFamAppReduction
"large-anon"
Nominal
fun
args
(computeMetadataOf argsFunctor knownFields)
}
where
_debugInput :: String
_debugInput = unlines [
"*** input"
, concat [
"given:"
, showSDocUnsafe (ppr given)
]
, concat [
"args: "
, showSDocUnsafe (ppr args)
]
]
_debugParsed :: String
_debugParsed = unlines [
"*** parsed"
, concat [
"parsedFields: "
, showSDocUnsafe (ppr argsParsedFields)
]
, concat [
"mKnownFields: "
, showSDocUnsafe (ppr argsParsedKnown)
]
]
computeMetadataOf :: Maybe Type -> KnownRow Type -> TcType
computeMetadataOf mf r =
mkPromotedListTy
(mkTupleTy Boxed [mkTyConTy typeSymbolKindCon, liftedTypeKind])
(map (KnownField.toType mf) $ KnownRow.toList r)
|
659219fedaf62b3d03f7151fe98abb71e848635262da9ef735faed298929df4e | w3ntao/sicp-solution | circuit.rkt | #lang racket
(require (combine-in (only-in "../AbstractionOfData/table.rkt"
make-table
insert!
lookup)
(only-in "wire.rkt"
make-wire
set-signal!
get-signal
add-action!)
(only-in "gate.rkt"
full-adder)
(only-in "agenda.rkt"
make-agenda
current-time
first-agenda-item
empty-agenda?
remove-first-agenda-item!)))
(define (ripple-carry-adder circuit-A circuit-B circuit-S agenda)
(define (A n)
(get-wire n circuit-A))
(define (B n)
(get-wire n circuit-B))
(define (S n)
(get-wire n circuit-S))
(define (ripple-carry-adder-unit n C-in)
(cond ((< n 0)
C-in)
(else
(let ((C-out (make-wire)))
(begin (full-adder (A n)
(B n)
C-in
(S n)
C-out
agenda)
(ripple-carry-adder-unit (- n 1)
C-out))))))
(define (same-number? a b c)
(and (= a b)
(= a c)
(= b c)))
(let ((C-0 (make-wire)))
(set-signal! C-0 0)
(cond ((same-number? (bit-length circuit-A)
(bit-length circuit-B)
(bit-length circuit-S))
(ripple-carry-adder-unit (- (bit-length circuit-S)
1)
C-0))
(else
(error "bit length mismatched -- RIPPLE-CARRY-ADDER")))))
; build n bits circuit
(define (make-circuit num)
(let ((circuit (make-table)))
(define (legal-range? n)
(and (< n num)
(> n -1)))
(define (put-wire n)
(begin
(insert! n
(make-wire)
circuit)
(when (> n 0)
(put-wire (- n 1)))))
(define (set-my-bit! n new-value)
(cond ((legal-range? n)
(set-signal! (lookup n circuit)
new-value))
(else
(error "Invalid bit -- SET-MY-BITS!" n))))
(define (get-my-bit n)
(cond ((legal-range? n)
(get-signal (lookup n
circuit)))
(else
(error "Invalid bit -- GET-MY-BIT" n))))
(define (get-my-wire n)
(cond ((legal-range? n)
(lookup n
circuit))
(else
(error "Invalid wire -- GET-MY-WIRE" n))))
(define (display-my-circuit)
(define (display-circuit-unit n)
(display (get-my-bit n))
(display " ")
(when (< n
(- num 1))
(display-circuit-unit (+ n 1))))
(display-circuit-unit 0)
(newline))
(define (dispatch m)
(cond ((eq? m 'get-bit) get-my-bit)
((eq? m 'set-bit!) set-my-bit!)
((eq? m 'get-wire) get-my-wire)
((eq? m 'display-circuit) display-my-circuit)
((eq? m 'bit-length) num)
(else
(error "Unknown operation -- CIRCUIT" m))))
(cond ((> num 0)
(begin
(put-wire (- num 1))
dispatch))
(else
(error "Invalid bit -- MAKE-CIRCUIT" num)))))
(define (get-bit n circuit)
((circuit 'get-bit) n))
(define (set-bit! n new-value circuit)
((circuit 'set-bit!) n new-value))
(define (get-wire n circuit)
((circuit 'get-wire) n))
(define (display-circuit circuit)
((circuit 'display-circuit)))
(define (bit-length circuit)
(circuit 'bit-length))
(define (propagate agenda)
(if (empty-agenda? agenda)
'done
(let ((first-item (first-agenda-item agenda)))
(first-item)
(remove-first-agenda-item! agenda)
(propagate agenda))))
(define (probe name wire agenda)
(add-action! wire
(lambda ()
(newline)
(display name)
(display " ")
(display (current-time agenda))
(display " New-value = ")
(display (get-signal wire)))))
(define (circuit-probe name circuit agenda)
(define (wire-probe name wire)
(add-action! wire
(lambda ()
(display name)
(display " ")
(display (current-time agenda))
(newline)
(display-circuit circuit))))
(define (circuit-probe-unit n)
(wire-probe name
(get-wire n circuit))
(when (< n
(- (bit-length circuit)
1))
(circuit-probe-unit (+ n 1))))
(circuit-probe-unit 0))
(define the-agenda (make-agenda))
(define circuit-a (make-circuit 3))
(define circuit-b (make-circuit 3))
(define circuit-s (make-circuit 3))
(circuit-probe 'circuit-s circuit-s the-agenda)
(ripple-carry-adder circuit-a circuit-b circuit-s the-agenda)
(set-bit! 0 1 circuit-a)
(set-bit! 1 1 circuit-a)
( set - bit ! 1 1 circuit - b )
(set-bit! 2 1 circuit-b)
(propagate the-agenda)
(newline)
(display "circuit s")
(newline)
(display-circuit circuit-s)
(newline)
| null | https://raw.githubusercontent.com/w3ntao/sicp-solution/00be3a7b4da50bb266f8a2db521a24e9f8c156be/ToolBox/IntegratedCircuit/circuit.rkt | racket | build n bits circuit | #lang racket
(require (combine-in (only-in "../AbstractionOfData/table.rkt"
make-table
insert!
lookup)
(only-in "wire.rkt"
make-wire
set-signal!
get-signal
add-action!)
(only-in "gate.rkt"
full-adder)
(only-in "agenda.rkt"
make-agenda
current-time
first-agenda-item
empty-agenda?
remove-first-agenda-item!)))
(define (ripple-carry-adder circuit-A circuit-B circuit-S agenda)
(define (A n)
(get-wire n circuit-A))
(define (B n)
(get-wire n circuit-B))
(define (S n)
(get-wire n circuit-S))
(define (ripple-carry-adder-unit n C-in)
(cond ((< n 0)
C-in)
(else
(let ((C-out (make-wire)))
(begin (full-adder (A n)
(B n)
C-in
(S n)
C-out
agenda)
(ripple-carry-adder-unit (- n 1)
C-out))))))
(define (same-number? a b c)
(and (= a b)
(= a c)
(= b c)))
(let ((C-0 (make-wire)))
(set-signal! C-0 0)
(cond ((same-number? (bit-length circuit-A)
(bit-length circuit-B)
(bit-length circuit-S))
(ripple-carry-adder-unit (- (bit-length circuit-S)
1)
C-0))
(else
(error "bit length mismatched -- RIPPLE-CARRY-ADDER")))))
(define (make-circuit num)
(let ((circuit (make-table)))
(define (legal-range? n)
(and (< n num)
(> n -1)))
(define (put-wire n)
(begin
(insert! n
(make-wire)
circuit)
(when (> n 0)
(put-wire (- n 1)))))
(define (set-my-bit! n new-value)
(cond ((legal-range? n)
(set-signal! (lookup n circuit)
new-value))
(else
(error "Invalid bit -- SET-MY-BITS!" n))))
(define (get-my-bit n)
(cond ((legal-range? n)
(get-signal (lookup n
circuit)))
(else
(error "Invalid bit -- GET-MY-BIT" n))))
(define (get-my-wire n)
(cond ((legal-range? n)
(lookup n
circuit))
(else
(error "Invalid wire -- GET-MY-WIRE" n))))
(define (display-my-circuit)
(define (display-circuit-unit n)
(display (get-my-bit n))
(display " ")
(when (< n
(- num 1))
(display-circuit-unit (+ n 1))))
(display-circuit-unit 0)
(newline))
(define (dispatch m)
(cond ((eq? m 'get-bit) get-my-bit)
((eq? m 'set-bit!) set-my-bit!)
((eq? m 'get-wire) get-my-wire)
((eq? m 'display-circuit) display-my-circuit)
((eq? m 'bit-length) num)
(else
(error "Unknown operation -- CIRCUIT" m))))
(cond ((> num 0)
(begin
(put-wire (- num 1))
dispatch))
(else
(error "Invalid bit -- MAKE-CIRCUIT" num)))))
(define (get-bit n circuit)
((circuit 'get-bit) n))
(define (set-bit! n new-value circuit)
((circuit 'set-bit!) n new-value))
(define (get-wire n circuit)
((circuit 'get-wire) n))
(define (display-circuit circuit)
((circuit 'display-circuit)))
(define (bit-length circuit)
(circuit 'bit-length))
(define (propagate agenda)
(if (empty-agenda? agenda)
'done
(let ((first-item (first-agenda-item agenda)))
(first-item)
(remove-first-agenda-item! agenda)
(propagate agenda))))
(define (probe name wire agenda)
(add-action! wire
(lambda ()
(newline)
(display name)
(display " ")
(display (current-time agenda))
(display " New-value = ")
(display (get-signal wire)))))
(define (circuit-probe name circuit agenda)
(define (wire-probe name wire)
(add-action! wire
(lambda ()
(display name)
(display " ")
(display (current-time agenda))
(newline)
(display-circuit circuit))))
(define (circuit-probe-unit n)
(wire-probe name
(get-wire n circuit))
(when (< n
(- (bit-length circuit)
1))
(circuit-probe-unit (+ n 1))))
(circuit-probe-unit 0))
(define the-agenda (make-agenda))
(define circuit-a (make-circuit 3))
(define circuit-b (make-circuit 3))
(define circuit-s (make-circuit 3))
(circuit-probe 'circuit-s circuit-s the-agenda)
(ripple-carry-adder circuit-a circuit-b circuit-s the-agenda)
(set-bit! 0 1 circuit-a)
(set-bit! 1 1 circuit-a)
( set - bit ! 1 1 circuit - b )
(set-bit! 2 1 circuit-b)
(propagate the-agenda)
(newline)
(display "circuit s")
(newline)
(display-circuit circuit-s)
(newline)
|
44a71c3d1ca8e16434187b38af26bafb2c1717d29504b01e5c3e5b4f7d0d2f5a | Incanus3/ExiL | backward.lisp | (in-package :integration-tests)
(declaim (optimize (compilation-speed 0) (debug 3) (space 0) (speed 0)))
(defclass backward-integration-tests (integration-tests) ())
(defmethod set-up ((tests backward-integration-tests))
(call-next-method)
(unwatch all))
(def-test-method test-fact-matching ((tests backward-integration-tests) :run nil)
(deffacts world
(in box hall)
(color box blue)
(size box big))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object blue))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box)))))
(def-test-method test-fact-matching-with-backtracking
((tests backward-integration-tests) :run nil)
(deffacts world
(in box1 hall)
(color box1 green)
(in box2 hall)
(color box2 blue)
(size box2 small)
(in box3 hall)
(color box3 blue)
(size box3 big))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object blue))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box3)))))
(def-test-method test-fact-matching-with-alternative-answers
((tests backward-integration-tests) :run nil)
(deffacts world
(in box1 hall)
(color box1 blue)
(in box2 hall)
(color box2 red)
(size box2 big)
(in box3 hall)
(color box3 red)
(size box3 small)
(in box4 hall)
(color box4 red)
(size box4 big))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object red))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box2)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box4)))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(def-test-method test-rule-matching
((tests backward-integration-tests) :run nil)
(deffacts world
(female jane)
(parent-of jane george))
(defrule mother
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?mother-of-george . jane)))))
(def-test-method test-rule-matching-with-backtracking
((tests backward-integration-tests) :run nil)
(deffacts world
(female jane)
(parent-of jane george))
(defrule mother-is-daughter-of-grandpa
(grandpa-of ?grandpa ?child)
(daughter-of ?mother ?grandpa)
=>
(assert (mother-of ?mother ?child)))
(defrule mother-is-female-parent
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?mother-of-george . jane)))))
(def-test-method test-rule-matching-with-backtracking-inverted-rules
((tests backward-integration-tests) :run nil)
(deffacts world
(female jane)
(parent-of jane george))
(defrule mother-is-female-parent
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(defrule mother-is-daughter-of-grandpa
(grandpa-of ?grandpa ?child)
(daughter-of ?mother ?grandpa)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?mother-of-george . jane)))))
(def-test-method test-rule-matching-with-alternative-answers
((tests backward-integration-tests) :run nil)
(deffacts world
(grandpa-of joseph george)
(daughter-of jane joseph)
(female jane)
(parent-of jane george))
(defrule mother-is-daughter-of-grandpa
(grandpa-of ?grandpa ?child)
(daughter-of ?mother ?grandpa)
=>
(assert (mother-of ?mother ?child)))
(defrule mother-is-female-parent
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(with-slots (env) tests
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::all-used-substitutions env)
'((?mother-of-george . jane)
(?child . george)
(?grandpa . joseph)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::all-used-substitutions env)
'((?mother-of-george . jane)
(?child . george)))))
(def-test-method test-backward-chaining-overall
((tests backward-integration-tests) :run nil)
(deffacts world
(in box1 hall)
(color box1 blue)
(in box2 hall)
(color box2 red)
(size box2 big)
(in box3 hall)
(color box3 red)
(size box3 small)
(in box4 hall)
(color box4 red)
(size box4 big))
;; this actually adds conditions to goals again, but since only
;; assertions are allowed in backward-chaining rules' activations
;; this will never cycle
(defrule grow-box
(in ?box hall)
(color ?box red)
(size ?box small)
=>
(assert (size ?box big)))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object red))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box2)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box3)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box4)))))
(add-test-suite 'backward-integration-tests)
;;(textui-test-run (get-suite backward-integration-tests))
| null | https://raw.githubusercontent.com/Incanus3/ExiL/de0f7c37538cecb7032cc1f2aa070524b0bc048d/src/tests/integration/backward.lisp | lisp |
this actually adds conditions to goals again, but since only
assertions are allowed in backward-chaining rules' activations
this will never cycle
(textui-test-run (get-suite backward-integration-tests)) | (in-package :integration-tests)
(declaim (optimize (compilation-speed 0) (debug 3) (space 0) (speed 0)))
(defclass backward-integration-tests (integration-tests) ())
(defmethod set-up ((tests backward-integration-tests))
(call-next-method)
(unwatch all))
(def-test-method test-fact-matching ((tests backward-integration-tests) :run nil)
(deffacts world
(in box hall)
(color box blue)
(size box big))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object blue))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box)))))
(def-test-method test-fact-matching-with-backtracking
((tests backward-integration-tests) :run nil)
(deffacts world
(in box1 hall)
(color box1 green)
(in box2 hall)
(color box2 blue)
(size box2 small)
(in box3 hall)
(color box3 blue)
(size box3 big))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object blue))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box3)))))
(def-test-method test-fact-matching-with-alternative-answers
((tests backward-integration-tests) :run nil)
(deffacts world
(in box1 hall)
(color box1 blue)
(in box2 hall)
(color box2 red)
(size box2 big)
(in box3 hall)
(color box3 red)
(size box3 small)
(in box4 hall)
(color box4 red)
(size box4 big))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object red))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box2)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box4)))))
(def-test-method test-rule-matching
((tests backward-integration-tests) :run nil)
(deffacts world
(female jane)
(parent-of jane george))
(defrule mother
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?mother-of-george . jane)))))
(def-test-method test-rule-matching-with-backtracking
((tests backward-integration-tests) :run nil)
(deffacts world
(female jane)
(parent-of jane george))
(defrule mother-is-daughter-of-grandpa
(grandpa-of ?grandpa ?child)
(daughter-of ?mother ?grandpa)
=>
(assert (mother-of ?mother ?child)))
(defrule mother-is-female-parent
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?mother-of-george . jane)))))
(def-test-method test-rule-matching-with-backtracking-inverted-rules
((tests backward-integration-tests) :run nil)
(deffacts world
(female jane)
(parent-of jane george))
(defrule mother-is-female-parent
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(defrule mother-is-daughter-of-grandpa
(grandpa-of ?grandpa ?child)
(daughter-of ?mother ?grandpa)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?mother-of-george . jane)))))
(def-test-method test-rule-matching-with-alternative-answers
((tests backward-integration-tests) :run nil)
(deffacts world
(grandpa-of joseph george)
(daughter-of jane joseph)
(female jane)
(parent-of jane george))
(defrule mother-is-daughter-of-grandpa
(grandpa-of ?grandpa ?child)
(daughter-of ?mother ?grandpa)
=>
(assert (mother-of ?mother ?child)))
(defrule mother-is-female-parent
(female ?mother)
(parent-of ?mother ?child)
=>
(assert (mother-of ?mother ?child)))
(reset)
(defgoal (mother-of ?mother-of-george george))
(with-slots (env) tests
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::all-used-substitutions env)
'((?mother-of-george . jane)
(?child . george)
(?grandpa . joseph)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::all-used-substitutions env)
'((?mother-of-george . jane)
(?child . george)))))
(def-test-method test-backward-chaining-overall
((tests backward-integration-tests) :run nil)
(deffacts world
(in box1 hall)
(color box1 blue)
(in box2 hall)
(color box2 red)
(size box2 big)
(in box3 hall)
(color box3 red)
(size box3 small)
(in box4 hall)
(color box4 red)
(size box4 big))
(defrule grow-box
(in ?box hall)
(color ?box red)
(size ?box small)
=>
(assert (size ?box big)))
(reset)
(defgoal (in ?object hall))
(defgoal (color ?object red))
(defgoal (size ?object big))
(back-run)
(with-slots (env) tests
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box2)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box3)))
(back-run)
(assert-false (eenv::goals env))
(assert-equal (eenv::used-substitutions env) '((?object . box4)))))
(add-test-suite 'backward-integration-tests)
|
04e250fcc9aeecc952eaae5633cf52bf38e5506141f40072367373c94fb81e77 | babashka/book | compile.clj | #!/usr/bin/env bb
(require '[babashka.process :as p])
(def out-page (str "gh-pages/"
(or (System/getenv "BABASHKA_BOOK_MAIN")
"master")
".html"))
(-> (p/$ asciidoctor src/book.adoc -o ~out-page -a docinfo=shared)
(p/check))
(binding [*out* *err*]
(println "Done writing to" out-page))
| null | https://raw.githubusercontent.com/babashka/book/e64c94c6649d76a7b3dae51fd1875e5eb9cd3dc0/script/compile.clj | clojure | #!/usr/bin/env bb
(require '[babashka.process :as p])
(def out-page (str "gh-pages/"
(or (System/getenv "BABASHKA_BOOK_MAIN")
"master")
".html"))
(-> (p/$ asciidoctor src/book.adoc -o ~out-page -a docinfo=shared)
(p/check))
(binding [*out* *err*]
(println "Done writing to" out-page))
| |
290e8e877228482e57d335072ecbd1dfc262eed11abab1cbf1551aff85c454c4 | Clozure/ccl-tests | declaration.lsp | ;-*- Mode: Lisp -*-
Author :
Created : Sun May 29 07:16:15 2005
;;;; Contains: Tests of the DECLARATION declarations
(in-package :cl-test)
(deftest declaration.1
(progn (declaim (declaration)) nil)
nil)
(deftest declaration.2
(progn (proclaim '(declaration)) nil)
nil)
(deftest declaration.3
(let ((sym (gensym))
(sym2 (gensym)))
(proclaim `(declaration ,sym ,sym2))
nil)
nil)
;;; For the error tests, see the page in the CLHS for TYPE:
;;; "A symbol cannot be both the name of a type and the name
;;; of a declaration. Defining a symbol as the name of a class,
;;; structure, condition, or type, when the symbol has been
;;; declared as a declaration name, or vice versa, signals an error."
;;; Declare these only if bad declarations produce warnings.
(when (block done
(handler-bind ((warning #'(lambda (c) c (return-from done t))))
(eval `(let () (declare (,(gensym))) nil))))
(deftest declaration.4
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (deftype ,sym () t) error)))
t t)
(deftest declaration.5
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (defstruct ,sym a b c) error)))
t t)
(deftest declaration.6
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (defclass ,sym () (a b c)) error)))
t t)
(deftest declaration.7
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (define-condition ,sym (condition) (a b c))
error)))
t t)
(deftest declaration.8
(let ((sym (gensym)))
(eval `(deftype ,sym () 'error))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
(deftest declaration.9
(let ((sym (gensym)))
(eval `(defstruct ,sym a b c))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
(deftest declaration.10
(let ((sym (gensym)))
(eval `(defclass ,sym () (a b c)))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
(deftest declaration.11
(let ((sym (gensym)))
(eval `(define-condition ,sym (condition) (a b c)))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
)
| null | https://raw.githubusercontent.com/Clozure/ccl-tests/0478abddb34dbc16487a1975560d8d073a988060/ansi-tests/declaration.lsp | lisp | -*- Mode: Lisp -*-
Contains: Tests of the DECLARATION declarations
For the error tests, see the page in the CLHS for TYPE:
"A symbol cannot be both the name of a type and the name
of a declaration. Defining a symbol as the name of a class,
structure, condition, or type, when the symbol has been
declared as a declaration name, or vice versa, signals an error."
Declare these only if bad declarations produce warnings. | Author :
Created : Sun May 29 07:16:15 2005
(in-package :cl-test)
(deftest declaration.1
(progn (declaim (declaration)) nil)
nil)
(deftest declaration.2
(progn (proclaim '(declaration)) nil)
nil)
(deftest declaration.3
(let ((sym (gensym))
(sym2 (gensym)))
(proclaim `(declaration ,sym ,sym2))
nil)
nil)
(when (block done
(handler-bind ((warning #'(lambda (c) c (return-from done t))))
(eval `(let () (declare (,(gensym))) nil))))
(deftest declaration.4
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (deftype ,sym () t) error)))
t t)
(deftest declaration.5
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (defstruct ,sym a b c) error)))
t t)
(deftest declaration.6
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (defclass ,sym () (a b c)) error)))
t t)
(deftest declaration.7
(let ((sym (gensym)))
(proclaim `(declaration ,sym))
(eval `(signals-error-always (define-condition ,sym (condition) (a b c))
error)))
t t)
(deftest declaration.8
(let ((sym (gensym)))
(eval `(deftype ,sym () 'error))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
(deftest declaration.9
(let ((sym (gensym)))
(eval `(defstruct ,sym a b c))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
(deftest declaration.10
(let ((sym (gensym)))
(eval `(defclass ,sym () (a b c)))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
(deftest declaration.11
(let ((sym (gensym)))
(eval `(define-condition ,sym (condition) (a b c)))
(eval `(signals-error-always (proclaim '(declaration ,sym))
error)))
t t)
)
|
6caad76cf3d923a0e7b985a7122dcbe7412965240150ed62e8612b33bfaee0de | Helium4Haskell/helium | ImportTwo2.hs | module ImportTwo2 where
y = 3 | null | https://raw.githubusercontent.com/Helium4Haskell/helium/5928bff479e6f151b4ceb6c69bbc15d71e29eb47/test/correct/ImportTwo2.hs | haskell | module ImportTwo2 where
y = 3 | |
eb98d91819d6077a04a4889651a63e075d1c2d7083a25ff3c8a20224161add85 | BranchTaken/Hemlock | test_stress2.ml | open! Basis.Rudiments
open! Basis
open Ordset
let test () =
(* test is n^2 time complexity, so keep n small. *)
let rec test n i e ordset = begin
match i < n with
| false -> ordset
| true -> begin
(* Hash i in order to test semi-random insertion order. *)
let h = Hash.(t_of_state (Uns.hash_fold i State.empty)) in
let ordset' = remove h (test n (succ i) e (insert h ordset)) in
assert (equal ordset ordset');
assert (equal ordset (union ordset ordset'));
assert (equal ordset (inter ordset ordset'));
assert (equal e (diff ordset ordset'));
ordset'
end
end in
let e = empty (module U128) in
let _ = test 100L 0L e e in
()
let _ = test ()
| null | https://raw.githubusercontent.com/BranchTaken/Hemlock/a21b462fe7f70475591d2ffae185c91552bf6372/bootstrap/test/basis/ordset/test_stress2.ml | ocaml | test is n^2 time complexity, so keep n small.
Hash i in order to test semi-random insertion order. | open! Basis.Rudiments
open! Basis
open Ordset
let test () =
let rec test n i e ordset = begin
match i < n with
| false -> ordset
| true -> begin
let h = Hash.(t_of_state (Uns.hash_fold i State.empty)) in
let ordset' = remove h (test n (succ i) e (insert h ordset)) in
assert (equal ordset ordset');
assert (equal ordset (union ordset ordset'));
assert (equal ordset (inter ordset ordset'));
assert (equal e (diff ordset ordset'));
ordset'
end
end in
let e = empty (module U128) in
let _ = test 100L 0L e e in
()
let _ = test ()
|
57fb6c362e5948915116b347dd47dbf74400b6764e40d16fc03d74ef363f0912 | cedlemo/OCaml-GI-ctypes-bindings-generator | Pack_direction.mli | open Ctypes
type t = Ltr | Rtl | Ttb | Btt
val of_value:
Unsigned.uint32 -> t
val to_value:
t -> Unsigned.uint32
val t_view: t typ
| null | https://raw.githubusercontent.com/cedlemo/OCaml-GI-ctypes-bindings-generator/21a4d449f9dbd6785131979b91aa76877bad2615/tools/Gtk3/Pack_direction.mli | ocaml | open Ctypes
type t = Ltr | Rtl | Ttb | Btt
val of_value:
Unsigned.uint32 -> t
val to_value:
t -> Unsigned.uint32
val t_view: t typ
| |
cf5c9b33cf98b977cd0432f008b36bd1353a39d98c25eeb7b8ad48e2b6007c40 | holdybot/holdybot | computation.clj | (ns parky.computation
(:require [java-time :as jt]
[parky.layout :refer [*identity*]]
[postal.core :as postal]
[parky.config :refer [env]]
[parky.db.core :as db]
[clojure.tools.logging :as log]
[conman.core :as conman]
[clj-http.client :as client]
[clojure.tools.reader.edn :as edn])
(:import (java.time LocalDateTime ZoneId LocalDate LocalTime)
(java.util.concurrent TimeUnit)))
(defonce last-computed (atom {}))
(defn- is-fn-bang-hour? [date zone bang-fn]
(let [timezone (get-in zone [:timezone] "Europe/Berlin")
bang-hour (get-in zone [:bang-hour] 16)
bang-minute (get-in zone [:bang-minute] 0)]
(bang-fn (jt/zoned-date-time) (jt/adjust (jt/zoned-date-time date timezone) (jt/local-time bang-hour bang-minute)))))
(defn is-before-bang-hour? [date zone]
(is-fn-bang-hour? date zone jt/before?))
(defn is-after-bang-hour? [date zone]
(is-fn-bang-hour? date zone jt/after?))
(defn flat-vals [eligibles] (reduce-kv (fn [m k v] (assoc m k (first v))) {} eligibles))
(defn- compute-winners [pendings zone date slots-count]
(let [user-points (flat-vals (group-by :email (db/get-points {:tenant_id (get-in *identity* [:tenant :id])
:parking_zone (:zone zone)
:parking_name (:name zone)
:to date
:from (jt/adjust date jt/minus (jt/days (Integer/valueOf (or (:days-look-back zone) 30))))
:emails (map :email pendings)})))
eligibles (for [pending pendings]
(assoc pending :points (or (get-in user-points [(:email pending) :points]) 0)))
sorted-points (sort-by :points < (shuffle eligibles))] ;; we need to shuffle here, to randomize prefered type requests
(log/info "Sorted by points" sorted-points)
(let [winners (take slots-count sorted-points)]
(log/info "Winners" winners)
winners)))
(defn- notification [email subject body]
(log/info "Send email" email subject)
(future (let [result (postal/send-message (get-in env [:smtp :transport])
{:from (get-in env [:smtp :from])
:to (clojure.string/trim email)
:subject subject
:body body})]
(log/info (:error result) (:message result)))))
(defn- notification-activated [email date parking-zone parking-name slot-name]
(notification email
(str "You have won a space " slot-name " in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "Congratulations! Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn- notification-deactivated [email date parking-zone parking-name]
(notification email
(str "Sorry, there is no free space for your request in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn notification-deactivated-by-admin [email date parking-zone parking-name]
(notification email
(str "Sorry, admin has just cancelled your space in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn notification-gave-up [email date parking-zone parking-name]
(notification email
(str "Someone has given up their space in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "If you still need the place, please make a reservation asap. If the space is still free, you will get it immediately. Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn notification-visitor-request [admin-email user-name email date parking-zone parking-name]
(notification admin-email
(str "Dear admin, visitor " user-name " " email " asks for space in " parking-zone " " parking-name)
(str "Please check their request for " date ". Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn get-slots [date zone]
(let [taken-slot-names (into #{} (map :slot_name (db/get-taken-slots {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})))
out-slots (into #{} (map :email (db/get-out-slots {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})))
slots (filter #(and (not (taken-slot-names (:name %)))
(if (some? (:owner %)) (out-slots (:owner %)) true)) (get-in zone [:slots]))]
(log/debug "Slots" taken-slot-names out-slots slots)
slots))
(defn group-by-type [slots]
(let [m (atom {})]
(doseq [slot slots]
(if (seq (:types slot))
(doseq [type (:types slot)]
(swap! m assoc type (conj (get @m type []) slot)))
(swap! m assoc nil (conj (get @m nil []) slot))))
@m))
(defn compute-user-slots [date zone winners slots]
(let [yesterdays-active (if (seq slots)
(db/get-taken-slots-for-emails {:tenant_id (get-in *identity* [:tenant :id])
:parking_day (jt/minus date (jt/days 1))
:parking_zone (:zone zone)
:parking_name (:name zone)
:emails (map :email winners)
:slot_names (map :name slots)})
[])
winners-by-email (flat-vals (group-by :email winners))
yesterdays-active-winners (flat-vals (group-by :email yesterdays-active))
slots-by-type (group-by-type slots)
slot-set (vec (map :name slots))
winners-atom (atom {})]
(doseq [winner winners]
(if (and
(some? (get yesterdays-active-winners (:email winner)))
(contains? (vec (map :name (get slots-by-type (:parking_type winner)))) (get-in yesterdays-active-winners [(:email winner) :slot_name])))
(swap! winners-atom assoc (get-in yesterdays-active-winners [(:email winner) :slot_name]) (:email winner))
(let [matching-slots (map :name (get slots-by-type (:parking_type winner)))
free-slot (first (filter (complement (into #{} (keys @winners-atom))) (remove nil? (cons (get-in yesterdays-active-winners [(:email winner) :slot_name]) matching-slots))))]
(when (some? free-slot)
(swap! winners-atom assoc free-slot (:email winner))))))
(let [assigned-winners (into #{} (vals @winners-atom))]
(doseq [winner winners]
(when (not (contains? assigned-winners (:email winner)))
(let [some-free-slot (first (filter (complement (into #{} (keys @winners-atom))) slot-set))]
(when some-free-slot
(swap! winners-atom assoc some-free-slot (:email winner)))))))
(vec (map (fn [[slot-name email]] [email slot-name (get-in winners-by-email [email :user_name])]) @winners-atom))))
[: email : : user_name ]
(defn ms-teams-msg [parking-zone parking-name date winners]
{"@context" ""
"@type" "MessageCard"
"potentialAction" [{"@type" "OpenUri"
"name" (str "Show in " (get env :app-name "Holdy"))
"targets" [{"os" "default"
"uri" (str "https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))}]}]
"sections" [{"facts" (map (fn [[email slot-name user-name]]
{:name slot-name
:value user-name}) winners)
"text" "Congratulations!"}]
"summary" (str (get env :app-name "Holdy") " results")
"themeColor" "0072C6"
"title" (str (get env :app-name "Holdy") " winners " date)})
(defn slack-msg [parking-zone parking-name date winners]
{"text" (str "*Congratulations!*\t" (get env :app-name "Holdy") " winners" date "\n\n" (clojure.string/join (map (fn [[email slot-name user-name]]
(str "\n*" slot-name "*\t" user-name)) winners)))
"attachments" [{"fallback" (str "Show " (get env :app-name "Holdy") " https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))
"actions" [{"type" "button"
"text" (str "Open in "(get env :app-name "Holdy"))
"url" (str "https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))}]}]})
(defn notify-winners [zone date winners]
(log/debug "notify" winners)
(when (seq winners)
(when-let [url (get-in zone [:teams-hook-url])]
(future (client/post url {:form-params (ms-teams-msg (:zone zone) (:name zone) date winners) :content-type :json})))
(when-let [url (get-in zone [:slack-hook-url])]
(future (client/post url {:form-params (slack-msg (:zone zone) (:name zone) date winners) :content-type :json})))))
(defn activate-winners [zone date more-users-than-slots? winners slots deactivate-remainings? include-inactive?]
(let [activation-points (if more-users-than-slots? 7 5)
users-slots (compute-user-slots date zone winners slots)]
(log/info "Winners" users-slots)
(doseq [[email slot-name] users-slots]
(log/info "Activating" email slot-name)
(db/activate-parking! {:tenant_id (get-in *identity* [:tenant :id])
:parking_zone (:zone zone)
:parking_name (:name zone)
:parking_day date
:points activation-points
:slot_name slot-name
:email email
:statuses (if include-inactive? ["inactive" "pending"] ["pending"])
:on_behalf_of include-inactive?})
(notification-activated email date (:zone zone) (:name zone) slot-name))
(when deactivate-remainings?
(let [remainings (db/get-pending-parkings-by-day {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})]
(db/deactivate-remainings! {:tenant_id (get-in *identity* [:tenant :id])
:parking_zone (:zone zone)
:parking_name (:name zone)
:parking_day date})
(doseq [remaining remainings]
(notification-deactivated (:email remaining) date (:zone zone) (:name zone)))))
(when (seq users-slots)
(notify-winners zone date users-slots))))
(defn- compute [zone date slots-count]
(log/info "Start Computing" (:zone zone) (:name zone) date)
(conman/with-transaction [parky.db.core/*db*]
(let [pendings (if (:disable-auto-assignment zone)
(db/get-pending-parkings-by-day-filtered-by-greenlist {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})
(db/get-pending-parkings-by-day {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)}))]
(when (seq pendings)
(let [more-users-than-slots? (> (count pendings) slots-count)
winners (compute-winners pendings zone date slots-count)]
(activate-winners zone date more-users-than-slots? winners (get-slots date zone) true false)))))
(log/info "End Computing" (:zone zone) (:name zone) date))
; fixme copy/paste
(defn- generate-filtered-days [from days filter-fn]
(vec (take days (remove nil? (remove #(filter-fn %) (jt/iterate jt/plus from (jt/days 1)))))))
(defn compute-now []
(try
(let [local-time (LocalTime/now (ZoneId/of "UTC"))
current-seconds (+ (* 3600 (.getHour local-time)) (* 60 (.getMinute local-time)) (.getSecond local-time))
date (jt/plus (jt/local-date) (jt/days 1))]
(doseq [tenant-id (db/get-all-computable-tenants-id {:bang_seconds_utc current-seconds
:computed_date date})]
(let [tenant (db/get-whole-tenant-by-id tenant-id)
settings (edn/read-string (:settings tenant))]
(binding [*identity* {:tenant (merge tenant {:settings settings})}]
(doseq [zone (:zones settings)]
(let [slots-count (count (get-slots date zone))]
(compute zone date slots-count)
(doseq [slot (:slots zone)] ; fixme perf
(when (:out-by-default slot)
(when (get-in slot [:owner])
(let [filtered-days (into #{} (remove nil? (map-indexed #(when %2 (inc %1)) (get zone :disabled-days (repeat 7 false)))))
target-date (first (generate-filtered-days (jt/adjust date jt/plus (jt/days 1)) 1 #(filtered-days (.getValue (jt/day-of-week %)))))
is-active-day (= target-date (jt/adjust date jt/plus (jt/days 1)))]
(when is-active-day
(db/set-out-by-default! {:tenant_id (get-in *identity* [:tenant :id])
:parking_day target-date
:parking_zone (:zone zone)
:parking_name (:name zone)
:email (get slot :owner)
:user_name (or (get slot :name) (get slot :owner))})))))))
(db/delete-old-outs! {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)}))
(db/update-tenant-dates! {:computed_date date
:bang_seconds_utc (- (+ (* 3600 (Integer/valueOf (get settings :bang-hour 16))) (* 60 (Integer/valueOf (get settings :bang-minute 0)))) (.getTotalSeconds (.getOffset (.getRules (ZoneId/of (get settings :timezone "Europe/Berlin"))) (LocalDateTime/now (ZoneId/of "UTC")))))
:tenant_id (:id tenant)})))))
(catch Exception e
(log/error e))))
| null | https://raw.githubusercontent.com/holdybot/holdybot/1039abe34fd58eaf1a62bbea6592db678cb6aef6/src/clj/parky/computation.clj | clojure | we need to shuffle here, to randomize prefered type requests
fixme copy/paste
fixme perf | (ns parky.computation
(:require [java-time :as jt]
[parky.layout :refer [*identity*]]
[postal.core :as postal]
[parky.config :refer [env]]
[parky.db.core :as db]
[clojure.tools.logging :as log]
[conman.core :as conman]
[clj-http.client :as client]
[clojure.tools.reader.edn :as edn])
(:import (java.time LocalDateTime ZoneId LocalDate LocalTime)
(java.util.concurrent TimeUnit)))
(defonce last-computed (atom {}))
(defn- is-fn-bang-hour? [date zone bang-fn]
(let [timezone (get-in zone [:timezone] "Europe/Berlin")
bang-hour (get-in zone [:bang-hour] 16)
bang-minute (get-in zone [:bang-minute] 0)]
(bang-fn (jt/zoned-date-time) (jt/adjust (jt/zoned-date-time date timezone) (jt/local-time bang-hour bang-minute)))))
(defn is-before-bang-hour? [date zone]
(is-fn-bang-hour? date zone jt/before?))
(defn is-after-bang-hour? [date zone]
(is-fn-bang-hour? date zone jt/after?))
(defn flat-vals [eligibles] (reduce-kv (fn [m k v] (assoc m k (first v))) {} eligibles))
(defn- compute-winners [pendings zone date slots-count]
(let [user-points (flat-vals (group-by :email (db/get-points {:tenant_id (get-in *identity* [:tenant :id])
:parking_zone (:zone zone)
:parking_name (:name zone)
:to date
:from (jt/adjust date jt/minus (jt/days (Integer/valueOf (or (:days-look-back zone) 30))))
:emails (map :email pendings)})))
eligibles (for [pending pendings]
(assoc pending :points (or (get-in user-points [(:email pending) :points]) 0)))
(log/info "Sorted by points" sorted-points)
(let [winners (take slots-count sorted-points)]
(log/info "Winners" winners)
winners)))
(defn- notification [email subject body]
(log/info "Send email" email subject)
(future (let [result (postal/send-message (get-in env [:smtp :transport])
{:from (get-in env [:smtp :from])
:to (clojure.string/trim email)
:subject subject
:body body})]
(log/info (:error result) (:message result)))))
(defn- notification-activated [email date parking-zone parking-name slot-name]
(notification email
(str "You have won a space " slot-name " in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "Congratulations! Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn- notification-deactivated [email date parking-zone parking-name]
(notification email
(str "Sorry, there is no free space for your request in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn notification-deactivated-by-admin [email date parking-zone parking-name]
(notification email
(str "Sorry, admin has just cancelled your space in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn notification-gave-up [email date parking-zone parking-name]
(notification email
(str "Someone has given up their space in " parking-zone " " parking-name " on " (jt/format "yyyy-MM-dd" date))
(str "If you still need the place, please make a reservation asap. If the space is still free, you will get it immediately. Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn notification-visitor-request [admin-email user-name email date parking-zone parking-name]
(notification admin-email
(str "Dear admin, visitor " user-name " " email " asks for space in " parking-zone " " parking-name)
(str "Please check their request for " date ". Your " (get env :app-name "Holdy") " at https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))))
(defn get-slots [date zone]
(let [taken-slot-names (into #{} (map :slot_name (db/get-taken-slots {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})))
out-slots (into #{} (map :email (db/get-out-slots {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})))
slots (filter #(and (not (taken-slot-names (:name %)))
(if (some? (:owner %)) (out-slots (:owner %)) true)) (get-in zone [:slots]))]
(log/debug "Slots" taken-slot-names out-slots slots)
slots))
(defn group-by-type [slots]
(let [m (atom {})]
(doseq [slot slots]
(if (seq (:types slot))
(doseq [type (:types slot)]
(swap! m assoc type (conj (get @m type []) slot)))
(swap! m assoc nil (conj (get @m nil []) slot))))
@m))
(defn compute-user-slots [date zone winners slots]
(let [yesterdays-active (if (seq slots)
(db/get-taken-slots-for-emails {:tenant_id (get-in *identity* [:tenant :id])
:parking_day (jt/minus date (jt/days 1))
:parking_zone (:zone zone)
:parking_name (:name zone)
:emails (map :email winners)
:slot_names (map :name slots)})
[])
winners-by-email (flat-vals (group-by :email winners))
yesterdays-active-winners (flat-vals (group-by :email yesterdays-active))
slots-by-type (group-by-type slots)
slot-set (vec (map :name slots))
winners-atom (atom {})]
(doseq [winner winners]
(if (and
(some? (get yesterdays-active-winners (:email winner)))
(contains? (vec (map :name (get slots-by-type (:parking_type winner)))) (get-in yesterdays-active-winners [(:email winner) :slot_name])))
(swap! winners-atom assoc (get-in yesterdays-active-winners [(:email winner) :slot_name]) (:email winner))
(let [matching-slots (map :name (get slots-by-type (:parking_type winner)))
free-slot (first (filter (complement (into #{} (keys @winners-atom))) (remove nil? (cons (get-in yesterdays-active-winners [(:email winner) :slot_name]) matching-slots))))]
(when (some? free-slot)
(swap! winners-atom assoc free-slot (:email winner))))))
(let [assigned-winners (into #{} (vals @winners-atom))]
(doseq [winner winners]
(when (not (contains? assigned-winners (:email winner)))
(let [some-free-slot (first (filter (complement (into #{} (keys @winners-atom))) slot-set))]
(when some-free-slot
(swap! winners-atom assoc some-free-slot (:email winner)))))))
(vec (map (fn [[slot-name email]] [email slot-name (get-in winners-by-email [email :user_name])]) @winners-atom))))
[: email : : user_name ]
(defn ms-teams-msg [parking-zone parking-name date winners]
{"@context" ""
"@type" "MessageCard"
"potentialAction" [{"@type" "OpenUri"
"name" (str "Show in " (get env :app-name "Holdy"))
"targets" [{"os" "default"
"uri" (str "https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))}]}]
"sections" [{"facts" (map (fn [[email slot-name user-name]]
{:name slot-name
:value user-name}) winners)
"text" "Congratulations!"}]
"summary" (str (get env :app-name "Holdy") " results")
"themeColor" "0072C6"
"title" (str (get env :app-name "Holdy") " winners " date)})
(defn slack-msg [parking-zone parking-name date winners]
{"text" (str "*Congratulations!*\t" (get env :app-name "Holdy") " winners" date "\n\n" (clojure.string/join (map (fn [[email slot-name user-name]]
(str "\n*" slot-name "*\t" user-name)) winners)))
"attachments" [{"fallback" (str "Show " (get env :app-name "Holdy") " https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))
"actions" [{"type" "button"
"text" (str "Open in "(get env :app-name "Holdy"))
"url" (str "https://" (get-in *identity* [:tenant :host]) "/#/calendar/" (ring.util.codec/url-encode parking-zone) "/" (ring.util.codec/url-encode parking-name))}]}]})
(defn notify-winners [zone date winners]
(log/debug "notify" winners)
(when (seq winners)
(when-let [url (get-in zone [:teams-hook-url])]
(future (client/post url {:form-params (ms-teams-msg (:zone zone) (:name zone) date winners) :content-type :json})))
(when-let [url (get-in zone [:slack-hook-url])]
(future (client/post url {:form-params (slack-msg (:zone zone) (:name zone) date winners) :content-type :json})))))
(defn activate-winners [zone date more-users-than-slots? winners slots deactivate-remainings? include-inactive?]
(let [activation-points (if more-users-than-slots? 7 5)
users-slots (compute-user-slots date zone winners slots)]
(log/info "Winners" users-slots)
(doseq [[email slot-name] users-slots]
(log/info "Activating" email slot-name)
(db/activate-parking! {:tenant_id (get-in *identity* [:tenant :id])
:parking_zone (:zone zone)
:parking_name (:name zone)
:parking_day date
:points activation-points
:slot_name slot-name
:email email
:statuses (if include-inactive? ["inactive" "pending"] ["pending"])
:on_behalf_of include-inactive?})
(notification-activated email date (:zone zone) (:name zone) slot-name))
(when deactivate-remainings?
(let [remainings (db/get-pending-parkings-by-day {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})]
(db/deactivate-remainings! {:tenant_id (get-in *identity* [:tenant :id])
:parking_zone (:zone zone)
:parking_name (:name zone)
:parking_day date})
(doseq [remaining remainings]
(notification-deactivated (:email remaining) date (:zone zone) (:name zone)))))
(when (seq users-slots)
(notify-winners zone date users-slots))))
(defn- compute [zone date slots-count]
(log/info "Start Computing" (:zone zone) (:name zone) date)
(conman/with-transaction [parky.db.core/*db*]
(let [pendings (if (:disable-auto-assignment zone)
(db/get-pending-parkings-by-day-filtered-by-greenlist {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)})
(db/get-pending-parkings-by-day {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)}))]
(when (seq pendings)
(let [more-users-than-slots? (> (count pendings) slots-count)
winners (compute-winners pendings zone date slots-count)]
(activate-winners zone date more-users-than-slots? winners (get-slots date zone) true false)))))
(log/info "End Computing" (:zone zone) (:name zone) date))
(defn- generate-filtered-days [from days filter-fn]
(vec (take days (remove nil? (remove #(filter-fn %) (jt/iterate jt/plus from (jt/days 1)))))))
(defn compute-now []
(try
(let [local-time (LocalTime/now (ZoneId/of "UTC"))
current-seconds (+ (* 3600 (.getHour local-time)) (* 60 (.getMinute local-time)) (.getSecond local-time))
date (jt/plus (jt/local-date) (jt/days 1))]
(doseq [tenant-id (db/get-all-computable-tenants-id {:bang_seconds_utc current-seconds
:computed_date date})]
(let [tenant (db/get-whole-tenant-by-id tenant-id)
settings (edn/read-string (:settings tenant))]
(binding [*identity* {:tenant (merge tenant {:settings settings})}]
(doseq [zone (:zones settings)]
(let [slots-count (count (get-slots date zone))]
(compute zone date slots-count)
(when (:out-by-default slot)
(when (get-in slot [:owner])
(let [filtered-days (into #{} (remove nil? (map-indexed #(when %2 (inc %1)) (get zone :disabled-days (repeat 7 false)))))
target-date (first (generate-filtered-days (jt/adjust date jt/plus (jt/days 1)) 1 #(filtered-days (.getValue (jt/day-of-week %)))))
is-active-day (= target-date (jt/adjust date jt/plus (jt/days 1)))]
(when is-active-day
(db/set-out-by-default! {:tenant_id (get-in *identity* [:tenant :id])
:parking_day target-date
:parking_zone (:zone zone)
:parking_name (:name zone)
:email (get slot :owner)
:user_name (or (get slot :name) (get slot :owner))})))))))
(db/delete-old-outs! {:tenant_id (get-in *identity* [:tenant :id])
:parking_day date
:parking_zone (:zone zone)
:parking_name (:name zone)}))
(db/update-tenant-dates! {:computed_date date
:bang_seconds_utc (- (+ (* 3600 (Integer/valueOf (get settings :bang-hour 16))) (* 60 (Integer/valueOf (get settings :bang-minute 0)))) (.getTotalSeconds (.getOffset (.getRules (ZoneId/of (get settings :timezone "Europe/Berlin"))) (LocalDateTime/now (ZoneId/of "UTC")))))
:tenant_id (:id tenant)})))))
(catch Exception e
(log/error e))))
|
97df0dbc676f917d10c0f70147eb34babb3cccf02ce906a2522f349316cbbeda | larcenists/larceny | prefix-larceny.scm | ;INSERTCODE
;------------------------------------------------------------------------------
(error-handler
(lambda l
(decode-error l)
(display "bench DIED!") (newline) (exit 118)))
(define (run-bench name count ok? run)
(let loop ((i 0) (result (list 'undefined)))
(if (< i count)
(loop (+ i 1) (run))
result)))
(define (run-benchmark name count ok? run-maker . args)
(newline)
(let* ((run (apply run-maker args))
(result (time (run-bench name count ok? run))))
(if (not (ok? result))
(begin
(display "*** wrong result ***")
(newline)
(display "*** got: ")
(write result)
(newline))))
(exit 0))
(define (fatal-error . args)
(apply error #f args))
(define (call-with-output-file/truncate filename proc)
(call-with-output-file filename proc))
Bitwise operations on exact integers .
; From the draft reference implementation of R6RS generic arithmetic.
(define (bitwise-or i j)
(if (and (fixnum? i) (fixnum? j))
(fxlogior i j)
(if (and (exact? i)
(integer? i)
(exact? j)
(integer? j))
(cond ((or (= i -1) (= j -1))
-1)
((= i 0)
j)
((= j 0)
i)
(else
(let* ((i0 (if (odd? i) 1 0))
(j0 (if (odd? j) 1 0))
(i1 (- i i0))
(j1 (- j j0))
(i/2 (quotient i1 2))
(j/2 (quotient j1 2))
(hi (* 2 (bitwise-or i/2 j/2)))
(lo (if (= 0 (+ i0 j0)) 0 1)))
(+ hi lo))))
(error "illegal argument to bitwise-or" i j))))
(define (bitwise-and i j)
(if (and (fixnum? i) (fixnum? j))
(fxlogand i j)
(if (and (exact? i)
(integer? i)
(exact? j)
(integer? j))
(cond ((or (= i 0) (= j 0))
0)
((= i -1)
j)
((= j -1)
i)
(else
(let* ((i0 (if (odd? i) 1 0))
(j0 (if (odd? j) 1 0))
(i1 (- i i0))
(j1 (- j j0))
(i/2 (quotient i1 2))
(j/2 (quotient j1 2))
(hi (* 2 (bitwise-and i/2 j/2)))
(lo (* i0 j0)))
(+ hi lo))))
(error "illegal argument to bitwise-and" i j))))
(define (bitwise-not i)
(if (fixnum? i)
(fxlognot i)
(if (and (exact? i)
(integer? i))
(cond ((= i -1)
0)
((= i 0)
-1)
(else
(let* ((i0 (if (odd? i) 1 0))
(i1 (- i i0))
(i/2 (quotient i1 2))
(hi (* 2 (bitwise-not i/2)))
(lo (- 1 i0)))
(+ hi lo))))
(error "illegal argument to bitwise-not" i j))))
;------------------------------------------------------------------------------
Macros ...
(if-fixflo
(begin
Specialize fixnum and flonum arithmetic .
(define-syntax FLOATvector-const
(syntax-rules ()
((FLOATvector-const x ...) '#(x ...))))
(define-syntax FLOATvector?
(syntax-rules ()
((FLOATvector? x) (vector? x))))
(define-syntax FLOATvector
(syntax-rules ()
((FLOATvector x ...) (vector x ...))))
(define-syntax FLOATmake-vector
(syntax-rules ()
((FLOATmake-vector n) (make-vector n 0.0))
((FLOATmake-vector n init) (make-vector n init))))
(define-syntax FLOATvector-ref
(syntax-rules ()
((FLOATvector-ref v i) (vector-ref v i))))
(define-syntax FLOATvector-set!
(syntax-rules ()
((FLOATvector-set! v i x) (vector-set! v i x))))
(define-syntax FLOATvector-length
(syntax-rules ()
((FLOATvector-length v) (vector-length v))))
(define-syntax nuc-const
(syntax-rules ()
((FLOATnuc-const x ...) '#(x ...))))
(define-syntax FLOAT+
(syntax-rules ()
((FLOAT+ x ...) (fl+ x ...))))
(define-syntax FLOAT-
(syntax-rules ()
((FLOAT- x ...) (fl- x ...))))
(define-syntax FLOAT*
(syntax-rules ()
((FLOAT* x ...) (fl* x ...))))
(define-syntax FLOAT/
(syntax-rules ()
FIXME
(define-syntax FLOAT=
(syntax-rules ()
((FLOAT= x y) (fl= x y))))
(define-syntax FLOAT<
(syntax-rules ()
((FLOAT< x y) (fl< x y))))
(define-syntax FLOAT<=
(syntax-rules ()
((FLOAT<= x y) (fl<= x y))))
(define-syntax FLOAT>
(syntax-rules ()
((FLOAT> x y) (fl> x y))))
(define-syntax FLOAT>=
(syntax-rules ()
((FLOAT>= x y) (fl>= x y))))
(define-syntax FLOATnegative?
(syntax-rules ()
((FLOATnegative? x) (fl< x 0.0))))
(define-syntax FLOATpositive?
(syntax-rules ()
((FLOATpositive? x) (fl< 0.0 x))))
(define-syntax FLOATzero?
(syntax-rules ()
((FLOATzero? x) (fl= 0.0 x))))
(define-syntax FLOATabs
(syntax-rules ()
FIXME
(define-syntax FLOATsin
(syntax-rules ()
FIXME
(define-syntax FLOATcos
(syntax-rules ()
FIXME
(define-syntax FLOATatan
(syntax-rules ()
FIXME
(define-syntax FLOATsqrt
(syntax-rules ()
FIXME
(define-syntax FLOATmin
(syntax-rules ()
FIXME
(define-syntax FLOATmax
(syntax-rules ()
FIXME
(define-syntax FLOATround
(syntax-rules ()
FIXME
(define-syntax FLOATinexact->exact
(syntax-rules ()
((FLOATinexact->exact x) (inexact->exact x))))
(define (GENERIC+ x y) (+ x y))
(define (GENERIC- x y) (- x y))
(define (GENERIC* x y) (* x y))
(define (GENERIC/ x y) (/ x y))
(define (GENERICquotient x y) (quotient x y))
(define (GENERICremainder x y) (remainder x y))
(define (GENERICmodulo x y) (modulo x y))
(define (GENERIC= x y) (= x y))
(define (GENERIC< x y) (< x y))
(define (GENERIC<= x y) (<= x y))
(define (GENERIC> x y) (> x y))
(define (GENERIC>= x y) (>= x y))
(define (GENERICexpt x y) (expt x y))
(define-syntax +
(syntax-rules ()
((+ x ...) (fx+ x ...))))
(define-syntax -
(syntax-rules ()
((- x ...) (fx- x ...))))
(define-syntax *
(syntax-rules ()
((* x ...) (fx* x ...))))
;(define-syntax quotient
; (syntax-rules ()
( ( quotient x ... ) ( quotient x ... ) ) ) ) ; FIXME
;(define-syntax modulo
; (syntax-rules ()
( ( modulo x ... ) ( modulo x ... ) ) ) ) ; FIXME
;(define-syntax remainder
; (syntax-rules ()
( ( remainder x ... ) ( remainder x ... ) ) ) ) ; FIXME
(define-syntax =
(syntax-rules ()
((= x y) (fx= x y))))
(define-syntax <
(syntax-rules ()
((< x y) (fx< x y))))
(define-syntax <=
(syntax-rules ()
((<= x y) (fx<= x y))))
(define-syntax >
(syntax-rules ()
((> x y) (fx> x y))))
(define-syntax >=
(syntax-rules ()
((>= x y) (fx>= x y))))
(define-syntax negative?
(syntax-rules ()
((negative? x) (fxnegative? x))))
(define-syntax positive?
(syntax-rules ()
((positive? x) (fxpositive? x))))
(define-syntax zero?
(syntax-rules ()
((zero? x) (fxzero? x))))
;(define-syntax odd?
; (syntax-rules ()
( ( odd ? x ) ( odd ? x ) ) ) ) ; FIXME
;(define-syntax even?
; (syntax-rules ()
( ( even ? x ) ( even ? x ) ) ) ) ; FIXME
(define-syntax bitwise-or
(syntax-rules ()
((bitwise-or x y) (fxlogior x y))))
(define-syntax bitwise-and
(syntax-rules ()
((bitwise-and x y) (fxlogand x y))))
(define-syntax bitwise-not
(syntax-rules ()
((bitwise-not x) (fxlognot x))))
)
(begin
; Don't specialize fixnum and flonum arithmetic.
(define-syntax FLOATvector-const
(syntax-rules ()
((FLOATvector-const x ...) '#(x ...))))
(define-syntax FLOATvector?
(syntax-rules ()
((FLOATvector? x) (vector? x))))
(define-syntax FLOATvector
(syntax-rules ()
((FLOATvector x ...) (vector x ...))))
(define-syntax FLOATmake-vector
(syntax-rules ()
((FLOATmake-vector n) (make-vector n 0.0))
((FLOATmake-vector n init) (make-vector n init))))
(define-syntax FLOATvector-ref
(syntax-rules ()
((FLOATvector-ref v i) (vector-ref v i))))
(define-syntax FLOATvector-set!
(syntax-rules ()
((FLOATvector-set! v i x) (vector-set! v i x))))
(define-syntax FLOATvector-length
(syntax-rules ()
((FLOATvector-length v) (vector-length v))))
(define-syntax nuc-const
(syntax-rules ()
((FLOATnuc-const x ...) '#(x ...))))
(define-syntax FLOAT+
(syntax-rules ()
((FLOAT+ x ...) (+ x ...))))
(define-syntax FLOAT-
(syntax-rules ()
((FLOAT- x ...) (- x ...))))
(define-syntax FLOAT*
(syntax-rules ()
((FLOAT* x ...) (* x ...))))
(define-syntax FLOAT/
(syntax-rules ()
((FLOAT/ x ...) (/ x ...))))
(define-syntax FLOAT=
(syntax-rules ()
((FLOAT= x y) (= x y))))
(define-syntax FLOAT<
(syntax-rules ()
((FLOAT< x y) (< x y))))
(define-syntax FLOAT<=
(syntax-rules ()
((FLOAT<= x y) (<= x y))))
(define-syntax FLOAT>
(syntax-rules ()
((FLOAT> x y) (> x y))))
(define-syntax FLOAT>=
(syntax-rules ()
((FLOAT>= x y) (>= x y))))
(define-syntax FLOATnegative?
(syntax-rules ()
((FLOATnegative? x) (negative? x))))
(define-syntax FLOATpositive?
(syntax-rules ()
((FLOATpositive? x) (positive? x))))
(define-syntax FLOATzero?
(syntax-rules ()
((FLOATzero? x) (zero? x))))
(define-syntax FLOATabs
(syntax-rules ()
((FLOATabs x) (abs x))))
(define-syntax FLOATsin
(syntax-rules ()
((FLOATsin x) (sin x))))
(define-syntax FLOATcos
(syntax-rules ()
((FLOATcos x) (cos x))))
(define-syntax FLOATatan
(syntax-rules ()
((FLOATatan x) (atan x))))
(define-syntax FLOATsqrt
(syntax-rules ()
((FLOATsqrt x) (sqrt x))))
(define-syntax FLOATmin
(syntax-rules ()
((FLOATmin x y) (min x y))))
(define-syntax FLOATmax
(syntax-rules ()
((FLOATmax x y) (max x y))))
(define-syntax FLOATround
(syntax-rules ()
((FLOATround x) (round x))))
(define-syntax FLOATinexact->exact
(syntax-rules ()
((FLOATinexact->exact x) (inexact->exact x))))
Generic arithmetic .
(define-syntax GENERIC+
(syntax-rules ()
((GENERIC+ x ...) (+ x ...))))
(define-syntax GENERIC-
(syntax-rules ()
((GENERIC- x ...) (- x ...))))
(define-syntax GENERIC*
(syntax-rules ()
((GENERIC* x ...) (* x ...))))
(define-syntax GENERIC/
(syntax-rules ()
((GENERIC/ x ...) (/ x ...))))
(define-syntax GENERICquotient
(syntax-rules ()
((GENERICquotient x y) (quotient x y))))
(define-syntax GENERICremainder
(syntax-rules ()
((GENERICremainder x y) (remainder x y))))
(define-syntax GENERICmodulo
(syntax-rules ()
((GENERICmodulo x y) (modulo x y))))
(define-syntax GENERIC=
(syntax-rules ()
((GENERIC= x y) (= x y))))
(define-syntax GENERIC<
(syntax-rules ()
((GENERIC< x y) (< x y))))
(define-syntax GENERIC<=
(syntax-rules ()
((GENERIC<= x y) (<= x y))))
(define-syntax GENERIC>
(syntax-rules ()
((GENERIC> x y) (> x y))))
(define-syntax GENERIC>=
(syntax-rules ()
((GENERIC>= x y) (>= x y))))
(define-syntax GENERICexpt
(syntax-rules ()
((GENERICexpt x y) (expt x y))))
)
)
;------------------------------------------------------------------------------
| null | https://raw.githubusercontent.com/larcenists/larceny/fef550c7d3923deb7a5a1ccd5a628e54cf231c75/test/Benchmarking/CrossPlatform/prefix/prefix-larceny.scm | scheme | INSERTCODE
------------------------------------------------------------------------------
From the draft reference implementation of R6RS generic arithmetic.
------------------------------------------------------------------------------
(define-syntax quotient
(syntax-rules ()
FIXME
(define-syntax modulo
(syntax-rules ()
FIXME
(define-syntax remainder
(syntax-rules ()
FIXME
(define-syntax odd?
(syntax-rules ()
FIXME
(define-syntax even?
(syntax-rules ()
FIXME
Don't specialize fixnum and flonum arithmetic.
------------------------------------------------------------------------------ | (error-handler
(lambda l
(decode-error l)
(display "bench DIED!") (newline) (exit 118)))
(define (run-bench name count ok? run)
(let loop ((i 0) (result (list 'undefined)))
(if (< i count)
(loop (+ i 1) (run))
result)))
(define (run-benchmark name count ok? run-maker . args)
(newline)
(let* ((run (apply run-maker args))
(result (time (run-bench name count ok? run))))
(if (not (ok? result))
(begin
(display "*** wrong result ***")
(newline)
(display "*** got: ")
(write result)
(newline))))
(exit 0))
(define (fatal-error . args)
(apply error #f args))
(define (call-with-output-file/truncate filename proc)
(call-with-output-file filename proc))
Bitwise operations on exact integers .
(define (bitwise-or i j)
(if (and (fixnum? i) (fixnum? j))
(fxlogior i j)
(if (and (exact? i)
(integer? i)
(exact? j)
(integer? j))
(cond ((or (= i -1) (= j -1))
-1)
((= i 0)
j)
((= j 0)
i)
(else
(let* ((i0 (if (odd? i) 1 0))
(j0 (if (odd? j) 1 0))
(i1 (- i i0))
(j1 (- j j0))
(i/2 (quotient i1 2))
(j/2 (quotient j1 2))
(hi (* 2 (bitwise-or i/2 j/2)))
(lo (if (= 0 (+ i0 j0)) 0 1)))
(+ hi lo))))
(error "illegal argument to bitwise-or" i j))))
(define (bitwise-and i j)
(if (and (fixnum? i) (fixnum? j))
(fxlogand i j)
(if (and (exact? i)
(integer? i)
(exact? j)
(integer? j))
(cond ((or (= i 0) (= j 0))
0)
((= i -1)
j)
((= j -1)
i)
(else
(let* ((i0 (if (odd? i) 1 0))
(j0 (if (odd? j) 1 0))
(i1 (- i i0))
(j1 (- j j0))
(i/2 (quotient i1 2))
(j/2 (quotient j1 2))
(hi (* 2 (bitwise-and i/2 j/2)))
(lo (* i0 j0)))
(+ hi lo))))
(error "illegal argument to bitwise-and" i j))))
(define (bitwise-not i)
(if (fixnum? i)
(fxlognot i)
(if (and (exact? i)
(integer? i))
(cond ((= i -1)
0)
((= i 0)
-1)
(else
(let* ((i0 (if (odd? i) 1 0))
(i1 (- i i0))
(i/2 (quotient i1 2))
(hi (* 2 (bitwise-not i/2)))
(lo (- 1 i0)))
(+ hi lo))))
(error "illegal argument to bitwise-not" i j))))
Macros ...
(if-fixflo
(begin
Specialize fixnum and flonum arithmetic .
(define-syntax FLOATvector-const
(syntax-rules ()
((FLOATvector-const x ...) '#(x ...))))
(define-syntax FLOATvector?
(syntax-rules ()
((FLOATvector? x) (vector? x))))
(define-syntax FLOATvector
(syntax-rules ()
((FLOATvector x ...) (vector x ...))))
(define-syntax FLOATmake-vector
(syntax-rules ()
((FLOATmake-vector n) (make-vector n 0.0))
((FLOATmake-vector n init) (make-vector n init))))
(define-syntax FLOATvector-ref
(syntax-rules ()
((FLOATvector-ref v i) (vector-ref v i))))
(define-syntax FLOATvector-set!
(syntax-rules ()
((FLOATvector-set! v i x) (vector-set! v i x))))
(define-syntax FLOATvector-length
(syntax-rules ()
((FLOATvector-length v) (vector-length v))))
(define-syntax nuc-const
(syntax-rules ()
((FLOATnuc-const x ...) '#(x ...))))
(define-syntax FLOAT+
(syntax-rules ()
((FLOAT+ x ...) (fl+ x ...))))
(define-syntax FLOAT-
(syntax-rules ()
((FLOAT- x ...) (fl- x ...))))
(define-syntax FLOAT*
(syntax-rules ()
((FLOAT* x ...) (fl* x ...))))
(define-syntax FLOAT/
(syntax-rules ()
FIXME
(define-syntax FLOAT=
(syntax-rules ()
((FLOAT= x y) (fl= x y))))
(define-syntax FLOAT<
(syntax-rules ()
((FLOAT< x y) (fl< x y))))
(define-syntax FLOAT<=
(syntax-rules ()
((FLOAT<= x y) (fl<= x y))))
(define-syntax FLOAT>
(syntax-rules ()
((FLOAT> x y) (fl> x y))))
(define-syntax FLOAT>=
(syntax-rules ()
((FLOAT>= x y) (fl>= x y))))
(define-syntax FLOATnegative?
(syntax-rules ()
((FLOATnegative? x) (fl< x 0.0))))
(define-syntax FLOATpositive?
(syntax-rules ()
((FLOATpositive? x) (fl< 0.0 x))))
(define-syntax FLOATzero?
(syntax-rules ()
((FLOATzero? x) (fl= 0.0 x))))
(define-syntax FLOATabs
(syntax-rules ()
FIXME
(define-syntax FLOATsin
(syntax-rules ()
FIXME
(define-syntax FLOATcos
(syntax-rules ()
FIXME
(define-syntax FLOATatan
(syntax-rules ()
FIXME
(define-syntax FLOATsqrt
(syntax-rules ()
FIXME
(define-syntax FLOATmin
(syntax-rules ()
FIXME
(define-syntax FLOATmax
(syntax-rules ()
FIXME
(define-syntax FLOATround
(syntax-rules ()
FIXME
(define-syntax FLOATinexact->exact
(syntax-rules ()
((FLOATinexact->exact x) (inexact->exact x))))
(define (GENERIC+ x y) (+ x y))
(define (GENERIC- x y) (- x y))
(define (GENERIC* x y) (* x y))
(define (GENERIC/ x y) (/ x y))
(define (GENERICquotient x y) (quotient x y))
(define (GENERICremainder x y) (remainder x y))
(define (GENERICmodulo x y) (modulo x y))
(define (GENERIC= x y) (= x y))
(define (GENERIC< x y) (< x y))
(define (GENERIC<= x y) (<= x y))
(define (GENERIC> x y) (> x y))
(define (GENERIC>= x y) (>= x y))
(define (GENERICexpt x y) (expt x y))
(define-syntax +
(syntax-rules ()
((+ x ...) (fx+ x ...))))
(define-syntax -
(syntax-rules ()
((- x ...) (fx- x ...))))
(define-syntax *
(syntax-rules ()
((* x ...) (fx* x ...))))
(define-syntax =
(syntax-rules ()
((= x y) (fx= x y))))
(define-syntax <
(syntax-rules ()
((< x y) (fx< x y))))
(define-syntax <=
(syntax-rules ()
((<= x y) (fx<= x y))))
(define-syntax >
(syntax-rules ()
((> x y) (fx> x y))))
(define-syntax >=
(syntax-rules ()
((>= x y) (fx>= x y))))
(define-syntax negative?
(syntax-rules ()
((negative? x) (fxnegative? x))))
(define-syntax positive?
(syntax-rules ()
((positive? x) (fxpositive? x))))
(define-syntax zero?
(syntax-rules ()
((zero? x) (fxzero? x))))
(define-syntax bitwise-or
(syntax-rules ()
((bitwise-or x y) (fxlogior x y))))
(define-syntax bitwise-and
(syntax-rules ()
((bitwise-and x y) (fxlogand x y))))
(define-syntax bitwise-not
(syntax-rules ()
((bitwise-not x) (fxlognot x))))
)
(begin
(define-syntax FLOATvector-const
(syntax-rules ()
((FLOATvector-const x ...) '#(x ...))))
(define-syntax FLOATvector?
(syntax-rules ()
((FLOATvector? x) (vector? x))))
(define-syntax FLOATvector
(syntax-rules ()
((FLOATvector x ...) (vector x ...))))
(define-syntax FLOATmake-vector
(syntax-rules ()
((FLOATmake-vector n) (make-vector n 0.0))
((FLOATmake-vector n init) (make-vector n init))))
(define-syntax FLOATvector-ref
(syntax-rules ()
((FLOATvector-ref v i) (vector-ref v i))))
(define-syntax FLOATvector-set!
(syntax-rules ()
((FLOATvector-set! v i x) (vector-set! v i x))))
(define-syntax FLOATvector-length
(syntax-rules ()
((FLOATvector-length v) (vector-length v))))
(define-syntax nuc-const
(syntax-rules ()
((FLOATnuc-const x ...) '#(x ...))))
(define-syntax FLOAT+
(syntax-rules ()
((FLOAT+ x ...) (+ x ...))))
(define-syntax FLOAT-
(syntax-rules ()
((FLOAT- x ...) (- x ...))))
(define-syntax FLOAT*
(syntax-rules ()
((FLOAT* x ...) (* x ...))))
(define-syntax FLOAT/
(syntax-rules ()
((FLOAT/ x ...) (/ x ...))))
(define-syntax FLOAT=
(syntax-rules ()
((FLOAT= x y) (= x y))))
(define-syntax FLOAT<
(syntax-rules ()
((FLOAT< x y) (< x y))))
(define-syntax FLOAT<=
(syntax-rules ()
((FLOAT<= x y) (<= x y))))
(define-syntax FLOAT>
(syntax-rules ()
((FLOAT> x y) (> x y))))
(define-syntax FLOAT>=
(syntax-rules ()
((FLOAT>= x y) (>= x y))))
(define-syntax FLOATnegative?
(syntax-rules ()
((FLOATnegative? x) (negative? x))))
(define-syntax FLOATpositive?
(syntax-rules ()
((FLOATpositive? x) (positive? x))))
(define-syntax FLOATzero?
(syntax-rules ()
((FLOATzero? x) (zero? x))))
(define-syntax FLOATabs
(syntax-rules ()
((FLOATabs x) (abs x))))
(define-syntax FLOATsin
(syntax-rules ()
((FLOATsin x) (sin x))))
(define-syntax FLOATcos
(syntax-rules ()
((FLOATcos x) (cos x))))
(define-syntax FLOATatan
(syntax-rules ()
((FLOATatan x) (atan x))))
(define-syntax FLOATsqrt
(syntax-rules ()
((FLOATsqrt x) (sqrt x))))
(define-syntax FLOATmin
(syntax-rules ()
((FLOATmin x y) (min x y))))
(define-syntax FLOATmax
(syntax-rules ()
((FLOATmax x y) (max x y))))
(define-syntax FLOATround
(syntax-rules ()
((FLOATround x) (round x))))
(define-syntax FLOATinexact->exact
(syntax-rules ()
((FLOATinexact->exact x) (inexact->exact x))))
Generic arithmetic .
(define-syntax GENERIC+
(syntax-rules ()
((GENERIC+ x ...) (+ x ...))))
(define-syntax GENERIC-
(syntax-rules ()
((GENERIC- x ...) (- x ...))))
(define-syntax GENERIC*
(syntax-rules ()
((GENERIC* x ...) (* x ...))))
(define-syntax GENERIC/
(syntax-rules ()
((GENERIC/ x ...) (/ x ...))))
(define-syntax GENERICquotient
(syntax-rules ()
((GENERICquotient x y) (quotient x y))))
(define-syntax GENERICremainder
(syntax-rules ()
((GENERICremainder x y) (remainder x y))))
(define-syntax GENERICmodulo
(syntax-rules ()
((GENERICmodulo x y) (modulo x y))))
(define-syntax GENERIC=
(syntax-rules ()
((GENERIC= x y) (= x y))))
(define-syntax GENERIC<
(syntax-rules ()
((GENERIC< x y) (< x y))))
(define-syntax GENERIC<=
(syntax-rules ()
((GENERIC<= x y) (<= x y))))
(define-syntax GENERIC>
(syntax-rules ()
((GENERIC> x y) (> x y))))
(define-syntax GENERIC>=
(syntax-rules ()
((GENERIC>= x y) (>= x y))))
(define-syntax GENERICexpt
(syntax-rules ()
((GENERICexpt x y) (expt x y))))
)
)
|
247c161c252216a6b6f6224a4b320d007d459afdf932d276ed5f6d906b4f05f5 | brendanhay/gogol | Patch.hs | # LANGUAGE DataKinds #
# LANGUAGE DeriveGeneric #
# LANGUAGE DerivingStrategies #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleInstances #
# LANGUAGE GeneralizedNewtypeDeriving #
# LANGUAGE LambdaCase #
{-# LANGUAGE OverloadedStrings #-}
# LANGUAGE PatternSynonyms #
# LANGUAGE RecordWildCards #
{-# LANGUAGE StrictData #-}
# LANGUAGE TypeFamilies #
# LANGUAGE TypeOperators #
# LANGUAGE NoImplicitPrelude #
# OPTIONS_GHC -fno - warn - duplicate - exports #
# OPTIONS_GHC -fno - warn - name - shadowing #
# OPTIONS_GHC -fno - warn - unused - binds #
# OPTIONS_GHC -fno - warn - unused - imports #
# OPTIONS_GHC -fno - warn - unused - matches #
-- |
Module : . Classroom . Courses . CourseWorkMaterials . Patch
Copyright : ( c ) 2015 - 2022
License : Mozilla Public License , v. 2.0 .
Maintainer : < brendan.g.hay+ >
-- Stability : auto-generated
Portability : non - portable ( GHC extensions )
--
Updates one or more fields of a course work material . This method returns the following error codes : * @PERMISSION_DENIED@ if the requesting developer project for access errors . * @INVALID_ARGUMENT@ if the request is malformed . * @FAILED_PRECONDITION@ if the requested course work material has already been deleted . * if the requested course or course work material does not exist
--
/See:/ < / Google Classroom API Reference > for
module Gogol.Classroom.Courses.CourseWorkMaterials.Patch
( -- * Resource
ClassroomCoursesCourseWorkMaterialsPatchResource,
-- ** Constructing a Request
ClassroomCoursesCourseWorkMaterialsPatch (..),
newClassroomCoursesCourseWorkMaterialsPatch,
)
where
import Gogol.Classroom.Types
import qualified Gogol.Prelude as Core
| A resource alias for @classroom.courses.courseWorkMaterials.patch@ method which the
-- 'ClassroomCoursesCourseWorkMaterialsPatch' request conforms to.
type ClassroomCoursesCourseWorkMaterialsPatchResource =
"v1"
Core.:> "courses"
Core.:> Core.Capture "courseId" Core.Text
Core.:> "courseWorkMaterials"
Core.:> Core.Capture "id" Core.Text
Core.:> Core.QueryParam "$.xgafv" Xgafv
Core.:> Core.QueryParam "access_token" Core.Text
Core.:> Core.QueryParam "callback" Core.Text
Core.:> Core.QueryParam "updateMask" Core.FieldMask
Core.:> Core.QueryParam "uploadType" Core.Text
Core.:> Core.QueryParam "upload_protocol" Core.Text
Core.:> Core.QueryParam "alt" Core.AltJSON
Core.:> Core.ReqBody '[Core.JSON] CourseWorkMaterial
Core.:> Core.Patch '[Core.JSON] CourseWorkMaterial
| Updates one or more fields of a course work material . This method returns the following error codes : * @PERMISSION_DENIED@ if the requesting developer project for access errors . * @INVALID_ARGUMENT@ if the request is malformed . * @FAILED_PRECONDITION@ if the requested course work material has already been deleted . * if the requested course or course work material does not exist
--
-- /See:/ 'newClassroomCoursesCourseWorkMaterialsPatch' smart constructor.
data ClassroomCoursesCourseWorkMaterialsPatch = ClassroomCoursesCourseWorkMaterialsPatch
{ -- | V1 error format.
xgafv :: (Core.Maybe Xgafv),
-- | OAuth access token.
accessToken :: (Core.Maybe Core.Text),
| JSONP
callback :: (Core.Maybe Core.Text),
-- | Identifier of the course. This identifier can be either the Classroom-assigned identifier or an alias.
courseId :: Core.Text,
-- | Identifier of the course work material.
id :: Core.Text,
-- | Multipart request metadata.
payload :: CourseWorkMaterial,
-- | Mask that identifies which fields on the course work material to update. This field is required to do an update. The update fails if invalid fields are specified. If a field supports empty values, it can be cleared by specifying it in the update mask and not in the course work material object. If a field that does not support empty values is included in the update mask and not set in the course work material object, an @INVALID_ARGUMENT@ error is returned. The following fields may be specified by teachers: * @title@ * @description@ * @state@ * @scheduled_time@ * @topic_id@
updateMask :: (Core.Maybe Core.FieldMask),
| Legacy upload protocol for media ( e.g. \"media\ " , \"multipart\ " ) .
uploadType :: (Core.Maybe Core.Text),
-- | Upload protocol for media (e.g. \"raw\", \"multipart\").
uploadProtocol :: (Core.Maybe Core.Text)
}
deriving (Core.Eq, Core.Show, Core.Generic)
-- | Creates a value of 'ClassroomCoursesCourseWorkMaterialsPatch' with the minimum fields required to make a request.
newClassroomCoursesCourseWorkMaterialsPatch ::
-- | Identifier of the course. This identifier can be either the Classroom-assigned identifier or an alias. See 'courseId'.
Core.Text ->
-- | Identifier of the course work material. See 'id'.
Core.Text ->
-- | Multipart request metadata. See 'payload'.
CourseWorkMaterial ->
ClassroomCoursesCourseWorkMaterialsPatch
newClassroomCoursesCourseWorkMaterialsPatch courseId id payload =
ClassroomCoursesCourseWorkMaterialsPatch
{ xgafv = Core.Nothing,
accessToken = Core.Nothing,
callback = Core.Nothing,
courseId = courseId,
id = id,
payload = payload,
updateMask = Core.Nothing,
uploadType = Core.Nothing,
uploadProtocol = Core.Nothing
}
instance
Core.GoogleRequest
ClassroomCoursesCourseWorkMaterialsPatch
where
type
Rs ClassroomCoursesCourseWorkMaterialsPatch =
CourseWorkMaterial
type
Scopes ClassroomCoursesCourseWorkMaterialsPatch =
'[Classroom'Courseworkmaterials]
requestClient
ClassroomCoursesCourseWorkMaterialsPatch {..} =
go
courseId
id
xgafv
accessToken
callback
updateMask
uploadType
uploadProtocol
(Core.Just Core.AltJSON)
payload
classroomService
where
go =
Core.buildClient
( Core.Proxy ::
Core.Proxy
ClassroomCoursesCourseWorkMaterialsPatchResource
)
Core.mempty
| null | https://raw.githubusercontent.com/brendanhay/gogol/77394c4e0f5bd729e6fe27119701c45f9d5e1e9a/lib/services/gogol-classroom/gen/Gogol/Classroom/Courses/CourseWorkMaterials/Patch.hs | haskell | # LANGUAGE OverloadedStrings #
# LANGUAGE StrictData #
|
Stability : auto-generated
* Resource
** Constructing a Request
'ClassroomCoursesCourseWorkMaterialsPatch' request conforms to.
/See:/ 'newClassroomCoursesCourseWorkMaterialsPatch' smart constructor.
| V1 error format.
| OAuth access token.
| Identifier of the course. This identifier can be either the Classroom-assigned identifier or an alias.
| Identifier of the course work material.
| Multipart request metadata.
| Mask that identifies which fields on the course work material to update. This field is required to do an update. The update fails if invalid fields are specified. If a field supports empty values, it can be cleared by specifying it in the update mask and not in the course work material object. If a field that does not support empty values is included in the update mask and not set in the course work material object, an @INVALID_ARGUMENT@ error is returned. The following fields may be specified by teachers: * @title@ * @description@ * @state@ * @scheduled_time@ * @topic_id@
| Upload protocol for media (e.g. \"raw\", \"multipart\").
| Creates a value of 'ClassroomCoursesCourseWorkMaterialsPatch' with the minimum fields required to make a request.
| Identifier of the course. This identifier can be either the Classroom-assigned identifier or an alias. See 'courseId'.
| Identifier of the course work material. See 'id'.
| Multipart request metadata. See 'payload'. | # LANGUAGE DataKinds #
# LANGUAGE DeriveGeneric #
# LANGUAGE DerivingStrategies #
# LANGUAGE DuplicateRecordFields #
# LANGUAGE FlexibleInstances #
# LANGUAGE GeneralizedNewtypeDeriving #
# LANGUAGE LambdaCase #
# LANGUAGE PatternSynonyms #
# LANGUAGE RecordWildCards #
# LANGUAGE TypeFamilies #
# LANGUAGE TypeOperators #
# LANGUAGE NoImplicitPrelude #
# OPTIONS_GHC -fno - warn - duplicate - exports #
# OPTIONS_GHC -fno - warn - name - shadowing #
# OPTIONS_GHC -fno - warn - unused - binds #
# OPTIONS_GHC -fno - warn - unused - imports #
# OPTIONS_GHC -fno - warn - unused - matches #
Module : . Classroom . Courses . CourseWorkMaterials . Patch
Copyright : ( c ) 2015 - 2022
License : Mozilla Public License , v. 2.0 .
Maintainer : < brendan.g.hay+ >
Portability : non - portable ( GHC extensions )
Updates one or more fields of a course work material . This method returns the following error codes : * @PERMISSION_DENIED@ if the requesting developer project for access errors . * @INVALID_ARGUMENT@ if the request is malformed . * @FAILED_PRECONDITION@ if the requested course work material has already been deleted . * if the requested course or course work material does not exist
/See:/ < / Google Classroom API Reference > for
module Gogol.Classroom.Courses.CourseWorkMaterials.Patch
ClassroomCoursesCourseWorkMaterialsPatchResource,
ClassroomCoursesCourseWorkMaterialsPatch (..),
newClassroomCoursesCourseWorkMaterialsPatch,
)
where
import Gogol.Classroom.Types
import qualified Gogol.Prelude as Core
| A resource alias for @classroom.courses.courseWorkMaterials.patch@ method which the
type ClassroomCoursesCourseWorkMaterialsPatchResource =
"v1"
Core.:> "courses"
Core.:> Core.Capture "courseId" Core.Text
Core.:> "courseWorkMaterials"
Core.:> Core.Capture "id" Core.Text
Core.:> Core.QueryParam "$.xgafv" Xgafv
Core.:> Core.QueryParam "access_token" Core.Text
Core.:> Core.QueryParam "callback" Core.Text
Core.:> Core.QueryParam "updateMask" Core.FieldMask
Core.:> Core.QueryParam "uploadType" Core.Text
Core.:> Core.QueryParam "upload_protocol" Core.Text
Core.:> Core.QueryParam "alt" Core.AltJSON
Core.:> Core.ReqBody '[Core.JSON] CourseWorkMaterial
Core.:> Core.Patch '[Core.JSON] CourseWorkMaterial
| Updates one or more fields of a course work material . This method returns the following error codes : * @PERMISSION_DENIED@ if the requesting developer project for access errors . * @INVALID_ARGUMENT@ if the request is malformed . * @FAILED_PRECONDITION@ if the requested course work material has already been deleted . * if the requested course or course work material does not exist
data ClassroomCoursesCourseWorkMaterialsPatch = ClassroomCoursesCourseWorkMaterialsPatch
xgafv :: (Core.Maybe Xgafv),
accessToken :: (Core.Maybe Core.Text),
| JSONP
callback :: (Core.Maybe Core.Text),
courseId :: Core.Text,
id :: Core.Text,
payload :: CourseWorkMaterial,
updateMask :: (Core.Maybe Core.FieldMask),
| Legacy upload protocol for media ( e.g. \"media\ " , \"multipart\ " ) .
uploadType :: (Core.Maybe Core.Text),
uploadProtocol :: (Core.Maybe Core.Text)
}
deriving (Core.Eq, Core.Show, Core.Generic)
newClassroomCoursesCourseWorkMaterialsPatch ::
Core.Text ->
Core.Text ->
CourseWorkMaterial ->
ClassroomCoursesCourseWorkMaterialsPatch
newClassroomCoursesCourseWorkMaterialsPatch courseId id payload =
ClassroomCoursesCourseWorkMaterialsPatch
{ xgafv = Core.Nothing,
accessToken = Core.Nothing,
callback = Core.Nothing,
courseId = courseId,
id = id,
payload = payload,
updateMask = Core.Nothing,
uploadType = Core.Nothing,
uploadProtocol = Core.Nothing
}
instance
Core.GoogleRequest
ClassroomCoursesCourseWorkMaterialsPatch
where
type
Rs ClassroomCoursesCourseWorkMaterialsPatch =
CourseWorkMaterial
type
Scopes ClassroomCoursesCourseWorkMaterialsPatch =
'[Classroom'Courseworkmaterials]
requestClient
ClassroomCoursesCourseWorkMaterialsPatch {..} =
go
courseId
id
xgafv
accessToken
callback
updateMask
uploadType
uploadProtocol
(Core.Just Core.AltJSON)
payload
classroomService
where
go =
Core.buildClient
( Core.Proxy ::
Core.Proxy
ClassroomCoursesCourseWorkMaterialsPatchResource
)
Core.mempty
|
9cf9b881eb4e0884fd7c5c4ec0fbf0fb83c87d92fd49c995fce17c91b78e4010 | RefactoringTools/HaRe | D.hs | module C where
-- Test for refactor of if to case
-- The comments on the then and else legs should be preserved
foo x = if (odd x)
then do
-- This is an odd result
bob x 1
else do
-- This is an even result
bob x 2
bob x y = x + y
| null | https://raw.githubusercontent.com/RefactoringTools/HaRe/ef5dee64c38fb104e6e5676095946279fbce381c/test/testdata/Case/D.hs | haskell | Test for refactor of if to case
The comments on the then and else legs should be preserved
This is an odd result
This is an even result | module C where
foo x = if (odd x)
then do
bob x 1
else do
bob x 2
bob x y = x + y
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.