added
stringdate 2025-04-01 04:05:38
2025-04-01 07:14:06
| created
timestamp[us]date 2001-10-09 16:19:16
2025-01-01 03:51:31
| id
stringlengths 4
10
| metadata
dict | source
stringclasses 2
values | text
stringlengths 0
1.61M
|
|---|---|---|---|---|---|
2025-04-01T06:38:27.266245
| 2019-11-08T10:54:55
|
519948419
|
{
"authors": [
"Trollwut",
"hacknug"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5459",
"repo": "dracula/dracula-theme",
"url": "https://github.com/dracula/dracula-theme/issues/347"
}
|
gharchive/issue
|
Dracula for Vimium-FF (Firefox addon)
I made a custom CSS for the Firefox addon Vimium-FF.
If you want, it could be added to the collection of Dracula themes.
My repo is accessible here: https://github.com/Trollwut/vimium-dracula
Hey @Trollwut 👋 README.md should look like the template. If you could make a change and let me know once it's been updated, I'll invite you to the org so you can transfer the repo and maintain it there 👍
hey @hacknug !
Thanks for that! I've rewritten the README.md to fit to the template. Only the link to the contributers is not set yet, as I don't know how the repo's full name will be in the Dracula repo. :)
May you please have a look at it?
Link will be dracula/vimium if it's also compatible with the linked Chrome version. Please confirm it will be.
Just sent you the invite to join the org. Once you join, you'll be able to transfer your repo to it (make sure you do this so GitHub takes care of redirecting users visiting your current URL). let me know once it's done and I'll set the right permission for you to take care of it 😉
yeah bby, tested it myself!
Working on Chromium 78 with the latest Vimium 1.64.6. Will tell you, when I tranfered the repo!
Aaaand it's transfered!
I selected only the Vimium group to have access to it. Please adjust if this wasn't sufficient.
Also please check if I did that right, as this was my first transfer of a repo :)
@Trollwut everything looking good. Thank you so much for your contribution! 🎉
|
2025-04-01T06:38:27.268980
| 2022-09-11T09:52:16
|
1368892431
|
{
"authors": [
"hajosattila",
"syrofoam"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5460",
"repo": "dracula/foot",
"url": "https://github.com/dracula/foot/issues/3"
}
|
gharchive/issue
|
Shell completion (and other) text not visible (very faint)
Arch Linux | 5.19.7-AMD | sway version: 1.7 | foot version: 1.13.1 | Dracula theme
foot.ini: https://pastebin.com/s2tRCWkb
Sorry for my bad English!
Issue was patched with the merge request and the colors where updated.
If you update or download latest you should be good.
|
2025-04-01T06:38:27.360714
| 2022-09-14T00:11:13
|
1372146382
|
{
"authors": [
"drahnr",
"lopopolo"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5461",
"repo": "drahnr/cargo-spellcheck",
"url": "https://github.com/drahnr/cargo-spellcheck/issues/277"
}
|
gharchive/issue
|
reflow sub command transposes // and leading space
Describe the bug
cargo spellcheck reflow produces bad comments.
To Reproduce
Steps to reproduce the behaviour:
A file containing:
use std::any::Any;
use std::borrow::Cow;
use std::collections::HashSet;
use std::ffi::CStr;
use std::hash::{Hash, Hasher};
use std::ptr::NonNull;
use crate::def::{ConstantNameError, EnclosingRubyScope, Free, Method, NotDefinedError};
use crate::error::Error;
use crate::ffi::InterpreterExtractError;
use crate::method;
use crate::sys;
use crate::Artichoke;
mod registry;
pub use registry::Registry;
#[derive(Debug)]
pub struct Builder<'a> {
interp: &'a mut Artichoke,
spec: &'a Spec,
is_mrb_tt_data: bool,
super_class: Option<NonNull<sys::RClass>>,
methods: HashSet<method::Spec>,
}
impl<'a> Builder<'a> {
#[must_use]
pub fn for_spec(interp: &'a mut Artichoke, spec: &'a Spec) -> Self {
Self {
interp,
spec,
is_mrb_tt_data: false,
super_class: None,
methods: HashSet::default(),
}
}
#[must_use]
pub fn value_is_rust_object(mut self) -> Self {
self.is_mrb_tt_data = true;
self
}
pub fn with_super_class<T, U>(mut self, classname: U) -> Result<Self, Error>
where
T: Any,
U: Into<Cow<'static, str>>,
{
let state = self.interp.state.as_deref().ok_or_else(InterpreterExtractError::new)?;
let rclass = if let Some(spec) = state.classes.get::<T>() {
spec.rclass()
} else {
return Err(NotDefinedError::super_class(classname.into()).into());
};
let rclass = unsafe { self.interp.with_ffi_boundary(|mrb| rclass.resolve(mrb))? };
if let Some(rclass) = rclass {
self.super_class = Some(rclass);
Ok(self)
} else {
Err(NotDefinedError::super_class(classname.into()).into())
}
}
pub fn add_method<T>(mut self, name: T, method: Method, args: sys::mrb_aspec) -> Result<Self, ConstantNameError>
where
T: Into<Cow<'static, str>>,
{
let spec = method::Spec::new(method::Type::Instance, name.into(), method, args)?;
self.methods.insert(spec);
Ok(self)
}
pub fn add_self_method<T>(
mut self,
name: T,
method: Method,
args: sys::mrb_aspec,
) -> Result<Self, ConstantNameError>
where
T: Into<Cow<'static, str>>,
{
let spec = method::Spec::new(method::Type::Class, name.into(), method, args)?;
self.methods.insert(spec);
Ok(self)
}
pub fn define(self) -> Result<(), NotDefinedError> {
use sys::mrb_vtype::MRB_TT_DATA;
let name = self.spec.name_c_str().as_ptr();
let mut super_class = if let Some(super_class) = self.super_class {
super_class
} else {
// SAFETY: Although this direct access of the `mrb` property on the
// interp does not go through `Artichoke::with_ffi_boundary`, no
// `MRB_API` functions are called, which means it is not required to
// re-box the Artichoke `State` into the `mrb_state->ud` pointer.
//
// This code only performs a memory access to read a field from the
// `mrb_state`.
let rclass = unsafe { self.interp.mrb.as_ref().object_class };
NonNull::new(rclass).ok_or_else(|| NotDefinedError::super_class("Object"))?
};
let rclass = self.spec.rclass();
let rclass = unsafe { self.interp.with_ffi_boundary(|mrb| rclass.resolve(mrb)) };
let mut rclass = if let Ok(Some(rclass)) = rclass {
rclass
} else if let Some(enclosing_scope) = self.spec.enclosing_scope() {
let scope = unsafe { self.interp.with_ffi_boundary(|mrb| enclosing_scope.rclass(mrb)) };
if let Ok(Some(mut scope)) = scope {
let rclass = unsafe {
self.interp.with_ffi_boundary(|mrb| {
sys::mrb_define_class_under(mrb, scope.as_mut(), name, super_class.as_mut())
})
};
let rclass = rclass.map_err(|_| NotDefinedError::class(self.spec.name()))?;
NonNull::new(rclass).ok_or_else(|| NotDefinedError::class(self.spec.name()))?
} else {
return Err(NotDefinedError::enclosing_scope(enclosing_scope.fqname().into_owned()));
}
} else {
let rclass = unsafe {
self.interp
.with_ffi_boundary(|mrb| sys::mrb_define_class(mrb, name, super_class.as_mut()))
};
let rclass = rclass.map_err(|_| NotDefinedError::class(self.spec.name()))?;
NonNull::new(rclass).ok_or_else(|| NotDefinedError::class(self.spec.name()))?
};
for method in &self.methods {
unsafe {
method.define(self.interp, rclass.as_mut())?;
}
}
// If a `Spec` defines a `Class` whose instances own a pointer to a
// Rust object, mark them as `MRB_TT_DATA`.
if self.is_mrb_tt_data {
unsafe {
sys::mrb_sys_set_instance_tt(rclass.as_mut(), MRB_TT_DATA);
}
}
Ok(())
}
}
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct Rclass {
name: &'static CStr,
enclosing_scope: Option<EnclosingRubyScope>,
}
impl Rclass {
#[must_use]
pub const fn new(name: &'static CStr, enclosing_scope: Option<EnclosingRubyScope>) -> Self {
Self { name, enclosing_scope }
}
/// Resolve a type's [`sys::RClass`] using its enclosing scope and name.
///
/// # Safety
///
/// This function must be called within an [`Artichoke::with_ffi_boundary`]
/// closure because the FFI APIs called in this function may require access
/// to the Artichoke [`State`](crate::state::State).
pub unsafe fn resolve(&self, mrb: *mut sys::mrb_state) -> Option<NonNull<sys::RClass>> {
let class_name = self.name.as_ptr();
if let Some(ref scope) = self.enclosing_scope {
// short circuit if enclosing scope does not exist.
let mut scope = scope.rclass(mrb)?;
let is_defined_under = sys::mrb_class_defined_under(mrb, scope.as_mut(), class_name);
if is_defined_under {
// Enclosing scope exists.
// Class is defined under the enclosing scope.
let class = sys::mrb_class_get_under(mrb, scope.as_mut(), class_name);
NonNull::new(class)
} else {
// Enclosing scope exists.
// Class is not defined under the enclosing scope.
None
}
} else {
let is_defined = sys::mrb_class_defined(mrb, class_name);
if is_defined {
// Class exists in root scope.
let class = sys::mrb_class_get(mrb, class_name);
NonNull::new(class)
} else {
// Class does not exist in root scope.
None
}
}
}
}
#[derive(Debug)]
pub struct Spec {
name: Cow<'static, str>,
name_cstr: &'static CStr,
data_type: Box<sys::mrb_data_type>,
enclosing_scope: Option<EnclosingRubyScope>,
}
impl Spec {
pub fn new<T>(
name: T,
name_cstr: &'static CStr,
enclosing_scope: Option<EnclosingRubyScope>,
free: Option<Free>,
) -> Result<Self, ConstantNameError>
where
T: Into<Cow<'static, str>>,
{
let name = name.into();
// SAFETY: The constructed `mrb_data_type` has `'static` lifetime:
//
// - `name_cstr` is `&'static` so it will outlive the `data_type`.
// - `Spec` does not offer mutable access to these fields.
let data_type = sys::mrb_data_type {
struct_name: name_cstr.as_ptr(),
dfree: free,
};
let data_type = Box::new(data_type);
Ok(Self {
name,
name_cstr,
data_type,
enclosing_scope,
})
}
#[must_use]
pub fn data_type(&self) -> *const sys::mrb_data_type {
self.data_type.as_ref()
}
#[must_use]
pub fn name(&self) -> Cow<'static, str> {
match &self.name {
Cow::Borrowed(name) => Cow::Borrowed(name),
Cow::Owned(name) => name.clone().into(),
}
}
#[must_use]
pub fn name_c_str(&self) -> &'static CStr {
self.name_cstr
}
#[must_use]
pub fn enclosing_scope(&self) -> Option<&EnclosingRubyScope> {
self.enclosing_scope.as_ref()
}
#[must_use]
pub fn fqname(&self) -> Cow<'_, str> {
if let Some(scope) = self.enclosing_scope() {
let mut fqname = String::from(scope.fqname());
fqname.push_str("::");
fqname.push_str(self.name.as_ref());
fqname.into()
} else {
self.name.as_ref().into()
}
}
#[must_use]
pub fn rclass(&self) -> Rclass {
Rclass::new(self.name_cstr, self.enclosing_scope.clone())
}
}
impl Hash for Spec {
fn hash<H: Hasher>(&self, state: &mut H) {
self.name().hash(state);
self.enclosing_scope().hash(state);
}
}
impl Eq for Spec {}
impl PartialEq for Spec {
fn eq(&self, other: &Self) -> bool {
self.fqname() == other.fqname()
}
}
#[cfg(test)]
mod tests {
use spinoso_exception::StandardError;
use crate::extn::core::kernel::Kernel;
use crate::test::prelude::*;
struct RustError;
#[test]
fn super_class() {
let mut interp = interpreter();
let spec = class::Spec::new("RustError", qed::const_cstr_from_str!("RustError\0"), None, None).unwrap();
class::Builder::for_spec(&mut interp, &spec)
.with_super_class::<StandardError, _>("StandardError")
.unwrap()
.define()
.unwrap();
interp.def_class::<RustError>(spec).unwrap();
let result = interp.eval(b"RustError.new.is_a?(StandardError)").unwrap();
let result = result.try_convert_into::<bool>(&interp).unwrap();
assert!(result, "RustError instances are instance of StandardError");
let result = interp.eval(b"RustError < StandardError").unwrap();
let result = result.try_convert_into::<bool>(&interp).unwrap();
assert!(result, "RustError inherits from StandardError");
}
#[test]
fn rclass_for_undef_root_class() {
let mut interp = interpreter();
let spec = class::Spec::new("Foo", qed::const_cstr_from_str!("Foo\0"), None, None).unwrap();
let rclass = unsafe { interp.with_ffi_boundary(|mrb| spec.rclass().resolve(mrb)) }.unwrap();
assert!(rclass.is_none());
}
#[test]
fn rclass_for_undef_nested_class() {
let mut interp = interpreter();
let scope = interp.module_spec::<Kernel>().unwrap().unwrap();
let spec = class::Spec::new(
"Foo",
qed::const_cstr_from_str!("Foo\0"),
Some(EnclosingRubyScope::module(scope)),
None,
)
.unwrap();
let rclass = unsafe { interp.with_ffi_boundary(|mrb| spec.rclass().resolve(mrb)) }.unwrap();
assert!(rclass.is_none());
}
#[test]
fn rclass_for_nested_class() {
let mut interp = interpreter();
interp.eval(b"module Foo; class Bar; end; end").unwrap();
let spec = module::Spec::new(&mut interp, "Foo", qed::const_cstr_from_str!("Foo\0"), None).unwrap();
let spec = class::Spec::new(
"Bar",
qed::const_cstr_from_str!("Bar\0"),
Some(EnclosingRubyScope::module(&spec)),
None,
)
.unwrap();
let rclass = unsafe { interp.with_ffi_boundary(|mrb| spec.rclass().resolve(mrb)) }.unwrap();
assert!(rclass.is_some());
}
#[test]
fn rclass_for_nested_class_under_class() {
let mut interp = interpreter();
interp.eval(b"class Foo; class Bar; end; end").unwrap();
let spec = class::Spec::new("Foo", qed::const_cstr_from_str!("Foo\0"), None, None).unwrap();
let spec = class::Spec::new(
"Bar",
qed::const_cstr_from_str!("Bar\0"),
Some(EnclosingRubyScope::class(&spec)),
None,
)
.unwrap();
let rclass = unsafe { interp.with_ffi_boundary(|mrb| spec.rclass().resolve(mrb)) }.unwrap();
assert!(rclass.is_some());
}
}
Run cargo spellcheck reflow
Observe this malformed diff:
diff --git i/artichoke-backend/src/class.rs w/artichoke-backend/src/class.rs
index 941b22a09c..42f4a3a881 100644
--- i/artichoke-backend/src/class.rs
+++ w/artichoke-backend/src/class.rs
@@ -138,8 +138,8 @@ impl<'a> Builder<'a> {
}
}
- // If a `Spec` defines a `Class` whose instances own a pointer to a
- // Rust object, mark them as `MRB_TT_DATA`.
+ // If a `Spec` defines a `Class` whose instances own a pointer to a Rust
+ //object, mark them as `MRB_TT_DATA`.
if self.is_mrb_tt_data {
unsafe {
sys::mrb_sys_set_instance_tt(rclass.as_mut(), MRB_TT_DATA);
@@ -175,13 +175,13 @@ impl Rclass {
let mut scope = scope.rclass(mrb)?;
let is_defined_under = sys::mrb_class_defined_under(mrb, scope.as_mut(), class_name);
if is_defined_under {
- // Enclosing scope exists.
- // Class is defined under the enclosing scope.
+ // Enclosing scope exists. Class is defined under the enclosing
+ //scope.
let class = sys::mrb_class_get_under(mrb, scope.as_mut(), class_name);
NonNull::new(class)
} else {
- // Enclosing scope exists.
- // Class is not defined under the enclosing scope.
+ // Enclosing scope exists. Class is not defined under the
+ //enclosing scope.
None
}
} else {
Expected behavior
No extra space before comment on second line, a space after the //.
Screenshots
Please complete the following information:
System: macOS
Obtained: cargo
Version: cargo-spellcheck 0.11.3
Additional context
Does it only happen with two line comments?
#238 could be related
@drahnr from a quick peek, it occurs in more places than just two line comments. This one is such an example:
diff --git i/artichoke-backend/src/def.rs w/artichoke-backend/src/def.rs
index 9d1a766659..82963c879e 100644
--- i/artichoke-backend/src/def.rs
+++ w/artichoke-backend/src/def.rs
@@ -54,10 +54,10 @@ where
// Rather than attempt a free and virtually guaranteed segfault, log
// loudly and short-circuit; a leak is better than a crash.
//
- // `box_unbox_free::<T>` is only ever called in an FFI context when
- // there are C frames in the stack. Using `eprintln!` or unwrapping the
- // error from `write!` here is undefined behavior and may result in an
- // abort. Instead, suppress the error.
+ // `box_unbox_free::<T>` is only ever called in an FFI context when there
+ //are C frames in the stack. Using `eprintln!` or unwrapping the error
+ //from `write!` here is undefined behavior and may result in an abort.
+ //Instead, suppress the error.
let _ignored = write!(
io::stderr(),
"Received null pointer in box_unbox_free::<{}>",
Full diff
diff --git i/README.md w/README.md
index 69faaff23a..a6dbbd3053 100644
--- i/README.md
+++ w/README.md
@@ -135,10 +135,10 @@ If Artichoke does not run Ruby source code in the same way that MRI does, it is
a bug and we would appreciate if you [filed an issue so we can fix
it][file-an-issue].
-If you would like to contribute code 👩💻👨💻, find an issue that looks interesting
-and leave a comment that you're beginning to investigate. If there is no issue,
-please file one before beginning to work on a PR. [Good first issues are labeled
-`E-easy`][e-easy].
+If you would like to contribute code 👩💻👨💻, find an issue that looks
+interesting and leave a comment that you're beginning to investigate. If there
+is no issue, please file one before beginning to work on a PR. [Good first
+issues are labeled `E-easy`][e-easy].
### Discussion
diff --git i/artichoke-backend/src/class.rs w/artichoke-backend/src/class.rs
index 840e98fdc3..d65a201c31 100644
--- i/artichoke-backend/src/class.rs
+++ w/artichoke-backend/src/class.rs
@@ -138,8 +138,8 @@ impl<'a> Builder<'a> {
}
}
- // If a `Spec` defines a `Class` whose instances own a pointer to a
- // Rust object, mark them as `MRB_TT_DATA`.
+ // If a `Spec` defines a `Class` whose instances own a pointer to a Rust
+ //object, mark them as `MRB_TT_DATA`.
if self.is_mrb_tt_data {
unsafe {
sys::mrb_sys_set_instance_tt(rclass.as_mut(), MRB_TT_DATA);
@@ -175,13 +175,13 @@ impl Rclass {
let mut scope = scope.rclass(mrb)?;
let is_defined_under = sys::mrb_class_defined_under(mrb, scope.as_mut(), class_name);
if is_defined_under {
- // Enclosing scope exists.
- // Class is defined under the enclosing scope.
+ // Enclosing scope exists. Class is defined under the enclosing
+ //scope.
let class = sys::mrb_class_get_under(mrb, scope.as_mut(), class_name);
NonNull::new(class)
} else {
- // Enclosing scope exists.
- // Class is not defined under the enclosing scope.
+ // Enclosing scope exists. Class is not defined under the
+ //enclosing scope.
None
}
} else {
diff --git i/artichoke-backend/src/class/registry.rs w/artichoke-backend/src/class/registry.rs
index 2244c18b67..6369f1448e 100644
--- i/artichoke-backend/src/class/registry.rs
+++ w/artichoke-backend/src/class/registry.rs
@@ -233,9 +233,8 @@ where
self.0.shrink_to_fit();
}
- /// Shrinks the capacity of the registry with a lower bound.
- /// The capacity will remain at least as large as both the length and the
- /// supplied value.
+ /// Shrinks the capacity of the registry with a lower bound. The capacity
+ /// will remain at least as large as both the length and the supplied value.
///
/// If the current capacity is less than the lower limit, this is a no-op.
pub fn shrink_to(&mut self, min_capacity: usize) {
diff --git i/artichoke-backend/src/def.rs w/artichoke-backend/src/def.rs
index 9d1a766659..82963c879e 100644
--- i/artichoke-backend/src/def.rs
+++ w/artichoke-backend/src/def.rs
@@ -54,10 +54,10 @@ where
// Rather than attempt a free and virtually guaranteed segfault, log
// loudly and short-circuit; a leak is better than a crash.
//
- // `box_unbox_free::<T>` is only ever called in an FFI context when
- // there are C frames in the stack. Using `eprintln!` or unwrapping the
- // error from `write!` here is undefined behavior and may result in an
- // abort. Instead, suppress the error.
+ // `box_unbox_free::<T>` is only ever called in an FFI context when there
+ //are C frames in the stack. Using `eprintln!` or unwrapping the error
+ //from `write!` here is undefined behavior and may result in an abort.
+ //Instead, suppress the error.
let _ignored = write!(
io::stderr(),
"Received null pointer in box_unbox_free::<{}>",
@@ -120,9 +120,9 @@ pub struct ModuleScope {
/// Typesafe wrapper for the [`RClass *`](sys::RClass) of the enclosing scope
/// for an mruby `Module` or `Class`.
///
-/// In Ruby, classes and modules can be defined inside another class or
-/// module. mruby only supports resolving [`RClass`](sys::RClass) pointers
-/// relative to an enclosing scope. This can be the top level with
+/// In Ruby, classes and modules can be defined inside another class or module.
+/// mruby only supports resolving [`RClass`](sys::RClass) pointers relative to
+/// an enclosing scope. This can be the top level with
/// [`mrb_class_get`](sys::mrb_class_get) and
/// [`mrb_module_get`](sys::mrb_module_get) or it can be under another class
/// with [`mrb_class_get_under`](sys::mrb_class_get_under) or module with
diff --git i/artichoke-backend/src/error.rs w/artichoke-backend/src/error.rs
index a64747ceba..95d097dbe9 100644
--- i/artichoke-backend/src/error.rs
+++ w/artichoke-backend/src/error.rs
@@ -97,9 +97,9 @@ where
// `mrb_exc_raise` will call longjmp which will unwind the stack.
sys::mrb_exc_raise(mrb, exc);
- // SAFETY: This line is unreachable because `raise` will unwind the
- // stack with `longjmp` when calling `sys::mrb_exc_raise` in the
- // preceding line.
+ // SAFETY: This line is unreachable because `raise` will unwind the stack
+ //with `longjmp` when calling `sys::mrb_exc_raise` in the preceding
+ //line.
hint::unreachable_unchecked()
}
@@ -107,8 +107,8 @@ where
// log loudly to stderr and attempt to fallback to a runtime error.
emit_fatal_warning!("Unable to raise exception: {:?}", exception);
- // Any non-`Copy` objects that we haven't cleaned up at this point will
- // leak, so drop everything.
+ // Any non-`Copy` objects that we haven't cleaned up at this point will leak,
+ //so drop everything.
drop(exception);
// `mrb_sys_raise` will call longjmp which will unwind the stack.
diff --git i/artichoke-backend/src/exception_handler.rs w/artichoke-backend/src/exception_handler.rs
index c4d3dd2759..cd65f63061 100644
--- i/artichoke-backend/src/exception_handler.rs
+++ w/artichoke-backend/src/exception_handler.rs
@@ -130,14 +130,13 @@ impl From<CaughtException> for Error {
pub fn last_error(interp: &mut Artichoke, exception: Value) -> Result<Error, Error> {
let mut arena = interp.create_arena_savepoint()?;
- // Clear the current exception from the mruby interpreter so subsequent
- // calls to the mruby VM are not tainted by an error they did not
- // generate.
+ // Clear the current exception from the mruby interpreter so subsequent calls
+ //to the mruby VM are not tainted by an error they did not generate.
//
- // We must clear the pointer at the beginning of this function so we can
- // use the mruby VM to inspect the exception once we turn it into an
- // `mrb_value`. `Value::funcall` handles errors by calling this
- // function, so not clearing the exception results in a stack overflow.
+ // We must clear the pointer at the beginning of this function so we can use
+ //the mruby VM to inspect the exception once we turn it into an `mrb_value`.
+ //`Value::funcall` handles errors by calling this function, so not clearing
+ //the exception results in a stack overflow.
// Generate exception metadata in by executing the Ruby code:
//
@@ -146,11 +145,11 @@ pub fn last_error(interp: &mut Artichoke, exception: Value) -> Result<Error, Err
// message = exception.message
// ```
- // Sometimes when hacking on `extn/core` it is possible to enter a
- // crash loop where an exception is captured by this handler, but
- // extracting the exception name or backtrace throws again.
- // Un-commenting the following print statement will at least get you the
- // exception class and message, which should help debugging.
+ // Sometimes when hacking on `extn/core` it is possible to enter a crash loop
+ //where an exception is captured by this handler, but extracting the
+ //exception name or backtrace throws again. Un-commenting the following
+ //print statement will at least get you the exception class and message,
+ //which should help debugging.
//
// ```
// let message = exception.funcall(&mut arena, "message", &[], None)?;
diff --git i/artichoke-backend/src/extn/core/array/mod.rs w/artichoke-backend/src/extn/core/array/mod.rs
index 21262148db..f2709adabc 100644
--- i/artichoke-backend/src/extn/core/array/mod.rs
+++ w/artichoke-backend/src/extn/core/array/mod.rs
@@ -265,9 +265,9 @@ impl BoxUnboxVmValue for Array {
// SAFETY: `Array` is backed by a `Vec` which can allocate at
// most `isize::MAX` bytes.
//
- // `mrb_value` is not a ZST, so in practice, `len` and
- // `capacity` will never overflow `mrb_int`, which is an `i64`
- // on 64-bit targets.
+ // `mrb_value` is not a ZST, so in practice, `len` and `capacity`
+ //will never overflow `mrb_int`, which is an `i64` on 64-bit
+ //targets.
//
// On 32-bit targets, `usize` is `u32` which will never overflow
// `i64`. Artichoke unconditionally compiles mruby with `-DMRB_INT64`.
diff --git i/artichoke-backend/src/extn/core/array/trampoline.rs w/artichoke-backend/src/extn/core/array/trampoline.rs
index 04a0992e72..0d9526a38d 100644
--- i/artichoke-backend/src/extn/core/array/trampoline.rs
+++ w/artichoke-backend/src/extn/core/array/trampoline.rs
@@ -228,9 +228,9 @@ pub fn initialize(
second: Option<Value>,
block: Option<Block>,
) -> Result<Value, Error> {
- // Pack an empty `Array` into the given uninitialized `RArray *` so it can
- // be safely marked if an mruby allocation occurs and a GC is triggered in
- // `Array::initialize`.
+ // Pack an empty `Array` into the given uninitialized `RArray *` so it can be
+ //safely marked if an mruby allocation occurs and a GC is triggered in
+ //`Array::initialize`.
//
// Allocations are likely in the case where a block is passed to
// `Array#initialize` or when the first and second args must be coerced with
@@ -241,9 +241,9 @@ pub fn initialize(
}
pub fn initialize_copy(interp: &mut Artichoke, ary: Value, mut from: Value) -> Result<Value, Error> {
- // Pack an empty `Array` into the given uninitialized `RArray *` so it can
- // be safely marked if an mruby allocation occurs and a GC is triggered in
- // `Array::initialize`.
+ // Pack an empty `Array` into the given uninitialized `RArray *` so it can be
+ //safely marked if an mruby allocation occurs and a GC is triggered in
+ //`Array::initialize`.
//
// This ensures the given `RArry *` is initialized even if a non-`Array`
// object is called with `Array#initialize_copy` and the
@@ -314,8 +314,8 @@ pub fn reverse_bang(interp: &mut Artichoke, mut ary: Value) -> Result<Value, Err
}
let mut array = unsafe { Array::unbox_from_value(&mut ary, interp)? };
- // SAFETY: Reversing an `Array` in place does not reallocate it. The array
- // is repacked without any intervening interpreter heap allocations.
+ // SAFETY: Reversing an `Array` in place does not reallocate it. The array is
+ //repacked without any intervening interpreter heap allocations.
unsafe {
let array_mut = array.as_inner_mut();
array_mut.reverse();
@@ -346,8 +346,8 @@ pub fn shift(interp: &mut Artichoke, mut ary: Value, count: Option<Value>) -> Re
// garbage collection, otherwise marking the children in `ary` will have
// undefined behavior.
//
- // The call to `Array::alloc_value` happens outside this block after
- // the `Array` has been repacked.
+ // The call to `Array::alloc_value` happens outside this block after the
+ //`Array` has been repacked.
let shifted = unsafe {
let array_mut = array.as_inner_mut();
let shifted = array_mut.shift_n(count);
@@ -360,8 +360,8 @@ pub fn shift(interp: &mut Artichoke, mut ary: Value, count: Option<Value>) -> Re
Array::alloc_value(shifted, interp)
} else {
- // SAFETY: The call to `Array::shift` will potentially invalidate the
- // raw parts stored in `ary`'s `RArray*`.
+ // SAFETY: The call to `Array::shift` will potentially invalidate the raw
+ //parts stored in `ary`'s `RArray*`.
//
// The raw parts in `ary`'s `RArray *` must be repacked before a
// potential garbage collection, otherwise marking the children in `ary`
diff --git i/artichoke-backend/src/extn/core/array/wrapper.rs w/artichoke-backend/src/extn/core/array/wrapper.rs
index 96f1e98087..d871d3a39e 100644
--- i/artichoke-backend/src/extn/core/array/wrapper.rs
+++ w/artichoke-backend/src/extn/core/array/wrapper.rs
@@ -397,7 +397,7 @@ impl Array {
/// Returns a reference to an element at the index.
///
- /// Unlike [`Vec`], this method does not support indexing with a range. See
+ /// Unlike [`Vec`], this method does not support indexing with a range. See
/// the [`slice`](Self::slice) method for retrieving a sub-slice from the
/// array.
#[inline]
diff --git i/artichoke-backend/src/extn/core/float/mod.rs w/artichoke-backend/src/extn/core/float/mod.rs
index c2ab5259a2..e74b685da5 100644
--- i/artichoke-backend/src/extn/core/float/mod.rs
+++ w/artichoke-backend/src/extn/core/float/mod.rs
@@ -146,13 +146,10 @@ impl Float {
///
/// Other modes include:
///
- /// | mode | value |
- /// |------------------------------------|-------|
- /// | Indeterminable | -1 |
- /// | Rounding towards zero | 0 |
- /// | Rounding to the nearest number | 1 |
- /// | Rounding towards positive infinity | 2 |
- /// | Rounding towards negative infinity | 3 |
+ /// | mode | value | |------------------------------------|-------| |
+ /// Indeterminable | -1 | | Rounding towards zero | 0 | | Rounding to the
+ /// nearest number | 1 | | Rounding towards positive infinity | 2 | |
+ /// Rounding towards negative infinity | 3 |
///
/// # Rust Caveats
///
diff --git i/artichoke-backend/src/extn/core/integer/mod.rs w/artichoke-backend/src/extn/core/integer/mod.rs
index 3e64dad9f1..413d84f54c 100644
--- i/artichoke-backend/src/extn/core/integer/mod.rs
+++ w/artichoke-backend/src/extn/core/integer/mod.rs
@@ -104,8 +104,8 @@ impl Integer {
message.extend_from_slice(b") not supported");
Err(NotImplementedError::from(message).into())
} else {
- // When no encoding is supplied, MRI assumes the encoding is
- // either ASCII or ASCII-8BIT.
+ // When no encoding is supplied, MRI assumes the encoding is either
+ //ASCII or ASCII-8BIT.
//
// - `Integer`s from 0..127 result in a `String` with ASCII
// encoding.
@@ -283,7 +283,8 @@ mod tests {
let expected = -i64::from(x) / i64::from(y);
quotient == expected
} else {
- // Round negative integer division toward negative infinity.
+ // Round negative integer division toward negative
+ //infinity.
let expected = (-i64::from(x) / i64::from(y)) - 1;
quotient == expected
}
@@ -311,7 +312,8 @@ mod tests {
let expected = -i64::from(x) / i64::from(y);
quotient == expected
} else {
- // Round negative integer division toward negative infinity.
+ // Round negative integer division toward negative
+ //infinity.
let expected = (-i64::from(x) / i64::from(y)) - 1;
quotient == expected
}
diff --git i/artichoke-backend/src/extn/core/kernel/require.rs w/artichoke-backend/src/extn/core/kernel/require.rs
index 24814f818f..710d7a673a 100644
--- i/artichoke-backend/src/extn/core/kernel/require.rs
+++ w/artichoke-backend/src/extn/core/kernel/require.rs
@@ -1,4 +1,4 @@
-//! [`Kernel#require`](https://ruby-doc.org/core-3.1.2/Kernel.html#method-i-require)
+//! //! [`Kernel#require`](https://ruby-doc.org/core-3.1.2/Kernel.html#method-i-require)
use std::path::{Path, PathBuf};
@@ -11,9 +11,9 @@ use crate::extn::prelude::*;
use crate::state::parser::Context;
pub fn load(interp: &mut Artichoke, mut filename: Value) -> Result<Loaded, Error> {
- // SAFETY: The extracted byte slice is converted to an owned `Vec<u8>`
- // before the interpreter is used again which protects against a garbage
- // collection invalidating the pointer.
+ // SAFETY: The extracted byte slice is converted to an owned `Vec<u8>` before
+ //the interpreter is used again which protects against a garbage collection
+ //invalidating the pointer.
let filename = unsafe { implicitly_convert_to_string(interp, &mut filename)? };
if filename.find_byte(b'\0').is_some() {
return Err(ArgumentError::with_message("path name contains null byte").into());
@@ -41,9 +41,9 @@ pub fn load(interp: &mut Artichoke, mut filename: Value) -> Result<Loaded, Error
}
pub fn require(interp: &mut Artichoke, mut filename: Value) -> Result<Required, Error> {
- // SAFETY: The extracted byte slice is converted to an owned `Vec<u8>`
- // before the interpreter is used again which protects against a garbage
- // collection invalidating the pointer.
+ // SAFETY: The extracted byte slice is converted to an owned `Vec<u8>` before
+ //the interpreter is used again which protects against a garbage collection
+ //invalidating the pointer.
let filename = unsafe { implicitly_convert_to_string(interp, &mut filename)? };
if filename.find_byte(b'\0').is_some() {
return Err(ArgumentError::with_message("path name contains null byte").into());
@@ -72,9 +72,9 @@ pub fn require(interp: &mut Artichoke, mut filename: Value) -> Result<Required,
#[allow(clippy::module_name_repetitions)]
pub fn require_relative(interp: &mut Artichoke, mut filename: Value, base: RelativePath) -> Result<Required, Error> {
- // SAFETY: The extracted byte slice is converted to an owned `Vec<u8>`
- // before the interpreter is used again which protects against a garbage
- // collection invalidating the pointer.
+ // SAFETY: The extracted byte slice is converted to an owned `Vec<u8>` before
+ //the interpreter is used again which protects against a garbage collection
+ //invalidating the pointer.
let filename = unsafe { implicitly_convert_to_string(interp, &mut filename)? };
if filename.find_byte(b'\0').is_some() {
return Err(ArgumentError::with_message("path name contains null byte").into());
diff --git i/artichoke-backend/src/extn/core/kernel/trampoline.rs w/artichoke-backend/src/extn/core/kernel/trampoline.rs
index 028784a1c8..d672da7a33 100644
--- i/artichoke-backend/src/extn/core/kernel/trampoline.rs
+++ w/artichoke-backend/src/extn/core/kernel/trampoline.rs
@@ -54,8 +54,8 @@ pub fn integer(interp: &mut Artichoke, mut val: Value, base: Option<Value>) -> R
// https://github.com/ruby/ruby/blob/v3_1_2/object.c#L3127-L3132
if let Ok(f) = val.try_convert_into::<f64>(interp) {
- // TODO: handle exception kwarg and return `nil` if it is false and f is not finite.
- // https://github.com/ruby/ruby/blob/v3_1_2/object.c#L3129
+ // TODO: handle exception kwarg and return `nil` if it is false and f is
+ //not finite. https://github.com/ruby/ruby/blob/v3_1_2/object.c#L3129
// https://github.com/ruby/ruby/blob/v3_1_2/object.c#L3131
// https://github.com/ruby/ruby/blob/v3_1_2/bignum.c#L5230-L5235
diff --git i/artichoke-backend/src/extn/core/matchdata/trampoline.rs w/artichoke-backend/src/extn/core/matchdata/trampoline.rs
index 33bbe9808b..1510e4564a 100644
--- i/artichoke-backend/src/extn/core/matchdata/trampoline.rs
+++ w/artichoke-backend/src/extn/core/matchdata/trampoline.rs
@@ -123,8 +123,8 @@ pub fn element_reference(
return interp.try_convert_mut(matched);
}
- // NOTE(lopopolo): Encapsulation is broken here by reaching into the
- // inner regexp.
+ // NOTE(lopopolo): Encapsulation is broken here by reaching into the inner
+ //regexp.
let captures_len = data.regexp.inner().captures_len(None)?;
let rangelen = i64::try_from(captures_len).map_err(|_| ArgumentError::with_message("input string too long"))?;
let at = match elem.is_range(interp, rangelen)? {
diff --git i/artichoke-backend/src/extn/core/math/trampoline.rs w/artichoke-backend/src/extn/core/math/trampoline.rs
index ebb7a90bf6..3d22c757b9 100644
--- i/artichoke-backend/src/extn/core/math/trampoline.rs
+++ w/artichoke-backend/src/extn/core/math/trampoline.rs
@@ -114,8 +114,8 @@ pub fn ldexp(interp: &mut Artichoke, fraction: Value, exponent: Value) -> Result
return Err(RangeError::with_message("float NaN out of range of integer").into());
}
Err(Ok(exp)) => {
- // This saturating cast will be rejected by the `i32::try_from`
- // below if `exp` is too large.
+ // This saturating cast will be rejected by the `i32::try_from` below
+ //if `exp` is too large.
exp as i64
}
Err(Err(err)) => return Err(err),
diff --git i/artichoke-backend/src/extn/core/numeric/mod.rs w/artichoke-backend/src/extn/core/numeric/mod.rs
index eaa6c68b01..7b5c8cc843 100644
--- i/artichoke-backend/src/extn/core/numeric/mod.rs
+++ w/artichoke-backend/src/extn/core/numeric/mod.rs
@@ -42,8 +42,8 @@ pub enum Coercion {
///
/// # Coercion enum
///
-/// Artichoke represents the `[y, x]` tuple Array as the [`Coercion`] enum, which
-/// orders its values `Coercion::Integer(x, y)`.
+/// Artichoke represents the `[y, x]` tuple Array as the [`Coercion`] enum,
+/// which orders its values `Coercion::Integer(x, y)`.
///
/// [numeric]: https://ruby-doc.org/core-3.1.2/Numeric.html#method-i-coerce
pub fn coerce(interp: &mut Artichoke, x: Value, y: Value) -> Result<Coercion, Error> {
diff --git i/artichoke-backend/src/extn/core/regexp/backend/onig.rs w/artichoke-backend/src/extn/core/regexp/backend/onig.rs
index 58767984dd..45e08ef1ec 100644
--- i/artichoke-backend/src/extn/core/regexp/backend/onig.rs
+++ w/artichoke-backend/src/extn/core/regexp/backend/onig.rs
@@ -118,9 +118,9 @@ impl RegexpType for Onig {
// Explicitly suppress this error because `debug` is infallible and
// cannot panic.
//
- // In practice this error will never be triggered since the only
- // fallible call in `format_debug_escape_into` is to `write!` which
- // never `panic!`s for a `String` formatter, which we are using here.
+ // In practice this error will never be triggered since the only fallible
+ //call in `format_debug_escape_into` is to `write!` which never
+ //`panic!`s for a `String` formatter, which we are using here.
let _ = format_debug_escape_into(&mut pattern, self.source.pattern());
debug.push_str(pattern.replace('/', r"\/").as_str());
debug.push('/');
diff --git i/artichoke-backend/src/extn/core/regexp/backend/regex/mod.rs w/artichoke-backend/src/extn/core/regexp/backend/regex/mod.rs
index 05ec97e933..f7e7dfcc58 100644
--- i/artichoke-backend/src/extn/core/regexp/backend/regex/mod.rs
+++ w/artichoke-backend/src/extn/core/regexp/backend/regex/mod.rs
@@ -1,3 +1,3 @@
-// TODO(GH-490): Add `regex::Binary` implementation of `RegexType`.
-// pub mod binary;
+// TODO(GH-490): Add `regex::Binary` implementation of `RegexType`. pub mod
+ //binary;
pub mod utf8;
diff --git i/artichoke-backend/src/extn/core/regexp/backend/regex/utf8.rs w/artichoke-backend/src/extn/core/regexp/backend/regex/utf8.rs
index 0e52f3bf0f..6ec8d9b1a5 100644
--- i/artichoke-backend/src/extn/core/regexp/backend/regex/utf8.rs
+++ w/artichoke-backend/src/extn/core/regexp/backend/regex/utf8.rs
@@ -127,9 +127,9 @@ impl RegexpType for Utf8 {
// Explicitly suppress this error because `debug` is infallible and
// cannot panic.
//
- // In practice this error will never be triggered since the only
- // fallible call in `format_debug_escape_into` is to `write!` which
- // never `panic!`s for a `String` formatter, which we are using here.
+ // In practice this error will never be triggered since the only fallible
+ //call in `format_debug_escape_into` is to `write!` which never
+ //`panic!`s for a `String` formatter, which we are using here.
let _ = format_debug_escape_into(&mut pattern, self.source.pattern());
debug.push_str(pattern.replace('/', r"\/").as_str());
debug.push('/');
@@ -177,8 +177,8 @@ impl RegexpType for Utf8 {
if let Some(captures) = self.regex.captures(haystack) {
// per the [docs] for `captures.len()`:
//
- // > This is always at least 1, since every regex has at least one
- // > capture group that corresponds to the full match.
+ // > This is always at least 1, since every regex has at least one >
+ //capture group that corresponds to the full match.
//
// [docs]: https://docs.rs/regex/1.3.4/regex/struct.Captures.html#method.len
interp.set_active_regexp_globals(captures.len().checked_sub(1).unwrap_or_default())?;
@@ -259,8 +259,8 @@ impl RegexpType for Utf8 {
if let Some(captures) = self.regex.captures(target) {
// per the [docs] for `captures.len()`:
//
- // > This is always at least 1, since every regex has at least one
- // > capture group that corresponds to the full match.
+ // > This is always at least 1, since every regex has at least one >
+ //capture group that corresponds to the full match.
//
// [docs]: https://docs.rs/regex/1.3.4/regex/struct.Captures.html#method.len
interp.set_active_regexp_globals(captures.len().checked_sub(1).unwrap_or_default())?;
@@ -307,8 +307,8 @@ impl RegexpType for Utf8 {
if let Some(captures) = self.regex.captures(haystack) {
// per the [docs] for `captures.len()`:
//
- // > This is always at least 1, since every regex has at least one
- // > capture group that corresponds to the full match.
+ // > This is always at least 1, since every regex has at least one >
+ //capture group that corresponds to the full match.
//
// [docs]: https://docs.rs/regex/1.3.4/regex/struct.Captures.html#method.len
interp.set_active_regexp_globals(captures.len().checked_sub(1).unwrap_or_default())?;
diff --git i/artichoke-backend/src/extn/core/regexp/syntax.rs w/artichoke-backend/src/extn/core/regexp/syntax.rs
index 5b82baa35c..4a1d410b70 100644
--- i/artichoke-backend/src/extn/core/regexp/syntax.rs
+++ w/artichoke-backend/src/extn/core/regexp/syntax.rs
@@ -1,9 +1,8 @@
// This module is forked from `regex-syntax` crate @ `26f7318e`.
//
-// https://github.com/rust-lang/regex/blob/26f7318e2895eae56e95a260e81e2d48b90e7c25/regex-syntax/src/lib.rs
+// //https://github.com/rust-lang/regex/blob/26f7318e2895eae56e95a260e81e2d48b90e7c25/regex-syntax/src/lib.rs
//
-// MIT License
-// Copyright (c) 2014 The Rust Project Developers
+// MIT License Copyright (c) 2014 The Rust Project Developers
#![allow(clippy::match_same_arms)]
@@ -52,8 +51,8 @@ pub fn escape_into(text: &str, buf: &mut String) {
pub fn is_meta_character(c: char) -> bool {
match c {
'\\' | '.' | '+' | '*' | '?' | '(' | ')' | '|' | '[' | ']' | '{' | '}' | '^' | '$' | '#' | '-' => true,
- // This match arm differs from `regex-syntax` by including '/'.
- // Ruby uses '/' to mark `Regexp` literals in source code.
+ // This match arm differs from `regex-syntax` by including '/'. Ruby uses
+ //'/' to mark `Regexp` literals in source code.
'/' => true,
// This match arm differs from `regex-syntax` by including ' ' (an ASCII
// space character). Ruby always escapes ' ' in calls to `Regexp::escape`.
diff --git i/artichoke-backend/src/extn/core/string/ffi.rs w/artichoke-backend/src/extn/core/string/ffi.rs
index 682719c0d2..6592b0f68c 100644
--- i/artichoke-backend/src/extn/core/string/ffi.rs
+++ w/artichoke-backend/src/extn/core/string/ffi.rs
@@ -184,7 +184,8 @@ unsafe extern "C" fn mrb_str_resize(mrb: *mut sys::mrb_state, s: sys::mrb_value,
match len.checked_sub(s.len()) {
Some(0) => {}
Some(additional) => s.try_reserve(additional)?,
- // If the given length is less than the length of the `String`, truncate.
+ // If the given length is less than the length of the `String`,
+ //truncate.
None => s.truncate(len),
}
Ok(())
@@ -220,9 +221,9 @@ unsafe extern "C" fn mrb_str_resize(mrb: *mut sys::mrb_state, s: sys::mrb_value,
// This is not possible on stable Rust since `TryReserveErrorKind` is
// unstable.
Err(_) => {
- // NOTE: This code can't use an `Error` unified exception trait object.
- // Since we're in memory error territory, we're not sure if we can
- // allocate the `Box` it requires.
+ // NOTE: This code can't use an `Error` unified exception trait
+ //object. Since we're in memory error territory, we're not sure if
+ //we can allocate the `Box` it requires.
let err = NoMemoryError::with_message("out of memory");
error::raise(guard, err);
}
@@ -496,7 +497,8 @@ unsafe extern "C" fn mrb_string_cstr(mrb: *mut sys::mrb_state, s: sys::mrb_value
// #define mrb_str_to_inum(mrb, str, base, badcheck) mrb_str_to_integer(mrb, str, base, badcheck)
// ```
//
-// This function converts a numeric string to numeric `mrb_value` with the given base.
+// This function converts a numeric string to numeric `mrb_value` with the given
+ //base.
#[no_mangle]
unsafe extern "C" fn mrb_str_to_integer(
mrb: *mut sys::mrb_state,
@@ -606,8 +608,8 @@ unsafe extern "C" fn mrb_str_cat(
if let Ok(mut string) = String::unbox_from_value(&mut s, &mut guard) {
let slice = slice::from_raw_parts(ptr.cast::<u8>(), len);
- // SAFETY: The string is repacked before any intervening uses of
- // `interp` which means no mruby heap allocations can occur.
+ // SAFETY: The string is repacked before any intervening uses of `interp`
+ //which means no mruby heap allocations can occur.
let string_mut = string.as_inner_mut();
string_mut.extend_from_slice(slice);
let inner = string.take();
diff --git i/artichoke-backend/src/extn/core/string/mod.rs w/artichoke-backend/src/extn/core/string/mod.rs
index d8f3ba100e..d42d324d5c 100644
--- i/artichoke-backend/src/extn/core/string/mod.rs
+++ w/artichoke-backend/src/extn/core/string/mod.rs
@@ -34,8 +34,8 @@ impl BoxUnboxVmValue for String {
) -> Result<UnboxedValueGuard<'a, Self::Guarded>, Error> {
let _ = interp;
- // Make sure we have a String otherwise extraction will fail.
- // This check is critical to the safety of accessing the `value` union.
+ // Make sure we have a String otherwise extraction will fail. This check
+ //is critical to the safety of accessing the `value` union.
if value.ruby_type() != Ruby::String {
let mut message = std::string::String::from("uninitialized ");
message.push_str(Self::RUBY_TYPE);
@@ -129,9 +129,9 @@ impl BoxUnboxVmValue for String {
}
fn free(data: *mut c_void) {
- // this function is never called. `String` is freed directly in the VM
- // by calling `mrb_gc_free_str` which is defined in
- // `extn/core/string/ffi.rs`.
+ // this function is never called. `String` is freed directly in the VM by
+ //calling `mrb_gc_free_str` which is defined in
+ //`extn/core/string/ffi.rs`.
//
// `String` should not have a destructor registered in the class
// registry.
@@ -168,8 +168,8 @@ mod tests {
#[test]
fn modifying_and_repacking_encoding_zeroes_old_encoding_flags() {
let mut interp = interpreter();
- // Modify the encoding of a binary string in place to be UTF-8 by
- // pushing a UTF-8 string into an empty binary string.
+ // Modify the encoding of a binary string in place to be UTF-8 by pushing
+ //a UTF-8 string into an empty binary string.
//
// Test for the newly taken UTF-8 encoding by ensuring that the char
// length of the string is 1.
diff --git i/artichoke-backend/src/extn/core/string/mruby.rs w/artichoke-backend/src/extn/core/string/mruby.rs
index 51d9a1593a..ab24142d8e 100644
--- i/artichoke-backend/src/extn/core/string/mruby.rs
+++ w/artichoke-backend/src/extn/core/string/mruby.rs
@@ -22,7 +22,11 @@ pub fn init(interp: &mut Artichoke) -> InitializeResult<()> {
.add_method("[]=", string_aset, sys::mrb_args_any())?
.add_method("ascii_only?", string_ascii_only, sys::mrb_args_none())?
.add_method("b", string_b, sys::mrb_args_none())?
- .add_method("bytes", string_bytes, sys::mrb_args_none())? // This does not support the deprecated block form
+ .add_method("bytes", string_bytes, sys::mrb_args_none())? // This does
+ //not support
+ //the
+ //deprecated
+ //block form
.add_method("bytesize", string_bytesize, sys::mrb_args_none())?
.add_method("byteslice", string_byteslice, sys::mrb_args_req_and_opt(1, 1))?
.add_method("capitalize", string_capitalize, sys::mrb_args_any())?
@@ -30,14 +34,25 @@ pub fn init(interp: &mut Artichoke) -> InitializeResult<()> {
.add_method("casecmp", string_casecmp_ascii, sys::mrb_args_req(1))?
.add_method("casecmp?", string_casecmp_unicode, sys::mrb_args_req(1))?
.add_method("center", string_center, sys::mrb_args_req_and_opt(1, 1))?
- .add_method("chars", string_chars, sys::mrb_args_none())? // This does not support the deprecated block form
+ .add_method("chars", string_chars, sys::mrb_args_none())? // This does
+ //not support
+ //the
+ //deprecated
+ //block form
.add_method("chomp", string_chomp, sys::mrb_args_opt(1))?
.add_method("chomp!", string_chomp_bang, sys::mrb_args_opt(1))?
.add_method("chop", string_chop, sys::mrb_args_none())?
.add_method("chop!", string_chop_bang, sys::mrb_args_none())?
.add_method("chr", string_chr, sys::mrb_args_none())?
.add_method("clear", string_clear, sys::mrb_args_none())?
- .add_method("codepoints", string_codepoints, sys::mrb_args_none())? // This does not support the deprecated block form
+ .add_method("codepoints", string_codepoints, sys::mrb_args_none())? // //This
+ //does
+ //not
+ //support
+ //the
+ //deprecated
+ //block
+ //form
.add_method("concat", string_concat, sys::mrb_args_any())?
.add_method("downcase", string_downcase, sys::mrb_args_any())?
.add_method("downcase!", string_downcase_bang, sys::mrb_args_any())?
@@ -47,7 +62,12 @@ pub fn init(interp: &mut Artichoke) -> InitializeResult<()> {
.add_method("hash", string_hash, sys::mrb_args_none())?
.add_method("include?", string_include, sys::mrb_args_req(1))?
.add_method("index", string_index, sys::mrb_args_req_and_opt(1, 1))?
- .add_method("initialize", string_initialize, sys::mrb_args_opt(1))? // TODO: support encoding and capacity kwargs
+ .add_method("initialize", string_initialize, sys::mrb_args_opt(1))? // //TODO:
+ //support
+ //encoding
+ //and
+ //capacity
+ //kwargs
.add_method("initialize_copy", string_initialize_copy, sys::mrb_args_req(1))?
.add_method("inspect", string_inspect, sys::mrb_args_none())?
.add_method("intern", string_intern, sys::mrb_args_none())?
diff --git i/artichoke-backend/src/extn/core/string/trampoline.rs w/artichoke-backend/src/extn/core/string/trampoline.rs
index 68b9441beb..432403ddff 100644
--- i/artichoke-backend/src/extn/core/string/trampoline.rs
+++ w/artichoke-backend/src/extn/core/string/trampoline.rs
@@ -41,8 +41,8 @@ pub fn add(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Result
let to_append = unsafe { implicitly_convert_to_string(interp, &mut other)? };
let mut concatenated = s.clone();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the max
+ //allocation size and may panic or abort.
concatenated.extend_from_slice(to_append);
super::String::alloc_value(concatenated, interp)
}
@@ -59,12 +59,12 @@ pub fn append(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Res
let mut s = unsafe { super::String::unbox_from_value(&mut value, interp)? };
if let Ok(int) = other.try_convert_into::<i64>(interp) {
- // SAFETY: The string is repacked before any intervening uses of
- // `interp` which means no mruby heap allocations can occur.
+ // SAFETY: The string is repacked before any intervening uses of `interp`
+ //which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the max
+ //allocation size and may panic or abort.
string_mut
.try_push_int(int)
.map_err(|err| RangeError::from(err.message()))?;
@@ -129,12 +129,13 @@ pub fn append(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Res
// `interp` which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the
+ //max allocation size and may panic or abort.
string_mut.extend_from_slice(other.as_slice());
if !matches!(other.encoding(), Encoding::Utf8) && !other.is_ascii_only() {
- // encodings are incompatible if other is not UTF-8 and is non-ASCII
+ // encodings are incompatible if other is not UTF-8 and is
+ //non-ASCII
string_mut.set_encoding(other.encoding());
}
@@ -177,8 +178,8 @@ pub fn append(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Res
// `interp` which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the
+ //max allocation size and may panic or abort.
string_mut.extend_from_slice(other.as_slice());
// Set encoding to `other.encoding()` if other is non-ASCII.
@@ -229,8 +230,8 @@ pub fn append(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Res
// `interp` which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the
+ //max allocation size and may panic or abort.
string_mut.extend_from_slice(other.as_slice());
let s = s.take();
@@ -274,8 +275,8 @@ pub fn append(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Res
// `interp` which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the
+ //max allocation size and may panic or abort.
string_mut.extend_from_slice(other.as_slice());
if !other.is_ascii_only() {
@@ -291,8 +292,8 @@ pub fn append(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Res
// `interp` which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // XXX: This call doesn't do a check to see if we'll exceed the max allocation
- // size and may panic or abort.
+ // XXX: This call doesn't do a check to see if we'll exceed the
+ //max allocation size and may panic or abort.
string_mut.extend_from_slice(other.as_slice());
let s = s.take();
@@ -365,10 +366,10 @@ pub fn aref(
// => nil
// ```
//
- // Don't specialize on the case where `index == len` because the provided
- // length can change the result. Even if the length argument is not
- // given, we still need to preserve the encoding of the source string,
- // so fall through to the happy path below.
+ // Don't specialize on the case where `index == len` because the
+ //provided length can change the result. Even if the length argument
+ //is not given, we still need to preserve the encoding of the source
+ //string, so fall through to the happy path below.
Some(index) if index > s.len() => return Ok(Value::nil()),
Some(index) => index,
};
@@ -468,8 +469,8 @@ pub fn aref(
return Ok(Value::nil());
}
}
- // The overload of `String#[]` that takes a `String` **only** takes `String`s.
- // No implicit conversion is performed.
+ // The overload of `String#[]` that takes a `String` **only** takes
+ //`String`s. No implicit conversion is performed.
//
// ```
// [3.0.1] > s = "abc"
@@ -487,9 +488,9 @@ pub fn aref(
// ```
if let Ok(substring) = unsafe { super::String::unbox_from_value(&mut first, interp) } {
if s.index(&*substring, None).is_some() {
- // Indexing with a `String` returns a newly allocated object that
- // has the same encoding as the index, regardless of the encoding on
- // the receiver.
+ // Indexing with a `String` returns a newly allocated object that has
+ //the same encoding as the index, regardless of the encoding on the
+ //receiver.
//
// ```
// [3.0.2] > s = "abc"
@@ -702,12 +703,14 @@ pub fn byteslice(
let length = if let Some(length) = length {
length
} else {
- // Per the docs -- https://ruby-doc.org/core-3.1.2/String.html#method-i-byteslice
+ // Per the docs --
+ //https://ruby-doc.org/core-3.1.2/String.html#method-i-byteslice
//
- // > If passed a single Integer, returns a substring of one byte at that position.
+ // > If passed a single Integer, returns a substring of one byte at that
+ //position.
//
- // NOTE: Index out a single byte rather than a slice to avoid having
- // to do an overflow check on the addition.
+ // NOTE: Index out a single byte rather than a slice to avoid having to
+ //do an overflow check on the addition.
if let Some(&byte) = s.get(index) {
let s = super::String::with_bytes_and_encoding(vec![byte], s.encoding());
// ```
@@ -862,7 +865,8 @@ pub fn casecmp_ascii(interp: &mut Artichoke, mut value: Value, mut other: Value)
pub fn casecmp_unicode(interp: &mut Artichoke, mut value: Value, mut other: Value) -> Result<Value, Error> {
let s = unsafe { super::String::unbox_from_value(&mut value, interp)? };
- // TODO: this needs to do an implicit conversion, but we need a Spinoso string.
+ // TODO: this needs to do an implicit conversion, but we need a Spinoso
+ //string.
if let Ok(other) = unsafe { super::String::unbox_from_value(&mut other, interp) } {
let eql = *s == *other;
Ok(interp.convert(eql))
@@ -1045,8 +1049,8 @@ pub fn downcase_bang(interp: &mut Artichoke, mut value: Value) -> Result<Value,
// which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // `make_lowercase` might reallocate the string and invalidate the
- // boxed pointer, capacity, length triple.
+ // `make_lowercase` might reallocate the string and invalidate the boxed
+ //pointer, capacity, length triple.
string_mut.make_lowercase();
let s = s.take();
@@ -1150,8 +1154,8 @@ pub fn initialize(interp: &mut Artichoke, mut value: Value, from: Option<Value>)
Vec::new()
};
- // If we are calling `initialize` on an already initialized `String`,
- // pluck out the inner buffer and drop it so we don't leak memory.
+ // If we are calling `initialize` on an already initialized `String`, pluck
+ //out the inner buffer and drop it so we don't leak memory.
//
// ```console
// [3.0.2] > s = "abc"
@@ -1411,8 +1415,8 @@ pub fn setbyte(interp: &mut Artichoke, mut value: Value, index: Value, byte: Val
index
} else {
let mut message = String::from("index ");
- // Suppress error because `String`'s `fmt::Write` impl is infallible.
- // (It will abort on OOM).
+ // Suppress error because `String`'s `fmt::Write` impl is infallible. (It
+ //will abort on OOM).
let _ignored = write!(&mut message, "{} out of string", index);
return Err(IndexError::from(message).into());
};
@@ -1550,8 +1554,8 @@ pub fn upcase_bang(interp: &mut Artichoke, mut value: Value) -> Result<Value, Er
// which means no mruby heap allocations can occur.
unsafe {
let string_mut = s.as_inner_mut();
- // `make_uppercase` might reallocate the string and invalidate the
- // boxed pointer, capacity, length triple.
+ // `make_uppercase` might reallocate the string and invalidate the boxed
+ //pointer, capacity, length triple.
string_mut.make_uppercase();
let s = s.take();
diff --git i/artichoke-backend/src/extn/core/symbol/ffi.rs w/artichoke-backend/src/extn/core/symbol/ffi.rs
index<PHONE_NUMBER>..b4bb7f18d2 100644
--- i/artichoke-backend/src/extn/core/symbol/ffi.rs
+++ w/artichoke-backend/src/extn/core/symbol/ffi.rs
@@ -60,7 +60,7 @@ unsafe extern "C" fn mrb_intern_str(mrb: *mut sys::mrb_state, name: sys::mrb_val
}
}
-/* `mrb_intern_check` series functions returns 0 if the symbol is not defined */
+/* `mrb_intern_check` series functions returns 0 if the symbol is not defined*/
// ```c
// MRB_API mrb_sym mrb_intern_check(mrb_state*,const char*,size_t);
@@ -207,8 +207,8 @@ unsafe extern "C" fn mrb_sym_dump(mrb: *mut sys::mrb_state, sym: sys::mrb_sym) -
unwrap_interpreter!(mrb, to => guard, or_else = ptr::null());
if let Ok(Some(bytes)) = guard.lookup_symbol(sym) {
let bytes = bytes.to_vec();
- // Allocate a buffer with the lifetime of the interpreter and return
- // a pointer to it.
+ // Allocate a buffer with the lifetime of the interpreter and return a
+ //pointer to it.
if let Ok(string) = guard.try_convert_mut(bytes) {
if let Ok(bytes) = string.try_convert_into_mut::<&[u8]>(&mut guard) {
return bytes.as_ptr().cast();
diff --git i/artichoke-backend/src/extn/core/symbol/mod.rs w/artichoke-backend/src/extn/core/symbol/mod.rs
index 3194eeefa0..474b5c79ed 100644
--- i/artichoke-backend/src/extn/core/symbol/mod.rs
+++ w/artichoke-backend/src/extn/core/symbol/mod.rs
@@ -22,8 +22,8 @@ impl BoxUnboxVmValue for Symbol {
) -> Result<UnboxedValueGuard<'a, Self::Guarded>, Error> {
let _ = interp;
- // Make sure we have a Symbol otherwise extraction will fail.
- // This check is critical to the safety of accessing the `value` union.
+ // Make sure we have a Symbol otherwise extraction will fail. This check
+ //is critical to the safety of accessing the `value` union.
if value.ruby_type() != Ruby::Symbol {
let mut message = String::from("uninitialized ");
message.push_str(Self::RUBY_TYPE);
diff --git i/artichoke-backend/src/extn/core/time/mruby.rs w/artichoke-backend/src/extn/core/time/mruby.rs
index 4aa4e7e816..2fcf476ece 100644
--- i/artichoke-backend/src/extn/core/time/mruby.rs
+++ w/artichoke-backend/src/extn/core/time/mruby.rs
@@ -13,8 +13,8 @@ pub fn init(interp: &mut Artichoke) -> InitializeResult<()> {
}
let spec = class::Spec::new("Time", TIME_CSTR, None, Some(def::box_unbox_free::<time::Time>))?;
- // NOTE: The ordering of method declarations in the builder below is the
- // same as in `Init_Time` in MRI `time.c`.
+ // NOTE: The ordering of method declarations in the builder below is the same
+ //as in `Init_Time` in MRI `time.c`.
class::Builder::for_spec(interp, &spec)
.value_is_rust_object()
// Constructor
diff --git i/artichoke-backend/src/extn/core/time/offset.rs w/artichoke-backend/src/extn/core/time/offset.rs
index a7cd0c21a4..29695b76de 100644
--- i/artichoke-backend/src/extn/core/time/offset.rs
+++ w/artichoke-backend/src/extn/core/time/offset.rs
@@ -52,8 +52,8 @@ impl TryConvertMut<Value, Option<Offset>> for Artichoke {
}
}
- // Based on the above logic, the only option in the hash is `in`.
- // >0 keys, and all other keys are rejected).
+ // Based on the above logic, the only option in the hash is `in`. >0
+ //keys, and all other keys are rejected).
let mut in_value = hash.get(0).expect("Only the `in` parameter should be available").1;
match in_value.ruby_type() {
diff --git i/artichoke-backend/src/extn/core/time/subsec.rs w/artichoke-backend/src/extn/core/time/subsec.rs
index d86f332c46..2bf49f7cfd 100644
--- i/artichoke-backend/src/extn/core/time/subsec.rs
+++ w/artichoke-backend/src/extn/core/time/subsec.rs
@@ -62,10 +62,10 @@ impl TryConvertMut<Option<Value>, SubsecMultiplier> for Artichoke {
}
}
-/// A struct that represents the adjustment needed to a `Time` based on a
-/// the parsing of optional Ruby Values. Seconds can require adjustment as a
-/// means for handling overflow of values. e.g. `1_001` millis can be requested
-/// which should result in 1 seconds, and `1_000_000` nanoseconds.
+/// A struct that represents the adjustment needed to a `Time` based on a the
+/// parsing of optional Ruby Values. Seconds can require adjustment as a means
+/// for handling overflow of values. e.g. `1_001` millis can be requested which
+/// should result in 1 seconds, and `1_000_000` nanoseconds.
///
/// Note: Negative nanoseconds are not supported, thus any negative adjustment
/// will generally result in at least -1 second, and the relevant positive
@@ -103,9 +103,9 @@ impl TryConvertMut<(Option<Value>, Option<Value>), Subsec> for Artichoke {
let seconds_base = NANOS_IN_SECOND / multiplier_nanos;
if subsec.ruby_type() == Ruby::Float {
- // FIXME: The below deviates from the MRI implementation of
- // Time. MRI uses `to_r` for subsec calculation on floats
- // subsec nanos, and this could result in different values.
+ // FIXME: The below deviates from the MRI implementation of Time.
+ //MRI uses `to_r` for subsec calculation on floats subsec nanos,
+ //and this could result in different values.
let subsec: f64 = self.try_convert(subsec)?;
@@ -119,9 +119,9 @@ impl TryConvertMut<(Option<Value>, Option<Value>), Subsec> for Artichoke {
return Err(FloatDomainError::with_message("Infinity").into());
}
- // These conversions are luckily not lossy. `seconds_base`
- // and `multiplier_nanos` are guaranteed to be represented
- // without loss in a f64.
+ // These conversions are luckily not lossy. `seconds_base` and
+ //`multiplier_nanos` are guaranteed to be represented without
+ //loss in a f64.
#[allow(clippy::cast_precision_loss)]
let seconds_base = seconds_base as f64;
#[allow(clippy::cast_precision_loss)]
@@ -133,10 +133,10 @@ impl TryConvertMut<(Option<Value>, Option<Value>), Subsec> for Artichoke {
// `is_sign_negative()` is not enough here, since this logic
// should also be skilled for negative zero.
if subsec < -0.0 {
- // Nanos always needs to be a positive u32. If subsec
- // is negative, we will always need remove one second.
- // Nanos can then be adjusted since it will always be
- // the inverse of the total nanos in a second.
+ // Nanos always needs to be a positive u32. If subsec is
+ //negative, we will always need remove one second. Nanos can
+ //then be adjusted since it will always be the inverse of
+ //the total nanos in a second.
secs -= 1.0;
#[allow(clippy::cast_precision_loss)]
@@ -159,18 +159,17 @@ impl TryConvertMut<(Option<Value>, Option<Value>), Subsec> for Artichoke {
} else {
let subsec: i64 = implicitly_convert_to_int(self, subsec)?;
- // The below calculations should always be safe. The
- // multiplier is guaranteed to not be 0, the remainder
- // should never overflow, and is guaranteed to be less
- // than u32::MAX.
+ // The below calculations should always be safe. The multiplier
+ //is guaranteed to not be 0, the remainder should never
+ //overflow, and is guaranteed to be less than u32::MAX.
let mut secs = subsec / seconds_base;
let mut nanos = (subsec % seconds_base) * multiplier_nanos;
if subsec.is_negative() {
- // Nanos always needs to be a positive u32. If subsec
- // is negative, we will always need remove one second.
- // Nanos can then be adjusted since it will always be
- // the inverse of the total nanos in a second.
+ // Nanos always needs to be a positive u32. If subsec is
+ //negative, we will always need remove one second. Nanos can
+ //then be adjusted since it will always be the inverse of
+ //the total nanos in a second.
secs = secs
.checked_sub(1)
.ok_or(ArgumentError::with_message("Time too small"))?;
diff --git i/artichoke-backend/src/extn/mod.rs w/artichoke-backend/src/extn/mod.rs
index 79ccd7979e..7fbaf5bc1d 100644
--- i/artichoke-backend/src/extn/mod.rs
+++ w/artichoke-backend/src/extn/mod.rs
@@ -1,5 +1,5 @@
-// This pragma is needed to allow passing `Value` by value in all the mruby
-// and Rust trampolines.
+// This pragma is needed to allow passing `Value` by value in all the mruby and
+ //Rust trampolines.
#![allow(clippy::needless_pass_by_value)]
use crate::release_metadata::ReleaseMetadata;
diff --git i/artichoke-backend/src/extn/prelude.rs w/artichoke-backend/src/extn/prelude.rs
index e80f1e5cb6..099e160b61 100644
--- i/artichoke-backend/src/extn/prelude.rs
+++ w/artichoke-backend/src/extn/prelude.rs
@@ -1,5 +1,4 @@
-//! A "prelude" for users of the `extn` module in the `artichoke-backend`
-//! crate.
+//! A "prelude" for users of the `extn` module in the `artichoke-backend` crate.
//!
//! This prelude is similar to the standard library's prelude in that you'll
//! almost always want to import its entire contents, but unlike the standard
diff --git i/artichoke-backend/src/extn/stdlib/json/mod.rs w/artichoke-backend/src/extn/stdlib/json/mod.rs
index e6a4fd0cbc..849051d129 100644
--- i/artichoke-backend/src/extn/stdlib/json/mod.rs
+++ w/artichoke-backend/src/extn/stdlib/json/mod.rs
@@ -14,9 +14,9 @@ static JSON_PURE_PARSER_RUBY_SOURCE: &[u8] = include_bytes!("vendor/json/pure/pa
pub fn init(interp: &mut Artichoke) -> InitializeResult<()> {
let spec = module::Spec::new(interp, "JSON", JSON_CSTR, None)?;
interp.def_module::<Json>(spec)?;
- // NOTE(lopopolo): This setup of the JSON gem in the virtual file system does not include
- // any of the `json/add` sources for serializing "extra" types like `Time`
- // and `BigDecimal`, not all of which Artichoke supports.
+ // NOTE(lopopolo): This setup of the JSON gem in the virtual file system does
+ //not include any of the `json/add` sources for serializing "extra" types
+ //like `Time` and `BigDecimal`, not all of which Artichoke supports.
interp.def_rb_source_file("json.rb", JSON_RUBY_SOURCE)?;
interp.def_rb_source_file("json/common.rb", JSON_COMMON_RUBY_SOURCE)?;
interp.def_rb_source_file("json/generic_object.rb", JSON_GENERIC_OBJECT_RUBY_SOURCE)?;
diff --git i/artichoke-backend/src/fmt.rs w/artichoke-backend/src/fmt.rs
index d18c01ba02..ed2343cda2 100644
--- i/artichoke-backend/src/fmt.rs
+++ w/artichoke-backend/src/fmt.rs
@@ -16,7 +16,7 @@ use crate::Artichoke;
/// This error type can also be used to convert generic [`fmt::Error`] into an
/// [`Error`], such as when formatting integers with [`write!`].
///
-/// This error type wraps a [`fmt::Error`].
+/// This error type wraps a [`fmt::Error`].
///
/// # Examples
///
diff --git i/artichoke-backend/src/gc.rs w/artichoke-backend/src/gc.rs
index 985f9e46aa..e0c1ca83ca 100644
--- i/artichoke-backend/src/gc.rs
+++ w/artichoke-backend/src/gc.rs
@@ -10,8 +10,8 @@ use arena::{ArenaIndex, ArenaSavepointError};
pub trait MrbGarbageCollection {
/// Create a savepoint in the GC arena.
///
- /// Savepoints allow mruby to deallocate all the objects created via the
- /// C API.
+ /// Savepoints allow mruby to deallocate all the objects created via the C
+ /// API.
///
/// Normally objects created via the C API are marked as permanently alive
/// ("white" GC color) with a call to [`mrb_gc_protect`].
@@ -251,8 +251,8 @@ mod tests {
interp.full_gc().unwrap();
assert_eq!(
interp.live_object_count(),
- // plus 1 because stack keep is enabled in eval which marks the
- // last returned value as live.
+ // plus 1 because stack keep is enabled in eval which marks the last
+ //returned value as live.
baseline_object_count + 1,
"Started with {} live objects, ended with {}. Potential memory leak!",
baseline_object_count,
diff --git i/artichoke-backend/src/gc/arena.rs w/artichoke-backend/src/gc/arena.rs
index 4b187c0292..46932b333a 100644
--- i/artichoke-backend/src/gc/arena.rs
+++ w/artichoke-backend/src/gc/arena.rs
@@ -70,9 +70,9 @@ impl From<ArenaSavepointError> for Error {
/// Arena savepoints ensure mruby objects are reaped even when allocated with
/// the C API.
///
-/// mruby manages objects created via the C API in a memory construct called
-/// the [arena]. The arena is a stack and objects stored there are permanently
-/// alive to avoid having to track lifetimes externally to the interpreter.
+/// mruby manages objects created via the C API in a memory construct called the
+/// [arena]. The arena is a stack and objects stored there are permanently alive
+/// to avoid having to track lifetimes externally to the interpreter.
///
/// An [`ArenaIndex`] is an index to some position of the stack. When restoring
/// an `ArenaIndex`, the stack pointer is moved. All objects beyond the pointer
@@ -134,8 +134,8 @@ impl<'a> DerefMut for ArenaIndex<'a> {
impl<'a> Drop for ArenaIndex<'a> {
fn drop(&mut self) {
let idx = self.index;
- // We can't panic in a drop impl, so ignore errors when crossing the
- // FFI boundary.
+ // We can't panic in a drop impl, so ignore errors when crossing the FFI
+ //boundary.
let _ignored = unsafe {
self.interp
.with_ffi_boundary(|mrb| sys::mrb_sys_gc_arena_restore(mrb, idx))
diff --git i/artichoke-backend/src/globals.rs w/artichoke-backend/src/globals.rs
index 08de0382ee..21c91b31c7 100644
--- i/artichoke-backend/src/globals.rs
+++ w/artichoke-backend/src/globals.rs
@@ -6,8 +6,8 @@ use crate::sys;
use crate::value::Value;
use crate::Artichoke;
-// TODO: Handle invalid variable names. For now this is delegated to mruby.
-// The parser in `spinoso-symbol` can handle this.
+// TODO: Handle invalid variable names. For now this is delegated to mruby. The
+ //parser in `spinoso-symbol` can handle this.
impl Globals for Artichoke {
type Value = Value;
diff --git i/artichoke-backend/src/interpreter.rs w/artichoke-backend/src/interpreter.rs
index 7e5708bfb0..05fa7c5a08 100644
--- i/artichoke-backend/src/interpreter.rs
+++ w/artichoke-backend/src/interpreter.rs
@@ -63,9 +63,9 @@ pub fn interpreter_with_config(config: ReleaseMetadata<'_>) -> Result<Artichoke,
}
arena.restore();
- // mruby lazily initializes some core objects like `top_self` and generates
- // a lot of garbage on start-up. Eagerly initialize the interpreter to
- // provide predictable initialization behavior.
+ // mruby lazily initializes some core objects like `top_self` and generates a
+ //lot of garbage on start-up. Eagerly initialize the interpreter to provide
+ //predictable initialization behavior.
interp.create_arena_savepoint()?.interp().eval(&[])?;
if let GcState::Enabled = prior_gc_state {
diff --git i/artichoke-backend/src/lib.rs w/artichoke-backend/src/lib.rs
index 7d861dea63..7a64e5fa48 100644
--- i/artichoke-backend/src/lib.rs
+++ w/artichoke-backend/src/lib.rs
@@ -2,8 +2,8 @@
#![warn(clippy::pedantic)]
#![warn(clippy::cargo)]
#![allow(clippy::missing_errors_doc)]
-#![allow(clippy::question_mark)] // https://github.com/rust-lang/rust-clippy/issues/8281
-#![allow(clippy::unnecessary_lazy_evaluations)] // https://github.com/rust-lang/rust-clippy/issues/8109
+#![allow(clippy::question_mark)] // //https://github.com/rust-lang/rust-clippy/issues/8281
+ //https://github.com/rust-lang/rust-clippy/issues/8109
#![cfg_attr(test, allow(clippy::non_ascii_literal))]
#![allow(unknown_lints)]
// #![warn(missing_docs)]
@@ -28,8 +28,8 @@
//!
//! ### Evaling Source Code
//!
-//! The `artichoke-backend` interpreter implements
-//! [`Eval` from `artichoke-core`](crate::core::Eval).
+//! The `artichoke-backend` interpreter implements [`Eval` from
+//! `artichoke-core`](crate::core::Eval).
//!
//! ```rust
//! use artichoke_backend::prelude::*;
@@ -68,8 +68,8 @@
//!
//! ## Virtual File System and `Kernel#require`
//!
-//! The `artichoke-backend` interpreter includes an in-memory virtual
-//! file system. The file system stores Ruby sources and Rust extension functions
+//! The `artichoke-backend` interpreter includes an in-memory virtual file
+//! system. The file system stores Ruby sources and Rust extension functions
//! that are similar to MRI C extensions.
//!
//! The virtual file system enables applications built with `artichoke-backend`
diff --git i/artichoke-backend/src/load_path.rs w/artichoke-backend/src/load_path.rs
index 895c86c2c8..c3ec1ed071 100644
--- i/artichoke-backend/src/load_path.rs
+++ w/artichoke-backend/src/load_path.rs
@@ -32,8 +32,8 @@ pub use native::Native;
/// Directory at which Ruby sources and extensions are stored in the virtual
/// file system.
///
-/// `RUBY_LOAD_PATH` is the default current working directory for
-/// [`Memory`] file systems.
+/// `RUBY_LOAD_PATH` is the default current working directory for [`Memory`]
+/// file systems.
///
/// [`Hybrid`] file systems locate the path on a [`Memory`] file system.
#[cfg(not(windows))]
@@ -42,8 +42,8 @@ pub const RUBY_LOAD_PATH: &str = "/artichoke/virtual_root/src/lib";
/// Directory at which Ruby sources and extensions are stored in the virtual
/// file system.
///
-/// `RUBY_LOAD_PATH` is the default current working directory for
-/// [`Memory`] file systems.
+/// `RUBY_LOAD_PATH` is the default current working directory for [`Memory`]
+/// file systems.
///
/// [`Hybrid`] file systems locate the path on a [`Memory`] file system.
#[cfg(windows)]
diff --git i/artichoke-backend/src/macros.rs w/artichoke-backend/src/macros.rs
index d888165fae..c55affb3ef 100644
--- i/artichoke-backend/src/macros.rs
+++ w/artichoke-backend/src/macros.rs
@@ -19,8 +19,8 @@ macro_rules! emit_fatal_warning {
// called when there are foreign C frames in the stack and panics are
// either undefined behavior or will result in an abort.
//
- // Ensure the returned error is dropped so we don't leave anything on
- // the stack in the event of a foreign unwind.
+ // Ensure the returned error is dropped so we don't leave anything on the
+ //stack in the event of a foreign unwind.
let maybe_err = ::std::write!(::std::io::stderr(), "fatal[artichoke-backend]: ");
drop(maybe_err);
let maybe_err = ::std::writeln!(::std::io::stderr(), $($arg)+);
@@ -96,8 +96,8 @@ pub mod argspec {
pub const REST: &CStr = qed::const_cstr_from_str!("*\0");
}
-/// Extract [`sys::mrb_value`]s from a [`sys::mrb_state`] to adapt a C
-/// entry point to a Rust implementation of a Ruby function.
+/// Extract [`sys::mrb_value`]s from a [`sys::mrb_state`] to adapt a C entry
+/// point to a Rust implementation of a Ruby function.
///
/// This macro exists because the mruby VM [does not validate argspecs] attached
/// to native functions.
diff --git i/artichoke-backend/src/module.rs w/artichoke-backend/src/module.rs
index 7958fa7ba6..f9869de06a 100644
--- i/artichoke-backend/src/module.rs
+++ w/artichoke-backend/src/module.rs
@@ -137,13 +137,13 @@ impl Rclass {
let is_defined_under =
sys::mrb_const_defined_at(mrb, sys::mrb_sys_obj_value(scope.cast::<c_void>().as_mut()), self.sym);
if is_defined_under {
- // Enclosing scope exists.
- // Module is defined under the enclosing scope.
+ // Enclosing scope exists. Module is defined under the enclosing
+ //scope.
let module = sys::mrb_module_get_under(mrb, scope.as_mut(), module_name);
NonNull::new(module)
} else {
- // Enclosing scope exists.
- // Module is not defined under the enclosing scope.
+ // Enclosing scope exists. Module is not defined under the
+ //enclosing scope.
None
}
} else {
diff --git i/artichoke-backend/src/module/registry.rs w/artichoke-backend/src/module/registry.rs
index 06587e7b0a..ac76beff82 100644
--- i/artichoke-backend/src/module/registry.rs
+++ w/artichoke-backend/src/module/registry.rs
@@ -233,9 +233,8 @@ where
self.0.shrink_to_fit();
}
- /// Shrinks the capacity of the registry with a lower bound.
- /// The capacity will remain at least as large as both the length and the
- /// supplied value.
+ /// Shrinks the capacity of the registry with a lower bound. The capacity
+ /// will remain at least as large as both the length and the supplied value.
///
/// If the current capacity is less than the lower limit, this is a no-op.
pub fn shrink_to(&mut self, min_capacity: usize) {
diff --git i/artichoke-backend/src/sys/args.rs w/artichoke-backend/src/sys/args.rs
index 0e80b577e4..a17bfc27da 100644
--- i/artichoke-backend/src/sys/args.rs
+++ w/artichoke-backend/src/sys/args.rs
@@ -259,7 +259,7 @@ pub mod specifiers {
/// The following args specified are optional.
pub const FOLLOWING_ARGS_OPTIONAL: &str = "|";
- /// Retrieve a Boolean indicating whether the previous optional argument
- /// was given.
+ /// Retrieve a Boolean indicating whether the previous optional argument was
+ /// given.
pub const PREVIOUS_OPTIONAL_ARG_GIVEN: &str = "?";
}
diff --git i/artichoke-backend/src/sys/mod.rs w/artichoke-backend/src/sys/mod.rs
index ed374e56ed..50234d7ae7 100644
--- i/artichoke-backend/src/sys/mod.rs
+++ w/artichoke-backend/src/sys/mod.rs
@@ -22,7 +22,8 @@ mod args;
#[allow(clippy::all)]
#[allow(clippy::pedantic)]
#[allow(clippy::restriction)]
-#[cfg_attr(test, allow(deref_nullptr))] // See https://github.com/rust-lang/rust-bindgen/issues/1651.
+#[cfg_attr(test, allow(deref_nullptr))] // See
+ //https://github.com/rust-lang/rust-bindgen/issues/1651.
mod ffi {
include!(concat!(env!("OUT_DIR"), "/ffi.rs"));
}
diff --git i/artichoke-backend/src/sys/protect.rs w/artichoke-backend/src/sys/protect.rs
index 25782f4c1b..262970f6cf 100644
--- i/artichoke-backend/src/sys/protect.rs
+++ w/artichoke-backend/src/sys/protect.rs
@@ -58,8 +58,8 @@ trait Protect {
unsafe extern "C" fn run(mrb: *mut sys::mrb_state, data: sys::mrb_value) -> sys::mrb_value;
}
-// `Funcall` must be `Copy` because we may unwind past the frames in which
-// it is used with `longjmp` which does not allow Rust to run destructors.
+// `Funcall` must be `Copy` because we may unwind past the frames in which it is
+ //used with `longjmp` which does not allow Rust to run destructors.
#[derive(Clone, Copy)]
struct Funcall<'a> {
slf: sys::mrb_value,
@@ -76,9 +76,9 @@ impl<'a> Protect for Funcall<'a> {
// allow Rust to run destructors.
let Self { slf, func, args, block } = *Box::from_raw(ptr.cast::<Self>());
- // This will always unwrap because we've already checked that we
- // have fewer than `MRB_FUNCALL_ARGC_MAX` args, which is less than
- // `i64` max value.
+ // This will always unwrap because we've already checked that we have
+ //fewer than `MRB_FUNCALL_ARGC_MAX` args, which is less than `i64` max
+ //value.
let argslen = if let Ok(argslen) = i64::try_from(args.len()) {
argslen
} else {
@@ -93,8 +93,8 @@ impl<'a> Protect for Funcall<'a> {
}
}
-// `Eval` must be `Copy` because we may unwind past the frames in which
-// it is used with `longjmp` which does not allow Rust to run destructors.
+// `Eval` must be `Copy` because we may unwind past the frames in which it is
+ //used with `longjmp` which does not allow Rust to run destructors.
#[derive(Clone, Copy)]
struct Eval<'a> {
context: *mut sys::mrbc_context,
@@ -106,8 +106,8 @@ impl<'a> Protect for Eval<'a> {
let ptr = sys::mrb_sys_cptr_ptr(data);
let Self { context, code } = *Box::from_raw(ptr.cast::<Self>());
- // Execute arbitrary ruby code, which may generate objects with C APIs
- // if backed by Rust functions.
+ // Execute arbitrary ruby code, which may generate objects with C APIs if
+ //backed by Rust functions.
//
// `mrb_load_nstring_ctx` sets the "stack keep" field on the context
// which means the most recent value returned by eval will always be
@@ -116,8 +116,8 @@ impl<'a> Protect for Eval<'a> {
}
}
-// `BlockYield` must be `Copy` because we may unwind past the frames in which
-// it is used with `longjmp` which does not allow Rust to run destructors.
+// `BlockYield` must be `Copy` because we may unwind past the frames in which it
+ //is used with `longjmp` which does not allow Rust to run destructors.
#[derive(Clone, Copy)]
struct BlockYield {
block: sys::mrb_value,
@@ -154,8 +154,8 @@ pub enum Range {
Out,
}
-// `IsRange` must be `Copy` because we may unwind past the frames in which
-// it is used with `longjmp` which does not allow Rust to run destructors.
+// `IsRange` must be `Copy` because we may unwind past the frames in which it is
+ //used with `longjmp` which does not allow Rust to run destructors.
#[derive(Default, Debug, Clone, Copy)]
struct IsRange {
value: sys::mrb_value,
diff --git i/artichoke-backend/src/types.rs w/artichoke-backend/src/types.rs
index dcd16e8013..620930e706 100644
--- i/artichoke-backend/src/types.rs
+++ w/artichoke-backend/src/types.rs
@@ -19,14 +19,14 @@ pub fn ruby_from_mrb_value(value: sys::mrb_value) -> Ruby {
// in the `sys::mrb_vtype` enum C source.
#[allow(clippy::match_same_arms)]
match value.tt {
- // `nil` is implemented with the `MRB_TT_FALSE` type tag in mruby
- // (since both values are falsy). The difference is that Booleans are
- // non-zero `Fixnum`s.
+ // `nil` is implemented with the `MRB_TT_FALSE` type tag in mruby (since
+ //both values are falsy). The difference is that Booleans are non-zero
+ //`Fixnum`s.
MRB_TT_FALSE if unsafe { sys::mrb_sys_value_is_nil(value) } => Ruby::Nil,
MRB_TT_FALSE => Ruby::Bool,
- // `MRB_TT_FREE` is a marker type tag that indicates to the mruby
- // VM that an object is unreachable and should be deallocated by the
- // garbage collector.
+ // `MRB_TT_FREE` is a marker type tag that indicates to the mruby VM that
+ //an object is unreachable and should be deallocated by the garbage
+ //collector.
MRB_TT_FREE => Ruby::Unreachable,
MRB_TT_TRUE => Ruby::Bool,
MRB_TT_INTEGER => Ruby::Fixnum,
@@ -39,8 +39,8 @@ pub fn ruby_from_mrb_value(value: sys::mrb_value) -> Ruby {
MRB_TT_OBJECT => Ruby::Object,
MRB_TT_CLASS => Ruby::Class,
MRB_TT_MODULE => Ruby::Module,
- // `MRB_TT_ICLASS` is an internal use type tag meant for holding
- // mixed in modules.
+ // `MRB_TT_ICLASS` is an internal use type tag meant for holding mixed in
+ //modules.
MRB_TT_ICLASS => Ruby::Unreachable,
// `MRB_TT_SCLASS` represents a singleton class, or a class that is
// defined anonymously, e.g. `c1` or `c2` below:
@@ -52,8 +52,8 @@ pub fn ruby_from_mrb_value(value: sys::mrb_value) -> Ruby {
// c2 = (class <<cls; self; end)
// ```
//
- // mruby also uses the term singleton method to refer to methods
- // defined on an object's eigenclass, e.g. `bar` below:
+ // mruby also uses the term singleton method to refer to methods defined
+ //on an object's eigenclass, e.g. `bar` below:
//
// ```ruby
// class Foo; end
@@ -70,12 +70,12 @@ pub fn ruby_from_mrb_value(value: sys::mrb_value) -> Ruby {
MRB_TT_STRING => Ruby::String,
MRB_TT_RANGE => Ruby::Range,
MRB_TT_EXCEPTION => Ruby::Exception,
- // NOTE(lopopolo): This might be an internal closure symbol table,
- // rather than the `ENV` core object.
+ // NOTE(lopopolo): This might be an internal closure symbol table, rather
+ //than the `ENV` core object.
MRB_TT_ENV => Ruby::Unreachable,
- // `MRB_TT_DATA` is a type tag for wrapped C pointers. It is used
- // to indicate that an `mrb_value` has an owned pointer to an
- // external data structure stored in its `value.p` field.
+ // `MRB_TT_DATA` is a type tag for wrapped C pointers. It is used to
+ //indicate that an `mrb_value` has an owned pointer to an external data
+ //structure stored in its `value.p` field.
MRB_TT_DATA => Ruby::Data,
// NOTE(lopopolo): `Fiber`s are unimplemented in Artichoke.
MRB_TT_FIBER => Ruby::Fiber,
diff --git i/artichoke-backend/src/value.rs w/artichoke-backend/src/value.rs
index d0b71ce81a..eaf26c7fb9 100644
--- i/artichoke-backend/src/value.rs
+++ w/artichoke-backend/src/value.rs
@@ -210,8 +210,8 @@ impl ValueCore for Value {
}
fn respond_to(&self, interp: &mut Self::Artichoke, method: &str) -> Result<bool, Self::Error> {
- // Look up a method in the mruby VM's method table for this value's
- // class object.
+ // Look up a method in the mruby VM's method table for this value's class
+ //object.
let method_sym = if let Some(sym) = interp.check_interned_string(method)? {
sym
} else {
diff --git i/artichoke-core/src/class_registry.rs w/artichoke-core/src/class_registry.rs
index 56a59ca642..08d7b719a6 100644
--- i/artichoke-core/src/class_registry.rs
+++ w/artichoke-core/src/class_registry.rs
@@ -10,7 +10,8 @@ pub trait ClassRegistry {
/// Concrete value type for boxed Ruby values.
type Value;
- /// Concrete error type for errors encountered when manipulating the class registry.
+ /// Concrete error type for errors encountered when manipulating the class
+ /// registry.
type Error;
/// Type representing a class specification.
@@ -39,7 +40,8 @@ pub trait ClassRegistry {
where
T: Any;
- /// Retrieve whether a class definition exists from the state bound to Rust type `T`.
+ /// Retrieve whether a class definition exists from the state bound to Rust
+ /// type `T`.
///
/// # Errors
///
diff --git i/artichoke-core/src/convert.rs w/artichoke-core/src/convert.rs
index feb1e34755..c258a54543 100644
--- i/artichoke-core/src/convert.rs
+++ w/artichoke-core/src/convert.rs
@@ -4,8 +4,7 @@
///
/// Implementors may not allocate on the interpreter heap.
///
-/// See [`core::convert::From`].
-/// See [`ConvertMut`].
+/// See [`core::convert::From`]. See [`ConvertMut`].
pub trait Convert<T, U> {
/// Performs the infallible conversion.
fn convert(&self, from: T) -> U;
@@ -15,8 +14,7 @@ pub trait Convert<T, U> {
///
/// Implementors may not allocate on the interpreter heap.
///
-/// See [`core::convert::TryFrom`].
-/// See [`TryConvertMut`].
+/// See [`core::convert::TryFrom`]. See [`TryConvertMut`].
#[allow(clippy::module_name_repetitions)]
pub trait TryConvert<T, U> {
/// Error type for failed conversions.
@@ -35,8 +33,7 @@ pub trait TryConvert<T, U> {
///
/// Implementors may allocate on the interpreter heap.
///
-/// See [`core::convert::From`].
-/// See [`Convert`].
+/// See [`core::convert::From`]. See [`Convert`].
#[allow(clippy::module_name_repetitions)]
pub trait ConvertMut<T, U> {
/// Performs the infallible conversion.
@@ -47,8 +44,7 @@ pub trait ConvertMut<T, U> {
///
/// Implementors may allocate on the interpreter heap.
///
-/// See [`core::convert::TryFrom`].
-/// See [`TryConvert`].
+/// See [`core::convert::TryFrom`]. See [`TryConvert`].
pub trait TryConvertMut<T, U> {
/// Error type for failed conversions.
type Error;
diff --git i/artichoke-core/src/debug.rs w/artichoke-core/src/debug.rs
index c828cd82e2..fcd628ea3b 100644
--- i/artichoke-core/src/debug.rs
+++ w/artichoke-core/src/debug.rs
@@ -13,7 +13,8 @@ pub trait Debug {
/// Some immediate types like `true`, `false`, and `nil` are shown by value
/// rather than by class.
///
- /// This function suppresses all errors and returns an empty string on error.
+ /// This function suppresses all errors and returns an empty string on
+ /// error.
fn inspect_type_name_for_value(&mut self, value: Self::Value) -> &str;
/// Return the class name for the given value's type.
@@ -21,6 +22,7 @@ pub trait Debug {
/// Even immediate types will have their class name spelled out. For
/// example, calling this function with `nil` will return `"NilClass"`.
///
- /// This function suppresses all errors and returns an empty string on error.
+ /// This function suppresses all errors and returns an empty string on
+ /// error.
fn class_name_for_value(&mut self, value: Self::Value) -> &str;
}
diff --git i/artichoke-core/src/file.rs w/artichoke-core/src/file.rs
index 722b894b1a..9dd45462c7 100644
--- i/artichoke-core/src/file.rs
+++ w/artichoke-core/src/file.rs
@@ -2,8 +2,8 @@
/// Rust extension hook that can be required.
///
-/// `File`s are mounted in the interpreter file system and can modify interpreter
-/// state when they are loaded.
+/// `File`s are mounted in the interpreter file system and can modify
+/// interpreter state when they are loaded.
pub trait File {
/// Concrete type for interpreter.
type Artichoke;
diff --git i/artichoke-core/src/globals.rs w/artichoke-core/src/globals.rs
index b9217df7b5..c457cc430a 100644
--- i/artichoke-core/src/globals.rs
+++ w/artichoke-core/src/globals.rs
@@ -48,10 +48,10 @@ pub trait Globals {
///
/// # Compatibility Notes
///
- /// Getting a global that is currently may return `Ok(None)` even through
- /// a non-existent global resolves to `nil` in the Ruby VM. Consult the
- /// documentation on implementations of this trait for implementation-defined
- /// behavior.
+ /// Getting a global that is currently may return `Ok(None)` even through a
+ /// non-existent global resolves to `nil` in the Ruby VM. Consult the
+ /// documentation on implementations of this trait for
+ /// implementation-defined behavior.
///
/// # Errors
///
diff --git i/artichoke-core/src/hash.rs w/artichoke-core/src/hash.rs
index 520d8fb595..02323bb494 100644
--- i/artichoke-core/src/hash.rs
+++ w/artichoke-core/src/hash.rs
@@ -4,10 +4,10 @@ use core::hash::BuildHasher;
/// A trait for retrieving an interpreter-global [`BuildHasher`].
///
-/// The [`BuildHasher`] associated with the interpreter is for creating instances
-/// of [`Hasher`]. A `BuildHasher` is typically used (e.g., by `HashMap`) to
-/// create [`Hasher`]s for each key such that they are hashed independently of
-/// one another, since [`Hasher`]s contain state.
+/// The [`BuildHasher`] associated with the interpreter is for creating
+/// instances of [`Hasher`]. A `BuildHasher` is typically used (e.g., by
+/// `HashMap`) to create [`Hasher`]s for each key such that they are hashed
+/// independently of one another, since [`Hasher`]s contain state.
///
/// By associating one [`BuildHasher`] with the interpreter, identical Ruby
/// objects should hash identically, even if the interpreter's [`BuildHasher`]
diff --git i/artichoke-core/src/intern.rs w/artichoke-core/src/intern.rs
index ca0fa7333b..34f247e5dc 100644
--- i/artichoke-core/src/intern.rs
+++ w/artichoke-core/src/intern.rs
@@ -10,7 +10,7 @@ use alloc::borrow::Cow;
/// Store and retrieve byte strings that have the same lifetime as the
/// interpreter.
///
-/// See the [Ruby `Symbol` type][symbol].
+/// See the [Ruby `Symbol` `Symbol` type][symbol].
///
/// [symbol]: https://ruby-doc.org/core-3.1.2/Symbol.html
pub trait Intern {
diff --git i/artichoke-core/src/lib.rs w/artichoke-core/src/lib.rs
index 974e342434..a208fb35ff 100644
--- i/artichoke-core/src/lib.rs
+++ w/artichoke-core/src/lib.rs
@@ -90,11 +90,11 @@
//!
//! # Examples
//!
-//! [`artichoke-backend`](https://artichoke.github.io/artichoke/artichoke_backend/)
+//! //! [`artichoke-backend`](https://artichoke.github.io/artichoke/artichoke_backend/)
//! is one implementation of the `artichoke-core` traits.
//!
-//! To use all the APIs defined in Artichoke Core, bring the traits into
-//! scope by importing the prelude:
+//! To use all the APIs defined in Artichoke Core, bring the traits into scope
+//! by importing the prelude:
//!
//! ```
//! use artichoke_core::prelude::*;
diff --git i/artichoke-core/src/load.rs w/artichoke-core/src/load.rs
index 1494181b3f..94301cde46 100644
--- i/artichoke-core/src/load.rs
+++ w/artichoke-core/src/load.rs
@@ -79,10 +79,9 @@ impl From<Required> for bool {
/// In Ruby, `load` is stateless. All sources passed to `load` are loaded for
/// every method call.
///
-/// Each time a file is loaded, it is parsed and executed by the
-/// interpreter. If the file executes without raising an error, the file is
-/// successfully loaded and Rust callers can expect a [`Loaded::Success`]
-/// variant.
+/// Each time a file is loaded, it is parsed and executed by the interpreter. If
+/// the file executes without raising an error, the file is successfully loaded
+/// and Rust callers can expect a [`Loaded::Success`] variant.
///
/// If the file raises an exception as it is required, Rust callers can expect
/// an `Err` variant. The file is not added to the set of loaded features.
@@ -125,14 +124,14 @@ pub trait LoadSources {
/// Concrete type for errors returned by `File::require`.
type Exception;
- /// Add a Rust extension hook to the virtual file system. A stub Ruby file is
- /// added to the file system and [`File::require`] will dynamically define
- /// Ruby items when invoked via `Kernel#require`.
+ /// Add a Rust extension hook to the virtual file system. A stub Ruby file
+ /// is added to the file system and [`File::require`] will dynamically
+ /// define Ruby items when invoked via `Kernel#require`.
///
- /// If `path` is a relative path, the Ruby source is added to the
- /// file system relative to `RUBY_LOAD_PATH`. If the path is absolute, the
- /// file is placed directly on the file system. Ancestor directories are
- /// created automatically.
+ /// If `path` is a relative path, the Ruby source is added to the file
+ /// system relative to `RUBY_LOAD_PATH`. If the path is absolute, the file
+ /// is placed directly on the file system. Ancestor directories are created
+ /// automatically.
///
/// # Errors
///
@@ -146,10 +145,10 @@ pub trait LoadSources {
/// Add a Ruby source to the virtual file system.
///
- /// If `path` is a relative path, the Ruby source is added to the
- /// file system relative to `RUBY_LOAD_PATH`. If the path is absolute, the
- /// file is placed directly on the file system. Ancestor directories are
- /// created automatically.
+ /// If `path` is a relative path, the Ruby source is added to the file
+ /// system relative to `RUBY_LOAD_PATH`. If the path is absolute, the file
+ /// is placed directly on the file system. Ancestor directories are created
+ /// automatically.
///
/// # Errors
///
@@ -219,8 +218,8 @@ pub trait LoadSources {
/// Require source located at the given path.
///
- /// Query the underlying virtual file system for a source file and require it
- /// onto the interpreter. This requires files with the following steps:
+ /// Query the underlying virtual file system for a source file and require
+ /// it onto the interpreter. This requires files with the following steps:
///
/// 1. Retrieve and execute the extension hook, if any.
/// 2. Read file contents and [`eval`](crate::eval::Eval) them.
diff --git i/artichoke-core/src/module_registry.rs w/artichoke-core/src/module_registry.rs
index 0e1c9a478d..9810582513 100644
--- i/artichoke-core/src/module_registry.rs
+++ w/artichoke-core/src/module_registry.rs
@@ -10,7 +10,8 @@ pub trait ModuleRegistry {
/// Concrete value type for boxed Ruby values.
type Value;
- /// Concrete error type for errors encountered when manipulating the module registry.
+ /// Concrete error type for errors encountered when manipulating the module
+ /// registry.
type Error;
/// Type representing a module specification.
@@ -27,7 +28,8 @@ pub trait ModuleRegistry {
where
T: Any;
- /// Retrieve a module definition from the interpreter bound to Rust type `T`.
+ /// Retrieve a module definition from the interpreter bound to Rust type
+ /// `T`.
///
/// This function returns `None` if type `T` has not had a module spec
/// registered for it using [`ModuleRegistry::def_module`].
diff --git i/artichoke-core/src/parser.rs w/artichoke-core/src/parser.rs
index ac49985def..b43e9bb2cc 100644
--- i/artichoke-core/src/parser.rs
+++ w/artichoke-core/src/parser.rs
@@ -66,8 +66,8 @@ pub trait Parser {
pub enum IncrementLinenoError {
/// An overflow occurred when incrementing the line number.
///
- /// This error is reported based on the internal parser storage width
- /// and contains the max value the parser can store.
+ /// This error is reported based on the internal parser storage width and
+ /// contains the max value the parser can store.
Overflow(usize),
}
diff --git i/artichoke-core/src/regexp.rs w/artichoke-core/src/regexp.rs
index 8ccc82cefd..7f2699f678 100644
--- i/artichoke-core/src/regexp.rs
+++ w/artichoke-core/src/regexp.rs
@@ -18,8 +18,8 @@ pub trait Regexp {
///
/// Per the Ruby documentation:
///
- /// > `$1`, `$2` and so on contain text matching first, second, etc capture
- /// > group.
+ /// > `$1`, `$2` and so on contain text matching first, second, etc capture >
+ /// group.
///
/// # Errors
///
@@ -34,8 +34,8 @@ pub trait Regexp {
///
/// Per the Ruby documentation:
///
- /// > `$1`, `$2` and so on contain text matching first, second, etc capture
- /// > group.
+ /// > `$1`, `$2` and so on contain text matching first, second, etc capture >
+ /// group.
///
/// # Errors
///
diff --git i/artichoke-core/src/types.rs w/artichoke-core/src/types.rs
index 24a9f33231..5f6b8df6bd 100644
--- i/artichoke-core/src/types.rs
+++ w/artichoke-core/src/types.rs
@@ -91,8 +91,8 @@ pub enum Ruby {
Object,
/// Ruby `Proc` type.
///
- /// `Proc` is a callable closure that captures lexical scope. `Proc`s can
- /// be arbitrary arity and may or may not enforce this arity when called.
+ /// `Proc` is a callable closure that captures lexical scope. `Proc`s can be
+ /// arbitrary arity and may or may not enforce this arity when called.
Proc,
/// Ruby `Range` type.
///
diff --git i/artichoke-load-path/src/rubylib.rs w/artichoke-load-path/src/rubylib.rs
index 910fa6c856..6ad4ad3cd4 100644
--- i/artichoke-load-path/src/rubylib.rs
+++ w/artichoke-load-path/src/rubylib.rs
@@ -83,9 +83,9 @@ impl Rubylib {
/// This source loader grants access to the host file system. The `Rubylib`
/// loader does not support native extensions.
///
- /// This method returns [`None`] if there are errors resolving the
- /// `RUBYLIB` environment variable, if the `RUBYLIB` environment variable is
- /// not set, if the current working directory cannot be retrieved, or if the
+ /// This method returns [`None`] if there are errors resolving the `RUBYLIB`
+ /// environment variable, if the `RUBYLIB` environment variable is not set,
+ /// if the current working directory cannot be retrieved, or if the
/// `RUBYLIB` environment variable does not contain any paths.
///
/// [current working directory]: env::current_dir
diff --git i/mezzaluna-feature-loader/src/feature/mod.rs w/mezzaluna-feature-loader/src/feature/mod.rs
index 7b0e97aa32..c1d1fa6e09 100644
--- i/mezzaluna-feature-loader/src/feature/mod.rs
+++ w/mezzaluna-feature-loader/src/feature/mod.rs
@@ -80,9 +80,9 @@ impl Feature {
/// Get the path associated with this feature.
///
- /// The path returned by this method is not guaranteed to be the same as
- /// the path returned by [`LoadedFeatures::features`] since features may
- /// be deduplicated by their physical location in the underlying loaders.
+ /// The path returned by this method is not guaranteed to be the same as the
+ /// path returned by [`LoadedFeatures::features`] since features may be
+ /// deduplicated by their physical location in the underlying loaders.
///
/// # Examples
///
diff --git i/mezzaluna-feature-loader/src/lib.rs w/mezzaluna-feature-loader/src/lib.rs
index 73b7123a79..a77a5e17b7 100644
--- i/mezzaluna-feature-loader/src/lib.rs
+++ w/mezzaluna-feature-loader/src/lib.rs
@@ -1,7 +1,7 @@
#![warn(clippy::all)]
#![warn(clippy::pedantic)]
#![warn(clippy::cargo)]
-#![allow(clippy::question_mark)] // https://github.com/rust-lang/rust-clippy/issues/8281
+#![allow(clippy::question_mark)] // //https://github.com/rust-lang/rust-clippy/issues/8281
#![allow(unknown_lints)]
#![warn(missing_docs)]
#![warn(missing_debug_implementations)]
diff --git i/mezzaluna-feature-loader/src/loaders/disk.rs w/mezzaluna-feature-loader/src/loaders/disk.rs
index f2f8e6846b..ac7ed49baf 100644
--- i/mezzaluna-feature-loader/src/loaders/disk.rs
+++ w/mezzaluna-feature-loader/src/loaders/disk.rs
@@ -123,8 +123,8 @@ impl Disk {
/// This source loader grants access to the host file system. The `Disk`
/// loader does not support native extensions.
///
- /// This method returns [`None`] if the given `load_path` does not contain any
- /// paths.
+ /// This method returns [`None`] if the given `load_path` does not contain
+ /// any paths.
///
/// [`load_path`]: Self::load_path
/// [`set_load_path`]: Self::set_load_path
diff --git i/mezzaluna-feature-loader/src/loaders/memory.rs w/mezzaluna-feature-loader/src/loaders/memory.rs
index d1ae68f143..182750e34c 100644
--- i/mezzaluna-feature-loader/src/loaders/memory.rs
+++ w/mezzaluna-feature-loader/src/loaders/memory.rs
@@ -100,8 +100,9 @@ impl Memory {
///
/// # Panics
///
- /// If the given path is an absolute path outside of this loader's [load
- /// path], this function will panic.
+ /// If the given path is an absolute path outside of this loader's
+ /// [load
+ path], this function will panic.
///
/// If the given path has already been inserted into the in-memory file
/// system, this function will panic.
@@ -150,8 +151,9 @@ impl Memory {
///
/// # Panics
///
- /// If the given path is an absolute path outside of this loader's [load
- /// path], this function will panic.
+ /// If the given path is an absolute path outside of this loader's
+ /// [load
+ path], this function will panic.
///
/// If the given path has already been inserted into the in-memory file
/// system, this function will panic.
diff --git i/mezzaluna-feature-loader/src/loaders/rubylib.rs w/mezzaluna-feature-loader/src/loaders/rubylib.rs
index ab1d558de1..f4451a976f 100644
--- i/mezzaluna-feature-loader/src/loaders/rubylib.rs
+++ w/mezzaluna-feature-loader/src/loaders/rubylib.rs
@@ -79,9 +79,9 @@ impl Rubylib {
/// This source loader grants access to the host file system. The `Rubylib`
/// loader does not support native extensions.
///
- /// This method returns [`None`] if there are errors resolving the
- /// `RUBYLIB` environment variable, if the `RUBYLIB` environment variable is
- /// not set, if the current working directory cannot be retrieved, or if the
+ /// This method returns [`None`] if there are errors resolving the `RUBYLIB`
+ /// environment variable, if the `RUBYLIB` environment variable is not set,
+ /// if the current working directory cannot be retrieved, or if the
/// `RUBYLIB` environment variable does not contain any paths.
///
/// [current working directory]: env::current_dir
diff --git i/scolapasta-aref/src/lib.rs w/scolapasta-aref/src/lib.rs
index d2c7cdbb07..8d517ce743 100644
--- i/scolapasta-aref/src/lib.rs
+++ w/scolapasta-aref/src/lib.rs
@@ -36,7 +36,8 @@
#![no_std]
-/// Convert a signed aref offset to a `usize` index into the underlying container.
+/// Convert a signed aref offset to a `usize` index into the underlying
+/// container.
///
/// Negative indexes are interpreted as indexing from the end of the container
/// as long as their magnitude is less than the given length.
diff --git i/scolapasta-int-parse/src/error.rs w/scolapasta-int-parse/src/error.rs
index cfc3ff4ed3..a8f935e52f 100644
--- i/scolapasta-int-parse/src/error.rs
+++ w/scolapasta-int-parse/src/error.rs
@@ -170,8 +170,8 @@ pub enum InvalidRadixExceptionKind {
///
/// [`ArgumentError`]: https://ruby-doc.org/core-3.1.2/ArgumentError.html
ArgumentError,
- /// If the given radix falls outside the range of an [`i32`], the error should
- /// be mapped to a [`RangeError`]:
+ /// If the given radix falls outside the range of an [`i32`], the error
+ /// should be mapped to a [`RangeError`]:
///
/// ```console
/// [3.1.2] > begin; Integer "123", (2 ** 31 + 1); rescue => e; p e; end
diff --git i/scolapasta-int-parse/src/lib.rs w/scolapasta-int-parse/src/lib.rs
index 0045e39440..b0ab0e9e09 100644
--- i/scolapasta-int-parse/src/lib.rs
+++ w/scolapasta-int-parse/src/lib.rs
@@ -20,7 +20,8 @@
//! Parse a given byte string and optional radix into an [`i64`].
//!
-//! [`parse`] wraps [`i64::from_str_radix`] by normalizing the input byte string:
+//! [`parse`] wraps [`i64::from_str_radix`] by normalizing the input byte
+//! string:
//!
//! - Assert the byte string is ASCII and does not contain NUL bytes.
//! - Parse the radix to ensure it is in range and valid for the given input
diff --git i/scolapasta-int-parse/src/parser.rs w/scolapasta-int-parse/src/parser.rs
index 805c982159..2fe647ec76 100644
--- i/scolapasta-int-parse/src/parser.rs
+++ w/scolapasta-int-parse/src/parser.rs
@@ -46,9 +46,10 @@ impl<'a> State<'a> {
// => 21
// ```
//
- // In bases below 10, the string representation for large numbers will
- // be longer, but pre-allocating for these uncommon cases seems wasteful.
- // The `String` will reallocate if it needs to in these pathological cases.
+ // In bases below 10, the string representation for large numbers will be
+ //longer, but pre-allocating for these uncommon cases seems wasteful.
+ //The `String` will reallocate if it needs to in these pathological
+ //cases.
const PRE_ALLOCATED_DIGIT_CAPACITY: usize = 21;
match self {
diff --git i/scolapasta-int-parse/src/radix.rs w/scolapasta-int-parse/src/radix.rs
index 4a6f70b9ca..5d9b069809 100644
--- i/scolapasta-int-parse/src/radix.rs
+++ w/scolapasta-int-parse/src/radix.rs
@@ -595,10 +595,8 @@ mod tests {
#[test]
fn negative_radix_with_inline_base_and_leading_spaces_ignores() {
- // [3.1.2] > Integer " 0123", -6
- // => 83
- // [3.1.2] > Integer " 0x123", -6
- // => 291
+ // [3.1.2] > Integer " 0123", -6 => 83 [3.1.2] > Integer " 0x123", -6 =>
+ //291
let subject = " 0123".try_into().unwrap();
let radix = Radix::try_base_from_str_and_i64(subject, -6).unwrap();
assert_eq!(radix, None);
diff --git i/scolapasta-path/src/paths/windows.rs w/scolapasta-path/src/paths/windows.rs
index 3756ddb478..7bc90bb8fc 100644
--- i/scolapasta-path/src/paths/windows.rs
+++ w/scolapasta-path/src/paths/windows.rs
@@ -196,8 +196,8 @@ mod tests {
// ([]uint16=`[0xdcc0 0x2e 0x74 0x78 0x74]`)
// ```
//
- // and attempt to read it by calling `ioutil.ReadDir` and reading all
- // the files that come back.
+ // and attempt to read it by calling `ioutil.ReadDir` and reading all the
+ //files that come back.
//
// See: https://github.com/golang/go/issues/32334#issue-450436484
diff --git i/scolapasta-string-escape/src/string.rs w/scolapasta-string-escape/src/string.rs
index 315358bf22..cc40782ded 100644
--- i/scolapasta-string-escape/src/string.rs
+++ w/scolapasta-string-escape/src/string.rs
@@ -25,8 +25,7 @@ use crate::literal::{ascii_char_with_escape, Literal};
///
/// # Errors
///
-/// This method only returns an error when the given writer returns an
-/// error.
+/// This method only returns an error when the given writer returns an error.
pub fn format_debug_escape_into<W, T>(mut dest: W, message: T) -> fmt::Result
where
W: Write,
diff --git i/spinoso-array/src/array/mod.rs w/spinoso-array/src/array/mod.rs
index 0382db5e4d..f7818fc6ae 100644
--- i/spinoso-array/src/array/mod.rs
+++ w/spinoso-array/src/array/mod.rs
@@ -5,8 +5,8 @@
//! in `std`. [`SmallArray`](smallvec::SmallArray) is based on [`SmallVec`].
//! [`TinyArray`](tinyvec::TinyArray) is based on [`TinyVec`].
//!
-//! The smallvec backend uses small vector optimization to store
-//! [some elements][inline-capacity] inline without spilling to the heap.
+//! The smallvec backend uses small vector optimization to store [some
+//! elements][inline-capacity] inline without spilling to the heap.
//!
//! The `SmallArray` backend requires the `small-array` Cargo feature to be
//! enabled.
diff --git i/spinoso-array/src/array/smallvec/mod.rs w/spinoso-array/src/array/smallvec/mod.rs
index b8ddb284cc..4e77346e9f 100644
--- i/spinoso-array/src/array/smallvec/mod.rs
+++ w/spinoso-array/src/array/smallvec/mod.rs
@@ -481,7 +481,7 @@ impl<T> SmallArray<T> {
/// Returns a reference to an element at the index.
///
- /// Unlike [`Vec`], this method does not support indexing with a range. See
+ /// Unlike [`Vec`], this method does not support indexing with a range. See
/// the [`slice`](Self::slice) method for retrieving a sub-slice from the
/// array.
///
diff --git i/spinoso-array/src/array/tinyvec/mod.rs w/spinoso-array/src/array/tinyvec/mod.rs
index 8c6aea2e71..3c87deaea0 100644
--- i/spinoso-array/src/array/tinyvec/mod.rs
+++ w/spinoso-array/src/array/tinyvec/mod.rs
@@ -476,7 +476,7 @@ where
/// Returns a reference to an element at the index.
///
- /// Unlike [`Vec`], this method does not support indexing with a range. See
+ /// Unlike [`Vec`], this method does not support indexing with a range. See
/// the [`slice`](Self::slice) method for retrieving a sub-slice from the
/// array.
///
@@ -882,8 +882,8 @@ impl<T> TinyArray<T>
where
T: Clone + Default,
{
- /// Construct a new `TinyArray<T>` with length `len` and all elements set
- /// to `default`. The `TinyArray` will have capacity at least `len`.
+ /// Construct a new `TinyArray<T>` with length `len` and all elements set to
+ /// `default`. The `TinyArray` will have capacity at least `len`.
///
/// # Examples
///
diff --git i/spinoso-array/src/array/vec/mod.rs w/spinoso-array/src/array/vec/mod.rs
index 98385ed9cf..28e37ab67c 100644
--- i/spinoso-array/src/array/vec/mod.rs
+++ w/spinoso-array/src/array/vec/mod.rs
@@ -501,7 +501,7 @@ impl<T> Array<T> {
/// Returns a reference to an element at the index.
///
- /// Unlike [`Vec`], this method does not support indexing with a range. See
+ /// Unlike [`Vec`], this method does not support indexing with a range. See
/// the [`slice`](Self::slice) method for retrieving a sub-slice from the
/// array.
///
diff --git i/spinoso-array/src/lib.rs w/spinoso-array/src/lib.rs
index 6e2dc43963..3572bdb945 100644
--- i/spinoso-array/src/lib.rs
+++ w/spinoso-array/src/lib.rs
@@ -109,8 +109,8 @@
//!
//! # Panics
//!
-//! `Array`s in this crate do not expose panicking slicing operations (except for
-//! their [`Index`] and [`IndexMut`] implementations). Instead of panicking,
+//! `Array`s in this crate do not expose panicking slicing operations (except
+//! for their [`Index`] and [`IndexMut`] implementations). Instead of panicking,
//! slicing APIs operate until the end of the vector or return `&[]`. Mutating
//! APIs extend `Array`s on out of bounds access.
//!
diff --git i/spinoso-env/src/env/memory.rs w/spinoso-env/src/env/memory.rs
index 7cf4477520..20838817b2 100644
--- i/spinoso-env/src/env/memory.rs
+++ w/spinoso-env/src/env/memory.rs
@@ -84,9 +84,9 @@ impl Memory {
// https://doc.rust-lang.org/std/env/fn.set_var.html
// https://doc.rust-lang.org/std/env/fn.remove_var.html
//
- // This function may panic if key is empty, contains an ASCII equals
- // sign '=' or the NUL character '\0', or when the value contains the
- // NUL character.
+ // This function may panic if key is empty, contains an ASCII equals sign
+ //'=' or the NUL character '\0', or when the value contains the NUL
+ //character.
if name.is_empty() {
// MRI accepts empty names on get and should always return `nil`
// since empty names are invalid at the OS level.
@@ -142,9 +142,9 @@ impl Memory {
// https://doc.rust-lang.org/std/env/fn.set_var.html
// https://doc.rust-lang.org/std/env/fn.remove_var.html
//
- // This function may panic if key is empty, contains an ASCII equals
- // sign '=' or the NUL character '\0', or when the value contains the
- // NUL character.
+ // This function may panic if key is empty, contains an ASCII equals sign
+ //'=' or the NUL character '\0', or when the value contains the NUL
+ //character.
if name.find_byte(b'\0').is_some() {
let message = "bad environment variable name: contains null byte";
Err(ArgumentError::with_message(message).into())
diff --git i/spinoso-env/src/env/system.rs w/spinoso-env/src/env/system.rs
index ac25525d75..9b081534f8 100644
--- i/spinoso-env/src/env/system.rs
+++ w/spinoso-env/src/env/system.rs
@@ -80,7 +80,8 @@ impl System {
///
/// # Implementation notes
///
- /// This method accesses the host system's environment using [`env::var_os`].
+ /// This method accesses the host system's environment using
+ /// [`env::var_os`].
///
/// # Examples
///
@@ -109,9 +110,9 @@ impl System {
// https://doc.rust-lang.org/std/env/fn.set_var.html
// https://doc.rust-lang.org/std/env/fn.remove_var.html
//
- // This function may panic if key is empty, contains an ASCII equals
- // sign '=' or the NUL character '\0', or when the value contains the
- // NUL character.
+ // This function may panic if key is empty, contains an ASCII equals sign
+ //'=' or the NUL character '\0', or when the value contains the NUL
+ //character.
if name.is_empty() {
// MRI accepts empty names on get and should always return `nil`
// since empty names are invalid at the OS level.
@@ -140,8 +141,8 @@ impl System {
///
/// # Implementation notes
///
- /// This method accesses the host system's environment using [`env::set_var`]
- /// and [`env::remove_var`].
+ /// This method accesses the host system's environment using
+ /// [`env::set_var`] and [`env::remove_var`].
///
/// # Examples
///
@@ -181,9 +182,9 @@ impl System {
// https://doc.rust-lang.org/std/env/fn.set_var.html
// https://doc.rust-lang.org/std/env/fn.remove_var.html
//
- // This function may panic if key is empty, contains an ASCII equals
- // sign '=' or the NUL character '\0', or when the value contains the
- // NUL character.
+ // This function may panic if key is empty, contains an ASCII equals sign
+ //'=' or the NUL character '\0', or when the value contains the NUL
+ //character.
if name.find_byte(b'\0').is_some() {
let message = "bad environment variable name: contains null byte";
Err(ArgumentError::with_message(message).into())
@@ -222,7 +223,8 @@ impl System {
///
/// # Implementation notes
///
- /// This method accesses the host system's environment using [`env::vars_os`].
+ /// This method accesses the host system's environment using
+ /// [`env::vars_os`].
///
/// # Examples
///
diff --git i/spinoso-env/src/lib.rs w/spinoso-env/src/lib.rs
index 2d629d1212..3beef86c11 100644
--- i/spinoso-env/src/lib.rs
+++ w/spinoso-env/src/lib.rs
@@ -47,7 +47,8 @@
//!
//! # Examples
//!
-//! Using the in-memory backend allows safely manipulating an emulated environment:
+//! Using the in-memory backend allows safely manipulating an emulated
+//! environment:
//!
//! ```
//! # use spinoso_env::Memory;
@@ -186,7 +187,8 @@ impl error::Error for Error {
///
/// Argument errors have an associated message.
///
-/// This error corresponds to the [Ruby `ArgumentError` Exception class].
+/// This error corresponds to the [Ruby `ArgumentError`
+/// `ArgumentError` Exception class].
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/argumenterror.rs w/spinoso-exception/src/core/argumenterror.rs
index 276969db2c..86c5373a17 100644
--- i/spinoso-exception/src/core/argumenterror.rs
+++ w/spinoso-exception/src/core/argumenterror.rs
@@ -45,15 +45,14 @@ impl ArgumentError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"ArgumentError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `ArgumentError` Ruby exception with the given
- /// message.
+ /// Construct a new, `ArgumentError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/encodingerror.rs w/spinoso-exception/src/core/encodingerror.rs
index bce3a2e8a1..04313d98f3 100644
--- i/spinoso-exception/src/core/encodingerror.rs
+++ w/spinoso-exception/src/core/encodingerror.rs
@@ -45,15 +45,14 @@ impl EncodingError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"EncodingError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `EncodingError` Ruby exception with the given
- /// message.
+ /// Construct a new, `EncodingError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/eoferror.rs w/spinoso-exception/src/core/eoferror.rs
index 65fa57f16e..deef23437d 100644
--- i/spinoso-exception/src/core/eoferror.rs
+++ w/spinoso-exception/src/core/eoferror.rs
@@ -46,15 +46,14 @@ impl EOFError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"EOFError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `EOFError` Ruby exception with the given
- /// message.
+ /// Construct a new, `EOFError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/exception.rs w/spinoso-exception/src/core/exception.rs
index add5b17347..0fd23e2a1f 100644
--- i/spinoso-exception/src/core/exception.rs
+++ w/spinoso-exception/src/core/exception.rs
@@ -45,15 +45,14 @@ impl Exception {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"Exception";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `Exception` Ruby exception with the given
- /// message.
+ /// Construct a new, `Exception` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/fatal.rs w/spinoso-exception/src/core/fatal.rs
index ba05401c89..32e2846d41 100644
--- i/spinoso-exception/src/core/fatal.rs
+++ w/spinoso-exception/src/core/fatal.rs
@@ -45,15 +45,14 @@ impl Fatal {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"fatal";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `fatal` Ruby exception with the given
- /// message.
+ /// Construct a new, `fatal` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/fibererror.rs w/spinoso-exception/src/core/fibererror.rs
index 2a57d97961..e51fe9b622 100644
--- i/spinoso-exception/src/core/fibererror.rs
+++ w/spinoso-exception/src/core/fibererror.rs
@@ -45,15 +45,14 @@ impl FiberError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"FiberError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `FiberError` Ruby exception with the given
- /// message.
+ /// Construct a new, `FiberError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/floatdomainerror.rs w/spinoso-exception/src/core/floatdomainerror.rs
index d69464b0ff..92e7023cf4 100644
--- i/spinoso-exception/src/core/floatdomainerror.rs
+++ w/spinoso-exception/src/core/floatdomainerror.rs
@@ -45,9 +45,9 @@ impl FloatDomainError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"FloatDomainError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-exception/src/core/frozenerror.rs w/spinoso-exception/src/core/frozenerror.rs
index a68be40b7b..380c7358bf 100644
--- i/spinoso-exception/src/core/frozenerror.rs
+++ w/spinoso-exception/src/core/frozenerror.rs
@@ -45,15 +45,14 @@ impl FrozenError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"FrozenError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `FrozenError` Ruby exception with the given
- /// message.
+ /// Construct a new, `FrozenError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/indexerror.rs w/spinoso-exception/src/core/indexerror.rs
index dd61dcf331..970214c051 100644
--- i/spinoso-exception/src/core/indexerror.rs
+++ w/spinoso-exception/src/core/indexerror.rs
@@ -45,15 +45,14 @@ impl IndexError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"IndexError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `IndexError` Ruby exception with the given
- /// message.
+ /// Construct a new, `IndexError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/interrupt.rs w/spinoso-exception/src/core/interrupt.rs
index f9924ca1ea..3c7fa3cdae 100644
--- i/spinoso-exception/src/core/interrupt.rs
+++ w/spinoso-exception/src/core/interrupt.rs
@@ -45,15 +45,14 @@ impl Interrupt {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"Interrupt";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `Interrupt` Ruby exception with the given
- /// message.
+ /// Construct a new, `Interrupt` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/ioerror.rs w/spinoso-exception/src/core/ioerror.rs
index 0f926d29d9..96c214ece6 100644
--- i/spinoso-exception/src/core/ioerror.rs
+++ w/spinoso-exception/src/core/ioerror.rs
@@ -46,15 +46,14 @@ impl IOError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"IOError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `IOError` Ruby exception with the given
- /// message.
+ /// Construct a new, `IOError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/keyerror.rs w/spinoso-exception/src/core/keyerror.rs
index b5049f24c3..f46ec41427 100644
--- i/spinoso-exception/src/core/keyerror.rs
+++ w/spinoso-exception/src/core/keyerror.rs
@@ -45,15 +45,14 @@ impl KeyError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"KeyError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `KeyError` Ruby exception with the given
- /// message.
+ /// Construct a new, `KeyError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/loaderror.rs w/spinoso-exception/src/core/loaderror.rs
index 93b0ce489f..c281941608 100644
--- i/spinoso-exception/src/core/loaderror.rs
+++ w/spinoso-exception/src/core/loaderror.rs
@@ -45,15 +45,14 @@ impl LoadError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"LoadError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `LoadError` Ruby exception with the given
- /// message.
+ /// Construct a new, `LoadError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/localjumperror.rs w/spinoso-exception/src/core/localjumperror.rs
index 470a9430ed..d6cd4757a2 100644
--- i/spinoso-exception/src/core/localjumperror.rs
+++ w/spinoso-exception/src/core/localjumperror.rs
@@ -45,15 +45,14 @@ impl LocalJumpError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"LocalJumpError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `LocalJumpError` Ruby exception with the given
- /// message.
+ /// Construct a new, `LocalJumpError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/nameerror.rs w/spinoso-exception/src/core/nameerror.rs
index 6a1912d8c5..83c2f72c77 100644
--- i/spinoso-exception/src/core/nameerror.rs
+++ w/spinoso-exception/src/core/nameerror.rs
@@ -45,15 +45,14 @@ impl NameError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"NameError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `NameError` Ruby exception with the given
- /// message.
+ /// Construct a new, `NameError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/nomemoryerror.rs w/spinoso-exception/src/core/nomemoryerror.rs
index c629495ed3..581db5baeb 100644
--- i/spinoso-exception/src/core/nomemoryerror.rs
+++ w/spinoso-exception/src/core/nomemoryerror.rs
@@ -45,15 +45,14 @@ impl NoMemoryError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"NoMemoryError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `NoMemoryError` Ruby exception with the given
- /// message.
+ /// Construct a new, `NoMemoryError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/nomethoderror.rs w/spinoso-exception/src/core/nomethoderror.rs
index 51eb9cc97d..f56c0e8c03 100644
--- i/spinoso-exception/src/core/nomethoderror.rs
+++ w/spinoso-exception/src/core/nomethoderror.rs
@@ -45,15 +45,14 @@ impl NoMethodError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"NoMethodError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `NoMethodError` Ruby exception with the given
- /// message.
+ /// Construct a new, `NoMethodError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/notimplementederror.rs w/spinoso-exception/src/core/notimplementederror.rs
index e736cd650e..08e355513c 100644
--- i/spinoso-exception/src/core/notimplementederror.rs
+++ w/spinoso-exception/src/core/notimplementederror.rs
@@ -45,9 +45,9 @@ impl NotImplementedError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"NotImplementedError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-exception/src/core/rangeerror.rs w/spinoso-exception/src/core/rangeerror.rs
index 1559606ff4..eac71799fe 100644
--- i/spinoso-exception/src/core/rangeerror.rs
+++ w/spinoso-exception/src/core/rangeerror.rs
@@ -45,15 +45,14 @@ impl RangeError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"RangeError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `RangeError` Ruby exception with the given
- /// message.
+ /// Construct a new, `RangeError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/regexperror.rs w/spinoso-exception/src/core/regexperror.rs
index 05a44aca00..418358d434 100644
--- i/spinoso-exception/src/core/regexperror.rs
+++ w/spinoso-exception/src/core/regexperror.rs
@@ -45,15 +45,14 @@ impl RegexpError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"RegexpError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `RegexpError` Ruby exception with the given
- /// message.
+ /// Construct a new, `RegexpError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/runtimeerror.rs w/spinoso-exception/src/core/runtimeerror.rs
index 11eb629e7d..116690f327 100644
--- i/spinoso-exception/src/core/runtimeerror.rs
+++ w/spinoso-exception/src/core/runtimeerror.rs
@@ -45,15 +45,14 @@ impl RuntimeError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"RuntimeError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `RuntimeError` Ruby exception with the given
- /// message.
+ /// Construct a new, `RuntimeError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/scripterror.rs w/spinoso-exception/src/core/scripterror.rs
index c632f5a862..0322b08048 100644
--- i/spinoso-exception/src/core/scripterror.rs
+++ w/spinoso-exception/src/core/scripterror.rs
@@ -45,15 +45,14 @@ impl ScriptError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"ScriptError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `ScriptError` Ruby exception with the given
- /// message.
+ /// Construct a new, `ScriptError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/securityerror.rs w/spinoso-exception/src/core/securityerror.rs
index f8706531e6..20b5467c4f 100644
--- i/spinoso-exception/src/core/securityerror.rs
+++ w/spinoso-exception/src/core/securityerror.rs
@@ -45,15 +45,14 @@ impl SecurityError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"SecurityError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `SecurityError` Ruby exception with the given
- /// message.
+ /// Construct a new, `SecurityError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/signalexception.rs w/spinoso-exception/src/core/signalexception.rs
index 77e01b511b..28246a0382 100644
--- i/spinoso-exception/src/core/signalexception.rs
+++ w/spinoso-exception/src/core/signalexception.rs
@@ -45,9 +45,9 @@ impl SignalException {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"SignalException";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-exception/src/core/standarderror.rs w/spinoso-exception/src/core/standarderror.rs
index 310e75db53..c5e8c1de78 100644
--- i/spinoso-exception/src/core/standarderror.rs
+++ w/spinoso-exception/src/core/standarderror.rs
@@ -45,15 +45,14 @@ impl StandardError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"StandardError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `StandardError` Ruby exception with the given
- /// message.
+ /// Construct a new, `StandardError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/stopiteration.rs w/spinoso-exception/src/core/stopiteration.rs
index 9653d851a6..5309cffba2 100644
--- i/spinoso-exception/src/core/stopiteration.rs
+++ w/spinoso-exception/src/core/stopiteration.rs
@@ -45,15 +45,14 @@ impl StopIteration {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"StopIteration";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `StopIteration` Ruby exception with the given
- /// message.
+ /// Construct a new, `StopIteration` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/syntaxerror.rs w/spinoso-exception/src/core/syntaxerror.rs
index 84556aba49..89784c1715 100644
--- i/spinoso-exception/src/core/syntaxerror.rs
+++ w/spinoso-exception/src/core/syntaxerror.rs
@@ -45,15 +45,14 @@ impl SyntaxError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"SyntaxError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `SyntaxError` Ruby exception with the given
- /// message.
+ /// Construct a new, `SyntaxError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/systemcallerror.rs w/spinoso-exception/src/core/systemcallerror.rs
index eac05c8bf7..8c1e8f8727 100644
--- i/spinoso-exception/src/core/systemcallerror.rs
+++ w/spinoso-exception/src/core/systemcallerror.rs
@@ -45,9 +45,9 @@ impl SystemCallError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"SystemCallError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-exception/src/core/systemexit.rs w/spinoso-exception/src/core/systemexit.rs
index 96fcc43e02..cb1e6287cc 100644
--- i/spinoso-exception/src/core/systemexit.rs
+++ w/spinoso-exception/src/core/systemexit.rs
@@ -45,15 +45,14 @@ impl SystemExit {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"SystemExit";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `SystemExit` Ruby exception with the given
- /// message.
+ /// Construct a new, `SystemExit` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/systemstackerror.rs w/spinoso-exception/src/core/systemstackerror.rs
index 1d7f73b580..dc767c7539 100644
--- i/spinoso-exception/src/core/systemstackerror.rs
+++ w/spinoso-exception/src/core/systemstackerror.rs
@@ -45,9 +45,9 @@ impl SystemStackError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"SystemStackError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-exception/src/core/threaderror.rs w/spinoso-exception/src/core/threaderror.rs
index 9f55fb12e2..90a09a19f5 100644
--- i/spinoso-exception/src/core/threaderror.rs
+++ w/spinoso-exception/src/core/threaderror.rs
@@ -45,15 +45,14 @@ impl ThreadError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"ThreadError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `ThreadError` Ruby exception with the given
- /// message.
+ /// Construct a new, `ThreadError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/typeerror.rs w/spinoso-exception/src/core/typeerror.rs
index f099b21a2d..591e1c9912 100644
--- i/spinoso-exception/src/core/typeerror.rs
+++ w/spinoso-exception/src/core/typeerror.rs
@@ -45,15 +45,14 @@ impl TypeError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"TypeError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
- /// Construct a new, `TypeError` Ruby exception with the given
- /// message.
+ /// Construct a new, `TypeError` Ruby exception with the given message.
///
/// # Examples
///
diff --git i/spinoso-exception/src/core/uncaughtthrowerror.rs w/spinoso-exception/src/core/uncaughtthrowerror.rs
index 3f7de347a8..9b35f69fba 100644
--- i/spinoso-exception/src/core/uncaughtthrowerror.rs
+++ w/spinoso-exception/src/core/uncaughtthrowerror.rs
@@ -45,9 +45,9 @@ impl UncaughtThrowError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"UncaughtThrowError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-exception/src/core/zerodivisionerror.rs w/spinoso-exception/src/core/zerodivisionerror.rs
index 02d692e0cb..faad1c22a1 100644
--- i/spinoso-exception/src/core/zerodivisionerror.rs
+++ w/spinoso-exception/src/core/zerodivisionerror.rs
@@ -45,9 +45,9 @@ impl ZeroDivisionError {
pub const fn new() -> Self {
const DEFAULT_MESSAGE: &[u8] = b"ZeroDivisionError";
- // `Exception` objects initialized via (for example)
- // `raise RuntimeError` or `RuntimeError.new` have `message`
- // equal to the exception's class name.
+ // `Exception` objects initialized via (for example) `raise RuntimeError`
+ //or `RuntimeError.new` have `message` equal to the exception's class
+ //name.
let message = Cow::Borrowed(DEFAULT_MESSAGE);
Self { message }
}
diff --git i/spinoso-math/src/lib.rs w/spinoso-math/src/lib.rs
index 7fbc09aea2..32099630a1 100644
--- i/spinoso-math/src/lib.rs
+++ w/spinoso-math/src/lib.rs
@@ -196,9 +196,9 @@ impl error::Error for Error {
///
/// Domain errors have an associated message.
///
-/// This error corresponds to the [Ruby `Math::DomainError` Exception class]. It
-/// can be used to differentiate between [`NaN`](f64::NAN) inputs and what would
-/// be `NaN` outputs.
+/// This error corresponds to the [Ruby `Math::DomainError`
+/// `Math::DomainError` Exception class]. It can be used to differentiate
+/// between [`NaN`](f64::NAN) inputs and what would be `NaN` outputs.
///
/// # Examples
///
diff --git i/spinoso-math/src/math.rs w/spinoso-math/src/math.rs
index cd8610803a..30428d9be8 100644
--- i/spinoso-math/src/math.rs
+++ w/spinoso-math/src/math.rs
@@ -208,8 +208,8 @@ pub fn atan2(value: f64, other: f64) -> f64 {
///
/// # Errors
///
-/// If the result of computing the inverse hyperbolic tangent is [`NAN`]
-/// a domain error is returned.
+/// If the result of computing the inverse hyperbolic tangent is [`NAN`] a
+/// domain error is returned.
///
/// [`NAN`]: f64::NAN
#[inline]
@@ -486,29 +486,12 @@ pub fn gamma(value: f64) -> Result<f64, DomainError> {
// and might be an approximation so include a lookup table for as many `n`
// as can fit in the float mantissa.
const FACTORIAL_TABLE: [f64; 23] = [
- 1.0_f64, // fact(0)
- 1.0, // fact(1)
- 2.0, // fact(2)
- 6.0, // fact(3)
- 24.0, // fact(4)
- 120.0, // fact(5)
- 720.0, // fact(6)
- 5_040.0, // fact(7)
- 40_320.0, // fact(8)
- 362_880.0, // fact(9)
- 3_628_800.0, // fact(10)
- 39_916_800.0, // fact(11)
- 479_001_600.0, // fact(12)
- 6_227_020_800.0, // fact(13)
- 87_178_291_200.0, // fact(14)
- 1_307_674_368_000.0, // fact(15)
- 20_922_789_888_000.0, // fact(16)
- 355_687_428_096_000.0, // fact(17)
- 6_402_373_705_728_000.0, // fact(18)
- 121_645_100_408_832_000.0, // fact(19)
- 2_432_902_008_176_640_000.0, // fact(20)
- 51_090_942_171_709_440_000.0, // fact(21)
- 1_124_000_727_777_607_680_000.0, // fact(22)
+ 1.0_f64, // fact(0) fact(1) fact(2) fact(3)
+ //fact(4) fact(5) fact(6) fact(7)
+ //fact(8) fact(9) fact(10) fact(11)
+ //fact(12) fact(13) fact(14) fact(15)
+ //fact(16) fact(17) fact(18) fact(19)
+ //fact(20) fact(21) fact(22)
];
match value {
value if value.is_infinite() && value.is_sign_negative() => Err(DomainError::with_message(
diff --git i/spinoso-random/src/lib.rs w/spinoso-random/src/lib.rs
index 710453fdf5..c3088ddd3f 100644
--- i/spinoso-random/src/lib.rs
+++ w/spinoso-random/src/lib.rs
@@ -265,7 +265,8 @@ impl error::Error for InitializeError {}
/// This error is returned by [`urandom()`]. See its documentation for more
/// details.
///
-/// This error corresponds to the [Ruby `RuntimeError` Exception class].
+/// This error corresponds to the [Ruby `RuntimeError`
+/// `RuntimeError` Exception class].
///
/// # Examples
///
@@ -332,7 +333,8 @@ impl error::Error for UrandomError {}
/// This error is returned by [`new_seed`]. See its documentation for more
/// details.
///
-/// This error corresponds to the [Ruby `RuntimeError` Exception class].
+/// This error corresponds to the [Ruby `RuntimeError`
+/// `RuntimeError` Exception class].
///
/// # Examples
///
@@ -397,7 +399,8 @@ impl error::Error for NewSeedError {}
/// This error is returned by [`rand()`]. See its documentation for more
/// details.
///
-/// This error corresponds to the [Ruby `ArgumentError` Exception class].
+/// This error corresponds to the [Ruby `ArgumentError`
+/// `ArgumentError` Exception class].
///
/// # Examples
///
diff --git i/spinoso-regexp/src/debug.rs w/spinoso-regexp/src/debug.rs
index 794adeb4ca..d55f583fc1 100644
--- i/spinoso-regexp/src/debug.rs
+++ w/spinoso-regexp/src/debug.rs
@@ -52,8 +52,7 @@ impl Delimiters {
///
/// # Examples
///
-/// UTF-8 regexp patterns and options are formatted in a debug
-/// representation:
+/// UTF-8 regexp patterns and options are formatted in a debug representation:
///
/// ```
/// use spinoso_regexp::Debug;
@@ -95,8 +94,9 @@ pub struct Debug<'a> {
}
impl<'a> Debug<'a> {
- /// Construct a new `Debug` iterator with a regexp source, [options
- /// modifiers], and [encoding modifiers].
+ /// Construct a new `Debug` iterator with a regexp source,
+ /// [options
+ modifiers], and [encoding modifiers].
///
/// # Examples
///
@@ -199,8 +199,8 @@ impl<'a> Iterator for Debug<'a> {
self.source = &self.source[size..];
Some(ch)
}
- // Otherwise, we've gotten invalid UTF-8, which means this is not a
- // printable char.
+ // Otherwise, we've gotten invalid UTF-8, which means this is not
+ //a printable char.
None => {
let (chunk, remainder) = self.source.split_at(size);
self.source = remainder;
diff --git i/spinoso-regexp/src/encoding.rs w/spinoso-regexp/src/encoding.rs
index 879da6a0a7..179bf7d89a 100644
--- i/spinoso-regexp/src/encoding.rs
+++ w/spinoso-regexp/src/encoding.rs
@@ -32,10 +32,10 @@ impl error::Error for InvalidEncodingError {}
/// The encoding of a Regexp literal.
///
-/// Regexps are assumed to use the source encoding but literals may override
-/// the encoding with a Regexp modifier.
+/// Regexps are assumed to use the source encoding but literals may override the
+/// encoding with a Regexp modifier.
///
-/// See [`Regexp` encoding][regexp-encoding].
+/// See [`Regexp` [`Regexp` encoding][regexp-encoding].
///
/// [regexp-encoding]: https://ruby-doc.org/core-3.1.2/Regexp.html#class-Regexp-label-Encoding
#[derive(Debug, Clone, Copy, PartialOrd, Ord)]
diff --git i/spinoso-regexp/src/error.rs w/spinoso-regexp/src/error.rs
index 00aceb73f6..656a74c08a 100644
--- i/spinoso-regexp/src/error.rs
+++ w/spinoso-regexp/src/error.rs
@@ -63,7 +63,8 @@ impl error::Error for Error {
///
/// Argument errors have an associated message.
///
-/// This error corresponds to the [Ruby `ArgumentError` Exception class].
+/// This error corresponds to the [Ruby `ArgumentError`
+/// `ArgumentError` Exception class].
///
/// # Examples
///
diff --git i/spinoso-regexp/src/lib.rs w/spinoso-regexp/src/lib.rs
index d360452d54..71f4242047 100644
--- i/spinoso-regexp/src/lib.rs
+++ w/spinoso-regexp/src/lib.rs
@@ -3,8 +3,7 @@
#![warn(clippy::cargo)]
#![cfg_attr(test, allow(clippy::non_ascii_literal))]
#![allow(unknown_lints)]
-// TODO: warn on missing docs once crate is API-complete.
-// #![warn(missing_docs)]
+// TODO: warn on missing docs once crate is API-complete. #![warn(missing_docs)]
#![warn(missing_debug_implementations)]
#![warn(missing_copy_implementations)]
#![warn(rust_2018_idioms)]
diff --git i/spinoso-regexp/src/options.rs w/spinoso-regexp/src/options.rs
index b3d63ec0d5..72d0826007 100644
--- i/spinoso-regexp/src/options.rs
+++ w/spinoso-regexp/src/options.rs
@@ -112,8 +112,8 @@ impl From<u8> for Options {
impl From<i64> for Options {
/// Truncate the given `i64` to one byte and generate flags.
///
- /// See `From<u8>`. For a conversion that fails if the given `i64` is
- /// larger than [`u8::MAX`], see [`try_from_int`].
+ /// See `From<u8>`. For a conversion that fails if the given `i64` is larger
+ /// than [`u8::MAX`], see [`try_from_int`].
///
/// [`try_from_int`]: Self::try_from_int
fn from(flags: i64) -> Self {
@@ -487,7 +487,8 @@ mod tests {
#[test]
fn make_options_all_opts() {
- // `ALL_REGEXP_OPTS` is equivalent to `EXTENDED | IGNORECASE | MULTILINE` flags.
+ // `ALL_REGEXP_OPTS` is equivalent to
+ //`EXTENDED | IGNORECASE | MULTILINE` flags.
let mut opts = Options::new();
opts.flags |= Flags::ALL_REGEXP_OPTS;
assert_ne!(Options::from(Flags::EXTENDED), opts);
diff --git i/spinoso-regexp/src/regexp/regex/utf8/mod.rs w/spinoso-regexp/src/regexp/regex/utf8/mod.rs
index d623fa4c19..8fae30ccb3 100644
--- i/spinoso-regexp/src/regexp/regex/utf8/mod.rs
+++ w/spinoso-regexp/src/regexp/regex/utf8/mod.rs
@@ -215,7 +215,8 @@ impl Utf8 {
Ok(pos)
}
- /// Check whether this regexp matches the given haystack starting at an offset.
+ /// Check whether this regexp matches the given haystack starting at an
+ /// offset.
///
/// If the given offset is negative, it counts backward from the end of the
/// haystack.
@@ -392,9 +393,8 @@ mod tests {
(B("xyz"), "xyz"),
(B("🦀"), "🦀"),
(B("铁锈"), "铁锈"),
- // Invalid UTF-8 patterns are not supported 👇
- // (B(b"\xFF\xFE"), r"\xFF\xFE"),
- // (B(b"abc \xFF\xFE xyz"), r"abc \xFF\xFE xyz"),
+ // Invalid UTF-8 patterns are not supported 👇 (B(b"\xFF\xFE"),
+ //r"\xFF\xFE"), (B(b"abc \xFF\xFE xyz"), r"abc \xFF\xFE xyz"),
];
for (pattern, display) in test_cases {
let regexp = make(pattern, None, Encoding::None);
@@ -411,7 +411,8 @@ mod tests {
(B("\0"), r"/\x00/m", Options::from(Flags::MULTILINE)),
(B(b"\x0a"), "/\n/", Options::default()),
(B("\x0B"), "/\x0B/", Options::default()),
- // NOTE: the control characters, not a raw string, are in the debug output.
+ // NOTE: the control characters, not a raw string, are in the debug
+ //output.
(B("\n\r\t"), "/\n\r\t/", Options::default()),
(B("\n\r\t"), "/\n\r\t/mix", Options::from(Flags::ALL_REGEXP_OPTS)),
(
@@ -460,9 +461,9 @@ mod tests {
),
(B("铁锈"), "/铁锈/m", Options::from(Flags::MULTILINE)),
(B("铁+锈*"), "/铁+锈*/mix", Options::from(Flags::ALL_REGEXP_OPTS)),
- // Invalid UTF-8 patterns are not supported 👇
- // (B(b"\xFF\xFE"), r"\xFF\xFE", Options::default()),
- // (B(b"abc \xFF\xFE xyz"), r"abc \xFF\xFE xyz", Options::default()),
+ // Invalid UTF-8 patterns are not supported 👇 (B(b"\xFF\xFE"),
+ //r"\xFF\xFE", Options::default()), (B(b"abc \xFF\xFE xyz"), r"abc
+ //\xFF\xFE xyz", Options::default()),
];
for (pattern, debug, options) in test_cases {
let regexp = make(pattern, Some(options), Encoding::None);
diff --git i/spinoso-securerandom/src/lib.rs w/spinoso-securerandom/src/lib.rs
index 2e1c5fc642..7aa7792c4d 100644
--- i/spinoso-securerandom/src/lib.rs
+++ w/spinoso-securerandom/src/lib.rs
@@ -127,7 +127,8 @@ pub enum Error {
/// This may mean that too many random bytes were requested or the system is
/// out of memory.
///
- /// See [`TryReserveError`] and [`TryReserveErrorKind`] for more information.
+ /// See [`TryReserveError`] and [`TryReserveErrorKind`] for more
+ /// information.
///
/// [`TryReserveErrorKind`]: std::collections::TryReserveErrorKind
Memory(TryReserveError),
@@ -182,7 +183,8 @@ impl error::Error for Error {
///
/// Argument errors have an associated message.
///
-/// This error corresponds to the [Ruby `ArgumentError` Exception class].
+/// This error corresponds to the [Ruby `ArgumentError`
+/// `ArgumentError` Exception class].
///
/// # Examples
///
@@ -472,15 +474,15 @@ pub fn random_bytes(len: Option<i64>) -> Result<Vec<u8>, Error> {
pub enum Max {
/// Generate floats in the range `[0, max)`.
///
- /// If `max` is less than or equal to zero, the range defaults to floats
- /// in `[0.0, 1.0]`.
+ /// If `max` is less than or equal to zero, the range defaults to floats in
+ /// `[0.0, 1.0]`.
///
/// If `max` is [`NaN`](f64::NAN), an error is returned.
Float(f64),
/// Generate signed integers in the range `[0, max)`.
///
- /// If `max` is less than or equal to zero, the range defaults to floats
- /// in `[0.0, 1.0]`.
+ /// If `max` is less than or equal to zero, the range defaults to floats in
+ /// `[0.0, 1.0]`.
Integer(i64),
/// Generate floats in the range `[0.0, 1.0]`.
None,
@@ -679,8 +681,8 @@ pub fn urlsafe_base64(len: Option<i64>, padding: bool) -> Result<String, Error>
/// Generate a random sequence of ASCII alphanumeric bytes.
///
-/// If `len` is [`Some`] and non-negative, generate a [`String`] of `len`
-/// random ASCII alphanumeric bytes. If `len` is [`None`], generate 16 random
+/// If `len` is [`Some`] and non-negative, generate a [`String`] of `len` random
+/// ASCII alphanumeric bytes. If `len` is [`None`], generate 16 random
/// alphanumeric bytes.
///
/// The returned [`Vec<u8>`](Vec) is guaranteed to contain only ASCII bytes.
diff --git i/spinoso-securerandom/src/uuid.rs w/spinoso-securerandom/src/uuid.rs
index 719128c0cc..0b5fa40974 100644
--- i/spinoso-securerandom/src/uuid.rs
+++ w/spinoso-securerandom/src/uuid.rs
@@ -17,8 +17,7 @@ use crate::{Error, RandomBytesError};
/// [RFC 4122, Section 4.1]: https://tools.ietf.org/html/rfc4122#section-4.1
const OCTETS: usize = 16;
-// See the BNF from JDK 8 that confirms stringified UUIDs are 36 characters
-// long:
+// See the BNF from JDK 8 that confirms stringified UUIDs are 36 characters long:
//
// https://docs.oracle.com/javase/8/docs/api/java/util/UUID.html#toString--
const ENCODED_LENGTH: usize = 36;
@@ -33,7 +32,8 @@ pub fn v4() -> Result<String, Error> {
let mut bytes = [0; OCTETS];
get_random_bytes(OsRng, &mut bytes)?;
- // Per RFC 4122, Section 4.4, set bits for version and `clock_seq_hi_and_reserved`.
+ // Per RFC 4122, Section 4.4, set bits for version and
+ //`clock_seq_hi_and_reserved`.
bytes[6] = (bytes[6] & 0x0f) | 0x40;
bytes[8] = (bytes[8] & 0x3f) | 0x80;
diff --git i/spinoso-string/src/buf/nul_terminated_vec.rs w/spinoso-string/src/buf/nul_terminated_vec.rs
index 8f7abe334a..dbe0afce4d 100644
--- i/spinoso-string/src/buf/nul_terminated_vec.rs
+++ w/spinoso-string/src/buf/nul_terminated_vec.rs
@@ -15,8 +15,7 @@ fn ensure_nul_terminated(vec: &mut Vec<u8>) {
const NUL_BYTE: u8 = 0;
let spare_capacity = vec.spare_capacity_mut();
- // If the vec has spare capacity, set the first and last bytes to NUL.
- // See:
+ // If the vec has spare capacity, set the first and last bytes to NUL. See:
//
// - https://github.com/artichoke/artichoke/pull/1976#discussion_r932782264
// - https://github.com/artichoke/artichoke/blob/16c869a9ad29acfe143bfcc011917ef442ccac54/artichoke-backend/vendor/mruby/src/string.c#L36-L38
@@ -88,8 +87,8 @@ impl Deref for Buf {
impl DerefMut for Buf {
#[inline]
fn deref_mut(&mut self) -> &mut Self::Target {
- // SAFETY: the mutable reference given out is a slice, NOT the
- // underlying `Vec`, so the allocation cannot change size.
+ // SAFETY: the mutable reference given out is a slice, NOT the underlying
+ //`Vec`, so the allocation cannot change size.
&mut *self.inner
}
}
diff --git i/spinoso-string/src/chars.rs w/spinoso-string/src/chars.rs
index a54dbb5e20..50597bcef7 100644
--- i/spinoso-string/src/chars.rs
+++ w/spinoso-string/src/chars.rs
@@ -197,7 +197,8 @@ impl<'a> Iterator for ConventionallyUtf8<'a> {
Some(ch)
} else {
let (invalid_utf8_bytes, remainder) = self.bytes.split_at(size);
- // Invalid UTF-8 bytes are yielded as byte slices one byte at a time.
+ // Invalid UTF-8 bytes are yielded as byte slices one byte at a
+ //time.
self.invalid_bytes = InvalidBytes::with_bytes(invalid_utf8_bytes);
self.bytes = remainder;
self.invalid_bytes.next()
diff --git i/spinoso-string/src/codepoints.rs w/spinoso-string/src/codepoints.rs
index 21ba542ffd..80d3aadc3b 100644
--- i/spinoso-string/src/codepoints.rs
+++ w/spinoso-string/src/codepoints.rs
@@ -118,9 +118,9 @@ impl InvalidCodepointError {
// formatted as `0x...`.
const MESSAGE_MAX_LENGTH: usize = 27 + 2 + mem::size_of::<u32>() * 2;
let mut s = alloc::string::String::with_capacity(MESSAGE_MAX_LENGTH);
- // In practice, the errors from `write!` below are safe to ignore
- // because the `core::fmt::Write` impl for `String` will never panic
- // and these `String`s will never approach `isize::MAX` bytes.
+ // In practice, the errors from `write!` below are safe to ignore because
+ //the `core::fmt::Write` impl for `String` will never panic and these
+ //`String`s will never approach `isize::MAX` bytes.
//
// See the `core::fmt::Display` impl for `InvalidCodepointError`.
let _ = write!(s, "{}", self);
diff --git i/spinoso-string/src/enc/mod.rs w/spinoso-string/src/enc/mod.rs
index f152a7856d..a24896635c 100644
--- i/spinoso-string/src/enc/mod.rs
+++ w/spinoso-string/src/enc/mod.rs
@@ -93,9 +93,9 @@ impl Ord for EncodedString {
//
// Per the docs in `std`:
//
-// > In particular `Eq`, `Ord` and `Hash` must be equivalent for borrowed and
-// > owned values: `x.borrow() == y.borrow()` should give the same result as
-// > `x == y`.
+// > In particular `Eq`, `Ord` and `Hash` must be equivalent for borrowed and >
+ //owned values: `x.borrow() == y.borrow()` should give the same result as > `x
+//== y`.
impl Borrow<[u8]> for EncodedString {
#[inline]
fn borrow(&self) -> &[u8] {
diff --git i/spinoso-string/src/enc/utf8/mod.rs w/spinoso-string/src/enc/utf8/mod.rs
index 02a633020c..b0b3a5ce66 100644
--- i/spinoso-string/src/enc/utf8/mod.rs
+++ w/spinoso-string/src/enc/utf8/mod.rs
@@ -208,25 +208,26 @@ impl Utf8String {
#[inline]
#[must_use]
pub fn get_char(&self, index: usize) -> Option<&'_ [u8]> {
- // Fast path rejection for indexes beyond bytesize, which is
- // cheap to retrieve.
+ // Fast path rejection for indexes beyond bytesize, which is cheap to
+ //retrieve.
if index >= self.len() {
return None;
}
- // Fast path for trying to treat the conventionally UTF-8 string
- // as entirely ASCII.
+ // Fast path for trying to treat the conventionally UTF-8 string as
+ //entirely ASCII.
//
- // If the string is either all ASCII or all ASCII for a prefix
- // of the string that contains the range we wish to slice,
- // fallback to byte slicing as in the ASCII and binary fast path.
+ // If the string is either all ASCII or all ASCII for a prefix of the
+ //string that contains the range we wish to slice, fallback to byte
+ //slicing as in the ASCII and binary fast path.
let consumed = match self.inner.find_non_ascii_byte() {
None => return self.inner.get(index..=index),
Some(idx) if idx > index => return self.inner.get(index..=index),
Some(idx) => idx,
};
let mut slice = &self.inner[consumed..];
- // TODO: See if we can use `get_unchecked` as implemented in `fn char_len`
- // Count of "characters" remaining until the `index`th character.
+ // TODO: See if we can use `get_unchecked` as implemented in
+ //`fn char_len` Count of "characters" remaining until the `index`th
+ //character.
let mut remaining = index - consumed;
// This loop will terminate when either:
//
@@ -237,43 +238,39 @@ impl Utf8String {
// The loop will advance by at least one byte every iteration.
loop {
match bstr::decode_utf8(slice) {
- // If we've run out of slice while trying to find the
- // `index`th character, the lookup fails and we return `nil`.
+ // If we've run out of slice while trying to find the `index`th
+ //character, the lookup fails and we return `nil`.
(_, 0) => return None,
- // The next two arms mean we've reached the `index`th
- // character. Either return the next valid UTF-8
- // character byte slice or, if the next bytes are an
- // invalid UTF-8 sequence, the next byte.
+ // The next two arms mean we've reached the `index`th character.
+ //Either return the next valid UTF-8 character byte slice or, if
+ //the next bytes are an invalid UTF-8 sequence, the next byte.
(Some(_), size) if remaining == 0 => return Some(&slice[..size]),
- // Size is guaranteed to be positive per the first arm
- // which means this slice operation will not panic.
+ // Size is guaranteed to be positive per the first arm which
+ //means this slice operation will not panic.
(None, _) if remaining == 0 => return Some(&slice[..1]),
- // We found a single UTF-8 encoded character keep track
- // of the count and advance the substring to continue
- // decoding.
+ // We found a single UTF-8 encoded character keep track of the
+ //count and advance the substring to continue decoding.
(Some(_), size) => {
slice = &slice[size..];
remaining -= 1;
}
- // The next two arms handle the case where we have
- // encountered an invalid UTF-8 byte sequence.
+ // The next two arms handle the case where we have encountered an
+ //invalid UTF-8 byte sequence.
//
- // In this case, `decode_utf8` will return slices whose
- // length is `1..=3`. The length of this slice is the
- // number of "characters" we can advance the loop by.
+ // In this case, `decode_utf8` will return slices whose length is
+ //`1..=3`. The length of this slice is the number of
+ //"characters" we can advance the loop by.
//
- // If the invalid UTF-8 sequence contains more bytes
- // than we have remaining to get to the `index`th char,
- // then the target character is inside the invalid UTF-8
- // sequence.
+ // If the invalid UTF-8 sequence contains more bytes than we have
+ //remaining to get to the `index`th char, then the target
+ //character is inside the invalid UTF-8 sequence.
(None, size) if remaining < size => return Some(&slice[remaining..=remaining]),
- // If there are more characters remaining than the number
- // of bytes yielded in the invalid UTF-8 byte sequence,
- // count `size` bytes and advance the slice to continue
- // decoding.
+ // If there are more characters remaining than the number of
+ //bytes yielded in the invalid UTF-8 byte sequence, count `size`
+ //bytes and advance the slice to continue decoding.
(None, size) => {
slice = &slice[size..];
remaining -= size;
@@ -328,8 +325,8 @@ impl Utf8String {
return Some(&[]);
}
- // If the start of the range is beyond the character count of the
- // string, the whole lookup must fail.
+ // If the start of the range is beyond the character count of the string,
+ //the whole lookup must fail.
//
// Slice lookups where the start is just beyond the last character index
// always return an empty slice.
@@ -395,24 +392,23 @@ impl Utf8String {
_ => {}
}
- // Fast path for trying to treat the conventionally UTF-8 string
- // as entirely ASCII.
+ // Fast path for trying to treat the conventionally UTF-8 string as
+ //entirely ASCII.
//
- // If the string is either all ASCII or all ASCII for the subset
- // of the string we wish to slice, fallback to byte slicing as in
- // the ASCII and binary fast path.
+ // If the string is either all ASCII or all ASCII for the subset of the
+ //string we wish to slice, fallback to byte slicing as in the ASCII and
+ //binary fast path.
//
- // Perform the same saturate-to-end slicing mechanism if `end`
- // is beyond the character length of the string.
+ // Perform the same saturate-to-end slicing mechanism if `end` is beyond
+ //the character length of the string.
let consumed = match self.inner.find_non_ascii_byte() {
- // The entire string is ASCII, so byte indexing <=> char
- // indexing.
+ // The entire string is ASCII, so byte indexing <=> char indexing.
None => return self.inner.get(start..end).or_else(|| self.inner.get(start..)),
- // The whole substring we are interested in is ASCII, so
- // byte indexing is still valid.
+ // The whole substring we are interested in is ASCII, so byte
+ //indexing is still valid.
Some(non_ascii_byte_offset) if non_ascii_byte_offset > end => return self.get(start..end),
- // We turn non-ASCII somewhere inside before the substring
- // we're interested in, so consume that much.
+ // We turn non-ASCII somewhere inside before the substring we're
+ //interested in, so consume that much.
Some(non_ascii_byte_offset) if non_ascii_byte_offset <= start => non_ascii_byte_offset,
// This means we turn non-ASCII somewhere inside the substring.
// Consume up to start.
@@ -436,12 +432,10 @@ impl Utf8String {
// `start`th character, the lookup fails and we return `nil`.
(_, 0) => return None,
- // We found a single UTF-8 encoded character. keep track
- // of the count and advance the substring to continue
- // decoding.
+ // We found a single UTF-8 encoded character. keep track of
+ //the count and advance the substring to continue decoding.
//
- // If there's only one more to go, advance and stop the
- // loop.
+ // If there's only one more to go, advance and stop the loop.
(Some(_), size) if remaining == 1 => break &slice[size..],
// Otherwise, keep track of the character we observed and
// advance the slice to continue decoding.
@@ -457,14 +451,13 @@ impl Utf8String {
// length is `1..=3`. The length of this slice is the
// number of "characters" we can advance the loop by.
//
- // If the invalid UTF-8 sequence contains more bytes
- // than we have remaining to get to the `start`th char,
- // then we can break the loop directly.
+ // If the invalid UTF-8 sequence contains more bytes than we
+ //have remaining to get to the `start`th char, then we can
+ //break the loop directly.
(None, size) if remaining <= size => break &slice[remaining..],
- // If there are more characters remaining than the number
- // of bytes yielded in the invalid UTF-8 byte sequence,
- // count `size` bytes and advance the slice to continue
- // decoding.
+ // If there are more characters remaining than the number of
+ //bytes yielded in the invalid UTF-8 byte sequence, count
+ //`size` bytes and advance the slice to continue decoding.
(None, size) => {
slice = &slice[size..];
remaining -= size;
@@ -475,12 +468,11 @@ impl Utf8String {
// Scan the slice for the span of characters we want to return.
remaining = end - start;
- // We know `remaining` is not zero because we fast-pathed that
- // case above.
+ // We know `remaining` is not zero because we fast-pathed that case
+ //above.
debug_assert!(remaining > 0);
- // keep track of the start of the substring from the `start`th
- // character.
+ // keep track of the start of the substring from the `start`th character.
let substr = slice;
// This loop will terminate when either:
@@ -496,38 +488,36 @@ impl Utf8String {
// character, saturate the slice to the end of the string.
(_, 0) => return Some(substr),
- // We found a single UTF-8 encoded character. keep track
- // of the count and advance the substring to continue
- // decoding.
+ // We found a single UTF-8 encoded character. keep track of the
+ //count and advance the substring to continue decoding.
//
- // If there's only one more to go, advance and stop the
- // loop.
+ // If there's only one more to go, advance and stop the loop.
(Some(_), size) if remaining == 1 => {
- // Push `endth` more positive because this match has
- // the effect of shrinking `slice`.
+ // Push `endth` more positive because this match has the
+ //effect of shrinking `slice`.
let endth = substr.len() - slice.len() + size;
return Some(&substr[..endth]);
}
- // Otherwise, keep track of the character we observed and
- // advance the slice to continue decoding.
+ // Otherwise, keep track of the character we observed and advance
+ //the slice to continue decoding.
(Some(_), size) => {
slice = &slice[size..];
remaining -= 1;
}
- // The next two arms handle the case where we have
- // encountered an invalid UTF-8 byte sequence.
+ // The next two arms handle the case where we have encountered an
+ //invalid UTF-8 byte sequence.
//
- // In this case, `decode_utf8` will return slices whose
- // length is `1..=3`. The length of this slice is the
- // number of "characters" we can advance the loop by.
+ // In this case, `decode_utf8` will return slices whose length is
+ //`1..=3`. The length of this slice is the number of
+ //"characters" we can advance the loop by.
//
- // If the invalid UTF-8 sequence contains more bytes
- // than we have remaining to get to the `end`th char,
- // then we can break the loop directly.
+ // If the invalid UTF-8 sequence contains more bytes than we have
+ //remaining to get to the `end`th char, then we can break the
+ //loop directly.
(None, size) if remaining <= size => {
- // For an explanation of this arithmetic:
- // If we're trying to slice:
+ // For an explanation of this arithmetic: If we're trying to
+ //slice:
//
// ```
// s = "a\xF0\x9F\x87"
@@ -548,10 +538,9 @@ impl Utf8String {
let endth = substr.len() - slice.len() + remaining;
return Some(&substr[..endth]);
}
- // If there are more characters remaining than the number
- // of bytes yielded in the invalid UTF-8 byte sequence,
- // count `size` bytes and advance the slice to continue
- // decoding.
+ // If there are more characters remaining than the number of
+ //bytes yielded in the invalid UTF-8 byte sequence, count `size`
+ //bytes and advance the slice to continue decoding.
(None, size) => {
slice = &slice[size..];
remaining -= size;
@@ -657,19 +646,18 @@ impl Utf8String {
// Turkic or ASCII-only modes
#[inline]
pub fn make_capitalized(&mut self) {
- // This allocation assumes that in the common case, capitalizing
- // and lower-casing `char`s do not change the length of the
- // `String`.
+ // This allocation assumes that in the common case, capitalizing and
+ //lower-casing `char`s do not change the length of the `String`.
//
- // Use a `Vec` here instead of a `Buf` to ensure at most one alloc
- // fix-up happens instead of alloc fix-ups being O(chars).
+ // Use a `Vec` here instead of a `Buf` to ensure at most one alloc fix-up
+ //happens instead of alloc fix-ups being O(chars).
let mut replacement = Vec::with_capacity(self.len());
let mut bytes = self.inner.as_slice();
match bstr::decode_utf8(bytes) {
(Some(ch), size) => {
- // Converting a UTF-8 character to uppercase may yield
- // multiple codepoints.
+ // Converting a UTF-8 character to uppercase may yield multiple
+ //codepoints.
for ch in ch.to_uppercase() {
replacement.push_char(ch);
}
@@ -686,8 +674,8 @@ impl Utf8String {
while !bytes.is_empty() {
let (ch, size) = bstr::decode_utf8(bytes);
if let Some(ch) = ch {
- // Converting a UTF-8 character to lowercase may yield
- // multiple codepoints.
+ // Converting a UTF-8 character to lowercase may yield multiple
+ //codepoints.
for ch in ch.to_lowercase() {
replacement.push_char(ch);
}
@@ -703,19 +691,19 @@ impl Utf8String {
#[inline]
pub fn make_lowercase(&mut self) {
- // This allocation assumes that in the common case, lower-casing
- // `char`s do not change the length of the `String`.
+ // This allocation assumes that in the common case, lower-casing `char`s
+ //do not change the length of the `String`.
//
- // Use a `Vec` here instead of a `Buf` to ensure at most one alloc
- // fix-up happens instead of alloc fix-ups being O(chars).
+ // Use a `Vec` here instead of a `Buf` to ensure at most one alloc fix-up
+ //happens instead of alloc fix-ups being O(chars).
let mut replacement = Vec::with_capacity(self.len());
let mut bytes = self.inner.as_slice();
while !bytes.is_empty() {
let (ch, size) = bstr::decode_utf8(bytes);
if let Some(ch) = ch {
- // Converting a UTF-8 character to lowercase may yield
- // multiple codepoints.
+ // Converting a UTF-8 character to lowercase may yield multiple
+ //codepoints.
for ch in ch.to_lowercase() {
replacement.push_char(ch);
}
@@ -731,19 +719,19 @@ impl Utf8String {
#[inline]
pub fn make_uppercase(&mut self) {
- // This allocation assumes that in the common case, upper-casing
- // `char`s do not change the length of the `String`.
+ // This allocation assumes that in the common case, upper-casing `char`s
+ //do not change the length of the `String`.
//
- // Use a `Vec` here instead of a `Buf` to ensure at most one alloc
- // fix-up happens instead of alloc fix-ups being O(chars).
+ // Use a `Vec` here instead of a `Buf` to ensure at most one alloc fix-up
+ //happens instead of alloc fix-ups being O(chars).
let mut replacement = Vec::with_capacity(self.len());
let mut bytes = self.inner.as_slice();
while !bytes.is_empty() {
let (ch, size) = bstr::decode_utf8(bytes);
if let Some(ch) = ch {
- // Converting a UTF-8 character to lowercase may yield
- // multiple codepoints.
+ // Converting a UTF-8 character to lowercase may yield multiple
+ //codepoints.
for ch in ch.to_uppercase() {
replacement.push_char(ch);
}
@@ -795,8 +783,8 @@ impl Utf8String {
// FIXME: this allocation can go away if `ConventionallyUtf8` impls
// `DoubleEndedIterator`.
let chars = ConventionallyUtf8::from(&self.inner[..]).collect::<Vec<_>>();
- // Use a `Vec` here instead of a `Buf` to ensure at most one alloc
- // fix-up happens instead of alloc fix-ups being O(chars).
+ // Use a `Vec` here instead of a `Buf` to ensure at most one alloc fix-up
+ //happens instead of alloc fix-ups being O(chars).
let mut replacement = Vec::with_capacity(self.inner.len());
for &bytes in chars.iter().rev() {
replacement.extend_from_slice(bytes);
@@ -949,7 +937,7 @@ mod tests {
#[test]
fn char_len_utf8() {
- // https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L147-L157
+ // //https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L147-L157
let s = Utf8String::from("Ω≈ç√∫˜µ≤≥÷");
assert_eq!(s.char_len(), 10);
let s = Utf8String::from("åß∂ƒ©˙∆˚¬…æ");
@@ -978,14 +966,14 @@ mod tests {
// effectively cause rendering issues or character-length issues to
// validate product globalization readiness.
//
- // https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L202-L224
+ // //https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L202-L224
let s = Utf8String::from("表ポあA鷗ŒéB逍Üߪąñ丂㐀𠀀");
assert_eq!(s.char_len(), 17);
}
#[test]
fn char_len_two_byte_chars() {
- // https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L188-L196
+ // //https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L188-L196
let s = Utf8String::from("田中さんにあげて下さい");
assert_eq!(s.char_len(), 11);
let s = Utf8String::from("パーティーへ行かないか");
@@ -1008,19 +996,21 @@ mod tests {
#[test]
fn char_len_space_chars() {
- // Whitespace: all the characters with category `Zs`, `Zl`, or `Zp` (in Unicode
- // version 8.0.0), plus `U+0009 (HT)`, `U+000B (VT)`, `U+000C (FF)`, `U+0085 (NEL)`,
- // and `U+200B` (ZERO WIDTH SPACE), which are in the C categories but are often
- // treated as whitespace in some contexts.
+ // Whitespace: all the characters with category `Zs`, `Zl`, or `Zp` (in
+ //Unicode version 8.0.0), plus `U+0009 (HT)`, `U+000B (VT)`,
+ //`U+000C (FF)`, `U+0085 (NEL)`, and `U+200B` (ZERO WIDTH SPACE), which
+ //are in the C categories but are often treated as whitespace in some
+ //contexts.
//
- // This file unfortunately cannot express strings containing
- // `U+0000`, `U+000A`, or `U+000D` (`NUL`, `LF`, `CR`).
+ // This file unfortunately cannot express strings containing `U+0000`,
+ //`U+000A`, or `U+000D` (`NUL`, `LF`, `CR`).
//
// The next line may appear to be blank or mojibake in some viewers.
//
- // The next line may be flagged for "trailing whitespace" in some viewers.
+ // The next line may be flagged for "trailing whitespace" in some
+ //viewers.
//
- // https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L131
+ // //https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L131
let bytes = "
";
let s = Utf8String::from(bytes);
@@ -1097,7 +1087,8 @@ mod tests {
// Changes length when case changes
// https://github.com/minimaxir/big-list-of-naughty-strings/blob/894882e7/blns.txt#L226-L232
let varying_length = Utf8String::from("zȺȾ");
- // There doesn't appear to be any RTL scripts that have cases, but might as well make sure
+ // There doesn't appear to be any RTL scripts that have cases, but might
+ //as well make sure
let rtl = Utf8String::from("مرحبا الخرشوف");
let capitalize: fn(&Utf8String) -> Utf8String = |value: &Utf8String| {
@@ -1184,16 +1175,17 @@ mod tests {
//
// Per `bstr`:
//
- // The bytes `\xF0\x9F\x87` could lead to a valid UTF-8 sequence, but 3 of them
- // on their own are invalid. Only one replacement codepoint is substituted,
- // which demonstrates the "substitution of maximal subparts" strategy.
+ // The bytes `\xF0\x9F\x87` could lead to a valid UTF-8 sequence, but 3
+ //of them on their own are invalid. Only one replacement codepoint is
+ //substituted, which demonstrates the "substitution of maximal subparts"
+ //strategy.
let s = Utf8String::from(b"\xF0\x9F\x87");
assert_eq!(s.chr(), b"\xF0");
}
#[test]
fn get_char_slice_valid_range() {
- let s = Utf8String::from(b"a\xF0\x9F\x92\x8E\xFF".to_vec()); // "a💎\xFF"
+ let s = Utf8String::from(b"a\xF0\x9F\x92\x8E\xFF".to_vec()); // //"a💎\xFF"
assert_eq!(s.get_char_slice(0..0), Some(&b""[..]));
assert_eq!(s.get_char_slice(0..1), Some(&b"a"[..]));
assert_eq!(s.get_char_slice(0..2), Some("a💎".as_bytes()));
@@ -1207,7 +1199,7 @@ mod tests {
#[test]
#[allow(clippy::reversed_empty_ranges)]
fn get_char_slice_invalid_range() {
- let s = Utf8String::from(b"a\xF0\x9F\x92\x8E\xFF".to_vec()); // "a💎\xFF"
+ let s = Utf8String::from(b"a\xF0\x9F\x92\x8E\xFF".to_vec()); // //"a💎\xFF"
assert_eq!(s.get_char_slice(4..5), None);
assert_eq!(s.get_char_slice(4..1), None);
assert_eq!(s.get_char_slice(3..1), Some(&b""[..]));
diff --git i/spinoso-string/src/impls.rs w/spinoso-string/src/impls.rs
index 4b445c16d8..c232e9dae2 100644
--- i/spinoso-string/src/impls.rs
+++ w/spinoso-string/src/impls.rs
@@ -206,15 +206,15 @@ impl DerefMut for String {
}
}
-// This impl of `Borrow<[u8]>` is permissible due to the behavior of
-// `PartialEq`, `Hash`, and `Ord` impls on `String` which only rely on the byte
-// slice contents in the underlying encoded string.
+// This impl of `Borrow<[u8]>` is permissible due to the behavior of `PartialEq`,
+ //`Hash`, and `Ord` impls on `String` which only rely on the byte slice contents
+//in the underlying encoded string.
//
// Per the docs in `std`:
//
-// > In particular `Eq`, `Ord` and `Hash` must be equivalent for borrowed and
-// > owned values: `x.borrow() == y.borrow()` should give the same result as
-// > `x == y`.
+// > In particular `Eq`, `Ord` and `Hash` must be equivalent for borrowed and >
+ //owned values: `x.borrow() == y.borrow()` should give the same result as > `x
+//== y`.
impl Borrow<[u8]> for String {
#[inline]
fn borrow(&self) -> &[u8] {
diff --git i/spinoso-string/src/inspect.rs w/spinoso-string/src/inspect.rs
index 5be7b04e2b..1d02b5046c 100644
--- i/spinoso-string/src/inspect.rs
+++ w/spinoso-string/src/inspect.rs
@@ -87,9 +87,9 @@ impl<'a> Inspect<'a> {
/// Write an `Inspect` iterator into the given destination using the debug
/// representation of the byte buffer associated with a source `String`.
///
- /// This formatter writes content like `"spinoso"` and `"invalid-\xFF-utf8"`.
- /// To see example output of the underlying iterator, see the `Inspect`
- /// documentation.
+ /// This formatter writes content like `"spinoso"` and
+ /// `"invalid-\xFF-utf8"`. To see example output of the underlying iterator,
+ /// see the `Inspect` documentation.
///
/// To write binary output, use [`write_into`], which requires the **std**
/// feature to be activated.
@@ -134,9 +134,9 @@ impl<'a> Inspect<'a> {
/// Write an `Inspect` iterator into the given destination using the debug
/// representation of the byte buffer associated with a source `String`.
///
- /// This formatter writes content like `"spinoso"` and `"invalid-\xFF-utf8"`.
- /// To see example output of the underlying iterator, see the `Inspect`
- /// documentation.
+ /// This formatter writes content like `"spinoso"` and
+ /// `"invalid-\xFF-utf8"`. To see example output of the underlying iterator,
+ /// see the `Inspect` documentation.
///
/// To write to a [formatter], use [`format_into`].
///
diff --git i/spinoso-string/src/iter.rs w/spinoso-string/src/iter.rs
index 67cfce8623..b6b61e4225 100644
--- i/spinoso-string/src/iter.rs
+++ w/spinoso-string/src/iter.rs
@@ -143,8 +143,8 @@ impl<'a> IterMut<'a> {
/// Views the underlying data as a subslice of the original data.
///
- /// To avoid creating `&mut` references that alias, this is forced to consume
- /// the iterator.
+ /// To avoid creating `&mut` references that alias, this is forced to
+ /// consume the iterator.
///
/// # Examples
///
diff --git i/spinoso-string/src/lib.rs w/spinoso-string/src/lib.rs
index 87bb579728..3b8f0be089 100644
--- i/spinoso-string/src/lib.rs
+++ w/spinoso-string/src/lib.rs
@@ -3,8 +3,7 @@
#![warn(clippy::cargo)]
#![cfg_attr(test, allow(clippy::non_ascii_literal))]
#![allow(unknown_lints)]
-// TODO: warn on missing docs once crate is API-complete.
-// #![warn(missing_docs)]
+// TODO: warn on missing docs once crate is API-complete. #![warn(missing_docs)]
#![warn(missing_debug_implementations)]
#![warn(missing_copy_implementations)]
#![warn(rust_2018_idioms)]
@@ -282,8 +281,8 @@ impl String {
/// If `len` is greater than the string's current length, this has no
/// effect.
///
- /// Note that this method has no effect on the allocated capacity
- /// of the string.
+ /// Note that this method has no effect on the allocated capacity of the
+ /// string.
///
/// # Examples
///
@@ -416,7 +415,8 @@ impl String {
/// using one of the safe operations instead, such as [`truncate`],
/// [`extend`], or [`clear`].
///
- /// This function can change the return value of [`String::is_valid_encoding`].
+ /// This function can change the return value of
+ /// [`String::is_valid_encoding`].
///
/// # Safety
///
@@ -805,16 +805,14 @@ impl String {
self.inner.reserve_exact(additional);
}
- /// Tries to reserve the minimum capacity for exactly `additional`
- /// elements to be inserted in the `String`. After calling
- /// `try_reserve_exact`, capacity will be greater than or equal to
- /// `self.len() + additional` if it returns `Ok(())`. Does nothing if the
- /// capacity is already sufficient.
+ /// Tries to reserve the minimum capacity for exactly `additional` elements
+ /// to be inserted in the `String`. After calling `try_reserve_exact`,
+ /// capacity will be greater than or equal to `self.len() + additional` if
+ /// it returns `Ok(())`. Does nothing if the capacity is already sufficient.
///
- /// Note that the allocator may give the collection more space than
- /// it requests. Therefore, capacity can not be relied upon to be
- /// precisely minimal. Prefer [`try_reserve`] if future insertions are
- /// expected.
+ /// Note that the allocator may give the collection more space than it
+ /// requests. Therefore, capacity can not be relied upon to be precisely
+ /// minimal. Prefer [`try_reserve`] if future insertions are expected.
///
/// # Errors
///
@@ -1050,8 +1048,8 @@ impl String {
///
/// # Examples
///
- /// For [UTF-8] strings, the given codepoint is converted to a Unicode scalar
- /// value before appending:
+ /// For [UTF-8] strings, the given codepoint is converted to a Unicode
+ /// scalar value before appending:
///
/// ```
/// use spinoso_string::String;
@@ -1356,9 +1354,9 @@ impl String {
pub fn unicode_casecmp(&self, other: &String, options: CaseFold) -> Option<bool> {
let left = self.as_slice();
let right = other.as_slice();
- // If both `String`s are conventionally UTF-8, they must be case
- // compared using the given case folding strategy. This requires the
- // `String`s be well-formed UTF-8.
+ // If both `String`s are conventionally UTF-8, they must be case compared
+ //using the given case folding strategy. This requires the `String`s be
+ //well-formed UTF-8.
if let (Encoding::Utf8, Encoding::Utf8) = (self.encoding(), other.encoding()) {
if let (Ok(left), Ok(right)) = (str::from_utf8(left), str::from_utf8(right)) {
// Both slices are UTF-8, compare with the given Unicode case
@@ -1494,8 +1492,8 @@ impl String {
#[inline]
#[must_use]
pub fn chomp<T: AsRef<[u8]>>(&mut self, separator: Option<T>) -> bool {
- // convert to a concrete type and delegate to a single `chomp` impl
- // to minimize code duplication when monomorphizing.
+ // convert to a concrete type and delegate to a single `chomp` impl to
+ //minimize code duplication when monomorphizing.
if let Some(sep) = separator {
chomp(self, Some(sep.as_ref()))
} else {
@@ -1505,7 +1503,8 @@ impl String {
/// Modifies this `String` in-place and removes the last character.
///
- /// This method returns a [`bool`] that indicates if this string was modified.
+ /// This method returns a [`bool`] that indicates if this string was
+ /// modified.
///
/// If the string ends with `\r\n`, both characters are removed. When
/// applying `chop` to an empty string, the string remains empty.
@@ -1642,18 +1641,19 @@ impl String {
if let Some(offset) = offset {
let buf = buf.get(offset..)?;
let index = buf.find(needle)?;
- // This addition is guaranteed not to overflow because the result is
- // a valid index of the underlying `Vec`.
+ // This addition is guaranteed not to overflow because the result
+ //is a valid index of the underlying `Vec`.
//
- // `self.buf.len() < isize::MAX` because `self.buf` is a `Vec` and
- // `Vec` documents `isize::MAX` as its maximum allocation size.
+ // `self.buf.len() < isize::MAX` because `self.buf` is a `Vec`
+ //and `Vec` documents `isize::MAX` as its maximum allocation
+ //size.
Some(index + offset)
} else {
buf.find(needle)
}
}
- // convert to a concrete type and delegate to a single `index` impl
- // to minimize code duplication when monomorphizing.
+ // convert to a concrete type and delegate to a single `index` impl to
+ //minimize code duplication when monomorphizing.
let needle = needle.as_ref();
inner(self.inner.as_slice(), needle, offset)
}
@@ -1670,8 +1670,8 @@ impl String {
buf.rfind(needle)
}
}
- // convert to a concrete type and delegate to a single `rindex` impl
- // to minimize code duplication when monomorphizing.
+ // convert to a concrete type and delegate to a single `rindex` impl to
+ //minimize code duplication when monomorphizing.
let needle = needle.as_ref();
inner(self.inner.as_slice(), needle, offset)
}
@@ -2034,8 +2034,8 @@ fn chomp(string: &mut String, separator: Option<&[u8]>) -> bool {
}
Some(separator) if string.inner.ends_with(separator) => {
let original_len = string.len();
- // This subtraction is guaranteed not to panic because
- // `separator` is a substring of `buf`.
+ // This subtraction is guaranteed not to panic because `separator` is
+ //a substring of `buf`.
let truncate_to_len = original_len - separator.len();
string.inner.truncate(truncate_to_len);
// Separator is non-empty and we are always truncating, so this
diff --git i/spinoso-symbol/src/casecmp/unicode.rs w/spinoso-symbol/src/casecmp/unicode.rs
index 2f0c404344..ed73820655 100644
--- i/spinoso-symbol/src/casecmp/unicode.rs
+++ w/spinoso-symbol/src/casecmp/unicode.rs
@@ -49,9 +49,9 @@ where
// Encoding mismatch, the bytes are not comparable using Unicode case
// folding.
//
- // > `nil` is returned if the two symbols have incompatible encodings,
- // > or if `other_symbol` is not a symbol.
- // > <https://ruby-doc.org/core-3.1.2/Symbol.html#method-i-casecmp-3F>
+ // > `nil` is returned if the two symbols have incompatible encodings, > or
+ //if `other_symbol` is not a symbol. >
+ //<https://ruby-doc.org/core-3.1.2/Symbol.html#method-i-casecmp-3F>
(Ok(_), Err(_)) | (Err(_), Ok(_)) => return Ok(None),
};
Ok(Some(cmp))
diff --git i/spinoso-symbol/src/ident.rs w/spinoso-symbol/src/ident.rs
index a88a8eec33..087c3147ea 100644
--- i/spinoso-symbol/src/ident.rs
+++ w/spinoso-symbol/src/ident.rs
@@ -322,7 +322,8 @@ impl TryFrom<&[u8]> for IdentifierType {
}
}
-/// Error type returned from the [`FromStr`] implementation on [`IdentifierType`].
+/// Error type returned from the [`FromStr`] implementation on
+/// [`IdentifierType`].
///
/// # Examples
///
@@ -504,8 +505,8 @@ fn is_ident_char(ch: char) -> bool {
/// Scan the [`char`]s in the input until either invalid UTF-8 or an invalid
/// ident is found. See [`is_ident_char`].
///
-/// This method returns `Some(index)` of the start of the first invalid ident
-/// or `None` if the whole input is a valid ident.
+/// This method returns `Some(index)` of the start of the first invalid ident or
+/// `None` if the whole input is a valid ident.
///
/// Empty slices are not valid idents.
#[inline]
diff --git i/spinoso-symbol/src/inspect.rs w/spinoso-symbol/src/inspect.rs
index ef056ac623..72a7b3fa6f 100644
--- i/spinoso-symbol/src/inspect.rs
+++ w/spinoso-symbol/src/inspect.rs
@@ -91,9 +91,9 @@ impl<'a> Inspect<'a> {
/// representation of the interned byte slice associated with the symbol in
/// the underlying interner.
///
- /// This formatter writes content like `:spinoso` and `:"invalid-\xFF-utf8"`.
- /// To see example output of the underlying iterator, see the `Inspect`
- /// documentation.
+ /// This formatter writes content like `:spinoso` and
+ /// `:"invalid-\xFF-utf8"`. To see example output of the underlying
+ /// iterator, see the `Inspect` documentation.
///
/// To write binary output, use [`write_into`], which requires the **std**
/// feature to be activated.
@@ -135,9 +135,9 @@ impl<'a> Inspect<'a> {
/// representation of the interned byte slice associated with the symbol in
/// the underlying interner.
///
- /// This formatter writes content like `:spinoso` and `:"invalid-\xFF-utf8"`.
- /// To see example output of the underlying iterator, see the `Inspect`
- /// documentation.
+ /// This formatter writes content like `:spinoso` and
+ /// `:"invalid-\xFF-utf8"`. To see example output of the underlying
+ /// iterator, see the `Inspect` documentation.
///
/// To write to a [formatter], use [`format_into`].
///
diff --git i/spinoso-symbol/src/lib.rs w/spinoso-symbol/src/lib.rs
index 969f76caa4..18e3731284 100644
--- i/spinoso-symbol/src/lib.rs
+++ w/spinoso-symbol/src/lib.rs
@@ -157,8 +157,8 @@ impl std::error::Error for SymbolOverflowError {}
/// Identifier bound to an interned byte string.
///
-/// A `Symbol` allows retrieving a reference to the original interned
-/// byte string. Equivalent `Symbol`s will resolve to an identical byte string.
+/// A `Symbol` allows retrieving a reference to the original interned byte
+/// string. Equivalent `Symbol`s will resolve to an identical byte string.
///
/// `Symbol`s are based on a `u32` index. They are cheap to compare and cheap to
/// copy.
@@ -176,11 +176,11 @@ impl Borrow<u32> for Symbol {
impl Symbol {
/// Construct a new `Symbol` from the given `u32`.
///
- /// `Symbol`s constructed manually may fail to resolve to an underlying
- /// byte string.
+ /// `Symbol`s constructed manually may fail to resolve to an underlying byte
+ /// string.
///
- /// `Symbol`s are not constrained to the interner which created them.
- /// No runtime checks ensure that the underlying interner is called with a
+ /// `Symbol`s are not constrained to the interner which created them. No
+ /// runtime checks ensure that the underlying interner is called with a
/// `Symbol` that the interner itself issued.
///
/// # Examples
diff --git i/spinoso-time/src/time/tzrs/convert.rs w/spinoso-time/src/time/tzrs/convert.rs
index 6874b2cc37..3ef023eed3 100644
--- i/spinoso-time/src/time/tzrs/convert.rs
+++ w/spinoso-time/src/time/tzrs/convert.rs
@@ -37,8 +37,8 @@ impl fmt::Display for Time {
impl Time {
/// Formats _time_ according to the directives in the given format string.
///
- /// Can be used to implement [`Time#strftime`]. The resulting string should be
- /// treated as an ASCII-encoded string.
+ /// Can be used to implement [`Time#strftime`]. The resulting string should
+ /// be treated as an ASCII-encoded string.
///
/// # Examples
///
diff --git i/spinoso-time/src/time/tzrs/error.rs w/spinoso-time/src/time/tzrs/error.rs
index 7473f43ccd..3f36c7928d 100644
--- i/spinoso-time/src/time/tzrs/error.rs
+++ w/spinoso-time/src/time/tzrs/error.rs
@@ -22,8 +22,8 @@ pub enum TimeError {
/// Note: [`tz::error::DateTimeError`] is only thrown from `tz-rs` when a
/// provided component value is out of range.
///
- /// Note: This is different from how MRI ruby is implemented. e.g. Second
- /// 60 is valid in MRI, and will just add an additional second instead of
+ /// Note: This is different from how MRI ruby is implemented. e.g. Second 60
+ /// is valid in MRI, and will just add an additional second instead of
/// erroring.
ComponentOutOfRangeError(DateTimeError),
@@ -103,8 +103,8 @@ impl From<TzError> for TimeError {
// Allowing matching arms due to documentation
#[allow(clippy::match_same_arms)]
match error {
- // These two are generally recoverable within the usable of `spinoso_time`
- // TzError::DateTimeError(error) => Self::from(error),
+ // These two are generally recoverable within the usable of
+ //`spinoso_time` TzError::DateTimeError(error) => Self::from(error),
TzError::ProjectDateTimeError(error) => Self::from(error),
// The rest will bleed through, but are included here for reference
diff --git i/spinoso-time/src/time/tzrs/math.rs w/spinoso-time/src/time/tzrs/math.rs
index bfc2433c44..76de1f99e4 100644
--- i/spinoso-time/src/time/tzrs/math.rs
+++ w/spinoso-time/src/time/tzrs/math.rs
@@ -74,7 +74,8 @@ impl Time {
new_nanos -= NANOS_IN_SECOND;
}
- // Rounding should never cause an error generating a new time since it's always a truncation
+ // Rounding should never cause an error generating a new time
+ //since it's always a truncation
let dt = DateTime::from_timespec_and_local(unix_time, new_nanos, local_time_type)
.expect("Could not round the datetime");
Self {
@@ -171,8 +172,8 @@ impl Time {
// Subtraction
impl Time {
- /// Subtraction — Subtracts the given duration from _time_ and returns
- /// that value as a new `Time` object.
+ /// Subtraction — Subtracts the given duration from _time_ and returns that
+ /// value as a new `Time` object.
///
/// # Errors
///
diff --git i/spinoso-time/src/time/tzrs/mod.rs w/spinoso-time/src/time/tzrs/mod.rs
index 44582c9f6c..82f03c6cdd 100644
--- i/spinoso-time/src/time/tzrs/mod.rs
+++ w/spinoso-time/src/time/tzrs/mod.rs
@@ -141,7 +141,8 @@ impl Time {
///
/// # Errors
///
- /// Can produce a [`TimeError`], generally when provided values are out of range.
+ /// Can produce a [`TimeError`], generally when provided values are out of
+ /// range.
///
/// [`Time#new`]: https://ruby-doc.org/core-3.1.2/Time.html#method-c-new
/// [`Timezone`]: https://ruby-doc.org/core-3.1.2/Time.html#class-Time-label-Timezone+argument
@@ -175,7 +176,8 @@ impl Time {
// upstream has provided a test case which means we have a test that
// simulates this failure condition and requires us to handle it.
//
- // See: https://github.com/x-hgg-x/tz-rs/issues/34#issuecomment-1206140198
+ // See:
+ //https://github.com/x-hgg-x/tz-rs/issues/34#issuecomment-1206140198
let dt = found_date_times.latest().ok_or(TimeError::Unknown)?;
Ok(Self { inner: dt, offset })
}
@@ -197,7 +199,8 @@ impl Time {
///
/// # Errors
///
- /// Can produce a [`TimeError`], however these should never been seen in regular usage.
+ /// Can produce a [`TimeError`], however these should never been seen in
+ /// regular usage.
///
/// [`Time#now`]: https://ruby-doc.org/core-3.1.2/Time.html#method-c-now
#[inline]
@@ -228,7 +231,8 @@ impl Time {
///
/// # Errors
///
- /// Can produce a [`TimeError`], however these should not be seen during regular usage.
+ /// Can produce a [`TimeError`], however these should not be seen during
+ /// regular usage.
///
/// [`Time#at`]: https://ruby-doc.org/core-3.1.2/Time.html#method-c-at
#[inline]
@@ -264,7 +268,8 @@ impl TryFrom<ToA> for Time {
///
/// # Errors
///
- /// Can produce a [`TimeError`], generally when provided values are out of range.
+ /// Can produce a [`TimeError`], generally when provided values are out of
+ /// range.
#[inline]
fn try_from(to_a: ToA) -> Result<Self> {
let offset = Offset::try_from(to_a.zone).unwrap_or_else(|_| Offset::utc());
diff --git i/spinoso-time/src/time/tzrs/offset.rs w/spinoso-time/src/time/tzrs/offset.rs
index 13b77bfc30..5290f2497e 100644
--- i/spinoso-time/src/time/tzrs/offset.rs
+++ w/spinoso-time/src/time/tzrs/offset.rs
@@ -61,8 +61,8 @@ fn local_time_zone() -> TimeZoneRef<'static> {
GMT
}
-/// Generates a [+/-]HHMM timezone format from a given number of seconds
-/// Note: the actual seconds element is effectively ignored here
+/// Generates a [+/-]HHMM timezone format from a given number of seconds Note:
+/// the actual seconds element is effectively ignored here
#[inline]
#[must_use]
fn offset_hhmm_from_seconds(seconds: i32) -> String {
@@ -311,8 +311,8 @@ impl TryFrom<&str> for Offset {
// includes all sorts of numerals, including Devanagari and
// Kannada, which don't parse into an `i32` using `FromStr`.
//
- // `[[:digit:]]` is documented to be an ASCII character class
- // for only digits 0-9.
+ // `[[:digit:]]` is documented to be an ASCII character class for
+ //only digits 0-9.
//
// See:
// - https://docs.rs/regex/latest/regex/#perl-character-classes-unicode-friendly
diff --git i/spinoso-time/src/time/tzrs/parts.rs w/spinoso-time/src/time/tzrs/parts.rs
index e02a8138a7..33a0d515e7 100644
--- i/spinoso-time/src/time/tzrs/parts.rs
+++ w/spinoso-time/src/time/tzrs/parts.rs
@@ -60,8 +60,9 @@ impl Time {
/// Returns the second of the minute (0..60) for _time_.
///
- /// Seconds range from zero to 60 to allow the system to inject [leap
- /// seconds].
+ /// Seconds range from zero to 60 to allow the system to inject
+ /// [leap
+ seconds].
///
/// Can be used to implement [`Time#sec`].
///
@@ -316,8 +317,8 @@ impl Time {
self.inner.local_time_type().is_dst()
}
- /// Returns an integer representing the day of the week, `0..=6`, with Sunday
- /// == 0.
+ /// Returns an integer representing the day of the week, `0..=6`, with
+ /// Sunday == 0.
///
/// Can be used to implement [`Time#wday`].
///
diff --git i/src/bin/airb.rs w/src/bin/airb.rs
index 47d657c252..f142194010 100644
--- i/src/bin/airb.rs
+++ w/src/bin/airb.rs
@@ -11,8 +11,8 @@
#![warn(unused_qualifications)]
#![warn(variant_size_differences)]
-//! `airb` is the Artichoke implementation of `irb` and is an interactive Ruby shell
-//! and [REPL][repl].
+//! `airb` is the Artichoke implementation of `irb` and is an interactive Ruby
+//! shell and [REPL][repl].
//!
//! `airb` is a readline enabled shell, although it does not persist history.
//!
diff --git i/src/bin/artichoke.rs w/src/bin/artichoke.rs
index 5aab468a74..2be5d61deb 100644
--- i/src/bin/artichoke.rs
+++ w/src/bin/artichoke.rs
@@ -173,7 +173,7 @@ fn command() -> Command<'static> {
//
// `ripgrep` is licensed with the MIT License Copyright (c) 2015 Andrew Gallant.
//
-// https://github.com/BurntSushi/ripgrep/blob/9f924ee187d4c62aa6ebe4903d0cfc6507a5adb5/LICENSE-MIT
+// //https://github.com/BurntSushi/ripgrep/blob/9f924ee187d4c62aa6ebe4903d0cfc6507a5adb5/LICENSE-MIT
//
// See https://github.com/artichoke/artichoke/issues/1301.
@@ -195,12 +195,12 @@ where
if err.use_stderr() {
return Err(err.into());
}
- // Explicitly ignore any error returned by write!. The most likely error
- // at this point is a broken pipe error, in which case, we want to ignore
- // it and exit quietly.
+ // Explicitly ignore any error returned by write!. The most likely error at
+ //this point is a broken pipe error, in which case, we want to ignore it and
+ //exit quietly.
//
- // (This is the point of this helper function. clap's functionality for
- // doing this will panic on a broken pipe error.)
+ // (This is the point of this helper function. clap's functionality for doing
+ //this will panic on a broken pipe error.)
let _ignored = write!(io::stdout(), "{}", err);
process::exit(0);
}
diff --git i/src/lib.rs w/src/lib.rs
index 1c574a3a75..27657f0b4a 100644
--- i/src/lib.rs
+++ w/src/lib.rs
@@ -13,9 +13,11 @@
//! Artichoke Ruby
//!
-//! This crate is a Rust and Ruby implementation of the [Ruby programming
-//! language][rubylang]. Artichoke is not production-ready, but intends to be a
-//! [MRI-compliant][rubyspec] implementation of [recent MRI Ruby][mri-target].
+//! This crate is a Rust and Ruby implementation of the
+//! [Ruby programming
+ language][rubylang]. Artichoke is not production-ready,
+//! but intends to be a [MRI-compliant][rubyspec] implementation of
+//! [recent MRI Ruby][mri-target].
//!
//! [mri-target]: https://github.com/artichoke/artichoke/blob/trunk/RUBYSPEC.md#mri-target
//!
diff --git i/src/parser.rs w/src/parser.rs
index c25cc27df2..f9eb79b410 100644
--- i/src/parser.rs
+++ w/src/parser.rs
@@ -193,8 +193,8 @@ impl<'a> Parser<'a> {
EXPR_ENDFN => false,
// jump keyword like break, return, ...
EXPR_MID => false,
- // this token is unreachable and is used to do integer math on the
- // values of `mrb_lex_state_enum`.
+ // this token is unreachable and is used to do integer math on
+ //the values of `mrb_lex_state_enum`.
EXPR_MAX_STATE => false,
};
if code_has_unterminated_expression {
@@ -216,7 +216,7 @@ impl<'a> Drop for Parser<'a> {
sys::mrb_parser_free(parser.as_mut());
});
}
- // There is no need to free `context` since it is owned by the
- // Artichoke state.
+ // There is no need to free `context` since it is owned by the Artichoke
+ //state.
}
}
diff --git i/src/ruby.rs w/src/ruby.rs
index a5710565d5..41a9d67279 100644
--- i/src/ruby.rs
+++ w/src/ruby.rs
@@ -219,9 +219,9 @@ fn load_error<P: AsRef<OsStr>>(file: P, message: &str) -> Result<String, Error>
// This function exists to provide a workaround for Artichoke not being able to
// read from the local file system.
//
-// By passing the `--with-fixture PATH` argument, this function loads the file
-// at `PATH` into memory and stores it in the interpreter bound to the
-// `$fixture` global.
+// By passing the `--with-fixture PATH` argument, this function loads the file at
+ //`PATH` into memory and stores it in the interpreter bound to the `$fixture`
+//global.
#[inline]
fn setup_fixture_hack<P: AsRef<Path>>(interp: &mut Artichoke, fixture: P) -> Result<(), Error> {
let data = if let Ok(data) = fs::read(fixture.as_ref()) {
as an aside it looks like grapheme cluster emojis are reflowed differently than prettier does. Maybe cargo-spellcheck is counting these as multiple characters?
this reflow is also broken:
diff --git i/spinoso-time/src/time/tzrs/parts.rs w/spinoso-time/src/time/tzrs/parts.rs
index e02a8138a7..33a0d515e7 100644
--- i/spinoso-time/src/time/tzrs/parts.rs
+++ w/spinoso-time/src/time/tzrs/parts.rs
@@ -60,8 +60,9 @@ impl Time {
/// Returns the second of the minute (0..60) for _time_.
///
- /// Seconds range from zero to 60 to allow the system to inject [leap
- /// seconds].
+ /// Seconds range from zero to 60 to allow the system to inject
+ /// [leap
+ seconds].
///
/// Can be used to implement [`Time#sec`].
///
@@ -316,8 +317,8 @@ impl Time {
self.inner.local_time_type().is_dst()
}
- /// Returns an integer representing the day of the week, `0..=6`, with Sunday
- /// == 0.
+ /// Returns an integer representing the day of the week, `0..=6`, with
+ /// Sunday == 0.
///
/// Can be used to implement [`Time#wday`].
///
block quotes in // comments appear to be reflowed incorrectly as well
Graphemes are not handled, I didn't figure out a way to reliably count their length. See #143
|
2025-04-01T06:38:27.394521
| 2024-01-17T11:44:37
|
2086015889
|
{
"authors": [
"AnomalRoil",
"sergeevabc"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5462",
"repo": "drand/tlock",
"url": "https://github.com/drand/tlock/issues/78"
}
|
gharchive/issue
|
Windows binary
Dear @drand, Windows 7 x64 user here. Could you be so kind to generate .exe for the rest of us who are ordinary users without compilers?
Sounds like a good idea. I'll release a v1.1.0 soon and will include binaries for Windows, Linux and Macos.
You can now find a Windows binary in our latest release!
https://github.com/drand/tlock/releases
I'll have to tweak the releaser a bit to avoid packaging binaries in .tar.gz archives, but these can be extracted using 7Zip and probably many other Archive tools on Windows.
@AnomalRoil, I am a Windows 7 x64 user (mentioned in the first message), but it seems you built an executable using the latest version of a compiler, from which the ability to build for that OS was treacherously removed. This is what I see on the screen:
$ tle.exe
Exception 0xc0000005 0x8 0x0 0x0
PC=0x0
runtime.asmstdcall()
$GOROOT/src/runtime/sys_windows_amd64.s:65 +0x75 fp=0x22fca0 sp=0x22fc80 pc=0x46da75
rax 0x0
rbx 0xfe2c80
rcx 0x1036dc0
rdi 0x7fffffde000
rsi 0x22fea0
rbp 0x22fde0
rsp 0x22fc78
r8 0x0
r9 0x22fee0
r10 0x1008818
r11 0x21
r12 0x22fec0
r13 0x1
r14 0xfe2620
r15 0x0
rip 0x0
rflags 0x10293
cs 0x33
fs 0x53
gs 0x2b
I guess this might be caused by issues in our goreleaser script. Let me re-open the issue.
Could you try with these 2 binaries and let me know if either work for you?
Just running ./releasetle.exe -v and ./localtle.exe -v should be enough to check if they work:
tle-win.zip
@AnomalRoil, none of them work, alas.
It seems Windows 7 might not be supported anymore since Go 1.21 (https://github.com/golang/go/issues/57003), that's an unfortunate decision given the current market penetration of Windows 7 and seems a bit quick given how Windows XP is currently still supported by MS for paid customers, AFAIK.
Could you try with this binary compiled with Go1.20 which should still support Windows 7 ?
tle-1.20.zip
@AnomalRoil, this binary works as expected, thank you.
That's why I used the word 'treacherously' in relation to what @golang did. Those folks from California lost touch with reality and stopped taking into account which OS people like me outside the golden billion use and will use for a foreseeable future. It's especially ridiculous when they cite OS distribution statistics taken from a site like StatCounter, given that it has been distorted for many years by the fact that tracker scripts are blocked at the browser level, even without extensions like uBlock. I do not remember such betrayal, for example, by the C++ devs. And now we are forced to negotiate with app developers like you to compile, let’s say, legacy versions. Recently, @shenwei356 and I were working on improving bRename, a file renaming app, and faced the same situation. Indeed, the word 'segregation' fits here, not 'progress'.
This should now be fixed in https://github.com/drand/tlock/releases/tag/v1.1.1
|
2025-04-01T06:38:27.396572
| 2019-12-26T23:05:46
|
542690298
|
{
"authors": [
"codebycliff",
"kapso",
"pedrofurtado"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5463",
"repo": "drapergem/draper",
"url": "https://github.com/drapergem/draper/issues/869"
}
|
gharchive/issue
|
Ruby 2.7.0 support
Seeing this warning...
lib/ruby/gems/2.7.0/gems/draper-3.1.0/lib/draper/delegation.rb:10: warning: Using the last argument as keyword parameters is deprecated; maybe ** should be added to the call
@olleolleolle @codebycliff 🤝
Fixed by: https://github.com/drapergem/draper/pull/870
|
2025-04-01T06:38:27.451564
| 2018-06-06T17:55:04
|
329971149
|
{
"authors": [
"drewish",
"gvgramazio"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5464",
"repo": "drewish/jekyll-font-awesome-sass",
"url": "https://github.com/drewish/jekyll-font-awesome-sass/issues/3"
}
|
gharchive/issue
|
[REQUEST] - Support for FontAwesome 5
font-awesome-sass is now at version 5.0.13 but your plugin only support version 4. Could you please update it?
I'm not currently using it myself so I haven't had any push to upgrade it, but happy to have a PR.
Well turns out there was already a PR so merged that and published an updated gem.
Yeah, I've already saw that PR. That was the mainly reason for asking to update. :)
By the way, thanks for the update. :D
|
2025-04-01T06:38:27.453350
| 2020-10-23T16:21:14
|
728350236
|
{
"authors": [
"drewvigne",
"duffel90"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5465",
"repo": "drewvigne/arduino_nano_33_ant",
"url": "https://github.com/drewvigne/arduino_nano_33_ant/issues/1"
}
|
gharchive/issue
|
Powermeter
Hey,
i found you project over youtube. The last thing I did was to rebuild the powermeter after this project https://github.com/rrrlasse/powerino/wiki. But I would like to get my data directly on my Garmin watch. Is it possible to send the data via ANT with the already existing project?
Best regards
Hi duffel90,
Certain Garmin watches support ANT power meter data, yours might be one of them. So long as it is, the existing project should work fine.
Drew
|
2025-04-01T06:38:27.464215
| 2016-04-02T04:22:58
|
145333858
|
{
"authors": [
"Bandito11",
"dylanvdmerwe",
"jgw96",
"tlancina"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5466",
"repo": "driftyco/ionic-cli",
"url": "https://github.com/driftyco/ionic-cli/issues/892"
}
|
gharchive/issue
|
ENOENT spawn error with ionic start
From @Bandito11 on April 2, 2016 2:8
Note: for support questions, please use one of these channels:
https://forum.ionicframework.com/
http://ionicworldwide.herokuapp.com/
Short description of the problem:
I upgraded to beta.23 and started to get an error when created a new project using 'ionic start foo --v2 --ts'.
The error in red says
"Unable to spawn commandError: spawn npm ENOENT (CLI v2.0.0-beta.23)."
I don't even know what it did to the project.
What behavior are you expecting?
Steps to reproduce:
ionic start foo --v2 --ts
it builds and creates the proyect
insert any relevant code between the above and below backticks
Other information: (e.g. stacktraces, related issues, suggestions how to fix, stackoverflow links, forum links, etc)
Which Ionic Version? 1.x or 2.x
Run ionic info from terminal/cmd prompt: (paste output below)
Copied from original issue: driftyco/ionic#6018
Hello! Thanks for opening an issue with us! Since this is an issue related to the ionic cli and not the framework i will be moving this issue to that repo. Feel free to continue the conversation over there! Thanks!
I think I solved it. I had python2 and python3 installed so after I deleted that everything was solved.
I think this should be added to the documents even though I think this pertains node and not the ionic cli
@Bandito11 what version of nodejs are you using? You shouldn't have to uninstall python to be able to use the ionic cli.
I have the latest one. I uninstalled and installed everything yesterday because I've been using ASP.Net and Python3 (already had Python 2 installed) for college since January and wanted to have the latest npm and node. When I was 'ionic start'-ing the project it was mentioning that python 3 was in the path (don't remember what else it said) so I deleted python 3 and everything was fixed.
I am using the latest ionic@beta too. It isbeta23
Hmm, ok, thanks for the info! I will have to look more into this.
This error is happening on my side and I have not installed Python.
The npm install task originally used exec to run npm install, which was sometimes caused issues so it was switched to spawn, which doesn't work/works differently on Windows.
Switched to using https://github.com/IndigoUnited/node-cross-spawn in the latest beta, so this should be resolved, but let me know if you're still having issues, thanks!
|
2025-04-01T06:38:27.469746
| 2014-08-29T21:09:31
|
41539782
|
{
"authors": [
"gastonbesada",
"mhartington"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5467",
"repo": "driftyco/ionic-site",
"url": "https://github.com/driftyco/ionic-site/issues/210"
}
|
gharchive/issue
|
[Feature request] - Full features demo in the homepage
Hi,
I'm reviewing http://www.idangero.us/framework7/ and I found very helpful the full features demo in the homepage. Could you review if is possible make some similar in ionic's site?
Thanks and regards
For demos, we have opted to keep them under our codepen account.
http://codepen.io/ionic/public-list/
|
2025-04-01T06:38:27.476783
| 2017-04-26T12:59:03
|
224459106
|
{
"authors": [
"Iyashu5040",
"brandyscarney",
"jgw96"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5468",
"repo": "driftyco/ionic",
"url": "https://github.com/driftyco/ionic/issues/11373"
}
|
gharchive/issue
|
$label-ios-text-color missing
Ionic version: (check one with "x")
[ ] 1.x
[ ] 2.x
[X ] 3.x
I'm submitting a ... (check one with "x")
[X ] bug report
[ ] feature request
[ ] support request => Please do not submit support requests here, use one of these channels: https://forum.ionicframework.com/ or http://ionicworldwide.herokuapp.com/
Current behavior:
There is no $label-ios-text-color sass variable. There are equivalents for md and wp.
Expected behavior:
A $label-ios-text-color variable is needed to be able to override label colors on ios like on other platforms.
Ionic info: (run ionic info from a terminal/cmd prompt and paste output below):
Cordova CLI: 6.5.0
Ionic Framework Version: 3.0.1
Ionic CLI Version: 2.2.3
Ionic App Lib Version: 2.2.1
Ionic App Scripts Version: 1.3.0
ios-deploy version: Not installed
ios-sim version: Not installed
OS: Windows 10
Node Version: v6.9.1
Xcode version: Not installed
Hello, thanks for opening an issue with us, we will look into this.
I was making a repo for a different issue and decided to just add this one as well. Run this project with ionic serve --lab and compare Android and iOS, go to the Sign Up page and you'll see the color difference. The variables are set at the bottom of variables.scss. Here's the commit with the changes:
https://github.com/Iyashu5040/ionic-conference-app/commit/810a2574d07f6757db15a24ca546353f934c2c58
The variable doesn't exist because we don't change the styling on the iOS label. We could add one that just styles initial, but if you wanted to match the way material design works it would be the following styling:
.item-input .label-ios,
.item-select .label-ios,
.item-datetime .label-ios {
color: color($colors, primary);
}
Thanks for the clarification and the example, I appreciate it. Regarding adding the variable: I'd like to argue for adding it, for the sake of making customisation easier and more consistent across platforms.
@Iyashu5040 I've added it back to master. Could you try out the following nightly and let me know if you have any issues?
npm install --save --save-exact<EMAIL_ADDRESS>
Thanks @brandyscarney! I've tested with the nightly version and the style applies correctly.
|
2025-04-01T06:38:27.485055
| 2017-05-22T13:00:56
|
230388166
|
{
"authors": [
"jgw96",
"stalniy"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5469",
"repo": "driftyco/ionic",
"url": "https://github.com/driftyco/ionic/issues/11752"
}
|
gharchive/issue
|
viewWillEnter is not triggered on App instance when switching between tabs
Ionic version: (check one with "x")
[ ] 1.x (For Ionic 1.x issues, please use https://github.com/driftyco/ionic-v1)
[x] 2.x
[x] 3.x
I'm submitting a ... (check one with "x")
[x] bug report
[ ] feature request
[ ] support request => Please do not submit support requests here, use one of these channels: https://forum.ionicframework.com/ or http://ionicworldwide.herokuapp.com/
Current behavior:
When switching between tabs in ion-tabs ionic emits value on viewWillEnter observable on App instance only once (when component is created and doesn't emit for subsequent navigations)
Expected behavior:
When switching between tabs in ion-tabs ionic emits viewWillEnter observable on App instance
Steps to reproduce:
Create an ionic app with tabs
Inject App and subscribe to viewWillEnter
Navigate between tabs
Other information:
issue caused by this line https://github.com/driftyco/ionic/blob/master/src/components/tabs/tabs.ts#L383
I think to fix the issue the line should be changed to:
selectedPage && selectedTab._willEnter(selectedPage);
The same should be done for other leave/enter events.
The fix is quite important for properly and easily tracking analytics in the whole application because it allows to track all navigation events through subscription to Apps viewWillEnter observable.
I'm ready to create PR if you think the suggested approach is correct
Ionic info: (run ionic info from a terminal/cmd prompt and paste output below):
Cordova CLI: 7.0.1
Ionic Framework Version: 2.2.0
Ionic CLI Version: 2.1.18
Ionic App Lib Version: 2.1.9
ios-deploy version: 1.9.0
ios-sim version: 5.0.8
OS: OS X El Capitan
Node Version: v6.9.1
Xcode version: Xcode 8.2.1 Build version 8C1002
Hello, thanks for using Ionic! I am going to close this issue as a duplicate of https://github.com/driftyco/ionic/issues/11694 and we can continue the discussion on that issue 😃 .
|
2025-04-01T06:38:27.490148
| 2015-05-22T13:34:54
|
79457641
|
{
"authors": [
"Ionitron",
"timuric"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5470",
"repo": "driftyco/ionic",
"url": "https://github.com/driftyco/ionic/issues/3814"
}
|
gharchive/issue
|
Layout glitches on Android when keyboard is opening or closing.
There are short visual distortions on keyboard transitions. However Ionic View app does not have such issue, thus it is seems to be solvable. My device is Nexus 4, Android 5.1. However I noticed similar behaviour on other devices.
Link to the video: http://d.pr/v/16ErH
Test app : http://d.pr/f/V56v
Does anyone has a clue what is the reason for that?
Greetings @timuric!
My sensors indicate that you need to update your issue through our custom issue form. We are now requiring all issues to be submitted this way, to ensure that we have all of the information necessary to fix them as quickly as possible.
Click Here To Update Your Issue
I will have no choice but to close this issue if it is not resubmitted through the form. Please fill out the rest of the form, so that I may use my friendly robot powers to assist you.
Thank you!
Greetings @timuric!
I've closed this issue because my sensors indicated it was old and inactive, and may have already been fixed in recent versions of Ionic. However, if you are still experiencing this issue, please feel free to reopen this issue by creating a new one, and include any examples and other necessary information, so that we can look into it further.
Thank you for allowing me to assist you.
|
2025-04-01T06:38:27.497077
| 2015-09-27T22:04:11
|
108563571
|
{
"authors": [
"Ionitron",
"Urigo",
"kaiquewdev",
"mlynch",
"yumikohey"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5471",
"repo": "driftyco/ionic",
"url": "https://github.com/driftyco/ionic/issues/4435"
}
|
gharchive/issue
|
bug: Meteor driftyco:ionic Blank Screen on ios simulator / ios devices
Type: bug
Platform: ios 7 webview
I am building a Meteor hybrid app that using AngularJS and ionic, so I decided to use driftyco:ionic, since meteoric:ionic doesn't support Angular.
Everything works fine on web browser, however, it is blank when it runs on ios devices or simulator. I get rid of driftyco:ionic, content is able to show on ios devices. Thus, I believe there is something I did wrong, so I got a blank page.
Could you check my repo https://github.com/yumikohey/bringMe ?
I deployed to <IP_ADDRESS>(most updated) and bring-me.meteor.com(older version)
Thanks for your help.
Greetings @yumikohey!
My sensors indicate that you need to update your issue through our custom issue form. We are now requiring all issues to be submitted this way, to ensure that we have all of the information necessary to fix them as quickly as possible.
Click Here To Update Your Issue
I will have no choice but to close this issue if it is not resubmitted through the form. Please fill out the rest of the form, so that I may use my friendly robot powers to assist you.
Thank you!
Hi @yumikohey. Seems like a better question for @Urigo
@yumikohey can you please open an issue on the Angular Meteor repo with your source code?
thanks
@yumikohey Look at the repo, check pull request that i sent to your project.
This is a meteor environment issue, i guess.
|
2025-04-01T06:38:27.503906
| 2016-08-22T10:30:24
|
172418921
|
{
"authors": [
"jgw96",
"navid045",
"nkaredia"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5472",
"repo": "driftyco/ionic",
"url": "https://github.com/driftyco/ionic/issues/7819"
}
|
gharchive/issue
|
Need a new option for inputs at the alertController component
Short description of the problem:
We can't add some custom attributes for inputs in alertController.
What behavior are you expecting?
I think it is necessary that we could add custom attributes like maxlength, id, style and etc to inputs in alertController.
I suggest we have a new option in inputs named "attr" that can set custom attributes for inputs. for example like this:
let alert = this.alertCtrl.create({
title: 'Login',
inputs: [
{
name: 'test',
placeholder: 'Test name',
attr: { maxlength: 6, id: "test", style:"color:red; font-size: 14px;" }
}
],
buttons: [
{
text: 'Cancel',
role: 'cancel',
handler: data => {
console.log('Cancel clicked');
}
}
]
});
alert.present();
}
Which Ionic Version? 1.x or 2.x
2.x
Plunker that shows an example of your issue
For Ionic 1 issues - http://plnkr.co/edit/Xo1QyAUx35ny1Xf9ODHx?p=preview
For Ionic 2 issues - http://plnkr.co/edit/me3Uk0GKWVRhZWU0usad?p=preview
Run ionic info from terminal/cmd prompt: (paste output below)
Your system information:
Cordova CLI: 6.2.0
Gulp version: CLI version 3.8.11
Gulp local:
Ionic CLI Version: 2.0.0-beta.31
Ionic App Lib Version: 2.0.0-beta.17
OS: Windows 8
Node Version: v4.4.0
Hello! We are moving our feature requests to a new feature request doc. I have moved this feature request to the doc and because of this I will be closing this issue for now. Thanks for using Ionic!
Hello all! While this is an awesome feature request it is not something that we plan on doing anytime soon. Because of this I am going to move this to our internal feature tracking repo for now as it is just "collecting dust" here. Once we decide to implement this I will move it back. Thanks everyone for using Ionic!
This issue was moved to driftyco/ionic-feature-requests#54
alertoptions = {
title: 'sometitle',
inputs: [{name: 'someName', id: 'someID'}]
}
alert.present().then(v => {
// will fire this when modal is completely loaded with animations
let id = document.getElementById('someID');
if (id) {
// Do your thing
}
});
Although this is just a workaround, I think Ionic team should include a feature to add custom attributes to handle form validation and other native HTML5 features for inputs.
|
2025-04-01T06:38:27.508941
| 2015-08-18T23:55:26
|
101775395
|
{
"authors": [
"Alphatiger",
"Ionitron"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5473",
"repo": "driftyco/ng-cordova",
"url": "https://github.com/driftyco/ng-cordova/issues/944"
}
|
gharchive/issue
|
IOS pushnotification sound
In NgCordova push notification sample code
$rootScope.$on('$cordovaPush:notificationReceived', function(event, notification) {
if (notification.alert) {
navigator.notification.alert(notification.alert);
}
if (notification.sound) {
var snd = new Media(event.sound);
snd.play();
}
A sound file is being played explicitly, is this mandatory or if i skip it will it play the default sound?
And does the sound plays (default or custom) plays in background?
I tested with all possibe scenarios but but currently with out the above sound code, sound is not playing
Is the above code mandatory
Greetings @Alphatiger!
I've closed this issue because my sensors indicated it was old and inactive, and may have already been fixed in recent versions of Ionic. However, if you are still experiencing this issue, please feel free to reopen this issue by creating a new one, and include any examples and other necessary information, so that we can look into it further.
Thank you for allowing me to assist you.
|
2025-04-01T06:38:27.512196
| 2024-04-29T06:05:51
|
2268197946
|
{
"authors": [
"MJRT",
"janat08"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5474",
"repo": "drizzle-team/drizzle-orm",
"url": "https://github.com/drizzle-team/drizzle-orm/issues/2226"
}
|
gharchive/issue
|
[FEATURE]: cloudflare d1 session support
Describe what you want
https://blog.cloudflare.com/building-d1-a-global-database
This should be usable in graphql too.
Have any plan for this? This will greatly enhance the user experience.
|
2025-04-01T06:38:27.525842
| 2024-10-07T15:16:43
|
2570757872
|
{
"authors": [
"AndriiSherman",
"Krinopotam",
"john-griffin",
"kylesloper",
"marc-neander",
"notcodev",
"ryanxcharles"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5475",
"repo": "drizzle-team/drizzle-orm",
"url": "https://github.com/drizzle-team/drizzle-orm/issues/3057"
}
|
gharchive/issue
|
[BUG]: Unable to run migrations using drizzle-kit 0.25.0
What version of drizzle-orm are you using?
0.33.0
What version of drizzle-kit are you using?
0.25.0
Describe the Bug
pnpm --package=drizzle-kit --package=drizzle-orm --package=pg -c dlx 'drizzle-kit migrate'
gives
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './casing' is not defined by "exports" in /Users/marc.neander/Library/Caches/pnpm/dlx/3zgskc56eje2wm6mx5ky2kqzoe/192677b77f9-1747/node_modules/drizzle-orm/package.json
Expected behavior
No response
Environment & setup
No response
Can't reproduce, did you try to force reinstall node_modules?
Can't reproduce, did you try to force reinstall node_modules?
Happens in CI without any cache
What version of drizzle-orm are you using?
0.33.0
What version of drizzle-kit are you using?
0.25.0
Describe the Bug
pnpm --package=drizzle-kit --package=drizzle-orm --package=pg -c dlx 'drizzle-kit migrate'
gives
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './casing' is not defined by "exports" in /Users/marc.neander/Library/Caches/pnpm/dlx/3zgskc56eje2wm6mx5ky2kqzoe/192677b77f9-1747/node_modules/drizzle-orm/package.json
Expected behavior
No response
Environment & setup
No response
Encountered the same problem. drizzle-kit 0.25.0 only works correctly with drizzle-orm 0.34.0. Update the version of drizzle-orm to resolve the issue
What version of drizzle-orm are you using?
0.33.0
What version of drizzle-kit are you using?
0.25.0
Describe the Bug
pnpm --package=drizzle-kit --package=drizzle-orm --package=pg -c dlx 'drizzle-kit migrate'
gives
Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './casing' is not defined by "exports" in /Users/marc.neander/Library/Caches/pnpm/dlx/3zgskc56eje2wm6mx5ky2kqzoe/192677b77f9-1747/node_modules/drizzle-orm/package.json
Expected behavior
No response
Environment & setup
No response
Encountered the same problem. drizzle-kit 0.25.0 only works correctly with drizzle-orm 0.34.0. Update the version of drizzle-orm to resolve the issue
Can confirm that this solves it. Had a bad break running migrations with latest packages inbetween orm and kit releases i guess
Can confirm this as not completed.
Using pnpm
<EMAIL_ADDRESS><EMAIL_ADDRESS>Error [ERR_PACKAGE_PATH_NOT_EXPORTED]: Package subpath './casing' is not defined by "exports" in C:[path to project]/node_modules\drizzle-orm\package.json
Cleared pnpm cache and forcefully remove and updated modules all to no avail.
Also getting this after upgrading to the following versions using pnpm:
<EMAIL_ADDRESS><EMAIL_ADDRESS>
I was able to reproduce this by upgrading a single package in my pnpm monorepo to the following versions:
<EMAIL_ADDRESS><EMAIL_ADDRESS>After I upgraded every app in the monorepo to the same versions the issue goes away.
I was able to reproduce this by upgrading a single package in my pnpm monorepo to the following versions:
<EMAIL_ADDRESS><EMAIL_ADDRESS>After I upgraded every app in the monorepo to the same versions the issue goes away.
Thank you! I faced with two different problems when I was trying to create migrations and I've tried different versions of this packages and this versions combination works fine
I had this bug. I fixed it. Here's how.
I had a monorepo with pnpm with multiple different versions of drizzle installed. I removed drizzle as a dependency from modules that didn't need it. This left only one version of drizzle, up to date, in only one package. I deleted node_modules, reinstalled everything, and the problem vanished.
|
2025-04-01T06:38:27.543917
| 2023-04-20T20:43:07
|
1677403629
|
{
"authors": [
"aaronnickovich",
"drodil"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5476",
"repo": "drodil/backstage-plugin-toolbox",
"url": "https://github.com/drodil/backstage-plugin-toolbox/issues/36"
}
|
gharchive/issue
|
feat: YAML Diff
Tool that compares two YAML text and shows the diff
General diff functionality would be nice too, not just for YAMLs :)
@drodil good idea! I'll see if I can make it generic :)
PR to solve this issue: #40
|
2025-04-01T06:38:27.546800
| 2016-06-26T10:09:23
|
162321246
|
{
"authors": [
"cgroschupp",
"droe"
],
"license": "BSD-2-Clause",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5477",
"repo": "droe/sslsplit",
"url": "https://github.com/droe/sslsplit/issues/137"
}
|
gharchive/issue
|
The sslsplit child process continues, even if the master process terminated.
See example to reproduce it:
sslsplit -p sslsplit.pid -P http <IP_ADDRESS> 8080 www.example.com 80&
ps x | grep sslsplit | grep -v grep
3018 s002 S 0:00.00 sslsplit -p sslsplit.pid -P http <IP_ADDRESS> 8080 www.example.com 80
3019 s002 S 0:00.00 sslsplit -p sslsplit.pid -P http <IP_ADDRESS> 8080 www.example.com 80
kill $(cat sslsplit.pid)
ps x | grep sslsplit | grep -v grep
3019 s002 S 0:00.00 sslsplit -p sslsplit.pid -P http <IP_ADDRESS> 8080 www.example.com 80
sslsplit -V
SSLsplit 0.5.0 (built 2016-06-25)
Copyright (c) 2009-2016, Daniel Roethlisberger<EMAIL_ADDRESS>http://www.roe.ch/SSLsplit
Build info: OSX:10.10.5 XNU:2782.40.9:sw_vers:2782.50.1 V:FILE
Features: -DHAVE_DARWIN_LIBPROC -DHAVE_PF
NAT engines: pf*
Local process info support: yes (Darwin libproc)
compiled against OpenSSL 1.0.2h 3 May 2016 (1000208f)
rtlinked against OpenSSL 1.0.2h 3 May 2016 (1000208f)
OpenSSL has support for TLS extensions
TLS Server Name Indication (SNI) supported
OpenSSL is thread-safe with THREADID
Using SSL_MODE_RELEASE_BUFFERS
SSL/TLS protocol availability: ssl3 tls10 tls11 tls12
SSL/TLS algorithm availability: RSA DSA ECDSA DH ECDH EC
OpenSSL option availability: SSL_OP_NO_COMPRESSION SSL_OP_NO_TICKET SSL_OP_ALLOW_UNSAFE_LEGACY_RENEGOTIATION SSL_OP_DONT_INSERT_EMPTY_FRAGMENTS SSL_OP_NO_SESSION_RESUMPTION_ON_RENEGOTIATION SSL_OP_TLS_ROLLBACK_BUG
compiled against libevent 2.0.22-stable
rtlinked against libevent 2.0.22-stable
8 CPU cores detected
I will take a look. Any reason why you are not using daemon mode? (-d)
I can reproduce the problem. The PID file always points to the parent process. When the parent process is killed, the child process does not terminate. When on the other hand the child process is killed, all is well, the parent process gets notified and terminates gracefully.
The above commits fix a number of issues that would lead to the parent process being stuck in wait() while still having signals queued to forward to the child process. The notable commits are abc86df, adding SIGTERM which was missing in the list of signals forwarded to the client process, and 5ece01a, preventing the server from being stuck in wait() after all privsep client sockets send a close message before the child process actually terminates.
This fixes the issue for me. Please test and report back if it resolved the issue for you too.
Works for me too.
|
2025-04-01T06:38:27.557063
| 2021-04-21T08:29:00
|
863587904
|
{
"authors": [
"PandaUncle",
"nuo-promise",
"zendwang"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5478",
"repo": "dromara/shenyu",
"url": "https://github.com/dromara/shenyu/issues/1294"
}
|
gharchive/issue
|
PluginHandle configuration not work
Describe the bug
Configure PluginHandle .
PluginName: divide
FieldName: timeout
DefaultValue: 10000
I hope when rule registered,the timeout is 10000, but still 3000.
In the plugin processing management, configure the timeout of the divide plugin to 10000 (the system default is 3000). After the rule is registered, the timeout of the rule is still 3000. I originally hoped that the registered rules could be 10000 in accordance with the configuration timeout in the processing management
Environment
Soul version(s): 2.3.0
Steps to reproduce
Expected behavior
I will follow up this question.keep contact!
we will fix it on next version.
The current version of the interface registration does not go to the pluginhandle table to read the plugin attribute value. Under our internal evaluation, this function will be added if necessary.
Need to be optimized? @yu199195 @dengliming
After discussion, support is not considered for the time being, and the issue will be closed in one day.
|
2025-04-01T06:38:27.565482
| 2017-12-11T19:41:32
|
281142991
|
{
"authors": [
"coveralls",
"julianoes"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5479",
"repo": "dronecore/DroneCore",
"url": "https://github.com/dronecore/DroneCore/pull/198"
}
|
gharchive/pull-request
|
Package integration test, generate .deb and .rpm
This moves from checkinstall to fpm which allows us to generate multiple packages, one for the library and one for the integration tests.
Coverage remained the same at 16.051% when pulling ad6d8611e86c2659645de870edee520ee843ee8d on integration-tests-exe into 7225b768e006d2a01aea0c57289889ae56c96a2f on develop.
Coverage remained the same at 16.051% when pulling cdf884904a83638a1f5bada13277b8fb5bafdca4 on integration-tests-exe into 7225b768e006d2a01aea0c57289889ae56c96a2f on develop.
Coverage remained the same at 16.051% when pulling 96c2dae692fbee72cb9203f5892306f063b42183 on integration-tests-exe into 7225b768e006d2a01aea0c57289889ae56c96a2f on develop.
Coverage remained the same at 16.051% when pulling 251c6ae0d7f561169f89e50b3d912359af20bd7a on integration-tests-exe into 7225b768e006d2a01aea0c57289889ae56c96a2f on develop.
Coverage decreased (-0.3%) to 15.78% when pulling e18736b99d02c3256632654a7bb8d3e5dbb781c1 on integration-tests-exe into 7225b768e006d2a01aea0c57289889ae56c96a2f on develop.
|
2025-04-01T06:38:27.573664
| 2017-12-29T18:56:35
|
285139389
|
{
"authors": [
"duckhang9113",
"jyopari"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5480",
"repo": "dronekit/dronekit-python",
"url": "https://github.com/dronekit/dronekit-python/issues/778"
}
|
gharchive/issue
|
mavproxy command not found
Hi,
I installed everything properly and I keep getting this error:
bash: mavproxy.py: command not found
do you know what up?
I have the same problem. I use ubuntu 17 version. I use the following instruction to install: https://ardupilot.github.io/MAVProxy/html/getting_started/download_and_installation.html#linux
|
2025-04-01T06:38:27.589574
| 2022-02-12T07:10:19
|
1133663886
|
{
"authors": [
"CLAassistant",
"ZacSweers",
"digitalbuddha",
"eleventigerssc",
"liutikas"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5481",
"repo": "dropbox/focus",
"url": "https://github.com/dropbox/focus/pull/3"
}
|
gharchive/pull-request
|
Disable configuration caching on CreateFocusSettingsTask
Partially handles #2 but doesn't implement actual support for configuration caching. This instead marks this task as incompatible with it, allowing Gradle to gracefully fall back to regular mode when this task runs rather than require users to manually disable CC first.
Gradle 7.5 will introduce APIs for depending on the resolved dependency graph in a CC-friendly way, but that's not ready yet. Example of that can be found here though: https://github.com/adammurdoch/dependency-graph-as-task-inputs/blob/main/plugins/src/main/java/TestPlugin.java#L31-L35
Note this also raises the minimum Gradle version to 7.4 unless you want to dynamically gate the API call at runtime on the gradle version
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.You have signed the CLA already but the status is still pending? Let us recheck it.
The CLA link leads me to an inactionable screen (I can't appear to click anything or input anything?)
Weird, seems to be loading fine for me. Try again?
Worked on mobile 👍
Raising apis seems fine, if we run into problems internally I’ll follow up and gate
Thanks Zac!
@digitalbuddha @ZacSweers it would be great if you could gate this call for Gradle versions below 7 for the poor folks who haven't got to the latest yet.
I think that you should just update. I know the Gradle ecosystem has a long track record of supporting old versions for long periods of time, and I think that's significantly held back the Gradle ecosystem's evolution. If you care enough about your build QoL to adopt a tool like this, you should care enough to update it regularly IMO
+1 Zac, configuration caching and project isolation compatibility is simply not something that will be possible to adopt by plugins without ugly hacks. Let's keep this modern!
|
2025-04-01T06:38:27.596164
| 2023-02-28T21:07:30
|
1603814259
|
{
"authors": [
"CLAassistant",
"denyszhak"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5482",
"repo": "dropbox/sqlalchemy-stubs",
"url": "https://github.com/dropbox/sqlalchemy-stubs/pull/255"
}
|
gharchive/pull-request
|
stub Select.filter and Select.filter_by
PR for https://github.com/dropbox/sqlalchemy-stubs/issues/254
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.dzhak seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.You have signed the CLA already but the status is still pending? Let us recheck it.
See https://github.com/dropbox/sqlalchemy-stubs/pull/256
|
2025-04-01T06:38:27.598512
| 2015-10-23T22:57:55
|
113113754
|
{
"authors": [
"Bjvanminnen",
"joshlory"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5483",
"repo": "droplet-editor/droplet",
"url": "https://github.com/droplet-editor/droplet/pull/131"
}
|
gharchive/pull-request
|
Restore update locations when a socket reparse fails
Prior to this change, socket reparse would modify the updates array of locations, but not restore the original values if the parent fails to parse with the new socket value. This would cause Container#getFromLocation to throw after falling off the end of the linked list.
The fix is to save the update locations before attempting to parse the socket as a new block, and restore them if the parent reparse fails.
This is a better fix for the JavaScript function sockets bug than https://github.com/droplet-editor/droplet/pull/130, which only treats the symptom but not the root cause.
Do you have enough understanding now to update the comment at the beginning of the method?
# Don't reparse sockets. When we reparse sockets,
# reparse them first, then try reparsing their parent and
# make sure everything checks out.
Changes lgtm. Good job tracking this down.
|
2025-04-01T06:38:27.606795
| 2022-08-27T08:18:35
|
1352981704
|
{
"authors": [
"rhowe"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5484",
"repo": "dropwizard/dropwizard",
"url": "https://github.com/dropwizard/dropwizard/pull/5770"
}
|
gharchive/pull-request
|
Remove redundant outputPatternAsHeader logging setting from tests
This defaults to false - we don't need to set it to false explicitly.
See https://logback.qos.ch/manual/encoders.html#outputPatternAsHeader
If you'd rather merge into 2.0.x, close this and look at https://github.com/dropwizard/dropwizard/pull/5769 instead
@rhowe Thanks a lot for the cleanup. Solid work as always. heart
This one was @zUniQueX really, I just pressed the buttons
|
2025-04-01T06:38:27.608597
| 2021-03-30T03:08:13
|
844020051
|
{
"authors": [
"psuriana",
"xinyuan-huang"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5485",
"repo": "drorlab/atom3d",
"url": "https://github.com/drorlab/atom3d/issues/22"
}
|
gharchive/issue
|
DB5 dataset for Protein Interface Prediction is empty
Hi, thanks for this amazing and comprehensive work!
After I download the full dataset from https://www.atom3d.ai/pip.html, I load the DB5 dataset, and print its length which is 0. The DIPS part data is correct.
Thanks for bug report! We've fixed and re-uploaded the DB5 dataset.
|
2025-04-01T06:38:27.618906
| 2016-12-17T16:02:38
|
196228807
|
{
"authors": [
"drphilmarshall",
"jennykim1016"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5486",
"repo": "drphilmarshall/OM10",
"url": "https://github.com/drphilmarshall/OM10/pull/49"
}
|
gharchive/pull-request
|
Lenspop Magnitude Calculation Demo
I have one question about the pull request. May I submit the request when I think the new codes are ready to be reviewed or every time when I am committing changes to my master branch? I am making sure that I am committing all the changes to github whenever I am changing the files in my computer.
I started working on making the diagnostic plots after December 10th, so I had to make this new branch named color to only include the commits that I made before December 10th. This branch has the first version of the lenspop demo notebook, and I am working on my master branch to make the diagnostic plots in addition to fixing some of the features in the first demo notebook.
Thanks!
p.s: I would not want this to be merged with the base OM10 directory yet!
@jennykim1016 Looking at this notebook again, in the light of your more recent work on the diagnostic plots, I wonder if we can use it to demonstrate the difference between the star forming and old stellar population SEDs, as well as showing the contrast with a quasar at z=2. I'll merge this PR and play around a bit, if I can fid some time! :-) Nice to have a basic demo in place though. Thanks!
|
2025-04-01T06:38:27.630804
| 2017-02-24T18:31:38
|
210123795
|
{
"authors": [
"BenoitClaveau",
"freewil"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5487",
"repo": "drudge/node-gpg",
"url": "https://github.com/drudge/node-gpg/issues/23"
}
|
gharchive/issue
|
How use clearsign
The command to sign my file asks me a secret code.
gpg --clearsign file.txt
How to proceed with node-gpg to input my secret code.
Thanks
This is similar to #20 I believe, where you need to input a passpharse.
|
2025-04-01T06:38:27.632446
| 2017-08-04T05:06:18
|
247903776
|
{
"authors": [
"gianm",
"jihoonson",
"lizhanhui",
"vongosling"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5488",
"repo": "druid-io/druid",
"url": "https://github.com/druid-io/druid/pull/4648"
}
|
gharchive/pull-request
|
Update to use latest Apache RocketMQ
RocketMQ has been incubating on Apache for a while and new version has been released thereafter. This PR is to update the plugin in order to be compatible with the latest release of Apache RocketMQ.
Please fix the build failure.
@lizhanhui, the error seems to be an unused import.
[ERROR] /home/travis/build/druid-io/druid/extensions-contrib/druid-rocketmq/src/main/java/io/druid/firehose/rocketmq/RocketMQFirehoseFactory.java:38:8: Unused import - java.util.Iterator. [UnusedImports]
@lizhanhui
|
2025-04-01T06:38:27.695884
| 2017-01-31T23:54:24
|
204459381
|
{
"authors": [
"timriley"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5489",
"repo": "dry-rb/dry-web-roda",
"url": "https://github.com/dry-rb/dry-web-roda/pull/28"
}
|
gharchive/pull-request
|
Test behaviour of newly generated app
Adopt the approach (and much of the support code) taken by hanami in their CLI testing. Now, instead of having a "dummy" app baked into our spec/ dir, we generate a new dry-web-roda app completely from scratch, following the approach a user would take as closely as possible, including bundling all the gems for the app. With the app generated, our first test is to actually boot it up and then ensure a page renders as we expect. This should serve as a helpful smoke test for setup/configuration/dependency issues in the future.
Our next step here would be to throw some extra files into the generated project (migration, relation, repo, routes, extra view) to ensure the whole persistence layer works. (This would be for another PR and another day, though 😄)
Thanks @jodosha for the code and the blog posts sharing this approach!
Adopt the approach (and much of the support code) taken by hanami in their CLI testing. Now, instead of having a "dummy" app baked into our spec/ dir, we generate a new dry-web-roda app completely from scratch, following the approach a user would take as closely as possible, including bundling all the gems for the app. With the app generated, our first test is to actually boot it up and then ensure a page renders as we expect. This should serve as a helpful smoke test for setup/configuration/dependency issues in the future.
Our next step here would be to throw some extra files into the generated project (migration, relation, repo, routes, extra view) to ensure the whole persistence layer works. (This would be for another PR and another day, though 😄)
Thanks @jodosha for the code and the blog posts sharing this approach!
|
2025-04-01T06:38:27.698765
| 2024-05-24T00:22:42
|
2314060937
|
{
"authors": [
"duanhongyi"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5490",
"repo": "drycc/workflow",
"url": "https://github.com/drycc/workflow/issues/53"
}
|
gharchive/issue
|
add volumes:client support
commands:
drycc volumes:client put xxx.file vol://s1/etc
drycc volumes:client get vol://s1/etc/xxx.file
Components:
workflow-cli
controller
controller-go-sdk
Implementation:
generate a temporary filer pod
filer mounting volumes for operation
complete
|
2025-04-01T06:38:27.719219
| 2022-10-23T17:53:00
|
1419872979
|
{
"authors": [
"arpanghosh2416",
"mriganka56"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5491",
"repo": "dsc-iem/Tourist-Guiding-App-Hacktoberfest22",
"url": "https://github.com/dsc-iem/Tourist-Guiding-App-Hacktoberfest22/pull/13"
}
|
gharchive/pull-request
|
Service Layer added on full terms
Service layer features added more.
Thank You @arpanghosh2416 for your contribution
|
2025-04-01T06:38:27.781020
| 2023-08-18T19:23:22
|
1857187136
|
{
"authors": [
"carganillox",
"dsdanielpark"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5492",
"repo": "dsdanielpark/Bard-API",
"url": "https://github.com/dsdanielpark/Bard-API/issues/168"
}
|
gharchive/issue
|
Request Bard API Key
Name : Joms
Email<EMAIL_ADDRESS>Purpose : Educational Purposes only. Thank you! 😇
Hello,
The Bard API package is an unofficial Python package, so we cannot provide an official Google API key.
Please check https://github.com/dsdanielpark/Bard-API#google-palm-api for more information or reach out to Google Cloud services for an official request. As far as I know, currently, only some restricted whitelists have access to the API, so it seems like you are using this package unofficially.
Also, please refer to the Readme for sufficient information related to the Bard API package.
Thank you.
|
2025-04-01T06:38:27.787832
| 2022-05-04T10:04:23
|
1225163841
|
{
"authors": [
"HerkulaasCombrink",
"aidanhorn",
"vukosim"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5493",
"repo": "dsfsi/covid19za",
"url": "https://github.com/dsfsi/covid19za/issues/970"
}
|
gharchive/issue
|
Closing and Archiving the Project
This is an initial Issue on working towards closing and archiving this project. If there are TODOs that you want to create or higlight, please do so and use the milestone
Suggested Data Archiving -> 1 July 2022
Please see above @HerkulaasCombrink @shaze @dmackie @lrossouw @lizelgreyling @elolelo @anelda
You can add ideas for Archiving Prep here https://docs.google.com/document/d/13PkZ5bdyGF4T6kCVG8e58Znl4ieUA7hFiC4PgSMlt5E/edit?usp=sharing
Ask for editing permission.
@HerkulaasCombrink and Jonas are busy with this - in addition to the other team members.
We are starting by consolidating the issues, but then we might open new issues (if they are data related) and we are going to update any information, if needed.
Hello, I see that the NICD media alerts page stopped releasing covid-19 updates after 29 July 2022. There is an official dashboard which gives up-to-date stats, including by province and municipality. Can you please tell me how we can get those numbers in tabular format?
Hey @aidanhorn Can you maybe create a seperate issue with details and pictures and we can then share and ask for any volunteers who can help with something that can fill in whatever is missing.
Hi @vukosim, I have looked at the linked official dashboard, and I don't see how to get historical numbers at a sub-national level. At least if we knew how to do that, then we would know how to ask for volunteer help.
@aidanhorn Actually found the ArcGIS API endpoint, so if someone can extract from there https://gis.nicd.ac.za/server/rest/
|
2025-04-01T06:38:27.792186
| 2021-11-16T21:49:19
|
1055387537
|
{
"authors": [
"dsietz",
"jqnatividad"
],
"license": "apache-2.0",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5494",
"repo": "dsietz/test-data-generation",
"url": "https://github.com/dsietz/test-data-generation/pull/98"
}
|
gharchive/pull-request
|
Performance
Hi @dsietz ,
Hope you don't mind but I took the liberty of doing a quick pass at improving performance.
I squeezed about 10% better performance from my testing with these tweaks.
BTW, I noticed that you do multi-threading in the library, is that something that can be exposed to the Data Sample Parser? Several qsv commands support multi-threading and have CLI options for it to take advantage of additional processors.
If so, let me know so I can create an enhancement issue for it.
Thanks again!
@jqnatividad Thank you very much for the performance improvements and sharing your code. I'm always open to code contributions and feedback. Your changes make sense and are more eloquent than what I had. Going forward, please make sure to create Pull Requests onto the Development branch so that I can integrate it with my own changes before pushing it up to Master.
Unfortunately, I don't have the time necessary to support continued development (enhancements) on this package, however if you'd like to join as a contributor of this package, you are more than welcome. Just let me know.
Hi @dsietz,
Thanks for your prompt reply and merging my enhancements.
And yes, I'd be more than happy to help maintain and enhance test-data-generation!
It fills an underserved need, and I can imagine several enhancements already.
|
2025-04-01T06:38:27.868119
| 2016-03-01T16:41:21
|
137625509
|
{
"authors": [
"dthree",
"looterz"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5495",
"repo": "dthree/vantage",
"url": "https://github.com/dthree/vantage/issues/50"
}
|
gharchive/issue
|
vantage-auth-basic missing
It looks like the repository no longer exists, and the pam repository is abandoned. I plan on implementing public / private-key authentication of my own, would you be interested in a repository link once it's done to add to the readme?
Sure! Thanks!
|
2025-04-01T06:38:27.869343
| 2019-04-24T19:22:52
|
436865989
|
{
"authors": [
"frolosofsky"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5496",
"repo": "dtr-org/unit-e",
"url": "https://github.com/dtr-org/unit-e/pull/1019"
}
|
gharchive/pull-request
|
Reduce circular dependencies in esperanza
Moves ExtractValidatorPubKey and ExtractValidatorAddress to the esperanza/script.cpp
< Circular dependency: esperanza/checks -> esperanza/finalizationstate -> esperanza/checks
< Circular dependency: esperanza/finalizationstate -> validation -> esperanza/finalizationstate
< Circular dependency: consensus/tx_verify -> esperanza/finalizationstate -> validation -> consensus/tx_verify
< Circular dependency: esperanza/finalizationstate -> validation -> finalization/vote_recorder -> esperanza/finalizationstate
Force pushed to resolve conflicts.
|
2025-04-01T06:38:27.887190
| 2024-08-02T18:13:27
|
2445518524
|
{
"authors": [
"HT-7",
"deaveipslon",
"dubrowgn"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5497",
"repo": "dubrowgn/wattz",
"url": "https://github.com/dubrowgn/wattz/issues/43"
}
|
gharchive/issue
|
Add the option in mA
Add the option to display in mA (MILIAMPERE), either positive if it is in the socket charging, or negative if it is out of the socket discharging. And also add the option of the icon update interval.
I initially used dynamic units, but never observed values small enough to actually display mA/mW/etc. This does overlap a bit with another request to increase precision in the main app view, so this might become more relevant.
I initially used dynamic units, but never observed values small enough to actually display mA/mW/etc. This does overlap a bit with another request to increase precision in the main app view, so this might become more relevant.
I'd be very grateful if you could add this option, so it's easier to know how much the charger is charging and how much the phone is draining. Thanks for taking it into consideration.
I was going to request this. Thank you.
|
2025-04-01T06:38:27.908061
| 2024-08-07T21:17:14
|
2454366303
|
{
"authors": [
"JohnHVancouver",
"wearpants",
"wuputah"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5498",
"repo": "duckdb/pg_duckdb",
"url": "https://github.com/duckdb/pg_duckdb/issues/105"
}
|
gharchive/issue
|
security: GUC to block access to local filesystem
Discussed this a long time ago... but currently we allow DuckDB to read from the local filesystem. This is a security risk; the CSV reader is particularly easy to use here since it will read just about any plain text file.
This should instead be controllable via a GUC, default disabled, that can only be enabled by superuser.
Another option would be to restrict it to certain directories?
Yeah, could certainly do that as a further enhancement. My thought was you either are cool with accessing the filesystem (for testing, running on localhost, etc), or you're not (hosted / production environment).
I have production uses for local data (not just dev/testing) & so would like this restricted to certain directories, instead of just on/off
/etc/passwd is a world-readable CSV file 😅
ah, excellent!
|
2025-04-01T06:38:27.912333
| 2023-03-23T10:37:55
|
1637263475
|
{
"authors": [
"CDRussell"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5499",
"repo": "duckduckgo/Android",
"url": "https://github.com/duckduckgo/Android/pull/2996"
}
|
gharchive/pull-request
|
Trigger translations
Task/Issue URL: https://app.asana.com/0/0/1204249256538142/f
Description
Steps to test this PR
Feature 1
[ ]
[ ]
UI changes
Before
After
!(Upload before screenshot)
(Upload after screenshot)
Closing in favor of using https://github.com/duckduckgo/Android/pull/3017 instead
|
2025-04-01T06:38:27.919988
| 2024-08-28T13:33:18
|
2492095779
|
{
"authors": [
"CrisBarreiro"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5500",
"repo": "duckduckgo/Android",
"url": "https://github.com/duckduckgo/Android/pull/4947"
}
|
gharchive/pull-request
|
Replace "Always Ask" with "Ask Every Time"
Task/Issue URL:
Description
Steps to test this PR
Feature 1
[ ]
[ ]
UI changes
Before
After
!(Upload before screenshot)
(Upload after screenshot)
[!WARNING]
This pull request is not mergeable via GitHub because a downstack PR is open. Once all requirements are satisfied, merge this PR as a stack on Graphite.
Learn more
#4947 👈
#4663 : 10 other dependent PRs (#4748 , #4752 , #4780 and 7 others)
develop
This stack of pull requests is managed by Graphite. Learn more about stacking.
Join @CrisBarreiro and the rest of your teammates on Graphite
|
2025-04-01T06:38:27.963711
| 2021-07-15T14:34:31
|
945453515
|
{
"authors": [
"damirka",
"duesee"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5501",
"repo": "duesee/abnf",
"url": "https://github.com/duesee/abnf/pull/18"
}
|
gharchive/pull-request
|
Bump nom to 0.7.0-alpha1, add Cargo.lock
Hi there!
Follow up on PR to https://github.com/duesee/abnf-core/pull/3. Reasons for the bump are described in the abnf-core PR.
Should be merged (if merged) only after abnf-core is updated and published to crates.io because of dependency on bumped version.
Tested on this branch with cargo git imports: https://github.com/damirka/abnf/tree/bump-nom-crate
Hey again :-) Sorry, I am a confused now... The PR title says "0.7.0-alpha1", the commit message says "0.6.5", but the change is to use nom "6.2.1"?
I thought we want to use nom 7.0.0-alpha1?
Could you please clarify what you want to archive and cleanup the commits? Maybe it would be easier to start with the abnf-core crate. Than we can come back here.
Done. Used force to rewrite commit message.
Also minor nit: Could you increase the version to 0.12.0 (due to the same reasons as in abnf-core) and use the commit message "Use nom 7.0.0-alpha1 and abnf-core 0.5.0 to avoid issues with dependencies"? Thank you!
I will merge this as soon as abnf-core 0.5.0 is published on crates.io.
@duesee done!
https://crates.io/crates/abnf/0.12.0 Hope that helps!
|
2025-04-01T06:38:27.965711
| 2024-05-23T20:51:48
|
2313823355
|
{
"authors": [
"dufoli"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5502",
"repo": "dufoli/Salesforce-Inspector-Advanced",
"url": "https://github.com/dufoli/Salesforce-Inspector-Advanced/issues/116"
}
|
gharchive/issue
|
apex: unit test
Unit test
[ ] ability to launch unit test (select or all)
[ ] display result of unit test : ApexTestRunResult global and ApexTestResult/ApexTestQueueItem by method
[ ] display code coverage after test run
cf https://github.com/sorenkrabbe/Chrome-Salesforce-inspector/issues/112
|
2025-04-01T06:38:27.972153
| 2016-06-28T21:29:13
|
162792055
|
{
"authors": [
"duizendnegen",
"jbailey4"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5503",
"repo": "duizendnegen/ember-cli-lazy-load",
"url": "https://github.com/duizendnegen/ember-cli-lazy-load/issues/11"
}
|
gharchive/issue
|
loader.js is not merged in the vendor.js file
When using this addon with an updated version of ember-cli >= 2.3.0-beta.2, the loader.js module is not loaded in the vendor.js file, resulting in a JS error stating that define is not defined.
I believe this is due to an upstream change in ember-cli, which changed the loader.js dep to a node module in package.json. Previously loader.js was a bower component.
See: https://github.com/ember-cli/ember-cli/blob/v2.3.0-beta.2/CHANGELOG.md#230-beta2
Thanks for the heads-up - I updated the readme that an exact ember-cli version of 2.2.0 is needed. I'm open for PRs to improve this, but personally, I am not pushing this project much further and rather looking at ember-engines.
Yeah I definitely agree moving forward with ember-engines is much better long-term. However, our team is desperate for lazy loading and need an implementation for the short term. This addon works great for that use case and doesn't require a ton of changes to the overall app structure.
I have a fix for this issue locally that works with newer versions of ember-cli and would be happy to submit a PR.
Happily accepting a PR! Especially if it supports both ember-cli versions
On Wed, 29 Jun 2016, 17:24 Joshua Bailey<EMAIL_ADDRESS>wrote:
Yeah I definitely agree moving forward with ember-engines is much better
long-term. However, our team is desperate for lazy loading and need an
implementation for the short term. This addon works great for that use case
and doesn't require a ton of changes to the overall app structure.
I have a fix for this issue locally that works with newer versions of
ember-cli and would be happy to submit a PR.
—
You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub
https://github.com/duizendnegen/ember-cli-lazy-load/issues/11#issuecomment-229391792,
or mute the thread
https://github.com/notifications/unsubscribe/AAoow5aR0DWVYfrMbnZ1BxHIICMXIBzcks5qQo5HgaJpZM4JAjoQ
.
|
2025-04-01T06:38:27.974660
| 2015-12-19T03:28:37
|
123058348
|
{
"authors": [
"dularion",
"zinnfamily"
],
"license": "mit",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5504",
"repo": "dularion/streama",
"url": "https://github.com/dularion/streama/issues/108"
}
|
gharchive/issue
|
user invites
I am running streama behind a nginx ssl reverse proxy and the invite urls comeback as Sorry, you're not authorized to view this page. I tried https:// removing http:// also remove the :8080. Nothing works.
Did you make sure to include the https to your base path in your settings? Can you paste the url of a working page and the invite url ?
Where is the base path in your settings?
on your installation, go to Admin -> Settings.
Closing until further notice
|
2025-04-01T06:38:27.979601
| 2023-09-12T09:01:07
|
1892028390
|
{
"authors": [
"cjfff"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5505",
"repo": "dumlj/dumlj-build",
"url": "https://github.com/dumlj/dumlj-build/issues/3"
}
|
gharchive/issue
|
支持 eslint 只引用不声明能力
因为 import/no-extraneous-dependencies 中 packageDir 只会合并依赖,而真正模块可能并没有声明,这样容易导致发布后项目缺少依赖。
例如:项目中有两模块A,B,AB 均没有相互依赖。当 A 依赖声明 lodash,B 引用了 lodash 并且没有声明依赖。这时通过 packageDir: [folder A, folder B], 这时 eslint 会通过检测,而实际 B 需要告警依赖 lodash 并没有声明。
DEMO: https://stackblitz.com/edit/node-4meg4u?file=.eslintrc.js,package.json,packages%2FA%2Findex.js,packages%2FB%2Findex.js,packages%2FB%2Fpacakge.json
I found the solution here, there is the brief introduction for it.
It work for the single repository well but when it comes to the monorepo it provider the option which is the packageDir.
If it includes the all of the repositories in contains in the nomorepo, it'll read the dependencies from another package, that's what we don't need.
So, with this situation we can follow the solution in the issue I posted above.
there is the solution link to https://github.com/dumlj/dumlj-build/pull/4
|
2025-04-01T06:38:28.008505
| 2020-04-16T08:00:16
|
600829108
|
{
"authors": [
"autholykos"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5508",
"repo": "dusk-network/dusk-blockchain",
"url": "https://github.com/dusk-network/dusk-blockchain/issues/412"
}
|
gharchive/issue
|
Maintainer package should use rusk RPC directly
The StakeAutomaton uses the node RPC to handle automation of bid and stake. It should instead use rusk. This also means removing the reference to ristretto.Scalar from the code
Duplicate of #492. Closing
|
2025-04-01T06:38:28.015768
| 2024-10-26T07:12:55
|
2615617947
|
{
"authors": [
"Fraggle",
"flvndvd"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5509",
"repo": "dust-tt/dust",
"url": "https://github.com/dust-tt/dust/pull/8260"
}
|
gharchive/pull-request
|
Upg: stop fetching feature flags on all requests, do it on-demand
Description
Querying the feature flag is the sixth query that cost us the most CPU according to GCP insight.
It's very fast but done 9 millions times / day.
This refactor the code to fetch the feature flags only when needed.
If it's efficient, we should then look into the other query done 9 millions times (active subscription + plan), and then the other ones done 5-3 millions times (such as fetching the groups, the memberships etc..).
Risk
I tested locally. Worst case, the feature behind feature flags are disabled.
Deploy Plan
Deploy front
🔥 🔥
|
2025-04-01T06:38:28.033815
| 2019-07-20T09:48:22
|
470657068
|
{
"authors": [
"nwthomas"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5510",
"repo": "dustinmyers/react-conflux",
"url": "https://github.com/dustinmyers/react-conflux/pull/8"
}
|
gharchive/pull-request
|
Create Documentation Website
Create Documentation Website ⚛️
This pull request updates the conflux repository to contain source code for the documentation website.
Pull Request List 🔥
[x] Install Gatsby
[x] Setup Gatsby configuration files
[x] Configure Node-Sass with global variables, styles, and mixins
[x] Install needed dependencies and plugins in Gatsby
[x] Make landing page for documentation site
[ ] Craft documentation and pull content from CMS such as Contentful
[ ] Complete accessibility features for superb user experience
[ ] Integrate markdown files for documentation along with code snippets
[ ] Implement fuzzy search in search bar
[ ] Update with About.html page
[ ] Finish out Contact.html page in Gatsby
[ ] Make custom 404 page
Closed in lieu of using a different setup for our documentation website.
|
2025-04-01T06:38:28.151334
| 2017-11-15T10:34:05
|
274107633
|
{
"authors": [
"LSAMIJN",
"alinabee",
"aprentis",
"bkanaki",
"dusty-nv",
"e-mily",
"engineer1982",
"flurpo",
"jonwilliams84",
"kanakiyab",
"linusali",
"neildotwilliams",
"nikever",
"niyazFattahov",
"sms720"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5511",
"repo": "dusty-nv/jetson-inference",
"url": "https://github.com/dusty-nv/jetson-inference/issues/160"
}
|
gharchive/issue
|
RTSP source in detectnet-camera example
Hi everybody.
I want to get images from my Hikvision camera. RTSP link is: rtsp://<IP_ADDRESS>:554/Streaming/Channels/102
I have successfully used this pipeline in my OpenCV(3.3.1) app, but it does not work in detectnet-camera example, which i`ve built on my x86_64 PC(Cuda 8.0, OpenCV 3.3.1, TensorRT version 2.1, build 2102
)
I have added this line to code cloned from this repo:
ss<< "rtspsrc location=rtsp://<IP_ADDRESS>:554/Streaming/Channels/102 protocols=udp latency=0 ! decodebin ! videoconvert ! appsink name=mysink ";
Those lines were in the output.
`[gstreamer] initialized gstreamer, version <IP_ADDRESS>
[gstreamer] gstreamer decoder pipeline string:
rtspsrc location=rtsp://<IP_ADDRESS>:554/Streaming/Channels/102 protocols=udp latency=0 ! decodebin ! videoconvert ! appsink name=mysink
detectnet-camera: successfully initialized video device
width: 1280
height: 720
depth: 24 (bpp)
[gstreamer] gstreamer transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert0
[gstreamer] gstreamer changed state from NULL to READY ==> typefind
[gstreamer] gstreamer changed state from NULL to READY ==> decodebin0
[gstreamer] gstreamer changed state from NULL to READY ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> videoconvert0
[gstreamer] gstreamer changed state from READY to PAUSED ==> typefind
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer msg new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> videoconvert0
[gstreamer] gstreamer msg progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtspsrc0
[gstreamer] gstreamer msg progress ==> rtspsrc0
detectnet-camera: camera open for streaming
detectnet-camera: failed to capture frame
detectnet-camera: failed to convert from NV12 to RGBA
detectNet::Detect( 0x(nil), 1280, 720 ) -> invalid parameters
[cuda] cudaNormalizeRGBA((float4*)imgRGBA, make_float2(0.0f, 255.0f), (float4*)imgRGBA, make_float2(0.0f, 1.0f), camera->GetWidth(), camera->GetHeight())
[cuda] invalid device pointer (error 17) (hex 0x11)
[cuda] /home/aprentis/jetson-inference/detectnet-camera/detectnet-camera.cpp:247
[cuda] registered 14745600 byte openGL texture for interop access (1280x720)
`
What`s wrong? Thanks in advance!
As it says, 'detectnet-camera: failed to convert from NV12 to RGBA'
There is a conversion to RGBAf format from NV12 or RGB in the file gstCamera.cpp: in function ConvertRGBA. Either you modify the check for onboard camera or change your pipeline to use 'nvvidconv' to generate the output in 'NV12' format instead of using videoconvert.
Hope this works.
I had the same issue and managed to fix it using @omaralvarez's Pull Request described here: https://github.com/dusty-nv/jetson-inference/issues/88
After taking in the pull request code, in detectnet-camera I replaced the line:
gstCamera* camera = gstCamera::Create(DEFAULT_CAMERA);
With:
gstPipeline* pipeline = gstPipeline::Create(
"rtspsrc location=rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov ! queue ! rtph264depay ! h264parse ! queue ! omxh264dec ! appsink name=mysink",
240,
160,
12
);
(Replacing the RTSP address, height & width accordingly + #include "gstPipeline.h" at the top)
Yes, that will work fine because he gets rid of the onboard camera check and hence, there is just one conversion method, RGBtoRGBAf which is the one you should use.
Could you post the full code on how to use gstPipeline?
Dear,
is it possible to see full code for this? I'm new to GStreamer and i'm looking for several days already for a simple an clear example how to play a rtsp stream from an IP camera from the imagenet-camera source code.
your help is much appreciated
The full code for gstPipeline is in the Pull Request:
https://github.com/dusty-nv/jetson-inference/pull/93/commits/2717e8914dad03116641247ed2dd9ebc88379d4c
Hi, thank you for the code.
but I have some issues when trying to compile
(original jetson-inference is compiling and working)
I'm running on a jetson TX2 with jetpack 4.2.2.
Can you give me any direction where to start looking for a solution?
Thank you in advance.
nvidia@tx2:~/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/build$ make -j2
[ 5%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaYUV-YV12.cu.o
[ 5%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/jetson-inference_generated_imageNet.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 7%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaFont.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 10%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaNormalize.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 12%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaOverlay.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/util/cuda/cudaOverlay.cu(29): warning: variable "thick" was declared but never referenced
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/util/cuda/cudaOverlay.cu(8): warning: function "eq_less" was declared but never referenced
:0:7: warning: ISO C++11 requires whitespace after the macro name
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/util/cuda/cudaOverlay.cu(29): warning: variable "thick" was declared but never referenced
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/util/cuda/cudaOverlay.cu(8): warning: function "eq_less" was declared but never referenced
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 15%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaRGB.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 17%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaResize.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 20%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaYUV-NV12.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
[ 22%] Building NVCC (Device) object CMakeFiles/jetson-inference.dir/util/cuda/jetson-inference_generated_cudaYUV-YUYV.cu.o
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
:0:7: warning: ISO C++11 requires whitespace after the macro name
Scanning dependencies of target jetson-inference
[ 25%] Building CXX object CMakeFiles/jetson-inference.dir/detectNet.cpp.o
[ 27%] Building CXX object CMakeFiles/jetson-inference.dir/imageNet.cpp.o
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:9:0,
from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:5:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:51:46: error: ‘vector’ in namespace ‘std’ does not name a template type
const char* input_blob, const std::vectorstd::string& output_blobs,
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:51:52: error: expected ‘,’ or ‘...’ before ‘<’ token
const char* input_blob, const std::vectorstd::string& output_blobs,
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:93:20: error: ‘vector’ in namespace ‘std’ does not name a template type
const std::vectorstd::string& outputs,
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:93:26: error: expected ‘,’ or ‘...’ before ‘<’ token
const std::vectorstd::string& outputs,
^
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:9:0,
from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:5:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:170:7: error: ‘vector’ in namespace ‘std’ does not name a template type
std::vector mOutputs;
^~~~~~
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:5:0:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:117:7: error: ‘vector’ in namespace ‘std’ does not name a template type
std::vectorstd::string mClassSynset; // 1000 class ID's (ie n01580077, n04325704)
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:118:7: error: ‘vector’ in namespace ‘std’ does not name a template type
std::vectorstd::string mClassDesc;
^~~~~~
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:5:0:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h: In member function ‘const char* imageNet::GetClassDesc(uint32_t) const’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:90:68: error: ‘mClassDesc’ was not declared in this scope
inline const char* GetClassDesc( uint32_t index ) const { return mClassDesc[index].c_str(); }
^~~~~~~~~~
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h:9:0,
from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:5:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:51:46: error: ‘vector’ in namespace ‘std’ does not name a template type
const char* input_blob, const std::vectorstd::string& output_blobs,
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:51:52: error: expected ‘,’ or ‘...’ before ‘<’ token
const char* input_blob, const std::vectorstd::string& output_blobs,
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:93:20: error: ‘vector’ in namespace ‘std’ does not name a template type
const std::vectorstd::string& outputs,
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:93:26: error: expected ‘,’ or ‘...’ before ‘<’ token
const std::vectorstd::string& outputs,
^
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h:9:0,
from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:5:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/tensorNet.h:170:7: error: ‘vector’ in namespace ‘std’ does not name a template type
std::vector mOutputs;
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:90:68: note: suggested alternative: ‘GetClassDesc’
inline const char* GetClassDesc( uint32_t index ) const { return mClassDesc[index].c_str(); }
^~~~~~~~~~
GetClassDesc
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h: In member function ‘const char* imageNet::GetClassSynset(uint32_t) const’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:95:70: error: ‘mClassSynset’ was not declared in this scope
inline const char* GetClassSynset( uint32_t index ) const { return mClassSynset[index].c_str(); }
^~~~~~~~~~~~
In file included from /home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:5:0:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h: In member function ‘uint32_t detectNet::GetMaxBoundingBoxes() const’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h:122:56: error: ‘mOutputs’ was not declared in this scope
inline uint32_t GetMaxBoundingBoxes() const { return mOutputs[1].dims.w * mOutputs[1].dims.h * mOutputs[1].dims.c; }
^~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.h:95:70: note: suggested alternative: ‘GetClassSynset’
inline const char* GetClassSynset( uint32_t index ) const { return mClassSynset[index].c_str(); }
^~~~~~~~~~~~
GetClassSynset
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h:122:56: note: suggested alternative: ‘puts’
inline uint32_t GetMaxBoundingBoxes() const { return mOutputs[1].dims.w * mOutputs[1].dims.h * mOutputs[1].dims.c; }
^~~~~~~~
puts
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h: In member function ‘uint32_t detectNet::GetNumClasses() const’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h:127:51: error: ‘mOutputs’ was not declared in this scope
inline uint32_t GetNumClasses() const { return mOutputs[0].dims.c; }
^~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.h:127:51: note: suggested alternative: ‘puts’
inline uint32_t GetNumClasses() const { return mOutputs[0].dims.c; }
^~~~~~~~
puts
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp: In member function ‘bool imageNet::init(const char*, const char*, const char*, const char*, const char*, const char*, uint32_t)’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:128:19: error: ‘mOutputs’ was not declared in this scope
mOutputClasses = mOutputs[0].dims.c;
^~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp: In static member function ‘static detectNet* detectNet::Create(const char*, const char*, const char*, float, const char*, const char*, const char*, uint32_t)’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:56:7: error: ‘vector’ is not a member of ‘std’
std::vectorstd::string output_blobs;
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:128:19: note: suggested alternative: ‘output’
mOutputClasses = mOutputs[0].dims.c;
^~~~~~~~
output
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:56:25: error: expected primary-expression before ‘>’ token
std::vectorstd::string output_blobs;
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:130:36: error: ‘mClassSynset’ was not declared in this scope
if( !loadClassInfo(class_path) || mClassSynset.size() != mOutputClasses || mClassDesc.size() != mOutputClasses )
^~~~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:56:27: error: ‘output_blobs’ was not declared in this scope
std::vectorstd::string output_blobs;
^~~~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:130:36: note: suggested alternative: ‘GetClassSynset’
if( !loadClassInfo(class_path) || mClassSynset.size() != mOutputClasses || mClassDesc.size() != mOutputClasses )
^~~~~~~~~~~~
GetClassSynset
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:130:77: error: ‘mClassDesc’ was not declared in this scope
if( !loadClassInfo(class_path) || mClassSynset.size() != mOutputClasses || mClassDesc.size() != mOutputClasses )
^~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:56:27: note: suggested alternative: ‘input_blob’
std::vectorstd::string output_blobs;
^~~~~~~~~~~~
input_blob
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:130:77: note: suggested alternative: ‘GetClassDesc’
if( !loadClassInfo(class_path) || mClassSynset.size() != mOutputClasses || mClassDesc.size() != mOutputClasses )
^~~~~~~~~~
GetClassDesc
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp: In member function ‘bool imageNet::loadClassInfo(const char*)’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:231:4: error: ‘mClassSynset’ was not declared in this scope
mClassSynset.push_back(a);
^~~~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp: At global scope:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:197:29: error: variable or field ‘mergeRect’ declared void
static void mergeRect( std::vector& rects, const float6& rect )
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:197:29: error: ‘vector’ is not a member of ‘std’
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:197:42: error: expected primary-expression before ‘>’ token
static void mergeRect( std::vector& rects, const float6& rect )
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:197:45: error: ‘rects’ was not declared in this scope
static void mergeRect( std::vector& rects, const float6& rect )
^~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:197:45: note: suggested alternative: ‘gets’
static void mergeRect( std::vector& rects, const float6& rect )
^~~~~
gets
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:197:52: error: expected primary-expression before ‘const’
static void mergeRect( std::vector& rects, const float6& rect )
^~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp: In member function ‘bool detectNet::Detect(float*, uint32_t, uint32_t, float*, int*, float*)’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:247:43: error: ‘mOutputs’ was not declared in this scope
void* inferenceBuffers[] = { mInputCUDA, mOutputs[OUTPUT_CVG].CUDA, mOutputs[OUTPUT_BBOX].CUDA };
^~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:231:4: note: suggested alternative: ‘GetClassSynset’
mClassSynset.push_back(a);
^~~~~~~~~~~~
GetClassSynset
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:232:4: error: ‘mClassDesc’ was not declared in this scope
mClassDesc.push_back(b);
^~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:247:43: note: suggested alternative: ‘puts’
void* inferenceBuffers[] = { mInputCUDA, mOutputs[OUTPUT_CVG].CUDA, mOutputs[OUTPUT_BBOX].CUDA };
^~~~~~~~
puts
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:232:4: note: suggested alternative: ‘GetClassDesc’
mClassDesc.push_back(b);
^~~~~~~~~~
GetClassDesc
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:245:4: error: ‘mClassSynset’ was not declared in this scope
mClassSynset.push_back(a);
^~~~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:267:49: error: ‘class nvinfer1::Dims3’ has no member named ‘w’
const float cell_width = /width/ mInputDims.w / ow;
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:268:50: error: ‘class nvinfer1::Dims3’ has no member named ‘h’
const float cell_height = /height/ mInputDims.h / oh;
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:270:56: error: ‘class nvinfer1::Dims3’ has no member named ‘w’
const float scale_x = float(width) / float(mInputDims.w);
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:271:57: error: ‘class nvinfer1::Dims3’ has no member named ‘h’
const float scale_y = float(height) / float(mInputDims.h);
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:280:7: error: ‘vector’ is not a member of ‘std’
std::vector< std::vector > rects;
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:280:20: error: ‘vector’ is not a member of ‘std’
std::vector< std::vector > rects;
^~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:280:33: error: expected primary-expression before ‘>’ token
std::vector< std::vector > rects;
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:280:35: error: expected primary-expression before ‘>’ token
std::vector< std::vector > rects;
^
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:280:37: error: ‘rects’ was not declared in this scope
std::vector< std::vector > rects;
^~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:245:4: note: suggested alternative: ‘GetClassSynset’
mClassSynset.push_back(a);
^~~~~~~~~~~~
GetClassSynset
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:246:4: error: ‘mClassDesc’ was not declared in this scope
mClassDesc.push_back(str);
^~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:280:37: note: suggested alternative: ‘gets’
std::vector< std::vector > rects;
^~~~~
gets
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/detectNet.cpp:307:6: error: ‘mergeRect’ was not declared in this scope
mergeRect( rects[z], make_float6(x1, y1, x2, y2, coverage, z) );
^~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:246:4: note: suggested alternative: ‘GetClassDesc’
mClassDesc.push_back(str);
^~~~~~~~~~
GetClassDesc
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:252:56: error: ‘mClassSynset’ was not declared in this scope
printf("imageNet -- loaded %zu class info entries\n", mClassSynset.size());
^~~~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:252:56: note: suggested alternative: ‘GetClassSynset’
printf("imageNet -- loaded %zu class info entries\n", mClassSynset.size());
^~~~~~~~~~~~
GetClassSynset
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp: In member function ‘int imageNet::Classify(float*, uint32_t, uint32_t, float*)’:
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:286:43: error: ‘mOutputs’ was not declared in this scope
void* inferenceBuffers[] = { mInputCUDA, mOutputs[0].CUDA };
^~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:286:43: note: suggested alternative: ‘puts’
void* inferenceBuffers[] = { mInputCUDA, mOutputs[0].CUDA };
^~~~~~~~
puts
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:303:49: error: ‘mClassDesc’ was not declared in this scope
printf("class %04zu - %f (%s)\n", n, value, mClassDesc[n].c_str());
^~~~~~~~~~
/home/nvidia/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/imageNet.cpp:303:49: note: suggested alternative: ‘GetClassDesc’
printf("class %04zu - %f (%s)\n", n, value, mClassDesc[n].c_str());
^~~~~~~~~~
GetClassDesc
CMakeFiles/jetson-inference.dir/build.make:125: recipe for target 'CMakeFiles/jetson-inference.dir/detectNet.cpp.o' failed
make[2]: *** [CMakeFiles/jetson-inference.dir/detectNet.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
CMakeFiles/jetson-inference.dir/build.make:149: recipe for target 'CMakeFiles/jetson-inference.dir/imageNet.cpp.o' failed
make[2]: *** [CMakeFiles/jetson-inference.dir/imageNet.cpp.o] Error 1
CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/jetson-inference.dir/all' failed
make[1]: *** [CMakeFiles/jetson-inference.dir/all] Error 2
Makefile:129: recipe for target 'all' failed
make: *** [all] Error 2
nvidia@tx2:~/jetson-inference-2717e8914dad03116641247ed2dd9ebc88379d4c/build$
Hi,
I managed to get a RTSP stream from an IP camera with pipeline.
But I don't get how to make it work with the gstCamera that is used in the jetson-inference example
how should i make the pipeline as video source of the gstCamera?
I'm really sorry if i ask supid questions but this is completely new to me.
When I start program now, I have my IP camera live image on top of the onboard camera of my jetson...
complete c++ code:
#include "gstCamera.h"
#include "glDisplay.h"
#include "detectNet.h"
#include "commandLine.h"
#include <signal.h>
bool signal_recieved = false;
void sig_handler(int signo)
{
if( signo == SIGINT )
{
printf("received SIGINT\n");
signal_recieved = true;
}
}
int usage()
{
printf("usage: detectnet-camera [-h] [--network NETWORK] [--threshold THRESHOLD]\n");
printf(" [--camera CAMERA] [--width WIDTH] [--height HEIGHT]\n\n");
printf("Locate objects in a live camera stream using an object detection DNN.\n\n");
printf("optional arguments:\n");
printf(" --help show this help message and exit\n");
printf(" --network NETWORK pre-trained model to load (see below for options)\n");
printf(" --overlay OVERLAY detection overlay flags (e.g. --overlay=box,labels,conf)\n");
printf(" valid combinations are: 'box', 'labels', 'conf', 'none'\n");
printf(" --alpha ALPHA overlay alpha blending value, range 0-255 (default: 120)\n");
printf(" --camera CAMERA index of the MIPI CSI camera to use (e.g. CSI camera 0),\n");
printf(" or for VL42 cameras the /dev/video device to use.\n");
printf(" by default, MIPI CSI camera 0 will be used.\n");
printf(" --width WIDTH desired width of camera stream (default is 1280 pixels)\n");
printf(" --height HEIGHT desired height of camera stream (default is 720 pixels)\n");
printf(" --threshold VALUE minimum threshold for detection (default is 0.5)\n\n");
printf("%s\n", detectNet::Usage());
return 0;
}
int main( int argc, char** argv )
{
/*
* parse command line
*/
commandLine cmdLine(argc, argv);
if( cmdLine.GetFlag("help") )
return usage();
/*
* attach signal handler
*/
if( signal(SIGINT, sig_handler) == SIG_ERR )
printf("\ncan't catch SIGINT\n");
/**Added by me for rtsp streaming */
GstElement *pipeline;
GstBus *bus;
GstMessage *msg;
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin uri=rtsp://admin:admin@<IP_ADDRESS>:554/cam/realmonitor?channel=1&subtype=1&unicast=true&proto=Onvif",NULL);
/* Start playing IP Camera */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/*
How to set pipeline as gstCamera source instead of onboard cam of TX2?
*/
/* create the camera device */
gstCamera* camera = gstCamera::Create(cmdLine.GetInt("width", gstCamera::DefaultWidth),cmdLine.GetInt("height", gstCamera::DefaultHeight),0);
if( !camera )
{
printf("\ndetectnet-camera: failed to initialize camera device\n");
return 0;
}
printf("\ndetectnet-camera: successfully initialized camera device\n");
printf(" width: %u\n", camera->GetWidth());
printf(" height: %u\n", camera->GetHeight());
printf(" depth: %u (bpp)\n\n", camera->GetPixelDepth());
/*
* create detection network
*/
detectNet* net = detectNet::Create(argc, argv);
if( !net )
{
printf("detectnet-camera: failed to load detectNet model\n");
return 0;
}
// parse overlay flags
const uint32_t overlayFlags = detectNet::OverlayFlagsFromStr(cmdLine.GetString("overlay", "box,labels,conf"));
/*
* create openGL window
*/
glDisplay* display = glDisplay::Create();
if( !display )
printf("detectnet-camera: failed to create openGL display\n");
/*
* start streaming
*/
if( !camera->Open() )
{
printf("detectnet-camera: failed to open camera for streaming\n");
return 0;
}
printf("detectnet-camera: camera open for streaming\n");
/*
* processing loop
*/
float confidence = 0.0f;
while( !signal_recieved )
{
// capture RGBA image
float* imgRGBA = NULL;
if( !camera->CaptureRGBA(&imgRGBA, 1000) )
printf("detectnet-camera: failed to capture RGBA image from camera\n");
// detect objects in the frame
detectNet::Detection* detections = NULL;
const int numDetections = net->Detect(imgRGBA, camera->GetWidth(), camera->GetHeight(), &detections, overlayFlags);
if( numDetections > 0 )
{
printf("%i objects detected\n", numDetections);
for( int n=0; n < numDetections; n++ )
{
printf("detected obj %i class #%u (%s) confidence=%f\n", n, detections[n].ClassID, net->GetClassDesc(detections[n].ClassID), detections[n].Confidence);
printf("bounding box %i (%f, %f) (%f, %f) w=%f h=%f\n", n, detections[n].Left, detections[n].Top, detections[n].Right, detections[n].Bottom, detections[n].Width(), detections[n].Height());
}
}
// update display
if( display != NULL )
{
// render the image
display->RenderOnce(imgRGBA, camera->GetWidth(), camera->GetHeight());
// update the status bar
char str[256];
sprintf(str, "TensorRT %i.%i.%i | %s | Network %.0f FPS", NV_TENSORRT_MAJOR, NV_TENSORRT_MINOR, NV_TENSORRT_PATCH, precisionTypeToStr(net->GetPrecision()), net->GetNetworkFPS());
display->SetTitle(str);
// check if the user quit
if( display->IsClosed() )
signal_recieved = true;
}
// print out timing info
net->PrintProfilerTimes();
}
/*
* destroy resources
*/
printf("detectnet-camera: shutting down...\n");
SAFE_DELETE(camera);
SAFE_DELETE(display);
SAFE_DELETE(net);
printf("detectnet-camera: shutdown complete.\n");
return 0;
}
I've updated the jetson inference code with alinabee revisions, when I run sudo make I get the following error:
QMutex: No such file or directory
#include compilation terminated.
Any idea how to get this dependency installed on my Jetson Nano?
Thanks
I also get this:
QMutex: No such file or directory
#include compilation terminated.
Any ideas gratefully received!
Here is what I did to get my external Hikvision IP camera feed working with detectnet example.
## Install v4l2loopback utilities and kernel driver
$ sudo apt install v4l2loopback-utils v4l2loopback-dkms
## Install ffmpeg
$ sudo apt install ffmpeg
## Load the v4l2loopback driver. This will create /dev/video0 device
$ sudo modprobe v4l2loopback
## Using ffmpeg pull rtsp stream from camera and push it to the video device created by
## v4l2loopback kernel module.
$ ffmpeg -thread_queue_size 512 \
-i<EMAIL_ADDRESS>\
-vcodec rawvideo -vf scale=640:480 -f v4l2 \
-threads 0 -pix_fmt yuyv422 /dev/video0
## Now you can run the detectnet or imagenet example against the video device
## make sure to match height and width specified in ffmpeg command here.
## I was getting gstreamer error when the size is not matched.
$ detectnet-camera.py \
--network=ped-100 \
--width=640 --height=360 \
--camera=/dev/video0 \
--threshold=1.8 --overlay=box
I tried that also...I've been trying to get IP camera to work for quite a while with no luck...I followed your patch and go a gstreamer error below..The stream runs great in browser but nothing on my TX2. .Any ideas? Any Hints? Best,
Sudo modprobe v4l2loopback
ffmpeg -thread_queue_size 512 -i http://<IP_ADDRESS>:8888/ir.mjpeg -vcodec rawvideo -vf scale=320:240 -f v4l2 -threads 0 -pix_fmt yuv420p /dev/video1
./detectnet-camera --width=320 --height=240 --camera=/dev/video1 --overlay=box
OpenGL] glDisplay -- X screen 0 resolution: 1280x1024
[OpenGL] glDisplay -- display device initialized
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert1
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> videoconvert0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> videoconvert1
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> videoconvert0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer msg new-clock ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer msg stream-start ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> videoconvert1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> videoconvert0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0
[gstreamer] gstCamera onEOS
[gstreamer] gstreamer v4l2src0 ERROR Internal data stream error.
[gstreamer] gstreamer Debugging info: gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
detectnet-camera: camera open for streaming
Here is what I did to get my external Hikvision IP camera feed working with detectnet example.
## Install v4l2loopback utilities and kernel driver
$ sudo apt install v4l2loopback-utils v4l2loopback-dkms
## Install ffmpeg
$ sudo apt install ffmpeg
## Load the v4l2loopback driver. This will create /dev/video0 device
$ sudo modprobe v4l2loopback
## Using ffmpeg pull rtsp stream from camera and push it to the video device created by
## v4l2loopback kernel module.
$ ffmpeg -thread_queue_size 512 \
-i<EMAIL_ADDRESS>\
-vcodec rawvideo -vf scale=640:480 -f v4l2 \
-threads 0 -pix_fmt yuyv422 /dev/video0
## Now you can run the detectnet or imagenet example against the video device
## make sure to match height and width specified in ffmpeg command here.
## I was getting gstreamer error when the size is not matched.
$ detectnet-camera.py \
--network=ped-100 \
--width=640 --height=360 \
--camera=/dev/video0 \
--threshold=1.8 --overlay=box
I use the stream from my Raspberry Pi running MotionEyeOS using this method. Thanks for the solution. I did tweak the ffmpeg command line slightly.
$ ffmpeg -thread_queue_size 512 -i http://<IP_ADDRESS>:8081/ -vcodec rawvideo -vf scale=1280:720 -f v4l2 -threads 0 -pix_fmt yuyv422 /dev/video2
$ detectnet-camera.py
--network=ped-100
--width=1280 --height=720
--camera=/dev/video2
--threshold=1.8 --overlay=box
Here is what I did to get my external Hikvision IP camera feed working with detectnet example.
## Install v4l2loopback utilities and kernel driver
$ sudo apt install v4l2loopback-utils v4l2loopback-dkms
## Install ffmpeg
$ sudo apt install ffmpeg
## Load the v4l2loopback driver. This will create /dev/video0 device
$ sudo modprobe v4l2loopback
## Using ffmpeg pull rtsp stream from camera and push it to the video device created by
## v4l2loopback kernel module.
$ ffmpeg -thread_queue_size 512 \
-i<EMAIL_ADDRESS>\
-vcodec rawvideo -vf scale=640:480 -f v4l2 \
-threads 0 -pix_fmt yuyv422 /dev/video0
## Now you can run the detectnet or imagenet example against the video device
## make sure to match height and width specified in ffmpeg command here.
## I was getting gstreamer error when the size is not matched.
$ detectnet-camera.py \
--network=ped-100 \
--width=640 --height=360 \
--camera=/dev/video0 \
--threshold=1.8 --overlay=box
That worked fine for me, thanks! But I had some issues that were solved wrapping up the rtsp url with "rtsp://...". Here is a video of everything working together: https://youtu.be/PLBffle0CcQ
Hello, how can I put rtsp in "videoOutput"? I need steram detectnet from jetson nano over lan by RTSP.
Hi @niyazFattahov, jetson-inference/jetson-utils doesn't support RTSP output, as it requires support for special server code. Otherwise I would add it if it were simple.
Note that DeepStream supports RTSP output and has support for the RTSP server if you need that.
What about this: https://github.com/GStreamer/gst-rtsp-server/blob/1.14.5/examples/test-launch.c
can it be used somehow?
I tested rtsp only from csi camera with gstreamer like this:
./test-launch "videotestsrc ! nvvidconv ! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96"
(https://forums.developer.nvidia.com/t/jetson-nano-faq/82953)
thank you.
In theory, yes, some similar code / dependencies would need to be integrated into the videoOutput class in order to support RTSP output.
(## Install v4l2loopback utilities and kernel driver
$ sudo apt install v4l2loopback-utils v4l2loopback-dkms
Install ffmpeg
$ sudo apt install ffmpeg
Load the v4l2loopback driver. This will create /dev/video0 device
$ sudo modprobe v4l2loopback
Using ffmpeg pull rtsp stream from camera and push it to the video device created by
v4l2loopback kernel module.
$ ffmpeg -thread_queue_size 512
-i<EMAIL_ADDRESS>-vcodec rawvideo -vf scale=640:480 -f v4l2
-threads 0 -pix_fmt yuyv422 /dev/video0
Now you can run the detectnet or imagenet example against the video device
make sure to match height and width specified in ffmpeg command here.
I was getting gstreamer error when the size is not matched.
$ detectnet-camera.py
--network=ped-100
--width=640 --height=360
--camera=/dev/video0
--threshold=1.8 --overlay=box #)
@neildotwilliams @dusty-nv
I've tried this on Jetson TX2 but I got this error
(aititx2-2@aititx22-desktop:~$ ffmpeg -thread_queue_size 2631 -i "rtsp://aititx2-2@aititx22-desktop@<IP_ADDRESS>:1935/profile" -vcodec rawvideo -vf scale=1280:720 -f v4l2 -threads 0 -pix_fmt yuyv422 /dev/video0
ffmpeg version 3.4.8-0ubuntu0.2 Copyright (c) 2000-2020 the FFmpeg developers
built with gcc 7 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)
configuration: --prefix=/usr --extra-version=0ubuntu0.2 --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libavresample 3. 7. 0 / 3. 7. 0
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
[rtsp @ 0x559b6b26b0] max delay reached. need to consume packet
[rtsp @ 0x559b6b26b0] RTP: missed 2 packets
[rtsp @ 0x559b6b26b0] max delay reached. need to consume packet
[rtsp @ 0x559b6b26b0] RTP: missed 2 packets
[rtsp @ 0x559b6b26b0] max delay reached. need to consume packet
[rtsp @ 0x559b6b26b0] RTP: missed 1 packets
Input #0, rtsp, from 'rtsp://aititx2-2@aititx22-desktop@<IP_ADDRESS>:1935/profile':
Metadata:
title : Unnamed
comment : N/A
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 720x1280, 90k tbr, 90k tbn, 180k tbc
Stream #0:1: Audio: aac (LC), 32000 Hz, stereo, fltp
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))
Press [q] to stop, [?] for help
[v4l2 @ 0x559b7204e0] Frame rate very high for a muxer not efficiently supporting it.
Please consider specifying a lower framerate, a different muxer or -vsync 2
[v4l2 @ 0x559b7204e0] ioctl(VIDIOC_G_FMT): Invalid argument
Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
Error initializing output stream 0:0 --
Conversion failed!
)
please help! I use RSTP to connect to my android phone camera.
|
2025-04-01T06:38:28.170100
| 2020-10-23T05:45:53
|
727932005
|
{
"authors": [
"dutscher",
"mkay581"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5512",
"repo": "dutscher/stencil-storybook",
"url": "https://github.com/dutscher/stencil-storybook/issues/3"
}
|
gharchive/issue
|
Does it work with latest Stencil v1?
Hello @dutscher and thank you so much for creating this package! :heart:
I see that it supports Stencil v2 but our organization can't quite move to v2 just yet, so we're on the latest version of v1. Would this package work with the latest v1?
Thanks for any information.
storybook runs alone and stencil is connected via proxy. so i promise you that your "old" stencil will work.
give it a try, clone and set down the stencil version in package.json. thr sample component and scss should run also under older version.
Ah okay, that's great. I'll give it a try!
please let me know if it really works :)
|
2025-04-01T06:38:28.172332
| 2022-06-23T20:19:22
|
1282894004
|
{
"authors": [
"duzda",
"lm41"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5513",
"repo": "duzda/deezer-enhanced",
"url": "https://github.com/duzda/deezer-enhanced/issues/31"
}
|
gharchive/issue
|
[Feature Request] Set output device
It would be pretty awesome, if deezer-enhanced would provide the posibility to select the output device. I have the GoXLR, which has 5 Output channels and everytime the audio starts again I have to go to pavoucontrol and select the music channel. (Default is System).
I see, sounds great, I'll look into it!
This issue is probably upstream.
https://github.com/electron/electron/issues/27581
https://github.com/electron/electron/issues/7470
I'm closing this because the error never occurred again. I don't know what it was, but it seems to be gone now.
|
2025-04-01T06:38:28.210213
| 2022-04-20T03:09:22
|
1209104738
|
{
"authors": [
"dvanoni",
"fredericky123"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5514",
"repo": "dvanoni/notero",
"url": "https://github.com/dvanoni/notero/issues/58"
}
|
gharchive/issue
|
can we make the structure of notion the same as zotero collections?
Right now, all items are in the same folder in Notero, can we make the structure of notion the same as zotero collections
(The same item can exist in different folders)
Hi @fredericky123, thanks for the feedback!
There was some discussion around this topic in #6, and the decision was made to sync to a single Notion database for now. To achieve something similar to what you're asking, the idea we had was to sync a Collection property into Notion as described in #30. Then you could make as many different views of that database as you wanted based on the Collection property or any combination of other properties. This felt like the more "Notion way" of doing things.
Do you think having a Collection property in Notion as described in #30 would meet your needs?
Thanks @dvanoni ! This is exactly what I want. Have it already been realized? How to configure it?
By the way, when enable the collections to sync, can we have a toogle to sync all collections? Right now, we have to choose each single collection, since I have a lot of collections, it seems time-consuming to do this.
@fredericky123, unfortunately this functionality isn't built yet. Keep an eye on #30, and I'll be sure to post an update when it's ready.
I pulled out your second question into its own issue so we can track that separately: #63
I'm going to close this issue as I believe we should have these points covered by #30 and #63. Feel free to add your input on those!
|
2025-04-01T06:38:28.223851
| 2024-05-19T05:06:25
|
2304432637
|
{
"authors": [
"LebronXierunfeng",
"dian1414"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5515",
"repo": "dvlab-research/LongLoRA",
"url": "https://github.com/dvlab-research/LongLoRA/issues/185"
}
|
gharchive/issue
|
Something wrong with the torch version
I followed the steps in readme but I encounted the following errors in SFT.
[WARNING] async_io requires the dev libaio .so object and headers but these were not found.
[WARNING] async_io: please install the libaio-dev package with apt
[WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found.
[WARNING] Please specify the CUTLASS repo directory as environment variable $CUTLASS_PATH
[WARNING] sparse_attn requires a torch version >= 1.5 and < 2.0 but detected 2.3
[WARNING] using untested triton version (2.2.0), only 1.0.0 is known to be compatible
[2024-05-19 04:59:05,039] [INFO] [comm.py:637:init_distributed] cdb=None
[2024-05-19 04:59:05,082] [INFO] [real_accelerator.py:203:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[2024-05-19 04:59:05,108] [INFO] [real_accelerator.py:203:get_accelerator] Setting ds_accelerator to cuda (auto detect)
[WARNING] async_io requires the dev libaio .so object and headers but these were not found.
[WARNING] async_io requires the dev libaio .so object and headers but these were not found.
[WARNING] async_io: please install the libaio-dev package with apt
[WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found.
[WARNING] Please specify the CUTLASS repo directory as environment variable $CUTLASS_PATH
[WARNING] sparse_attn requires a torch version >= 1.5 and < 2.0 but detected 2.3
[WARNING] using untested triton version (2.2.0), only 1.0.0 is known to be compatible
[WARNING] async_io: please install the libaio-dev package with apt
[WARNING] If libaio is already installed (perhaps from source), try setting the CFLAGS and LDFLAGS environment variables to where it can be found.
[WARNING] Please specify the CUTLASS repo directory as environment variable $CUTLASS_PATH
[WARNING] sparse_attn requires a torch version >= 1.5 and < 2.0 but detected 2.3
[WARNING] using untested triton version (2.2.0), only 1.0.0 is known to be compatible
[rank5]: Traceback (most recent call last):
[rank5]: File "/home/zhangyan/Dynathink/LongLoRA/fine-tune.py", line 211, in
[rank5]: train()
[rank5]: File "/home/zhangyan/Dynathink/LongLoRA/fine-tune.py", line 106, in train
[rank5]: model_args, training_args = parser.parse_args_into_dataclasses()
[rank5]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank5]: File "/home/zhangyan/anaconda3/envs/longlora/lib/python3.12/site-packages/transformers/hf_argparser.py", line 347, in parse_args_into_dataclasses
[rank5]: raise ValueError(f"Some specified arguments are not used by the HfArgumentParser: {remaining_args}")
[rank5]: ValueError: Some specified arguments are not used by the HfArgumentParser: [' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ', ' ']
In requirements.txt the version of torch is >=2.0.0, but the errors show that I need to install torch<2.0.0. How can I deal with the problem?
the same problem
|
2025-04-01T06:38:28.255755
| 2024-10-27T11:41:07
|
2616536081
|
{
"authors": [
"AqibAliMughal",
"dwip708"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5516",
"repo": "dwip708/EcoFusion",
"url": "https://github.com/dwip708/EcoFusion/pull/23"
}
|
gharchive/pull-request
|
Chore: Improved the overall cart UI including empty cart
Issue # 15
I divided the main section into 80-20%
Implemented bootstrap card styling to make it more attractive.
Iterated the same items in the pricing module to reflect the pricing more clearly.
PREVIOUSLY
NOW
(1/2)
(2/2)
Please have a look at the PR
Great work @AqibAliMughal.
|
2025-04-01T06:38:28.259612
| 2020-05-09T10:25:02
|
615143734
|
{
"authors": [
"dword1511",
"rmalchow",
"tilllt"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5517",
"repo": "dword1511/raspiraw",
"url": "https://github.com/dword1511/raspiraw/issues/2"
}
|
gharchive/issue
|
is the new "high quality" camera supported?
hi,
rpi2dng barfs with images from the new raspberry camera. is this supposed to work? here's a raw (captured with raspi-still --raw) in case this helps.
https://rm.ignorelist.com/s/yrWY8362a6wNmjC
.rm
the sensor is a "IMX477R"
Not yet... Will look into it when I have time
Here are some infos about other tools that do the dng conversion from the HQ Camera Raw Images: https://www.raspberrypi.org/forums/viewtopic.php?p=1665327#p1665327
|
2025-04-01T06:38:28.325505
| 2018-04-10T10:24:25
|
312862188
|
{
"authors": [
"despo"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5518",
"repo": "dxw/teacher-vacancy-service",
"url": "https://github.com/dxw/teacher-vacancy-service/pull/112"
}
|
gharchive/pull-request
|
Feature/26 fix payscale ordering
updates pay scale labels
retrieves in correct order
Fix pay_scale ordering
Correct naming of pay scale options
|
2025-04-01T06:38:28.326276
| 2023-01-23T11:05:58
|
1552901521
|
{
"authors": [
"dy1ng"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5519",
"repo": "dy1ng/teamcity-documentation",
"url": "https://github.com/dy1ng/teamcity-documentation/pull/104"
}
|
gharchive/pull-request
|
Update preinstalled software list for TCC agents
This PR was created by an automated build in TeamCity Cloud
Closing this PR as obsolete, there is a more recent one
|
2025-04-01T06:38:28.364351
| 2024-09-12T01:17:46
|
2521106978
|
{
"authors": [
"dylansnyk"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5520",
"repo": "dylansnyk/backstage",
"url": "https://github.com/dylansnyk/backstage/pull/4195"
}
|
gharchive/pull-request
|
[Snyk] Security upgrade @backstage/theme from 0.0.0-use.local to 0.1.1
Snyk has created this PR to fix 4 vulnerabilities in the yarn dependencies of this project.
Snyk changed the following file(s):
plugins/example-todo-list/package.json
Note for zero-installs users
If you are using the Yarn feature zero-installs that was introduced in Yarn V2, note that this PR does not update the .yarn/cache/ directory meaning this code cannot be pulled and immediately developed on as one would expect for a zero-install project - you will need to run yarn to update the contents of the ./yarn/cache directory.
If you are not using zero-install you can ignore this as your flow should likely be unchanged.
⚠️ Warning
Failed to update the yarn.lock, please update manually before merging.
Vulnerabilities that will be fixed with an upgrade:
Issue
Score
Asymmetric Resource Consumption (Amplification) SNYK-JS-BODYPARSER-7926860
112
Cross-site Scripting SNYK-JS-EXPRESS-7926867
80
Cross-site Scripting SNYK-JS-SEND-7926862
80
Cross-site Scripting SNYK-JS-SERVESTATIC-7926865
80
[!IMPORTANT]
Check the changes in this PR to ensure they won't cause issues with your project.
Max score is 1000. Note that the real score may have changed since the PR was raised.
This PR was automatically created by Snyk using the credentials of a real user.
Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.
For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic
Learn how to fix vulnerabilities with free interactive lessons:
🦉 Cross-site Scripting
🎉 Snyk hasn't found any issues so far.
✅ code/snyk check is completed. No issues were found. (View Details)
|
2025-04-01T06:38:28.378067
| 2024-10-06T20:14:14
|
2568872863
|
{
"authors": [
"dylansnyk"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5521",
"repo": "dylansnyk/backstage",
"url": "https://github.com/dylansnyk/backstage/pull/4621"
}
|
gharchive/pull-request
|
[Snyk] Security upgrade @backstage/plugin-auth-node from 0.0.0-use.local to 0.1.0
Snyk has created this PR to fix 1 vulnerabilities in the yarn dependencies of this project.
Snyk changed the following file(s):
plugins/catalog-backend/package.json
Note for zero-installs users
If you are using the Yarn feature zero-installs that was introduced in Yarn V2, note that this PR does not update the .yarn/cache/ directory meaning this code cannot be pulled and immediately developed on as one would expect for a zero-install project - you will need to run yarn to update the contents of the ./yarn/cache directory.
If you are not using zero-install you can ignore this as your flow should likely be unchanged.
⚠️ Warning
Failed to update the yarn.lock, please update manually before merging.
Vulnerabilities that will be fixed with an upgrade:
Issue
Score
Cross-site Scripting (XSS) SNYK-JS-COOKIE-8163060
44
[!IMPORTANT]
Check the changes in this PR to ensure they won't cause issues with your project.
Max score is 1000. Note that the real score may have changed since the PR was raised.
This PR was automatically created by Snyk using the credentials of a real user.
Note: You are seeing this because you or someone else with access to this repository has authorized Snyk to open fix PRs.
For more information:
🧐 View latest project report
📜 Customise PR templates
🛠 Adjust project settings
📚 Read about Snyk's upgrade logic
Learn how to fix vulnerabilities with free interactive lessons:
🦉 Cross-site Scripting (XSS)
🎉 Snyk hasn't found any issues so far.
✅ security/snyk check is completed. No issues were found. (View Details)
✅ license/snyk check is completed. No issues were found. (View Details)
|
2025-04-01T06:38:28.396954
| 2023-10-23T22:59:50
|
1958166786
|
{
"authors": [
"bhelx",
"thomasdarimont"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5522",
"repo": "dylibso/chicory",
"url": "https://github.com/dylibso/chicory/issues/52"
}
|
gharchive/issue
|
Initial benchmark setup
It is advisable to have solid benchmarks from the beginning to quickly and easily assess the performance impact of different implementations.
For this we could leverage jmh.
Some example JMH integrations can be found here: https://github.com/FasterXML/jackson-benchmarks/tree/2.15
The idea of this issue is to provide an initial setup to enable the project to write easy and reliable micro benchmarks.
I started a branch for this this morning. We've been mostly using the tests to measure performance but i think it makes more sense to get a basic micro benchmark setup working. JMH will provide more control and also i'd like to separate out things like parsing, compilation, and runtime performance. Will ping you when i get a PR started @danielperano
I think this has been solved by https://github.com/dylibso/chicory/pull/280 in the meantime.
|
2025-04-01T06:38:28.403534
| 2023-08-25T19:17:30
|
1867570947
|
{
"authors": [
"sebastian-correa"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5523",
"repo": "dynaconf/dynaconf",
"url": "https://github.com/dynaconf/dynaconf/pull/989"
}
|
gharchive/pull-request
|
Improve Configuration and Validation sections of the docs
Summary
Fixes #911 and #968 by adding documentation on some Dynaconf parameters and reworking the Validation section of the docs.
Details
Not much detail to give, as context is present in the 2 linked issues.
Aside from the obvious additions to configuration.md and validation.md, I updated the Pyhon version in CONTRIBUTING.md to 3.8+ and added .DS_Store file to .gitignore. In particular, any of these 2 unrelated changes could be dropped from this PR if needed :).
Questions
Links between Markdown files don't seem to work on my local docs (make docs). Did I do something wrong or do they get fixed when deployed to the web?
Notes
Please feel free to ask any changes in this PR. The idea is to improve Dynaconf.
Maybe I have to do absolute links (without the ./ at the start?
Thanks @pedro-psb for your input and patience. I pushed some changes, you can look at the diff here.
Please don't hesitate to re-ask for changes, and please mark as solved any discussions you think should be closed.
Hey @pedro-psb, I'll look at your comments during the weekend. My bad with the links though, I could've sworn that I went through all of them :S.
@pedro-psb sorry this is taking so long, I don't have much time mid-week to tackle this PR :S. I've now re-read and it looks like its better now. I'll look check all links from the dev deploy once it's ready.
|
2025-04-01T06:38:28.424543
| 2022-05-09T18:15:35
|
1230077784
|
{
"authors": [
"andynataco"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5524",
"repo": "dynata/rex-sdk-python",
"url": "https://github.com/dynata/rex-sdk-python/pull/12"
}
|
gharchive/pull-request
|
md formatting
Added some python syntax highlighting to code blocks.
@ginotrombetti can you review this when you get a chance?
|
2025-04-01T06:38:28.425804
| 2021-06-15T18:07:06
|
921664882
|
{
"authors": [
"LucasHocker",
"TechShady"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5525",
"repo": "dynatrace-oss/DynatraceDashboardPowerups",
"url": "https://github.com/dynatrace-oss/DynatraceDashboardPowerups/issues/57"
}
|
gharchive/issue
|
Add color override for OOTB Honeycomb Tiles
When viewing certain honeycomb tiles, like synthetics, the chart displays only green/red based on the success/failure of the synthetic monitor. If a monitor is disabled and it's last run was successful, the cell shows green. This request is to override that green color for disabled monitor (last run was good) or red color for disabled monitor (last run was bad) to grey. This would depend on the status of the monitor. The field is called status and I can filter the chart on that, but not sure if "status" is exposed where the powerup could just color those cells grey. Hopefully it is...
this doesn't seem possible without making some sort of api call
|
2025-04-01T06:38:28.516623
| 2023-07-06T16:38:49
|
1791886272
|
{
"authors": [
"imartinezortiz"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5526",
"repo": "e-ucm/docker-limesurvey",
"url": "https://github.com/e-ucm/docker-limesurvey/issues/6"
}
|
gharchive/issue
|
KC and mariadb upgrade
Check following resources:
https://github.com/keycloak/keycloak/tree/main/quarkus/container
https://github.com/keycloak/keycloak-containers/tree/main/docker-compose-examples
https://github.com/MariaDB/mariadb-docker/ issues/94
https://mariadb.org/mariadb-server-docker-official-images-healthcheck-without-mysqladmin/
https://github.com/MariaDB/mariadb-docker/ pull/508
Try to send a PR to upstream now that seems that the project is active again:
https://github.com/Frankniesten/Limesurvey-SAML-Authentication/ pull/6
Beware that LS seems to create responses' tables using MyISAM but seems that now can be configured:
https://github.com/LimeSurvey/LimeSurvey/ pull/1043
https://mariadb.com/kb/en/converting-tables-from-myisam-to-innodb/
|
2025-04-01T06:38:28.644335
| 2021-02-18T16:41:28
|
811270249
|
{
"authors": [
"alissonalberini",
"neldreth2021"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5527",
"repo": "eKoopmans/html2pdf.js",
"url": "https://github.com/eKoopmans/html2pdf.js/issues/396"
}
|
gharchive/issue
|
Downloading Dashboard Comprised of Google Charts extends past width of PDF
I have a simple div tag denoted with the id: dashboard and my code is as the following:
//Download PDF of Dashboard
$('#options-menu-download-as-pdf').on('click', function () {
var dashboard = document.getElementById('dashboard');
var options = {
filename: 'Report.pdf',
html2canvas: {
scale: 2
},
pagebreak: {
mode: ['avoid-all', 'css', 'legacy']
},
jsPDF: {
format: [500, 200],
unit: 'mm',
orientation: 'landscape'
}
}
html2pdf().from(dashboard).set(options).save();
});
This does everything I would like except I would like to be able to assign the width and height dynamically (I can do this already by window size or the current h/w of the dashboard in HTML but that could lead to poorly rendered PDFs if the user has their browser window not at full screen).
It appears as though the charts themselves cannot be manipulated to fit within the h/w of the pdf so the get cut off, whereas my grid styling actually fits to the dimensions of he document.
I looked at this thread: https://github.com/eKoopmans/html2pdf.js/issues/44 that discusses a 'fit-to-width' option. I am missing an example of that in documentation so I am not sure if that is an option with html2pdf, the underlying jsPDF, or even a CSS option.
How about: $("#Elementd).css("height", "297mm");
Or have you already found a solution?
|
2025-04-01T06:38:28.651028
| 2024-08-20T05:57:49
|
2474781250
|
{
"authors": [
"JesusPoderoso",
"SunLiangcan"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5528",
"repo": "eProsima/Fast-DDS-python",
"url": "https://github.com/eProsima/Fast-DDS-python/issues/182"
}
|
gharchive/issue
|
'HelloWorldPubSubType' object has no attribute 'setName'. Did you mean: 'set_name'?
E:\test\build\Release>python HelloWorldExample.py -p publisher
Creating publisher.
Traceback (most recent call last):
File "E:\test\build\Release\HelloWorldExample.py", line 206, in
writer = Writer(args.domain, args.machine)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\test\build\Release\HelloWorldExample.py", line 113, in init
self.topic_data_type.setName("HelloWorldDataType")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'HelloWorldPubSubType' object has no attribute 'setName'. Did you mean: 'set_name'?
Hi @SunLiangcan, thanks for your report.
I am assuming you are using the Fast DDS python version from master branch.
Fast DDS, and Fast DDS Python, are near to a mayor release (v3.0.0 and v2.0.0, respectively). That entails several API breaks and refactors. For that reason, and until these versions are released, it is strongly advisable to use the latest stable branch, 2.14.x (and python v1.4.2).
It seems that performing the TopicDataType refactor in the python repository we missed updating that example.
We will fix it and come back to you, sorry for the inconvenience.
ok,use 1.4.2 no problem
Hi @SunLiangcan, we have already fixed it (#183) in master
|
2025-04-01T06:38:28.652211
| 2021-12-21T09:10:03
|
1085601499
|
{
"authors": [
"JLBuenoLopez-eProsima",
"MiguelCompany"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5529",
"repo": "eProsima/Fast-DDS-statistics-backend",
"url": "https://github.com/eProsima/Fast-DDS-statistics-backend/pull/141"
}
|
gharchive/pull-request
|
[13307] is_metatraffic documentation
#126 extended public API but no documentation was included. This PR solves the issue.
Merge after #140 (only last commit is relevant)
@JLBuenoLopez-eProsima I think this one has to be rebased
|
2025-04-01T06:38:28.867009
| 2017-09-20T09:32:23
|
259099521
|
{
"authors": [
"2hf",
"zhaochao"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5530",
"repo": "eayunstack/trove",
"url": "https://github.com/eayunstack/trove/pull/43"
}
|
gharchive/pull-request
|
Avoid deleting parent backup failed caused by 404
Catch NotFound exception in trove API level, and return directly to avoid dropping out while deleting parent backup that has multiple children backups.
Signed-off-by: Fan Zhang<EMAIL_ADDRESS>
commit message 里加上 issue 的信息。
|
2025-04-01T06:38:28.869039
| 2022-08-10T05:00:33
|
1334070532
|
{
"authors": [
"SamTheGeek",
"ebaauw"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5531",
"repo": "ebaauw/homebridge-zp",
"url": "https://github.com/ebaauw/homebridge-zp/issues/198"
}
|
gharchive/issue
|
Allow settings-only mode
There should be a way to only show settings (and even user-defined lists of settings) in HomeKit as a multi-switch accessory.
This would allow voice control of settings such as speech enhancement or night mode without requiring any other visibility of Sonos in HK.
Ideally, I would be able to configure SonosZP to show a single tile that allowed me to turn on or off Night Mode and Speech Enhancement and any subwoofer I had paired to a given zone/surround setup. This would allow me to automate the "evening" Sonos settings along with lights and other HK devices.
You can change the settings from Siri: use Eve or another decent HomeKit app to create a scene with the setting. Then recall that scene from Siri.
Only exposing a single setting is beyond the scope of Homebridge ZP. If you don’t like to see the accessories exposed by Homebridge ZP, don’t use that plugin. You could probably use a plugin like homebridge-commander to expose dummy switches that issue zp commands to change a setting.
|
2025-04-01T06:38:28.870770
| 2022-09-16T02:45:45
|
1375323559
|
{
"authors": [
"aakash-priyadarshi",
"ebankoff"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5532",
"repo": "ebankoff/Beast_Bomber",
"url": "https://github.com/ebankoff/Beast_Bomber/issues/48"
}
|
gharchive/issue
|
Country number is not supported
Hi, I was trying to use the SMS spammer on my own number. But it seems my Indian number is not supported. Can you help me with adding my country code to the script? What changes I must do and where.
Can you help me with adding my country code to the script? What changes I must do and where.
Hello, it is unlikely that I will be able to add support for Indian phone numbers🤷
|
2025-04-01T06:38:28.872286
| 2023-10-02T18:36:48
|
1922427406
|
{
"authors": [
"BrutalBirdie",
"louim"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5533",
"repo": "ebbba-org/ansible-role-coturn",
"url": "https://github.com/ebbba-org/ansible-role-coturn/pull/7"
}
|
gharchive/pull-request
|
Allow port 80 for certificate renewal
Let's Encrypt requires port 80 to be open for certificate renewal. This commit adds port 80 to the firewall rules for coturn when using TLS.
OH! Sorry somehow I missed this PR completely.
Will work on it right away.
Moved to PR #10
|
2025-04-01T06:38:28.880879
| 2021-10-21T09:33:50
|
1032271056
|
{
"authors": [
"ESapenaVentura",
"aaclan-ebi",
"ke4",
"ofanobilbao"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5534",
"repo": "ebi-ait/hca-ebi-wrangler-central",
"url": "https://github.com/ebi-ait/hca-ebi-wrangler-central/issues/526"
}
|
gharchive/issue
|
Fix BAM to FASTQ - d5410c6e-612d-421a-a66f-2de5e04dd050
SOP: https://ebi-ait.github.io/hca-ebi-wrangler-central/SOPs/update_ena_runs_SOP.html
I started to download the files from DCP Data Browser (https://data.humancellatlas.org/explore/projects/abe1a013-af7a-45ed-8c26-f3793c24a1f4/get-curl-command) to the EBI cluster into the nfs/production/hca/d5410c6e-612d-421a-a66f-2de5e04dd050 folder. 126 files/ 3.09 TB so it is going to take a while.
Started checksum generation on the EBI cluster to /nfs/production/hca/d5410c6e-612d-421a-a66f-2de5e04dd050/md5_checksum.txt.
Started uploading the files to ENA's FTP server into /d5410c6e-612d-421a-a66f-2de5e04dd050 folder.
Checksum calculation is done and stored here: /nfs/production/hca/d5410c6e-612d-421a-a66f-2de5e04dd050/md5_checksum.txt.
Files has been submitted to ENA and we got back the accessions:
<?xml version='1.0' encoding='UTF-8'?>
<RECEIPT receiptDate="2021-11-29T16:08:45.085Z" submissionFile="submission.xml" success="true">
<RUN accession="ERR7441145" alias="sequencingRun_90b34757-474f-42c3-9d31-683a5a0a84bd_1" status="PRIVATE" />
<RUN accession="ERR7441146" alias="sequencingRun_0f14c412-5014-4ac0-9a71-858b2f047777_1" status="PRIVATE" />
<RUN accession="ERR7441147" alias="sequencingRun_082e87ac-5cf6-4bad-bedb-5f6591b8f566_1" status="PRIVATE" />
<RUN accession="ERR7441148" alias="sequencingRun_157ba915-28d7-4d80-89ad-71c8291dbc05_1" status="PRIVATE" />
<RUN accession="ERR7441149" alias="sequencingRun_cc5a78a1-539f-4dec-80b6-62f35dcafd89_1" status="PRIVATE" />
<RUN accession="ERR7441150" alias="sequencingRun_01f7c3d0-d4be-432d-aa25-8c7fbce20b49_1" status="PRIVATE" />
<RUN accession="ERR7441151" alias="sequencingRun_dc31f31d-ab56-4025-9834-99be638a2d50_1" status="PRIVATE" />
<RUN accession="ERR7441152" alias="sequencingRun_b6dec4a6-2d9b-40ac-80c4-41dce01aea46_1" status="PRIVATE" />
<RUN accession="ERR7441153" alias="sequencingRun_51fb7eb7-a422-482e-a98e-c9e6f9628e97_1" status="PRIVATE" />
<RUN accession="ERR7441154" alias="sequencingRun_d8c08782-6f69-4314-947c-1afe6928cbce_1" status="PRIVATE" />
<RUN accession="ERR7441155" alias="sequencingRun_b3ce1085-08dc-42ff-a609-6968315327a8_1" status="PRIVATE" />
<RUN accession="ERR7441156" alias="sequencingRun_6a0f0064-ba67-43d6-985e-68d8edcf8c0b_1" status="PRIVATE" />
<RUN accession="ERR7441157" alias="sequencingRun_afd0ea55-e710-4b46-bb05-2423e491b6f5_1" status="PRIVATE" />
<RUN accession="ERR7441158" alias="sequencingRun_13a062ba-2b8e-43a1-bc2a-bc17f650b37d_1" status="PRIVATE" />
<RUN accession="ERR7441159" alias="sequencingRun_6d273f72-f55c-4c8e-b91e-29e762194c3f_1" status="PRIVATE" />
<RUN accession="ERR7441160" alias="sequencingRun_37cad11b-c8c9-4d1f-b715-498b0f8d4b35_1" status="PRIVATE" />
<RUN accession="ERR7441161" alias="sequencingRun_0b52914d-687b-44d1-9a70-a95df55ed502_1" status="PRIVATE" />
<RUN accession="ERR7441162" alias="sequencingRun_3a20b6a5-6652-4486-86bc-842c7c31c343_1" status="PRIVATE" />
<RUN accession="ERR7441163" alias="sequencingRun_44b8ad82-1109-4543-a534-a85b34c2c301_1" status="PRIVATE" />
<RUN accession="ERR7441164" alias="sequencingRun_548a75b4-ba45-4700-b7bb-656c3995c316_1" status="PRIVATE" />
<RUN accession="ERR7441165" alias="sequencingRun_2c2c943c-1c0e-462c-b630-8a91a1f0fb94_1" status="PRIVATE" />
<RUN accession="ERR7441166" alias="sequencingRun_83b474d3-c20f-48f6-95a0-b0fa2269f14d_1" status="PRIVATE" />
<SUBMISSION accession="ERA7498444" alias="SUBMISSION-29-11-2021-16:08:43:157" />
<MESSAGES />
<ACTIONS>ADD</ACTIONS>
</RECEIPT>
Filed ticket to ENA HelpDesk: [ENA DATA STATUS #549358]
Submitter: Broker
Name: Karoly Erdos
Email<EMAIL_ADDRESS>CCEmails<EMAIL_ADDRESS>Subject: Suppressing old ENA runs
Query is related to: Suppression
I work on: Humans
Organisms classification: Not applicable
The work is: Other/not sure (Raw sequencing reads)
Message Body:
Hi,
Could you please suppress the following ENA runs from this study (ERP120466 / PRJEB37165):
ERR4336830
ERR4336831
ERR4336832
ERR4336833
ERR4336834
ERR4336835
ERR4336836
ERR4336837
ERR4336838
ERR4336839
ERR4336840
ERR4336841
ERR4336842
ERR4336843
ERR4336844
ERR4336845
ERR4336846
ERR4336847
ERR4336848
ERR4336849
ERR4336850
ERR4336851
These were BAM files and they had been replaced with FASTQ files.
Please DO NOT delete/suppress the experiment and the new FASTQ files.
Many thanks,
Karoly
We have to monitor ENA when are the new FASTQ files are available in the browser. Normally it takes at least 48 hours.
Project to check: https://www.ebi.ac.uk/ena/browser/view/PRJEB37165
Bam files have not yet been deleted as per Monday Dec 6th, 10:20 AM
@ke4 to confirm if bam files are deleted in the ENA study today.
|
2025-04-01T06:38:28.882308
| 2021-02-17T12:12:04
|
810129393
|
{
"authors": [
"K-r-ll",
"amalik01"
],
"license": "CC-BY-4.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5535",
"repo": "ebi-chebi/ChEBI",
"url": "https://github.com/ebi-chebi/ChEBI/issues/3999"
}
|
gharchive/issue
|
Wrong ontology for norbornene and premutilin
These relationships are wrong because alkenes are acyclic while norbornene and premutilin are cyclic.
norbornene (CHEBI:52286) is a alkene (CHEBI:32878)
premutilin (CHEBI:142455) is a alkene
Thanks. Now corrected.
|
2025-04-01T06:38:28.912799
| 2019-08-30T01:45:26
|
487245288
|
{
"authors": [
"progital"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5536",
"repo": "ecadlabs/tezos-ts",
"url": "https://github.com/ecadlabs/tezos-ts/pull/32"
}
|
gharchive/pull-request
|
#9 Get block
Not ready for merging yet.
Created PR for reviewing and discussion.
@carte7000 done
|
2025-04-01T06:38:28.929179
| 2022-10-28T17:58:21
|
1427597676
|
{
"authors": [
"benjoz"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5537",
"repo": "echoboomer/incident-bot",
"url": "https://github.com/echoboomer/incident-bot/issues/35"
}
|
gharchive/issue
|
No incident recorded in the database
it seems that the incidents are not saved in DB
A critical error is visible in the logs
INFO:bot.incident.incident:Creating incident channel: inc-202210281751-test-4
ERROR:bot.incident.incident:Error sending message to incident digest channel: The request to the Slack API failed. (url: https://www.slack.com/api/chat.postMessage)
The server responded with: {'ok': False, 'error': 'not_in_channel'}
INFO:bot.incident.incident:Sending message to digest channel for: inc-202210281751-test-4
INFO:bot.incident.incident:Writing incident entry to database for inc-202210281751-test-4...
CRITICAL:bot.incident.incident:Error writing entry to database: local variable 'digest_message' referenced before assignment
ERROR:bot.models.incident:Incident update failed for inc-202210281751-test-4: No row was found when one was required```
False positive : incident-bot was not added to the incident digest channel, which means it was not able to post a message.
Then everything failed afterward
|
2025-04-01T06:38:28.934356
| 2024-08-15T21:35:03
|
2469009018
|
{
"authors": [
"blaubaer",
"coveralls"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5538",
"repo": "echocat/slf4g",
"url": "https://github.com/echocat/slf4g/pull/32"
}
|
gharchive/pull-request
|
Fix link error related to Go 1.23 release
Since release of Go 1.23 go:linkname is a problem and the hook into testing.(*common).logDepth does not longer work. Therefore (although ugly), we have to disable this feature inside of github.com/echocat/slf4g/sdk/testlog for now.
Pull Request Test Coverage Report for Build<PHONE_NUMBER>7
Details
0 of 0 changed or added relevant lines in 0 files are covered.
No unchanged relevant lines lost coverage.
Overall coverage remained the same at 97.987%
Totals
Change from base Build<PHONE_NUMBER>6:
0.0%
Covered Lines:
3992
Relevant Lines:
4074
💛 - Coveralls
|
2025-04-01T06:38:28.938077
| 2023-03-16T14:14:02
|
1627587574
|
{
"authors": [
"FrankSchnicke"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5539",
"repo": "eclipse-aas4j/aas4j-model-generator",
"url": "https://github.com/eclipse-aas4j/aas4j-model-generator/issues/22"
}
|
gharchive/issue
|
Update GitHub Actions
Due to the migration, some actions do not work anymore
Currently, we have three actions:
Deploy to maven central: Is obsolete due to mandatory Jenkins usage --> Delete
Deploy to java Model: Is still needed. However, in the current state, it is still referencing the admin shell io packages --> Update
Run Tests: is still needed. Most likely an update to also cover the main branch is sufficient.
|
2025-04-01T06:38:28.941735
| 2023-04-20T22:55:37
|
1677527448
|
{
"authors": [
"marco-miller",
"santimchp"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5540",
"repo": "eclipse-cdt-cloud/theia-trace-extension",
"url": "https://github.com/eclipse-cdt-cloud/theia-trace-extension/issues/964"
}
|
gharchive/issue
|
"org.eclipse.tracecompass.incubator.trace.server.jersey.rest.core.id" could not be found in the registry
Hi,
Just trying to follow the steps on the README file to run a demo on a Linux machine using the Trace extension on Theia.
I cloned the repo
I have the prerequisite of Java 11 as required
I completed the section Build the extension and example application
And the problem arise on the next step on Try the trace extension section, when I run the command: yarn start:server then the following error message is saved on the log file at (theia-trace-extension/trace-compass-server/configuration/):
!ENTRY org.eclipse.osgi 4 0 2023-04-20 23:02:54.573
!MESSAGE Application error
!STACK 1
java.lang.RuntimeException: Application "org.eclipse.tracecompass.incubator.trace.server.jersey.rest.core.id" could not be found in the registry. The applications available are: org.eclipse.equinox.app.error.
at org.eclipse.equinox.internal.app.EclipseAppContainer.startDefaultApp(EclipseAppContainer.java:252)
at org.eclipse.equinox.internal.app.MainApplicationLauncher.run(MainApplicationLauncher.java:33)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:136)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:402)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:255)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:659)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:596)
at org.eclipse.equinox.launcher.Main.run(Main.java:1467)
at org.eclipse.equinox.launcher.Main.main(Main.java:1440)
What is the error about and the solution for it?
Thank you.
Thanks for having reported this. I just pushed a PR (linked herein) to fix the Java version in the README. Using 17 instead of 11 works locally for me, and should be the required version AFAIK.
|
2025-04-01T06:38:29.021071
| 2024-10-25T18:40:07
|
2614866290
|
{
"authors": [
"rpoet-jh"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5541",
"repo": "eclipse-pass/pass-support",
"url": "https://github.com/eclipse-pass/pass-support/pull/129"
}
|
gharchive/pull-request
|
Update README files for default prop values cleanup
As part of https://github.com/eclipse-pass/main/issues/1066
@markpatton I realized I have to fix the ITs for pass-support after merging the pass-core change for default prop values. I'll fix them and then reopen this PR.
Apparently github has some issues with reopened PRs and rebase and merge. Closed this PR and opened a new PR https://github.com/eclipse-pass/pass-support/pull/130
|
2025-04-01T06:38:29.044266
| 2023-12-13T19:50:06
|
2040365890
|
{
"authors": [
"stephanbcbauer"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5542",
"repo": "eclipse-tractusx/eclipse-tractusx.github.io",
"url": "https://github.com/eclipse-tractusx/eclipse-tractusx.github.io/issues/563"
}
|
gharchive/issue
|
fix(tutorials): link to perquisites doesn't work
What
There is a link to prerequisites in the Skills required section. This link doesn't work, and ends up in a Page not found.
Why
The link doesn't work.
closed by #623
|
2025-04-01T06:38:29.051297
| 2024-09-20T15:40:29
|
2539119953
|
{
"authors": [
"Usmanfee",
"manojava-gk"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5543",
"repo": "eclipse-tractusx/portal-frontend",
"url": "https://github.com/eclipse-tractusx/portal-frontend/pull/1134"
}
|
gharchive/pull-request
|
fix(service marketplace): list all active services
Description
Service marketplace should have all the active services but now with implemented solution we can see all the services.
Changelog entry:
- fixed service marketplace to display all active available servicesp[#1143](https://github.com/eclipse-tractusx/portal-frontend/issues/1143)
Why
Service Marketplace hasn't have the all the active services available throughout from all the service provider. Now, for each service provider marketplace we can observe all the services.
Issue
https://github.com/eclipse-tractusx/portal-frontend/issues/1143
Checklist
[x] I have performed a self-review of my own code
[x] I have successfully tested my changes locally
@ma3u @oyo can you please review this PR ?
currently, I can't add you in this PR as reviewer. However, I have already created the ticket to become tractusx contributor: https://github.com/eclipse-tractusx/sig-infra/issues/547 . Thank you!
@evegufy I have updated the dependency file but the error message still persists and not any clue with the error message.
signal need to update DEPENDENCIES.
@manojava-gk I have introduced enum for frequently used string for sorting type . can you have a look please ? :)
@Usmanfee Just to maintain consistency use PascalCase instead. See other examples in the page.
@manojava-gk we are also use camelcase format in e.g. PAGES and OVERLAYS enum . I am using camelcase since this is the pattern backend support.
@Usmanfee one more thing, constants file is a place where we host most common things in the app. I do not think this sorting type is a generic one. it is very specific to the back end api. I prefer to define this in the specific api types's file.
In future we can define this in the constants when all apis sorting types are unified.
CC: @oyo
@manojava-gk I have updated the code based on your suggestion. could you please have a look now ? Thank you :)
Looks fine now
@oyo Thanks for your feedback :) . I have resolved your comments with suggested changes.
@evegufy I have reverted changelog changes. @oyo could you please re-approve and merge it again ? Thank you
|
2025-04-01T06:38:29.054912
| 2023-05-23T15:34:37
|
1722324452
|
{
"authors": [
"evegufy",
"oyo"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5544",
"repo": "eclipse-tractusx/portal-frontend",
"url": "https://github.com/eclipse-tractusx/portal-frontend/pull/34"
}
|
gharchive/pull-request
|
chore(docker hub): clean up after registry move
Description
change registry in package.json
Why
moved to docker hub
Issue
https://github.com/eclipse-tractusx/portal-frontend/pull/19
Checklist
[x] I have performed a self-review of my own code
[x] I have successfully tested my changes locally
@oyo could you please check if this script is still needed? It contains references to the old registry.
@oyo could you please check if this script is still needed? It contains references to the old registry.
@evegufy That is a very old script initially meant to build and push images to the azure container registry that we had before ghcr. It's not used any more and we can delete it.
|
2025-04-01T06:38:29.057275
| 2023-10-18T05:34:02
|
1948862756
|
{
"authors": [
"jjeroch",
"stefan-ettl",
"stephanbcbauer"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5545",
"repo": "eclipse-tractusx/sig-release",
"url": "https://github.com/eclipse-tractusx/sig-release/issues/246"
}
|
gharchive/issue
|
Separate Issuer Component (removed from MIW)
Description
Remove Issuer Component from MIW.
Issuer Component to be transferred to separate component,
Harmonize approach with other dataspaces, GAIA-X and EDC and reduce restrictions.
Impact
Additional information
[ ] I'm willing to contribute to this feature
Won't do in PI11
replaced with new feature https://github.com/eclipse-tractusx/sig-release/issues/416
|
2025-04-01T06:38:29.061069
| 2023-10-24T14:33:53
|
1959403673
|
{
"authors": [
"FaGru3n",
"igorsvetlov"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5546",
"repo": "eclipse-tractusx/sig-release",
"url": "https://github.com/eclipse-tractusx/sig-release/issues/318"
}
|
gharchive/issue
|
Story "Introduce Supply Chain Domain Ontology"
# Story "Introduce Supply Chain Domain Ontology"
Repository https://github.com/catenax-ng/product-ontology
Statement
As an ESS use case developer, I want to have a supply-chain domain ontology such that I can formulate my use case roles, permissions and skills with a vocabulary based on BPN entities paired with abstract material flows.
Acceptance Criteria
Domain Ontology Exists
Domain Ontology Validates
Estimation
5 SP
Originally posted by @drcgjung in https://github.com/eclipse-tractusx/sig-release/issues/280#issuecomment-1770735415
wrong issue place @igorsvetlov could you please raise the issue under https://github.com/catenax-ng/product-ontology
@FaGru3n please re-open for now. We'll discuss with @drcgjung where to place this one and other related stories.
|
2025-04-01T06:38:29.063509
| 2023-07-24T04:44:11
|
1817606614
|
{
"authors": [
"richaashara",
"shijinrajbosch",
"tunacicek"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5547",
"repo": "eclipse-tractusx/sldt-semantic-hub",
"url": "https://github.com/eclipse-tractusx/sldt-semantic-hub/issues/138"
}
|
gharchive/issue
|
Update esmf-sdk version
update esmf-sdk version to 2.2.3
PR:139 will close this issue.
Hi @richaashara ,
thanks for your feedback and PR. We will take a look over the PR.
Hi @richaashara ,
Thanks for the PR. This PR is incomplete since unit tests are failing.
We are also working on ESMF SDK vesion update.
Updated to 2.4.2. Issue will be closed.
|
2025-04-01T06:38:29.068627
| 2023-08-14T12:39:34
|
1849697512
|
{
"authors": [
"almadigabor",
"ds-mwesener"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5548",
"repo": "eclipse-tractusx/traceability-foss",
"url": "https://github.com/eclipse-tractusx/traceability-foss/issues/241"
}
|
gharchive/issue
|
fix: get rid of ghcr.io image references
I've fount multiple docker image links referencing ghcr.io registry. Please get rid of them and reference the DockerHub images instead. Some examples:
trivy.yml worfklow
values.yaml in main chart
values.yaml in backend and frontend chart
It's okay to build for both registries but DockerHub is preferred and should be used in cases mentioned above. See TRG4.05.
Hi @almadigabor I removed the main references and only left the ghcr repository in cases where we build on both registry based on repository. Although the 3 mentioned have been changed!
Please find attached links:
Trivy: https://github.com/eclipse-tractusx/traceability-foss/blob/main/.github/workflows/trivy.yml
Backend Values: https://github.com/eclipse-tractusx/traceability-foss/blob/main/charts/traceability-foss/charts/backend/values.yaml
Frontend Values: https://github.com/eclipse-tractusx/traceability-foss/blob/main/charts/traceability-foss/charts/frontend/values.yaml
Main Values: https://github.com/eclipse-tractusx/traceability-foss/blob/main/charts/traceability-foss/values.yaml
Please let me know if you need anything else.
Thanks in advance!
Hey! Looks good!
|
2025-04-01T06:38:29.070911
| 2024-07-30T23:00:26
|
2438808641
|
{
"authors": [
"gregmedd",
"stevenhartley"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5549",
"repo": "eclipse-uprotocol/up-spec",
"url": "https://github.com/eclipse-uprotocol/up-spec/issues/209"
}
|
gharchive/issue
|
Clarify requirements of L3 services and L3 clients
The current up-l3 specs combine the requirements for the service with the requirements for clients communicating with those services. We have found it difficult to parse what we are expected to implement for client code in up-cpp vs what is a behavior expected of the service. For example, it is not clear in the USubscription spec whether the state diagram is something the client does when subscribing to a topic, or if it represents an internal state per-topic within the service.
I would recommend splitting the specs into separate documents. One could focus on the requirements of the service itself, such as its states, behaviors, inputs, and outputs. The other(s) could cover how clients interact with those services - what RPC methods are available, what data they're expected to send, what steps they need to follow, are they expected to subscribe for notifications, etc. The client specs could be defined entirely in terms of layer 2 components and operations, abstracting any protocol details below layer 2.
After splitting these specs, the file tree would probably be something like this:
up-l3
├── usubscription
│ └── v3
│ ├── README.adoc
│ ├── service.adoc
│ ├── client_publisher.adoc
│ └── client_subscriber.adoc
└── utwin
└── v2
├── README.adoc
├── service.adoc
└── client.adoc
this has been done for uDiscovery, Need to open a separate issue for uSubscription.
|
2025-04-01T06:38:29.382209
| 2021-01-20T13:23:59
|
789975722
|
{
"authors": [
"HGuillemet",
"agibsonccc",
"jljljl"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5550",
"repo": "eclipse/deeplearning4j",
"url": "https://github.com/eclipse/deeplearning4j/issues/9159"
}
|
gharchive/issue
|
"Cannot do backward pass: all epsilons not set" when using frozen layer
I have a convolution layer that, when its output feeds a frozen deconvolution2D layer, triggers the following exception:
java.lang.IllegalStateException: Cannot do backward pass: all epsilons not set. Layer "fc" (idx 1876), numInputs :1; numOutputs: 1
at org.deeplearning4j.nn.graph.vertex.impl.LayerVertex.doBackward(LayerVertex.java:133)
at org.deeplearning4j.nn.graph.ComputationGraph.calcBackpropGradients(ComputationGraph.java:2713)
at org.deeplearning4j.nn.graph.ComputationGraph.computeGradientAndScore(ComputationGraph.java:1382)
at org.deeplearning4j.nn.graph.ComputationGraph.computeGradientAndScore(ComputationGraph.java:1342)
at org.deeplearning4j.optimize.solvers.BaseOptimizer.gradientAndScore(BaseOptimizer.java:170)
at org.deeplearning4j.optimize.solvers.StochasticGradientDescent.optimize(StochasticGradientDescent.java:63)
at org.deeplearning4j.optimize.Solver.optimize(Solver.java:52)
at org.deeplearning4j.nn.graph.ComputationGraph.fitHelper(ComputationGraph.java:1166)
at org.deeplearning4j.nn.graph.ComputationGraph.fit(ComputationGraph.java:1116)
at org.deeplearning4j.nn.graph.ComputationGraph.fit(ComputationGraph.java:1083)
at org.deeplearning4j.nn.graph.ComputationGraph.fit(ComputationGraph.java:1019)
Here is the creation of the frozen layer:
graph.appendLayer("transpose",
new FrozenLayer(
new Deconvolution2D.Builder(new int[]{upFactor * 2, upFactor * 2}, new int[]{upFactor, upFactor},
new int[]{upFactor / 2, upFactor / 2})
.nOut(numClasses)
.hasBias(false)
.weightInit(linearInterpolationInit)
.build()
)
);
if I remove the FrozenLayer (keeping an unfrozen Deconvolution2D layer), the exception doesn't show up.
Using DL4J beta 7.
@HGuillemet please check with debugger if it goes into org.deeplearning4j.nn.layers.FrozenLayer.java:159 at gradientAndScore() method
Also seems like "fc" layer is another layer and it's not provided here. Debugging requires full code or probably at least graph declaration and initialization.
@HGuillemet (and future readers) I'll get to this after the release. I plan on doing a bug fix sprint for bugs like this and in samediff. Sorry for the wait.
@HGuillemet I'll be getting to this now that M1 is out. If you have any updates on this, let me know. Feedback is appreciated!
Closing this, we support backprop with: FrozenLayerWithBackprop - usages here. https://github.com/eclipse/deeplearning4j/blob/5e8951cd8ee8106bb393635f840c398a1759b2fa/deeplearning4j/deeplearning4j-core/src/test/java/org/deeplearning4j/nn/layers/FrozenLayerWithBackpropTest.java#L90 I assume this is for GANs.
|
2025-04-01T06:38:29.592649
| 2019-08-31T03:44:16
|
487725817
|
{
"authors": [
"jeenbroekstra"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5551",
"repo": "eclipse/rdf4j",
"url": "https://github.com/eclipse/rdf4j/pull/1530"
}
|
gharchive/pull-request
|
JMH runner ignores lock to avoid parallel builds failing
This PR addresses GitHub issue: #1529 .
Briefly describe the changes proposed in this PR:
configured surefire plugin set jmh.ignoreLock to true to avoid benchmark execution failing
ran local parallel test, seems to solve the issue (but proof is in Jenkins obviously)
Merging this in immediately in a "let's see if it actually works" kinda way.
|
2025-04-01T06:38:29.596187
| 2022-04-22T13:01:27
|
1212334239
|
{
"authors": [
"hmottestad"
],
"license": "bsd-3-clause",
"license_source": "bigquery",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5552",
"repo": "eclipse/rdf4j",
"url": "https://github.com/eclipse/rdf4j/pull/3820"
}
|
gharchive/pull-request
|
Merge develop into main
GitHub issue resolved: #
Briefly describe the changes proposed in this PR:
PR Author Checklist (see the contributor guidelines for more details):
[ ] my pull request is self-contained
[ ] I've added tests for the changes I made
[ ] I've applied code formatting (you can use mvn process-resources to format from the command line)
[ ] I've squashed my commits where necessary
[ ] every commit message starts with the issue number (GH-xxxx) followed by a meaningful description of the change
Main requires PR verify for java 8 :(
|
2025-04-01T06:38:29.634614
| 2017-11-28T17:57:19
|
277480468
|
{
"authors": [
"Vogel612",
"genie-winery"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5553",
"repo": "eclipse/winery",
"url": "https://github.com/eclipse/winery/pull/203"
}
|
gharchive/pull-request
|
Update third-party-libraries with latest output
third-party-libraries.xlsx as generated from eb34c80cd1c511ed7184f4ba4993a95208753e42
Can one of the admins verify this patch?
|
2025-04-01T06:38:29.672875
| 2024-09-30T08:58:31
|
2556010091
|
{
"authors": [
"EddyCMWF",
"garciampred"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5554",
"repo": "ecmwf-projects/cads-adaptors",
"url": "https://github.com/ecmwf-projects/cads-adaptors/issues/216"
}
|
gharchive/issue
|
CadsObs adaptor to use the retrieve_list_of_results method
For consistency with other adaptors, the CadsObs adaptor should use the retrieve_list_of_results method.
This returns a list of paths, and is called by the default self.retrieve method. The self.make_download_object is then responsible turning the list of paths into a downloadable object for the retreive-api.
If the default self.make_download_object is incompatible with the adaptor, then please define your own retrieve method which handles the creation of a download object, which must be an open binary file.
This will give more flexibility with anticipated functionality, and allows use with the MultiAdaptor.
(Markel, this is lower priority than the task that Paul has given you, we can discuss in the next sprint meeting)
It should be easy to do, we only need to be careful with the error handling.
|
2025-04-01T06:38:29.679783
| 2024-02-16T10:37:13
|
2138316787
|
{
"authors": [
"corentincarton"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5555",
"repo": "ecmwf/pyflow",
"url": "https://github.com/ecmwf/pyflow/issues/39"
}
|
gharchive/issue
|
Support for ecflow-generated variables
Is your feature request related to a problem? Please describe.
We can't use ecflow-generated variables in scripts as they are not supported by pyflow, which means they are not detected and exported at the beginning of the script. For instance, if I create a RepeatDate("YMD",...), ecflow will generate the following variables:
YMD
YMD_DD
YMD_DOW
YMD_JULIAN
YMD_MM
YMD_YYYY
But we can't use those because they are not added to the exportables list in pyflow (see https://github.com/ecmwf/pyflow/blob/master/pyflow/attributes.py#L259 and https://github.com/ecmwf/pyflow/blob/master/pyflow/nodes.py#L401). The RepeatDate class is only linked to one exported variable, following the name provided in the RepeatDate class.
The FAMILY variable, attached to a Family node is another example, but there are many others. We should list them.
Describe the solution you'd like
We could maybe add a list of exportables attached to an Exportable class?
Describe alternatives you've considered
No response
Additional context
No response
Organisation
No response
List of ECFLOW generated variables (missing repeat and maybe other attributes): https://ecflow.readthedocs.io/en/latest/ug/user_manual/ecflow_variables/generated_variables.html
|
2025-04-01T06:38:29.761712
| 2023-02-12T06:55:16
|
1581147805
|
{
"authors": [
"ecyrbe",
"pkrinesh"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5556",
"repo": "ecyrbe/zodios",
"url": "https://github.com/ecyrbe/zodios/pull/331"
}
|
gharchive/pull-request
|
Update helpers.html
typo at makeEndpoint
update signle to single
Thank you, but sorry, gh-page is not where you fix issues. it's on /website on .md files.
Sorry, I tried to find it and didn't know how I ended up on gh-page
|
2025-04-01T06:38:29.765379
| 2023-09-22T18:23:50
|
1909344806
|
{
"authors": [
"Ivan-267"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5557",
"repo": "edbeeching/godot_rl_agents",
"url": "https://github.com/edbeeching/godot_rl_agents/pull/148"
}
|
gharchive/pull-request
|
Require sb3 version 2 or newer
Our environment will not work with older sb3 versions, due to using gymnasium (where older sb3 versions used gym).
Edit: This may cause an issue with rllib, I will check the test results and see if it can be addressed if it does.
It seems that older Ray is installed on the tests that pass, so by specifying the gymnasium version we could be enforcing older Ray (which works with our config) to be installed. On the newer version of Ray, I also locally had some issues unless I try a different path format. Even outside of the path issue, I also had some issues with one of my environments with both Ray versions, something about the format of observations, but I didn't test it in-depth, so I don't know the issue was related to my Godot environment (which worked with SB3 which I mostly use for now), or something else.
I'll add the gymnasium version as a quick temporary fix, if it works, we can merge the PR for now since it's focused on ensuring the correct SB3 version, and in the future we can test our Ray configuration and see if something needs to be updated.
With this approach, newer Ray 2.7 seems to work, although I didn't perform much testing to see if everything works with this as before. As a quick fix I've changed the folder path to an absolute path as that seems to fix the error. An alternative could be to use older Ray or find a different fix for the issue.
It should be noted that one of my environments didn't work with rllib (some observation type error), but I think that holds for both Ray versions, and may need more troubleshooting at some future point as I'm not sure about the cause yet (maybe something with the env itself even though it works with SB3). Jumperhard worked on my PC as well, but there might be some difference in observations from my env (which has some floats and a raycast obs array).
For future reference and in case similar errors appear in the future, the error was:
ValueError: The two structures don't have the same nested structure.
Entire second structure:
OrderedDict([('obs', .)])
|
2025-04-01T06:38:29.841222
| 2021-09-18T03:38:34
|
999893432
|
{
"authors": [
"codecov-commenter",
"edge-minato"
],
"license": "MIT",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5558",
"repo": "edge-minato/pypj",
"url": "https://github.com/edge-minato/pypj/pull/37"
}
|
gharchive/pull-request
|
cicd: specify unittest directory
Description
cicd: specify unittest directory
Type of change
[ ] Bug fix
[ ] New feature
[ ] Document
[x] Test
[x] CI/CD
[x] Refactor
How has this been tested
[x] Unittest
[ ] Others:
Checklist:
[x] I have made corresponding changes to the documentation
[x] My changes generate no new warnings
[x] I have added tests that prove my fix is effective or that my feature works
[x] New and existing unit tests pass locally with my changes
[x] Let's make the world better✨😋🐍🌎
Codecov Report
Merging #37 (9336a49) into main (c60b059) will increase coverage by 0.00%.
The diff coverage is 100.00%.
@@ Coverage Diff @@
## main #37 +/- ##
=======================================
Coverage 95.56% 95.57%
=======================================
Files 21 21
Lines 609 610 +1
Branches 36 36
=======================================
+ Hits 582 583 +1
Misses 18 18
Partials 9 9
Impacted Files
Coverage Δ
pypj/task/githubactions.py
100.00% <100.00%> (ø)
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update c60b059...9336a49. Read the comment docs.
|
2025-04-01T06:38:29.845025
| 2024-01-15T17:42:24
|
2082459432
|
{
"authors": [
"rbino",
"szakhlypa"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5559",
"repo": "edgehog-device-manager/edgehog",
"url": "https://github.com/edgehog-device-manager/edgehog/pull/429"
}
|
gharchive/pull-request
|
Restore Tenant Provisioning
Move it to the Tenants API, removing the Provisioning context.
Use AshJsonApi to expose a new JsonApi compatible API.
Edgehog.Tenants.Reconciler.Behaviour expects Edgehog.Tenants.Tenant to have a typespec @type t().
Ash.Resource has @type record :: struct().
I think we can start with @type record :: Ash.Resource.record() for Tenant and Tenant.record() in Reconciler.Behaviour
|
2025-04-01T06:38:29.864386
| 2019-05-28T22:04:17
|
449498146
|
{
"authors": [
"codecov-io",
"tsconn23"
],
"license": "Apache-2.0",
"license_source": "github-api",
"license_type": "permissive",
"provenance": "gharchive-dolma-0000.json.gz:5560",
"repo": "edgexfoundry/edgex-go",
"url": "https://github.com/edgexfoundry/edgex-go/pull/1383"
}
|
gharchive/pull-request
|
Update edgex modules to v0.1.0 pre-edinburgh
Fix #1380
In preparation for cutting the edinburgh branch, we need to consume
v0.1.0 in the go.mod for all edgex modules
go-mod-core-contracts
go-mod-messaging
go-mod-registry
Signed-off-by: Trevor Conn<EMAIL_ADDRESS>
Codecov Report
:exclamation: No coverage uploaded for pull request base (master@823421a). Click here to learn what that means.
The diff coverage is n/a.
@@ Coverage Diff @@
## master #1383 +/- ##
=========================================
Coverage ? 17.02%
=========================================
Files ? 78
Lines ? 8789
Branches ? 0
=========================================
Hits ? 1496
Misses ? 7142
Partials ? 151
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 823421a...904ce62. Read the comment docs.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.