repo_name
stringlengths
8
38
pr_number
int64
3
47.1k
pr_title
stringlengths
8
175
pr_description
stringlengths
2
19.8k
author
null
date_created
stringlengths
25
25
date_merged
stringlengths
25
25
filepath
stringlengths
6
136
before_content
stringlengths
54
884k
after_content
stringlengths
56
884k
pr_author
stringlengths
3
21
previous_commit
stringlengths
40
40
pr_commit
stringlengths
40
40
comment
stringlengths
2
25.4k
comment_author
stringlengths
3
29
__index_level_0__
int64
0
5.1k
ceph/ceph-ansible
7,326
common: stop using ceph/daemon entrypoint
major refactor in order to make ceph-ansible use the `ceph/ceph` container image instead of `ceph/daemon`
null
2022-10-07 12:52:53+00:00
2023-05-31 21:07:13+00:00
roles/ceph-nfs/templates/ceph-nfs.service.j2
[Unit] Description=NFS-Ganesha file server Documentation=http://github.com/nfs-ganesha/nfs-ganesha/wiki {% if container_binary == 'docker' %} After=docker.service network-online.target local-fs.target time-sync.target Requires=docker.service {% else %} After=network-online.target local-fs.target time-sync.target {% end...
[Unit] Description=NFS-Ganesha file server Documentation=http://github.com/nfs-ganesha/nfs-ganesha/wiki {% if container_binary == 'docker' %} After=docker.service network-online.target local-fs.target time-sync.target Requires=docker.service {% else %} After=network-online.target local-fs.target time-sync.target {% end...
guits
5cd692dcdc8dd55036b8c283f1bca56f79965c1d
23a8bbc6c59d78e982f343c06cfb5fe8e86cb757
hi @gouthampacha sorry for the late answer. I've no idea about nfsv3 here but this part needs to be reworked indeed.
guits
13
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
```suggestion - name: refresh /etc/ceph/osd files ```
guits
14
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
When passing a list of osd ids, (`-e osd_to_kill=1,3,5`), if the osds are on the same host, that command would be run multiple times unnecessarily on that host.
guits
15
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
```suggestion - name: refresh /etc/ceph/osd files ceph_volume_simple_scan: path: "/var/lib/ceph/osd/{{ cluster }}-{{ item.2 }}" cluster: "{{ cluster }}" force: true environment: CEPH_CONTAINER_IMAGE: "{{ ceph_docker_registry + '/' + ceph_docker_image + ':' + ceph_d...
guits
16
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
```suggestion ```
guits
17
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
```suggestion ``` doesn't make sense to keep this if we know it will run only when not containerized deployments.
guits
18
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
this command won't work in containerized deployment, you must run it from within a container..
guits
19
ceph/ceph-ansible
7,226
[skip ci] Refresh /etc/ceph/osd json files content before zapping the disks
If the physical disk to device path mapping has changed since the last ceph-volume simple scan (e.g. addition or removal of disks), a wrong disk could be deleted. Closes: https://bugzilla.redhat.com/show_bug.cgi?id=2071035 Signed-off-by: Teoman ONAY <tonay@redhat.com>
null
2022-07-04 10:03:03+00:00
2022-07-11 07:14:41+00:00
infrastructure-playbooks/shrink-osd.yml
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
--- # This playbook shrinks Ceph OSDs that have been created with ceph-volume. # It can remove any number of OSD(s) from the cluster and ALL THEIR DATA # # Use it like this: # ansible-playbook shrink-osd.yml -e osd_to_kill=0,2,6 # Prompts for confirmation to shrink, defaults to no and # doesn't shrink the clust...
asm0deuz
dffe7b47de70b6eeec71a3fa86f8c407adb4dd8e
64e08f2c0bdea6f4c4ad5862dc8f350c6adbe2cd
```suggestion command: "{{ container_binary }} exec ceph-osd-{{ item.2 }} ceph-volume simple scan --force /var/lib/ceph/osd/{{ cluster }}-{{ item.2 }}" ```
guits
20
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
```suggestion url: "https://raw.githubusercontent.com/ceph/ceph/{{ grafana_dashboard_version }}/monitoring/ceph-mixin/dashboards_out/{{ item }}" ```
guits
21
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
```suggestion ```
guits
22
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
The change I proposed would be backwards compatible as the dashboards for octopus and earlier life in the old location still.
mitch000001
23
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
I mean, we could also change this like that within master and cherry-pick into stable-6.0 onwards. Works for me. Is that the direction to go?
mitch000001
24
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
> I mean, we could also change this like that within master and cherry-pick into stable-6.0 onwards. Works for me. Is that the direction to go? yes
guits
25
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
> The change I proposed would be backwards compatible as the dashboards for octopus and earlier life in the old location still. the branch 'main' isn't intended to be used for deploying stables releases.
guits
26
ceph/ceph-ansible
7,197
fix(ceph-grafana): make dashboard download work again
This fixes the dashboard download for pacific and later. Since ceph switched to Prometheus Monitoring Mixins the path to the generated dashboards has changed. It is still working for octopus but it's broken from pacific onwards. This change fixes the issue. Currently I only added the two latest releases to the chec...
null
2022-06-10 15:27:51+00:00
2022-06-14 12:36:24+00:00
roles/ceph-grafana/tasks/configure_grafana.yml
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
--- - name: install ceph-grafana-dashboards package on RedHat or SUSE package: name: ceph-grafana-dashboards state: "{{ (upgrade_ceph_packages|bool) | ternary('latest','present') }}" register: result until: result is succeeded when: - not containerized_deployment | bool - ansible_facts['os_famil...
mitch000001
8a5fb702f2a3df46834baf6019285463bbfcc4fb
4edaab5f4c5445cb1fafc5d8824c49717e9f96c8
Changed.
mitch000001
27
ceph/ceph-ansible
7,181
[skip ci] rbd-mirror: major refactor
- Use config-key store to add cluster peer. - Support multiple pools mirroring. Signed-off-by: Guillaume Abrioux <gabrioux@redhat.com>
null
2022-05-12 15:23:32+00:00
2022-07-29 15:33:26+00:00
roles/ceph-rbd-mirror/tasks/configure_mirroring.yml
--- - name: enable mirroring on the pool command: "{{ container_exec_cmd | default('') }} rbd --cluster {{ cluster }} --keyring /etc/ceph/{{ cluster }}.client.rbd-mirror.{{ ansible_facts['hostname'] }}.keyring --name client.rbd-mirror.{{ ansible_facts['hostname'] }} mirror pool enable {{ ceph_rbd_mirror_pool }} {{ ce...
--- - name: cephx tasks when: - cephx | bool block: - name: get client.bootstrap-rbd-mirror from ceph monitor ceph_key: name: client.bootstrap-rbd-mirror cluster: "{{ cluster }}" output_format: plain state: info environment: CEPH_CONTAINER_IMAGE: "{{ ceph_...
guits
3a8daafbe8c9023c6dcd8034adfcc98893e5c303
b74ff6e22c0d1b95e71384e4d7e2fb2ad556ac39
add a `ceph_pool` task? (create the pool if it doesn't exist already?)
guits
28
ceph/ceph-ansible
7,181
[skip ci] rbd-mirror: major refactor
- Use config-key store to add cluster peer. - Support multiple pools mirroring. Signed-off-by: Guillaume Abrioux <gabrioux@redhat.com>
null
2022-05-12 15:23:32+00:00
2022-07-29 15:33:26+00:00
roles/ceph-rbd-mirror/tasks/configure_mirroring.yml
--- - name: enable mirroring on the pool command: "{{ container_exec_cmd | default('') }} rbd --cluster {{ cluster }} --keyring /etc/ceph/{{ cluster }}.client.rbd-mirror.{{ ansible_facts['hostname'] }}.keyring --name client.rbd-mirror.{{ ansible_facts['hostname'] }} mirror pool enable {{ ceph_rbd_mirror_pool }} {{ ce...
--- - name: cephx tasks when: - cephx | bool block: - name: get client.bootstrap-rbd-mirror from ceph monitor ceph_key: name: client.bootstrap-rbd-mirror cluster: "{{ cluster }}" output_format: plain state: info environment: CEPH_CONTAINER_IMAGE: "{{ ceph_...
guits
3a8daafbe8c9023c6dcd8034adfcc98893e5c303
b74ff6e22c0d1b95e71384e4d7e2fb2ad556ac39
legacy from testing? ```suggestion retries: 60 ```
guits
29
yogeshojha/rengine
1,100
Fix report generation when `Ignore Informational Vulnerabilities` checked
When **Ignore Informational Vulnerabilities** is checked there are still info vulns datas. I've reworked the queries that display vulnerabilities to prevent info vulns to display in the : - **Quick summary** Info blue box - **Reconnaissance Findings** - **Vulnerabilities Discovered** Info blue box I've also fi...
null
2023-12-05 01:25:41+00:00
2023-12-08 05:48:36+00:00
web/startScan/views.py
import markdown from celery import group from weasyprint import HTML from datetime import datetime from django.contrib import messages from django.db.models import Count from django.http import HttpResponse, HttpResponseRedirect, JsonResponse from django.shortcuts import get_object_or_404, render from django.template....
import markdown from celery import group from weasyprint import HTML from datetime import datetime from django.contrib import messages from django.db.models import Count from django.http import HttpResponse, HttpResponseRedirect, JsonResponse from django.shortcuts import get_object_or_404, render from django.template....
psyray
4341d9834865240222a8dc72c01caaec0d7bed44
69231095782663fe0fe8b0e49b8aa995aa042723
In django template you can use `|length` you can use, `{{all_vulnerabilities|length}}`
yogeshojha
0
yogeshojha/rengine
1,100
Fix report generation when `Ignore Informational Vulnerabilities` checked
When **Ignore Informational Vulnerabilities** is checked there are still info vulns datas. I've reworked the queries that display vulnerabilities to prevent info vulns to display in the : - **Quick summary** Info blue box - **Reconnaissance Findings** - **Vulnerabilities Discovered** Info blue box I've also fi...
null
2023-12-05 01:25:41+00:00
2023-12-08 05:48:36+00:00
web/templates/report/template.html
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
psyray
4341d9834865240222a8dc72c01caaec0d7bed44
69231095782663fe0fe8b0e49b8aa995aa042723
## Potentially unsafe external link External links without noopener/noreferrer are a potential security risk. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/171)
github-advanced-security[bot]
1
yogeshojha/rengine
1,100
Fix report generation when `Ignore Informational Vulnerabilities` checked
When **Ignore Informational Vulnerabilities** is checked there are still info vulns datas. I've reworked the queries that display vulnerabilities to prevent info vulns to display in the : - **Quick summary** Info blue box - **Reconnaissance Findings** - **Vulnerabilities Discovered** Info blue box I've also fi...
null
2023-12-05 01:25:41+00:00
2023-12-08 05:48:36+00:00
web/templates/report/template.html
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
psyray
4341d9834865240222a8dc72c01caaec0d7bed44
69231095782663fe0fe8b0e49b8aa995aa042723
## Potentially unsafe external link External links without noopener/noreferrer are a potential security risk. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/172)
github-advanced-security[bot]
2
yogeshojha/rengine
1,100
Fix report generation when `Ignore Informational Vulnerabilities` checked
When **Ignore Informational Vulnerabilities** is checked there are still info vulns datas. I've reworked the queries that display vulnerabilities to prevent info vulns to display in the : - **Quick summary** Info blue box - **Reconnaissance Findings** - **Vulnerabilities Discovered** Info blue box I've also fi...
null
2023-12-05 01:25:41+00:00
2023-12-08 05:48:36+00:00
web/templates/report/template.html
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
psyray
4341d9834865240222a8dc72c01caaec0d7bed44
69231095782663fe0fe8b0e49b8aa995aa042723
Fixed
psyray
3
yogeshojha/rengine
1,100
Fix report generation when `Ignore Informational Vulnerabilities` checked
When **Ignore Informational Vulnerabilities** is checked there are still info vulns datas. I've reworked the queries that display vulnerabilities to prevent info vulns to display in the : - **Quick summary** Info blue box - **Reconnaissance Findings** - **Vulnerabilities Discovered** Info blue box I've also fi...
null
2023-12-05 01:25:41+00:00
2023-12-08 05:48:36+00:00
web/templates/report/template.html
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
psyray
4341d9834865240222a8dc72c01caaec0d7bed44
69231095782663fe0fe8b0e49b8aa995aa042723
Fixed
psyray
4
yogeshojha/rengine
1,100
Fix report generation when `Ignore Informational Vulnerabilities` checked
When **Ignore Informational Vulnerabilities** is checked there are still info vulns datas. I've reworked the queries that display vulnerabilities to prevent info vulns to display in the : - **Quick summary** Info blue box - **Reconnaissance Findings** - **Vulnerabilities Discovered** Info blue box I've also fi...
null
2023-12-05 01:25:41+00:00
2023-12-08 05:48:36+00:00
web/templates/report/template.html
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
<html> <head> <meta charset="utf-8"> <title>Report</title> <link href="https://fonts.googleapis.com/css2?family=Inter:wght@100;200;300;400;500&display=swap" rel="stylesheet"> <style> @page { size: A4; @top-left { background: {{primary_color}}; content: counte...
psyray
4341d9834865240222a8dc72c01caaec0d7bed44
69231095782663fe0fe8b0e49b8aa995aa042723
same here as well `{{all_vulnerabilities|length}}`
yogeshojha
5
yogeshojha/rengine
1,071
Fixes for #1033, #1026, #1027
Fixes - Fix Dashboard redirection error (fixes #1026) - Fix Message color (red created confusing message as error) (fixes #1027) - Update nuclei for v3, and nuclei v3 requires go 1.21. (fixes #1033) https://github.com/projectdiscovery/nuclei#install-nuclei
null
2023-11-23 10:33:54+00:00
2023-11-23 13:04:18+00:00
web/scanEngine/templates/scanEngine/lookup.html
{% extends 'base/base.html' %} {% load static %} {% load custom_tags %} {% block title %} Interesting entries Lookup {% endblock title %} {% block custom_js_css_link %} {% endblock custom_js_css_link %} {% block breadcrumb_title %} <li class="breadcrumb-item"><a href="{% url 'scan_engine_index' current_project.slug ...
{% extends 'base/base.html' %} {% load static %} {% load custom_tags %} {% block title %} Interesting entries Lookup {% endblock title %} {% block custom_js_css_link %} {% endblock custom_js_css_link %} {% block breadcrumb_title %} <li class="breadcrumb-item"><a href="{% url 'scan_engine_index' current_project.slug ...
yogeshojha
6c1ec3124b55404eae84c8ac721ad067563b9243
b190060d07e6ed4d6bfd969481ab6d54779c09a0
```suggestion <span class="text-muted">Please use a comma (,) to separate the keywords.</span> ```
AnonymousWP
6
yogeshojha/rengine
1,063
Fix crash on saving endpoint (FFUF related only)
Fix #1006 I've added : - a **try except** block to catch error on duplicate record returned by **get_or_create** in **saving_endpoint** method - a **check** on endpoint existence in **dir_file_fuzz** method Errors are logged to the console with the URL. ![image](https://github.com/yogeshojha/rengine/assets/...
null
2023-11-22 02:57:45+00:00
2023-11-27 12:37:27+00:00
web/reNgine/tasks.py
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
psyray
7c01a46cea370e74385682ba7c28eaf4e58f5d69
2e089dc62f1bd64aa481750da10fa750e3aa232d
Is this comment still needed?
AnonymousWP
7
yogeshojha/rengine
1,063
Fix crash on saving endpoint (FFUF related only)
Fix #1006 I've added : - a **try except** block to catch error on duplicate record returned by **get_or_create** in **saving_endpoint** method - a **check** on endpoint existence in **dir_file_fuzz** method Errors are logged to the console with the URL. ![image](https://github.com/yogeshojha/rengine/assets/...
null
2023-11-22 02:57:45+00:00
2023-11-27 12:37:27+00:00
web/reNgine/tasks.py
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
psyray
7c01a46cea370e74385682ba7c28eaf4e58f5d69
2e089dc62f1bd64aa481750da10fa750e3aa232d
Don't know, I could delete it
psyray
8
yogeshojha/rengine
1,063
Fix crash on saving endpoint (FFUF related only)
Fix #1006 I've added : - a **try except** block to catch error on duplicate record returned by **get_or_create** in **saving_endpoint** method - a **check** on endpoint existence in **dir_file_fuzz** method Errors are logged to the console with the URL. ![image](https://github.com/yogeshojha/rengine/assets/...
null
2023-11-22 02:57:45+00:00
2023-11-27 12:37:27+00:00
web/reNgine/tasks.py
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
psyray
7c01a46cea370e74385682ba7c28eaf4e58f5d69
2e089dc62f1bd64aa481750da10fa750e3aa232d
Not needed, feel free to remove
yogeshojha
9
yogeshojha/rengine
1,063
Fix crash on saving endpoint (FFUF related only)
Fix #1006 I've added : - a **try except** block to catch error on duplicate record returned by **get_or_create** in **saving_endpoint** method - a **check** on endpoint existence in **dir_file_fuzz** method Errors are logged to the console with the URL. ![image](https://github.com/yogeshojha/rengine/assets/...
null
2023-11-22 02:57:45+00:00
2023-11-27 12:37:27+00:00
web/reNgine/tasks.py
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
psyray
7c01a46cea370e74385682ba7c28eaf4e58f5d69
2e089dc62f1bd64aa481750da10fa750e3aa232d
@psyray Remove it please so I can merge it.
AnonymousWP
10
yogeshojha/rengine
1,058
fix: ffuf ANSI code processing preventing task to finish
Should - [ ] fix #1006 Needs to be tested for potential impact on other tasks (e.g: dalfox)
null
2023-11-21 11:54:34+00:00
2023-11-24 03:10:39+00:00
web/reNgine/tasks.py
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
ocervell
b557c6b8b70ea554c232095bf2fbb213e6d3648f
0ded32c1bee7852e7fc5daea0fb6de999097400b
## Overly permissive regular expression range Suspicious character range that is equivalent to \[@A-Z\]. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/167)
github-advanced-security[bot]
11
yogeshojha/rengine
1,058
fix: ffuf ANSI code processing preventing task to finish
Should - [ ] fix #1006 Needs to be tested for potential impact on other tasks (e.g: dalfox)
null
2023-11-21 11:54:34+00:00
2023-11-24 03:10:39+00:00
web/reNgine/tasks.py
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
import csv import json import os import pprint import subprocess import time import validators import whatportis import xmltodict import yaml import tldextract import concurrent.futures from datetime import datetime from urllib.parse import urlparse from api.serializers import SubdomainSerializer from celery import ch...
ocervell
b557c6b8b70ea554c232095bf2fbb213e6d3648f
0ded32c1bee7852e7fc5daea0fb6de999097400b
## Overly permissive regular expression range Suspicious character range that is equivalent to \[0-9:;<=>?\]. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/168)
github-advanced-security[bot]
12
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
Makefile
.DEFAULT_GOAL:=help # Credits: https://github.com/sherifabdlnaby/elastdocker/ # This for future release of Compose that will use Docker Buildkit, which is much efficient. COMPOSE_PREFIX_CMD := COMPOSE_DOCKER_CLI_BUILD=1 COMPOSE_ALL_FILES := -f docker-compose.yml SERVICES := db web proxy redis celery celery-...
include .env .DEFAULT_GOAL:=help # Credits: https://github.com/sherifabdlnaby/elastdocker/ # This for future release of Compose that will use Docker Buildkit, which is much efficient. COMPOSE_PREFIX_CMD := COMPOSE_DOCKER_CLI_BUILD=1 COMPOSE_ALL_FILES := -f docker-compose.yml SERVICES := db web proxy redis c...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
What's the addition/benefit of adding the .env file to `Makefile`?
AnonymousWP
13
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
Makefile
.DEFAULT_GOAL:=help # Credits: https://github.com/sherifabdlnaby/elastdocker/ # This for future release of Compose that will use Docker Buildkit, which is much efficient. COMPOSE_PREFIX_CMD := COMPOSE_DOCKER_CLI_BUILD=1 COMPOSE_ALL_FILES := -f docker-compose.yml SERVICES := db web proxy redis celery celery-...
include .env .DEFAULT_GOAL:=help # Credits: https://github.com/sherifabdlnaby/elastdocker/ # This for future release of Compose that will use Docker Buildkit, which is much efficient. COMPOSE_PREFIX_CMD := COMPOSE_DOCKER_CLI_BUILD=1 COMPOSE_ALL_FILES := -f docker-compose.yml SERVICES := db web proxy redis c...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
To give it access to variables located in the .env and avoid writing the username/email directly in the Makefile. This way, all configuration elements remain centralized in the .env file. Following the Django documentation, for non-interactive createsuperuser process, you need to specify --username and --email argumen...
C0wnuts
14
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
```suggestion 1. Edit the `.env` file, **please make sure to change the password for postgresql `POSTGRES_PASSWORD`!** ```
AnonymousWP
15
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
```suggestion 1. **Optional, only for non-interactive install**: In the `.env` file, **please make sure to change the super admin values!** ```
AnonymousWP
16
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
```suggestion `DJANGO_SUPERUSER_USERNAME`: web interface admin username (used to login to the web interface). ```
AnonymousWP
17
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
```suggestion `DJANGO_SUPERUSER_EMAIL`: web interface admin email. ```
AnonymousWP
18
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
```suggestion `DJANGO_SUPERUSER_PASSWORD`: web interface admin password (used to login to the web interface). ```
AnonymousWP
19
yogeshojha/rengine
973
Add non-interactive installation parameter
Add a non-interactive installation method via a new parameter to be passed to the install.sh script. Essential for automated/industrialized systems (e.g. via Ansible or another automated environment creation system).
null
2023-10-12 01:09:15+00:00
2023-11-21 12:49:22+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
C0wnuts
3dd700357a4bd5701b07ede4511f66042655be00
64b7f291240b3b8853e3cec7ee6230827c97b907
```suggestion Or for a non-interactive installation, use `-n` argument (make sure you've modified the `.env` file before launching the installation). ```
AnonymousWP
20
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v1.2.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
The hyperlinks seem incorrect or I am mentioned by accident. :p
AnonymousWP
21
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v1.2.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
Ditto.
AnonymousWP
22
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
README.md
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v1.2.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
<p align="center"> <a href="https://rengine.wiki"><img src=".github/screenshots/banner.gif" alt=""/></a> </p> <p align="center"><a href="https://github.com/yogeshojha/rengine/releases" target="_blank"><img src="https://img.shields.io/badge/version-v2.0.0-informational?&logo=none" alt="reNgine Latest Version" /></a>&nb...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
Thank you for pointing out ;p
yogeshojha
23
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
web/dashboard/views.py
from datetime import timedelta from targetApp.models import Domain from startScan.models import * from django.utils import timezone from django.shortcuts import render, redirect from django.http import HttpResponse from django.db.models.functions import TruncDay from django.contrib.auth.decorators import login_requir...
import json import logging from datetime import timedelta from django.contrib.auth import get_user_model from django.contrib import messages from django.contrib.auth import update_session_auth_hash from django.contrib.auth.forms import PasswordChangeForm from django.contrib.auth.signals import user_logged_in, user_lo...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
## Information exposure through an exception [Stack trace information](1) flows to this location and may be exposed to an external user. [Stack trace information](2) flows to this location and may be exposed to an external user. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/148)
github-advanced-security[bot]
24
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
web/startScan/templates/startScan/history.html
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatable/datatables.css' %}"> <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatab...
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% load permission_tags %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} {% endblock custom_js_css_link %} {% block breadcrumb_title %} <li class="breadcrumb-item active" aria-current="page">Scan History</li> {%...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
## DOM text reinterpreted as HTML [DOM text](1) is reinterpreted as HTML without escaping meta-characters. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/153)
github-advanced-security[bot]
25
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
web/startScan/templates/startScan/history.html
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatable/datatables.css' %}"> <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatab...
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% load permission_tags %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} {% endblock custom_js_css_link %} {% block breadcrumb_title %} <li class="breadcrumb-item active" aria-current="page">Scan History</li> {%...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
## DOM text reinterpreted as HTML [DOM text](1) is reinterpreted as HTML without escaping meta-characters. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/154)
github-advanced-security[bot]
26
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
web/startScan/templates/startScan/history.html
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatable/datatables.css' %}"> <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatab...
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% load permission_tags %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} {% endblock custom_js_css_link %} {% block breadcrumb_title %} <li class="breadcrumb-item active" aria-current="page">Scan History</li> {%...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
## DOM text reinterpreted as HTML [DOM text](1) is reinterpreted as HTML without escaping meta-characters. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/155)
github-advanced-security[bot]
27
yogeshojha/rengine
963
2.0-jasper release
### Added - Projects: Projects allow you to efficiently organize their web application reconnaissance efforts. With this feature, you can create distinct project spaces, each tailored to a specific purpose, such as personal bug bounty hunting, client engagements, or any other specialized recon task. - Roles and Pe...
null
2023-10-02 07:51:35+00:00
2023-10-07 10:37:23+00:00
web/startScan/templates/startScan/history.html
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatable/datatables.css' %}"> <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatab...
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% load permission_tags %} {% block title %} Scan history {% endblock title %} {% block custom_js_css_link %} {% endblock custom_js_css_link %} {% block breadcrumb_title %} <li class="breadcrumb-item active" aria-current="page">Scan History</li> {%...
yogeshojha
3c60bc1ee495044794d91edee0c96fff73ab46c7
5413708d243799a5271440c47c6f98d0c51154ca
## DOM text reinterpreted as HTML [DOM text](1) is reinterpreted as HTML without escaping meta-characters. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/156)
github-advanced-security[bot]
28
yogeshojha/rengine
814
Fixes required to get install script working
Update celery version to 5.2.7 in requirements.txt Update go version to 1.20 in Dockerfile These changes are required for install.sh to complete on Ubuntu 22.04.1 LTS (GNU/Linux 5.15.0-66-generic x86_64) (digital ocean droplet). Note: Saw some warning/maybe error related to whatisport (or similar) pypi package. ...
null
2023-02-12 05:24:03+00:00
2023-03-02 17:25:01+00:00
web/Dockerfile
# Base image FROM ubuntu:20.04 # Labels and Credits LABEL \ name="reNgine" \ author="Yogesh Ojha <yogesh.ojha11@gmail.com>" \ description="reNgine is a automated pipeline of recon process, useful for information gathering during web application penetration testing." # Environment Variables ENV DEBIAN_FRON...
# Base image FROM ubuntu:20.04 # Labels and Credits LABEL \ name="reNgine" \ author="Yogesh Ojha <yogesh.ojha11@gmail.com>" \ description="reNgine is a automated pipeline of recon process, useful for information gathering during web application penetration testing." # Environment Variables ENV DEBIAN_FRON...
m00tiny
5e3c04c336d3798fbff20f362a8091dface53203
a0ce6a270a30e8e3c224b6141555a530b9b6c50e
- RUN wget https://go.dev/dl/go1.20.1.linux-amd64.tar.gz
cybersaki
29
yogeshojha/rengine
680
Release/1.3.1
# Fixes - Fix for #643 Downloading issue for Subdomain and Endpoints - Fix for #627 Too many Targets causes issues while loading datatable - Fix version Numbering issue
null
2022-08-12 12:46:29+00:00
2022-08-12 12:57:24+00:00
web/targetApp/templates/target/list.html
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% block title %} List all targets {% endblock title %} {% block custom_js_css_link %} <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatable/datatables.css' %}"> <link rel="stylesheet" type="text/css" href="{% static 'plugins/dat...
{% extends 'base/base.html' %} {% load static %} {% load humanize %} {% block title %} List all targets {% endblock title %} {% block custom_js_css_link %} <link rel="stylesheet" type="text/css" href="{% static 'plugins/datatable/datatables.css' %}"> <link rel="stylesheet" type="text/css" href="{% static 'plugins/dat...
yogeshojha
758debc4e79b5dc3f1ee29fcabcacb8e15656a94
0caa3a6f04a26f9f3554e1617c7f369a9b10330e
## DOM text reinterpreted as HTML [DOM text](1) is reinterpreted as HTML without escaping meta-characters. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/135)
github-advanced-security[bot]
30
yogeshojha/rengine
664
Release/1.3.0
## 1.3.0 **Release Date: July 11, 2022** ## Added - Geographic Distribution of Assets Map ## Fixes - WHOIS Provider Changed - Fixed Dark UI Issues - Fix HTTPX Issue
null
2022-07-10 17:49:40+00:00
2022-07-18 19:30:04+00:00
web/static/custom/custom.js
function checkall(clickchk, relChkbox) { var checker = $('#' + clickchk); var multichk = $('.' + relChkbox); checker.click(function() { multichk.prop('checked', $(this).prop('checked')); }); } function multiCheck(tb_var) { tb_var.on("change", ".chk-parent", function() { var e = $(this).closest("table").find(...
function checkall(clickchk, relChkbox) { var checker = $('#' + clickchk); var multichk = $('.' + relChkbox); checker.click(function() { multichk.prop('checked', $(this).prop('checked')); }); } function multiCheck(tb_var) { tb_var.on("change", ".chk-parent", function() { var e = $(this).closest("table").find(...
yogeshojha
9fb69660d763b79e6ab505099c7e4cd58f19761c
18be197fed32ce87979564bb50f002e46290bc3f
## Inefficient regular expression This part of the regular expression may cause exponential backtracking on strings starting with '0' and containing many repetitions of '0'. [Show more details](https://github.com/yogeshojha/rengine/security/code-scanning/134)
github-advanced-security[bot]
31
yogeshojha/rengine
530
Fix #529
Nuclei returns the response to stdout: `{"template-id":"tech-detect","info":{"name":"Wappalyzer Technology Detection","author":["hakluke"],"tags":["tech"],"reference":null,"severity":"info"},"matcher-name":"nginx","type":"http","host":"https://example.com:443","matched-at":"https://example.com:443","timestamp":"2021-1...
null
2021-10-31 10:27:33+00:00
2021-11-01 16:58:16+00:00
web/reNgine/tasks.py
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
radaram
43af3a6aecdece4923ee74b108853f7b9c51ed12
27d6ec5827a51fd74e3ab97a5cef38fc7f5d9168
But we can remove this, not sure if this `matched` is returned to some specific version.
yogeshojha
32
yogeshojha/rengine
530
Fix #529
Nuclei returns the response to stdout: `{"template-id":"tech-detect","info":{"name":"Wappalyzer Technology Detection","author":["hakluke"],"tags":["tech"],"reference":null,"severity":"info"},"matcher-name":"nginx","type":"http","host":"https://example.com:443","matched-at":"https://example.com:443","timestamp":"2021-1...
null
2021-10-31 10:27:33+00:00
2021-11-01 16:58:16+00:00
web/reNgine/tasks.py
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
radaram
43af3a6aecdece4923ee74b108853f7b9c51ed12
27d6ec5827a51fd74e3ab97a5cef38fc7f5d9168
Can you please confirm, if it is always returning `matched-at` on the latest Nuclei release? Thanks
yogeshojha
33
yogeshojha/rengine
530
Fix #529
Nuclei returns the response to stdout: `{"template-id":"tech-detect","info":{"name":"Wappalyzer Technology Detection","author":["hakluke"],"tags":["tech"],"reference":null,"severity":"info"},"matcher-name":"nginx","type":"http","host":"https://example.com:443","matched-at":"https://example.com:443","timestamp":"2021-1...
null
2021-10-31 10:27:33+00:00
2021-11-01 16:58:16+00:00
web/reNgine/tasks.py
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
radaram
43af3a6aecdece4923ee74b108853f7b9c51ed12
27d6ec5827a51fd74e3ab97a5cef38fc7f5d9168
Yes, of course, In version nuclei 2.5.2 passed key matched https://github.com/projectdiscovery/nuclei/blob/v2.5.2/v2/pkg/output/output.go#L78 `Matched string json:"matched,omitempty"` In nuclei 2.5.3(last version) passed key matched-at `Matched string json:"matched-at,omitempty"` https://github.com/projectdiscover...
radaram
34
yogeshojha/rengine
530
Fix #529
Nuclei returns the response to stdout: `{"template-id":"tech-detect","info":{"name":"Wappalyzer Technology Detection","author":["hakluke"],"tags":["tech"],"reference":null,"severity":"info"},"matcher-name":"nginx","type":"http","host":"https://example.com:443","matched-at":"https://example.com:443","timestamp":"2021-1...
null
2021-10-31 10:27:33+00:00
2021-11-01 16:58:16+00:00
web/reNgine/tasks.py
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
import os import traceback import yaml import json import csv import validators import random import requests import logging import metafinder.extractor as metadata_extractor import whatportis import subprocess from selenium.webdriver.firefox.options import Options as FirefoxOptions from selenium import webdriver fro...
radaram
43af3a6aecdece4923ee74b108853f7b9c51ed12
27d6ec5827a51fd74e3ab97a5cef38fc7f5d9168
Сan I delete the old condition `if 'matched' in json_st`?
radaram
35
yogeshojha/rengine
527
web/Dockerfile: Update Go to v1.17 and add command to update Nuclei & Nuclei Templates
- Starting in Go 1.17, installing executables with go get is deprecated. go install may be used instead. [Deprecation of 'go get' for installing executables](https://golang.org/doc/go-get-install-deprecation). - Install and update Go package with `go install -v example.com/cmd@latest` or `GO111MODULE=on go install -v...
null
2021-10-25 09:33:01+00:00
2021-12-14 03:04:58+00:00
web/Dockerfile
# Base image FROM ubuntu:20.04 # Labels and Credits LABEL \ name="reNgine" \ author="Yogesh Ojha <yogesh.ojha11@gmail.com>" \ description="reNgine is a automated pipeline of recon process, useful for information gathering during web application penetration testing." # Environment Variables ENV DEBIAN_FRON...
# Base image FROM ubuntu:20.04 # Labels and Credits LABEL \ name="reNgine" \ author="Yogesh Ojha <yogesh.ojha11@gmail.com>" \ description="reNgine is a automated pipeline of recon process, useful for information gathering during web application penetration testing." # Environment Variables ENV DEBIAN_FRON...
0x71rex
c46402b6381c17252e27baf0b1961849c51439a0
cf30e98e0440424019cb2cad600892ce405f850e
Sorry, my bad. What a silly mistake, there was extra '-v' on line 74 :grin:
0x71rex
36
yogeshojha/rengine
527
web/Dockerfile: Update Go to v1.17 and add command to update Nuclei & Nuclei Templates
- Starting in Go 1.17, installing executables with go get is deprecated. go install may be used instead. [Deprecation of 'go get' for installing executables](https://golang.org/doc/go-get-install-deprecation). - Install and update Go package with `go install -v example.com/cmd@latest` or `GO111MODULE=on go install -v...
null
2021-10-25 09:33:01+00:00
2021-12-14 03:04:58+00:00
web/Dockerfile
# Base image FROM ubuntu:20.04 # Labels and Credits LABEL \ name="reNgine" \ author="Yogesh Ojha <yogesh.ojha11@gmail.com>" \ description="reNgine is a automated pipeline of recon process, useful for information gathering during web application penetration testing." # Environment Variables ENV DEBIAN_FRON...
# Base image FROM ubuntu:20.04 # Labels and Credits LABEL \ name="reNgine" \ author="Yogesh Ojha <yogesh.ojha11@gmail.com>" \ description="reNgine is a automated pipeline of recon process, useful for information gathering during web application penetration testing." # Environment Variables ENV DEBIAN_FRON...
0x71rex
c46402b6381c17252e27baf0b1961849c51439a0
cf30e98e0440424019cb2cad600892ce405f850e
Haven't remove the second "-v" in line 74 btw. 😁
0x71rex
37
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
It won't run in macOS or windows.
sbimochan
38
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
This quick install script is only for Ubuntu/Debian bases OS. https://rengine.wiki/install/quick-install/ Other OS will continue to use https://rengine.wiki/install/install/
yogeshojha
39
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
but docker info would run in all OS right? sorry why did it fail the docker test before?
sbimochan
40
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
Correct me if I am wrong, `docker info` will return True if docker is installed, and it has nothing to do with whether docker is running or not. You can give it a try Stop docker service `sudo systemctl stop docker` Now try with docker info `sudo docker info` On the other hand, try with `sudo systemctl...
yogeshojha
41
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
let me test in my friend's linux.
sbimochan
42
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
confirmed. your way is right for linux. my way was right for 'macOS'
sbimochan
43
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
Sure take your time, for mac and windows users will continue to go through detailed installation steps. we also need to install make, and as I said, this script is only intended for Ubuntu/debian, that's why I've done `apt install make` this would anyways fail on macos
yogeshojha
44
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
Yep, so what we can do is, I will accept and merge this PR, and maybe if you want to work on a install script that works across different OS, that would be awesome.
yogeshojha
45
yogeshojha/rengine
468
Installation: Check docker running status before installing reNgine.
### Changes - checks docker run status. Before it used to execute all `make build` commands even if docker was not running. - Early error shown to the user - Terminal text color changed - made this sentence "Before running this script, please make sure Docker is running and you have made changes to .env file....
null
2021-08-24 16:52:53+00:00
2021-08-27 02:54:16+00:00
install.sh
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 3; echo "Before running this script, please make sure you have made changes to .env file." tput setaf 1; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made changes to .env file (y/n)?...
#!/bin/bash tput setaf 2; cat web/art/1.0.txt tput setaf 1; echo "Before running this script, please make sure Docker is running and you have made changes to .env file." tput setaf 2; echo "Changing the postgres username & password from .env is highly recommended." tput setaf 4; read -p "Are you sure, you made chang...
sbimochan
2bd2219659fcf0f0541fc4879bd69bfa79a500c7
e98433517e4a6198e6e2208fdf1b324f41be5bcb
Sounds perfect.
sbimochan
46
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
this now hard codes md5 hash. In the current version you can choose the hash function
PaulWestenthanner
0
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
do you know how much benefit the change from int(hexdigest) to int.from_bytes alone brings? I saw you mentioned 40%-60% for three changes combined. I think this is less readable and could use a comment. Also the byteorder might depend on the machine (c.f. https://docs.python.org/3/library/stdtypes.html#int.from_bytes ...
PaulWestenthanner
1
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
you don't need the auto sample attribute anymore
PaulWestenthanner
2
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
`n_process` sounds like an integer. Better call it `process_list` or something more telling. You only copied the name but probably it's time to change it now
PaulWestenthanner
3
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
this ignores the `max_samples` parameter and might lead to the process crashing in case of too much data / too few CPUs
PaulWestenthanner
4
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
please add the `process_creation_method` parameter in the documentation and explain the options
PaulWestenthanner
5
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
how is this the same as the old code? wasn't the old code just doing `shm_result[column_index] += 1`?
PaulWestenthanner
6
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
why is this 2x2?
PaulWestenthanner
7
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
splitting this way is not very elegant if the last chunk has to be treated separately. using numpy array split could be helpful: https://stackoverflow.com/a/75981560. Wouldn't it be easier if the `hash_chunk` function would hash a chunk and return an array. Then it wouldn't need the `shm_result` and `shm_offset` pa...
PaulWestenthanner
8
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
do you need to re-assign this to `np_result` or will this be updated in place?
PaulWestenthanner
9
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
Removing it!
bkhant1
10
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
I will add a comment! The `digest` version is about 30% faster than the `hexdigest` version. On my machine: ```python > import hashlib > hasher = hashlib.md5() > hasher.update(b"abdcde1234") > %timeit int(hasher.hexdigest(), 16) 659 ns ± 29.5 ns per loop (mean ± std. dev. of 7 runs, 1000000 loops each) > %time...
bkhant1
11
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
That is actually another advantage of using a shared memory instead of queues. ### In the previous implementation, there were 3 mains things that could take up memory: 1. the input dataframe, stored in `self.X` - it is copied accross all processes wether we fork or spawn because the process target is `self.require...
bkhant1
12
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
It was, but it was in the context of writting to a subset of the output dataframe. In that case we write to the "global" output dataframe, so we need to add the offset specific to the chunk we're processing. I was worried about introducing a regression so I added `test_simple_example` where the expected result is h...
bkhant1
13
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
`shm` stands for "shared_memory" - will update the variable name 👍 > Wouldn't it be easier if the hash_chunk function would hash a chunk and return an array. Then it wouldn't need the shm_result and shm_offset parameters (what does shm stand for btw?). Then you'd just concatenate all the chunks in the end? That...
bkhant1
14
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
My bad thanks for catching that! I'm replacing it with `hash_ctor = getattr(hashlib, hash_key)` outside the loop and `hash_ctor()` in the loop which is even faster! ```python > ctor = getattr(hashlib, 'md5') > %timeit ctor() 190 ns ± 0.927 ns per loop (mean ± std. dev. of 7 runs, 10000000 loops each) > %timeit ...
bkhant1
15
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
Ok that makes sense. Which python/hashlib version are you using btw? Running `hasher.update("abc")` results in a `TypeError` on my machine since it requires (as per documentation) bytes not strings. With `hasher.update("abc".encode("utf-8"))` I'm getting the same results as you
PaulWestenthanner
16
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
I copied the wrong line from my shell! I updated my comment above.
bkhant1
17
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
Im not sure 😅 I am just using it to get the type of the default int array name in [that line](https://github.com/scikit-learn-contrib/category_encoders/pull/428/files/206de2d8327489c4ff4d2c7c4f566c0fc06210c1#diff-5871b042f65ccab77377b2e9a92ea2c9651cc039b020835b6d77bfcb01ffe475R187) so it could be 1 by 1!
bkhant1
18
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
It gets updated in place!
bkhant1
19
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
nice catch! without calling the `system()` function this was always `False`, was it?
PaulWestenthanner
20
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
I"ve seen your implementation of the mulitprocess pool (here https://github.com/bkhant1/category_encoders/compare/all_optis...bkhant1:category_encoders:multiproc_pool?expand=1) and like it a lot. I think it is very clean and you should add it to the PR
PaulWestenthanner
21
scikit-learn-contrib/category_encoders
428
Optimise `HashingEncoder` for both large and small dataframes
I used the HashingEncoder recently and found weird that any call to `fit` or `transform`, even for a dataframe with only 10s of rows and a couple of columns took at least 2s... I also had quite a large amount of data to encode, and that took a long time. That got me started on improving the performance of Hashi...
null
2023-10-08 15:09:46+00:00
2023-11-11 14:34:26+00:00
category_encoders/hashing.py
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import math import platform __author__ = 'willmcginnis', 'LiuShulun' class HashingEncoder(util.BaseEncoder, util.Unsuper...
"""The hashing module contains all methods and classes related to the hashing trick.""" import sys import hashlib import category_encoders.utils as util import multiprocessing import pandas as pd import numpy as np import math import platform from concurrent.futures import ProcessPoolExecutor __author__ = 'willmcginn...
bkhant1
26ef26106fcbadb281c162b76258955f66f2c741
5c94e27436a3cf837d7c84a71c566e8320ce512f
maybe it would make sense to chance it to 1 by 1 as it would be sort of a minimal example. Also probably add a comment that you only need it for the datatype so people won't wonder why it is there
PaulWestenthanner
22
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/__init__.py
""" .. module:: category_encoders :synopsis: :platform: """ from category_encoders.backward_difference import BackwardDifferenceEncoder from category_encoders.binary import BinaryEncoder from category_encoders.gray import GrayEncoder from category_encoders.count import CountEncoder from category_encoders.hashing...
""" .. module:: category_encoders :synopsis: :platform: """ from category_encoders.backward_difference import BackwardDifferenceEncoder from category_encoders.binary import BinaryEncoder from category_encoders.gray import GrayEncoder from category_encoders.count import CountEncoder from category_encoders.hashing...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
I don't really like the input warning here. I know of some users who use this library in their project and they try to suppress the warning for their end-users. Also, at the moment it also does not work anyway. I'd be an favor of not issuing a warning here but have something on the `index.rst` (c.f. below comment)
PaulWestenthanner
23
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/__init__.py
""" .. module:: category_encoders :synopsis: :platform: """ from category_encoders.backward_difference import BackwardDifferenceEncoder from category_encoders.binary import BinaryEncoder from category_encoders.gray import GrayEncoder from category_encoders.count import CountEncoder from category_encoders.hashing...
""" .. module:: category_encoders :synopsis: :platform: """ from category_encoders.backward_difference import BackwardDifferenceEncoder from category_encoders.binary import BinaryEncoder from category_encoders.gray import GrayEncoder from category_encoders.count import CountEncoder from category_encoders.hashing...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
I fully agree, in fact warnings are annoying. Let's remove this. Do you want me to add a new commit or you prefer to do it in the merge process?
JaimeArboleda
24
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
so we're not actually doing anything with the `input_features` parameter? I thought the point of including it would be to explicitly tell every encoder how to calculate output features from the given input features (this is what the 1-1 mixin does, right?) For some encoder it give a not-fitted error, but other don't...
PaulWestenthanner
25
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
Well, let me explain (for simplicity, every time I say a `Pipeline` it could be that or a `ColumnTransformer` or a `FeatureUnion`...): * We need to have the parameter `input_features` because otherwise, when the Encoder is part of a `Pipeline`, there will be an `Exception: BaseEncoder.get_feature_names_out() takes 1 ...
JaimeArboleda
26
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
I agree that the proposed change is better than what we have at the moment since it fixes a bug. I've got one more question: If we change all encoders to properly work with the input features, will we get compatibility even without setting `set_output=pandas`? Even if we still do all the internal stuff in pandas? If...
PaulWestenthanner
27
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
Well, my gut feeling is that it's not worth it: it will still be a very big effort and it will only fix one subtle bug (which doesn't make anything crash) and there will still be some inconsistencies. Let me try to explain... * **Current Situation**: If the `category_encoder` is inside a composed `sklearn` transfom...
JaimeArboleda
28
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
Sorry that I did not answer you question properly: regarding the versions, I would not specify the dependency of 1.2 in the requirements or the setup, because with the simple fix there the remaining bugs are not, in my opinion, a big issue. If you want, I could refactor the tests so that I don't require the version...
JaimeArboleda
29
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
> I've got one more question: If we change all encoders to properly work with the input features, will we get compatibility even without setting `set_output=pandas`? Ok I see, this is not true since the columns to encode are always referenced by name in the `cols` parameter. Then I agree with you that changing `g...
PaulWestenthanner
30
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
I pretty much agree with you that we should go for the super simple fix. Let me think a little about the version stuff, maybe there is another solution, but I think we're very close to getting this merged
PaulWestenthanner
31
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
Hi, I've added some thoughts on the version. I'll resolve the discussion here. Please have a look at the other comment I've made and if you're fine with it we're ready to merge!
PaulWestenthanner
32
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
category_encoders/utils.py
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
"""A collection of shared utilities for all encoders, not intended for external use.""" from abc import abstractmethod from enum import Enum, auto import warnings import pandas as pd import numpy as np import sklearn.base from sklearn.base import BaseEstimator, TransformerMixin from sklearn.exceptions import NotFitted...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
Hi! Ok, I have added both suggestions!
JaimeArboleda
33
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
docs/source/index.rst
.. Category Encoders documentation master file, created by sphinx-quickstart on Sat Jan 16 13:08:19 2016. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. Category Encoders ================= A set of scikit-learn-style transformers for encoding c...
.. Category Encoders documentation master file, created by sphinx-quickstart on Sat Jan 16 13:08:19 2016. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. Category Encoders ================= A set of scikit-learn-style transformers for encoding c...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
could you please introduce a section `known issues` here after usage and before contents, that states something like: """ CategoryEncoders internally works with pandas `DataFrames` as apposed to sklearn which works with numpy arrays. This can cause problems in sklearn versions prior to `1.2.0`. In order to ensure f...
PaulWestenthanner
34
scikit-learn-contrib/category_encoders
398
(WIP) Partial fix for getting feature names out
I think this is a partial fix for this opened issue: https://github.com/scikit-learn-contrib/category_encoders/issues/395 It remains to check the behaviour of other estimators that are not ONE_TO_ONE. Please, let me know if you like the work in progress and I will try to continue.
null
2023-02-23 13:33:41+00:00
2023-03-13 11:48:24+00:00
docs/source/index.rst
.. Category Encoders documentation master file, created by sphinx-quickstart on Sat Jan 16 13:08:19 2016. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. Category Encoders ================= A set of scikit-learn-style transformers for encoding c...
.. Category Encoders documentation master file, created by sphinx-quickstart on Sat Jan 16 13:08:19 2016. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. Category Encoders ================= A set of scikit-learn-style transformers for encoding c...
JaimeArboleda
5eb7a2d6359d680bdadd0534bdb983e712a47f9c
570827e6b48737d0c9aece8aca31edd6da02c1b2
I think it's very good, yes. Thanks! I don't see any need for more. Let me add this piece and remove the other one so that you can merge :)
JaimeArboleda
35