instruction stringlengths 0 30k ⌀ |
|---|
Python program for matching file names and copying them into a zip is not finding a specific file? |
|python|css|python-3.x|regex| |
null |
|amazon-web-services|amazon-ec2|amazon-ecs| |
null |
```
> sam |> purrr::map_dfr(tibble::as_tibble)
# A tibble: 3 × 22
symbol name price changesPercentage change dayLow dayHigh yearHigh yearLow
<chr> <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 COUNCODO… Coun… 4.8 0 0 4.6 4.9 7.05 3.4
2 SPENCERS… Spen… 91.3 0.44 0.4 90.6 94.6 139. 51.5
3 SRPL-RE.… NA 0.05 -50 -0.05 0.05 0.1 0.1 0.05
# ℹ 13 more variables: marketCap <dbl>, priceAvg50 <dbl>, priceAvg200 <dbl>,
# exchange <chr>, volume <dbl>, avgVolume <dbl>, open <dbl>,
# previousClose <dbl>, eps <dbl>, pe <dbl>, earningsAnnouncement <chr>,
# sharesOutstanding <dbl>, timestamp <dbl>
``` |
javascript) odd/ even/ null..../ NaN |
|javascript|if-statement| |
null |
The **problem** is that for the call `my_function<int>(myvar)` the generic version `my_function(ArgsT&&... args)` is **better match** because it doesn't require a *conversion to const* for it's argument. That is, the first overload `my_function(const StringT& name, ArgsT&&... args)` requires a *conversion to const* for its first parameter `name` and hence it is **worse match** than the generic version `my_function(ArgsT&&... args)`.
To show this, below is a [contrived example](https://godbolt.org/z/GGdGEe7hr):
```
#include <iostream>
template<typename T, typename... Args> void func(const T&, Args&&...)
{
std::cout << "const T& version";
}
template<typename... Args> void func(Args&&...)
{
std::cout << "Without const version";
}
int main()
{
func(4); //prints Without const version
}
```
---
<h4>Solution</h4>
One way to solve this is to pass `name` by value by removing the low-level const from `name` as shown below:
```
template <typename MyType, typename StringT=std::string, typename... ArgsT,
typename = std::enable_if_t<std::is_constructible_v<std::string, StringT>>>
//------------------------------------vvvvvvv------------------------->removed const & from here
std::shared_ptr<MyType> my_function(StringT name, ArgsT&&... args) {
std::cout << "name: " << name << std::endl;
auto t = std::make_shared<MyType>(5);
return t;
}
```
[Working demo](https://godbolt.org/z/j6Y8n4bfG) |
Im searching for hook, in golang android, like for c++ binaries in android we have frida hooking framework and several plthook available publicly,
is there hooking framework or something similar available for golang binaries in android
like in frida we want to hook system of libc then
```
Interceptor.attach(Module.findExportByName('libc.so', 'system'), {
onEnter: function(args) {
console.log('[+] system called with command:', Memory.readUtf8String(args[0]));
},
});
```
I just need any similar hooking for golang like if i want to hook
time.Now()
function in golang binary which will be available in exports of the binary then how can i without actual function formatting from ida etc..
|
Hook For GoLang Android |
|android|go|hook|reverse-engineering| |
null |
What is the best utilization of Numpy arrays in Python?
I understand that Numpy is based in C and optimizes memory usage. However, I have never used a Numpy array in my production code. Am I overlooking a way to improve my code?
I have read several articles and referenced old class notes. But I am still not sure when I should opt for using Numpy arrays over a Pandas data frame. |
When to Use a Numpy Array |
|python|pandas|numpy|optimization|memory| |
null |
I am having trouble on how to correctly display the addition of two time values in a template. Here is my code.
**models.py**
```
class UserTimesheet(models.Model):
employee = models.ForeignKey(Employee, models.SET_NULL, blank=True, null=True)
date = models.DateField()
monday_start_time = models.TimeField(_(u"Start Time"), null=True, blank=True)
monday_end_time = models.TimeField(_(u"End Time"), null=True, blank=True, )
@property
def get_monday_total(self):
if self.monday_start_time is not None and self.monday_end_time is not None:
return self.monday_end_time - self.monday_start_time
```
I have tried using:
```
{{ get_monday_total }}
```
```
{{ UserTimesheet.get_monday_total }}
```
And a lot of different other things, but I cannot, for the life of me, figure out how to display this logic within a template. Any ideas?
Any and all help is grateful. |
null |
I am creating an API that searches and reverses an entry in the account.move model. I am able to find the correct entry and reverse it using the refund_moves() method. However, whenever I try to confirm the reversed entry using the action_post() method, I get a "Expected singleton: res.company()" error.
I've used the action_post() method before on other models such as sale.order/account.move and it works fine.
Code:
```python
@http.route('/update_invoice', website="false", auth='custom_auth', type='json', methods=['POST'])
#Searching for entry
invoice = request.env['account.move'].sudo().search([('matter_id','=',matterID),('account_id','=',accountID),('move_type','=','out_invoice'),('company_id','=',creditor.id)])
if invoice:
#Create Reversal
move_reversal = request.env['account.move.reversal'].with_context(active_model="account.move", active_ids=invoice.id).sudo().create({
'date': intakeDate,
'reason': 'Balance Adjustment',
'journal_id': invoice.journal_id.id,
})
#Reverse Entry
move_reversal.refund_moves()
#Search for created reversed entry
refundInvoice = request.env['account.move'].sudo().search([('name','=',"/"),('company_id','=',creditor.id),('move_type','=','out_refund')])
if refundInvoice:
_logger.info("Refund Invoice Found")
#Error occurs
refundInvoice.action_post()
```
Custom Authorization:
```python
@classmethod
def _auth_method_custom_auth(cls):
access_token = request.httprequest.headers.get('Authorization')
_logger.info(access_token)
if not access_token:
_logger.info('Access Token Missing')
raise BadRequest('Missing Access Token')
if access_token.startswith('Bearer '):
access_token = access_token[7:]
_logger.info(access_token)
user_id = request.env["res.users.apikeys"]._check_credentials(scope='odoo.restapi', key=access_token)
if not user_id:
_logger.info('No user with api key found')
raise BadRequest('Access token Invalid')
request.update_env(user=user_id)
```
Traceback:
```
Traceback (most recent call last):
File "/home/odoo/src/odoo/odoo/models.py", line 5841, in ensure_one
_id, = self._ids
ValueError: not enough values to unpack (expected 1, got 0)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/odoo/src/odoo/odoo/http.py", line 2189, in __call__
response = request._serve_db()
File "/home/odoo/src/odoo/odoo/http.py", line 1765, in _serve_db
return service_model.retrying(self._serve_ir_http, self.env)
File "/home/odoo/src/odoo/odoo/service/model.py", line 133, in retrying
result = func()
File "/home/odoo/src/odoo/odoo/http.py", line 1792, in _serve_ir_http
response = self.dispatcher.dispatch(rule.endpoint, args)
File "/home/odoo/src/odoo/odoo/http.py", line 1996, in dispatch
result = self.request.registry['ir.http']._dispatch(endpoint)
File "/home/odoo/src/odoo/addons/website/models/ir_http.py", line 235, in _dispatch
response = super()._dispatch(endpoint)
File "/home/odoo/src/odoo/odoo/addons/base/models/ir_http.py", line 222, in _dispatch
result = endpoint(**request.params)
File "/home/odoo/src/odoo/odoo/http.py", line 722, in route_wrapper
result = endpoint(self, *args, **params_ok)
File "/home/odoo/src/user/account_ext/controllers/main.py", line 411, in update_invoice
refundInvoice.action_post()
File "/home/odoo/src/odoo/addons/sale/models/account_move.py", line 63, in action_post
res = super(AccountMove, self).action_post()
File "/home/odoo/src/enterprise/account_accountant/models/account_move.py", line 76, in action_post
res = super().action_post()
File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 4072, in action_post
other_moves._post(soft=False)
File "/home/odoo/src/enterprise/sale_subscription/models/account_move.py", line 13, in _post
posted_moves = super()._post(soft=soft)
File "/home/odoo/src/enterprise/account_asset/models/account_move.py", line 109, in _post
posted = super()._post(soft)
File "/home/odoo/src/odoo/addons/sale/models/account_move.py", line 99, in _post
posted = super()._post(soft)
File "/home/odoo/src/enterprise/account_reports/models/account_move.py", line 48, in _post
return super()._post(soft)
File "/home/odoo/src/enterprise/account_avatax/models/account_move.py", line 15, in _post
res = super()._post(soft=soft)
File "/home/odoo/src/enterprise/account_invoice_extract/models/account_invoice.py", line 262, in _post
posted = super()._post(soft)
File "/home/odoo/src/enterprise/account_inter_company_rules/models/account_move.py", line 14, in _post
posted = super()._post(soft)
File "/home/odoo/src/enterprise/account_external_tax/models/account_move.py", line 53, in _post
return super()._post(soft=soft)
File "/home/odoo/src/enterprise/account_accountant/models/account_move.py", line 68, in _post
posted = super()._post(soft)
File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 3876, in _post
draft_reverse_moves.reversed_entry_id._reconcile_reversed_moves(draft_reverse_moves, self._context.get('move_reverse_cancel', False))
File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 3694, in _reconcile_reversed_moves
lines.with_context(move_reverse_cancel=move_reverse_cancel).reconcile()
File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2935, in reconcile
return self._reconcile_plan([self])
File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2345, in _reconcile_plan
self._reconcile_plan_with_sync(plan_list, all_amls)
File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2492, in _reconcile_plan_with_sync
exchange_diff_values = exchange_lines_to_fix._prepare_exchange_difference_move_vals(
File "/home/odoo/src/odoo/addons/account/models/account_move_line.py", line 2603, in _prepare_exchange_difference_move_vals
accounting_exchange_date = journal.with_context(move_date=exchange_date).accounting_date
File "/home/odoo/src/odoo/odoo/fields.py", line 1207, in __get__
self.compute_value(recs)
File "/home/odoo/src/odoo/odoo/fields.py", line 1389, in compute_value
records._compute_field_value(self)
File "/home/odoo/src/odoo/addons/mail/models/mail_thread.py", line 424, in _compute_field_value
return super()._compute_field_value(field)
File "/home/odoo/src/odoo/odoo/models.py", line 4867, in _compute_field_value
fields.determine(field.compute, self)
File "/home/odoo/src/odoo/odoo/fields.py", line 102, in determine
return needle(*args)
File "/home/odoo/src/odoo/addons/account/models/account_journal.py", line 366, in _compute_accounting_date
journal.accounting_date = temp_move._get_accounting_date(move_date, has_tax)
File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 4358, in _get_accounting_date
lock_dates = self._get_violated_lock_dates(invoice_date, has_tax)
File "/home/odoo/src/odoo/addons/account/models/account_move.py", line 4389, in _get_violated_lock_dates
return self.company_id._get_violated_lock_dates(invoice_date, has_tax)
File "/home/odoo/src/odoo/addons/account/models/company.py", line 369, in _get_violated_lock_dates
self.ensure_one()
File "/home/odoo/src/odoo/odoo/models.py", line 5844, in ensure_one
raise ValueError("Expected singleton: %s" % self)
ValueError: Expected singleton: res.company()
``` |
class CircularWheelView: UIView {
private let imageLayer = CALayer()
var image: UIImage? {
didSet {
setNeedsLayout()
}
}
var labelTexts: [String] = [""] {
didSet {
drawTextLabels()
}
}
override func layoutSubviews() {
super.layoutSubviews()
// Update the image layer when the image changes or the view's bounds change
if let image = image {
let imageSize = CGSize(width: bounds.width / 2, height: bounds.height / 2)
imageLayer.frame = CGRect(x: (bounds.width - imageSize.width) / 2, y: (bounds.height - imageSize.height) / 2, width: imageSize.width, height: imageSize.height)
imageLayer.contents = image.cgImage
}
}
override func draw(_ rect: CGRect) {
// Draw the circular ring
let ringPath = UIBezierPath(ovalIn: bounds.insetBy(dx: 20, dy: 20))
UIColor.lightGray.setStroke()
ringPath.lineWidth = 40
ringPath.stroke()
// Draw the text labels along the circumference
drawTextLabels()
}
private func drawTextLabels() {
let center = CGPoint(x: bounds.midX, y: bounds.midY)
let radius = min(bounds.width, bounds.height) / 2.3
let labelCount = labelTexts.count // Number of text labels
let labelAngle = CGFloat.pi * 2.0 / CGFloat(labelCount)
for i in 0..<labelCount {
let label = UILabel()
label.text = labelTexts[i]
label.sizeToFit()
// Calculate label position
let angle = CGFloat(i) * labelAngle
let x = center.x + radius * cos(angle)
let y = center.y + radius * sin(angle)
// Adjust label position to make it circular
label.center = CGPoint(x: x, y: y)
label.transform = CGAffineTransform(rotationAngle: angle + CGFloat.pi / 2)
addSubview(label)
}
}
func rotateWheel() {
let rotationAnimation = CABasicAnimation(keyPath: "transform.rotation")
rotationAnimation.fromValue = 0.0
rotationAnimation.toValue = CGFloat.pi * 2.0
rotationAnimation.duration = 2.0
rotationAnimation.repeatCount = .infinity
layer.add(rotationAnimation, forKey: "rotate")
}
}
USAGE
let circularWheelView = CircularWheelView(frame: CGRect(x: 50, y: 100, width: 300, height: 300))
circularWheelView.labelTexts = ["Settings 1", "Setting 2", "Setting 3", "Setting 4", "Setting 5"]
circularWheelView.image = UIImage(resource: .letter)
circularWheelView.backgroundColor = .clear
view.addSubview(circularWheelView)
// Start rotating the wheel
circularWheelView.rotateWheel()
[Please check video for reference][1]
[1]: https://drive.google.com/file/d/1U9hPo0tuLHBIrrBKSyUxYLRNDZsbbOoq/view?usp=share_link |
SafeAreaView doesn't include tab-bar? Expo, Expo Router |
|react-native|expo|expo-router| |
null |
I have hosted the react app on IIS server inside wwwroot/{myFolder}. I am currently facing the below error :[Unexpected Application Error!(404 Not Found )](https://i.stack.imgur.com/888re.png)
I already have a web.config file inside wwwroot/{myfolder} which looks like this.
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<rewrite>
<rules>
<rule name="React Routes" stopProcessing="true">
<match url=".*" />
<conditions logicalGrouping="MatchAll">
<add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
<add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />
<add input="{REQUEST_URI}" pattern="^/(api)" negate="true" />
</conditions>
<action type="Rewrite" url="/index.html" />
</rule>
</rules>
</rewrite>
</system.webServer>
</configuration>
And I have also installed URL Rewrite Module.
|
`auto` return type cannot do that. The type of the returned value must be known statically. You could consider to return a `std::variant<std::string,int>`. However, I suspect that `crow::json::rvalue` is already some kind of variant.
You could make the function a template:
template <typename T>
T getAs(const crow::json::rvalue &data);
Such that the caller specifies the type:
std::string s = getAs<std::string>(x);
If it is only 2 types, I would rather recommend to write two overloads: `getAsString` and `getAsInt` though. |
{"OriginalQuestionIds":[8634156],"Voters":[{"Id":880619,"DisplayName":"Jasper de Vries"},{"Id":157882,"DisplayName":"BalusC","BindingReason":{"GoldTagBadge":"primefaces"}}]} |
a['Str']=a['a'].str.split(' ') # split text using blanks
a['Str']=a['Str'].str.get(0) #get the first item of each Str column
print(a) |
I've tried inserting image file to MySQL db using Struts2 but I'm getting below error.
ognl.MethodFailedException: Method "execute" failed for object com.motorola.action.LicenseAction@569b1c56 [java.lang.AbstractMethodError: Method com/mysql/jdbc/ServerPreparedStatement.setBlob(ILjava/io/InputStream;)V is abstract]
at ognl.OgnlRuntime.callAppropriateMethod(OgnlRuntime.java:1556)
at ognl.ObjectMethodAccessor.callMethod(ObjectMethodAccessor.java:68)
at com.opensymphony.xwork2.ognl.accessor.XWorkMethodAccessor.callMethodWithDebugInfo(XWorkMethodAccessor.java:96)
at com.opensymphony.xwork2.ognl.accessor.XWorkMethodAccessor.callMethod(XWorkMethodAccessor.java:88)
at ognl.OgnlRuntime.callMethod(OgnlRuntime.java:1620)
at ognl.ASTMethod.getValueBody(ASTMethod.java:91)
at ognl.SimpleNode.evaluateGetValueBody(SimpleNode.java:212)
at ognl.SimpleNode.getValue(SimpleNode.java:258)
at ognl.Ognl.getValue(Ognl.java:470)
at ognl.Ognl.getValue(Ognl.java:434)
at com.opensymphony.xwork2.ognl.OgnlUtil$3.execute(OgnlUtil.java:371)
at com.opensymphony.xwork2.ognl.OgnlUtil.compileAndExecuteMethod(OgnlUtil.java:423)
at com.opensymphony.xwork2.ognl.OgnlUtil.callMethod(OgnlUtil.java:369)
at com.opensymphony.xwork2.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:436)
at com.opensymphony.xwork2.DefaultActionInvocation.invokeActionOnly(DefaultActionInvocation.java:291)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:252)
at org.apache.struts2.interceptor.debugging.DebuggingInterceptor.intercept(DebuggingInterceptor.java:253)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.DefaultWorkflowInterceptor.doIntercept(DefaultWorkflowInterceptor.java:176)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.validator.ValidationInterceptor.doIntercept(ValidationInterceptor.java:260)
at org.apache.struts2.interceptor.validation.AnnotationValidationInterceptor.doIntercept(AnnotationValidationInterceptor.java:52)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ConversionErrorInterceptor.doIntercept(ConversionErrorInterceptor.java:139)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:134)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:134)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.StaticParametersInterceptor.intercept(StaticParametersInterceptor.java:199)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.MultiselectInterceptor.intercept(MultiselectInterceptor.java:69)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.DateTextFieldInterceptor.intercept(DateTextFieldInterceptor.java:115)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.CheckboxInterceptor.intercept(CheckboxInterceptor.java:88)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.FileUploadInterceptor.intercept(FileUploadInterceptor.java:324)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ModelDrivenInterceptor.intercept(ModelDrivenInterceptor.java:99)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ScopedModelDrivenInterceptor.intercept(ScopedModelDrivenInterceptor.java:139)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ChainingInterceptor.intercept(ChainingInterceptor.java:157)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.PrepareInterceptor.doIntercept(PrepareInterceptor.java:174)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.I18nInterceptor.intercept(I18nInterceptor.java:123)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.ServletConfigInterceptor.intercept(ServletConfigInterceptor.java:171)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.AliasInterceptor.intercept(AliasInterceptor.java:201)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ExceptionMappingInterceptor.intercept(ExceptionMappingInterceptor.java:193)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.factory.StrutsActionProxy.execute(StrutsActionProxy.java:53)
at org.apache.struts2.dispatcher.Dispatcher.serviceAction(Dispatcher.java:577)
at org.apache.struts2.dispatcher.ExecuteOperations.executeAction(ExecuteOperations.java:81)
at org.apache.struts2.dispatcher.filter.StrutsPrepareAndExecuteFilter.doFilter(StrutsPrepareAndExecuteFilter.java:143)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:522)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1095)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1500)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.AbstractMethodError: Method com/mysql/jdbc/ServerPreparedStatement.setBlob(ILjava/io/InputStream;)V is abstract
at com.mysql.jdbc.ServerPreparedStatement.setBlob(ServerPreparedStatement.java)
at com.motorola.dao.FileUploadDAO.save(FileUploadDAO.java:28)
at com.motorola.action.LicenseAction.execute(LicenseAction.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at ognl.OgnlRuntime.invokeMethod(OgnlRuntime.java:899)
at ognl.OgnlRuntime.callAppropriateMethod(OgnlRuntime.java:1544)
... 82 more
/-- Encapsulated exception ------------\
java.lang.AbstractMethodError: Method com/mysql/jdbc/ServerPreparedStatement.setBlob(ILjava/io/InputStream;)V is abstract
at com.mysql.jdbc.ServerPreparedStatement.setBlob(ServerPreparedStatement.java)
at com.motorola.dao.FileUploadDAO.save(FileUploadDAO.java:28)
at com.motorola.action.LicenseAction.execute(LicenseAction.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at ognl.OgnlRuntime.invokeMethod(OgnlRuntime.java:899)
at ognl.OgnlRuntime.callAppropriateMethod(OgnlRuntime.java:1544)
at ognl.ObjectMethodAccessor.callMethod(ObjectMethodAccessor.java:68)
at com.opensymphony.xwork2.ognl.accessor.XWorkMethodAccessor.callMethodWithDebugInfo(XWorkMethodAccessor.java:96)
at com.opensymphony.xwork2.ognl.accessor.XWorkMethodAccessor.callMethod(XWorkMethodAccessor.java:88)
at ognl.OgnlRuntime.callMethod(OgnlRuntime.java:1620)
at ognl.ASTMethod.getValueBody(ASTMethod.java:91)
at ognl.SimpleNode.evaluateGetValueBody(SimpleNode.java:212)
at ognl.SimpleNode.getValue(SimpleNode.java:258)
at ognl.Ognl.getValue(Ognl.java:470)
at ognl.Ognl.getValue(Ognl.java:434)
at com.opensymphony.xwork2.ognl.OgnlUtil$3.execute(OgnlUtil.java:371)
at com.opensymphony.xwork2.ognl.OgnlUtil.compileAndExecuteMethod(OgnlUtil.java:423)
at com.opensymphony.xwork2.ognl.OgnlUtil.callMethod(OgnlUtil.java:369)
at com.opensymphony.xwork2.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:436)
at com.opensymphony.xwork2.DefaultActionInvocation.invokeActionOnly(DefaultActionInvocation.java:291)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:252)
at org.apache.struts2.interceptor.debugging.DebuggingInterceptor.intercept(DebuggingInterceptor.java:253)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.DefaultWorkflowInterceptor.doIntercept(DefaultWorkflowInterceptor.java:176)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.validator.ValidationInterceptor.doIntercept(ValidationInterceptor.java:260)
at org.apache.struts2.interceptor.validation.AnnotationValidationInterceptor.doIntercept(AnnotationValidationInterceptor.java:52)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ConversionErrorInterceptor.doIntercept(ConversionErrorInterceptor.java:139)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:134)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:134)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.StaticParametersInterceptor.intercept(StaticParametersInterceptor.java:199)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.MultiselectInterceptor.intercept(MultiselectInterceptor.java:69)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.DateTextFieldInterceptor.intercept(DateTextFieldInterceptor.java:115)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.CheckboxInterceptor.intercept(CheckboxInterceptor.java:88)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.FileUploadInterceptor.intercept(FileUploadInterceptor.java:324)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ModelDrivenInterceptor.intercept(ModelDrivenInterceptor.java:99)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ScopedModelDrivenInterceptor.intercept(ScopedModelDrivenInterceptor.java:139)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ChainingInterceptor.intercept(ChainingInterceptor.java:157)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.PrepareInterceptor.doIntercept(PrepareInterceptor.java:174)
at com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.I18nInterceptor.intercept(I18nInterceptor.java:123)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.interceptor.ServletConfigInterceptor.intercept(ServletConfigInterceptor.java:171)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.AliasInterceptor.intercept(AliasInterceptor.java:201)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at com.opensymphony.xwork2.interceptor.ExceptionMappingInterceptor.intercept(ExceptionMappingInterceptor.java:193)
at com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:247)
at org.apache.struts2.factory.StrutsActionProxy.execute(StrutsActionProxy.java:53)
at org.apache.struts2.dispatcher.Dispatcher.serviceAction(Dispatcher.java:577)
at org.apache.struts2.dispatcher.ExecuteOperations.executeAction(ExecuteOperations.java:81)
at org.apache.struts2.dispatcher.filter.StrutsPrepareAndExecuteFilter.doFilter(StrutsPrepareAndExecuteFilter.java:143)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:522)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1095)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1500)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Unknown Source)
--------------------------------------
Below is my view, action class, POJO and DAO class code.
**licUpload.jsp:**
<%@ taglib prefix="s" uri="/struts-tags"%>
<%@ page import="java.util.ArrayList" %>
<%@ page import="java.util.HashMap" %>
<html>
<head>
<link href="Css/motoCSS.css" rel="stylesheet" type="text/css" />
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<meta http-equiv="X-Frame-Options" content="deny">
<meta http auto-config="false" disable-url-rewriting="true">
</head>
<body style="background-color:#4682B4">
<s:form name="licUploadForm" action="imageupload" method="post" enctype="multipart/form-data" theme="simple" >
<table width="100%" border="0" cellspacing="0" cellpadding="0">
<tr>
<td height="190" colspan="3"> </td>
</tr>
<tr>
<td> </td>
<td width="30%" class="login" style="background-repeat: no-repeat">
<table width="100%" border="0" cellspacing="0" cellpadding="0">
<tr>
<td height="60" colspan="2"> </td>
</tr>
<tr align="center">
<td colspan="2"><strong>License Upload</strong></td>
</tr>
<tr></tr>
<tr>
<td align="right" height="35" style="font-size: 13px;">Licence ID</td>
<td> <s:textfield name="licID" id="licID"/></td>
</tr>
<tr>
<td align="right" style="font-size: 13px;">Start Date</td>
<td>
<s:textfield name="startDate" id="startDate"/></td>
</tr>
<tr>
<td align="right" height="35" style="font-size: 13px;">End Date</td>
<td>
<s:textfield name="endDate" id="endDate"/>
</td>
</tr>
<tr>
<td align="right" height="35" style="font-size: 13px;">Upload License</td>
<td> <s:file name="license" label="Select license" size="40" />
</td>
</tr>
<tr>
<td height="40" colspan="2" align="center">
<s:submit class="button" value="Upload" />
<s:submit class="button" value="Cancel"/>
</td>
</tr>
</table>
</td>
<td> </td>
</tr>
</table>
</s:form>
</body>
<HEAD>
</HEAD>
</html>
**LicenseAction.java:**
package com.motorola.action;
import java.io.File;
import javax.servlet.http.HttpServletRequest;
import org.apache.struts2.ServletActionContext;
import com.motorola.dao.FileUploadDAO;
import com.opensymphony.xwork2.ActionSupport;
import com.motorola.pojo.LicensePOJO;
public class LicenseAction extends ActionSupport {
private String licID;
private String startDate;
private String endDate;
private File license;
public String getLicID() {
return licID;
}
public void setLicID(String licID) {
this.licID = licID;
}
public String getStartDate() {
return startDate;
}
public void setStartDate(String startDate) {
this.startDate = startDate;
}
public String getEndDate() {
return endDate;
}
public void setEndDate(String endDate) {
this.endDate = endDate;
}
public File getLicense() {
return license;
}
public void setLicense(File license) {
this.license = license;
}
@Override
public String execute() {
System.out.println("inside licenseAction");
HttpServletRequest req = ServletActionContext.getRequest();
int i=FileUploadDAO.save(this);
if(i>0){
return "success";
}
return "error";
}
}
**FileUploadDAO.java:**
package com.motorola.dao;
import java.io.FileInputStream;
import java.sql.Connection;
import java.sql.PreparedStatement;
import com.motorola.action.LicenseAction;
import dbconnecetion.DBConnection;
public class FileUploadDAO {
public static int save(LicenseAction lm){
int status=0;
Connection con = DBConnection.getConnection();
try{
System.out.println("inside FileUploadDAO");
PreparedStatement ps=con.prepareStatement("insert into customer_license_table(license_id,lic_start_date,lic_end_date,license_copy) values(?,?,?,?)");
ps.setString(1,lm.getLicID());
ps.setString(2, lm.getStartDate());
ps.setString(3, lm.getEndDate());
// for inserting pdf in database
System.out.println("input stream is*****"+lm.getLicense());
FileInputStream inputStream = new FileInputStream(lm.getLicense());
System.out.println("input stream is*****"+inputStream);
ps.setBlob(4, inputStream);
int i = ps.executeUpdate();
status=ps.executeUpdate();
}catch(Exception e){e.printStackTrace();}
return status;
}
}
**struts.xml:**
```xml
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE struts PUBLIC
"-//Apache Software Foundation//DTD Struts Configuration 2.5//EN"
"http://struts.apache.org/dtds/struts-2.5.dtd">
<struts>
<constant name="struts.devMode" value="true" />
<package name="default" extends="struts-default">
<action name="loginSubmit" class="com.motorola.action.LoginAction">
<result name="admin">/AdminUser.jsp</result>
<result name="service">/ServiceCenter.jsp</result>
<result name="customer">/RequestManagement.jsp</result>
<result name="ERROR">/index.jsp</result>
</action>
<action name="addCustomer" class="com.motorola.action.CustomerMgtAction">
<result name="success">/CustomerManagement.jsp</result>
<result name="error">/index.jsp</result>
</action>
<action name="requestMgt" class="com.motorola.action.RequestMgtAction">
<result name="success">/RequestManagement.jsp</result>
</action>
<action name="viewReqDetails" class="com.motorola.action.RequestMgtDetailsAction">
<result name="success">/RequestMgmtDetails.jsp</result>
<result name="error">/RequestManagement.jsp</result>
</action>
<action name="viewAllCustomers" class="com.motorola.action.ViewCustomersAction">
<result name="success">/ViewCustomers.jsp</result>
<result name="error">/AdminUser.jsp</result>
</action>
<action name="acceptCustDetails" class="com.motorola.action.UpdateRequestStatusAction">
<result name="success">/RequestManagement.jsp</result>
<result name="error">/RequestMgmtDetails.jsp</result>
</action>
<action name="viewLicense" class="com.motorola.action.ViewLicenseAction">
<result name="success">/ViewLicense.jsp</result>
</action>
<action name="imageupload" class="com.motorola.action.LicenseAction">
<result name="success">/ViewCustomers.jsp</result>
</action>
</package>
</struts>
```
I tried with `preparedStatement.setBinaryStream(int parameterIndex,InputStream x);`
but unable to insert image file to db |
I am trying to run a nodejs based Docker container on a k8s cluster.
The code refuses to run, and continuously getting errors:
`Navigation frame was detached`
`Requesting main frame too early`
I cut down to the minimal code that should do some work:
```
const puppeteer = require('puppeteer');
const os = require('os');
function delay(time) {
return new Promise(resolve => setTimeout(resolve, time));
}
const platform = os.platform();
(async () => {
console.log('started')
let browserConfig = {}
if (platform === 'linux') {
browserConfig = {
executablePath: '/usr/bin/google-chrome',
headless: true,
args:['--no-sandbox','--disable-web-security','--disable-features=IsolateOrigins,site-per-process','--disable-gpu', '--disable-dev-shm-usage']
}
}
else {
browserConfig = {
headless: true,
args:['--no-sandbox','--disable-web-security','--disable-features=IsolateOrigins,site-per-process','--disable-gpu', '--disable-dev-shm-usage']
}
}
console.log('create browser')
const browser = await puppeteer.launch(browserConfig)
console.log('create page')
const page = await browser.newPage()
await page.setViewport({ width: 1920, height: 926 })
console.log('browsing to url')
await page.goto("https://www.example.com", {
waitUntil: 'load',
timeout: 3000000
})
console.log('waiting')
await delay(5000)
console.log('get content')
const s = await page.content();
console.log('content', s);
console.log('close browser')
await browser.close()
console.log('finished')
})();
```
That code is running on the local nodejs cli, and also when packed to a docker image using this Dockerfile:
```
# Install dependencies only when needed
FROM node:20.11.1-slim AS deps
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD true
RUN apt-get update && \
apt-get install -y libc6 && \
apt-get install -y git && \
rm -rf /var/lib/apt/lists/*
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci --legacy-peer-deps
# Rebuild the source code only when needed
FROM node:20.11.1-slim AS builder
WORKDIR /app
COPY . .
COPY --from=deps /app/node_modules ./node_modules
ARG NODE_ENV=production
RUN echo ${NODE_ENV}
# RUN NODE_ENV=${NODE_ENV} npm run build
# Production image, copy all the files and run next
FROM node:20.11.1-slim AS runner
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD true
WORKDIR /app
RUN apt-get update && apt-get install -y gnupg wget && \
wget --quiet --output-document=- https://dl-ssl.google.com/linux/linux_signing_key.pub | gpg --dearmor > /etc/apt/trusted.gpg.d/google-archive.gpg && \
echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" > /etc/apt/sources.list.d/google-chrome.list && \
apt-get update && \
apt-get install -y google-chrome-stable --no-install-recommends && \
rm -rf /var/lib/apt/lists/*
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/package.json ./package.json
COPY --from=builder /app/app.js ./app.js
# Expose
EXPOSE 3000
# CMD ["node", "app.js"]
CMD node app.js
```
Running this image on local Docker is doing the job.
However when I try to deploy to Kubernetes cluster, it fails with those errors.
[k8s error message](https://i.stack.imgur.com/t2Fpw.png)
This is very frustrating :( help will be much appreciated. |
You don't need to transform that original payload. Using the Foreach Scope you can select the collection to iterate inside the payload, then use the ObjectStore connector Store operation.
Example:
``` lang-xml
<foreach collection="#[payload.payload]">
<os:store key="#[payload.EMPNAME]">
<os:value>#[payload.EMPID]</os:value>
</os:store>
</foreach>
```
As you can see in the [documentation of the Object Store connector][1] there is no store all keys operation.
[1]: https://docs.mulesoft.com/object-store-connector/latest/object-store-connector-reference#operations |
I'm trying to secure my ASP.NET Core 8 Web API with Keycloak.
I've created a realm and an user with confidential access-type, so it needs a secret. Everything is well implemented I think, but when I try to make a request from Swagger UI, it gives me the following CORS error:
> Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost:7777/realms/dc2-local/protocol/openid-connect/auth?xxx (Reason: CORS header ‘Access-Control-Allow-Origin’ missing). Status code: 200.
When I press the link it redirects correctly to the login page of keycloak and after that, I can make requests correctly without the CORS error, but the first request, when I need to authenticate myself, I get the error
This is my CORS policy:
builder.Services.AddCors(options =>
{
options.AddDefaultPolicy(builder =>
{
builder
.AllowAnyOrigin()
.AllowAnyMethod()
.AllowAnyHeader();
});
});
/*other code*/
app.UseRouting();
app.UseCors();
app.UseAuthentication();
app.UseAuthorization();
And this are my request headers:
GET /realms/dc2-local/protocol/openid-connect/auth?xxx HTTP/1.1
Host: localhost:7777
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/115.0
Accept: application/json
Accept-Language: pt-PT
Accept-Encoding: gzip, deflate, br
Origin: http://localhost:5055
Referer: http://localhost:5055/
DNT: 1
Connection: keep-alive
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: cross-site |
|cors|swagger|asp.net-core-webapi|keycloak| |
I have a problem with the function `merge` in kaggel. When I run it in RStudio it works but not in kaggel.
I have two different tables with the sames colomnes: "Id" and "ActivityDay".
I want to merge the two table in one by "Id" AND "ActivityDay"
*my code:*
```
daily_Calories_Steps <- merge(daily_calories, dailySteps_merged, by = c("Id","ActivityDay"))
```
*warning message:*
Error in fix.by(by.x, x): 'by' must specify a uniquely valid column
Traceback:
1. merge(daily_calories, dailySteps_merged, by = c("Id", "ActivityDay"))
2. merge.data.frame(daily_calories, dailySteps_merged, by = c("Id",
. "ActivityDay"))
3. fix.by(by.x, x)
4. stop(ngettext(sum(bad), "'by' must specify a uniquely valid column",
. "'by' must specify uniquely valid columns"), domain = NA)
Thanks for helping...
I want to make a notebook in kaggel but this code doesnt work... |
You have inconsistent file names for the password file (the htpasswd commands misses a `/` ).
htpasswd command:
>
> sudo htpasswd -c /etc/apache2.htpasswd
my-site.conf:
>
> AuthUserFile /etc/apache2/.htpasswd
The first command creates a file `/etc/apache2.htpasswd` the config expects the file `/etc/apache2/.htpasswd` (extra `/`) |
I have a list of URLs which has about 5k urls, I want to crawl each of these websites and find links to certain other page, which most these websites have.
To solve this issue i wrote a python script which works but its too slow. I was hoping to find a better solution than mine.
The below code takes about 2 mins to complete for just 10 links. Is there any way to make it more faster or some other method altogether.
```
import asyncio
import aiohttp
from bs4 import BeautifulSoup
from urllib.parse import urljoin
timeout = 5
async def fetch_html(session, url):
async with session.get(url, headers=headers, timeout=timeout) as response:
return await response.text()
async def find_fee_page(session, start_url, max_depth=2):
visited_urls = set()
queue = [(start_url, 0)]
ans_list = []
while queue:
current_url, depth = queue.pop(0)
if current_url in visited_urls or depth > max_depth:
continue
visited_urls.add(current_url)
try:
html = await fetch_html(session, current_url)
soup = BeautifulSoup(html, 'html.parser')
for link in soup.find_all('a', href=True):
absolute_link = urljoin(current_url, link['href'])
if pattern.search(absolute_link.lower()):
ans_list.append(absolute_link)
return absolute_link
elif absolute_link != current_url:
queue.append((absolute_link, depth + 1))
except Exception as e:
print(f"Error processing {current_url}: {e}")
return None
async def main():
start_urls = links[:10]
async with aiohttp.ClientSession() as session:
tasks = [find_fee_page(session, url) for url in start_urls]
results = await asyncio.gather(*tasks)
for url, result in zip(start_urls, results):
if result:
print(f"Fee page found for {url}: {result}")
else:
print(f"No fee page found for {url}")
await main()
```
|
How to crawl 5000 different URLs to find certain links |
|python|web-scraping|async-await|web-crawler| |
null |
I want to populate my ListView with 16 names of districts. But when I open page it shows 16 rows without content in ListView. Is there also any other option to display content of txt file line by line?
My Code:
ObservableCollection<Wojewodschaften> listModel = new
ObservableCollection<Wojewodschaften>();
string content;
string[] Wojewodztwa = new string[] { "" };
private async void LoadWoj()
{
using (var stream = await FileSystem.OpenAppPackageFileAsync("wojewodschaften.txt"))
{
using (var reader = new StreamReader(stream))
{
content = await reader.ReadToEndAsync();
}
Wojewodztwa = content.Split('\n'); //Split the content after a new row.
foreach (var item in Wojewodztwa)
{
Console.WriteLine(item);
Wojewodschaften wojewodztwa = new Wojewodschaften()
{
NameOfWojewodschaften = item
};
listModel.Add(wojewodztwa);
ListOfWojewodschaftenList.ItemsSource = listModel;
}
}
}
Xaml Code
<ListView
x:Name="ListOfWojewodschaftenList">
<ListView.ItemTemplate>
<DataTemplate>
<ViewCell
Height="20"
>
<Label
Text="{Binding NamWojewodschaften}"
TextColor="Black"
FontSize="20"
VerticalOptions="CenterAndExpand"
HorizontalOptions="CenterAndExpand"
/>
</ViewCell>
</DataTemplate>
</ListView.ItemTemplate>
</ListView> |
I have a grpc server which starts and stops a task. The task is run as a goroutine. I want to be able to forcefully stop the goroutine if a stop request is received regardless of what stage of execution the goroutine is in. The code looks similar to the following:
``` go
func Test() {
test1()
test2()
test3()
...
}
```
``` go
// in a grpc server method
go Test()
```
In this example, `Test()` takes a long time to complete (it controls some hardware which could take multiple seconds before a response is received). This `Test()` method is called from a parent as a goroutine.
I could use context or channels to stop the goroutine, however this means that I would need to check context everywhere throughout the method. In addition, there are some critical reasons for why I need the Test() method to end as abruptly as possible even if it is currently executing code that doesn't include the context checking logic.
I was reading that goroutines are meant to be stopped collaboratively. However since this option does not work for my use case, what other options do I have? Can I start a thread in go that is not a goroutine or are there any libraries that will let me achieve what I want? |
Alternatives to kill a goroutine/thread completely externally |
|multithreading|go|goroutine| |
Currently you are blocking thread where events are coming in and IO Operations can take some time. You could use something like this. When have the data coming from an event you immediately give it to a handler which keeps track of data in a queue which is safe to be accessed from many threads and uses First in first out order.This handler runs a loop in separate thread which either saves data or waits for more data to come in. This way your main thread is free to take more data events and Data handler thread's job is to save it. Both threads have producer/consumer relationship. In this example data coming at much faster rate(100ms/packet) and saving is slower (1000ms) but still all data is being saved.
```
import android.util.Log;
import java.util.concurrent.LinkedBlockingQueue;
public class DataHandler implements Runnable {
private final LinkedBlockingQueue<Data> queue = new LinkedBlockingQueue<>();
private Thread thread;
void start() {
Log.i("handler", "Trying to start");
thread = new Thread(this);
thread.start();
}
void stop() {
Log.i("handler", "Trying to stop");
thread.interrupt();
}
void addData(Data data) {
Log.i("handler", "Adding data");
queue.add(data);
}
@Override
public void run() {
Log.i("handler", "Starting");
while (true) {
try {
if (queue.peek() != null) {
saveData(queue.take());
} else {
Log.i("handler", "Waiting");
Thread.sleep(1000);
}
} catch (InterruptedException e) {
break;
}
}
Log.i("handler", "Exiting");
}
private void saveData(Data data) throws InterruptedException {
Log.i("handler", "Saving Data");
Thread.sleep(1000);
Log.i("handler", "Saved Data");
}
}
```
Use it like
```
void fakeData() throws InterruptedException {
DataHandler handler = new DataHandler();
handler.start();
Thread.sleep(100);
handler.addData(new Data());
Thread.sleep(100);
handler.addData(new Data());
Thread.sleep(100);
handler.addData(new Data());
Thread.sleep(5000);
handler.stop();
}
``` |
Puppeteer on Kubernetes throws errors: "Navigation frame was detached", "Requesting main frame too early" |
|node.js|docker|kubernetes|puppeteer| |
null |
I upgraded my react-native project from expo 48 to 50 and Gradle wrapper to `8.7` after resolving some other issues I am stuck on this issue:
`Execution failed for task ':expo-permissions:compileDebugKotlin'.
> 'compileDebugJavaWithJavac' task (current target is 17) and 'compileDebugKotlin' task (current target is 11) jvm target compatibility should be set to the same Java version.
Things I tried:
- Using Java toolchain (https://kotlinlang.org/docs/gradle-configure-project.html#gradle-java-toolchains-support)
- Adding different plugins including `org.gradle.toolchains.foojay-resolver-convention`
- Adding `compatibilityOptions` and `kotlin` and `kotlinOptions` blocks with hardCoded Java version and `jvmToolchain(17)`
- Updating the Java version to 17 for gradle **Settings -> Build, Execution, Deployment -> Gradle -> Gradle JDK**
and many other things but nothing seems to help. |
Here is the code
```
def download_from_dict(path_link_dict, folder)
counter = 0
for path, link, name in tqdm(path_link_dict):
counter = counter + 1
if os.path.isfile(folder + path + name):
print('[ Already there! ] ' + name)
continue
if not os.path.isdir(folder + path):
os.makedirs(folder + path)
response = requests.get(link, headers=HEADERS)
with open(folder + path + name, 'wb') as file:
file.write(response.content)
print('[*] Downloaded ' + name)
```
output is
```
progress bar..
[*] Downloaded something..
Progress bar..
[*] Downloaded something
```
desired output( I want the bar to stay on the bottom of the terminal. )
```
[*] Downloaded something
[*] Downloaded something
[*] Downloaded something
progress bar..
```
I have tried using the leave=False, position=0, and barfmt.. parameters for the tqdm function, but it didn't work
I also have tried using \r before [*] Downloaded, but the bar is longer than the printed statement so it clears only a part of it.
I have gone to the tqdm docs, and i couldn't solve this problem
can you help?
|
How to keep tqdm progress bar on the bottom of the terminal? |
|python|terminal|printing|carriage-return|tqdm| |
null |
Is it Possible to make a .Net Framework Application work in Linux OS ?
We have a pretty huge WPF application which uses .Net Remoting and calls to SQL server.
Converting .NET framework code to .NET core is not a option as it would need lots of efforts.
I am seeing mixed content in the internet on this feasibility by using Wine with ubuntu .And all articles point out to migrate to .net core first .
So , Is it possible to make this .net application work in Linux OS without migrating to .Net core . If its possible , then what are some pointers to consider and where to start with . |
.Net Framework application with .Net Remoting in Linux OS |
|linux|ubuntu|.net-4.8|wine|.net-remoting| |
Add two time values and display in template |
|python|django|django-crispy-forms| |
alert is also a member of the window, why the alert function can be executed.What is their difference?
<!-- begin snippet: js hide: false console: true babel: false -->
<!-- language: lang-html -->
<html>
<head>
<meta charset="utf-8" />
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
</head>
<body>
<script>
function close() {
console.log('close function');
}
function alert() {
console.log('alert function');
}
</script>
<button onclick="close()">close btn</button>
<button onclick="alert()">alert btn</button>
</body>
</html>
<!-- end snippet -->
|
I am using selenium-manager in Python, and I am getting an error in GitLab CI. The error is as follows.
{
"level": "ERROR",
"timestamp": 1708741147,
"message": "error sending request for url (https://googlechromelabs.github.io/chrome-for-testing/known-good-versions-with-downloads.json): error trying to connect: dns error: failed to lookup address information: Name does not resolve"
}
To find the cause, I tried to run selenium-manager directly, but I get the same error. To investigate the cause, I tried to get the json file by "wget" immediately before the run as shown below, and it succeeded. Only selenium-manager failed.
- cat /etc/resolv.conf
- dig googlechromelabs.github.io +qr
- wget https://googlechromelabs.github.io/chrome-for-testing/known-good-versions-with-downloads.json
- /builds/.venv/lib/python3.10/site-packages/selenium/webdriver/common/linux/selenium-manager --browser chrome --debug --language-binding python --output json
Do you know what the cause is? |
When should I use a Python Numpy array?
I understand that Numpy is based in C and optimizes memory usage. However, I have never used a Numpy array in my production code. Am I overlooking a way to improve my code?
I have read several articles and referenced old class notes. But I am still not sure when I should opt for using Numpy arrays over a Pandas data frame. |
Let's say I have this example:
```
<div>
<p>some text <em>emphasized text</em> some other text</p>
<p><em>The paragraph I want to capture</em><p>
<p>some text <em>emphasized text</em> some other text and <em>other em text</em> until the end.</p>
</div>
```
What I want to select is the second paragraph (but it may be third or first as well). The thing is that here p and em are adjacent. There is no text between <p> and <em>, not at the beginning nor in the end. All text is inside <em>xyz</em>.
How can I get it with XPath query ?
I tried //p/em, //p/child:em, //em/parent:p, all these select the three paragraphs as all <em> are children of <p> |
Xpath Select contiguous parent from child or adjacent child of parent |
I am looking for a way of cutting a section diagonally, kind of like the image below using CSS. It's very important that the cut stays the same on all page dimensions and doesn't overlap with the text, while starting from the bottom left corner.
[][1]][1]
The section is a div, if it matters.
I hope you guys can help a bit
This is what I tried, but even if I change my values to fit one resolution, the moment I resize the page, it gets weird
```css
.cut
{
position: relative;
overflow: hidden;
}
.cut:after
{
content: "";
position: absolute;
top: 65%;
left: 0;
height: 104%;
width: 130%;
background: red;
transform: rotate(-5deg);
}
```
[1]: https://i.stack.imgur.com/oLsyR.png |
null |
Recently I was asked to maintain an old image processing project(5 year old) at my company
and It uses openCL.
There is piece of code which works like below
**if (oneKernelFlag == true)
launch a gamma correction kernel on the whole image
else
break the image into grids(ex:- 2*2)
for loop (....) // iterate for each grid
launch the same gamma correction kernel on each grid**
Similar kind of logic is used for applying kernels in few other functions.
The oneKernelFlag is hardcoded and project is built for each hardware product.
I noticed that execution is way faster when we launch single kernel (oneKernelFlag == true) compared to
multiple kernel launch , almost 30% reduction in timing.
**Now, I am confused what is the use of launching multiple same kernels
on smaller problem spaces? When is this useful?**
Please help
The original developer and documentation are unavailable
I couldnot find concrete details online. |
I am trying to create a ADDIN SETUP PROJECT in VS 2022 for Revit. and I had some problems
I want to create a Custom Installation Wizard Where the following actions Should be included.
1. License Agreement (Like img)
2. Choose checkbox for version (Like img)
Example: if(cb2019 = true) => location for Directories "C/ProgramData/Revit/Addin/2019"
if(cb2020 = true) => location for Directories "C/ProgramData/Revit/Addin/2020" |
How to create a custom setup wizard |
|c#|installation|revit-api| |
null |
If you try to run the test module, it fails because it is not in the path like @tromgy explained. However, to run tests you don't need the init files or mess with `sys.path`.
With pytest, you are not supposed to run the test module itself to run tests, but instead run pytest from command line with `python -m pytest tests`. Just make sure you are running from your project folder i.e. "tau-intro-to-pytest". If you want to run a specific test function, you can [specify that from the command line][1] as well, but there are good VS Code extensions to do that without writing lengthy command line calls. The python [extension has a tests explorer][2] included in it, but I like the UI of [Python Test Explorer for VS Code][3] better.
[1]: https://docs.pytest.org/en/6.2.x/usage.html#specifying-tests-selecting-tests
[2]: https://code.visualstudio.com/docs/python/testing
[3]: https://marketplace.visualstudio.com/items?itemName=LittleFoxTeam.vscode-python-test-adapter |
null |
I'm getting the following data:
`{'name_work': 'рациональные числа', 'writing_date': datetime.datetime(2024, 3, 15, 16, 18, 37, tzinfo=datetime.timezone.utc), 'student_type': \<TypeStudentWork: TypeStudentWork object (2)\>, 'number_of_tasks': '10', 'student': \<User: Dear Student\>, 'example': \<Example: ответы к 1 самостоятельной\>, 'teacher': \<SimpleLazyObject: \<User: Diana Berkina\>\>, 'image_work': \<InMemoryUploadedFile: только\_ответы.jpg (image/jpeg)\>}
`
My views is a class that receives this data and sends it to the form:
```python
class CreateStudentWorkView(View):
def post(self, request, type_work_id):
try:
type_work_obj = TypeStudentWork.objects.get(pk=type_work_id)
students = User.objects.filter(student_class=type_work_obj.school_class)
print(students)
for student in students:
image_file = request.FILES.get("image_work_" + str(student.id))
if image_file:
initial_data = {
'name_work': type_work_obj.name_work,
'writing_date': type_work_obj.writing_date,
'student_type': type_work_obj,
'number_of_tasks': '10',
'student': student,
'example': type_work_obj.example,
'teacher': request.user,
'image_work': image_file
}
print("Initial data for student", student.username, ":", initial_data)
form = StudentWorkForm(initial=initial_data)
print(form.errors)
if form.is_valid():
form.save()
else:
print("Errors for student", student.username, ":", form.errors)
except Exception as e:
print(e)
return redirect('moderator')
```
My model is the model of the work I'm trying to keep:
```python
class StudentWork(models.Model):
name_work = models.CharField(max_length=55)
writing_date = models.DateTimeField()
student_type = models.ForeignKey(
TypeStudentWork,
related_name='type_works',
on_delete=models.CASCADE,
blank=True,
null=True
)
number_of_tasks = models.CharField(max_length=5, default="5")
student = models.ForeignKey(
User,
related_name='student',
on_delete=models.SET_NULL,
null=True
)
example = models.ForeignKey(
Example,
on_delete=models.SET_NULL,
null=True
)
image_work = models.ImageField(upload_to='image')
text_work = models.TextField(null=True, blank=True)
proven_work = models.TextField(null=True, blank=True)
assessment = models.CharField(max_length=10, null=True, blank=True)
teacher = models.ForeignKey(
User,
related_name='teacher',
on_delete=models.SET_NULL,
blank=True,
null=True
)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
deleted_at = models.DateTimeField(blank=True, null=True)
```
I've tried extracting id into fields where the type is ForeignKey. But as I understand I need objects, I don't know how to get them correctly. |
null |
|c#|asp.net| |
Is this what you're looking for?
```
for i=1,10 do
local part = display.newImage("part" .. i .. ".png", 0, 0)
part:scale ( 0.5, 0.5)
part.isVisible = false
end |
Better to do this using a subdomain, but anyway
- Create a folder with the desired name inside the cPanel public_html
folder, for example, **Laravel**
- Put the contents of the public folder related to Laravel into this
folder
- Create another folder with the name **app** or any other desired name in
cPanel and before the Public_html folder and put the complete content
of Laravel except the **Public** folder into this folder.
[![enter image description here][1]][1]
- Now go to the public_html folder and then enter the Laravel folder
and open the index.php file and apply the following changes and save
the file:
-
require __DIR__.'/../app/vendor/autoload.php';
$maintenance = __DIR__.'/../app/storage/framework/maintenance.php';
$app = require_once __DIR__.'/../app/bootstrap/app.php';
Update the `.env` file according to the database information and then open the address of your Laravel program
[1]: https://i.stack.imgur.com/g0hsT.jpg |
The simplest option to find the `project-id`:
[![Find the project-id.][1]][1]
If this option is not available, you can search it on the source page:
1. Right-click the main GitLab page of your project
2. Select "View Page Source" (`Ctrl + U` in Chrome)
3. Search for:
```txt
Project ID:
```
There you go.
[1]: https://i.stack.imgur.com/hSf6o.png |
three call GetDetailData,but only "BOML" call when error Must declare the scalar variable "@pDBMode",other call not error
string SQLStr = "Exec PR_SPSA03_Qry @pDBMode, @pDBKind, @pCompNo, @pProdNo, @pflow, @pSeq "
add spapec "@pDBMode ," , after not error
What is the reason?
```csharp
switch (DBKind)
{
case "BOML":
bls = GetDetailData<BOMDTO>(DBKind, ProdNo, ref boms, ref Msg);
if (bls)
{
HF_BData.Set("BOML", JsonConvert.SerializeObject(boms));
grid_BOM.DataSource = boms;
grid_BOM.DataBind();
}
break;
case "PDML":
bls = GetDetailData<PDMDTO>(DBKind, ProdNo, ref pdms, ref Msg);
if (bls)
{
HF_BData.Set("PDML", JsonConvert.SerializeObject(pdms));
grid_PDM.DataSource = pdms;
grid_PDM.DataBind();
}
break;
case "PDFL":
bls = GetDetailData<PDFDTO>(DBKind, ProdNo, ref pdfs, ref Msg);
if (bls)
{
HF_BData.Set("PDFL", JsonConvert.SerializeObject(pdfs));
grid_PDF.DataSource = pdfs;
grid_PDF.DataBind();
}
break;
}
private bool GetDetailData<T>(string DBKind, string ProdNo, ref List<T> Data, ref string Msg)
{
string SQLStr = "Exec PR_SPSA03_Qry @pDBMode, @pDBKind, @pCompNo, @pProdNo, @pflow, @pSeq ";
Function fn = new Function();
DynamicParameters dynParams = new DynamicParameters();
dynParams.Add("@pDBMode", "B");
dynParams.Add("@pDBKind", DBKind);
dynParams.Add("@pCompNo", Session["CompNo"].ToString());
dynParams.Add("@pProdNo", ProdNo);
dynParams.Add("@pflow", "");
dynParams.Add("@pSeq", "");
bool bls = fn.GetData<T>("SPS", SQLStr, dynParams, ref Data, ref Msg);
return bls;
}
public bool GetData<T>(string ConStr, string SqlStr, DynamicParameters parameters, ref List<T> ReturnData, ref string Msg)
{
bool bls = false;
ConStr = GetConStr(ConStr);
try
{
using (SqlConnection conn = new SqlConnection(ConStr))
{
conn.Open();
ReturnData = conn.Query<T>(SqlStr, parameters).ToList();
}
bls = true;
}
catch (Exception ex)
{
Msg = ex.Message;
}
return bls;
}
```
add spapec "@pDBMode ," , after not error
What is the reason? |
How to populate ListVIew from textfile in Xamrin Fomrs |
|xamarin-forms-4| |
I have a simple gRPC asynchronous server.
It's a also multi-thread, one thread handle one completion queue.
But the thread scalibility is very bad when the count of threads grows:
- For small messages, the best performance is with only one thread.
- For big messages, the performance is only doubled from 1 to 4 threads, and higher count of threads doesn't increase performance (on a server with 64 cores).
Is it a normal behaviour?
The server looks like this:
```c++
class Server final
{
public:
Server(const Config& config, Service& sync_service)
: m_config(config),
m_sync_service(sync_service)
{
// Nothing to do
}
~Server()
{
stop();
}
void stop()
{
m_stopped = true;
m_server->Shutdown();
for(auto& cq: m_cqs)
{
cq->Shutdown();
}
}
void run_and_wait(const std::string& grpc_address_port)
{
grpc::ServerBuilder builder;
builder.AddListeningPort(grpc_address_port, grpc::InsecureServerCredentials());
builder.RegisterService(&m_async_service);
const int num_threads = m_config.threads;
const int threads_per_cq = 1;
assert(num_threads % threads_per_cq == 0);
for(int i = 0; i < ceil_div(num_threads, threads_per_cq); i++)
{
auto& cq = m_cqs.emplace_back(builder.AddCompletionQueue());
}
// with the gRPC runtime.
// Finally assemble the server.
m_server = builder.BuildAndStart();
std::cout << "Server listening on " << grpc_address_port << std::endl;
// Proceed to the server's main loop.
std::vector<std::thread> threads;
for(int i = 0; i < num_threads; i++)
{
grpc::ServerCompletionQueue* cq = m_cqs[i / threads_per_cq].get();
for(int j = 0; j < m_config.concurrent_calldatas; j++)
{
new CallDataUnary<IMethod>(m_async_service, m_sync_service, *cq);
}
threads.emplace_back([this, cq]() {
handle_rpcs(*cq);
});
}
// Just wait all poller threads to stop
for(auto& thread: threads)
{
if(thread.joinable())
{ thread.join(); }
}
}
private:
class CallDataBase
{
public:
virtual ~CallDataBase() = default;
virtual void proceed() = 0;
virtual void wait_for_new_request() = 0;
enum class State
{
WAIT_REQUEST,
FINISH
};
protected:
State m_state{State::WAIT_REQUEST};
};
template<typename IMethod>
class CallDataUnary : public CallDataBase
{
public:
CallDataUnary(AsyncService& service, Service& sync_service, grpc::ServerCompletionQueue& cq)
: m_service(service),
m_sync_service(sync_service),
m_writer(&m_context),
m_cq(cq)
{
wait_for_request();
}
void wait_for_new_request()
{
(new CallDataUnary(m_service, m_sync_service, m_cq));
}
void wait_for_request()
{
// Equivalent of calling the RequestXXX()
IMethod::request(
m_service,
&m_context,
&m_request,
&m_writer,
&m_cq,
&m_cq,
this
);
}
void proceed() final
{
if(m_state == State::WAIT_REQUEST)
{
wait_for_new_request();
m_state = CallDataBase::State::FINISH;
// Call the business logic
grpc::Status status = IMethod::dispatch_grpc_sync(m_sync_service, &m_context, &m_request, &m_response);
// At the end to avoid data races.
m_writer.Finish(m_response, status, this);
}
else
{
GPR_ASSERT(m_state == State::FINISH);
delete this;
}
}
private:
/// The service of the RPC method that this CallData is listening to.
AsyncService& m_service;
Service& m_sync_service;
// The gRPC request for this RPC call.
grpc::ServerContext m_context;
/// The gRPC request sent by the client.
RequestCPP m_request;
/// The gRPC response to send to the client when the request is completed.
ResponseCPP m_response;
/// The writer to write the response to the client.
grpc::ServerAsyncResponseWriter<ResponseCPP> m_writer;
grpc::ServerCompletionQueue& m_cq;
};
// This can be run in multiple threads if needed.
void handle_rpcs(grpc::ServerCompletionQueue& cq)
{
void* tag; // uniquely identifies a request.
bool ok;
while(!m_stopped)
{
// Block waiting to read the next event from the completion queue. The
// event is uniquely identified by its tag, which in this case is the
// memory address of a CallData instance.
// The return value of Next should always be checked. This return value
// tells us whether there is any kind of event or m_cq is shutting down.
GPR_ASSERT(cq.Next(&tag, &ok));
auto call = static_cast<CallDataBase*>(tag);
if(ok)
{
call->proceed();
}
else
{
call->wait_for_new_request();
delete call;
}
}
}
Config m_config;
std::vector<std::unique_ptr<grpc::ServerCompletionQueue>> m_cqs;
AsyncService m_async_service;
Service& m_sync_service;
std::unique_ptr<grpc::Server> m_server;
bool m_stopped{false};
};
``` |
gRPC async server : bad thread scailibility |
|c++|grpc| |
> Could an unitialized pointer be a NULL pointer?
An uninitialized pointer could be a null pointer.
(Do not use “NULL” for a null pointer. `NULL` is a specific macro defined by standard C headers with some different implications.)
> From what I have read a NULL pointer points to the memory location "0"…
Many C implementations use memory address zero for null pointers, but this is not the only possibility. The C standard merely requires a null pointer to compare unequal to any object or function in the C program.
> … and an unitialized pointer points to a random location?
No, “random” is a [different concept](https://en.wikipedia.org/wiki/Randomness). Saying something is random is asserting there is a lack of pattern or predictability to it. While it would not violate the C standard for an uninitialized object to behave randomly, the C standard does not say it does that, and it is unlikely that an uninitialized object would actually behave randomly rather than having at least some pattern of behavior arising out of the programming environment, although not a pattern controlled by the C standard.
An uninitialized object is said to have an *indeterminate* value. An indeterminate value is not any particular value but is a description of how the object behaves. Effectively, it means the C standard does not require the object to have a determined value—in other words, the value is not fixed; it can appear to be different at different times—and does not require a C implementation to take the value from any particular place, including the memory reserved for the object.
This means that, during optimization, a compiler may take the value of the object from memory, from a processor register, or anywhere else, and it may use different sources at different times, and other aspects of the program or the environment may change whatever source or sources the program is using for the value. So the value of the object in the program may appear to change.
For example, if `a` is an uninitialized `int`, then `printf("%d\n", a); printf("%d\n", a);` could print two different numbers.
> Could it be that this random location sometimes is the "0" so that it is a NULL pointer as well?
It can happen that the value used for an indeterminate pointer is a null pointer. |
You can use this rule for first to steps.
rules:
- if: '$CI_MERGE_REQUEST && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "main"'
when: always
- if: '$CI_MERGE_REQUEST == null'
when: always
And for only frontend changes just filter the folder structure like this;
only:
- changes:
- web/**/*
- master |
I am trying to deploy Airflow on Kubernetes with Istio. Here is my VirtualService config:
apiVersion: networking.istio.io/v1beta1
kind: VirtualService
metadata:
name: myapp-virtualservice
namespace: mynamespace
spec:
hosts:
- "myapp.example.com"
gateways:
- mygateway
http:
- match:
- uri:
prefix: /api/v1/
route:
- destination:
host: backend-service
port:
number: 8000
- match:
- uri:
prefix: /airflow/home/
rewrite:
uri: /home
route:
- destination:
host: airflow-service
port:
number: 8080
- match:
- uri:
prefix: /
route:
- destination:
host: frontend-service
port:
number: 443
So when I access https://myapp.example.com/airflow/home/, it reaches my airflow webserver in the pod, and I can see this log:
10.196.182.95 - - [20/Mar/2024:15:51:39 +0530] "GET /home HTTP/1.1" 302 319 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0"
But then it tries redirecting to the login page based on headers Location: `https://myapp.example.com/login/?next=https%3A%2F%2Fmyapp.example.com%2Fhome` but it cannot find it, I think. So, then it redirects to `https://myapp.example.com/404?next=https:%2F%2Fmyapp.example.com%2Fhome`, and that's it. I cannot reach airflow UI at all, getting 404 error all the time.
How to fix the redirection in this case?
Here is my airflow.cfg for webserver:
[webserver]
base_url = https://myapp.example.com/airflow/
web_server_host = 0.0.0.0
web_server_port = 8080
web_server_worker_timeout = 1200
enable_proxy_fix = True
web_server_ssl_cert = /airflow/cert/tls.crt
web_server_ssl_key = /airflow/cert/tls.key
I tried accessing the webserver without istio:
kubectl port-forward svc/airflow-service 8080:8080
and I was able to reach the airlfow UI and the login page on localhost:8080 locally on my machine, so it seems that airflow is setup correctly but something might be wrong with istio. Any ideas?
EDIT: This approach worked but its not very clean and honestly I would prefer to have something that works in a proper way:
- match:
- uri:
prefix: /airflow/home/
- uri:
prefix: /airflow/home
rewrite:
uri: /home
route:
- destination:
host: airflow-service
port:
number: 8080
- match:
- uri:
prefix: /login
rewrite:
uri: /login
route:
- destination:
host: airflow-service
port:
number: 8080
So in this setup I can go to `myapp.example.com/login`, then login to airflow first. Then it will redirect me to `myapp.example.com/home` (which is page not in airflow app, just my base app). But since I am logged in I can access now `myapp.example.com/airflow/home` and the airflow app will not redirect me to `myapp.example.com/login` page anymore which caused 404 previously and I am able to use airflow.
But it would be nice to have `/airflow/login` and `/airflow/home` redirection working correctly and from proper `/airflow/login` page not `/login` page |
Data
```
CREATE TABLE mytable(
ID INTEGER NOT NULL
,PartNum VARCHAR(40) NOT NULL
,EnteredOn VARCHAR(40) NOT NULL
,PickedTime VARCHAR(40)
,DeliveredTime VARCHAR(40)
);
INSERT INTO mytable(ID,PartNum,EnteredOn,PickedTime,DeliveredTime) VALUES
(100,'50A','2024-03-28 08:59:13.727','2024-03-28 09:30:20.237','2024-03-28 09:56:42.570'),
(125,'60B','2024-03-28 08:59:22.290','2024-03-28 09:31:32.543','2024-03-28 09:56:50.683'),
(171,'50A','2024-03-28 14:31:28.480',NULL,NULL),
(211,'70B','2024-03-28 14:31:33.613',NULL,NULL);
```
Use unpivot and pivot together
```
select *
from
(
select CONCAT(PartNum,'_',ID) COL
,st,
t_stamp
from (select ID,PartNum,
EnteredOn as Entered,
PickedTime as Picked,
DeliveredTime as Delivered
from mytable) m
unpivot
(
t_stamp
for st in ([Entered]
,[Picked]
,[Delivered]
)
) unpiv
) src
pivot
(
max(st)
for COL in ([50A_100], [60B_125], [50A_171],[70B_211])
) piv;
```
you can use dynamic query as follows
```
DECLARE @SQL nvarchar(max)
DECLARE @cols nvarchar(max)
set @cols=(select string_agg(CONCAT('[',PartNum,'_',ID,']'),',') from mytable)
--print(@cols)
set @SQL='
select *
from
(
select CONCAT(PartNum,''_'',ID) COL
,st,
t_stamp
from (select ID,PartNum,
EnteredOn as Entered,
PickedTime as Picked,
DeliveredTime as Delivered
from mytable) m
unpivot
(
t_stamp
for st in ([Entered]
,[Picked]
,[Delivered]
)
) unpiv
) src
pivot
(
max(st)
for COL in (' + @cols + ' )
) piv;
'
--print @SQL
exec(@SQL)
```
[dbfiddle](https://dbfiddle.uk/Nim6l3cd) |
**Description**:
I recently purchased a Moto e22 device running Android 12 and encountered several issues while trying to build my Flutter project using Android Studio.
**First Problem:**
After creating a new Flutter project with no modifications, when I attempted to run the project from Android Studio using the play button, the build succeeded, but I encountered a white screen with no logs in the run tab. However, if I manually kill the app and relaunch it on the Android device without any further build from Android Studio, it runs without any issues.
**Second Problem:**
When I tried running the project using flutter run --verbose or flutter run --debug, the app installed successfully on my Android device, but I got stuck with the message "Waiting for VM Service port to be available" or "Built build/app/outputs/flutter-apk/app-debug.apk" without any further actions such as hot reload or hot restart.
Here's a summary of my Flutter doctor output:
this is my flutter doctor summary
```
Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel main, 3.21.0-17.0.pre.19, on macOS 12.6.8 21G725 darwin-x64, locale en-GB)
[✓] Android toolchain - develop for Android devices (Android SDK version 34.0.0)
[!] Xcode - develop for iOS and macOS (Xcode 14.2)
! CocoaPods 1.12.1 out of date (1.13.0 is recommended).
CocoaPods is used to retrieve the iOS and macOS platform side's plugin code that responds to your plugin usage on the Dart side.
Without CocoaPods, plugins will not work on iOS or macOS.
For more info, see https://flutter.dev/platform-plugins
To upgrade see https://guides.cocoapods.org/using/getting-started.html#updating-cocoapods for instructions.
[✓] Chrome - develop for the web
[✓] Android Studio (version 2023.2)
[✓] VS Code (version 1.84.2)
[✓] Connected device (3 available)
[✓] Network resources
! Doctor found issues in 1 category.
```
I've attempted various solutions found in threads, including changing the Flutter channel, but the issues persist.
Here's the output of my logcat:
```
2024-03-29 06:58:54.905 669-830 UxUtility ven...hardware.mtkpower@1.0-service E notifyAppState error = NULL
2024-03-29 06:58:54.926 2937-2937 PhoneInterfaceManager com.android.phone E [PhoneIntfMgr] getCarrierPackageNamesForIntentAndPhone: No UICC
2024-03-29 06:58:54.943 3886-3886 AidRoutingManager com.android.nfc E Size of routing table804
2024-03-29 06:58:54.958 681-681 BpTransact...edListener surfaceflinger E Failed to transact (-32)
2024-03-29 06:58:54.958 681-681 BpTransact...edListener surfaceflinger E Failed to transact (-32)
2024-03-29 06:58:54.986 681-834 BufferQueueDebug surfaceflinger E [com.android.settings/com.android.settings.SubSettings#0](this:0xb400007a470532e8,id:-1,api:0,p:-1,c:-1) id info cannot be read from 'com.android.settings/com.android.settings.SubSettings#0'
2024-03-29 06:58:55.381 681-834 BufferQueueDebug surfaceflinger E [Splash Screen com.whimstech.driverapp.new_flutter_project#0](this:0xb400007a47069f68,id:-1,api:0,p:-1,c:-1) id info cannot be read from 'Splash Screen com.whimstech.driverapp.new_flutter_project#0'
2024-03-29 06:58:55.494 669-830 UxUtility ven...hardware.mtkpower@1.0-service E notifyAppState error = NULL
2024-03-29 06:58:55.633 6026-6043 OpenGLRenderer com.android.settings E EglManager::makeCurrent mED = 0x1, surface = 0x0, mEC = 0xb4000078bd6be650, error = EGL_SUCCESS
2024-03-29 06:58:55.741 6249-6271 QT pid-6249 E [QT]file does not exist
2024-03-29 06:58:56.370 681-835 BufferQueueDebug surfaceflinger E [com.whimstech.driverapp.new_flutter_project/com.whimstech.driverapp.new_flutter_project.MainActivity#0](this:0xb400007a47042188,id:-1,api:0,p:-1,c:-1) id info cannot be read from 'com.whimstech.driverapp.new_flutter_project/com.whimstech.driverapp.new_flutter_project.MainActivity#0'
2024-03-29 06:58:56.400 6249-6269 OpenGLRenderer com...driverapp.new_flutter_project E EglManager::makeCurrent mED = 0x1, surface = 0x0, mEC = 0xb4000078bd6b8110, error = EGL_SUCCESS
2024-03-29 06:58:56.400 681-835 BufferQueueDebug surfaceflinger E [SurfaceView[com.whimstech.driverapp.new_flutter_project/com.whimstech.driverapp.new_flutter_project.MainActivity](BLAST)#0](this:0xb400007a47056078,id:-1,api:0,p:-1,c:-1) id info cannot be read from 'SurfaceView[com.whimstech.driverapp.new_flutter_project/com.whimstech.driverapp.new_flutter_project.MainActivity](BLAST)#0'
2024-03-29 06:58:56.407 681-835 HWComposer surfaceflinger E getSupportedContentTypes: getSupportedContentTypes failed for display 0: Unsupported (8)
2024-03-29 06:58:56.413 6249-6276 ion com...driverapp.new_flutter_project E ioctl c0044901 failed with code -1: Invalid argument
2024-03-29 06:59:00.457 24427-4404 WakeLock com.google.android.gms.persistent E GCM_HB_ALARM release without a matched acquire!
2024-03-29 06:59:00.470 6249-6269 OpenGLRenderer com...driverapp.new_flutter_project E fbcNotifyFrameComplete error: undefined symbol: fbcNotifyFrameComplete
2024-03-29 06:59:00.470 6249-6269 OpenGLRenderer com...driverapp.new_flutter_project E fbcNotifyNoRender error: undefined symbol: fbcNotifyNoRender
```
|
Flutter project build issues on Moto e22 running Android 12 |
|java|android|flutter|logcat| |
I don’t have sudo access and contacting sys-admin takes a non trivial amount of time.
Here is the output of `nvcc -V`
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2024 NVIDIA Corporation
Built on Tue_Feb_27_16:19:38_PST_2024
Cuda compilation tools, release 12.4, V12.4.99
Build cuda_12.4.r12.4/compiler.33961263_0
Output of `nvidia-smi`
```
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 550.67 Driver Version: 550.67 CUDA Version: 12.4 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA RTX A6000 Off | 00000000:1C:00.0 Off | Off |
| 30% 32C P8 19W / 300W | 23MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 1 NVIDIA RTX A6000 Off | 00000000:1E:00.0 Off | Off |
| 30% 33C P8 20W / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 2 NVIDIA RTX A6000 Off | 00000000:3D:00.0 Off | Off |
| 30% 32C P8 27W / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 3 NVIDIA RTX A6000 Off | 00000000:3E:00.0 Off | Off |
| 30% 34C P8 25W / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 4 NVIDIA RTX A6000 Off | 00000000:3F:00.0 Off | Off* |
|ERR! 49C P5 ERR! / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 5 NVIDIA RTX A6000 Off | 00000000:40:00.0 Off | Off |
| 30% 31C P8 6W / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 6 NVIDIA RTX A6000 Off | 00000000:41:00.0 Off | Off |
| 30% 31C P8 16W / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
| 7 NVIDIA RTX A6000 Off | 00000000:5E:00.0 Off | Off |
| 30% 29C P8 6W / 300W | 11MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+------------------------+----------------------+
+-----------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=========================================================================================|
| 0 N/A N/A 4216 G /usr/libexec/Xorg 9MiB |
| 0 N/A N/A 4466 G /usr/bin/gnome-shell 4MiB |
| 1 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
| 2 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
| 3 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
| 4 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
| 5 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
| 6 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
| 7 N/A N/A 4216 G /usr/libexec/Xorg 4MiB |
+-----------------------------------------------------------------------------------------+
```
when I try to run
```
cuda_available = torch.cuda.is_available()
print("CUDA Available:", cuda_available)
if cuda_available:
print("CUDA version:", torch.version.cuda)
print("cuDNN version:", torch.backends.cudnn.version())
else:
print("CUDA not available")
```
I get the following error:
/home/user_name/anaconda3/envs/llm2/lib/python3.10/site-packages/torch/cuda/__init__.py:141: UserWarning: CUDA initialization: CUDA driver initialization failed, you might not have a CUDA gpu. (Triggered internally at ../c10/cuda/CUDAFunctions.cpp:108.)
return torch._C._cuda_getDeviceCount() > 0
CUDA Available: False
CUDA not available
Is it possible to fix this error without sudo access ?
The two possible solutions are :-
1) Update drivers
2) Build pytorch for cuda 12.4 from source
IIRC both of these require sudo access
|
Very simple: do the compression, then snip the result.
import gzip
plain = b"Stuff"
compressed = gzip.compress(plain)
bad_compressed = compressed[:-1]
gzip.decompress(bad_compressed) # EOFError
Even easier, just two bytes is enough for the `gzip` module to recognise the gzip format, but is obviously not a complete file.
bad_compressed = b'\x1f\x8b'
gzip.decompress(bad_compressed) # EOFError
This is in-memory for the simplicity of demonstration; it would work the same if you manipulated the file instead of the string. For example:
echo Stuff | gzip | head -c 2 >file.gz |
Recently I am working hard on embedded device develop by .NET. Not only the program but also the database is stored in the SD card.
Now I have a situation.
Because my embedded device is used in a workshop, when there is an uncontrollable power outage in the workshop, or there are various possibilities such as wires falling off due to human movement, causing the equipment to suddenly lose power, the database may be damaged.
There was nothing I could do to avoid any of these actions that would cause a blackout. So I had to find a way to back up the database. When EFCORE could not read the database, I used the backup to restore the database.
An immature idea I have is to create another identical database. Regularly back up the contents of the main database to this backup database. Although I know that there may be situations where both databases are damaged at the same time, it can at least reduce the probability of database damage to a certain extent.
I guess that whether it is EFCORE or SQLITE, there should be some ways to deal with the power outage that occurs during the database reading and writing process. Unfortunately, I really couldn't find a relevant solution, so I had to think of using two databases to avoid failures.
How should I backup SQLite database by EFCore? |
Postgresql doesn't respond, select * from claims where case_number='22222' but responds select * from claim I don't understand the reason who can help please
I have restarted postgresql and normally works 5-10 minutes
postgresql version 12 |
Postgresql doesn't respond, select * from claims where case_number='22222' but responds select * from claim |
|postgresql| |
null |
I want to send push and pull events of images to Kafka using the Azure ACR Webhook function.
'Event Hub' is not available due to cost.
The webhook is set to be thrown to the Kafka Rest Proxy URI with the header 'Content-Type: application/vnd.kafka.json.v2+json'.
When an event occurs
Headers:
'User-Agent: AzureContainerRegistry/1.0
traceparent: 00-1318068fc8e0ca8904b19865b65d551b-d98ef2405fa798bf-01
Content-Type: application/vnd.kafka.json.v2+json; charset=utf-8
Content-Length: 104'
Payload:
'
{
"id": "a00c973c-a717-425d-9db6-874a6288728c",
"timestamp": "2024-03-13T06:56:49.2277133Z",
"action": "ping"
}
'
The message is sent, but the response is '{"error_code":422,"message":"Unrecognized field: id"}'.
This seems to be because the payload format of the webhook is sent without the 'records, value' keys as follows.
{ "records":[{"value": { *ACTUAL DATA* }}]}
In this environment where the payload format of the message cannot be edited,
Is there a solution that matches the message format and sends it to Kafka, or is there a more efficient REST solution that can receive such a simple message and send it to Kafka?
Please let me know if there is a configuration that allows you to send messages to Kafka using Azure Webhook without Event Hub.
thank you |
How to input data through Kafka REST in an environment where message payload cannot be edited |