I am trying to solve an optimization problem for a graph, where basically there is a cost for transporting "package" over a single edge, and I have multiple sou
I am trying to decompose an overlapping set of cuboids to non-overlapping ones, the fewer non-overlapping cuboids used the better. For example, in the case belo
I have a function shown below: def _extract_parent(matched_list, json_data, info_type): return [json_data[match_lst][info_type] for match_lst in matched_lis
I have been trying to write a code using gentic algorithm to find minimum number of NAND gates,but i have no idea of how to take a chromosome or case to find na
I know there is a longest increasing subsequence algorithm that runs in O(nlogn) (https://www.geeksforgeeks.org/longest-monotonically-increasing-subsequence-siz
Before reading I must emphasise I have demanding performance requirements (not premature optimization- processing millions of messages and need to design with p
I want to run the same Java Spark Streaming (10 seconds micro batch) through 2 instances (sparkStr1 and sparkStr2). Mainly, they consume the same kafka topic (3
#include <stdio.h> #include <iostream> #include <string> #include <chrono> #include <memory> #include <cstdlib> #include <
This code block is from OR-Tools docs, and I want to remove these for-loops. Is there a way to vectorize the code? The issue here is that I expect to have the n
This is a follow-up to my previous question here. I have a optimization model that tries to find the highest coverage of a set of probe to a sequence. I approac
I'm developing a python GraphQL API server using FastAPI and Strawberry, and I'm implementing this feature: I have two entities, User and Order, which have a on
Lets say we have a struct Original as this: class Original { int x; bool y; bool z; }; Due to alignment, the sizeof(Original) is 8 bytes. 4 for the int,
While testing things around Compiler Explorer, I tried out the following overflow-free function for calculating the average of 2 unsigned 32-bit integers: uint3
I am attempting to solve a coding challenge however my solution is not very performant, I'm looking for advice or suggestions on how I can improve my algorithm.
I am trying to optimize a portfolio for a certain timer period Error message
i have a function function t=strength(x) t(1) = -0.804-0.001.*x(1)-0.001.*x(2)+0.0.*x(3)+0.0.*x(4)-0.031.*x(5)+0.005.*x(6)-0.002.*x(7)+0.001.*x(8)-0.735.
from collections import deque def findMax(hardDiskSpace, k): n = len(hardDiskSpace) if n * k == 0: return [] if k > n: return []
I have a simple JSON reading class that should grab values from a JSON object and put it in c# variables. Right now it uses 8 if statements, but I was wondering
I have the following objective function : Minimize(sum(abs(di[i]-d[i]) ) ) and I want to add a constraint to make sure that d[i] is appeared at least k times i
I want to disable a computation of several filters during Predict call with Tensorflow 2 and Keras. Do i have to modify the source code of Tensorflow to achieve