I have to generate a list of prime numbers smaller than a given number and then find all pairs of the generated prime numbers that add up to that value. E.g. nu
I am using json4s for parsing a json object which gives me result like this: JObject(List((x,JArray(List(JString(x_value)))), (y,JArray(List(JString(y_value)))
I am trying to define typeclass instances for an inner class. As a minimal example, I have code that looks like this: trait Typeclass[T] class Outer: class
I'm trying to build https://github.com/gatling/gatling but the compilation fails. The steps I undertook: Installed sbt using the documentation. Clone the githu
I have 2 dataframes, the first one has 53 columns and the second one has 132 column. I want to compare the 2 dataframes and remove all the columns that are not
How can I split string like key and value using scala in efficient way: I would like to split below emp string into key value pair. var emp = "Map(employees -&g
My scala code that used to work fine with databricks runtime 5.5LTS is not working with runtime 7.3LTS and above. I have tried upgrading microsoft libraries acc
I have the following HTTP-based application that routes every request to an Akka Actor which uses a long chain of Akka Actors to process the request. path("p
Reduce can be an override fold that doesn't take the first element. I guess there is an answer to that design decision but I can't find it.
import org.apache.spark.sql.SparkSession object RDDBroadcast extends App { val spark = SparkSession.builder() .appName("SparkByExamples.com") .maste
I tried posting this question to Scala Users but no reply yet. How does one go about working with the new experimental Scala 3 with Scala.JS? I can’t find
I am trying to understand and incorporate upper bound types with overriding in my system, but have not been able to achieve it without some ugly code. I have th
I am having trouble retrieving the old value before a cast of a column in spark. initially, all my inputs are strings and I want to cast the column num1 into a
We are running a stateful structured streaming job which reads from Kafka and writes to HDFS. And we are hitting this exception: 17/12/08 05:20:12 ERROR FileFor
I'm trying to unit test my actor's handling of a "Terminated" message for a child actor. The code under test is something like this: case Terminated(termin
I have a dataframe look like this below id pub_date version unique_id c_id p_id type source lni001 20220301 1
I am relatively new to Scala and also new to Doobie. I am connecting to SQL Server 2014 and need to create a temp table and subsequently insert into that temp
How can I integrate Kafka producer with spark stateful streaming which uses checkpoint along with StreamingContext.getOrCreate. I read this post: How to write s
I am trying to extract a value from an array in SparkSQL, but getting the error below: Example column customer_details {"original_customer_id":"ch_382820","fi
I have written a small application, but there is a problem that my app doesn't wait for my actors to stop, and stops them before their actions are completed. I