Scala is a general purpose programming language principally targeting the Java Virtual Machine. Designed to express common programming patterns in a concise, elegant, and type-safe way, it fuses both imperative and functional programming styles. Its key features are: advanced static type system ...

learn more… | top users | synonyms

1
vote
1answer
17 views

Convert Source[ByteString, Future[IOResult]] to List[String]

I'm trying to read a file. I need to read this file by line into a list. val res: stream.scaladsl.Source[ByteString, Future[IOResult]] = Ftp.fromPath(Paths.get(uri), ftpSettings) How can I convert ...
0
votes
0answers
2 views

LWJGL 2D Texturing

I'm having some issues binding a texture to a rectangle in OpenGL (using LWJGL). When the window opens nothing but a blank screen is shown. I have been using this tutorial for the majority of ...
0
votes
1answer
6 views

Gson import error in Scala.

I am using a Gson library for parsing Json data. I am trying to run a program from terminal as follows: scala -classpath "*.jar" JsonParsing.scala To which I am getting the following error: ...
0
votes
0answers
4 views

ensimeConfig creates directories java and scala-2.11, which I don't need

When I run ensimeConfig, it creates directories such as src/main/java src/main/scala-2.11 which I don't need, since I have my sources always inside src/main/scala How can I avoid such behaviour? ...
0
votes
0answers
8 views

How to insert vector of ones to Matrix?

I've a vector and a matrix: 1 1 0 0 0 0 I want to prepend the vector to matrix to produce : 1 0 0 1 0 0 I have so far : val dv = DenseVector(1.0,1.0); val dm = DenseMatrix.zeros[Double](2,2) ...
2
votes
1answer
45 views

Stacking Free Monads

I'm learning about the Free monads, and I've put together a simple example in Scala where I use them to define two domain specific languages. The first monad deals with the side effects of a ...
6
votes
5answers
13k views

Scala beginners - simplest way to count words in file

I'm trying to code in the simplest way a program to count word occurences in file in Scala Language. So far I have these piece of code: import scala.io.Codec.string2codec import scala.io.Source ...
0
votes
2answers
13 views

Scala: Pattern matching a generic case class for “concrete” cases

let's say I have: sealed trait Data final case class TextData() extends Data final case class ImageData() extends Data final case class MetaData[D <: Data](data: D) extends Data I want to ...
1
vote
0answers
10 views

Multiple Inheritance not supported for class/abstract class but supported for trait in scala .Why?

Let me explain my observation in scala about multiple inheritance- Multiple inheritance is not possible in scala for classes. I can understand this is due to "Diamond Problem" http://stackoverflow....
-1
votes
0answers
43 views

Modifying input so it is all the same type

Within my system, I thought I had multiple lists that I was concatenating. However I am not sure why I have a List of a List, so when I try and put them together it shows "List(DATA > 23) ,4)". What ...
0
votes
0answers
6 views

Type error when chaining map with toSet and using function literal with underscore

Passing function literal with underscore as an argument to map chained with toSet on another collection (e.g. List) results in type error: scala> List(1, 2, 3).toSet map (_.toString) <console&...
1
vote
0answers
19 views

Scala sys.process does not seem to block

I'm calling Scala's sys.process to execute an AWS file copy from within a spark application. According to the docs, the ProcessBuilder returned by Scala's sys.process.Process (cmd) should block until ...
3
votes
1answer
58 views

Build List of Foo's with Specific Value at Compile-Time

Given: case class Foo(x: BigDecimal) I'd like to, at compile-time, build a List[Foo] where each Foo must have a BigDecimal value of 5. So, I'd expect the following code to compile: type Foo5Only = ...
1
vote
0answers
12 views

Avro - code-generation approach vs non-code generation approach

I'm new to Avro. The official documentation indicates that there are two possible approaches to using avro; With code generation - here classes are auto-generated from avro schema files by the avro ...
0
votes
0answers
15 views

Alternative to Kafka mirror maker tool

I am trying to find an alternative for kafka mirror maker tool for replicating between kafka clusters, so far i found Ubers U replicator tool which is an improvement of kafka mirror maker tool , but ...
0
votes
0answers
11 views

IntelliJ cannot resolve Spark external libraries

I'm trying to setup Apache Spark project (pulled lastest master branch) in my IntelliJ IntelliJ IDEA 2016.2.5 Build #IC-162.2228.15, built on October 14, 2016 JRE: 1.8.0_112-release-287-b2 x86_64 JVM:...
0
votes
1answer
16 views

Play 2.5 type mismatch Session type

Using deadbolt2 I have the following controller function implementation: def restricted: Action = deadbolt.Restrict(List(Array(USER_ROLE)))() { request => Future { val localUser = ...
0
votes
0answers
28 views

Memory usage of a Scala program

I am trying to find the memory usage of a Scala program I am even getting the result. I am referring the following code: object Test { def main(args: Array[String]): Unit = { val mb = ...
0
votes
0answers
13 views

Rebuild Query before execution

I'm researching to use jOOQ with as the abstraction layer to the configurable JDBC backends. One and the required option is the very-specific-database, which supports JDBC, but it has the not-common ...
3
votes
1answer
67 views

Scalaz: combine Writer and State (and/or Lens)

I'm trying to combine Writer and State (through Lens). I'm pretty sure I need monad transformers, but I have difficulty figuring out how to use the T versions and how to build this properly. Right ...
1
vote
1answer
31 views

Circe trait fields not included in json

I have a simple trait which mixed in some case classes. When converting instances of that classes to JSON via circe, I realized that fields with default values in trait not included in JSON string. I'...
2
votes
1answer
24 views

Reading massive JSON files into Spark Dataframe

I have a large nested NDJ (new line delimited JSON) file that I need to read into a single spark dataframe and save to parquet. In an attempt to render the schema I use this function: def ...
0
votes
1answer
247 views

Spark & Scala - NullPointerException in RDD traversal

I have a number of CSV files and need to combine them into a RDD by part of their filenames. For example, for the below files $ ls 20140101_1.csv 20140101_3.csv 20140201_2.csv 20140301_1.csv ...
1
vote
2answers
27 views

How to enforce a context bound on a wildcard in Scala?

I have an implicit helper set up like this: trait Helper[T] { def help(entry: T): Unit } object Helpers { implicit object XHelper extends Helper[X] { override def help(entry: X): Unit = {...
0
votes
0answers
10 views

Pubsub using Topic Groups with Akka Cluster

I'm trying to create a pubsub-style application with Akka Cluster. I'm reading the docs about pubsub and am trying to now run their example. My basic workflow is this: Run the subscriber (who ...
-1
votes
4answers
37 views

Loading CSV in spark

I'm attempting the Kaggle Titanic Example using SparkML and Scala. I'm attempting to load the first training file but I am running into a strange error: java.io.IOException: Could not read footer: ...
1
vote
1answer
278 views

Java spark example runs wrong with error: java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$

I wrote a example with spark maven support in Intelligent IDEA. The spark version is 2.0.0, hadoop version is 2.7.3, scala version is 2.11.8. Enviroment in system and IDE is the same version. Then ...
-2
votes
0answers
29 views

How to pass a dataframe as a function parameter in Spark

I am implementing a program which takes the whole data frame as a parameter. I know this may not be support in Spark, but I would like to know is there a good way to solve my problem. I have a Spark ...
0
votes
1answer
28 views

Create Hive table on top of data created in Spark

I have created data in ORC format under Spark like this: var select: String = "SELECT ..." sqlContext.sql(select).write.format("orc").save("/tmp/out/20160101") select = "SELECT ..." sqlContext.sql(...
2
votes
1answer
1k views

H2 in-memory database,[JdbcSQLException: Table “USERINFO” not found; SQL statement:

i am using H2 in memory database with play framework i am a beginner so i decided to make a little Login-Register app to get started with scala and play i decided to use H2 in memory database but it ...
1
vote
1answer
17 views

How can I connect to 2 kafka topics at a time, but process only 1 at a time

I want to know how I can connect to 2 kafka topics from within a single spark stream job, but only process the messages of 1 of the topics at a time. I want the ability to process from topic#1, and ...
0
votes
0answers
14 views

Python: save pandas data frame to parquet file

Is it possible to save a pandas data frame directly to a parquet file? If not, what would be the suggested process? The aim is to be able to send the parquet file to another team, which they can use ...
1
vote
1answer
28 views

Defining a Generic Method in a Trait When Using Algebraic Data Type in Scala

I have defined an abstract data type, and i wanted to make it fairly general and scalable. However, scala type system is giving me a nightmare. sealed trait Tree[+A] { def evaluateTree(): A = this ...
0
votes
3answers
31 views

spark: read parquet file and process it

I am new of Spark 1.6. I'd like read an parquet file and process it. For simplify suppose to have a parquet with this structure: id, amount, label and I have 3 rule: amount < 10000 => label=...
2
votes
3answers
44 views

asInstance[T] but with Option[T] as result

I have instance and type and want to get Some in case casting is possible and None if not. Currently I use my own implementation: def tryCast[T](o: Any)(implicit manifest: Manifest[T]): Option[T] = {...
0
votes
2answers
16 views

Creating a single column table with Slick

I am pretty used to writing the standard slick boiler plate code like this. Suppose I am creating a table called Foo with columns id and name. We can write case class Foo(id: Long, name: String) ...
13
votes
1answer
5k views

How to define schema for custom type in Spark SQL?

The following example code tries to put some case objects into a dataframe. The code includes the definition of a case object hierarchy and a case class using this trait: import org.apache.spark.{...
0
votes
1answer
18 views

Passing JSON String to Google Charts in Play Framework controller to Scala view is not recognized nor do the charts display

I have a controller in Play that passes a string in JSON format to a Scala view: public Result reportsPieChart() { String jsonString = "{cols: [{id: 'task', label: 'Task', type: 'string'}, {...
2
votes
2answers
42 views

what is the scalaz filterM method?

this scalaz tutorial provides an example of using the filterM method, but it does not explain it. List(1, 2, 3) filterM { x => List(true, false) } res19: List[List[Int]] = List(List(1, 2, 3), List(...
0
votes
0answers
38 views

scala - delegate implementation design pattern (without reflection ?)

I would like to expose a public API (a kind of Runnable) and let users implement it, and then execute that code against our servers (given the class name containing the code to run). Users provide a ...
0
votes
0answers
10 views

How to reinitialize connection pool in playframework 2.5

I am running test with embeded DB where I get connection string before test suite. I am injecting that connection string to application via implicit override lazy val app = new ...
2
votes
1answer
26 views

How to use logging trait inside a function in spark without getting task not serializable error

I tried copying the logging trait from the spark to my project. Previously I was using print instead of log. The below code was working fine with print statement. But after adding the log, I am ...
0
votes
1answer
38 views

Is there a general way of transforming a Future[Reader[A, X]] to Reader[A, Future[X]]?

Such a transformation is possible for any functor, not only Future: implicit class RichFunctorReader[F[_]: Functor, A, B](fr: F[Reader[A, B]]) { def toReaderFunctor: Reader[A, F[B]] = Reader { a =&...
2
votes
1answer
116 views

Swagger-play throws exception during app initizalization

I have problem when I'am trying attach Swagger to Play Framework app. Swagger lib scan uninitialized classes and cause problems. Any advices how to deal with it? I extracted part of app as example: ...
-1
votes
0answers
17 views

Google spell mistyping api

I'm writing a piece of codes to correct mistyped words (both English words and famous names, etc.). I want the function to work the same as when we enter a word into google search engine, and it ...
0
votes
1answer
22 views

Why does the Akka Streams TestSource not receive every pull as a request?

I am trying to track down a bug in a custom Flow where there were extra requests going upstream. I wrote a test case which should have failed given the behaviour and logging I was seeing but it didn't....
1
vote
1answer
28 views

Generic json decoder trait with Circe implementation

I have a trait used to inject json decoder as dependency to components of my project: trait JsonDecoder { def apply[T](s: String): Option[T] } When I try to implement it with Circe: import io....
0
votes
2answers
39 views

Method Definition When Using Algebraic Data Type in Scala

I want to implement a Library in Scala. I'm just starting and i'm already having trouble to design it in a modular and scalable way. I need some help! For instance i have defined an tree ADT sealed ...
0
votes
3answers
62 views

Generating adaptors for higher-kinded interfaces

I often find myself in a scenario where I have defined an interface like so: trait FooInterface [T[_]] { def barA (): T[Int] def barB (): T[Int] def barC (): T[Int] } I then write a few ...
-1
votes
1answer
55 views

File name in spark scala showing wrong result

Hi i am using below code to get the name of the file in spark but when i am appending it in a RDD[string] i am getting different value. Below is my code: import org.apache.spark.rdd.RDD val text: ...