Posted in Functional Programming in Scala

Functional Program Design

 

JSON

Case classes are Scala’s preferred way to define complex data. Below is one way to represent JSON data

abstract class JSON
case class JSeq (elems: List[JSON])           extends JSON
case class JObj (bindings: Map[String, JSON]) extends JSON
case class JNum (num: Double)                 extends JSON
case class JStr (str: String)                 extends JSON
case class JBool(b: Boolean)                  extends JSON
case object JNull                             extends JSON
 
object JSONApp extends App {
  def show(json: JSON): String = json match {
    case JSeq(elems) => "[" + (elems map show mkString ",") + "]"
    case JObj(bindings) =>
      val assocs = bindings map {
        case (key, value) => "\"" + key + "\": " + show(value)
      }
      "{" + (assocs mkString ",") + "}"
    case JNum(num) => num.toString
    case JStr(str) => "\"" + str + "\""
    case JBool(b) => b.toString
    case JNull => "null"
  }
 
  val data = JObj(Map(
    "firstName" -> JStr("John"),
    "lastName" -> JStr("Smith"),
    "address" -> JObj(Map(
      "streetAddress" -> JStr("21 2nd Street"),
      "state" -> JStr("NY"),
      "postalCode" -> JNum(10021)
    )),
    "phoneNumbers" -> JSeq(List(
      JObj(Map(
        "type" -> JStr("home"), "number" -> JStr("212 555-1234")
      )),
      JObj(Map(
        "type" -> JStr("fax"), "number"-> JStr("646 555-4567")
      ))
    ))
  ))
 println( show(data) )
}

Left hand-side of a For-expression generator can also be a pattern as shown below which lists all first-name and last-name of phone#s that start with 212:

 val data: List[JSON] = ...
  for {
    JObj(bindings) <- data
    JSeq(phones) = bindings(”phoneNumbers”)
    JObj(phone) <- phones
    JStr(digits) = phone(”number”)
    if digits startsWith ”212”
  } yield (bindings(”firstName”), bindings(”lastName”))

Here JObj(bindings) <- data acts as implicit filter

val books: List[Book] = List(
Book(title = "Book 1"
     authors = List("Author-1","Author-2")), 
Book(title = "Book 2"
     authors = List(”Bird, Richard”, ”Wadler, Phil”)),
....)

{ for {
    b1 <- books
    b2 <- books
    if b1.title < b2.title
    a1 <- b1.authors
    a2 <- b2.authors
    if a1 == a2
} yield a1
}.distinct

distinct is needed to remove duplicate authors who are in results list more than twice

Partial Functions

Partial Function, as opposed to Total functions, provides an answer only for a subset of possible data, and defines the data it can handle by <code>isDefined</code> as follows:

val divide = new PartialFunction[Int, Int] {
    def apply(x: Int) = 42 / x
    def isDefinedAt(x: Int) = x != 0
}
scala> divide(0)
java.lang.ArithmeticException: / by zero

scala> divide.isDefinedAt(1)
res0: Boolean = true

scala> if (divide.isDefinedAt(1)) divide(1)
res1: AnyVal = 42

scala> divide.isDefinedAt(0)
res2: Boolean = false

More common way of writing Partial Functions is using “case” statement which gives a default implementation of isDefined:

val divide2: PartialFunction[Int, Int] = {
    case d: Int if d != 0 => 42 / d
}

scala> divide2(0)
scala.MatchError: 0 (of class java.lang.Integer)

scala> divide2.isDefinedAt(0)
res0: Boolean = false

scala> divide2.isDefinedAt(1)
res1: Boolean = true

Note that our exception changed to MatchError instead of ArithmeticException when using “case”.

Partial-Functions are well handled by “collect” as shown below:

scala> List(41, "cat") map { case i: Int ⇒ i + 1 }
scala.MatchError: cat (of class java.lang.String)

scala> List(41, "cat") collect { case i: Int ⇒ i + 1 }
res1: List[Int] = List(42)

Magic here is that collect expects a PartialFunction and invokes the isDefined method. If we define partial-function inline, the compiler knows that it’s a partial function and you avoid explicit PartialFunction trait.

Seq,Set and Map are also Partial-functions

val pets = List("cat", "dog", "frog")
scala> pets(0)
res13: java.lang.String = cat
scala> pets(3)
java.lang.IndexOutOfBoundsException: 3
scala> pets.isDefinedAt(0)
res14: Boolean = true
scala> pets.isDefinedAt(3)
res15: Boolean = false
scala> Seq(1, 2, 42) collect pets //safely collect values of indexes
res16: Seq[java.lang.String] = List(dog, frog)

Checking for isDefined can be painful and luckily Scala supports lift method that converts partial-function to total-function which returns an Option

scala> pets.lift(0)
res17: Option[java.lang.String] = Some(cat)
scala> pets.lift(42)
res18: Option[java.lang.String] = None
scala> pets.lift(0) map ("I love my " + _) getOrElse ""
res19: java.lang.String = I love my cat
scala> pets.lift(42) map ("I love my " + _) getOrElse ""
res20: java.lang.String = ""

(above notes on Partial functions has been taken from this blog)

We can chain together partial-functions using orElse or andThen which are defined in PartialFunction trait just like lift.

// converts 1 to "one", etc., up to 5
val convert1to5 = new PartialFunction[Int, String] {....}
// converts 6 to "six", etc., up to 10
val convert6to10 = new PartialFunction[Int, String] {....}
scala> val handle1to10 = convert1to5 orElse convert6to10
handle1to10: PartialFunction[Int,String] = 

withFilter

Scala provides another variation of filter called withFilter which doesn’t create a new collection like filter does. It acts like a view that filters the result to be passed on to subsequent calls to map/flatMap etc.

scala> List(1,2,3).filter(_ == 2)
res2: List[Int] = List(2)

scala> List(1,2,3).withFilter(_ == 2)
res3: scala.collection.generic.FilterMonadic[Int,List[Int]] = scala.collection.TraversableLike$WithFilter@78405e6

We can’t use filter after applying withFilter but we can apply multiple withFilters

scala> List(1,2,3).withFilter(_ == 2).filter(x => true)
:11: error: value filter is not a member of scala.collection.generic.FilterMonadic[Int,List[Int]]
       List(1,2,3).withFilter(_ == 2).filter(x => true)
                                      ^

scala> List(1,2,3).withFilter(_ == 2).withFilter(x => true)
res6: scala.collection.generic.FilterMonadic[Int,List[Int]] = scala.collection.TraversableLike$WithFilter@6705ba33

Scala compiler translated for-expression into higher-order functions using map, flatMap and filter. For eg, for (x <- e1) yield e2
is translated to e1.map(x => e2)

Translation of for is not limited to lists or sequences or even collections.
It is based solely on the presence of the methods map, flatMap and withFilter. This lets us use for syntax for our own types as well – you must only define map, flatMap and withFilter for these types.
There are many types for which this is useful: arrays, iterators, databases, XML data, optional values, parsers, etc.

As long as client interface to the database defines the methods map, flatMap and withFilter we can use for syntax database querying.
This is the basis of Scala database connection frameworks ScalaQuery and Slick.
Similar ideas underly Microsoft’s LINQ.

Streams

Streams are similar to Lists except for their Tails are evaluated on demand.

Streams are defined from a constant Stream.empty and a constructor Stream.cons.

val xs = Stream.cons(1, Stream.cons(2, Stream.empty)) 

They can also be defined like the other collections by using the
object Stream as a factory Stream(1, 2, 3)

The toStream method on a collection will turn the collection into a stream:
(1 to 1000).toStream > res0: Stream[Int] = Stream(1, ?)

Stream supports almost all methods of List except for :: which always produces a List. #:: should be used instead to produce a Stream which can be used in Expressions as well as Patterns.

To find the second prime number between 1000 and 10000:
((1000 to 10000).toStream filter isPrime)(1)

x #:: xs == Stream.cons(x, xs)

Even the implementation of Streams are very close to Lists with only major difference being the use of “Call-by-name” to declare second param of cons ro filter ops as follows which causes lazy-evaluation.

object Stream {
def cons[T](hd: T, tl: => Stream[T]) = new Stream[T] {
      def isEmpty = false
      def head = hd
      def tail = tl
    }
    val empty = new Stream[Nothing] {
def isEmpty = true
def head = throw new NoSuchElementException(”empty.head”) def tail = throw new NoSuchElementException(”empty.tail”)
} 
}
Posted in Raspberry Pi

Compile FFMPEG for Raspberry Pi 3

Get FFMPEG source code:

$ git clone https://git.ffmpeg.org/ffmpeg.git ffmpeg
$ cd ffmpeg
$ mkdir dependencies
$ cd dependencies/
$ mkdir output

Compile libx264

$ git clone http://git.videolan.org/git/x264.git
$ cd x264/
$ ./configure --enable-static --prefix=/home/pi/ffmpeg/dependencies/output/
$ make -j4     (NOTE: this utilizes 4 threads/cores and is not applicable for Raspberry Pi Zero)
$ make install
$ cd ..

Compile ALSA

$ wget ftp://ftp.alsa-project.org/pub/lib/alsa-lib-1.1.1.tar.bz2
$ tar xjf alsa-lib-1.1.1.tar.bz2
$ cd alsa-lib-1.1.1/
$ ./configure --prefix=/home/pi/ffmpeg/dependencies/output
$ make -j4
$ make install
$ cd ..

Compile FDK-AAC

Installing build tools:

$ sudo apt-get install pkg-config autoconf automake libtool

Compile fdk-aac

$ git clone https://github.com/mstorsjo/fdk-aac.git
$ cd fdk-aac
$ ./autogen.sh
$ ./configure --enable-shared --enable-static
$ make -j4
$ sudo make install
$ sudo ldconfig
$ cd ..

Compile FFMPEG

$ cd ..
$ ./configure --prefix=/home/pi/ffmpeg/dependencies/output
--enable-gpl --enable-libx264 --enable-nonfree --enable-libfdk_aac
--enable-omx --enable-omx-rpi
--extra-cflags="-I/home/pi/ffmpeg/dependencies/output/include"
--extra-ldflags="-L/home/pi/ffmpeg/dependencies/output/lib"
--extra-libs="-lx264 -lpthread -lm -ldl"
$ make -j4
$ make install

--enable-omx --enable-omx-rpi : This enables ffmpeg runtime hardware encoding

Posted in Raspberry Pi

From Zero to Raspberry Pi Zero

I was pretty eager to try out Raspberry Pi Zero after their recent addition of Camera connector in v1.3 version. Luckily I got hold of a Raspberry Pi Zero from Adafruit last week and here goes my setup..

One pain-point in setting-up Raspberry Pi Zero is its needed accessories (mini HDMI-to-HDMI, micro USB OTG cable etc, power cable, monitor, keyboard, mouse, wifi dongle etc..also RPi Zero doesn’t come with USB hub built-in as in its elder bigger brother Rasperry Pi 3) to get it up and running. Thanks to this great article by Andrew, now we can setup RPi Zero as a USB gadget so that I can simply SSH into it from my Macbook via USB.

  1. Once SD card is flashed with latest Raspbian Jessie Lite image (Jessie full works equally good) , open up the boot partition Finder/Windows Explorer on your Mac/PC and add to the bottom of the config.txt file dtoverlay=dwc2 on a new line, then save the file.
  2. Open up the cmdline.txt. Be careful, each parameter is seperated by a single space (it does not use newlines). Insert modules-load=dwc2,g_ether after rootwait.
  3. Thats it ! Now hook-up RPi zero to Macbook/PC with micro-USB to USB cable and you can ssh into as
    ssh pi@raspberrypi.local

 

Setting up WiFi dongle

Edit sudo nano /etc/wpa_supplicant/wpa_supplicant.conf as follows:

ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1

network={
ssid="SSID-here"
psk="password-here"
}

Mounting USB drives

Type sudo blkid to identify the drive id.

Posted in Functional Programming in Scala

Transition to Functional programming paradigm

This blog lists key points I noticed while trying to shift gears to Functional Programming using Scala JVM language coming from a traditional Object-oriented programming background for 10+ years. I’ve extensively used Coursera’s Scala course and online documentation/blogs to come-up with this content.

What is Functional programming

Mainstream languages like C, C++, Java etc were based on the idea of update in place. They presuppose that the way to solve a programming problem is to have procedures that read and write values to memory locations, be it directly visible to the programmer (by means of pointers) or not.

Other languages, like Lisp, are based on a different assumption: a computation is best expressed as a set of functions over immutable values. So, in order to produce any kind of repetitive computation, recursion is essential. This assumption has strong roots in lambda calculus and related mathematical formalisms.

Tail-recursion:

Recursion has been looked-down upon in Imperative style programming but its the basic building block for Functional Programming. Stack-overflow issue still exists if you have too many recursive calls and thats where Tail-recursion comes into play. We can convert any Recursive function into Tail-recursion by using an intermediate function containing an Aggregator. This lets Compiler use the same stack-space for all recursive method calls instead of creating a new one for each Recursive call.

Higher-order functions and Currying:

Functions are first-class citizens which gets treated just like any primitive types, i.e. you can pass it as a parameter to a function, return it from a function and define a var to hold a function.

Currying is when you break down a function that takes multiple arguments into a series of functions that take part of the arguments. Advantage of this is that it offers great amount of expressiveness and reusability to a function. Function arguments associate to the left in context of Currying. Basically, this lets you express your solution just as if you you write a mathematical equation what would have taken multiple LOCs in an Imperative language.

Function-types:

Function-types associate to right

Persistent Data-structures:

Classes:

Scala will create a new Type and a Constructor for every class definition as in below examples

class Example(x: Int, y:Int)
class Rational(x: Int, y:Int) {
  def numer = x
  def denom = y
  override def toString = numer + "/" } denom
}

val x = new Rational(1,2)
x.numer
x.denom

Functions within Class are called Methods. They differ quiet a bit which we’ll see later.

We can add ‘require‘ method to the class definition to enforce some restrictions on class variables during constructions. An IllegalArgumentException will be thrown if its not met.

require (y!= 0, "Denominator must be non-zero")

Similarly, there exists assert method with same signature which throws a AssertionError.

  • require enforces pre-condition on caller of function
  • assert checks the function code itself

Every class def introduces an implicit constructor called ‘primary’ constructor that takes all parameters and executes all statements in class such as ‘require’ etc.

We can define multiple constructors by using ‘this‘ keyword as follows:

def this(x: Int) = this (x,1) //the latter constructor refers to primary constructor

Any method with parameter can be used as an Infix operator such as:

x.add(y) can be re-written as x add y

Symbols can be used as Identifiers for Variable or Methods such as +, *&^%, etc

We can define prefix operators like -y by declaring it as unary operator as follows:

def unary_- : Rational = new Rational (-numer, denom)

You can create a singleton class by replacing “class” identifier with “object”

Traits

Traits lets us achieve multiple inheritance in Scala. They’re similar to Java Interfaces except that Traits can contain method definitions (Java 8 supports methods in Interfaces) It cannot define variable/parameters though.

class Shape extends Shape with Planar with Movable ...

We can instantiate Traits using Anonymous classes as follows:

trait Generator[+T] {
   def generate: T
}

cal integers = new Generator[Int] {
    val rand = new java.util.Random
    def generate = rand.nextInt()
}

We are not instantiating Trait directly, but rather creating a suitable object for the Trait to attach itself to so you can use the Trait’s functionality without needing to define a class that extends the Trait.

Scala class Hierarchy

All Scala classes automatically import classes belonging to following packages: scala, java.lang and scala.Predef

_ is the wildcard in Scala

Scala-class-hierarchy

“Any” is base type of all classes containing methods like ‘==’

“AnyRef” is just alias of java.lang.Object and is the base class of all reference types

“AnyVal” is base type of all primitive types

Nothing” is sub-type of every other type. It’s used to signal abnormal termination (for eg. ‘throw Exc’ expressionn is used to abort evaluation and its type is Nothing) and, also as element type of empty collections as in Set[Nothing]

“Null” is subtype of every class inherited from Object/Any thus incompatible with AnyVal. Every reference class type also has null as a value.

Below is an example of a class hierarchy showing Polymorphism in action. Note that Type parameters doesn’t effect Scala during runtime as it uses Type-erasure

trait List[T] {
    def isEmpty: Boolean
    def head: T
    def tail: Lis[T]
}

class Cons[T] (val head: T, val tail: List[T]) extends List[T] {
    def isEmpty = false
}

class Nil[T] extends List[T] {
    def  isEmpty = true
    def head = throw new NoSuchElementException("Nil.head")
    def tail = throw new NoSuchElementException("Nil.tail")
}

Above code creates what is called as Immutable linked list which is fundamental to Functional programming which is constructed using 2 basic elements:

Nil:      the empty list

Cons:  cell containing an element and remainder of list

Below is graphical representation of Immutable linked lists:

Immutable linked list

 

Functions as Objects

Functions too are objects in Scala. Functions type A => B is just abbreviation for scala.Function1[A,B] which is define as follows:

package scala
trait Function1[A,B] {
 def apply(x: A): B
}

There are also traits with Function2, Function3 etc to support upto 22 params.

Anonymous function such as (x: Int) => x * x would be expanded as follows:

class AnonFun extends Function1[Int, Int] {
    def apply(x: Int) = x * x
}
new AnonFun

This in turn gets converted to as follows using Java anonymous class syntax:

new Function1[Int, Int] {
   def apply(x:Int) = x*x
}

Function calls such as f(a,b) is expanded to f.apply(a,b). For eg:
val f = (x: Int) => x*x
f(7)

would be translated to:

val f = new Function1[Int, Int] {
def apply(x: Int) = x*x
}
f.apply(7)

ETA Expansion

Note that method such as def f(x: Int): Boolean = ... is not a function value as it ends in infinite expansion. But when name of method is used in place where function type is expected, then its converted to function value (x: Int) => f(x) which gets expanded as below. This is known as ETA expansion :

new Function1[Int, Boolean] {
    def apply(x: Int) = f(x)
}

Functional decomposition with Pattern matching

A case class definition adds feature to pattern match on classes. This modifier adds the benefits of (1) add companion ‘object’ singleton declarations for syntactic convenience (2) provides concrete subclass with no body.

Since the concrete sub-class has no body, we will be using Pattern matching using ‘match’ keyword.

For eg,

  • trait Expr case class Number(n: Int) extends Expr case class Sum(e1: Expr, e2: Expr) def eval(e: Expr): Int = e match { case Number(n) => n case Sum(e1,e2) => eval(e1) + eval(e2) }

NOTE: match statement execution is sequential as it executes all cases starting from top to bottom one-by-one and skips after a matching case.

Lists

Comparing to Arrays, Lists are immutable and are recursive whereas Arrays are flat.

val l = List(1,2,3) is similar to using Cons operator as 1::2::3::Nil.
Note that operators ending in ‘:’ are right associative and thus above can be translated as Nil.::3.::2.::1

All operations on lists can be expressed in terms of ‘head’, ‘tail’ and ‘isEmpty’

Lists can also be used in pattern matching as follows:
1 :: 2 :: xs Lists starting with 1 and then 2
x :: Nil or List(x) Lists with length 1
List() or Nil Empty list
List(2 :: xs) List containing only element as another list that start with 2

More methods on lists xs.length ; xs.last ; xs.init ; xs take n ; xs drop n ; xs(n) ; xs ++ ys ; ::: (concatenating lists) ; xs.reverse ; xs updated (n, x) ; xs indexof x ; xs contains x

Pairs and Tuples

Pairs consisting of x and y is written as (x,y).

val pair = ("answer",42)
Type of above pair is (String,Int). Pairs can be used as patterns.
val (label,pair) = pair This returns label: String = answer, value: Int = 42

A tuple expression (e1,…,en) is an abbreviation of parameterized type scala.Tuplen(e1,...,en)

Type inference

Consider below code

def msort[T](xs: List[T])(lt: (T,T) => Boolean): List[T] = {......}
val nums = List(2,4,-1,9)
msort(nums)((x: Int,y: Int) => x < y)

msort call can be re-written to msort(nums)(x,y) as Scala compiler can infer the parameter types of x & y based on type of nums.

Implicit parameters

scala> implicit def v = 7
v: Int
scala> implicit var x = 10L
x: Long
// i is implicit
scala> def pp(a:Int)(implicit i:Int) = println(a,i)
pp: (a: Int)(implicit i: Int)Unit
scala> pp(3)
(3,7)

In above example, Scala compiler searches for implicit definition based on below rules:

– is marked ‘implicit’
– has type compatible
– is visible at point of function call or is defined in companion object associated with implicit’s type

Higher order List functions

Normal List operations include the following:

  • transforming each element of a list
  • retrieving list of all elements satisfying a criteria
  • combining elements of a list

Scala allows generic functions to implement above patterns using higher-order functions.

Below is a screenshot of my IntelliJ IDEA workspace with some examples:

Screen Shot 2016-07-21 at 11.17.57 AM.png

Reduction of Lists

Common operation on Lists is to combine elements of List with a given operator.

reduceLeft inserts given binary operator between adjacent elements of a list

List(x1,….,xn) reduceLeft op = (…(x1 op x2) op …) op xn

Using reduceLeft, we define sum and product op as follows:

def sum(xs: List[Int]) = (0 :: xs) reduceLeft ((x,y) => x + y)
def product(xs: List[Int]) = (1 :: xs) reduceLeft ((x,y) => x*y)

Above ops can be re-written using wildcard pattern as follows

def sum(xs: List[Int]) = (0 :: xs) reduceLeft (_ + _)
def product(xs: List[Int]) = (1 :: xs) reduceLeft (_ * _)

foldLeft is similar to reduceLeft except for it takes an accumulator as a additional param which will be returned when foldLeft is called on empty list.

def sum(xs: List[Int]) = (xs foldLeft 0) (_ + _)
def product(xs: List[Int]) = (xs foldLeft 1) (_ * _)

foldLeft and foldRight are equivalent (possible differences in efficiency) only if the operators are associative and commutative

Other Collections

Lists are linear, i.e access to first element is faster than middle or end element.

Scala offers alternative sequence implementation: Vector which is immutable and offers more balanced access patterns.

Screen Shot 2016-07-23 at 10.57.50 PM.png

When is a List preferred to a Vector ?

List: if our operations involve having a Head and Tail of a sequence as these are done at constant-time with Lists whereas its complicated with a Vector

Vector: when we need bulk operations like Map, Filter, Fold

Vectors are created analogous to Lists including its operations. Only exception is we’ll replace <code>::</code> with <code>:+</code> (creates new Vector with leading element x followed by elements of xs) or <code>+:</code>(creates new Vector with trailing element x preceded by all elements of xs).

Screen Shot 2016-07-24 at 9.58.49 PM.png

More functions on Sequences

Screen Shot 2016-07-24 at 8.44.50 AM.png

xs zip ys       A sequence of pairs drawn from corresponding elements of sequences xs & ys

xs unzip ys  Splits a sequence of pairs xs into two sequences consisting of the first and second half of all pairs.

(1 to 2) flatMap (x => (3 to 4) map (y => (x,y)))

res8: scala.collection.immutable.IndexedSeq[(Int, Int)] = Vector((1,3), (1,4), (2,3), (2,4))

Above functions lists all combinations of numbers x and y where x is from 1 to 2 and y is from 3 to 4

Below function:

def scalarProduct(xs: Vector[Double],ys: Vector[Double]) : Double = (xs zip ys).map(xy => xy._1 * xy._2).sum

is equivalent to following def which uses case

def scalarProduct_WithCase(xs: Vector[Double],ys: Vector[Double]) : Double = (xs zip ys).map{case (x,y) => x * y}.sum

Note that {case p1 => e1, ... case pn => en} is same as x => x match {case p1 => e1 ... case pn => en}

We can combine Sequence of Sequences using flatten

Its also true that xs flatMap f is equal to (xs map f).flatten

All collection types share common set of general core methods: map, flatMap,filter,foldLeft & foldRight. Latter two reduces to single value

For Expressions

Higher-order functions like map,flatMap and filter might make program difficult to understand in which case we can use for expressions.

For eg: if persons is a list of elements of class Person with fields name and age,

case class Person(name: String,age: Int)

we can obtain names of persons over 20 as follows:

for (p 20) yield p.name

which is equivalent to below expression which is somewhat difficult to understand.

persons filter (p=> p.age > 20) map (p => p.name)

A for expression is of the form:

for (s) yield e where s is a sequence of generators and filters and e is expression whose value is returned by an iteration.

We can also use {s} instead of (s) which lets us write sequence of generators and filters in multiple lines without need of semi-colons.

With For expression, we can re-write scalarProduct as follows:

def scalarProduct(xs: List[Double], ys: List[Double]) : Double = (for ((x,y) 

Sets

Set is written analogous to a sequence:

val fruit = Set("apple","banana","pear")
val s = (1 to 6).toSet

Most ops on sequences are available for Sets as it inherits them from Iterable as follows:

s map (_ + 2)
fruit filter (_.startsWith == "app")
s.nonEmpty

Differences between Set and Sequence:

    1. Sets are un-ordered
    2. No duplicate elements. s map (_ / 2) //Set(2,0,3,1)
    3. Fundamental operation on Set is contains just as head/tail for Lists and index for Vector

Maps

Class Map[Key, Value] extends collection type Iterable[(Key, Value)] thus it supports all operations that iterables do.

val capitalOfCountry = Map("US" -> "Washington", "Switzerland" -> "Bern")
val countryOfCapital = capitalOfCountry map {case(x,y) => (y,x)}

Note that maps extends iterables of key/value pairs. So, key -> value is just an alternatetive way of writing (key, value)

Class Map[Key, Value] also extends the function type Key => Value so maps can be used anywhere functions can. In particular, it can be applied to key arguments.

Applying a map to a non-existent key gives error:

capitalOfCountry("Andorra")//java.util.NoSuchElementException: key not found: Andorra

To query a map without knowing if key exists or not, we can use get which returns a Option value

 capitalOfCountry get "US" //Some("Washington")
      capitalOfCountry get "Andorra" //None

Since Option classes are case classes, we can decompose them using pattern matching

def showCapital(country: String) = capitalOfCountry.get(country) match {
case Some(capital) => capital
case None => "missing data"
}
showCapital("US") //"Washington"

Maps are partial functions as they could lead to exception if no key found. The operation withDefaultValue turns a map into a total function.

val cap1 = capitalOfCountry withDefaultValue ""
cap1("Andorra") //""

Repeated parameter

For convenience, we can convert below function call:

def Polynom(val terms: Map[Int, Douoble]) {
...
}
Polynom(Map(1 -> 2.0, 3 -> 4.0, 5 -> 6.2))

to

def Polynom(bindings: (Int, Double)*) =
    new Polycom(bindings.toMap withDefaultValue 0)
...
Polynom(1 -> 2.0, 3 -> 4.0, 5 -> 6.2)

Inside Polynom function, binding is seen as a Seq[(Int,Double)]

sortWith and groupBy

val fruit = List("apple","pear","orange","pineapple")
fruit sortWith (_.length < _.length)//List("pear","apple","orange","pinepapple")
fruit.sorted //List("pear","apple","orange","pinepapple")

groupBy partitions a collection into a map of collections according to a discriminator function f

fruit groupBy (_.head) //Map(p -> List(pear,pineapple), a -> List(apple), o -> List(orange))
Posted in Raspberry Pi

Youtube Live streaming using Raspberry Pi

This blog describes how to stream video + audio to Youtube Live events using Raspberry Pi. There are lot of sites online that describe how to Live stream a webcam video to Youtube with Raspberry Pi but no where I found working solution for live streaming both Audio & Video to Youtube. Hence this blog..

Bit of background:

I work for a non-profit organization that needed a free 24/7 webcam streaming. They were using VLC in tandem with Quicktime broadcasting for last 10 yrs that worked great until the recent advent of new HTML5 and Mobile platforms. It became obsolete especially due to its need for Quicktime browser plugin and lack of support for Mobile platforms.

After some research, Youtube Live events came out to be the only free streaming solution. Initially I tried using Adobe Flash Live encoder to live stream to Youtube but its a User-interface driven tool and not a fire-and-forget type solution. This where Raspberry Pi fit the job.

Raspberry Pi 2: a tiny, cheap yet savory unix box

51aoRKBglUL

Needed components:

  1. Raspberry Pi 2 or 3
  2. SD card to load Raspbian OS
  3. Camera component
  4. WiFi adapter (only if its Raspberry Pi ver 2. WiFi & Bluetooth are built-in to Raspberry Pi from ver 3 onwards)
  5. USB Audio adapter
  6. Power adapter
  7. Raspberry Pi case (optional)
  8. Keyboard, Mouse, HDMI cable, Monitor (optional. only needed until you setup SSH)

[WORK in PROGRESS]