
Val orderItems = sc.textFile("/public/retail_db/order_items")

Val orders = sc.textFile("/public/retail_db/orders") Thank you for your reply, but where am I missing in my script? I just refactored my script here: Val dailyRevenuePerProdId = productsJoin. Val productsJoin = productsMap.join(ordersJoinMap) Val ordersJoin = orderMap.join(order_itemsMap) Val order_itemsData = sc.textFile("/user/varunu28/retail_db/order_items") Val ordersData = sc.textFile("/user/varunu28/retail_db/orders")
#TOO MANY ARGUMENTS FOR METHOD MAP SPARK CODE#
Here is the complete code val productData = sc.textFile("/user/varunu28/retail_db/products") I tried doing a sort in descending order based on the revenue and it came out fine.

It is not able to able to map directly so try doing dailyRevenuePerProductId.take(10).sortBy(rec => rec._1._2).foreach(println) Thank you very would have to make a change while giving the key to sortBy function. I understand this can be achieved by using sortWith but just want to know why sortBy throws error to me. It seems the false option is also available. I checked the doc for spark 1.6.2 which is the lab version: Is this caused by different Spark version? the false option is available according to this doc: :44: error: too many arguments for method sortBy: (f: (((String, Int), Float)) => B)(implicit ord: )Array Now I want to sort in descending, so I added “.false” in the sortBy:ĭailyRevenuePerProductId.take(10).sortBy(_._2,false).foreach(println)

DailyRevenuePerProductId.take(10).sortBy(_._1._2).foreach(println)
