apache spark - Scala - Expanding an argument list in a pattern matching expression -


i'm new scala , trying use interface spark. i'm running problem making generic csv dataframe function. example, i've got csv 50 fields, first of task, name, , id. can following work:

val reader = new csvreader(new stringreader(txt))  reader.readall().map(_ match {   case array(task, name, id, _*) => row(task, name, id)   case unexpectedarrayform =>     throw new runtimeexception("record did not have correct number of fields: "+ unexpectedarrayform.mkstring(",")) }) 

however, i'd rather not have hard code number of fields needed create spark row. tried this:

val reader = new csvreader(new stringreader(txt))  reader.readall().map(_ match {   case array(args @ _*) => row(args)   case unexpectedarrayform =>     throw new runtimeexception("record did not have correct number of fields: "+ unexpectedarrayform.mkstring(",")) }) 

but creates row object single element. how can make expand args in row(args) if have array of n elements i'll row n elements?

this should trick:

val reader = new csvreader(new stringreader(txt))  reader.readall().map(_ match {   case a: array[string] => row(a:_*)   case unexpectedarrayform =>     throw new runtimeexception("record did not have correct number of fields: "+ unexpectedarrayform.mkstring(",")) }) 

edited correct omission of array type


Comments

Popular posts from this blog

qt - Using float or double for own QML classes -

Create Outlook appointment via C# .Net -

ios - Swift Array Resetting Itself -