I am working on this challenge and it has a large base64 file. Each line has 2 characters and it has 150+ million lines.
Luckily we can use the Base64 stream decoder. It reads from an io.Reader
and returns one that can be copied into an io.Writer
. It also takes care of the new lines.
Sample code is at:
This code will accept a base64 encoded file (whitespace does not matter) with -file/--file
and store the decoded bytes in filename-out
:
|
|
Two interesting things:
*io.File
that implements io.Reader
, meaning we can pass it to any function that can use one.io.Reader
, we can use io.Copy
to copy it to the output file directly.The code is pretty simple, we open the input file and the output file, pass the input file to the base64 stream decoder and copy its output to the output file and we're done.
Now this code is pretty fast and simple. For example our input file of 150+ million lines was decoded and written to disk in less then three seconds (Measure-Command == time
):
PS> Measure-Command{go run .\base64streamdecoder-example.go -file base64file}
Days : 0
Hours : 0
Minutes : 0
Seconds : 2
Milliseconds : 960
Ticks : 29604607
TotalDays : 3.42645914351852E-05
TotalHours : 0.000822350194444444
TotalMinutes : 0.0493410116666667
TotalSeconds : 2.9604607
TotalMilliseconds : 2960.4607
Such is the magic of io.Reader
.