Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Anyone who's generating multi-GB JSON files on purpose has some explaining to do.


Logs. jsonl is a popular streaming format.


I guess, but you can grep JSONL just like you can a regular log file. As such, you don't need any sophisticated tools as discussed in this article.

> 2. Each Line is a Valid JSON Value

> 3. Line Separator is '\n'

https://jsonlines.org/


Yes - 100% I spend hours a day blasting through line json and I always pre-filter with egrep, and only move to things like jq with the hopefully (dramatically) reduced log size.

Also - with linejson - you can just grab the first 10,000 or so lines and tweak your query with that before throwing it against the full log structure as well.

With that said - this entire thread has been gold - lots of useful strategies for working with large json files.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: