How do I extract values from nested JSON?

After parsing some JSON:

data = JSON.parse(data)[‘info’]
puts data

I get:

[
{
“title”=>“CEO”,
“name”=>“George”,
“columns”=>[
{
“display_name”=> “Salary”,
“value”=>“3.85”,
}
, {
“display_name”=> “Bonus”,
“value”=>“994.19”,
}
, {
“display_name”=> “Increment”,
“value”=>“8.15”,
}
]
}
]

columns has nested data in itself.

I want to save the data in a database or CSV file.

title, name, value_Salary, value_Bonus, value_increment

But I’m not concerned about getting display_name, so just the values of
first of columns, second of columns data, etc.

Ok I tried data.map after converting to hash & hash.flatten could find a
way out… .map{|x| x[‘columns’]} .map {|s| s[“value”]} tried to get the
values atleast separately - but couldnt…

Got it to work with

records = data.map { |record|
title, name = record.values_at(‘title’, ‘name’)
values = record[‘columns’].map{ |column| column[‘value’] }

[title, name, *values]
}

but I have another json -
data = [
{ “title” => “CEO”, “name” => “George”, “columns” => [ { “display_name”
=> “Address”, “value” => “Albany”, }, { “display_name” => “Phone”,
“value” => “47123”, }, { “display_name” => “Mobile”, “value” =>
“784123”, } ] } ]

  • here the initial title name remain the same as previews json… but
    values inside differ…

so have to join / merge two values inside the columns… any ideas on
it

To clarify further…

I have two hashes…

data = JSON.parse(data)[‘info’] puts data

I get:

[
{
“title”=>“CEO”,
“name”=>“George”,
“columns”=>[
{
“display_name”=> “Salary”,
“value”=>“3.85”,
}
, {
“display_name”=> “Bonus”,
“value”=>“994.19”,
}
, {
“display_name”=> “Increment”,
“value”=>“8.15”,
}
]
}
]

data2 = JSON.parse(data2)[‘info’]
puts data2

[
{
“title”=>“CEO”,
“name”=>“George”,
“columns”=>[
{
“display_name”=> “Address”,
“value”=>“Albany”,
}
, {
“display_name”=> “Phone”,
“value”=>“47123”,
}
, {
“display_name”=> “Mobile”,
“value”=>“784123”,
}
]
}
]

I want to join the values inside “columns” into one hash if the
conditions are met like name = george in both hashes

the required output to be like

[
{
“title”=>“CEO”,
“name”=>“George”,
“columns”=>[
{
“display_name”=> “Salary”,
“value”=>“3.85”,
}
, {
“display_name”=> “Bonus”,
“value”=>“994.19”,
}
, {
“display_name”=> “Increment”,
“value”=>“8.15”,
}
, {
“display_name”=> “Address”,
“value”=>“Albany”,
}
, {
“display_name”=> “Phone”,
“value”=>“47123”,
}
, {
“display_name”=> “Mobile”,
“value”=>“784123”,
}
]
}
]

What I have tried is zip, merge, inject, join, but the best I can get is
a new Hash that discards all first values and store second.

On Dec 1, 2013, at 11:44 AM, Dr.Mohamed Ajmal A. [email protected]
wrote:

What I have tried is zip, merge, inject, join, but the best I can get is
a new Hash that discards all first values and store second.

If the order of the elements in the nested columns array is always the
same then the approach you used to extract the values is fine.

Assuming that the order is always the same, you can extract the values
from the second structure as you did with the first. Once you have the
two in arrays, build a Hash using a unique value as the key (the name in
your example) from the first, then merge the values you want from the
second.

For example:

Extract records and records2 as you did before, then:

people = {}

Build the hashes from the first array using the name (row[1]) as the

key
records.each do |row|
people[row[1]] = {
title: row[0],
name: row[1],
salary: row[2],
bonus: row[3],
increment: row[4]
}
end

Merge values from the second array, using the same key

records2.each do |row|
if people.has_key?(row[1])
people[row[1]].merge!({
address: row[2],
phone: row[3],
mobile: row[4]
})
end
end

If it’s possible for either set of records to have duplicate entries for
the key you will lose some data with approach.

If you want to save the data to a database, then the hashes are what you
need:

people.values

If you need the values as an array instead of a hash, to save to CSV,
then:

people.map {|k,v| v.values}

This approach is fine if the data is fairly small. Otherwise, it will be
very inefficient in terms of processing and memory because it performs
multiple passes and keeps multiple copies of the data in memory. If you
have no control over the JSON data, and the data is fairly small
(compared to your system memory), and this is a one-off task, I wouldnt
worry about the inefficiency too much.

Hope that helps,
Ammar

thanks Ammar got it working similarly