Subscribe to this thread
Home - General / All posts - Merging Tables
334 post(s)
#15-Jul-18 00:55

Hi guys,

In 9, I've got field data arriving in about 40 discrete CSVs per day. The client wants "one compiled file, please", every day.

Currently, I load all into GlobalMapper, then export as a single CSV.

What's the best analog to this in 9?

In my head I'm thinking the options are:

a) Import all as table components, then copy and paste each one into a master table each day (sad face)

b) Import all as table components, then keep adding UNION ALL lines to a query referencing the new tables

c) Import as above, turn into drawing, use the Merge facilities.

Anything better I'm not thnking of? Like.... mouse-select x number of table components, then go file .. export? That would be slick (schema are identical, btw).


8,167 post(s)
#15-Jul-18 01:49

d) Import all as table components, then use a SELECT INTO query that calls a script function that iterates over all user tables in the project (possibly with a filter), compiles a query using all table names, and returns a table of combined results.

In other words your b), but with enumeration of tables automated.

Do your tables usually have a fixed prefix? Are they in a specific named folder? Otherwise all tables except mfd_root and mfd_meta themselves could be assumed to be targets.


1,872 post(s)
#15-Jul-18 07:36

Or you could just use one of my favourites

Aussie Nature Shots

hphillips14 post(s)
#23-Jul-18 16:04

Or use a shell script?

for file in *.csh


cat %file >> todays_files.csh


Sorry, this isn't an M9 or complete solution


8,037 post(s)
#23-Jul-18 08:43

If the filenames are all fixed, I'd do this:

1. Create a MAP file and link all CSVs (don't need to import them).

2. Create a model table which will receive all data in the MAP file (if the fields in all files are the same, just copy and paste a table from one of the linked data sources into the MAP file, then add MFD_ID / MFD_ID_X if you want).

3. Create a query that would first do DELETE FROM <model table> and then INSERT INTO <model table> (<field1>, <field2>, ...) SELECT <field1>, <field2> ... FROM <CSV data source>::<CSV table> from each data source.

Then the process of compiling a report is: update the CSV files, open the MAP file which has them all linked, run the query, export the table to the format you want.

If the filenames are changing, there's no way to avoid manual input without writing a script, like Tim says. I'd just write a script to copy data, but you can use a combination of a query and a script to make the script part as little as possible. It's fairly straightforward either way, tell us if you need help.

Manifold User Community Use Agreement Copyright (C) 2007-2017 Manifold Software Limited. All rights reserved.