Subscribe to this thread
Home - General / All posts - Linking Large ADF in Manifold 9

843 post(s)
#04-Apr-18 05:13

I have downloaded a very large (67GB) adf dataset of Australian elevations which I would like to link into Manifold 9.

I have tried linking the hdr file, and after 1 hour the import appeared to have completed, but I can't do anything, just get "invalid heap" messages.

Any ideas what might be going wrong?

Is import the best approach for this? I only need read only access.

Would it be better to add as a datasource? or is that just the same thing via a different menu.

My main aim is to be able to browse and export manageable chunks of this dataset to use in Manifold 8 more efficiently than downloading bits from Geoscience Australia.


5,436 post(s)
#04-Apr-18 06:36

Any ideas what might be going wrong?

Not without *way* more information on details. You know the drill: What version of 9 are you using? Tell us about your computer system... free disk space, where the TEMP folder is located, size of pagefile, etc, etc. Is the download on the same machine? On the same hard disk? Is the data accessed through a network on a archival disk shared by several local machines? etc., etc.

Do you have a link to the data? Have you contacted tech support? What did they say, if you did contact them?

9 is generally very good at working with big data on even small machines, but a) it can make mistakes, which can be found only with lots of detail, and b) it cannot work miracles, such as enabling use of very large data on machines that do not have enough free space to work with big data. So,... to get to interesting causes of problems we should begin by collecting all details necessary to exclude obvious issues.

Would it be better to add as a datasource? or is that just the same thing via a different menu.

In general, yes. But it is not always the same thing.

Best of all would be to import into Manifold and to then save that .map. Link that .map into any future projects you need that will use that data. Why use .map? Because it is a faster format than .adf.


8,568 post(s)
#04-Apr-18 07:53

The number one thing to sort out is the 'invalid heap' messages and we need more data to figure out where they come from (eg, is the data set vector or raster? is it linking or importing? you are saying 'linking' in the title, but then mention that the 'import' appeared to have completed). Please contact tech support.

Some time to pre-process a large data set would be expected. If the data set is raster, we have to build intermediate levels, and if the data set is vector, we have to build a spatial index. In both cases we need to read the entire data set, this is going to take time. More, if we are talking about linking, it is important to then let the data source save composed data by answering 'Yes' to the 'Save data?' prompt when you close or save the MAP file - otherwise the data source will have to recompute it in future sessions. But, again, the most important thing now is to figure out where the 'invalid heap' messages come from - could be wrong data in the file, could be our issue, etc.


843 post(s)
#04-Apr-18 09:53

It is a raster dataset - surface elevation derived from SRTM 1 second data for the whole of Australia

DEM-H dataset from here:!aac46307-fce8-449d-e044-00144fdd4fa6

It is saved on a network drive.

Running Manifold

97GB free on SSD

I used the file menu - import.

It then displays, I think "copying data" for about an hour.

Really just experimenting with 9 at the moment, but this seemed like something it would be good for.

Might try an import overnight.

The dataset works fine with QGIS, just a bit slow, though not an hour slow.


5,436 post(s)
#04-Apr-18 10:40

The dataset works fine with QGIS, just a bit slow, though not an hour slow.

Manifold .map isn't slow. It tends to pop open instantly. From the third paragraph in the Importing and Linking topic:

Importing large files can take a long timebecause the imported data will be analyzed and stored in special, pre-computed data structures within the Manifold file that allow subsequentreads and writes to be very fast. It pays to be patient with such imports as once the data is imported and stored within a Manifold project file access to that data will usually be far faster than it was in the original format. Once imported the data will open instantlythereafter..

So yes, the initial import can be slow. After that it is very fast, often instantaneous.

Be that as it may, our task is to find out why you are getting an error. One obvious thing to do is to simplify the situation to exclude causes of error that have nothing to do with Manifold. For example:

It is saved on a network drive.

OK. Networks often have errors. Do all your work locally to eliminate the possibility of network errors causing symptoms. If everything works perfectly locally but doesn't when you start using your network, it can make sense to take a look at your network.

97GB free on SSD

I don't recall if ADF is a compressed format. If it were a 67 GB TIF you wouldn't have enough free space to save a big project on that drive. See "Must Have Free Space on Disk" in the Performance Tips topic.

It also helps to have the info I mentioned earlier. You don't say anything about your pagefile, TEMP folder, etc.

About the data: your link opens to a page with many links on it. The DEM-H link is only 26.8 GB. Is that it?


8,568 post(s)
#04-Apr-18 16:59

If 97 GB is all space that is available for an import of a 67 GB ADF (= all space available for converting ADF to MAP, which involves creating additional data for intermediate levels as well as for MAP file structures), then this is very likely not enough space and we are looking at an 'out of disk space' error.

This (not enough space) might have an effect on the performance of the import as well.

The rule of thumb is having the temp space be 3 times the size of imported data, before accounting for compression. This tries to cover data which might have to be added plus one temporary copy for the save.


843 post(s)
#06-Apr-18 02:59

Probably need to stick with QGIS then for this dataset.


5,436 post(s)
#06-Apr-18 05:16

Why? So you can run slow with it every time you use it? Is that not a case of "penny wise and pound foolish"?

Why not figure out how to import into Manifold, if possible, save as a .map and then thereafter every time run fast?

I write "figure out" because sometimes the different workflow used by different tools needs to be learned including how to deal with their particular constraints or requirements. That's often very positive because it can open more efficient methods, and sometimes, what appears to be a hassle initially can save much time in the long run if it points out a weakness in other workflow or infrastructure that could be holding you back in other areas.

An example of that might be (just saying "might" as you need to be on the scene to know) a limitation of 97 GB of free space. That is an absurdly small amount of free space if you are working with data sets where a single one of them is 67 GB in size. Thinking "well, I can get around it this once using this slow tool to squeeze by..." may certainly be OK on a one-time basis, but only until the next "one-time" rolls around. :-)

Many of us have had that feeling, of knowing we are working with too little free space for the size data we routinely manipulate, but feeling a need to just get by with the task right now. When you start having to think about "do I have enough room for this next file?" on disk, well, that's a sign the low amount of free space has to be dealt with. Move out what you don't need or get more space.

In my own work I often make copies willy-nilly so I have backups, and backups to backups. The result? The other day I realized on my primary workstation I had about 50 GB free space left despite having plenty of terabytes in local storage plus connections to effectively limitless archival storage. Doh. I had been wasting time moving files about to open up enough free space for new, big data I was acquiring.

So I invested some time doing spring cleaning, organizing more rational archives, consolidating copies of copies that had proliferated and what do you know... 2 TB free space!

Cannot resist adding... even at that, 2 TB is nothing. Heck, a 2 TB SSD is a mere $410 these days. Visiting just now and taking a look there are plenty of 8TB drives at prices around $230. Buy a 12TB drive for $450. Having vast storage space is one of the cheapest, yet most effective investments you can make.


843 post(s)
#06-Apr-18 08:08

Why? Because it is not so easy to retrofit a large hard drive in a laptop, and I might only use that dataset once a year or so, so in this case it is not worth fiddling with. I'm sure there are other applications Manifold 9 will be good for.


5,436 post(s)
#06-Apr-18 10:08

it is not so easy to retrofit a large hard drive in a laptop,

Ah, I admire your ambition. :-) But, you know, the above is why tech support asks about systems before offering advice, debugging, etc. It helps to know if somebody is running 32-bit or 64-bit, working on a laptop, etc.

But... since you've raised the question it seems only right to follow up a bit on this for the benefit of other forum users.

Are you running 32-bit or 64-bit on your laptop?

How much RAM memory do you have in your laptop?

What version of Windows are you running?


In the "for what it is worth" department, consider upgrading the drive in your laptop. I've done that many times in various laptops and, so far, it's always worked perfectly and has been much easier than I thought it would be.

Consider that even if you do this only once a year or so, are you going to leave a big data set on a laptop if space is tight? I tried downloading the link, the one I asked about:

The DEM-H link is only 26.8 GB. Is that it?

... and the site is very slow, reporting about 9 to 12 hours for the download. The download was interrupted when I tried two days ago, so I've just launched it again. The site choked on both Chrome and Opera, so I had to use Edge, which is not as good as either Chrome or Opera for restarting an interrupted download. If something takes nine hours to download, why ever repeat that?


843 post(s)
#09-Apr-18 07:46

64 bit windows 10 pro, 12GB of RAM, only intel graphics.

The dataset is saved on our file server, so no need to download it again. The download link is correct, it is only 26.8GB because it is zipped.

Putting a big drive on the laptop is possible, but since there is space for only one I would lose the benefits of an ssd.

Keeping data on the laptop would prevent others in the office from getting to it. We would also have to implement backups for the laptop.

We only ever need small, catchment size chunks of the data to use at any time, so QGIS works well to export a suitable chunk to work with. Having it in manifold 9 format would probably work even better, but not enough so to bother for this particular dataset, particularly now as we only have one 9 licence to try out for the moment, so the other 16 manifold users would not be able to take advantage. I am not following up with support because I don't actually need to get at that data now, I just thought it might be a useful one for Manifold 9 to show how good it is.


4,212 post(s)
#09-Apr-18 11:46

(I also appreciate it btw, it's a magnificent data set to have on tap - thanks!)


843 post(s)
#27-Dec-18 07:30

I thought I would have another go at this since I just got a new computer with lots of disk space.

It now appears to finish importing, but instead of getting a DEM of Australia, I get a good one of Western Australia, with a copy of itself placed to the east, which may be concerning to the residents of Victoria who will suddenly find themselves swimming.

Anyone have an idea what has gone wrong here.

Link to a screenshot showing the dataset as displayed in QGIS (left) and Manifold 9 (right)

I've overlaid the Australian coastline on the manifold one to show where it ended up.

Fortunately the bit I use most (west) is working fine.


5,436 post(s)
#27-Dec-18 08:29

Anyone have an idea what has gone wrong here.

Yes. What went wrong is you didn't answer my question back in April. :-)

I tried downloading the link, the one I asked about:

The DEM-H link is only 26.8 GB. Is that it?

I gave up downloading until I knew the effort would be for the right file. (I'm saying that with good humor... no hard feelings) . But without the actual file in hand it's not possible to debug the situation.

Two things to try:

1) Try importing it into 9 using the GDAL path. What happens then?

2) Report a bug to tech support and be ready to provide full info, including on what happened using GDAL. It may or may not be a bug in Manifold, but that's the right next step.

Such issues usually are one of four classes: a bug in how the data is written into the file format [if the bug is in the same software that wrote it and now reads it successfully, that does not mean the format is correctly written.], a bug in how Manifold reads that format, a gray area in the format spec where there are different, reasonable ways to interpret how data should be written/read to and from the format, and, of course, possible user error. The last is much less likely given the automated nature of most format imports, but it does occur. At times you get all four mixed together.

The only way you can sort such things out is with a copy of the original data, all info on the data, and every detail of how it was imported. Given that you can't upload 28 GB to the forum, if the GDAL import works then that's a classic bug report.

Unless the originating server has gotten more reliable in the last seven months it might be best if you offered to upload the original data to tech support if you report a bug.


8,743 post(s)
#27-Dec-18 21:37

I get essentially the same result as Mike, after setting up the .ADF source as a new data source.

DEM-H downloaded from (permalink). Zip source extracted.

In Manifold

  • Create > New Data Source...
  • File: adf
  • ... a05f7893-0050-7506-e044-00144fdd4fa6\hdr.adf

The Migrating Data phase took about 34 minutes (according to the log).


2018-12-28 09:31:36 -- Manifold System Beta

2018-12-28 09:31:36 -- Starting up

2018-12-28 09:31:36 Log file: C:\Users\tjhb\AppData\Local\Manifold\v9.0\20181228.log

2018-12-28 09:31:37 -- Startup complete (0.750 sec)

2018-12-28 09:31:37 -- Create: (New Project) (0.015 sec)

2018-12-28 09:44:46 -- Create cache: D:\Downloads\\a05f7893-0050-7506-e044-00144fdd4fa6\_mfd.mapcache (0.000 sec)

2018-12-28 10:18:53 -- Cache: (root)::[Data Source]::[a05f7893-0050-7506-e044-00144fdd4fa6 Tiles] (2040.949 sec)

2018-12-28 10:18:54 Render: [Data Source]::[a05f7893-0050-7506-e044-00144fdd4fa6] (0.296 sec)

The resulting image (no style applied):

The result in Global Mapper 18.2:

Here is what the table looks like in Manifold 9. Note the unusal tile size (512x4). There are 8,843,400 tiles.

DEM-H GM18.png
DEM-H table.png


8,743 post(s)
#27-Dec-18 22:32

A couple more comments.

The table includes a full rectangle of tiles, including tiles without visible pixels (many). That seems unusual (and wasteful).

The number of 512x4 tiles in the X direction is 289 (0..288), and in Y is 30600 (0..30599).

The image is truncated at X = 143--it repeats from X = 144--which is exactly halfway across the image.

To check these things I used this query, which draws indexed rectangles for each tile in an image. In case someone else finds it useful.

-- source

VALUE @table TABLE = [Data Source]::[a05f7893-0050-7506-e044-00144fdd4fa6 Tiles];

VALUE @image TABLE = [Data Source]::[a05f7893-0050-7506-e044-00144fdd4fa6];

-- source metadata

VALUE @field NVARCHAR = ComponentProperty(@image, 'FieldTile');

VALUE @dimXY INT32X2 = CAST(ComponentProperty(@table'FieldTileSize.' + @field) AS INT32X2);

VALUE @dimX INT32 = VectorValue(@dimXY, 0);

VALUE @dimY INT32 = VectorValue(@dimXY, 1);

-- target

CREATE TABLE [tile boxes Table] (

    [mfd_id] INT64,

    [X] INT32,

    [Y] INT32,

    [Geom] GEOM,

    INDEX [mfd_id_x] BTREE ([mfd_id]),

    INDEX [Geom_x] RTREE ([Geom]),

    PROPERTY 'FieldCoordSystem.Geom'

        ComponentProperty(@table'FieldCoordSystem.' + @field)

        -- indexed space


CREATE DRAWING [tile boxes] (

    PROPERTY 'Table' PragmaValue('creatednamequoted'),

    PROPERTY 'FieldGeom' 'Geom',

    PROPERTY 'StyleAreaColorBack' '{ "Value": -16777216 }' -- clear


INSERT INTO [tile boxes Table] ([X][Y][Geom])




        VectorMakeX2([X] * @dimX, [Y] * @dimY),

        VectorMakeX2(([X] + 1) * @dimX, ([Y] + 1) * @dimY)


FROM @image


Draw image tile boxes.sql


8,743 post(s)
#28-Dec-18 02:30

I tried one more thing: retiling the image into 2048x2048px tiles. Code attached, with thanks to Adam's example here--updated for new functions added since.

There are now only 4380 tiles.

Much the same result:

DEM-H retiled.png
Retile to 2048x2048.sql


843 post(s)
#28-Dec-18 01:41

Anyone know if M9 can utilise the GDAL installed with QGIS, or if a separate installation is necessary?

I tried adding the QGIS folders with GDAL dlls in to the path, but nothing seems to happen when I try to import with GDAL in Manifold, so I assume that has not worked.


5,436 post(s)
#28-Dec-18 15:44

I tried adding the QGIS folders with GDAL dlls in to the path, but nothing seems to happen when I try to import with GDAL in Manifold, so I assume that has not worked.

There are detailed instructions in the GDAL / OGR topic. As the topic notes...

Testing shows the biggest problem users have with GDAL/OGR is getting GDAL installed on their machine, and with updating the Windows PATH environmental variable to correctly point at the GDAL installation.

... so it provides detailed instructions. I followed those instructions a few months ago and have had no problems since them using GDAL from Manifold on my machine. Should still work.


8,743 post(s)
#30-Dec-18 07:12

Still works.

I have just done an Import using Manifold with GDAL 2.2.3 x64 (MSVC 2015 version) from Tamas Szekeres' GISInternals.

There is no truncation/repetition.

The tile dimesnions are the same (512x4) and the number of tiles also the same.

I will do a more exact comparison tomorrow and report any differences.

DEM-H via GDAL 2.2.3.png


8,743 post(s)
#01-Jan-19 00:56

Same (correct) result as above after following the exact earlier workflow, except using GDAL.

In Manifold (after deleting the previous mapcache file):

  • Create > New Data Source...
  • File: gdal
  • ... a05f7893-0050-7506-e044-00144fdd4fa6\hdr.adf

1,918 post(s)
#28-Dec-18 09:58

I have the full data set which also includes the full DEM-H in over 800 tiles. Importing or linking the full .ADF gives the same split result as above but importing or linking any of the individual tiles places them correctly.

Aussie Nature Shots

Manifold User Community Use Agreement Copyright (C) 2007-2017 Manifold Software Limited. All rights reserved.