Subscribe to this thread
Home - Cutting Edge / All posts - Manifold System

8,259 post(s)
#12-Mar-18 17:01

SHA256: c5c3be2cc5c218f8c412579a6d2b4b5709398a9041ad6a0c8cda3670dc92d18b

SHA256: d09b6d54592512e452911804ff008f4f7fe600d86a9558c9d96bba7ae583bd22


8,259 post(s)
#12-Mar-18 17:02


Composing contour areas or lines includes optimizations for empty or missing tiles.

Composing contour areas or lines includes optimizations for tiles with a small height range.

(Note: the performance increases from the above two optimizations depend on the data, but (a) there is nearly always an increase, and (b) usually, the increase is significant.)

Composing contour areas or lines and tracing areas reports progress.

Previews for Geoms Adjacent, Geoms Contained, Geoms Containing, Geoms Intersecting, Geoms Touching, Normalized Geoms templates in the Select and Transform panes are more accurate.

The Select pane includes an 'Allow parallel execution' option (on by default) used for overlay templates.

The Select and Transform panes stop the preview prior to performing the operation. (This was wasting resources unnecessarily if the preview was still running.)

Connecting to a WMS server assumes that servers using old versions of the WMS standard (pre-1.1.1) ignore axes order for the coordinate system and always use XY.

Reading a DWG or DXF file better handles block inserts and performs faster.

Exporting a drawing to a DXF file exports areas as hatches instead of exporting area boundaries as lines.

End of list.


5,119 post(s)
#12-Mar-18 18:43

I'd like to thank everyone that sent in data sets as part of the contouring discussion in the thread, as those were a big help in producing the optimizations in 165.5. Having good example data and descriptions of workflow made it possible to rapidly turn this around. If anyone ever found anything quicker than 9 for contouring, please re-run using 165.5 to see how it has improved. :-)


3,097 post(s)
#12-Mar-18 20:34

I will try giving it a shot tomorrow or Wednesday. But, out of curiosity, what improvements did you see in using the large DEM I sent to you? (this question can be for either Dimitri or any of the other folks who got a hold of the Garrett County DEM).

159 post(s)
#12-Mar-18 20:59

I reported 5 ft contour lines on 2.8G DEM (sftopo_2236_contour_lines) took 363.645 sec core i7 6700 4 cores 8 threads. Looking at the query I think it was actually 3 ft contours. Just reran with, took 30.061 seconds, over 10X faster. 5 ft contour lines took 19.601 seconds.

dyalsjas96 post(s)
#13-Mar-18 02:17

Original Run

Garrett County DEM

Manifold Cutting Edge build

Transform Contour Areas:

Min Height 269m, Max Height 1028m, Step 10m

560.416 seconds (9 minutes, 20 seconds)

New Build Run

Garrett County DEM

Manifold Cutting Edge build

Transform Contour Areas:

Min Height 269m, Max Height 1028m, Step 10m

57.851 seconds


8,410 post(s)
#13-Mar-18 02:39

Speed is one thing. Noise is another.

It's not there yet!

Examples soon.


1,668 post(s)
#13-Mar-18 03:32

Is this the sort of thing you mean Tim?


Landsystems Ltd ... Know your land |


5,119 post(s)
#13-Mar-18 04:49

What happens when you normalize those results?


8,410 post(s)
#13-Mar-18 05:58

Kind of...

The attached 8.0.30 result took about one hour. (A bit more; Manifold 8 didn't log a time natively so I was reliant on my watch.) Notice how smooth it is. Some very small artefacts that should ideally not be there, but almost no contour collisions.

The result took 69 seconds. There is a significant number of contour collisions, though very few small artefacts.

The result took 715 seconds (11mn 12s). As far as I can tell, the result is exactly the same as the result for

These are all 15m contours from SRTM at 1spx for the whole South Island of New Zealand (32401 x 28801px, FP32, missing pixels fillied with -32767).

The screenshots are details from a flat area of Canterbury directly west of Banks Peninsula. The match between zooms and scales is rough.

Timings are for i7-4790 (without Meltdown/Spectre mitigations).

I much prefer the result given by 8.0.30. I don't think I could use the results from 9, however quick. No result is perfect.



8,410 post(s)
#13-Mar-18 06:36

This may be a case of asking a silly question (to test new software) and getting a silly answer.

SRTM is somewhat noisy.

I will try again after an appropriate Gaussian blur tomorrow.


5,119 post(s)
#13-Mar-18 07:19

SRTM is somewhat noisy.

I'll leave it to adamw for a more precise discussion, but I do not believe it is so much that SRTM is noisy, it is that it does not match reality in a way that drawing contours at certain levels of detail will be free of odd contour features that the data insists should be there.

SRTM is integer data: 7, 8, 9 and so on. That is not the physical reality of the surfaces SRTM represents, which smoothly contour at scales both below and above the pixel size of SRTM. If you use a step function to represent non-stepped data, and you contour at levels of detail where those not-the-same-as-reality steps are a factor, you get artifacts when applying procedures that create contours which represent surfaces using generally-understood conventions about what contours are supposed to mean.

None of this is unusual: the common recipe to which adamw refers I believe comes from ESRI advice. I suppose other approaches might be to use a blur, or remove resultant artifacts with normalization.


8,259 post(s)
#13-Mar-18 07:04

There should be no difference between the results produced by and You say you aren't seeing any and I am not seeing any either from cursory look, so that's good.

There might be a difference between the results produced by 9 and 8. The screens show some. What remains is to determine whether these differences are good or bad. The important thing is that both 8 and 9 use the same underlying model for the surface so it's a simple case of one of them being wrong. (Other programs may use a slightly different model and so their contours might be legitimately different from those produced by 8 or 9, but this is not the case here.) And we did fix a number of inaccuracies compared to 8 particularly in edge cases like contour height coinciding exactly with that of flat areas of specific shapes. It might be that it is 8 that is wrong, although yes, it is possible that it is 9.

Could we have the file and the contour transform parameters?

Whatever the case, a common recipe to get rid of spikes is to alter contour heights slightly so that they don't coincide exactly with heights on the raster. Ie, instead of producing contours every 10 meters starting at 500, produce contours every 10 meters starting at 500.001.


8,259 post(s)
#13-Mar-18 07:26

On a closer look, it seems to me that the screen for 8 and the screens for 9 show contours for different heights.

That might be fixed by the UI for entering contouring parameters which will round the min / max height to the step. (In the transform, if you specify that you want heights between 120 and 445 with a step of 100, you get 120, 220, 320, 420, while you perhaps want 200, 300, 400.)


8,410 post(s)
#13-Mar-18 21:12

On a closer look, it seems to me that the screen for 8 and the screens for 9 show contours for different heights.

I've checked, and I'm afraid that is not the case.

So back to

The important thing is that both 8 and 9 use the same underlying model for the surface so it's a simple case of one of them being wrong. ... we did fix a number of inaccuracies compared to 8 particularly in edge cases like contour height coinciding exactly with that of flat areas of specific shapes. It might be that it is 8 that is wrong, although yes, it is possible that it is 9.

Could we have the file and the contour transform parameters?

I am packaging up a single 1x1° tile of SRTM data, including the area shown in previous screenshots, and will list exact steps in 8 and 9.

The original data source is here. Login required. I will also provide a direct link.


8,410 post(s)
#13-Mar-18 21:37

The original GeoTIFF version of the tile is also at Dropbox here.


8,410 post(s)
#13-Mar-18 21:52

Steps in Manifold 8.0.29:

  • Import surface s44_e172_1arc_v3.tif (see above)
  • Leave as Int16, 3601 x 3601px
  • Open surface
  • Surface > Contours...
  • Name: ...
  • Create: lines
  • Heights: remove 2 suggested heights
  • Heights: Add sequence > Offset: 0.00, Step: 15 > OK > OK

[Takes about 1mn 55 on this laptop.]


8,410 post(s)
#13-Mar-18 22:17

Steps in Manifold

  • Import image s44_e172_1arc_v3.tif
  • Table [s44_e172_1arc_v3 Tiles] > Properties > FieldCoordSystem.Tile...
  • Adjust projection from standard EPSG:4326 to use XY order, by adding overide inside mfd JSON string:

EPSG:4326,mfd:{ "LocalOffsetX": 171.9998611111111, "LocalOffsetY": -44.00013888888889, "LocalScaleX": 0.0002777777777777778, "LocalScaleY": 0.0002777777777777778, "Axes": "XY"}

(If the projection is not adjusted, then while the image may display correctly, the contours will inherit YX order and appear inverted.)

  • Open image
  • Contents > Transform > Contour lines
  • Min height: 0
  • Max height: 1930
  • Step: 15
  • Allow parallel execution
  • Add component

[Takes about 1.5s on same laptop used with Manifold 8.]


8,410 post(s)
#14-Mar-18 00:05

Some screenshots to show what I am looking at.

Zoom box.png shows the full STRM tile, with a blue box showing the area of detail used for the next three images.

Zoom M8.png shows the zoomed area with contours created in Manifold 8.

Zoom M9 shows the area with contours created in Manifold 9.

Zoom GM 18 shows the area with contours made in Global Mapper 18.2.

(In GM the contours were made without optional smoothing, simplification or filtering, and again from 0 to 1930m, step 15m. Execution time was roughly 20s, on the same machine used in Manifold 8 and 9.)

I've kept the formatting and shading as similar as possible between all three examples.

The Manifold 8 and Global Mapper results are very similar, though the Manifold 8 result has some tiny branches not present in Global Mapper (no complaint about that--a small blur gets rid of those).

The Manifold 9 result looks a bit different. It seems to have extra noise or artefacts. See the adjacent and linked contour rings, mainly near the centre of the screenshot.

Zoom box.png
Zoom GM18.png
Zoom M8.png
Zoom M9.png


8,410 post(s)
#14-Mar-18 00:38

Another interesting difference (maybe worth looking at) is a small cluster in the upper part of the large band of forest in the upper right of the screenshots. See the yellow box.

There are just a few pixels at 75m elevation here, which Manifold 8 and Global Mapper 18 both pick up (showing 3 small rings, almost the same though not quite), but Manifold 9 does not (no 75m contour here).

Detail image.png


4,206 post(s)
#14-Mar-18 04:59

Great data Tim, thanks!


8,259 post(s)
#23-Mar-18 09:28

There is a general note on differences between contours produced by 8 and 9 in the build notes for

A smaller note on this specific case:

The branch in the yellow box above which 8 creates and 9 does not create is all on pixels of the same height, 75, which coincides with the contour height. The surrounding pixels are all lower, this is a flat peak. So, 8 circles that and 9 doesn't. If the surrounding pixels were all higher, it would be the reverse, 9 would circle the pixels and 8 wouldn't.

As an illustration, I negated the surface and built a contour at height -75, using 8. Here are the results:

The contour at 75 on the original surface is thin black, the contour at -75 on the negated surface is wide cyan.



8,410 post(s)
#14-Mar-18 01:42

Lastly, a project with all 3 contour results together (after a bit of renaming), in case anyone is interested.

s44_e172 compare.mxb (Dropbox, 72MB).

There are noticeable differences between Manifold 8 and Manifold 9 in all flat areas. Manifold 9 is sometimes better, possibly.

Global Mapper is basically like Manifold 8--but notice the half-pixel offset between the GM contours and Manifold contours (both 8 and 9). I think that is because Global Mapper observes the distinction between Pixel-is-Point and Pixel-is-Area raster data, whereas Manifold always assumes Pixel-is-Area. (Other terms for these are grid-centred data and cell-centred data, respectively.) I believe GM is right in this case, since SRTM data is Pixel-is-Point (grid-centred)--that is why its one-degree tiles at one second resolution are 3601 x 3601 pixels, not 3600 x 3600. The TIFF metadata for the tile confirms "Pixel is Point".

But that is another matter!


8,259 post(s)
#14-Mar-18 08:13

Regarding pixel-is-point and pixel-is-area, Manifold only assumes pixel-is-area in that the parameters of the coordinate system define the shift to the corner of the corner pixel, not to the center of the corner pixel. Transforms make individual decisions, and contours in particular assume pixel-is-point.

We will take a look at the offset, it is either us or Global Mapper misregistering the image by applying the shift parameters to center instead of corner or vice versa.


5,119 post(s)
#14-Mar-18 04:58

Great work and great illustrations! These are really impressive.

The Manifold 8 and Global Mapper results are very similar, though the Manifold 8 result has some tiny branches not present in Global Mapper (no complaint about that--a small blur gets rid of those).

The Manifold 9 result looks a bit different. It seems to have extra noise or artefacts. See the adjacent and linked contour rings, mainly near the centre of the screenshot.

But the above is too broad an opinion to be used to draw any conclusions. The only technical conclusion you can draw from the screenshots presented is that all three results are different. Observing they are all "similar" to some degree doesn't really tell you much, nor does picking a few contours out of zillions help much in characterizing exactly how they are different from each other, at least not without a careful look at the underlying data in each case.

For example, sure, there are a) spots where GM and 8 show small contours where 9 does not, but conversely there are locations where b) 9 shows small contours where 8 or GM may not. Without a careful consideration of what the actual data is in each such spot, calling b) a case of "noise" while not calling a) a case of "noise" is expressing a pre-formed opinion, not making a technical observation.

I 100% agree that global factors such as treating data as pixel-is-area or pixel-is-point should be reckoned explicitly with options. Those are pretty clear base conditions where making comparisons at only the highest level helps a lot.

But further down, sorting out what is "noise" and what is the drawing of a line that the data rightfully compels should be there, can only be done by examining the data.

It also depends on what you set to be the objective of the package in terms of capturing the information in the original data and preserving that information in the transformation from raster to vector. If the vector lines really do capture the data, then you should be able to re-create the raster from that vector in a reverse transformation.

How that works in practice is that in the case of contour lines somebody might call a spike, that is, a line which extends from an otherwise closed line figure, "noise." But if that line correctly represents a col effect on either side you need that stub contour line to correctly re-create the raster surface from the vector lines in a reverse vector-to-raster transformation. If you remove that line because you don't like spikes, you have changed what the vector data says about the form, the undulations, of the surface. Re-creating the raster from vectors in which such spikes have been removed will result in a surface that is different the the original, in that it will be missing the col.

9 deals with many such cases that 8 did not, I think to result in more accurate contours. But accurate contours are not always the objective in that people might prefer pretty contours (in the sense of being smoother, more continuous or seeming to be more orderly) to accurate ones. If GM is, indeed, more similar to 8 when you look at various cases in detail, that might (don't know... just saying "might") indicate GM, like 8, is not as accurate as 9.


8,259 post(s)
#14-Mar-18 08:05

Thanks a lot for the detailed description with illustrations and data.

We'll look into what is going on.

We have been constructing test cases and looking into the code yesterday and I believe we identified one issue which we should fix. After we do this, we will look into the differences between contours produced by 8 and 9 on this data and on several other data sets, to check the specifics in each case and make adjustments as necessary.


3,097 post(s)
#13-Mar-18 13:16

I reran my data as well. I went from 1524s to 182s, a 8.75X improvement. However, when looking at the

ArcGIS and Manifold results, you can see that ArcGIS (black lines) includes a more vertices and may give a more natural looking contour.

So, like Tim, I think there are some issues that still must be dealt with.

However, this is zoomed in pretty far, so at this point, it is all estimation by both software products and neither version is "correct". They are just approximations.

I wrote an update on my blog here.



8,259 post(s)
#13-Mar-18 14:45

For what it is worth, we have been getting even higher performance increases on that particular data set (from 1500+ sec to 95 sec instead of "just" to 182 sec). That might be related to the fact that you are running the test using Viewer = right after a big import that shook up a lot of memory, and we were running the test using a fresh instance of 9 with the MAP file created by a prior session.

Regarding the difference in results, indeed that might be both versions being correct with respect to their raster models. We might obviously include support for different models, but we think it would be even better if we allowed, say, interpolating or otherwise pre-processing the raster as an intermediate step before building contours (or anything else). That way, the user would have a very flexible way to make the raster richer / smoother / whatever, with as many parameters as desired. We have some of that already, that can and perhaps should be extended.


3,097 post(s)
#13-Mar-18 14:48

that is great, and yes, I raised that question about Viewer here. I'm glad to have you confirm it.

Mike Pelletier

1,542 post(s)
#13-Mar-18 20:07

Speed is nice but quality is really important. The right tool for the job is what matters so yes, please make the tool great with those pre-processing steps. Add quick ways to change parameters and view results as well. That will save tons of time and lead to a job done right. Sorry for this cheer leading post, but trying to balance all talk about raw speed of processes :-)


5,119 post(s)
#14-Mar-18 05:10

Speed is nice but quality is really important. The right tool for the job is what matters so yes, please make the tool great with those pre-processing steps.

You have to be careful when you use the word "quality" because that means different things to different people. Is it an improvement in "quality" when you remove details to make something prettier?

For example, do you consider a vector representation that accurately represents the original raster to be a higher "quality" thing than one which does not? A test of that is whether you can recover the original raster from the vector using a vector-to-raster transformation.

By "quality" some people mean a prettier vector, not a more accurate one. If you interpolate a raster before vectorizing it, the result of the raster-to-vector operation is a vector representation of an interpolated surface, not the original surface. That is often significantly prettier than an accurate vector representation because contours can be smoother and more orderly. Interpolation cannot make a surface more accurate by adding details that are not in the original data, but just like applying a blur to a photograph, it can remove details that are there.

For the same reason photographers will use "soft focus" to blur out pimples and other undesired, but genuine, details in portraits, you can apply a variety of techniques to elevation data to get prettier, but less accurate contour lines. I suggest that being able to do such things is great, a wonderful part of the toolkit, but I would respectfully suggest that the basis for having them is artistry, and not call it "quality".

Mike Pelletier

1,542 post(s)
#16-Mar-18 14:19

Well said Dimitri. My intent was simply to encourage a little focus on more tools (the right tool creates the desired outcome or qualiity) rather than on raw speed and a clean interface. I'm sure that will happen after 9's infrastructure gets fully developed, but the current process of dabbling in various areas (like style) leaves us wondering a bit :-)

StanNWT89 post(s)
#15-Mar-18 18:53

Hi Adam,

Might it be a good idea for demonstration purposes to use that same BOEM Gulf of Mexico mutlibeam bathymetry you reference in your YouTube video, to illustrate the speed of working with large DEMs, to also illustrate the speed of contouring with Manifold

I contoured the east and west TIFF images on my home laptop and it accomplished the task in ~49 seconds and ~39 seconds respectively on the TIFFs where the elevation values are in feet.

My laptop is a Windows 10 Pro - Dell XPS 15 circa 2014 (Core i7 4712HQ, 16GB DDR3 1600 MHz RAM, 1 TB 5400 RPM hard drive with the 32 GB mSATA cache drive Nividia 750M, but the source files in the Manifold project document were on a Corsair GTX 256 GB thumb drive, the specs of which state a 450 MB/s read and 350 MB/s write speed, but I usually only see well over 100 MB/s when I'm unzipping those BOEM files, using Winzip 18.

If you pick as a data source the pre-existing Esri created SHP file contours and compare the results of the contouring with what Manifold can do you will get a feeling for the accuracy, precision, quality, etc., of what Manifold does vs. what ArcGIS does.

You can see at the bottom of the page the download files: (Size: 449 MB) – Eastern half of the bathymetry grid. Depth in feet. (Size: 497 MB) – Eastern half of the bathymetry grid. Depth in meters. (Size: 573 MB) – Hillshaded bathymetry of the Eastern grid. Vertically exaggerated by 5x. (Size: 869 MB) – Western half of the bathymetry grid. Depth in feet. (Size: 501 MB) – Western half of the bathymetry grid. Depth in meters. (Size: 572 MB) – Hillshaded bathymetry of the Western grid. Vertically exaggerated by 5x.

Gulf_Bathymetric_Contours.lpk (Size: 363 MB) – ESRI ArcMap layer package containing bathymetric contours of varying intervals. (Size: 527 MB) – Zipped shapefiles for bathymetric contours of varying intervals.

Personally I want to take my 30 m USGS NED DEM and generate contours on it. It's 140.61 GB in size. The bounding box is (Top 83.0016106799 N, Left -180 W, Right -88.9984386585 W, Bottom 50.9966324193 N). The data set is 327603, 115217 pixels in size. Cell size is (0.00027778, 0.00027778) degrees (x,y). You can see the coverage in the attached JPG. This data set would really push the contouring transform nicely. I'm getting a new workstation at work and it will be online hopefully by May and I'll try to push it then. There's around 1800, 1 arc-second tiles in that data set.



8,259 post(s)
#16-Mar-18 07:50

That's a good pointer, thanks.

We will check the contours produced by our code vs the pre-existing ones produced by ESRI, like you say. We can perform such comparisons on any data, but in this case since the contour files have been published, they are kind of "official", this makes comparing vs them more valuable.

It might indeed be a good idea to do a demo of contours on the data in general - perhaps after we have easy means to merge West and East together.

When you get around to testing contours on your big data set, please consider reporting the results on the forum, we are interested in how it will go. The next build will contain several relevant optimizations / additions. We do test against synthetic data sets of similar size, but there are frequently important insights which you can only get from someone else's data.


5,119 post(s)
#16-Mar-18 09:21

Might it be a good idea for demonstration purposes to use that same BOEM Gulf of Mexico mutlibeam bathymetry you reference in your YouTube video, to illustrate the speed of working with large DEMs, to also illustrate the speed of contouring with Manifold

I contoured the east and west TIFF images on my home laptop and it accomplished the task in ~49 seconds and ~39 seconds respectively on the TIFFs where the elevation values are in feet.

As impressive as it is that your laptop could do that so quickly, anything that takes more than a few seconds is not right for a video. Videos are necessarily mass market, which means they must be created for highly impatient attention spans.

So, if Manifold can do something in 39 seconds that requires 39 minutes in a different package, well, that's wonderful when you read it in text as in this post, but 39 seconds is an eternity on video. The average YouTube visitor isn't going to stick around while nothing happens for 39 seconds.

Some videos cheat that by saying "oh, let's pause until this is done..." and then come back, but we do not like doing that in Manifold, where we prefer whenever possible for people to see the true, authentic effect as it happens in real time.

It would be great, by the way, if you could report analogous timings on your laptop with Arc. Please report all the settings and workflow in both cases so apples to apples comparisons can be made.

By the way, I got a kick out of this...

My laptop is [...] 16GB DDR3 [...]

... It's 140.61 GB in size.

... Ambitious! :-) I would recommend on your laptop for the first few trials starting with a part of the data set and then scaling up, so you know when to start launching the job at the end of the day, to leave it cooking overnight.

I have to admit to being curious... what is the use case, the end need, for creating contours on all of Alaska and a big part of Canada all at once?

Also, as Art Lembo has pointed out in other posts... Until you get your new workstation, I'd recommend getting an inexpensive 3 TB external hard disk for extra space on your laptop. Those have become very inexpensive. Run it over USB 3.0 and it will be faster than a thumb drive, with plenty of extra space as well.

For your new workstation, get a Ryzen with lots of cores, maybe even a Threadripper with 32 cores if you can swing it. :-)

By the way...

I contoured the east and west TIFF images on my home laptop and it accomplished the task in ~49 seconds and ~39 seconds respectively on the TIFFs where the elevation values are in feet.

... what settings for contours did you use?

I just tried on a really old and slow Core i7 with 24 GB running windows 10 with data on an external hard disk, making contours from -3300 to 0 with a step of 300:

Transform (Contour Areas): [BOEMbathyW_m] (28.559 sec)

Transform (Contour Areas): [BOEMbathyE_m] (46.007 sec)

All 8 hypercores were busy, but of course with more memory, some SSD and so on, it would be much quicker.


5,119 post(s)
#16-Mar-18 12:25

Just to follow up...

I contoured the east and west TIFF images on my home laptop and it accomplished the task in ~49 seconds and ~39 seconds respectively on the TIFFs where the elevation values are in feet.

Oops... I forgot to ask... Did you contour areas or contour lines?

Here are some updated numbers. I ran contour areas and contour lines on an old Intel Core i7 machine with 24 GB of RAM and also on an old AMD FX machine with 16 GB of RAM. Neither had SSD but instead was running on plain, old slow hard disk. Both were running Windows 10, and both computed contours from -3300 to 0 in steps of 300:


Transform (Contour Areas): [BOEMbathyW_m] (13.797 sec)

Transform (Contour Lines): [BOEMbathyW_m] (8.422 sec)

Transform (Contour Areas): [BOEMbathyE_m] (23.626 sec)

Transform (Contour Lines): [BOEMbathyE_m] (16.157 sec)

Intel i7:

Transform (Contour Areas): [BOEMbathyW_m] (28.559 sec)

Transform (Contour Lines): [BOEMbathyW_m] (14.532 sec)

Transform (Contour Areas): [BOEMbathyE_m] (44.531 sec)

Transform (Contour Lines): [BOEMbathyE_m] (27.763 sec)

An AMD FX has eight real cores while the Core i7 has four cores that can be treated as eight hypercores. What is interesting about the above timings is that if the task is big compared to the system RAM available then they are affected by Windows caching of disk read/writes. To get the above numbers, I ran each trial twice in immediate succession. The first one, while Windows was sorting out cache allocated to other things, was usually significantly slower.

Anyway, getting the biggest number for doing contour areas on the larger of the two data sets down to 23 seconds on a sub-$100 CPU is pretty good, especially considering that only 16 GB of RAM is an absurdly small amount these days. I was surprised to see so little on that machine. It's just a spare machine that nobody uses that sits around in a corner somewhere on our local net and nobody noticed it has so little RAM.

StanNWT89 post(s)
#17-Mar-18 04:29

My new workstation is a Dell 7820. Dual Intel xeon hexacore 3.4 GHz. 128 GB 2666 MHz DDR4 RAM. Dual quadro P4000 graphics cards. 2TB samsung 960 Pro m.2 will be the boot drive. Plus an extra 1TB Dell class 40 OPAL 2.0 m.2 drive which will be delivered as the boot drive but I'm going to switch that to the samsung one. Intel 550x dual 10GbE. That should make manifold 9 scream. Oyr organization requires intel vpro and an Intel nic, otherwise I'd target have a threadripper 1950x due to more pci-e lanes.

Perhaps I'll be able to get the areca 12 drive thunderbolt 3 RAID but for now DROBO 5Ds will have to do. I've got tons of space. Our organization tries to force everything to be network driven but GIS that isn't based on WMS, WAS, WCS, etc., its impissibly slow under normal circumstances let alone the 3.5 MB/s that I get network throughput from my remote site to the corporate Datacenter. Yes I live in the middle of nowhere. I'd love to have a 10GbE high end NAS RAID or good server with fiber channel connected RAID to connect via 10GbE but that's an expensive pipe dream.

StanNWT89 post(s)
#17-Mar-18 04:15

I contoured the data using lines. I used an interval of 100 ft. Yes I used the TIFFs for depths in feet. I normally use meters but wanted to try the feet. My range was -11000 to 0. I did this for both data sets. The large 140 GB DEM is in an Esri File GeoDatabase on my DROBO5D at work. The storage RAID that DEM TIFF sources tiles of the DEM data is stored is on a DROBO 5D connected via USB 3.0. I've got 10.49 TB 7.5 TB free. All my manifold projects are stored on a separate DROBO 5D with 21 TB of storage and 18TB free. I'm thinking of trying to get the Areca 12 drive thunderbolt 3 RAID so I have around 2 GB/s throughput on my new workstation.


5,119 post(s)
#17-Mar-18 07:08

The large 140 GB DEM is in an Esri File GeoDatabase on my DROBO5D at work.

You could leave it in the gdb, but of course that would be significantly slower than bringing it into a .map.

Sounds like you have some great hardware in your future! :-)


8,259 post(s)
#20-Mar-18 15:36

Well, there is no good way to read raster data out of GDBs, so this has to be converted to some other format currently.


8,259 post(s)
#13-Mar-18 07:20

Here is where those spikes come from:

If you contour the above using 5 as the height, there will be a diagonal spike in the center.

Spikes usually appear when contouring heights that have been forced to be integer, a common recipe is to produce contours at non-integer heights. Spikes can be removed in postprocessing using one of the Normalize transforms. All contouring algorithms produce spikes, if the output of a particular algorithm does not have them, this means they have been removed either as the last step of the algorithm or possibly a little earlier.



8,410 post(s)
#13-Mar-18 07:43

Vue l'heure, I'm going to defer understanding these replies correctly until tomorrow.

Meanwhile I have uploaded the combined South Island SRTM data to Dropbox.

NB FP32 but with original values unchanged, except that all invalid heights are replaced by -32767. Manifold 8 file format.


8,410 post(s)
#13-Mar-18 07:51

But yes in real life I would always apply a slight Gaussian blur before contouring. You are both right. (Even if I must wait till morning to understand exactly how.)


6,247 post(s)
#13-Mar-18 08:32

Reading a DWG is VERY fast.

I'd like to see hatches imported as area, too.

AllainR11 post(s)
#14-Mar-18 00:41

I tried to import a new autocade 2018 sample - attached - but could not see anything. I tried both viewer and cutting edge Manifold System

Would you mind trying it please?

Thank you,

Drive shaft.dwg


1,668 post(s)
#14-Mar-18 01:32

No luck M8 or M9 same cutting edge build:

2018-03-14 14:30:38 *** Unknown file format.

ArcMap 10.4.1 and Global Mapper 17 import it fine.

Landsystems Ltd ... Know your land |


5,119 post(s)
#14-Mar-18 05:13

How did you try importing it? Manifold often provides different ways to import the same thing. Did you try all of them? For example, have you tried importing it using Manifold's GDAL dataport?


6,247 post(s)
#14-Mar-18 07:14

I have checke errors im ACAD 2013. None.

The current ACAD file version are "unknown file formats"

From ACAD 2013 saved as

  • ACAD 2000.DWG
  • ADAD 2000.DXF
  • ACAD R14.DWG

These imported in but the import missed the parts of type ACAD_PROXY_ENTITY

But - hey - a hatch imported as area which I missed yesterday in another import of a ACAD 2000.DWG.

In ACAD I exploded the entities of type ACAD_PROXY_ENTITY and saved as ACAD 2000.DWG (attached)

This imported without problems.

The import in Mfd8 has errors in the hatch that are corrected in Mfd

Drive shaft2000.dwg

AllainR11 post(s)
#14-Mar-18 11:20

Thank you all for testing that file!

It is a good idea for trying different methods for importing but for specific file type, I think it would be better if we could just select the file extension and get it in.

Thank you all again,

234 post(s)
#15-Mar-18 01:53

The DWG file also opened in ArcGIS 10.2.2, Global Mapper 15, Mapinfo V12, Libre Cad, all 32 bit versions. DraftSight, and MSCAD 2018. 64 bit versions

Granted, some of the drawings were ugly, but no errors were encountered.

Just Remember, You are unique, just like everybody else!


5,119 post(s)
#15-Mar-18 06:45

Granted, some of the drawings were ugly, but no errors were encountered.

Could you clarify?

For example, do you mean all packages imported all objects exactly identically, but that some packages used formatting you found ugly?

Or, do you mean no packages reported errors during import, but that the drawings created by different packages were different in that the objects created were different?

In either case, which results to your mind were ugly and why do you consider those ugly?


5,119 post(s)
#15-Mar-18 08:21

I'd be interested in hearing which were "ugly"... but, in the meantime I tried a few myself.

1. LibreCAD imports something, but apparently not completely as compared to others:

2. A free resource from a well-respected CAD company, Bentley, is their free CAD viewer. Anyone willing to register with personal details, and willing to burn about a gigabyte of space, can download it and use it for free. What is useful about this is that Bentley utilizes the RealDWG paid library that AutoCAD licenses for about $5000 first year, $2500 per year thereafter. So... in theory this should produce the same results as everybody else who uses RealDWG:

As you can see from the above screenshot, which is clearly is an incorrect import, even using AutoCAD's own code within a package from one of the world's most experienced and most respected CAD companies is not a guarantee the import will be accomplished correctly.

3. AutoDesk's own online viewer, a useful "gold standard," produces the following:


Anyway, my point with the above is not just which packages can "open" a file and "import" it, but which managed to import it correctly. If anything, many people would prefer to have no import than one which imports data with numerous errors that are difficult to detect, and which then propagate through workflow and projects.

I therefore respectfully ask all contributors when remarking if something does or does not import a specific dwg to clearly state whether the import is accomplished accurately, or if there are any errors in the import such as some objects not being imported, geometry being imported incorrectly, etc.



1,886 post(s)
#15-Mar-18 09:16

CADWizz Ultra ( is a useful viewer/converter which opens the original dwg other than for any labels if they are in the original.

Aussie Nature Shots


6,247 post(s)
#15-Mar-18 09:29

It's not easy to say more than "ugly" if you have no clue about what the ACAD file should look like.

But as Manifold Viewer AutoDesk has a free viewer for DWG files to and similar to ManifoldViewer it allows to check the state of visibility of layers (very important, should be reflected in the layerbar of the imported map) and type of objects.

Ask google for the last version (2018) of DWG TrueView. You can at least see what's missing in the import.

Most important: You can convert new ACAD formats and downgrade to ACAD 2000 which is the latest format Mfd 8 + 9 can import with the restriction of some entity types AFAIK.


5,119 post(s)
#15-Mar-18 12:53

Ask google for the last version (2018) of DWG TrueView.

Thanks for the tip! I knew AutoCAD had a viewer but somehow jumped to AutoCAD Viewer, which is their online thing that kept crashing.

I'll download DWG TrueView and see what that does. It's easy to download since the installation for the viewer is a mere 790 MB for 64 bit windows. :-)


5,119 post(s)
#15-Mar-18 13:40

Here is the dwg in AutoDesk DWG TrueView 2018...

Thanks for the tip, Klaus!


234 post(s)
#16-Mar-18 04:51


My apologies for using the word ugly.

I should have said that some of the results were less than desirable, and in some cases incomplete.

The included images depict the results of my opening of the files with various software packages I listed above.

(I would have included the images in the body of this message, but for some reason the "Insert Image" button does not seem to be working for me.)

ArcMap 10.2.2 is nearly six years old.

MapInfo 12 is 5 years old.

Global Mapper 15 is six years old.

I got the same results as is displayed above with LibreCad.

I do not have screen shots of the results with Draftsight, or with MSCAD, but the drawings matched, and were verified on a system loaded with AutoCAD C3D.

I am of the belief that if a software package cannot open or import a specific type of file, that type of file should not be included in the supported files type list.

I am also of the belief that if a software package is only able to import, or open a specific type of file to a certain version, that information should be included in the dialog.

Just one fool's opinion.


Just Remember, You are unique, just like everybody else!


6,247 post(s)
#16-Mar-18 07:24

None of the programs can claim to import the file acceptably. Each of them has its deficiencies (ArcMap10 misses the hatch in the structure on the right).

So the problem obviously is based on AutoDesk politics to "promote" each new version with incompatible and unessential additions. I never before met an ACAD_PROXY_ENTITY.

As there are alternatives I jangle every associate to deliver R14.DXF files if they want to use ACAD as exchange format. And every one of them has had this experience before. Modern ACAD formats are no standard!

They all stick to AutoCad, because they have invested so much in further training of stuff to tame the beast. And rarely the engineers in person are competent to correct the small little typo that slipped into the last version. So they enhance their status by access to a draftsperson. That are the two sides why this business model takes so long to die off.


1,264 post(s)
#19-Mar-18 16:20

ACAD_PROXY_ENTITY is a workaround AutoDesk uses for basic AutoCAD to allow more complex drawing features that are created by the specialist versions of AutoCAD. A common example would be opening a drawing file prepared in Civil 3D 2018 with plain AutoCAD 2018, if the Civil 3D 2018 object enabler is installed in AutoCAD 2018 (a separate install usually) the Civil 3D model features (TIN, Surface, Contours, road design features, etc.) can be viewed, but not modified. If the object enabler isn't installed even basic AutoCAD can't display these kind of features.

When I need to bring in such features to another software package I have to use Civil 3D to convert the objects, typically contours, to basic line contour features that can be imported without issue.


519 post(s)
#31-Mar-18 21:03

true view is 750 MO. I try Serif draw Plus export to autocad in dxf and dwg format for many autocad version 200 2004 R2 not all format is supported by manifold 9 . The import of *.dwg2004 and *.dwgR12 fail .

Only the .dwR12 file show nothing when open in TRueView 2018.



join image "Because my dad promised me" interstellar from Manifold: Time by Stephen Baxter. power Math destruction


6,247 post(s)
#31-Mar-18 22:26

Non of the files contains any object. They are all empty when opened in AutoCAD MAP 3D 2017.


8,259 post(s)
#16-Mar-18 08:06

I think Dimitri just wanted to know which converters did a good job vs which did a bad job + how to tell a good job vs a bad job (it is not always obvious).

We hear you on the deficiencies of CAD imports. We completely agree importing CAD files works worse than importing almost anything else. This is a constant annoyance to our users who have to jump through hoops (converting to older formats / converting complex shapes into simpler ones) to try and make sure their data survives. There are reasons for this mess outside of our control. However, we are looking for ways to improve things, and have some ideas.


8,259 post(s)
#20-Mar-18 15:40

An update:

We will likely issue tomorrow. It will contain several improvements for contours among other things, based on feedback (thanks!). We'd like to then issue a public build in a few days and proceed to the next series of builds (there is a small stash of new features for it already which we are holding back because we want to issue the public build first).


514 post(s)
#20-Mar-18 16:14


Is there a general road map for Manifold 9 new features that can be shared with the forum? No problem is there isn't.


How soon?

522 post(s)
#20-Mar-18 17:38

And by general, I assume he means no details but more of a topical road map...without dates. I used to buy software when I was in the Air Force and had to answer this question before getting funded. There's no way to hold a programmer to a date, but there is always a big picture road map of modules to the end point. Once a module had been introduced then we could discuss details of that module. Whether you are willing to share such a road map is, of course, up to you.

It was very refreshing to watch the enhancement of the contour algorithms in real time. Although I did not check the logs to measure it, I could feel the speed improvement in my smaller .dem files with the 165.5 update.


5,119 post(s)
#20-Mar-18 18:45

See the 9 FAQ page, some comments there. For more real time, the discussions in this forum are pretty good. As remarked below, there is a lot of "community driven" to what takes priority on the short list.

It was very refreshing to watch the enhancement of the contour algorithms in real time.

More coming soon. :-)

578 post(s)
#20-Mar-18 22:37

On the subject of free AutoCAD viewers, DraftSight but Dassault is the one that I use. It is actually a fully AutoCAD compatible 2D CAD program. I use it to clean up drawings before pulling them into QGIS. Yes QGIS. Pulling AutoCAD drawings into Manifold was always rotten as it made a drawing for each layer. QGIS makes one layer for all the polylines and the layer name in AutoCAD becomes an attribute in a column. Strangely that fits the 'everything is a table mantra' much more than how M8 handled things.

Manifold User Community Use Agreement Copyright (C) 2007-2017 Manifold Software Limited. All rights reserved.