Subscribe to this thread
Home - Cutting Edge / All posts - Manifold System

9,480 post(s)
#12-Aug-19 16:03

Here is a new build.

SHA256: d0520025a07a9099f6f5e1b4440a43f5bf116e54463e4aa5f2f38257ef7d960d

SHA256: 4e05455b5a42483770661400ae85484d1ecfa52ee634de968c90ad8519f6dca4


9,480 post(s)
#12-Aug-19 16:05


Folders in the Layers pane are toggled independently from layers. A layer is considered to be visible when it is turned on and all folders it is in are also turned on.

The drop-down menu for favorite coordinate systems displays coordinate systems which override metrics with a trailing '#' symbol.

(Fix) Tables with more than 10 fields no longer display with wrong field order in a table window.

Shift-clicking (instead of Ctrl-Shift-clicking) a field header in a table window no longer sorts records. (Supported clicks after the change: Ctrl-click = sort table on clicked field replacing previous sort order, Ctrl-Shift-click = add clicked field to existing sort order.)

Invoking View - Zoom to Native in a map window for an image adjusts zoom center to avoid smoothing image pixels. Resizing a window might need a re-adjustment.

Invoking View - Zoom to Native in a map window for a drawing, labels or map adjusts zoom center to make entered coordinates whole numbers (the adjustment is opposite to the one needed to avoid smoothing pixels for images). Resizing a window might need a re-adjustment.

New query functions:

  • TileGeomToValues - takes an image and a geom, returns a table with all pixels under geom (fields: X, Y, Value).
  • TileGeomToValuesX2 / X3 / X4 - variants of TileGeomToValues that return pixel values as an x2, x3, x4 (for images with multiple channels).

LAS dataport optimizes spatial searches without thinning with big rects and returns records progressively (instead of first collecting all records to return).

Snapping to coordinates in map window performs faster on data sources that support data thinning (MAP, LAS).

The object model supports expression contexts:

  • Schema.Constraint has a new property: ExpressionContext (string, read / write).
  • Schema.Field has a new property: ExpressionContext (string, read / write).
  • Schema.AddConstraint has a new optional parameter for expression context (string).
  • Schema.AddFieldComputed has a new optional parameter for expression context (string).

The object model supports searching for records via an r-tree index with thinning:

  • Table has a new method: SearchBatchRTreeThin which takes the name of an r-tree index, search keys (ValueSet with a rect for the indexed geometry field), array of fields to output, X and Y size of a grid to use for thinning (eg, window size), and a boolean flag allowing (true) or disallowing (false) to reduce the number of coordinates in returned geometry values. Thinning is currently implemented by r-tree indexes in MAP and LAS files. If an index does not support thinning, SearchBatchRTreeThin works as SearchBatchRTree.

(Changes to the API doc reflecting the additions above are coming.)

Cutting edge builds expire on the first day of the third month following the build month (for example, this cutting edge build will continue working during August, September and October, and will expire on the 1st of November). One month or less before the expiration date, starting a cutting edge build will open the log window and output: "This cutting edge build is going to expire soon. Check for newer cutting edge builds on". After the expiration date, attempting to start a cutting edge build will fail with: "This cutting edge build has expired. Check for newer cutting edge builds on".

(Fix) Registering the ODBC driver for cutting edge builds no longer fails to use the 'experimental' name for the driver.

The ODBC driver uses a 4-component version instead of a 3-component one. (This is an old limitation that comes from the times when we did not support ODBC 3.x completely. We now do, so the limitation no longer applies, we can assign the ODBC driver the exact same version as the product.)

End of list.


1,738 post(s)
#12-Aug-19 22:04

I'm very interested to see the TileGeomToValues family of functions and would welcome any examples. Can these also be used to update the pixels under a geom in a performant way or is this something for a future function?

Also just a comment about the cache optimisations in the previous build. I suspect that it was something to do with these or other changes, but I went from only being able to generate watercourses/watersheds on relatively small DEM's to being able to generate watercourses/watersheds on SRTM data for the whole of the New Zealand mainland in 10 mins and 20 mins respectively on my 5 year old machine. Amazing work!

Landsystems Ltd ... Know your land |


9,480 post(s)
#13-Aug-19 12:29

An example for TileGeomToValues - take an image, overlay a drawing with areas, return 3 highest pixels under each area:



-- take all pixels, return n highest

FUNCTION highestCoords(@t TABLE, @n INT32TABLE AS (




-- prepare to convert coordinates from drawing to image

VALUE @conv TABLE = CALL CoordConverterMake(

  ComponentCoordSystem([german_alps]), -- to

  ComponentCoordSystem([sample_areas]-- from



-- select 3 highest pixels in german_alps under each object in sample_areas

SELECT [mfd_id]SPLIT CALL highestCoords(

  CALL TileGeomToValues([german_alps], CoordConvert(@conv, [Geom])), 3

FROM [sample_areas];

Regarding writing back into a raster, TileGeomToValues[Xn] functions are not really suited for that. What you are looking to do specifically and what do you want done in the case of conflicts (pixels covered by more than one object)? We have a couple ideas of our own, it's all completely doable, but we'd like to hear about uses first.

The performance boosts for watersheds that you see are pretty likely related to cache optimizations, indeed. Thanks for your kind words. :-)

atrushwo74 post(s)
#13-Aug-19 17:43

Interesting example. A nice succinct answer to the transfer highest coordinate thread. I too am interested in a Vector to Raster function. Basically you would use an area to modify a raster within the area's extents.

An example of this would be for urban catchment delineation. One of our issues is with the quality of our full feature to bare earth conversion. Often things like canopies or parked vehicles distort the curb and gutter which is the primary flow path. To get around this, we depress the roadway the height of the curb. This gives us more representative streamlines and routes runoff towards catchbasins.

Another example of this would be for two dimensional flow modelling. To make the modelling software work, we have to use a bare earth raster. However, we all know that flow cannot travel through buildings. Our workaround is to raise the building footprints a few metres such that our software routes flows around the raised buildings.

In Manifold 8, we would accomplish these tasks by transferring the selection from the vector layer to the raster. We would then use the Transform menu to raise/lower the selected pixels. The transfer selection step essentially eliminates the issue of many-to-one conflicts because you can only select the same pixel once. Not sure what the best method of accomplishing this in M9 would be...


1,738 post(s)
#14-Aug-19 02:42

Hi Adam,

As Andrew rightly points out, in M8 this largely meant either transferring a selection from another component to an image or creating a freehand selection in an image and updating the selected pixels in some way. To me, this was one of those magical functions which I have used innumerable times and which stood Manifold 8 out from the pack. It is something that I have long hoped will be added to 9 and perhaps extended to include both freehand selections and updating pixels using other components and selections within these components (either with a selection flag or with values).

By way of examples, I have used the M8 functions (in conjunction with M8’s ‘Selections’ pane tools) to build proposed coastal marine structures into LiDAR DEM’s using architects CAD drawings. This is typically to create the base data for hydrodynamic modelling software to simulate the effects of proposed structures upon currents and the patterns of sediment erosion/deposition.

I also use it to:

- Insert or remove flood control structures in an ‘as built’ state to LiDAR data.

- As part of process to estimate the floodplain extent of rivers from LiDAR.

- Assess river flood inundation in response to flood control structure breaches.

- In preparation for various hydrological modelling similarly to Andrew’s examples.

I have also used it numerous times as a component of creating cartographic presentations.

I should also mention at this point that the ‘Modify Selection' tools in M8 where often invaluable for working with selections once transferred allowing them to be expanded, shrunk or smoothed.

My current pressing need is to allow me to complete a script which produces layers of marine connected and disconnected flood inundation at incremental intervals. This is to support an NZ wide effort to create public facing indicative flood extent viewers in response to sea level rise or other flooding events, similar to the NOAA sea level rise viewer:

I have labored to create a performant script in ArcMap but in parallel have been replicating the process in M9. I have all of the steps in M9 apart from setting the marine disconnected pixels to a value of 99 (green pixels in NOAA’s viewer) for which I need to be able to update pixels under drawing objects.

In ArcMap the un-optimised process on my test data takes ~20 hours to run on my test data set. In contrast, I estimate that with the ability to update my pixels, in M9 the same will take ~10 minutes and would offer the additional benefit of being able to process all data sets at once.

This is something I obviously want to pursue as we will be assisting other authorities around the country with building their own data sets. It also seems perfectly suited to M9’s grid analytics and in addition to saving countless labor hours will allow me to really showcase the utility of M9 to my employers and others.

and what do you want done in the case of conflicts (pixels covered by more than one object)?

Typically I have used this as per the M8 implementation of transferring a selection and then updating the pixels in some way which avoids this problem. In the case that we can update pixels under multiple (potentially overlapping) drawing objects, could the transfer rules not be called into play to determine how to deal with this?

Landsystems Ltd ... Know your land |


9,551 post(s)
#14-Aug-19 04:01

Excellent posts by Dan and Andrew. I would just say "+1", but would like to add one simple example, and two comments, one regarding overlaps and the other regarding selections.

The example: flattening lakes and waterbodies to a known elevation. This is very similar to Dan's inundation examples of course. It also seems similar to filling sinks to a known elevation, except for being restricted to given vector objects, and applying per-object elevation defined by an attribute.

Re overlaps:

what do you want done in the case of conflicts (pixels covered by more than one object)?

Well mainly, no removal of overlaps by implicit normalization. In my opinion that is nearly always a bad thing, unless it is essential to make an algorithm finish in reasonable time, or to avoid circularity. Leave it to the user to remove overlaps in advance, if they are errors.

If overlaps or other conflicts exist, then I like the simple rule "last object wins". For now, this is not under user control, since Z-order is undefined. Arbitrary assignment (including multiple sequential assignments when multiple threads are in play) is normally perfectly OK in the meantime. If and when it is not OK, the user has the option of normalizing topology in advance, or manual clipping in a defined order, or detecting and harmonizing crossed lines. (Only the user can know whether this is necessary.) When it becomes possible to define Z-order by the value of a field, then "last object wins" should I think become "highest Z-value wins".

Lastly regarding selections. The Manifold 8 approach of first selectiong pixels under geometry, then applying a transform to selected pixels, is convenient enough but seems to me (a) to involve an unnecessary step, and (b) not to fit the Manifold 9 image tile model well at all. Applying values directly from geometry to touching pixels seems more efficient and more natural to me.

All of this is possible already with 9, but it feels like a lot of work. It would be really great to see it built in and easy.


1,738 post(s)
#14-Aug-19 04:12

Applying values directly from geometry to pixels seems more efficient and more natural to me.

Agreed, but the ability to make irregular selections by freehand or component transfer and work with these is also extremely useful. As such I would like to see both.

Landsystems Ltd ... Know your land |


9,551 post(s)
#14-Aug-19 06:11

Can you give an example, of where that is a necessary (or best) way of working?

In particular, drawing a freehand selection of pixels.


9,480 post(s)
#14-Aug-19 09:03

Thanks a lot to all of you, this is exactly the feedback that we wanted.

It looks like extending the selection in images from full tiles to pixels and keeping it easy to alter / manipulate just the selection in queries would be a big step forward so we are perhaps going to do this first and other things second.

(We have been planning to allow selecting individual pixels in images opposite to just selecting tiles from the very beginning. The current model of selection in an image being a subset of records = full tiles in terms of pixels was mostly a temporary stopgap. We did this limited selection model for images because there are multiple ways to go about storing selected pixels, and we wanted to spend some time on raster analysis tools first, both simple tools like filters and more complex tools like watersheds, to see what their preferences regarding how to store selections would be. We now have a pretty good idea regarding what we need, this involves reorganizing / extending images themselves, too. This is a fairly big chunk of work that we are planning to do in the near future. As a birds-eye view, we have three such big chunks of work on our plate: "geometry", "rasters" and "servers". We are currently busy with other things, but we make gradual progress on each of these chunks in parallel.)


9,551 post(s)
#14-Aug-19 09:41

We have been planning to allow selecting individual pixels in images opposite to just selecting tiles from the very beginning

Can I ask why you have never said so?

It is not a rhetorical question. Really, why.

Neglecting to disclose important design intentions like this (even through two long successive betas) only wastes goodwill and time.


6,438 post(s)
#14-Aug-19 10:45

Can I ask why you have never said so?

That's been discussed in the past, as part of plans to implement rasters, and to backfill various 8 capabilities into 9.


9,480 post(s)
#14-Aug-19 12:04

I tried to look through previous discussions on image selections and our plans regarding that in the beta areas of this forum and it seems we didn't say anything like "image selection is always going to be whole tiles" but - at least in what I have been able to find quickly - we didn't say "we are going to switch from selecting full tiles to selecting pixels" either. I guess by not saying that in the end we do want means to restrict operations to individual pixels rather than tiles, or by not saying it clearly enough, we might have created a wrong impression here. Apologies for that.

Part of this is perhaps because while we always wanted to be able to restrict operations to individual pixels rather than tiles, we were considering different ways of doing this and in some of those ways we would have things that wouldn't be called "selection" (eg, several approaches we considered were using just "masks").

Again, apologies for the confusion.


9,480 post(s)
#14-Aug-19 12:33

(Here is a post from Dimitri that talked about selecting pixels: ... pixel level selections ... There are probably other posts like that, but I agree it's pretty buried.)

atrushwo74 post(s)
#17-Aug-19 02:46

Hey Adam, Glad too see pixel selection is in the works. Having access to individual pixels for manipulation of their channels and elevation would be fantastic. It would also be very useful to have access to the pixel's visibility state as well.

In Manifold 8, we would turn some pixels off to force streamline terminations at that pixel's centroid. Combining this with the ability to transfer vectors to rasters would allow for much more control over the raster itself.


1,738 post(s)
#14-Aug-19 20:45

Great news. Thanks for the explanation and birds eye view. I look forward to seeing some of these features in future builds.

Landsystems Ltd ... Know your land |

808 post(s)
#13-Aug-19 22:14

Speaking of snapping...

Snapping only seems to work when creating a new element. Can it be turned on for editing an existing element?


9,480 post(s)
#14-Aug-19 08:28

Yes, we will do this.


9,480 post(s)
#21-Aug-19 10:59

We updated the API doc for the object model.

There are new entries for Schema.Constraint.ExpressionContext, Schema.Field.ExpressionContext, Table.SearchBatchRTreeThin. Schema.AddConstraint, Schema.AddFieldComputed document recently added parameter for the expression context. We also cleaned up a number of examples and added several new examples and some new notes to existing functions.

There is a gotcha: Table.SearchBatchRTreeThin, as documented, takes an additional parameter (touching) that was added to the function after To make examples for that function in the documentation work in, remove that parameter from calls.

428 post(s)
#24-Aug-19 21:53


I updated Dump_Code add-in with ExpressionContext.


9,480 post(s)
#23-Aug-19 13:55

Status update:

We are looking to issue the next build somewhere next week. The focus is on vector editing tools, plus we are going to have a number of additions in other areas.

Mike Pelletier

1,829 post(s)
#23-Aug-19 18:52

This will be most welcome Adam. Hopefully Mfd is keeping an eye on QGIS vector editing tools as good example of the tools that are needed. The COGO tools however should include a way to track and make future edits to a traverse (implement similar to the coordinate tab?), as well as the ability to enter a scale factor value (ie., 0.999586) that applies to all distances entered. Also, please include the ability to enter distances as a simple math phrase (ie., 243*5) instead of having to do this on a calculator first and then entering 1215.


9,551 post(s)
#24-Aug-19 10:38

With respect, I think that this might be why Dimitri has always insisted on blind voting being the only reliable criterion for feature requests.

Your "should" is not right. As I understand it, no one outside the United States of America ever has to deal with traverses (in that sense). It is a primitive historic system which you are condemned to working with locally.

That is a good reason for you to write and share custom scripts.

Mike Pelletier

1,829 post(s)
#24-Aug-19 18:21

By traverses I mean a series of distances and bearings to describe many different things such as property lines and utilities. Are we one the same page? If so, I'm surprised to hear this is not one of the common practices around our friendly globe.

This is repetitive job that needs to be well integrated in software's workflow and custom scripts often don't do that as well. Also, these suggestions are seemingly not that challenging for Mfd. Sharing suggestions is common practice in this forum that often leads to good discussion.

As always, thanks for sharing your opinion Tim, even when you are wrong . Just kidding, no right or wrong here.


9,551 post(s)
#24-Aug-19 22:57

It was only your word "should" that hit me the wrong way Mike. Sorry for being picky.

But no. I don't know about the rest of the globe, but in New Zealand, while distances and bearings are recorded on parcel records, which have legal standing (alongside physical marks), no one except professional surveyors would ever need to work with them, because all* of our property data is also available in a common, accurate, authoritative GIS format, publicly available online for free, and that public data suffices for GIS, property, utility, engineering, and just about all other practical purposes. Alongside GPS and GLONASS of course.

(*If you go back more than, say, 50 years then some property transactions have not been digitised. And original traverse books have generally not been, I think. But all boundaries and recent transactions are digital.)

I think the only time you'd need to use distances and bearings in New Zealand is if there were a boundary dispute in court. And then you would need a surveyor.

New Zealand is a small country though. Easy to have good systems.


9,551 post(s)
#24-Aug-19 23:20

P.s. we use the Torrens system. (I see Colorado has a "limited implementation" of it. Sounds better for lawyers.--And someone needs to retranslate the English entry for Russia. Interesting though, big country.)


6,438 post(s)
#25-Aug-19 07:33

And someone needs to retranslate the English entry for Russia.

Yes, definitely it needs a new translation... but the gist of it is interesting.

Drilling into that entry's claim of an open registry available online with "simple web forms" you can find government sites like this one, that for free provide all parcels (!) in Russia, a region that is about three times the East-West extent of the lower 48 US. It automatically gets more or less detailed as you zoom in/out.

For the link above, I just zoomed in at random to the outskirts of Suzdal, a small town full of ancient churches and monasteries that eight hundred years ago was the capital. If you click on one of the parcels you get a pop up with info (in Cyrillic, unfortunately... don't they know that the entire Web is supposed to be in English?...).

It's a total hassle that in the US there is no nationwide, free, online site providing parcel maps. I can add a cadastral parcel layer to a Manifold project for France, New Zealand, Russia, or many other countries, but not in the US. Sigh!

Does anybody know of a list of countries for which parcels are published openly online?

By the way, it's interesting how comments like Tim's can lead to fixes and improvements in Manifold. Drilling into Tim's link and comment I noticed that you can create WMS data sources for sites like the one for the link I provided, but in that particular case when Manifold builds a hierarchy under the data source for the various data sets brought in, the folders are named with ???? characters, a classic indication something about fonts / languages is not all there. I couldn't get the vector layers to work, so the WMS layers aren't as sharp as what you see in the browser URL link I provided.

I've filed a report to get the ??? issue cleared up, as well as a note on the vector layers. Getting JSON server layers to work I've found is tricky given how people tend to implement them in so many different ways, but we may as well get as many possible variations into the zoo of variations that Manifold can handle.

For those who are curious, I've attached an .mxb project with a location, Suzdal, with a Map that shows approximately the same spot as the browser URL link above. In the project, if you open up the data source used, it shows the ??? problem. The map opens using a Bing satellite layer as the top background layer, so you can see how the parcels overlay satellite photography reasonably well (as, I guess, one should expect from a Torrens system agency archive...). If you turn off the Bing satellite layer, the next layer is a Yandex streets layer, showing the churches in the view. It's interesting that Suzdal is the kind of place where if you just zoom at random into town you're likely to hit a few churches...

Thanks for the link! I sure wish the US had open, nationwide, online publication of parcel maps.

[Edit] Can't resist noting... just realized that the .mxb I attached is only 2kb. It's wild you can publish such a tiny project with a location in it that has the potential to convey so much info in such a small file (using links, of course...), that anybody in the world can see, for free, using Viewer... :-)


Mike Pelletier

1,829 post(s)
#26-Aug-19 05:29

The Russian site is very impressive Dimitri and yes it is cool how discussion can unearth new understanding.

As for your desire to have U.S. parcels nationwide online for free, I've added it to the wishlist :-)

I can say that Colorado's state government does have GIS parcel info from participating Counties (primarily for emergency services), many Counties still don't have their parcel info in a GIS however.

drtees99 post(s)
#26-Aug-19 16:44

I cannot speak for the rest of the US, but Washington State does have a database containing all parcels within the state. It takes a bit of time to generate the map on my computer using M9 (likely, I am not doing something to harness the power of M9). It is painful to open it in M8.

One of the problems I see with a national parcel map is an extention of the problems with the Washington State map. Parcel information is maintained at a county level rather than the state level. Not all counties have the resources to digitize their parcel information, or if they do, they do not necessarily keep it up-to-date. Even worse is that county parcel data that is available sometimes does not match up with actual surveyed data. This is usually the result of a county georegistering a paper parcel map and digitzing the boundaries rather than translating metes and bounds. It is embarrassing to use GIS to reseach a property only to find out later that the boundaries I used were wrong when the professional site survey comes in.

Sad in a way that a country as rich as the US is so backward technologically at a very basic level.


6,438 post(s)
#26-Aug-19 19:16

Washington State does have a database containing all parcels within the state. It takes a bit of time to generate the map on my computer using M9 (likely, I am not doing something to harness the power of M9).

Do you have a link to the data? Parcel data is great "real life" data, so I'm curious how best to approach it.

181 post(s)
#26-Aug-19 20:34

US Florida parcel data here (by county)

Parcel boundary files:

drtees99 post(s)
#27-Aug-19 01:07


You might try this link to download the gdb file.

If that doesn't work, navigate to the Washington Geospatial Open Data Portal . The parcel data will be under the Boundaries category. Search "parcel" to winnow out all the boundary databases that are not pertinent to just parcel data.

Hope this helps.

40 post(s)
#04-Sep-19 22:31

British Columbia partially attributed parcel data:

Fully attributed data are available only to provincial gov't employees. This is a work in progress and being built around ArcGIS parcel fabric functionality.

808 post(s)
#26-Aug-19 20:31

Couple years ago a chief appraiser from another (Texas) county called me to get a Google Earth overlay for her county. I am still shocked at the condition of her GIS. Even in the county seat they have holes, gaps, overlaps, and trapezoids instead of the squared and rectangular parcels designed back in the 1800s. They have a part time, former employee, doing the maps for them. I don't see how these can be in compliance with the state standards unless they also have the paper maps to fall back on. I would guess they have 3 man-years of work to repair the GIS.

Bad Parcel Map.jpg


6,438 post(s)
#27-Aug-19 08:04

That's a wild example, for sure. But it might not be the fault of the GIS guy. One problem with "metes and bounds" (COGO) title descriptions is that it is very difficult to tell if there are any overlaps, gaps or other problems until you precisely plot out all the parcels in the region. I wouldn't be surprised if, years ago, people who drew paper maps based on title descriptions made informal adjustments so the map would come out right.

It could well be that the first time any of those old title descriptions have been precisely checked is when a GIS is used to draw an exact shape that corresponds to the metes and bounds description, with the results being like the image you posted. I have a lot of respect for surveyor accuracy in past years, and even past centuries, but when you have a title description that does not automatically provide a precise visualization, there have got to be errors in some of those parcel descriptions.

It would be interesting to do some basic research to see just how common errors are, at least those that lead to overlaps and such.

808 post(s)
#27-Aug-19 21:14

I've not seen the word COGO in years. I am happily unencumbered by ESRI users in my area. I use ArcGIS ONLY to create a shape file for our contractors (they need all the index files created by Arc). Anyway I jumped to YouTube to see how they do traverse type COGO work in Arc. It seems to take about 4x more clicks to do anything than it should and that is after you have put everything into the proper format. However, I did see something I liked.

I rely on Tim Osborne's Plot Traverse add in for M8 for my daily work. Without that I would likely be using Arc software. To summarize how that works, the user translates metes and bounds from a deed into a format suitable for M8 to read. Plot Travers has a lot of distance units to choose from and even distinguishes between California Varas and Texas Varas. The input table is a text document that looks like this, but other formats can be used...

s62:59:13w, 85

n85:27:46w, 90

n8:26:46w, 177.99

s8:26:46e, 177.99

It is well thought out, and works well for me.

Plot Traverse does not handle curves as used by some surveyors. Some surveyors would walk through a curve in 200-foot chord lengths and just note distance and direction. Some would use delta, radius, tangent, and arc length. More recently curve data comes with additional data of a chord distance and direction. With that I can use Plot Traverse without breaking my stride. But with the older form of delta/radius/etc., I had to leave that as an open jaw and figure out the curve after plotting the rest of the points.

The "new" thing I saw today was a table in Arc with the calls. I'm not certain if the table is an input table or a result table, but if it was the input table, that would be pretty good. The table has fields for OBJECTID, SHAPE (point or line I assume), SHAPE_Length Direction, Distance, Delta, Radius, Tangent, ArcLength, and Side. The direction and distance are basically in the format used by Osborne. If M9 had an INPUT table that could be filled out with distances and directions (and the curve data), that would work great for me. It would create a permanent record within Manifold for future users to review previous work. With a time stamp it could become part of the archive. I would want to see field to store notes like deed volume, page, number, place of beginning, abstract or subdivision, surveyor name, date, etc. An added benefit would be the ability to label the lines and curves with distance and direction as Arc does.

Metes and bounds might be primitive and historic, but I cannot imagine how the rest of the world describes parcels of land without corner-to-corner distance and direction.

/editorial The Torrens System seems to be what we in Texas would call a nationwide abstract plant. Most states are a collection of counties. Inside each county is a collection of abstracts. Abstracts used to be called surveys back at Texas sovereignty from Mexico; however, the counties were not set in stone yet. Later on surveys were renumbered with unique numbers and called abstracts. We have title companies which perform as abstract plants. According to Webster, an abstract plant is a comprehensive record maintained by a title-insurance company indicating liens, encumbrances, and defects affecting the title to properties located in the community where the company operates as insurer —not often in formal use. To create a Torrens type system in Texas would be going up against all the established title companies and putting them out of business or at least making them a government agency. Part of the problem here is that people are allowed to write and file their own deeds as are attorneys. Today I looked at a string of deeds from some Houston attorneys which had enough typos and omissions as to be unenforceable. The client created a trust for her property, but the attorneys did not get the name of their own client right and did not get the tract number right. Now she's trying to sell and they perpetuated the incorrect owner and tract number once more. Title companies almost never make a mistake (IMHO), but attorney and self written deeds have a significant mistake between 50% and 80% of the time.



9,551 post(s)
#27-Aug-19 22:24

/editorial ... /endeditorial

Yes. The essence of the Torrens system is that evidence of title is not required.

Instead, formal registration of title simply is title. Even, in principle, if details were wrong, though of course you can apply to have a mistake corrected.

There is one register, recording all titles (and other instruments such as easements), and all survey plans.

Historic titles and plans can be interesting for other purposes, but chains of title have no bearing on the present ownership of land.

There are no title deeds, just the register. There is no title insurance, and no need for any.

Mike Pelletier

1,829 post(s)
#27-Aug-19 23:44

Great posts everyone. I can add that GIS parcel mapping in the U.S. (at least I think this is true for the rest of country) is done for tax purposes. This means that accuracy only needs to be sufficient for taxation reasons and not for construction or legal transfer reasons. Accuracy meaning the shape of the parcel and placement on the land. Generally urban areas are mapped pretty well, while remote lands are often quite rough due to lack of GPS info of survey monuments. Parcel mappers use a variety of data sources to try and get the parcel mapped well but often it does not include GPS control of actual survey monuments.

The Plot traverse add-in from Tim Osbourne is really really good and essential for this type of work. It demonstrates a strong desire to use Manifold for mapping by at least one surveyor :-) Still Mfd could implement it better and more seamless allowing the traverse to plot as the data is entered, helping to show errors early on, as well as add dchall8 good wishlist above.


9,551 post(s)
#28-Aug-19 04:49

I can add that GIS parcel mapping in the U.S. (at least I think this is true for the rest of country) is done for tax purposes. This means that accuracy only needs to be sufficient for taxation reasons and not for construction or legal transfer reasons.


No doubt that is true. But in that case, you would need something else for legal transfer, construction, engineering, hydrology, environmental mitigation, biodiversity, and every other purpose.

What do you have instead, that is accurate enough for those purposes?

181 post(s)
#28-Aug-19 06:40

Exactly correct. Who could imagine that poorly funded government agencies would be expected to produce good GIS data for all the something else? Like national census constantly under attack in the USA and Canada--US Census TIGER is the base for geocoding and navigation apps in the US but those companies who use it do not support US Census as a matter of some kind of policy.

I am right now engaged with some potential litigation over my property boundaries here in Florida. I have a good surveyor who is working from the legal description and old plat maps. That is what one does in a legal process. One goes back to the ground spatial data descriptions in the legal record.

808 post(s)
#28-Aug-19 14:21

The oil and gas industry goes to some extremes in my opinion, but I see their point. When they sight a drilling location, they have a surveyor mark 50-ft grid on a 5-acre square (roughly 500x500 ft). Trimble type GPS enhancers are pretty standard for surveyors in my parts of Texas, simply because you can't get a job with an oil company without it. The company installs utility poles at the 4 corners with LiDAR scanners and lets those run. The idea is to scan the 5-acre surface to determine an exact amount fill needed to bring the elevation up to a specified level. That way they can drill to a precise depth according to their lease. Leasing often specifies a 10-ft depth band leaving other bands open for other leasing.

Mike Pelletier

1,829 post(s)
#28-Aug-19 15:54

The accuracy done depends on the accuracy needed. For property transfer, monuments rule such that you own what is within your survey monuments which are described in your property deed. The deed can be metes and bounds tied to some other monuments or as described in a subdivision plat.

We have to buy title insurance to help protect against property loss. Sounds like the state covers that in New Zealand. How accurate is the mapping of GIS parcels there?

In either system, someone has to draw the properties at whatever the desired accuracy. It would be nice if Mfd 9, with its speed, stability, etc. could be that tool.

808 post(s)
#29-Aug-19 15:42

Monuments. I'm sure we all have stories of our favorite monuments. I see a lot of monuments described as a point. e.g. N37:43:25E, 2352.34 feet to a point. That surveyor has retired but still works with a younger guy off and on. Other monuments have disappeared since 1853 when the surveyor found a dead mesquite tree marked with an X on the bank of the river. Today's surveyors like to tell stories of the time they found the rock mound marking the corner of an old Spanish league or labor.

A surveyor just walked out of the office. We were talking about his Leica radio monuments being nails in locations where the safety of his radio is more or less assured. I'm good with fence corners if I can find them.


9,551 post(s)
#27-Aug-19 23:13


There is one register, recording all titles (and other instruments interests such as easements), and all survey plans.


6,438 post(s)
#27-Aug-19 16:49

It takes a bit of time to generate the map on my computer using M9

You don't say what you're doing, but I guess you just linked in the GDB and are not using the data in .map format. GDB is slow. Manifold .map is very fast.

Here is what I did:

1. Download the .gdb from the link you supplied.

2. Link the gdb into a Manifold project.

3. Using the step by step procedure given in this topic, export it to a Manifold .map project. That takes 7 minutes, about right given GDB is slow and it's over a GB of vector data. But that's a one-time import and then you're free of GDB-slowness.

4. Open that new Manifold project. Opens instantly. You can zoom in and pan effortlessly, alt-click on a parcel to see the record data, etc.. Works fast.

drtees99 post(s)
#27-Aug-19 17:07

Again, you are right. I did link to the GDB file. My workload of late has not allowed me the luxury of time to really learn how to use M9. Instead, I initially used M9 and the GDB file to select parcel data for counties that haven't made such data public (usually the less populated counties), and export that data to M8.

I may be a troglodyte, but I do try to look up at the sky once in a while!

Mike Pelletier

1,829 post(s)
#26-Aug-19 05:15

Thanks for the link and info Tim. Your right on target about our system being better for lawyers as well as other property related jobs :-)

Nevertheless there are lots of surveyors and CAD operators that might be very happy using Manifold instead of CAD. Often it is very helpful having lots of additional layers in your mapping to provide clues to finding solutions for what is being mapped. The ability to put a traverse into the software with background info helps show which distance or bearing is in error.

Drawing the traverse in another package or even in a script that isn't well integrated within Manifold is too much hassle. Better to spot the error on the map, adjust the traverse accordingly, and have the software instantly display the correction. This addition to the software would be seemingly easy to implement and open up a lot more potential buyers in my humble opinion. More useful than editing coordinates directly, which I have never used. Others may find this useful of course.


9,480 post(s)
#02-Sep-19 08:07

One more status update:

We need a few more days to finish the build, it is going out this week.

The main culprit is snaps. We are capturing more data than previous builds did because of new features, we are making sure this works fast and the operations stay responsive even if the amounts of data are big. 8 was cheating a bit here in that it was capturing data for snaps while rendering, but we can do better.


9,480 post(s)
#08-Sep-19 07:51

One more update:

We are nearly done with the build. There are a couple of things that need finishing touches, all of them related to vector editing. We are going to finish them up in the first half of the coming week. Apologies for the delay, some things took surprising amounts of time to work through (with all the extensions to geometry, etc, compared to 8).

We have been reading various suggestions for vector editing on the forum, we'll revisit them after the build.

On metes and bounds, we aren't doing them directly now, but we had plans to add them before in some form and we will do so. At the very least, we can re-create the add-in that worked in 8 referenced above and make a similar add-in for 9. (Maybe it is a good idea to have this as an add-in for other reasons - to create a useful example of an add-in and proof the infrastructure around add-ins, add necessary API calls if they aren't in the object model yet, etc. We'll see.)

Mike Pelletier

1,829 post(s)
#08-Sep-19 20:57

Thanks for the update and peak at what is cooking in the Mfd kitchen.

Those are some good pluses of doing metes and bounds as an add-in. Perhaps it would also allow users to customize it for a certain need.

However, I think from a marketing side it would be a negative because this tool is a primary need for many users and they likely would rather see it built in if their new to Mfd.


9,551 post(s)
#08-Sep-19 23:17

My 2c: a factory-built add-in, with a little bit of custom GUI, would be priceless.

One solid example would be a catalyst for more (whether factory-built, factory-curated, or third-party for free or for fee), lowering the cost of customization.

There is no reason why some approved add-ins should not be shipped as a standard package, even as part of the product itself if that were desired. (Personally I would prefer a separate add-in package.)

That would be marketing asset in my opinion.

808 post(s)
#09-Sep-19 15:17

Thank you so much for acknowledging that metes and bounds input is important to many of us, if only in the USA. I like the approach, also.

Manifold User Community Use Agreement Copyright (C) 2007-2019 Manifold Software Limited. All rights reserved.