Subscribe to this thread
Home - General / All posts - Cleaning up GPX data with M9
Forest
594 post(s)
#19-Jul-19 07:50

I am trying to make a simple shapefile for ESRI users that contains a polyline for each day I recorded a track. Getting the GPX files into M9 is easy and clipping off the bits that are outside the project area is also easy. The merge drawing command in the edit menu is also a work of art. My troubles were that I can't edit the schema to delete the GPX data fields that I do not need in the shapefile. Also when I export the merged drawing, it splits into two shapefiles, one which has regular polylines for the data that was clipped and one that has polyline-Z geometry for the GPS data that did not cross the project boundary and was not clipped. The exported polylines also crashed the lastest QGIS.

I was defeated and ended up doing the work in QGIS.

In M9, I could not just copy and paste the features from one drawing to another, I often could not select the GPX polyline in a drawing and I still do not know why. Yes, I read the manual and must be missing something.

A final note, when clipping, I can't really see the target-data tint applied to polylines that are black on a grey clipping polygon background. The target-data tint could be amped up a bit to make it more visible, at least for the default, no formatting applied case, with black lines on a grey background.

Dimitri


5,491 post(s)
#19-Jul-19 09:32

My troubles were that I can't edit the schema to delete the GPX data fields that I do not need in the shapefile.

Why not? Launch schema, select what you want to delete, press the delete button. See the Schema topic. What happens when you try that?

In M9, I could not just copy and paste the features from one drawing to another, I often could not select the GPX polyline in a drawing and I still do not know why. Yes, I read the manual and must be missing something.

Sure sounds like you're missing something. The above is trivial. If you can't select something, you probably don't have an index. That's highlighted in the Selection topic right at the beginning, so if you read that, maybe it's something else.

If you tell us what you did, step by step, then we can help out. Also, please don't just say "I read the manual" without saying what topics you read. For example, did you readCopy and Paste between Drawings ? If so, what part of that process doesn't work for you?

The rule of thumb is that if some very basic step does not work, stop, read the topic that covers that basic step. Can't delete a field in the schema? Read the Schema topic. Can't select something? Read the Selection topic, and so on. If that doesn't immediately work for you, don't beat your head on it. Post here in the forum, where missing a basic step is easy to fix.

But if you keep going and let it snowball, then there are very many basic things not being done and the combined mess is much harder to sort out.

I can't really see the target-data tint applied to polylines that are black on a grey clipping polygon background.

? Why didn't you change the gray background to something else using Style? Takes a second, if that.

The exported polylines also crashed the lastest QGIS.

Make sure you report that to QGIS tech support, so they can fix the bug.

adamw


8,634 post(s)
#25-Jul-19 14:42

Regarding this:

The exported polylines also crashed the lastest QGIS.

Which ones - those without Z or those with Z? We used to have problems with SHP files exported with Z, but we think we fixed them, and QGIS used to have problems with SHP files with Z of their own.

You can remove Z from geometry values prior to doing the export using the Remove Z transform template, that way exporting will create only one SHP file.

We hear you on not being able to select data in order to copy / paste it. As Dimitri said this most likely happens because the drawing does not have an unique index. Adding a unique index is pretty easy now (you don't have to write any SQL, just invoke a dialog and press a toolbar button), but we can perhaps just side-step the issue for imports entirely by automatically creating a unique index for each imported component via an option.

Forest
594 post(s)
#26-Jul-19 00:15

I am persistent and usually by the time I call for help, I have tried something 50 times and have burned the job budget and have also have debugged the situation as far as I can. It is getting to the stage where I have to tell myself *never* to try and use a new operation without making up some trivial test data first and making sure that I can get that working first. When I post something, I don't have much time or energy left to narrow the issue down to a tight specification and just want to see if it is an issue that anyone else has had before I commit to documenting it and submitting it to tech. So I really appreciate your assistance.

The deleting fields thing was user error, I didn't select them with the control key so the delete function was not available. Lesson to me, making a row the current row and selecting it in the schema are not the same.

One of the main reasons that I use manifold instead of QGIS is that manifold handles malformed shapefiles and they crash QGIS (even 3.8). In general manifold is better than QGIS for data management and I just need to get these last challenges behind me. I am just transitioning to using it regularly.

The Manifold help system is probably best for new users, I can't read it from top to bottom as I have seen nearly all of it before on many occasions and it is hard to find the small bits that I really need. ESRI has this issue even worse and I have watched lots of 90 minute videos without finding the few bits that I really need. QGIS has a panel with hints it some of its dialogs and these are really helpful. Something similar happens if you google an excel issue, you almost always get a three point list that tells you what to do. The way help is provided at the moment is not -just in time- style and there is an inefficiency here that I can't quite put my finger on.

Will check and try to further isolate the Z polyline issues above and get back to you. When I have dozens of attempts to do something, the story gets messy and I have to clean it up before reporting it.

Thanks heaps.

Dimitri


5,491 post(s)
#26-Jul-19 09:27

there is an inefficiency here

I agree 100%.

I think there are several sources of inefficiency in trying to find what you need in the doc. Listing those may help in building solutions.

First, 9 has become a fairly large system, no longer just the Radian engine with an SQL interface. So covering all it can do necessarily involves a lot of text to be searched to find what you want. As it gets built out we will increasingly encounter network effects between new features, where adding what may seem at first glance to be a simple new feature might geometrically expand what can be done given existing capabilities. 9 could get very much larger, requiring much more documentation, general education, and tips.

Second, 9 is evolving fairly rapidly. It often does so in internal/infrastructure ways that have subtle, but important, effects on the user interface. That has the virtue of trying out new things in a community driven model, but the downside of relearning in the context of frequently changing documentation. It could be that a topic already read needs to be re-read to work the thing as it is today.

Third, 9 is still adding user interfaces that provide greater point-and-click ease, with many more to go, which would reduce the need for more complex learning, such as learning to use SQL. For a core constituency of data engineers the use of SQL is, in many ways, self-documenting, in that knowing it and having a list of functions with a few examples gets you off and running. But there seems little point in teaching SQL in point and click style when, before such teaching can be written, point and click dialogs that are easier to learn will appear.

Last, I think is the limitation on documentation resources. Given finite resources the first task is to document what is there, to get it all down in an understandable English explanation. That comes first, and then, given more time, can appearsummaries and quick guides for various specific interests. Manual indexing is like that too, because it is very costly to change as topics are constantly altered.

To give you an idea of the resources required, I think in the past three weeks there have been several thousand changes involving nearly 400 topics, with over a thousand illustrations being added or changed. Changes are barely incorporated by the time the next build arrives.

I think in many ways there is a fundamental trade off: for the lowest cost and greatest capability, it helps to have a system that is evolving in a no frills way. You have to be fearless in buying into improved engineering, stronger machinery, and better GUI, no matter what the cost to existing documentation.

Once the main parts of the GUI are filled out in what is likely to be enduring form, then I expect we'll see improvements like quick "how to" guides, better indexing, jumps to topics from within dialogs, and so on.

Until then, we can use this forum as a big, parallelized, search engine, collectively using our intelligence and our individual exploration of release notes, experimentation, and documentation, to help each other, whenever any one of us gets stuck.

Manifold User Community Use Agreement Copyright (C) 2007-2017 Manifold Software Limited. All rights reserved.