Subscribe to this thread
Home - General / All posts - Importing old Mapinfo TAB files - some work some don't
hgmonaro45 post(s)
#22-Sep-17 02:13

I'm trying to import some old Mapinfo TAB file topography and have some that work fine and others that don't. Of course the ones I want are the ones that don't work. Getting the attached error message (in viewer it's similar (ie. *** File data appears to be corrupted. If you downloaded this file from Internet, try downloading it again.).

I've saved 2 drawings (each has a .DAT, .ID, .IND, .MAP & the .TAB) into the attached zip file. The AUSTOWN imports into Manifold & Viewer, the VICLOC one gets the error. Looking at the .TAB files in a text editor shows they have different 'version' values (300 & 410) and I found an old forum post that mentions ones with numbers greater than 300 'can be trouble' however there was no explanation associated.

I tried importing into another GIS (Maptitude) and it also had issues so maybe the files are really corrupt. If anyone has Mapinfo installed can you see if you can open the files.

Cheers, HG

Attachments:
error_msg.jpg
topo.zip

adamw


10,447 post(s)
#22-Sep-17 07:57

Well, our dataport stops at checking the version of the header, so the file uses a version of the format that we do not recognize. What *does* read the file? GDAL / QGIS?

otm_shank
50 post(s)
#24-Sep-17 23:50

I've tried opening the VICLOC table in MapInfo, it comes up with an error "Unable to open encrypted table. Invalid key."

I wasn't aware MapInfo Tab files could be encrypted and require a key, however I found this old reference for ABS Census data which mentions it: http://www.abs.gov.au/websitedbs/D3110129.nsf/cc4409445c9ea0864a25679d007d51db/3c8ad6c57ce51037ca25688900228d2d!OpenDocument

So it looks like there's no issue with Manifold, and probably the file isn't corrupted. It's just that you need a 'key' to open it. Was there a workspace (.wor) file provided with the .tab file? This may contain the license number.

Tim

hgmonaro45 post(s)
#25-Sep-17 05:29

Adam, nothing I have opens the files that won't open in Manifold (I only have Manifold, Manifold Viewer & the above mentioned Maptitude). The files do come from the CDATA96 product that Tim links to in his reply. I do have some of the .wor files that the linked document talks about however I don't have the Mapinfo installed although I do have the installation discs. Not sure it will install as pretty sure (working off memory here!) I moved to Manifold when Mapinfo stopped working due to OS upgrade and the local agent wanted too much money to upgrade. I was hoping I could use the road network in these files but I think that's not going to be possible.

Thanks to both of you for your help. I'm going to move on.

P.S. Tim, I noticed you were 'mucking around' with the G-NAF. Have any luck? I've loaded it into Manifold but it's too big and slow on my PC to be usable. Even state based versions are ponderous (VIC/NSW/QLD), the other states sort of work but are still slow as. I was thinking of starting another post about it but not sure if it's appropriate on the forum unless issues were tech based.

otm_shank
50 post(s)
#26-Sep-17 02:37

Unfortunately I haven't had time to play around with it recently. My aim is to develop a script which will import all the PSV files, do the joins etc to create State tables (as well as an Australia-wide table), and then export these to the relevant GIS file formats as required. So far however I found the processing (table joins, column updates etc) wasn't that much faster in Radian Studio compared to MapInfo, apparently this is because some of the SQL functions still run single core only. The Manifold Team told me there's plans to improve on this in the future, so I'm looking forward to that!

Dimitri


7,413 post(s)
#26-Sep-17 07:43

Could you post the SQL you use? There are many good reasons to do so.

One reason is to enable your colleagues in this forum to offer advice on what might be missed opportunities to write faster SQL. The query engine does what you order it to do, after all, and we all from time to time tell such engines to do their work in ways that might be significantly slower than other approaches.

The second is that the Engineering plan for optimizing the query engine tries first to visit those happy places where people and queries go all the time. The optimizer as it is now is pretty good, but it is not remotely close to how good it can be given steady background work, a bit more with every build, to tune how it reckons the best way to approach the particular decomposition of a given query. The more Engineering knows about those cases where the optimizer misses a chance to speed up a well-written query, the higher such cases can bubble up in priority for improvements.

Keep in mind, also, that parallelism is not a magic bullet. Some things just don't go any faster but instead will go slower if you parallelize them. As a thought experiment, consider whatever is the internal processing required to open a standard Windows File - Open dialog, a totally trivial thing. You're not going to get that any faster by trying to parallelize it so 16 CPU cores and 4000 GPU cores help open the dialog and then all work together to draw an Open button and a Cancel button. Trying to do that will only go slower.

Likewise, if a task is totally disk bound, like doing nothing but copying a file from one disk to another, you're not going to get significant performance gains by tossing a hundred cores at it.

The optimizer has to reckon such issues, and to do so in cases that are far more complex than deciding whether it makes sense to throw a thousand cores at drawing an Open button and a Cancel button. Like I say, it's pretty good now, the measure of "good" being whether a human could do better by hand-coding, but there is always room for improvement.

Radian is usually significantly faster than MapInfo, but the guys at MapInfo have very good skills and they tend to do a very good job of implementation. Because of that, MapInfo is a useful reference to measure performance, either of the core system or of the optimizer, against a well-implemented standard. So it is very helpful to know detailed comparisons, exactly what is being done in MapInfo and likewise exactly what is being done in Radian, with numbers on the results.

So... post that SQL, let the SQL wizards in this forum comment and then the result of all that joint thought will indeed be great input for tuning the system.

otm_shank
50 post(s)
#26-Sep-17 22:41

Hi Dimitri, sure the SQL is below. Basically I have 2 tables (3.6m rows in each), and I need to copy values from one to the other, based on a unique join value. In Radian this takes 150 seconds, and in MapInfo it takes 230 seconds.

I understand that not everything can/should be parallelized. One of the other functions I'll be doing regularly is updating values based on a point-in-polygon type join, so use of the parallelized SQL functions such as GeomOverlayContainingPar() will hopefully provide more significant speed improvements.

UPDATE (

SELECT [VIC_ADDRESS_DETAIL_psv].[ADDRESS_DETAIL_PID],

[VIC_ADDRESS_DETAIL_psv].[Longitude],

[VIC_ADDRESS_DEFAULT_GEOCODE_psv].[LONGITUDE] AS [n_Longitude],

[VIC_ADDRESS_DETAIL_psv].[Latitude],

[VIC_ADDRESS_DEFAULT_GEOCODE_psv].[LATITUDE] AS [n_Latitude]

FROM [VIC_ADDRESS_DETAIL_psv] JOIN [VIC_ADDRESS_DEFAULT_GEOCODE_psv]

ON [VIC_ADDRESS_DETAIL_psv].[ADDRESS_DETAIL_PID]=[VIC_ADDRESS_DEFAULT_GEOCODE_psv].[ADDRESS_DETAIL_PID]

THREADS SystemCpuCount()

) SET [Latitude] = [n_Latitude], [Longitude] = [n_Longitude];

hgmonaro45 post(s)
#27-Sep-17 01:32

Attached is my method of uploading the G-NAF data (the Word doc) and my empty .map file. Not claiming my scripts are the best but they do work

Attachments:
GNAF.map
Loading GNAF.docx

mortlock1 post(s)
#16-Aug-18 07:02

Hey there,

I came across this post in my attempt to access the 1996 census data. After a month my library was finally able to get the CDs for me however unfortunately, I also discovered the weird "invalid key" situation which led me here.

Have the geographic boundaries, I just need a way of seeing the data tables. I don't suppose you managed to find a way to access these tables?

Manifold User Community Use Agreement Copyright (C) 2007-2021 Manifold Software Limited. All rights reserved.