Subscribe to this thread
Home - General / All posts - Polyline drawing layer converted to points drawing. Both tables show the same number of records.
StanNWT14 post(s)
#04-Oct-17 19:08

I'm using Radian Studio 9.0.163 and I have loaded as a source database an Esri File GeoDatabase built in 10.x.x. I have the entire Canada landmass can vec 1:50,000 data set as vector point line and polygon feature classes. The entire geodatabase is 46 GB. I've open as a data source the river polyline layer. The drawing and table come in fine. Drawing / rendering that drawing of the polyline layer isn't instantaneous the first time I add the drawing to the map but that's fine. As a test of the triangulation transform to further push Radian based on the 15 million point test of the California road points, I wanted to do the same thing for all the verticies of the entire river layer. There are over 15.3 million records in the polyline drawing / table. When I preview the convert to points transform it's instantaneous. When I add component it takes a few minutes, I think 1440 seconds. When I look at the table and look at the last record. Both the polyline and points tales site the same number of records in the MFDID field on the far left indicating the number of records. How can I have a point drawing / table showing the same number of records when there's clearly over 100 million and likely over 200 million points in that drawing / table? Is the the MFDID a hold over from the record IDs in the polyline drawing / table?

When I do the tranguate all or the voronoi transform via the pop-up at a large scale it's instantaneous but fails to draw when I zoom out at least within a minute. Yes I know I shouldn't be so impatient after all I work in an ArcGIS Desktop (32-bit) world all day long.

I could try it in manifold future as well. But I assume the same results.

I'm going to try this next on the full canada hydro network data set.

Dimitri


4,193 post(s)
#04-Oct-17 20:25

I know you're trying to provide full detail but you're leaving out a lot that is important. For example,

I have loaded as a source database

... is that linked in or imported? Imported means it is in Radian. Linked as a data source means the data is sitting out there in the geodatabase, so you are more testing the performance of ESRI geodatabase format, not Radian.

I could try it in manifold future as well.

Always use Future. I know it doesn't seem that long (only month) since Future forked away from Radian but a month in Manifold engineering time is hundreds of items, some of which are significant improvements in performance, bug fixes, etc.

StanNWT14 post(s)
#04-Oct-17 20:58

Well I used the add from source, picked the fgb option. Then when I wanted to convert to points I think it automatically copied the data into radian. I could see a copy transform apparently selected and running. I don't think I chose that myself. The polyline drawing is inside the project, in the list. But would it still be linked to the file geodatabase?

If you haven't guessed I am new to radian, hence the testing but have been using GIS since Map II Processor on a Mac classic in the early 1990s.

I'm also typing this in on my smartphone at work. My radian install is in my laptop at home where I've been doing the testing. So I'm not looking at an active screen and writing it all from memory of what I did last night.

tjhb

7,452 post(s)
#05-Oct-17 00:50

Appreciated that you're not in front of Radian now, so you can't do much about this now, but it really does help if you can write each step using the names of actual dialogs and options.

I used the add from source, picked the fgb option.

I think that is File > Create > New Data Source, Type: 'File: gdb'.

But to answer this:

How can I have a point drawing / table showing the same number of records when there's clearly over 100 million and likely over 200 million points in that drawing / table?

You mention the "Convert to Points" transform. This might seem like splitting hairs, but it's not: there's no transform called "Convert to Points", but only "Convert to Point", singular.

"Convert to Point" takes a geom of any type, and converts it to a geom of point type. If the source geom has many coordinates (vertices), the result is a point geom with many coordinates, i.e. a multipoint.

That is a one-to-one transform, one object in, one object out.

It's not the transform you want, since you want individual points in the result. For that, use "Decompose to Coordinates" instead.

(While you're at it, do use the Edit Query button for each transform, for an idea of how they work--even if you can only follow a bit at first.)

Dimitri


4,193 post(s)
#05-Oct-17 07:05

"Convert to Point" takes a geom of any type, and converts it to a geom of point type. If the source geom has many coordinates (vertices), the result is a point geom with many coordinates, i.e. a multipoint.

I find that illustrations help me visualize how the different transforms work. Almost all of the templates have illustrations of starting data and the result shown in the Transform Templates - Geom topic.

To find a particular template quickly, launch that topic in your browser and then hit a Ctrl-F (works in most browsers... I tend to use Opera, Edge or Chrome and it works in all three) to open a Find box in the browser. Enter "Convert to Point" without quotes and the browser will find the template quickly for you on the page you are viewing.

The other thing that helps a lot when working with transforms is to do what that topic does: Create a drawing (takes 3 seconds...), add some lines or areas to it (10 seconds) and then try out the different transforms to see what they do. They'll all preview what they do so you can try out a dozen different transforms in a minute or two. Do that to get your head around what you propose to do because it is a lot quicker to try stuff out with a few dozen objects than with 100 million objects. :-)

Can't resist adding... if anybody wants a classic example why many people curse whoever invented the idea of "multipoints," that is, a branched, single point object that to every sensible observer looks like multiple, separate point objects, well... this thread fits right in. Multipoints have their uses, I concede, but they sure do provide opportunities for confusion.

StanNWT14 post(s)
#05-Oct-17 07:09

You are correct, that it makes it difficult to help someone when they're not using the precise language and steps involved.

I took another run at the process of taking the rivers polyline layer from the File GeoDatabase and creating a point for each vertex.

The rivers polyline layer is dragged from the File GeoDatabase source onto an existing map. The polyline feature class in in geographic coordinate system NAD 83 CSRS, which shows up in the layer contents as Latitude / Longitude. The other existing layers in the map are Lambert Conformal Conic. If I double click the table for that particular river polyline layer to check out the number of records, it takes around 8 minutes to read the table at a rate of 48,000/s. I then highlight the rivers polyline layer in the map, which has 15,310,631 records. I then select "Edit" -> "Transform". I then select decompose to coordinates.

The code from the "Edit Query" window is below:

-- $manifold$

--

-- Auto-generated

-- Transform - Decompose to Coordinates - Add Component

--

CREATE TABLE [HD_1470009_1 Decompose to Coordinates] (

[OBJECTID] INT32,

[Shape] GEOM,

[CODE] INT32,

[VALDATE] NVARCHAR,

[PROVIDER] INT16,

[ID] NVARCHAR,

[DATANAME] NVARCHAR,

[ACCURACY] INT32,

[THEME] NVARCHAR,

[DEFINITION] INT16,

[PERMANENCY] INT16,

[ACQTECH] INT16,

[CONCISCODE] INT32,

[GEONAMEDB] NVARCHAR,

[LANGUAGE] INT16,

[NAMEEN] NVARCHAR,

[NAMEFR] NVARCHAR,

[NAMEID] NVARCHAR,

[Shape_Length] FLOAT64,

[mfd_id] INT64,

INDEX [mfd_id_x] BTREE ([mfd_id]),

INDEX [FDO_OBJECTID] BTREEDUP ([OBJECTID]),

INDEX [FDO_Shape] RTREE ([Shape]),

PROPERTY 'FieldCoordSystem.Shape' 'GEOGCS["GCS_North_American_1983_CSRS",DATUM["D_North_American_1983_CSRS",SPHEROID["GRS_1980",6378137.0,298.257222101]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],VERTCS["NAVD_1988",VDATUM["North_American_Vertical_Datum_1988"],PARAMETER["Vertical_Shift",0.0],PARAMETER["Direction",1.0],UNIT["Meter",1.0]]'

);

CREATE DRAWING [HD_1470009_1 Decompose to Coordinates Drawing] (

PROPERTY 'Table' '[HD_1470009_1 Decompose to Coordinates]',

PROPERTY 'FieldGeom' 'Shape'

);

FUNCTION splitgeom(arg GEOM) TABLE AS

(SELECT GeomMakePoint([XY]) AS [Geom] FROM CALL GeomToCoords(arg))

END;

PRAGMA ('progress.percentnext' = '100');

INSERT INTO [HD_1470009_1 Decompose to Coordinates] (

[OBJECTID], [CODE], [VALDATE], [PROVIDER], [ID], [DATANAME], [ACCURACY], [THEME], [DEFINITION], [PERMANENCY], [ACQTECH], [CONCISCODE], [GEONAMEDB], [LANGUAGE], [NAMEEN], [NAMEFR], [NAMEID], [Shape_Length],

[Shape]

) SELECT

[OBJECTID], [CODE], [VALDATE], [PROVIDER], [ID], [DATANAME], [ACCURACY], [THEME], [DEFINITION], [PERMANENCY], [ACQTECH], [CONCISCODE], [GEONAMEDB], [LANGUAGE], [NAMEEN], [NAMEFR], [NAMEID], [Shape_Length],

SPLIT CALL splitgeom([Shape])

FROM [Data Source]::[HD_1470009_1 Drawing]

THREADS SystemCpuCount();

I am using all the fields and the values per polyline transferred to each coordinate pair which creates new points not for any other purpose at this stage then to test the performance of handling a large vector polyline data set converted to coordinate points. For example multibeam bathymetry or LiDAR as you know has vast amounts of data associated with each time space coordinate, so if I can at least have a few fields for each new point, I'll at least test a large number of points with a reasonable number of attributes... Not really an apples to apples comparison, but it is all I have at the moment with free data.

In preview mode when zoomed in, it does show the points from the coordinates of the vertices instantly. I then click the "add component" button. It runs seemingly twice, once it gets to roughly 300 million records, which are the coordinate pairs of each vertex, it takes about 20-40 minutes the total estimated size of this operation looks like it's around 58 GB, the speed is around 130,000 records per second, then it goes through an "insert records" phase which starts from the beginning adding records, it inserts records at a rate of 21,000 records per second. I have no idea where this 58 GB of data is stored, perhaps a windows temp file, since I've not saved the project yet.

After 183 minutes Manifold future 9.0.163.4 crashes before the new point layer is created in the project.

It's not stored in the File GeoDatabase, it gets made in the project. I know this because the transform I was previously using in error, which was the compose to point template in the transform window, well that is what made the 15,310,631 record presumably multi-point layer. The new layer I was trying to create will create unique points and records in a table for each vertex with all the attributes of it's associated polyline.

I can show screen grabs by uploads if required...

StanNWT14 post(s)
#05-Oct-17 07:57

After re-running the decompose to coordinates, I saw where what I was inferring a second run was going. The first time through is "Inserting records, Scanning records" the second time is actually inserting records. And the closest record count I saw was just over 312 million, reading at 21.5 MB/s, 124,000/s. At 43 minutes it starts the actual inserting records part. It fails when it actually tries to insert the records at around 183 minutes.

tjhb

7,452 post(s)
#05-Oct-17 08:33

More helpful:

Say what data you are using. Where from, what format, how to get it. URLs are good.

Then exactly what you do to import the data.

Then step by step, what steps you perform.

Currently you are a million miles away from that sort of standard, you are just blowing.

Can you try again?

tjhb

7,452 post(s)
#05-Oct-17 21:56

Sorry, I was more harsh here than I intended to be. (I'm not always good at that.)

You're doing absolutely the right thing, just try to take it one step at a time.

The tag for the General forum section says "no question too simple". I would add "no question can be put too simply".

Putting complex things simply is really hard work, especially in new territory.

On the plus side, many people on the forum are happy to put hours (voluntarily) into answering a good question, if you give them the right levers and tools.

Dimitri


4,193 post(s)
#05-Oct-17 15:50

OK... let's have at it. Before we do, though, I have one request: please don't use whatever text editor you were using to create text in the very wide post. :-)

Onward...

Please don't talk about queries or anything else until you tell us precisely how, step by danged step, you get your data into Manifold.

It's frustrating that we still don't know whether you have imported the data into the project or linked into the project and exactly how you did that. Please don't tell us what you infer must be the situation. Instead, tell us step by step what you actually did.

An example:

The rivers polyline layer is dragged from the File GeoDatabase source onto an existing map.

The above doesn't tell us whether you imported the table into the project or linked it into the project. It *sounds* like what you are talking about is dragging and dropping a drawing into a map window from a linked data source where all the data is stored outside the project. But it could also mean you are dragging a table from a data source and dropping it into the local part of the project.

Here is how you should describe what you might have done:

<begin example quote>

1. I read the "Example: Connect to an ESRI GDB File Geodatabase" topic.

2. Following that topic, I launched Manifold Future 163.5 and chose File - Create - New Data Source.

3. I chose Type of File: gdb, clicked the browse button, and browsed over to my C:\files\ESRIstuffGDB folder and clicked on the gdb file in there, clicked open and then clicked Create data source.

4. That created a data source with a cylinder in my project. I clicked the + icon to open it and in there I saw many drawings and tables.

5. The drawing I'm working with is one of those... It is called... everything from here on in is that drawing...

<end example quote>

Reading the above it would be clear a gdb was LINKED into the project with no data IMPORTED, and that everything done thereafter depended upon GDB as a data store.

It's not stored in the File GeoDatabase, it gets made in the project. I know this because the transform I was previously using in error, which was the compose to point template in the transform window, well that is what made the 15,310,631 record presumably multi-point layer.

Well, everything gets made in the project. The question is whether it gets made in the project stored locally in Radian storage or gets made in the project stored in the external GDB.

I can't conclude it is created in local storage from the info above because the description does not hang together: how a transform operates is not the way one knows if something is linked or imported, exactly because of the operational transparency the Radian engine provides in such situations... the operational equivalency being a very good thing, not a bad thing.

My guess - and it is only a wild guess - is that you are having problems because everything is still in the GDB. So let's start with simplifying the situation:

0. Read the example topic. Actually download the Naperville GDB from ESRI's site and duplicate the example so you get a hands-on feel for it.

1. Open a new, empty project. Link the file GDB into your project. Expand the database cylinder.

2. Find the drawing you want within that database cylinder hierarchy. Highlight BOTH the drawing and the drawing's table. Press the Copy icon in the toolbar.

3. Close the database hierarchy, and click the mouse somewhere in the big, unoccupied white space in the project below the System Data. Click the Paste icon.

4. Wait a long time while it copies the drawing and table from the GDB and pastes into the project. When the process is finally done you'll see a drawing and its table appear in the project underneath the System Data folder. As promised in the very first intro topics, imports can be slow but, mercifully, that's normally a one-time task since once you have data in a Radian format .map after that it all is fast.

5. Click on the GDB database cylinder to highlight the data source, and then press the Delete button on the toolbar. That deletes the data source. We do that to make sure there's no GDB around to sow confusion about what is linked and what is imported.

6. Is the drawing and table that you pasted into the project still there? Great! That means you successfully copied and pasted the drawing and its table out of the GDB and into the project.

7. Save the project. The first time you do this it will take a long time. After that it will be quick to open and much quicker to save.

8. Just to be sure you did all this right, close Manifold. Open it again and open the project. You should not see any database cylinders there, just the drawing and the drawing's table that you brought in from the GDB.

The above is a really convoluted way of going about this but I've tried to construct it in a way that it will be impossible for you to leave what you are doing partially in the GDB. The point of doing that is a) to avoid a situation where you have to learn enough about the system to use things liked linked tables from GDB correctly, and b) where you can get on to trying out Radian instead of doing nothing but testing GDB.

My own guess, by the way, is that it wasn't Future that crashed but GDB, and the GDB driver (ESRI's code, not ours) killed everything in which it was embedded. That's one reason we hate incorporating third party drivers because when they crash people blame us. But there's no real way to defend against kill shots from drivers when you have to let them in behind your defenses to get reasonable performance out of them.

Now, it's perfectly possible the Radian engine crashed doing what you were doing, and Engineering is always eager to find such cases. When crashes are extremely rare it is all the more exciting to find one, so that, too can be immediately fixed. So Engineering gets excited but when they hear "GDB" they know they probably won't get lucky.

Given that from what you write it appears almost certain Engineering will be disappointed once again to discover that all your data was left in GDB and so GDB was doing all the data storage. Considering the well-known proclivity of GDB to crash under stress it seems more likely this case is GDB and not Radian. Luckily, it is very easy to remove GDB as a possibility by eliminating the use of GDB as a data store.

That will also help your objective of measuring Radian performance and not inadvertently measuring ESRI performance.

StanNWT14 post(s)
#12-Oct-17 16:21

Hi Dimitri,

I wanted to reply after testing both on my home laptop and work computers. Both had the same result which is a crash "manifold 9.0 has stopped working".

The dataset that I am using is the Can_Vec 1:50,000 rivers stored in a single file geodatabase. I have all the feature classes from the Can_Vec data.

ftp://ftp.geogratis.gc.ca/pub/nrcan_rncan/vector/canvec/archive/canvec_archive_20130515/canada_fgdb

This data set is of all of Canada at 1:50,000.

The data is in separate zip files, each zip file is for a set feature class types. Within each type there are points, polylines and polygons indicated by a (0, 1 or 2) as the last character in the feature class name. The feature class that I'm using is HD_1470009_1.

I suppose you could just download the ftp://ftp.geogratis.gc.ca/pub/nrcan_rncan/vector/canvec/archive/canvec_archive_20130515/canada_fgdb\canvec_gdb_CA_HD.zip file and import from the file geodatabase just for the hydrography, HD_1470009_1, then run the test yourselves?

The newest version of the Can_Vec data is located here:

ftp://ftp.geogratis.gc.ca//pub/nrcan_rncan/vector/canvec/fgdb

The newest can_vec data is stored separately in folders named for the feature class types: Admin, Elevation, Hydro, Land, ManMade, Rest_MGT, Toponymy, Transport. However, the newest version isn't the version I am having issue with. Yes I can do the test with the newest version of the rivers, feature class to try and rule out some error with the layer itself. I can also try a check geometry or repair geometry within ArcGIS to see if there's any issues as far as ArcGIS can determine with the feature class.

The data on the geogratis ftp site is stored separately for each file geodatabase feature class, likely due to size issues. In my file geodatabase I have organized the data into feature data sets but they load successfully into Manifold / Radian Studio.

As you suggested I first copied the feature class, which is comprised of the table and drawing elements in Manifold, I used Manifold Future 9.0.163.5 for the test on both machines, into the project and deleted the source file geodatabase reference from the project, and saved the project before trying the decompose to coordinates transform. I launched Manifold 9.0.163.5 to run the test at 2:40 PM local time and the crash happened at 5:59 PM.

The Windows Event Viewer log info is below:

Faulting application name: manifold.exe, version: 9.0.163.5, time stamp: 0x59d504f7

Faulting module name: ext.dll, version: 9.0.163.5, time stamp: 0x59d504db

Exception code: 0x40000015

Fault offset: 0x00000000022c7536

Faulting process id: 0x2144

Faulting application start time: 0x01d342d0bd17fcfc

Faulting application path: \Downloaded_Software\Manifold\manifold-future-9.0.163.5-x64\bin64\manifold.exe

Faulting module path: \Downloaded_Software\Manifold\manifold-future-9.0.163.5-x64\bin64\ext.dll

Report Id: 482389d4-aee0-11e7-97c9-3417ebbe3cad

The detailed version of the Event Viewer log is below:

- <Eventxmlns="http://schemas.microsoft.com/win/2004/08/events/event">

- <System>

<Provider Name="Application Error"/>

<EventIDQualifiers="0">1000</EventID>

<Level>2</Level>

<Task>100</Task>

<Keywords>0x80000000000000</Keywords>

<TimeCreated SystemTime="2017-10-11T23:59:57.000000000Z"/>

<EventRecordID>78020</EventRecordID>

<Channel>Application</Channel>

<Computer>IK63-SEM0120.corp.ds.gov.nt.ca</Computer>

<Security />

</System>

- <EventData>

<Data>manifold.exe</Data>

<Data>9.0.163.5</Data>

<Data>59d504f7</Data>

<Data>ext.dll</Data>

<Data>9.0.163.5</Data>

<Data>59d504db</Data>

<Data>40000015</Data>

<Data>00000000022c7536</Data>

<Data>2144</Data>

<Data>01d342d0bd17fcfc</Data>

<Data>\Downloaded_Software\Manifold\manifold-future-9.0.163.5-x64\bin64\manifold.exe</Data>

<Data>\Downloaded_Software\Manifold\manifold-future-9.0.163.5-x64\bin64\ext.dll</Data>

<Data>482389d4-aee0-11e7-97c9-3417ebbe3cad</Data>

</EventData>

</Event>

I had over 300 million points when I scan through the details, but it fails when it starts to insert the records.

My workstation at work is as configured as follows:

Dell Precision T3610, 96 GB DDR3 1866 MHz RAM, 1 TB 7200 RPM OS drive, OCZ Revodrive 960, which is where my pagefile, TEMP, TMP folders are stored. I have an Nvidia Quadro M4000 and P4000 graphics cards installed with driver version 385.69 installed. My CPU is an Intel Xeon E5-1620 v2 @ 3.70 GHz. I'm running Windows 7 Enterprise 64-bit. Patches and service packs well that is controlled by the corporate IT environment. I know it's not possible to duplicate my system, but wanted to give some system specs.

As for the super wide post I made earlier in the thread, that is a result of me copying and pasting directly from the expression tab within the Transform dialog.

Dimitri


4,193 post(s)
#12-Oct-17 20:13

I know you're trying but we have to take this step by step. No jumping ahead. Also, please don't volunteer info from event log and stuff like that. Please just take it step by step.

Here's what I mean:

The data on the geogratis ftp site is stored separately for each file geodatabase feature class, likely due to size issues. In my file geodatabase I have organized the data into feature data sets but they load successfully into Manifold / Radian Studio.

Based on the above I do not know exactly what you have in your project and there is no way to download something from anywhere and get an exact match to what you have in your project. So we have to pin that down.

An ideal situation is to just look at the .map file. If you save the .map file how big is it? If you then zip it, how big is the zip file? Do you have a server somewhere you can put that file for download? Is the query you are using in that file?

Some more questions...

As you suggested I first copied the feature class, which is comprised of the table and drawing elements in Manifold, I used Manifold Future 9.0.163.5 for the test on both machines, into the project and deleted the source file geodatabase reference from the project, and saved the project before trying the decompose to coordinates transform. I launched Manifold 9.0.163.5 to run the test at 2:40 PM local time and the crash happened at 5:59 PM.

OK. It becomes muchmore interesting if there is no connection to GDB. Maybe Engineering will get lucky! :-) But still, when you say "run the test" I don't know the exact test you are running and how that relates to what, exactly, are the tables, schemas, etc. in the project. The best way to deal with this is to take a look at the actual project you are running and the actual query. We need that level of detail to make progress.

Dell Precision T3610, 96 GB DDR3 1866 MHz RAM, 1 TB 7200 RPM OS drive, OCZ Revodrive 960, which is where my pagefile, TEMP, TMP folders are stored.

I presume an OCZ Revodrive 960 is some sort of solid state disk, but I have no idea how big it is. You haven't mentioned how much free space you have on the various disks. All that is important to know.

When you get a crash by the way, don't do anything else until you first create a dump file. That's the important thing.

StanNWT14 post(s)
#16-Oct-17 19:03

Hi Dimitri,

The .map file is 12,613, 248 KB, using 7-Zip with ultra compression, it's 3,240,485 KB.

There is only one drawing and related attribute table in the .map file. I do not have a server at work where I can make available a directory for file upload. My organization uses secure file transfer where a single recipient's email would have to be on the other end and I have a 2 GB file limit per day. I can break the file up into equal parts so it's under 2 GB, and using 7-zip you you download each part? But that requires a recipient email address. I don't personally have a drop box account to host the file for download. I can if you like download that GeoDatabase with the single feature class from NRCan, the one referenced in the ftp site, and connect that as a source, then copy the drawing and table in, and try and redo it so that it's as reproducible without me sending you my map file, if that might help? But I know you want my map file. I have done this with both my work computer and my laptop at home, with very different hardware, OS and installed software environments.

My work computer has:

An OCZ RevoDrive, is a PCI-Express SSD, it's 960 GB in size, it's really 894 GB, currently there is 682 GB free. My page file is 147,372 MB in size, recommended automatically by Windows on the RevoDrive. I have 96 GB of RAM installed. My storage RAID where I have the .map file, is a Drobo 5D with a total capacity of 21.6 TB and 18.76 TB free space. The original GeoDatabase is stored on a different Drobo 5D with a capacity of 10.49 TB and 7.76 TB free space. My C drive is a 1 TB 7200 RPM seagate HD with 663 GB free space.

My laptop at home has an mSATA cache drive of 32GB used by the Dell UEFI as an acceleration drive, and a 1 TB 5400 RPM hard drive, 16 GB RAM, a 2GB page file on either the C drive or that mSATA cache, I'm not sure. There's about 700 GB free on the hard drive of the laptop.

Both computers, have Manifold 9.0.163.5 crash after trying to insert records. I haven't been able to record where in the number of records that crash is, because I wasn't able to be in front of the computer looking at it for 3 hours straight on both machines, 3 hours that is after the scanning records phase. I believe on my laptop the scanning phase went upwards of 350 million records. Seems odd, that this operation has so small of a limit, compared to the forum post I read regarding reading 1.72 billion lidar points. But yes a totally different operation.

How would you suggest I proceed with sending the 7-zip file or portions at a time?

StanNWT14 post(s)
#20-Oct-17 15:33

Dimitri,

Can I post drop box links in here so that you can download the .7z file? I've not used drop box as of yet in my personal life, as I've not needed to transfer large quantities of data, and I'm not really a fan of cloud storage for my personal info. In work environments organizations I've worked for never use drop box. I've used iOmega cloud storage but that was 6+ years ago.

Dimitri


4,193 post(s)
#20-Oct-17 19:07

Sorry for disappearing on you... there's a huge new build coming out tomorrow and I've been reviewing some of the internal builds working up to that. I just haven't had the opportunity to pursue this as I would like.

Here's an idea: 3 GB is not a lot to FTP. Can you connect to an FTP site? If so, drop a note to tech support saying "hi, I can reproduce a crash using this .map file and the following step by step procedure [insert here the step by step procedure]... can I FTP the example file to you?"

They'll issue you an FTP account with instructions, so you could upload it to them. They really should be doing this anyway and not us trying to do it cart before the horse style here in the forum.

Manifold User Community Use Agreement Copyright (C) 2007-2017 Manifold Software Limited. All rights reserved.