writers.arrow

The Arrow Writer supports writing to Apache Arrow Feather and Parquet file types.

Dynamic Plugin

This stage requires a dynamic plugin to operate

Streamable Stage

This stage supports streaming operations

Example

[
    {
        "type":"readers.las",
        "filename":"inputfile.las"
    },
    {
        "type":"writers.arrow",
        "format":"feather",
        "filename":"outputfile.feather"
    }
]
[
    {
        "type":"readers.las",
        "filename":"inputfile.las"
    },
    {
        "type":"writers.arrow",
        "format":"parquet",
        "geoparquet":"true",
        "filename":"outputfile.parquet"
    }
]

Options

batch_size

Number of rows to write as a batch [Default: 65536*64 ]

filename

Output file to write [Required]

format

File type to write (feather, parquet) [Default: “feather”]

geoarrow_dimension_name

Dimension name to write GeoArrow struct [Default: xyz]

geoparquet

Write WKB column and GeoParquet metadata when writing parquet output

write_pipeline_metadata

Write PDAL pipeline metadata into PDAL:pipeline:metadata of geoarrow_dimension_name

where

An expression that limits points passed to a writer. Points that don’t pass the expression skip the stage but are available to subsequent stages in a pipeline. [Default: no filtering]

where_merge

A strategy for merging points skipped by a ‘where’ option when running in standard mode. If true, the skipped points are added to the first point view returned by the skipped filter. If false, skipped points are placed in their own point view. If auto, skipped points are merged into the returned point view provided that only one point view is returned and it has the same point count as it did when the writer was run. [Default: auto]