Skip to content
This repository has been archived by the owner on Jul 24, 2021. It is now read-only.

MaxTilesetComplexity is not adequate enough to limit big jobs #2441

Closed
openstreetmap-trac opened this issue Jul 23, 2021 · 1 comment
Closed

Comments

@openstreetmap-trac
Copy link

Reporter: woidrick
[Submitted to the original trac issue database at 10.00am, Thursday, 12th November 2009]

I understand MaxTilesetComlexity as a how large in megabytes generated tileset will be.
The amount of RAM used for rendering is more adequate parameter.

Example: 1025 1694 12
is about 4000000 comlexity
but when maplint layer rendered, it eats more than 20Gb RAM because of many errors notifications rendered for this tileset

May be it is not easy to calculate amount of RAM used, but the size of svg file used for rendering by inkscape can be checked before running inkscape.

So, additional complexity limit variable should be used.

Or when rendering of many simmillar writings shoud be rendered in less resource eating matter.

@openstreetmap-trac
Copy link
Author

Author: spaetz
[Added to the original trac issue at 11.54am, Tuesday, 29th June 2010]

The size of the svg and the size of the png files correlate pretty well, so there is little value in checking the size of the .svg value. In addition the server never sees the .svg files so the clients would have to transfer that information and the server would have to store that information in some data base (there is no database for each tileset file now and the tileset file size is easily to get at.

So this is a won'tfix from the server side.I agree that the RAMusage by inkscape would be the best measurement but that is information that the client cannot get at.

It used to be that the regular tilesets and maplint RAMusage deviated a lot. That has become a lot better with the maplint layer gone now.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant