Skip to content

WIP warm starts for inference#35

Open
amueller wants to merge 1 commit intopystruct:masterfrom
amueller:caching_inference
Open

WIP warm starts for inference#35
amueller wants to merge 1 commit intopystruct:masterfrom
amueller:caching_inference

Conversation

@amueller
Copy link
Member

This is a draft of how to support caching inference.
it does something. not sure if it helps. also doesn't support multi-processing.
It is basically only for opengm, and only does something sensible for selected inference algorithms that can cope with init. There are all the sampling and move-making based methods, I believe (try 'lf', 'fm', 'icm').
I couldn't see much gain, but maybe there is still some bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant