Commit 565bed38 authored by Davis King's avatar Davis King

Made it so you can deserialize bn_ objects into affine_ objects.

parent c7813163
......@@ -11,6 +11,7 @@
#include "../rand.h"
#include "../string.h"
#include "tensor_tools.h"
#include "../vectorstream.h"
namespace dlib
......@@ -884,6 +885,17 @@ namespace dlib
{
std::string version;
deserialize(version, in);
if (version == "bn_")
{
// Since we can build an affine_ from a bn_ we check if that's what is in
// the stream and if so then just convert it right here.
unserialize sin(version, in);
bn_ temp;
deserialize(temp, sin);
item = temp;
return;
}
if (version != "affine_")
throw serialization_error("Unexpected version found while deserializing dlib::affine_.");
deserialize(item.params, in);
......
......@@ -154,6 +154,11 @@ namespace dlib
This kind of pattern is useful if you want to use one type of layer
during training but a different type of layer during testing since it
allows you to easily convert between related deep neural network types.
Additionally, if you provide a constructor to build a layer from another
layer type you should also write your layer's deserialize() routine such
that it can read that other layer's serialized data in addition to your
own serialized data.
!*/
template <typename SUBNET>
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment