I've created a simple webservice with Spring Boot 1.3.2 (with dependencies on spring-boot-starter-data-rest and spring-boot-starter-data-mongodb) to read/write MongoDB documents describe by the following class (getters/setters omitted):
Persisting a new object in the database works as expected. By POSTing:
I get the following answer (_links omitted for clarity purposes):
The purpose of the data field is to store data of variable complexity and depth.
Let's keep it simple for now... PATCHing with:
Further PATCHing with:
Question one: is this result correct? I would have expected the following instead:
Now, for the funny part...
As a foreword, let me just say that I know that PATCH is unsafe and non-idempotent and should be used with caution.
Let's say I have changed my mind about d, and I want to store a subdocument in it:
Throws and exception:
There is a workaround, by simply PATCHing data with a only, then re-sending the PATCH above, I can get what I wan't, ie:
A bit awkward to me, since the exception above sometimes get thrown when I re-add d as subdocument, even if I PATCHed it away before!
Let's play a bit more (sorry for the long post, btw). The current data being:
With a new PATCH that goes:
Consistently with what happened before, I should expect the following response:
Turns out I get:
Question two: is this result correct?
If it the expected behaviour, It seems kind of inconsistant with the result that led me to Question One.
Now for Question Three...
The current data being :
PATCHing an update for a and d:
Here's what I get:
The payload for d just went down the toilet
If I re-send the PATCH, it will update d correctly.
And if I send it again, or even if I change the contents of d once more, d get flushed down again, and so on.
Question Three: what is happening?
I have been scratching my head for a few hours and I really don't have a clue about this strange behaviour (and I am confident there's are many more unexpected things that could happen).
The only thing clear is that right now, send PATCH request should be avoided when dealing with a data schema that is actually schema less, until the result can get somewhat predictable.