Json deserialization requires elements to be in order

Hi @Daumantas and team,

I serialize a textual v2 model into json, and store it in a v2 API compliant repository (OpenMBEE flexo)
When I retrieve the model as json from that repository, due to the nature of those technologies,
the json elements are not anymore in the order as when the model was serialized.

The deserialization seems to require to be in the order how they are defined without forward references, and that’s not the case because we want to be able to commit incremental and partial models to those repositories. It’s not an all or nothing commit or retrieve.

When the json elements are out of order the deserialization fails for certain models.
This is extremely limiting for integration and scalability.

This is the minimal model to reproduce the issue:

part m0001_2N {
part nx0001 {
port scp_outside2;
}
part tcs0001{
port scp;
}
interface tcs0001.scp to nx0001.scp_outside2;
}

Attached (but for some reason I’m not allowed to upload as “new user”) is the json created with syside (raw.json) and the one retrieved from the repository (flexo.json)
They are identical but the order is different. We handle already the root namespace that syside expects to be the first element when deserializing. But that’s not enough.

When I remove the line with the interface, the deserialization works just fine because in the json the elements for the interface happen to reference elements that appear later in the json representation.

I hope this is an easy fix, as it is critical for integrating syside into a standard v2 API environment.
One approach could be to make 2+ passes when parsing. It could be an optional argument.
In another step it could become incremental in general. For larger models, we’ll have to use pagination, and would want to deserialize incrementally, adding pages of json, one by one until the model is complete.

Thank you for your help and support.
Robert

cc @rubayet

Hi @Robert_Karban,

One approach could be to make 2+ passes when parsing

This is already the case with JSON deserialization, there is a pre-pass that populates a map of (element_id, Element) for resolving local references, including owned elements. For example, this works and prints an empty diff:

import difflib
import warnings
import syside

SRC = """\
part m0001_2N {
    interface tcs0001.scp to nx0001.scp_outside2;
    part nx0001 {
        port scp_outside2;
    }
    part tcs0001 {
        port scp;
    }
}
"""

model, _ = syside.load_model(sysml_source=SRC)

with model.user_docs[0].lock() as doc:
    root_node = doc.root_node

string = syside.json.dumps(
    root_node,
    syside.SerializationOptions.minimal().with_options(
        use_standard_names=True,
        include_derived=True,
        include_redefined=False,
        include_default=False,
        include_optional=False,
        include_implied=True,
    ),
    include_cross_ref_uris=False,
)

input = syside.sexp(root_node, print_references=True)

# disable warnings due to missing references without `@uri`
with warnings.catch_warnings(category=syside.json.SerdeWarning, action="ignore"):
    des, _ = syside.json.loads(string, "memory:///test.sysml")

map = syside.IdMap()
for mutex in syside.Environment.get_default().documents:
    with mutex.lock() as dep:
        map.insert_or_assign(dep)

report, success = des.link(map)
assert success, str(report.messages)

output = syside.sexp(des.root, print_references=True)

print(
    "Diff:\n",
    "\n".join(
        difflib.unified_diff(
            input.splitlines(), output.splitlines(), "input.txt", "output.txt"
        )
    ),
)

for some reason I’m not allowed to upload as “new user”

I will notify the team about, hopefully it can be resolved.

would want to deserialize incrementally, adding pages of json, one by one until the model is complete.

Streaming interface is a good feature request. However, it will take some time as it also needs to work with internal model invariant which ensures that all relationships except Dependencies have 2 related elements, and owned related elements cannot be replaced with placeholders.

Thank you.

I think there’s a misunderstanding or I’m missing something.

Yes, the parsing of the textual model works in any order of declaration.

I’m talking about the json representation.

In your example the json that is deserialized is the same that is serialized.

In the case i describe, the order of the json elements to deserialize is random although the elements are the same as the result of the serialization.

I need to upload the example json files. I think that will clarify.

Hi Robert, you should be able to embed files into your posts now

The order beyond the first element should not matter. Only the first mapping pass inserts elements in the order they are in the JSON which the order does not matter for unless there are duplicate IDs. The second deserialization pass will start from the root and continue in some order of owned elements/relationships until the tree of all owned elements has been deserialized.

Hi @Zygimantas files uploading works now but json extension is blocked. I will send them in an email to you!

Ok sent files to @Zygimantas , @Daumantas I didnt have your email..

flexo.json.txt (425.2 KB)

raw.json.txt (388.1 KB)

here are the two files

raw is what syside serializes

flexo is what flexo returns - out of order

Thanks, I can reproduce the issue with both flexo.json (after moving Namespace to the front), and raw.json. Issue seems to be that ReferenceSubsetting owned related element is deserialized as unonwed element which ends its deserialization early, leaving it as an orphan element which does not work with printer.

I will also see if I can figure out a good way to infer or set the root element without having to modify input JSON.

Great!

Yea the orphan is a blocking issue.

Don’t worry too much about the root namespace. We already handle that. It’s easy.

Ownership deserialization fixed for future release (likely v0.8.2).

Also added simple inference for root element, either a Namespace (not subtype) without owning relationships, or the last ancestor of the first element in the array following either owning namespaces, owning related elements, or owning relationships.

Excellent @Daumantas !

What’s the ETA for 0.8.2?

Is there any workaround meanwhile?

Looks like it will be in 0.8.1 instead. Temporary workaround is to place ownedRelatedElement last in relationships that can own their sources/targets.

Great, @Daumantas
I experimented a bit further and found something interested.
I added by accident the “identity” field to the element instead as a sibling to the payload for the commit to the API, and now the deserialization works.
For example:
{

"@id": "006402a9-738b-457f-adbc-1fe4a59c5f9a",

"@type": "TypeFeaturing",

"elementId": "006402a9-738b-457f-adbc-1fe4a59c5f9a",

"featureOfType": {

  "@id": "07a60df1-ba11-4c47-964b-7a3058410870"

},

"featuringType": {

  "@id": "1e4b5240-50e0-5ab5-a454-7416095f5d16"

},

**"identity": {**

  **"@id": "006402a9-738b-457f-adbc-1fe4a59c5f9a"**

**},**

"isImplied": true,

"isImpliedIncluded": true,

"isLibraryElement": false,

[…]

Instead, if I commit as requested by the API spec with the identify as a sibling for all payloads, so it does NOT appear in the json to be deserialized then the deserialization fails:
[{‘payload’: {‘@type’: ‘Namespace’,
@id’: ‘a5d12121-3e4a-4aee-95b5-e05e93c426ef’,
‘elementId’: ‘a5d12121-3e4a-4aee-95b5-e05e93c426ef’,
‘qualifiedName’: ‘2025-10-07T20:02:43Z’,
‘isImpliedIncluded’: True,
‘isLibraryElement’: False,

[…]
‘identity’: {‘@id’: ‘a5d12121-3e4a-4aee-95b5-e05e93c426ef’}},
{‘payload’

[…]

If I don’t include identity at all, the deserialization fails too.

I don’t really understand what’s going on here but maybe you do?
Thanks,

Robert

I am afraid I cannot help with that. Syside works with specification defined JSON schema, which does not include identity or payload .

Yea. I understand.

It’s specified by the v2 API specification.

The curious thing is that if identity is there the deserialization works.

But if you don’t use it then idk. Hm. Thank you

v0.8.1 has been released