Live blogging: Sam Fletcher

Perhaps the requirement that an embedding exist is too strong.  There are many small causal sets that cannot be embedded into a relativistic spacetime.  Small perturbations of a set should not change whether we count it as embeddable, but this will not hold if we require exact embeddings.  So we look instead for embeddings of a coarse–grained causal set, where a coarse-graining is a causal set that could have arisen with high probability from a Bernoulli deletion.

It’s not clear that deletion can be understood as a sort of “averaging” of a causal set.  One doesn’t normally do statistical mechanics by ignoring some atoms entirely; rather we pay attention to larger-scale variables.

A better option may be to use a Bernoulli process to identify vertices on adjacent elements of the set, but this is not easy.

Deleting too many points makes the task of embedding too easy.  So the embedding density should be set by the observables we’re interested in, and how many points can be deleted while still approximating them accurately.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s