I'm thinking about this in relation to importing external data sets
and possibly automating keeping the data up to date within Wikidata. To use a practical
example the
Office for National Statistics in the UK produces many of
the official datasets about the UK including population, income levels,
consumer price index, unemployment rate, etc, there are equivalent
organisations in many countries. The ONS produces a new large data set every week, much
of which (at least the headline figures) would be very useful to have on Wikidata. It seems unrealistic to keep all this data up to date by hand given the amount of data, the size of the community and the different skill levels and interests within it.
If someone (either within Wikidata or ONS) set up a bot or other system to keep these figures up to date it seems sensible to have some sort of system in place so that if someone wanted to change/merge or delete an item or set of items that were fed by this external data set (either manually or using a bot) that they were at least aware that doing so would break a link keeping the values up to date.