You are viewing a single comment's thread from:

RE: Is the universe a finite state machine?

in #universe6 years ago

Good questions. Easy answers. ONE.: time is change of entropy. Nothing less, nothing more. The other word is action. Cf. de Broglie's hidden thermodynamics. TWO.: Of course they can be all predicted, you simply need infinite resources of spacetime and mass-energy to do it. A bigger universe in hand to run your target universe all future states. Cf. Claude Shannon 1940es. THREE.: What base deterministic you ask about? If you ask about unitary probability - even so I reckon because the non-unitary probabilistics ( with non-binary code ) must be renderable / compressible to unitary ( where probabilities add up to 1 or 100% ).

Sort:  

Time is change of entropy? So if you create order you go backwards in time? Doesn't make sense to me...

You can predict everything? Please tell, how will you measure the current state at sub atomic level?

Entropy is not disorder. See.: http://www.science20.com/train_thought/blog/entropy_not_disorder-75081 -- The relation order/entropy, see.: http://www.informationphilosopher.com/solutions/scientists/layzer/ -- It is not even necessary to start with actually low entropy. See.: http://www.science20.com/the_hammock_physicist/immortal_unbounded_universe-134704 -- Current state at subatomic level - you can't currently, see.: https://en.wikipedia.org/wiki/Mutual_information -- BUT, information is indestructible... so you can unscramble any past state if you live long enough and grow to sufficient Shannon Scale. See.: https://en.wikipedia.org/wiki/Information_theory -- A thing does not need to be a 'who' to 'observe' and 'measure'. Measurement is ... interaction. Information is a verb, not noun :)

Too much pseudo-science in there for my liking!

I challenge you to point out a piece of such here.

I strongly disagree with the piece on entropy. True, it is not disorder per se but it is strongly related to it. I don't like the way that the author makes broad, sweeping statements like "The temperature and entropy of a system is only well defined for systems that are homogeneous and in thermal equilibrium." - which I would consider untrue. Also that he never really defines entropy, at least not beyond the "heat" explanation at the beginning, which is rather incomplete and/or irrelevant to his later discussions. That end summary of his is missing a LOT of substance.

As for "information is a noun" I disagree wholeheartedly. The action of measuring something can affect it, especially at the subatomic level. But that does not necessarily imply that all information is dynamic in nature. All matter and energy is dynamic, but not information. For example: my statement that "all matter and energy is dynamic" is immutable. It's a fixed piece of information, a noun.

Okay. I won't argue on ''Did you check Layzer?'' etc. Lets compact it - on your first paragraph will ask you.: ''Name at least one closed system.'' and on your second paragraph.: ''you do not measure streams? your noun-sentence occured whole and instantly?'' - name please a think which is not matter-form-process simultaneously? Information is a verb, not noun - exactly in-form ... forget about quantum collapses by seeing something :) it is not in the formulae. But how things limit each others' entropy is = measurement, interaction. And entropy is nothing but degrees of freedom.

Coin Marketplace

STEEM 0.28
TRX 0.12
JST 0.033
BTC 63042.71
ETH 3175.48
USDT 1.00
SBD 3.81