Tau Meta-Language: Transcompilation and scaling development

in #tauchain4 years ago

Transcompilation

By now many of you have watched the latest Q/A and are aware of the achievements of 2019. TML is now nearly complete and is in a current state where it meets the guarantees Ohad made. One of the key features which is not often discussed in detail is the feature called "transcompilation". This feature is where you can take source code from one programming language and compile it (translate it) into source code in another programming language. This very special feature of TML will in my opinion ease and accelerate development of Tau Alpha discussion platform and Agoras.

Could it be possible for developers to port their codebase from C++ into TML? This would be a source to source translation. If this is achievable as theory currently suggests then it is an an on-ramp for developers. My understanding of the process. Source code --> Lexical analyzer --> Token stream --> Syntax Analyzer (w/CFG) --> Semantic Analysis.

All source code contains syntax. Syntax in the source code is analyzed by a syntax analyzer. The syntax analyzer produces a parse tree. Source code is also semantics which is why after the parse tree is produced there is another process called context sensitive analysis or semantic analysis. Toward the end of this process we end up with a CFG which is a context free grammar which can help complete the process because the CFG contains the rules of the programming language.

CFGs can be swapped in and out and at the foundation is the core logic behind the code. Programs in TML are not equivalent due to the fact that Tau resembles a finite state machine with finite resources rather than an infinite memory Turing machine. That being said, Ohad is confident that he can get programs to be importable into TML although I'm not fully able to explain the details yet.

Scaling development

TML can scale development because it has a very unique feature which Ohad originally described as "code re-use". In reality it's more like knowledge sharing. Knowledge bases can contain all that is necessary to compute and the ability to share a knowledge base would allow essentially code reuse or to put it differently if we can solve the problem once and only once then also upload the knowledge to the cloud then anyone can download that knowledge from the cloud to plug into their knowledge base for their problems while leveraging solution.

There are some questions such as if knowledge is really going to be so valuable does everyone want to share all of their knowledge with everyone? Perhaps there will be reasons not to? Once again there may also be some discussions which take place on Tau which the community does not want to support or tolerate. The community will have many technical options for governance, for updating it's rules, for updating it's knowledge, for optimizing, for providing decision support to facilitate these processes.

In my opinion the genie is already out of the bottle in the sense that TML exists, it works, and anyone can steal the code if they have complete disregard for morals, laws, and social norms. It is quite possible that those who want to focus on beneficial use cases will dedicate their computation resources and attention toward these activities. Those who do not care, or who for example prefer to violate norms, or "community standards", or international law, are already able to do this because TML can be copied and there really isn't a lot the community can do to stop an adversary willing to violate the license agreements. What the community will have to decide is how to encourage responsible participation with the Tau network and in my opinion for those who want to encourage this kind of participation the tools are already there to build in the technical, legal, social/reputation and decision support solutions.

Some ideas Tau can apply to promote responsible participation:

  • Smart constitutions
  • Reputation economics
  • Legal accountability
  • Identity verification with pseudo anonymity
  • Copy protection and privacy features for those who don't want to share all their knowledge with all
  • Best practices implemented by default for security
  • Decision support, consequence mapping, opinion mapping, sentiment analysis

Responsible participation is maintainable in my opinion when the foundation of the rules are made clear. This foundation would allow for an evolving ruleset which never violates the foundation. So a smart constitution would be layered rulesets where at the bottom layer you have the core or foundational values of the community. This essentially would encode the "community standards" from which all rules emerge from and these rules which emerge from it could determine resource allocation in accordance to the smart constitution.

Reputation economics is the norms of the community which regulate certain things. For example if someone scams members of the community and it's possible to have a reputation then the scammer has a cost against scamming people which they have to consider when weighed against the possible benefits.

Legal accountability is obvious. If something is licensed or under copyright then it's legally protected. If there is legal accountability then these licenses, terms of service, etc, mean something.

Identity verification can be done in many ways but sooner or later best practices will emerge for how to do it. This will be necessary for KYC/AML, it will be necessary for more practical reasons too such as to get control of your account if your computer gets hacked.

Copy protection would allow people to trade knowledge in markets in ways where their knowledge cannot be seen or copied. This could encourage sharing of more knowledge and allow the network to benefit economically.

Best practices for security would simply mean find all the current best ways of protecting people and apply those techniques. Social account recovery for example is a must for any discussion platform.

And perhaps most important of all is decision support. In order to make the best possible decisions from the available information there has to be the tools allowing it. If the desire is to make a network which is likely to benefit the world then it's going to be a constant struggle to figure out which decisions has what impact on a lot of people and trying to determine if from the current understanding of the knowledge if the impact is likely to be positive either now, or strategically over generations. In my opinion the most difficult decisions are decisions which have lasting perhaps multi-generational consequences and to have a community which can make those kinds of decisions in my opinion requires the sort of tools to enable at the very least the ability to weigh pros and cons, rank potential outcomes, track sentiment, etc.

Sort:  

To listen to the audio version of this article click on the play image.

Brought to you by @tts. If you find it useful please consider upvoting this reply.

Coin Marketplace

STEEM 0.29
TRX 0.12
JST 0.033
BTC 63793.53
ETH 3193.21
USDT 1.00
SBD 3.92