Warren Buffett once said, “It is better to be approximately correct than to be exact wrong”.
If you are in a primeval forest, you get lost, you have a detailed map of the forest in your hand, and there is a North Star in the sky, how do you get out of the forest?
There are generally two ways to do this:
- Take the map and find your way carefully.
- Keep your head up in the direction of the North Star.
We are often taught to get out of the forest in the second way because:
- The former emphasizes accuracy, but if you are too obsessed with the details of the map and ignore the dense jungle, there will be many changes, and the experience of the predecessors is not always reliable, and you will also get lost, and fall into the “accurate mistake”, as if you can find a step that is determined at every step according to the map, in fact, it cannot be done, so you can’t get out of the forest;
- The latter emphasizes the approximate (vague) general direction, and the emphasis is that as long as the general direction is correct, you will definitely be able to get out of the forest, which is actually “approximate correctness”.
Approximate correctness, emphasizing correctness; Precise mistakes, emphasis is on precision.
Warren Buffett’s “approximate correctness” under the concept of value investing has helped him go through the cycle, but the truths that human beings recognize are relative, not to mention a famous quote from Buffett? It can only be said that this sentence is in line with his value investment philosophy, but it also has boundaries.
Is there really no third way out of the forest?
Precise correctness
The above mentioned two sets of words, “approximate correctness” and “exact error”. Think about it, in what context did Warren Buffett propose them?
In fact, it is proposed under the recognition of “the uncertainty of the outcome of the development of things”. Specifically, there are three aspects:
- Dynamics. Dynamics can also be replaced by “volatility”. Dynamics refers to the rapid change and instability of the environment. For example, consumer demand for healthy food may be higher in one period, while more attention may be paid to eco-friendly products in another. This change in market demand increases the uncertainty of companies in product development and marketing.
- Complexity. Complexity can also be replaced by “connectedness”, which refers to the interconnectedness and interaction between various factors and stakeholders in the environment. For example, there are multiple departments, teams, and employees within an organization, and there are complex working relationships and information flows between the various levels. Managers need to deal with a variety of decisions and issues, such as resource allocation, goal setting, people management, and coordination and cooperation. This complexity makes organizational management challenging, as different factors and variables interact with each other, leading to uncertainty and unpredictable outcomes.
- Ambiguity. It can be replaced by “ambiguity”. Ambiguity refers to the ambiguity of information and the possibility of multiple interpretations. For example, a company publishes an ad that uses a metaphor or symbol that can be interpreted in multiple ways. Different people may have different understandings of this metaphor, resulting in ambiguity in the intent and message of the advertisement.
After all, the North Star is unchanging, such as Alibaba, Ma Yun is deciding whether to do “cloud computing”, not by relying on accurate models, but by “making the world no difficult business” This “approximate correct” decision-making.
However, we can think about the other way around, assuming that the “certainty of the outcome of the development of things” is higher or can become higher. Should “approximate correct” also be used?
Zhang Yiming, a new generation of entrepreneurs, said: There is a difference between people and people, when your computing power is high enough, even art can be accurate, just because your computing power is not high enough to seek fuzzy solutions.
Here we first need to understand how the so-called “right” and “wrong” are caused, and the simplicity is caused by the effect of “model fitting data”.
The essence of practice is the technique and art of “model fitting data”.
We always pursue the right and avoid mistakes in practice, and the method cannot be “model” and “data”. When Warren Buffett realizes that he can’t be “exactly right”, he uses “approximate right” instead, so as to avoid “exact mistakes”.
Does Warren Buffett just want to be “similar”? The key is whether it can become “accurate”, and it has to go back to “model and data”.
Imagine that a group of highly educated people in a particularly bureaucratic enterprise are constantly in meetings and brainstorming, but they never carefully investigate the market and collect information, and they try to find the most accurate forecasting model.
There is a high probability that they will make a “precise mistake”.
The reason is very simple, the prediction of the empirical world relies on the inductive model is not a general model, but the reconstruction of indirect and direct experience such as learning in school, communicating with cattle people, and accumulating work experience, but no matter what kind of experience, the quality of the inductive model depends on the quality of the data.
High-quality data must overcome the above-mentioned characteristics of “dynamics, complexity and ambiguity”, and it is obvious that:
- If they don’t go to the market to investigate, then the data must have a lag and not overcome the dynamics;
- The human brain can’t process mesh data, so there’s no complexity to overcome;
- Neither the team nor the user can be universally rational, and it is difficult to overcome ambiguity in the transmission of information.
The decision-making at this time is really not as “approximately correct” as Buffett and Jack Ma. Because their models and data are all approximate, it is impossible to come up with “exact correctness”.
On the other hand, if the above three characteristics can be overcome, then a “precise correctness” can be achieved. But again, this is an almost impossible task, as it satisfies four assumptions:
- The data involved is known/non-dynamic, so as to overcome the three-dimensional and comprehensive nature of data acquisition;
- Team members can connect with all users and be honest with each other, so it is possible to obtain three-dimensional and comprehensive data;
- The human brain can process mesh data in order to overcome the complexity of the data structure;
- Both the team members and the users are rational so that there is no ambiguity in semantics.
Obviously, each of these assumptions is a fantasy and an impossible task.
Can we transform our thinking into a way where we don’t have to “achieve” the above assumptions, but look for a field that is more “approximating” to the above assumptions, so that we can reach a state that is not “exactly right” but “exactly approximation”?
The answer is obvious, of course, and the “approximation” here refers to the approximation/approximation of the data to the model itself. Moreover, in the real world we are facing, there are many areas that could have achieved “precise fuzziness” but could not have been achieved due to insufficient algorithms and computing power, such as the digitalization of management, how should it approximate the above four assumptions? If “approximated” the following characteristics:
- The data is known;
- The organization is candid;
- AI can handle complexity;
- Organization is rational.
There is still a lot of room for precision in this field, and on the contrary, the so-called approximate correctness should not be pursued.
Most of the data faced by the digitalization of management is known, but it is relatively scattered and offline, and the complexity of the organization can be modeled by AI (when management begins to become shackles, how to re-understand management?). has been clearly stated), but organizations are not always honest and rational. The key question here begins to change: since it is impossible for an organization to be completely honest and rational, then there must be all kinds of biases/ambiguities, for example, even if it is the same management philosophy, in different organizations, there will be different understandings, this is a kind of ambiguity, the key is whether this ambiguity can be correctly accommodated and handled.
It was impossible to account for this ambiguity in the previous management model, because the “data structure” and “algorithm” were outdated.
Exact approximation
For example, the Hays method establishes a data structure like this:
- Level of knowledge and skills
- Professional theoretical knowledge
- Management know-how
- Interpersonal skills
- Problem-solving skills
- Mental environment
- Difficulty of thinking
- Responsibilities of the position
- Freedom of movement
- The role of position in the formation of consequences
- Responsibilities
Then, the data structure is graded, the expression may be different, but the meaning is the distinction of low, medium, and high, and then an average score (also called a norm) is assigned to each level, such as:
In this way, a “data structure” is completed from the top down.Its algorithm is:
- Analyze the three elements of each position: knowledge level and skills, complexity of problem solving, and job responsibilities.
- Select the number from the reference table.
- The score is calculated according to the formula
- A Check the Knowledge Skill Level Chart score A
- B Look up the Problem Solving Ability Chart Percentage B
- C Check the Duty Responsibilities Table with a score of C
- The score is calculated by the following formula: Job evaluation score = A×(1+B) + C
This is also a quantitative method, but this top-down approach is actually an “exact approximation”, but this approximation, just like taking medicine, may have such an explanation: 50KG people eat 10 grams, 60KG people eat 15 grams, and 70KG people eat 20 grams.
On the surface, this is very accurate, but it does not take into account the individual differences of people of different weights, for example, the same 60KG person, some people eat 15 grams and are fine, but some people can’t eat so much because of other diseases, at this time, this kind of quantification can easily lead to “accurate errors”, which may lead to aggravation of the original condition, because this error comes from the limitations of the data structure of the model itself and the ambiguity of data collection for individual differences.
The biggest problem with the Hays method is also the limitation of its “data structure and algorithm” itself, which cannot process at least the following three types of data:
- In terms of dynamics, it is extremely difficult to process the random data of the dynamic development process of the enterprise in a timely manner, and this evaluation method is too costly, and it takes a lot of effort to adjust it every time.
- In terms of complexity, the skills and resources used by various employees in each task are a complex network diagram, which contains a lot of data that is very explicit but can only be memorized by the human brain but cannot be processed by the Hyace evaluation method, and the granularity is too coarse. For example, the “discount policy” of a certain headquarters has a great support for the increase of sales in a certain region, but there is a certain loss to the brand, and Haishi ignores these.
- In terms of ambiguity, the “hidden data” that is not easy to network cannot be calculated, such as the “potential pressure from the boss”, which cannot be included in the data structure of the Hay’s evaluation method for calculation.
The Hai evaluation method ignores all these microscopic “dynamic, complex, and fuzzy” and deals with them in a highly linear way, but this is a complex way of using low-dimensional linearity to deal with high-dimensional complexities, resulting in a huge amount of information loss and heavy distortion.
It would be more accurate to present the high-dimensional complexity itself first, and treat it in a linear way in the high-dimensional space.
Because Hay’s is unable to handle the above types of complex data, this “exact approximation” is actually very coarse.
Moreover, this kind of quantified approximation is not related to financial data such as business income in any way, so the Hai method is actually a primitive and low-dimensional quantitative method that ignores the organization’s network, information rights, and financial flows.
It’s a feeling that’s more refined than human sensibility and isn’t very rational.
This huge ambiguity makes there must be a lot of unfairness in job evaluation, why is “a highly talented student with a master’s degree is higher than a college student with three years of work experience (or vice versa)”?
The Hays evaluation method, which is excellent in the traditional management paradigm, uses the same methodology, as follows: 1. Clarify the core element structure of the analysis object. For example, the Hays assessment is about skill level, problem-solving skills, and job responsibilities. 2. Qualitative rating of each element. 3. Score each element after rating. 4. Conduct specific analysis and score for each specific object, and add it up according to the formula to get the total score. 5. Determine the position of the analysis object in the enterprise according to the total score of the rating. If the object of analysis is the position, then the final conclusion is the rank; If the object of analysis is performance, then the final conclusion is the degree of contribution, which is called a 360-degree evaluation; ……
Because of the coarse granularity, it really needs to be upgraded.
Valor’s “Exact Approximation”
Valor solves precisely the above problems in terms of “data structures and algorithms”, which establishes a general model of “tasks, people, skills, and resources” through complex networks, which means that it is not an empirical model.
The reason why the empirical models of the “highly educated people” mentioned above are prone to “accurate errors” is precisely because the empirical model is too malleable, too poorly inclusive, and does not have universality, and it cannot be adjusted in time in the face of dynamics, complexity and ambiguity, and in the face of uncertainty, if the model does not have the ability to adjust in time, then there will be a dilemma of “holding a hammer and looking at everything like a nail”, and it will naturally make “accurate mistakes”.
The advantage of the general model is that it is malleable, inclusive, and universal, so it can be more effectively modeled and adjusted in time in the face of volatile, complex and ambiguous realities.
However, Valor still provides an “exact approximation”, but this approximation is not rough, but can be very accurate, because it is actually atomizing the organization, and the data structure is very fine, that is, the data structure and the algorithm itself are very accurate. Compared with the “exact approximation” such as the Hays evaluation method, GPT can also be compared with GPT of different generations, and the accuracy of the model depends on the size of the parameters, GPT-1, GPT-2, and GPT-3 are 117 million, 1.5 billion, and 175 billion respectively. To make an imprecise analogy, traditional management paradigms such as the Hays method of estimation are GPT-1 level parameters, while Valor is at least GPT-3 to GPT-4 level parameters.
The “approximation” here depends on how three-dimensional and comprehensive the data the user can get, i.e., whether it can extract known data that already exists, but is scattered in different departments/brains due to organizational problems, and if it can, Valor can detonate it.
It’s a challenge for anyone, not even the CEO himself, to be three-dimensional and comprehensive in terms of data, because you always ask yourself the question – how do I know I’ve made a complete list? The answer is: I don’t know. No one knows. This is the limit of cognition, and it is also the limit of human reason. However, the end of human rationality is the beginning of computational rationality.
It is always impossible for human beings to attain God-like omniscience, but it is always possible to work in the direction of “knowing more.” So we welcome the user to modify her data, especially to add her data, preferably by including all of her known parts into the software’s calculations, so that she doesn’t miss out. Is it omniscient to include all? Of course not, no matter how “complete” it is, it is only subjectively complete, and objectively it is also biased. It’s constant, it’s impossible.
Just like the popular slogan of “Thirteen Invitations”: look at the world with prejudice.
If you think about it, this is actually the real truth: people can only admit their own limitations in order to absorb more true knowledge and keep approaching the truth.
At the same time, there is another reason why human beings can continue to progress: imagination. As Harari puts it in “A Brief History of Mankind”, it is because of the belief in “fiction” that humans build civilization. So, while Valor encourages seeking truth from facts, it also encourages you to use your imagination to simulate any idea you have in Valor and gain insight into values that you couldn’t see before.
Valor is helping managers improve computing power and pursue “exact approximation” AI.
Fuzzy and probabilistic thinking
In Valor, “exact” vs. “approximate” here requires some special clarification.
Valor does this by: 1) atomizing the organization. Whether it’s a mission, vision, values, etc., or a specific idea, an event, an event, etc., they are all supported or supported as a fact. 2) Complex network and graph algorithms model atomic organizations, which solves the problem of “dynamic, complex, and fuzzy” organizational data that is difficult to handle by previous models. 3) On this basis, the input data is quantified through an ingenious “data structure and algorithm”.
Points 1 and 2) indicate whether Valor can be “accurate” enough in terms of the generality of the model, or the phrase that complex networks can be presented as an organization. The conclusion of step 3 can only be a probability of constantly approaching the truth, so it is naturally an “approximation”, but this approximation is very high-order. It is because of the above-mentioned cognitive limitations of human beings, human beings do not have “omniscient eyes”, whether it is in the network, information rights or financial flows, there may be non-three-dimensional and incomplete data collection, which will cause a certain “ambiguity”.
So, the most important thing to say about Valor’s best practice is that the “reliability” of the conclusion depends on the accuracy of the algorithm, and the “validity” of the conclusion depends on your confidence in the input data.
We’ve done a lot of research on the “precision of algorithms“, and your best practices depend largely on the validity of the data you input, and because of the emergence of large language models, AI will also play a great role in helping you input more effective data.
In this way, you can get an advanced “exact approximation” that is several orders of magnitude higher than the traditional management paradigm.
At last
Human beings can never reach the “precise correctness”, but human beings are always trying their best to achieve the “more” precision and correctness, for this “more”, Buffett and Ma Yun’s method is to simplify the road and use “approximate correctness”, and Zhang Yiming has been pursuing “exact approximation” and avoiding “precise error”, in this regard, we are the same as Zhang Yiming.
Here, we can respond to the question at the beginning of the text: Is there really no third way out of the forest?
Some. In management activities, there must be basic rules, which are actually the “approximate correctness” of “mission, vision, and values” that Ma Yun talked about at Lakeside University; However, in the process of management, it is necessary to accurately deal with what can be accurately handled, so that the “organization, talent, and KPI” are more effective.
Precisely, management is different from decision-making: management is faced with the known; Decisions are made in the face of the unknown. And the management activity itself is far from the stage of spelling “approximate correctness” due to its degree of precision.
Management suitability also requires an “exact approximation”, and it requires a “precise approximation” that is not original but high-order.