The Conundrum of ICOBench and its TOP 20 Experts

I’ve researched deeper on the phenomenon of ICO “experts” cropping overnight and came up with amusing findings. Most of today’s top ICO experts are average people who decided to start advising ICOs within the past year or two.

There have been decades of study performed to discover the hallmarks of expertise and how it can be achieved. Most scientists in the field now agree that there are two distinct forms of intellectual expertise: heuristics and biases (HB) and natural decision-making (NDM). While these two schools of thought differ in major and important ways, both rely on objectively successful completion of tasks within a domain as prerequisites for being deemed an expert.

How do we define expertise?

In the crypto space, we can use the legal definition of what an “expert” opinion is and why it must be admissible as proof of expertise. According to Odgers and Richardson, the traditional rules of evidence in Australian courts include:

  • The opinion must be relevant to a fact in issue.
  • The expert must disclose the facts (usually assumed) upon which the opinion is based.
  • The facts upon which the opinion is based must be capable of proof by admissible evidence.
  • Evidence must be produced to prove the assumed facts upon which is the basis of the opinion.

I’ve researched deeper on the phenomenon of ICO “experts” cropping overnight and came up with amusing findings. Most of today’s top ICO experts are average people who decided to start advising ICOs within the past year or two.

Experts don’t seem to have any special knowledge except they were invited by an organization or a group hoping to launch an ICO based on what they deem is this personality’s influence in the crypto/blockchain field.

A track record of rating ICOs for over 3 months apparently becomes acceptable as “expertise”. Some of them were part of some ICO, but at that point, they were clueless about how ICOs work. It’s only after that they established their knowledge about them.

Those were observations of insiders in the field and I’m not about to name names except when these early ICO advisors make you think that they are super specialists and you “don’t belong in the same group with them”. That is the biggest deception they spread.

The funny part is, only a few ICO advisors could be called well-known and popular but ICO founders are led to believe that their ICO’s success depends on the popularity of an ICO advisor/expert, which could not be further from the truth. But as things stand, you can apparently advise your way to popularity, as the current crop advisors seem to be doing.

How does ICOBench choose its TOP 20 experts?

Here’s the thing, there’s some “mystery” in the way ICOBench ranks the experts in the "TOP 20" and so on. Currently, it’s based on the success score where for each ongoing or upcoming ICO, a person (or an agency) receives a number of points that is calculated based on the ratings of the ICOs that they are a part of.

The problem is, the top 30 guys all have something like 40-70 projects or around that range under their belt. Make no mistake, after I reached out to ICOBench, they modified their “Experts” listing page based on “weighted score”, but still the Top 20 and Top 10 are based on the Success Score. The ICO space has been active for around 2 years now or an equivalent of 24 months of existence.

Let me show you how this expert rating CANNOT possibly work by simple calculation if you will bear with me.

A typical ICO takes anywhere from 4-6 months from start to full completion. Even in 2017, it was taking at least 4 months for full completion. Considering 5 months as an average, we are looking at 10 active projects at all times for an ICO Advisor assuming he takes up 2 new projects every month.

From my experience, if you are hands-on involved to advise, assist, and consult the ICO, reviewing all activities and so on, it can take up to 40 hours a week - as a “working advisor”. For just advising and consulting for Q & A, maybe 20 hours a month.

Considering an average of 30 hours, that's 300 hours a month of advisory time there. THEN the rest of the time for learning, marketing themselves, living a life, and so on. I know we all work for 300 hours a month, but we just don't get to work on advisory all those 300 hours.

These computations reveal the fact, that these top advisors/experts are less "knowledgeable" as they never really advised the ICOs, which would be impossible as we have deduced from the computation, but just had the name to the list and may be helped with a thing or two.

I have spoken in recent days with many ICO founders who had reached out to me saying they did not get any value or directions or assistance of any kind from the "top advisors" and wanted my assistance. Of course, I cannot do more than 2 advisories at one time (not 2 per month, but 2 active advisories at one time irrespective of how many months I’ve been at it), so I have to decline. Investors depend on expert’s rating to judge who the "best" is.

Bottom line, it’s my kind suggestion to not just consider the number of ICOs these “experts” are part of, but to decide their competence by the way they actually served the ICOs and in the manner they are rating the ICOs. I hope it’s taken as a positive suggestion for the betterment of the community by all rating sites who adopt a similar calculation.