Skip to playerSkip to main content
What if ChatGPT could help you find the next 100x crypto before anyone else? πŸš€ The combination of AI + crypto investing is one of the most powerful strategies for spotting hidden gems in today’s market β€” and it’s easier than you think.

In this video, I’ll show you exactly how to use ChatGPT to find hidden gems in the crypto market. From researching low-cap altcoins to analyzing narratives like DeFi, AI, and tokenization, ChatGPT can help you identify projects before they explode. We’ll also cover strategies for filtering out hype, using data-driven prompts, and combining ChatGPT research with tools like CoinGecko, CoinMarketCap, and TradingView.

Will ChatGPT give retail traders an edge in finding the next Bitcoin, Solana, or XRP before the crowd? Stick around as I break it down step by step and share how you can use AI to gain an edge in crypto investing.

πŸ‘‰ Do you think AI will play a huge role in crypto investing? Comment below with your thoughts β€” I’d love to hear them!

πŸ‘‰ Subscribe for daily alpha on crypto market trends, bold Bitcoin predictions, and altcoin gems that could 10x your portfolio! – https://www.youtube.com/channel/UCpjN8bNE-CoAgpfMatghM9g

πŸ“§ Email: [email protected]

πŸ’° Affiliate Links

Sofi Checking & Savings – Get $25 free ➝ https://www.sofi.com/invite/money?gcp=16a53d0f-b4b2-441d-9100-cfb506305260&isAliasGcp=false

Sofi Investing – Free $25 in stock ➝ https://www.sofi.com/invite/invest?gcp=ab31edd8-701e-4109-9225-51b41e35d246&isAliasGcp=false

Coinbase Exchange – Earn up to $300 BTC ➝ https://coinbase.com/join/YPUQLCY?src=referral-link

Tracking Tools – CoinGecko | CoinMarketCap

Trading Tools – Get $15 off TradingView ➝ https://www.tradingview.com/pricing/?share_your_love=cryptonextsteps

#ChatGPT #Crypto #Altcoins #CryptoNews #HiddenGems #CryptoInvesting #Blockchain #DeFi #Web3 #CryptoTraders #CryptoMarket #AI #CryptoUpdate #BullRun #CryptoStrategy
Transcript
00:00welcome to the deep dive where our goal is pretty simple yeah cut through the noise uh the
00:10information chaos and really hand you synthesized actionable knowledge to stay ahead that's the
00:17plan and today we are wading deep into crypto research i mean the market just generates noise
00:24faster than anyone could possibly process it really does and trying to find those genuinely
00:29overlooked opportunities those you know hidden gems that's often what separates the informed
00:33investor from well the rest of the crowd it's a classic dilemma isn't it due diligence is absolutely
00:39the foundation the bedrock of successful investing sure but in defy honestly it feels like a full-time
00:45job just tracking the data reading all the docs understanding the tech it's overwhelming it is so
00:50our mission for you today is really focused we're gonna define in detail a safe repeatable structured
00:56and maybe most importantly an ai accelerated workflow for finding these opportunities okay
01:01we want to turn that information overload problem into more of a processing problem something manageable
01:07ai accelerated okay let's untack this uh before we even bring in chat gpt or the llms we need to
01:13frame this to find the target when we say hidden gem for this deep dive what exactly are we hunting for
01:20yeah we need a really precise definition otherwise you know we're just chasing random pumps or
01:25speculation right so a hidden gem for this workflow it's got to be a small cap or maybe a mid cap token
01:32definitely far away from the big top 10 assets okay smaller coins smaller yes and crucially it must have
01:40specific verifiable positive attributes we're looking for clear evidence things like improving
01:47fundamentals real early traction or maybe overlooked catalyst technical or market ones things you can
01:52actually measure exactly verifiable and just as important it still needs to lack that high mainstream
01:57coverage if bloomberg or you know cnbc are talking about it daily too late yeah it's probably too late
02:03for the kind of edge we're looking for the lack of hype is almost as important as the strength of the
02:07fundamentals here so we're not just looking for cheap tokens that's not the goal we're looking for
02:12fundamentally sound projects that the market hasn't fully priced in yet probably because of
02:17lack of awareness or maybe just the info is hard to get exactly that we are hunting for alpha that
02:22genuinely requires research effort it's not just sitting there on the surface right and that brings
02:27us neatly to the ai connection why involve a tool like chat gpt or any large language model when crypto
02:34research really relies on real-time you know on-chain metrics isn't there a conflict yeah that's the
02:41challenge isn't it the data the on-chain stuff is real time but the documentation the white papers
02:48the audits the tokenomics docs they're static and often frankly massive and that's precisely where the
02:55llm's main value lies it's time compression specifically time compression in digesting complex text okay think
03:01about the hundreds of hours potentially saved by not having to manually scan a 50-page white paper
03:06just to find the vesting schedule or trying to compare governance mechanisms across three competing
03:11protocols by hand the ai can summarize those complex docs it can distill key risks generate research
03:18checklists based on the documentation and do rapid cross-project comparisons on criteria you define so
03:25the ai is basically handling the grunt work the reading the comparing allowing the human the researcher
03:32to focus purely on the analysis and the decision making part precisely it's an acceleration tool
03:37for your cognitive load however and this is absolutely the foundational caveat for this entire
03:42deep dive okay important point coming up yes ai must must be paired directly with verified real data
03:49sources if you skip the specialized data tools the massaris the dunes and just ask the llm for the current
03:55market cap or active users what happens you risk relying on outdated training data or worse what we call
04:00hallucinated facts things it just makes up because it doesn't have the live info i write fabricated
04:05information exactly the llm is only useful for synthesis when the inputs the numbers the links
04:12the metrics are collected from external verified often real-time sources like massari dune analytics nansen
04:19etc okay let's get into that ai edge versus its limitations then where does it really provide
04:25a genuine measurable advantage well if we isolate it the edge is strictly within the processing the
04:32structuring and the synthesis stage it absolutely cannot collect the primary data but boy can it
04:39organize complexity right so the best fits for llm acceleration are things like as we said deep
04:45document digestion can you give some more specific examples it's not just white papers you'd feed it
04:49right oh absolutely not think about security audits a typical smart contract audit report it might be
04:55dozens of pages long full of technical jargon dense findings yeah tough reads very tough you can feed
05:01that raw text to the lm and ask it specifically like ignoring the low severity stuff list the three
05:06highest risk potential vulnerabilities you found in this audit boom instant summary that's useful or
05:12think about the complex legal structure some protocols have the ai can help generate checklists to make
05:18sure you've covered all the compliance or regulatory risks mentioned in their legal docs and what about that
05:23explain like i'm five function i find that super helpful when i'm trying to wrap my head around some
05:28new complex technical thing it's incredibly valuable because it tests your own understanding right if you
05:35can't explain something simply you probably don't understand it well enough to put money into it good
05:39point the ai can take a really detailed explanation of i don't know a novel sharding architecture or some
05:46complex zero knowledge proof implementation and break it down into simple bullet points or analogies that acceleration of
05:53clarity it's often the difference between grasping a core tech advantage and just dismissing a project
05:58as too complicated yeah i can see that but let's be you know brutally honest about the limitations too
06:04this is the not a fit alone warning where do people commonly misuse these llms in crypto research
06:10okay the most common and honestly the most dangerous misuse is expecting real-time data or market
06:16sensitive insights directly from the llm like asking for the current price exactly chat gpt cannot give
06:23you real-time prices it cannot give you privileged data on wallet flows unless you feed it the output
06:30from a tool like nansen first because its training data is old by necessity yes it's delayed which means it
06:37absolutely must be used after the data is collected from those specialized real-time tools
06:43the massari screeners the dune dashboards the nansen label data these are non-negotiable prerequisites
06:50right relying only on the llm during the data gathering phase it's like trying to navigate a
06:54fighter jet using a map printed six months ago you're gonna crash you guarantee you'll crash into outdated or
07:00worse completely fabricated information that makes the relationship really clear then the ai speeds up
07:05the thinking the analysis the organization but it doesn't replace the initial verifiable data gathering
07:11it's like a world-class assistant for the researcher perfect analogy it takes the verified inputs you
07:16give it and structures them but it's not the detective out there collecting the evidence in the first
07:20place precisely if we use say a legal investigation analogy the llm is the expert paralegal organizes
07:29evidence cross-references documents drafts arguments okay but the dedicated data tools dune nansen
07:36massari they're the detectives collecting the fingerprints the surveillance footage in real time
07:42you absolutely must trust the detective work before you ask the paralegal to synthesize the case okay this
07:48clear division of labor brings us right to the core of this deep dive the repeatable structured five-step
07:54process this is the roadmap the workflow designed to efficiently move you from that raw overwhelming data
08:00to a concise risk-defined investment thesis let's start with the first three steps purely data gathering
08:05and shortlisting right step one source discovery let's call it the screening filter the goal here
08:10is just managing that info overload create a shortlist maybe 10 to 20 tokens max that fit our hidden gem
08:16profile small or mid-cap low coverage high potential exactly and the action here starting with neutral
08:23specialized data hubs screeners basically like the ones massari provides are a good starting point
08:29you're applying filters to quickly weed out projects that are either too large you know no alpha left or
08:35maybe too small too illiquid too risky you'd be surgical with those filters absolutely what are the specific
08:42like non-negotiable filters we should be using here okay first you need to define your sector don't try
08:48to boil the ocean focus on narratives the market is maybe just starting to track things like l2 infrastructure
08:54or deepen and decentralized physical infrastructure or rwa real world assets these are pretty focused
09:00areas right now but commit to one or two for a screening session focus is key okay what else second
09:06valuation check critically important you must filter using both market cap mc and fully diluted valuation
09:12fd why both because for small caps fdv is often the much better metric it accounts for all those future
09:18token unlocks the potential dilution down the road if the fdv is like astronomical compared to the
09:23current market cap even if the mc looks small danger sign huge danger sign the long-term sell pressure is
09:29likely toxic we want projects where the fdv isn't say 50 times the mc unless the tokenomics clearly show
09:38how value is captured proportionally to that dilution that fdv check is so critical so many people just see the
09:43small market cap and completely miss the massive dilution bomb waiting to go off yeah okay sector
09:49valuation yeah what else liquidity checks non-negotiable can you actually buy or sell a
09:55meaningful amount without tanking or pumping the price use the screeners liquidity metrics or check
10:00the order books on major exchanges if you can't get in or out reasonably you're trapped they're trapped
10:06and finally developer activity we need proof that the project is actually being built actively improved
10:12look at platforms like token terminal or sometimes github directly or other code tracking sites
10:17check the frequency of code commits the number of active core contributors over the last say 90 days
10:22so not just hype but actual work being done exactly if dev activity is flatlining the project is likely
10:28dying regardless of what twitter says high consistent dev activity is a strong proxy for future potential
10:34okay great so step one gives us a filtered shortlist active devs sensible valuation small cap projects
10:41now we need to prove they're actually being used not just held by speculators hoping for a pump
10:47that takes us to step two on-chain validation right this is where we shift gears from market level data
10:53down to granular usage data and for this powerful dashboards like those on dune analytics are
10:58indispensable the goal is to cut through the hype validate the project's actual utility is this token
11:05actually needed for the network to function is the fee revenue growing is anyone actually using this
11:10thing makes sense which specific panels or metrics on dune should we be prioritizing where do we focus
11:15you need to pull usage and holder data and critically track the trends not just a single snapshot in time
11:21okay trends over time yes the key metrics to pull maybe save links to the charts for are active
11:28addresses daily or weekly depending on the project's life cycle transaction count shows the actual network
11:34activity load new wallets created this can be a measure of expansion velocity and new user growth
11:40and potentially exchange flows are tokens moving on to exchanges to be sold or off exchanges into wallets
11:46for holding why do these specific metrics active addresses transaction count new wallets matter so much
11:53more than just looking at the price chart because they represent the verifiable difference between a
11:57temporary fad and an actual business being built think of it like this yeah imagine a new app or an l2
12:04or a game if its token price is pumping like crazy but the number of daily active addresses is totally
12:10flat it's like a trendy new coffee shop where everyone's outside taking instagram photos of the
12:14sign but nobody's actually buying coffee exactly the price the hype is high but the daily active
12:20transactions the coffee purchases are low it's likely a purely speculative play a fad not sustainable but
12:27conversely if the prices may be flat or even drifting down a bit yet active addresses are
12:33consistently climbing week after week that's the signal that's fundamental adoption that's a
12:39potentially healthy business being built under the radar it's latent growth just waiting for a
12:44narrative spark or wider market recognition okay and you mentioned tracking this over time
12:49absolutely paramount these metrics must be tracked over a sustained period we think four to 12 weeks
12:55is a reasonable minimum baseline for this kind of analysis right and you need to overlay notable
13:01news dates right onto those charts if active addresses suddenly spike for three days right after a big
13:07partnership announcement probably temporary hype exactly but if they rise consistently steadily across two
13:13months regardless of specific news events that's a fundamental trend you can potentially build a thesis on
13:19the long-term trend data is what separates durable utility from fleeting speculation okay fantastic we
13:26filtered the universe step one we've proven the project is active and being used step two but for
13:31real conviction especially in crypto we often want to know what the biggest smartest players are doing
13:37that brings us to the final data gathering step step three this smart money sanity check yes the goal
13:45here is straightforward verify accumulation or distribution patterns by players who are
13:49generally considered sophisticated they might have better research deeper network access maybe earlier
13:55information and how do we do that for this we rely heavily on specialized wallet labeling tools nansen is
14:01the best known example here we're specifically looking to see if wallets labeled as sophisticated entities
14:07think institutional funds known market makers prominent ecosystem investors or builders have been
14:12consistently accumulating the token or if they've been selling off exiting their positions okay but i need
14:18to push back here a little isn't there an inherent bias if we're waiting for nansen to label wallets
14:25and then identify sustained accumulation haven't we already missed the earliest cheapest entry point
14:33aren't we just following the leaders after they've already made their move that is a really critical
14:38question and the answer is nuanced yes you will almost certainly miss the very earliest ground floor
14:43entry point that's often reserved for vcs or seed investors who got in months or years before the token
14:49even exists publicly right our target here isn't necessarily being the absolute earliest investor our target is
14:55the investor who gets in before broad market coverage starts before the hype cycle really kicks in okay
15:00seeing smart money accumulation does two crucial things for us at this stage
15:04one their likely deeper due diligence serves as a confirmation signal for the positive fundamentals
15:09you spotted in step two it validates your own work right no sanity check exactly and two
15:15their presence their buying often helps stabilize liquidity and implicitly validates the longer
15:20term investment case it adds a layer of institutional credibility if you will so we're essentially
15:25trading a potential for the absolute first 10x gain for maybe a higher probability of a safer 5x gain
15:32it's a risk management trade-off that's a perfect way to put it it's about increasing conviction and
15:37reducing risk by confirming your thesis with the actions of sophisticated players but another big
15:44but okay you have to be highly critical when interpreting nansen data or any wallet tracking data
15:49don't just look at a single massive inflow transaction why not because that could be anything
15:55a fund rebalancing between wallets a treasury transfer for operations a large over-the-counter
16:00o tc deal that doesn't reflect market sentiment it's noisy so what should we focus on you need
16:05to focus on sustained repeat buys from multiple labeled wallets over that same 4 12-week period we
16:12discussed look for the pattern not the single snapshot are several known funds or market makers
16:18consistently adding to their positions week after week that shows real belief it shows sustained belief
16:23in the project's long-term value not just a reaction to some short-term hype this consistent
16:29accumulation pattern acts as our final filter before we unleash the ai for synthesis all right
16:35this is the turning point we've done the hard yards gathering verifiable data the screening step one
16:40the usage validation step two the smart money check step three now we transition to analysis and synthesis
16:47this is where the ai like chat gpt really starts to shine structuring and stress testing all that raw
16:53data we collected exactly we're moving into step four chat gpt synthesis and this is where the magic
16:59can happen but only because we put in the work to gather high quality ingredients first garbage in
17:03garbage out especially with ai right so the action here is feeding the lm your actual verified findings you
17:11literally paste in the key quantitative metrics active addresses up 35 over the last six weeks you give
17:17the links to the specific dune dashboards you use you paste critical excerpts from the tokenomics documents you
17:22found on masari or the project site this is that crucial distinction again we are not asking the llm
17:28to go find data in the wild we are asking it to process and structure the verified data we meticulously
17:35collected in steps one to three precisely we are leveraging its incredible organizational and language
17:41processing capabilities not its uh often flawed world knowledge base okay so we feed it the good stuff
17:47what do we ask it to do then you issue a comprehensive prompt you ask the llm to
17:51synthesize all this disparate data to generate four specific mandatory outputs these are designed
17:57to accelerate your own due diligence process significantly four outputs what are they first
18:02ask for a synthesized prioritized list of risks associated with the project these risks should be
18:07derived directly from the documentation you fed it like tokenomics flaws audit findings and potentially
18:13implied by the on-chain metrics like maybe slowing user growth despite price action okay risk list first
18:18makes sense second a detailed competitor map and comparison table you instruct it based on the technical
18:25docs and market positioning i provided create a feature-by-feature comparison table against its main competitors
18:31this forces the ai to structure the competitive landscape giving you instant context on relative strengths
18:37and weaknesses super useful for positioning third third emote analysis ask it based on the tech docs and market data
18:46what makes this project truly unique defensible or difficult for a well-funded competitor to replicate
18:51quickly is it network effects unique tech community what identifying the sustainable advantage and the
18:57fourth one and fourth this one is critically important falsification tests ask the ai given my bull case
19:04thesis which you state clearly define clear metric based tests that would definitively invalidate this bull case
19:10falsification test let's pause on that yeah that sounds like structured intellectual humility forcing
19:15yourself to define how you could be wrong can you give a concrete example like what would the prompt
19:19and the output look like sure let's say you're analyzing a new layer 2 solution your bull case
19:24hypothesis might be something like i believe project x will capture 10 market share of unique daily l2
19:31users within 12 months because its novel execution environment offers 50 lower gas fees than competitors
19:37a and b okay clear hypothesis you feed the ai this hypothesis along with the competitor data from
19:42the table it generated the user growth trends etc and you ask for falsification tests the ai might
19:49come back with something like falsification tests hashtag one if project x's total value locked tvl growth stalls or
19:56declines for more than 45 consecutive days while its main competitors a and b collectively capture more than
20:0250 percent of all new unique daily senders across the l2 sector during that same period the bull case based on fee advantage
20:08capturing market share is likely invalidated wow okay that is really specific and actionable it takes
20:14the decision making away from just vague feelings or hope and ties it directly to measurable observable
20:20metrics it tells you exactly when you need to admit you might be wrong which is often the hardest but
20:25most important thing for any investor to do and that structure that predefined i'm wrong if x happens
20:31leads us perfectly into the final step of the workflow step five risk controls and pre-mortem
20:38this is where we solidify the thesis with proactive defense and psychological preparation right this
20:43sounds like integrating behavioral finance directly into the analysis process using the ai for that
20:47exactly right you use the llm here almost purely as a structured thinking partner a sounding board you
20:53task it based on the risks that identified for you in step four to help you draft specific position sizing
20:58rules for example given the high unlock risk in month six limit initial position to one percent of
21:04portfolio okay you use it to help define specific non-emotional exit criteria maybe based on those
21:10falsification tests i will exit the position entirely if active addresses fall below 80 percent of their 90
21:15day peak and stay there for two consecutive weeks write it down commit to it and the pre-mortem exercise
21:22that sounds powerful but also maybe a bit counterintuitive yeah how do we ensure that running a pre-mortem
21:29imagining failure doesn't just subconsciously anchor us to that failure scenario and make us too scared
21:35to actually make the investment that's a really profound point actually the framing is key the
21:40pre-mortem must be framed as a forensic exercise not as some kind of self-fulfilling prophecy okay
21:46forensic exercise how the prompt shouldn't be will this fail it should be something like assume this
21:52investment has definitively failed six months from now based on the risks we identified in step four
21:57tokenomics competition adoption walk me through the most plausible sequence of events the narrative
22:04that led to that failure ah okay working backwards from assumed failure exactly by generating this
22:09logical failure narrative before you invest you achieve a couple of things you directly combat
22:14confirmation bias that very human tendency to only seek out data that supports your decision after
22:20you've made it right we all do that we do and when you're forced to logically construct how failure
22:25could happen using the risks you already identified you're sort of psychologically inoculating yourself
22:31against the emotional pain of loss if it does happen you've already lived it mentally it helps prevent
22:36clinging to a losing trade based on hope that is really powerful psychological risk management confronting
22:42and structuring the failure scenario before committing capital makes you prepared for both outcomes
22:47so to get this high quality analysis reliably out of the ai you can't just ask simple questions you
22:53need slightly more sophisticated prompts we've found three specific prompt engineering techniques or
22:59patterns that consistently yield more structured actionable analysis okay give us the patterns how do we
23:05elevate the quality of the l1 output from basic summary to real analysis okay first is the summarizer
23:11pattern but with a twist we use this for document digestion but add a critical edge the prompt
23:17looks something like this summarize this project's tokenomics its emissions schedule and particularly
23:23the unlock risks using these pasted documents or links provide them then based specifically on the
23:29potential supply shock from unlocks and the vesting periods you found list five specific red flags or
23:35events that could cause a major price drawdown ah so it's not just summarizing what is it it's summarize and
23:41then use that information to immediately identify potential problems related to supply exactly it forces the
23:46llm to move from passive digestion to active critical analysis focused on a key risk area supply shocks
23:53it turns a huge block of text into a prioritized bulleted warning system very efficient yeah that's
23:58efficiency right there going from what is it straight to what are the biggest landmines related to supply
24:04okay pattern number two second we call it the comparator pattern this is fantastic for getting quick
24:09context on unit economics and the competitive landscape the prompt needs to be highly structured something like
24:16create a detailed feature by feature table comparing project a our target versus its main competitors
24:22project b and project c include columns for consensus mechanism core fee model estimated treasury runway in
24:29months top holder concentration percentage and stated roadmap maturity for q4 use only the pasted notes and
24:36official links i am providing now okay very specific columns and restricting it to the provided info yes that
24:41prevents hallucination this instantly visualizes the competitive field it helps you understand relative
24:47value without spending hours manually building a comparison spreadsheet yourself you know if project
24:52a has maybe a three-year treasury runway based on its docs but competitor project b only has six months left
24:58that's a huge comparative insight right there instantly surfaced super valuable for relative assessment
25:03definitely superior to manual spreadsheet building for every single competitor okay and the third and final
25:08final pattern this one we call the signal to noise pattern or maybe the devil's advocate pattern
25:12it's specifically designed to stress test your own inherent human bias towards the project you've just spent
25:18hours researching ah fighting confirmation bias again how does this one work the prompt goes something
25:24like this okay here are the key on chain metrics i've verified eg active addresses up 20 percent month over
25:31month but unique senders are flat and here are the nansen wallet label observations eg smart money accumulation
25:38is minimal mostly retail buying now based only on this data argue against investing in this project right now
25:46specifically what negative narrative or interpretation would invalidate the apparent bull case even if the
25:51price is currently rising wow okay so you're explicitly telling the ai to build the bear case using the data you found
25:58even if some of it looks positive on the surface precisely it forces the ai and more importantly
26:03forces you to rigorously challenge your own bias to give significant weight to any contradictory evidence
26:08or alternative interpretations it ensure you're not just seeing what you want to see okay these patterns
26:13are great let's try to bring this entire five-step workflow including the ai synthesis to life can we walk
26:19through two hypothetical case studies maybe one success and one near miss where the process saves someone
26:25from a bad investment absolutely these kinds of stress tests really show the workflow in action all
26:31right case study one let's call it smart money confirms the thesis this is our hypothetical success
26:35story we start at step one the massari screener it flags let's say a layer two focused on data
26:41infrastructure ticks the boxes low fully diluted valuation relative to the actual fees it's generating
26:48and crucially it seems to be flying completely under the radar barely any mentions on social media
26:53low analyst coverage okay good candidate we move to step two on chain validation we pull up the
26:58relevant dune dashboards and the data is compelling we see a consistent undeniable upward trend daily
27:05active users and the total fees generated by the protocol have been rising steadily week over week
27:10for maybe nine consecutive weeks on nine weeks straight yeah a genuine sustained sign of adoption
27:16it strongly suggests the product market fit is real not just temporary hype people are actually using it and
27:21paying for it okay strong fundamental signal now for step three the nansen confirmation does smart money agree
27:29and let's say in this case step three seals the deal we check the nansen wallet labels for this token
27:35and we find evidence of repeat maybe small but sustained accumulation buys specifically from say
27:41five different named sophisticated venture funds or market makers happening over that same nine week period
27:47so not just one big whale buy but coordinated consistent accumulation from known smart players
27:53exactly that pattern provides significant validation so now we move to the ai synthesis steps four and five
27:59right you feed chat gpt the links to the dune dashboards showing the rising usage the summary of
28:04the nansen findings about fund accumulation and excerpts from the tokenomics explaining the fee capture
28:09the ai takes these inputs and what is it output it synthesizes them into a concise easily digestible investment
28:15thesis maybe highlighting the synergy between the growing user base and the validated smart money accumulation
28:23but crucially using maybe the summarizer pattern on the tokenomics docs the ai also spits out a clear
28:29prioritized risk list perhaps it flags a complex vesting schedule with a moderate unlock coming in four months
28:36this allows the investor to size their position appropriately acknowledging that specific risk alongside the clear growth signals
28:43the whole process which might have taken weeks of manual digging and analysis gets compressed potentially into a single focused afternoon
28:51using this workflow that's the dream scenario efficiency plus conviction now let's flip it case study to the near miss this is how the
28:58workflow saves you capital helps you identify and consciously avoid a potential trap let's say we find a token that's
29:04generating absolutely massive sustained hype on social media twitter's buzzing influencers are shilling it the price chart looks like a rocket ship the
29:11fomo is intense ah yes the classic hype cycle okay but we stick religiously to the workflow no exceptions
29:19we move to step two the on-chain validation check and uh oh we find a stark contradiction between the social narrative
29:27and the actual usage reality what is dune show dune analytics shows may be flat or even slightly declining
29:34active addresses over the past month and perhaps more worryingly a sharp drop in the number of unique
29:39senders fewer distinct wallets are actually interacting with the protocol so lots of talk but usage is
29:45stagnant or falling big disconnect huge disconnect people are talking but they aren't actually using
29:50the product then we check step three nansen what's the smart money doing amidst this hype and nansen confirms
29:57the worrying usage trend it shows clear net outflows from wallets labeled as smart money over the past month
30:02they're selling into the hype not buying wow so the verifiable data completely contradicts the social media
30:07narrative the noise is deafening but the actual signal is flashing bright red how does the ai help
30:13us navigate this in the risk phase steps four and five okay the user seeing these conflicting signals hype
30:19versus reality retail fomo versus smart money selling decides to run a heavy risk mitigation step using the ai
30:27they specifically ask chat gpt step five to help draft a pre-mortem analysis okay and as input for this
30:33pre-mortem they feed the ai the tokens detailed vesting schedule and supply information which
30:39let's say was incredibly complex and buried deep in some governance forum post not easily visible getting
30:45complexity exactly the ai perhaps using that summarizer pattern we discussed digests this complex supply
30:51info and immediately spots a major issue a huge previously unnoticed emissions overhang
30:57a massive multi-million dollar chunk of tokens is scheduled to unlock and hit the market next month
31:03the unlock bomb the unlock bomb completely obscured by all the social media hype the ai then helps
31:08outline the highly probable failure scenario the token price likely tanks hard upon this massive unlock
31:14regardless of fundamentals which were already questionable based on usage data and the result the result
31:18is clear the user armed with this structured analysis of the usage data smart money flows and the hidden
31:24supply risk surfaced by the ai confidently passes on the investment they avoid a potentially massive
31:31drawdown triggered by that mechanical supply shock that the hype mob completely missed that right there
31:36is the ultimate value proposition isn't it the workflow amplified by the ai didn't just tell you what
31:41to buy it accelerated and structured your decision to not buy which let's be honest is often the most
31:47profitable decision an investor can make in crypto saving capital is key absolutely and to execute
31:54this kind of advanced ai assisted strategy successfully week after week you need to be
31:59really surgical and rigorous about the specific metrics you track consistently we rely on what we
32:04call three mandatory data panels these move way beyond just simple price and volume okay let's break
32:09down these panels panel one the on-chain traction panel this is about proving real utility right
32:14correct here we prioritize those three key metrics we discussed active addresses transaction count and unique
32:20senders these are the vital signs of a protocol's health you know if a million addresses hold a token but only
32:28a few hundred are actually active daily it's mostly dormant holders not users exactly you've achieved
32:33distribution success maybe but not utility success and again we track this data over 4 12 weeks minimum
32:40daily volatility in these metrics is mostly noise the sustained multi-week trend is the signal we care about
32:47any more advanced metrics within this panel yeah for a more sophisticated look especially within layer two
32:52ecosystems you might dig deeper than just total transaction volume look at the distribution of gas usage among the different
32:58protocols built on that l2 is your target project consistently driving a significant maybe growing
33:04percentage of the total gas consumed on the network ah so is it becoming indispensable to the ecosystem precisely that
33:11suggests deep integration and real demand not just transient speculative traffic hopping between dApps okay
33:17great panel two panel two holder quality and concentration this is where the nansen layer or similar analytics comes
33:24in heavily it's about analyzing who owns the supply and critically where it's moving what are we tracking here
33:31first track the top 10 or top 20 holder percentage over time if just a handful of early whales or the team wallet
33:38dominate the liquid supply big red flag centralization risk huge red flag centralization risk potential exit
33:44liquidity issues down the line but more specifically we track the number of distinct wallets labeled as fun
33:50market maker or smart money that are net accumulating versus those that are net distributing each week or month
33:56tracking the count of smart wallets accumulating not just the total volume yes because it shows the breadth of
34:01conviction is it just one fun buying or are multiple sophisticated players seeing the same opportunity
34:08this allows us to construct something really powerful we call the smart money heat map smart money heat
34:13map tell me more that sounds advanced it is we track these weekly net flows from labeled smart money
34:18wallets but we segment them by protocol sector l2s dempin rwa gaming defi lending etc why track fund flows by
34:27sector what does that tell us it helps you spot potential sector rotations early often before the narrative hits
34:34mainstream crypto media or influencers for instance if you consistently see net outflows from wallets labeled as
34:41active and defi lending protocols for several weeks but at the same time you see sustained net inflows into
34:49wallets known to be active and depend in rwa projects that suggests the big money is repositioning shifting focus
34:56exactly it suggests where sophisticated capital believes the next major growth narrative might emerge
35:03this gives you valuable lead time helping you prioritize opportunities from your step one screened list that align with
35:08these nascent institutional flows that's a really powerful edge okay panel three panel three a mission and unlock calendar
35:15this is all about the unit economics of supply understanding dilution is non-negotiable if you want to avoid being blindsided
35:21where do we get this data reliably this is primarily derived from meticulous reading of official tokenomics documents
35:27governance forums and often aggregated by data providers like massari you need to map out the
35:32cumulative circulating supply versus the fully diluted valuation fdv at key future dates mapping out future
35:40supply yes and specifically you must calculate the expected cell pressure from major upcoming scheduled unlocks
35:47for example if your analysis reveals that say 50 of the total remaining unvested token supply
35:53unlocks for early investors in the team over the next four months you need to quantify that you need
35:58to quantify the approximate usd value of that unlock at current prices to understand the sheer scale of
36:03the potential mechanical cell pressure hitting the market the ai can help process the complex unlock
36:09scheduled details from the docs but the human researcher has to decide if the fundamental growth story from step two
36:15is plausibly strong enough to absorb that incoming supply without cratering the price right growth versus
36:20supply pressure it's a balancing act yeah now this whole workflow is designed to find hidden gems which
36:27often means smaller market caps potentially lower liquidity we absolutely must talk about the safety
36:33checks and ethical considerations here could agree more the final safety requirements are paramount especially
36:39in this segment of the market first you must be hyper alert for potential pump and dump dynamics these thrive
36:46in low liquidity environments a quick check can a relatively small investment small relative to your
36:52typical position size significantly move the price of the token on the main exchanges if yes if yes that
36:58signals a thin liquidity environment it's a major risk your investment itself could be potentially illiquid when
37:03you need to exit proceed with extreme caution or more likely just pass makes sense and what about that
37:09concentration rule you mentioned derived from panel 2 data yes the holder concentration if your analysis
37:14using nansen or similar tools reveals that the top say 10 non-exchange non-protocol wallets control a
37:22significant overwhelming percentage of the liquid supply the definition of significant depends on your risk
37:28tolerance but generally anything above 50 concentrated in so few hands is highly dangerous
37:33what's the verdict it's a clear unambiguous risk indicator the potential for manipulation for coordinated
37:39dumping or just massive cell pressure if one whale exits is simply too high our definitive recommendation
37:45based on analyzing countless past failures is usually just to pass on that opportunity find something with
37:50healthier distribution and the final guiding principle here the mantra that must be followed
37:55religiously throughout this entire process use ai to explain risk never ever to ignore it the llm should
38:02make you more cautious more structured more aware of potential downsides not less disciplined or more
38:08prone to fomo so just to recap our journey today the real transformative power of ai in crypto research as we
38:15see it isn't really in primary data collection it's purely in the synthesis the structured analysis and maybe most
38:22of the results uniquely in facilitating that psychological risk management piece right it takes that complex verified data you feed it
38:29and helps rapidly construct a clear defensible thesis and just as importantly a clear risk profile
38:36but the power of the overall workflow itself lies in its discipline structure combining the high-level screening from tools like
38:42massari with the granular usage proof from dune and that critical smart money validation from nansen to work together they have to work together
38:49without those verifiable data inputs feeding the machine the ai is at best a sophisticated distraction and at
38:55worst a dangerous source of confident sounding misinformation the key takeaway then is really
39:00understanding that these tools the data platforms and the ai are separate but they're critically interdependent you need
39:07the whole chain when used correctly together they form a pretty robust shield against market noise
39:13hype and purely emotional trading decisions well said so as we wrap up we want to leave you with a
39:18provocative thought something to mull over forcing you to choose signal over noise in your own process
39:25imagine this scenario nansen's labeled smart money wallets are clearly actively exiting a token you're
39:31watching net outflows week after week but at the same time social media sentiment is absolutely
39:37mooning it's pure rocket emojis to the moon posts fueled by hype and fomo which signal do you follow
39:42the verifiable flow of sophisticated capital hitting for the exits or the deafening roar of the crowd
39:48narrative that's the kind of hard question this structured workflow forces you to confront and
39:53hopefully answer rationally long before you click buy that's the test we really hope this deep dive into ai
39:59assisted crypto research has been valuable maybe giving you a concrete framework or refining your own
40:05process if this workflow helped you define your search criteria better or gave you specific metrics
40:11to start tracking or maybe just clarified how to actually use these powerful tools like chat gpt nansen
40:19and dune together yeah well we need your help too yeah absolutely engaging with this deep dive yeah and
40:26subscribing to the channel if you haven't already dropping a comment below maybe letting us know which metric
40:30you think is the absolute must have before buying or just hitting that light button it genuinely helps
40:36support us it really does it boosts our visibility in the algorithms lets youtube know you value this
40:41kind of content and frankly it allows us to keep creating this sort of quality research heavy crypto
40:46analysis for you we love doing it but your engagement makes it possible we appreciate it so thanks
40:51for diving deep with us today we'll catch you on the next one
41:00you
Be the first to comment
Add your comment

Recommended