Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing logs_by_topic table #316

Open
johnayoung opened this issue Feb 2, 2022 · 3 comments
Open

Missing logs_by_topic table #316

johnayoung opened this issue Feb 2, 2022 · 3 comments

Comments

@johnayoung
Copy link

johnayoung commented Feb 2, 2022

Hi all,

When attempting to run the DAGs for the first time, we are unable to see where the "logs_by_topic" table is getting populated for our log event.

WITH parsed_logs AS
(SELECT
    logs.block_timestamp AS block_timestamp
    ,logs.block_number AS block_number
    ,logs.transaction_hash AS transaction_hash
    ,logs.log_index AS log_index
    ,logs.address AS contract_address
    ,`<project-id>-internal.ethereum_<entity>_blockchain_etl.parse_<smart-contract>_event_<event-name>`(logs.data, logs.topics) AS parsed
FROM `<project-id>-internal.crypto_ethereum_partitioned.logs_by_topic_0x8c5` AS logs
WHERE

  address in (lower('<address>'))

  AND topics[SAFE_OFFSET(0)] = '<topic>'


  -- live


  )
SELECT
     block_timestamp
     ,block_number
     ,transaction_hash
     ,log_index
     ,contract_address

    ,parsed.owner AS `owner`
    ,parsed.spender AS `spender`
    ,parsed.value AS `value`
FROM parsed_logs
WHERE parsed IS NOT NULL

The section in question is this guy:

...
FROM `<project-id>-internal.crypto_ethereum_partitioned.logs_by_topic_0x8c5` AS logs
...

We know that this is part of the "LIVE" realtime update section, but what is actually populating the table with the topics that we specify? Is this being done in a different repo?

@medvedev1088
Copy link
Member

@johnayoung
Copy link
Author

Thanks a ton @medvedev1088 for the quick response.

So am I correct in assuming:

Are both of these repos setup enough where we can plug and play our own implementation? We plan on contributing to the dataset ecosystem, but need a custom implementation for some edge cases.

@medvedev1088
Copy link
Member

@johnayoung yes those assumptions are correct. The code in the repos is sufficient to set the system up.

On your last point, what datasets are you planning on contributing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants