You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
thank you for this very nice tool.
I want to use the MicrobialDB pre-built database to detect/quantify reads in my fastq files.
I am able to run the tool but it seems that the database and taxonomy file do not have the same tax IDs.
For instance, out of 58675642 reads, 54123119 (±92%) is classified as 9606, which I know it is human.
However, this tax id is not present in ktaxonomy.tsv file that is paired with the db.
I was hoping to find humans in both since it is in the description of the index:
Do you have maybe an up-do-date database by any chance? Or perhaps some directions to go about this?
Can I merge this ktaxnomy.tsv file with the nodes.dmp file from kraken2, remove duplicates and use it? Not sure if that would cause any issues.
Thank you.
Best
Rodolfo
The text was updated successfully, but these errors were encountered:
Hi,
thank you for this very nice tool.
I want to use the MicrobialDB pre-built database to detect/quantify reads in my fastq files.
I am able to run the tool but it seems that the database and taxonomy file do not have the same tax IDs.
For instance, out of 58675642 reads, 54123119 (±92%) is classified as 9606, which I know it is human.
However, this tax id is not present in ktaxonomy.tsv file that is paired with the db.
I downloaded the DB and the indexes https://benlangmead.github.io/aws-indexes/k2:
https://genome-idx.s3.amazonaws.com/kraken/uniq/krakendb-2023-08-08-MICROBIAL/database.kdb
https://genome-idx.s3.amazonaws.com/kraken/uniq/krakendb-2023-08-08-MICROBIAL/kuniq_microbialdb_minus_kdb.20230808.tgz
I was hoping to find humans in both since it is in the description of the index:
Do you have maybe an up-do-date database by any chance? Or perhaps some directions to go about this?
Can I merge this ktaxnomy.tsv file with the nodes.dmp file from kraken2, remove duplicates and use it? Not sure if that would cause any issues.
Thank you.
Best
Rodolfo
The text was updated successfully, but these errors were encountered: