Skip to content

Py3.12 compat & improved API surface #54

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

flaviut
Copy link

@flaviut flaviut commented Feb 28, 2025

I'll be completely honest, this is mostly AI generated and untested except for the keyword_search and product_details endpoints, which work great. Based on #51.

The upstream API is not great in that it doesn't allow passing the secrets and id as parameters, or specifying how to store the client credentials. I'm not going to say the this PR's API is ideal in that regard either, I'd much rather have a class dedicated to the storage strategy so the API client can store the data however they want, for example in a database.

Usage example, including local caching so that I don't repeatedly hit the API with the same queries:

class DigkeyParts:
    def __init__(self, data_dir: Path = Path("./data")):
        self.client = DigikeyApi(
            client_id="...",
            client_secret="...",
            storage_path=str(data_dir),
        )

        self.cache_conn = sqlite3.connect(data_dir / "digikey.sqlite3")
        self.cache_cursor = self.cache_conn.cursor()

        self.cache_cursor.execute("pragma foreign_keys = on")
        self.cache_cursor.execute("pragma journal_mode = wal")
        self.cache_cursor.execute(
            "create table if not exists keyword_searches (id integer primary key, keyword text, products blob)"
        )
        self.cache_cursor.execute(
            "create table if not exists product_details (id integer primary key, digikey_pn text, product blob)"
        )
        self.cache_cursor.execute(
            "create index if not exists product_details_digikey_pn on product_details(digikey_pn)"
        )
        self.cache_cursor.execute(
            "create index if not exists keyword_searches_keyword on keyword_searches(keyword)"
        )

    def get_by_keyword(self, keyword: str) -> list[Product]:
        keyword = keyword.strip().lower()

        self.cache_cursor.execute(
            "select products from keyword_searches where keyword = ?", (keyword,)
        )
        if result := self.cache_cursor.fetchone():
            return pickle.loads(result[0])

        response = self.client.keyword_search(KeywordRequest(keywords=keyword))
        products = response.products
        self.cache_cursor.execute(
            "insert into keyword_searches (keyword, products) values (?, ?)",
            (keyword, pickle.dumps(products, protocol=5)),
        )
        self.cache_conn.commit()
        return products

    def get_by_id(self, digikey_pn: str) -> Optional[Product]:
        digikey_pn = digikey_pn.strip().upper()

        self.cache_cursor.execute(
            "select product from product_details where digikey_pn = ?", (digikey_pn,)
        )

        if result := self.cache_cursor.fetchone():
            return pickle.loads(result[0])

        response = self.client.product_details(digikey_pn)
        self.cache_cursor.execute(
            "insert into product_details (digikey_pn, product) values (?, ?)",
            (digikey_pn, pickle.dumps(response.product, protocol=5)),
        )
        self.cache_conn.commit()

        return response.product

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants