Hello! I'm trying to update an custom image from D...
# help-connector-development
y
Hello! I'm trying to update an custom image from Docker Hub but I'm getting an error
Get Spec job failed
, but the previous build worked correctly. Difference between images only in libs error build
Copy code
Successfully installed Deprecated-1.2.14 Jinja2-3.1.2 MarkupSafe-2.1.3 PyYAML-5.4.1 airbyte-cdk-0.39.2 airbyte-protocol-models-0.3.6 attrs-23.1.0 backoff-2.2.1 cachetools-5.3.1 cattrs-23.1.2 certifi-2023.5.7 charset-normalizer-3.1.0 dpath-2.0.8 exceptiongroup-1.1.1 genson-1.2.2 idna-3.4 isodate-0.6.1 jsonref-0.3.0 jsonschema-3.2.0 pendulum-2.1.2 platformdirs-3.5.1 pydantic-1.9.2 pyrsistent-0.19.3 python-dateutil-2.8.2 pytzdata-2020.1 requests-2.31.0 requests-cache-1.0.1 six-1.16.0  typing-extensions-4.6.3 url-normalize-1.4.3 urllib3-2.0.2 wrapt-1.15.0
working build
Copy code
Successfully installed Deprecated-1.2.13 Jinja2-3.1.2 MarkupSafe-2.1.2 PyYAML-5.4.1 airbyte-cdk-0.37.0 airbyte-protocol-models-0.3.6 attrs-23.1.0 backoff-2.2.1 cachetools-5.3.0 cattrs-22.2.0 certifi-2023.5.7 charset-normalizer-3.1.0 dpath-2.0.8 exceptiongroup-1.1.1 genson-1.2.2 idna-3.4 isodate-0.6.1 jsonref-0.3.0 jsonschema-3.2.0 pendulum-2.1.2 platformdirs-3.5.1 pydantic-1.9.2 pyrsistent-0.19.3 python-dateutil-2.8.2 pytzdata-2020.1 requests-2.30.0 requests-cache-1.0.1 six-1.16.0  typing-extensions-4.5.0 url-normalize-1.4.3 urllib3-2.0.2 wrapt-1.15.0
Copy code
2023-06-05 15:37:44 ERROR i.a.s.a.ApiHelper(execute):37 - Unexpected Exception
java.lang.IllegalStateException: Get Spec job failed.
	at com.google.common.base.Preconditions.checkState(Preconditions.java:502) ~[guava-31.1-jre.jar:?]
	at io.airbyte.commons.server.converters.SpecFetcher.getSpecFromJob(SpecFetcher.java:14) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
	at io.airbyte.commons.server.handlers.SourceDefinitionsHandler.getSpecForImage(SourceDefinitionsHandler.java:300) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
	at io.airbyte.commons.server.handlers.SourceDefinitionsHandler.updateSourceDefinition(SourceDefinitionsHandler.java:245) ~[io.airbyte-airbyte-commons-server-0.41.0.jar:?]
	at io.airbyte.server.apis.SourceDefinitionApiController.lambda$updateSourceDefinition$8(SourceDefinitionApiController.java:141) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
	at io.airbyte.server.apis.ApiHelper.execute(ApiHelper.java:23) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
	at io.airbyte.server.apis.SourceDefinitionApiController.updateSourceDefinition(SourceDefinitionApiController.java:141) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
	at io.airbyte.server.apis.$SourceDefinitionApiController$Definition$Exec.dispatch(Unknown Source) ~[io.airbyte-airbyte-server-0.41.0.jar:?]
	at io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invoke(AbstractExecutableMethodsDefinition.java:371) ~[micronaut-inject-3.8.5.jar:3.8.5]
	at io.micronaut.context.DefaultBeanContext$4.invoke(DefaultBeanContext.java:594) ~[micronaut-inject-3.8.5.jar:3.8.5]
	at io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:303) ~[micronaut-router-3.8.5.jar:3.8.5]
	at io.micronaut.web.router.RouteMatch.execute(RouteMatch.java:111) ~[micronaut-router-3.8.5.jar:3.8.5]
	at io.micronaut.http.context.ServerRequestContext.with(ServerRequestContext.java:103) ~[micronaut-http-3.8.5.jar:3.8.5]
	at io.micronaut.http.server.RouteExecutor.lambda$executeRoute$14(RouteExecutor.java:659) ~[micronaut-http-server-3.8.5.jar:3.8.5]
	at reactor.core.publisher.FluxDeferContextual.subscribe(FluxDeferContextual.java:49) ~[reactor-core-3.5.0.jar:3.5.0]
	at reactor.core.publisher.InternalFluxOperator.subscribe(InternalFluxOperator.java:62) ~[reactor-core-3.5.0.jar:3.5.0]
	at reactor.core.publisher.FluxSubscribeOn$SubscribeOnSubscriber.run(FluxSubscribeOn.java:194) ~[reactor-core-3.5.0.jar:3.5.0]
	at io.micronaut.reactive.reactor.instrument.ReactorInstrumentation.lambda$init$0(ReactorInstrumentation.java:62) ~[micronaut-runtime-3.8.5.jar:3.8.5]
	at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:84) ~[reactor-core-3.5.0.jar:3.5.0]
	at reactor.core.scheduler.WorkerTask.call(WorkerTask.java:37) ~[reactor-core-3.5.0.jar:3.5.0]
	at io.micronaut.scheduling.instrument.InvocationInstrumenterWrappedCallable.call(InvocationInstrumenterWrappedCallable.java:53) ~[micronaut-context-3.8.5.jar:3.8.5]
	at java.util.concurrent.FutureTask.run(FutureTask.java:317) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?]
	at java.lang.Thread.run(Thread.java:1589) ~[?:?]
k
A message from kapa.ai
It seems like you're encountering the "Get Spec job failed" error when trying to update a custom connector in Airbyte. Based on the information provided, it's not clear what the exact cause of the issue is. However, there are a few things you can try to resolve the problem: 1. Make sure your custom connector image is available on the machine where Airbyte is running. You might need to pull the image manually to your server [source](https://airbytehq.slack.com/archives/C021JANJ6TY/p1665045044792859). 2. Check the logs for more information about the error. You can use
docker logs <container_name>
to view the logs of the Airbyte server container [source](https://discuss.airbyte.io/t/2248). 3. Ensure that your custom connector is built correctly and all the dependencies are properly installed. You can refer to this [forum discussion](https://discuss.airbyte.io/t/2248) where a user faced a similar issue and resolved it by fixing the requirements in the
setup.py
file and updating the
source_specs.yaml
file. If the issue persists, you might want to share more information about your custom connector and its configuration to get more specific help.
j
Please check whether the spec command works correctly when calling it from the command line like this:
Copy code
docker run --rm -i <source-image-name> spec
y
@Joe Reuter (Airbyte) Status: Downloaded newer image for xxxxx/airbyte:source-reports-115 Traceback (most recent call last): File "/airbyte/integration_code/main.py", line 9, in <module> from source_reports import SourceReports File "/airbyte/integration_code/source_reports/__init__.py", line 1, in <module> from .source import SourceReports File "/airbyte/integration_code/source_reports/source.py", line 23, in <module> from source_reports.spec import ConnectorConfig File "/airbyte/integration_code/source_reports/spec.py", line 73, in <module> class ReportConfig(BaseModel): File "pydantic/main.py", line 205, in pydantic.main.ModelMetaclass._new File "pydantic/fields.py", line 491, in pydantic.fields.ModelField.infer File "pydantic/fields.py", line 421, in pydantic.fields.ModelField._init File "pydantic/fields.py", line 537, in pydantic.fields.ModelField.prepare File "pydantic/fields.py", line 641, in pydantic.fields.ModelField._type_analysis File "/usr/local/lib/python3.9/typing.py", line 852, in _subclasscheck return issubclass(cls, self.origin) TypeError: issubclass() arg 1 must be a class___
j
That’s the problem - the connector code doesn’t work correctly and doesn’t return its spec
y
But no one changed the custom connector code, only the version of the library that pulls airbyte has changed only works with: "Deprecated==1.2.13" "MarkupSafe==2.1.2" "cachetools==5.3.0" "cattrs==22.2.0" "requests==2.30.0" "typing-extensions==4.5.0" "airbyte-cdk==0.37.0" spec.py
Copy code
import logging
from datetime import datetime, date
from enum import Enum
from typing import Any, Dict, List, Optional, Type, Union, Literal

from airbyte_cdk.sources.config import BaseConfig
from pydantic import BaseModel, Field, PositiveInt

from .fields import _metrics, _groups
logger = logging.getLogger("airbyte")

metrics = [(e["name"], e["name"]) for e in _metrics]
MetricsFields = Enum("MetricsFields", dict(metrics))

groups = [(e["name"], e["name"]) for e in _groups]
GroupsFields = Enum("GroupsFields", dict(groups))

reports = [(r,r) for r in ['cohorts', 'cohortsMonthly', 'cohortsWeekly', 'overview']]
ReportsFields = Enum("ReportsFields", dict(reports))

DATE_TIME_PATTERN = "^[0-9]{4}-[0-9]{2}-[0-9]{2}$"
EMPTY_PATTERN = "^$"

class AppConfig(BaseModel):
    """Config for custom app"""

    class Config:
        use_enum_values = True

    app_name: str = Field(
        title="Name",
        description="The name value of app",
    )

    token: str = Field(
        title="Token",
        description="The token",
    )

    events: List[str] = Field(
        title="Events",
        description="A list of events (only for cohorts report)",
        default=[],
    )

    reports: List[ReportsFields] = Field(
        title="Reports",
        description="Reports list for app"
    )

    start_date: Optional[date] = Field(
        title="Start Date",
        description="The date from which you'd like to replicate data for this stream, in the format YYYY-MM-DDT00:00:00Z.",
        pattern=DATE_TIME_PATTERN,
        examples=["2017-01-25"],
    )

    end_date: Optional[date] = Field(
        title="End Date",
        description=(
            "The date until which you'd like to replicate data for this stream, in the format YYYY-MM-DDT00:00:00Z. "
            "All data generated between the start date and this date will be replicated. "
            "Not setting this option will result in always syncing the latest data."
        ),
        pattern=DATE_TIME_PATTERN,
        examples=["2017-01-26"],
    )

class ReportConfig(BaseModel):
    """Config for custom app"""

    class Config:
        use_enum_values = True

    report: Literal['cohorts', 'overview', 'cohortsMonthly', 'cohortsWeekly'] = Field(
        title="Report",
        order=1,
        description="Report type"
    )

    metrics: List[MetricsFields] = Field(
        title="Metrics",
        order=2,
        description="A list of chosen fields for fields parameter",
        default=[],
    )

    groups: List[GroupsFields] = Field(
        title="Groups",
        order=3,
        description="A list of chosen groups",
        default=[],
    )

    additional_metrics: List[str] = Field(
        title="Additional metrics",
        order=4,
        description="A list of metrics",
        default=[],
    )

    additional_groups: List[str] = Field(
        title="Additional groups",
        order=5,
        description="A list of groups",
        default=[],
    )

    attribution_type: Union[None, Literal['click', 'impression', "all"]] = Field(
        title="Attribution Type",
        order=6,
        description="Attribution type (only for cohorts report)",
        default=None,
    )

    lookback_period: int = Field(
        title="Window",
        order=7,
        default=0,
        description="Include data window",
    )

    full_sync: bool = Field(
        title="Include all days in period",
        order=8,
        default=True,
        description="Include all days in period. Use this for first run.",
    )

class ConnectorConfig(BaseConfig):
    """Connector config"""

    class Config:
        title = "Source Adjust Reports"

    start_date: date = Field(
        title="Start Date",
        order=1,
        description=(
            "The date from which you'd like to replicate data for all incremental streams, "
            "in the format YYYY-MM-DDT00:00:00Z. All data generated after this date will be replicated."
        ),
        pattern=DATE_TIME_PATTERN,
        examples=["2017-01-25"],
    )

    access_token: str = Field(
        title="Access Token",
        order=2,
        description=(
            "The value of the access token generated. "
        ),
        airbyte_secret=True,
    )

    shift_days: int = Field(
        title="Shift Days",
        order=3,
        default=1,
        description="Include data whith shift days",
    )

    reports: Optional[List[ReportConfig]] = Field(
        title="Reports",
        order=4,
        description=(
            "A list of reports"
        ),
    )

    apps: Optional[List[AppConfig]] = Field(
        title="Apps",
        order=5,
        description=(
            "A list of apps"
        ),
    )
j
@Alexandre Girard (Airbyte) could you take a look at this? Maybe we introduced a bug in more recent versions of the CDK?
👀 1
a
the issue appears to be with typing extensions 4.6. I was able to reproduce the issue by running
pip install typing_extensions=='4.6.3'
. Long term-solution is probably to pip-compile to freeze transient dependencies, but I'll look into where the dependency comes from in case there's an easy fix
alternatively, you might be able to unblock yourself by forcing the dependency to
typing-extensions==4.5.0
in your connector's setup.py
I created an issue for us to use pip-compile to freeze transitive dependencies, but I would recommend you freeze the version of typing-extensions in your module. The upgrade to 4.6 doesn't seem to have broken any of the connectors in our repository, so it's possible a future CDK update would still break your custom connector.