Hi Team. In case of updating or adding extrEnvs of...
# all-things-deployment
n
Hi Team. In case of updating or adding extrEnvs of datahub-gms, do I need to upgrade again with same version? 🤔
✅ 1
b
yes
n
Thanks! However, it returns kafka topic errors which were not happened while installing datahub. Anyway I'll try to figure out it
b
what kind of error did you encounter?
n
I guess kafka topic lists are not matched. However, at installed point it was totally fine.
Copy code
client.go:250: [debug] error updating the resource "datahub-datahub-gms":
	 cannot patch "datahub-datahub-gms" with kind Deployment: The order in patch list:
[map[name:DATAHUB_REVISION value:4] map[name:FAILED_METADATA_CHANGE_EVENT_NAME value:dcatalog-datahub-fmce] map[name:FAILED_METADATA_CHANGE_EVENT_NAME value:FailedMetadataChangeEvent_v4] map[name:METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:MetadataChangeProposal_v1] map[name:METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:catalog-datahub-mpe] map[name:FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:dcatalog-datahub-fmpe] map[name:FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:FailedMetadataChangeProposal_v1] map[name:METADATA_SERVICE_AUTH_ENABLED value:true]]
 doesn't match $setElementOrder list:
[map[name:SHOW_SEARCH_FILTERS_V2] map[name:SHOW_BROWSE_V2] map[name:BACKFILL_BROWSE_PATHS_V2] map[name:DATAHUB_UPGRADE_HISTORY_KAFKA_CONSUMER_GROUP_ID] map[name:DATAHUB_REVISION] map[name:ENABLE_PROMETHEUS] map[name:MCE_CONSUMER_ENABLED] map[name:MAE_CONSUMER_ENABLED] map[name:PE_CONSUMER_ENABLED] map[name:ENTITY_REGISTRY_CONFIG_PATH] map[name:DATAHUB_ANALYTICS_ENABLED] map[name:EBEAN_DATASOURCE_USERNAME] map[name:EBEAN_DATASOURCE_PASSWORD] map[name:EBEAN_DATASOURCE_HOST] map[name:EBEAN_DATASOURCE_URL] map[name:EBEAN_DATASOURCE_DRIVER] map[name:KAFKA_BOOTSTRAP_SERVER] map[name:KAFKA_SCHEMAREGISTRY_URL] map[name:SCHEMA_REGISTRY_TYPE] map[name:ELASTICSEARCH_HOST] map[name:ELASTICSEARCH_PORT] map[name:SKIP_ELASTICSEARCH_CHECK] map[name:ELASTICSEARCH_USE_SSL] map[name:GRAPH_SERVICE_IMPL] map[name:METADATA_CHANGE_EVENT_NAME] map[name:FAILED_METADATA_CHANGE_EVENT_NAME] map[name:METADATA_AUDIT_EVENT_NAME] map[name:DATAHUB_USAGE_EVENT_NAME] map[name:METADATA_CHANGE_PROPOSAL_TOPIC_NAME] map[name:FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME] map[name:METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME] map[name:METADATA_CHANGE_LOG_TIMESERIES_TOPIC_NAME] map[name:PLATFORM_EVENT_TOPIC_NAME] map[name:DATAHUB_UPGRADE_HISTORY_TOPIC_NAME] map[name:UI_INGESTION_ENABLED] map[name:SECRET_SERVICE_ENCRYPTION_KEY] map[name:UI_INGESTION_DEFAULT_CLI_VERSION] map[name:ELASTICSEARCH_QUERY_MAX_TERM_BUCKET_SIZE] map[name:ELASTICSEARCH_QUERY_EXACT_MATCH_EXCLUSIVE] map[name:ELASTICSEARCH_QUERY_EXACT_MATCH_WITH_PREFIX] map[name:ELASTICSEARCH_QUERY_EXACT_MATCH_FACTOR] map[name:ELASTICSEARCH_QUERY_EXACT_MATCH_PREFIX_FACTOR] map[name:ELASTICSEARCH_QUERY_EXACT_MATCH_CASE_FACTOR] map[name:ELASTICSEARCH_QUERY_EXACT_MATCH_ENABLE_STRUCTURED] map[name:ELASTICSEARCH_SEARCH_GRAPH_TIMEOUT_SECONDS] map[name:ELASTICSEARCH_SEARCH_GRAPH_BATCH_SIZE] map[name:ELASTICSEARCH_SEARCH_GRAPH_MAX_RESULT] map[name:SEARCH_SERVICE_ENABLE_CACHE] map[name:LINEAGE_SEARCH_CACHE_ENABLED] map[name:ELASTICSEARCH_INDEX_BUILDER_MAPPINGS_REINDEX] map[name:ELASTICSEARCH_INDEX_BUILDER_SETTINGS_REINDEX] map[name:ALWAYS_EMIT_CHANGE_LOG] map[name:GRAPH_SERVICE_DIFF_MODE_ENABLED] map[name:METADATA_CHANGE_EVENT_NAME] map[name:METADATA_AUDIT_EVENT_NAME] map[name:FAILED_METADATA_CHANGE_EVENT_NAME] map[name:METADATA_CHANGE_PROPOSAL_TOPIC_NAME] map[name:METADATA_SERVICE_AUTH_ENABLED] map[name:FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME] map[name:DATAHUB_USAGE_EVENT_NAME] map[name:KAFKA_MCE_TOPIC_NAME] map[name:KAFKA_FMCE_TOPIC_NAME] map[name:KAFKA_TOPIC_NAME]]
b
maybe you can inspect your revised chart by using
helm template
? have not encountered this error before
n
do you mean that every env variables should be set on helm extraVar in datahub-gms too?
b
my understanding is that if values.yml has the specific setting, then i do not need to add that particular env to gms, for instance metadata-authentication i only suggest
helm template
so that you can preview whatever is being sent to the cluster
n
That's what I expected too. So this is quite weird. 🤔
the strange thing is topic is declared twice
Copy code
[map[name:DATAHUB_REVISION value:4] map[name:FAILED_METADATA_CHANGE_EVENT_NAME value:dcatalog-datahub-fmce] map[name:FAILED_METADATA_CHANGE_EVENT_NAME value:FailedMetadataChangeEvent_v4] map[name:METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:MetadataChangeProposal_v1] map[name:METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:catalog-datahub-mpe] map[name:FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:dcatalog-datahub-fmpe] map[name:FAILED_METADATA_CHANGE_PROPOSAL_TOPIC_NAME value:FailedMetadataChangeProposal_v1] map[name:METADATA_SERVICE_AUTH_ENABLED value:true]]