Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crowd linux /services folder merge #1934

Draft
wants to merge 1 commit into
base: crowd-linux
Choose a base branch
from

Conversation

epipav
Copy link
Collaborator

@epipav epipav commented Dec 7, 2023

Changes proposed ✍️

What

🤖[deprecated] Generated by Copilot at 9a84aea

This pull request adds features, fixes bugs, and improves performance for the automations worker and the data sink worker services. It adds support for checking creation timestamps, custom workflow IDs, and reuse policies for automations, and for delaying and retrying results, soft deleting activities, and measuring metrics for the data sink worker. It also migrates to use pnpm as the package manager, and updates the code style and logic of several methods and queries. It modifies the interfaces and queries in automation.repo.ts, types.ts, automation.service.ts, newActivityAutomations.ts, newMemberAutomations.ts, dataSink.repo.ts, activity.repo.ts, dataSink.service.ts, activity.service.ts, data.repo.ts, activity.data.ts, and dataSink.data.ts, and adds new classes and functions in package.json, processOldResults.ts, main.ts, index.ts, default.json, and index.ts. It also fixes a bug in triggerAutomation and member.repo.ts.

🤖[deprecated] Generated by Copilot at 9a84aea

This pull request has many changes
From pnpm to soft deletes and ranges
It adds DataSinkWorkerEmitter
And some metrics and triggers
And optimizes queries and engages

Why

How

🤖[deprecated] Generated by Copilot at 9a84aea

  • Migrate to pnpm as the package manager and update the dev:local and dev scripts in package.json files (link, link)
  • Add the createdAt field to the IRelevantAutomationData interface and the getRelevantAutomations query in automation.repo.ts to store and fetch the creation timestamp of each automation (link, link)
  • Add the joinedAt field to the IMemberData interface in types.ts to store the timestamp of when a member joined a community (link)
  • Add the timestamp field to the IActivityData interface in types.ts to store the timestamp of when an activity was created (link)
  • Add the shouldProcess variable and the if conditions to the processNewMemberAutomation and processNewActivityAutomation methods in automation.service.ts to check the creation timestamps of members, activities, and automations, and log warnings and skip processing if the member or activity was created before the automation (link, link, link, link, link, link, link, link, link, link)
  • Add the WorkflowIdReusePolicy and workflowInfo imports to the newActivityAutomations.ts and newMemberAutomations.ts files, and use them to declare the info variable and pass the workflowId, workflowIdReusePolicy, retry, and searchAttributes options to the executeChild calls in the processNewActivityAutomation and processNewMemberAutomation functions, to use a custom workflow ID and reuse policy, retry the child workflow with the same maximum attempts as the parent workflow, and add the tenant ID as a search attribute (link, link, link, link, link, link)
  • Add the @crowd/telemetry dependency to the package.json file in the data_sink_worker folder, and use the telemetry import to measure and increment metrics for the data sink worker in the dataSink.service.ts file (link, link, link, link)
  • Add the maxStreamRetries option to the worker section in the default.json file, and the IWorkerConfig interface and the WORKER_SETTINGS function to the index.ts file in the conf folder, to allow configuring the worker settings, such as the maximum number of stream retries (link, link)
  • Add the DataSinkWorkerEmitter import to the main.ts, index.ts in the queue folder, and dataSink.service.ts files, and use it to declare, initialize, and pass the dataSinkWorkerEmitter variable to the WorkerQueueReceiver constructor, the processOldResultsJob function, and the DataSinkService constructor, and to handle the DataSinkWorkerQueueMessageType.CHECK_RESULTS message type in the handleMessage method, to use the data sink worker emitter to trigger the result processing for the delayed results (link, link, link, link, link, link, link, link, link, link)
  • Add the retries and delayedUntil fields to the IResultData interface in the dataSink.data.ts file, and the getResult, resetResults, delayResult, and getDelayedResults methods in the dataSink.repo.ts file, to store, fetch, update, and query the retry count and the delay timestamp of the results (link, link, link, link, link)
  • Add the resultInfo parameter to the triggerResultError method in the dataSink.service.ts file, and use it to pass the result information to the error handler and the markResultError and delayResult calls, and to check the retry count and delay the result if it is below the maximum number of stream retries (link, link, link, link)
  • Add the deletedAt field to the IDbActivity interface in the activity.data.ts file, and the findExisting method and query in the activity.repo.ts file, to support soft deletion of activities and to fetch the deletion timestamp of an existing activity (link, link, link, link)
  • Add the if condition and the log.trace call to the processActivity method in the activity.service.ts file, to skip the processing if the existing activity is deleted (link)
  • Change the workflowId option to use the TemporalWorkflowId.NEW_MEMBER_AUTOMATION enum value in the processNewMemberAutomation method in the activity.service.ts file, and add the TemporalWorkflowId import to the file, to use a custom enum for the temporal workflow IDs (link, link)
  • Change the SQL query in the findByEmail method in the member.repo.ts file to use the @> operator instead of the = ANY operator, to use the index on the emails array column (link)
  • Change the MAX_CONCURRENT_PROMISES and MAX_RESULTS_TO_LOAD constants from 50 and 200 to 2 and 10 respectively in the processOldResults.ts file, to reduce the load on the database and the temporal server when processing old results (link)
  • Change the loadChildTables parameter to the if condition in the getActivities method in data.repo.ts to check if the engagement score should be calculated for each activity, to avoid unnecessary queries (link)
  • Remove the throw err statement from the catch block in the triggerAutomation method in automation.service.ts, to avoid throwing errors that are already handled by the triggerResultError method (link)
  • Remove the log.error call from the catch block in the processOldResultsJob function in the processOldResults.ts file, to avoid logging the same error twice, since the triggerResultError method already logs the error (link)
  • Remove the i and batchLength variables from the processOldResultsJob function in the processOldResults.ts file, to avoid unnecessary variables and use the resultsToProcess.length directly (link, link, link)
  • Remove the currentIndex, i, and batchLength variables from the processOldResultsJob function in the processOldResults.ts file, to avoid unnecessary logging and variables (link)
  • Remove the process variable from the processNewMemberAutomation and processNewActivityAutomation methods in automation.service.ts, and use the shouldProcess variable instead, to use a more descriptive name (link, link, link, link, link, link, link)
  • Add the && activity.body condition to the if condition in the processNewActivityAutomation method in automation.service.ts to check if the activity has a body before performing the keyword matching, to avoid errors when the body is null or undefined (link)
  • Add the if condition to the processActivity method in the activity.service.ts file, to avoid triggering the member sync for a null or undefined member ID (link)

Checklist ✅

  • Label appropriately with Feature, Improvement, or Bug.
  • Add screenshots to the PR description for relevant FE changes
  • New backend functionality has been unit-tested.
  • API documentation has been updated (if necessary) (see docs on API documentation).
  • Quality standards are met.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant