Skip to content

Commit

Permalink
chore: combine master and dev-upgrade-18 for rc6 (#10772)
Browse files Browse the repository at this point in the history
### Description

For this special case of wanting to merge the current state of master
into dev-upgrade-18 for the new rc6, we are taking a different approach.

Step 1: Checkout new branch (mk/u18-merge) from master
Step 2: Merge dev-upgrade-18 into mk/u18-merge
Step 3: Manually resolve conflicts (Keep u18 suffixed @agoric/* deps and
upgrade name changes to upgrade.go from dev-upgrade-18, rest from
master)
Step 4: Create a PR from mk/u18-merge to be merged back into
dev-upgrade-18
  • Loading branch information
mujahidkay authored Dec 24, 2024
2 parents aaebae4 + 38d4747 commit cc4b6b8
Show file tree
Hide file tree
Showing 685 changed files with 57,689 additions and 9,565 deletions.
5 changes: 4 additions & 1 deletion .eslintignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# also ignored in packages/cosmic-proto/.eslintignore, but IDE's pick up the root config
packages/cosmic-proto/dist
packages/cosmic-proto/node_modules/
packages/cosmic-proto/coverage/
packages/cosmic-proto/dist/
packages/cosmic-proto/proto/
packages/cosmic-proto/src/codegen/
43 changes: 29 additions & 14 deletions .eslintrc.cjs
Original file line number Diff line number Diff line change
Expand Up @@ -62,15 +62,12 @@ module.exports = {
root: true,
parser: '@typescript-eslint/parser',
parserOptions: {
// Works for us!
EXPERIMENTAL_useProjectService: true,
useProjectService: true,
sourceType: 'module',
project: [
'./packages/*/tsconfig.json',
'./packages/*/tsconfig.json',
'./packages/wallet/*/tsconfig.json',
'./tsconfig.json',
],
projectService: {
allowDefaultProject: ['*.js'],
defaultProject: 'tsconfig.json',
},
tsconfigRootDir: __dirname,
extraFileExtensions: ['.cjs'],
},
Expand Down Expand Up @@ -141,6 +138,9 @@ module.exports = {
// CI has a separate format check but keep this warn to maintain that "eslint --fix" prettifies
// UNTIL https://github.com/Agoric/agoric-sdk/issues/4339
'prettier/prettier': 'warn',

// Not a risk with our coding style
'no-use-before-define': 'off',
},
settings: {
jsdoc: {
Expand Down Expand Up @@ -174,8 +174,8 @@ module.exports = {
{
files: [
'packages/**/demo/**/*.js',
'packages/*/test/**/*.js',
'packages/*/test/**/*.test.js',
'packages/*/test/**/*.*s',
'packages/*/test/**/*.test.*s',
'packages/wallet/api/test/**/*.js',
],
rules: {
Expand All @@ -185,12 +185,23 @@ module.exports = {
// NOTE: This rule is enabled for the repository in general. We turn it
// off for test code for now.
'@jessie.js/safe-await-separator': 'off',

// Like `'ava/no-only-test`, but works with @endo/ses-ava
'no-restricted-properties': [
'error',
{
object: 'test',
property: 'only',
message:
'Do not commit .only tests - they prevent other tests from running',
},
],
},
},
{
// These tests use EV() instead of E(), which are easy to confuse.
// Help by erroring when E() packages are imported.
files: ['packages/boot/test/**/*.test.*'],
files: ['packages/boot/test/**/*.test.*s'],
rules: {
'no-restricted-imports': [
'error',
Expand Down Expand Up @@ -226,12 +237,16 @@ module.exports = {
{
files: ['*.d.ts'],
rules: {
// Irrelevant in a typedef
'no-use-before-define': 'off',
// Linter confuses the type declaration with value declaration
'no-redeclare': 'off',
},
},
{
// disable type-aware linting for these files that have can have a .d.ts twin
// because it can't go into tsconfig (because that would cause tsc build to overwrite the .d.ts twin)
files: ['exported.*', 'types-index.*', 'types-ambient.*', 'types.*'],
extends: ['plugin:@typescript-eslint/disable-type-checked'],
},
{
// disable type-aware linting in HTML
files: ['*.html'],
Expand All @@ -243,7 +258,7 @@ module.exports = {
files: ['a3p-integration/**'],
extends: ['plugin:@typescript-eslint/disable-type-checked'],
parserOptions: {
EXPERIMENTAL_useProjectService: false,
useProjectService: false,
project: false,
},
rules: {
Expand Down
12 changes: 6 additions & 6 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,6 @@
v ✰ Thanks for creating a PR! ✰
☺ > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -->

<!-- Most PRs should close a specific issue. All PRs should at least reference one or more issues. Edit and/or delete the following lines as appropriate (note: you don't need both `refs` and `closes` for the same one): -->

closes: #XXXX
refs: #XXXX

<!-- Integration testing generally doesn't run until a PR is labeled for merge, but can be opted into for every push by adding label 'force:integration', and can be customized to use non-default external targets by including lines here that **start** with leading-`#` directives:
* (https://github.com/Agoric/documentation) #documentation-branch: $BRANCH_NAME
* (https://github.com/endojs/endo) #endo-branch: $BRANCH_NAME
Expand All @@ -16,6 +11,11 @@ refs: #XXXX
These directives should be removed before adding a merge label, so final integration tests run against default targets.
-->

<!-- Most PRs should close a specific issue. All PRs should at least reference one or more issues. Edit and/or delete the following lines as appropriate (note: you don't need both `refs` and `closes` for the same one): -->

closes: #XXXX
refs: #XXXX

## Description
<!-- Add a description of the changes that this PR introduces and the files that are the most critical to review. -->

Expand All @@ -32,4 +32,4 @@ These directives should be removed before adding a merge label, so final integra
<!-- Every PR should of course come with tests of its own functionality. What additional tests are still needed beyond those unit tests? How does this affect CI, other test automation, or the testnet? -->

### Upgrade Considerations
<!-- What aspects of this PR are relevant to upgrading live production systems, and how should they be addressed? -->
<!-- What aspects of this PR are relevant to upgrading live production systems, and how should they be addressed? What steps should be followed to verify that its changes have been included in a release (ollinet/emerynet/mainnet/etc.) and work successfully there? If the process is elaborate, consider adding a script to scripts/verification/. -->
65 changes: 65 additions & 0 deletions .github/actions/ci-test-result.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
#! /usr/bin/env node
const fs = require('node:fs');
const process = require('node:process');
const { sendMetricsToGCP, makeTimeSeries } = require('./gcp-monitoring.cjs');

const resultFiles = process.argv.slice(2);

const tapResultRegex = new RegExp(
`(^(?<status>not )?ok (?<num>[0-9]+) - (?<name>.+?)(?: %ava-dur=(?<duration>[0-9]+)ms)?(?:# (?<comments>.+?))?$(?<output>(\n^#.+?$)*)(?<failure>(\n^(?:(?!(?:not|ok) ))[^\n\r]+?$)*))`,
'gms',
);
let timeSeriesData = [];

function processTAP(packageName, tapbody) {
let m;
const returnValue = [];
// eslint-disable-next-line no-cond-assign
while ((m = tapResultRegex.exec(tapbody))) {
if (m.groups.name) {
const testCaseName = `${m.groups.name}`.replace(/["<>]/g, '').trim();

let skipped = false;
let succeeded = true;
let todo = false;
if (m.groups.status) {
succeeded = false;
}
if (m.groups.comments) {
if (m.groups.comments.match(/SKIP/gi)) {
skipped = true;
}
if (m.groups.comments.match(/TODO/gi)) {
todo = true;
skipped = true;
succeeded = true;
}
}
returnValue.push({
labels: {
test_name: testCaseName,
package: packageName,
test_status:
succeeded && !(todo || skipped)
? 'succeeded'
: !succeeded
? 'failed'
: 'skipped',
},
value: Number(succeeded && !(todo || skipped)),
});
}
}
return returnValue;
}

for (const file of resultFiles) {
const resultsBody = fs.readFileSync(file, 'utf-8');
const packageName = file.split('/').at(-2);

const response = processTAP(packageName, resultsBody);
timeSeriesData.push(...response);
}

const timeSeries = makeTimeSeries(timeSeriesData);
sendMetricsToGCP(timeSeries);
169 changes: 169 additions & 0 deletions .github/actions/dump-ci-stats-to-gcp-metrics.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,169 @@
const Monitoring = require('@google-cloud/monitoring');

const gcpCredentials = JSON.parse(process.env.GCP_CREDENTIALS);
const monitoring = new Monitoring.MetricServiceClient({
projectId: gcpCredentials.project_id,
credentials: {
client_email: gcpCredentials.client_email,
private_key: gcpCredentials.private_key,
},
});

async function sendMetricsToGCP(metricType, metricValue, labels) {
const projectId = gcpCredentials.project_id;

const request = {
name: monitoring.projectPath(projectId),
timeSeries: [
{
metric: {
type: `custom.googleapis.com/github/${metricType}`,
labels: labels,
},
resource: {
type: 'global',
labels: {
project_id: projectId,
},
},
points: [
{
interval: {
endTime: {
seconds: Math.floor(Date.now() / 1000),
},
},
value: {
doubleValue: metricValue,
},
},
],
},
],
};
try {
await monitoring.createTimeSeries(request);
console.log(`Metric ${metricType} sent successfully.`);
} catch (error) {
console.error('Error sending metric:', error);
}
}

// Function to fetch workflow and job details via GitHub API
async function fetchWorkflowDetails() {
const runId = process.argv[2];
const repo = process.env.GITHUB_REPOSITORY;
const apiUrl = `https://api.github.com/repos/${repo}/actions/runs/${runId}`;

try {
const response = await fetch(apiUrl, {
headers: {
Authorization: `Bearer ${process.env.GITHUB_TOKEN}`,
Accept: 'application/vnd.github.v3+json',
},
});

if (!response.ok) throw new Error(`HTTP error! status: ${response.status}`);
const data = await response.json();

return {
workflowId: data.id,
workflowName: data.name,
status: data.status, // "completed", "in_progress", etc.
conclusion: data.conclusion, // "success", "failure"
startTime: data.created_at,
endTime: data.updated_at,
trigger: data.event, // "push", "pull_request", etc.
jobs: await fetchJobDetails(repo, data.id), // Fetch individual job details
};
} catch (error) {
console.error('Error fetching workflow details:', error);
process.exit(1);
}
}

async function fetchJobDetails(repo, runId) {
const apiUrl = `https://api.github.com/repos/${repo}/actions/runs/${runId}/jobs`;

try {
const response = await fetch(apiUrl, {
headers: {
Authorization: `Bearer ${process.env.GITHUB_TOKEN}`,
Accept: 'application/vnd.github.v3+json',
},
});

if (!response.ok) throw new Error(`HTTP error! status: ${response.status}`);
const data = await response.json();
return data.jobs;
} catch (error) {
console.error('Error fetching job details:', error);
return [];
}
}

// Main function to send metrics
(async () => {
try {
const workflowStats = await fetchWorkflowDetails();

const workflowLabels = {
workflow_name: workflowStats.workflowName,
workflow_id: workflowStats.workflowId,
trigger: workflowStats.trigger,
};

const workflowDuration =
(new Date(workflowStats.endTime) - new Date(workflowStats.startTime)) /
1000;
await sendMetricsToGCP(
'ci_workflow_duration',
workflowDuration,
workflowLabels,
);

for (const job of workflowStats.jobs) {
const jobLabels = {
workflow_name: workflowStats.workflowName,
job_name: job.name,
runner_name: job.runner_name,
conclusion: job.conclusion,
};

const jobExecutionTime =
(new Date(job.completed_at) - new Date(job.started_at)) / 1000;
await sendMetricsToGCP(
'ci_job_execution_time',
jobExecutionTime,
jobLabels,
);

// Send job status (1 for success, 0 for failure)
const jobStatus = job.conclusion === 'success' ? 1 : 0;
await sendMetricsToGCP('ci_job_status', jobStatus, jobLabels);

// Capture step-level metrics for step details per job
for (const step of job.steps) {
const stepExecutionTime =
(new Date(step.completed_at) - new Date(step.started_at)) / 1000;
const stepLabels = {
workflow_name: workflowStats.workflowName,
job_name: job.name,
step_name: step.name,
runner_name: job.runner_name,
};

await sendMetricsToGCP(
'ci_step_execution_time',
stepExecutionTime,
stepLabels,
);
}
}
} catch (error) {
console.error('Error in main function:', error);
process.exit(1);
}

process.exit(0);
})();
Loading

0 comments on commit cc4b6b8

Please sign in to comment.