Skip to content

Feat: modify workflow updateContributor to be trigger by a repository dispatch event and extract contrbuton of all public repo of frontendmu #248

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/workflows/update-contributors.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ on:
pull_request_target:
types:
- closed
repository_dispatch:
types: [trigger-workflow]

jobs:
update-contributors:
Expand Down
91 changes: 73 additions & 18 deletions packages/frontendmu-data/scripts/update-contributors.js
Original file line number Diff line number Diff line change
@@ -1,43 +1,98 @@
import fs from "fs";
import { execSync } from "child_process";

const owner = "Front-End-Coders-Mauritius";
const owner = "frontendmu";
const repo = "frontend.mu";
const branch = "main"; // Replace with the default branch of your repository

const contributorsFile = "./data/contributors.json";

async function updateContributors() {
try {
const response = await fetch(
`https://api.github.com/repos/${owner}/${repo}/contributors`
const allPublicRepositoriesList = await fetch(
`https://api.github.com/users/${owner}/repos`
).then((response) => response.json());
Comment on lines +16 to +18
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Handle potential pagination when fetching repositories

The GitHub API returns up to 30 repositories per request by default. If the user has more than 30 repositories, some repositories may be missed. Consider handling pagination or increasing the per_page parameter to ensure all repositories are fetched.

You can modify the fetch URL to include ?per_page=100:

- `https://api.github.com/users/${owner}/repos`
+ `https://api.github.com/users/${owner}/repos?per_page=100`

Or implement pagination as follows:

let allPublicRepositoriesList = [];
let page = 1;
let per_page = 100;
let hasMore = true;

while (hasMore) {
  const response = await fetch(
    `https://api.github.com/users/${owner}/repos?page=${page}&per_page=${per_page}`
  );
  const repos = await response.json();
  if (repos.length > 0) {
    allPublicRepositoriesList = allPublicRepositoriesList.concat(repos);
    page++;
  } else {
    hasMore = false;
  }
}


const allPublicRepositories = allPublicRepositoriesList.map(
(repo) => repo.name
Comment on lines +20 to +21
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Add error handling for fetching repositories

There's no check to verify if the repository fetch request was successful. If the API call fails or returns an error, the script may throw an exception when trying to process the response.

Consider adding error handling:

const response = await fetch(
  `https://api.github.com/users/${owner}/repos`
);
if (!response.ok) {
  throw new Error(`Error fetching repositories: ${response.status} ${response.statusText}`);
}
const allPublicRepositoriesList = await response.json();

);

const result = await response.json();
const contributors = result
.map((contributor) => {
return {
username: contributor.login,
contributions: contributor.contributions,
};
})
.filter((contributor) => {
// Exclude the following contributors from the list
const excludedContributors = ["actions-user", "github-actions[bot]"];
return !excludedContributors.includes(contributor.username);
// console.log("All public repositories:", allPublicRepositories);
// [
// '.github',
// 'branding',
// 'conference-2024',
// 'events',
// 'frontend.mu',
// 'frontendmu-daisy',
// 'frontendmu-nuxt',
// 'frontendmu-ticket',
// 'google-photos-sync',
// 'hacktoberfestmu-2019',
// 'meetupFEC',
// 'nuxt-workshop-devcon2024',
// 'nuxt-workshop-devcon2024-preparations',
// 'playground',
// 'vercel-og-next',
// 'video'
// ]

// const contributors = [];
const contributorsMap = {};

const excludedContributors = ["actions-user", "github-actions[bot]"];
const excludedRepositories = [".github", "google-photos-sync", "branding"];

for (const repo of allPublicRepositories) {
if (excludedRepositories.includes(repo)) {
continue;
}
const contributorsList = await fetch(
`https://api.github.com/repos/${owner}/${repo}/contributors`
).then((response) => response.json());

// const contributorsListFiltered = contributorsList
// .map((contributor) => {
// return {
// username: contributor.login,
// contributions: contributor.contributions,
// };
// })
// .filter((contributor) => {
// return !excludedContributors.includes(contributor.username);
// });
// contributors.push(...contributorsListFiltered);
// console.log(`Contributors for ${repo}:`, contributorsListFiltered);
// }
// const updatedContributors = [...new Set(contributors)];

contributorsList.forEach((contributor) => {
if (!excludedContributors.includes(contributor.login)) {
if (contributorsMap[contributor.login]) {
contributorsMap[contributor.login] += contributor.contributions;
} else {
contributorsMap[contributor.login] = contributor.contributions;
}
}
Comment on lines +34 to +41
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Refactor contributor aggregation logic using a Map

While the current logic works, using a Map can make the code more efficient and cleaner when accumulating contributions.

Here's how you can implement it:

const contributorsMap = new Map();

contributorsList.forEach((contributor) => {
  if (!excludedContributors.includes(contributor.login)) {
    const currentContributions = contributorsMap.get(contributor.login) || 0;
    contributorsMap.set(contributor.login, currentContributions + contributor.contributions);
  }
});

Then, when creating updatedContributors:

const updatedContributors = Array.from(contributorsMap, ([username, contributions]) => ({
  username,
  contributions
}));

});
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Handle API rate limiting and errors when fetching contributors

Fetching contributors for each repository in a loop may lead to hitting GitHub's API rate limits, especially for unauthenticated requests. Additionally, there's no error handling for these fetch requests, which could cause the script to fail silently.

To handle this:

  1. Authenticate API Requests: Use a GitHub personal access token to increase the rate limit.

    const GITHUB_TOKEN = process.env.GITHUB_TOKEN; // Ensure this environment variable is set
  2. Add Error Handling: Check if the fetch request was successful before processing the response.

    const contributorsResponse = await fetch(
      `https://api.github.com/repos/${owner}/${repo}/contributors`,
      {
        headers: {
          'Authorization': `token ${GITHUB_TOKEN}`
        }
      }
    );
    
    if (!contributorsResponse.ok) {
      console.error(`Error fetching contributors for ${repo}: ${contributorsResponse.status} ${contributorsResponse.statusText}`);
      continue; // Skip to the next repository
    }
    
    const contributorsList = await contributorsResponse.json();
  3. Implement Rate Limiting Handling: Detect when the rate limit is reached and wait before retrying.

    if (contributorsResponse.status === 403 && contributorsResponse.headers.get('X-RateLimit-Remaining') === '0') {
      const resetTime = contributorsResponse.headers.get('X-RateLimit-Reset');
      const currentTime = Math.floor(Date.now() / 1000);
      const waitTime = resetTime - currentTime;
      console.log(`Rate limit exceeded. Waiting for ${waitTime} seconds.`);
      await new Promise(resolve => setTimeout(resolve, waitTime * 1000));
      // Retry the request after waiting
    }

const updatedContributors = Object.entries(contributorsMap).map(([username, contributions]) => ({
username,
contributions
}));
const contributorsData = JSON.stringify(updatedContributors, null, 2);

const updatedContributors = [...new Set(contributors)];
console.log(contributorsData)

if (
JSON.stringify(updatedContributors) !==
JSON.stringify(getExistingContributors())
) {
fs.writeFileSync(
contributorsFile,
JSON.stringify(updatedContributors, null, 2)
contributorsData
);
console.log("Contributors file updated.");
// console.log("Contributors file updated.");

// Configure Git user and email for the commit
execSync('git config user.name "GitHub Action"');
Expand Down
Loading