Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include hot fixes in dev #3010

Open
wants to merge 33 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
e3de0d2
Merge pull request #2987 from chaoss/remove-update-weight-hotfix
sgoggins Feb 12, 2025
455b02a
Merge pull request #2988 from chaoss/dev
sgoggins Feb 12, 2025
c6b2cb5
Update gsoc-ideas.md
sgoggins Feb 12, 2025
f98b5db
Update gsoc-ideas.md
sgoggins Feb 12, 2025
8b1a939
Update gsoc-ideas.md
sgoggins Feb 12, 2025
f847358
Update gsoc-ideas.md
sgoggins Feb 12, 2025
1ffc551
Update gsoc-interest.md
sgoggins Feb 12, 2025
ffec284
Update README.md
sgoggins Feb 12, 2025
a653027
Update README.md
sgoggins Feb 12, 2025
9500e73
Update gsoc-ideas.md
sgoggins Feb 13, 2025
32d9a8c
Fix key orchestrator missing in Docker build
Ulincsys Feb 13, 2025
1393440
Add __init__.py to fix module import error
Ulincsys Feb 14, 2025
c6c86ed
Merge pull request #2996 from chaoss/keyman-docker-hotfix
sgoggins Feb 14, 2025
def43f1
Update metadata.py
sgoggins Feb 14, 2025
c6646ce
Update README.md
sgoggins Feb 14, 2025
03196c5
Update README.md
sgoggins Feb 14, 2025
46bf034
Show tracebacks for network exceptions
Ulincsys Feb 17, 2025
1d9583d
Fix incorrect usage of format_exc
Ulincsys Feb 17, 2025
53c3c14
Have decorator pass secondary exceptions transparently
Ulincsys Feb 17, 2025
6954f9c
Logical fixes and general improvements
Ulincsys Feb 17, 2025
641f72f
Strip whitespace from keys while loading
Ulincsys Feb 17, 2025
d8f784a
Merge pull request #2999 from chaoss/improve_nettest_logging_hotfix
sgoggins Feb 17, 2025
7d58dae
Merge pull request #3000 from chaoss/fix_protocol_exception_hotfix
sgoggins Feb 17, 2025
e6940c6
Filter worker_oauth keys by platform in cli
Ulincsys Feb 18, 2025
e308fac
Merge pull request #3005 from chaoss/cli_api_key_hotfix
sgoggins Feb 18, 2025
9a4b866
Fix comparison with NoneType logical error
Ulincsys Feb 21, 2025
1e417ca
Secondary task to use get_secondary_data_last_collected
Ulincsys Feb 22, 2025
a1a9775
Merge pull request #3009 from chaoss/last_collected_null_hotfix
sgoggins Feb 22, 2025
47a8c66
Update README.md
sgoggins Feb 22, 2025
474c65c
Update metadata.py
sgoggins Feb 22, 2025
21fae40
reduce dependency on create_collection_status_record tasks
ABrain7710 Feb 26, 2025
9ca06bd
cleanup everything
ABrain7710 Feb 26, 2025
da84da9
Merge pull request #3014 from chaoss/hotfix-main
sgoggins Feb 26, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 5 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Augur NEW Release v0.81.0
# Augur NEW Release v0.81.2

Augur is primarily a data engineering tool that makes it possible for data scientists to gather open source software community data - less data carpentry for everyone else!
The primary way of looking at Augur data is through [8Knot](https://github.com/oss-aspen/8knot), a public instance of 8Knot is available [here](https://metrix.chaoss.io) - this is tied to a public instance of [Augur](https://ai.chaoss.io).
Expand All @@ -11,8 +11,7 @@ We follow the [First Timers Only](https://www.firsttimersonly.com/) philosophy o
## NEW RELEASE ALERT!
**If you want to jump right in, the updated docker, docker-compose and bare metal installation instructions are available [here](docs/new-install.md)**.

<<<<<<< HEAD
Augur is now releasing a dramatically improved new version to the ```main``` branch. It is also available [here](https://github.com/chaoss/augur/releases/tag/v0.81.0).
Augur is now releasing a dramatically improved new version to the ```main``` branch. It is also available [here](https://github.com/chaoss/augur/releases/tag/v0.81.2).


- The `main` branch is a stable version of our new architecture, which features:
Expand Down Expand Up @@ -44,9 +43,9 @@ For more information on [how to get involved on the CHAOSS website](https://chao

## Collecting Data

Augur supports ```Python3.6``` through ```Python3.9``` on all platforms. ```Python3.10``` and above do not yet work because of machine learning worker dependencies. On OSX, you can create a ```Python3.9``` environment, by running:
Augur supports ```Python3.7``` through ```Python3.11``` on all platforms. ```Python3.12``` and above do not yet work because of machine learning worker dependencies. On OSX, you can create a ```Python3.11``` environment, by running:
```
$ python3.9 -m venv path/to/venv
$ python3.11 -m venv path/to/venv
```

Augur's main focus is to measure the overall health and sustainability of open source projects.
Expand Down Expand Up @@ -84,7 +83,7 @@ We strongly believe that much of what makes open source so great is the incredib

## License, Copyright, and Funding

Copyright © 2023 University of Nebraska at Omaha, University of Missouri, Brian Warner, and the CHAOSS Project.
Copyright © 2025 University of Nebraska at Omaha, University of Missouri, Brian Warner, and the CHAOSS Project.

Augur is free software: you can redistribute it and/or modify it under the terms of the MIT License as published by the Open Source Initiative. See the [LICENSE](LICENSE) file for more details.

Expand Down
28 changes: 19 additions & 9 deletions augur/application/cli/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import re
import json
import httpx
import traceback

from augur.application.db.engine import DatabaseEngine
from augur.application.db import get_engine, dispose_database_engine
Expand All @@ -16,23 +17,32 @@ def test_connection(function_internet_connection):
@click.pass_context
def new_func(ctx, *args, **kwargs):
usage = re.search(r"Usage:\s(.*)\s\[OPTIONS\]", str(ctx.get_usage())).groups()[0]
success = False
with httpx.Client() as client:
try:
_ = client.request(
method="GET", url="http://chaoss.community", timeout=10, follow_redirects=True)

return ctx.invoke(function_internet_connection, *args, **kwargs)
success = True
except (TimeoutError, httpx.TimeoutException):
print("Request timed out.")
except httpx.NetworkError:
except httpx.NetworkError as e:
print(f"Network Error: {httpx.NetworkError}")
except httpx.ProtocolError:
print(traceback.format_exc())
except httpx.ProtocolError as e:
print(f"Protocol Error: {httpx.ProtocolError}")
print(f"\n\n{usage} command setup failed\n \
You are not connected to the internet.\n \
Please connect to the internet to run Augur\n \
Consider setting http_proxy variables for limited access installations.")
sys.exit(-1)
print(traceback.format_exc())

if not success:
print(
f"""
\n\n{usage} command setup failed.
There was an error while testing for network connectivity
Please check your connection to the internet to run Augur
Consider setting http_proxy variables for limited access installations."""
)
sys.exit(-1)

return ctx.invoke(function_internet_connection, *args, **kwargs)

return update_wrapper(new_func, function_internet_connection)

Expand Down
11 changes: 7 additions & 4 deletions augur/application/cli/backend.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,9 @@

worker_vmem_cap = get_value("Celery", 'worker_process_vmem_cap')

# create rabbit messages so if it failed on shutdown the queues are clean
cleanup_collection_status_and_rabbit(logger, ctx.obj.engine)

gunicorn_command = f"gunicorn -c {gunicorn_location} -b {host}:{port} augur.api.server:app --log-file gunicorn.log"
server = subprocess.Popen(gunicorn_command.split(" "))

Expand All @@ -109,7 +112,7 @@
logger.info(f'Augur is running at: {"http" if development else "https"}://{host}:{port}')
logger.info(f"The API is available at '{api_response.json()['route']}'")

processes = start_celery_worker_processes(float(worker_vmem_cap), disable_collection)

Check warning on line 115 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'processes' from outer scope (line 469) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:115:4: W0621: Redefining name 'processes' from outer scope (line 469) (redefined-outer-name)

if os.path.exists("celerybeat-schedule.db"):
logger.info("Deleting old task schedule")
Expand Down Expand Up @@ -180,7 +183,7 @@

try:
keypub.shutdown()
cleanup_after_collection_halt(logger, ctx.obj.engine)
cleanup_collection_status_and_rabbit(logger, ctx.obj.engine)
except RedisConnectionError:
pass

Expand Down Expand Up @@ -252,7 +255,7 @@
"""
Sends SIGTERM to all Augur server & worker processes
"""
logger = logging.getLogger("augur.cli")

Check warning on line 258 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:258:4: W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name)

augur_stop(signal.SIGTERM, logger, ctx.obj.engine)

Expand All @@ -265,7 +268,7 @@
"""
Stop collection tasks if they are running, block until complete
"""
processes = get_augur_processes()

Check warning on line 271 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'processes' from outer scope (line 469) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:271:4: W0621: Redefining name 'processes' from outer scope (line 469) (redefined-outer-name)

stopped = []

Expand All @@ -275,7 +278,7 @@
stopped.append(p)
p.terminate()

if not len(stopped):

Check warning on line 281 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 C1802: Do not use `len(SEQUENCE)` without comparison to determine if a sequence is empty (use-implicit-booleaness-not-len) Raw Output: augur/application/cli/backend.py:281:7: C1802: Do not use `len(SEQUENCE)` without comparison to determine if a sequence is empty (use-implicit-booleaness-not-len)
logger.info("No collection processes found")
return

Expand All @@ -284,7 +287,7 @@

killed = []
while True:
for i in range(len(alive)):

Check warning on line 290 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 C0200: Consider using enumerate instead of iterating with range and len (consider-using-enumerate) Raw Output: augur/application/cli/backend.py:290:8: C0200: Consider using enumerate instead of iterating with range and len (consider-using-enumerate)
if alive[i].status() == psutil.STATUS_ZOMBIE:
logger.info(f"KILLING ZOMBIE: {alive[i].pid}")
alive[i].kill()
Expand All @@ -296,13 +299,13 @@
for i in reversed(killed):
alive.pop(i)

if not len(alive):

Check warning on line 302 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 C1802: Do not use `len(SEQUENCE)` without comparison to determine if a sequence is empty (use-implicit-booleaness-not-len) Raw Output: augur/application/cli/backend.py:302:11: C1802: Do not use `len(SEQUENCE)` without comparison to determine if a sequence is empty (use-implicit-booleaness-not-len)
break

logger.info(f"Waiting on [{', '.join(str(p.pid for p in alive))}]")
time.sleep(0.5)

cleanup_after_collection_halt(logger, ctx.obj.engine)
cleanup_collection_status_and_rabbit(logger, ctx.obj.engine)

@cli.command('kill')
@test_connection
Expand All @@ -313,11 +316,11 @@
"""
Sends SIGKILL to all Augur server & worker processes
"""
logger = logging.getLogger("augur.cli")

Check warning on line 319 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:319:4: W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name)
augur_stop(signal.SIGKILL, logger, ctx.obj.engine)


def augur_stop(signal, logger, engine):

Check warning on line 323 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'signal' from outer scope (line 12) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:323:15: W0621: Redefining name 'signal' from outer scope (line 12) (redefined-outer-name)

Check warning on line 323 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:323:23: W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name)
"""
Stops augur with the given signal,
and cleans up collection if it was running
Expand All @@ -330,10 +333,10 @@
_broadcast_signal_to_processes(augur_processes, broadcast_signal=signal, given_logger=logger)

if "celery" in process_names:
cleanup_after_collection_halt(logger, engine)
cleanup_collection_status_and_rabbit(logger, engine)


def cleanup_after_collection_halt(logger, engine):
def cleanup_collection_status_and_rabbit(logger, engine):

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[pylint] reported by reviewdog 🐶
W0621: Redefining name 'logger' from outer scope (line 34) (redefined-outer-name)

clear_redis_caches()

connection_string = get_value("RabbitMQ", "connection_string")
Expand Down Expand Up @@ -482,7 +485,7 @@
pass
return augur_processes

def _broadcast_signal_to_processes(processes, broadcast_signal=signal.SIGTERM, given_logger=None):

Check warning on line 488 in augur/application/cli/backend.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0621: Redefining name 'processes' from outer scope (line 469) (redefined-outer-name) Raw Output: augur/application/cli/backend.py:488:35: W0621: Redefining name 'processes' from outer scope (line 469) (redefined-outer-name)
if given_logger is None:
_logger = logger
else:
Expand Down
2 changes: 1 addition & 1 deletion augur/application/cli/github.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ def update_api_key():
"""
SELECT value as github_key from config Where section_name='Keys' AND setting_name='github_api_key'
UNION All
SELECT access_token as github_key from worker_oauth ORDER BY github_key DESC;
SELECT access_token as github_key from worker_oauth where platform='github' ORDER BY github_key DESC;
"""
)

Expand Down
6 changes: 5 additions & 1 deletion augur/tasks/frontend.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
from augur.tasks.github.util.github_graphql_data_access import GithubGraphQlDataAccess
from augur.application.db.lib import get_group_by_name, get_repo_by_repo_git, get_github_repo_by_src_id, get_gitlab_repo_by_src_id
from augur.tasks.github.util.util import get_owner_repo
from augur.application.db.models.augur_operations import retrieve_owner_repos, FRONTEND_REPO_GROUP_NAME, RepoGroup
from augur.application.db.models.augur_operations import retrieve_owner_repos, FRONTEND_REPO_GROUP_NAME, RepoGroup, CollectionStatus
from augur.tasks.github.util.github_paginator import hit_api

from augur.application.db.models import UserRepo, Repo
Expand Down Expand Up @@ -235,6 +235,8 @@ def add_github_repo(logger, session, url, repo_group_id, group_id, repo_type, re
logger.error(f"Error while adding repo: Failed to insert user repo record. A record with a repo_id of {repo_id} and a group id of {group_id} needs to be added to the user repo table so that this repo shows up in the users group")
return

CollectionStatus.insert(session, logger, repo_id)


def get_gitlab_repo_data(gl_session, url: str, logger) -> bool:

Expand Down Expand Up @@ -281,6 +283,8 @@ def add_gitlab_repo(logger, session, url, repo_group_id, group_id, repo_src_id):
if not result:
logger.error(f"Error while adding repo: Failed to insert user repo record. A record with a repo_id of {repo_id} and a group id of {group_id} needs to be added to the user repo table so that this repo shows up in the users group")
return

CollectionStatus.insert(session, logger, repo_id)

# @celery.task
# def add_org_repo_list(user_id, group_name, urls):
Expand Down
11 changes: 8 additions & 3 deletions augur/tasks/github/pull_requests/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -231,9 +231,14 @@ def collect_pull_request_review_comments(repo_git: str, full_collection: bool) -
repo_id = get_repo_by_repo_git(repo_git).repo_id

if not full_collection:
# subtract 2 days to ensure all data is collected
core_data_last_collected = (get_core_data_last_collected(repo_id) - timedelta(days=2)).replace(tzinfo=timezone.utc)
review_msg_url += f"?since={core_data_last_collected.isoformat()}"
last_collected_date = get_secondary_data_last_collected(repo_id)

if last_collected_date:
# subtract 2 days to ensure all data is collected
core_data_last_collected = (last_collected_date - timedelta(days=2)).replace(tzinfo=timezone.utc)
review_msg_url += f"?since={core_data_last_collected.isoformat()}"
else:
logger.warning(f"core_data_last_collected is NULL for recollection on repo: {repo_git}")

pr_reviews = get_pull_request_reviews_by_repo_id(repo_id)

Expand Down
2 changes: 2 additions & 0 deletions augur/tasks/github/util/github_api_key_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,8 @@ def get_api_keys(self) -> List[str]:

if len(keys) == 0:
return []

keys = [key.strip() for key in keys]

valid_keys = []
with httpx.Client() as client:
Expand Down
2 changes: 2 additions & 0 deletions augur/tasks/gitlab/gitlab_api_key_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,6 +110,8 @@ def get_api_keys(self) -> List[str]:
if len(keys) == 0:
return []

keys = [key.strip() for key in keys]

valid_keys = []
with httpx.Client() as client:

Expand Down
7 changes: 5 additions & 2 deletions augur/tasks/init/celery_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -201,8 +201,8 @@ def setup_periodic_tasks(sender, **kwargs):
The tasks so that they are grouped by the module they are defined in
"""
from celery.schedules import crontab
from augur.tasks.start_tasks import augur_collection_monitor, augur_collection_update_weights
from augur.tasks.start_tasks import non_repo_domain_tasks, retry_errored_repos
from augur.tasks.start_tasks import augur_collection_monitor
from augur.tasks.start_tasks import non_repo_domain_tasks, retry_errored_repos, create_collection_status_records
from augur.tasks.git.facade_tasks import clone_repos
from augur.tasks.db.refresh_materialized_views import refresh_materialized_views
from augur.tasks.data_analysis.contributor_breadth_worker.contributor_breadth_worker import contributor_breadth_model
Expand Down Expand Up @@ -232,6 +232,9 @@ def setup_periodic_tasks(sender, **kwargs):
logger.info(f"Setting 404 repos to be marked for retry on midnight each day")
sender.add_periodic_task(crontab(hour=0, minute=0),retry_errored_repos.s())

one_day_in_seconds = 24*60*60
sender.add_periodic_task(one_day_in_seconds, create_collection_status_records.s())

@after_setup_logger.connect
def setup_loggers(*args,**kwargs):
"""Override Celery loggers with our own."""
Expand Down
4 changes: 2 additions & 2 deletions augur/tasks/start_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -378,5 +378,5 @@ def create_collection_status_records(self):
CollectionStatus.insert(session, logger, repo[0])
repo = execute_sql(query).first()

#Check for new repos every seven minutes to be out of step with the clone_repos task
create_collection_status_records.si().apply_async(countdown=60*7)
# no longer recursively run this task because collection status records are added when repos are inserted
#create_collection_status_records.si().apply_async(countdown=60*7)
4 changes: 3 additions & 1 deletion augur/util/repo_load_controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
from typing import Any, Dict

from augur.application.db.engine import DatabaseEngine
from augur.application.db.models import Repo, UserRepo, RepoGroup, UserGroup, User
from augur.application.db.models import Repo, UserRepo, RepoGroup, UserGroup, User, CollectionStatus
from augur.application.db.models.augur_operations import retrieve_owner_repos
from augur.application.db.util import execute_session_query

Expand Down Expand Up @@ -67,8 +67,10 @@ def add_cli_repo(self, repo_data: Dict[str, Any], from_org_list=False, repo_type
# if the repo doesn't exist it adds it
if "gitlab" in url:
repo_id = Repo.insert_gitlab_repo(self.session, url, repo_group_id, "CLI")
CollectionStatus.insert(self.session, logger, repo_id)
else:
repo_id = Repo.insert_github_repo(self.session, url, repo_group_id, "CLI", repo_type)
CollectionStatus.insert(self.session, logger, repo_id)

if not repo_id:
logger.warning(f"Invalid repo group id specified for {url}, skipping.")
Expand Down
1 change: 1 addition & 0 deletions docker/backend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,7 @@ COPY ./augur/ augur/
COPY ./metadata.py .
COPY ./setup.py .
COPY ./scripts/ scripts/
COPY ./keyman/ keyman/

# Add rust and cargo to PATH
ENV PATH="/usr/bin/:/root/.cargo/bin:/usr/local/bin:${PATH}"
Expand Down
91 changes: 81 additions & 10 deletions docker/backend/graphical
Original file line number Diff line number Diff line change
@@ -1,22 +1,75 @@
#SPDX-License-Identifier: MIT
FROM python:3.9-slim-bullseye
# SPDX-License-Identifier: MIT
FROM python:3.11-slim-bullseye

LABEL maintainer="outdoors@acm.org"
LABEL version="0.51.1"
LABEL version="0.76.6"

ENV DEBIAN_FRONTEND=noninteractive
ENV PATH="/usr/bin/:/usr/local/bin:/usr/lib:${PATH}"

RUN set -x \
&& apt-get update \
&& apt-get -y install --no-install-recommends \
&& apt-get -y install \
git \
bash \
curl \
gcc \
python3-pip \
software-properties-common \
postgresql-contrib \
musl-dev \
python3-dev \
python3-distutils \
python3-venv \
wget \
postgresql-client \
&& rm -rf /var/lib/apt/lists/*
libpq-dev \
build-essential \
rustc \
cargo \
chromium \
tar \
jq \
chromium-driver \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y

# Install Firefox from Debian repositories for ARM64 architecture
RUN set -x \
&& apt-get update \
&& apt-get install -y firefox-esr

# Install Geckodriver
RUN GECKODRIVER_VERSION=$(curl -s https://api.github.com/repos/mozilla/geckodriver/releases/latest | jq -r '.tag_name' | sed 's/v//') \
&& ARCH=$(uname -m) \
&& if [ "$ARCH" = "aarch64" ]; then \
GECKODRIVER_URL="https://github.com/mozilla/geckodriver/releases/download/v${GECKODRIVER_VERSION}/geckodriver-v${GECKODRIVER_VERSION}-linux-aarch64.tar.gz"; \
GECKODRIVER_FILE="geckodriver-v${GECKODRIVER_VERSION}-linux-aarch64.tar.gz"; \
else \
GECKODRIVER_URL="https://github.com/mozilla/geckodriver/releases/download/v${GECKODRIVER_VERSION}/geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz"; \
GECKODRIVER_FILE="geckodriver-v${GECKODRIVER_VERSION}-linux64.tar.gz"; \
fi \
&& wget $GECKODRIVER_URL \
&& tar -xzf $GECKODRIVER_FILE \
&& mv geckodriver /usr/local/bin/ \
&& rm $GECKODRIVER_FILE

# Verify installations
RUN firefox --version
RUN geckodriver --version

# Ensure Rust directories are writable
RUN mkdir -p /root/.rustup/downloads /root/.cargo/registry && \
chmod -R 777 /root/.rustup /root/.cargo

# Add rust and cargo to PATH
ENV PATH="/root/.cargo/bin:${PATH}"

# Install the specific version of Rust
RUN set -x \
&& rustup install 1.78.0
RUN set -x \
&& rustup default 1.78.0

EXPOSE 5000

Expand All @@ -27,20 +80,38 @@ COPY ./augur/ augur/
COPY ./metadata.py .
COPY ./setup.py .
COPY ./scripts/ scripts/
COPY ./keyman/ keyman/

# Add rust and cargo to PATH
ENV PATH="/usr/bin/:/root/.cargo/bin:/usr/local/bin:${PATH}"

#COPY ./docker/backend/docker.config.json .
RUN python3 -m venv /opt/venv

RUN set -x \
&& /opt/venv/bin/pip install --upgrade pip

RUN set -x \
&& /opt/venv/bin/pip install wheel

RUN set -x \
&& /opt/venv/bin/pip install .

RUN ./scripts/docker/install-workers-deps.sh
RUN set -x \
&& /opt/venv/bin/pip install --upgrade pip \
&& /opt/venv/bin/pip install wheel \
&& /opt/venv/bin/pip install .

RUN ./scripts/docker/install-go.sh
ENV PATH="${PATH}:/usr/local/go/bin"
RUN ./scripts/docker/install-workers-deps.sh

# RUN ./scripts/install/workers.sh

RUN mkdir -p repos/ logs/ /augur/facade/

COPY ./docker/backend/graphical.sh /
RUN chmod +x /graphical.sh
ENTRYPOINT /graphical.sh
COPY ./docker/backend/init.sh /
RUN chmod +x /entrypoint.sh /init.sh
ENTRYPOINT ["/bin/bash", "/graphical.sh"]
#ENTRYPOINT ["/entrypoint.sh"]
CMD /init.sh
Loading