Skip to content

Commit

Permalink
chore: remove AWS god keys (hedyorg#1327)
Browse files Browse the repository at this point in the history
Currently, we are using service-specific AWS user keys. What does that
mean?

There is one AWS user who can has full admin access to DynamoDB, and
this user is used by both the alpha as well as the production
environment.

There is a separate AWS user who has full admin access to SES, and this
user is also used by both the alpha as well as the production
environment.

There are multiple things wrong with this setup:

- There are different keys per service we want to use, which is
  unnecessarily annoying to configure.
- The keys have unnecessarily broad permissions.
- The keys are not scoped to a single (alpha or prod) environment,
  and importantly scoped to the resources that that environment
  needs to access.

I've rectified the configuration in AWS itself. The current PR switches
over to just using the implicit, environment-provided keys that are tied
to the alpha and prod environments.

After this PR is merged, we can get rid of the god users and their keys.
  • Loading branch information
rix0rrr authored Nov 19, 2021
1 parent d01cb53 commit 3589c5d
Show file tree
Hide file tree
Showing 4 changed files with 17 additions and 15 deletions.
9 changes: 3 additions & 6 deletions doc/CONFIGURATION.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,9 @@ server work.
AWS credentials and setup:

```
AWS_DYNAMODB_ACCESS_KEY
AWS_DYNAMODB_SECRET_KEY
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_DYNAMODB_TABLE_PREFIX
AWS_SES_ACCESS_KEY
AWS_SES_SECRET_KEY
```

JSONbin credentials and setup:
Expand Down Expand Up @@ -45,7 +42,7 @@ PROXY_TO_TEST_PROPORTION
IS_TEST_ENV
```

App secret (for cookies):
App secret (needs to be the same to share cookies between instances):

```
SECRET_KEY
Expand Down
2 changes: 1 addition & 1 deletion website/auth.py
Original file line number Diff line number Diff line change
Expand Up @@ -557,7 +557,7 @@ def change_user_email():
logging.getLogger(name).setLevel(logging.CRITICAL)

# https://docs.aws.amazon.com/ses/latest/DeveloperGuide/send-using-sdk-python.html
email_client = boto3.client('ses', region_name = config['email']['region'], aws_access_key_id = os.getenv('AWS_SES_ACCESS_KEY'), aws_secret_access_key = os.getenv('AWS_SES_SECRET_KEY'))
email_client = boto3.client('ses', region_name = config['email']['region'])

@querylog.timed
def send_email(recipient, subject, body_plain, body_html):
Expand Down
12 changes: 10 additions & 2 deletions website/database.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from utils import timems, times
from . import dynamo


storage = dynamo.AwsDynamoStorage.from_env() or dynamo.MemoryStorage('dev_database.json')

USERS = dynamo.Table(storage, 'users', 'username', indexed_fields=['email'])
Expand Down Expand Up @@ -163,9 +164,16 @@ def get_class(self, id):
def get_teacher_classes(self, username, students_to_list):
"""Return all the classes belonging to a teacher."""
classes = None
if dynamo.is_dynamo_available ():
if isinstance(storage, dynamo.AwsDynamoStorage):
classes = CLASSES.get_many({'teacher': username}, reverse=True)
# If we're using the in-memory database, we need to make a shallow copy of the classes before changing the `students` key from a set to list, otherwise the field will remain a list later and that will break the set methods.

# If we're using the in-memory database, we need to make a shallow copy
# of the classes before changing the `students` key from a set to list,
# otherwise the field will remain a list later and that will break the
# set methods.
#
# FIXME: I don't understand what the above comment is saying, but I'm
# skeptical that it's accurate.
else:
classes = []
for Class in CLASSES.get_many({'teacher': username}, reverse=True):
Expand Down
9 changes: 3 additions & 6 deletions website/dynamo.py
Original file line number Diff line number Diff line change
Expand Up @@ -202,8 +202,9 @@ def _validate_key(self, key):
class AwsDynamoStorage(TableStorage):
@staticmethod
def from_env():
if is_dynamo_available():
db = boto3.client ('dynamodb', region_name = config ['dynamodb'] ['region'], aws_access_key_id = os.getenv ('AWS_DYNAMODB_ACCESS_KEY'), aws_secret_access_key = os.getenv ('AWS_DYNAMODB_SECRET_KEY'))
# If we have AWS credentials, use the real DynamoDB
if os.getenv('AWS_ACCESS_KEY_ID'):
db = boto3.client ('dynamodb', region_name = config ['dynamodb'] ['region'])
db_prefix = os.getenv ('AWS_DYNAMODB_TABLE_PREFIX', '')
return AwsDynamoStorage(db, db_prefix)
return None
Expand Down Expand Up @@ -436,10 +437,6 @@ def _flush(self):
except IOError:
pass

def is_dynamo_available():
return bool(os.getenv ('AWS_DYNAMODB_ACCESS_KEY'))


def first_or_none(xs):
return xs[0] if xs else None

Expand Down

0 comments on commit 3589c5d

Please sign in to comment.