Compare commits

..

15 Commits

Author SHA1 Message Date
da6e56c4eb 4.55.0
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m43s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m10s
Build-Release-Image / Merge-Images (push) Successful in 27s
Build-Release-Image / Create-Release (push) Successful in 10s
Build-Release-Image / Notify (push) Successful in 3s
2024-10-18 12:00:06 +01:00
798b58529c 4.53.2
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m29s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m33s
Build-Release-Image / Merge-Images (push) Successful in 44s
Build-Release-Image / Create-Release (push) Successful in 7s
Build-Release-Image / Notify (push) Successful in 20s
2024-10-11 12:00:07 +01:00
3da6c983e1 4.53.1
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m13s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m54s
Build-Release-Image / Merge-Images (push) Successful in 16s
Build-Release-Image / Create-Release (push) Successful in 40s
Build-Release-Image / Notify (push) Successful in 5s
2024-10-09 12:00:06 +01:00
294232a329 4.52.1
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m56s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m45s
Build-Release-Image / Merge-Images (push) Successful in 22s
Build-Release-Image / Create-Release (push) Successful in 8s
Build-Release-Image / Notify (push) Successful in 3s
2024-10-02 12:00:06 +01:00
fae9d7bc17 4.52.0
All checks were successful
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m44s
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 4m31s
Build-Release-Image / Merge-Images (push) Successful in 23s
Build-Release-Image / Create-Release (push) Successful in 23s
Build-Release-Image / Notify (push) Successful in 17s
2024-10-01 12:00:06 +01:00
d666f5af3f 4.51.2
All checks were successful
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m33s
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m35s
Build-Release-Image / Merge-Images (push) Successful in 25s
Build-Release-Image / Create-Release (push) Successful in 10s
Build-Release-Image / Notify (push) Successful in 3s
2024-09-28 12:00:06 +01:00
556fae02d5 4.51.1
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m21s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m40s
Build-Release-Image / Merge-Images (push) Successful in 23s
Build-Release-Image / Create-Release (push) Successful in 9s
Build-Release-Image / Notify (push) Successful in 4s
2024-09-26 12:00:06 +01:00
fd4c67c3d1 4.51.0
All checks were successful
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m30s
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m31s
Build-Release-Image / Merge-Images (push) Successful in 11s
Build-Release-Image / Create-Release (push) Successful in 9s
Build-Release-Image / Notify (push) Successful in 2s
2024-09-25 12:00:07 +01:00
edef254529 4.50.0
All checks were successful
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m43s
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m54s
Build-Release-Image / Merge-Images (push) Successful in 23s
Build-Release-Image / Create-Release (push) Successful in 14s
Build-Release-Image / Notify (push) Successful in 3s
2024-09-19 12:00:06 +01:00
357f0cca57 4.49.10
Some checks failed
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m7s
Build-Release-Image / Build-Image (linux/amd64) (push) Has been cancelled
Build-Release-Image / Merge-Images (push) Has been cancelled
Build-Release-Image / Create-Release (push) Has been cancelled
Build-Release-Image / Notify (push) Has been cancelled
2024-09-13 12:00:28 +01:00
8ce90e27f7 4.49.9
Some checks failed
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m4s
Build-Release-Image / Build-Image (linux/amd64) (push) Has been cancelled
Build-Release-Image / Merge-Images (push) Has been cancelled
Build-Release-Image / Create-Release (push) Has been cancelled
Build-Release-Image / Notify (push) Has been cancelled
2024-09-07 12:00:06 +01:00
3ecc8d36f9 4.49.8
Some checks failed
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m18s
Build-Release-Image / Build-Image (linux/amd64) (push) Has been cancelled
Build-Release-Image / Merge-Images (push) Has been cancelled
Build-Release-Image / Create-Release (push) Has been cancelled
Build-Release-Image / Notify (push) Has been cancelled
2024-09-04 12:00:07 +01:00
14f4829fab 4.49.7
Some checks failed
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m21s
Build-Release-Image / Build-Image (linux/amd64) (push) Has been cancelled
Build-Release-Image / Merge-Images (push) Has been cancelled
Build-Release-Image / Create-Release (push) Has been cancelled
Build-Release-Image / Notify (push) Has been cancelled
2024-09-03 12:00:06 +01:00
63ac89e952 4.49.6
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m15s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m18s
Build-Release-Image / Merge-Images (push) Successful in 13s
Build-Release-Image / Create-Release (push) Successful in 10s
Build-Release-Image / Notify (push) Successful in 3s
2024-08-28 12:00:07 +01:00
8896f00124 4.49.5
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m16s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m21s
Build-Release-Image / Merge-Images (push) Successful in 11s
Build-Release-Image / Create-Release (push) Successful in 9s
Build-Release-Image / Notify (push) Successful in 2s
2024-08-26 12:00:07 +01:00
204 changed files with 6821 additions and 3726 deletions

View File

@ -109,7 +109,7 @@ jobs:
GITHUB_ACTIONS_TEST: true GITHUB_ACTIONS_TEST: true
- name: Archive code coverage results - name: Archive code coverage results
uses: actions/upload-artifact@v2 uses: actions/upload-artifact@v4
with: with:
name: code-coverage-report name: code-coverage-report
path: htmlcov path: htmlcov
@ -163,7 +163,7 @@ jobs:
uses: docker/build-push-action@v3 uses: docker/build-push-action@v3
with: with:
context: . context: .
platforms: linux/amd64,linux/arm64 platforms: linux/amd64
push: true push: true
tags: ${{ steps.meta.outputs.tags }} tags: ${{ steps.meta.outputs.tags }}

View File

@ -8,7 +8,7 @@ repos:
- id: check-yaml - id: check-yaml
- id: trailing-whitespace - id: trailing-whitespace
- repo: https://github.com/Riverside-Healthcare/djLint - repo: https://github.com/Riverside-Healthcare/djLint
rev: v1.3.0 rev: v1.34.1
hooks: hooks:
- id: djlint-jinja - id: djlint-jinja
files: '.*\.html' files: '.*\.html'
@ -21,5 +21,4 @@ repos:
- id: ruff - id: ruff
args: [ --fix ] args: [ --fix ]
# Run the formatter. # Run the formatter.
- id: ruff-format - id: ruff-format

View File

@ -20,7 +20,7 @@ SimpleLogin backend consists of 2 main components:
## Install dependencies ## Install dependencies
The project requires: The project requires:
- Python 3.10 and [rye](https://github.com/astral-sh/rye) to manage dependencies - Python 3.10 and poetry to manage dependencies
- Node v10 for front-end. - Node v10 for front-end.
- Postgres 13+ - Postgres 13+
@ -28,7 +28,7 @@ First, install all dependencies by running the following command.
Feel free to use `virtualenv` or similar tools to isolate development environment. Feel free to use `virtualenv` or similar tools to isolate development environment.
```bash ```bash
rye sync poetry sync
``` ```
On Mac, sometimes you might need to install some other packages via `brew`: On Mac, sometimes you might need to install some other packages via `brew`:
@ -55,7 +55,7 @@ brew install -s re2 pybind11
We use pre-commit to run all our linting and static analysis checks. Please run We use pre-commit to run all our linting and static analysis checks. Please run
```bash ```bash
rye run pre-commit install poetry run pre-commit install
``` ```
To install it in your development environment. To install it in your development environment.
@ -160,25 +160,25 @@ Here are the small sum-ups of the directory structures and their roles:
The code is formatted using [ruff](https://github.com/astral-sh/ruff), to format the code, simply run The code is formatted using [ruff](https://github.com/astral-sh/ruff), to format the code, simply run
``` ```
rye run ruff format . poetry run ruff format .
``` ```
The code is also checked with `flake8`, make sure to run `flake8` before creating the pull request by The code is also checked with `flake8`, make sure to run `flake8` before creating the pull request by
```bash ```bash
rye run flake8 poetry run flake8
``` ```
For HTML templates, we use `djlint`. Before creating a pull request, please run For HTML templates, we use `djlint`. Before creating a pull request, please run
```bash ```bash
rye run djlint --check templates poetry run djlint --check templates
``` ```
If some files aren't properly formatted, you can format all files with If some files aren't properly formatted, you can format all files with
```bash ```bash
rye run djlint --reformat . poetry run djlint --reformat .
``` ```
## Test sending email ## Test sending email
@ -223,6 +223,31 @@ Now open http://localhost:1080/ (or http://localhost:1080/ for MailHog), you sho
## Job runner ## Job runner
Some features require a job handler (such as GDPR data export). To test such feature you need to run the job_runner Some features require a job handler (such as GDPR data export). To test such feature you need to run the job_runner
```bash ```bash
python job_runner.py python job_runner.py
``` ```
# Setup for Mac
There are several ways to setup Python and manage the project dependencies on Mac. For info we have successfully used this setup on a Mac silicon:
```bash
# we haven't managed to make python 3.12 work
brew install python3.10
# make sure to update the PATH so python, pip point to Python3
# for us it can be done by adding "export PATH=/opt/homebrew/opt/python@3.10/libexec/bin:$PATH" to .zprofile
# Although pipx is the recommended way to install poetry,
# install pipx via brew will automatically install python 3.12
# and poetry will then use python 3.12
# so we recommend using poetry this way instead
curl -sSL https://install.python-poetry.org | python3 -
poetry install
# activate the virtualenv and you should be good to go!
source .venv/bin/activate
```

View File

@ -9,6 +9,7 @@ from sqlalchemy import or_
from app.db import Session from app.db import Session
from app.email_utils import send_welcome_email from app.email_utils import send_welcome_email
from app.partner_user_utils import create_partner_user, create_partner_subscription
from app.utils import sanitize_email, canonicalize_email from app.utils import sanitize_email, canonicalize_email
from app.errors import ( from app.errors import (
AccountAlreadyLinkedToAnotherPartnerException, AccountAlreadyLinkedToAnotherPartnerException,
@ -23,6 +24,7 @@ from app.models import (
User, User,
Alias, Alias,
) )
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import random_string from app.utils import random_string
@ -66,9 +68,10 @@ def set_plan_for_partner_user(partner_user: PartnerUser, plan: SLPlan):
LOG.i( LOG.i(
f"Creating partner_subscription [user_id={partner_user.user_id}] [partner_id={partner_user.partner_id}]" f"Creating partner_subscription [user_id={partner_user.user_id}] [partner_id={partner_user.partner_id}]"
) )
PartnerSubscription.create( create_partner_subscription(
partner_user_id=partner_user.id, partner_user=partner_user,
end_at=plan.expiration, expiration=plan.expiration,
msg="Upgraded via partner. User did not have a previous partner subscription",
) )
agent.record_custom_event("PlanChange", {"plan": "premium", "type": "new"}) agent.record_custom_event("PlanChange", {"plan": "premium", "type": "new"})
else: else:
@ -80,6 +83,11 @@ def set_plan_for_partner_user(partner_user: PartnerUser, plan: SLPlan):
"PlanChange", {"plan": "premium", "type": "extension"} "PlanChange", {"plan": "premium", "type": "extension"}
) )
sub.end_at = plan.expiration sub.end_at = plan.expiration
emit_user_audit_log(
user=partner_user.user,
action=UserAuditLogAction.SubscriptionExtended,
message="Extended partner subscription",
)
Session.commit() Session.commit()
@ -98,8 +106,8 @@ def ensure_partner_user_exists_for_user(
if res and res.partner_id != partner.id: if res and res.partner_id != partner.id:
raise AccountAlreadyLinkedToAnotherPartnerException() raise AccountAlreadyLinkedToAnotherPartnerException()
if not res: if not res:
res = PartnerUser.create( res = create_partner_user(
user_id=sl_user.id, user=sl_user,
partner_id=partner.id, partner_id=partner.id,
partner_email=link_request.email, partner_email=link_request.email,
external_user_id=link_request.external_user_id, external_user_id=link_request.external_user_id,
@ -140,8 +148,8 @@ class NewUserStrategy(ClientMergeStrategy):
activated=True, activated=True,
from_partner=self.link_request.from_partner, from_partner=self.link_request.from_partner,
) )
partner_user = PartnerUser.create( partner_user = create_partner_user(
user_id=new_user.id, user=new_user,
partner_id=self.partner.id, partner_id=self.partner.id,
external_user_id=self.link_request.external_user_id, external_user_id=self.link_request.external_user_id,
partner_email=self.link_request.email, partner_email=self.link_request.email,
@ -200,7 +208,7 @@ def get_login_strategy(
return ExistingUnlinkedUserStrategy(link_request, user, partner) return ExistingUnlinkedUserStrategy(link_request, user, partner)
def check_alias(email: str) -> bool: def check_alias(email: str):
alias = Alias.get_by(email=email) alias = Alias.get_by(email=email)
if alias is not None: if alias is not None:
raise AccountIsUsingAliasAsEmail() raise AccountIsUsingAliasAsEmail()
@ -275,10 +283,26 @@ def switch_already_linked_user(
LOG.i( LOG.i(
f"Deleting previous partner_user:{other_partner_user.id} from user:{current_user.id}" f"Deleting previous partner_user:{other_partner_user.id} from user:{current_user.id}"
) )
emit_user_audit_log(
user=other_partner_user.user,
action=UserAuditLogAction.UnlinkAccount,
message=f"Deleting partner_user {other_partner_user.id} (external_user_id={other_partner_user.external_user_id} | partner_email={other_partner_user.partner_email}) from user {current_user.id}, as we received a new link request for the same partner",
)
PartnerUser.delete(other_partner_user.id) PartnerUser.delete(other_partner_user.id)
LOG.i(f"Linking partner_user:{partner_user.id} to user:{current_user.id}") LOG.i(f"Linking partner_user:{partner_user.id} to user:{current_user.id}")
# Link this partner_user to the current user # Link this partner_user to the current user
emit_user_audit_log(
user=partner_user.user,
action=UserAuditLogAction.UnlinkAccount,
message=f"Unlinking from partner, as user will now be tied to another external account. old=(id={partner_user.user.id} | email={partner_user.user.email}) | new=(id={current_user.id} | email={current_user.email})",
)
partner_user.user_id = current_user.id partner_user.user_id = current_user.id
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.LinkAccount,
message=f"Linking user {current_user.id} ({current_user.email}) to partner_user:{partner_user.id} (external_user_id={partner_user.external_user_id} | partner_email={partner_user.partner_email})",
)
# Set plan # Set plan
set_plan_for_partner_user(partner_user, link_request.plan) set_plan_for_partner_user(partner_user, link_request.plan)
Session.commit() Session.commit()

View File

@ -1,5 +1,5 @@
from __future__ import annotations from __future__ import annotations
from typing import Optional from typing import Optional, List
import arrow import arrow
import sqlalchemy import sqlalchemy
@ -33,6 +33,10 @@ from app.models import (
Mailbox, Mailbox,
DeletedAlias, DeletedAlias,
DomainDeletedAlias, DomainDeletedAlias,
PartnerUser,
AliasMailbox,
AliasAuditLog,
UserAuditLog,
) )
from app.newsletter_utils import send_newsletter_to_user, send_newsletter_to_address from app.newsletter_utils import send_newsletter_to_user, send_newsletter_to_address
@ -735,10 +739,13 @@ class InvalidMailboxDomainAdmin(SLModelView):
class EmailSearchResult: class EmailSearchResult:
no_match: bool = True no_match: bool = True
alias: Optional[Alias] = None alias: Optional[Alias] = None
mailbox: Optional[Mailbox] = None alias_audit_log: Optional[List[AliasAuditLog]] = None
mailbox: List[Mailbox] = []
mailbox_count: int = 0
deleted_alias: Optional[DeletedAlias] = None deleted_alias: Optional[DeletedAlias] = None
deleted_custom_alias: Optional[DomainDeletedAlias] = None deleted_custom_alias: Optional[DomainDeletedAlias] = None
user: Optional[User] = None user: Optional[User] = None
user_audit_log: Optional[List[UserAuditLog]] = None
@staticmethod @staticmethod
def from_email(email: str) -> EmailSearchResult: def from_email(email: str) -> EmailSearchResult:
@ -746,23 +753,32 @@ class EmailSearchResult:
alias = Alias.get_by(email=email) alias = Alias.get_by(email=email)
if alias: if alias:
output.alias = alias output.alias = alias
output.alias_audit_log = (
AliasAuditLog.filter_by(alias_id=alias.id)
.order_by(AliasAuditLog.created_at.desc())
.all()
)
output.no_match = False output.no_match = False
return output
user = User.get_by(email=email) user = User.get_by(email=email)
if user: if user:
output.user = user output.user = user
output.user_audit_log = (
UserAuditLog.filter_by(user_id=user.id)
.order_by(UserAuditLog.created_at.desc())
.all()
)
output.no_match = False output.no_match = False
return output mailboxes = (
mailbox = Mailbox.get_by(email=email) Mailbox.filter_by(email=email).order_by(Mailbox.id.desc()).limit(10).all()
if mailbox: )
output.mailbox = mailbox if mailboxes:
output.mailbox = mailboxes
output.mailbox_count = Mailbox.filter_by(email=email).count()
output.no_match = False output.no_match = False
return output
deleted_alias = DeletedAlias.get_by(email=email) deleted_alias = DeletedAlias.get_by(email=email)
if deleted_alias: if deleted_alias:
output.deleted_alias = deleted_alias output.deleted_alias = deleted_alias
output.no_match = False output.no_match = False
return output
domain_deleted_alias = DomainDeletedAlias.get_by(email=email) domain_deleted_alias = DomainDeletedAlias.get_by(email=email)
if domain_deleted_alias: if domain_deleted_alias:
output.domain_deleted_alias = domain_deleted_alias output.domain_deleted_alias = domain_deleted_alias
@ -782,16 +798,41 @@ class EmailSearchHelpers:
@staticmethod @staticmethod
def mailbox_count(user: User) -> int: def mailbox_count(user: User) -> int:
return Mailbox.filter_by(user_id=user.id).order_by(Mailbox.id.asc()).count() return Mailbox.filter_by(user_id=user.id).order_by(Mailbox.id.desc()).count()
@staticmethod
def alias_mailboxes(alias: Alias) -> list[Mailbox]:
return (
Session.query(Mailbox)
.filter(Mailbox.id == Alias.mailbox_id, Alias.id == alias.id)
.union(
Session.query(Mailbox)
.join(AliasMailbox, Mailbox.id == AliasMailbox.mailbox_id)
.filter(AliasMailbox.alias_id == alias.id)
)
.order_by(Mailbox.id)
.limit(10)
.all()
)
@staticmethod
def alias_mailbox_count(alias: Alias) -> int:
return len(alias.mailboxes)
@staticmethod @staticmethod
def alias_list(user: User) -> list[Alias]: def alias_list(user: User) -> list[Alias]:
return Alias.filter_by(user_id=user.id).order_by(Alias.id.asc()).limit(10).all() return (
Alias.filter_by(user_id=user.id).order_by(Alias.id.desc()).limit(10).all()
)
@staticmethod @staticmethod
def alias_count(user: User) -> int: def alias_count(user: User) -> int:
return Alias.filter_by(user_id=user.id).count() return Alias.filter_by(user_id=user.id).count()
@staticmethod
def partner_user(user: User) -> Optional[PartnerUser]:
return PartnerUser.get_by(user_id=user.id)
class EmailSearchAdmin(BaseView): class EmailSearchAdmin(BaseView):
def is_accessible(self): def is_accessible(self):
@ -805,9 +846,8 @@ class EmailSearchAdmin(BaseView):
@expose("/", methods=["GET", "POST"]) @expose("/", methods=["GET", "POST"])
def index(self): def index(self):
search = EmailSearchResult() search = EmailSearchResult()
email = "" email = request.args.get("email")
if request.form and request.form["email"]: if email is not None and len(email) > 0:
email = request.form["email"]
email = email.strip() email = email.strip()
search = EmailSearchResult.from_email(email) search = EmailSearchResult.from_email(email)

View File

@ -0,0 +1,38 @@
from enum import Enum
from typing import Optional
from app.models import Alias, AliasAuditLog
class AliasAuditLogAction(Enum):
CreateAlias = "create"
ChangeAliasStatus = "change_status"
DeleteAlias = "delete"
UpdateAlias = "update"
InitiateTransferAlias = "initiate_transfer_alias"
AcceptTransferAlias = "accept_transfer_alias"
TransferredAlias = "transferred_alias"
ChangedMailboxes = "changed_mailboxes"
CreateContact = "create_contact"
UpdateContact = "update_contact"
DeleteContact = "delete_contact"
def emit_alias_audit_log(
alias: Alias,
action: AliasAuditLogAction,
message: str,
user_id: Optional[int] = None,
commit: bool = False,
):
AliasAuditLog.create(
user_id=user_id or alias.user_id,
alias_id=alias.id,
alias_email=alias.email,
action=action.value,
message=message,
commit=commit,
)

View File

@ -0,0 +1,61 @@
from dataclasses import dataclass
from enum import Enum
from typing import List, Optional
from app.alias_audit_log_utils import emit_alias_audit_log, AliasAuditLogAction
from app.db import Session
from app.models import Alias, AliasMailbox, Mailbox
_MAX_MAILBOXES_PER_ALIAS = 20
class CannotSetMailboxesForAliasCause(Enum):
Forbidden = "Forbidden"
EmptyMailboxes = "Must choose at least one mailbox"
TooManyMailboxes = "Too many mailboxes"
@dataclass
class SetMailboxesForAliasResult:
performed_change: bool
reason: Optional[CannotSetMailboxesForAliasCause]
def set_mailboxes_for_alias(
user_id: int, alias: Alias, mailbox_ids: List[int]
) -> Optional[CannotSetMailboxesForAliasCause]:
if len(mailbox_ids) == 0:
return CannotSetMailboxesForAliasCause.EmptyMailboxes
if len(mailbox_ids) > _MAX_MAILBOXES_PER_ALIAS:
return CannotSetMailboxesForAliasCause.TooManyMailboxes
mailboxes = (
Session.query(Mailbox)
.filter(
Mailbox.id.in_(mailbox_ids),
Mailbox.user_id == user_id,
Mailbox.verified == True, # noqa: E712
)
.all()
)
if len(mailboxes) != len(mailbox_ids):
return CannotSetMailboxesForAliasCause.Forbidden
# first remove all existing alias-mailboxes links
AliasMailbox.filter_by(alias_id=alias.id).delete()
Session.flush()
# then add all new mailboxes, being the first the one associated with the alias
for i, mailbox in enumerate(mailboxes):
if i == 0:
alias.mailbox_id = mailboxes[0].id
else:
AliasMailbox.create(alias_id=alias.id, mailbox_id=mailbox.id)
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.ChangedMailboxes,
message=",".join([f"{mailbox.id} ({mailbox.email})" for mailbox in mailboxes]),
)
return None

View File

@ -1,12 +1,14 @@
import csv import csv
from io import StringIO from io import StringIO
import re import re
from dataclasses import dataclass
from typing import Optional, Tuple from typing import Optional, Tuple
from email_validator import validate_email, EmailNotValidError from email_validator import validate_email, EmailNotValidError
from sqlalchemy.exc import IntegrityError, DataError from sqlalchemy.exc import IntegrityError, DataError
from flask import make_response from flask import make_response
from app.alias_audit_log_utils import AliasAuditLogAction, emit_alias_audit_log
from app.config import ( from app.config import (
BOUNCE_PREFIX_FOR_REPLY_PHASE, BOUNCE_PREFIX_FOR_REPLY_PHASE,
BOUNCE_PREFIX, BOUNCE_PREFIX,
@ -23,6 +25,7 @@ from app.email_utils import (
send_cannot_create_domain_alias, send_cannot_create_domain_alias,
send_email, send_email,
render, render,
sl_formataddr,
) )
from app.errors import AliasInTrashError from app.errors import AliasInTrashError
from app.events.event_dispatcher import EventDispatcher from app.events.event_dispatcher import EventDispatcher
@ -30,6 +33,7 @@ from app.events.generated.event_pb2 import (
AliasDeleted, AliasDeleted,
AliasStatusChanged, AliasStatusChanged,
EventContent, EventContent,
AliasCreated,
) )
from app.log import LOG from app.log import LOG
from app.models import ( from app.models import (
@ -363,11 +367,18 @@ def delete_alias(
Session.commit() Session.commit()
LOG.i(f"Moving {alias} to global trash {deleted_alias}") LOG.i(f"Moving {alias} to global trash {deleted_alias}")
alias_id = alias.id
alias_email = alias.email
emit_alias_audit_log(
alias, AliasAuditLogAction.DeleteAlias, "Alias deleted by user action"
)
Alias.filter(Alias.id == alias.id).delete() Alias.filter(Alias.id == alias.id).delete()
Session.commit() Session.commit()
EventDispatcher.send_event( EventDispatcher.send_event(
user, EventContent(alias_deleted=AliasDeleted(alias_id=alias.id)) user,
EventContent(alias_deleted=AliasDeleted(id=alias_id, email=alias_email)),
) )
if commit: if commit:
Session.commit() Session.commit()
@ -444,7 +455,7 @@ def alias_export_csv(user, csv_direct_export=False):
return output return output
def transfer_alias(alias, new_user, new_mailboxes: [Mailbox]): def transfer_alias(alias: Alias, new_user: User, new_mailboxes: [Mailbox]):
# cannot transfer alias which is used for receiving newsletter # cannot transfer alias which is used for receiving newsletter
if User.get_by(newsletter_alias_id=alias.id): if User.get_by(newsletter_alias_id=alias.id):
raise Exception("Cannot transfer alias that's used to receive newsletter") raise Exception("Cannot transfer alias that's used to receive newsletter")
@ -498,17 +509,90 @@ def transfer_alias(alias, new_user, new_mailboxes: [Mailbox]):
alias.disable_pgp = False alias.disable_pgp = False
alias.pinned = False alias.pinned = False
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.TransferredAlias,
message=f"Lost ownership of alias due to alias transfer confirmed. New owner is {new_user.id}",
user_id=old_user.id,
)
EventDispatcher.send_event(
old_user,
EventContent(
alias_deleted=AliasDeleted(
id=alias.id,
email=alias.email,
)
),
)
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.AcceptTransferAlias,
message=f"Accepted alias transfer from user {old_user.id}",
user_id=new_user.id,
)
EventDispatcher.send_event(
new_user,
EventContent(
alias_created=AliasCreated(
id=alias.id,
email=alias.email,
note=alias.note,
enabled=alias.enabled,
created_at=int(alias.created_at.timestamp),
)
),
)
Session.commit() Session.commit()
def change_alias_status(alias: Alias, enabled: bool, commit: bool = False): def change_alias_status(
alias: Alias, enabled: bool, message: Optional[str] = None, commit: bool = False
):
LOG.i(f"Changing alias {alias} enabled to {enabled}") LOG.i(f"Changing alias {alias} enabled to {enabled}")
alias.enabled = enabled alias.enabled = enabled
event = AliasStatusChanged( event = AliasStatusChanged(
alias_id=alias.id, alias_email=alias.email, enabled=enabled id=alias.id,
email=alias.email,
enabled=enabled,
created_at=int(alias.created_at.timestamp),
) )
EventDispatcher.send_event(alias.user, EventContent(alias_status_change=event)) EventDispatcher.send_event(alias.user, EventContent(alias_status_change=event))
audit_log_message = f"Set alias status to {enabled}"
if message is not None:
audit_log_message += f". {message}"
emit_alias_audit_log(
alias, AliasAuditLogAction.ChangeAliasStatus, audit_log_message
)
if commit: if commit:
Session.commit() Session.commit()
@dataclass
class AliasRecipientName:
name: str
message: Optional[str] = None
def get_alias_recipient_name(alias: Alias) -> AliasRecipientName:
"""
Logic:
1. If alias has name, use it
2. If alias has custom domain, and custom domain has name, use it
3. Otherwise, use the alias email as the recipient
"""
if alias.name:
return AliasRecipientName(
name=sl_formataddr((alias.name, alias.email)),
message=f"Put alias name {alias.name} in from header",
)
elif alias.custom_domain:
if alias.custom_domain.name:
return AliasRecipientName(
name=sl_formataddr((alias.custom_domain.name, alias.email)),
message=f"Put domain default alias name {alias.custom_domain.name} in from header",
)
return AliasRecipientName(name=alias.email)

View File

@ -1,9 +1,13 @@
from typing import Optional
from deprecated import deprecated from deprecated import deprecated
from flask import g from flask import g
from flask import jsonify from flask import jsonify
from flask import request from flask import request
from app import alias_utils from app import alias_utils
from app.alias_audit_log_utils import emit_alias_audit_log, AliasAuditLogAction
from app.alias_mailbox_utils import set_mailboxes_for_alias
from app.api.base import api_bp, require_api_auth from app.api.base import api_bp, require_api_auth
from app.api.serializer import ( from app.api.serializer import (
AliasInfo, AliasInfo,
@ -26,7 +30,7 @@ from app.errors import (
) )
from app.extensions import limiter from app.extensions import limiter
from app.log import LOG from app.log import LOG
from app.models import Alias, Contact, Mailbox, AliasMailbox, AliasDeleteReason from app.models import Alias, Contact, Mailbox, AliasDeleteReason
@deprecated @deprecated
@ -185,7 +189,11 @@ def toggle_alias(alias_id):
if not alias or alias.user_id != user.id: if not alias or alias.user_id != user.id:
return jsonify(error="Forbidden"), 403 return jsonify(error="Forbidden"), 403
alias_utils.change_alias_status(alias, enabled=not alias.enabled) alias_utils.change_alias_status(
alias,
enabled=not alias.enabled,
message=f"Set enabled={not alias.enabled} via API",
)
LOG.i(f"User {user} changed alias {alias} enabled status to {alias.enabled}") LOG.i(f"User {user} changed alias {alias} enabled status to {alias.enabled}")
Session.commit() Session.commit()
@ -272,10 +280,12 @@ def update_alias(alias_id):
if not alias or alias.user_id != user.id: if not alias or alias.user_id != user.id:
return jsonify(error="Forbidden"), 403 return jsonify(error="Forbidden"), 403
changed_fields = []
changed = False changed = False
if "note" in data: if "note" in data:
new_note = data.get("note") new_note = data.get("note")
alias.note = new_note alias.note = new_note
changed_fields.append("note")
changed = True changed = True
if "mailbox_id" in data: if "mailbox_id" in data:
@ -285,35 +295,19 @@ def update_alias(alias_id):
return jsonify(error="Forbidden"), 400 return jsonify(error="Forbidden"), 400
alias.mailbox_id = mailbox_id alias.mailbox_id = mailbox_id
changed_fields.append(f"mailbox_id ({mailbox_id})")
changed = True changed = True
if "mailbox_ids" in data: if "mailbox_ids" in data:
mailbox_ids = [int(m_id) for m_id in data.get("mailbox_ids")] mailbox_ids = [int(m_id) for m_id in data.get("mailbox_ids")]
mailboxes: [Mailbox] = [] err = set_mailboxes_for_alias(
user_id=user.id, alias=alias, mailbox_ids=mailbox_ids
# check if all mailboxes belong to user )
for mailbox_id in mailbox_ids: if err:
mailbox = Mailbox.get(mailbox_id) return jsonify(error=err.value), 400
if not mailbox or mailbox.user_id != user.id or not mailbox.verified:
return jsonify(error="Forbidden"), 400
mailboxes.append(mailbox)
if not mailboxes:
return jsonify(error="Must choose at least one mailbox"), 400
# <<< update alias mailboxes >>>
# first remove all existing alias-mailboxes links
AliasMailbox.filter_by(alias_id=alias.id).delete()
Session.flush()
# then add all new mailboxes
for i, mailbox in enumerate(mailboxes):
if i == 0:
alias.mailbox_id = mailboxes[0].id
else:
AliasMailbox.create(alias_id=alias.id, mailbox_id=mailbox.id)
# <<< END update alias mailboxes >>>
mailbox_ids_string = ",".join(map(str, mailbox_ids))
changed_fields.append(f"mailbox_ids ({mailbox_ids_string})")
changed = True changed = True
if "name" in data: if "name" in data:
@ -325,17 +319,26 @@ def update_alias(alias_id):
if new_name: if new_name:
new_name = new_name.replace("\n", "") new_name = new_name.replace("\n", "")
alias.name = new_name alias.name = new_name
changed_fields.append("name")
changed = True changed = True
if "disable_pgp" in data: if "disable_pgp" in data:
alias.disable_pgp = data.get("disable_pgp") alias.disable_pgp = data.get("disable_pgp")
changed_fields.append("disable_pgp")
changed = True changed = True
if "pinned" in data: if "pinned" in data:
alias.pinned = data.get("pinned") alias.pinned = data.get("pinned")
changed_fields.append("pinned")
changed = True changed = True
if changed: if changed:
changed_fields_string = ",".join(changed_fields)
emit_alias_audit_log(
alias,
AliasAuditLogAction.UpdateAlias,
f"Alias fields updated ({changed_fields_string})",
)
Session.commit() Session.commit()
return jsonify(ok=True), 200 return jsonify(ok=True), 200
@ -424,7 +427,7 @@ def create_contact_route(alias_id):
contact_address = data.get("contact") contact_address = data.get("contact")
try: try:
contact = create_contact(g.user, alias, contact_address) contact = create_contact(alias, contact_address)
except ErrContactErrorUpgradeNeeded as err: except ErrContactErrorUpgradeNeeded as err:
return jsonify(error=err.error_for_user()), 403 return jsonify(error=err.error_for_user()), 403
except (ErrAddressInvalid, CannotCreateContactForReverseAlias) as err: except (ErrAddressInvalid, CannotCreateContactForReverseAlias) as err:
@ -446,11 +449,16 @@ def delete_contact(contact_id):
200 200
""" """
user = g.user user = g.user
contact = Contact.get(contact_id) contact: Optional[Contact] = Contact.get(contact_id)
if not contact or contact.alias.user_id != user.id: if not contact or contact.alias.user_id != user.id:
return jsonify(error="Forbidden"), 403 return jsonify(error="Forbidden"), 403
emit_alias_audit_log(
alias=contact.alias,
action=AliasAuditLogAction.DeleteContact,
message=f"Deleted contact {contact_id} ({contact.email})",
)
Contact.delete(contact_id) Contact.delete(contact_id)
Session.commit() Session.commit()
@ -468,12 +476,17 @@ def toggle_contact(contact_id):
200 200
""" """
user = g.user user = g.user
contact = Contact.get(contact_id) contact: Optional[Contact] = Contact.get(contact_id)
if not contact or contact.alias.user_id != user.id: if not contact or contact.alias.user_id != user.id:
return jsonify(error="Forbidden"), 403 return jsonify(error="Forbidden"), 403
contact.block_forward = not contact.block_forward contact.block_forward = not contact.block_forward
emit_alias_audit_log(
alias=contact.alias,
action=AliasAuditLogAction.UpdateContact,
message=f"Set contact state {contact.id} {contact.email} -> {contact.website_email} to blocked {contact.block_forward}",
)
Session.commit() Session.commit()
return jsonify(block_forward=contact.block_forward), 200 return jsonify(block_forward=contact.block_forward), 200

View File

@ -52,8 +52,12 @@ def auth_login():
password = data.get("password") password = data.get("password")
device = data.get("device") device = data.get("device")
email = sanitize_email(data.get("email")) email = data.get("email")
canonical_email = canonicalize_email(data.get("email")) if not email:
LoginEvent(LoginEvent.ActionType.failed, LoginEvent.Source.api).send()
return jsonify(error="Email or password incorrect"), 400
email = sanitize_email(email)
canonical_email = canonicalize_email(email)
user = User.get_by(email=email) or User.get_by(email=canonical_email) user = User.get_by(email=email) or User.get_by(email=canonical_email)

View File

@ -2,8 +2,10 @@ from flask import g, request
from flask import jsonify from flask import jsonify
from app.api.base import api_bp, require_api_auth from app.api.base import api_bp, require_api_auth
from app.custom_domain_utils import set_custom_domain_mailboxes
from app.db import Session from app.db import Session
from app.models import CustomDomain, DomainDeletedAlias, Mailbox, DomainMailbox from app.log import LOG
from app.models import CustomDomain, DomainDeletedAlias
def custom_domain_to_dict(custom_domain: CustomDomain): def custom_domain_to_dict(custom_domain: CustomDomain):
@ -100,23 +102,14 @@ def update_custom_domain(custom_domain_id):
if "mailbox_ids" in data: if "mailbox_ids" in data:
mailbox_ids = [int(m_id) for m_id in data.get("mailbox_ids")] mailbox_ids = [int(m_id) for m_id in data.get("mailbox_ids")]
if mailbox_ids: result = set_custom_domain_mailboxes(user.id, custom_domain, mailbox_ids)
# check if mailbox is not tempered with if result.success:
mailboxes = []
for mailbox_id in mailbox_ids:
mailbox = Mailbox.get(mailbox_id)
if not mailbox or mailbox.user_id != user.id or not mailbox.verified:
return jsonify(error="Forbidden"), 400
mailboxes.append(mailbox)
# first remove all existing domain-mailboxes links
DomainMailbox.filter_by(domain_id=custom_domain.id).delete()
Session.flush()
for mailbox in mailboxes:
DomainMailbox.create(domain_id=custom_domain.id, mailbox_id=mailbox.id)
changed = True changed = True
else:
LOG.info(
f"Prevented from updating mailboxes [custom_domain_id={custom_domain.id}]: {result.reason.value}"
)
return jsonify(error="Forbidden"), 400
if changed: if changed:
Session.commit() Session.commit()

View File

@ -6,6 +6,7 @@ from app import config
from app.extensions import limiter from app.extensions import limiter
from app.log import LOG from app.log import LOG
from app.models import Job, ApiToCookieToken from app.models import Job, ApiToCookieToken
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
@api_bp.route("/user", methods=["DELETE"]) @api_bp.route("/user", methods=["DELETE"])
@ -16,6 +17,11 @@ def delete_user():
""" """
# Schedule delete account job # Schedule delete account job
emit_user_audit_log(
user=g.user,
action=UserAuditLogAction.UserMarkedForDeletion,
message=f"Marked user {g.user.id} ({g.user.email}) for deletion from API",
)
LOG.w("schedule delete account job for %s", g.user) LOG.w("schedule delete account job for %s", g.user)
Job.create( Job.create(
name=config.JOB_DELETE_ACCOUNT, name=config.JOB_DELETE_ACCOUNT,

View File

@ -87,7 +87,7 @@ def update_user_info():
File.delete(file.id) File.delete(file.id)
s3.delete(file.path) s3.delete(file.path)
Session.flush() Session.flush()
else: if data["profile_picture"] is not None:
raw_data = base64.decodebytes(data["profile_picture"].encode()) raw_data = base64.decodebytes(data["profile_picture"].encode())
if detect_image_format(raw_data) == ImageFormat.Unknown: if detect_image_format(raw_data) == ImageFormat.Unknown:
return jsonify(error="Unsupported image format"), 400 return jsonify(error="Unsupported image format"), 400

View File

@ -35,6 +35,33 @@ def sl_getenv(env_var: str, default_factory: Callable = None):
return literal_eval(value) return literal_eval(value)
def get_env_dict(env_var: str) -> dict[str, str]:
"""
Get an env variable and convert it into a python dictionary with keys and values as strings.
Args:
env_var (str): env var, example: SL_DB
Syntax is: key1=value1;key2=value2
Components separated by ;
key and value separated by =
"""
value = os.getenv(env_var)
if not value:
return {}
components = value.split(";")
result = {}
for component in components:
if component == "":
continue
parts = component.split("=")
if len(parts) != 2:
raise Exception(f"Invalid config for env var {env_var}")
result[parts[0].strip()] = parts[1].strip()
return result
config_file = os.environ.get("CONFIG") config_file = os.environ.get("CONFIG")
if config_file: if config_file:
config_file = get_abs_path(config_file) config_file = get_abs_path(config_file)
@ -574,7 +601,6 @@ SKIP_MX_LOOKUP_ON_CHECK = False
DISABLE_RATE_LIMIT = "DISABLE_RATE_LIMIT" in os.environ DISABLE_RATE_LIMIT = "DISABLE_RATE_LIMIT" in os.environ
SUBSCRIPTION_CHANGE_WEBHOOK = os.environ.get("SUBSCRIPTION_CHANGE_WEBHOOK", None)
MAX_API_KEYS = int(os.environ.get("MAX_API_KEYS", 30)) MAX_API_KEYS = int(os.environ.get("MAX_API_KEYS", 30))
UPCLOUD_USERNAME = os.environ.get("UPCLOUD_USERNAME", None) UPCLOUD_USERNAME = os.environ.get("UPCLOUD_USERNAME", None)
@ -609,3 +635,32 @@ EVENT_WEBHOOK_ENABLED_USER_IDS: Optional[List[int]] = read_webhook_enabled_user_
# Allow to define a different DB_URI for the event listener, in case we want to skip the connection pool # Allow to define a different DB_URI for the event listener, in case we want to skip the connection pool
# It defaults to the regular DB_URI in case it's needed # It defaults to the regular DB_URI in case it's needed
EVENT_LISTENER_DB_URI = os.environ.get("EVENT_LISTENER_DB_URI", DB_URI) EVENT_LISTENER_DB_URI = os.environ.get("EVENT_LISTENER_DB_URI", DB_URI)
def read_partner_dict(var: str) -> dict[int, str]:
partner_value = get_env_dict(var)
if len(partner_value) == 0:
return {}
res: dict[int, str] = {}
for partner_id in partner_value.keys():
try:
partner_id_int = int(partner_id.strip())
res[partner_id_int] = partner_value[partner_id]
except ValueError:
pass
return res
PARTNER_DNS_CUSTOM_DOMAINS: dict[int, str] = read_partner_dict(
"PARTNER_DNS_CUSTOM_DOMAINS"
)
PARTNER_CUSTOM_DOMAIN_VALIDATION_PREFIXES: dict[int, str] = read_partner_dict(
"PARTNER_CUSTOM_DOMAIN_VALIDATION_PREFIXES"
)
MAILBOX_VERIFICATION_OVERRIDE_CODE: Optional[str] = os.environ.get(
"MAILBOX_VERIFICATION_OVERRIDE_CODE", None
)
AUDIT_LOG_MAX_DAYS = int(os.environ.get("AUDIT_LOG_MAX_DAYS", 30))

View File

@ -1 +1,2 @@
HEADER_ALLOW_API_COOKIES = "X-Sl-Allowcookies" HEADER_ALLOW_API_COOKIES = "X-Sl-Allowcookies"
DMARC_RECORD = "v=DMARC1; p=quarantine; pct=100; adkim=s; aspf=s"

124
app/app/contact_utils.py Normal file
View File

@ -0,0 +1,124 @@
from dataclasses import dataclass
from enum import Enum
from typing import Optional
from sqlalchemy.exc import IntegrityError
from app.alias_audit_log_utils import emit_alias_audit_log, AliasAuditLogAction
from app.db import Session
from app.email_utils import generate_reply_email, parse_full_address
from app.email_validation import is_valid_email
from app.log import LOG
from app.models import Contact, Alias
from app.utils import sanitize_email
class ContactCreateError(Enum):
InvalidEmail = "Invalid email"
NotAllowed = "Your plan does not allow to create contacts"
@dataclass
class ContactCreateResult:
contact: Optional[Contact]
created: bool
error: Optional[ContactCreateError]
def __update_contact_if_needed(
contact: Contact, name: Optional[str], mail_from: Optional[str]
) -> ContactCreateResult:
if name and contact.name != name:
LOG.d(f"Setting {contact} name to {name}")
contact.name = name
Session.commit()
if mail_from and contact.mail_from is None:
LOG.d(f"Setting {contact} mail_from to {mail_from}")
contact.mail_from = mail_from
Session.commit()
return ContactCreateResult(contact, created=False, error=None)
def create_contact(
email: str,
alias: Alias,
name: Optional[str] = None,
mail_from: Optional[str] = None,
allow_empty_email: bool = False,
automatic_created: bool = False,
from_partner: bool = False,
) -> ContactCreateResult:
# If user cannot create contacts, they still need to be created when receiving an email for an alias
if not automatic_created and not alias.user.can_create_contacts():
return ContactCreateResult(
None, created=False, error=ContactCreateError.NotAllowed
)
# Parse emails with form 'name <email>'
try:
email_name, email = parse_full_address(email)
except ValueError:
email = ""
email_name = ""
# If no name is explicitly given try to get it from the parsed email
if name is None:
name = email_name[: Contact.MAX_NAME_LENGTH]
else:
name = name[: Contact.MAX_NAME_LENGTH]
# If still no name is there, make sure the name is None instead of empty string
if not name:
name = None
if name is not None and "\x00" in name:
LOG.w("Cannot use contact name because has \\x00")
name = ""
# Sanitize email and if it's not valid only allow to create a contact if it's explicitly allowed. Otherwise fail
email = sanitize_email(email, not_lower=True)
if not is_valid_email(email):
LOG.w(f"invalid contact email {email}")
if not allow_empty_email:
return ContactCreateResult(
None, created=False, error=ContactCreateError.InvalidEmail
)
LOG.d("Create a contact with invalid email for %s", alias)
# either reuse a contact with empty email or create a new contact with empty email
email = ""
# If contact exists, update name and mail_from if needed
contact = Contact.get_by(alias_id=alias.id, website_email=email)
if contact is not None:
return __update_contact_if_needed(contact, name, mail_from)
# Create the contact
reply_email = generate_reply_email(email, alias)
try:
flags = Contact.FLAG_PARTNER_CREATED if from_partner else 0
contact = Contact.create(
user_id=alias.user_id,
alias_id=alias.id,
website_email=email,
name=name,
reply_email=reply_email,
mail_from=mail_from,
automatic_created=automatic_created,
flags=flags,
invalid_email=email == "",
commit=True,
)
if automatic_created:
trail = ". Automatically created"
else:
trail = ". Created by user action"
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.CreateContact,
message=f"Created contact {contact.id} ({contact.email}){trail}",
commit=True,
)
LOG.d(
f"Created contact {contact} for alias {alias} with email {email} invalid_email={contact.invalid_email}"
)
except IntegrityError:
Session.rollback()
LOG.info(
f"Contact with email {email} for alias_id {alias.id} already existed, fetching from DB"
)
contact = Contact.get_by(alias_id=alias.id, website_email=email)
return __update_contact_if_needed(contact, name, mail_from)
return ContactCreateResult(contact, created=True, error=None)

View File

@ -0,0 +1,206 @@
import arrow
import re
from dataclasses import dataclass
from enum import Enum
from typing import List, Optional
from app.config import JOB_DELETE_DOMAIN
from app.db import Session
from app.email_utils import get_email_domain_part
from app.log import LOG
from app.models import User, CustomDomain, SLDomain, Mailbox, Job, DomainMailbox
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
_ALLOWED_DOMAIN_REGEX = re.compile(r"^(?!-)[A-Za-z0-9-]{1,63}(?<!-)$")
_MAX_MAILBOXES_PER_DOMAIN = 20
@dataclass
class CreateCustomDomainResult:
message: str = ""
message_category: str = ""
success: bool = False
instance: Optional[CustomDomain] = None
redirect: Optional[str] = None
class CannotUseDomainReason(Enum):
InvalidDomain = 1
BuiltinDomain = 2
DomainAlreadyUsed = 3
DomainPartOfUserEmail = 4
DomainUserInMailbox = 5
def message(self, domain: str) -> str:
if self == CannotUseDomainReason.InvalidDomain:
return "This is not a valid domain"
elif self == CannotUseDomainReason.BuiltinDomain:
return "A custom domain cannot be a built-in domain."
elif self == CannotUseDomainReason.DomainAlreadyUsed:
return f"{domain} already used"
elif self == CannotUseDomainReason.DomainPartOfUserEmail:
return "You cannot add a domain that you are currently using for your personal email. Please change your personal email to your real email"
elif self == CannotUseDomainReason.DomainUserInMailbox:
return f"{domain} already used in a SimpleLogin mailbox"
else:
raise Exception("Invalid CannotUseDomainReason")
class CannotSetCustomDomainMailboxesCause(Enum):
InvalidMailbox = "Something went wrong, please retry"
NoMailboxes = "You must select at least 1 mailbox"
TooManyMailboxes = (
f"You can only set up to {_MAX_MAILBOXES_PER_DOMAIN} mailboxes per domain"
)
@dataclass
class SetCustomDomainMailboxesResult:
success: bool
reason: Optional[CannotSetCustomDomainMailboxesCause] = None
def is_valid_domain(domain: str) -> bool:
"""
Checks that a domain is valid according to RFC 1035
"""
if len(domain) > 255:
return False
if domain.endswith("."):
domain = domain[:-1] # Strip the trailing dot
labels = domain.split(".")
if not labels:
return False
for label in labels:
if not _ALLOWED_DOMAIN_REGEX.match(label):
return False
return True
def sanitize_domain(domain: str) -> str:
new_domain = domain.lower().strip()
if new_domain.startswith("http://"):
new_domain = new_domain[len("http://") :]
if new_domain.startswith("https://"):
new_domain = new_domain[len("https://") :]
return new_domain
def can_domain_be_used(user: User, domain: str) -> Optional[CannotUseDomainReason]:
if not is_valid_domain(domain):
return CannotUseDomainReason.InvalidDomain
elif SLDomain.get_by(domain=domain):
return CannotUseDomainReason.BuiltinDomain
elif CustomDomain.get_by(domain=domain):
return CannotUseDomainReason.DomainAlreadyUsed
elif get_email_domain_part(user.email) == domain:
return CannotUseDomainReason.DomainPartOfUserEmail
elif Mailbox.filter(
Mailbox.verified.is_(True), Mailbox.email.endswith(f"@{domain}")
).first():
return CannotUseDomainReason.DomainUserInMailbox
else:
return None
def create_custom_domain(
user: User, domain: str, partner_id: Optional[int] = None
) -> CreateCustomDomainResult:
if not user.is_premium():
return CreateCustomDomainResult(
message="Only premium plan can add custom domain",
message_category="warning",
)
new_domain = sanitize_domain(domain)
domain_forbidden_cause = can_domain_be_used(user, new_domain)
if domain_forbidden_cause:
return CreateCustomDomainResult(
message=domain_forbidden_cause.message(new_domain), message_category="error"
)
new_custom_domain = CustomDomain.create(domain=new_domain, user_id=user.id)
# new domain has ownership verified if its parent has the ownership verified
for root_cd in user.custom_domains:
if new_domain.endswith("." + root_cd.domain) and root_cd.ownership_verified:
LOG.i(
"%s ownership verified thanks to %s",
new_custom_domain,
root_cd,
)
new_custom_domain.ownership_verified = True
# Add the partner_id in case it's passed
if partner_id is not None:
new_custom_domain.partner_id = partner_id
emit_user_audit_log(
user=user,
action=UserAuditLogAction.CreateCustomDomain,
message=f"Created custom domain {new_custom_domain.id} ({new_domain})",
)
Session.commit()
return CreateCustomDomainResult(
success=True,
instance=new_custom_domain,
)
def delete_custom_domain(domain: CustomDomain):
# Schedule delete domain job
LOG.w("schedule delete domain job for %s", domain)
domain.pending_deletion = True
Job.create(
name=JOB_DELETE_DOMAIN,
payload={"custom_domain_id": domain.id},
run_at=arrow.now(),
commit=True,
)
def set_custom_domain_mailboxes(
user_id: int, custom_domain: CustomDomain, mailbox_ids: List[int]
) -> SetCustomDomainMailboxesResult:
if len(mailbox_ids) == 0:
return SetCustomDomainMailboxesResult(
success=False, reason=CannotSetCustomDomainMailboxesCause.NoMailboxes
)
elif len(mailbox_ids) > _MAX_MAILBOXES_PER_DOMAIN:
return SetCustomDomainMailboxesResult(
success=False, reason=CannotSetCustomDomainMailboxesCause.TooManyMailboxes
)
mailboxes = (
Session.query(Mailbox)
.filter(
Mailbox.id.in_(mailbox_ids),
Mailbox.user_id == user_id,
Mailbox.verified == True, # noqa: E712
)
.all()
)
if len(mailboxes) != len(mailbox_ids):
return SetCustomDomainMailboxesResult(
success=False, reason=CannotSetCustomDomainMailboxesCause.InvalidMailbox
)
# first remove all existing domain-mailboxes links
DomainMailbox.filter_by(domain_id=custom_domain.id).delete()
Session.flush()
for mailbox in mailboxes:
DomainMailbox.create(domain_id=custom_domain.id, mailbox_id=mailbox.id)
mailboxes_as_str = ",".join(map(str, mailbox_ids))
emit_user_audit_log(
user=custom_domain.user,
action=UserAuditLogAction.UpdateCustomDomain,
message=f"Updated custom domain {custom_domain.id} mailboxes (domain={custom_domain.domain}) (mailboxes={mailboxes_as_str})",
)
Session.commit()
return SetCustomDomainMailboxesResult(success=True)

View File

@ -1,37 +1,228 @@
from dataclasses import dataclass
from typing import List, Optional
from app import config
from app.constants import DMARC_RECORD
from app.db import Session from app.db import Session
from app.dns_utils import get_cname_record from app.dns_utils import (
MxRecord,
DNSClient,
is_mx_equivalent,
get_network_dns_client,
)
from app.models import CustomDomain from app.models import CustomDomain
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import random_string
@dataclass
class DomainValidationResult:
success: bool
errors: [str]
class CustomDomainValidation: class CustomDomainValidation:
def __init__(self, dkim_domain: str): def __init__(
self,
dkim_domain: str,
dns_client: DNSClient = get_network_dns_client(),
partner_domains: Optional[dict[int, str]] = None,
partner_domains_validation_prefixes: Optional[dict[int, str]] = None,
):
self.dkim_domain = dkim_domain self.dkim_domain = dkim_domain
self._dkim_records = { self._dns_client = dns_client
(f"{key}._domainkey", f"{key}._domainkey.{self.dkim_domain}") self._partner_domains = partner_domains or config.PARTNER_DNS_CUSTOM_DOMAINS
self._partner_domain_validation_prefixes = (
partner_domains_validation_prefixes
or config.PARTNER_CUSTOM_DOMAIN_VALIDATION_PREFIXES
)
def get_ownership_verification_record(self, domain: CustomDomain) -> str:
prefix = "sl"
if (
domain.partner_id is not None
and domain.partner_id in self._partner_domain_validation_prefixes
):
prefix = self._partner_domain_validation_prefixes[domain.partner_id]
if not domain.ownership_txt_token:
domain.ownership_txt_token = random_string(30)
Session.commit()
return f"{prefix}-verification={domain.ownership_txt_token}"
def get_expected_mx_records(self, domain: CustomDomain) -> list[MxRecord]:
records = []
if domain.partner_id is not None and domain.partner_id in self._partner_domains:
domain = self._partner_domains[domain.partner_id]
records.append(MxRecord(10, f"mx1.{domain}."))
records.append(MxRecord(20, f"mx2.{domain}."))
else:
# Default ones
for priority, domain in config.EMAIL_SERVERS_WITH_PRIORITY:
records.append(MxRecord(priority, domain))
return records
def get_expected_spf_domain(self, domain: CustomDomain) -> str:
if domain.partner_id is not None and domain.partner_id in self._partner_domains:
return self._partner_domains[domain.partner_id]
else:
return config.EMAIL_DOMAIN
def get_expected_spf_record(self, domain: CustomDomain) -> str:
spf_domain = self.get_expected_spf_domain(domain)
return f"v=spf1 include:{spf_domain} ~all"
def get_dkim_records(self, domain: CustomDomain) -> {str: str}:
"""
Get a list of dkim records to set up. Depending on the custom_domain, whether if it's from a partner or not,
it will return the default ones or the partner ones.
"""
# By default use the default domain
dkim_domain = self.dkim_domain
if domain.partner_id is not None:
# Domain is from a partner. Retrieve the partner config and use that domain if exists
dkim_domain = self._partner_domains.get(domain.partner_id, dkim_domain)
return {
f"{key}._domainkey": f"{key}._domainkey.{dkim_domain}"
for key in ("dkim", "dkim02", "dkim03") for key in ("dkim", "dkim02", "dkim03")
} }
def get_dkim_records(self) -> {str: str}:
"""
Get a list of dkim records to set up. It will be
"""
return self._dkim_records
def validate_dkim_records(self, custom_domain: CustomDomain) -> dict[str, str]: def validate_dkim_records(self, custom_domain: CustomDomain) -> dict[str, str]:
""" """
Check if dkim records are properly set for this custom domain. Check if dkim records are properly set for this custom domain.
Returns empty list if all records are ok. Other-wise return the records that aren't properly configured Returns empty list if all records are ok. Other-wise return the records that aren't properly configured
""" """
correct_records = {}
invalid_records = {} invalid_records = {}
for prefix, expected_record in self.get_dkim_records(): expected_records = self.get_dkim_records(custom_domain)
for prefix, expected_record in expected_records.items():
custom_record = f"{prefix}.{custom_domain.domain}" custom_record = f"{prefix}.{custom_domain.domain}"
dkim_record = get_cname_record(custom_record) dkim_record = self._dns_client.get_cname_record(custom_record)
if dkim_record != expected_record: if dkim_record == expected_record:
correct_records[prefix] = custom_record
else:
invalid_records[custom_record] = dkim_record or "empty" invalid_records[custom_record] = dkim_record or "empty"
# HACK: If dkim is enabled, don't disable it to give users time to update their CNAMES
# HACK
# As initially we only had one dkim record, we want to allow users that had only the original dkim record and
# the domain validated to continue seeing it as validated (although showing them the missing records).
# However, if not even the original dkim record is right, even if the domain was dkim_verified in the past,
# we will remove the dkim_verified flag.
# This is done in order to give users with the old dkim config (only one) to update their CNAMEs
if custom_domain.dkim_verified: if custom_domain.dkim_verified:
return invalid_records # Check if at least the original dkim is there
if correct_records.get("dkim._domainkey") is not None:
# Original dkim record is there. Return the missing records (if any) and don't clear the flag
return invalid_records
# Original DKIM record is not there, which means the DKIM config is not finished. Proceed with the
# rest of the code path, returning the invalid records and clearing the flag
custom_domain.dkim_verified = len(invalid_records) == 0 custom_domain.dkim_verified = len(invalid_records) == 0
if custom_domain.dkim_verified:
emit_user_audit_log(
user=custom_domain.user,
action=UserAuditLogAction.VerifyCustomDomain,
message=f"Verified DKIM records for custom domain {custom_domain.id} ({custom_domain.domain})",
)
Session.commit() Session.commit()
return invalid_records return invalid_records
def validate_domain_ownership(
self, custom_domain: CustomDomain
) -> DomainValidationResult:
"""
Check if the custom_domain has added the ownership verification records
"""
txt_records = self._dns_client.get_txt_record(custom_domain.domain)
expected_verification_record = self.get_ownership_verification_record(
custom_domain
)
if expected_verification_record in txt_records:
custom_domain.ownership_verified = True
emit_user_audit_log(
user=custom_domain.user,
action=UserAuditLogAction.VerifyCustomDomain,
message=f"Verified ownership for custom domain {custom_domain.id} ({custom_domain.domain})",
)
Session.commit()
return DomainValidationResult(success=True, errors=[])
else:
return DomainValidationResult(success=False, errors=txt_records)
def validate_mx_records(
self, custom_domain: CustomDomain
) -> DomainValidationResult:
mx_domains = self._dns_client.get_mx_domains(custom_domain.domain)
expected_mx_records = self.get_expected_mx_records(custom_domain)
if not is_mx_equivalent(mx_domains, expected_mx_records):
return DomainValidationResult(
success=False,
errors=[f"{record.priority} {record.domain}" for record in mx_domains],
)
else:
custom_domain.verified = True
emit_user_audit_log(
user=custom_domain.user,
action=UserAuditLogAction.VerifyCustomDomain,
message=f"Verified MX records for custom domain {custom_domain.id} ({custom_domain.domain})",
)
Session.commit()
return DomainValidationResult(success=True, errors=[])
def validate_spf_records(
self, custom_domain: CustomDomain
) -> DomainValidationResult:
spf_domains = self._dns_client.get_spf_domain(custom_domain.domain)
expected_spf_domain = self.get_expected_spf_domain(custom_domain)
if expected_spf_domain in spf_domains:
custom_domain.spf_verified = True
emit_user_audit_log(
user=custom_domain.user,
action=UserAuditLogAction.VerifyCustomDomain,
message=f"Verified SPF records for custom domain {custom_domain.id} ({custom_domain.domain})",
)
Session.commit()
return DomainValidationResult(success=True, errors=[])
else:
custom_domain.spf_verified = False
Session.commit()
txt_records = self._dns_client.get_txt_record(custom_domain.domain)
cleaned_records = self.__clean_spf_records(txt_records, custom_domain)
return DomainValidationResult(
success=False,
errors=cleaned_records,
)
def validate_dmarc_records(
self, custom_domain: CustomDomain
) -> DomainValidationResult:
txt_records = self._dns_client.get_txt_record("_dmarc." + custom_domain.domain)
if DMARC_RECORD in txt_records:
custom_domain.dmarc_verified = True
emit_user_audit_log(
user=custom_domain.user,
action=UserAuditLogAction.VerifyCustomDomain,
message=f"Verified DMARC records for custom domain {custom_domain.id} ({custom_domain.domain})",
)
Session.commit()
return DomainValidationResult(success=True, errors=[])
else:
custom_domain.dmarc_verified = False
Session.commit()
return DomainValidationResult(success=False, errors=txt_records)
def __clean_spf_records(
self, txt_records: List[str], custom_domain: CustomDomain
) -> List[str]:
final_records = []
verification_record = self.get_ownership_verification_record(custom_domain)
for record in txt_records:
if record != verification_record:
final_records.append(record)
return final_records

View File

@ -1,5 +1,6 @@
from dataclasses import dataclass from dataclasses import dataclass
from operator import or_ from operator import or_
from typing import Optional
from flask import render_template, request, redirect, flash from flask import render_template, request, redirect, flash
from flask import url_for from flask import url_for
@ -9,13 +10,11 @@ from sqlalchemy import and_, func, case
from wtforms import StringField, validators, ValidationError from wtforms import StringField, validators, ValidationError
# Need to import directly from config to allow modification from the tests # Need to import directly from config to allow modification from the tests
from app import config, parallel_limiter from app import config, parallel_limiter, contact_utils
from app.alias_audit_log_utils import emit_alias_audit_log, AliasAuditLogAction
from app.contact_utils import ContactCreateError
from app.dashboard.base import dashboard_bp from app.dashboard.base import dashboard_bp
from app.db import Session from app.db import Session
from app.email_utils import (
generate_reply_email,
parse_full_address,
)
from app.email_validation import is_valid_email from app.email_validation import is_valid_email
from app.errors import ( from app.errors import (
CannotCreateContactForReverseAlias, CannotCreateContactForReverseAlias,
@ -24,8 +23,8 @@ from app.errors import (
ErrContactAlreadyExists, ErrContactAlreadyExists,
) )
from app.log import LOG from app.log import LOG
from app.models import Alias, Contact, EmailLog, User from app.models import Alias, Contact, EmailLog
from app.utils import sanitize_email, CSRFValidationForm from app.utils import CSRFValidationForm
def email_validator(): def email_validator():
@ -51,7 +50,7 @@ def email_validator():
return _check return _check
def create_contact(user: User, alias: Alias, contact_address: str) -> Contact: def create_contact(alias: Alias, contact_address: str) -> Contact:
""" """
Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS. Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS.
Can throw exceptions: Can throw exceptions:
@ -61,37 +60,23 @@ def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
""" """
if not contact_address: if not contact_address:
raise ErrAddressInvalid("Empty address") raise ErrAddressInvalid("Empty address")
try: output = contact_utils.create_contact(email=contact_address, alias=alias)
contact_name, contact_email = parse_full_address(contact_address) if output.error == ContactCreateError.InvalidEmail:
except ValueError:
raise ErrAddressInvalid(contact_address) raise ErrAddressInvalid(contact_address)
elif output.error == ContactCreateError.NotAllowed:
contact_email = sanitize_email(contact_email)
if not is_valid_email(contact_email):
raise ErrAddressInvalid(contact_email)
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
if contact:
raise ErrContactAlreadyExists(contact)
if not user.can_create_contacts():
raise ErrContactErrorUpgradeNeeded() raise ErrContactErrorUpgradeNeeded()
elif output.error is not None:
raise ErrAddressInvalid("Invalid address")
elif not output.created:
raise ErrContactAlreadyExists(output.contact)
contact = Contact.create( contact = output.contact
user_id=alias.user_id,
alias_id=alias.id,
website_email=contact_email,
name=contact_name,
reply_email=generate_reply_email(contact_email, alias),
)
LOG.d( LOG.d(
"create reverse-alias for %s %s, reverse alias:%s", "create reverse-alias for %s %s, reverse alias:%s",
contact_address, contact_address,
alias, alias,
contact.reply_email, contact.reply_email,
) )
Session.commit()
return contact return contact
@ -207,7 +192,7 @@ def get_contact_infos(
def delete_contact(alias: Alias, contact_id: int): def delete_contact(alias: Alias, contact_id: int):
contact = Contact.get(contact_id) contact: Optional[Contact] = Contact.get(contact_id)
if not contact: if not contact:
flash("Unknown error. Refresh the page", "warning") flash("Unknown error. Refresh the page", "warning")
@ -215,6 +200,11 @@ def delete_contact(alias: Alias, contact_id: int):
flash("You cannot delete reverse-alias", "warning") flash("You cannot delete reverse-alias", "warning")
else: else:
delete_contact_email = contact.website_email delete_contact_email = contact.website_email
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.DeleteContact,
message=f"Delete contact {contact_id} ({contact.email})",
)
Contact.delete(contact_id) Contact.delete(contact_id)
Session.commit() Session.commit()
@ -261,7 +251,7 @@ def alias_contact_manager(alias_id):
if new_contact_form.validate(): if new_contact_form.validate():
contact_address = new_contact_form.email.data.strip() contact_address = new_contact_form.email.data.strip()
try: try:
contact = create_contact(current_user, alias, contact_address) contact = create_contact(alias, contact_address)
except ( except (
ErrContactErrorUpgradeNeeded, ErrContactErrorUpgradeNeeded,
ErrAddressInvalid, ErrAddressInvalid,

View File

@ -7,6 +7,7 @@ from flask import render_template, redirect, url_for, flash, request
from flask_login import login_required, current_user from flask_login import login_required, current_user
from app import config from app import config
from app.alias_audit_log_utils import emit_alias_audit_log, AliasAuditLogAction
from app.alias_utils import transfer_alias from app.alias_utils import transfer_alias
from app.dashboard.base import dashboard_bp from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required from app.dashboard.views.enter_sudo import sudo_required
@ -57,6 +58,12 @@ def alias_transfer_send_route(alias_id):
transfer_token = f"{alias.id}.{secrets.token_urlsafe(32)}" transfer_token = f"{alias.id}.{secrets.token_urlsafe(32)}"
alias.transfer_token = hmac_alias_transfer_token(transfer_token) alias.transfer_token = hmac_alias_transfer_token(transfer_token)
alias.transfer_token_expiration = arrow.utcnow().shift(hours=24) alias.transfer_token_expiration = arrow.utcnow().shift(hours=24)
emit_alias_audit_log(
alias,
AliasAuditLogAction.InitiateTransferAlias,
"Initiated alias transfer",
)
Session.commit() Session.commit()
alias_transfer_url = ( alias_transfer_url = (
config.URL config.URL

View File

@ -1,8 +1,11 @@
from typing import Optional
from flask import render_template, request, redirect, url_for, flash from flask import render_template, request, redirect, url_for, flash
from flask_login import login_required, current_user from flask_login import login_required, current_user
from flask_wtf import FlaskForm from flask_wtf import FlaskForm
from wtforms import StringField, validators from wtforms import StringField, validators
from app.alias_audit_log_utils import emit_alias_audit_log, AliasAuditLogAction
from app.dashboard.base import dashboard_bp from app.dashboard.base import dashboard_bp
from app.db import Session from app.db import Session
from app.models import Contact from app.models import Contact
@ -20,7 +23,7 @@ class PGPContactForm(FlaskForm):
@dashboard_bp.route("/contact/<int:contact_id>/", methods=["GET", "POST"]) @dashboard_bp.route("/contact/<int:contact_id>/", methods=["GET", "POST"])
@login_required @login_required
def contact_detail_route(contact_id): def contact_detail_route(contact_id):
contact = Contact.get(contact_id) contact: Optional[Contact] = Contact.get(contact_id)
if not contact or contact.user_id != current_user.id: if not contact or contact.user_id != current_user.id:
flash("You cannot see this page", "warning") flash("You cannot see this page", "warning")
return redirect(url_for("dashboard.index")) return redirect(url_for("dashboard.index"))
@ -50,6 +53,11 @@ def contact_detail_route(contact_id):
except PGPException: except PGPException:
flash("Cannot add the public key, please verify it", "error") flash("Cannot add the public key, please verify it", "error")
else: else:
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.UpdateContact,
message=f"Added PGP key {contact.pgp_public_key} for contact {contact_id} ({contact.email})",
)
Session.commit() Session.commit()
flash( flash(
f"PGP public key for {contact.email} is saved successfully", f"PGP public key for {contact.email} is saved successfully",
@ -62,6 +70,11 @@ def contact_detail_route(contact_id):
) )
elif pgp_form.action.data == "remove": elif pgp_form.action.data == "remove":
# Free user can decide to remove contact PGP key # Free user can decide to remove contact PGP key
emit_alias_audit_log(
alias=alias,
action=AliasAuditLogAction.UpdateContact,
message=f"Removed PGP key {contact.pgp_public_key} for contact {contact_id} ({contact.email})",
)
contact.pgp_public_key = None contact.pgp_public_key = None
contact.pgp_finger_print = None contact.pgp_finger_print = None
Session.commit() Session.commit()

View File

@ -5,11 +5,9 @@ from wtforms import StringField, validators
from app import parallel_limiter from app import parallel_limiter
from app.config import EMAIL_SERVERS_WITH_PRIORITY from app.config import EMAIL_SERVERS_WITH_PRIORITY
from app.custom_domain_utils import create_custom_domain
from app.dashboard.base import dashboard_bp from app.dashboard.base import dashboard_bp
from app.db import Session from app.models import CustomDomain
from app.email_utils import get_email_domain_part
from app.log import LOG
from app.models import CustomDomain, Mailbox, DomainMailbox, SLDomain
class NewCustomDomainForm(FlaskForm): class NewCustomDomainForm(FlaskForm):
@ -23,13 +21,12 @@ class NewCustomDomainForm(FlaskForm):
@parallel_limiter.lock(only_when=lambda: request.method == "POST") @parallel_limiter.lock(only_when=lambda: request.method == "POST")
def custom_domain(): def custom_domain():
custom_domains = CustomDomain.filter_by( custom_domains = CustomDomain.filter_by(
user_id=current_user.id, is_sl_subdomain=False user_id=current_user.id,
is_sl_subdomain=False,
pending_deletion=False,
).all() ).all()
mailboxes = current_user.mailboxes()
new_custom_domain_form = NewCustomDomainForm() new_custom_domain_form = NewCustomDomainForm()
errors = {}
if request.method == "POST": if request.method == "POST":
if request.form.get("form-name") == "create": if request.form.get("form-name") == "create":
if not current_user.is_premium(): if not current_user.is_premium():
@ -37,87 +34,25 @@ def custom_domain():
return redirect(url_for("dashboard.custom_domain")) return redirect(url_for("dashboard.custom_domain"))
if new_custom_domain_form.validate(): if new_custom_domain_form.validate():
new_domain = new_custom_domain_form.domain.data.lower().strip() res = create_custom_domain(
user=current_user, domain=new_custom_domain_form.domain.data
if new_domain.startswith("http://"): )
new_domain = new_domain[len("http://") :] if res.success:
flash(f"New domain {res.instance.domain} is created", "success")
if new_domain.startswith("https://"):
new_domain = new_domain[len("https://") :]
if SLDomain.get_by(domain=new_domain):
flash("A custom domain cannot be a built-in domain.", "error")
elif CustomDomain.get_by(domain=new_domain):
flash(f"{new_domain} already used", "error")
elif get_email_domain_part(current_user.email) == new_domain:
flash(
"You cannot add a domain that you are currently using for your personal email. "
"Please change your personal email to your real email",
"error",
)
elif Mailbox.filter(
Mailbox.verified.is_(True), Mailbox.email.endswith(f"@{new_domain}")
).first():
flash(
f"{new_domain} already used in a SimpleLogin mailbox", "error"
)
else:
new_custom_domain = CustomDomain.create(
domain=new_domain, user_id=current_user.id
)
# new domain has ownership verified if its parent has the ownership verified
for root_cd in current_user.custom_domains:
if (
new_domain.endswith("." + root_cd.domain)
and root_cd.ownership_verified
):
LOG.i(
"%s ownership verified thanks to %s",
new_custom_domain,
root_cd,
)
new_custom_domain.ownership_verified = True
Session.commit()
mailbox_ids = request.form.getlist("mailbox_ids")
if mailbox_ids:
# check if mailbox is not tempered with
mailboxes = []
for mailbox_id in mailbox_ids:
mailbox = Mailbox.get(mailbox_id)
if (
not mailbox
or mailbox.user_id != current_user.id
or not mailbox.verified
):
flash("Something went wrong, please retry", "warning")
return redirect(url_for("dashboard.custom_domain"))
mailboxes.append(mailbox)
for mailbox in mailboxes:
DomainMailbox.create(
domain_id=new_custom_domain.id, mailbox_id=mailbox.id
)
Session.commit()
flash(
f"New domain {new_custom_domain.domain} is created", "success"
)
return redirect( return redirect(
url_for( url_for(
"dashboard.domain_detail_dns", "dashboard.domain_detail_dns",
custom_domain_id=new_custom_domain.id, custom_domain_id=res.instance.id,
) )
) )
else:
flash(res.message, res.message_category)
if res.redirect:
return redirect(url_for(res.redirect))
return render_template( return render_template(
"dashboard/custom_domain.html", "dashboard/custom_domain.html",
custom_domains=custom_domains, custom_domains=custom_domains,
new_custom_domain_form=new_custom_domain_form, new_custom_domain_form=new_custom_domain_form,
EMAIL_SERVERS_WITH_PRIORITY=EMAIL_SERVERS_WITH_PRIORITY, EMAIL_SERVERS_WITH_PRIORITY=EMAIL_SERVERS_WITH_PRIORITY,
errors=errors,
mailboxes=mailboxes,
) )

View File

@ -8,6 +8,7 @@ from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required from app.dashboard.views.enter_sudo import sudo_required
from app.log import LOG from app.log import LOG
from app.models import Subscription, Job from app.models import Subscription, Job
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
class DeleteDirForm(FlaskForm): class DeleteDirForm(FlaskForm):
@ -33,6 +34,11 @@ def delete_account():
# Schedule delete account job # Schedule delete account job
LOG.w("schedule delete account job for %s", current_user) LOG.w("schedule delete account job for %s", current_user)
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UserMarkedForDeletion,
message=f"User {current_user.id} ({current_user.email}) marked for deletion via webapp",
)
Job.create( Job.create(
name=JOB_DELETE_ACCOUNT, name=JOB_DELETE_ACCOUNT,
payload={"user_id": current_user.id}, payload={"user_id": current_user.id},

View File

@ -1,3 +1,5 @@
from typing import Optional
from flask import render_template, request, redirect, url_for, flash from flask import render_template, request, redirect, url_for, flash
from flask_login import login_required, current_user from flask_login import login_required, current_user
from flask_wtf import FlaskForm from flask_wtf import FlaskForm
@ -20,6 +22,7 @@ from app.dashboard.base import dashboard_bp
from app.db import Session from app.db import Session
from app.errors import DirectoryInTrashError from app.errors import DirectoryInTrashError
from app.models import Directory, Mailbox, DirectoryMailbox from app.models import Directory, Mailbox, DirectoryMailbox
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
class NewDirForm(FlaskForm): class NewDirForm(FlaskForm):
@ -69,7 +72,9 @@ def directory():
if not delete_dir_form.validate(): if not delete_dir_form.validate():
flash("Invalid request", "warning") flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory")) return redirect(url_for("dashboard.directory"))
dir_obj = Directory.get(delete_dir_form.directory_id.data) dir_obj: Optional[Directory] = Directory.get(
delete_dir_form.directory_id.data
)
if not dir_obj: if not dir_obj:
flash("Unknown error. Refresh the page", "warning") flash("Unknown error. Refresh the page", "warning")
@ -79,6 +84,11 @@ def directory():
return redirect(url_for("dashboard.directory")) return redirect(url_for("dashboard.directory"))
name = dir_obj.name name = dir_obj.name
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.DeleteDirectory,
message=f"Delete directory {dir_obj.id} ({dir_obj.name})",
)
Directory.delete(dir_obj.id) Directory.delete(dir_obj.id)
Session.commit() Session.commit()
flash(f"Directory {name} has been deleted", "success") flash(f"Directory {name} has been deleted", "success")
@ -90,7 +100,7 @@ def directory():
flash("Invalid request", "warning") flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory")) return redirect(url_for("dashboard.directory"))
dir_id = toggle_dir_form.directory_id.data dir_id = toggle_dir_form.directory_id.data
dir_obj = Directory.get(dir_id) dir_obj: Optional[Directory] = Directory.get(dir_id)
if not dir_obj or dir_obj.user_id != current_user.id: if not dir_obj or dir_obj.user_id != current_user.id:
flash("Unknown error. Refresh the page", "warning") flash("Unknown error. Refresh the page", "warning")
@ -103,6 +113,11 @@ def directory():
dir_obj.disabled = True dir_obj.disabled = True
flash(f"On-the-fly is disabled for {dir_obj.name}", "warning") flash(f"On-the-fly is disabled for {dir_obj.name}", "warning")
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateDirectory,
message=f"Updated directory {dir_obj.id} ({dir_obj.name}) set disabled = {dir_obj.disabled}",
)
Session.commit() Session.commit()
return redirect(url_for("dashboard.directory")) return redirect(url_for("dashboard.directory"))
@ -112,7 +127,7 @@ def directory():
flash("Invalid request", "warning") flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory")) return redirect(url_for("dashboard.directory"))
dir_id = update_dir_form.directory_id.data dir_id = update_dir_form.directory_id.data
dir_obj = Directory.get(dir_id) dir_obj: Optional[Directory] = Directory.get(dir_id)
if not dir_obj or dir_obj.user_id != current_user.id: if not dir_obj or dir_obj.user_id != current_user.id:
flash("Unknown error. Refresh the page", "warning") flash("Unknown error. Refresh the page", "warning")
@ -143,6 +158,12 @@ def directory():
for mailbox in mailboxes: for mailbox in mailboxes:
DirectoryMailbox.create(directory_id=dir_obj.id, mailbox_id=mailbox.id) DirectoryMailbox.create(directory_id=dir_obj.id, mailbox_id=mailbox.id)
mailboxes_as_str = ",".join(map(str, mailbox_ids))
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateDirectory,
message=f"Updated directory {dir_obj.id} ({dir_obj.name}) mailboxes ({mailboxes_as_str})",
)
Session.commit() Session.commit()
flash(f"Directory {dir_obj.name} has been updated", "success") flash(f"Directory {dir_obj.name} has been updated", "success")
@ -181,6 +202,11 @@ def directory():
new_dir = Directory.create( new_dir = Directory.create(
name=new_dir_name, user_id=current_user.id name=new_dir_name, user_id=current_user.id
) )
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.CreateDirectory,
message=f"New directory {new_dir.name} ({new_dir.name})",
)
except DirectoryInTrashError: except DirectoryInTrashError:
flash( flash(
f"{new_dir_name} has been used before and cannot be reused", f"{new_dir_name} has been used before and cannot be reused",

View File

@ -1,33 +1,26 @@
import re import re
import arrow
from flask import render_template, request, redirect, url_for, flash from flask import render_template, request, redirect, url_for, flash
from flask_login import login_required, current_user from flask_login import login_required, current_user
from flask_wtf import FlaskForm from flask_wtf import FlaskForm
from wtforms import StringField, validators, IntegerField from wtforms import StringField, validators, IntegerField
from app.config import EMAIL_SERVERS_WITH_PRIORITY, EMAIL_DOMAIN, JOB_DELETE_DOMAIN from app.constants import DMARC_RECORD
from app.config import EMAIL_SERVERS_WITH_PRIORITY, EMAIL_DOMAIN
from app.custom_domain_utils import delete_custom_domain, set_custom_domain_mailboxes
from app.custom_domain_validation import CustomDomainValidation from app.custom_domain_validation import CustomDomainValidation
from app.dashboard.base import dashboard_bp from app.dashboard.base import dashboard_bp
from app.db import Session from app.db import Session
from app.dns_utils import (
get_mx_domains,
get_spf_domain,
get_txt_record,
is_mx_equivalent,
)
from app.log import LOG
from app.models import ( from app.models import (
CustomDomain, CustomDomain,
Alias, Alias,
DomainDeletedAlias, DomainDeletedAlias,
Mailbox, Mailbox,
DomainMailbox,
AutoCreateRule, AutoCreateRule,
AutoCreateRuleMailbox, AutoCreateRuleMailbox,
Job,
) )
from app.regex_utils import regex_match from app.regex_utils import regex_match
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import random_string, CSRFValidationForm from app.utils import random_string, CSRFValidationForm
@ -44,13 +37,9 @@ def domain_detail_dns(custom_domain_id):
custom_domain.ownership_txt_token = random_string(30) custom_domain.ownership_txt_token = random_string(30)
Session.commit() Session.commit()
spf_record = f"v=spf1 include:{EMAIL_DOMAIN} ~all"
domain_validator = CustomDomainValidation(EMAIL_DOMAIN) domain_validator = CustomDomainValidation(EMAIL_DOMAIN)
csrf_form = CSRFValidationForm() csrf_form = CSRFValidationForm()
dmarc_record = "v=DMARC1; p=quarantine; pct=100; adkim=s; aspf=s"
mx_ok = spf_ok = dkim_ok = dmarc_ok = ownership_ok = True mx_ok = spf_ok = dkim_ok = dmarc_ok = ownership_ok = True
mx_errors = spf_errors = dkim_errors = dmarc_errors = ownership_errors = [] mx_errors = spf_errors = dkim_errors = dmarc_errors = ownership_errors = []
@ -59,15 +48,14 @@ def domain_detail_dns(custom_domain_id):
flash("Invalid request", "warning") flash("Invalid request", "warning")
return redirect(request.url) return redirect(request.url)
if request.form.get("form-name") == "check-ownership": if request.form.get("form-name") == "check-ownership":
txt_records = get_txt_record(custom_domain.domain) ownership_validation_result = domain_validator.validate_domain_ownership(
custom_domain
if custom_domain.get_ownership_dns_txt_value() in txt_records: )
if ownership_validation_result.success:
flash( flash(
"Domain ownership is verified. Please proceed to the other records setup", "Domain ownership is verified. Please proceed to the other records setup",
"success", "success",
) )
custom_domain.ownership_verified = True
Session.commit()
return redirect( return redirect(
url_for( url_for(
"dashboard.domain_detail_dns", "dashboard.domain_detail_dns",
@ -78,36 +66,28 @@ def domain_detail_dns(custom_domain_id):
else: else:
flash("We can't find the needed TXT record", "error") flash("We can't find the needed TXT record", "error")
ownership_ok = False ownership_ok = False
ownership_errors = txt_records ownership_errors = ownership_validation_result.errors
elif request.form.get("form-name") == "check-mx": elif request.form.get("form-name") == "check-mx":
mx_domains = get_mx_domains(custom_domain.domain) mx_validation_result = domain_validator.validate_mx_records(custom_domain)
if mx_validation_result.success:
if not is_mx_equivalent(mx_domains, EMAIL_SERVERS_WITH_PRIORITY):
flash("The MX record is not correctly set", "warning")
mx_ok = False
# build mx_errors to show to user
mx_errors = [
f"{priority} {domain}" for (priority, domain) in mx_domains
]
else:
flash( flash(
"Your domain can start receiving emails. You can now use it to create alias", "Your domain can start receiving emails. You can now use it to create alias",
"success", "success",
) )
custom_domain.verified = True
Session.commit()
return redirect( return redirect(
url_for( url_for(
"dashboard.domain_detail_dns", custom_domain_id=custom_domain.id "dashboard.domain_detail_dns", custom_domain_id=custom_domain.id
) )
) )
else:
flash("The MX record is not correctly set", "warning")
mx_ok = False
mx_errors = mx_validation_result.errors
elif request.form.get("form-name") == "check-spf": elif request.form.get("form-name") == "check-spf":
spf_domains = get_spf_domain(custom_domain.domain) spf_validation_result = domain_validator.validate_spf_records(custom_domain)
if EMAIL_DOMAIN in spf_domains: if spf_validation_result.success:
custom_domain.spf_verified = True
Session.commit()
flash("SPF is setup correctly", "success") flash("SPF is setup correctly", "success")
return redirect( return redirect(
url_for( url_for(
@ -115,14 +95,12 @@ def domain_detail_dns(custom_domain_id):
) )
) )
else: else:
custom_domain.spf_verified = False
Session.commit()
flash( flash(
f"SPF: {EMAIL_DOMAIN} is not included in your SPF record.", f"SPF: {EMAIL_DOMAIN} is not included in your SPF record.",
"warning", "warning",
) )
spf_ok = False spf_ok = False
spf_errors = get_txt_record(custom_domain.domain) spf_errors = spf_validation_result.errors
elif request.form.get("form-name") == "check-dkim": elif request.form.get("form-name") == "check-dkim":
dkim_errors = domain_validator.validate_dkim_records(custom_domain) dkim_errors = domain_validator.validate_dkim_records(custom_domain)
@ -138,10 +116,10 @@ def domain_detail_dns(custom_domain_id):
flash("DKIM: the CNAME record is not correctly set", "warning") flash("DKIM: the CNAME record is not correctly set", "warning")
elif request.form.get("form-name") == "check-dmarc": elif request.form.get("form-name") == "check-dmarc":
txt_records = get_txt_record("_dmarc." + custom_domain.domain) dmarc_validation_result = domain_validator.validate_dmarc_records(
if dmarc_record in txt_records: custom_domain
custom_domain.dmarc_verified = True )
Session.commit() if dmarc_validation_result.success:
flash("DMARC is setup correctly", "success") flash("DMARC is setup correctly", "success")
return redirect( return redirect(
url_for( url_for(
@ -149,19 +127,23 @@ def domain_detail_dns(custom_domain_id):
) )
) )
else: else:
custom_domain.dmarc_verified = False
Session.commit()
flash( flash(
"DMARC: The TXT record is not correctly set", "DMARC: The TXT record is not correctly set",
"warning", "warning",
) )
dmarc_ok = False dmarc_ok = False
dmarc_errors = txt_records dmarc_errors = dmarc_validation_result.errors
return render_template( return render_template(
"dashboard/domain_detail/dns.html", "dashboard/domain_detail/dns.html",
EMAIL_SERVERS_WITH_PRIORITY=EMAIL_SERVERS_WITH_PRIORITY, EMAIL_SERVERS_WITH_PRIORITY=EMAIL_SERVERS_WITH_PRIORITY,
dkim_records=domain_validator.get_dkim_records(), ownership_record=domain_validator.get_ownership_verification_record(
custom_domain
),
expected_mx_records=domain_validator.get_expected_mx_records(custom_domain),
dkim_records=domain_validator.get_dkim_records(custom_domain),
spf_record=domain_validator.get_expected_spf_record(custom_domain),
dmarc_record=DMARC_RECORD,
**locals(), **locals(),
) )
@ -183,6 +165,11 @@ def domain_detail(custom_domain_id):
return redirect(request.url) return redirect(request.url)
if request.form.get("form-name") == "switch-catch-all": if request.form.get("form-name") == "switch-catch-all":
custom_domain.catch_all = not custom_domain.catch_all custom_domain.catch_all = not custom_domain.catch_all
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateCustomDomain,
message=f"Switched custom domain {custom_domain.id} ({custom_domain.domain}) catch all to {custom_domain.catch_all}",
)
Session.commit() Session.commit()
if custom_domain.catch_all: if custom_domain.catch_all:
@ -201,6 +188,11 @@ def domain_detail(custom_domain_id):
elif request.form.get("form-name") == "set-name": elif request.form.get("form-name") == "set-name":
if request.form.get("action") == "save": if request.form.get("action") == "save":
custom_domain.name = request.form.get("alias-name").replace("\n", "") custom_domain.name = request.form.get("alias-name").replace("\n", "")
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateCustomDomain,
message=f"Switched custom domain {custom_domain.id} ({custom_domain.domain}) name",
)
Session.commit() Session.commit()
flash( flash(
f"Default alias name for Domain {custom_domain.domain} has been set", f"Default alias name for Domain {custom_domain.domain} has been set",
@ -208,6 +200,11 @@ def domain_detail(custom_domain_id):
) )
else: else:
custom_domain.name = None custom_domain.name = None
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateCustomDomain,
message=f"Cleared custom domain {custom_domain.id} ({custom_domain.domain}) name",
)
Session.commit() Session.commit()
flash( flash(
f"Default alias name for Domain {custom_domain.domain} has been removed", f"Default alias name for Domain {custom_domain.domain} has been removed",
@ -221,6 +218,11 @@ def domain_detail(custom_domain_id):
custom_domain.random_prefix_generation = ( custom_domain.random_prefix_generation = (
not custom_domain.random_prefix_generation not custom_domain.random_prefix_generation
) )
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateCustomDomain,
message=f"Switched custom domain {custom_domain.id} ({custom_domain.domain}) random prefix generation to {custom_domain.random_prefix_generation}",
)
Session.commit() Session.commit()
if custom_domain.random_prefix_generation: if custom_domain.random_prefix_generation:
@ -238,40 +240,16 @@ def domain_detail(custom_domain_id):
) )
elif request.form.get("form-name") == "update": elif request.form.get("form-name") == "update":
mailbox_ids = request.form.getlist("mailbox_ids") mailbox_ids = request.form.getlist("mailbox_ids")
# check if mailbox is not tempered with result = set_custom_domain_mailboxes(
mailboxes = [] user_id=current_user.id,
for mailbox_id in mailbox_ids: custom_domain=custom_domain,
mailbox = Mailbox.get(mailbox_id) mailbox_ids=mailbox_ids,
if ( )
not mailbox
or mailbox.user_id != current_user.id
or not mailbox.verified
):
flash("Something went wrong, please retry", "warning")
return redirect(
url_for(
"dashboard.domain_detail", custom_domain_id=custom_domain.id
)
)
mailboxes.append(mailbox)
if not mailboxes: if result.success:
flash("You must select at least 1 mailbox", "warning") flash(f"{custom_domain.domain} mailboxes has been updated", "success")
return redirect( else:
url_for( flash(result.reason.value, "warning")
"dashboard.domain_detail", custom_domain_id=custom_domain.id
)
)
# first remove all existing domain-mailboxes links
DomainMailbox.filter_by(domain_id=custom_domain.id).delete()
Session.flush()
for mailbox in mailboxes:
DomainMailbox.create(domain_id=custom_domain.id, mailbox_id=mailbox.id)
Session.commit()
flash(f"{custom_domain.domain} mailboxes has been updated", "success")
return redirect( return redirect(
url_for("dashboard.domain_detail", custom_domain_id=custom_domain.id) url_for("dashboard.domain_detail", custom_domain_id=custom_domain.id)
@ -279,16 +257,8 @@ def domain_detail(custom_domain_id):
elif request.form.get("form-name") == "delete": elif request.form.get("form-name") == "delete":
name = custom_domain.domain name = custom_domain.domain
LOG.d("Schedule deleting %s", custom_domain)
# Schedule delete domain job delete_custom_domain(custom_domain)
LOG.w("schedule delete domain job for %s", custom_domain)
Job.create(
name=JOB_DELETE_DOMAIN,
payload={"custom_domain_id": custom_domain.id},
run_at=arrow.now(),
commit=True,
)
flash( flash(
f"{name} scheduled for deletion." f"{name} scheduled for deletion."

View File

@ -149,7 +149,9 @@ def index():
) )
flash(f"Alias {email} has been deleted", "success") flash(f"Alias {email} has been deleted", "success")
elif request.form.get("form-name") == "disable-alias": elif request.form.get("form-name") == "disable-alias":
alias_utils.change_alias_status(alias, enabled=False) alias_utils.change_alias_status(
alias, enabled=False, message="Set enabled=False from dashboard"
)
Session.commit() Session.commit()
flash(f"Alias {alias.email} has been disabled", "success") flash(f"Alias {alias.email} has been disabled", "success")

View File

@ -1,6 +1,7 @@
import base64 import base64
import binascii import binascii
import json import json
from typing import Optional
from flask import render_template, request, redirect, url_for, flash from flask import render_template, request, redirect, url_for, flash
from flask_login import login_required, current_user from flask_login import login_required, current_user
@ -15,6 +16,7 @@ from app.dashboard.base import dashboard_bp
from app.db import Session from app.db import Session
from app.log import LOG from app.log import LOG
from app.models import Mailbox from app.models import Mailbox
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import CSRFValidationForm from app.utils import CSRFValidationForm
@ -123,7 +125,12 @@ def mailbox_verify():
if not code: if not code:
# Old way # Old way
return verify_with_signed_secret(mailbox_id) return verify_with_signed_secret(mailbox_id)
mailbox = mailbox_utils.verify_mailbox_code(current_user, mailbox_id, code) try:
mailbox = mailbox_utils.verify_mailbox_code(current_user, mailbox_id, code)
except mailbox_utils.MailboxError as e:
LOG.i(f"Cannot verify mailbox {mailbox_id} because of {e}")
flash(f"Cannot verify mailbox: {e.msg}", "error")
return redirect(url_for("dashboard.mailbox_route"))
LOG.d("Mailbox %s is verified", mailbox) LOG.d("Mailbox %s is verified", mailbox)
return render_template("dashboard/mailbox_validation.html", mailbox=mailbox) return render_template("dashboard/mailbox_validation.html", mailbox=mailbox)
@ -146,7 +153,7 @@ def verify_with_signed_secret(request: str):
flash("Invalid link. Please delete and re-add your mailbox", "error") flash("Invalid link. Please delete and re-add your mailbox", "error")
return redirect(url_for("dashboard.mailbox_route")) return redirect(url_for("dashboard.mailbox_route"))
mailbox_id = mailbox_data[0] mailbox_id = mailbox_data[0]
mailbox = Mailbox.get(mailbox_id) mailbox: Optional[Mailbox] = Mailbox.get(mailbox_id)
if not mailbox: if not mailbox:
flash("Invalid link", "error") flash("Invalid link", "error")
return redirect(url_for("dashboard.mailbox_route")) return redirect(url_for("dashboard.mailbox_route"))
@ -156,6 +163,11 @@ def verify_with_signed_secret(request: str):
return redirect(url_for("dashboard.mailbox_route")) return redirect(url_for("dashboard.mailbox_route"))
mailbox.verified = True mailbox.verified = True
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.VerifyMailbox,
message=f"Verified mailbox {mailbox.id} ({mailbox.email})",
)
Session.commit() Session.commit()
LOG.d("Mailbox %s is verified", mailbox) LOG.d("Mailbox %s is verified", mailbox)

View File

@ -16,10 +16,11 @@ from app.db import Session
from app.email_utils import email_can_be_used_as_mailbox from app.email_utils import email_can_be_used_as_mailbox
from app.email_utils import mailbox_already_used, render, send_email from app.email_utils import mailbox_already_used, render, send_email
from app.extensions import limiter from app.extensions import limiter
from app.log import LOG from app.mailbox_utils import perform_mailbox_email_change, MailboxEmailChangeError
from app.models import Alias, AuthorizedAddress from app.models import Alias, AuthorizedAddress
from app.models import Mailbox from app.models import Mailbox
from app.pgp_utils import PGPException, load_public_key_and_check from app.pgp_utils import PGPException, load_public_key_and_check
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import sanitize_email, CSRFValidationForm from app.utils import sanitize_email, CSRFValidationForm
@ -88,8 +89,12 @@ def mailbox_detail_route(mailbox_id):
flash("SPF enforcement globally not enabled", "error") flash("SPF enforcement globally not enabled", "error")
return redirect(url_for("dashboard.index")) return redirect(url_for("dashboard.index"))
mailbox.force_spf = ( force_spf_value = request.form.get("spf-status") == "on"
True if request.form.get("spf-status") == "on" else False mailbox.force_spf = force_spf_value
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Set force_spf to {force_spf_value} on mailbox {mailbox_id} ({mailbox.email})",
) )
Session.commit() Session.commit()
flash( flash(
@ -113,6 +118,11 @@ def mailbox_detail_route(mailbox_id):
if AuthorizedAddress.get_by(mailbox_id=mailbox.id, email=address): if AuthorizedAddress.get_by(mailbox_id=mailbox.id, email=address):
flash(f"{address} already added", "error") flash(f"{address} already added", "error")
else: else:
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Add authorized address {address} to mailbox {mailbox_id} ({mailbox.email})",
)
AuthorizedAddress.create( AuthorizedAddress.create(
user_id=current_user.id, user_id=current_user.id,
mailbox_id=mailbox.id, mailbox_id=mailbox.id,
@ -133,6 +143,11 @@ def mailbox_detail_route(mailbox_id):
flash("Unknown error. Refresh the page", "warning") flash("Unknown error. Refresh the page", "warning")
else: else:
address = authorized_address.email address = authorized_address.email
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Remove authorized address {address} from mailbox {mailbox_id} ({mailbox.email})",
)
AuthorizedAddress.delete(authorized_address_id) AuthorizedAddress.delete(authorized_address_id)
Session.commit() Session.commit()
flash(f"{address} has been deleted", "success") flash(f"{address} has been deleted", "success")
@ -165,6 +180,11 @@ def mailbox_detail_route(mailbox_id):
except PGPException: except PGPException:
flash("Cannot add the public key, please verify it", "error") flash("Cannot add the public key, please verify it", "error")
else: else:
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Add PGP Key {mailbox.pgp_finger_print} to mailbox {mailbox_id} ({mailbox.email})",
)
Session.commit() Session.commit()
flash("Your PGP public key is saved successfully", "success") flash("Your PGP public key is saved successfully", "success")
return redirect( return redirect(
@ -172,6 +192,11 @@ def mailbox_detail_route(mailbox_id):
) )
elif request.form.get("action") == "remove": elif request.form.get("action") == "remove":
# Free user can decide to remove their added PGP key # Free user can decide to remove their added PGP key
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Remove PGP Key {mailbox.pgp_finger_print} from mailbox {mailbox_id} ({mailbox.email})",
)
mailbox.pgp_public_key = None mailbox.pgp_public_key = None
mailbox.pgp_finger_print = None mailbox.pgp_finger_print = None
mailbox.disable_pgp = False mailbox.disable_pgp = False
@ -191,9 +216,19 @@ def mailbox_detail_route(mailbox_id):
) )
else: else:
mailbox.disable_pgp = False mailbox.disable_pgp = False
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Enabled PGP for mailbox {mailbox_id} ({mailbox.email})",
)
flash(f"PGP is enabled on {mailbox.email}", "info") flash(f"PGP is enabled on {mailbox.email}", "info")
else: else:
mailbox.disable_pgp = True mailbox.disable_pgp = True
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Disabled PGP for mailbox {mailbox_id} ({mailbox.email})",
)
flash(f"PGP is disabled on {mailbox.email}", "info") flash(f"PGP is disabled on {mailbox.email}", "info")
Session.commit() Session.commit()
@ -203,6 +238,11 @@ def mailbox_detail_route(mailbox_id):
elif request.form.get("form-name") == "generic-subject": elif request.form.get("form-name") == "generic-subject":
if request.form.get("action") == "save": if request.form.get("action") == "save":
mailbox.generic_subject = request.form.get("generic-subject") mailbox.generic_subject = request.form.get("generic-subject")
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Set generic subject for mailbox {mailbox_id} ({mailbox.email})",
)
Session.commit() Session.commit()
flash("Generic subject is enabled", "success") flash("Generic subject is enabled", "success")
return redirect( return redirect(
@ -210,6 +250,11 @@ def mailbox_detail_route(mailbox_id):
) )
elif request.form.get("action") == "remove": elif request.form.get("action") == "remove":
mailbox.generic_subject = None mailbox.generic_subject = None
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Remove generic subject for mailbox {mailbox_id} ({mailbox.email})",
)
Session.commit() Session.commit()
flash("Generic subject is disabled", "success") flash("Generic subject is disabled", "success")
return redirect( return redirect(
@ -272,7 +317,7 @@ def cancel_mailbox_change_route(mailbox_id):
@dashboard_bp.route("/mailbox/confirm_change") @dashboard_bp.route("/mailbox/confirm_change")
def mailbox_confirm_change_route(): def mailbox_confirm_email_change_route():
s = TimestampSigner(MAILBOX_SECRET) s = TimestampSigner(MAILBOX_SECRET)
signed_mailbox_id = request.args.get("mailbox_id") signed_mailbox_id = request.args.get("mailbox_id")
@ -281,30 +326,20 @@ def mailbox_confirm_change_route():
except Exception: except Exception:
flash("Invalid link", "error") flash("Invalid link", "error")
return redirect(url_for("dashboard.index")) return redirect(url_for("dashboard.index"))
else:
mailbox = Mailbox.get(mailbox_id)
# new_email can be None if user cancels change in the meantime res = perform_mailbox_email_change(mailbox_id)
if mailbox and mailbox.new_email:
user = mailbox.user
if Mailbox.get_by(email=mailbox.new_email, user_id=user.id):
flash(f"{mailbox.new_email} is already used", "error")
return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox.id)
)
mailbox.email = mailbox.new_email flash(res.message, res.message_category)
mailbox.new_email = None if res.error:
if res.error == MailboxEmailChangeError.EmailAlreadyUsed:
# mark mailbox as verified if the change request is sent from an unverified mailbox
mailbox.verified = True
Session.commit()
LOG.d("Mailbox change %s is verified", mailbox)
flash(f"The {mailbox.email} is updated", "success")
return redirect( return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox.id) url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
) )
else: elif res.error == MailboxEmailChangeError.InvalidId:
flash("Invalid link", "error")
return redirect(url_for("dashboard.index")) return redirect(url_for("dashboard.index"))
else:
raise Exception("Unhandled MailboxEmailChangeError")
else:
return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
)

View File

@ -11,6 +11,7 @@ from app.dashboard.base import dashboard_bp
from app.errors import SubdomainInTrashError from app.errors import SubdomainInTrashError
from app.log import LOG from app.log import LOG
from app.models import CustomDomain, Mailbox, SLDomain from app.models import CustomDomain, Mailbox, SLDomain
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
# Only lowercase letters, numbers, dashes (-) are currently supported # Only lowercase letters, numbers, dashes (-) are currently supported
_SUBDOMAIN_PATTERN = r"[0-9a-z-]{1,}" _SUBDOMAIN_PATTERN = r"[0-9a-z-]{1,}"
@ -102,6 +103,12 @@ def subdomain_route():
ownership_verified=True, ownership_verified=True,
commit=True, commit=True,
) )
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.CreateCustomDomain,
message=f"Create subdomain {new_custom_domain.id} ({full_domain})",
commit=True,
)
except SubdomainInTrashError: except SubdomainInTrashError:
flash( flash(
f"{full_domain} has been used before and cannot be reused", f"{full_domain} has been used before and cannot be reused",

View File

@ -32,7 +32,9 @@ def unsubscribe(alias_id):
# automatic unsubscribe, according to https://tools.ietf.org/html/rfc8058 # automatic unsubscribe, according to https://tools.ietf.org/html/rfc8058
if request.method == "POST": if request.method == "POST":
alias_utils.change_alias_status(alias, False) alias_utils.change_alias_status(
alias, enabled=False, message="Set enabled=False from unsubscribe request"
)
flash(f"Alias {alias.email} has been blocked", "success") flash(f"Alias {alias.email} has been blocked", "success")
Session.commit() Session.commit()

View File

@ -1,102 +1,22 @@
from app import config from abc import ABC, abstractmethod
from typing import Optional, List, Tuple from dataclasses import dataclass
from typing import List, Optional
import dns.resolver import dns.resolver
from app.config import NAMESERVERS
def _get_dns_resolver():
my_resolver = dns.resolver.Resolver()
my_resolver.nameservers = config.NAMESERVERS
return my_resolver
def get_ns(hostname) -> [str]:
try:
answers = _get_dns_resolver().resolve(hostname, "NS", search=True)
except Exception:
return []
return [a.to_text() for a in answers]
def get_cname_record(hostname) -> Optional[str]:
"""Return the CNAME record if exists for a domain, WITHOUT the trailing period at the end"""
try:
answers = _get_dns_resolver().resolve(hostname, "CNAME", search=True)
except Exception:
return None
for a in answers:
ret = a.to_text()
return ret[:-1]
return None
def get_mx_domains(hostname) -> [(int, str)]:
"""return list of (priority, domain name) sorted by priority (lowest priority first)
domain name ends with a "." at the end.
"""
try:
answers = _get_dns_resolver().resolve(hostname, "MX", search=True)
except Exception:
return []
ret = []
for a in answers:
record = a.to_text() # for ex '20 alt2.aspmx.l.google.com.'
parts = record.split(" ")
ret.append((int(parts[0]), parts[1]))
return sorted(ret, key=lambda prio_domain: prio_domain[0])
_include_spf = "include:" _include_spf = "include:"
def get_spf_domain(hostname) -> [str]: @dataclass
"""return all domains listed in *include:*""" class MxRecord:
try: priority: int
answers = _get_dns_resolver().resolve(hostname, "TXT", search=True) domain: str
except Exception:
return []
ret = []
for a in answers: # type: dns.rdtypes.ANY.TXT.TXT
for record in a.strings:
record = record.decode() # record is bytes
if record.startswith("v=spf1"):
parts = record.split(" ")
for part in parts:
if part.startswith(_include_spf):
ret.append(part[part.find(_include_spf) + len(_include_spf) :])
return ret
def get_txt_record(hostname) -> [str]:
try:
answers = _get_dns_resolver().resolve(hostname, "TXT", search=True)
except Exception:
return []
ret = []
for a in answers: # type: dns.rdtypes.ANY.TXT.TXT
for record in a.strings:
record = record.decode() # record is bytes
ret.append(record)
return ret
def is_mx_equivalent( def is_mx_equivalent(
mx_domains: List[Tuple[int, str]], ref_mx_domains: List[Tuple[int, str]] mx_domains: List[MxRecord], ref_mx_domains: List[MxRecord]
) -> bool: ) -> bool:
""" """
Compare mx_domains with ref_mx_domains to see if they are equivalent. Compare mx_domains with ref_mx_domains to see if they are equivalent.
@ -105,16 +25,127 @@ def is_mx_equivalent(
The priority order is taken into account but not the priority number. The priority order is taken into account but not the priority number.
For example, [(1, domain1), (2, domain2)] is equivalent to [(10, domain1), (20, domain2)] For example, [(1, domain1), (2, domain2)] is equivalent to [(10, domain1), (20, domain2)]
""" """
mx_domains = sorted(mx_domains, key=lambda priority_domain: priority_domain[0]) mx_domains = sorted(mx_domains, key=lambda x: x.priority)
ref_mx_domains = sorted( ref_mx_domains = sorted(ref_mx_domains, key=lambda x: x.priority)
ref_mx_domains, key=lambda priority_domain: priority_domain[0]
)
if len(mx_domains) < len(ref_mx_domains): if len(mx_domains) < len(ref_mx_domains):
return False return False
for i in range(0, len(ref_mx_domains)): for actual, expected in zip(mx_domains, ref_mx_domains):
if mx_domains[i][1] != ref_mx_domains[i][1]: if actual.domain != expected.domain:
return False return False
return True return True
class DNSClient(ABC):
@abstractmethod
def get_cname_record(self, hostname: str) -> Optional[str]:
pass
@abstractmethod
def get_mx_domains(self, hostname: str) -> List[MxRecord]:
pass
def get_spf_domain(self, hostname: str) -> List[str]:
"""
return all domains listed in *include:*
"""
try:
records = self.get_txt_record(hostname)
ret = []
for record in records:
if record.startswith("v=spf1"):
parts = record.split(" ")
for part in parts:
if part.startswith(_include_spf):
ret.append(
part[part.find(_include_spf) + len(_include_spf) :]
)
return ret
except Exception:
return []
@abstractmethod
def get_txt_record(self, hostname: str) -> List[str]:
pass
class NetworkDNSClient(DNSClient):
def __init__(self, nameservers: List[str]):
self._resolver = dns.resolver.Resolver()
self._resolver.nameservers = nameservers
def get_cname_record(self, hostname: str) -> Optional[str]:
"""
Return the CNAME record if exists for a domain, WITHOUT the trailing period at the end
"""
try:
answers = self._resolver.resolve(hostname, "CNAME", search=True)
for a in answers:
ret = a.to_text()
return ret[:-1]
except Exception:
return None
def get_mx_domains(self, hostname: str) -> List[MxRecord]:
"""
return list of (priority, domain name) sorted by priority (lowest priority first)
domain name ends with a "." at the end.
"""
try:
answers = self._resolver.resolve(hostname, "MX", search=True)
ret = []
for a in answers:
record = a.to_text() # for ex '20 alt2.aspmx.l.google.com.'
parts = record.split(" ")
ret.append(MxRecord(priority=int(parts[0]), domain=parts[1]))
return sorted(ret, key=lambda x: x.priority)
except Exception:
return []
def get_txt_record(self, hostname: str) -> List[str]:
try:
answers = self._resolver.resolve(hostname, "TXT", search=False)
ret = []
for a in answers: # type: dns.rdtypes.ANY.TXT.TXT
for record in a.strings:
ret.append(record.decode())
return ret
except Exception:
return []
class InMemoryDNSClient(DNSClient):
def __init__(self):
self.cname_records: dict[str, Optional[str]] = {}
self.mx_records: dict[str, List[MxRecord]] = {}
self.spf_records: dict[str, List[str]] = {}
self.txt_records: dict[str, List[str]] = {}
def set_cname_record(self, hostname: str, cname: str):
self.cname_records[hostname] = cname
def set_mx_records(self, hostname: str, mx_list: List[MxRecord]):
self.mx_records[hostname] = mx_list
def set_txt_record(self, hostname: str, txt_list: List[str]):
self.txt_records[hostname] = txt_list
def get_cname_record(self, hostname: str) -> Optional[str]:
return self.cname_records.get(hostname)
def get_mx_domains(self, hostname: str) -> List[MxRecord]:
mx_list = self.mx_records.get(hostname, [])
return sorted(mx_list, key=lambda x: x.priority)
def get_txt_record(self, hostname: str) -> List[str]:
return self.txt_records.get(hostname, [])
def get_network_dns_client() -> NetworkDNSClient:
return NetworkDNSClient(NAMESERVERS)
def get_mx_domains(hostname: str) -> List[MxRecord]:
return get_network_dns_client().get_mx_domains(hostname)

View File

@ -592,7 +592,7 @@ def email_can_be_used_as_mailbox(email_address: str) -> bool:
from app.models import CustomDomain from app.models import CustomDomain
if CustomDomain.get_by(domain=domain, verified=True): if CustomDomain.get_by(domain=domain, is_sl_subdomain=True, verified=True):
LOG.d("domain %s is a SimpleLogin custom domain", domain) LOG.d("domain %s is a SimpleLogin custom domain", domain)
return False return False
@ -657,7 +657,7 @@ def get_mx_domain_list(domain) -> [str]:
""" """
priority_domains = get_mx_domains(domain) priority_domains = get_mx_domains(domain)
return [d[:-1] for _, d in priority_domains] return [d.domain[:-1] for d in priority_domains]
def personal_email_already_used(email_address: str) -> bool: def personal_email_already_used(email_address: str) -> bool:

View File

@ -1,8 +1,12 @@
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
import newrelic.agent
from app import config from app import config
from app.db import Session from app.db import Session
from app.errors import ProtonPartnerNotSetUp from app.errors import ProtonPartnerNotSetUp
from app.events.generated import event_pb2 from app.events.generated import event_pb2
from app.log import LOG
from app.models import User, PartnerUser, SyncEvent from app.models import User, PartnerUser, SyncEvent
from app.proton.utils import get_proton_partner from app.proton.utils import get_proton_partner
from typing import Optional from typing import Optional
@ -26,26 +30,43 @@ class PostgresDispatcher(Dispatcher):
return PostgresDispatcher() return PostgresDispatcher()
class GlobalDispatcher:
__dispatcher: Optional[Dispatcher] = None
@staticmethod
def get_dispatcher() -> Dispatcher:
if not GlobalDispatcher.__dispatcher:
GlobalDispatcher.__dispatcher = PostgresDispatcher.get()
return GlobalDispatcher.__dispatcher
@staticmethod
def set_dispatcher(dispatcher: Optional[Dispatcher]):
GlobalDispatcher.__dispatcher = dispatcher
class EventDispatcher: class EventDispatcher:
@staticmethod @staticmethod
def send_event( def send_event(
user: User, user: User,
content: event_pb2.EventContent, content: event_pb2.EventContent,
dispatcher: Dispatcher = PostgresDispatcher.get(), dispatcher: Optional[Dispatcher] = None,
skip_if_webhook_missing: bool = True, skip_if_webhook_missing: bool = True,
): ):
if dispatcher is None:
dispatcher = GlobalDispatcher.get_dispatcher()
if config.EVENT_WEBHOOK_DISABLE: if config.EVENT_WEBHOOK_DISABLE:
LOG.i("Not sending events because webhook is disabled")
return return
if not config.EVENT_WEBHOOK and skip_if_webhook_missing: if not config.EVENT_WEBHOOK and skip_if_webhook_missing:
LOG.i(
"Not sending events because webhook is not configured and allowed to be empty"
)
return return
if config.EVENT_WEBHOOK_ENABLED_USER_IDS is not None:
if user.id not in config.EVENT_WEBHOOK_ENABLED_USER_IDS:
return
partner_user = EventDispatcher.__partner_user(user.id) partner_user = EventDispatcher.__partner_user(user.id)
if not partner_user: if not partner_user:
LOG.i(f"Not sending events because there's no partner user for user {user}")
return return
event = event_pb2.Event( event = event_pb2.Event(
@ -58,6 +79,10 @@ class EventDispatcher:
serialized = event.SerializeToString() serialized = event.SerializeToString()
dispatcher.send(serialized) dispatcher.send(serialized)
event_type = content.WhichOneof("content")
newrelic.agent.record_custom_event("EventStoredToDb", {"type": event_type})
LOG.i("Sent event to the dispatcher")
@staticmethod @staticmethod
def __partner_user(user_id: int) -> Optional[PartnerUser]: def __partner_user(user_id: int) -> Optional[PartnerUser]:
# Check if the current user has a partner_id # Check if the current user has a partner_id

View File

@ -24,7 +24,7 @@ _sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x0b\x65vent.proto\x12\x12simplelogin_events\"(\n\x0fUserPlanChanged\x12\x15\n\rplan_end_time\x18\x01 \x01(\r\"\r\n\x0bUserDeleted\"Z\n\x0c\x41liasCreated\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x12\n\nalias_note\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\"L\n\x12\x41liasStatusChanged\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x03 \x01(\x08\"5\n\x0c\x41liasDeleted\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\"D\n\x10\x41liasCreatedList\x12\x30\n\x06\x65vents\x18\x01 \x03(\x0b\x32 .simplelogin_events.AliasCreated\"\x93\x03\n\x0c\x45ventContent\x12?\n\x10user_plan_change\x18\x01 \x01(\x0b\x32#.simplelogin_events.UserPlanChangedH\x00\x12\x37\n\x0cuser_deleted\x18\x02 \x01(\x0b\x32\x1f.simplelogin_events.UserDeletedH\x00\x12\x39\n\ralias_created\x18\x03 \x01(\x0b\x32 .simplelogin_events.AliasCreatedH\x00\x12\x45\n\x13\x61lias_status_change\x18\x04 \x01(\x0b\x32&.simplelogin_events.AliasStatusChangedH\x00\x12\x39\n\ralias_deleted\x18\x05 \x01(\x0b\x32 .simplelogin_events.AliasDeletedH\x00\x12\x41\n\x11\x61lias_create_list\x18\x06 \x01(\x0b\x32$.simplelogin_events.AliasCreatedListH\x00\x42\t\n\x07\x63ontent\"y\n\x05\x45vent\x12\x0f\n\x07user_id\x18\x01 \x01(\r\x12\x18\n\x10\x65xternal_user_id\x18\x02 \x01(\t\x12\x12\n\npartner_id\x18\x03 \x01(\r\x12\x31\n\x07\x63ontent\x18\x04 \x01(\x0b\x32 .simplelogin_events.EventContentb\x06proto3') DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x0b\x65vent.proto\x12\x12simplelogin_events\"(\n\x0fUserPlanChanged\x12\x15\n\rplan_end_time\x18\x01 \x01(\r\"\r\n\x0bUserDeleted\"\\\n\x0c\x41liasCreated\x12\n\n\x02id\x18\x01 \x01(\r\x12\r\n\x05\x65mail\x18\x02 \x01(\t\x12\x0c\n\x04note\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\x12\x12\n\ncreated_at\x18\x05 \x01(\r\"T\n\x12\x41liasStatusChanged\x12\n\n\x02id\x18\x01 \x01(\r\x12\r\n\x05\x65mail\x18\x02 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x03 \x01(\x08\x12\x12\n\ncreated_at\x18\x04 \x01(\r\")\n\x0c\x41liasDeleted\x12\n\n\x02id\x18\x01 \x01(\r\x12\r\n\x05\x65mail\x18\x02 \x01(\t\"D\n\x10\x41liasCreatedList\x12\x30\n\x06\x65vents\x18\x01 \x03(\x0b\x32 .simplelogin_events.AliasCreated\"\x93\x03\n\x0c\x45ventContent\x12?\n\x10user_plan_change\x18\x01 \x01(\x0b\x32#.simplelogin_events.UserPlanChangedH\x00\x12\x37\n\x0cuser_deleted\x18\x02 \x01(\x0b\x32\x1f.simplelogin_events.UserDeletedH\x00\x12\x39\n\ralias_created\x18\x03 \x01(\x0b\x32 .simplelogin_events.AliasCreatedH\x00\x12\x45\n\x13\x61lias_status_change\x18\x04 \x01(\x0b\x32&.simplelogin_events.AliasStatusChangedH\x00\x12\x39\n\ralias_deleted\x18\x05 \x01(\x0b\x32 .simplelogin_events.AliasDeletedH\x00\x12\x41\n\x11\x61lias_create_list\x18\x06 \x01(\x0b\x32$.simplelogin_events.AliasCreatedListH\x00\x42\t\n\x07\x63ontent\"y\n\x05\x45vent\x12\x0f\n\x07user_id\x18\x01 \x01(\r\x12\x18\n\x10\x65xternal_user_id\x18\x02 \x01(\t\x12\x12\n\npartner_id\x18\x03 \x01(\r\x12\x31\n\x07\x63ontent\x18\x04 \x01(\x0b\x32 .simplelogin_events.EventContentb\x06proto3')
_globals = globals() _globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals) _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
@ -36,15 +36,15 @@ if not _descriptor._USE_C_DESCRIPTORS:
_globals['_USERDELETED']._serialized_start=77 _globals['_USERDELETED']._serialized_start=77
_globals['_USERDELETED']._serialized_end=90 _globals['_USERDELETED']._serialized_end=90
_globals['_ALIASCREATED']._serialized_start=92 _globals['_ALIASCREATED']._serialized_start=92
_globals['_ALIASCREATED']._serialized_end=182 _globals['_ALIASCREATED']._serialized_end=184
_globals['_ALIASSTATUSCHANGED']._serialized_start=184 _globals['_ALIASSTATUSCHANGED']._serialized_start=186
_globals['_ALIASSTATUSCHANGED']._serialized_end=260 _globals['_ALIASSTATUSCHANGED']._serialized_end=270
_globals['_ALIASDELETED']._serialized_start=262 _globals['_ALIASDELETED']._serialized_start=272
_globals['_ALIASDELETED']._serialized_end=315 _globals['_ALIASDELETED']._serialized_end=313
_globals['_ALIASCREATEDLIST']._serialized_start=317 _globals['_ALIASCREATEDLIST']._serialized_start=315
_globals['_ALIASCREATEDLIST']._serialized_end=385 _globals['_ALIASCREATEDLIST']._serialized_end=383
_globals['_EVENTCONTENT']._serialized_start=388 _globals['_EVENTCONTENT']._serialized_start=386
_globals['_EVENTCONTENT']._serialized_end=791 _globals['_EVENTCONTENT']._serialized_end=789
_globals['_EVENT']._serialized_start=793 _globals['_EVENT']._serialized_start=791
_globals['_EVENT']._serialized_end=914 _globals['_EVENT']._serialized_end=912
# @@protoc_insertion_point(module_scope) # @@protoc_insertion_point(module_scope)

View File

@ -16,34 +16,38 @@ class UserDeleted(_message.Message):
def __init__(self) -> None: ... def __init__(self) -> None: ...
class AliasCreated(_message.Message): class AliasCreated(_message.Message):
__slots__ = ("alias_id", "alias_email", "alias_note", "enabled") __slots__ = ("id", "email", "note", "enabled", "created_at")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int] ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int] EMAIL_FIELD_NUMBER: _ClassVar[int]
ALIAS_NOTE_FIELD_NUMBER: _ClassVar[int] NOTE_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int] ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int CREATED_AT_FIELD_NUMBER: _ClassVar[int]
alias_email: str id: int
alias_note: str email: str
note: str
enabled: bool enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., alias_note: _Optional[str] = ..., enabled: bool = ...) -> None: ... created_at: int
def __init__(self, id: _Optional[int] = ..., email: _Optional[str] = ..., note: _Optional[str] = ..., enabled: bool = ..., created_at: _Optional[int] = ...) -> None: ...
class AliasStatusChanged(_message.Message): class AliasStatusChanged(_message.Message):
__slots__ = ("alias_id", "alias_email", "enabled") __slots__ = ("id", "email", "enabled", "created_at")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int] ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int] EMAIL_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int] ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int CREATED_AT_FIELD_NUMBER: _ClassVar[int]
alias_email: str id: int
email: str
enabled: bool enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., enabled: bool = ...) -> None: ... created_at: int
def __init__(self, id: _Optional[int] = ..., email: _Optional[str] = ..., enabled: bool = ..., created_at: _Optional[int] = ...) -> None: ...
class AliasDeleted(_message.Message): class AliasDeleted(_message.Message):
__slots__ = ("alias_id", "alias_email") __slots__ = ("id", "email")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int] ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int] EMAIL_FIELD_NUMBER: _ClassVar[int]
alias_id: int id: int
alias_email: str email: str
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ...) -> None: ... def __init__(self, id: _Optional[int] = ..., email: _Optional[str] = ...) -> None: ...
class AliasCreatedList(_message.Message): class AliasCreatedList(_message.Message):
__slots__ = ("events",) __slots__ = ("events",)

View File

@ -103,7 +103,9 @@ class UnsubscribeHandler:
): ):
return status.E509 return status.E509
LOG.i(f"User disabled alias {alias} via unsubscribe header") LOG.i(f"User disabled alias {alias} via unsubscribe header")
alias_utils.change_alias_status(alias, enabled=False) alias_utils.change_alias_status(
alias, enabled=False, message="Set enabled=False via unsubscribe header"
)
Session.commit() Session.commit()
enable_alias_url = config.URL + f"/dashboard/?highlight_alias_id={alias.id}" enable_alias_url = config.URL + f"/dashboard/?highlight_alias_id={alias.id}"
for mailbox in alias.mailboxes: for mailbox in alias.mailboxes:

View File

@ -1,3 +1,5 @@
import newrelic.agent
from app.events.event_dispatcher import EventDispatcher, Dispatcher from app.events.event_dispatcher import EventDispatcher, Dispatcher
from app.events.generated.event_pb2 import EventContent, AliasCreated, AliasCreatedList from app.events.generated.event_pb2 import EventContent, AliasCreated, AliasCreatedList
from app.log import LOG from app.log import LOG
@ -12,6 +14,7 @@ def send_alias_creation_events_for_user(
return return
chunk_size = min(chunk_size, 50) chunk_size = min(chunk_size, 50)
event_list = [] event_list = []
LOG.i("Sending alias create events for user {user}")
for alias in ( for alias in (
Alias.yield_per_query(chunk_size) Alias.yield_per_query(chunk_size)
.filter_by(user_id=user.id) .filter_by(user_id=user.id)
@ -19,22 +22,31 @@ def send_alias_creation_events_for_user(
): ):
event_list.append( event_list.append(
AliasCreated( AliasCreated(
alias_id=alias.id, id=alias.id,
alias_email=alias.email, email=alias.email,
alias_note=alias.note, note=alias.note,
enabled=alias.enabled, enabled=alias.enabled,
created_at=int(alias.created_at.timestamp),
) )
) )
if len(event_list) >= chunk_size: if len(event_list) >= chunk_size:
LOG.i(f"Sending {len(event_list)} alias create event for {user}")
EventDispatcher.send_event( EventDispatcher.send_event(
user, user,
EventContent(alias_create_list=AliasCreatedList(events=event_list)), EventContent(alias_create_list=AliasCreatedList(events=event_list)),
dispatcher=dispatcher, dispatcher=dispatcher,
) )
newrelic.agent.record_custom_metric(
"Custom/event_alias_created_event", len(event_list)
)
event_list = [] event_list = []
if len(event_list) > 0: if len(event_list) > 0:
LOG.i(f"Sending {len(event_list)} alias create event for {user}")
EventDispatcher.send_event( EventDispatcher.send_event(
user, user,
EventContent(alias_create_list=AliasCreatedList(events=event_list)), EventContent(alias_create_list=AliasCreatedList(events=event_list)),
dispatcher=dispatcher, dispatcher=dispatcher,
) )
newrelic.agent.record_custom_metric(
"Custom/event_alias_created_event", len(event_list)
)

View File

@ -1,6 +1,7 @@
import dataclasses import dataclasses
import secrets import secrets
import random import random
from enum import Enum
from typing import Optional from typing import Optional
import arrow import arrow
@ -16,6 +17,7 @@ from app.email_utils import (
from app.email_validation import is_valid_email from app.email_validation import is_valid_email
from app.log import LOG from app.log import LOG
from app.models import User, Mailbox, Job, MailboxActivation from app.models import User, Mailbox, Job, MailboxActivation
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
@dataclasses.dataclass @dataclasses.dataclass
@ -70,9 +72,15 @@ def create_mailbox(
f"User {user} has tried to create mailbox with {email} but email is invalid" f"User {user} has tried to create mailbox with {email} but email is invalid"
) )
raise MailboxError("Invalid email") raise MailboxError("Invalid email")
new_mailbox = Mailbox.create( new_mailbox: Mailbox = Mailbox.create(
email=email, user_id=user.id, verified=verified, commit=True email=email, user_id=user.id, verified=verified, commit=True
) )
emit_user_audit_log(
user=user,
action=UserAuditLogAction.CreateMailbox,
message=f"Create mailbox {new_mailbox.id} ({new_mailbox.email}). Verified={verified}",
commit=True,
)
if verified: if verified:
LOG.i(f"User {user} as created a pre-verified mailbox with {email}") LOG.i(f"User {user} as created a pre-verified mailbox with {email}")
@ -129,7 +137,7 @@ def delete_mailbox(
if not transfer_mailbox.verified: if not transfer_mailbox.verified:
LOG.i(f"User {user} has tried to transfer to a non verified mailbox") LOG.i(f"User {user} has tried to transfer to a non verified mailbox")
MailboxError("Your new mailbox is not verified") raise MailboxError("Your new mailbox is not verified")
# Schedule delete account job # Schedule delete account job
LOG.i( LOG.i(
@ -204,6 +212,11 @@ def verify_mailbox_code(user: User, mailbox_id: int, code: str) -> Mailbox:
raise CannotVerifyError("Invalid activation code") raise CannotVerifyError("Invalid activation code")
LOG.i(f"User {user} has verified mailbox {mailbox_id}") LOG.i(f"User {user} has verified mailbox {mailbox_id}")
mailbox.verified = True mailbox.verified = True
emit_user_audit_log(
user=user,
action=UserAuditLogAction.VerifyMailbox,
message=f"Verify mailbox {mailbox_id} ({mailbox.email})",
)
clear_activation_codes_for_mailbox(mailbox) clear_activation_codes_for_mailbox(mailbox)
return mailbox return mailbox
@ -213,7 +226,10 @@ def generate_activation_code(
) -> MailboxActivation: ) -> MailboxActivation:
clear_activation_codes_for_mailbox(mailbox) clear_activation_codes_for_mailbox(mailbox)
if use_digit_code: if use_digit_code:
code = "{:06d}".format(random.randint(1, 999999)) if config.MAILBOX_VERIFICATION_OVERRIDE_CODE:
code = config.MAILBOX_VERIFICATION_OVERRIDE_CODE
else:
code = "{:06d}".format(random.randint(1, 999999))
else: else:
code = secrets.token_urlsafe(16) code = secrets.token_urlsafe(16)
return MailboxActivation.create( return MailboxActivation.create(
@ -258,3 +274,54 @@ def send_verification_email(
mailbox_email=mailbox.email, mailbox_email=mailbox.email,
), ),
) )
class MailboxEmailChangeError(Enum):
InvalidId = 1
EmailAlreadyUsed = 2
@dataclasses.dataclass
class MailboxEmailChangeResult:
error: Optional[MailboxEmailChangeError]
message: str
message_category: str
def perform_mailbox_email_change(mailbox_id: int) -> MailboxEmailChangeResult:
mailbox: Optional[Mailbox] = Mailbox.get(mailbox_id)
# new_email can be None if user cancels change in the meantime
if mailbox and mailbox.new_email:
user = mailbox.user
if Mailbox.get_by(email=mailbox.new_email, user_id=user.id):
return MailboxEmailChangeResult(
error=MailboxEmailChangeError.EmailAlreadyUsed,
message=f"{mailbox.new_email} is already used",
message_category="error",
)
emit_user_audit_log(
user=user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Change mailbox email for mailbox {mailbox_id} (old={mailbox.email} | new={mailbox.new_email})",
)
mailbox.email = mailbox.new_email
mailbox.new_email = None
# mark mailbox as verified if the change request is sent from an unverified mailbox
mailbox.verified = True
Session.commit()
LOG.d("Mailbox change %s is verified", mailbox)
return MailboxEmailChangeResult(
error=None,
message=f"The {mailbox.email} is updated",
message_category="success",
)
else:
return MailboxEmailChangeResult(
error=MailboxEmailChangeError.InvalidId,
message="Invalid link",
message_category="error",
)

View File

@ -336,7 +336,7 @@ class Fido(Base, ModelMixin):
class User(Base, ModelMixin, UserMixin, PasswordOracle): class User(Base, ModelMixin, UserMixin, PasswordOracle):
__tablename__ = "users" __tablename__ = "users"
FLAG_FREE_DISABLE_CREATE_ALIAS = 1 << 0 FLAG_DISABLE_CREATE_CONTACTS = 1 << 0
FLAG_CREATED_FROM_PARTNER = 1 << 1 FLAG_CREATED_FROM_PARTNER = 1 << 1
FLAG_FREE_OLD_ALIAS_LIMIT = 1 << 2 FLAG_FREE_OLD_ALIAS_LIMIT = 1 << 2
FLAG_CREATED_ALIAS_FROM_PARTNER = 1 << 3 FLAG_CREATED_ALIAS_FROM_PARTNER = 1 << 3
@ -543,7 +543,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
# bitwise flags. Allow for future expansion # bitwise flags. Allow for future expansion
flags = sa.Column( flags = sa.Column(
sa.BigInteger, sa.BigInteger,
default=FLAG_FREE_DISABLE_CREATE_ALIAS, default=FLAG_DISABLE_CREATE_CONTACTS,
server_default="0", server_default="0",
nullable=False, nullable=False,
) )
@ -973,7 +973,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
def has_custom_domain(self): def has_custom_domain(self):
return CustomDomain.filter_by(user_id=self.id, verified=True).count() > 0 return CustomDomain.filter_by(user_id=self.id, verified=True).count() > 0
def custom_domains(self): def custom_domains(self) -> List["CustomDomain"]:
return CustomDomain.filter_by(user_id=self.id, verified=True).all() return CustomDomain.filter_by(user_id=self.id, verified=True).all()
def available_domains_for_random_alias( def available_domains_for_random_alias(
@ -1168,7 +1168,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
def can_create_contacts(self) -> bool: def can_create_contacts(self) -> bool:
if self.is_premium(): if self.is_premium():
return True return True
if self.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0: if self.flags & User.FLAG_DISABLE_CREATE_CONTACTS == 0:
return True return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
@ -1660,18 +1660,6 @@ class Alias(Base, ModelMixin):
Session.add(new_alias) Session.add(new_alias)
DailyMetric.get_or_create_today_metric().nb_alias += 1 DailyMetric.get_or_create_today_metric().nb_alias += 1
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasCreated, EventContent
event = AliasCreated(
alias_id=new_alias.id,
alias_email=new_alias.email,
alias_note=new_alias.note,
enabled=True,
)
EventDispatcher.send_event(user, EventContent(alias_created=event))
if ( if (
new_alias.flags & cls.FLAG_PARTNER_CREATED > 0 new_alias.flags & cls.FLAG_PARTNER_CREATED > 0
and new_alias.user.flags & User.FLAG_CREATED_ALIAS_FROM_PARTNER == 0 and new_alias.user.flags & User.FLAG_CREATED_ALIAS_FROM_PARTNER == 0
@ -1684,6 +1672,23 @@ class Alias(Base, ModelMixin):
if flush: if flush:
Session.flush() Session.flush()
# Internal import to avoid global import cycles
from app.alias_audit_log_utils import AliasAuditLogAction, emit_alias_audit_log
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasCreated, EventContent
event = AliasCreated(
id=new_alias.id,
email=new_alias.email,
note=new_alias.note,
enabled=True,
created_at=int(new_alias.created_at.timestamp),
)
EventDispatcher.send_event(user, EventContent(alias_created=event))
emit_alias_audit_log(
new_alias, AliasAuditLogAction.CreateAlias, "New alias created"
)
return new_alias return new_alias
@classmethod @classmethod
@ -1862,6 +1867,8 @@ class Contact(Base, ModelMixin):
MAX_NAME_LENGTH = 512 MAX_NAME_LENGTH = 512
FLAG_PARTNER_CREATED = 1 << 0
__tablename__ = "contact" __tablename__ = "contact"
__table_args__ = ( __table_args__ = (
@ -1920,6 +1927,9 @@ class Contact(Base, ModelMixin):
# whether contact is created automatically during the forward phase # whether contact is created automatically during the forward phase
automatic_created = sa.Column(sa.Boolean, nullable=True, default=False) automatic_created = sa.Column(sa.Boolean, nullable=True, default=False)
# contact flags
flags = sa.Column(sa.Integer, nullable=False, default=0, server_default="0")
@property @property
def email(self): def email(self):
return self.website_email return self.website_email
@ -2418,6 +2428,18 @@ class CustomDomain(Base, ModelMixin):
sa.Boolean, nullable=False, default=False, server_default="0" sa.Boolean, nullable=False, default=False, server_default="0"
) )
partner_id = sa.Column(
sa.Integer,
sa.ForeignKey("partner.id"),
nullable=True,
default=None,
server_default=None,
)
pending_deletion = sa.Column(
sa.Boolean, nullable=False, default=False, server_default="0"
)
__table_args__ = ( __table_args__ = (
Index( Index(
"ix_unique_domain", # Index name "ix_unique_domain", # Index name
@ -2425,6 +2447,8 @@ class CustomDomain(Base, ModelMixin):
unique=True, unique=True,
postgresql_where=Column("ownership_verified"), postgresql_where=Column("ownership_verified"),
), # The condition ), # The condition
Index("ix_custom_domain_user_id", "user_id"),
Index("ix_custom_domain_pending_deletion", "pending_deletion"),
) )
user = orm.relationship(User, foreign_keys=[user_id], backref="custom_domains") user = orm.relationship(User, foreign_keys=[user_id], backref="custom_domains")
@ -2442,9 +2466,6 @@ class CustomDomain(Base, ModelMixin):
def get_trash_url(self): def get_trash_url(self):
return config.URL + f"/dashboard/domains/{self.id}/trash" return config.URL + f"/dashboard/domains/{self.id}/trash"
def get_ownership_dns_txt_value(self):
return f"sl-verification={self.ownership_txt_token}"
@classmethod @classmethod
def create(cls, **kwargs): def create(cls, **kwargs):
domain = kwargs.get("domain") domain = kwargs.get("domain")
@ -2749,9 +2770,9 @@ class Mailbox(Base, ModelMixin):
from app.email_utils import get_email_local_part from app.email_utils import get_email_local_part
mx_domains: [(int, str)] = get_mx_domains(get_email_local_part(self.email)) mx_domains = get_mx_domains(get_email_local_part(self.email))
# Proton is the first domain # Proton is the first domain
if mx_domains and mx_domains[0][1] in ( if mx_domains and mx_domains[0].domain in (
"mail.protonmail.ch.", "mail.protonmail.ch.",
"mailsec.protonmail.ch.", "mailsec.protonmail.ch.",
): ):
@ -3750,15 +3771,14 @@ class SyncEvent(Base, ModelMixin):
sa.Index("ix_sync_event_taken_time", "taken_time"), sa.Index("ix_sync_event_taken_time", "taken_time"),
) )
def mark_as_taken(self) -> bool: def mark_as_taken(self, allow_taken_older_than: Optional[Arrow] = None) -> bool:
sql = """ taken_condition = ["taken_time IS NULL"]
UPDATE sync_event
SET taken_time = :taken_time
WHERE id = :sync_event_id
AND taken_time IS NULL
"""
args = {"taken_time": arrow.now().datetime, "sync_event_id": self.id} args = {"taken_time": arrow.now().datetime, "sync_event_id": self.id}
if allow_taken_older_than:
taken_condition.append("taken_time < :taken_older_than")
args["taken_older_than"] = allow_taken_older_than.datetime
sql_taken_condition = "({})".format(" OR ".join(taken_condition))
sql = f"UPDATE sync_event SET taken_time = :taken_time WHERE id = :sync_event_id AND {sql_taken_condition}"
res = Session.execute(sql, args) res = Session.execute(sql, args)
Session.commit() Session.commit()
@ -3784,3 +3804,39 @@ class SyncEvent(Base, ModelMixin):
.limit(100) .limit(100)
.all() .all()
) )
class AliasAuditLog(Base, ModelMixin):
"""This model holds an audit log for all the actions performed to an alias"""
__tablename__ = "alias_audit_log"
user_id = sa.Column(sa.Integer, nullable=False)
alias_id = sa.Column(sa.Integer, nullable=False)
alias_email = sa.Column(sa.String(255), nullable=False)
action = sa.Column(sa.String(255), nullable=False)
message = sa.Column(sa.Text, default=None, nullable=True)
__table_args__ = (
sa.Index("ix_alias_audit_log_user_id", "user_id"),
sa.Index("ix_alias_audit_log_alias_id", "alias_id"),
sa.Index("ix_alias_audit_log_alias_email", "alias_email"),
sa.Index("ix_alias_audit_log_created_at", "created_at"),
)
class UserAuditLog(Base, ModelMixin):
"""This model holds an audit log for all the actions performed by a user"""
__tablename__ = "user_audit_log"
user_id = sa.Column(sa.Integer, nullable=False)
user_email = sa.Column(sa.String(255), nullable=False)
action = sa.Column(sa.String(255), nullable=False)
message = sa.Column(sa.Text, default=None, nullable=True)
__table_args__ = (
sa.Index("ix_user_audit_log_user_id", "user_id"),
sa.Index("ix_user_audit_log_user_email", "user_email"),
sa.Index("ix_user_audit_log_created_at", "created_at"),
)

View File

@ -0,0 +1,46 @@
from typing import Optional
from arrow import Arrow
from app.models import PartnerUser, PartnerSubscription, User
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
def create_partner_user(
user: User, partner_id: int, partner_email: str, external_user_id: str
) -> PartnerUser:
instance = PartnerUser.create(
user_id=user.id,
partner_id=partner_id,
partner_email=partner_email,
external_user_id=external_user_id,
)
emit_user_audit_log(
user=user,
action=UserAuditLogAction.LinkAccount,
message=f"Linked account to partner_id={partner_id} | partner_email={partner_email} | external_user_id={external_user_id}",
)
return instance
def create_partner_subscription(
partner_user: PartnerUser,
expiration: Optional[Arrow],
msg: Optional[str] = None,
) -> PartnerSubscription:
instance = PartnerSubscription.create(
partner_user_id=partner_user.id,
end_at=expiration,
)
message = "User upgraded through partner subscription"
if msg:
message += f" | {msg}"
emit_user_audit_log(
user=partner_user.user,
action=UserAuditLogAction.Upgrade,
message=message,
)
return instance

View File

View File

@ -0,0 +1,121 @@
from typing import Optional
import arrow
from coinbase_commerce.error import WebhookInvalidPayload, SignatureVerificationError
from coinbase_commerce.webhook import Webhook
from flask import Flask, request
from app.config import COINBASE_WEBHOOK_SECRET
from app.db import Session
from app.email_utils import send_email, render
from app.log import LOG
from app.models import CoinbaseSubscription, User
from app.subscription_webhook import execute_subscription_webhook
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
def setup_coinbase_commerce(app: Flask):
@app.route("/coinbase", methods=["POST"])
def coinbase_webhook():
# event payload
request_data = request.data.decode("utf-8")
# webhook signature
request_sig = request.headers.get("X-CC-Webhook-Signature", None)
try:
# signature verification and event object construction
event = Webhook.construct_event(
request_data, request_sig, COINBASE_WEBHOOK_SECRET
)
except (WebhookInvalidPayload, SignatureVerificationError) as e:
LOG.e("Invalid Coinbase webhook")
return str(e), 400
LOG.d("Coinbase event %s", event)
if event["type"] == "charge:confirmed":
if handle_coinbase_event(event):
return "success", 200
else:
return "error", 400
return "success", 200
def handle_coinbase_event(event) -> bool:
server_user_id = event["data"]["metadata"]["user_id"]
try:
user_id = int(server_user_id)
except ValueError:
user_id = int(float(server_user_id))
code = event["data"]["code"]
user: Optional[User] = User.get(user_id)
if not user:
LOG.e("User not found %s", user_id)
return False
coinbase_subscription: CoinbaseSubscription = CoinbaseSubscription.get_by(
user_id=user_id
)
if not coinbase_subscription:
LOG.d("Create a coinbase subscription for %s", user)
coinbase_subscription = CoinbaseSubscription.create(
user_id=user_id, end_at=arrow.now().shift(years=1), code=code, commit=True
)
emit_user_audit_log(
user=user,
action=UserAuditLogAction.Upgrade,
message="Upgraded though Coinbase",
commit=True,
)
send_email(
user.email,
"Your SimpleLogin account has been upgraded",
render(
"transactional/coinbase/new-subscription.txt",
user=user,
coinbase_subscription=coinbase_subscription,
),
render(
"transactional/coinbase/new-subscription.html",
user=user,
coinbase_subscription=coinbase_subscription,
),
)
else:
if coinbase_subscription.code != code:
LOG.d("Update code from %s to %s", coinbase_subscription.code, code)
coinbase_subscription.code = code
if coinbase_subscription.is_active():
coinbase_subscription.end_at = coinbase_subscription.end_at.shift(years=1)
else: # already expired subscription
coinbase_subscription.end_at = arrow.now().shift(years=1)
emit_user_audit_log(
user=user,
action=UserAuditLogAction.SubscriptionExtended,
message="Extended coinbase subscription",
)
Session.commit()
send_email(
user.email,
"Your SimpleLogin account has been extended",
render(
"transactional/coinbase/extend-subscription.txt",
user=user,
coinbase_subscription=coinbase_subscription,
),
render(
"transactional/coinbase/extend-subscription.html",
user=user,
coinbase_subscription=coinbase_subscription,
),
)
execute_subscription_webhook(user)
return True

286
app/app/payments/paddle.py Normal file
View File

@ -0,0 +1,286 @@
import arrow
import json
from dateutil.relativedelta import relativedelta
from flask import Flask, request
from app import paddle_utils, paddle_callback
from app.config import (
PADDLE_MONTHLY_PRODUCT_ID,
PADDLE_MONTHLY_PRODUCT_IDS,
PADDLE_YEARLY_PRODUCT_IDS,
PADDLE_COUPON_ID,
)
from app.db import Session
from app.email_utils import send_email, render
from app.log import LOG
from app.models import Subscription, PlanEnum, User, Coupon
from app.subscription_webhook import execute_subscription_webhook
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import random_string
def setup_paddle_callback(app: Flask):
@app.route("/paddle", methods=["GET", "POST"])
def paddle():
LOG.d(f"paddle callback {request.form.get('alert_name')} {request.form}")
# make sure the request comes from Paddle
if not paddle_utils.verify_incoming_request(dict(request.form)):
LOG.e("request not coming from paddle. Request data:%s", dict(request.form))
return "KO", 400
if (
request.form.get("alert_name") == "subscription_created"
): # new user subscribes
# the passthrough is json encoded, e.g.
# request.form.get("passthrough") = '{"user_id": 88 }'
passthrough = json.loads(request.form.get("passthrough"))
user_id = passthrough.get("user_id")
user = User.get(user_id)
subscription_plan_id = int(request.form.get("subscription_plan_id"))
if subscription_plan_id in PADDLE_MONTHLY_PRODUCT_IDS:
plan = PlanEnum.monthly
elif subscription_plan_id in PADDLE_YEARLY_PRODUCT_IDS:
plan = PlanEnum.yearly
else:
LOG.e(
"Unknown subscription_plan_id %s %s",
subscription_plan_id,
request.form,
)
return "No such subscription", 400
sub = Subscription.get_by(user_id=user.id)
if not sub:
LOG.d(f"create a new Subscription for user {user}")
Subscription.create(
user_id=user.id,
cancel_url=request.form.get("cancel_url"),
update_url=request.form.get("update_url"),
subscription_id=request.form.get("subscription_id"),
event_time=arrow.now(),
next_bill_date=arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date(),
plan=plan,
)
emit_user_audit_log(
user=user,
action=UserAuditLogAction.Upgrade,
message="Upgraded through Paddle",
)
else:
LOG.d(f"Update an existing Subscription for user {user}")
sub.cancel_url = request.form.get("cancel_url")
sub.update_url = request.form.get("update_url")
sub.subscription_id = request.form.get("subscription_id")
sub.event_time = arrow.now()
sub.next_bill_date = arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date()
sub.plan = plan
# make sure to set the new plan as not-cancelled
# in case user cancels a plan and subscribes a new plan
sub.cancelled = False
emit_user_audit_log(
user=user,
action=UserAuditLogAction.SubscriptionExtended,
message="Extended Paddle subscription",
)
execute_subscription_webhook(user)
LOG.d("User %s upgrades!", user)
Session.commit()
elif request.form.get("alert_name") == "subscription_payment_succeeded":
subscription_id = request.form.get("subscription_id")
LOG.d("Update subscription %s", subscription_id)
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
# when user subscribes, the "subscription_payment_succeeded" can arrive BEFORE "subscription_created"
# at that time, subscription object does not exist yet
if sub:
sub.event_time = arrow.now()
sub.next_bill_date = arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date()
Session.commit()
execute_subscription_webhook(sub.user)
elif request.form.get("alert_name") == "subscription_cancelled":
subscription_id = request.form.get("subscription_id")
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
if sub:
# cancellation_effective_date should be the same as next_bill_date
LOG.w(
"Cancel subscription %s %s on %s, next bill date %s",
subscription_id,
sub.user,
request.form.get("cancellation_effective_date"),
sub.next_bill_date,
)
sub.event_time = arrow.now()
sub.cancelled = True
emit_user_audit_log(
user=sub.user,
action=UserAuditLogAction.SubscriptionCancelled,
message="Cancelled Paddle subscription",
)
Session.commit()
user = sub.user
send_email(
user.email,
"SimpleLogin - your subscription is canceled",
render(
"transactional/subscription-cancel.txt",
user=user,
end_date=request.form.get("cancellation_effective_date"),
),
)
execute_subscription_webhook(sub.user)
else:
# user might have deleted their account
LOG.i(f"Cancel non-exist subscription {subscription_id}")
return "OK"
elif request.form.get("alert_name") == "subscription_updated":
subscription_id = request.form.get("subscription_id")
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
if sub:
next_bill_date = request.form.get("next_bill_date")
if not next_bill_date:
paddle_callback.failed_payment(sub, subscription_id)
return "OK"
LOG.d(
"Update subscription %s %s on %s, next bill date %s",
subscription_id,
sub.user,
request.form.get("cancellation_effective_date"),
sub.next_bill_date,
)
if (
int(request.form.get("subscription_plan_id"))
== PADDLE_MONTHLY_PRODUCT_ID
):
plan = PlanEnum.monthly
else:
plan = PlanEnum.yearly
sub.cancel_url = request.form.get("cancel_url")
sub.update_url = request.form.get("update_url")
sub.event_time = arrow.now()
sub.next_bill_date = arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date()
sub.plan = plan
# make sure to set the new plan as not-cancelled
sub.cancelled = False
emit_user_audit_log(
user=sub.user,
action=UserAuditLogAction.SubscriptionExtended,
message="Extended Paddle subscription",
)
Session.commit()
execute_subscription_webhook(sub.user)
else:
LOG.w(
f"update non-exist subscription {subscription_id}. {request.form}"
)
return "No such subscription", 400
elif request.form.get("alert_name") == "payment_refunded":
subscription_id = request.form.get("subscription_id")
LOG.d("Refund request for subscription %s", subscription_id)
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
if sub:
user = sub.user
Subscription.delete(sub.id)
emit_user_audit_log(
user=user,
action=UserAuditLogAction.SubscriptionCancelled,
message="Paddle subscription cancelled as user requested a refund",
)
Session.commit()
LOG.e("%s requests a refund", user)
execute_subscription_webhook(sub.user)
elif request.form.get("alert_name") == "subscription_payment_refunded":
subscription_id = request.form.get("subscription_id")
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
LOG.d(
"Handle subscription_payment_refunded for subscription %s",
subscription_id,
)
if not sub:
LOG.w(
"No such subscription for %s, payload %s",
subscription_id,
request.form,
)
return "No such subscription"
plan_id = int(request.form["subscription_plan_id"])
if request.form["refund_type"] == "full":
if plan_id in PADDLE_MONTHLY_PRODUCT_IDS:
LOG.d("subtract 1 month from next_bill_date %s", sub.next_bill_date)
sub.next_bill_date = sub.next_bill_date - relativedelta(months=1)
LOG.d("next_bill_date is %s", sub.next_bill_date)
Session.commit()
elif plan_id in PADDLE_YEARLY_PRODUCT_IDS:
LOG.d("subtract 1 year from next_bill_date %s", sub.next_bill_date)
sub.next_bill_date = sub.next_bill_date - relativedelta(years=1)
LOG.d("next_bill_date is %s", sub.next_bill_date)
Session.commit()
else:
LOG.e("Unknown plan_id %s", plan_id)
else:
LOG.w("partial subscription_payment_refunded, not handled")
execute_subscription_webhook(sub.user)
return "OK"
@app.route("/paddle_coupon", methods=["GET", "POST"])
def paddle_coupon():
LOG.d("paddle coupon callback %s", request.form)
if not paddle_utils.verify_incoming_request(dict(request.form)):
LOG.e("request not coming from paddle. Request data:%s", dict(request.form))
return "KO", 400
product_id = request.form.get("p_product_id")
if product_id != PADDLE_COUPON_ID:
LOG.e("product_id %s not match with %s", product_id, PADDLE_COUPON_ID)
return "KO", 400
email = request.form.get("email")
LOG.d("Paddle coupon request for %s", email)
coupon = Coupon.create(
code=random_string(30),
comment="For 1-year coupon",
expires_date=arrow.now().shift(years=1, days=-1),
commit=True,
)
return (
f"Your 1-year coupon is <b>{coupon.code}</b> <br> "
f"It's valid until <b>{coupon.expires_date.date().isoformat()}</b>"
)

View File

@ -2,9 +2,11 @@ from dataclasses import dataclass
from enum import Enum from enum import Enum
from flask import url_for from flask import url_for
from typing import Optional from typing import Optional
import arrow
from app import config
from app.errors import LinkException from app.errors import LinkException
from app.models import User, Partner from app.models import User, Partner, Job
from app.proton.proton_client import ProtonClient, ProtonUser from app.proton.proton_client import ProtonClient, ProtonUser
from app.account_linking import ( from app.account_linking import (
process_login_case, process_login_case,
@ -41,12 +43,21 @@ class ProtonCallbackHandler:
def __init__(self, proton_client: ProtonClient): def __init__(self, proton_client: ProtonClient):
self.proton_client = proton_client self.proton_client = proton_client
def _initial_alias_sync(self, user: User):
Job.create(
name=config.JOB_SEND_ALIAS_CREATION_EVENTS,
payload={"user_id": user.id},
run_at=arrow.now(),
commit=True,
)
def handle_login(self, partner: Partner) -> ProtonCallbackResult: def handle_login(self, partner: Partner) -> ProtonCallbackResult:
try: try:
user = self.__get_partner_user() user = self.__get_partner_user()
if user is None: if user is None:
return generate_account_not_allowed_to_log_in() return generate_account_not_allowed_to_log_in()
res = process_login_case(user, partner) res = process_login_case(user, partner)
self._initial_alias_sync(res.user)
return ProtonCallbackResult( return ProtonCallbackResult(
redirect_to_login=False, redirect_to_login=False,
flash_message=None, flash_message=None,
@ -75,6 +86,7 @@ class ProtonCallbackHandler:
if user is None: if user is None:
return generate_account_not_allowed_to_log_in() return generate_account_not_allowed_to_log_in()
res = process_link_case(user, current_user, partner) res = process_link_case(user, current_user, partner)
self._initial_alias_sync(res.user)
return ProtonCallbackResult( return ProtonCallbackResult(
redirect_to_login=False, redirect_to_login=False,
flash_message="Account successfully linked", flash_message="Account successfully linked",

View File

@ -5,6 +5,7 @@ from app.db import Session
from app.log import LOG from app.log import LOG
from app.errors import ProtonPartnerNotSetUp from app.errors import ProtonPartnerNotSetUp
from app.models import Partner, PartnerUser, User from app.models import Partner, PartnerUser, User
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
PROTON_PARTNER_NAME = "Proton" PROTON_PARTNER_NAME = "Proton"
_PROTON_PARTNER: Optional[Partner] = None _PROTON_PARTNER: Optional[Partner] = None
@ -32,6 +33,11 @@ def perform_proton_account_unlink(current_user: User):
) )
if partner_user is not None: if partner_user is not None:
LOG.info(f"User {current_user} has unlinked the account from {partner_user}") LOG.info(f"User {current_user} has unlinked the account from {partner_user}")
emit_user_audit_log(
user=current_user,
action=UserAuditLogAction.UnlinkAccount,
message=f"User has unlinked the account (email={partner_user.partner_email} | external_user_id={partner_user.external_user_id})",
)
PartnerUser.delete(partner_user.id) PartnerUser.delete(partner_user.id)
Session.commit() Session.commit()
agent.record_custom_event("AccountUnlinked", {"partner": proton_partner.name}) agent.record_custom_event("AccountUnlinked", {"partner": proton_partner.name})

21
app/app/sentry_utils.py Normal file
View File

@ -0,0 +1,21 @@
from typing import Optional
from sentry_sdk.types import Event, Hint
_HTTP_CODES_TO_IGNORE = [416]
def _should_send(_event: Event, hint: Hint) -> bool:
# Check if this is an HTTP Exception event
if "exc_info" in hint:
exc_type, exc_value, exc_traceback = hint["exc_info"]
# Check if it's a Werkzeug HTTPException (raised for HTTP status codes)
if hasattr(exc_value, "code") and exc_value.code in _HTTP_CODES_TO_IGNORE:
return False
return True
def sentry_before_send(event: Event, hint: Hint) -> Optional[Event]:
if _should_send(event, hint):
return event
return None

View File

@ -1,40 +1,16 @@
import requests
from requests import RequestException
from app import config
from app.db import Session from app.db import Session
from app.events.event_dispatcher import EventDispatcher from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import EventContent, UserPlanChanged from app.events.generated.event_pb2 import EventContent, UserPlanChanged
from app.log import LOG
from app.models import User from app.models import User
def execute_subscription_webhook(user: User): def execute_subscription_webhook(user: User):
webhook_url = config.SUBSCRIPTION_CHANGE_WEBHOOK
if webhook_url is None:
return
subscription_end = user.get_active_subscription_end( subscription_end = user.get_active_subscription_end(
include_partner_subscription=False include_partner_subscription=False
) )
sl_subscription_end = None sl_subscription_end = None
if subscription_end: if subscription_end:
sl_subscription_end = subscription_end.timestamp sl_subscription_end = subscription_end.timestamp
payload = {
"user_id": user.id,
"is_premium": user.is_premium(),
"active_subscription_end": sl_subscription_end,
}
try:
response = requests.post(webhook_url, json=payload, timeout=2)
if response.status_code == 200:
LOG.i("Sent request to subscription update webhook successfully")
else:
LOG.i(
f"Request to webhook failed with status {response.status_code}: {response.text}"
)
except RequestException as e:
LOG.error(f"Subscription request exception: {e}")
event = UserPlanChanged(plan_end_time=sl_subscription_end) event = UserPlanChanged(plan_end_time=sl_subscription_end)
EventDispatcher.send_event(user, EventContent(user_plan_change=event)) EventDispatcher.send_event(user, EventContent(user_plan_change=event))
Session.commit() Session.commit()

View File

@ -0,0 +1,40 @@
from enum import Enum
from app.models import User, UserAuditLog
class UserAuditLogAction(Enum):
Upgrade = "upgrade"
SubscriptionExtended = "subscription_extended"
SubscriptionCancelled = "subscription_cancelled"
LinkAccount = "link_account"
UnlinkAccount = "unlink_account"
CreateMailbox = "create_mailbox"
VerifyMailbox = "verify_mailbox"
UpdateMailbox = "update_mailbox"
DeleteMailbox = "delete_mailbox"
CreateCustomDomain = "create_custom_domain"
VerifyCustomDomain = "verify_custom_domain"
UpdateCustomDomain = "update_custom_domain"
DeleteCustomDomain = "delete_custom_domain"
CreateDirectory = "create_directory"
UpdateDirectory = "update_directory"
DeleteDirectory = "delete_directory"
UserMarkedForDeletion = "user_marked_for_deletion"
DeleteUser = "delete_user"
def emit_user_audit_log(
user: User, action: UserAuditLogAction, message: str, commit: bool = False
):
UserAuditLog.create(
user_id=user.id,
user_email=user.email,
action=action.value,
message=message,
commit=commit,
)

View File

@ -3,6 +3,7 @@ from typing import Optional
from app.db import Session from app.db import Session
from app.log import LOG from app.log import LOG
from app.models import User, SLDomain, CustomDomain, Mailbox from app.models import User, SLDomain, CustomDomain, Mailbox
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
class CannotSetAlias(Exception): class CannotSetAlias(Exception):
@ -16,12 +17,13 @@ class CannotSetMailbox(Exception):
def set_default_alias_domain(user: User, domain_name: Optional[str]): def set_default_alias_domain(user: User, domain_name: Optional[str]):
if domain_name is None: if not domain_name:
LOG.i(f"User {user} has set no domain as default domain") LOG.i(f"User {user} has set no domain as default domain")
user.default_alias_public_domain_id = None user.default_alias_public_domain_id = None
user.default_alias_custom_domain_id = None user.default_alias_custom_domain_id = None
Session.flush() Session.flush()
return return
sl_domain: SLDomain = SLDomain.get_by(domain=domain_name) sl_domain: SLDomain = SLDomain.get_by(domain=domain_name)
if sl_domain: if sl_domain:
if sl_domain.hidden: if sl_domain.hidden:
@ -53,7 +55,7 @@ def set_default_alias_domain(user: User, domain_name: Optional[str]):
def set_default_mailbox(user: User, mailbox_id: int) -> Mailbox: def set_default_mailbox(user: User, mailbox_id: int) -> Mailbox:
mailbox = Mailbox.get(mailbox_id) mailbox: Optional[Mailbox] = Mailbox.get(mailbox_id)
if not mailbox or mailbox.user_id != user.id: if not mailbox or mailbox.user_id != user.id:
raise CannotSetMailbox("Invalid mailbox") raise CannotSetMailbox("Invalid mailbox")
@ -66,5 +68,11 @@ def set_default_mailbox(user: User, mailbox_id: int) -> Mailbox:
LOG.i(f"User {user} has set mailbox {mailbox} as his default one") LOG.i(f"User {user} has set mailbox {mailbox} as his default one")
user.default_mailbox_id = mailbox.id user.default_mailbox_id = mailbox.id
emit_user_audit_log(
user=user,
action=UserAuditLogAction.UpdateMailbox,
message=f"Set mailbox {mailbox.id} ({mailbox.email}) as default",
)
Session.commit() Session.commit()
return mailbox return mailbox

View File

@ -14,6 +14,7 @@ from sqlalchemy.sql import Insert, text
from app import s3, config from app import s3, config
from app.alias_utils import nb_email_log_for_mailbox from app.alias_utils import nb_email_log_for_mailbox
from app.api.views.apple import verify_receipt from app.api.views.apple import verify_receipt
from app.custom_domain_validation import CustomDomainValidation
from app.db import Session from app.db import Session
from app.dns_utils import get_mx_domains, is_mx_equivalent from app.dns_utils import get_mx_domains, is_mx_equivalent
from app.email_utils import ( from app.email_utils import (
@ -59,8 +60,11 @@ from app.models import (
) )
from app.pgp_utils import load_public_key_and_check, PGPException from app.pgp_utils import load_public_key_and_check, PGPException
from app.proton.utils import get_proton_partner from app.proton.utils import get_proton_partner
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from app.utils import sanitize_email from app.utils import sanitize_email
from server import create_light_app from server import create_light_app
from tasks.clean_alias_audit_log import cleanup_alias_audit_log
from tasks.clean_user_audit_log import cleanup_user_audit_log
from tasks.cleanup_old_imports import cleanup_old_imports from tasks.cleanup_old_imports import cleanup_old_imports
from tasks.cleanup_old_jobs import cleanup_old_jobs from tasks.cleanup_old_jobs import cleanup_old_jobs
from tasks.cleanup_old_notifications import cleanup_old_notifications from tasks.cleanup_old_notifications import cleanup_old_notifications
@ -905,9 +909,11 @@ def check_custom_domain():
LOG.i("custom domain has been deleted") LOG.i("custom domain has been deleted")
def check_single_custom_domain(custom_domain): def check_single_custom_domain(custom_domain: CustomDomain):
mx_domains = get_mx_domains(custom_domain.domain) mx_domains = get_mx_domains(custom_domain.domain)
if not is_mx_equivalent(mx_domains, config.EMAIL_SERVERS_WITH_PRIORITY): validator = CustomDomainValidation(dkim_domain=config.EMAIL_DOMAIN)
expected_custom_domains = validator.get_expected_mx_records(custom_domain)
if not is_mx_equivalent(mx_domains, expected_custom_domains):
user = custom_domain.user user = custom_domain.user
LOG.w( LOG.w(
"The MX record is not correctly set for %s %s %s", "The MX record is not correctly set for %s %s %s",
@ -1215,7 +1221,7 @@ def notify_hibp():
def clear_users_scheduled_to_be_deleted(dry_run=False): def clear_users_scheduled_to_be_deleted(dry_run=False):
users = User.filter( users: List[User] = User.filter(
and_( and_(
User.delete_on.isnot(None), User.delete_on.isnot(None),
User.delete_on <= arrow.now().shift(days=-DELETE_GRACE_DAYS), User.delete_on <= arrow.now().shift(days=-DELETE_GRACE_DAYS),
@ -1227,6 +1233,11 @@ def clear_users_scheduled_to_be_deleted(dry_run=False):
) )
if dry_run: if dry_run:
continue continue
emit_user_audit_log(
user=user,
action=UserAuditLogAction.DeleteUser,
message=f"Delete user {user.id} ({user.email})",
)
User.delete(user.id) User.delete(user.id)
Session.commit() Session.commit()
@ -1238,6 +1249,16 @@ def delete_old_data():
cleanup_old_notifications(oldest_valid) cleanup_old_notifications(oldest_valid)
def clear_alias_audit_log():
oldest_valid = arrow.now().shift(days=-config.AUDIT_LOG_MAX_DAYS)
cleanup_alias_audit_log(oldest_valid)
def clear_user_audit_log():
oldest_valid = arrow.now().shift(days=-config.AUDIT_LOG_MAX_DAYS)
cleanup_user_audit_log(oldest_valid)
if __name__ == "__main__": if __name__ == "__main__":
LOG.d("Start running cronjob") LOG.d("Start running cronjob")
parser = argparse.ArgumentParser() parser = argparse.ArgumentParser()
@ -1246,22 +1267,6 @@ if __name__ == "__main__":
"--job", "--job",
help="Choose a cron job to run", help="Choose a cron job to run",
type=str, type=str,
choices=[
"stats",
"notify_trial_end",
"notify_manual_subscription_end",
"notify_premium_end",
"delete_logs",
"delete_old_data",
"poll_apple_subscription",
"sanity_check",
"delete_old_monitoring",
"check_custom_domain",
"check_hibp",
"notify_hibp",
"cleanup_tokens",
"send_undelivered_mails",
],
) )
args = parser.parse_args() args = parser.parse_args()
# wrap in an app context to benefit from app setup like database cleanup, sentry integration, etc # wrap in an app context to benefit from app setup like database cleanup, sentry integration, etc
@ -1310,4 +1315,10 @@ if __name__ == "__main__":
load_unsent_mails_from_fs_and_resend() load_unsent_mails_from_fs_and_resend()
elif args.job == "delete_scheduled_users": elif args.job == "delete_scheduled_users":
LOG.d("Deleting users scheduled to be deleted") LOG.d("Deleting users scheduled to be deleted")
clear_users_scheduled_to_be_deleted(dry_run=True) clear_users_scheduled_to_be_deleted()
elif args.job == "clear_alias_audit_log":
LOG.d("Clearing alias audit log")
clear_alias_audit_log()
elif args.job == "clear_user_audit_log":
LOG.d("Clearing user audit log")
clear_user_audit_log()

View File

@ -80,3 +80,17 @@ jobs:
schedule: "*/5 * * * *" schedule: "*/5 * * * *"
captureStderr: true captureStderr: true
concurrencyPolicy: Forbid concurrencyPolicy: Forbid
- name: SimpleLogin clear alias_audit_log old entries
command: python /code/cron.py -j clear_alias_audit_log
shell: /bin/bash
schedule: "0 * * * *" # Once every hour
captureStderr: true
concurrencyPolicy: Forbid
- name: SimpleLogin clear user_audit_log old entries
command: python /code/cron.py -j clear_user_audit_log
shell: /bin/bash
schedule: "0 * * * *" # Once every hour
captureStderr: true
concurrencyPolicy: Forbid

View File

@ -52,8 +52,12 @@ from flanker.addresslib import address
from flanker.addresslib.address import EmailAddress from flanker.addresslib.address import EmailAddress
from sqlalchemy.exc import IntegrityError from sqlalchemy.exc import IntegrityError
from app import pgp_utils, s3, config from app import pgp_utils, s3, config, contact_utils
from app.alias_utils import try_auto_create, change_alias_status from app.alias_utils import (
try_auto_create,
change_alias_status,
get_alias_recipient_name,
)
from app.config import ( from app.config import (
EMAIL_DOMAIN, EMAIL_DOMAIN,
URL, URL,
@ -195,81 +199,16 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
mail_from, mail_from,
) )
contact_email = mail_from contact_email = mail_from
contact_result = contact_utils.create_contact(
if not is_valid_email(contact_email): email=contact_email,
LOG.w( alias=alias,
"invalid contact email %s. Parse from %s %s", name=contact_name,
contact_email, mail_from=mail_from,
from_header, allow_empty_email=True,
mail_from, automatic_created=True,
) from_partner=False,
# either reuse a contact with empty email or create a new contact with empty email )
contact_email = "" return contact_result.contact
contact_email = sanitize_email(contact_email, not_lower=True)
if contact_name and "\x00" in contact_name:
LOG.w("issue with contact name %s", contact_name)
contact_name = ""
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
if contact:
if contact.name != contact_name:
LOG.d(
"Update contact %s name %s to %s",
contact,
contact.name,
contact_name,
)
contact.name = contact_name
Session.commit()
# contact created in the past does not have mail_from and from_header field
if not contact.mail_from and mail_from:
LOG.d(
"Set contact mail_from %s: %s to %s",
contact,
contact.mail_from,
mail_from,
)
contact.mail_from = mail_from
Session.commit()
else:
alias_id = alias.id
try:
contact_email_for_reply = (
contact_email if is_valid_email(contact_email) else ""
)
contact = Contact.create(
user_id=alias.user_id,
alias_id=alias_id,
website_email=contact_email,
name=contact_name,
mail_from=mail_from,
reply_email=generate_reply_email(contact_email_for_reply, alias),
automatic_created=True,
)
if not contact_email:
LOG.d("Create a contact with invalid email for %s", alias)
contact.invalid_email = True
LOG.d(
"create contact %s for %s, reverse alias:%s",
contact_email,
alias,
contact.reply_email,
)
Session.commit()
except IntegrityError:
# If the tx has been rolled back, the connection is borked. Force close to try to get a new one and start fresh
Session.close()
LOG.info(
f"Contact with email {contact_email} for alias_id {alias_id} already existed, fetching from DB"
)
contact = Contact.get_by(alias_id=alias_id, website_email=contact_email)
return contact
def get_or_create_reply_to_contact( def get_or_create_reply_to_contact(
@ -294,33 +233,7 @@ def get_or_create_reply_to_contact(
) )
return None return None
contact = Contact.get_by(alias_id=alias.id, website_email=contact_address) return contact_utils.create_contact(contact_address, alias, contact_name).contact
if contact:
return contact
else:
LOG.d(
"create contact %s for alias %s via reply-to header %s",
contact_address,
alias,
reply_to_header,
)
try:
contact = Contact.create(
user_id=alias.user_id,
alias_id=alias.id,
website_email=contact_address,
name=contact_name,
reply_email=generate_reply_email(contact_address, alias),
automatic_created=True,
)
Session.commit()
except IntegrityError:
LOG.w("Contact %s %s already exist", alias, contact_address)
Session.rollback()
contact = Contact.get_by(alias_id=alias.id, website_email=contact_address)
return contact
def replace_header_when_forward(msg: Message, alias: Alias, header: str): def replace_header_when_forward(msg: Message, alias: Alias, header: str):
@ -818,7 +731,7 @@ def forward_email_to_mailbox(
email_log = EmailLog.create( email_log = EmailLog.create(
contact_id=contact.id, contact_id=contact.id,
user_id=user.id, user_id=contact.user_id,
mailbox_id=mailbox.id, mailbox_id=mailbox.id,
alias_id=contact.alias_id, alias_id=contact.alias_id,
message_id=str(msg[headers.MESSAGE_ID]), message_id=str(msg[headers.MESSAGE_ID]),
@ -1252,23 +1165,11 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
Session.commit() Session.commit()
# make the email comes from alias recipient_name = get_alias_recipient_name(alias)
from_header = alias.email if recipient_name.message:
# add alias name from alias LOG.d(recipient_name.message)
if alias.name: LOG.d("From header is %s", recipient_name.name)
LOG.d("Put alias name %s in from header", alias.name) add_or_replace_header(msg, headers.FROM, recipient_name.name)
from_header = sl_formataddr((alias.name, alias.email))
elif alias.custom_domain:
# add alias name from domain
if alias.custom_domain.name:
LOG.d(
"Put domain default alias name %s in from header",
alias.custom_domain.name,
)
from_header = sl_formataddr((alias.custom_domain.name, alias.email))
LOG.d("From header is %s", from_header)
add_or_replace_header(msg, headers.FROM, from_header)
try: try:
if str(msg[headers.TO]).lower() == "undisclosed-recipients:;": if str(msg[headers.TO]).lower() == "undisclosed-recipients:;":
@ -1601,7 +1502,9 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
LOG.w( LOG.w(
f"Disable alias {alias} because {reason}. {alias.mailboxes} {alias.user}. Last contact {contact}" f"Disable alias {alias} because {reason}. {alias.mailboxes} {alias.user}. Last contact {contact}"
) )
change_alias_status(alias, enabled=False) change_alias_status(
alias, enabled=False, message=f"Set enabled=False due to {reason}"
)
Notification.create( Notification.create(
user_id=user.id, user_id=user.id,

View File

@ -9,7 +9,7 @@ from events.runner import Runner
from events.event_source import DeadLetterEventSource, PostgresEventSource from events.event_source import DeadLetterEventSource, PostgresEventSource
from events.event_sink import ConsoleEventSink, HttpEventSink from events.event_sink import ConsoleEventSink, HttpEventSink
_DEFAULT_MAX_RETRIES = 100 _DEFAULT_MAX_RETRIES = 10
class Mode(Enum): class Mode(Enum):

View File

@ -1,4 +1,5 @@
import requests import requests
import newrelic.agent
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from app.config import EVENT_WEBHOOK, EVENT_WEBHOOK_SKIP_VERIFY_SSL from app.config import EVENT_WEBHOOK, EVENT_WEBHOOK_SKIP_VERIFY_SSL
@ -26,6 +27,9 @@ class HttpEventSink(EventSink):
headers={"Content-Type": "application/x-protobuf"}, headers={"Content-Type": "application/x-protobuf"},
verify=not EVENT_WEBHOOK_SKIP_VERIFY_SSL, verify=not EVENT_WEBHOOK_SKIP_VERIFY_SSL,
) )
newrelic.agent.record_custom_event(
"EventSentToPartner", {"http_code": res.status_code}
)
if res.status_code != 200: if res.status_code != 200:
LOG.warning( LOG.warning(
f"Failed to send event to webhook: {res.status_code} {res.text}" f"Failed to send event to webhook: {res.status_code} {res.text}"

View File

@ -72,7 +72,9 @@ class PostgresEventSource(EventSource):
Session.close() # Ensure we get a new connection and we don't leave a dangling tx Session.close() # Ensure we get a new connection and we don't leave a dangling tx
def __connect(self): def __connect(self):
self.__connection = psycopg2.connect(self.__connection_string) self.__connection = psycopg2.connect(
self.__connection_string, application_name="sl-event-listen"
)
from app.db import Session from app.db import Session
@ -83,24 +85,28 @@ class DeadLetterEventSource(EventSource):
def __init__(self, max_retries: int): def __init__(self, max_retries: int):
self.__max_retries = max_retries self.__max_retries = max_retries
def execute_loop(
self, on_event: Callable[[SyncEvent], NoReturn]
) -> list[SyncEvent]:
threshold = arrow.utcnow().shift(minutes=-_DEAD_LETTER_THRESHOLD_MINUTES)
events = SyncEvent.get_dead_letter(
older_than=threshold, max_retries=self.__max_retries
)
if events:
LOG.info(f"Got {len(events)} dead letter events")
newrelic.agent.record_custom_metric(
"Custom/dead_letter_events_to_process", len(events)
)
for event in events:
if event.mark_as_taken(allow_taken_older_than=threshold):
on_event(event)
return events
@newrelic.agent.background_task() @newrelic.agent.background_task()
def run(self, on_event: Callable[[SyncEvent], NoReturn]): def run(self, on_event: Callable[[SyncEvent], NoReturn]):
while True: while True:
try: try:
threshold = arrow.utcnow().shift( events = self.execute_loop(on_event)
minutes=-_DEAD_LETTER_THRESHOLD_MINUTES
)
events = SyncEvent.get_dead_letter(
older_than=threshold, max_retries=self.__max_retries
)
if events:
LOG.info(f"Got {len(events)} dead letter events")
if events:
newrelic.agent.record_custom_metric(
"Custom/dead_letter_events_to_process", len(events)
)
for event in events:
on_event(event)
Session.close() # Ensure that we have a new connection and we don't have a dangling tx with a lock Session.close() # Ensure that we have a new connection and we don't have a dangling tx with a lock
if not events: if not events:
LOG.debug("No dead letter events") LOG.debug("No dead letter events")

View File

@ -3,7 +3,7 @@ Run scheduled jobs.
Not meant for running job at precise time (+- 1h) Not meant for running job at precise time (+- 1h)
""" """
import time import time
from typing import List from typing import List, Optional
import arrow import arrow
from sqlalchemy.sql.expression import or_, and_ from sqlalchemy.sql.expression import or_, and_
@ -20,6 +20,7 @@ from app.jobs.event_jobs import send_alias_creation_events_for_user
from app.jobs.export_user_data_job import ExportUserDataJob from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG from app.log import LOG
from app.models import User, Job, BatchImport, Mailbox, CustomDomain, JobState from app.models import User, Job, BatchImport, Mailbox, CustomDomain, JobState
from app.user_audit_log_utils import emit_user_audit_log, UserAuditLogAction
from server import create_light_app from server import create_light_app
@ -128,7 +129,7 @@ def welcome_proton(user):
def delete_mailbox_job(job: Job): def delete_mailbox_job(job: Job):
mailbox_id = job.payload.get("mailbox_id") mailbox_id = job.payload.get("mailbox_id")
mailbox = Mailbox.get(mailbox_id) mailbox: Optional[Mailbox] = Mailbox.get(mailbox_id)
if not mailbox: if not mailbox:
return return
@ -152,6 +153,12 @@ def delete_mailbox_job(job: Job):
mailbox_email = mailbox.email mailbox_email = mailbox.email
user = mailbox.user user = mailbox.user
emit_user_audit_log(
user=user,
action=UserAuditLogAction.DeleteMailbox,
message=f"Delete mailbox {mailbox.id} ({mailbox.email})",
)
Mailbox.delete(mailbox_id) Mailbox.delete(mailbox_id)
Session.commit() Session.commit()
LOG.d("Mailbox %s %s deleted", mailbox_id, mailbox_email) LOG.d("Mailbox %s %s deleted", mailbox_id, mailbox_email)
@ -240,28 +247,41 @@ def process_job(job: Job):
elif job.name == config.JOB_DELETE_DOMAIN: elif job.name == config.JOB_DELETE_DOMAIN:
custom_domain_id = job.payload.get("custom_domain_id") custom_domain_id = job.payload.get("custom_domain_id")
custom_domain = CustomDomain.get(custom_domain_id) custom_domain: Optional[CustomDomain] = CustomDomain.get(custom_domain_id)
if not custom_domain: if not custom_domain:
return return
is_subdomain = custom_domain.is_sl_subdomain
domain_name = custom_domain.domain domain_name = custom_domain.domain
user = custom_domain.user user = custom_domain.user
custom_domain_partner_id = custom_domain.partner_id
CustomDomain.delete(custom_domain.id) CustomDomain.delete(custom_domain.id)
Session.commit() Session.commit()
if is_subdomain:
message = f"Delete subdomain {custom_domain_id} ({domain_name})"
else:
message = f"Delete custom domain {custom_domain_id} ({domain_name})"
emit_user_audit_log(
user=user,
action=UserAuditLogAction.DeleteCustomDomain,
message=message,
)
LOG.d("Domain %s deleted", domain_name) LOG.d("Domain %s deleted", domain_name)
send_email( if custom_domain_partner_id is None:
user.email, send_email(
f"Your domain {domain_name} has been deleted", user.email,
f"""Domain {domain_name} along with its aliases are deleted successfully. f"Your domain {domain_name} has been deleted",
f"""Domain {domain_name} along with its aliases are deleted successfully.
Regards, Regards,
SimpleLogin team. SimpleLogin team.
""", """,
retries=3, retries=3,
) )
elif job.name == config.JOB_SEND_USER_REPORT: elif job.name == config.JOB_SEND_USER_REPORT:
export_job = ExportUserDataJob.create_from_job(job) export_job = ExportUserDataJob.create_from_job(job)
if export_job: if export_job:

View File

@ -0,0 +1,30 @@
"""Custom Domain partner id
Revision ID: 2441b7ff5da9
Revises: 1c14339aae90
Create Date: 2024-09-13 15:43:02.425964
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '2441b7ff5da9'
down_revision = '1c14339aae90'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('custom_domain', sa.Column('partner_id', sa.Integer(), nullable=True, default=None, server_default=None))
op.create_foreign_key(None, 'custom_domain', 'partner', ['partner_id'], ['id'])
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_constraint(None, 'custom_domain', type_='foreignkey')
op.drop_column('custom_domain', 'partner_id')
# ### end Alembic commands ###

View File

@ -0,0 +1,31 @@
"""contact.flags and custom_domain.pending_deletion
Revision ID: 88dd7a0abf54
Revises: 2441b7ff5da9
Create Date: 2024-09-19 15:41:20.910374
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '88dd7a0abf54'
down_revision = '2441b7ff5da9'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.add_column('contact', sa.Column('flags', sa.Integer(), server_default='0', nullable=False))
op.add_column('custom_domain', sa.Column('pending_deletion', sa.Boolean(), server_default='0', nullable=False))
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('custom_domain', 'pending_deletion')
op.drop_column('contact', 'flags')
# ### end Alembic commands ###

View File

@ -0,0 +1,27 @@
"""custom domain indices
Revision ID: 62afa3a10010
Revises: 88dd7a0abf54
Create Date: 2024-09-30 11:40:04.127791
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = '62afa3a10010'
down_revision = '88dd7a0abf54'
branch_labels = None
depends_on = None
def upgrade():
with op.get_context().autocommit_block():
op.create_index('ix_custom_domain_pending_deletion', 'custom_domain', ['pending_deletion'], unique=False, postgresql_concurrently=True)
op.create_index('ix_custom_domain_user_id', 'custom_domain', ['user_id'], unique=False, postgresql_concurrently=True)
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('ix_custom_domain_user_id', table_name='custom_domain', postgresql_concurrently=True)
op.drop_index('ix_custom_domain_pending_deletion', table_name='custom_domain', postgresql_concurrently=True)

View File

@ -0,0 +1,45 @@
"""alias_audit_log
Revision ID: 91ed7f46dc81
Revises: 62afa3a10010
Create Date: 2024-10-11 13:22:11.594054
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '91ed7f46dc81'
down_revision = '62afa3a10010'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('alias_audit_log',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('created_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=False),
sa.Column('updated_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('alias_id', sa.Integer(), nullable=False),
sa.Column('alias_email', sa.String(length=255), nullable=False),
sa.Column('action', sa.String(length=255), nullable=False),
sa.Column('message', sa.Text(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index('ix_alias_audit_log_alias_email', 'alias_audit_log', ['alias_email'], unique=False)
op.create_index('ix_alias_audit_log_alias_id', 'alias_audit_log', ['alias_id'], unique=False)
op.create_index('ix_alias_audit_log_user_id', 'alias_audit_log', ['user_id'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index('ix_alias_audit_log_user_id', table_name='alias_audit_log')
op.drop_index('ix_alias_audit_log_alias_id', table_name='alias_audit_log')
op.drop_index('ix_alias_audit_log_alias_email', table_name='alias_audit_log')
op.drop_table('alias_audit_log')
# ### end Alembic commands ###

View File

@ -0,0 +1,44 @@
"""user_audit_log
Revision ID: 7d7b84779837
Revises: 91ed7f46dc81
Create Date: 2024-10-16 11:52:49.128644
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '7d7b84779837'
down_revision = '91ed7f46dc81'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('user_audit_log',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('created_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=False),
sa.Column('updated_at', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('user_email', sa.String(length=255), nullable=False),
sa.Column('action', sa.String(length=255), nullable=False),
sa.Column('message', sa.Text(), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index('ix_user_audit_log_user_email', 'user_audit_log', ['user_email'], unique=False)
op.create_index('ix_user_audit_log_user_id', 'user_audit_log', ['user_id'], unique=False)
op.create_index('ix_user_audit_log_created_at', 'user_audit_log', ['created_at'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index('ix_user_audit_log_user_id', table_name='user_audit_log')
op.drop_index('ix_user_audit_log_user_email', table_name='user_audit_log')
op.drop_index('ix_user_audit_log_created_at', table_name='user_audit_log')
op.drop_table('user_audit_log')
# ### end Alembic commands ###

View File

@ -0,0 +1,27 @@
"""alias_audit_log_index_created_at
Revision ID: 32f25cbf12f6
Revises: 7d7b84779837
Create Date: 2024-10-16 16:45:36.827161
"""
import sqlalchemy_utils
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '32f25cbf12f6'
down_revision = '7d7b84779837'
branch_labels = None
depends_on = None
def upgrade():
with op.get_context().autocommit_block():
op.create_index('ix_alias_audit_log_created_at', 'alias_audit_log', ['created_at'], unique=False, postgresql_concurrently=True)
def downgrade():
with op.get_context().autocommit_block():
op.drop_index('ix_alias_audit_log_created_at', table_name='alias_audit_log', postgresql_concurrently=True)

View File

@ -94,6 +94,20 @@ def log_nb_db_connection():
newrelic.agent.record_custom_metric("Custom/nb_db_connections", nb_connection) newrelic.agent.record_custom_metric("Custom/nb_db_connections", nb_connection)
@newrelic.agent.background_task()
def log_nb_db_connection_by_app_name():
# get the number of connections to the DB
rows = Session.execute(
"SELECT application_name, count(datid) FROM pg_stat_activity group by application_name"
)
for row in rows:
if row[0].find("sl-") == 0:
LOG.d("number of db connections for app %s = %s", row[0], row[1])
newrelic.agent.record_custom_metric(
f"Custom/nb_db_app_connection/{row[0]}", row[1]
)
@newrelic.agent.background_task() @newrelic.agent.background_task()
def log_pending_to_process_events(): def log_pending_to_process_events():
r = Session.execute("select count(*) from sync_event WHERE taken_time IS NULL;") r = Session.execute("select count(*) from sync_event WHERE taken_time IS NULL;")
@ -125,6 +139,21 @@ def log_events_pending_dead_letter():
) )
@newrelic.agent.background_task()
def log_failed_events():
r = Session.execute(
"""
SELECT COUNT(*)
FROM sync_event
WHERE retry_count >= 10;
""",
)
failed_events = list(r)[0][0]
LOG.d("number of failed events %s", failed_events)
newrelic.agent.record_custom_metric("Custom/sync_events_failed", failed_events)
if __name__ == "__main__": if __name__ == "__main__":
exporter = MetricExporter(get_newrelic_license()) exporter = MetricExporter(get_newrelic_license())
while True: while True:
@ -132,6 +161,8 @@ if __name__ == "__main__":
log_nb_db_connection() log_nb_db_connection()
log_pending_to_process_events() log_pending_to_process_events()
log_events_pending_dead_letter() log_events_pending_dead_letter()
log_failed_events()
log_nb_db_connection_by_app_name()
Session.close() Session.close()
exporter.run() exporter.run()

View File

@ -0,0 +1,49 @@
#!/usr/bin/env python3
import argparse
import time
from sqlalchemy import func
from app.models import Alias
from app.db import Session
parser = argparse.ArgumentParser(
prog="Backfill alias", description="Update alias notes and backfill flag"
)
parser.add_argument(
"-s", "--start_alias_id", default=0, type=int, help="Initial alias_id"
)
parser.add_argument("-e", "--end_alias_id", default=0, type=int, help="Last alias_id")
args = parser.parse_args()
alias_id_start = args.start_alias_id
max_alias_id = args.end_alias_id
if max_alias_id == 0:
max_alias_id = Session.query(func.max(Alias.id)).scalar()
print(f"Checking alias {alias_id_start} to {max_alias_id}")
step = 10000
noteSql = "(note = 'Created through Proton' or note = 'Created through partner Proton')"
alias_query = f"UPDATE alias set note = NULL, flags = flags | :flag where id>=:start AND id<:end and {noteSql}"
updated = 0
start_time = time.time()
for batch_start in range(alias_id_start, max_alias_id, step):
rows_done = Session.execute(
alias_query,
{
"start": batch_start,
"end": batch_start + step,
"flag": Alias.FLAG_PARTNER_CREATED,
},
)
updated += rows_done.rowcount
Session.commit()
elapsed = time.time() - start_time
last_batch_id = batch_start + step
time_per_alias = elapsed / (last_batch_id)
remaining = max_alias_id - last_batch_id
time_remaining = remaining / time_per_alias
hours_remaining = time_remaining / 60.0
print(
f"\rAlias {batch_start}/{max_alias_id} {updated} {hours_remaining:.2f} mins remaining"
)
print("")

View File

@ -0,0 +1,63 @@
#!/usr/bin/env python3
import argparse
import time
from sqlalchemy import func
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import UserPlanChanged, EventContent
from app.models import PartnerUser
from app.db import Session
parser = argparse.ArgumentParser(
prog="Backfill alias", description="Update alias notes and backfill flag"
)
parser.add_argument(
"-s", "--start_pu_id", default=0, type=int, help="Initial partner_user_id"
)
parser.add_argument(
"-e", "--end_pu_id", default=0, type=int, help="Last partner_user_id"
)
args = parser.parse_args()
pu_id_start = args.start_pu_id
max_pu_id = args.end_pu_id
if max_pu_id == 0:
max_pu_id = Session.query(func.max(PartnerUser.id)).scalar()
print(f"Checking partner user {pu_id_start} to {max_pu_id}")
step = 100
updated = 0
start_time = time.time()
with_premium = 0
for batch_start in range(pu_id_start, max_pu_id, step):
partner_users = (
Session.query(PartnerUser).filter(
PartnerUser.id >= batch_start, PartnerUser.id < batch_start + step
)
).all()
for partner_user in partner_users:
subscription_end = partner_user.user.get_active_subscription_end(
include_partner_subscription=False
)
end_timestamp = None
if subscription_end:
with_premium += 1
end_timestamp = subscription_end.timestamp
event = UserPlanChanged(plan_end_time=end_timestamp)
EventDispatcher.send_event(
partner_user.user, EventContent(user_plan_change=event)
)
Session.flush()
updated += 1
Session.commit()
elapsed = time.time() - start_time
last_batch_id = batch_start + step
time_per_alias = elapsed / (last_batch_id)
remaining = max_pu_id - last_batch_id
time_remaining = remaining / time_per_alias
hours_remaining = time_remaining / 60.0
print(
f"\PartnerUser {batch_start}/{max_pu_id} {updated} {hours_remaining:.2f} mins remaining"
)
print(f"With SL premium {with_premium}")

476
app/poetry.lock generated
View File

@ -276,21 +276,6 @@ files = [
{file = "backcall-0.2.0.tar.gz", hash = "sha256:5cbdbf27be5e7cfadb448baf0aa95508f91f2bbc6c6437cd9cd06e2a4c215e1e"}, {file = "backcall-0.2.0.tar.gz", hash = "sha256:5cbdbf27be5e7cfadb448baf0aa95508f91f2bbc6c6437cd9cd06e2a4c215e1e"},
] ]
[[package]]
name = "backports.entry-points-selectable"
version = "1.1.1"
description = "Compatibility shim providing selectable entry points for older implementations"
optional = false
python-versions = ">=2.7"
files = [
{file = "backports.entry_points_selectable-1.1.1-py2.py3-none-any.whl", hash = "sha256:7fceed9532a7aa2bd888654a7314f864a3c16a4e710b34a58cfc0f08114c663b"},
{file = "backports.entry_points_selectable-1.1.1.tar.gz", hash = "sha256:914b21a479fde881635f7af5adc7f6e38d6b274be32269070c53b698c60d5386"},
]
[package.extras]
docs = ["jaraco.packaging (>=8.2)", "rst.linker (>=1.9)", "sphinx"]
testing = ["pytest", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.0.1)", "pytest-flake8", "pytest-mypy"]
[[package]] [[package]]
name = "bcrypt" name = "bcrypt"
version = "3.2.0" version = "3.2.0"
@ -375,35 +360,41 @@ files = [
[[package]] [[package]]
name = "boto3" name = "boto3"
version = "1.15.9" version = "1.35.37"
description = "The AWS SDK for Python" description = "The AWS SDK for Python"
optional = false optional = false
python-versions = "*" python-versions = ">=3.8"
files = [ files = [
{file = "boto3-1.15.9-py2.py3-none-any.whl", hash = "sha256:e0a1dbc0a0e460dc6de2f4144b5015edad3ab5c17ee83c6194b1a010d815bc60"}, {file = "boto3-1.35.37-py3-none-any.whl", hash = "sha256:385ca77bf8ea4ab2d97f6e2435bdb29f77d9301e2f7ac796c2f465753c2adf3c"},
{file = "boto3-1.15.9.tar.gz", hash = "sha256:02f5f7a2b1349760b030c34f90a9cb4600bf8fe3cbc76b801d122bc4cecf3a7f"}, {file = "boto3-1.35.37.tar.gz", hash = "sha256:470d981583885859fed2fd1c185eeb01cc03e60272d499bafe41b12625b158c8"},
] ]
[package.dependencies] [package.dependencies]
botocore = ">=1.18.9,<1.19.0" botocore = ">=1.35.37,<1.36.0"
jmespath = ">=0.7.1,<1.0.0" jmespath = ">=0.7.1,<2.0.0"
s3transfer = ">=0.3.0,<0.4.0" s3transfer = ">=0.10.0,<0.11.0"
[package.extras]
crt = ["botocore[crt] (>=1.21.0,<2.0a0)"]
[[package]] [[package]]
name = "botocore" name = "botocore"
version = "1.18.9" version = "1.35.37"
description = "Low-level, data-driven core of boto 3." description = "Low-level, data-driven core of boto 3."
optional = false optional = false
python-versions = "*" python-versions = ">=3.8"
files = [ files = [
{file = "botocore-1.18.9-py2.py3-none-any.whl", hash = "sha256:dc3244170254cbba7dfde00b0489f830069d93dd6a9e555178d989072d7ee7c2"}, {file = "botocore-1.35.37-py3-none-any.whl", hash = "sha256:64f965d4ba7adb8d79ce044c3aef7356e05dd74753cf7e9115b80f477845d920"},
{file = "botocore-1.18.9.tar.gz", hash = "sha256:35b06b8801eb2dd7e708de35581f9c0304740645874f3af5b8b0c1648f8d6365"}, {file = "botocore-1.35.37.tar.gz", hash = "sha256:b2b4d29bafd95b698344f2f0577bb67064adbf1735d8a0e3c7473daa59c23ba6"},
] ]
[package.dependencies] [package.dependencies]
jmespath = ">=0.7.1,<1.0.0" jmespath = ">=0.7.1,<2.0.0"
python-dateutil = ">=2.1,<3.0.0" python-dateutil = ">=2.1,<3.0.0"
urllib3 = {version = ">=1.20,<1.26", markers = "python_version != \"3.4\""} urllib3 = {version = ">=1.25.4,<2.2.0 || >2.2.0,<3", markers = "python_version >= \"3.10\""}
[package.extras]
crt = ["awscrt (==0.22.0)"]
[[package]] [[package]]
name = "cachetools" name = "cachetools"
@ -491,13 +482,13 @@ pycparser = "*"
[[package]] [[package]]
name = "cfgv" name = "cfgv"
version = "3.2.0" version = "3.4.0"
description = "Validate configuration and produce human readable error messages." description = "Validate configuration and produce human readable error messages."
optional = false optional = false
python-versions = ">=3.6.1" python-versions = ">=3.8"
files = [ files = [
{file = "cfgv-3.2.0-py2.py3-none-any.whl", hash = "sha256:32e43d604bbe7896fe7c248a9c2276447dbef840feb28fe20494f62af110211d"}, {file = "cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9"},
{file = "cfgv-3.2.0.tar.gz", hash = "sha256:cf22deb93d4bcf92f345a5c3cd39d3d41d6340adc60c78bbbd6588c384fda6a1"}, {file = "cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560"},
] ]
[[package]] [[package]]
@ -690,6 +681,21 @@ sdist = ["setuptools-rust (>=0.11.4)"]
ssh = ["bcrypt (>=3.1.5)"] ssh = ["bcrypt (>=3.1.5)"]
test = ["hypothesis (>=1.11.4,!=3.79.2)", "iso8601", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-subtests", "pytest-xdist", "pytz"] test = ["hypothesis (>=1.11.4,!=3.79.2)", "iso8601", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-subtests", "pytest-xdist", "pytz"]
[[package]]
name = "cssbeautifier"
version = "1.15.1"
description = "CSS unobfuscator and beautifier."
optional = false
python-versions = "*"
files = [
{file = "cssbeautifier-1.15.1.tar.gz", hash = "sha256:9f7064362aedd559c55eeecf6b6bed65e05f33488dcbe39044f0403c26e1c006"},
]
[package.dependencies]
editorconfig = ">=0.12.2"
jsbeautifier = "*"
six = ">=1.13.0"
[[package]] [[package]]
name = "decorator" name = "decorator"
version = "4.4.2" version = "4.4.2"
@ -734,41 +740,40 @@ graph = ["objgraph (>=1.7.2)"]
[[package]] [[package]]
name = "distlib" name = "distlib"
version = "0.3.1" version = "0.3.8"
description = "Distribution utilities" description = "Distribution utilities"
optional = false optional = false
python-versions = "*" python-versions = "*"
files = [ files = [
{file = "distlib-0.3.1-py2.py3-none-any.whl", hash = "sha256:8c09de2c67b3e7deef7184574fc060ab8a793e7adbb183d942c389c8b13c52fb"}, {file = "distlib-0.3.8-py2.py3-none-any.whl", hash = "sha256:034db59a0b96f8ca18035f36290806a9a6e6bd9d1ff91e45a7f172eb17e51784"},
{file = "distlib-0.3.1.zip", hash = "sha256:edf6116872c863e1aa9d5bb7cb5e05a022c519a4594dc703843343a9ddd9bff1"}, {file = "distlib-0.3.8.tar.gz", hash = "sha256:1530ea13e350031b6312d8580ddb6b27a104275a31106523b8f123787f494f64"},
] ]
[[package]] [[package]]
name = "djlint" name = "djlint"
version = "1.3.0" version = "1.34.1"
description = "HTML Template Linter and Formatter" description = "HTML Template Linter and Formatter"
optional = false optional = false
python-versions = ">=3.7,<4.0" python-versions = ">=3.8.0,<4.0.0"
files = [ files = [
{file = "djlint-1.3.0-py3-none-any.whl", hash = "sha256:0c986bf542cdac3025d431a5b15e6c3977f652f2e76e408dbb5e7aaab6b73d99"}, {file = "djlint-1.34.1-py3-none-any.whl", hash = "sha256:96ff1c464fb6f061130ebc88663a2ea524d7ec51f4b56221a2b3f0320a3cfce8"},
{file = "djlint-1.3.0.tar.gz", hash = "sha256:b2d8e6c0a14f88da165296f0da05795d15299b7ab0a9093d670ce9ffd867bc79"}, {file = "djlint-1.34.1.tar.gz", hash = "sha256:db93fa008d19eaadb0454edf1704931d14469d48508daba2df9941111f408346"},
] ]
[package.dependencies] [package.dependencies]
click = ">=8.0.1,<9.0.0" click = ">=8.0.1,<9.0.0"
colorama = ">=0.4.4,<0.5.0" colorama = ">=0.4.4,<0.5.0"
cssbeautifier = ">=1.14.4,<2.0.0"
html-tag-names = ">=0.1.2,<0.2.0" html-tag-names = ">=0.1.2,<0.2.0"
html-void-elements = ">=0.1.0,<0.2.0" html-void-elements = ">=0.1.0,<0.2.0"
importlib-metadata = ">=4.11.0,<5.0.0" jsbeautifier = ">=1.14.4,<2.0.0"
pathspec = ">=0.9.0,<0.10.0" json5 = ">=0.9.11,<0.10.0"
pathspec = ">=0.12.0,<0.13.0"
PyYAML = ">=6.0,<7.0" PyYAML = ">=6.0,<7.0"
regex = ">=2022.1.18,<2023.0.0" regex = ">=2023.0.0,<2024.0.0"
tomli = {version = ">=2.0.1,<3.0.0", markers = "python_version < \"3.11\""} tomli = {version = ">=2.0.1,<3.0.0", markers = "python_version < \"3.11\""}
tqdm = ">=4.62.2,<5.0.0" tqdm = ">=4.62.2,<5.0.0"
[package.extras]
test = ["coverage (>=6.3.1,<7.0.0)", "pytest (>=7.0.1,<8.0.0)", "pytest-cov (>=3.0.0,<4.0.0)"]
[[package]] [[package]]
name = "dkimpy" name = "dkimpy"
version = "1.0.5" version = "1.0.5"
@ -806,6 +811,16 @@ doh = ["requests", "requests-toolbelt"]
idna = ["idna (>=2.1)"] idna = ["idna (>=2.1)"]
trio = ["sniffio (>=1.1)", "trio (>=0.14.0)"] trio = ["sniffio (>=1.1)", "trio (>=0.14.0)"]
[[package]]
name = "editorconfig"
version = "0.12.4"
description = "EditorConfig File Locator and Interpreter for Python"
optional = false
python-versions = "*"
files = [
{file = "EditorConfig-0.12.4.tar.gz", hash = "sha256:24857fa1793917dd9ccf0c7810a07e05404ce9b823521c7dce22a4fb5d125f80"},
]
[[package]] [[package]]
name = "email-validator" name = "email-validator"
version = "1.1.3" version = "1.1.3"
@ -851,15 +866,20 @@ requests = "*"
[[package]] [[package]]
name = "filelock" name = "filelock"
version = "3.0.12" version = "3.15.4"
description = "A platform independent file lock." description = "A platform independent file lock."
optional = false optional = false
python-versions = "*" python-versions = ">=3.8"
files = [ files = [
{file = "filelock-3.0.12-py3-none-any.whl", hash = "sha256:929b7d63ec5b7d6b71b0fa5ac14e030b3f70b75747cef1b10da9b879fef15836"}, {file = "filelock-3.15.4-py3-none-any.whl", hash = "sha256:6ca1fffae96225dab4c6eaf1c4f4f28cd2568d3ec2a44e15a08520504de468e7"},
{file = "filelock-3.0.12.tar.gz", hash = "sha256:18d82244ee114f543149c66a6e0c14e9c4f8a1044b5cdaadd0f82159d6a6ff59"}, {file = "filelock-3.15.4.tar.gz", hash = "sha256:2207938cbc1844345cb01a5a95524dae30f0ce089eba5b00378295a17e3e90cb"},
] ]
[package.extras]
docs = ["furo (>=2023.9.10)", "sphinx (>=7.2.6)", "sphinx-autodoc-typehints (>=1.25.2)"]
testing = ["covdefaults (>=2.3)", "coverage (>=7.3.2)", "diff-cover (>=8.0.1)", "pytest (>=7.4.3)", "pytest-asyncio (>=0.21)", "pytest-cov (>=4.1)", "pytest-mock (>=3.12)", "pytest-timeout (>=2.2)", "virtualenv (>=20.26.2)"]
typing = ["typing-extensions (>=4.8)"]
[[package]] [[package]]
name = "flanker" name = "flanker"
version = "0.9.11" version = "0.9.11"
@ -1495,17 +1515,17 @@ pyreadline = {version = "*", markers = "sys_platform == \"win32\""}
[[package]] [[package]]
name = "identify" name = "identify"
version = "1.5.5" version = "2.6.0"
description = "File identification library for Python" description = "File identification library for Python"
optional = false optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7" python-versions = ">=3.8"
files = [ files = [
{file = "identify-1.5.5-py2.py3-none-any.whl", hash = "sha256:da683bfb7669fa749fc7731f378229e2dbf29a1d1337cbde04106f02236eb29d"}, {file = "identify-2.6.0-py2.py3-none-any.whl", hash = "sha256:e79ae4406387a9d300332b5fd366d8994f1525e8414984e1a59e058b2eda2dd0"},
{file = "identify-1.5.5.tar.gz", hash = "sha256:7c22c384a2c9b32c5cc891d13f923f6b2653aa83e2d75d8f79be240d6c86c4f4"}, {file = "identify-2.6.0.tar.gz", hash = "sha256:cb171c685bdc31bcc4c1734698736a7d5b6c8bf2e0c15117f4d469c8640ae5cf"},
] ]
[package.extras] [package.extras]
license = ["editdistance"] license = ["ukkonen"]
[[package]] [[package]]
name = "idna" name = "idna"
@ -1518,25 +1538,6 @@ files = [
{file = "idna-2.10.tar.gz", hash = "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6"}, {file = "idna-2.10.tar.gz", hash = "sha256:b307872f855b18632ce0c21c5e45be78c0ea7ae4c15c828c20788b26921eb3f6"},
] ]
[[package]]
name = "importlib-metadata"
version = "4.12.0"
description = "Read metadata from Python packages"
optional = false
python-versions = ">=3.7"
files = [
{file = "importlib_metadata-4.12.0-py3-none-any.whl", hash = "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23"},
{file = "importlib_metadata-4.12.0.tar.gz", hash = "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670"},
]
[package.dependencies]
zipp = ">=0.5"
[package.extras]
docs = ["jaraco.packaging (>=9)", "rst.linker (>=1.9)", "sphinx"]
perf = ["ipython"]
testing = ["flufl.flake8", "importlib-resources (>=1.3)", "packaging", "pyfakefs", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=1.3)", "pytest-flake8", "pytest-mypy (>=0.9.1)", "pytest-perf (>=0.9.2)"]
[[package]] [[package]]
name = "iniconfig" name = "iniconfig"
version = "1.0.1" version = "1.0.1"
@ -1669,6 +1670,31 @@ files = [
{file = "jmespath-0.10.0.tar.gz", hash = "sha256:b85d0567b8666149a93172712e68920734333c0ce7e89b78b3e987f71e5ed4f9"}, {file = "jmespath-0.10.0.tar.gz", hash = "sha256:b85d0567b8666149a93172712e68920734333c0ce7e89b78b3e987f71e5ed4f9"},
] ]
[[package]]
name = "jsbeautifier"
version = "1.15.1"
description = "JavaScript unobfuscator and beautifier."
optional = false
python-versions = "*"
files = [
{file = "jsbeautifier-1.15.1.tar.gz", hash = "sha256:ebd733b560704c602d744eafc839db60a1ee9326e30a2a80c4adb8718adc1b24"},
]
[package.dependencies]
editorconfig = ">=0.12.2"
six = ">=1.13.0"
[[package]]
name = "json5"
version = "0.9.25"
description = "A Python implementation of the JSON5 data format."
optional = false
python-versions = ">=3.8"
files = [
{file = "json5-0.9.25-py3-none-any.whl", hash = "sha256:34ed7d834b1341a86987ed52f3f76cd8ee184394906b6e22a1e0deb9ab294e8f"},
{file = "json5-0.9.25.tar.gz", hash = "sha256:548e41b9be043f9426776f05df8635a00fe06104ea51ed24b67f908856e151ae"},
]
[[package]] [[package]]
name = "jwcrypto" name = "jwcrypto"
version = "0.8" version = "0.8"
@ -1959,13 +1985,13 @@ urllib3 = ">=1.7,<2"
[[package]] [[package]]
name = "nodeenv" name = "nodeenv"
version = "1.5.0" version = "1.9.1"
description = "Node.js virtual environment builder" description = "Node.js virtual environment builder"
optional = false optional = false
python-versions = "*" python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [ files = [
{file = "nodeenv-1.5.0-py2.py3-none-any.whl", hash = "sha256:5304d424c529c997bc888453aeaa6362d242b6b4631e90f3d4bf1b290f1c84a9"}, {file = "nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9"},
{file = "nodeenv-1.5.0.tar.gz", hash = "sha256:ab45090ae383b716c4ef89e690c41ff8c2b257b85b309f01f3654df3d084bd7c"}, {file = "nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f"},
] ]
[[package]] [[package]]
@ -2015,13 +2041,13 @@ testing = ["docopt", "pytest (>=3.0.7)"]
[[package]] [[package]]
name = "pathspec" name = "pathspec"
version = "0.9.0" version = "0.12.1"
description = "Utility library for gitignore style pattern matching of file paths." description = "Utility library for gitignore style pattern matching of file paths."
optional = false optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7" python-versions = ">=3.8"
files = [ files = [
{file = "pathspec-0.9.0-py2.py3-none-any.whl", hash = "sha256:7d15c4ddb0b5c802d161efc417ec1a2558ea2653c2e8ad9c19098201dc1c993a"}, {file = "pathspec-0.12.1-py3-none-any.whl", hash = "sha256:a0d503e138a4c123b27490a4f7beda6a01c6f288df0e4a8b79c7eb0dc7b4cc08"},
{file = "pathspec-0.9.0.tar.gz", hash = "sha256:e564499435a2673d586f6b2130bb5b95f04a3ba06f81b8f895b651a3c76aabb1"}, {file = "pathspec-0.12.1.tar.gz", hash = "sha256:a482d51503a1ab33b1c67a6c3813a26953dbdc71c31dacaef9a838c4e29f5712"},
] ]
[[package]] [[package]]
@ -2117,13 +2143,13 @@ files = [
[[package]] [[package]]
name = "pre-commit" name = "pre-commit"
version = "2.17.0" version = "3.8.0"
description = "A framework for managing and maintaining multi-language pre-commit hooks." description = "A framework for managing and maintaining multi-language pre-commit hooks."
optional = false optional = false
python-versions = ">=3.6.1" python-versions = ">=3.9"
files = [ files = [
{file = "pre_commit-2.17.0-py2.py3-none-any.whl", hash = "sha256:725fa7459782d7bec5ead072810e47351de01709be838c2ce1726b9591dad616"}, {file = "pre_commit-3.8.0-py2.py3-none-any.whl", hash = "sha256:9a90a53bf82fdd8778d58085faf8d83df56e40dfe18f45b19446e26bf1b3a63f"},
{file = "pre_commit-2.17.0.tar.gz", hash = "sha256:c1a8040ff15ad3d648c70cc3e55b93e4d2d5b687320955505587fd79bbaed06a"}, {file = "pre_commit-3.8.0.tar.gz", hash = "sha256:8bb6494d4a20423842e198980c9ecf9f96607a07ea29549e180eef9ae80fe7af"},
] ]
[package.dependencies] [package.dependencies]
@ -2131,8 +2157,7 @@ cfgv = ">=2.0.0"
identify = ">=1.0.0" identify = ">=1.0.0"
nodeenv = ">=0.11.1" nodeenv = ">=0.11.1"
pyyaml = ">=5.1" pyyaml = ">=5.1"
toml = "*" virtualenv = ">=20.10.0"
virtualenv = ">=20.0.8"
[[package]] [[package]]
name = "prompt-toolkit" name = "prompt-toolkit"
@ -2665,85 +2690,104 @@ ocsp = ["cryptography (>=36.0.1)", "pyopenssl (==20.0.1)", "requests (>=2.26.0)"
[[package]] [[package]]
name = "regex" name = "regex"
version = "2022.6.2" version = "2023.12.25"
description = "Alternative regular expression module, to replace re." description = "Alternative regular expression module, to replace re."
optional = false optional = false
python-versions = ">=3.6" python-versions = ">=3.7"
files = [ files = [
{file = "regex-2022.6.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:042d122f9fee3ceb6d7e3067d56557df697d1aad4ff5f64ecce4dc13a90a7c01"}, {file = "regex-2023.12.25-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0694219a1d54336fd0445ea382d49d36882415c0134ee1e8332afd1529f0baa5"},
{file = "regex-2022.6.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ffef4b30785dc2d1604dfb7cf9fca5dc27cd86d65f7c2a9ec34d6d3ae4565ec2"}, {file = "regex-2023.12.25-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b014333bd0217ad3d54c143de9d4b9a3ca1c5a29a6d0d554952ea071cff0f1f8"},
{file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0afa6a601acf3c0dc6de4e8d7d8bbce4e82f8542df746226cd35d4a6c15e9456"}, {file = "regex-2023.12.25-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d865984b3f71f6d0af64d0d88f5733521698f6c16f445bb09ce746c92c97c586"},
{file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4a11cbe8eb5fb332ae474895b5ead99392a4ea568bd2a258ab8df883e9c2bf92"}, {file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1e0eabac536b4cc7f57a5f3d095bfa557860ab912f25965e08fe1545e2ed8b4c"},
{file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9c1f62ee2ba880e221bc950651a1a4b0176083d70a066c83a50ef0cb9b178e12"}, {file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c25a8ad70e716f96e13a637802813f65d8a6760ef48672aa3502f4c24ea8b400"},
{file = "regex-2022.6.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5aba3d13c77173e9bfed2c2cea7fc319f11c89a36fcec08755e8fb169cf3b0df"}, {file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9b6d73353f777630626f403b0652055ebfe8ff142a44ec2cf18ae470395766e"},
{file = "regex-2022.6.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:249437f7f5b233792234aeeecb14b0aab1566280de42dfc97c26e6f718297d68"}, {file = "regex-2023.12.25-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a9cc99d6946d750eb75827cb53c4371b8b0fe89c733a94b1573c9dd16ea6c9e4"},
{file = "regex-2022.6.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:179410c79fa86ef318d58ace233f95b87b05a1db6dc493fa29404a43f4b215e2"}, {file = "regex-2023.12.25-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88d1f7bef20c721359d8675f7d9f8e414ec5003d8f642fdfd8087777ff7f94b5"},
{file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:5e201b1232d81ca1a7a22ab2f08e1eccad4e111579fd7f3bbf60b21ef4a16cea"}, {file = "regex-2023.12.25-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:cb3fe77aec8f1995611f966d0c656fdce398317f850d0e6e7aebdfe61f40e1cd"},
{file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:fdecb225d0f1d50d4b26ac423e0032e76d46a788b83b4e299a520717a47d968c"}, {file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:7aa47c2e9ea33a4a2a05f40fcd3ea36d73853a2aae7b4feab6fc85f8bf2c9704"},
{file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:be57f9c7b0b423c66c266a26ad143b2c5514997c05dd32ce7ca95c8b209c2288"}, {file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:df26481f0c7a3f8739fecb3e81bc9da3fcfae34d6c094563b9d4670b047312e1"},
{file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:ed657a07d8a47ef447224ea00478f1c7095065dfe70a89e7280e5f50a5725131"}, {file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c40281f7d70baf6e0db0c2f7472b31609f5bc2748fe7275ea65a0b4601d9b392"},
{file = "regex-2022.6.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:24908aefed23dd065b4a668c0b4ca04d56b7f09d8c8e89636cf6c24e64e67a1e"}, {file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:d94a1db462d5690ebf6ae86d11c5e420042b9898af5dcf278bd97d6bda065423"},
{file = "regex-2022.6.2-cp310-cp310-win32.whl", hash = "sha256:775694cd0bb2c4accf2f1cdd007381b33ec8b59842736fe61bdbad45f2ac7427"}, {file = "regex-2023.12.25-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ba1b30765a55acf15dce3f364e4928b80858fa8f979ad41f862358939bdd1f2f"},
{file = "regex-2022.6.2-cp310-cp310-win_amd64.whl", hash = "sha256:809bbbbbcf8258049b031d80932ba71627d2274029386f0452e9950bcfa2c6e8"}, {file = "regex-2023.12.25-cp310-cp310-win32.whl", hash = "sha256:150c39f5b964e4d7dba46a7962a088fbc91f06e606f023ce57bb347a3b2d4630"},
{file = "regex-2022.6.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:ecd2b5d983eb0adf2049d41f95205bdc3de4e6cc2350e9c80d4409d3a75229de"}, {file = "regex-2023.12.25-cp310-cp310-win_amd64.whl", hash = "sha256:09da66917262d9481c719599116c7dc0c321ffcec4b1f510c4f8a066f8768105"},
{file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f4c101746a8dac0401abefa716b357c546e61ea2e3d4a564a9db9eac57ccbce"}, {file = "regex-2023.12.25-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:1b9d811f72210fa9306aeb88385b8f8bcef0dfbf3873410413c00aa94c56c2b6"},
{file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:166ae7674d0a0e0f8044e7335ba86d0716c9d49465cff1b153f908e0470b8300"}, {file = "regex-2023.12.25-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d902a43085a308cef32c0d3aea962524b725403fd9373dea18110904003bac97"},
{file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c5eac5d8a8ac9ccf00805d02a968a36f5c967db6c7d2b747ab9ed782b3b3a28b"}, {file = "regex-2023.12.25-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d166eafc19f4718df38887b2bbe1467a4f74a9830e8605089ea7a30dd4da8887"},
{file = "regex-2022.6.2-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f57823f35b18d82b201c1b27ce4e55f88e79e81d9ca07b50ce625d33823e1439"}, {file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c7ad32824b7f02bb3c9f80306d405a1d9b7bb89362d68b3c5a9be53836caebdb"},
{file = "regex-2022.6.2-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4d42e3b7b23473729adbf76103e7df75f9167a5a80b1257ca30688352b4bb2dc"}, {file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:636ba0a77de609d6510235b7f0e77ec494d2657108f777e8765efc060094c98c"},
{file = "regex-2022.6.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:b2932e728bee0a634fe55ee54d598054a5a9ffe4cd2be21ba2b4b8e5f8064c2c"}, {file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fda75704357805eb953a3ee15a2b240694a9a514548cd49b3c5124b4e2ad01b"},
{file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:17764683ea01c2b8f103d99ae9de2473a74340df13ce306c49a721f0b1f0eb9e"}, {file = "regex-2023.12.25-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f72cbae7f6b01591f90814250e636065850c5926751af02bb48da94dfced7baa"},
{file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_i686.whl", hash = "sha256:2ac29b834100d2c171085ceba0d4a1e7046c434ddffc1434dbc7f9d59af1e945"}, {file = "regex-2023.12.25-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:db2a0b1857f18b11e3b0e54ddfefc96af46b0896fb678c85f63fb8c37518b3e7"},
{file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_ppc64le.whl", hash = "sha256:f43522fb5d676c99282ca4e2d41e8e2388427c0cf703db6b4a66e49b10b699a8"}, {file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:7502534e55c7c36c0978c91ba6f61703faf7ce733715ca48f499d3dbbd7657e0"},
{file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_s390x.whl", hash = "sha256:9faa01818dad9111dbf2af26c6e3c45140ccbd1192c3a0981f196255bf7ec5e6"}, {file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:e8c7e08bb566de4faaf11984af13f6bcf6a08f327b13631d41d62592681d24fe"},
{file = "regex-2022.6.2-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:17443f99b8f255273731f915fdbfea4d78d809bb9c3aaf67b889039825d06515"}, {file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:283fc8eed679758de38fe493b7d7d84a198b558942b03f017b1f94dda8efae80"},
{file = "regex-2022.6.2-cp36-cp36m-win32.whl", hash = "sha256:4a5449adef907919d4ce7a1eab2e27d0211d1b255bf0b8f5dd330ad8707e0fc3"}, {file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:f44dd4d68697559d007462b0a3a1d9acd61d97072b71f6d1968daef26bc744bd"},
{file = "regex-2022.6.2-cp36-cp36m-win_amd64.whl", hash = "sha256:4d206703a96a39763b5b45cf42645776f5553768ea7f3c2c1a39a4f59cafd4ba"}, {file = "regex-2023.12.25-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:67d3ccfc590e5e7197750fcb3a2915b416a53e2de847a728cfa60141054123d4"},
{file = "regex-2022.6.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:fcd7c432202bcb8b642c3f43d5bcafc5930d82fe5b2bf2c008162df258445c1d"}, {file = "regex-2023.12.25-cp311-cp311-win32.whl", hash = "sha256:68191f80a9bad283432385961d9efe09d783bcd36ed35a60fb1ff3f1ec2efe87"},
{file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:186c5a4a4c40621f64d771038ede20fca6c61a9faa8178f9e305aaa0c2442a97"}, {file = "regex-2023.12.25-cp311-cp311-win_amd64.whl", hash = "sha256:7d2af3f6b8419661a0c421584cfe8aaec1c0e435ce7e47ee2a97e344b98f794f"},
{file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:047b2d1323a51190c01b6604f49fe09682a5c85d3c1b2c8b67c1cd68419ce3c4"}, {file = "regex-2023.12.25-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:8a0ccf52bb37d1a700375a6b395bff5dd15c50acb745f7db30415bae3c2b0715"},
{file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:30637e7fa4acfed444525b1ab9683f714be617862820578c9fd4e944d4d9ad1f"}, {file = "regex-2023.12.25-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c3c4a78615b7762740531c27cf46e2f388d8d727d0c0c739e72048beb26c8a9d"},
{file = "regex-2022.6.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3adafe6f2c6d86dbf3313866b61180530ca4dcd0c264932dc8fa1ffb10871d58"}, {file = "regex-2023.12.25-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ad83e7545b4ab69216cef4cc47e344d19622e28aabec61574b20257c65466d6a"},
{file = "regex-2022.6.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:67ae3601edf86e15ebe40885e5bfdd6002d34879070be15cf18fc0d80ea24fed"}, {file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b7a635871143661feccce3979e1727c4e094f2bdfd3ec4b90dfd4f16f571a87a"},
{file = "regex-2022.6.2-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:48dddddce0ea7e7c3e92c1e0c5a28c13ca4dc9cf7e996c706d00479652bff76c"}, {file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d498eea3f581fbe1b34b59c697512a8baef88212f92e4c7830fcc1499f5b45a5"},
{file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:68e5c641645351eb9eb12c465876e76b53717f99e9b92aea7a2dd645a87aa7aa"}, {file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:43f7cd5754d02a56ae4ebb91b33461dc67be8e3e0153f593c509e21d219c5060"},
{file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:8fd5f8ae42f789538bb634bdfd69b9aa357e76fdfd7ad720f32f8994c0d84f1e"}, {file = "regex-2023.12.25-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51f4b32f793812714fd5307222a7f77e739b9bc566dc94a18126aba3b92b98a3"},
{file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:71988a76fcb68cc091e901fddbcac0f9ad9a475da222c47d3cf8db0876cb5344"}, {file = "regex-2023.12.25-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ba99d8077424501b9616b43a2d208095746fb1284fc5ba490139651f971d39d9"},
{file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:4b8838f70be3ce9e706df9d72f88a0aa7d4c1fea61488e06fdf292ccb70ad2be"}, {file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:4bfc2b16e3ba8850e0e262467275dd4d62f0d045e0e9eda2bc65078c0110a11f"},
{file = "regex-2022.6.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:663dca677bd3d2e2b5b7d0329e9f24247e6f38f3b740dd9a778a8ef41a76af41"}, {file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8c2c19dae8a3eb0ea45a8448356ed561be843b13cbc34b840922ddf565498c1c"},
{file = "regex-2022.6.2-cp37-cp37m-win32.whl", hash = "sha256:24963f0b13cc63db336d8da2a533986419890d128c551baacd934c249d51a779"}, {file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:60080bb3d8617d96f0fb7e19796384cc2467447ef1c491694850ebd3670bc457"},
{file = "regex-2022.6.2-cp37-cp37m-win_amd64.whl", hash = "sha256:ceff75127f828dfe7ceb17b94113ec2df4df274c4cd5533bb299cb099a18a8ca"}, {file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b77e27b79448e34c2c51c09836033056a0547aa360c45eeeb67803da7b0eedaf"},
{file = "regex-2022.6.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a6f2698cfa8340dfe4c0597782776b393ba2274fe4c079900c7c74f68752705"}, {file = "regex-2023.12.25-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:518440c991f514331f4850a63560321f833979d145d7d81186dbe2f19e27ae3d"},
{file = "regex-2022.6.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a8a08ace913c4101f0dc0be605c108a3761842efd5f41a3005565ee5d169fb2b"}, {file = "regex-2023.12.25-cp312-cp312-win32.whl", hash = "sha256:e2610e9406d3b0073636a3a2e80db05a02f0c3169b5632022b4e81c0364bcda5"},
{file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26dbe90b724efef7820c3cf4a0e5be7f130149f3d2762782e4e8ac2aea284a0b"}, {file = "regex-2023.12.25-cp312-cp312-win_amd64.whl", hash = "sha256:cc37b9aeebab425f11f27e5e9e6cf580be7206c6582a64467a14dda211abc232"},
{file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b5f759a1726b995dc896e86f17f9c0582b54eb4ead00ed5ef0b5b22260eaf2d0"}, {file = "regex-2023.12.25-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:da695d75ac97cb1cd725adac136d25ca687da4536154cdc2815f576e4da11c69"},
{file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1fc26bb3415e7aa7495c000a2c13bf08ce037775db98c1a3fac9ff04478b6930"}, {file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d126361607b33c4eb7b36debc173bf25d7805847346dd4d99b5499e1fef52bc7"},
{file = "regex-2022.6.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:52684da32d9003367dc1a1c07e059b9bbaf135ad0764cd47d8ac3dba2df109bc"}, {file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4719bb05094d7d8563a450cf8738d2e1061420f79cfcc1fa7f0a44744c4d8f73"},
{file = "regex-2022.6.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c1264eb40a71cf2bff43d6694ab7254438ca19ef330175060262b3c8dd3931a"}, {file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5dd58946bce44b53b06d94aa95560d0b243eb2fe64227cba50017a8d8b3cd3e2"},
{file = "regex-2022.6.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:bc635ab319c9b515236bdf327530acda99be995f9d3b9f148ab1f60b2431e970"}, {file = "regex-2023.12.25-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22a86d9fff2009302c440b9d799ef2fe322416d2d58fc124b926aa89365ec482"},
{file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:27624b490b5d8880f25dac67e1e2ea93dfef5300b98c6755f585799230d6c746"}, {file = "regex-2023.12.25-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2aae8101919e8aa05ecfe6322b278f41ce2994c4a430303c4cd163fef746e04f"},
{file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:555f7596fd1f123f8c3a67974c01d6ef80b9769e04d660d6c1a7cc3e6cff7069"}, {file = "regex-2023.12.25-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:e692296c4cc2873967771345a876bcfc1c547e8dd695c6b89342488b0ea55cd8"},
{file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:933e72fbe1829cbd59da2bc51ccd73d73162f087f88521a87a8ec9cb0cf10fa8"}, {file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:263ef5cc10979837f243950637fffb06e8daed7f1ac1e39d5910fd29929e489a"},
{file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:cff5c87e941292c97d11dc81bd20679f56a2830f0f0e32f75b8ed6e0eb40f704"}, {file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:d6f7e255e5fa94642a0724e35406e6cb7001c09d476ab5fce002f652b36d0c39"},
{file = "regex-2022.6.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:c757f3a27b6345de13ef3ca956aa805d7734ce68023e84d0fc74e1f09ce66f7a"}, {file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:88ad44e220e22b63b0f8f81f007e8abbb92874d8ced66f32571ef8beb0643b2b"},
{file = "regex-2022.6.2-cp38-cp38-win32.whl", hash = "sha256:a58d21dd1a2d6b50ed091554ff85e448fce3fe33a4db8b55d0eba2ca957ed626"}, {file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:3a17d3ede18f9cedcbe23d2daa8a2cd6f59fe2bf082c567e43083bba3fb00347"},
{file = "regex-2022.6.2-cp38-cp38-win_amd64.whl", hash = "sha256:495a4165172848503303ed05c9d0409428f789acc27050fe2cf0a4549188a7d5"}, {file = "regex-2023.12.25-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d15b274f9e15b1a0b7a45d2ac86d1f634d983ca40d6b886721626c47a400bf39"},
{file = "regex-2022.6.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:1ab5cf7d09515548044e69d3a0ec77c63d7b9dfff4afc19653f638b992573126"}, {file = "regex-2023.12.25-cp37-cp37m-win32.whl", hash = "sha256:ed19b3a05ae0c97dd8f75a5d8f21f7723a8c33bbc555da6bbe1f96c470139d3c"},
{file = "regex-2022.6.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c1ea28f0ee6cbe4c0367c939b015d915aa9875f6e061ba1cf0796ca9a3010570"}, {file = "regex-2023.12.25-cp37-cp37m-win_amd64.whl", hash = "sha256:a6d1047952c0b8104a1d371f88f4ab62e6275567d4458c1e26e9627ad489b445"},
{file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3de1ecf26ce85521bf73897828b6d0687cc6cf271fb6ff32ac63d26b21f5e764"}, {file = "regex-2023.12.25-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:b43523d7bc2abd757119dbfb38af91b5735eea45537ec6ec3a5ec3f9562a1c53"},
{file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fa7c7044aabdad2329974be2246babcc21d3ede852b3971a90fd8c2056c20360"}, {file = "regex-2023.12.25-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:efb2d82f33b2212898f1659fb1c2e9ac30493ac41e4d53123da374c3b5541e64"},
{file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:53d69d77e9cfe468b000314dd656be85bb9e96de088a64f75fe128dfe1bf30dd"}, {file = "regex-2023.12.25-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b7fca9205b59c1a3d5031f7e64ed627a1074730a51c2a80e97653e3e9fa0d415"},
{file = "regex-2022.6.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c8d61883a38b1289fba9944a19a361875b5c0170b83cdcc95ea180247c1b7d3"}, {file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:086dd15e9435b393ae06f96ab69ab2d333f5d65cbe65ca5a3ef0ec9564dfe770"},
{file = "regex-2022.6.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c5429202bef174a3760690d912e3a80060b323199a61cef6c6c29b30ce09fd17"}, {file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e81469f7d01efed9b53740aedd26085f20d49da65f9c1f41e822a33992cb1590"},
{file = "regex-2022.6.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:e85b10280cf1e334a7c95629f6cbbfe30b815a4ea5f1e28d31f79eb92c2c3d93"}, {file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:34e4af5b27232f68042aa40a91c3b9bb4da0eeb31b7632e0091afc4310afe6cb"},
{file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c400dfed4137f32127ea4063447006d7153c974c680bf0fb1b724cce9f8567fc"}, {file = "regex-2023.12.25-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9852b76ab558e45b20bf1893b59af64a28bd3820b0c2efc80e0a70a4a3ea51c1"},
{file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7f648037c503985aed39f85088acab6f1eb6a0482d7c6c665a5712c9ad9eaefc"}, {file = "regex-2023.12.25-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ff100b203092af77d1a5a7abe085b3506b7eaaf9abf65b73b7d6905b6cb76988"},
{file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:e7b2ff451f6c305b516281ec45425dd423223c8063218c5310d6f72a0a7a517c"}, {file = "regex-2023.12.25-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:cc038b2d8b1470364b1888a98fd22d616fba2b6309c5b5f181ad4483e0017861"},
{file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:be456b4313a86be41706319c397c09d9fdd2e5cdfde208292a277b867e99e3d1"}, {file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:094ba386bb5c01e54e14434d4caabf6583334090865b23ef58e0424a6286d3dc"},
{file = "regex-2022.6.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c3db393b21b53d7e1d3f881b64c29d886cbfdd3df007e31de68b329edbab7d02"}, {file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:5cd05d0f57846d8ba4b71d9c00f6f37d6b97d5e5ef8b3c3840426a475c8f70f4"},
{file = "regex-2022.6.2-cp39-cp39-win32.whl", hash = "sha256:d70596f20a03cb5f935d6e4aad9170a490d88fc4633679bf00c652e9def4619e"}, {file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:9aa1a67bbf0f957bbe096375887b2505f5d8ae16bf04488e8b0f334c36e31360"},
{file = "regex-2022.6.2-cp39-cp39-win_amd64.whl", hash = "sha256:3b9b6289e03dbe6a6096880d8ac166cb23c38b4896ad235edee789d4e8697152"}, {file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:98a2636994f943b871786c9e82bfe7883ecdaba2ef5df54e1450fa9869d1f756"},
{file = "regex-2022.6.2.tar.gz", hash = "sha256:f7b43acb2c46fb2cd506965b2d9cf4c5e64c9c612bac26c1187933c7296bf08c"}, {file = "regex-2023.12.25-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:37f8e93a81fc5e5bd8db7e10e62dc64261bcd88f8d7e6640aaebe9bc180d9ce2"},
{file = "regex-2023.12.25-cp38-cp38-win32.whl", hash = "sha256:d78bd484930c1da2b9679290a41cdb25cc127d783768a0369d6b449e72f88beb"},
{file = "regex-2023.12.25-cp38-cp38-win_amd64.whl", hash = "sha256:b521dcecebc5b978b447f0f69b5b7f3840eac454862270406a39837ffae4e697"},
{file = "regex-2023.12.25-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:f7bc09bc9c29ebead055bcba136a67378f03d66bf359e87d0f7c759d6d4ffa31"},
{file = "regex-2023.12.25-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:e14b73607d6231f3cc4622809c196b540a6a44e903bcfad940779c80dffa7be7"},
{file = "regex-2023.12.25-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9eda5f7a50141291beda3edd00abc2d4a5b16c29c92daf8d5bd76934150f3edc"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc6bb9aa69aacf0f6032c307da718f61a40cf970849e471254e0e91c56ffca95"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:298dc6354d414bc921581be85695d18912bea163a8b23cac9a2562bbcd5088b1"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2f4e475a80ecbd15896a976aa0b386c5525d0ed34d5c600b6d3ebac0a67c7ddf"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:531ac6cf22b53e0696f8e1d56ce2396311254eb806111ddd3922c9d937151dae"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22f3470f7524b6da61e2020672df2f3063676aff444db1daa283c2ea4ed259d6"},
{file = "regex-2023.12.25-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:89723d2112697feaa320c9d351e5f5e7b841e83f8b143dba8e2d2b5f04e10923"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0ecf44ddf9171cd7566ef1768047f6e66975788258b1c6c6ca78098b95cf9a3d"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:905466ad1702ed4acfd67a902af50b8db1feeb9781436372261808df7a2a7bca"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:4558410b7a5607a645e9804a3e9dd509af12fb72b9825b13791a37cd417d73a5"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:7e316026cc1095f2a3e8cc012822c99f413b702eaa2ca5408a513609488cb62f"},
{file = "regex-2023.12.25-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:3b1de218d5375cd6ac4b5493e0b9f3df2be331e86520f23382f216c137913d20"},
{file = "regex-2023.12.25-cp39-cp39-win32.whl", hash = "sha256:11a963f8e25ab5c61348d090bf1b07f1953929c13bd2309a0662e9ff680763c9"},
{file = "regex-2023.12.25-cp39-cp39-win_amd64.whl", hash = "sha256:e693e233ac92ba83a87024e1d32b5f9ab15ca55ddd916d878146f4e3406b5c91"},
{file = "regex-2023.12.25.tar.gz", hash = "sha256:29171aa128da69afdf4bde412d5bedc335f2ca8fcfe4489038577d05f16181e5"},
] ]
[[package]] [[package]]
@ -2857,50 +2901,72 @@ files = [
[[package]] [[package]]
name = "s3transfer" name = "s3transfer"
version = "0.3.3" version = "0.10.3"
description = "An Amazon S3 Transfer Manager" description = "An Amazon S3 Transfer Manager"
optional = false optional = false
python-versions = "*" python-versions = ">=3.8"
files = [ files = [
{file = "s3transfer-0.3.3-py2.py3-none-any.whl", hash = "sha256:2482b4259524933a022d59da830f51bd746db62f047d6eb213f2f8855dcb8a13"}, {file = "s3transfer-0.10.3-py3-none-any.whl", hash = "sha256:263ed587a5803c6c708d3ce44dc4dfedaab4c1a32e8329bab818933d79ddcf5d"},
{file = "s3transfer-0.3.3.tar.gz", hash = "sha256:921a37e2aefc64145e7b73d50c71bb4f26f46e4c9f414dc648c6245ff92cf7db"}, {file = "s3transfer-0.10.3.tar.gz", hash = "sha256:4f50ed74ab84d474ce614475e0b8d5047ff080810aac5d01ea25231cfc944b0c"},
] ]
[package.dependencies] [package.dependencies]
botocore = ">=1.12.36,<2.0a.0" botocore = ">=1.33.2,<2.0a.0"
[package.extras]
crt = ["botocore[crt] (>=1.33.2,<2.0a.0)"]
[[package]] [[package]]
name = "sentry-sdk" name = "sentry-sdk"
version = "1.5.11" version = "2.16.0"
description = "Python client for Sentry (https://sentry.io)" description = "Python client for Sentry (https://sentry.io)"
optional = false optional = false
python-versions = "*" python-versions = ">=3.6"
files = [ files = [
{file = "sentry-sdk-1.5.11.tar.gz", hash = "sha256:6c01d9d0b65935fd275adc120194737d1df317dce811e642cbf0394d0d37a007"}, {file = "sentry_sdk-2.16.0-py2.py3-none-any.whl", hash = "sha256:49139c31ebcd398f4f6396b18910610a0c1602f6e67083240c33019d1f6aa30c"},
{file = "sentry_sdk-1.5.11-py2.py3-none-any.whl", hash = "sha256:c17179183cac614e900cbd048dab03f49a48e2820182ec686c25e7ce46f8548f"}, {file = "sentry_sdk-2.16.0.tar.gz", hash = "sha256:90f733b32e15dfc1999e6b7aca67a38688a567329de4d6e184154a73f96c6892"},
] ]
[package.dependencies] [package.dependencies]
certifi = "*" certifi = "*"
urllib3 = ">=1.10.0" urllib3 = ">=1.26.11"
[package.extras] [package.extras]
aiohttp = ["aiohttp (>=3.5)"] aiohttp = ["aiohttp (>=3.5)"]
anthropic = ["anthropic (>=0.16)"]
arq = ["arq (>=0.23)"]
asyncpg = ["asyncpg (>=0.23)"]
beam = ["apache-beam (>=2.12)"] beam = ["apache-beam (>=2.12)"]
bottle = ["bottle (>=0.12.13)"] bottle = ["bottle (>=0.12.13)"]
celery = ["celery (>=3)"] celery = ["celery (>=3)"]
celery-redbeat = ["celery-redbeat (>=2)"]
chalice = ["chalice (>=1.16.0)"] chalice = ["chalice (>=1.16.0)"]
clickhouse-driver = ["clickhouse-driver (>=0.2.0)"]
django = ["django (>=1.8)"] django = ["django (>=1.8)"]
falcon = ["falcon (>=1.4)"] falcon = ["falcon (>=1.4)"]
flask = ["blinker (>=1.1)", "flask (>=0.11)"] fastapi = ["fastapi (>=0.79.0)"]
flask = ["blinker (>=1.1)", "flask (>=0.11)", "markupsafe"]
grpcio = ["grpcio (>=1.21.1)", "protobuf (>=3.8.0)"]
http2 = ["httpcore[http2] (==1.*)"]
httpx = ["httpx (>=0.16.0)"] httpx = ["httpx (>=0.16.0)"]
huey = ["huey (>=2)"]
huggingface-hub = ["huggingface-hub (>=0.22)"]
langchain = ["langchain (>=0.0.210)"]
litestar = ["litestar (>=2.0.0)"]
loguru = ["loguru (>=0.5)"]
openai = ["openai (>=1.0.0)", "tiktoken (>=0.3.0)"]
opentelemetry = ["opentelemetry-distro (>=0.35b0)"]
opentelemetry-experimental = ["opentelemetry-distro"]
pure-eval = ["asttokens", "executing", "pure-eval"] pure-eval = ["asttokens", "executing", "pure-eval"]
pymongo = ["pymongo (>=3.1)"]
pyspark = ["pyspark (>=2.4.4)"] pyspark = ["pyspark (>=2.4.4)"]
quart = ["blinker (>=1.1)", "quart (>=0.16.1)"] quart = ["blinker (>=1.1)", "quart (>=0.16.1)"]
rq = ["rq (>=0.6)"] rq = ["rq (>=0.6)"]
sanic = ["sanic (>=0.8)"] sanic = ["sanic (>=0.8)"]
sqlalchemy = ["sqlalchemy (>=1.2)"] sqlalchemy = ["sqlalchemy (>=1.2)"]
tornado = ["tornado (>=5)"] starlette = ["starlette (>=0.19.1)"]
starlite = ["starlite (>=1.48)"]
tornado = ["tornado (>=6)"]
[[package]] [[package]]
name = "setuptools" name = "setuptools"
@ -3130,17 +3196,6 @@ idna = "*"
requests = ">=2.1.0" requests = ">=2.1.0"
requests-file = ">=1.4" requests-file = ">=1.4"
[[package]]
name = "toml"
version = "0.10.1"
description = "Python Library for Tom's Obvious, Minimal Language"
optional = false
python-versions = "*"
files = [
{file = "toml-0.10.1-py2.py3-none-any.whl", hash = "sha256:bda89d5935c2eac546d648028b9901107a595863cb36bae0c73ac804a9b4ce88"},
{file = "toml-0.10.1.tar.gz", hash = "sha256:926b612be1e5ce0634a2ca03470f95169cf16f939018233a670519cb4ac58b0f"},
]
[[package]] [[package]]
name = "tomli" name = "tomli"
version = "2.0.1" version = "2.0.1"
@ -3272,41 +3327,39 @@ files = [
[[package]] [[package]]
name = "urllib3" name = "urllib3"
version = "1.25.10" version = "1.26.20"
description = "HTTP library with thread-safe connection pooling, file post, and more." description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4" python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
files = [ files = [
{file = "urllib3-1.25.10-py2.py3-none-any.whl", hash = "sha256:e7983572181f5e1522d9c98453462384ee92a0be7fac5f1413a1e35c56cc0461"}, {file = "urllib3-1.26.20-py2.py3-none-any.whl", hash = "sha256:0ed14ccfbf1c30a9072c7ca157e4319b70d65f623e91e7b32fadb2853431016e"},
{file = "urllib3-1.25.10.tar.gz", hash = "sha256:91056c15fa70756691db97756772bb1eb9678fa585d9184f24534b100dc60f4a"}, {file = "urllib3-1.26.20.tar.gz", hash = "sha256:40c2dc0c681e47eb8f90e7e27bf6ff7df2e677421fd46756da1161c39ca70d32"},
] ]
[package.extras] [package.extras]
brotli = ["brotlipy (>=0.6.0)"] brotli = ["brotli (==1.0.9)", "brotli (>=1.0.9)", "brotlicffi (>=0.8.0)", "brotlipy (>=0.6.0)"]
secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)"] secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"]
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"] socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[[package]] [[package]]
name = "virtualenv" name = "virtualenv"
version = "20.8.1" version = "20.21.1"
description = "Virtual Python Environment builder" description = "Virtual Python Environment builder"
optional = false optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,>=2.7" python-versions = ">=3.7"
files = [ files = [
{file = "virtualenv-20.8.1-py2.py3-none-any.whl", hash = "sha256:10062e34c204b5e4ec5f62e6ef2473f8ba76513a9a617e873f1f8fb4a519d300"}, {file = "virtualenv-20.21.1-py3-none-any.whl", hash = "sha256:09ddbe1af0c8ed2bb4d6ed226b9e6415718ad18aef9fa0ba023d96b7a8356049"},
{file = "virtualenv-20.8.1.tar.gz", hash = "sha256:bcc17f0b3a29670dd777d6f0755a4c04f28815395bca279cdcb213b97199a6b8"}, {file = "virtualenv-20.21.1.tar.gz", hash = "sha256:4c104ccde994f8b108163cf9ba58f3d11511d9403de87fb9b4f52bf33dbc8668"},
] ]
[package.dependencies] [package.dependencies]
"backports.entry-points-selectable" = ">=1.0.4" distlib = ">=0.3.6,<1"
distlib = ">=0.3.1,<1" filelock = ">=3.4.1,<4"
filelock = ">=3.0.0,<4" platformdirs = ">=2.4,<4"
platformdirs = ">=2,<3"
six = ">=1.9.0,<2"
[package.extras] [package.extras]
docs = ["proselint (>=0.10.2)", "sphinx (>=3)", "sphinx-argparse (>=0.2.5)", "sphinx-rtd-theme (>=0.4.3)", "towncrier (>=19.9.0rc1)"] docs = ["furo (>=2023.3.27)", "proselint (>=0.13)", "sphinx (>=6.1.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=22.12)"]
testing = ["coverage (>=4)", "coverage-enable-subprocess (>=1)", "flaky (>=3)", "packaging (>=20.0)", "pytest (>=4)", "pytest-env (>=0.6.2)", "pytest-freezegun (>=0.4.1)", "pytest-mock (>=2)", "pytest-randomly (>=1)", "pytest-timeout (>=1)"] test = ["covdefaults (>=2.3)", "coverage (>=7.2.3)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.3.1)", "pytest-env (>=0.8.1)", "pytest-freezegun (>=0.4.2)", "pytest-mock (>=3.10)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)"]
[[package]] [[package]]
name = "watchtower" name = "watchtower"
@ -3605,21 +3658,6 @@ files = [
idna = ">=2.0" idna = ">=2.0"
multidict = ">=4.0" multidict = ">=4.0"
[[package]]
name = "zipp"
version = "3.2.0"
description = "Backport of pathlib-compatible object wrapper for zip files"
optional = false
python-versions = ">=3.6"
files = [
{file = "zipp-3.2.0-py3-none-any.whl", hash = "sha256:43f4fa8d8bb313e65d8323a3952ef8756bf40f9a5c3ea7334be23ee4ec8278b6"},
{file = "zipp-3.2.0.tar.gz", hash = "sha256:b52f22895f4cfce194bc8172f3819ee8de7540aa6d873535a8668b730b8b411f"},
]
[package.extras]
docs = ["jaraco.packaging (>=3.2)", "rst.linker (>=1.9)", "sphinx"]
testing = ["func-timeout", "jaraco.itertools", "jaraco.test (>=3.2.0)", "pytest (>=3.5,!=3.7.3)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=1.2.3)", "pytest-cov", "pytest-flake8", "pytest-mypy"]
[[package]] [[package]]
name = "zope.event" name = "zope.event"
version = "4.5.0" version = "4.5.0"
@ -3698,4 +3736,4 @@ testing = ["coverage (>=5.0.3)", "zope.event", "zope.testing"]
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = "^3.10" python-versions = "^3.10"
content-hash = "01afc410d21eeac0a0ac7e8ef6eeb0a991cf4bc091c3351049263462e205ff63" content-hash = "314f199bd50ccbf636ce1c6c753f8c79a1f5a16aa7c1a330a2ec514a13dbad2d"

View File

@ -10,21 +10,23 @@ message UserDeleted {
} }
message AliasCreated { message AliasCreated {
uint32 alias_id = 1; uint32 id = 1;
string alias_email = 2; string email = 2;
string alias_note = 3; string note = 3;
bool enabled = 4; bool enabled = 4;
uint32 created_at = 5;
} }
message AliasStatusChanged { message AliasStatusChanged {
uint32 alias_id = 1; uint32 id = 1;
string alias_email = 2; string email = 2;
bool enabled = 3; bool enabled = 3;
uint32 created_at = 4;
} }
message AliasDeleted { message AliasDeleted {
uint32 alias_id = 1; uint32 id = 1;
string alias_email = 2; string email = 2;
} }
message AliasCreatedList { message AliasCreatedList {

View File

@ -69,7 +69,7 @@ python-dotenv = "^0.14.0"
ipython = "^7.31.1" ipython = "^7.31.1"
sqlalchemy_utils = "^0.36.8" sqlalchemy_utils = "^0.36.8"
psycopg2-binary = "^2.9.3" psycopg2-binary = "^2.9.3"
sentry_sdk = "^1.5.11" sentry_sdk = "^2.16.0"
blinker = "^1.4" blinker = "^1.4"
arrow = "^0.16.0" arrow = "^0.16.0"
Flask-WTF = "^0.14.3" Flask-WTF = "^0.14.3"
@ -121,13 +121,13 @@ aiospamc = "0.10"
[tool.poetry.dev-dependencies] [tool.poetry.dev-dependencies]
pytest = "^7.0.0" pytest = "^7.0.0"
pytest-cov = "^3.0.0" pytest-cov = "^3.0.0"
pre-commit = "^2.17.0"
black = "^22.1.0" black = "^22.1.0"
djlint = "^1.3.0" djlint = "^1.3.0"
pylint = "^2.14.4" pylint = "^2.14.4"
[tool.poetry.group.dev.dependencies] [tool.poetry.group.dev.dependencies]
ruff = "^0.1.5" ruff = "^0.1.5"
pre-commit = "^3.8.0"
[build-system] [build-system]
requires = ["poetry>=0.12"] requires = ["poetry>=0.12"]

View File

@ -12,10 +12,10 @@ docker run -p 25432:5432 --name ${container_name} -e POSTGRES_PASSWORD=postgres
sleep 3 sleep 3
# upgrade the DB to the latest stage and # upgrade the DB to the latest stage and
env DB_URI=postgresql://postgres:postgres@127.0.0.1:25432/sl rye run alembic upgrade head env DB_URI=postgresql://postgres:postgres@127.0.0.1:25432/sl poetry run alembic upgrade head
# generate the migration script. # generate the migration script.
env DB_URI=postgresql://postgres:postgres@127.0.0.1:25432/sl rye run alembic revision --autogenerate $@ env DB_URI=postgresql://postgres:postgres@127.0.0.1:25432/sl poetry run alembic revision --autogenerate $@
# remove the db # remove the db
docker rm -f ${container_name} docker rm -f ${container_name}

View File

@ -1,4 +1,3 @@
import json
import os import os
import time import time
from datetime import timedelta from datetime import timedelta
@ -7,10 +6,9 @@ import arrow
import click import click
import flask_limiter import flask_limiter
import flask_profiler import flask_profiler
import newrelic.agent
import sentry_sdk import sentry_sdk
from coinbase_commerce.error import WebhookInvalidPayload, SignatureVerificationError
from coinbase_commerce.webhook import Webhook
from dateutil.relativedelta import relativedelta
from flask import ( from flask import (
Flask, Flask,
redirect, redirect,
@ -29,7 +27,7 @@ from sentry_sdk.integrations.flask import FlaskIntegration
from sentry_sdk.integrations.sqlalchemy import SqlalchemyIntegration from sentry_sdk.integrations.sqlalchemy import SqlalchemyIntegration
from werkzeug.middleware.proxy_fix import ProxyFix from werkzeug.middleware.proxy_fix import ProxyFix
from app import paddle_utils, config, paddle_callback, constants from app import config, constants
from app.admin_model import ( from app.admin_model import (
SLAdminIndexView, SLAdminIndexView,
UserAdmin, UserAdmin,
@ -55,7 +53,6 @@ from app.config import (
FLASK_SECRET, FLASK_SECRET,
SENTRY_DSN, SENTRY_DSN,
URL, URL,
PADDLE_MONTHLY_PRODUCT_ID,
FLASK_PROFILER_PATH, FLASK_PROFILER_PATH,
FLASK_PROFILER_PASSWORD, FLASK_PROFILER_PASSWORD,
SENTRY_FRONT_END_DSN, SENTRY_FRONT_END_DSN,
@ -69,22 +66,16 @@ from app.config import (
LANDING_PAGE_URL, LANDING_PAGE_URL,
STATUS_PAGE_URL, STATUS_PAGE_URL,
SUPPORT_EMAIL, SUPPORT_EMAIL,
PADDLE_MONTHLY_PRODUCT_IDS,
PADDLE_YEARLY_PRODUCT_IDS,
PGP_SIGNER, PGP_SIGNER,
COINBASE_WEBHOOK_SECRET,
PAGE_LIMIT, PAGE_LIMIT,
PADDLE_COUPON_ID,
ZENDESK_ENABLED, ZENDESK_ENABLED,
MAX_NB_EMAIL_FREE_PLAN, MAX_NB_EMAIL_FREE_PLAN,
MEM_STORE_URI, MEM_STORE_URI,
) )
from app.dashboard.base import dashboard_bp from app.dashboard.base import dashboard_bp
from app.subscription_webhook import execute_subscription_webhook
from app.db import Session from app.db import Session
from app.developer.base import developer_bp from app.developer.base import developer_bp
from app.discover.base import discover_bp from app.discover.base import discover_bp
from app.email_utils import send_email, render
from app.extensions import login_manager, limiter from app.extensions import login_manager, limiter
from app.fake_data import fake_data from app.fake_data import fake_data
from app.internal.base import internal_bp from app.internal.base import internal_bp
@ -93,11 +84,8 @@ from app.log import LOG
from app.models import ( from app.models import (
User, User,
Alias, Alias,
Subscription,
PlanEnum,
CustomDomain, CustomDomain,
Mailbox, Mailbox,
CoinbaseSubscription,
EmailLog, EmailLog,
Contact, Contact,
ManualSubscription, ManualSubscription,
@ -114,9 +102,11 @@ from app.monitor.base import monitor_bp
from app.newsletter_utils import send_newsletter_to_user from app.newsletter_utils import send_newsletter_to_user
from app.oauth.base import oauth_bp from app.oauth.base import oauth_bp
from app.onboarding.base import onboarding_bp from app.onboarding.base import onboarding_bp
from app.payments.coinbase import setup_coinbase_commerce
from app.payments.paddle import setup_paddle_callback
from app.phone.base import phone_bp from app.phone.base import phone_bp
from app.redis_services import initialize_redis_services from app.redis_services import initialize_redis_services
from app.utils import random_string from app.sentry_utils import sentry_before_send
if SENTRY_DSN: if SENTRY_DSN:
LOG.d("enable sentry") LOG.d("enable sentry")
@ -127,6 +117,7 @@ if SENTRY_DSN:
FlaskIntegration(), FlaskIntegration(),
SqlalchemyIntegration(), SqlalchemyIntegration(),
], ],
before_send=sentry_before_send,
) )
# the app is served behind nginx which uses http and not https # the app is served behind nginx which uses http and not https
@ -299,7 +290,9 @@ def set_index_page(app):
res.status_code, res.status_code,
time.time() - start_time, time.time() - start_time,
) )
newrelic.agent.record_custom_event(
"HttpResponseStatus", {"code": res.status_code}
)
return res return res
@ -441,341 +434,6 @@ def jinja2_filter(app):
) )
def setup_paddle_callback(app: Flask):
@app.route("/paddle", methods=["GET", "POST"])
def paddle():
LOG.d(f"paddle callback {request.form.get('alert_name')} {request.form}")
# make sure the request comes from Paddle
if not paddle_utils.verify_incoming_request(dict(request.form)):
LOG.e("request not coming from paddle. Request data:%s", dict(request.form))
return "KO", 400
if (
request.form.get("alert_name") == "subscription_created"
): # new user subscribes
# the passthrough is json encoded, e.g.
# request.form.get("passthrough") = '{"user_id": 88 }'
passthrough = json.loads(request.form.get("passthrough"))
user_id = passthrough.get("user_id")
user = User.get(user_id)
subscription_plan_id = int(request.form.get("subscription_plan_id"))
if subscription_plan_id in PADDLE_MONTHLY_PRODUCT_IDS:
plan = PlanEnum.monthly
elif subscription_plan_id in PADDLE_YEARLY_PRODUCT_IDS:
plan = PlanEnum.yearly
else:
LOG.e(
"Unknown subscription_plan_id %s %s",
subscription_plan_id,
request.form,
)
return "No such subscription", 400
sub = Subscription.get_by(user_id=user.id)
if not sub:
LOG.d(f"create a new Subscription for user {user}")
Subscription.create(
user_id=user.id,
cancel_url=request.form.get("cancel_url"),
update_url=request.form.get("update_url"),
subscription_id=request.form.get("subscription_id"),
event_time=arrow.now(),
next_bill_date=arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date(),
plan=plan,
)
else:
LOG.d(f"Update an existing Subscription for user {user}")
sub.cancel_url = request.form.get("cancel_url")
sub.update_url = request.form.get("update_url")
sub.subscription_id = request.form.get("subscription_id")
sub.event_time = arrow.now()
sub.next_bill_date = arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date()
sub.plan = plan
# make sure to set the new plan as not-cancelled
# in case user cancels a plan and subscribes a new plan
sub.cancelled = False
execute_subscription_webhook(user)
LOG.d("User %s upgrades!", user)
Session.commit()
elif request.form.get("alert_name") == "subscription_payment_succeeded":
subscription_id = request.form.get("subscription_id")
LOG.d("Update subscription %s", subscription_id)
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
# when user subscribes, the "subscription_payment_succeeded" can arrive BEFORE "subscription_created"
# at that time, subscription object does not exist yet
if sub:
sub.event_time = arrow.now()
sub.next_bill_date = arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date()
Session.commit()
execute_subscription_webhook(sub.user)
elif request.form.get("alert_name") == "subscription_cancelled":
subscription_id = request.form.get("subscription_id")
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
if sub:
# cancellation_effective_date should be the same as next_bill_date
LOG.w(
"Cancel subscription %s %s on %s, next bill date %s",
subscription_id,
sub.user,
request.form.get("cancellation_effective_date"),
sub.next_bill_date,
)
sub.event_time = arrow.now()
sub.cancelled = True
Session.commit()
user = sub.user
send_email(
user.email,
"SimpleLogin - your subscription is canceled",
render(
"transactional/subscription-cancel.txt",
user=user,
end_date=request.form.get("cancellation_effective_date"),
),
)
execute_subscription_webhook(sub.user)
else:
# user might have deleted their account
LOG.i(f"Cancel non-exist subscription {subscription_id}")
return "OK"
elif request.form.get("alert_name") == "subscription_updated":
subscription_id = request.form.get("subscription_id")
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
if sub:
next_bill_date = request.form.get("next_bill_date")
if not next_bill_date:
paddle_callback.failed_payment(sub, subscription_id)
return "OK"
LOG.d(
"Update subscription %s %s on %s, next bill date %s",
subscription_id,
sub.user,
request.form.get("cancellation_effective_date"),
sub.next_bill_date,
)
if (
int(request.form.get("subscription_plan_id"))
== PADDLE_MONTHLY_PRODUCT_ID
):
plan = PlanEnum.monthly
else:
plan = PlanEnum.yearly
sub.cancel_url = request.form.get("cancel_url")
sub.update_url = request.form.get("update_url")
sub.event_time = arrow.now()
sub.next_bill_date = arrow.get(
request.form.get("next_bill_date"), "YYYY-MM-DD"
).date()
sub.plan = plan
# make sure to set the new plan as not-cancelled
sub.cancelled = False
Session.commit()
execute_subscription_webhook(sub.user)
else:
LOG.w(
f"update non-exist subscription {subscription_id}. {request.form}"
)
return "No such subscription", 400
elif request.form.get("alert_name") == "payment_refunded":
subscription_id = request.form.get("subscription_id")
LOG.d("Refund request for subscription %s", subscription_id)
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
if sub:
user = sub.user
Subscription.delete(sub.id)
Session.commit()
LOG.e("%s requests a refund", user)
execute_subscription_webhook(sub.user)
elif request.form.get("alert_name") == "subscription_payment_refunded":
subscription_id = request.form.get("subscription_id")
sub: Subscription = Subscription.get_by(subscription_id=subscription_id)
LOG.d(
"Handle subscription_payment_refunded for subscription %s",
subscription_id,
)
if not sub:
LOG.w(
"No such subscription for %s, payload %s",
subscription_id,
request.form,
)
return "No such subscription"
plan_id = int(request.form["subscription_plan_id"])
if request.form["refund_type"] == "full":
if plan_id in PADDLE_MONTHLY_PRODUCT_IDS:
LOG.d("subtract 1 month from next_bill_date %s", sub.next_bill_date)
sub.next_bill_date = sub.next_bill_date - relativedelta(months=1)
LOG.d("next_bill_date is %s", sub.next_bill_date)
Session.commit()
elif plan_id in PADDLE_YEARLY_PRODUCT_IDS:
LOG.d("subtract 1 year from next_bill_date %s", sub.next_bill_date)
sub.next_bill_date = sub.next_bill_date - relativedelta(years=1)
LOG.d("next_bill_date is %s", sub.next_bill_date)
Session.commit()
else:
LOG.e("Unknown plan_id %s", plan_id)
else:
LOG.w("partial subscription_payment_refunded, not handled")
execute_subscription_webhook(sub.user)
return "OK"
@app.route("/paddle_coupon", methods=["GET", "POST"])
def paddle_coupon():
LOG.d("paddle coupon callback %s", request.form)
if not paddle_utils.verify_incoming_request(dict(request.form)):
LOG.e("request not coming from paddle. Request data:%s", dict(request.form))
return "KO", 400
product_id = request.form.get("p_product_id")
if product_id != PADDLE_COUPON_ID:
LOG.e("product_id %s not match with %s", product_id, PADDLE_COUPON_ID)
return "KO", 400
email = request.form.get("email")
LOG.d("Paddle coupon request for %s", email)
coupon = Coupon.create(
code=random_string(30),
comment="For 1-year coupon",
expires_date=arrow.now().shift(years=1, days=-1),
commit=True,
)
return (
f"Your 1-year coupon is <b>{coupon.code}</b> <br> "
f"It's valid until <b>{coupon.expires_date.date().isoformat()}</b>"
)
def setup_coinbase_commerce(app):
@app.route("/coinbase", methods=["POST"])
def coinbase_webhook():
# event payload
request_data = request.data.decode("utf-8")
# webhook signature
request_sig = request.headers.get("X-CC-Webhook-Signature", None)
try:
# signature verification and event object construction
event = Webhook.construct_event(
request_data, request_sig, COINBASE_WEBHOOK_SECRET
)
except (WebhookInvalidPayload, SignatureVerificationError) as e:
LOG.e("Invalid Coinbase webhook")
return str(e), 400
LOG.d("Coinbase event %s", event)
if event["type"] == "charge:confirmed":
if handle_coinbase_event(event):
return "success", 200
else:
return "error", 400
return "success", 200
def handle_coinbase_event(event) -> bool:
server_user_id = event["data"]["metadata"]["user_id"]
try:
user_id = int(server_user_id)
except ValueError:
user_id = int(float(server_user_id))
code = event["data"]["code"]
user = User.get(user_id)
if not user:
LOG.e("User not found %s", user_id)
return False
coinbase_subscription: CoinbaseSubscription = CoinbaseSubscription.get_by(
user_id=user_id
)
if not coinbase_subscription:
LOG.d("Create a coinbase subscription for %s", user)
coinbase_subscription = CoinbaseSubscription.create(
user_id=user_id, end_at=arrow.now().shift(years=1), code=code, commit=True
)
send_email(
user.email,
"Your SimpleLogin account has been upgraded",
render(
"transactional/coinbase/new-subscription.txt",
user=user,
coinbase_subscription=coinbase_subscription,
),
render(
"transactional/coinbase/new-subscription.html",
user=user,
coinbase_subscription=coinbase_subscription,
),
)
else:
if coinbase_subscription.code != code:
LOG.d("Update code from %s to %s", coinbase_subscription.code, code)
coinbase_subscription.code = code
if coinbase_subscription.is_active():
coinbase_subscription.end_at = coinbase_subscription.end_at.shift(years=1)
else: # already expired subscription
coinbase_subscription.end_at = arrow.now().shift(years=1)
Session.commit()
send_email(
user.email,
"Your SimpleLogin account has been extended",
render(
"transactional/coinbase/extend-subscription.txt",
user=user,
coinbase_subscription=coinbase_subscription,
),
render(
"transactional/coinbase/extend-subscription.html",
user=user,
coinbase_subscription=coinbase_subscription,
),
)
execute_subscription_webhook(user)
return True
def init_extensions(app: Flask): def init_extensions(app: Flask):
login_manager.init_app(app) login_manager.init_app(app)

View File

@ -0,0 +1,12 @@
import arrow
from app.db import Session
from app.log import LOG
from app.models import AliasAuditLog
def cleanup_alias_audit_log(oldest_allowed: arrow.Arrow):
LOG.i(f"Deleting alias_audit_log older than {oldest_allowed}")
count = AliasAuditLog.filter(AliasAuditLog.created_at < oldest_allowed).delete()
Session.commit()
LOG.i(f"Deleted {count} alias_audit_log entries")

View File

@ -0,0 +1,12 @@
import arrow
from app.db import Session
from app.log import LOG
from app.models import UserAuditLog
def cleanup_user_audit_log(oldest_allowed: arrow.Arrow):
LOG.i(f"Deleting user_audit_log older than {oldest_allowed}")
count = UserAuditLog.filter(UserAuditLog.created_at < oldest_allowed).delete()
Session.commit()
LOG.i(f"Deleted {count} user_audit_log entries")

View File

@ -2,7 +2,7 @@
<div class="form-group row"> <div class="form-group row">
<label class="col-sm-2 col-form-label">{{ field.label }}</label> <label class="col-sm-2 col-form-label">{{ field.label }}</label>
<div class="col-sm-10"> <div class="col-sm-10">
{{ field(**kwargs)|safe }} {{ field(**kwargs) |safe }}
<small class="form-text text-muted">{{ field.description }}</small> <small class="form-text text-muted">{{ field.description }}</small>
{% if field.errors %} {% if field.errors %}

View File

@ -1,251 +1,272 @@
{% extends 'admin/master.html' %} {% extends 'admin/master.html' %}
{% macro show_user(user) -%} {% macro show_user(user) -%}
<h4>User {{ user.email }} with ID {{ user.id }}.</h4> <h4>User {{ user.email }} with ID {{ user.id }}.</h4>
<table class="table"> {% set pu = helper.partner_user(user) %}
<thead> <table class="table">
<tr> <thead>
<th scope="col">User ID</th>
<th scope="col">Email</th>
<th scope="col">Paid</th>
<th>Subscription</th>
<th>Created At</th>
</tr>
</thead>
<tbody>
<tr>
<td>{{ user.id }}</td>
<td>{{ user.email }}</td>
<td>{{ "yes" if user.is_paid() else No }}</td>
<td>{{ user.get_active_subscription() }}</td>
<td>{{ user.created_at }}</td>
</tr>
</tbody>
</table>
{%- endmacro %}
{% macro list_mailboxes(mbox_count, mboxes) %}
<h4>
{{ mbox_count }} Mailboxes found.
{% if mbox_count>10 %}Showing only the first 10.{% endif %}
</h4>
<table class="table">
<thead>
<tr>
<th>Mailbox ID</th>
<th>Email</th>
<th>Verified</th>
<th>Created At</th>
</tr>
</thead>
<tbody>
{% for mailbox in mboxes %}
<tr> <tr>
<td>{{ mailbox.id }}</td> <th scope="col">User ID</th>
<td>{{ mailbox.email }}</td> <th scope="col">Email</th>
<td>{{ "Yes" if mailbox.verified else "No" }}</td> <th scope="col">Status</th>
<td>{{ mailbox.created_at }}</td> <th scope="col">Paid</th>
<th>Subscription</th>
<th>Created At</th>
<th>Updated At</th>
<th>Connected with Proton account</th>
</tr> </tr>
{% endfor %} </thead>
</tbody> <tbody>
</table> <tr>
<td>{{ user.id }}</td>
<td><a href="?email={{ user.email }}">{{ user.email }}</a></td>
{% if user.disabled %}
<td class="text-danger">Disabled</td>
{% else %}
<td class="text-success">Enabled</td>
{% endif %}
<td>{{ "yes" if user.is_paid() else "No" }}</td>
<td>{{ user.get_active_subscription() }}</td>
<td>{{ user.created_at }}</td>
<td>{{ user.updated_at }}</td>
{% if pu %}
<td><a href="?email={{ pu.partner_email }}">{{ pu.partner_email }}</a></td>
{% else %}
<td>No</td>
{% endif %}
</tr>
</tbody>
</table>
{%- endmacro %}
{% macro list_mailboxes(message, mbox_count, mboxes) %}
<h4>
{{ mbox_count }} {{ message }}.
{% if mbox_count>10 %}Showing only the last 10.{% endif %}
</h4>
<table class="table">
<thead>
<tr>
<th>Mailbox ID</th>
<th>Email</th>
<th>Verified</th>
<th>Created At</th>
</tr>
</thead>
<tbody>
{% for mailbox in mboxes %}
<tr>
<td>{{ mailbox.id }}</td>
<td><a href="?email={{ mailbox.email }}">{{ mailbox.email }}</a></td>
<td>{{ "Yes" if mailbox.verified else "No" }}</td>
<td>
{{ mailbox.created_at }}
</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endmacro %} {% endmacro %}
{% macro list_alias(alias_count, aliases) %} {% macro list_alias(alias_count, aliases) %}
<h4> <h4>
{{ alias_count }} Aliases found. {{ alias_count }} Aliases found.
{% if alias_count>10 %}Showing only the first 10.{% endif %} {% if alias_count>10 %}Showing only the last 10.{% endif %}
</h4> </h4>
<table class="table"> <table class="table">
<thead> <thead>
<tr>
<th>Alias ID</th>
<th>Email</th>
<th>Verified</th>
<th>Created At</th>
</tr>
</thead>
<tbody>
{% for alias in aliases %}
<tr> <tr>
<td>{{ alias.id }}</td> <th>
<td>{{ alias.email }}</td> Alias ID
<td>{{ "Yes" if alias.verified else "No" }}</td> </th>
<td> <th>
{{ alias.created_at }} Email
</td> </th>
<th>
Enabled
</th>
<th>
Created At
</th>
</tr> </tr>
{% endfor %} </thead>
</tbody> <tbody>
</table> {% for alias in aliases %}
<tr>
<td>{{ alias.id }}</td>
<td><a href="?email={{ alias.email }}">{{ alias.email }}</a></td>
<td>{{ "Yes" if alias.enabled else "No" }}</td>
<td>{{ alias.created_at }}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endmacro %} {% endmacro %}
{% macro show_deleted_alias(deleted_alias) -%} {% macro show_deleted_alias(deleted_alias) -%}
<h4> <h4>Deleted Alias {{ deleted_alias.email }} with ID {{ deleted_alias.id }}.</h4>
Deleted Alias {{ deleted_alias.email }} with ID {{ deleted_alias.id }}. <table class="table">
</h4> <thead>
<table class="table"> <tr>
<thead> <th scope="col">Deleted Alias ID</th>
<tr> <th scope="col">Email</th>
<th scope="col"> <th scope="col">Deleted At</th>
Deleted Alias ID <th scope="col">Reason</th>
</th> </tr>
<th scope="col"> </thead>
Email <tbody>
</th> <tr>
<th scope="col"> <td>{{ deleted_alias.id }}</td>
Deleted At <td>{{ deleted_alias.email }}</td>
</th> <td>{{ deleted_alias.created_at }}</td>
<th scope="col"> <td>{{ deleted_alias.reason }}</td>
Reason </tr>
</th> </tbody>
</tr> </table>
</thead>
<tbody>
<tr>
<td>
{{ deleted_alias.id }}
</td>
<td>
{{ deleted_alias.email }}
</td>
<td>
{{ deleted_alias.created_at }}
</td>
<td>
{{ deleted_alias.reason }}
</td>
</tr>
</tbody>
</table>
{%- endmacro %} {%- endmacro %}
{% macro show_domain_deleted_alias(dom_deleted_alias) -%} {% macro show_domain_deleted_alias(dom_deleted_alias) -%}
<h4> <h4>
Domain Deleted Alias {{ dom_deleted_alias.email }} with ID {{ dom_deleted_alias.id }} for domain {{ dom_deleted_alias.domain.domain }} Domain Deleted Alias {{ dom_deleted_alias.email }} with ID {{ dom_deleted_alias.id }} for
</h4> domain {{ dom_deleted_alias.domain.domain }}
<table class="table"> </h4>
<thead> <table class="table">
<tr> <thead>
<th scope="col"> <tr>
Deleted Alias ID <th scope="col">Deleted Alias ID</th>
</th> <th scope="col">Email</th>
<th scope="col"> <th scope="col">Domain</th>
Email <th scope="col">Domain ID</th>
</th> <th scope="col">Domain owner user ID</th>
<th scope="col"> <th scope="col">Domain owner user email</th>
Domain <th scope="col">Deleted At</th>
</th> </tr>
<th scope="col"> </thead>
Domain ID <tbody>
</th> <tr>
<th scope="col"> <td>{{ dom_deleted_alias.id }}</td>
Domain owner user ID <td>{{ dom_deleted_alias.email }}</td>
</th> <td>{{ dom_deleted_alias.domain.domain }}</td>
<th scope="col"> <td>{{ dom_deleted_alias.domain.id }}</td>
Domain owner user email <td>{{ dom_deleted_alias.domain.user_id }}</td>
</th> <td>{{ dom_deleted_alias.created_at }}</td>
<th scope="col"> </tr>
Deleted At </tbody>
</th> </table>
</tr> {{ show_user(data.domain_deleted_alias.domain.user) }}
</thead>
<tbody>
<tr>
<td>
{{ dom_deleted_alias.id }}
</td>
<td>
{{ dom_deleted_alias.email }}
</td>
<td>
{{ dom_deleted_alias.domain.domain }}
</td>
<td>
{{ dom_deleted_alias.domain.id }}
</td>
<td>
{{ dom_deleted_alias.domain.user_id }}
</td>
<td>
{{ dom_deleted_alias.created_at }}
</td>
</tr>
</tbody>
</table>
{{ show_user(data.domain_deleted_alias.domain.user) }}
{%- endmacro %} {%- endmacro %}
{% macro list_alias_audit_log(alias_audit_log) %}
<h4>Alias Audit Log</h4>
<table class="table">
<thead>
<tr>
<th>User ID</th>
<th>Alias ID</th>
<th>Alias Email</th>
<th>Action</th>
<th>Message</th>
<th>Time</th>
</tr>
</thead>
<tbody>
{% for entry in alias_audit_log %}
<tr>
<td>{{ entry.user_id }}</td>
<td>{{ entry.alias_id }}</td>
<td><a href="?email={{ entry.alias_email }}">{{ entry.alias_email }}</a></td>
<td>{{ entry.action }}</td>
<td>{{ entry.message }}</td>
<td>{{ entry.created_at }}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endmacro %}
{% macro list_user_audit_log(user_audit_log) %}
<h4>User Audit Log</h4>
<table class="table">
<thead>
<tr>
<th>User email</th>
<th>Action</th>
<th>Message</th>
<th>Time</th>
</tr>
</thead>
<tbody>
{% for entry in user_audit_log %}
<tr>
<td><a href="?email={{ entry.user_email }}">{{ entry.user_email }}</a></td>
<td>{{ entry.action }}</td>
<td>{{ entry.message }}</td>
<td>{{ entry.created_at }}</td>
</tr>
{% endfor %}
</tbody>
</table>
{% endmacro %}
{% block body %} {% block body %}
<div class="border border-dark border-2 mt-1 mb-2 p-3">
<form method="post">
<div class="form-group">
<label for="email">
Email to search:
</label>
<input type="text"
class="form-control"
name="email"
value="{{ email or '' }}"/>
</div>
<button type="submit" class="btn btn-primary">
Submit
</button>
</form>
</div>
{% if no_match %}
<div class="border border-dark border-2 mt-1 mb-2 p-3 alert alert-warning"
role="alert">
No user, alias or mailbox found for {{ email }}
</div>
{% endif %}
{% if data.alias %}
<div class="border border-dark border-2 mt-1 mb-2 p-3"> <div class="border border-dark border-2 mt-1 mb-2 p-3">
<h3 class="mb-3"> <form method="get">
Found Alias {{ data.alias.email }} <div class="form-group">
</h3> <label for="email">Email to search:</label>
{{ list_alias(1,[data.alias]) }} <input type="text"
{{ show_user(data.alias.user) }} class="form-control"
{{ list_mailboxes(helper.mailbox_count(data.alias.user), helper.mailbox_list(data.alias.user) ) }} name="email"
value="{{ email or '' }}"/>
</div>
<button type="submit" class="btn btn-primary">Submit</button>
</form>
</div> </div>
{% endif %} {% if data.no_match and email %}
{% if data.user %} <div class="border border-dark border-2 mt-1 mb-2 p-3 alert alert-warning"
role="alert">No user, alias or mailbox found for {{ email }}</div>
{% endif %}
<div class="border border-dark border-2 mt-1 mb-2 p-3"> {% if data.alias %}
<h3 class="mb-3"> <div class="border border-dark border-2 mt-1 mb-2 p-3">
Found User {{ data.user.email }} <h3 class="mb-3">Found Alias {{ data.alias.email }}</h3>
</h3> {{ list_alias(1,[data.alias]) }}
{{ show_user(data.user) }} {{ list_alias_audit_log(data.alias_audit_log) }}
{{ list_mailboxes(helper.mailbox_count(data.user), helper.mailbox_list(data.user) ) }} {{ list_mailboxes("Mailboxes for alias", helper.alias_mailbox_count(data.alias), helper.alias_mailboxes(data.alias)) }}
{{ list_alias(helper.alias_count(data.user),helper.alias_list(data.user)) }} {{ show_user(data.alias.user) }}
</div> </div>
{% endif %} {% endif %}
{% if data.mailbox %}
<div class="border border-dark mt-1 mb-2 p-3"> {% if data.user %}
<h3 class="mb-3"> <div class="border border-dark border-2 mt-1 mb-2 p-3">
Found Mailbox {{ data.mailbox.email }} <h3 class="mb-3">Found User {{ data.user.email }}</h3>
</h3> {{ show_user(data.user) }}
{{ list_mailboxes(1, [data.mailbox] ) }} {{ list_mailboxes("Mailboxes for user", helper.mailbox_count(data.user) , helper.mailbox_list(data.user) ) }}
{{ show_user(data.mailbox.user) }} {{ list_alias(helper.alias_count(data.user) ,helper.alias_list(data.user)) }}
</div> {{ list_user_audit_log(data.user_audit_log) }}
{% endif %} </div>
{% if data.deleted_alias %} {% endif %}
{% if data.mailbox_count > 10 %}
<h3>Found more than 10 mailboxes for {{ email }}. Showing the last 10</h3>
{% elif data.mailbox_count > 0 %}
<h3>Found {{ data.mailbox_count }} mailbox(es) for {{ email }}</h3>
{% endif %}
{% for mailbox in data.mailbox %}
<div class="border border-dark mt-1 mb-2 p-3"> <div class="border border-dark mt-1 mb-2 p-3">
<h3 class="mb-3"> <h3 class="mb-3">Found Mailbox {{ mailbox.email }}</h3>
Found DeletedAlias {{ data.deleted_alias.email }} {{ list_mailboxes("Mailbox found", 1, [mailbox]) }}
</h3> {{ show_user(mailbox.user) }}
{{ show_deleted_alias(data.deleted_alias) }} </div>
</div> {% endfor %}
{% endif %} {% if data.deleted_alias %}
{% if data.domain_deleted_alias %}
<div class="border border-dark mt-1 mb-2 p-3"> <div class="border border-dark mt-1 mb-2 p-3">
<h3 class="mb-3"> <h3 class="mb-3">Found DeletedAlias {{ data.deleted_alias.email }}</h3>
Found DomainDeletedAlias {{ data.domain_deleted_alias.email }} {{ show_deleted_alias(data.deleted_alias) }}
</h3> </div>
{{ show_domain_deleted_alias(data.domain_deleted_alias) }} {% endif %}
</div> {% if data.domain_deleted_alias %}
{% endif %}
<div class="border border-dark mt-1 mb-2 p-3">
<h3 class="mb-3">Found DomainDeletedAlias {{ data.domain_deleted_alias.email }}</h3>
{{ show_domain_deleted_alias(data.domain_deleted_alias) }}
</div>
{% endif %}
{% endblock %} {% endblock %}

View File

@ -11,11 +11,11 @@ Based on https://github.com/flask-admin/flask-admin/issues/974#issuecomment-1682
<input name="user_id" <input name="user_id"
class="form-control" class="form-control"
placeholder="User ID" placeholder="User ID"
aria-describedby="userID"/> aria-describedby="userID" />
<input name="to_address" <input name="to_address"
class="form-control" class="form-control"
placeholder="Specify an address to receive the newsletter for testing" placeholder="Specify an address to receive the newsletter for testing"
aria-describedby="Email address"/> aria-describedby="Email address" />
</li> </li>
{% endblock %} {% endblock %}
{% block tail %} {% block tail %}

View File

@ -7,7 +7,7 @@
<div class="text-center text-muted small mt-4"> <div class="text-center text-muted small mt-4">
Ask for another activation email? Ask for another activation email?
<a href="{{ url_for('auth.resend_activation') }}" style="color: #4d21ff">Resend</a> <a href="{{ url_for("auth.resend_activation") }}" style="color: #4d21ff">Resend</a>
</div> </div>
{% endif %} {% endif %}
{% endblock %} {% endblock %}

View File

@ -13,7 +13,7 @@
</div> </div>
<div class="text-center"> <div class="text-center">
Please go to Please go to
<a href="{{ url_for('dashboard.setting') }}">settings</a> <a href="{{ url_for("dashboard.setting") }}">settings</a>
page to re-send the confirmation email. page to re-send the confirmation email.
</div> </div>
</div> </div>

View File

@ -33,7 +33,7 @@
<div class="text-muted mt-5" style="margin-top: 1em;"> <div class="text-muted mt-5" style="margin-top: 1em;">
Don't have your key with you? Don't have your key with you?
<br /> <br />
<a href="{{ url_for('auth.mfa') }}">Verify by One-Time Password</a> <a href="{{ url_for("auth.mfa") }}">Verify by One-Time Password</a>
</div> </div>
{% endif %} {% endif %}
<hr /> <hr />

View File

@ -20,7 +20,7 @@
</form> </form>
<div class="text-center text-muted"> <div class="text-center text-muted">
Forget it, Forget it,
<a href="{{ url_for('auth.login') }}">send me back</a> <a href="{{ url_for("auth.login") }}">send me back</a>
to the sign in screen. to the sign in screen.
</div> </div>
{% endblock %} {% endblock %}

View File

@ -7,7 +7,7 @@
<div class="text-center text-muted small mb-4"> <div class="text-center text-muted small mb-4">
You haven't received the activation email? You haven't received the activation email?
<a href="{{ url_for('auth.resend_activation') }}">Resend</a> <a href="{{ url_for("auth.resend_activation") }}">Resend</a>
</div> </div>
{% endif %} {% endif %}
<div class="card" style="border-radius: 2%"> <div class="card" style="border-radius: 2%">
@ -25,7 +25,7 @@
{{ form.password(class="form-control", type="password") }} {{ form.password(class="form-control", type="password") }}
{{ render_field_errors(form.password) }} {{ render_field_errors(form.password) }}
<div class="text-muted"> <div class="text-muted">
<a href="{{ url_for('auth.forgot_password') }}" class="small">I forgot my password</a> <a href="{{ url_for("auth.forgot_password") }}" class="small">I forgot my password</a>
</div> </div>
</div> </div>
<div class="form-footer"> <div class="form-footer">
@ -57,6 +57,6 @@
</div> </div>
<div class="text-center text-muted mt-2"> <div class="text-center text-muted mt-2">
Don't have an account yet? Don't have an account yet?
<a href="{{ url_for('auth.register') }}">Sign up</a> <a href="{{ url_for("auth.register") }}">Sign up</a>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -30,7 +30,7 @@
<div class="text-muted mt-5" style="margin-top: 1em;"> <div class="text-muted mt-5" style="margin-top: 1em;">
Having trouble with your authenticator? Having trouble with your authenticator?
<br /> <br />
<a href="{{ url_for('auth.fido') }}"> <a href="{{ url_for("auth.fido") }}">
Verify by your security Verify by your security
key key
</a> </a>

View File

@ -27,7 +27,7 @@
<!-- TODO: add terms <!-- TODO: add terms
<div class="form-group"> <div class="form-group">
<label class="custom-control custom-checkbox"> <label class="custom-control custom-checkbox">
<input type="checkbox" class="custom-control-input"/> <input type="checkbox" class="custom-control-input" />
<span class="custom-control-label">Agree the <a href="terms.html">terms and policy</a></span> <span class="custom-control-label">Agree the <a href="terms.html">terms and policy</a></span>
</label> </label>
</div> </div>
@ -69,6 +69,6 @@
</form> </form>
<div class="text-center text-muted mb-6"> <div class="text-center text-muted mb-6">
Already have account? Already have account?
<a href="{{ url_for('auth.login') }}">Sign in</a> <a href="{{ url_for("auth.login") }}">Sign in</a>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -19,6 +19,6 @@
</form> </form>
<div class="text-center text-muted"> <div class="text-center text-muted">
Don't have account yet? Don't have account yet?
<a href="{{ url_for('auth.register') }}">Sign up</a> <a href="{{ url_for("auth.register") }}">Sign up</a>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -29,7 +29,9 @@
{% endif %} {% endif %}
</div> </div>
<div class="text-center p-3" <div class="text-center p-3"
style="font-size: 12px; font-weight: 300; margin: auto"> style="font-size: 12px;
font-weight: 300;
margin: auto">
<span class="badge badge-warning">Warning</span> <span class="badge badge-warning">Warning</span>
Please note that social login is now <b>deprecated</b>. Please note that social login is now <b>deprecated</b>.
<br /> <br />
@ -39,8 +41,8 @@
</div> </div>
</div> </div>
<div class="text-center text-muted mt-2"> <div class="text-center text-muted mt-2">
<a href="{{ url_for('auth.register') }}">Sign up</a> <a href="{{ url_for("auth.register") }}">Sign up</a>
/ /
<a href="{{ url_for('auth.login') }}">Login</a> <a href="{{ url_for("auth.login") }}">Login</a>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -1,18 +1,18 @@
{% from "_formhelpers.html" import render_field, render_field_errors %} {% from "_formhelpers.html" import render_field, render_field_errors %}
<!doctype html> <!DOCTYPE html>
<html lang="en" <html lang="en"
dir="ltr" dir="ltr"
data-theme="{%- if request.cookies.get('dark-mode') == 'true' -%} dark{%- endif -%}"> data-theme="{%- if request.cookies.get('dark-mode') == 'true' -%} dark{%- endif -%}">
<head> <head>
<meta charset="UTF-8" /> <meta charset="UTF-8" />
<meta name="viewport" <meta name="viewport"
content="width=device-width, user-scalable=no, initial-scale=1.0, maximum-scale=1.0, minimum-scale=1.0"/> content="width=device-width, user-scalable=no, initial-scale=1.0, maximum-scale=1.0, minimum-scale=1.0" />
<meta http-equiv="X-UA-Compatible" content="ie=edge" /> <meta http-equiv="X-UA-Compatible" content="ie=edge" />
<meta http-equiv="Content-Language" content="en" /> <meta http-equiv="Content-Language" content="en" />
<meta name="msapplication-TileColor" content="#2d89ef" /> <meta name="msapplication-TileColor" content="#2d89ef" />
<meta name="theme-color" content="#4188c9" /> <meta name="theme-color" content="#4188c9" />
<meta name="apple-mobile-web-app-status-bar-style" <meta name="apple-mobile-web-app-status-bar-style"
content="black-translucent"/> content="black-translucent" />
<meta name="apple-mobile-web-app-capable" content="yes" /> <meta name="apple-mobile-web-app-capable" content="yes" />
<meta name="mobile-web-app-capable" content="yes" /> <meta name="mobile-web-app-capable" content="yes" />
<meta name="HandheldFriendly" content="True" /> <meta name="HandheldFriendly" content="True" />
@ -23,7 +23,7 @@
<!-- Yandex --> <!-- Yandex -->
<meta name="yandex-verification" content="c9e5d4d68bc983a1" /> <meta name="yandex-verification" content="c9e5d4d68bc983a1" />
<meta name="description" <meta name="description"
content="Protect your email address with email ALIAS. Create a different email alias for each website. No more phishing, or spam."/> content="Protect your email address with email ALIAS. Create a different email alias for each website. No more phishing, or spam." />
<link rel="icon" href="/static/favicon.ico" type="image/x-icon" /> <link rel="icon" href="/static/favicon.ico" type="image/x-icon" />
<link rel="shortcut icon" type="image/x-icon" href="/static/favicon.ico" /> <link rel="shortcut icon" type="image/x-icon" href="/static/favicon.ico" />
<link rel="canonical" href="{{ CANONICAL_URL }}" /> <link rel="canonical" href="{{ CANONICAL_URL }}" />
@ -32,7 +32,7 @@
| SimpleLogin | SimpleLogin
</title> </title>
<link rel="stylesheet" <link rel="stylesheet"
href="{{ url_for('static', filename='node_modules/font-awesome/css/font-awesome.css') }}"/> href="{{ url_for('static', filename='node_modules/font-awesome/css/font-awesome.css') }}" />
<!-- Dashboard Core --> <!-- Dashboard Core -->
<link href="/static/assets/css/dashboard.css" rel="stylesheet" /> <link href="/static/assets/css/dashboard.css" rel="stylesheet" />
<!-- Tabler JS --> <!-- Tabler JS -->
@ -51,19 +51,19 @@
<!-- IntroJS --> <!-- IntroJS -->
<link rel="stylesheet" <link rel="stylesheet"
type="text/css" type="text/css"
href="{{ url_for('static', filename='node_modules/intro.js/minified/introjs.min.css') }}"/> href="{{ url_for('static', filename='node_modules/intro.js/minified/introjs.min.css') }}" />
<script src="{{ url_for('static', filename='node_modules/intro.js/minified/intro.min.js') }}"></script> <script src="{{ url_for('static', filename='node_modules/intro.js/minified/intro.min.js') }}"></script>
<!-- Sentry --> <!-- Sentry -->
<script src="{{ url_for('static', filename='node_modules/@sentry/browser/build/bundle.min.js') }}"></script> <script src="{{ url_for('static', filename='node_modules/@sentry/browser/build/bundle.min.js') }}"></script>
<link rel="stylesheet" href="/static/vendor/bootstrap-social.min.css" /> <link rel="stylesheet" href="/static/vendor/bootstrap-social.min.css" />
<!-- Toastr library --> <!-- Toastr library -->
<link rel="stylesheet" <link rel="stylesheet"
href="{{ url_for('static', filename='node_modules/toastr/build/toastr.min.css') }}"/> href="{{ url_for('static', filename='node_modules/toastr/build/toastr.min.css') }}" />
<script src="{{ url_for('static', filename='node_modules/toastr/build/toastr.min.js') }}"></script> <script src="{{ url_for('static', filename='node_modules/toastr/build/toastr.min.js') }}"></script>
<script src="{{ url_for('static', filename='node_modules/bootbox/dist/bootbox.min.js') }}"></script> <script src="{{ url_for('static', filename='node_modules/bootbox/dist/bootbox.min.js') }}"></script>
<!-- Multiple-select library --> <!-- Multiple-select library -->
<link rel="stylesheet" <link rel="stylesheet"
href="{{ url_for('static', filename='node_modules/multiple-select/dist/multiple-select.min.css') }}"/> href="{{ url_for('static', filename='node_modules/multiple-select/dist/multiple-select.min.css') }}" />
<script src="{{ url_for('static', filename='node_modules/multiple-select/dist/multiple-select.min.js') }}"></script> <script src="{{ url_for('static', filename='node_modules/multiple-select/dist/multiple-select.min.js') }}"></script>
<!-- Parseley library --> <!-- Parseley library -->
<script src="{{ url_for('static', filename='node_modules/parsleyjs/dist/parsley.min.js') }}"></script> <script src="{{ url_for('static', filename='node_modules/parsleyjs/dist/parsley.min.js') }}"></script>
@ -75,10 +75,10 @@
<script async defer data-domain=”{{ PLAUSIBLE_DOMAIN }} src=”{{ PLAUSIBLE_HOST }}/js/plausible.outbound-links.js></script> <script async defer data-domain=”{{ PLAUSIBLE_DOMAIN }} src=”{{ PLAUSIBLE_HOST }}/js/plausible.outbound-links.js></script>
{% endif %} {% endif %}
<link rel="stylesheet" <link rel="stylesheet"
href="{{ url_for('static', filename='darkmode.css') }}?v={{ VERSION }}"/> href="{{ url_for('static', filename='darkmode.css') }}?v={{ VERSION }}" />
<link rel="stylesheet" <link rel="stylesheet"
type="text/css" type="text/css"
href="/static/style.css?v={{ VERSION }}"/> href="/static/style.css?v={{ VERSION }}" />
<script src="{{ url_for('static', filename='js/theme.js') }}"></script> <script src="{{ url_for('static', filename='js/theme.js') }}"></script>
<script>toastr.options.closeButton = true;</script> <script>toastr.options.closeButton = true;</script>
<!-- For additional head --> <!-- For additional head -->

View File

@ -33,7 +33,7 @@
This email address is used to log in to SimpleLogin. This email address is used to log in to SimpleLogin.
<br /> <br />
If you want to change the mailbox that emails are forwarded to, use the If you want to change the mailbox that emails are forwarded to, use the
<a href="{{ url_for('dashboard.mailbox_route') }}"> <a href="{{ url_for("dashboard.mailbox_route") }}">
<i class="fe fe-inbox"></i> Mailboxes page <i class="fe fe-inbox"></i> Mailboxes page
</a> </a>
instead. instead.
@ -50,14 +50,14 @@
<div class="mt-2"> <div class="mt-2">
<span class="text-danger float-left">Pending email change: {{ pending_email }}</span> <span class="text-danger float-left">Pending email change: {{ pending_email }}</span>
<form method="POST" <form method="POST"
action="{{ url_for('dashboard.resend_email_change') }}" action="{{ url_for("dashboard.resend_email_change") }}"
class="float-left ml-2"> class="float-left ml-2">
{{ change_email_form.csrf_token }} {{ change_email_form.csrf_token }}
<a onclick="this.closest('form').submit()" <a onclick="this.closest('form').submit()"
class="btn btn-secondary btn-sm">Resend confirmation email</a> class="btn btn-secondary btn-sm">Resend confirmation email</a>
</form> </form>
<form method="POST" <form method="POST"
action="{{ url_for('dashboard.cancel_email_change') }}" action="{{ url_for("dashboard.cancel_email_change") }}"
class="float-left ml-2"> class="float-left ml-2">
{{ change_email_form.csrf_token }} {{ change_email_form.csrf_token }}
<a onclick="this.closest('form').submit()" <a onclick="this.closest('form').submit()"
@ -91,10 +91,10 @@
</div> </div>
{% if not current_user.enable_otp %} {% if not current_user.enable_otp %}
<a href="{{ url_for('dashboard.mfa_setup') }}" <a href="{{ url_for("dashboard.mfa_setup") }}"
class="btn btn-outline-primary">Setup TOTP</a> class="btn btn-outline-primary">Setup TOTP</a>
{% else %} {% else %}
<a href="{{ url_for('dashboard.mfa_cancel') }}" <a href="{{ url_for("dashboard.mfa_cancel") }}"
class="btn btn-outline-danger">Disable TOTP</a> class="btn btn-outline-danger">Disable TOTP</a>
{% endif %} {% endif %}
</div> </div>
@ -111,10 +111,10 @@
</div> </div>
{% if current_user.fido_uuid is none %} {% if current_user.fido_uuid is none %}
<a href="{{ url_for('dashboard.fido_setup') }}" <a href="{{ url_for("dashboard.fido_setup") }}"
class="btn btn-outline-primary">Setup WebAuthn</a> class="btn btn-outline-primary">Setup WebAuthn</a>
{% else %} {% else %}
<a href="{{ url_for('dashboard.fido_manage') }}" <a href="{{ url_for("dashboard.fido_manage") }}"
class="btn btn-outline-info">Manage WebAuthn</a> class="btn btn-outline-info">Manage WebAuthn</a>
{% endif %} {% endif %}
</div> </div>
@ -146,7 +146,7 @@
<div class="card-body"> <div class="card-body">
<div class="card-title">Account Deletion</div> <div class="card-title">Account Deletion</div>
<div class="mb-3">If SimpleLogin isn't the right fit for you, you can simply delete your account.</div> <div class="mb-3">If SimpleLogin isn't the right fit for you, you can simply delete your account.</div>
<a href="{{ url_for('dashboard.delete_account') }}" <a href="{{ url_for("dashboard.delete_account") }}"
class="btn btn-outline-danger">Delete account</a> class="btn btn-outline-danger">Delete account</a>
</div> </div>
</div> </div>

View File

@ -27,7 +27,7 @@
<br /> <br />
<img src="/static/images/reverse-alias.svg" <img src="/static/images/reverse-alias.svg"
style="border: 1px solid" style="border: 1px solid"
class="my-2 img-fluid"/> class="my-2 img-fluid" />
</p> </p>
<p>This might seem like "magic" but trust us, only the first time is a bit awkward.</p> <p>This might seem like "magic" but trust us, only the first time is a bit awkward.</p>
<p> <p>
@ -75,9 +75,7 @@
{% else %} {% else %}
<button disabled <button disabled
title="Upgrade to premium to create reverse-aliases" title="Upgrade to premium to create reverse-aliases"
class="btn btn-primary mt-2"> class="btn btn-primary mt-2">Create reverse-alias</button>
Create reverse-alias
</button>
{% endif %} {% endif %}
</form> </form>
</div> </div>
@ -98,9 +96,7 @@
{% if highlight_contact_id %} {% if highlight_contact_id %}
<a href="{{ url_for("dashboard.alias_contact_manager", alias_id=alias.id, highlight_contact_id=highlight_contact_id) }}" <a href="{{ url_for("dashboard.alias_contact_manager", alias_id=alias.id, highlight_contact_id=highlight_contact_id) }}"
class="btn btn-light"> class="btn btn-light">Reset</a>
Reset
</a>
{% else %} {% else %}
<a href="{{ url_for("dashboard.alias_contact_manager", alias_id=alias.id) }}" <a href="{{ url_for("dashboard.alias_contact_manager", alias_id=alias.id) }}"
class="btn btn-light">Reset</a> class="btn btn-light">Reset</a>
@ -114,7 +110,7 @@
{% set contact = contact_info.contact %} {% set contact = contact_info.contact %}
<div class="col-md-6"> <div class="col-md-6">
<div class="my-2 p-2 card {% if contact.id == highlight_contact_id %} highlight-row{% endif %}"> <div class="my-2 p-2 card {% if contact.id == highlight_contact_id %}highlight-row{% endif %}">
<div class="mb-2 row"> <div class="mb-2 row">
<div class="col"> <div class="col">
<span class="font-weight-bold">{{ contact.website_email }}</span> <span class="font-weight-bold">{{ contact.website_email }}</span>
@ -139,15 +135,11 @@
target="_blank" target="_blank"
data-toggle="tooltip" data-toggle="tooltip"
title="You can click on this to open your email client. Or use the copy button 👉" title="You can click on this to open your email client. Or use the copy button 👉"
class="font-weight-bold"> class="font-weight-bold">*************************</a>
*************************
</a>
<span class="clipboard btn btn-sm btn-success copy-btn" <span class="clipboard btn btn-sm btn-success copy-btn"
data-toggle="tooltip" data-toggle="tooltip"
title="Copy the reverse-alias to clipboard" title="Copy the reverse-alias to clipboard"
data-clipboard-text="{{ contact.website_send_to() }}"> data-clipboard-text="{{ contact.website_send_to() }}">Copy reverse-alias</span>
Copy reverse-alias
</span>
</span> </span>
</div> </div>
<div class="mb-2 text-muted small-text"> <div class="mb-2 text-muted small-text">
@ -207,14 +199,12 @@
<nav aria-label="Contact navigation"> <nav aria-label="Contact navigation">
<ul class="pagination"> <ul class="pagination">
<li class="page-item"> <li class="page-item">
<a class="btn btn-outline-secondary {% if page == 0 %}disabled{% endif %}" <a class="btn btn-outline-secondary {% if page == 0 %}disabled{% endif %}" href="{{ url_for('dashboard.alias_contact_manager', alias_id=alias.id, page=page-1) }}">
href="{{ url_for('dashboard.alias_contact_manager', alias_id=alias.id, page=page-1) }}">
Previous Previous
</a> </a>
</li> </li>
<li class="page-item"> <li class="page-item">
<a class="btn btn-outline-secondary {% if last_page %}disabled{% endif %}" <a class="btn btn-outline-secondary {% if last_page %}disabled{% endif %}" href="{{ url_for('dashboard.alias_contact_manager', alias_id=alias.id, page=page+1) }}">
href="{{ url_for('dashboard.alias_contact_manager', alias_id=alias.id, page=page+1) }}">
Next Next
</a> </a>
</li> </li>

View File

@ -13,7 +13,9 @@
<div class="d-flex align-items-center"> <div class="d-flex align-items-center">
<div class="subheader">Total</div> <div class="subheader">Total</div>
<div class="text-muted" <div class="text-muted"
style="order: 2; margin-left: auto; font-size: .8rem">Last 14 days</div> style="order: 2;
margin-left: auto;
font-size: .8rem">Last 14 days</div>
</div> </div>
<div class="h1 m-0">{{ total }}</div> <div class="h1 m-0">{{ total }}</div>
</div> </div>
@ -25,7 +27,9 @@
<div class="d-flex align-items-center"> <div class="d-flex align-items-center">
<div class="subheader">Forwarded</div> <div class="subheader">Forwarded</div>
<div class="text-muted" <div class="text-muted"
style="order: 2; margin-left: auto; font-size: .8rem">Last 14 days</div> style="order: 2;
margin-left: auto;
font-size: .8rem">Last 14 days</div>
</div> </div>
<div class="h1 m-0">{{ email_forwarded }}</div> <div class="h1 m-0">{{ email_forwarded }}</div>
</div> </div>
@ -37,7 +41,9 @@
<div class="d-flex align-items-center"> <div class="d-flex align-items-center">
<div class="subheader">Replies/Sent</div> <div class="subheader">Replies/Sent</div>
<div class="text-muted" <div class="text-muted"
style="order: 2; margin-left: auto; font-size: .8rem">Last 14 days</div> style="order: 2;
margin-left: auto;
font-size: .8rem">Last 14 days</div>
</div> </div>
<div class="h1 m-0">{{ email_replied }}</div> <div class="h1 m-0">{{ email_replied }}</div>
</div> </div>
@ -49,7 +55,9 @@
<div class="d-flex align-items-center"> <div class="d-flex align-items-center">
<div class="subheader">Blocked</div> <div class="subheader">Blocked</div>
<div class="text-muted" <div class="text-muted"
style="order: 2; margin-left: auto; font-size: .8rem">Last 14 days</div> style="order: 2;
margin-left: auto;
font-size: .8rem">Last 14 days</div>
</div> </div>
<div class="h1 m-0">{{ email_blocked }}</div> <div class="h1 m-0">{{ email_blocked }}</div>
</div> </div>
@ -111,14 +119,12 @@
<nav aria-label="Alias log navigation"> <nav aria-label="Alias log navigation">
<ul class="pagination"> <ul class="pagination">
<li class="page-item"> <li class="page-item">
<a class="btn btn-outline-secondary {% if page_id == 0 %}disabled{% endif %}" <a class="btn btn-outline-secondary {% if page_id == 0 %}disabled{% endif %}" href="{{ url_for('dashboard.alias_log', alias_id=alias_id, page_id=page_id-1) }}">
href="{{ url_for('dashboard.alias_log', alias_id=alias_id, page_id=page_id-1) }}">
Previous Previous
</a> </a>
</li> </li>
<li class="page-item"> <li class="page-item">
<a class="btn btn-outline-secondary {% if last_page %}disabled{% endif %}" <a class="btn btn-outline-secondary {% if last_page %}disabled{% endif %}" href="{{ url_for('dashboard.alias_log', alias_id=alias_id, page_id=page_id+1) }}">
href="{{ url_for('dashboard.alias_log', alias_id=alias_id, page_id=page_id+1) }}">
Next Next
</a> </a>
</li> </li>

View File

@ -15,10 +15,7 @@
<select data-width="100%" class="mailbox-select" multiple name="mailbox_ids"> <select data-width="100%" class="mailbox-select" multiple name="mailbox_ids">
{% for mailbox in mailboxes %} {% for mailbox in mailboxes %}
<option value="{{ mailbox.id }}" <option value="{{ mailbox.id }}" {% if mailbox.id == current_user.default_mailbox_id %}selected{% endif %}>{{ mailbox.email }}</option>
{% if mailbox.id == current_user.default_mailbox_id %} selected{% endif %}>
{{ mailbox.email }}
</option>
{% endfor %} {% endfor %}
</select> </select>
<button class="btn btn-success mt-2">Confirm</button> <button class="btn btn-success mt-2">Confirm</button>

View File

@ -16,9 +16,7 @@
<em data-toggle="tooltip" <em data-toggle="tooltip"
title="Click to copy" title="Click to copy"
class="clipboard" class="clipboard"
data-clipboard-text="{{ alias_transfer_url }}"> data-clipboard-text="{{ alias_transfer_url }}">{{ alias_transfer_url }}</em>
{{ alias_transfer_url }}
</em>
<p class="mt-5"> <p class="mt-5">
Please copy the transfer URL. <strong>We won't be able to display it again</strong>. If you need to access it again you can generate a new URL. Please copy the transfer URL. <strong>We won't be able to display it again</strong>. If you need to access it again you can generate a new URL.
</p> </p>

View File

@ -22,7 +22,7 @@
<br /> <br />
The period left in the current subscription isn't taken into account. The period left in the current subscription isn't taken into account.
<br /> <br />
<a href="{{ url_for('dashboard.pricing') }}" <a href="{{ url_for("dashboard.pricing") }}"
class="btn btn-primary mt-2">Re-subscribe</a> class="btn btn-primary mt-2">Re-subscribe</a>
</p> </p>
{% else %} {% else %}

View File

@ -43,12 +43,14 @@
{% endif %} {% endif %}
<div class="form-group"> <div class="form-group">
<label class="form-label">PGP Public Key</label> <label class="form-label">PGP Public Key</label>
<textarea name="pgp" {% if not current_user.is_premium() %} disabled {% endif %} class="form-control" rows=10 id="pgp-public-key" placeholder="(Drag and drop or paste your pgp public key here)&#10;-----BEGIN PGP PUBLIC KEY BLOCK-----">{{ contact.pgp_public_key or "" }}</textarea> <textarea name="pgp"
{% if not current_user.is_premium() %}disabled{% endif %}
class="form-control"
rows="10"
id="pgp-public-key"
placeholder="(Drag and drop or paste your pgp public key here)&#10;-----BEGIN PGP PUBLIC KEY BLOCK-----">{{ contact.pgp_public_key or "" }}</textarea>
</div> </div>
<button class="btn btn-primary" name="action" {% if not current_user.is_premium() %} <button class="btn btn-primary" name="action" {% if not current_user.is_premium() %}disabled{% endif %} value="save">Save</button>
disabled {% endif %} value="save">
Save
</button>
{% if contact.pgp_finger_print %} {% if contact.pgp_finger_print %}
<button class="btn btn-danger float-right" name="action" value="remove">Remove</button> <button class="btn btn-danger float-right" name="action" value="remove">Remove</button>

View File

@ -74,10 +74,7 @@
required> required>
{% for mailbox in mailboxes %} {% for mailbox in mailboxes %}
<option value="{{ mailbox.id }}" <option value="{{ mailbox.id }}" {% if mailbox.id == current_user.default_mailbox_id %}selected{% endif %}>{{ mailbox.email }}</option>
{% if mailbox.id == current_user.default_mailbox_id %} selected{% endif %}>
{{ mailbox.email }}
</option>
{% endfor %} {% endfor %}
</select> </select>
<div class="small-text">The mailbox(es) that owns this alias.</div> <div class="small-text">The mailbox(es) that owns this alias.</div>
@ -102,7 +99,6 @@
</div> </div>
{% endblock %} {% endblock %}
{% block script %} {% block script %}
<script> <script>
$('.mailbox-select').multipleSelect(); $('.mailbox-select').multipleSelect();

View File

@ -30,9 +30,7 @@
</a> </a>
</div> </div>
{% endif %} {% endif %}
<div class="alert alert-primary collapse {% if not custom_domains %} show{% endif %}" <div class="alert alert-primary collapse {% if not custom_domains %}show{% endif %}" id="howtouse" role="alert">
id="howtouse"
role="alert">
By adding your domain, you can create aliases like <b>hi@my-domain.com</b> By adding your domain, you can create aliases like <b>hi@my-domain.com</b>
<br /> <br />
You can also enable <b>catch-all</b> to create aliases on-the-fly: You can also enable <b>catch-all</b> to create aliases on-the-fly:
@ -50,18 +48,14 @@
{% if custom_domain.ownership_verified and not custom_domain.verified %} {% if custom_domain.ownership_verified and not custom_domain.verified %}
<a href="{{ url_for('dashboard.domain_detail_dns', custom_domain_id=custom_domain.id, _anchor='dns-setup') }}" <a href="{{ url_for('dashboard.domain_detail_dns', custom_domain_id=custom_domain.id, _anchor='dns-setup') }}"
class="btn btn-info btn-sm"> class="btn btn-info btn-sm">Ownership verified. Setup the DNS</a>
Ownership verified. Setup the DNS
</a>
{% elif custom_domain.ownership_verified and custom_domain.verified %} {% elif custom_domain.ownership_verified and custom_domain.verified %}
<span class="badge badge-success">Domain ready</span> <span class="badge badge-success">Domain ready</span>
<!-- custom_domain.ownership_verified is False --> <!-- custom_domain.ownership_verified is False -->
{% else %} {% else %}
<a href="{{ url_for('dashboard.domain_detail_dns', custom_domain_id=custom_domain.id, _anchor='ownership-form') }}" <a href="{{ url_for('dashboard.domain_detail_dns', custom_domain_id=custom_domain.id, _anchor='ownership-form') }}"
class="btn btn-warning btn-sm" class="btn btn-warning btn-sm"
role="button"> role="button">Verify domain ownership</a>
Verify domain ownership
</a>
{% endif %} {% endif %}
</h5> </h5>
<h6 class="card-subtitle mb-4 text-muted"> <h6 class="card-subtitle mb-4 text-muted">
@ -100,4 +94,4 @@
</div> </div>
</div> </div>
{% endblock %} {% endblock %}
{% block script %}<script>$('.mailbox-select').multipleSelect();</script>{% endblock %}

View File

@ -22,9 +22,7 @@
<div class="alert alert-danger" role="alert">This feature is only available in premium plan.</div> <div class="alert alert-danger" role="alert">This feature is only available in premium plan.</div>
{% endif %} {% endif %}
<div class="alert alert-primary collapse {% if not dirs %} show{% endif %}" <div class="alert alert-primary collapse {% if not dirs %}show{% endif %}" id="howtouse" role="alert">
id="howtouse"
role="alert">
<div> <div>
Directory allows you to create aliases <b>on the fly</b>. Directory allows you to create aliases <b>on the fly</b>.
</div> </div>
@ -68,10 +66,10 @@
<form method="post"> <form method="post">
{{ toggle_dir_form.csrf_token }} {{ toggle_dir_form.csrf_token }}
<input type="hidden" name="form-name" value="toggle-directory"> <input type="hidden" name="form-name" value="toggle-directory">
{{ toggle_dir_form.directory_id( type="hidden", value=dir.id) }} {{ toggle_dir_form.directory_id(type="hidden", value=dir.id) }}
<label class="custom-switch cursor" style="padding-left: 1rem" data-toggle="tooltip" {% if dir.disabled %} <label class="custom-switch cursor" style="padding-left: 1rem" data-toggle="tooltip" {% if dir.disabled %}
title="Enable directory on-the-fly alias creation" {% else %} title="Disable directory on-the-fly alias creation" {% endif %}> title="Enable directory on-the-fly alias creation" {% else %} title="Disable directory on-the-fly alias creation" {% endif %}>
{{ toggle_dir_form.directory_enabled( class="custom-switch-input", checked=(not dir.disabled) ) }} {{ toggle_dir_form.directory_enabled(class="custom-switch-input", checked=(not dir.disabled) ) }}
<span class="custom-switch-indicator"></span> <span class="custom-switch-indicator"></span>
</label> </label>
</form> </form>
@ -91,11 +89,11 @@
data-toggle="tooltip" data-toggle="tooltip"
title="Aliases created with this directory are automatically owned by these mailboxes"></i> title="Aliases created with this directory are automatically owned by these mailboxes"></i>
<br /> <br />
{% set dir_mailboxes=dir.mailboxes %} {% set dir_mailboxes = dir.mailboxes %}
<form method="post" class="mt-2"> <form method="post" class="mt-2">
{{ update_dir_form.csrf_token }} {{ update_dir_form.csrf_token }}
<input type="hidden" name="form-name" value="update"> <input type="hidden" name="form-name" value="update">
{{ update_dir_form.directory_id( type="hidden", value=dir.id) }} {{ update_dir_form.directory_id(type="hidden", value=dir.id) }}
<select data-width="100%" <select data-width="100%"
required required
class="mailbox-select" class="mailbox-select"
@ -103,10 +101,7 @@
name="mailbox_ids"> name="mailbox_ids">
{% for mailbox in mailboxes %} {% for mailbox in mailboxes %}
<option value="{{ mailbox.id }}" <option value="{{ mailbox.id }}" {% if mailbox in dir_mailboxes %}selected{% endif %}>{{ mailbox.email }}</option>
{% if mailbox in dir_mailboxes %} selected{% endif %}>
{{ mailbox.email }}
</option>
{% endfor %} {% endfor %}
</select> </select>
<button class="mt-2 btn btn-outline-primary btn-sm">Update</button> <button class="mt-2 btn btn-outline-primary btn-sm">Update</button>
@ -119,7 +114,7 @@
<form method="post"> <form method="post">
{{ delete_dir_form.csrf_token }} {{ delete_dir_form.csrf_token }}
<input type="hidden" name="form-name" value="delete"> <input type="hidden" name="form-name" value="delete">
{{ delete_dir_form.directory_id( type="hidden", value=dir.id) }} {{ delete_dir_form.directory_id(type="hidden", value=dir.id) }}
<span class="card-link btn btn-link float-right text-danger delete-dir">Delete</span> <span class="card-link btn btn-link float-right text-danger delete-dir">Delete</span>
</form> </form>
</div> </div>
@ -129,7 +124,7 @@
</div> </div>
{% endfor %} {% endfor %}
</div> </div>
<div class="row {% if current_user.directory_quota <= 0 %} disabled-content{% endif %}"> <div class="row {% if current_user.directory_quota <= 0 %}disabled-content{% endif %}">
<div class="col"> <div class="col">
<div class="card"> <div class="card">
<div class="card-body"> <div class="card-body">
@ -139,8 +134,8 @@
<h2 class="h4 mb-1">New Directory</h2> <h2 class="h4 mb-1">New Directory</h2>
<div class="small-text mb-4">You can create up to {{ current_user.directory_quota }} directories.</div> <div class="small-text mb-4">You can create up to {{ current_user.directory_quota }} directories.</div>
{{ new_dir_form.name(class="form-control", placeholder="my-directory", {{ new_dir_form.name(class="form-control", placeholder="my-directory",
pattern="[0-9a-z-_]{3,}", pattern="[0-9a-z-_]{3,}",
title="Only letter, number, dash (-), underscore (_) can be used. Directory name must be at least 3 characters.") }} title="Only letter, number, dash (-), underscore (_) can be used. Directory name must be at least 3 characters.") }}
{{ render_field_errors(new_dir_form.name) }} {{ render_field_errors(new_dir_form.name) }}
<div class="small-text"> <div class="small-text">
Directory name must be at least 3 characters. Directory name must be at least 3 characters.
@ -156,10 +151,7 @@
<select data-width="100%" class="mailbox-select" multiple name="mailbox_ids"> <select data-width="100%" class="mailbox-select" multiple name="mailbox_ids">
{% for mailbox in mailboxes %} {% for mailbox in mailboxes %}
<option value="{{ mailbox.id }}" <option value="{{ mailbox.id }}" {% if mailbox.id == current_user.default_mailbox_id %}selected{% endif %}>{{ mailbox.email }}</option>
{% if mailbox.id == current_user.default_mailbox_id %} selected{% endif %}>
{{ mailbox.email }}
</option>
{% endfor %} {% endfor %}
</select> </select>
<button id="btn-create-directory" class="btn btn-primary mt-2">Create</button> <button id="btn-create-directory" class="btn btn-primary mt-2">Create</button>

View File

@ -13,7 +13,8 @@
<div class="alert alert-warning mt-3">Rules are ineffective when catch-all is enabled.</div> <div class="alert alert-warning mt-3">Rules are ineffective when catch-all is enabled.</div>
{% endif %} {% endif %}
<div class="{% if custom_domain.catch_all %} disabled-content{% endif %}"> <div class="{% if custom_domain.catch_all %}
disabled-content{% endif %}">
<div class="mt-3 mb-2"> <div class="mt-3 mb-2">
For a greater control than a simple catch-all, you can define a set of <b>rules</b> to auto create aliases. For a greater control than a simple catch-all, you can define a set of <b>rules</b> to auto create aliases.
<br /> <br />
@ -60,8 +61,7 @@
<div class="form-group"> <div class="form-group">
<label>Regex</label> <label>Regex</label>
{{ new_auto_create_rule_form.regex(class="form-control", {{ new_auto_create_rule_form.regex(class="form-control",
placeholder="prefix.*" placeholder="prefix.*") }}
) }}
{{ render_field_errors(new_auto_create_rule_form.regex) }} {{ render_field_errors(new_auto_create_rule_form.regex) }}
<div class="small-text"> <div class="small-text">
For example, if you want aliases that starts with <b>prefix</b> to be automatically created, you can set For example, if you want aliases that starts with <b>prefix</b> to be automatically created, you can set
@ -95,10 +95,7 @@
name="mailbox_ids"> name="mailbox_ids">
{% for mailbox in mailboxes %} {% for mailbox in mailboxes %}
<option value="{{ mailbox.id }}" <option value="{{ mailbox.id }}" {% if mailbox.id == current_user.default_mailbox_id %}selected{% endif %}>{{ mailbox.email }}</option>
{% if mailbox.id == current_user.default_mailbox_id %} selected{% endif %}>
{{ mailbox.email }}
</option>
{% endfor %} {% endfor %}
</select> </select>
</div> </div>
@ -128,9 +125,7 @@
{% if auto_create_test_result %} {% if auto_create_test_result %}
<div class="alert {% if auto_create_test_passed %} <div class="alert {% if auto_create_test_passed %}
alert-success {% else %} alert-warning {% endif %}"> alert-success {% else %} alert-warning {% endif %}">{{ auto_create_test_result }}</div>
{{ auto_create_test_result }}
</div>
{% endif %} {% endif %}
</div> </div>
</div> </div>

View File

@ -38,7 +38,7 @@
Value: <em data-toggle="tooltip" Value: <em data-toggle="tooltip"
title="Click to copy" title="Click to copy"
class="clipboard" class="clipboard"
data-clipboard-text="{{ custom_domain.get_ownership_dns_txt_value() }}">{{ custom_domain.get_ownership_dns_txt_value() }}</em> data-clipboard-text="{{ ownership_record }}">{{ ownership_record }}</em>
</div> </div>
<form method="post" action="#ownership-form"> <form method="post" action="#ownership-form">
{{ csrf_form.csrf_token }} {{ csrf_form.csrf_token }}
@ -63,8 +63,8 @@
{% endif %} {% endif %}
<hr /> <hr />
{% endif %} {% endif %}
<div class="{% if not custom_domain.ownership_verified %} disabled-content{% endif %}" <div class="{% if not custom_domain.ownership_verified %}
id="dns-setup"> disabled-content{% endif %}" id="dns-setup">
{% if not custom_domain.ownership_verified %} {% if not custom_domain.ownership_verified %}
<div class="alert alert-warning">A domain ownership must be verified first.</div> <div class="alert alert-warning">A domain ownership must be verified first.</div>
@ -91,7 +91,8 @@
<br /> <br />
Some domain registrars (Namecheap, CloudFlare, etc) might also use <em>@</em> for the root domain. Some domain registrars (Namecheap, CloudFlare, etc) might also use <em>@</em> for the root domain.
</div> </div>
{% for priority, email_server in EMAIL_SERVERS_WITH_PRIORITY %}
{% for record in expected_mx_records %}
<div class="mb-3 p-3 dns-record"> <div class="mb-3 p-3 dns-record">
Record: MX Record: MX
@ -99,14 +100,15 @@
Domain: {{ custom_domain.domain }} or Domain: {{ custom_domain.domain }} or
<b>@</b> <b>@</b>
<br /> <br />
Priority: {{ priority }} Priority: {{ record.priority }}
<br /> <br />
Target: <em data-toggle="tooltip" Target: <em data-toggle="tooltip"
title="Click to copy" title="Click to copy"
class="clipboard" class="clipboard"
data-clipboard-text="{{ email_server }}">{{ email_server }}</em> data-clipboard-text="{{ record.domain }}">{{ record.domain }}</em>
</div> </div>
{% endfor %} {% endfor %}
<form method="post" action="#mx-form"> <form method="post" action="#mx-form">
{{ csrf_form.csrf_token }} {{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="check-mx"> <input type="hidden" name="form-name" value="check-mx">
@ -177,9 +179,7 @@
<em data-toggle="tooltip" <em data-toggle="tooltip"
title="Click to copy" title="Click to copy"
class="clipboard" class="clipboard"
data-clipboard-text="{{ spf_record }}"> data-clipboard-text="{{ spf_record }}">{{ spf_record }}</em>
{{ spf_record }}
</em>
</div> </div>
<form method="post" action="#spf-form"> <form method="post" action="#spf-form">
{{ csrf_form.csrf_token }} {{ csrf_form.csrf_token }}
@ -238,10 +238,8 @@
Setting up DKIM is highly recommended to reduce the chance your emails ending up in the recipient's Spam Setting up DKIM is highly recommended to reduce the chance your emails ending up in the recipient's Spam
folder. folder.
</div> </div>
<div class="mb-2"> <div class="mb-2">Add the following CNAME DNS records to your domain.</div>
Add the following CNAME DNS records to your domain. {% for dkim_prefix, dkim_cname_value in dkim_records.items() %}
</div>
{% for dkim_prefix, dkim_cname_value in dkim_records %}
<div class="mb-2 p-3 dns-record"> <div class="mb-2 p-3 dns-record">
Record: CNAME Record: CNAME
@ -256,9 +254,7 @@
title="Click to copy" title="Click to copy"
class="clipboard" class="clipboard"
data-clipboard-text="{{ dkim_cname_value }}." data-clipboard-text="{{ dkim_cname_value }}."
style="overflow-wrap: break-word"> style="overflow-wrap: break-word">{{ dkim_cname_value }}.</em>
{{ dkim_cname_value }}.
</em>
</div> </div>
{% endfor %} {% endfor %}
<div class="alert alert-info"> <div class="alert alert-info">
@ -282,21 +278,15 @@
<input type="hidden" name="form-name" value="check-dkim"> <input type="hidden" name="form-name" value="check-dkim">
{% if custom_domain.dkim_verified %} {% if custom_domain.dkim_verified %}
<button type="submit" class="btn btn-outline-primary"> <button type="submit" class="btn btn-outline-primary">Re-verify</button>
Re-verify
</button>
{% else %} {% else %}
<button type="submit" class="btn btn-primary"> <button type="submit" class="btn btn-primary">Verify</button>
Verify
</button>
{% endif %} {% endif %}
</form> </form>
{% if not dkim_ok %} {% if not dkim_ok %}
<div class="text-danger mt-4"> <div class="text-danger mt-4">
<p> <p>Your DNS is not correctly set.</p>
Your DNS is not correctly set.
</p>
<ul> <ul>
{% for custom_record, retrieved_cname in dkim_errors.items() %} {% for custom_record, retrieved_cname in dkim_errors.items() %}
@ -312,10 +302,8 @@
</div> </div>
{% if custom_domain.dkim_verified %} {% if custom_domain.dkim_verified %}
<div class="text-danger mt-4"> <div class="text-danger mt-4">DKIM is still enabled. Please update your DKIM settings with all CNAME records</div>
DKIM is still enabled. Please update your DKIM settings with all CNAME records {% endif %}
</div>
{% endif %}
{% endif %} {% endif %}
</div> </div>
<hr /> <hr />
@ -330,24 +318,20 @@
{% else %} {% else %}
<span class="cursor" <span class="cursor"
data-toggle="tooltip" data-toggle="tooltip"
data-original-title="DMARC Not Verified">🚫 </span> data-original-title="DMARC Not Verified">🚫</span>
{% endif %} {% endif %}
</div> </div>
<div> <div>
DMARC DMARC
<a href="https://en.wikipedia.org/wiki/DMARC" <a href="https://en.wikipedia.org/wiki/DMARC"
target="_blank" target="_blank"
rel="noopener noreferrer"> rel="noopener noreferrer">(Wikipedia↗)</a>
(Wikipedia↗)
</a>
is designed to protect the domain from unauthorized use, commonly known as email spoofing. is designed to protect the domain from unauthorized use, commonly known as email spoofing.
<br /> <br />
Built around SPF and DKIM, a DMARC policy tells the receiving mail server what to do if Built around SPF and DKIM, a DMARC policy tells the receiving mail server what to do if
neither of those authentication methods passes. neither of those authentication methods passes.
</div> </div>
<div class="mb-2"> <div class="mb-2">Add the following TXT DNS record to your domain.</div>
Add the following TXT DNS record to your domain.
</div>
<div class="mb-2 p-3 dns-record"> <div class="mb-2 p-3 dns-record">
Record: TXT Record: TXT
<br /> <br />
@ -360,9 +344,7 @@
<em data-toggle="tooltip" <em data-toggle="tooltip"
title="Click to copy" title="Click to copy"
class="clipboard" class="clipboard"
data-clipboard-text="{{ dmarc_record }}"> data-clipboard-text="{{ dmarc_record }}">{{ dmarc_record }}</em>
{{ dmarc_record }}
</em>
</div> </div>
<div class="alert alert-info"> <div class="alert alert-info">
Some DNS registrar might require a full record path, in this case please use Some DNS registrar might require a full record path, in this case please use
@ -377,13 +359,9 @@
<input type="hidden" name="form-name" value="check-dmarc"> <input type="hidden" name="form-name" value="check-dmarc">
{% if custom_domain.dmarc_verified %} {% if custom_domain.dmarc_verified %}
<button type="submit" class="btn btn-outline-primary"> <button type="submit" class="btn btn-outline-primary">Re-verify</button>
Re-verify
</button>
{% else %} {% else %}
<button type="submit" class="btn btn-primary"> <button type="submit" class="btn btn-primary">Verify</button>
Verify
</button>
{% endif %} {% endif %}
</form> </form>
{% if not dmarc_ok %} {% if not dmarc_ok %}

View File

@ -34,13 +34,14 @@
. .
</div> </div>
</div> </div>
<div class="{% if not custom_domain.catch_all %} disabled-content{% endif %}"> <div class="{% if not custom_domain.catch_all %}
disabled-content{% endif %}">
<div> <div>
Auto-created aliases are automatically owned by the following mailboxes Auto-created aliases are automatically owned by the following mailboxes
<i class="fe fe-corner-right-down"></i> <i class="fe fe-corner-right-down"></i>
. .
</div> </div>
{% set domain_mailboxes=custom_domain.mailboxes %} {% set domain_mailboxes = custom_domain.mailboxes %}
<form method="post" class="mt-2"> <form method="post" class="mt-2">
{{ csrf_form.csrf_token }} {{ csrf_form.csrf_token }}
<input type="hidden" name="form-name" value="update"> <input type="hidden" name="form-name" value="update">
@ -54,10 +55,7 @@
name="mailbox_ids"> name="mailbox_ids">
{% for mailbox in mailboxes %} {% for mailbox in mailboxes %}
<option value="{{ mailbox.id }}" <option value="{{ mailbox.id }}" {% if mailbox in domain_mailboxes %}selected{% endif %}>{{ mailbox.email }}</option>
{% if mailbox in domain_mailboxes %} selected{% endif %}>
{{ mailbox.email }}
</option>
{% endfor %} {% endfor %}
</select> </select>
</div> </div>

View File

@ -21,8 +21,8 @@
<div class="my-3"> <div class="my-3">
<p>Alternatively you can use your Proton credentials to ensure it's you.</p> <p>Alternatively you can use your Proton credentials to ensure it's you.</p>
</div> </div>
<a class="btn btn-primary btn-block mt-2 proton-button w-25" <a class="btn btn-primary btn-block mt-2 proton-button"
href="{{ url_for('auth.proton_login', next=next) }}"> href="{{ url_for('auth.proton_login', next=next) }}" style="max-width: 400px">
<img class="mr-2" src="/static/images/proton.svg" /> <img class="mr-2" src="/static/images/proton.svg" />
Authenticate with Proton Authenticate with Proton
</a> </a>
@ -38,4 +38,4 @@
{% endif %} {% endif %}
</div> </div>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -45,7 +45,7 @@
<td>Link a New Key</td> <td>Link a New Key</td>
<td></td> <td></td>
<td class="text-center"> <td class="text-center">
<a href="{{ url_for('dashboard.fido_setup') }}"> <a href="{{ url_for("dashboard.fido_setup") }}">
<button class="btn btn-outline-success">Link</button> <button class="btn btn-outline-success">Link</button>
</a> </a>
</td> </td>

Some files were not shown because too many files have changed in this diff Show More