Compare commits
39 Commits
Author | SHA1 | Date | |
---|---|---|---|
99ffd1ec0c | |||
eda940f8b2 | |||
1dad582523 | |||
e516266a27 | |||
850fc95477 | |||
d172825900 | |||
026865e5bf | |||
add94ef2a2 | |||
1081400948 | |||
5776128905 | |||
d661860f4c | |||
0a52e32972 | |||
703dcbd0eb | |||
ce7ed69547 | |||
4f5564df16 | |||
2fee569131 | |||
7ea45d6f5d | |||
6d24db50bd | |||
88f270c6a1 | |||
0962b1cf29 | |||
6051d72691 | |||
c31a75a9ef | |||
ef289385ff | |||
9b12a2ad33 | |||
8eb19d88f3 | |||
e36e9d3077 | |||
b2430cbc5b | |||
1258115397 | |||
38c134d903 | |||
cd77e4cc2d | |||
87aedf3207 | |||
3523c9fc15 | |||
a6f4995cb5 | |||
727f61a35e | |||
ce5124605a | |||
2c82b03f8d | |||
1b7a6223ac | |||
75331c62a4 | |||
3f68a3e640 |
@ -17,6 +17,7 @@ steps:
|
|||||||
image: thegeeklab/drone-docker-buildx
|
image: thegeeklab/drone-docker-buildx
|
||||||
privileged: true
|
privileged: true
|
||||||
settings:
|
settings:
|
||||||
|
provenance: false
|
||||||
dockerfile: app/Dockerfile
|
dockerfile: app/Dockerfile
|
||||||
context: app
|
context: app
|
||||||
registry: git.mrmeeb.stream
|
registry: git.mrmeeb.stream
|
||||||
@ -35,6 +36,7 @@ steps:
|
|||||||
status:
|
status:
|
||||||
- success
|
- success
|
||||||
- failure
|
- failure
|
||||||
|
- killed
|
||||||
settings:
|
settings:
|
||||||
webhook:
|
webhook:
|
||||||
from_secret: slack_webhook
|
from_secret: slack_webhook
|
||||||
|
10
README.md
10
README.md
@ -1,9 +1,7 @@
|
|||||||
# Simple Login
|
# SimpleLogin
|
||||||
|
|
||||||
[](https://drone.mrmeeb.stream/MrMeeb/simple-login)
|
This repo exists to automatically capture any releases of the SaaS edition of SimpleLogin. It checks the simplelogin/app GitHub repo once a day, and builds the latest release automatically if it is newer than the currently built version.
|
||||||
|
|
||||||
This repo exists to automatically capture any releases of the SaaS edition of SimpleLogin. It checks once a day, and builds the latest one automatically if it is newer than the currentlty built version.
|
I did this to simplify deployment of my self-hosted SimpleLogin instance. SimpleLogin do not provide an up-to-date version for self-hosting, leaving you with the options of either running a very outdated version with no app support, a beta version, or their `simplelogin/app-ci` version. This last option works well if you use an x86 machine, but I'm running SimpleLogin on an ARM machine. Since I don't want to have to build containers on the machine itself, this repo handles that for me.
|
||||||
|
|
||||||
This exists to simplify deployment of SimpleLogin in a self-hosted capacity, while also allowing the use of the latest version; SimpleLogin do not provide an up-to-date version for this use.
|
As a result, this image is built for both amd64 and arm64 devices.
|
||||||
|
|
||||||
The image is built for amd64 and arm64 devices.
|
|
8
app/.github/workflows/main.yml
vendored
8
app/.github/workflows/main.yml
vendored
@ -15,9 +15,15 @@ jobs:
|
|||||||
|
|
||||||
- uses: actions/setup-python@v4
|
- uses: actions/setup-python@v4
|
||||||
with:
|
with:
|
||||||
python-version: '3.9'
|
python-version: '3.10'
|
||||||
cache: 'poetry'
|
cache: 'poetry'
|
||||||
|
|
||||||
|
- name: Install OS dependencies
|
||||||
|
if: ${{ matrix.python-version }} == '3.10'
|
||||||
|
run: |
|
||||||
|
sudo apt update
|
||||||
|
sudo apt install -y libre2-dev libpq-dev
|
||||||
|
|
||||||
- name: Install dependencies
|
- name: Install dependencies
|
||||||
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
|
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
|
||||||
run: poetry install --no-interaction
|
run: poetry install --no-interaction
|
||||||
|
@ -7,18 +7,19 @@ repos:
|
|||||||
hooks:
|
hooks:
|
||||||
- id: check-yaml
|
- id: check-yaml
|
||||||
- id: trailing-whitespace
|
- id: trailing-whitespace
|
||||||
- repo: https://github.com/psf/black
|
|
||||||
rev: 22.3.0
|
|
||||||
hooks:
|
|
||||||
- id: black
|
|
||||||
- repo: https://github.com/pycqa/flake8
|
|
||||||
rev: 3.9.2
|
|
||||||
hooks:
|
|
||||||
- id: flake8
|
|
||||||
- repo: https://github.com/Riverside-Healthcare/djLint
|
- repo: https://github.com/Riverside-Healthcare/djLint
|
||||||
rev: v1.3.0
|
rev: v1.3.0
|
||||||
hooks:
|
hooks:
|
||||||
- id: djlint-jinja
|
- id: djlint-jinja
|
||||||
files: '.*\.html'
|
files: '.*\.html'
|
||||||
entry: djlint --reformat
|
entry: djlint --reformat
|
||||||
|
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||||
|
# Ruff version.
|
||||||
|
rev: v0.1.5
|
||||||
|
hooks:
|
||||||
|
# Run the linter.
|
||||||
|
- id: ruff
|
||||||
|
args: [ --fix ]
|
||||||
|
# Run the formatter.
|
||||||
|
- id: ruff-format
|
||||||
|
|
||||||
|
@ -34,7 +34,7 @@ poetry install
|
|||||||
On Mac, sometimes you might need to install some other packages via `brew`:
|
On Mac, sometimes you might need to install some other packages via `brew`:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
brew install pkg-config libffi openssl postgresql
|
brew install pkg-config libffi openssl postgresql@13
|
||||||
```
|
```
|
||||||
|
|
||||||
You also need to install `gpg` tool, on Mac it can be done with:
|
You also need to install `gpg` tool, on Mac it can be done with:
|
||||||
@ -169,6 +169,12 @@ For HTML templates, we use `djlint`. Before creating a pull request, please run
|
|||||||
poetry run djlint --check templates
|
poetry run djlint --check templates
|
||||||
```
|
```
|
||||||
|
|
||||||
|
If some files aren't properly formatted, you can format all files with
|
||||||
|
|
||||||
|
```bash
|
||||||
|
poetry run djlint --reformat .
|
||||||
|
```
|
||||||
|
|
||||||
## Test sending email
|
## Test sending email
|
||||||
|
|
||||||
[swaks](http://www.jetmore.org/john/code/swaks/) is used for sending test emails to the `email_handler`.
|
[swaks](http://www.jetmore.org/john/code/swaks/) is used for sending test emails to the `email_handler`.
|
||||||
|
@ -23,15 +23,15 @@ COPY poetry.lock pyproject.toml ./
|
|||||||
# Install and setup poetry
|
# Install and setup poetry
|
||||||
RUN pip install -U pip \
|
RUN pip install -U pip \
|
||||||
&& apt-get update \
|
&& apt-get update \
|
||||||
&& apt install -y curl netcat gcc python3-dev gnupg git libre2-dev \
|
&& apt install -y curl netcat-traditional gcc python3-dev gnupg git libre2-dev cmake ninja-build\
|
||||||
&& curl -sSL https://install.python-poetry.org | python3 - \
|
&& curl -sSL https://install.python-poetry.org | python3 - \
|
||||||
# Remove curl and netcat from the image
|
# Remove curl and netcat from the image
|
||||||
&& apt-get purge -y curl netcat \
|
&& apt-get purge -y curl netcat-traditional \
|
||||||
# Run poetry
|
# Run poetry
|
||||||
&& poetry config virtualenvs.create false \
|
&& poetry config virtualenvs.create false \
|
||||||
&& poetry install --no-interaction --no-ansi --no-root \
|
&& poetry install --no-interaction --no-ansi --no-root \
|
||||||
# Clear apt cache \
|
# Clear apt cache \
|
||||||
&& apt-get purge -y libre2-dev \
|
&& apt-get purge -y libre2-dev cmake ninja-build\
|
||||||
&& apt-get clean \
|
&& apt-get clean \
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
@ -74,7 +74,7 @@ Setting up DKIM is highly recommended to reduce the chance your emails ending up
|
|||||||
First you need to generate a private and public key for DKIM:
|
First you need to generate a private and public key for DKIM:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
openssl genrsa -out dkim.key 1024
|
openssl genrsa -out dkim.key -traditional 1024
|
||||||
openssl rsa -in dkim.key -pubout -out dkim.pub.key
|
openssl rsa -in dkim.key -pubout -out dkim.pub.key
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -510,11 +510,14 @@ server {
|
|||||||
server_name app.mydomain.com;
|
server_name app.mydomain.com;
|
||||||
|
|
||||||
location / {
|
location / {
|
||||||
proxy_pass http://localhost:7777;
|
proxy_pass http://localhost:7777;
|
||||||
|
proxy_set_header Host $host;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Note: If `/etc/nginx/sites-enabled/default` exists, delete it or certbot will fail due to the conflict. The `simplelogin` file should be the only file in `sites-enabled`.
|
||||||
|
|
||||||
Reload Nginx with the command below
|
Reload Nginx with the command below
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
|
@ -5,13 +5,15 @@ from typing import Optional
|
|||||||
|
|
||||||
from arrow import Arrow
|
from arrow import Arrow
|
||||||
from newrelic import agent
|
from newrelic import agent
|
||||||
|
from sqlalchemy import or_
|
||||||
|
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
from app.email_utils import send_welcome_email
|
from app.email_utils import send_welcome_email
|
||||||
from app.utils import sanitize_email
|
from app.utils import sanitize_email, canonicalize_email
|
||||||
from app.errors import (
|
from app.errors import (
|
||||||
AccountAlreadyLinkedToAnotherPartnerException,
|
AccountAlreadyLinkedToAnotherPartnerException,
|
||||||
AccountIsUsingAliasAsEmail,
|
AccountIsUsingAliasAsEmail,
|
||||||
|
AccountAlreadyLinkedToAnotherUserException,
|
||||||
)
|
)
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import (
|
from app.models import (
|
||||||
@ -130,8 +132,9 @@ class ClientMergeStrategy(ABC):
|
|||||||
class NewUserStrategy(ClientMergeStrategy):
|
class NewUserStrategy(ClientMergeStrategy):
|
||||||
def process(self) -> LinkResult:
|
def process(self) -> LinkResult:
|
||||||
# Will create a new SL User with a random password
|
# Will create a new SL User with a random password
|
||||||
|
canonical_email = canonicalize_email(self.link_request.email)
|
||||||
new_user = User.create(
|
new_user = User.create(
|
||||||
email=self.link_request.email,
|
email=canonical_email,
|
||||||
name=self.link_request.name,
|
name=self.link_request.name,
|
||||||
password=random_string(20),
|
password=random_string(20),
|
||||||
activated=True,
|
activated=True,
|
||||||
@ -165,7 +168,6 @@ class NewUserStrategy(ClientMergeStrategy):
|
|||||||
|
|
||||||
class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
|
class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
|
||||||
def process(self) -> LinkResult:
|
def process(self) -> LinkResult:
|
||||||
|
|
||||||
partner_user = ensure_partner_user_exists_for_user(
|
partner_user = ensure_partner_user_exists_for_user(
|
||||||
self.link_request, self.user, self.partner
|
self.link_request, self.user, self.partner
|
||||||
)
|
)
|
||||||
@ -179,7 +181,7 @@ class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
|
|||||||
|
|
||||||
class LinkedWithAnotherPartnerUserStrategy(ClientMergeStrategy):
|
class LinkedWithAnotherPartnerUserStrategy(ClientMergeStrategy):
|
||||||
def process(self) -> LinkResult:
|
def process(self) -> LinkResult:
|
||||||
raise AccountAlreadyLinkedToAnotherPartnerException()
|
raise AccountAlreadyLinkedToAnotherUserException()
|
||||||
|
|
||||||
|
|
||||||
def get_login_strategy(
|
def get_login_strategy(
|
||||||
@ -207,15 +209,26 @@ def process_login_case(
|
|||||||
) -> LinkResult:
|
) -> LinkResult:
|
||||||
# Sanitize email just in case
|
# Sanitize email just in case
|
||||||
link_request.email = sanitize_email(link_request.email)
|
link_request.email = sanitize_email(link_request.email)
|
||||||
check_alias(link_request.email)
|
|
||||||
# Try to find a SimpleLogin user registered with that partner user id
|
# Try to find a SimpleLogin user registered with that partner user id
|
||||||
partner_user = PartnerUser.get_by(
|
partner_user = PartnerUser.get_by(
|
||||||
partner_id=partner.id, external_user_id=link_request.external_user_id
|
partner_id=partner.id, external_user_id=link_request.external_user_id
|
||||||
)
|
)
|
||||||
if partner_user is None:
|
if partner_user is None:
|
||||||
|
canonical_email = canonicalize_email(link_request.email)
|
||||||
# We didn't find any SimpleLogin user registered with that partner user id
|
# We didn't find any SimpleLogin user registered with that partner user id
|
||||||
|
# Make sure they aren't using an alias as their link email
|
||||||
|
check_alias(link_request.email)
|
||||||
|
check_alias(canonical_email)
|
||||||
# Try to find it using the partner's e-mail address
|
# Try to find it using the partner's e-mail address
|
||||||
user = User.get_by(email=link_request.email)
|
users = User.filter(
|
||||||
|
or_(User.email == link_request.email, User.email == canonical_email)
|
||||||
|
).all()
|
||||||
|
if len(users) > 1:
|
||||||
|
user = [user for user in users if user.email == canonical_email][0]
|
||||||
|
elif len(users) == 1:
|
||||||
|
user = users[0]
|
||||||
|
else:
|
||||||
|
user = None
|
||||||
return get_login_strategy(link_request, user, partner).process()
|
return get_login_strategy(link_request, user, partner).process()
|
||||||
else:
|
else:
|
||||||
# We found the SL user registered with that partner user id
|
# We found the SL user registered with that partner user id
|
||||||
|
@ -256,6 +256,17 @@ class UserAdmin(SLModelView):
|
|||||||
|
|
||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
|
@action(
|
||||||
|
"clear_delete_on",
|
||||||
|
"Remove scheduled deletion of user",
|
||||||
|
"This will remove the scheduled deletion for this users",
|
||||||
|
)
|
||||||
|
def clean_delete_on(self, ids):
|
||||||
|
for user in User.filter(User.id.in_(ids)):
|
||||||
|
user.delete_on = None
|
||||||
|
|
||||||
|
Session.commit()
|
||||||
|
|
||||||
# @action(
|
# @action(
|
||||||
# "login_as",
|
# "login_as",
|
||||||
# "Login as this user",
|
# "Login as this user",
|
||||||
@ -600,6 +611,26 @@ class NewsletterAdmin(SLModelView):
|
|||||||
else:
|
else:
|
||||||
flash(error_msg, "error")
|
flash(error_msg, "error")
|
||||||
|
|
||||||
|
@action(
|
||||||
|
"clone_newsletter",
|
||||||
|
"Clone this newsletter",
|
||||||
|
)
|
||||||
|
def clone_newsletter(self, newsletter_ids):
|
||||||
|
if len(newsletter_ids) != 1:
|
||||||
|
flash("you can only select 1 newsletter", "error")
|
||||||
|
return
|
||||||
|
|
||||||
|
newsletter_id = newsletter_ids[0]
|
||||||
|
newsletter: Newsletter = Newsletter.get(newsletter_id)
|
||||||
|
new_newsletter = Newsletter.create(
|
||||||
|
subject=newsletter.subject,
|
||||||
|
html=newsletter.html,
|
||||||
|
plain_text=newsletter.plain_text,
|
||||||
|
commit=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
flash(f"Newsletter {new_newsletter.subject} has been cloned", "success")
|
||||||
|
|
||||||
|
|
||||||
class NewsletterUserAdmin(SLModelView):
|
class NewsletterUserAdmin(SLModelView):
|
||||||
column_searchable_list = ["id"]
|
column_searchable_list = ["id"]
|
||||||
|
@ -6,8 +6,7 @@ from typing import Optional
|
|||||||
import itsdangerous
|
import itsdangerous
|
||||||
from app import config
|
from app import config
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import User
|
from app.models import User, AliasOptions, SLDomain
|
||||||
|
|
||||||
|
|
||||||
signer = itsdangerous.TimestampSigner(config.CUSTOM_ALIAS_SECRET)
|
signer = itsdangerous.TimestampSigner(config.CUSTOM_ALIAS_SECRET)
|
||||||
|
|
||||||
@ -43,7 +42,9 @@ def check_suffix_signature(signed_suffix: str) -> Optional[str]:
|
|||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def verify_prefix_suffix(user: User, alias_prefix, alias_suffix) -> bool:
|
def verify_prefix_suffix(
|
||||||
|
user: User, alias_prefix, alias_suffix, alias_options: Optional[AliasOptions] = None
|
||||||
|
) -> bool:
|
||||||
"""verify if user could create an alias with the given prefix and suffix"""
|
"""verify if user could create an alias with the given prefix and suffix"""
|
||||||
if not alias_prefix or not alias_suffix: # should be caught on frontend
|
if not alias_prefix or not alias_suffix: # should be caught on frontend
|
||||||
return False
|
return False
|
||||||
@ -56,7 +57,7 @@ def verify_prefix_suffix(user: User, alias_prefix, alias_suffix) -> bool:
|
|||||||
alias_domain_prefix, alias_domain = alias_suffix.split("@", 1)
|
alias_domain_prefix, alias_domain = alias_suffix.split("@", 1)
|
||||||
|
|
||||||
# alias_domain must be either one of user custom domains or built-in domains
|
# alias_domain must be either one of user custom domains or built-in domains
|
||||||
if alias_domain not in user.available_alias_domains():
|
if alias_domain not in user.available_alias_domains(alias_options=alias_options):
|
||||||
LOG.e("wrong alias suffix %s, user %s", alias_suffix, user)
|
LOG.e("wrong alias suffix %s, user %s", alias_suffix, user)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
@ -64,12 +65,11 @@ def verify_prefix_suffix(user: User, alias_prefix, alias_suffix) -> bool:
|
|||||||
# 1) alias_suffix must start with "." and
|
# 1) alias_suffix must start with "." and
|
||||||
# 2) alias_domain_prefix must come from the word list
|
# 2) alias_domain_prefix must come from the word list
|
||||||
if (
|
if (
|
||||||
alias_domain in user.available_sl_domains()
|
alias_domain in user.available_sl_domains(alias_options=alias_options)
|
||||||
and alias_domain not in user_custom_domains
|
and alias_domain not in user_custom_domains
|
||||||
# when DISABLE_ALIAS_SUFFIX is true, alias_domain_prefix is empty
|
# when DISABLE_ALIAS_SUFFIX is true, alias_domain_prefix is empty
|
||||||
and not config.DISABLE_ALIAS_SUFFIX
|
and not config.DISABLE_ALIAS_SUFFIX
|
||||||
):
|
):
|
||||||
|
|
||||||
if not alias_domain_prefix.startswith("."):
|
if not alias_domain_prefix.startswith("."):
|
||||||
LOG.e("User %s submits a wrong alias suffix %s", user, alias_suffix)
|
LOG.e("User %s submits a wrong alias suffix %s", user, alias_suffix)
|
||||||
return False
|
return False
|
||||||
@ -80,14 +80,18 @@ def verify_prefix_suffix(user: User, alias_prefix, alias_suffix) -> bool:
|
|||||||
LOG.e("wrong alias suffix %s, user %s", alias_suffix, user)
|
LOG.e("wrong alias suffix %s, user %s", alias_suffix, user)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
if alias_domain not in user.available_sl_domains():
|
if alias_domain not in user.available_sl_domains(
|
||||||
|
alias_options=alias_options
|
||||||
|
):
|
||||||
LOG.e("wrong alias suffix %s, user %s", alias_suffix, user)
|
LOG.e("wrong alias suffix %s, user %s", alias_suffix, user)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
def get_alias_suffixes(user: User) -> [AliasSuffix]:
|
def get_alias_suffixes(
|
||||||
|
user: User, alias_options: Optional[AliasOptions] = None
|
||||||
|
) -> [AliasSuffix]:
|
||||||
"""
|
"""
|
||||||
Similar to as get_available_suffixes() but also return custom domain that doesn't have MX set up.
|
Similar to as get_available_suffixes() but also return custom domain that doesn't have MX set up.
|
||||||
"""
|
"""
|
||||||
@ -99,7 +103,9 @@ def get_alias_suffixes(user: User) -> [AliasSuffix]:
|
|||||||
# for each user domain, generate both the domain and a random suffix version
|
# for each user domain, generate both the domain and a random suffix version
|
||||||
for custom_domain in user_custom_domains:
|
for custom_domain in user_custom_domains:
|
||||||
if custom_domain.random_prefix_generation:
|
if custom_domain.random_prefix_generation:
|
||||||
suffix = "." + user.get_random_alias_suffix() + "@" + custom_domain.domain
|
suffix = (
|
||||||
|
f".{user.get_random_alias_suffix(custom_domain)}@{custom_domain.domain}"
|
||||||
|
)
|
||||||
alias_suffix = AliasSuffix(
|
alias_suffix = AliasSuffix(
|
||||||
is_custom=True,
|
is_custom=True,
|
||||||
suffix=suffix,
|
suffix=suffix,
|
||||||
@ -113,7 +119,7 @@ def get_alias_suffixes(user: User) -> [AliasSuffix]:
|
|||||||
else:
|
else:
|
||||||
alias_suffixes.append(alias_suffix)
|
alias_suffixes.append(alias_suffix)
|
||||||
|
|
||||||
suffix = "@" + custom_domain.domain
|
suffix = f"@{custom_domain.domain}"
|
||||||
alias_suffix = AliasSuffix(
|
alias_suffix = AliasSuffix(
|
||||||
is_custom=True,
|
is_custom=True,
|
||||||
suffix=suffix,
|
suffix=suffix,
|
||||||
@ -134,16 +140,13 @@ def get_alias_suffixes(user: User) -> [AliasSuffix]:
|
|||||||
alias_suffixes.append(alias_suffix)
|
alias_suffixes.append(alias_suffix)
|
||||||
|
|
||||||
# then SimpleLogin domain
|
# then SimpleLogin domain
|
||||||
for sl_domain in user.get_sl_domains():
|
sl_domains = user.get_sl_domains(alias_options=alias_options)
|
||||||
suffix = (
|
default_domain_found = False
|
||||||
(
|
for sl_domain in sl_domains:
|
||||||
""
|
prefix = (
|
||||||
if config.DISABLE_ALIAS_SUFFIX
|
"" if config.DISABLE_ALIAS_SUFFIX else f".{user.get_random_alias_suffix()}"
|
||||||
else "." + user.get_random_alias_suffix()
|
|
||||||
)
|
|
||||||
+ "@"
|
|
||||||
+ sl_domain.domain
|
|
||||||
)
|
)
|
||||||
|
suffix = f"{prefix}@{sl_domain.domain}"
|
||||||
alias_suffix = AliasSuffix(
|
alias_suffix = AliasSuffix(
|
||||||
is_custom=False,
|
is_custom=False,
|
||||||
suffix=suffix,
|
suffix=suffix,
|
||||||
@ -152,11 +155,36 @@ def get_alias_suffixes(user: User) -> [AliasSuffix]:
|
|||||||
domain=sl_domain.domain,
|
domain=sl_domain.domain,
|
||||||
mx_verified=True,
|
mx_verified=True,
|
||||||
)
|
)
|
||||||
|
# No default or this is not the default
|
||||||
# put the default domain to top
|
if (
|
||||||
if user.default_alias_public_domain_id == sl_domain.id:
|
user.default_alias_public_domain_id is None
|
||||||
alias_suffixes.insert(0, alias_suffix)
|
or user.default_alias_public_domain_id != sl_domain.id
|
||||||
else:
|
):
|
||||||
alias_suffixes.append(alias_suffix)
|
alias_suffixes.append(alias_suffix)
|
||||||
|
else:
|
||||||
|
default_domain_found = True
|
||||||
|
alias_suffixes.insert(0, alias_suffix)
|
||||||
|
|
||||||
|
if not default_domain_found:
|
||||||
|
domain_conditions = {"id": user.default_alias_public_domain_id, "hidden": False}
|
||||||
|
if not user.is_premium():
|
||||||
|
domain_conditions["premium_only"] = False
|
||||||
|
sl_domain = SLDomain.get_by(**domain_conditions)
|
||||||
|
if sl_domain:
|
||||||
|
prefix = (
|
||||||
|
""
|
||||||
|
if config.DISABLE_ALIAS_SUFFIX
|
||||||
|
else f".{user.get_random_alias_suffix()}"
|
||||||
|
)
|
||||||
|
suffix = f"{prefix}@{sl_domain.domain}"
|
||||||
|
alias_suffix = AliasSuffix(
|
||||||
|
is_custom=False,
|
||||||
|
suffix=suffix,
|
||||||
|
signed_suffix=signer.sign(suffix).decode(),
|
||||||
|
is_premium=sl_domain.premium_only,
|
||||||
|
domain=sl_domain.domain,
|
||||||
|
mx_verified=True,
|
||||||
|
)
|
||||||
|
alias_suffixes.insert(0, alias_suffix)
|
||||||
|
|
||||||
return alias_suffixes
|
return alias_suffixes
|
||||||
|
@ -21,6 +21,8 @@ from app.email_utils import (
|
|||||||
send_cannot_create_directory_alias_disabled,
|
send_cannot_create_directory_alias_disabled,
|
||||||
get_email_local_part,
|
get_email_local_part,
|
||||||
send_cannot_create_domain_alias,
|
send_cannot_create_domain_alias,
|
||||||
|
send_email,
|
||||||
|
render,
|
||||||
)
|
)
|
||||||
from app.errors import AliasInTrashError
|
from app.errors import AliasInTrashError
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
@ -36,6 +38,8 @@ from app.models import (
|
|||||||
EmailLog,
|
EmailLog,
|
||||||
Contact,
|
Contact,
|
||||||
AutoCreateRule,
|
AutoCreateRule,
|
||||||
|
AliasUsedOn,
|
||||||
|
ClientUser,
|
||||||
)
|
)
|
||||||
from app.regex_utils import regex_match
|
from app.regex_utils import regex_match
|
||||||
|
|
||||||
@ -57,6 +61,8 @@ def get_user_if_alias_would_auto_create(
|
|||||||
domain_and_rule = check_if_alias_can_be_auto_created_for_custom_domain(
|
domain_and_rule = check_if_alias_can_be_auto_created_for_custom_domain(
|
||||||
address, notify_user=notify_user
|
address, notify_user=notify_user
|
||||||
)
|
)
|
||||||
|
if DomainDeletedAlias.get_by(email=address):
|
||||||
|
return None
|
||||||
if domain_and_rule:
|
if domain_and_rule:
|
||||||
return domain_and_rule[0].user
|
return domain_and_rule[0].user
|
||||||
directory = check_if_alias_can_be_auto_created_for_a_directory(
|
directory = check_if_alias_can_be_auto_created_for_a_directory(
|
||||||
@ -397,3 +403,58 @@ def alias_export_csv(user, csv_direct_export=False):
|
|||||||
output.headers["Content-Disposition"] = "attachment; filename=aliases.csv"
|
output.headers["Content-Disposition"] = "attachment; filename=aliases.csv"
|
||||||
output.headers["Content-type"] = "text/csv"
|
output.headers["Content-type"] = "text/csv"
|
||||||
return output
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def transfer_alias(alias, new_user, new_mailboxes: [Mailbox]):
|
||||||
|
# cannot transfer alias which is used for receiving newsletter
|
||||||
|
if User.get_by(newsletter_alias_id=alias.id):
|
||||||
|
raise Exception("Cannot transfer alias that's used to receive newsletter")
|
||||||
|
|
||||||
|
# update user_id
|
||||||
|
Session.query(Contact).filter(Contact.alias_id == alias.id).update(
|
||||||
|
{"user_id": new_user.id}
|
||||||
|
)
|
||||||
|
|
||||||
|
Session.query(AliasUsedOn).filter(AliasUsedOn.alias_id == alias.id).update(
|
||||||
|
{"user_id": new_user.id}
|
||||||
|
)
|
||||||
|
|
||||||
|
Session.query(ClientUser).filter(ClientUser.alias_id == alias.id).update(
|
||||||
|
{"user_id": new_user.id}
|
||||||
|
)
|
||||||
|
|
||||||
|
# remove existing mailboxes from the alias
|
||||||
|
Session.query(AliasMailbox).filter(AliasMailbox.alias_id == alias.id).delete()
|
||||||
|
|
||||||
|
# set mailboxes
|
||||||
|
alias.mailbox_id = new_mailboxes.pop().id
|
||||||
|
for mb in new_mailboxes:
|
||||||
|
AliasMailbox.create(alias_id=alias.id, mailbox_id=mb.id)
|
||||||
|
|
||||||
|
# alias has never been transferred before
|
||||||
|
if not alias.original_owner_id:
|
||||||
|
alias.original_owner_id = alias.user_id
|
||||||
|
|
||||||
|
# inform previous owner
|
||||||
|
old_user = alias.user
|
||||||
|
send_email(
|
||||||
|
old_user.email,
|
||||||
|
f"Alias {alias.email} has been received",
|
||||||
|
render(
|
||||||
|
"transactional/alias-transferred.txt",
|
||||||
|
alias=alias,
|
||||||
|
),
|
||||||
|
render(
|
||||||
|
"transactional/alias-transferred.html",
|
||||||
|
alias=alias,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# now the alias belongs to the new user
|
||||||
|
alias.user_id = new_user.id
|
||||||
|
|
||||||
|
# set some fields back to default
|
||||||
|
alias.disable_pgp = False
|
||||||
|
alias.pinned = False
|
||||||
|
|
||||||
|
Session.commit()
|
||||||
|
@ -16,3 +16,22 @@ from .views import (
|
|||||||
sudo,
|
sudo,
|
||||||
user,
|
user,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"alias_options",
|
||||||
|
"new_custom_alias",
|
||||||
|
"custom_domain",
|
||||||
|
"new_random_alias",
|
||||||
|
"user_info",
|
||||||
|
"auth",
|
||||||
|
"auth_mfa",
|
||||||
|
"alias",
|
||||||
|
"apple",
|
||||||
|
"mailbox",
|
||||||
|
"notification",
|
||||||
|
"setting",
|
||||||
|
"export",
|
||||||
|
"phone",
|
||||||
|
"sudo",
|
||||||
|
"user",
|
||||||
|
]
|
||||||
|
@ -24,12 +24,14 @@ from app.errors import (
|
|||||||
ErrContactAlreadyExists,
|
ErrContactAlreadyExists,
|
||||||
ErrAddressInvalid,
|
ErrAddressInvalid,
|
||||||
)
|
)
|
||||||
|
from app.extensions import limiter
|
||||||
from app.models import Alias, Contact, Mailbox, AliasMailbox
|
from app.models import Alias, Contact, Mailbox, AliasMailbox
|
||||||
|
|
||||||
|
|
||||||
@deprecated
|
@deprecated
|
||||||
@api_bp.route("/aliases", methods=["GET", "POST"])
|
@api_bp.route("/aliases", methods=["GET", "POST"])
|
||||||
@require_api_auth
|
@require_api_auth
|
||||||
|
@limiter.limit("10/minute", key_func=lambda: g.user.id)
|
||||||
def get_aliases():
|
def get_aliases():
|
||||||
"""
|
"""
|
||||||
Get aliases
|
Get aliases
|
||||||
@ -72,6 +74,7 @@ def get_aliases():
|
|||||||
|
|
||||||
@api_bp.route("/v2/aliases", methods=["GET", "POST"])
|
@api_bp.route("/v2/aliases", methods=["GET", "POST"])
|
||||||
@require_api_auth
|
@require_api_auth
|
||||||
|
@limiter.limit("50/minute", key_func=lambda: g.user.id)
|
||||||
def get_aliases_v2():
|
def get_aliases_v2():
|
||||||
"""
|
"""
|
||||||
Get aliases
|
Get aliases
|
||||||
|
@ -9,6 +9,7 @@ from requests import RequestException
|
|||||||
|
|
||||||
from app.api.base import api_bp, require_api_auth
|
from app.api.base import api_bp, require_api_auth
|
||||||
from app.config import APPLE_API_SECRET, MACAPP_APPLE_API_SECRET
|
from app.config import APPLE_API_SECRET, MACAPP_APPLE_API_SECRET
|
||||||
|
from app.subscription_webhook import execute_subscription_webhook
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import PlanEnum, AppleSubscription
|
from app.models import PlanEnum, AppleSubscription
|
||||||
@ -50,6 +51,7 @@ def apple_process_payment():
|
|||||||
|
|
||||||
apple_sub = verify_receipt(receipt_data, user, password)
|
apple_sub = verify_receipt(receipt_data, user, password)
|
||||||
if apple_sub:
|
if apple_sub:
|
||||||
|
execute_subscription_webhook(user)
|
||||||
return jsonify(ok=True), 200
|
return jsonify(ok=True), 200
|
||||||
|
|
||||||
return jsonify(error="Processing failed"), 400
|
return jsonify(error="Processing failed"), 400
|
||||||
@ -282,6 +284,7 @@ def apple_update_notification():
|
|||||||
apple_sub.plan = plan
|
apple_sub.plan = plan
|
||||||
apple_sub.product_id = transaction["product_id"]
|
apple_sub.product_id = transaction["product_id"]
|
||||||
Session.commit()
|
Session.commit()
|
||||||
|
execute_subscription_webhook(user)
|
||||||
return jsonify(ok=True), 200
|
return jsonify(ok=True), 200
|
||||||
else:
|
else:
|
||||||
LOG.w(
|
LOG.w(
|
||||||
@ -554,6 +557,7 @@ def verify_receipt(receipt_data, user, password) -> Optional[AppleSubscription]:
|
|||||||
product_id=latest_transaction["product_id"],
|
product_id=latest_transaction["product_id"],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
execute_subscription_webhook(user)
|
||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
return apple_sub
|
return apple_sub
|
||||||
|
@ -63,6 +63,11 @@ def auth_login():
|
|||||||
elif user.disabled:
|
elif user.disabled:
|
||||||
LoginEvent(LoginEvent.ActionType.disabled_login, LoginEvent.Source.api).send()
|
LoginEvent(LoginEvent.ActionType.disabled_login, LoginEvent.Source.api).send()
|
||||||
return jsonify(error="Account disabled"), 400
|
return jsonify(error="Account disabled"), 400
|
||||||
|
elif user.delete_on is not None:
|
||||||
|
LoginEvent(
|
||||||
|
LoginEvent.ActionType.scheduled_to_be_deleted, LoginEvent.Source.api
|
||||||
|
).send()
|
||||||
|
return jsonify(error="Account scheduled for deletion"), 400
|
||||||
elif not user.activated:
|
elif not user.activated:
|
||||||
LoginEvent(LoginEvent.ActionType.not_activated, LoginEvent.Source.api).send()
|
LoginEvent(LoginEvent.ActionType.not_activated, LoginEvent.Source.api).send()
|
||||||
return jsonify(error="Account not activated"), 422
|
return jsonify(error="Account not activated"), 422
|
||||||
@ -357,7 +362,7 @@ def auth_payload(user, device) -> dict:
|
|||||||
|
|
||||||
|
|
||||||
@api_bp.route("/auth/forgot_password", methods=["POST"])
|
@api_bp.route("/auth/forgot_password", methods=["POST"])
|
||||||
@limiter.limit("10/minute")
|
@limiter.limit("2/minute")
|
||||||
def forgot_password():
|
def forgot_password():
|
||||||
"""
|
"""
|
||||||
User forgot password
|
User forgot password
|
||||||
|
@ -13,8 +13,8 @@ from app.db import Session
|
|||||||
from app.email_utils import (
|
from app.email_utils import (
|
||||||
mailbox_already_used,
|
mailbox_already_used,
|
||||||
email_can_be_used_as_mailbox,
|
email_can_be_used_as_mailbox,
|
||||||
is_valid_email,
|
|
||||||
)
|
)
|
||||||
|
from app.email_validation import is_valid_email
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import Mailbox, Job
|
from app.models import Mailbox, Job
|
||||||
from app.utils import sanitize_email
|
from app.utils import sanitize_email
|
||||||
@ -45,7 +45,7 @@ def create_mailbox():
|
|||||||
mailbox_email = sanitize_email(request.get_json().get("email"))
|
mailbox_email = sanitize_email(request.get_json().get("email"))
|
||||||
|
|
||||||
if not user.is_premium():
|
if not user.is_premium():
|
||||||
return jsonify(error=f"Only premium plan can add additional mailbox"), 400
|
return jsonify(error="Only premium plan can add additional mailbox"), 400
|
||||||
|
|
||||||
if not is_valid_email(mailbox_email):
|
if not is_valid_email(mailbox_email):
|
||||||
return jsonify(error=f"{mailbox_email} invalid"), 400
|
return jsonify(error=f"{mailbox_email} invalid"), 400
|
||||||
|
@ -150,7 +150,7 @@ def new_custom_alias_v3():
|
|||||||
if not data:
|
if not data:
|
||||||
return jsonify(error="request body cannot be empty"), 400
|
return jsonify(error="request body cannot be empty"), 400
|
||||||
|
|
||||||
if type(data) is not dict:
|
if not isinstance(data, dict):
|
||||||
return jsonify(error="request body does not follow the required format"), 400
|
return jsonify(error="request body does not follow the required format"), 400
|
||||||
|
|
||||||
alias_prefix = data.get("alias_prefix", "").strip().lower().replace(" ", "")
|
alias_prefix = data.get("alias_prefix", "").strip().lower().replace(" ", "")
|
||||||
@ -168,7 +168,7 @@ def new_custom_alias_v3():
|
|||||||
return jsonify(error="alias prefix invalid format or too long"), 400
|
return jsonify(error="alias prefix invalid format or too long"), 400
|
||||||
|
|
||||||
# check if mailbox is not tempered with
|
# check if mailbox is not tempered with
|
||||||
if type(mailbox_ids) is not list:
|
if not isinstance(mailbox_ids, list):
|
||||||
return jsonify(error="mailbox_ids must be an array of id"), 400
|
return jsonify(error="mailbox_ids must be an array of id"), 400
|
||||||
mailboxes = []
|
mailboxes = []
|
||||||
for mailbox_id in mailbox_ids:
|
for mailbox_id in mailbox_ids:
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
import base64
|
import base64
|
||||||
|
import dataclasses
|
||||||
from io import BytesIO
|
from io import BytesIO
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
@ -7,6 +8,7 @@ from flask import jsonify, g, request, make_response
|
|||||||
from app import s3, config
|
from app import s3, config
|
||||||
from app.api.base import api_bp, require_api_auth
|
from app.api.base import api_bp, require_api_auth
|
||||||
from app.config import SESSION_COOKIE_NAME
|
from app.config import SESSION_COOKIE_NAME
|
||||||
|
from app.dashboard.views.index import get_stats
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
from app.models import ApiKey, File, PartnerUser, User
|
from app.models import ApiKey, File, PartnerUser, User
|
||||||
from app.proton.utils import get_proton_partner
|
from app.proton.utils import get_proton_partner
|
||||||
@ -30,6 +32,7 @@ def user_to_dict(user: User) -> dict:
|
|||||||
"in_trial": user.in_trial(),
|
"in_trial": user.in_trial(),
|
||||||
"max_alias_free_plan": user.max_alias_for_free_account(),
|
"max_alias_free_plan": user.max_alias_for_free_account(),
|
||||||
"connected_proton_address": None,
|
"connected_proton_address": None,
|
||||||
|
"can_create_reverse_alias": user.can_create_contacts(),
|
||||||
}
|
}
|
||||||
|
|
||||||
if config.CONNECT_WITH_PROTON:
|
if config.CONNECT_WITH_PROTON:
|
||||||
@ -56,6 +59,7 @@ def user_info():
|
|||||||
- in_trial
|
- in_trial
|
||||||
- max_alias_free
|
- max_alias_free
|
||||||
- is_connected_with_proton
|
- is_connected_with_proton
|
||||||
|
- can_create_reverse_alias
|
||||||
"""
|
"""
|
||||||
user = g.user
|
user = g.user
|
||||||
|
|
||||||
@ -136,3 +140,22 @@ def logout():
|
|||||||
response.delete_cookie(SESSION_COOKIE_NAME)
|
response.delete_cookie(SESSION_COOKIE_NAME)
|
||||||
|
|
||||||
return response
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
@api_bp.route("/stats")
|
||||||
|
@require_api_auth
|
||||||
|
def user_stats():
|
||||||
|
"""
|
||||||
|
Return stats
|
||||||
|
|
||||||
|
Output as json
|
||||||
|
- nb_alias
|
||||||
|
- nb_forward
|
||||||
|
- nb_reply
|
||||||
|
- nb_block
|
||||||
|
|
||||||
|
"""
|
||||||
|
user = g.user
|
||||||
|
stats = get_stats(user)
|
||||||
|
|
||||||
|
return jsonify(dataclasses.asdict(stats))
|
||||||
|
@ -17,3 +17,23 @@ from .views import (
|
|||||||
recovery,
|
recovery,
|
||||||
api_to_cookie,
|
api_to_cookie,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"login",
|
||||||
|
"logout",
|
||||||
|
"register",
|
||||||
|
"activate",
|
||||||
|
"resend_activation",
|
||||||
|
"reset_password",
|
||||||
|
"forgot_password",
|
||||||
|
"github",
|
||||||
|
"google",
|
||||||
|
"facebook",
|
||||||
|
"proton",
|
||||||
|
"change_email",
|
||||||
|
"mfa",
|
||||||
|
"fido",
|
||||||
|
"social",
|
||||||
|
"recovery",
|
||||||
|
"api_to_cookie",
|
||||||
|
]
|
||||||
|
@ -62,7 +62,7 @@ def fido():
|
|||||||
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
|
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
|
||||||
if browser and not browser.is_expired() and browser.user_id == user.id:
|
if browser and not browser.is_expired() and browser.user_id == user.id:
|
||||||
login_user(user)
|
login_user(user)
|
||||||
flash(f"Welcome back!", "success")
|
flash("Welcome back!", "success")
|
||||||
# Redirect user to correct page
|
# Redirect user to correct page
|
||||||
return redirect(next_url or url_for("dashboard.index"))
|
return redirect(next_url or url_for("dashboard.index"))
|
||||||
else:
|
else:
|
||||||
@ -110,7 +110,7 @@ def fido():
|
|||||||
|
|
||||||
session["sudo_time"] = int(time())
|
session["sudo_time"] = int(time())
|
||||||
login_user(user)
|
login_user(user)
|
||||||
flash(f"Welcome back!", "success")
|
flash("Welcome back!", "success")
|
||||||
|
|
||||||
# Redirect user to correct page
|
# Redirect user to correct page
|
||||||
response = make_response(redirect(next_url or url_for("dashboard.index")))
|
response = make_response(redirect(next_url or url_for("dashboard.index")))
|
||||||
|
@ -1,4 +1,4 @@
|
|||||||
from flask import request, render_template, redirect, url_for, flash, g
|
from flask import request, render_template, flash, g
|
||||||
from flask_wtf import FlaskForm
|
from flask_wtf import FlaskForm
|
||||||
from wtforms import StringField, validators
|
from wtforms import StringField, validators
|
||||||
|
|
||||||
@ -16,7 +16,7 @@ class ForgotPasswordForm(FlaskForm):
|
|||||||
|
|
||||||
@auth_bp.route("/forgot_password", methods=["GET", "POST"])
|
@auth_bp.route("/forgot_password", methods=["GET", "POST"])
|
||||||
@limiter.limit(
|
@limiter.limit(
|
||||||
"10/minute", deduct_when=lambda r: hasattr(g, "deduct_limit") and g.deduct_limit
|
"10/hour", deduct_when=lambda r: hasattr(g, "deduct_limit") and g.deduct_limit
|
||||||
)
|
)
|
||||||
def forgot_password():
|
def forgot_password():
|
||||||
form = ForgotPasswordForm(request.form)
|
form = ForgotPasswordForm(request.form)
|
||||||
@ -37,6 +37,5 @@ def forgot_password():
|
|||||||
if user:
|
if user:
|
||||||
LOG.d("Send forgot password email to %s", user)
|
LOG.d("Send forgot password email to %s", user)
|
||||||
send_reset_password_email(user)
|
send_reset_password_email(user)
|
||||||
return redirect(url_for("auth.forgot_password"))
|
|
||||||
|
|
||||||
return render_template("auth/forgot_password.html", form=form)
|
return render_template("auth/forgot_password.html", form=form)
|
||||||
|
@ -54,6 +54,12 @@ def login():
|
|||||||
"error",
|
"error",
|
||||||
)
|
)
|
||||||
LoginEvent(LoginEvent.ActionType.disabled_login).send()
|
LoginEvent(LoginEvent.ActionType.disabled_login).send()
|
||||||
|
elif user.delete_on is not None:
|
||||||
|
flash(
|
||||||
|
f"Your account is scheduled to be deleted on {user.delete_on}",
|
||||||
|
"error",
|
||||||
|
)
|
||||||
|
LoginEvent(LoginEvent.ActionType.scheduled_to_be_deleted).send()
|
||||||
elif not user.activated:
|
elif not user.activated:
|
||||||
show_resend_activation = True
|
show_resend_activation = True
|
||||||
flash(
|
flash(
|
||||||
|
@ -55,7 +55,7 @@ def mfa():
|
|||||||
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
|
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
|
||||||
if browser and not browser.is_expired() and browser.user_id == user.id:
|
if browser and not browser.is_expired() and browser.user_id == user.id:
|
||||||
login_user(user)
|
login_user(user)
|
||||||
flash(f"Welcome back!", "success")
|
flash("Welcome back!", "success")
|
||||||
# Redirect user to correct page
|
# Redirect user to correct page
|
||||||
return redirect(next_url or url_for("dashboard.index"))
|
return redirect(next_url or url_for("dashboard.index"))
|
||||||
else:
|
else:
|
||||||
@ -73,7 +73,7 @@ def mfa():
|
|||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
login_user(user)
|
login_user(user)
|
||||||
flash(f"Welcome back!", "success")
|
flash("Welcome back!", "success")
|
||||||
|
|
||||||
# Redirect user to correct page
|
# Redirect user to correct page
|
||||||
response = make_response(redirect(next_url or url_for("dashboard.index")))
|
response = make_response(redirect(next_url or url_for("dashboard.index")))
|
||||||
|
@ -53,7 +53,7 @@ def recovery_route():
|
|||||||
del session[MFA_USER_ID]
|
del session[MFA_USER_ID]
|
||||||
|
|
||||||
login_user(user)
|
login_user(user)
|
||||||
flash(f"Welcome back!", "success")
|
flash("Welcome back!", "success")
|
||||||
|
|
||||||
recovery_code.used = True
|
recovery_code.used = True
|
||||||
recovery_code.used_at = arrow.now()
|
recovery_code.used_at = arrow.now()
|
||||||
|
@ -94,9 +94,7 @@ def register():
|
|||||||
try:
|
try:
|
||||||
send_activation_email(user, next_url)
|
send_activation_email(user, next_url)
|
||||||
RegisterEvent(RegisterEvent.ActionType.success).send()
|
RegisterEvent(RegisterEvent.ActionType.success).send()
|
||||||
DailyMetric.get_or_create_today_metric().nb_new_web_non_proton_user += (
|
DailyMetric.get_or_create_today_metric().nb_new_web_non_proton_user += 1
|
||||||
1
|
|
||||||
)
|
|
||||||
Session.commit()
|
Session.commit()
|
||||||
except Exception:
|
except Exception:
|
||||||
flash("Invalid email, are you sure the email is correct?", "error")
|
flash("Invalid email, are you sure the email is correct?", "error")
|
||||||
|
@ -60,8 +60,8 @@ def reset_password():
|
|||||||
# this can be served to activate user too
|
# this can be served to activate user too
|
||||||
user.activated = True
|
user.activated = True
|
||||||
|
|
||||||
# remove the reset password code
|
# remove all reset password codes
|
||||||
ResetPasswordCode.delete(reset_password_code.id)
|
ResetPasswordCode.filter_by(user_id=user.id).delete()
|
||||||
|
|
||||||
# change the alternative_id to log user out on other browsers
|
# change the alternative_id to log user out on other browsers
|
||||||
user.alternative_id = str(uuid.uuid4())
|
user.alternative_id = str(uuid.uuid4())
|
||||||
|
@ -179,6 +179,7 @@ AWS_REGION = os.environ.get("AWS_REGION") or "eu-west-3"
|
|||||||
BUCKET = os.environ.get("BUCKET")
|
BUCKET = os.environ.get("BUCKET")
|
||||||
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
|
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
|
||||||
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
|
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
|
||||||
|
AWS_ENDPOINT_URL = os.environ.get("AWS_ENDPOINT_URL", None)
|
||||||
|
|
||||||
# Paddle
|
# Paddle
|
||||||
try:
|
try:
|
||||||
@ -488,7 +489,34 @@ def setup_nameservers():
|
|||||||
|
|
||||||
NAMESERVERS = setup_nameservers()
|
NAMESERVERS = setup_nameservers()
|
||||||
|
|
||||||
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = False
|
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = os.environ.get(
|
||||||
|
"DISABLE_CREATE_CONTACTS_FOR_FREE_USERS", False
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Expect format hits,seconds:hits,seconds...
|
||||||
|
# Example 1,10:4,60 means 1 in the last 10 secs or 4 in the last 60 secs
|
||||||
|
def getRateLimitFromConfig(
|
||||||
|
env_var: string, default: string = ""
|
||||||
|
) -> list[tuple[int, int]]:
|
||||||
|
value = os.environ.get(env_var, default)
|
||||||
|
if not value:
|
||||||
|
return []
|
||||||
|
entries = [entry for entry in value.split(":")]
|
||||||
|
limits = []
|
||||||
|
for entry in entries:
|
||||||
|
fields = entry.split(",")
|
||||||
|
limit = (int(fields[0]), int(fields[1]))
|
||||||
|
limits.append(limit)
|
||||||
|
return limits
|
||||||
|
|
||||||
|
|
||||||
|
ALIAS_CREATE_RATE_LIMIT_FREE = getRateLimitFromConfig(
|
||||||
|
"ALIAS_CREATE_RATE_LIMIT_FREE", "10,900:50,3600"
|
||||||
|
)
|
||||||
|
ALIAS_CREATE_RATE_LIMIT_PAID = getRateLimitFromConfig(
|
||||||
|
"ALIAS_CREATE_RATE_LIMIT_PAID", "50,900:200,3600"
|
||||||
|
)
|
||||||
PARTNER_API_TOKEN_SECRET = os.environ.get("PARTNER_API_TOKEN_SECRET") or (
|
PARTNER_API_TOKEN_SECRET = os.environ.get("PARTNER_API_TOKEN_SECRET") or (
|
||||||
FLASK_SECRET + "partnerapitoken"
|
FLASK_SECRET + "partnerapitoken"
|
||||||
)
|
)
|
||||||
@ -532,3 +560,10 @@ if ENABLE_ALL_REVERSE_ALIAS_REPLACEMENT:
|
|||||||
SKIP_MX_LOOKUP_ON_CHECK = False
|
SKIP_MX_LOOKUP_ON_CHECK = False
|
||||||
|
|
||||||
DISABLE_RATE_LIMIT = "DISABLE_RATE_LIMIT" in os.environ
|
DISABLE_RATE_LIMIT = "DISABLE_RATE_LIMIT" in os.environ
|
||||||
|
|
||||||
|
SUBSCRIPTION_CHANGE_WEBHOOK = os.environ.get("SUBSCRIPTION_CHANGE_WEBHOOK", None)
|
||||||
|
MAX_API_KEYS = int(os.environ.get("MAX_API_KEYS", 30))
|
||||||
|
|
||||||
|
UPCLOUD_USERNAME = os.environ.get("UPCLOUD_USERNAME", None)
|
||||||
|
UPCLOUD_PASSWORD = os.environ.get("UPCLOUD_PASSWORD", None)
|
||||||
|
UPCLOUD_DB_ID = os.environ.get("UPCLOUD_DB_ID", None)
|
||||||
|
@ -33,3 +33,39 @@ from .views import (
|
|||||||
notification,
|
notification,
|
||||||
support,
|
support,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"index",
|
||||||
|
"pricing",
|
||||||
|
"setting",
|
||||||
|
"custom_alias",
|
||||||
|
"subdomain",
|
||||||
|
"billing",
|
||||||
|
"alias_log",
|
||||||
|
"alias_export",
|
||||||
|
"unsubscribe",
|
||||||
|
"api_key",
|
||||||
|
"custom_domain",
|
||||||
|
"alias_contact_manager",
|
||||||
|
"enter_sudo",
|
||||||
|
"mfa_setup",
|
||||||
|
"mfa_cancel",
|
||||||
|
"fido_setup",
|
||||||
|
"coupon",
|
||||||
|
"fido_manage",
|
||||||
|
"domain_detail",
|
||||||
|
"lifetime_licence",
|
||||||
|
"directory",
|
||||||
|
"mailbox",
|
||||||
|
"mailbox_detail",
|
||||||
|
"refused_email",
|
||||||
|
"referral",
|
||||||
|
"contact_detail",
|
||||||
|
"setup_done",
|
||||||
|
"batch_import",
|
||||||
|
"alias_transfer",
|
||||||
|
"app",
|
||||||
|
"delete_account",
|
||||||
|
"notification",
|
||||||
|
"support",
|
||||||
|
]
|
||||||
|
@ -13,10 +13,10 @@ from app import config, parallel_limiter
|
|||||||
from app.dashboard.base import dashboard_bp
|
from app.dashboard.base import dashboard_bp
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
from app.email_utils import (
|
from app.email_utils import (
|
||||||
is_valid_email,
|
|
||||||
generate_reply_email,
|
generate_reply_email,
|
||||||
parse_full_address,
|
parse_full_address,
|
||||||
)
|
)
|
||||||
|
from app.email_validation import is_valid_email
|
||||||
from app.errors import (
|
from app.errors import (
|
||||||
CannotCreateContactForReverseAlias,
|
CannotCreateContactForReverseAlias,
|
||||||
ErrContactErrorUpgradeNeeded,
|
ErrContactErrorUpgradeNeeded,
|
||||||
@ -51,14 +51,6 @@ def email_validator():
|
|||||||
return _check
|
return _check
|
||||||
|
|
||||||
|
|
||||||
def user_can_create_contacts(user: User) -> bool:
|
|
||||||
if user.is_premium():
|
|
||||||
return True
|
|
||||||
if user.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
|
|
||||||
return True
|
|
||||||
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
|
|
||||||
|
|
||||||
|
|
||||||
def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
|
def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
|
||||||
"""
|
"""
|
||||||
Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS.
|
Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS.
|
||||||
@ -82,7 +74,7 @@ def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
|
|||||||
if contact:
|
if contact:
|
||||||
raise ErrContactAlreadyExists(contact)
|
raise ErrContactAlreadyExists(contact)
|
||||||
|
|
||||||
if not user_can_create_contacts(user):
|
if not user.can_create_contacts():
|
||||||
raise ErrContactErrorUpgradeNeeded()
|
raise ErrContactErrorUpgradeNeeded()
|
||||||
|
|
||||||
contact = Contact.create(
|
contact = Contact.create(
|
||||||
@ -90,7 +82,7 @@ def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
|
|||||||
alias_id=alias.id,
|
alias_id=alias.id,
|
||||||
website_email=contact_email,
|
website_email=contact_email,
|
||||||
name=contact_name,
|
name=contact_name,
|
||||||
reply_email=generate_reply_email(contact_email, user),
|
reply_email=generate_reply_email(contact_email, alias),
|
||||||
)
|
)
|
||||||
|
|
||||||
LOG.d(
|
LOG.d(
|
||||||
@ -327,6 +319,6 @@ def alias_contact_manager(alias_id):
|
|||||||
last_page=last_page,
|
last_page=last_page,
|
||||||
query=query,
|
query=query,
|
||||||
nb_contact=nb_contact,
|
nb_contact=nb_contact,
|
||||||
can_create_contacts=user_can_create_contacts(current_user),
|
can_create_contacts=current_user.can_create_contacts(),
|
||||||
csrf_form=csrf_form,
|
csrf_form=csrf_form,
|
||||||
)
|
)
|
||||||
|
@ -87,6 +87,6 @@ def get_alias_log(alias: Alias, page_id=0) -> [AliasLog]:
|
|||||||
contact=contact,
|
contact=contact,
|
||||||
)
|
)
|
||||||
logs.append(al)
|
logs.append(al)
|
||||||
logs = sorted(logs, key=lambda l: l.when, reverse=True)
|
logs = sorted(logs, key=lambda log: log.when, reverse=True)
|
||||||
|
|
||||||
return logs
|
return logs
|
||||||
|
@ -7,79 +7,19 @@ from flask import render_template, redirect, url_for, flash, request
|
|||||||
from flask_login import login_required, current_user
|
from flask_login import login_required, current_user
|
||||||
|
|
||||||
from app import config
|
from app import config
|
||||||
|
from app.alias_utils import transfer_alias
|
||||||
from app.dashboard.base import dashboard_bp
|
from app.dashboard.base import dashboard_bp
|
||||||
from app.dashboard.views.enter_sudo import sudo_required
|
from app.dashboard.views.enter_sudo import sudo_required
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
from app.email_utils import send_email, render
|
|
||||||
from app.extensions import limiter
|
from app.extensions import limiter
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import (
|
from app.models import (
|
||||||
Alias,
|
Alias,
|
||||||
Contact,
|
|
||||||
AliasUsedOn,
|
|
||||||
AliasMailbox,
|
|
||||||
User,
|
|
||||||
ClientUser,
|
|
||||||
)
|
)
|
||||||
from app.models import Mailbox
|
from app.models import Mailbox
|
||||||
from app.utils import CSRFValidationForm
|
from app.utils import CSRFValidationForm
|
||||||
|
|
||||||
|
|
||||||
def transfer(alias, new_user, new_mailboxes: [Mailbox]):
|
|
||||||
# cannot transfer alias which is used for receiving newsletter
|
|
||||||
if User.get_by(newsletter_alias_id=alias.id):
|
|
||||||
raise Exception("Cannot transfer alias that's used to receive newsletter")
|
|
||||||
|
|
||||||
# update user_id
|
|
||||||
Session.query(Contact).filter(Contact.alias_id == alias.id).update(
|
|
||||||
{"user_id": new_user.id}
|
|
||||||
)
|
|
||||||
|
|
||||||
Session.query(AliasUsedOn).filter(AliasUsedOn.alias_id == alias.id).update(
|
|
||||||
{"user_id": new_user.id}
|
|
||||||
)
|
|
||||||
|
|
||||||
Session.query(ClientUser).filter(ClientUser.alias_id == alias.id).update(
|
|
||||||
{"user_id": new_user.id}
|
|
||||||
)
|
|
||||||
|
|
||||||
# remove existing mailboxes from the alias
|
|
||||||
Session.query(AliasMailbox).filter(AliasMailbox.alias_id == alias.id).delete()
|
|
||||||
|
|
||||||
# set mailboxes
|
|
||||||
alias.mailbox_id = new_mailboxes.pop().id
|
|
||||||
for mb in new_mailboxes:
|
|
||||||
AliasMailbox.create(alias_id=alias.id, mailbox_id=mb.id)
|
|
||||||
|
|
||||||
# alias has never been transferred before
|
|
||||||
if not alias.original_owner_id:
|
|
||||||
alias.original_owner_id = alias.user_id
|
|
||||||
|
|
||||||
# inform previous owner
|
|
||||||
old_user = alias.user
|
|
||||||
send_email(
|
|
||||||
old_user.email,
|
|
||||||
f"Alias {alias.email} has been received",
|
|
||||||
render(
|
|
||||||
"transactional/alias-transferred.txt",
|
|
||||||
alias=alias,
|
|
||||||
),
|
|
||||||
render(
|
|
||||||
"transactional/alias-transferred.html",
|
|
||||||
alias=alias,
|
|
||||||
),
|
|
||||||
)
|
|
||||||
|
|
||||||
# now the alias belongs to the new user
|
|
||||||
alias.user_id = new_user.id
|
|
||||||
|
|
||||||
# set some fields back to default
|
|
||||||
alias.disable_pgp = False
|
|
||||||
alias.pinned = False
|
|
||||||
|
|
||||||
Session.commit()
|
|
||||||
|
|
||||||
|
|
||||||
def hmac_alias_transfer_token(transfer_token: str) -> str:
|
def hmac_alias_transfer_token(transfer_token: str) -> str:
|
||||||
alias_hmac = hmac.new(
|
alias_hmac = hmac.new(
|
||||||
config.ALIAS_TRANSFER_TOKEN_SECRET.encode("utf-8"),
|
config.ALIAS_TRANSFER_TOKEN_SECRET.encode("utf-8"),
|
||||||
@ -214,7 +154,7 @@ def alias_transfer_receive_route():
|
|||||||
mailboxes,
|
mailboxes,
|
||||||
token,
|
token,
|
||||||
)
|
)
|
||||||
transfer(alias, current_user, mailboxes)
|
transfer_alias(alias, current_user, mailboxes)
|
||||||
|
|
||||||
# reset transfer token
|
# reset transfer token
|
||||||
alias.transfer_token = None
|
alias.transfer_token = None
|
||||||
|
@ -3,9 +3,11 @@ from flask_login import login_required, current_user
|
|||||||
from flask_wtf import FlaskForm
|
from flask_wtf import FlaskForm
|
||||||
from wtforms import StringField, validators
|
from wtforms import StringField, validators
|
||||||
|
|
||||||
|
from app import config
|
||||||
from app.dashboard.base import dashboard_bp
|
from app.dashboard.base import dashboard_bp
|
||||||
from app.dashboard.views.enter_sudo import sudo_required
|
from app.dashboard.views.enter_sudo import sudo_required
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
|
from app.extensions import limiter
|
||||||
from app.models import ApiKey
|
from app.models import ApiKey
|
||||||
from app.utils import CSRFValidationForm
|
from app.utils import CSRFValidationForm
|
||||||
|
|
||||||
@ -14,9 +16,34 @@ class NewApiKeyForm(FlaskForm):
|
|||||||
name = StringField("Name", validators=[validators.DataRequired()])
|
name = StringField("Name", validators=[validators.DataRequired()])
|
||||||
|
|
||||||
|
|
||||||
|
def clean_up_unused_or_old_api_keys(user_id: int):
|
||||||
|
total_keys = ApiKey.filter_by(user_id=user_id).count()
|
||||||
|
if total_keys <= config.MAX_API_KEYS:
|
||||||
|
return
|
||||||
|
# Remove oldest unused
|
||||||
|
for api_key in (
|
||||||
|
ApiKey.filter_by(user_id=user_id, last_used=None)
|
||||||
|
.order_by(ApiKey.created_at.asc())
|
||||||
|
.all()
|
||||||
|
):
|
||||||
|
Session.delete(api_key)
|
||||||
|
total_keys -= 1
|
||||||
|
if total_keys <= config.MAX_API_KEYS:
|
||||||
|
return
|
||||||
|
# Clean up oldest used
|
||||||
|
for api_key in (
|
||||||
|
ApiKey.filter_by(user_id=user_id).order_by(ApiKey.last_used.asc()).all()
|
||||||
|
):
|
||||||
|
Session.delete(api_key)
|
||||||
|
total_keys -= 1
|
||||||
|
if total_keys <= config.MAX_API_KEYS:
|
||||||
|
return
|
||||||
|
|
||||||
|
|
||||||
@dashboard_bp.route("/api_key", methods=["GET", "POST"])
|
@dashboard_bp.route("/api_key", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
@sudo_required
|
@sudo_required
|
||||||
|
@limiter.limit("10/hour")
|
||||||
def api_key():
|
def api_key():
|
||||||
api_keys = (
|
api_keys = (
|
||||||
ApiKey.filter(ApiKey.user_id == current_user.id)
|
ApiKey.filter(ApiKey.user_id == current_user.id)
|
||||||
@ -50,6 +77,7 @@ def api_key():
|
|||||||
|
|
||||||
elif request.form.get("form-name") == "create":
|
elif request.form.get("form-name") == "create":
|
||||||
if new_api_key_form.validate():
|
if new_api_key_form.validate():
|
||||||
|
clean_up_unused_or_old_api_keys(current_user.id)
|
||||||
new_api_key = ApiKey.create(
|
new_api_key = ApiKey.create(
|
||||||
name=new_api_key_form.name.data, user_id=current_user.id
|
name=new_api_key_form.name.data, user_id=current_user.id
|
||||||
)
|
)
|
||||||
|
@ -1,14 +1,9 @@
|
|||||||
from app.db import Session
|
|
||||||
|
|
||||||
"""
|
|
||||||
List of apps that user has used via the "Sign in with SimpleLogin"
|
|
||||||
"""
|
|
||||||
|
|
||||||
from flask import render_template, request, flash, redirect
|
from flask import render_template, request, flash, redirect
|
||||||
from flask_login import login_required, current_user
|
from flask_login import login_required, current_user
|
||||||
from sqlalchemy.orm import joinedload
|
from sqlalchemy.orm import joinedload
|
||||||
|
|
||||||
from app.dashboard.base import dashboard_bp
|
from app.dashboard.base import dashboard_bp
|
||||||
|
from app.db import Session
|
||||||
from app.models import (
|
from app.models import (
|
||||||
ClientUser,
|
ClientUser,
|
||||||
)
|
)
|
||||||
@ -17,6 +12,10 @@ from app.models import (
|
|||||||
@dashboard_bp.route("/app", methods=["GET", "POST"])
|
@dashboard_bp.route("/app", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def app_route():
|
def app_route():
|
||||||
|
"""
|
||||||
|
List of apps that user has used via the "Sign in with SimpleLogin"
|
||||||
|
"""
|
||||||
|
|
||||||
client_users = (
|
client_users = (
|
||||||
ClientUser.filter_by(user_id=current_user.id)
|
ClientUser.filter_by(user_id=current_user.id)
|
||||||
.options(joinedload(ClientUser.client))
|
.options(joinedload(ClientUser.client))
|
||||||
|
@ -68,9 +68,14 @@ def coupon_route():
|
|||||||
)
|
)
|
||||||
return redirect(request.url)
|
return redirect(request.url)
|
||||||
|
|
||||||
coupon.used_by_user_id = current_user.id
|
updated = (
|
||||||
coupon.used = True
|
Session.query(Coupon)
|
||||||
Session.commit()
|
.filter_by(code=code, used=False)
|
||||||
|
.update({"used_by_user_id": current_user.id, "used": True})
|
||||||
|
)
|
||||||
|
if updated != 1:
|
||||||
|
flash("Coupon is not valid", "error")
|
||||||
|
return redirect(request.url)
|
||||||
|
|
||||||
manual_sub: ManualSubscription = ManualSubscription.get_by(
|
manual_sub: ManualSubscription = ManualSubscription.get_by(
|
||||||
user_id=current_user.id
|
user_id=current_user.id
|
||||||
@ -95,7 +100,7 @@ def coupon_route():
|
|||||||
commit=True,
|
commit=True,
|
||||||
)
|
)
|
||||||
flash(
|
flash(
|
||||||
f"Your account has been upgraded to Premium, thanks for your support!",
|
"Your account has been upgraded to Premium, thanks for your support!",
|
||||||
"success",
|
"success",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -24,6 +24,7 @@ from app.models import (
|
|||||||
AliasMailbox,
|
AliasMailbox,
|
||||||
DomainDeletedAlias,
|
DomainDeletedAlias,
|
||||||
)
|
)
|
||||||
|
from app.utils import CSRFValidationForm
|
||||||
|
|
||||||
|
|
||||||
@dashboard_bp.route("/custom_alias", methods=["GET", "POST"])
|
@dashboard_bp.route("/custom_alias", methods=["GET", "POST"])
|
||||||
@ -48,9 +49,13 @@ def custom_alias():
|
|||||||
at_least_a_premium_domain = True
|
at_least_a_premium_domain = True
|
||||||
break
|
break
|
||||||
|
|
||||||
|
csrf_form = CSRFValidationForm()
|
||||||
mailboxes = current_user.mailboxes()
|
mailboxes = current_user.mailboxes()
|
||||||
|
|
||||||
if request.method == "POST":
|
if request.method == "POST":
|
||||||
|
if not csrf_form.validate():
|
||||||
|
flash("Invalid request", "warning")
|
||||||
|
return redirect(request.url)
|
||||||
alias_prefix = request.form.get("prefix").strip().lower().replace(" ", "")
|
alias_prefix = request.form.get("prefix").strip().lower().replace(" ", "")
|
||||||
signed_alias_suffix = request.form.get("signed-alias-suffix")
|
signed_alias_suffix = request.form.get("signed-alias-suffix")
|
||||||
mailbox_ids = request.form.getlist("mailboxes")
|
mailbox_ids = request.form.getlist("mailboxes")
|
||||||
@ -164,4 +169,5 @@ def custom_alias():
|
|||||||
alias_suffixes=alias_suffixes,
|
alias_suffixes=alias_suffixes,
|
||||||
at_least_a_premium_domain=at_least_a_premium_domain,
|
at_least_a_premium_domain=at_least_a_premium_domain,
|
||||||
mailboxes=mailboxes,
|
mailboxes=mailboxes,
|
||||||
|
csrf_form=csrf_form,
|
||||||
)
|
)
|
||||||
|
@ -67,7 +67,7 @@ def directory():
|
|||||||
if request.method == "POST":
|
if request.method == "POST":
|
||||||
if request.form.get("form-name") == "delete":
|
if request.form.get("form-name") == "delete":
|
||||||
if not delete_dir_form.validate():
|
if not delete_dir_form.validate():
|
||||||
flash(f"Invalid request", "warning")
|
flash("Invalid request", "warning")
|
||||||
return redirect(url_for("dashboard.directory"))
|
return redirect(url_for("dashboard.directory"))
|
||||||
dir_obj = Directory.get(delete_dir_form.directory_id.data)
|
dir_obj = Directory.get(delete_dir_form.directory_id.data)
|
||||||
|
|
||||||
@ -87,7 +87,7 @@ def directory():
|
|||||||
|
|
||||||
if request.form.get("form-name") == "toggle-directory":
|
if request.form.get("form-name") == "toggle-directory":
|
||||||
if not toggle_dir_form.validate():
|
if not toggle_dir_form.validate():
|
||||||
flash(f"Invalid request", "warning")
|
flash("Invalid request", "warning")
|
||||||
return redirect(url_for("dashboard.directory"))
|
return redirect(url_for("dashboard.directory"))
|
||||||
dir_id = toggle_dir_form.directory_id.data
|
dir_id = toggle_dir_form.directory_id.data
|
||||||
dir_obj = Directory.get(dir_id)
|
dir_obj = Directory.get(dir_id)
|
||||||
@ -109,7 +109,7 @@ def directory():
|
|||||||
|
|
||||||
elif request.form.get("form-name") == "update":
|
elif request.form.get("form-name") == "update":
|
||||||
if not update_dir_form.validate():
|
if not update_dir_form.validate():
|
||||||
flash(f"Invalid request", "warning")
|
flash("Invalid request", "warning")
|
||||||
return redirect(url_for("dashboard.directory"))
|
return redirect(url_for("dashboard.directory"))
|
||||||
dir_id = update_dir_form.directory_id.data
|
dir_id = update_dir_form.directory_id.data
|
||||||
dir_obj = Directory.get(dir_id)
|
dir_obj = Directory.get(dir_id)
|
||||||
|
@ -8,6 +8,7 @@ from wtforms import PasswordField, validators
|
|||||||
|
|
||||||
from app.config import CONNECT_WITH_PROTON
|
from app.config import CONNECT_WITH_PROTON
|
||||||
from app.dashboard.base import dashboard_bp
|
from app.dashboard.base import dashboard_bp
|
||||||
|
from app.extensions import limiter
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import PartnerUser
|
from app.models import PartnerUser
|
||||||
from app.proton.utils import get_proton_partner
|
from app.proton.utils import get_proton_partner
|
||||||
@ -21,6 +22,7 @@ class LoginForm(FlaskForm):
|
|||||||
|
|
||||||
|
|
||||||
@dashboard_bp.route("/enter_sudo", methods=["GET", "POST"])
|
@dashboard_bp.route("/enter_sudo", methods=["GET", "POST"])
|
||||||
|
@limiter.limit("3/minute")
|
||||||
@login_required
|
@login_required
|
||||||
def enter_sudo():
|
def enter_sudo():
|
||||||
password_check_form = LoginForm()
|
password_check_form = LoginForm()
|
||||||
|
@ -52,12 +52,13 @@ def get_stats(user: User) -> Stats:
|
|||||||
|
|
||||||
|
|
||||||
@dashboard_bp.route("/", methods=["GET", "POST"])
|
@dashboard_bp.route("/", methods=["GET", "POST"])
|
||||||
|
@login_required
|
||||||
@limiter.limit(
|
@limiter.limit(
|
||||||
ALIAS_LIMIT,
|
ALIAS_LIMIT,
|
||||||
methods=["POST"],
|
methods=["POST"],
|
||||||
exempt_when=lambda: request.form.get("form-name") != "create-random-email",
|
exempt_when=lambda: request.form.get("form-name") != "create-random-email",
|
||||||
)
|
)
|
||||||
@login_required
|
@limiter.limit("10/minute", methods=["GET"], key_func=lambda: current_user.id)
|
||||||
@parallel_limiter.lock(
|
@parallel_limiter.lock(
|
||||||
name="alias_creation",
|
name="alias_creation",
|
||||||
only_when=lambda: request.form.get("form-name") == "create-random-email",
|
only_when=lambda: request.form.get("form-name") == "create-random-email",
|
||||||
|
@ -1,3 +1,7 @@
|
|||||||
|
import base64
|
||||||
|
import binascii
|
||||||
|
import json
|
||||||
|
|
||||||
import arrow
|
import arrow
|
||||||
from flask import render_template, request, redirect, url_for, flash
|
from flask import render_template, request, redirect, url_for, flash
|
||||||
from flask_login import login_required, current_user
|
from flask_login import login_required, current_user
|
||||||
@ -15,8 +19,8 @@ from app.email_utils import (
|
|||||||
mailbox_already_used,
|
mailbox_already_used,
|
||||||
render,
|
render,
|
||||||
send_email,
|
send_email,
|
||||||
is_valid_email,
|
|
||||||
)
|
)
|
||||||
|
from app.email_validation import is_valid_email
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.models import Mailbox, Job
|
from app.models import Mailbox, Job
|
||||||
from app.utils import CSRFValidationForm
|
from app.utils import CSRFValidationForm
|
||||||
@ -180,7 +184,9 @@ def mailbox_route():
|
|||||||
|
|
||||||
def send_verification_email(user, mailbox):
|
def send_verification_email(user, mailbox):
|
||||||
s = TimestampSigner(MAILBOX_SECRET)
|
s = TimestampSigner(MAILBOX_SECRET)
|
||||||
mailbox_id_signed = s.sign(str(mailbox.id)).decode()
|
encoded_data = json.dumps([mailbox.id, mailbox.email]).encode("utf-8")
|
||||||
|
b64_data = base64.urlsafe_b64encode(encoded_data)
|
||||||
|
mailbox_id_signed = s.sign(b64_data).decode()
|
||||||
verification_url = (
|
verification_url = (
|
||||||
URL + "/dashboard/mailbox_verify" + f"?mailbox_id={mailbox_id_signed}"
|
URL + "/dashboard/mailbox_verify" + f"?mailbox_id={mailbox_id_signed}"
|
||||||
)
|
)
|
||||||
@ -205,22 +211,34 @@ def send_verification_email(user, mailbox):
|
|||||||
@dashboard_bp.route("/mailbox_verify")
|
@dashboard_bp.route("/mailbox_verify")
|
||||||
def mailbox_verify():
|
def mailbox_verify():
|
||||||
s = TimestampSigner(MAILBOX_SECRET)
|
s = TimestampSigner(MAILBOX_SECRET)
|
||||||
mailbox_id = request.args.get("mailbox_id")
|
mailbox_verify_request = request.args.get("mailbox_id")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
r_id = int(s.unsign(mailbox_id, max_age=900))
|
mailbox_raw_data = s.unsign(mailbox_verify_request, max_age=900)
|
||||||
except Exception:
|
except Exception:
|
||||||
flash("Invalid link. Please delete and re-add your mailbox", "error")
|
flash("Invalid link. Please delete and re-add your mailbox", "error")
|
||||||
return redirect(url_for("dashboard.mailbox_route"))
|
return redirect(url_for("dashboard.mailbox_route"))
|
||||||
else:
|
try:
|
||||||
mailbox = Mailbox.get(r_id)
|
decoded_data = base64.urlsafe_b64decode(mailbox_raw_data)
|
||||||
if not mailbox:
|
except binascii.Error:
|
||||||
flash("Invalid link", "error")
|
flash("Invalid link. Please delete and re-add your mailbox", "error")
|
||||||
return redirect(url_for("dashboard.mailbox_route"))
|
return redirect(url_for("dashboard.mailbox_route"))
|
||||||
|
mailbox_data = json.loads(decoded_data)
|
||||||
|
if not isinstance(mailbox_data, list) or len(mailbox_data) != 2:
|
||||||
|
flash("Invalid link. Please delete and re-add your mailbox", "error")
|
||||||
|
return redirect(url_for("dashboard.mailbox_route"))
|
||||||
|
mailbox_id = mailbox_data[0]
|
||||||
|
mailbox = Mailbox.get(mailbox_id)
|
||||||
|
if not mailbox:
|
||||||
|
flash("Invalid link", "error")
|
||||||
|
return redirect(url_for("dashboard.mailbox_route"))
|
||||||
|
mailbox_email = mailbox_data[1]
|
||||||
|
if mailbox_email != mailbox.email:
|
||||||
|
flash("Invalid link", "error")
|
||||||
|
return redirect(url_for("dashboard.mailbox_route"))
|
||||||
|
|
||||||
mailbox.verified = True
|
mailbox.verified = True
|
||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
LOG.d("Mailbox %s is verified", mailbox)
|
LOG.d("Mailbox %s is verified", mailbox)
|
||||||
|
|
||||||
return render_template("dashboard/mailbox_validation.html", mailbox=mailbox)
|
return render_template("dashboard/mailbox_validation.html", mailbox=mailbox)
|
||||||
|
@ -30,7 +30,7 @@ class ChangeEmailForm(FlaskForm):
|
|||||||
@dashboard_bp.route("/mailbox/<int:mailbox_id>/", methods=["GET", "POST"])
|
@dashboard_bp.route("/mailbox/<int:mailbox_id>/", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def mailbox_detail_route(mailbox_id):
|
def mailbox_detail_route(mailbox_id):
|
||||||
mailbox = Mailbox.get(mailbox_id)
|
mailbox: Mailbox = Mailbox.get(mailbox_id)
|
||||||
if not mailbox or mailbox.user_id != current_user.id:
|
if not mailbox or mailbox.user_id != current_user.id:
|
||||||
flash("You cannot see this page", "warning")
|
flash("You cannot see this page", "warning")
|
||||||
return redirect(url_for("dashboard.index"))
|
return redirect(url_for("dashboard.index"))
|
||||||
@ -144,6 +144,15 @@ def mailbox_detail_route(mailbox_id):
|
|||||||
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if mailbox.is_proton():
|
||||||
|
flash(
|
||||||
|
"Enabling PGP for a Proton Mail mailbox is redundant and does not add any security benefit",
|
||||||
|
"info",
|
||||||
|
)
|
||||||
|
return redirect(
|
||||||
|
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
||||||
|
)
|
||||||
|
|
||||||
mailbox.pgp_public_key = request.form.get("pgp")
|
mailbox.pgp_public_key = request.form.get("pgp")
|
||||||
try:
|
try:
|
||||||
mailbox.pgp_finger_print = load_public_key_and_check(
|
mailbox.pgp_finger_print = load_public_key_and_check(
|
||||||
@ -182,25 +191,16 @@ def mailbox_detail_route(mailbox_id):
|
|||||||
)
|
)
|
||||||
elif request.form.get("form-name") == "generic-subject":
|
elif request.form.get("form-name") == "generic-subject":
|
||||||
if request.form.get("action") == "save":
|
if request.form.get("action") == "save":
|
||||||
if not mailbox.pgp_enabled():
|
|
||||||
flash(
|
|
||||||
"Generic subject can only be used on PGP-enabled mailbox",
|
|
||||||
"error",
|
|
||||||
)
|
|
||||||
return redirect(
|
|
||||||
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
|
||||||
)
|
|
||||||
|
|
||||||
mailbox.generic_subject = request.form.get("generic-subject")
|
mailbox.generic_subject = request.form.get("generic-subject")
|
||||||
Session.commit()
|
Session.commit()
|
||||||
flash("Generic subject for PGP-encrypted email is enabled", "success")
|
flash("Generic subject is enabled", "success")
|
||||||
return redirect(
|
return redirect(
|
||||||
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
||||||
)
|
)
|
||||||
elif request.form.get("action") == "remove":
|
elif request.form.get("action") == "remove":
|
||||||
mailbox.generic_subject = None
|
mailbox.generic_subject = None
|
||||||
Session.commit()
|
Session.commit()
|
||||||
flash("Generic subject for PGP-encrypted email is disabled", "success")
|
flash("Generic subject is disabled", "success")
|
||||||
return redirect(
|
return redirect(
|
||||||
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
|
||||||
)
|
)
|
||||||
|
@ -128,7 +128,6 @@ def setting():
|
|||||||
new_email_valid = True
|
new_email_valid = True
|
||||||
new_email = canonicalize_email(change_email_form.email.data)
|
new_email = canonicalize_email(change_email_form.email.data)
|
||||||
if new_email != current_user.email and not pending_email:
|
if new_email != current_user.email and not pending_email:
|
||||||
|
|
||||||
# check if this email is not already used
|
# check if this email is not already used
|
||||||
if personal_email_already_used(new_email) or Alias.get_by(
|
if personal_email_already_used(new_email) or Alias.get_by(
|
||||||
email=new_email
|
email=new_email
|
||||||
@ -198,6 +197,16 @@ def setting():
|
|||||||
)
|
)
|
||||||
return redirect(url_for("dashboard.setting"))
|
return redirect(url_for("dashboard.setting"))
|
||||||
|
|
||||||
|
if current_user.profile_picture_id is not None:
|
||||||
|
current_profile_file = File.get_by(
|
||||||
|
id=current_user.profile_picture_id
|
||||||
|
)
|
||||||
|
if (
|
||||||
|
current_profile_file is not None
|
||||||
|
and current_profile_file.user_id == current_user.id
|
||||||
|
):
|
||||||
|
s3.delete(current_profile_file.path)
|
||||||
|
|
||||||
file_path = random_string(30)
|
file_path = random_string(30)
|
||||||
file = File.create(user_id=current_user.id, path=file_path)
|
file = File.create(user_id=current_user.id, path=file_path)
|
||||||
|
|
||||||
@ -451,8 +460,13 @@ def send_change_email_confirmation(user: User, email_change: EmailChange):
|
|||||||
|
|
||||||
|
|
||||||
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
|
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
|
||||||
|
@limiter.limit("5/hour")
|
||||||
@login_required
|
@login_required
|
||||||
def resend_email_change():
|
def resend_email_change():
|
||||||
|
form = CSRFValidationForm()
|
||||||
|
if not form.validate():
|
||||||
|
flash("Invalid request. Please try again", "warning")
|
||||||
|
return redirect(url_for("dashboard.setting"))
|
||||||
email_change = EmailChange.get_by(user_id=current_user.id)
|
email_change = EmailChange.get_by(user_id=current_user.id)
|
||||||
if email_change:
|
if email_change:
|
||||||
# extend email change expiration
|
# extend email change expiration
|
||||||
@ -472,6 +486,10 @@ def resend_email_change():
|
|||||||
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
|
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def cancel_email_change():
|
def cancel_email_change():
|
||||||
|
form = CSRFValidationForm()
|
||||||
|
if not form.validate():
|
||||||
|
flash("Invalid request. Please try again", "warning")
|
||||||
|
return redirect(url_for("dashboard.setting"))
|
||||||
email_change = EmailChange.get_by(user_id=current_user.id)
|
email_change = EmailChange.get_by(user_id=current_user.id)
|
||||||
if email_change:
|
if email_change:
|
||||||
EmailChange.delete(email_change.id)
|
EmailChange.delete(email_change.id)
|
||||||
|
@ -75,12 +75,11 @@ def block_contact(contact_id):
|
|||||||
@dashboard_bp.route("/unsubscribe/encoded/<encoded_request>", methods=["GET"])
|
@dashboard_bp.route("/unsubscribe/encoded/<encoded_request>", methods=["GET"])
|
||||||
@login_required
|
@login_required
|
||||||
def encoded_unsubscribe(encoded_request: str):
|
def encoded_unsubscribe(encoded_request: str):
|
||||||
|
|
||||||
unsub_data = UnsubscribeHandler().handle_unsubscribe_from_request(
|
unsub_data = UnsubscribeHandler().handle_unsubscribe_from_request(
|
||||||
current_user, encoded_request
|
current_user, encoded_request
|
||||||
)
|
)
|
||||||
if not unsub_data:
|
if not unsub_data:
|
||||||
flash(f"Invalid unsubscribe request", "error")
|
flash("Invalid unsubscribe request", "error")
|
||||||
return redirect(url_for("dashboard.index"))
|
return redirect(url_for("dashboard.index"))
|
||||||
if unsub_data.action == UnsubscribeAction.DisableAlias:
|
if unsub_data.action == UnsubscribeAction.DisableAlias:
|
||||||
alias = Alias.get(unsub_data.data)
|
alias = Alias.get(unsub_data.data)
|
||||||
@ -97,14 +96,14 @@ def encoded_unsubscribe(encoded_request: str):
|
|||||||
)
|
)
|
||||||
)
|
)
|
||||||
if unsub_data.action == UnsubscribeAction.UnsubscribeNewsletter:
|
if unsub_data.action == UnsubscribeAction.UnsubscribeNewsletter:
|
||||||
flash(f"You've unsubscribed from the newsletter", "success")
|
flash("You've unsubscribed from the newsletter", "success")
|
||||||
return redirect(
|
return redirect(
|
||||||
url_for(
|
url_for(
|
||||||
"dashboard.index",
|
"dashboard.index",
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
if unsub_data.action == UnsubscribeAction.OriginalUnsubscribeMailto:
|
if unsub_data.action == UnsubscribeAction.OriginalUnsubscribeMailto:
|
||||||
flash(f"The original unsubscribe request has been forwarded", "success")
|
flash("The original unsubscribe request has been forwarded", "success")
|
||||||
return redirect(
|
return redirect(
|
||||||
url_for(
|
url_for(
|
||||||
"dashboard.index",
|
"dashboard.index",
|
||||||
|
@ -1 +1,3 @@
|
|||||||
from .views import index, new_client, client_detail
|
from .views import index, new_client, client_detail
|
||||||
|
|
||||||
|
__all__ = ["index", "new_client", "client_detail"]
|
||||||
|
@ -87,7 +87,7 @@ def client_detail(client_id):
|
|||||||
)
|
)
|
||||||
|
|
||||||
flash(
|
flash(
|
||||||
f"Thanks for submitting, we are informed and will come back to you asap!",
|
"Thanks for submitting, we are informed and will come back to you asap!",
|
||||||
"success",
|
"success",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -1 +1,3 @@
|
|||||||
from .views import index
|
from .views import index
|
||||||
|
|
||||||
|
__all__ = ["index"]
|
||||||
|
@ -34,7 +34,7 @@ def get_cname_record(hostname) -> Optional[str]:
|
|||||||
|
|
||||||
|
|
||||||
def get_mx_domains(hostname) -> [(int, str)]:
|
def get_mx_domains(hostname) -> [(int, str)]:
|
||||||
"""return list of (priority, domain name).
|
"""return list of (priority, domain name) sorted by priority (lowest priority first)
|
||||||
domain name ends with a "." at the end.
|
domain name ends with a "." at the end.
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
@ -50,7 +50,7 @@ def get_mx_domains(hostname) -> [(int, str)]:
|
|||||||
|
|
||||||
ret.append((int(parts[0]), parts[1]))
|
ret.append((int(parts[0]), parts[1]))
|
||||||
|
|
||||||
return ret
|
return sorted(ret, key=lambda prio_domain: prio_domain[0])
|
||||||
|
|
||||||
|
|
||||||
_include_spf = "include:"
|
_include_spf = "include:"
|
||||||
|
@ -20,6 +20,7 @@ X_SPAM_STATUS = "X-Spam-Status"
|
|||||||
LIST_UNSUBSCRIBE = "List-Unsubscribe"
|
LIST_UNSUBSCRIBE = "List-Unsubscribe"
|
||||||
LIST_UNSUBSCRIBE_POST = "List-Unsubscribe-Post"
|
LIST_UNSUBSCRIBE_POST = "List-Unsubscribe-Post"
|
||||||
RETURN_PATH = "Return-Path"
|
RETURN_PATH = "Return-Path"
|
||||||
|
AUTHENTICATION_RESULTS = "Authentication-Results"
|
||||||
|
|
||||||
# headers used to DKIM sign in order of preference
|
# headers used to DKIM sign in order of preference
|
||||||
DKIM_HEADERS = [
|
DKIM_HEADERS = [
|
||||||
@ -32,6 +33,7 @@ DKIM_HEADERS = [
|
|||||||
SL_DIRECTION = "X-SimpleLogin-Type"
|
SL_DIRECTION = "X-SimpleLogin-Type"
|
||||||
SL_EMAIL_LOG_ID = "X-SimpleLogin-EmailLog-ID"
|
SL_EMAIL_LOG_ID = "X-SimpleLogin-EmailLog-ID"
|
||||||
SL_ENVELOPE_FROM = "X-SimpleLogin-Envelope-From"
|
SL_ENVELOPE_FROM = "X-SimpleLogin-Envelope-From"
|
||||||
|
SL_ORIGINAL_FROM = "X-SimpleLogin-Original-From"
|
||||||
SL_ENVELOPE_TO = "X-SimpleLogin-Envelope-To"
|
SL_ENVELOPE_TO = "X-SimpleLogin-Envelope-To"
|
||||||
SL_CLIENT_IP = "X-SimpleLogin-Client-IP"
|
SL_CLIENT_IP = "X-SimpleLogin-Client-IP"
|
||||||
|
|
||||||
|
@ -54,6 +54,7 @@ from app.models import (
|
|||||||
IgnoreBounceSender,
|
IgnoreBounceSender,
|
||||||
InvalidMailboxDomain,
|
InvalidMailboxDomain,
|
||||||
VerpType,
|
VerpType,
|
||||||
|
available_sl_email,
|
||||||
)
|
)
|
||||||
from app.utils import (
|
from app.utils import (
|
||||||
random_string,
|
random_string,
|
||||||
@ -92,7 +93,7 @@ def send_welcome_email(user):
|
|||||||
|
|
||||||
send_email(
|
send_email(
|
||||||
comm_email,
|
comm_email,
|
||||||
f"Welcome to SimpleLogin",
|
"Welcome to SimpleLogin",
|
||||||
render("com/welcome.txt", user=user, alias=alias),
|
render("com/welcome.txt", user=user, alias=alias),
|
||||||
render("com/welcome.html", user=user, alias=alias),
|
render("com/welcome.html", user=user, alias=alias),
|
||||||
unsubscribe_link,
|
unsubscribe_link,
|
||||||
@ -103,7 +104,7 @@ def send_welcome_email(user):
|
|||||||
def send_trial_end_soon_email(user):
|
def send_trial_end_soon_email(user):
|
||||||
send_email(
|
send_email(
|
||||||
user.email,
|
user.email,
|
||||||
f"Your trial will end soon",
|
"Your trial will end soon",
|
||||||
render("transactional/trial-end.txt.jinja2", user=user),
|
render("transactional/trial-end.txt.jinja2", user=user),
|
||||||
render("transactional/trial-end.html", user=user),
|
render("transactional/trial-end.html", user=user),
|
||||||
ignore_smtp_error=True,
|
ignore_smtp_error=True,
|
||||||
@ -113,7 +114,7 @@ def send_trial_end_soon_email(user):
|
|||||||
def send_activation_email(email, activation_link):
|
def send_activation_email(email, activation_link):
|
||||||
send_email(
|
send_email(
|
||||||
email,
|
email,
|
||||||
f"Just one more step to join SimpleLogin",
|
"Just one more step to join SimpleLogin",
|
||||||
render(
|
render(
|
||||||
"transactional/activation.txt",
|
"transactional/activation.txt",
|
||||||
activation_link=activation_link,
|
activation_link=activation_link,
|
||||||
@ -582,6 +583,26 @@ def email_can_be_used_as_mailbox(email_address: str) -> bool:
|
|||||||
LOG.d("MX Domain %s %s is invalid mailbox domain", mx_domain, domain)
|
LOG.d("MX Domain %s %s is invalid mailbox domain", mx_domain, domain)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
existing_user = User.get_by(email=email_address)
|
||||||
|
if existing_user and existing_user.disabled:
|
||||||
|
LOG.d(
|
||||||
|
f"User {existing_user} is disabled. {email_address} cannot be used for other mailbox"
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
|
||||||
|
for existing_user in (
|
||||||
|
User.query()
|
||||||
|
.join(Mailbox, User.id == Mailbox.user_id)
|
||||||
|
.filter(Mailbox.email == email_address)
|
||||||
|
.group_by(User.id)
|
||||||
|
.all()
|
||||||
|
):
|
||||||
|
if existing_user.disabled:
|
||||||
|
LOG.d(
|
||||||
|
f"User {existing_user} is disabled and has a mailbox with {email_address}. Id cannot be used for other mailbox"
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
@ -767,7 +788,7 @@ def get_header_unicode(header: Union[str, Header]) -> str:
|
|||||||
ret = ""
|
ret = ""
|
||||||
for to_decoded_str, charset in decode_header(header):
|
for to_decoded_str, charset in decode_header(header):
|
||||||
if charset is None:
|
if charset is None:
|
||||||
if type(to_decoded_str) is bytes:
|
if isinstance(to_decoded_str, bytes):
|
||||||
decoded_str = to_decoded_str.decode()
|
decoded_str = to_decoded_str.decode()
|
||||||
else:
|
else:
|
||||||
decoded_str = to_decoded_str
|
decoded_str = to_decoded_str
|
||||||
@ -804,13 +825,13 @@ def to_bytes(msg: Message):
|
|||||||
for generator_policy in [None, policy.SMTP, policy.SMTPUTF8]:
|
for generator_policy in [None, policy.SMTP, policy.SMTPUTF8]:
|
||||||
try:
|
try:
|
||||||
return msg.as_bytes(policy=generator_policy)
|
return msg.as_bytes(policy=generator_policy)
|
||||||
except:
|
except Exception:
|
||||||
LOG.w("as_bytes() fails with %s policy", policy, exc_info=True)
|
LOG.w("as_bytes() fails with %s policy", policy, exc_info=True)
|
||||||
|
|
||||||
msg_string = msg.as_string()
|
msg_string = msg.as_string()
|
||||||
try:
|
try:
|
||||||
return msg_string.encode()
|
return msg_string.encode()
|
||||||
except:
|
except Exception:
|
||||||
LOG.w("as_string().encode() fails", exc_info=True)
|
LOG.w("as_string().encode() fails", exc_info=True)
|
||||||
|
|
||||||
return msg_string.encode(errors="replace")
|
return msg_string.encode(errors="replace")
|
||||||
@ -827,19 +848,6 @@ def should_add_dkim_signature(domain: str) -> bool:
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
def is_valid_email(email_address: str) -> bool:
|
|
||||||
"""
|
|
||||||
Used to check whether an email address is valid
|
|
||||||
NOT run MX check.
|
|
||||||
NOT allow unicode.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
validate_email(email_address, check_deliverability=False, allow_smtputf8=False)
|
|
||||||
return True
|
|
||||||
except EmailNotValidError:
|
|
||||||
return False
|
|
||||||
|
|
||||||
|
|
||||||
class EmailEncoding(enum.Enum):
|
class EmailEncoding(enum.Enum):
|
||||||
BASE64 = "base64"
|
BASE64 = "base64"
|
||||||
QUOTED = "quoted-printable"
|
QUOTED = "quoted-printable"
|
||||||
@ -918,7 +926,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
|
|||||||
if content_type == "text/plain":
|
if content_type == "text/plain":
|
||||||
encoding = get_encoding(msg)
|
encoding = get_encoding(msg)
|
||||||
payload = msg.get_payload()
|
payload = msg.get_payload()
|
||||||
if type(payload) is str:
|
if isinstance(payload, str):
|
||||||
clone_msg = copy(msg)
|
clone_msg = copy(msg)
|
||||||
new_payload = f"""{text_header}
|
new_payload = f"""{text_header}
|
||||||
------------------------------
|
------------------------------
|
||||||
@ -928,7 +936,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
|
|||||||
elif content_type == "text/html":
|
elif content_type == "text/html":
|
||||||
encoding = get_encoding(msg)
|
encoding = get_encoding(msg)
|
||||||
payload = msg.get_payload()
|
payload = msg.get_payload()
|
||||||
if type(payload) is str:
|
if isinstance(payload, str):
|
||||||
new_payload = f"""<table width="100%" style="width: 100%; -premailer-width: 100%; -premailer-cellpadding: 0;
|
new_payload = f"""<table width="100%" style="width: 100%; -premailer-width: 100%; -premailer-cellpadding: 0;
|
||||||
-premailer-cellspacing: 0; margin: 0; padding: 0;">
|
-premailer-cellspacing: 0; margin: 0; padding: 0;">
|
||||||
<tr>
|
<tr>
|
||||||
@ -950,6 +958,8 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
|
|||||||
for part in msg.get_payload():
|
for part in msg.get_payload():
|
||||||
if isinstance(part, Message):
|
if isinstance(part, Message):
|
||||||
new_parts.append(add_header(part, text_header, html_header))
|
new_parts.append(add_header(part, text_header, html_header))
|
||||||
|
elif isinstance(part, str):
|
||||||
|
new_parts.append(MIMEText(part))
|
||||||
else:
|
else:
|
||||||
new_parts.append(part)
|
new_parts.append(part)
|
||||||
clone_msg = copy(msg)
|
clone_msg = copy(msg)
|
||||||
@ -958,7 +968,14 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
|
|||||||
|
|
||||||
elif content_type in ("multipart/mixed", "multipart/signed"):
|
elif content_type in ("multipart/mixed", "multipart/signed"):
|
||||||
new_parts = []
|
new_parts = []
|
||||||
parts = list(msg.get_payload())
|
payload = msg.get_payload()
|
||||||
|
if isinstance(payload, str):
|
||||||
|
# The message is badly formatted inject as new
|
||||||
|
new_parts = [MIMEText(text_header, "plain"), MIMEText(payload, "plain")]
|
||||||
|
clone_msg = copy(msg)
|
||||||
|
clone_msg.set_payload(new_parts)
|
||||||
|
return clone_msg
|
||||||
|
parts = list(payload)
|
||||||
LOG.d("only add header for the first part for %s", content_type)
|
LOG.d("only add header for the first part for %s", content_type)
|
||||||
for ix, part in enumerate(parts):
|
for ix, part in enumerate(parts):
|
||||||
if ix == 0:
|
if ix == 0:
|
||||||
@ -975,7 +992,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
|
|||||||
|
|
||||||
|
|
||||||
def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
|
def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
|
||||||
if type(msg) is str:
|
if isinstance(msg, str):
|
||||||
msg = msg.replace(old, new)
|
msg = msg.replace(old, new)
|
||||||
return msg
|
return msg
|
||||||
|
|
||||||
@ -998,7 +1015,7 @@ def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
|
|||||||
if content_type in ("text/plain", "text/html"):
|
if content_type in ("text/plain", "text/html"):
|
||||||
encoding = get_encoding(msg)
|
encoding = get_encoding(msg)
|
||||||
payload = msg.get_payload()
|
payload = msg.get_payload()
|
||||||
if type(payload) is str:
|
if isinstance(payload, str):
|
||||||
if encoding == EmailEncoding.QUOTED:
|
if encoding == EmailEncoding.QUOTED:
|
||||||
LOG.d("handle quoted-printable replace %s -> %s", old, new)
|
LOG.d("handle quoted-printable replace %s -> %s", old, new)
|
||||||
# first decode the payload
|
# first decode the payload
|
||||||
@ -1043,7 +1060,7 @@ def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
|
|||||||
return msg
|
return msg
|
||||||
|
|
||||||
|
|
||||||
def generate_reply_email(contact_email: str, user: User) -> str:
|
def generate_reply_email(contact_email: str, alias: Alias) -> str:
|
||||||
"""
|
"""
|
||||||
generate a reply_email (aka reverse-alias), make sure it isn't used by any contact
|
generate a reply_email (aka reverse-alias), make sure it isn't used by any contact
|
||||||
"""
|
"""
|
||||||
@ -1054,6 +1071,7 @@ def generate_reply_email(contact_email: str, user: User) -> str:
|
|||||||
|
|
||||||
include_sender_in_reverse_alias = False
|
include_sender_in_reverse_alias = False
|
||||||
|
|
||||||
|
user = alias.user
|
||||||
# user has set this option explicitly
|
# user has set this option explicitly
|
||||||
if user.include_sender_in_reverse_alias is not None:
|
if user.include_sender_in_reverse_alias is not None:
|
||||||
include_sender_in_reverse_alias = user.include_sender_in_reverse_alias
|
include_sender_in_reverse_alias = user.include_sender_in_reverse_alias
|
||||||
@ -1068,6 +1086,12 @@ def generate_reply_email(contact_email: str, user: User) -> str:
|
|||||||
contact_email = contact_email.replace(".", "_")
|
contact_email = contact_email.replace(".", "_")
|
||||||
contact_email = convert_to_alphanumeric(contact_email)
|
contact_email = convert_to_alphanumeric(contact_email)
|
||||||
|
|
||||||
|
reply_domain = config.EMAIL_DOMAIN
|
||||||
|
alias_domain = get_email_domain_part(alias.email)
|
||||||
|
sl_domain = SLDomain.get_by(domain=alias_domain)
|
||||||
|
if sl_domain and sl_domain.use_as_reverse_alias:
|
||||||
|
reply_domain = alias_domain
|
||||||
|
|
||||||
# not use while to avoid infinite loop
|
# not use while to avoid infinite loop
|
||||||
for _ in range(1000):
|
for _ in range(1000):
|
||||||
if include_sender_in_reverse_alias and contact_email:
|
if include_sender_in_reverse_alias and contact_email:
|
||||||
@ -1075,15 +1099,15 @@ def generate_reply_email(contact_email: str, user: User) -> str:
|
|||||||
reply_email = (
|
reply_email = (
|
||||||
# do not use the ra+ anymore
|
# do not use the ra+ anymore
|
||||||
# f"ra+{contact_email}+{random_string(random_length)}@{config.EMAIL_DOMAIN}"
|
# f"ra+{contact_email}+{random_string(random_length)}@{config.EMAIL_DOMAIN}"
|
||||||
f"{contact_email}_{random_string(random_length)}@{config.EMAIL_DOMAIN}"
|
f"{contact_email}_{random_string(random_length)}@{reply_domain}"
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
random_length = random.randint(20, 50)
|
random_length = random.randint(20, 50)
|
||||||
# do not use the ra+ anymore
|
# do not use the ra+ anymore
|
||||||
# reply_email = f"ra+{random_string(random_length)}@{config.EMAIL_DOMAIN}"
|
# reply_email = f"ra+{random_string(random_length)}@{config.EMAIL_DOMAIN}"
|
||||||
reply_email = f"{random_string(random_length)}@{config.EMAIL_DOMAIN}"
|
reply_email = f"{random_string(random_length)}@{reply_domain}"
|
||||||
|
|
||||||
if not Contact.get_by(reply_email=reply_email):
|
if available_sl_email(reply_email):
|
||||||
return reply_email
|
return reply_email
|
||||||
|
|
||||||
raise Exception("Cannot generate reply email")
|
raise Exception("Cannot generate reply email")
|
||||||
@ -1099,26 +1123,6 @@ def is_reverse_alias(address: str) -> bool:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
# allow also + and @ that are present in a reply address
|
|
||||||
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-.+@"
|
|
||||||
|
|
||||||
|
|
||||||
def normalize_reply_email(reply_email: str) -> str:
|
|
||||||
"""Handle the case where reply email contains *strange* char that was wrongly generated in the past"""
|
|
||||||
if not reply_email.isascii():
|
|
||||||
reply_email = convert_to_id(reply_email)
|
|
||||||
|
|
||||||
ret = []
|
|
||||||
# drop all control characters like shift, separator, etc
|
|
||||||
for c in reply_email:
|
|
||||||
if c not in _ALLOWED_CHARS:
|
|
||||||
ret.append("_")
|
|
||||||
else:
|
|
||||||
ret.append(c)
|
|
||||||
|
|
||||||
return "".join(ret)
|
|
||||||
|
|
||||||
|
|
||||||
def should_disable(alias: Alias) -> (bool, str):
|
def should_disable(alias: Alias) -> (bool, str):
|
||||||
"""
|
"""
|
||||||
Return whether an alias should be disabled and if yes, the reason why
|
Return whether an alias should be disabled and if yes, the reason why
|
||||||
|
38
app/app/email_validation.py
Normal file
38
app/app/email_validation.py
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
from email_validator import (
|
||||||
|
validate_email,
|
||||||
|
EmailNotValidError,
|
||||||
|
)
|
||||||
|
|
||||||
|
from app.utils import convert_to_id
|
||||||
|
|
||||||
|
# allow also + and @ that are present in a reply address
|
||||||
|
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-.+@"
|
||||||
|
|
||||||
|
|
||||||
|
def is_valid_email(email_address: str) -> bool:
|
||||||
|
"""
|
||||||
|
Used to check whether an email address is valid
|
||||||
|
NOT run MX check.
|
||||||
|
NOT allow unicode.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
validate_email(email_address, check_deliverability=False, allow_smtputf8=False)
|
||||||
|
return True
|
||||||
|
except EmailNotValidError:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def normalize_reply_email(reply_email: str) -> str:
|
||||||
|
"""Handle the case where reply email contains *strange* char that was wrongly generated in the past"""
|
||||||
|
if not reply_email.isascii():
|
||||||
|
reply_email = convert_to_id(reply_email)
|
||||||
|
|
||||||
|
ret = []
|
||||||
|
# drop all control characters like shift, separator, etc
|
||||||
|
for c in reply_email:
|
||||||
|
if c not in _ALLOWED_CHARS:
|
||||||
|
ret.append("_")
|
||||||
|
else:
|
||||||
|
ret.append(c)
|
||||||
|
|
||||||
|
return "".join(ret)
|
@ -84,6 +84,14 @@ class ErrAddressInvalid(SLException):
|
|||||||
return f"{self.address} is not a valid email address"
|
return f"{self.address} is not a valid email address"
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidContactEmailError(SLException):
|
||||||
|
def __init__(self, website_email: str): # noqa: F821
|
||||||
|
self.website_email = website_email
|
||||||
|
|
||||||
|
def error_for_user(self) -> str:
|
||||||
|
return f"Cannot create contact with invalid email {self.website_email}"
|
||||||
|
|
||||||
|
|
||||||
class ErrContactAlreadyExists(SLException):
|
class ErrContactAlreadyExists(SLException):
|
||||||
"""raised when a contact already exists"""
|
"""raised when a contact already exists"""
|
||||||
|
|
||||||
@ -113,3 +121,10 @@ class AccountAlreadyLinkedToAnotherUserException(LinkException):
|
|||||||
class AccountIsUsingAliasAsEmail(LinkException):
|
class AccountIsUsingAliasAsEmail(LinkException):
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
super().__init__("Your account has an alias as it's email address")
|
super().__init__("Your account has an alias as it's email address")
|
||||||
|
|
||||||
|
|
||||||
|
class ProtonAccountNotVerified(LinkException):
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__(
|
||||||
|
"The Proton account you are trying to use has not been verified"
|
||||||
|
)
|
||||||
|
@ -9,6 +9,7 @@ class LoginEvent:
|
|||||||
failed = 1
|
failed = 1
|
||||||
disabled_login = 2
|
disabled_login = 2
|
||||||
not_activated = 3
|
not_activated = 3
|
||||||
|
scheduled_to_be_deleted = 4
|
||||||
|
|
||||||
class Source(EnumE):
|
class Source(EnumE):
|
||||||
web = 0
|
web = 0
|
||||||
|
@ -34,10 +34,10 @@ def apply_dmarc_policy_for_forward_phase(
|
|||||||
|
|
||||||
from_header = get_header_unicode(msg[headers.FROM])
|
from_header = get_header_unicode(msg[headers.FROM])
|
||||||
|
|
||||||
warning_plain_text = f"""This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
|
warning_plain_text = """This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
|
||||||
More info on https://simplelogin.io/docs/getting-started/anti-phishing/
|
More info on https://simplelogin.io/docs/getting-started/anti-phishing/
|
||||||
"""
|
"""
|
||||||
warning_html = f"""
|
warning_html = """
|
||||||
<p style="color:red">
|
<p style="color:red">
|
||||||
This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
|
This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
|
||||||
More info on <a href="https://simplelogin.io/docs/getting-started/anti-phishing/">anti-phishing measure</a>
|
More info on <a href="https://simplelogin.io/docs/getting-started/anti-phishing/">anti-phishing measure</a>
|
||||||
|
@ -221,7 +221,7 @@ def handle_complaint(message: Message, origin: ProviderComplaintOrigin) -> bool:
|
|||||||
return True
|
return True
|
||||||
|
|
||||||
if is_deleted_alias(msg_info.sender_address):
|
if is_deleted_alias(msg_info.sender_address):
|
||||||
LOG.i(f"Complaint is for deleted alias. Do nothing")
|
LOG.i("Complaint is for deleted alias. Do nothing")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
contact = Contact.get_by(reply_email=msg_info.sender_address)
|
contact = Contact.get_by(reply_email=msg_info.sender_address)
|
||||||
@ -231,7 +231,7 @@ def handle_complaint(message: Message, origin: ProviderComplaintOrigin) -> bool:
|
|||||||
alias = find_alias_with_address(msg_info.rcpt_address)
|
alias = find_alias_with_address(msg_info.rcpt_address)
|
||||||
|
|
||||||
if is_deleted_alias(msg_info.rcpt_address):
|
if is_deleted_alias(msg_info.rcpt_address):
|
||||||
LOG.i(f"Complaint is for deleted alias. Do nothing")
|
LOG.i("Complaint is for deleted alias. Do nothing")
|
||||||
return True
|
return True
|
||||||
|
|
||||||
if not alias:
|
if not alias:
|
||||||
|
@ -54,9 +54,8 @@ class UnsubscribeEncoder:
|
|||||||
def encode_subject(
|
def encode_subject(
|
||||||
cls, action: UnsubscribeAction, data: Union[int, UnsubscribeOriginalData]
|
cls, action: UnsubscribeAction, data: Union[int, UnsubscribeOriginalData]
|
||||||
) -> str:
|
) -> str:
|
||||||
if (
|
if action != UnsubscribeAction.OriginalUnsubscribeMailto and not isinstance(
|
||||||
action != UnsubscribeAction.OriginalUnsubscribeMailto
|
data, int
|
||||||
and type(data) is not int
|
|
||||||
):
|
):
|
||||||
raise ValueError(f"Data has to be an int for an action of type {action}")
|
raise ValueError(f"Data has to be an int for an action of type {action}")
|
||||||
if action == UnsubscribeAction.OriginalUnsubscribeMailto:
|
if action == UnsubscribeAction.OriginalUnsubscribeMailto:
|
||||||
@ -74,8 +73,8 @@ class UnsubscribeEncoder:
|
|||||||
)
|
)
|
||||||
signed_data = cls._get_signer().sign(serialized_data).decode("utf-8")
|
signed_data = cls._get_signer().sign(serialized_data).decode("utf-8")
|
||||||
encoded_request = f"{UNSUB_PREFIX}.{signed_data}"
|
encoded_request = f"{UNSUB_PREFIX}.{signed_data}"
|
||||||
if len(encoded_request) > 256:
|
if len(encoded_request) > 512:
|
||||||
LOG.e("Encoded request is longer than 256 chars")
|
LOG.w("Encoded request is longer than 512 chars")
|
||||||
return encoded_request
|
return encoded_request
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
|
@ -1,4 +1,5 @@
|
|||||||
import urllib
|
import urllib
|
||||||
|
from email.header import Header
|
||||||
from email.message import Message
|
from email.message import Message
|
||||||
|
|
||||||
from app.email import headers
|
from app.email import headers
|
||||||
@ -9,6 +10,7 @@ from app.handler.unsubscribe_encoder import (
|
|||||||
UnsubscribeData,
|
UnsubscribeData,
|
||||||
UnsubscribeOriginalData,
|
UnsubscribeOriginalData,
|
||||||
)
|
)
|
||||||
|
from app.log import LOG
|
||||||
from app.models import Alias, Contact, UnsubscribeBehaviourEnum
|
from app.models import Alias, Contact, UnsubscribeBehaviourEnum
|
||||||
|
|
||||||
|
|
||||||
@ -30,7 +32,10 @@ class UnsubscribeGenerator:
|
|||||||
"""
|
"""
|
||||||
unsubscribe_data = message[headers.LIST_UNSUBSCRIBE]
|
unsubscribe_data = message[headers.LIST_UNSUBSCRIBE]
|
||||||
if not unsubscribe_data:
|
if not unsubscribe_data:
|
||||||
|
LOG.info("Email has no unsubscribe header")
|
||||||
return message
|
return message
|
||||||
|
if isinstance(unsubscribe_data, Header):
|
||||||
|
unsubscribe_data = str(unsubscribe_data.encode())
|
||||||
raw_methods = [method.strip() for method in unsubscribe_data.split(",")]
|
raw_methods = [method.strip() for method in unsubscribe_data.split(",")]
|
||||||
mailto_unsubs = None
|
mailto_unsubs = None
|
||||||
other_unsubs = []
|
other_unsubs = []
|
||||||
@ -44,7 +49,9 @@ class UnsubscribeGenerator:
|
|||||||
if url_data.scheme == "mailto":
|
if url_data.scheme == "mailto":
|
||||||
query_data = urllib.parse.parse_qs(url_data.query)
|
query_data = urllib.parse.parse_qs(url_data.query)
|
||||||
mailto_unsubs = (url_data.path, query_data.get("subject", [""])[0])
|
mailto_unsubs = (url_data.path, query_data.get("subject", [""])[0])
|
||||||
|
LOG.debug(f"Unsub is mailto to {mailto_unsubs}")
|
||||||
else:
|
else:
|
||||||
|
LOG.debug(f"Unsub has {url_data.scheme} scheme")
|
||||||
other_unsubs.append(method)
|
other_unsubs.append(method)
|
||||||
# If there are non mailto unsubscribe methods, use those in the header
|
# If there are non mailto unsubscribe methods, use those in the header
|
||||||
if other_unsubs:
|
if other_unsubs:
|
||||||
@ -56,18 +63,19 @@ class UnsubscribeGenerator:
|
|||||||
add_or_replace_header(
|
add_or_replace_header(
|
||||||
message, headers.LIST_UNSUBSCRIBE_POST, "List-Unsubscribe=One-Click"
|
message, headers.LIST_UNSUBSCRIBE_POST, "List-Unsubscribe=One-Click"
|
||||||
)
|
)
|
||||||
|
LOG.debug(f"Adding click unsub methods to header {other_unsubs}")
|
||||||
return message
|
return message
|
||||||
if not mailto_unsubs:
|
elif not mailto_unsubs:
|
||||||
message = delete_header(message, headers.LIST_UNSUBSCRIBE)
|
LOG.debug("No unsubs. Deleting all unsub headers")
|
||||||
message = delete_header(message, headers.LIST_UNSUBSCRIBE_POST)
|
delete_header(message, headers.LIST_UNSUBSCRIBE)
|
||||||
|
delete_header(message, headers.LIST_UNSUBSCRIBE_POST)
|
||||||
return message
|
return message
|
||||||
return self._add_unsubscribe_header(
|
unsub_data = UnsubscribeData(
|
||||||
message,
|
UnsubscribeAction.OriginalUnsubscribeMailto,
|
||||||
UnsubscribeData(
|
UnsubscribeOriginalData(alias.id, mailto_unsubs[0], mailto_unsubs[1]),
|
||||||
UnsubscribeAction.OriginalUnsubscribeMailto,
|
|
||||||
UnsubscribeOriginalData(alias.id, mailto_unsubs[0], mailto_unsubs[1]),
|
|
||||||
),
|
|
||||||
)
|
)
|
||||||
|
LOG.debug(f"Adding unsub data {unsub_data}")
|
||||||
|
return self._add_unsubscribe_header(message, unsub_data)
|
||||||
|
|
||||||
def _add_unsubscribe_header(
|
def _add_unsubscribe_header(
|
||||||
self, message: Message, unsub: UnsubscribeData
|
self, message: Message, unsub: UnsubscribeData
|
||||||
|
@ -30,7 +30,7 @@ def handle_batch_import(batch_import: BatchImport):
|
|||||||
|
|
||||||
LOG.d("Download file %s from %s", batch_import.file, file_url)
|
LOG.d("Download file %s from %s", batch_import.file, file_url)
|
||||||
r = requests.get(file_url)
|
r = requests.get(file_url)
|
||||||
lines = [line.decode() for line in r.iter_lines()]
|
lines = [line.decode("utf-8") for line in r.iter_lines()]
|
||||||
|
|
||||||
import_from_csv(batch_import, user, lines)
|
import_from_csv(batch_import, user, lines)
|
||||||
|
|
||||||
|
@ -1,2 +1,4 @@
|
|||||||
from .integrations import set_enable_proton_cookie
|
from .integrations import set_enable_proton_cookie
|
||||||
from .exit_sudo import exit_sudo_mode
|
from .exit_sudo import exit_sudo_mode
|
||||||
|
|
||||||
|
__all__ = ["set_enable_proton_cookie", "exit_sudo_mode"]
|
||||||
|
@ -39,9 +39,8 @@ from app.models import (
|
|||||||
|
|
||||||
|
|
||||||
class ExportUserDataJob:
|
class ExportUserDataJob:
|
||||||
|
|
||||||
REMOVE_FIELDS = {
|
REMOVE_FIELDS = {
|
||||||
"User": ("otp_secret",),
|
"User": ("otp_secret", "password"),
|
||||||
"Alias": ("ts_vector", "transfer_token", "hibp_last_check"),
|
"Alias": ("ts_vector", "transfer_token", "hibp_last_check"),
|
||||||
"CustomDomain": ("ownership_txt_token",),
|
"CustomDomain": ("ownership_txt_token",),
|
||||||
}
|
}
|
||||||
|
@ -22,7 +22,6 @@ from app.message_utils import message_to_bytes, message_format_base64_parts
|
|||||||
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class SendRequest:
|
class SendRequest:
|
||||||
|
|
||||||
SAVE_EXTENSION = "sendrequest"
|
SAVE_EXTENSION = "sendrequest"
|
||||||
|
|
||||||
envelope_from: str
|
envelope_from: str
|
||||||
@ -32,6 +31,7 @@ class SendRequest:
|
|||||||
rcpt_options: Dict = {}
|
rcpt_options: Dict = {}
|
||||||
is_forward: bool = False
|
is_forward: bool = False
|
||||||
ignore_smtp_errors: bool = False
|
ignore_smtp_errors: bool = False
|
||||||
|
retries: int = 0
|
||||||
|
|
||||||
def to_bytes(self) -> bytes:
|
def to_bytes(self) -> bytes:
|
||||||
if not config.SAVE_UNSENT_DIR:
|
if not config.SAVE_UNSENT_DIR:
|
||||||
@ -45,6 +45,7 @@ class SendRequest:
|
|||||||
"mail_options": self.mail_options,
|
"mail_options": self.mail_options,
|
||||||
"rcpt_options": self.rcpt_options,
|
"rcpt_options": self.rcpt_options,
|
||||||
"is_forward": self.is_forward,
|
"is_forward": self.is_forward,
|
||||||
|
"retries": self.retries,
|
||||||
}
|
}
|
||||||
return json.dumps(data).encode("utf-8")
|
return json.dumps(data).encode("utf-8")
|
||||||
|
|
||||||
@ -65,8 +66,33 @@ class SendRequest:
|
|||||||
mail_options=decoded_data["mail_options"],
|
mail_options=decoded_data["mail_options"],
|
||||||
rcpt_options=decoded_data["rcpt_options"],
|
rcpt_options=decoded_data["rcpt_options"],
|
||||||
is_forward=decoded_data["is_forward"],
|
is_forward=decoded_data["is_forward"],
|
||||||
|
retries=decoded_data.get("retries", 1),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def save_request_to_unsent_dir(self, prefix: str = "DeliveryFail"):
|
||||||
|
file_name = (
|
||||||
|
f"{prefix}-{int(time.time())}-{uuid.uuid4()}.{SendRequest.SAVE_EXTENSION}"
|
||||||
|
)
|
||||||
|
file_path = os.path.join(config.SAVE_UNSENT_DIR, file_name)
|
||||||
|
self.save_request_to_file(file_path)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def save_request_to_failed_dir(self, prefix: str = "DeliveryRetryFail"):
|
||||||
|
file_name = (
|
||||||
|
f"{prefix}-{int(time.time())}-{uuid.uuid4()}.{SendRequest.SAVE_EXTENSION}"
|
||||||
|
)
|
||||||
|
dir_name = os.path.join(config.SAVE_UNSENT_DIR, "failed")
|
||||||
|
if not os.path.isdir(dir_name):
|
||||||
|
os.makedirs(dir_name)
|
||||||
|
file_path = os.path.join(dir_name, file_name)
|
||||||
|
self.save_request_to_file(file_path)
|
||||||
|
|
||||||
|
def save_request_to_file(self, file_path: str):
|
||||||
|
file_contents = self.to_bytes()
|
||||||
|
with open(file_path, "wb") as fd:
|
||||||
|
fd.write(file_contents)
|
||||||
|
LOG.i(f"Saved unsent message {file_path}")
|
||||||
|
|
||||||
|
|
||||||
class MailSender:
|
class MailSender:
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
@ -171,21 +197,9 @@ class MailSender:
|
|||||||
f"Could not send message to smtp server {config.POSTFIX_SERVER}:{config.POSTFIX_PORT}"
|
f"Could not send message to smtp server {config.POSTFIX_SERVER}:{config.POSTFIX_PORT}"
|
||||||
)
|
)
|
||||||
if config.SAVE_UNSENT_DIR:
|
if config.SAVE_UNSENT_DIR:
|
||||||
self._save_request_to_unsent_dir(send_request)
|
send_request.save_request_to_unsent_dir()
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def _save_request_to_unsent_dir(
|
|
||||||
self, send_request: SendRequest, prefix: str = "DeliveryFail"
|
|
||||||
):
|
|
||||||
file_name = (
|
|
||||||
f"{prefix}-{int(time.time())}-{uuid.uuid4()}.{SendRequest.SAVE_EXTENSION}"
|
|
||||||
)
|
|
||||||
file_path = os.path.join(config.SAVE_UNSENT_DIR, file_name)
|
|
||||||
file_contents = send_request.to_bytes()
|
|
||||||
with open(file_path, "wb") as fd:
|
|
||||||
fd.write(file_contents)
|
|
||||||
LOG.i(f"Saved unsent message {file_path}")
|
|
||||||
|
|
||||||
|
|
||||||
mail_sender = MailSender()
|
mail_sender = MailSender()
|
||||||
|
|
||||||
@ -219,6 +233,7 @@ def load_unsent_mails_from_fs_and_resend():
|
|||||||
LOG.i(f"Trying to re-deliver email {filename}")
|
LOG.i(f"Trying to re-deliver email {filename}")
|
||||||
try:
|
try:
|
||||||
send_request = SendRequest.load_from_file(full_file_path)
|
send_request = SendRequest.load_from_file(full_file_path)
|
||||||
|
send_request.retries += 1
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
LOG.e(f"Cannot load {filename}. Error {e}")
|
LOG.e(f"Cannot load {filename}. Error {e}")
|
||||||
continue
|
continue
|
||||||
@ -230,6 +245,11 @@ def load_unsent_mails_from_fs_and_resend():
|
|||||||
"DeliverUnsentEmail", {"delivered": "true"}
|
"DeliverUnsentEmail", {"delivered": "true"}
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
|
if send_request.retries > 2:
|
||||||
|
os.unlink(full_file_path)
|
||||||
|
send_request.save_request_to_failed_dir()
|
||||||
|
else:
|
||||||
|
send_request.save_request_to_file(full_file_path)
|
||||||
newrelic.agent.record_custom_event(
|
newrelic.agent.record_custom_event(
|
||||||
"DeliverUnsentEmail", {"delivered": "false"}
|
"DeliverUnsentEmail", {"delivered": "false"}
|
||||||
)
|
)
|
||||||
|
@ -1,6 +1,7 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
import base64
|
import base64
|
||||||
|
import dataclasses
|
||||||
import enum
|
import enum
|
||||||
import hashlib
|
import hashlib
|
||||||
import hmac
|
import hmac
|
||||||
@ -18,7 +19,7 @@ from flanker.addresslib import address
|
|||||||
from flask import url_for
|
from flask import url_for
|
||||||
from flask_login import UserMixin
|
from flask_login import UserMixin
|
||||||
from jinja2 import FileSystemLoader, Environment
|
from jinja2 import FileSystemLoader, Environment
|
||||||
from sqlalchemy import orm
|
from sqlalchemy import orm, or_
|
||||||
from sqlalchemy import text, desc, CheckConstraint, Index, Column
|
from sqlalchemy import text, desc, CheckConstraint, Index, Column
|
||||||
from sqlalchemy.dialects.postgresql import TSVECTOR
|
from sqlalchemy.dialects.postgresql import TSVECTOR
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
@ -26,9 +27,11 @@ from sqlalchemy.orm import deferred
|
|||||||
from sqlalchemy.sql import and_
|
from sqlalchemy.sql import and_
|
||||||
from sqlalchemy_utils import ArrowType
|
from sqlalchemy_utils import ArrowType
|
||||||
|
|
||||||
from app import config
|
from app import config, rate_limiter
|
||||||
from app import s3
|
from app import s3
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
|
from app.dns_utils import get_mx_domains
|
||||||
|
|
||||||
from app.errors import (
|
from app.errors import (
|
||||||
AliasInTrashError,
|
AliasInTrashError,
|
||||||
DirectoryInTrashError,
|
DirectoryInTrashError,
|
||||||
@ -273,6 +276,13 @@ class IntEnumType(sa.types.TypeDecorator):
|
|||||||
return self._enum_type(enum_value)
|
return self._enum_type(enum_value)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclasses.dataclass
|
||||||
|
class AliasOptions:
|
||||||
|
show_sl_domains: bool = True
|
||||||
|
show_partner_domains: Optional[Partner] = None
|
||||||
|
show_partner_premium: Optional[bool] = None
|
||||||
|
|
||||||
|
|
||||||
class Hibp(Base, ModelMixin):
|
class Hibp(Base, ModelMixin):
|
||||||
__tablename__ = "hibp"
|
__tablename__ = "hibp"
|
||||||
name = sa.Column(sa.String(), nullable=False, unique=True, index=True)
|
name = sa.Column(sa.String(), nullable=False, unique=True, index=True)
|
||||||
@ -291,7 +301,9 @@ class HibpNotifiedAlias(Base, ModelMixin):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
__tablename__ = "hibp_notified_alias"
|
__tablename__ = "hibp_notified_alias"
|
||||||
alias_id = sa.Column(sa.ForeignKey("alias.id", ondelete="cascade"), nullable=False)
|
alias_id = sa.Column(
|
||||||
|
sa.ForeignKey("alias.id", ondelete="cascade"), nullable=False, index=True
|
||||||
|
)
|
||||||
user_id = sa.Column(sa.ForeignKey("users.id", ondelete="cascade"), nullable=False)
|
user_id = sa.Column(sa.ForeignKey("users.id", ondelete="cascade"), nullable=False)
|
||||||
|
|
||||||
notified_at = sa.Column(ArrowType, default=arrow.utcnow, nullable=False)
|
notified_at = sa.Column(ArrowType, default=arrow.utcnow, nullable=False)
|
||||||
@ -332,7 +344,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
sa.Boolean, default=True, nullable=False, server_default="1"
|
sa.Boolean, default=True, nullable=False, server_default="1"
|
||||||
)
|
)
|
||||||
|
|
||||||
activated = sa.Column(sa.Boolean, default=False, nullable=False)
|
activated = sa.Column(sa.Boolean, default=False, nullable=False, index=True)
|
||||||
|
|
||||||
# an account can be disabled if having harmful behavior
|
# an account can be disabled if having harmful behavior
|
||||||
disabled = sa.Column(sa.Boolean, default=False, nullable=False, server_default="0")
|
disabled = sa.Column(sa.Boolean, default=False, nullable=False, server_default="0")
|
||||||
@ -402,7 +414,10 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
)
|
)
|
||||||
|
|
||||||
referral_id = sa.Column(
|
referral_id = sa.Column(
|
||||||
sa.ForeignKey("referral.id", ondelete="SET NULL"), nullable=True, default=None
|
sa.ForeignKey("referral.id", ondelete="SET NULL"),
|
||||||
|
nullable=True,
|
||||||
|
default=None,
|
||||||
|
index=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
referral = orm.relationship("Referral", foreign_keys=[referral_id])
|
referral = orm.relationship("Referral", foreign_keys=[referral_id])
|
||||||
@ -419,7 +434,10 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
|
|
||||||
# newsletter is sent to this address
|
# newsletter is sent to this address
|
||||||
newsletter_alias_id = sa.Column(
|
newsletter_alias_id = sa.Column(
|
||||||
sa.ForeignKey("alias.id", ondelete="SET NULL"), nullable=True, default=None
|
sa.ForeignKey("alias.id", ondelete="SET NULL"),
|
||||||
|
nullable=True,
|
||||||
|
default=None,
|
||||||
|
index=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
# whether to include the sender address in reverse-alias
|
# whether to include the sender address in reverse-alias
|
||||||
@ -433,7 +451,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
random_alias_suffix = sa.Column(
|
random_alias_suffix = sa.Column(
|
||||||
sa.Integer,
|
sa.Integer,
|
||||||
nullable=False,
|
nullable=False,
|
||||||
default=AliasSuffixEnum.random_string.value,
|
default=AliasSuffixEnum.word.value,
|
||||||
server_default=str(AliasSuffixEnum.random_string.value),
|
server_default=str(AliasSuffixEnum.random_string.value),
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -502,9 +520,8 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
server_default=BlockBehaviourEnum.return_2xx.name,
|
server_default=BlockBehaviourEnum.return_2xx.name,
|
||||||
)
|
)
|
||||||
|
|
||||||
# to keep existing behavior, the server default is TRUE whereas for new user, the default value is FALSE
|
|
||||||
include_header_email_header = sa.Column(
|
include_header_email_header = sa.Column(
|
||||||
sa.Boolean, default=False, nullable=False, server_default="1"
|
sa.Boolean, default=True, nullable=False, server_default="1"
|
||||||
)
|
)
|
||||||
|
|
||||||
# bitwise flags. Allow for future expansion
|
# bitwise flags. Allow for future expansion
|
||||||
@ -523,6 +540,16 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
nullable=False,
|
nullable=False,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Trigger hard deletion of the account at this time
|
||||||
|
delete_on = sa.Column(ArrowType, default=None)
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
sa.Index(
|
||||||
|
"ix_users_activated_trial_end_lifetime", activated, trial_end, lifetime
|
||||||
|
),
|
||||||
|
sa.Index("ix_users_delete_on", delete_on),
|
||||||
|
)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def directory_quota(self):
|
def directory_quota(self):
|
||||||
return min(
|
return min(
|
||||||
@ -557,7 +584,8 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def create(cls, email, name="", password=None, from_partner=False, **kwargs):
|
def create(cls, email, name="", password=None, from_partner=False, **kwargs):
|
||||||
user: User = super(User, cls).create(email=email, name=name, **kwargs)
|
email = sanitize_email(email)
|
||||||
|
user: User = super(User, cls).create(email=email, name=name[:100], **kwargs)
|
||||||
|
|
||||||
if password:
|
if password:
|
||||||
user.set_password(password)
|
user.set_password(password)
|
||||||
@ -568,19 +596,6 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
Session.flush()
|
Session.flush()
|
||||||
user.default_mailbox_id = mb.id
|
user.default_mailbox_id = mb.id
|
||||||
|
|
||||||
# create a first alias mail to show user how to use when they login
|
|
||||||
alias = Alias.create_new(
|
|
||||||
user,
|
|
||||||
prefix="simplelogin-newsletter",
|
|
||||||
mailbox_id=mb.id,
|
|
||||||
note="This is your first alias. It's used to receive SimpleLogin communications "
|
|
||||||
"like new features announcements, newsletters.",
|
|
||||||
)
|
|
||||||
Session.flush()
|
|
||||||
|
|
||||||
user.newsletter_alias_id = alias.id
|
|
||||||
Session.flush()
|
|
||||||
|
|
||||||
# generate an alternative_id if needed
|
# generate an alternative_id if needed
|
||||||
if "alternative_id" not in kwargs:
|
if "alternative_id" not in kwargs:
|
||||||
user.alternative_id = str(uuid.uuid4())
|
user.alternative_id = str(uuid.uuid4())
|
||||||
@ -599,6 +614,19 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
Session.flush()
|
Session.flush()
|
||||||
return user
|
return user
|
||||||
|
|
||||||
|
# create a first alias mail to show user how to use when they login
|
||||||
|
alias = Alias.create_new(
|
||||||
|
user,
|
||||||
|
prefix="simplelogin-newsletter",
|
||||||
|
mailbox_id=mb.id,
|
||||||
|
note="This is your first alias. It's used to receive SimpleLogin communications "
|
||||||
|
"like new features announcements, newsletters.",
|
||||||
|
)
|
||||||
|
Session.flush()
|
||||||
|
|
||||||
|
user.newsletter_alias_id = alias.id
|
||||||
|
Session.flush()
|
||||||
|
|
||||||
if config.DISABLE_ONBOARDING:
|
if config.DISABLE_ONBOARDING:
|
||||||
LOG.d("Disable onboarding emails")
|
LOG.d("Disable onboarding emails")
|
||||||
return user
|
return user
|
||||||
@ -624,7 +652,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
return user
|
return user
|
||||||
|
|
||||||
def get_active_subscription(
|
def get_active_subscription(
|
||||||
self,
|
self, include_partner_subscription: bool = True
|
||||||
) -> Optional[
|
) -> Optional[
|
||||||
Union[
|
Union[
|
||||||
Subscription
|
Subscription
|
||||||
@ -652,19 +680,40 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
if coinbase_subscription and coinbase_subscription.is_active():
|
if coinbase_subscription and coinbase_subscription.is_active():
|
||||||
return coinbase_subscription
|
return coinbase_subscription
|
||||||
|
|
||||||
partner_sub: PartnerSubscription = PartnerSubscription.find_by_user_id(self.id)
|
if include_partner_subscription:
|
||||||
if partner_sub and partner_sub.is_active():
|
partner_sub: PartnerSubscription = PartnerSubscription.find_by_user_id(
|
||||||
return partner_sub
|
self.id
|
||||||
|
)
|
||||||
|
if partner_sub and partner_sub.is_active():
|
||||||
|
return partner_sub
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
def get_active_subscription_end(
|
||||||
|
self, include_partner_subscription: bool = True
|
||||||
|
) -> Optional[arrow.Arrow]:
|
||||||
|
sub = self.get_active_subscription(
|
||||||
|
include_partner_subscription=include_partner_subscription
|
||||||
|
)
|
||||||
|
if isinstance(sub, Subscription):
|
||||||
|
return arrow.get(sub.next_bill_date)
|
||||||
|
if isinstance(sub, AppleSubscription):
|
||||||
|
return sub.expires_date
|
||||||
|
if isinstance(sub, ManualSubscription):
|
||||||
|
return sub.end_at
|
||||||
|
if isinstance(sub, CoinbaseSubscription):
|
||||||
|
return sub.end_at
|
||||||
|
return None
|
||||||
|
|
||||||
# region Billing
|
# region Billing
|
||||||
def lifetime_or_active_subscription(self) -> bool:
|
def lifetime_or_active_subscription(
|
||||||
|
self, include_partner_subscription: bool = True
|
||||||
|
) -> bool:
|
||||||
"""True if user has lifetime licence or active subscription"""
|
"""True if user has lifetime licence or active subscription"""
|
||||||
if self.lifetime:
|
if self.lifetime:
|
||||||
return True
|
return True
|
||||||
|
|
||||||
return self.get_active_subscription() is not None
|
return self.get_active_subscription(include_partner_subscription) is not None
|
||||||
|
|
||||||
def is_paid(self) -> bool:
|
def is_paid(self) -> bool:
|
||||||
"""same as _lifetime_or_active_subscription but not include free manual subscription"""
|
"""same as _lifetime_or_active_subscription but not include free manual subscription"""
|
||||||
@ -693,14 +742,14 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def is_premium(self) -> bool:
|
def is_premium(self, include_partner_subscription: bool = True) -> bool:
|
||||||
"""
|
"""
|
||||||
user is premium if they:
|
user is premium if they:
|
||||||
- have a lifetime deal or
|
- have a lifetime deal or
|
||||||
- in trial period or
|
- in trial period or
|
||||||
- active subscription
|
- active subscription
|
||||||
"""
|
"""
|
||||||
if self.lifetime_or_active_subscription():
|
if self.lifetime_or_active_subscription(include_partner_subscription):
|
||||||
return True
|
return True
|
||||||
|
|
||||||
if self.trial_end and arrow.now() < self.trial_end:
|
if self.trial_end and arrow.now() < self.trial_end:
|
||||||
@ -789,6 +838,17 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
< self.max_alias_for_free_account()
|
< self.max_alias_for_free_account()
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def can_send_or_receive(self) -> bool:
|
||||||
|
if self.disabled:
|
||||||
|
LOG.i(f"User {self} is disabled. Cannot receive or send emails")
|
||||||
|
return False
|
||||||
|
if self.delete_on is not None:
|
||||||
|
LOG.i(
|
||||||
|
f"User {self} is scheduled to be deleted. Cannot receive or send emails"
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
def profile_picture_url(self):
|
def profile_picture_url(self):
|
||||||
if self.profile_picture_id:
|
if self.profile_picture_id:
|
||||||
return self.profile_picture.get_url()
|
return self.profile_picture.get_url()
|
||||||
@ -867,14 +927,16 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
def custom_domains(self):
|
def custom_domains(self):
|
||||||
return CustomDomain.filter_by(user_id=self.id, verified=True).all()
|
return CustomDomain.filter_by(user_id=self.id, verified=True).all()
|
||||||
|
|
||||||
def available_domains_for_random_alias(self) -> List[Tuple[bool, str]]:
|
def available_domains_for_random_alias(
|
||||||
|
self, alias_options: Optional[AliasOptions] = None
|
||||||
|
) -> List[Tuple[bool, str]]:
|
||||||
"""Return available domains for user to create random aliases
|
"""Return available domains for user to create random aliases
|
||||||
Each result record contains:
|
Each result record contains:
|
||||||
- whether the domain belongs to SimpleLogin
|
- whether the domain belongs to SimpleLogin
|
||||||
- the domain
|
- the domain
|
||||||
"""
|
"""
|
||||||
res = []
|
res = []
|
||||||
for domain in self.available_sl_domains():
|
for domain in self.available_sl_domains(alias_options=alias_options):
|
||||||
res.append((True, domain))
|
res.append((True, domain))
|
||||||
|
|
||||||
for custom_domain in self.verified_custom_domains():
|
for custom_domain in self.verified_custom_domains():
|
||||||
@ -959,30 +1021,65 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
|
|
||||||
return None, "", False
|
return None, "", False
|
||||||
|
|
||||||
def available_sl_domains(self) -> [str]:
|
def available_sl_domains(
|
||||||
|
self, alias_options: Optional[AliasOptions] = None
|
||||||
|
) -> [str]:
|
||||||
"""
|
"""
|
||||||
Return all SimpleLogin domains that user can use when creating a new alias, including:
|
Return all SimpleLogin domains that user can use when creating a new alias, including:
|
||||||
- SimpleLogin public domains, available for all users (ALIAS_DOMAIN)
|
- SimpleLogin public domains, available for all users (ALIAS_DOMAIN)
|
||||||
- SimpleLogin premium domains, only available for Premium accounts (PREMIUM_ALIAS_DOMAIN)
|
- SimpleLogin premium domains, only available for Premium accounts (PREMIUM_ALIAS_DOMAIN)
|
||||||
"""
|
"""
|
||||||
return [sl_domain.domain for sl_domain in self.get_sl_domains()]
|
return [
|
||||||
|
sl_domain.domain
|
||||||
|
for sl_domain in self.get_sl_domains(alias_options=alias_options)
|
||||||
|
]
|
||||||
|
|
||||||
def get_sl_domains(self) -> List["SLDomain"]:
|
def get_sl_domains(
|
||||||
query = SLDomain.filter_by(hidden=False).order_by(SLDomain.order)
|
self, alias_options: Optional[AliasOptions] = None
|
||||||
|
) -> list["SLDomain"]:
|
||||||
|
if alias_options is None:
|
||||||
|
alias_options = AliasOptions()
|
||||||
|
top_conds = [SLDomain.hidden == False] # noqa: E712
|
||||||
|
or_conds = [] # noqa:E711
|
||||||
|
if self.default_alias_public_domain_id is not None:
|
||||||
|
default_domain_conds = [SLDomain.id == self.default_alias_public_domain_id]
|
||||||
|
if not self.is_premium():
|
||||||
|
default_domain_conds.append(
|
||||||
|
SLDomain.premium_only == False # noqa: E712
|
||||||
|
)
|
||||||
|
or_conds.append(and_(*default_domain_conds).self_group())
|
||||||
|
if alias_options.show_partner_domains is not None:
|
||||||
|
partner_user = PartnerUser.filter_by(
|
||||||
|
user_id=self.id, partner_id=alias_options.show_partner_domains.id
|
||||||
|
).first()
|
||||||
|
if partner_user is not None:
|
||||||
|
partner_domain_cond = [SLDomain.partner_id == partner_user.partner_id]
|
||||||
|
if alias_options.show_partner_premium is None:
|
||||||
|
alias_options.show_partner_premium = self.is_premium()
|
||||||
|
if not alias_options.show_partner_premium:
|
||||||
|
partner_domain_cond.append(
|
||||||
|
SLDomain.premium_only == False # noqa: E712
|
||||||
|
)
|
||||||
|
or_conds.append(and_(*partner_domain_cond).self_group())
|
||||||
|
if alias_options.show_sl_domains:
|
||||||
|
sl_conds = [SLDomain.partner_id == None] # noqa: E711
|
||||||
|
if not self.is_premium():
|
||||||
|
sl_conds.append(SLDomain.premium_only == False) # noqa: E712
|
||||||
|
or_conds.append(and_(*sl_conds).self_group())
|
||||||
|
top_conds.append(or_(*or_conds))
|
||||||
|
query = Session.query(SLDomain).filter(*top_conds).order_by(SLDomain.order)
|
||||||
|
return query.all()
|
||||||
|
|
||||||
if self.is_premium():
|
def available_alias_domains(
|
||||||
return query.all()
|
self, alias_options: Optional[AliasOptions] = None
|
||||||
else:
|
) -> [str]:
|
||||||
return query.filter_by(premium_only=False).all()
|
|
||||||
|
|
||||||
def available_alias_domains(self) -> [str]:
|
|
||||||
"""return all domains that user can use when creating a new alias, including:
|
"""return all domains that user can use when creating a new alias, including:
|
||||||
- SimpleLogin public domains, available for all users (ALIAS_DOMAIN)
|
- SimpleLogin public domains, available for all users (ALIAS_DOMAIN)
|
||||||
- SimpleLogin premium domains, only available for Premium accounts (PREMIUM_ALIAS_DOMAIN)
|
- SimpleLogin premium domains, only available for Premium accounts (PREMIUM_ALIAS_DOMAIN)
|
||||||
- Verified custom domains
|
- Verified custom domains
|
||||||
|
|
||||||
"""
|
"""
|
||||||
domains = self.available_sl_domains()
|
domains = self.available_sl_domains(alias_options=alias_options)
|
||||||
|
|
||||||
for custom_domain in self.verified_custom_domains():
|
for custom_domain in self.verified_custom_domains():
|
||||||
domains.append(custom_domain.domain)
|
domains.append(custom_domain.domain)
|
||||||
@ -1000,16 +1097,28 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
|
|||||||
> 0
|
> 0
|
||||||
)
|
)
|
||||||
|
|
||||||
def get_random_alias_suffix(self):
|
def get_random_alias_suffix(self, custom_domain: Optional["CustomDomain"] = None):
|
||||||
"""Get random suffix for an alias based on user's preference.
|
"""Get random suffix for an alias based on user's preference.
|
||||||
|
|
||||||
|
Use a shorter suffix in case of custom domain
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
str: the random suffix generated
|
str: the random suffix generated
|
||||||
"""
|
"""
|
||||||
if self.random_alias_suffix == AliasSuffixEnum.random_string.value:
|
if self.random_alias_suffix == AliasSuffixEnum.random_string.value:
|
||||||
return random_string(config.ALIAS_RANDOM_SUFFIX_LENGTH, include_digits=True)
|
return random_string(config.ALIAS_RANDOM_SUFFIX_LENGTH, include_digits=True)
|
||||||
return random_words(1, 3)
|
|
||||||
|
if custom_domain is None:
|
||||||
|
return random_words(1, 3)
|
||||||
|
|
||||||
|
return random_words(1)
|
||||||
|
|
||||||
|
def can_create_contacts(self) -> bool:
|
||||||
|
if self.is_premium():
|
||||||
|
return True
|
||||||
|
if self.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
|
||||||
|
return True
|
||||||
|
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return f"<User {self.id} {self.name} {self.email}>"
|
return f"<User {self.id} {self.name} {self.email}>"
|
||||||
@ -1254,16 +1363,30 @@ class OauthToken(Base, ModelMixin):
|
|||||||
return self.expired < arrow.now()
|
return self.expired < arrow.now()
|
||||||
|
|
||||||
|
|
||||||
def generate_email(
|
def available_sl_email(email: str) -> bool:
|
||||||
|
if (
|
||||||
|
Alias.get_by(email=email)
|
||||||
|
or Contact.get_by(reply_email=email)
|
||||||
|
or DeletedAlias.get_by(email=email)
|
||||||
|
):
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def generate_random_alias_email(
|
||||||
scheme: int = AliasGeneratorEnum.word.value,
|
scheme: int = AliasGeneratorEnum.word.value,
|
||||||
in_hex: bool = False,
|
in_hex: bool = False,
|
||||||
alias_domain=config.FIRST_ALIAS_DOMAIN,
|
alias_domain: str = config.FIRST_ALIAS_DOMAIN,
|
||||||
|
retries: int = 10,
|
||||||
) -> str:
|
) -> str:
|
||||||
"""generate an email address that does not exist before
|
"""generate an email address that does not exist before
|
||||||
:param alias_domain: the domain used to generate the alias.
|
:param alias_domain: the domain used to generate the alias.
|
||||||
:param scheme: int, value of AliasGeneratorEnum, indicate how the email is generated
|
:param scheme: int, value of AliasGeneratorEnum, indicate how the email is generated
|
||||||
|
:param retries: int, How many times we can try to generate an alias in case of collision
|
||||||
:type in_hex: bool, if the generate scheme is uuid, is hex favorable?
|
:type in_hex: bool, if the generate scheme is uuid, is hex favorable?
|
||||||
"""
|
"""
|
||||||
|
if retries <= 0:
|
||||||
|
raise Exception("Cannot generate alias after many retries")
|
||||||
if scheme == AliasGeneratorEnum.uuid.value:
|
if scheme == AliasGeneratorEnum.uuid.value:
|
||||||
name = uuid.uuid4().hex if in_hex else uuid.uuid4().__str__()
|
name = uuid.uuid4().hex if in_hex else uuid.uuid4().__str__()
|
||||||
random_email = name + "@" + alias_domain
|
random_email = name + "@" + alias_domain
|
||||||
@ -1273,15 +1396,15 @@ def generate_email(
|
|||||||
random_email = random_email.lower().strip()
|
random_email = random_email.lower().strip()
|
||||||
|
|
||||||
# check that the client does not exist yet
|
# check that the client does not exist yet
|
||||||
if not Alias.get_by(email=random_email) and not DeletedAlias.get_by(
|
if available_sl_email(random_email):
|
||||||
email=random_email
|
|
||||||
):
|
|
||||||
LOG.d("generate email %s", random_email)
|
LOG.d("generate email %s", random_email)
|
||||||
return random_email
|
return random_email
|
||||||
|
|
||||||
# Rerun the function
|
# Rerun the function
|
||||||
LOG.w("email %s already exists, generate a new email", random_email)
|
LOG.w("email %s already exists, generate a new email", random_email)
|
||||||
return generate_email(scheme=scheme, in_hex=in_hex)
|
return generate_random_alias_email(
|
||||||
|
scheme=scheme, in_hex=in_hex, retries=retries - 1
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class Alias(Base, ModelMixin):
|
class Alias(Base, ModelMixin):
|
||||||
@ -1363,7 +1486,7 @@ class Alias(Base, ModelMixin):
|
|||||||
)
|
)
|
||||||
|
|
||||||
# have I been pwned
|
# have I been pwned
|
||||||
hibp_last_check = sa.Column(ArrowType, default=None)
|
hibp_last_check = sa.Column(ArrowType, default=None, index=True)
|
||||||
hibp_breaches = orm.relationship("Hibp", secondary="alias_hibp")
|
hibp_breaches = orm.relationship("Hibp", secondary="alias_hibp")
|
||||||
|
|
||||||
# to use Postgres full text search. Only applied on "note" column for now
|
# to use Postgres full text search. Only applied on "note" column for now
|
||||||
@ -1390,7 +1513,8 @@ class Alias(Base, ModelMixin):
|
|||||||
def mailboxes(self):
|
def mailboxes(self):
|
||||||
ret = [self.mailbox]
|
ret = [self.mailbox]
|
||||||
for m in self._mailboxes:
|
for m in self._mailboxes:
|
||||||
ret.append(m)
|
if m.id is not self.mailbox.id:
|
||||||
|
ret.append(m)
|
||||||
|
|
||||||
ret = [mb for mb in ret if mb.verified]
|
ret = [mb for mb in ret if mb.verified]
|
||||||
ret = sorted(ret, key=lambda mb: mb.email)
|
ret = sorted(ret, key=lambda mb: mb.email)
|
||||||
@ -1439,6 +1563,15 @@ class Alias(Base, ModelMixin):
|
|||||||
flush = kw.pop("flush", False)
|
flush = kw.pop("flush", False)
|
||||||
|
|
||||||
new_alias = cls(**kw)
|
new_alias = cls(**kw)
|
||||||
|
user = User.get(new_alias.user_id)
|
||||||
|
if user.is_premium():
|
||||||
|
limits = config.ALIAS_CREATE_RATE_LIMIT_PAID
|
||||||
|
else:
|
||||||
|
limits = config.ALIAS_CREATE_RATE_LIMIT_FREE
|
||||||
|
# limits is array of (hits,days)
|
||||||
|
for limit in limits:
|
||||||
|
key = f"alias_create_{limit[1]}d:{user.id}"
|
||||||
|
rate_limiter.check_bucket_limit(key, limit[0], limit[1])
|
||||||
|
|
||||||
email = kw["email"]
|
email = kw["email"]
|
||||||
# make sure email is lowercase and doesn't have any whitespace
|
# make sure email is lowercase and doesn't have any whitespace
|
||||||
@ -1480,7 +1613,7 @@ class Alias(Base, ModelMixin):
|
|||||||
suffix = user.get_random_alias_suffix()
|
suffix = user.get_random_alias_suffix()
|
||||||
email = f"{prefix}.{suffix}@{config.FIRST_ALIAS_DOMAIN}"
|
email = f"{prefix}.{suffix}@{config.FIRST_ALIAS_DOMAIN}"
|
||||||
|
|
||||||
if not cls.get_by(email=email) and not DeletedAlias.get_by(email=email):
|
if available_sl_email(email):
|
||||||
break
|
break
|
||||||
|
|
||||||
return Alias.create(
|
return Alias.create(
|
||||||
@ -1509,7 +1642,7 @@ class Alias(Base, ModelMixin):
|
|||||||
|
|
||||||
if user.default_alias_custom_domain_id:
|
if user.default_alias_custom_domain_id:
|
||||||
custom_domain = CustomDomain.get(user.default_alias_custom_domain_id)
|
custom_domain = CustomDomain.get(user.default_alias_custom_domain_id)
|
||||||
random_email = generate_email(
|
random_email = generate_random_alias_email(
|
||||||
scheme=scheme, in_hex=in_hex, alias_domain=custom_domain.domain
|
scheme=scheme, in_hex=in_hex, alias_domain=custom_domain.domain
|
||||||
)
|
)
|
||||||
elif user.default_alias_public_domain_id:
|
elif user.default_alias_public_domain_id:
|
||||||
@ -1517,12 +1650,12 @@ class Alias(Base, ModelMixin):
|
|||||||
if sl_domain.premium_only and not user.is_premium():
|
if sl_domain.premium_only and not user.is_premium():
|
||||||
LOG.w("%s not premium, cannot use %s", user, sl_domain)
|
LOG.w("%s not premium, cannot use %s", user, sl_domain)
|
||||||
else:
|
else:
|
||||||
random_email = generate_email(
|
random_email = generate_random_alias_email(
|
||||||
scheme=scheme, in_hex=in_hex, alias_domain=sl_domain.domain
|
scheme=scheme, in_hex=in_hex, alias_domain=sl_domain.domain
|
||||||
)
|
)
|
||||||
|
|
||||||
if not random_email:
|
if not random_email:
|
||||||
random_email = generate_email(scheme=scheme, in_hex=in_hex)
|
random_email = generate_random_alias_email(scheme=scheme, in_hex=in_hex)
|
||||||
|
|
||||||
alias = Alias.create(
|
alias = Alias.create(
|
||||||
user_id=user.id,
|
user_id=user.id,
|
||||||
@ -1556,7 +1689,9 @@ class ClientUser(Base, ModelMixin):
|
|||||||
client_id = sa.Column(sa.ForeignKey(Client.id, ondelete="cascade"), nullable=False)
|
client_id = sa.Column(sa.ForeignKey(Client.id, ondelete="cascade"), nullable=False)
|
||||||
|
|
||||||
# Null means client has access to user original email
|
# Null means client has access to user original email
|
||||||
alias_id = sa.Column(sa.ForeignKey(Alias.id, ondelete="cascade"), nullable=True)
|
alias_id = sa.Column(
|
||||||
|
sa.ForeignKey(Alias.id, ondelete="cascade"), nullable=True, index=True
|
||||||
|
)
|
||||||
|
|
||||||
# user can decide to send to client another name
|
# user can decide to send to client another name
|
||||||
name = sa.Column(
|
name = sa.Column(
|
||||||
@ -1675,7 +1810,7 @@ class Contact(Base, ModelMixin):
|
|||||||
is_cc = sa.Column(sa.Boolean, nullable=False, default=False, server_default="0")
|
is_cc = sa.Column(sa.Boolean, nullable=False, default=False, server_default="0")
|
||||||
|
|
||||||
pgp_public_key = sa.Column(sa.Text, nullable=True)
|
pgp_public_key = sa.Column(sa.Text, nullable=True)
|
||||||
pgp_finger_print = sa.Column(sa.String(512), nullable=True)
|
pgp_finger_print = sa.Column(sa.String(512), nullable=True, index=True)
|
||||||
|
|
||||||
alias = orm.relationship(Alias, backref="contacts")
|
alias = orm.relationship(Alias, backref="contacts")
|
||||||
user = orm.relationship(User)
|
user = orm.relationship(User)
|
||||||
@ -1829,6 +1964,7 @@ class Contact(Base, ModelMixin):
|
|||||||
|
|
||||||
class EmailLog(Base, ModelMixin):
|
class EmailLog(Base, ModelMixin):
|
||||||
__tablename__ = "email_log"
|
__tablename__ = "email_log"
|
||||||
|
__table_args__ = (Index("ix_email_log_created_at", "created_at"),)
|
||||||
|
|
||||||
user_id = sa.Column(
|
user_id = sa.Column(
|
||||||
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
|
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
|
||||||
@ -2086,7 +2222,9 @@ class AliasUsedOn(Base, ModelMixin):
|
|||||||
sa.UniqueConstraint("alias_id", "hostname", name="uq_alias_used"),
|
sa.UniqueConstraint("alias_id", "hostname", name="uq_alias_used"),
|
||||||
)
|
)
|
||||||
|
|
||||||
alias_id = sa.Column(sa.ForeignKey(Alias.id, ondelete="cascade"), nullable=False)
|
alias_id = sa.Column(
|
||||||
|
sa.ForeignKey(Alias.id, ondelete="cascade"), nullable=False, index=True
|
||||||
|
)
|
||||||
user_id = sa.Column(sa.ForeignKey(User.id, ondelete="cascade"), nullable=False)
|
user_id = sa.Column(sa.ForeignKey(User.id, ondelete="cascade"), nullable=False)
|
||||||
|
|
||||||
alias = orm.relationship(Alias)
|
alias = orm.relationship(Alias)
|
||||||
@ -2205,6 +2343,7 @@ class CustomDomain(Base, ModelMixin):
|
|||||||
@classmethod
|
@classmethod
|
||||||
def create(cls, **kwargs):
|
def create(cls, **kwargs):
|
||||||
domain = kwargs.get("domain")
|
domain = kwargs.get("domain")
|
||||||
|
kwargs["domain"] = domain.replace("\n", "")
|
||||||
if DeletedSubdomain.get_by(domain=domain):
|
if DeletedSubdomain.get_by(domain=domain):
|
||||||
raise SubdomainInTrashError
|
raise SubdomainInTrashError
|
||||||
|
|
||||||
@ -2472,6 +2611,28 @@ class Mailbox(Base, ModelMixin):
|
|||||||
+ Alias.filter_by(mailbox_id=self.id).count()
|
+ Alias.filter_by(mailbox_id=self.id).count()
|
||||||
)
|
)
|
||||||
|
|
||||||
|
def is_proton(self) -> bool:
|
||||||
|
if (
|
||||||
|
self.email.endswith("@proton.me")
|
||||||
|
or self.email.endswith("@protonmail.com")
|
||||||
|
or self.email.endswith("@protonmail.ch")
|
||||||
|
or self.email.endswith("@proton.ch")
|
||||||
|
or self.email.endswith("@pm.me")
|
||||||
|
):
|
||||||
|
return True
|
||||||
|
|
||||||
|
from app.email_utils import get_email_local_part
|
||||||
|
|
||||||
|
mx_domains: [(int, str)] = get_mx_domains(get_email_local_part(self.email))
|
||||||
|
# Proton is the first domain
|
||||||
|
if mx_domains and mx_domains[0][1] in (
|
||||||
|
"mail.protonmail.ch.",
|
||||||
|
"mailsec.protonmail.ch.",
|
||||||
|
):
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def delete(cls, obj_id):
|
def delete(cls, obj_id):
|
||||||
mailbox: Mailbox = cls.get(obj_id)
|
mailbox: Mailbox = cls.get(obj_id)
|
||||||
@ -2504,6 +2665,12 @@ class Mailbox(Base, ModelMixin):
|
|||||||
|
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def create(cls, **kw):
|
||||||
|
if "email" in kw:
|
||||||
|
kw["email"] = sanitize_email(kw["email"])
|
||||||
|
return super().create(**kw)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return f"<Mailbox {self.id} {self.email}>"
|
return f"<Mailbox {self.id} {self.email}>"
|
||||||
|
|
||||||
@ -2763,6 +2930,31 @@ class Notification(Base, ModelMixin):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class Partner(Base, ModelMixin):
|
||||||
|
__tablename__ = "partner"
|
||||||
|
|
||||||
|
name = sa.Column(sa.String(128), unique=True, nullable=False)
|
||||||
|
contact_email = sa.Column(sa.String(128), unique=True, nullable=False)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def find_by_token(token: str) -> Optional[Partner]:
|
||||||
|
hmaced = PartnerApiToken.hmac_token(token)
|
||||||
|
res = (
|
||||||
|
Session.query(Partner, PartnerApiToken)
|
||||||
|
.filter(
|
||||||
|
and_(
|
||||||
|
PartnerApiToken.token == hmaced,
|
||||||
|
Partner.id == PartnerApiToken.partner_id,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if res:
|
||||||
|
partner, partner_api_token = res
|
||||||
|
return partner
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
class SLDomain(Base, ModelMixin):
|
class SLDomain(Base, ModelMixin):
|
||||||
"""SimpleLogin domains"""
|
"""SimpleLogin domains"""
|
||||||
|
|
||||||
@ -2780,12 +2972,23 @@ class SLDomain(Base, ModelMixin):
|
|||||||
sa.Boolean, nullable=False, default=False, server_default="0"
|
sa.Boolean, nullable=False, default=False, server_default="0"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
partner_id = sa.Column(
|
||||||
|
sa.ForeignKey(Partner.id, ondelete="cascade"),
|
||||||
|
nullable=True,
|
||||||
|
default=None,
|
||||||
|
server_default="NULL",
|
||||||
|
)
|
||||||
|
|
||||||
# if enabled, do not show this domain when user creates a custom alias
|
# if enabled, do not show this domain when user creates a custom alias
|
||||||
hidden = sa.Column(sa.Boolean, nullable=False, default=False, server_default="0")
|
hidden = sa.Column(sa.Boolean, nullable=False, default=False, server_default="0")
|
||||||
|
|
||||||
# the order in which the domains are shown when user creates a custom alias
|
# the order in which the domains are shown when user creates a custom alias
|
||||||
order = sa.Column(sa.Integer, nullable=False, default=0, server_default="0")
|
order = sa.Column(sa.Integer, nullable=False, default=0, server_default="0")
|
||||||
|
|
||||||
|
use_as_reverse_alias = sa.Column(
|
||||||
|
sa.Boolean, nullable=False, default=False, server_default="0"
|
||||||
|
)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return f"<SLDomain {self.domain} {'Premium' if self.premium_only else 'Free'}"
|
return f"<SLDomain {self.domain} {'Premium' if self.premium_only else 'Free'}"
|
||||||
|
|
||||||
@ -2806,6 +3009,8 @@ class Monitoring(Base, ModelMixin):
|
|||||||
active_queue = sa.Column(sa.Integer, nullable=False)
|
active_queue = sa.Column(sa.Integer, nullable=False)
|
||||||
deferred_queue = sa.Column(sa.Integer, nullable=False)
|
deferred_queue = sa.Column(sa.Integer, nullable=False)
|
||||||
|
|
||||||
|
__table_args__ = (Index("ix_monitoring_created_at", "created_at"),)
|
||||||
|
|
||||||
|
|
||||||
class BatchImport(Base, ModelMixin):
|
class BatchImport(Base, ModelMixin):
|
||||||
__tablename__ = "batch_import"
|
__tablename__ = "batch_import"
|
||||||
@ -2931,6 +3136,8 @@ class Bounce(Base, ModelMixin):
|
|||||||
email = sa.Column(sa.String(256), nullable=False, index=True)
|
email = sa.Column(sa.String(256), nullable=False, index=True)
|
||||||
info = sa.Column(sa.Text, nullable=True)
|
info = sa.Column(sa.Text, nullable=True)
|
||||||
|
|
||||||
|
__table_args__ = (sa.Index("ix_bounce_created_at", "created_at"),)
|
||||||
|
|
||||||
|
|
||||||
class TransactionalEmail(Base, ModelMixin):
|
class TransactionalEmail(Base, ModelMixin):
|
||||||
"""Storing all email addresses that receive transactional emails, including account email and mailboxes.
|
"""Storing all email addresses that receive transactional emails, including account email and mailboxes.
|
||||||
@ -2940,6 +3147,8 @@ class TransactionalEmail(Base, ModelMixin):
|
|||||||
__tablename__ = "transactional_email"
|
__tablename__ = "transactional_email"
|
||||||
email = sa.Column(sa.String(256), nullable=False, unique=False)
|
email = sa.Column(sa.String(256), nullable=False, unique=False)
|
||||||
|
|
||||||
|
__table_args__ = (sa.Index("ix_transactional_email_created_at", "created_at"),)
|
||||||
|
|
||||||
|
|
||||||
class Payout(Base, ModelMixin):
|
class Payout(Base, ModelMixin):
|
||||||
"""Referral payouts"""
|
"""Referral payouts"""
|
||||||
@ -2992,7 +3201,7 @@ class MessageIDMatching(Base, ModelMixin):
|
|||||||
|
|
||||||
# to track what email_log that has created this matching
|
# to track what email_log that has created this matching
|
||||||
email_log_id = sa.Column(
|
email_log_id = sa.Column(
|
||||||
sa.ForeignKey("email_log.id", ondelete="cascade"), nullable=True
|
sa.ForeignKey("email_log.id", ondelete="cascade"), nullable=True, index=True
|
||||||
)
|
)
|
||||||
|
|
||||||
email_log = orm.relationship("EmailLog")
|
email_log = orm.relationship("EmailLog")
|
||||||
@ -3226,31 +3435,6 @@ class ProviderComplaint(Base, ModelMixin):
|
|||||||
refused_email = orm.relationship(RefusedEmail, foreign_keys=[refused_email_id])
|
refused_email = orm.relationship(RefusedEmail, foreign_keys=[refused_email_id])
|
||||||
|
|
||||||
|
|
||||||
class Partner(Base, ModelMixin):
|
|
||||||
__tablename__ = "partner"
|
|
||||||
|
|
||||||
name = sa.Column(sa.String(128), unique=True, nullable=False)
|
|
||||||
contact_email = sa.Column(sa.String(128), unique=True, nullable=False)
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def find_by_token(token: str) -> Optional[Partner]:
|
|
||||||
hmaced = PartnerApiToken.hmac_token(token)
|
|
||||||
res = (
|
|
||||||
Session.query(Partner, PartnerApiToken)
|
|
||||||
.filter(
|
|
||||||
and_(
|
|
||||||
PartnerApiToken.token == hmaced,
|
|
||||||
Partner.id == PartnerApiToken.partner_id,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
.first()
|
|
||||||
)
|
|
||||||
if res:
|
|
||||||
partner, partner_api_token = res
|
|
||||||
return partner
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
class PartnerApiToken(Base, ModelMixin):
|
class PartnerApiToken(Base, ModelMixin):
|
||||||
__tablename__ = "partner_api_token"
|
__tablename__ = "partner_api_token"
|
||||||
|
|
||||||
@ -3320,7 +3504,7 @@ class PartnerSubscription(Base, ModelMixin):
|
|||||||
)
|
)
|
||||||
|
|
||||||
# when the partner subscription ends
|
# when the partner subscription ends
|
||||||
end_at = sa.Column(ArrowType, nullable=False)
|
end_at = sa.Column(ArrowType, nullable=False, index=True)
|
||||||
|
|
||||||
partner_user = orm.relationship(PartnerUser)
|
partner_user = orm.relationship(PartnerUser)
|
||||||
|
|
||||||
@ -3350,7 +3534,7 @@ class PartnerSubscription(Base, ModelMixin):
|
|||||||
|
|
||||||
class Newsletter(Base, ModelMixin):
|
class Newsletter(Base, ModelMixin):
|
||||||
__tablename__ = "newsletter"
|
__tablename__ = "newsletter"
|
||||||
subject = sa.Column(sa.String(), nullable=False, unique=True, index=True)
|
subject = sa.Column(sa.String(), nullable=False, index=True)
|
||||||
|
|
||||||
html = sa.Column(sa.Text)
|
html = sa.Column(sa.Text)
|
||||||
plain_text = sa.Column(sa.Text)
|
plain_text = sa.Column(sa.Text)
|
||||||
|
@ -1 +1,3 @@
|
|||||||
from . import views
|
from . import views
|
||||||
|
|
||||||
|
__all__ = ["views"]
|
||||||
|
@ -1 +1,3 @@
|
|||||||
from .views import authorize, token, user_info
|
from .views import authorize, token, user_info
|
||||||
|
|
||||||
|
__all__ = ["authorize", "token", "user_info"]
|
||||||
|
@ -64,7 +64,7 @@ def _split_arg(arg_input: Union[str, list]) -> Set[str]:
|
|||||||
- the response_type/scope passed as a list ?scope=scope_1&scope=scope_2
|
- the response_type/scope passed as a list ?scope=scope_1&scope=scope_2
|
||||||
"""
|
"""
|
||||||
res = set()
|
res = set()
|
||||||
if type(arg_input) is str:
|
if isinstance(arg_input, str):
|
||||||
if " " in arg_input:
|
if " " in arg_input:
|
||||||
for x in arg_input.split(" "):
|
for x in arg_input.split(" "):
|
||||||
if x:
|
if x:
|
||||||
|
@ -5,3 +5,11 @@ from .views import (
|
|||||||
account_activated,
|
account_activated,
|
||||||
extension_redirect,
|
extension_redirect,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"index",
|
||||||
|
"final",
|
||||||
|
"setup_done",
|
||||||
|
"account_activated",
|
||||||
|
"extension_redirect",
|
||||||
|
]
|
||||||
|
@ -39,7 +39,6 @@ class _InnerLock:
|
|||||||
lock_redis.storage.delete(lock_name)
|
lock_redis.storage.delete(lock_name)
|
||||||
|
|
||||||
def __call__(self, f: Callable[..., Any]):
|
def __call__(self, f: Callable[..., Any]):
|
||||||
|
|
||||||
if self.lock_suffix is None:
|
if self.lock_suffix is None:
|
||||||
lock_suffix = f.__name__
|
lock_suffix = f.__name__
|
||||||
else:
|
else:
|
||||||
|
@ -5,3 +5,11 @@ from .views import (
|
|||||||
provider1_callback,
|
provider1_callback,
|
||||||
provider2_callback,
|
provider2_callback,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"index",
|
||||||
|
"phone_reservation",
|
||||||
|
"twilio_callback",
|
||||||
|
"provider1_callback",
|
||||||
|
"provider2_callback",
|
||||||
|
]
|
||||||
|
@ -7,11 +7,12 @@ from typing import Optional
|
|||||||
|
|
||||||
from app.account_linking import SLPlan, SLPlanType
|
from app.account_linking import SLPlan, SLPlanType
|
||||||
from app.config import PROTON_EXTRA_HEADER_NAME, PROTON_EXTRA_HEADER_VALUE
|
from app.config import PROTON_EXTRA_HEADER_NAME, PROTON_EXTRA_HEADER_VALUE
|
||||||
|
from app.errors import ProtonAccountNotVerified
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
|
|
||||||
_APP_VERSION = "OauthClient_1.0.0"
|
_APP_VERSION = "OauthClient_1.0.0"
|
||||||
|
|
||||||
PROTON_ERROR_CODE_NOT_EXISTS = 2501
|
PROTON_ERROR_CODE_HV_NEEDED = 9001
|
||||||
|
|
||||||
PLAN_FREE = 1
|
PLAN_FREE = 1
|
||||||
PLAN_PREMIUM = 2
|
PLAN_PREMIUM = 2
|
||||||
@ -57,6 +58,15 @@ def convert_access_token(access_token_response: str) -> AccessCredentials:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def handle_response_not_ok(status: int, body: dict, text: str) -> Exception:
|
||||||
|
if status == HTTPStatus.UNPROCESSABLE_ENTITY:
|
||||||
|
res_code = body.get("Code")
|
||||||
|
if res_code == PROTON_ERROR_CODE_HV_NEEDED:
|
||||||
|
return ProtonAccountNotVerified()
|
||||||
|
|
||||||
|
return Exception(f"Unexpected status code. Wanted 200 and got {status}: " + text)
|
||||||
|
|
||||||
|
|
||||||
class ProtonClient(ABC):
|
class ProtonClient(ABC):
|
||||||
@abstractmethod
|
@abstractmethod
|
||||||
def get_user(self) -> Optional[UserInformation]:
|
def get_user(self) -> Optional[UserInformation]:
|
||||||
@ -124,11 +134,11 @@ class HttpProtonClient(ProtonClient):
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def __validate_response(res: Response) -> dict:
|
def __validate_response(res: Response) -> dict:
|
||||||
status = res.status_code
|
status = res.status_code
|
||||||
if status != HTTPStatus.OK:
|
|
||||||
raise Exception(
|
|
||||||
f"Unexpected status code. Wanted 200 and got {status}: " + res.text
|
|
||||||
)
|
|
||||||
as_json = res.json()
|
as_json = res.json()
|
||||||
|
if status != HTTPStatus.OK:
|
||||||
|
raise HttpProtonClient.__handle_response_not_ok(
|
||||||
|
status=status, body=as_json, text=res.text
|
||||||
|
)
|
||||||
res_code = as_json.get("Code")
|
res_code = as_json.get("Code")
|
||||||
if not res_code or res_code != 1000:
|
if not res_code or res_code != 1000:
|
||||||
raise Exception(
|
raise Exception(
|
||||||
|
31
app/app/rate_limiter.py
Normal file
31
app/app/rate_limiter.py
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
import redis.exceptions
|
||||||
|
import werkzeug.exceptions
|
||||||
|
from limits.storage import RedisStorage
|
||||||
|
|
||||||
|
from app.log import log
|
||||||
|
|
||||||
|
lock_redis: Optional[RedisStorage] = None
|
||||||
|
|
||||||
|
|
||||||
|
def set_redis_concurrent_lock(redis: RedisStorage):
|
||||||
|
global lock_redis
|
||||||
|
lock_redis = redis
|
||||||
|
|
||||||
|
|
||||||
|
def check_bucket_limit(
|
||||||
|
lock_name: Optional[str] = None,
|
||||||
|
max_hits: int = 5,
|
||||||
|
bucket_seconds: int = 3600,
|
||||||
|
):
|
||||||
|
# Calculate current bucket time
|
||||||
|
bucket_id = int(datetime.utcnow().timestamp()) % bucket_seconds
|
||||||
|
bucket_lock_name = f"bl:{lock_name}:{bucket_id}"
|
||||||
|
try:
|
||||||
|
value = lock_redis.incr(bucket_lock_name, bucket_seconds)
|
||||||
|
if value > max_hits:
|
||||||
|
raise werkzeug.exceptions.TooManyRequests()
|
||||||
|
except redis.exceptions.RedisError:
|
||||||
|
log.e("Cannot connect to redis")
|
@ -2,21 +2,23 @@ import flask
|
|||||||
import limits.storage
|
import limits.storage
|
||||||
|
|
||||||
from app.parallel_limiter import set_redis_concurrent_lock
|
from app.parallel_limiter import set_redis_concurrent_lock
|
||||||
|
from app.rate_limiter import set_redis_concurrent_lock as rate_limit_set_redis
|
||||||
from app.session import RedisSessionStore
|
from app.session import RedisSessionStore
|
||||||
|
|
||||||
|
|
||||||
def initialize_redis_services(app: flask.Flask, redis_url: str):
|
def initialize_redis_services(app: flask.Flask, redis_url: str):
|
||||||
|
|
||||||
if redis_url.startswith("redis://") or redis_url.startswith("rediss://"):
|
if redis_url.startswith("redis://") or redis_url.startswith("rediss://"):
|
||||||
storage = limits.storage.RedisStorage(redis_url)
|
storage = limits.storage.RedisStorage(redis_url)
|
||||||
app.session_interface = RedisSessionStore(storage.storage, storage.storage, app)
|
app.session_interface = RedisSessionStore(storage.storage, storage.storage, app)
|
||||||
set_redis_concurrent_lock(storage)
|
set_redis_concurrent_lock(storage)
|
||||||
|
rate_limit_set_redis(storage)
|
||||||
elif redis_url.startswith("redis+sentinel://"):
|
elif redis_url.startswith("redis+sentinel://"):
|
||||||
storage = limits.storage.RedisSentinelStorage(redis_url)
|
storage = limits.storage.RedisSentinelStorage(redis_url)
|
||||||
app.session_interface = RedisSessionStore(
|
app.session_interface = RedisSessionStore(
|
||||||
storage.storage, storage.storage_slave, app
|
storage.storage, storage.storage_slave, app
|
||||||
)
|
)
|
||||||
set_redis_concurrent_lock(storage)
|
set_redis_concurrent_lock(storage)
|
||||||
|
rate_limit_set_redis(storage)
|
||||||
else:
|
else:
|
||||||
raise RuntimeError(
|
raise RuntimeError(
|
||||||
f"Tried to set_redis_session with an invalid redis url: ${redis_url}"
|
f"Tried to set_redis_session with an invalid redis url: ${redis_url}"
|
||||||
|
@ -13,17 +13,29 @@ from app.config import (
|
|||||||
LOCAL_FILE_UPLOAD,
|
LOCAL_FILE_UPLOAD,
|
||||||
UPLOAD_DIR,
|
UPLOAD_DIR,
|
||||||
URL,
|
URL,
|
||||||
|
AWS_ENDPOINT_URL,
|
||||||
)
|
)
|
||||||
|
from app.log import LOG
|
||||||
if not LOCAL_FILE_UPLOAD:
|
|
||||||
_session = boto3.Session(
|
|
||||||
aws_access_key_id=AWS_ACCESS_KEY_ID,
|
|
||||||
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
|
|
||||||
region_name=AWS_REGION,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def upload_from_bytesio(key: str, bs: BytesIO, content_type="string"):
|
_s3_client = None
|
||||||
|
|
||||||
|
|
||||||
|
def _get_s3client():
|
||||||
|
global _s3_client
|
||||||
|
if _s3_client is None:
|
||||||
|
args = {
|
||||||
|
"aws_access_key_id": AWS_ACCESS_KEY_ID,
|
||||||
|
"aws_secret_access_key": AWS_SECRET_ACCESS_KEY,
|
||||||
|
"region_name": AWS_REGION,
|
||||||
|
}
|
||||||
|
if AWS_ENDPOINT_URL:
|
||||||
|
args["endpoint_url"] = AWS_ENDPOINT_URL
|
||||||
|
_s3_client = boto3.client("s3", **args)
|
||||||
|
return _s3_client
|
||||||
|
|
||||||
|
|
||||||
|
def upload_from_bytesio(key: str, bs: BytesIO, content_type="application/octet-stream"):
|
||||||
bs.seek(0)
|
bs.seek(0)
|
||||||
|
|
||||||
if LOCAL_FILE_UPLOAD:
|
if LOCAL_FILE_UPLOAD:
|
||||||
@ -34,7 +46,8 @@ def upload_from_bytesio(key: str, bs: BytesIO, content_type="string"):
|
|||||||
f.write(bs.read())
|
f.write(bs.read())
|
||||||
|
|
||||||
else:
|
else:
|
||||||
_session.resource("s3").Bucket(BUCKET).put_object(
|
_get_s3client().put_object(
|
||||||
|
Bucket=BUCKET,
|
||||||
Key=key,
|
Key=key,
|
||||||
Body=bs,
|
Body=bs,
|
||||||
ContentType=content_type,
|
ContentType=content_type,
|
||||||
@ -52,7 +65,8 @@ def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
|
|||||||
f.write(bs.read())
|
f.write(bs.read())
|
||||||
|
|
||||||
else:
|
else:
|
||||||
_session.resource("s3").Bucket(BUCKET).put_object(
|
_get_s3client().put_object(
|
||||||
|
Bucket=BUCKET,
|
||||||
Key=path,
|
Key=path,
|
||||||
Body=bs,
|
Body=bs,
|
||||||
# Support saving a remote file using Http header
|
# Support saving a remote file using Http header
|
||||||
@ -67,12 +81,9 @@ def download_email(path: str) -> Optional[str]:
|
|||||||
file_path = os.path.join(UPLOAD_DIR, path)
|
file_path = os.path.join(UPLOAD_DIR, path)
|
||||||
with open(file_path, "rb") as f:
|
with open(file_path, "rb") as f:
|
||||||
return f.read()
|
return f.read()
|
||||||
resp = (
|
resp = _get_s3client().get_object(
|
||||||
_session.resource("s3")
|
Bucket=BUCKET,
|
||||||
.Bucket(BUCKET)
|
Key=path,
|
||||||
.get_object(
|
|
||||||
Key=path,
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
if not resp or "Body" not in resp:
|
if not resp or "Body" not in resp:
|
||||||
return None
|
return None
|
||||||
@ -88,8 +99,7 @@ def get_url(key: str, expires_in=3600) -> str:
|
|||||||
if LOCAL_FILE_UPLOAD:
|
if LOCAL_FILE_UPLOAD:
|
||||||
return URL + "/static/upload/" + key
|
return URL + "/static/upload/" + key
|
||||||
else:
|
else:
|
||||||
s3_client = _session.client("s3")
|
return _get_s3client().generate_presigned_url(
|
||||||
return s3_client.generate_presigned_url(
|
|
||||||
ExpiresIn=expires_in,
|
ExpiresIn=expires_in,
|
||||||
ClientMethod="get_object",
|
ClientMethod="get_object",
|
||||||
Params={"Bucket": BUCKET, "Key": key},
|
Params={"Bucket": BUCKET, "Key": key},
|
||||||
@ -100,5 +110,15 @@ def delete(path: str):
|
|||||||
if LOCAL_FILE_UPLOAD:
|
if LOCAL_FILE_UPLOAD:
|
||||||
os.remove(os.path.join(UPLOAD_DIR, path))
|
os.remove(os.path.join(UPLOAD_DIR, path))
|
||||||
else:
|
else:
|
||||||
o = _session.resource("s3").Bucket(BUCKET).Object(path)
|
_get_s3client().delete_object(Bucket=BUCKET, Key=path)
|
||||||
o.delete()
|
|
||||||
|
|
||||||
|
def create_bucket_if_not_exists():
|
||||||
|
s3client = _get_s3client()
|
||||||
|
buckets = s3client.list_buckets()
|
||||||
|
for bucket in buckets["Buckets"]:
|
||||||
|
if bucket["Name"] == BUCKET:
|
||||||
|
LOG.i("Bucket already exists")
|
||||||
|
return
|
||||||
|
s3client.create_bucket(Bucket=BUCKET)
|
||||||
|
LOG.i(f"Bucket {BUCKET} created")
|
||||||
|
@ -75,7 +75,7 @@ class RedisSessionStore(SessionInterface):
|
|||||||
try:
|
try:
|
||||||
data = pickle.loads(val)
|
data = pickle.loads(val)
|
||||||
return ServerSession(data, session_id=session_id)
|
return ServerSession(data, session_id=session_id)
|
||||||
except:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
return ServerSession(session_id=str(uuid.uuid4()))
|
return ServerSession(session_id=str(uuid.uuid4()))
|
||||||
|
|
||||||
|
33
app/app/subscription_webhook.py
Normal file
33
app/app/subscription_webhook.py
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
import requests
|
||||||
|
from requests import RequestException
|
||||||
|
|
||||||
|
from app import config
|
||||||
|
from app.log import LOG
|
||||||
|
from app.models import User
|
||||||
|
|
||||||
|
|
||||||
|
def execute_subscription_webhook(user: User):
|
||||||
|
webhook_url = config.SUBSCRIPTION_CHANGE_WEBHOOK
|
||||||
|
if webhook_url is None:
|
||||||
|
return
|
||||||
|
subscription_end = user.get_active_subscription_end(
|
||||||
|
include_partner_subscription=False
|
||||||
|
)
|
||||||
|
sl_subscription_end = None
|
||||||
|
if subscription_end:
|
||||||
|
sl_subscription_end = subscription_end.timestamp
|
||||||
|
payload = {
|
||||||
|
"user_id": user.id,
|
||||||
|
"is_premium": user.is_premium(),
|
||||||
|
"active_subscription_end": sl_subscription_end,
|
||||||
|
}
|
||||||
|
try:
|
||||||
|
response = requests.post(webhook_url, json=payload, timeout=2)
|
||||||
|
if response.status_code == 200:
|
||||||
|
LOG.i("Sent request to subscription update webhook successfully")
|
||||||
|
else:
|
||||||
|
LOG.i(
|
||||||
|
f"Request to webhook failed with statue {response.status_code}: {response.text}"
|
||||||
|
)
|
||||||
|
except RequestException as e:
|
||||||
|
LOG.error(f"Subscription request exception: {e}")
|
@ -32,8 +32,8 @@ def random_words(words: int = 2, numbers: int = 0):
|
|||||||
fields = [secrets.choice(_words) for i in range(words)]
|
fields = [secrets.choice(_words) for i in range(words)]
|
||||||
|
|
||||||
if numbers > 0:
|
if numbers > 0:
|
||||||
fields.append("".join([str(random.randint(0, 9)) for i in range(numbers)]))
|
digits = "".join([str(random.randint(0, 9)) for i in range(numbers)])
|
||||||
return "".join(fields)
|
return "_".join(fields) + digits
|
||||||
else:
|
else:
|
||||||
return "_".join(fields)
|
return "_".join(fields)
|
||||||
|
|
||||||
@ -49,11 +49,11 @@ def random_string(length=10, include_digits=False):
|
|||||||
|
|
||||||
def convert_to_id(s: str):
|
def convert_to_id(s: str):
|
||||||
"""convert a string to id-like: remove space, remove special accent"""
|
"""convert a string to id-like: remove space, remove special accent"""
|
||||||
s = s.replace(" ", "")
|
|
||||||
s = s.lower()
|
s = s.lower()
|
||||||
s = unidecode(s)
|
s = unidecode(s)
|
||||||
|
s = s.replace(" ", "")
|
||||||
|
|
||||||
return s
|
return s[:256]
|
||||||
|
|
||||||
|
|
||||||
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-."
|
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-."
|
||||||
@ -99,7 +99,7 @@ def sanitize_email(email_address: str, not_lower=False) -> str:
|
|||||||
email_address = email_address.strip().replace(" ", "").replace("\n", " ")
|
email_address = email_address.strip().replace(" ", "").replace("\n", " ")
|
||||||
if not not_lower:
|
if not not_lower:
|
||||||
email_address = email_address.lower()
|
email_address = email_address.lower()
|
||||||
return email_address
|
return email_address.replace("\u200f", "")
|
||||||
|
|
||||||
|
|
||||||
class NextUrlSanitizer:
|
class NextUrlSanitizer:
|
||||||
|
93
app/cron.py
93
app/cron.py
@ -5,11 +5,11 @@ from typing import List, Tuple
|
|||||||
|
|
||||||
import arrow
|
import arrow
|
||||||
import requests
|
import requests
|
||||||
from sqlalchemy import func, desc, or_
|
from sqlalchemy import func, desc, or_, and_
|
||||||
from sqlalchemy.ext.compiler import compiles
|
from sqlalchemy.ext.compiler import compiles
|
||||||
from sqlalchemy.orm import joinedload
|
from sqlalchemy.orm import joinedload
|
||||||
from sqlalchemy.orm.exc import ObjectDeletedError
|
from sqlalchemy.orm.exc import ObjectDeletedError
|
||||||
from sqlalchemy.sql import Insert
|
from sqlalchemy.sql import Insert, text
|
||||||
|
|
||||||
from app import s3, config
|
from app import s3, config
|
||||||
from app.alias_utils import nb_email_log_for_mailbox
|
from app.alias_utils import nb_email_log_for_mailbox
|
||||||
@ -22,10 +22,9 @@ from app.email_utils import (
|
|||||||
render,
|
render,
|
||||||
email_can_be_used_as_mailbox,
|
email_can_be_used_as_mailbox,
|
||||||
send_email_with_rate_control,
|
send_email_with_rate_control,
|
||||||
normalize_reply_email,
|
|
||||||
is_valid_email,
|
|
||||||
get_email_domain_part,
|
get_email_domain_part,
|
||||||
)
|
)
|
||||||
|
from app.email_validation import is_valid_email, normalize_reply_email
|
||||||
from app.errors import ProtonPartnerNotSetUp
|
from app.errors import ProtonPartnerNotSetUp
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
from app.mail_sender import load_unsent_mails_from_fs_and_resend
|
from app.mail_sender import load_unsent_mails_from_fs_and_resend
|
||||||
@ -66,12 +65,14 @@ from server import create_light_app
|
|||||||
|
|
||||||
def notify_trial_end():
|
def notify_trial_end():
|
||||||
for user in User.filter(
|
for user in User.filter(
|
||||||
User.activated.is_(True), User.trial_end.isnot(None), User.lifetime.is_(False)
|
User.activated.is_(True),
|
||||||
|
User.trial_end.isnot(None),
|
||||||
|
User.trial_end >= arrow.now().shift(days=2),
|
||||||
|
User.trial_end < arrow.now().shift(days=3),
|
||||||
|
User.lifetime.is_(False),
|
||||||
).all():
|
).all():
|
||||||
try:
|
try:
|
||||||
if user.in_trial() and arrow.now().shift(
|
if user.in_trial():
|
||||||
days=3
|
|
||||||
) > user.trial_end >= arrow.now().shift(days=2):
|
|
||||||
LOG.d("Send trial end email to user %s", user)
|
LOG.d("Send trial end email to user %s", user)
|
||||||
send_trial_end_soon_email(user)
|
send_trial_end_soon_email(user)
|
||||||
# happens if user has been deleted in the meantime
|
# happens if user has been deleted in the meantime
|
||||||
@ -84,27 +85,49 @@ def delete_logs():
|
|||||||
delete_refused_emails()
|
delete_refused_emails()
|
||||||
delete_old_monitoring()
|
delete_old_monitoring()
|
||||||
|
|
||||||
for t in TransactionalEmail.filter(
|
for t_email in TransactionalEmail.filter(
|
||||||
TransactionalEmail.created_at < arrow.now().shift(days=-7)
|
TransactionalEmail.created_at < arrow.now().shift(days=-7)
|
||||||
):
|
):
|
||||||
TransactionalEmail.delete(t.id)
|
TransactionalEmail.delete(t_email.id)
|
||||||
|
|
||||||
for b in Bounce.filter(Bounce.created_at < arrow.now().shift(days=-7)):
|
for b in Bounce.filter(Bounce.created_at < arrow.now().shift(days=-7)):
|
||||||
Bounce.delete(b.id)
|
Bounce.delete(b.id)
|
||||||
|
|
||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
LOG.d("Delete EmailLog older than 2 weeks")
|
LOG.d("Deleting EmailLog older than 2 weeks")
|
||||||
|
|
||||||
max_dt = arrow.now().shift(weeks=-2)
|
total_deleted = 0
|
||||||
nb_deleted = EmailLog.filter(EmailLog.created_at < max_dt).delete()
|
batch_size = 500
|
||||||
Session.commit()
|
Session.execute("set session statement_timeout=30000").rowcount
|
||||||
|
queries_done = 0
|
||||||
|
cutoff_time = arrow.now().shift(days=-14)
|
||||||
|
rows_to_delete = EmailLog.filter(EmailLog.created_at < cutoff_time).count()
|
||||||
|
expected_queries = int(rows_to_delete / batch_size)
|
||||||
|
sql = text(
|
||||||
|
"DELETE FROM email_log WHERE id IN (SELECT id FROM email_log WHERE created_at < :cutoff_time order by created_at limit :batch_size)"
|
||||||
|
)
|
||||||
|
str_cutoff_time = cutoff_time.isoformat()
|
||||||
|
while total_deleted < rows_to_delete:
|
||||||
|
deleted_count = Session.execute(
|
||||||
|
sql, {"cutoff_time": str_cutoff_time, "batch_size": batch_size}
|
||||||
|
).rowcount
|
||||||
|
Session.commit()
|
||||||
|
total_deleted += deleted_count
|
||||||
|
queries_done += 1
|
||||||
|
LOG.i(
|
||||||
|
f"[{queries_done}/{expected_queries}] Deleted {total_deleted} EmailLog entries"
|
||||||
|
)
|
||||||
|
if deleted_count < batch_size:
|
||||||
|
break
|
||||||
|
|
||||||
LOG.i("Delete %s email logs", nb_deleted)
|
LOG.i("Deleted %s email logs", total_deleted)
|
||||||
|
|
||||||
|
|
||||||
def delete_refused_emails():
|
def delete_refused_emails():
|
||||||
for refused_email in RefusedEmail.filter_by(deleted=False).all():
|
for refused_email in (
|
||||||
|
RefusedEmail.filter_by(deleted=False).order_by(RefusedEmail.id).all()
|
||||||
|
):
|
||||||
if arrow.now().shift(days=1) > refused_email.delete_at >= arrow.now():
|
if arrow.now().shift(days=1) > refused_email.delete_at >= arrow.now():
|
||||||
LOG.d("Delete refused email %s", refused_email)
|
LOG.d("Delete refused email %s", refused_email)
|
||||||
if refused_email.path:
|
if refused_email.path:
|
||||||
@ -138,7 +161,7 @@ def notify_premium_end():
|
|||||||
|
|
||||||
send_email(
|
send_email(
|
||||||
user.email,
|
user.email,
|
||||||
f"Your subscription will end soon",
|
"Your subscription will end soon",
|
||||||
render(
|
render(
|
||||||
"transactional/subscription-end.txt",
|
"transactional/subscription-end.txt",
|
||||||
user=user,
|
user=user,
|
||||||
@ -195,7 +218,7 @@ def notify_manual_sub_end():
|
|||||||
LOG.d("Remind user %s that their manual sub is ending soon", user)
|
LOG.d("Remind user %s that their manual sub is ending soon", user)
|
||||||
send_email(
|
send_email(
|
||||||
user.email,
|
user.email,
|
||||||
f"Your subscription will end soon",
|
"Your subscription will end soon",
|
||||||
render(
|
render(
|
||||||
"transactional/manual-subscription-end.txt",
|
"transactional/manual-subscription-end.txt",
|
||||||
user=user,
|
user=user,
|
||||||
@ -272,7 +295,11 @@ def compute_metric2() -> Metric2:
|
|||||||
_24h_ago = now.shift(days=-1)
|
_24h_ago = now.shift(days=-1)
|
||||||
|
|
||||||
nb_referred_user_paid = 0
|
nb_referred_user_paid = 0
|
||||||
for user in User.filter(User.referral_id.isnot(None)):
|
for user in (
|
||||||
|
User.filter(User.referral_id.isnot(None))
|
||||||
|
.yield_per(500)
|
||||||
|
.enable_eagerloads(False)
|
||||||
|
):
|
||||||
if user.is_paid():
|
if user.is_paid():
|
||||||
nb_referred_user_paid += 1
|
nb_referred_user_paid += 1
|
||||||
|
|
||||||
@ -563,21 +590,21 @@ nb_total_bounced_last_24h: {stats_today.nb_total_bounced_last_24h} - {increase_p
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
monitoring_report += "\n====================================\n"
|
monitoring_report += "\n====================================\n"
|
||||||
monitoring_report += f"""
|
monitoring_report += """
|
||||||
# Account bounce report:
|
# Account bounce report:
|
||||||
"""
|
"""
|
||||||
|
|
||||||
for email, bounces in bounce_report():
|
for email, bounces in bounce_report():
|
||||||
monitoring_report += f"{email}: {bounces}\n"
|
monitoring_report += f"{email}: {bounces}\n"
|
||||||
|
|
||||||
monitoring_report += f"""\n
|
monitoring_report += """\n
|
||||||
# Alias creation report:
|
# Alias creation report:
|
||||||
"""
|
"""
|
||||||
|
|
||||||
for email, nb_alias, date in alias_creation_report():
|
for email, nb_alias, date in alias_creation_report():
|
||||||
monitoring_report += f"{email}, {date}: {nb_alias}\n"
|
monitoring_report += f"{email}, {date}: {nb_alias}\n"
|
||||||
|
|
||||||
monitoring_report += f"""\n
|
monitoring_report += """\n
|
||||||
# Full bounce detail report:
|
# Full bounce detail report:
|
||||||
"""
|
"""
|
||||||
monitoring_report += all_bounce_report()
|
monitoring_report += all_bounce_report()
|
||||||
@ -1020,7 +1047,8 @@ async def check_hibp():
|
|||||||
)
|
)
|
||||||
.filter(Alias.enabled)
|
.filter(Alias.enabled)
|
||||||
.order_by(Alias.hibp_last_check.asc())
|
.order_by(Alias.hibp_last_check.asc())
|
||||||
.all()
|
.yield_per(500)
|
||||||
|
.enable_eagerloads(False)
|
||||||
):
|
):
|
||||||
await queue.put(alias.id)
|
await queue.put(alias.id)
|
||||||
|
|
||||||
@ -1071,14 +1099,14 @@ def notify_hibp():
|
|||||||
)
|
)
|
||||||
|
|
||||||
LOG.d(
|
LOG.d(
|
||||||
f"Send new breaches found email to %s for %s breaches aliases",
|
"Send new breaches found email to %s for %s breaches aliases",
|
||||||
user,
|
user,
|
||||||
len(breached_aliases),
|
len(breached_aliases),
|
||||||
)
|
)
|
||||||
|
|
||||||
send_email(
|
send_email(
|
||||||
user.email,
|
user.email,
|
||||||
f"You were in a data breach",
|
"You were in a data breach",
|
||||||
render(
|
render(
|
||||||
"transactional/hibp-new-breaches.txt.jinja2",
|
"transactional/hibp-new-breaches.txt.jinja2",
|
||||||
user=user,
|
user=user,
|
||||||
@ -1098,6 +1126,18 @@ def notify_hibp():
|
|||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
|
|
||||||
|
def clear_users_scheduled_to_be_deleted():
|
||||||
|
users = User.filter(
|
||||||
|
and_(User.delete_on.isnot(None), User.delete_on < arrow.now())
|
||||||
|
).all()
|
||||||
|
for user in users:
|
||||||
|
LOG.i(
|
||||||
|
f"Scheduled deletion of user {user} with scheduled delete on {user.delete_on}"
|
||||||
|
)
|
||||||
|
User.delete(user.id)
|
||||||
|
Session.commit()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
LOG.d("Start running cronjob")
|
LOG.d("Start running cronjob")
|
||||||
parser = argparse.ArgumentParser()
|
parser = argparse.ArgumentParser()
|
||||||
@ -1164,3 +1204,6 @@ if __name__ == "__main__":
|
|||||||
elif args.job == "send_undelivered_mails":
|
elif args.job == "send_undelivered_mails":
|
||||||
LOG.d("Sending undelivered emails")
|
LOG.d("Sending undelivered emails")
|
||||||
load_unsent_mails_from_fs_and_resend()
|
load_unsent_mails_from_fs_and_resend()
|
||||||
|
elif args.job == "delete_scheduled_users":
|
||||||
|
LOG.d("Deleting users scheduled to be deleted")
|
||||||
|
clear_users_scheduled_to_be_deleted()
|
||||||
|
@ -5,65 +5,66 @@ jobs:
|
|||||||
schedule: "0 0 * * *"
|
schedule: "0 0 * * *"
|
||||||
captureStderr: true
|
captureStderr: true
|
||||||
|
|
||||||
- name: SimpleLogin Notify Trial Ends
|
|
||||||
command: python /code/cron.py -j notify_trial_end
|
|
||||||
shell: /bin/bash
|
|
||||||
schedule: "0 8 * * *"
|
|
||||||
captureStderr: true
|
|
||||||
|
|
||||||
- name: SimpleLogin Notify Manual Subscription Ends
|
|
||||||
command: python /code/cron.py -j notify_manual_subscription_end
|
|
||||||
shell: /bin/bash
|
|
||||||
schedule: "0 9 * * *"
|
|
||||||
captureStderr: true
|
|
||||||
|
|
||||||
- name: SimpleLogin Notify Premium Ends
|
|
||||||
command: python /code/cron.py -j notify_premium_end
|
|
||||||
shell: /bin/bash
|
|
||||||
schedule: "0 10 * * *"
|
|
||||||
captureStderr: true
|
|
||||||
|
|
||||||
- name: SimpleLogin Delete Logs
|
|
||||||
command: python /code/cron.py -j delete_logs
|
|
||||||
shell: /bin/bash
|
|
||||||
schedule: "0 11 * * *"
|
|
||||||
captureStderr: true
|
|
||||||
|
|
||||||
- name: SimpleLogin Poll Apple Subscriptions
|
|
||||||
command: python /code/cron.py -j poll_apple_subscription
|
|
||||||
shell: /bin/bash
|
|
||||||
schedule: "0 12 * * *"
|
|
||||||
captureStderr: true
|
|
||||||
|
|
||||||
- name: SimpleLogin Sanity Check
|
|
||||||
command: python /code/cron.py -j sanity_check
|
|
||||||
shell: /bin/bash
|
|
||||||
schedule: "0 2 * * *"
|
|
||||||
captureStderr: true
|
|
||||||
|
|
||||||
- name: SimpleLogin Delete Old Monitoring records
|
- name: SimpleLogin Delete Old Monitoring records
|
||||||
command: python /code/cron.py -j delete_old_monitoring
|
command: python /code/cron.py -j delete_old_monitoring
|
||||||
shell: /bin/bash
|
shell: /bin/bash
|
||||||
schedule: "0 14 * * *"
|
schedule: "15 1 * * *"
|
||||||
captureStderr: true
|
captureStderr: true
|
||||||
|
|
||||||
- name: SimpleLogin Custom Domain check
|
- name: SimpleLogin Custom Domain check
|
||||||
command: python /code/cron.py -j check_custom_domain
|
command: python /code/cron.py -j check_custom_domain
|
||||||
shell: /bin/bash
|
shell: /bin/bash
|
||||||
schedule: "0 15 * * *"
|
schedule: "15 2 * * *"
|
||||||
captureStderr: true
|
captureStderr: true
|
||||||
|
|
||||||
- name: SimpleLogin HIBP check
|
- name: SimpleLogin HIBP check
|
||||||
command: python /code/cron.py -j check_hibp
|
command: python /code/cron.py -j check_hibp
|
||||||
shell: /bin/bash
|
shell: /bin/bash
|
||||||
schedule: "0 18 * * *"
|
schedule: "15 3 * * *"
|
||||||
captureStderr: true
|
captureStderr: true
|
||||||
concurrencyPolicy: Forbid
|
concurrencyPolicy: Forbid
|
||||||
|
|
||||||
- name: SimpleLogin Notify HIBP breaches
|
- name: SimpleLogin Notify HIBP breaches
|
||||||
command: python /code/cron.py -j notify_hibp
|
command: python /code/cron.py -j notify_hibp
|
||||||
shell: /bin/bash
|
shell: /bin/bash
|
||||||
schedule: "0 19 * * *"
|
schedule: "15 4 * * *"
|
||||||
|
captureStderr: true
|
||||||
|
concurrencyPolicy: Forbid
|
||||||
|
|
||||||
|
- name: SimpleLogin Delete Logs
|
||||||
|
command: python /code/cron.py -j delete_logs
|
||||||
|
shell: /bin/bash
|
||||||
|
schedule: "15 5 * * *"
|
||||||
|
captureStderr: true
|
||||||
|
|
||||||
|
- name: SimpleLogin Poll Apple Subscriptions
|
||||||
|
command: python /code/cron.py -j poll_apple_subscription
|
||||||
|
shell: /bin/bash
|
||||||
|
schedule: "15 6 * * *"
|
||||||
|
captureStderr: true
|
||||||
|
|
||||||
|
- name: SimpleLogin Notify Trial Ends
|
||||||
|
command: python /code/cron.py -j notify_trial_end
|
||||||
|
shell: /bin/bash
|
||||||
|
schedule: "15 8 * * *"
|
||||||
|
captureStderr: true
|
||||||
|
|
||||||
|
- name: SimpleLogin Notify Manual Subscription Ends
|
||||||
|
command: python /code/cron.py -j notify_manual_subscription_end
|
||||||
|
shell: /bin/bash
|
||||||
|
schedule: "15 9 * * *"
|
||||||
|
captureStderr: true
|
||||||
|
|
||||||
|
- name: SimpleLogin Notify Premium Ends
|
||||||
|
command: python /code/cron.py -j notify_premium_end
|
||||||
|
shell: /bin/bash
|
||||||
|
schedule: "15 10 * * *"
|
||||||
|
captureStderr: true
|
||||||
|
|
||||||
|
- name: SimpleLogin delete users scheduled to be deleted
|
||||||
|
command: echo disabled_user_deletion #python /code/cron.py -j delete_scheduled_users
|
||||||
|
shell: /bin/bash
|
||||||
|
schedule: "15 11 * * *"
|
||||||
captureStderr: true
|
captureStderr: true
|
||||||
concurrencyPolicy: Forbid
|
concurrencyPolicy: Forbid
|
||||||
|
|
||||||
|
@ -15,6 +15,7 @@
|
|||||||
- [GET /api/user/cookie_token](#get-apiusercookie_token): Get a one time use token to exchange it for a valid cookie
|
- [GET /api/user/cookie_token](#get-apiusercookie_token): Get a one time use token to exchange it for a valid cookie
|
||||||
- [PATCH /api/user_info](#patch-apiuser_info): Update user's information.
|
- [PATCH /api/user_info](#patch-apiuser_info): Update user's information.
|
||||||
- [POST /api/api_key](#post-apiapi_key): Create a new API key.
|
- [POST /api/api_key](#post-apiapi_key): Create a new API key.
|
||||||
|
- [GET /api/stats](#get-apistats): Get user's stats.
|
||||||
- [GET /api/logout](#get-apilogout): Log out.
|
- [GET /api/logout](#get-apilogout): Log out.
|
||||||
|
|
||||||
[Alias endpoints](#alias-endpoints)
|
[Alias endpoints](#alias-endpoints)
|
||||||
@ -226,6 +227,22 @@ Input:
|
|||||||
|
|
||||||
Output: same as GET /api/user_info
|
Output: same as GET /api/user_info
|
||||||
|
|
||||||
|
#### GET /api/stats
|
||||||
|
|
||||||
|
Given the API Key, return stats about the number of aliases, number of emails forwarded/replied/blocked
|
||||||
|
|
||||||
|
Input:
|
||||||
|
|
||||||
|
- `Authentication` header that contains the api key
|
||||||
|
|
||||||
|
Output: if api key is correct, return a json with the following fields:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{"nb_alias": 1, "nb_block": 0, "nb_forward": 0, "nb_reply": 0}
|
||||||
|
```
|
||||||
|
|
||||||
|
If api key is incorrect, return 401.
|
||||||
|
|
||||||
#### PATCH /api/sudo
|
#### PATCH /api/sudo
|
||||||
|
|
||||||
Enable sudo mode
|
Enable sudo mode
|
||||||
@ -371,7 +388,7 @@ Input:
|
|||||||
- (Optional but recommended) `hostname` passed in query string
|
- (Optional but recommended) `hostname` passed in query string
|
||||||
- Request Message Body in json (`Content-Type` is `application/json`)
|
- Request Message Body in json (`Content-Type` is `application/json`)
|
||||||
- alias_prefix: string. The first part of the alias that user can choose.
|
- alias_prefix: string. The first part of the alias that user can choose.
|
||||||
- signed_suffix: should be one of the suffixes returned in the `GET /api/v4/alias/options` endpoint.
|
- signed_suffix: should be one of the suffixes returned in the `GET /api/v5/alias/options` endpoint.
|
||||||
- mailbox_ids: list of mailbox_id that "owns" this alias
|
- mailbox_ids: list of mailbox_id that "owns" this alias
|
||||||
- (Optional) note: alias note
|
- (Optional) note: alias note
|
||||||
- (Optional) name: alias name
|
- (Optional) name: alias name
|
||||||
@ -694,7 +711,7 @@ Return 200 and `existed=true` if contact is already added.
|
|||||||
|
|
||||||
It can return 403 with an error if the user cannot create reverse alias.
|
It can return 403 with an error if the user cannot create reverse alias.
|
||||||
|
|
||||||
``json
|
```json
|
||||||
{
|
{
|
||||||
"error": "Please upgrade to create a reverse-alias"
|
"error": "Please upgrade to create a reverse-alias"
|
||||||
}
|
}
|
||||||
|
123
app/docs/ssl.md
123
app/docs/ssl.md
@ -1,4 +1,4 @@
|
|||||||
# SSL, HTTPS, and HSTS
|
# SSL, HTTPS, HSTS and additional security measures
|
||||||
|
|
||||||
It's highly recommended to enable SSL/TLS on your server, both for the web app and email server.
|
It's highly recommended to enable SSL/TLS on your server, both for the web app and email server.
|
||||||
|
|
||||||
@ -58,3 +58,124 @@ Now, reload Nginx:
|
|||||||
```bash
|
```bash
|
||||||
sudo systemctl reload nginx
|
sudo systemctl reload nginx
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Additional security measures
|
||||||
|
|
||||||
|
For additional security, we recommend you take some extra steps.
|
||||||
|
|
||||||
|
### Enable Certificate Authority Authorization (CAA)
|
||||||
|
|
||||||
|
[Certificate Authority Authorization](https://letsencrypt.org/docs/caa/) is a step you can take to restrict the list of certificate authorities that are allowed to issue certificates for your domains.
|
||||||
|
|
||||||
|
Use [SSLMate’s CAA Record Generator](https://sslmate.com/caa/) to create a **CAA record** with the following configuration:
|
||||||
|
|
||||||
|
- `flags`: `0`
|
||||||
|
- `tag`: `issue`
|
||||||
|
- `value`: `"letsencrypt.org"`
|
||||||
|
|
||||||
|
To verify if the DNS works, the following command
|
||||||
|
|
||||||
|
```bash
|
||||||
|
dig @1.1.1.1 mydomain.com caa
|
||||||
|
```
|
||||||
|
|
||||||
|
should return:
|
||||||
|
|
||||||
|
```
|
||||||
|
mydomain.com. 3600 IN CAA 0 issue "letsencrypt.org"
|
||||||
|
```
|
||||||
|
|
||||||
|
### SMTP MTA Strict Transport Security (MTA-STS)
|
||||||
|
|
||||||
|
[MTA-STS](https://datatracker.ietf.org/doc/html/rfc8461) is an extra step you can take to broadcast the ability of your instance to receive and, optionally enforce, TSL-secure SMTP connections to protect email traffic.
|
||||||
|
|
||||||
|
Enabling MTA-STS requires you serve a specific file from subdomain `mta-sts.domain.com` on a well-known route.
|
||||||
|
|
||||||
|
Create a text file `/var/www/.well-known/mta-sts.txt` with the content:
|
||||||
|
|
||||||
|
```txt
|
||||||
|
version: STSv1
|
||||||
|
mode: testing
|
||||||
|
mx: app.mydomain.com
|
||||||
|
max_age: 86400
|
||||||
|
```
|
||||||
|
|
||||||
|
It is recommended to start with `mode: testing` for starters to get time to review failure reports. Add as many `mx:` domain entries as you have matching **MX records** in your DNS configuration.
|
||||||
|
|
||||||
|
Create a **TXT record** for `_mta-sts.mydomain.com.` with the following value:
|
||||||
|
|
||||||
|
```txt
|
||||||
|
v=STSv1; id=UNIX_TIMESTAMP
|
||||||
|
```
|
||||||
|
|
||||||
|
With `UNIX_TIMESTAMP` being the current date/time.
|
||||||
|
|
||||||
|
Use the following command to generate the record:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
echo "v=STSv1; id=$(date +%s)"
|
||||||
|
```
|
||||||
|
|
||||||
|
To verify if the DNS works, the following command
|
||||||
|
|
||||||
|
```bash
|
||||||
|
dig @1.1.1.1 _mta-sts.mydomain.com txt
|
||||||
|
```
|
||||||
|
|
||||||
|
should return a result similar to this one:
|
||||||
|
|
||||||
|
```
|
||||||
|
_mta-sts.mydomain.com. 3600 IN TXT "v=STSv1; id=1689416399"
|
||||||
|
```
|
||||||
|
|
||||||
|
Create an additional Nginx configuration in `/etc/nginx/sites-enabled/mta-sts` with the following content:
|
||||||
|
|
||||||
|
```
|
||||||
|
server {
|
||||||
|
server_name mta-sts.mydomain.com;
|
||||||
|
root /var/www;
|
||||||
|
listen 80;
|
||||||
|
|
||||||
|
location ^~ /.well-known {}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Restart Nginx with the following command:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
sudo service nginx restart
|
||||||
|
```
|
||||||
|
|
||||||
|
A correct configuration of MTA-STS, however, requires that the certificate used to host the `mta-sts` subdomain matches that of the subdomain referred to by the **MX record** from the DNS. In other words, both `mta-sts.mydomain.com` and `app.mydomain.com` must share the same certificate.
|
||||||
|
|
||||||
|
The easiest way to do this is to _expand_ the certificate associated with `app.mydomain.com` to also support the `mta-sts` subdomain using the following command:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
certbot --expand --nginx -d app.mydomain.com,mta-sts.mydomain.com
|
||||||
|
```
|
||||||
|
|
||||||
|
## SMTP TLS Reporting
|
||||||
|
|
||||||
|
[TLSRPT](https://datatracker.ietf.org/doc/html/rfc8460) is used by SMTP systems to report failures in establishing TLS-secure sessions as broadcast by the MTA-STS configuration.
|
||||||
|
|
||||||
|
Configuring MTA-STS in `mode: testing` as shown in the previous section gives you time to review failures from some SMTP senders.
|
||||||
|
|
||||||
|
Create a **TXT record** for `_smtp._tls.mydomain.com.` with the following value:
|
||||||
|
|
||||||
|
```txt
|
||||||
|
v=TSLRPTv1; rua=mailto:YOUR_EMAIL
|
||||||
|
```
|
||||||
|
|
||||||
|
The TLSRPT configuration at the DNS level allows SMTP senders that fail to initiate TLS-secure sessions to send reports to a particular email address. We suggest creating a `tls-reports` alias in SimpleLogin for this purpose.
|
||||||
|
|
||||||
|
To verify if the DNS works, the following command
|
||||||
|
|
||||||
|
```bash
|
||||||
|
dig @1.1.1.1 _smtp._tls.mydomain.com txt
|
||||||
|
```
|
||||||
|
|
||||||
|
should return a result similar to this one:
|
||||||
|
|
||||||
|
```
|
||||||
|
_smtp._tls.mydomain.com. 3600 IN TXT "v=TSLRPTv1; rua=mailto:tls-reports@mydomain.com"
|
||||||
|
```
|
||||||
|
@ -106,8 +106,6 @@ from app.email_utils import (
|
|||||||
get_header_unicode,
|
get_header_unicode,
|
||||||
generate_reply_email,
|
generate_reply_email,
|
||||||
is_reverse_alias,
|
is_reverse_alias,
|
||||||
normalize_reply_email,
|
|
||||||
is_valid_email,
|
|
||||||
replace,
|
replace,
|
||||||
should_disable,
|
should_disable,
|
||||||
parse_id_from_bounce,
|
parse_id_from_bounce,
|
||||||
@ -123,6 +121,7 @@ from app.email_utils import (
|
|||||||
generate_verp_email,
|
generate_verp_email,
|
||||||
sl_formataddr,
|
sl_formataddr,
|
||||||
)
|
)
|
||||||
|
from app.email_validation import is_valid_email, normalize_reply_email
|
||||||
from app.errors import (
|
from app.errors import (
|
||||||
NonReverseAliasInReplyPhase,
|
NonReverseAliasInReplyPhase,
|
||||||
VERPTransactional,
|
VERPTransactional,
|
||||||
@ -161,6 +160,7 @@ from app.models import (
|
|||||||
MessageIDMatching,
|
MessageIDMatching,
|
||||||
Notification,
|
Notification,
|
||||||
VerpType,
|
VerpType,
|
||||||
|
SLDomain,
|
||||||
)
|
)
|
||||||
from app.pgp_utils import (
|
from app.pgp_utils import (
|
||||||
PGPException,
|
PGPException,
|
||||||
@ -235,7 +235,6 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
|
|||||||
contact.mail_from = mail_from
|
contact.mail_from = mail_from
|
||||||
Session.commit()
|
Session.commit()
|
||||||
else:
|
else:
|
||||||
|
|
||||||
try:
|
try:
|
||||||
contact = Contact.create(
|
contact = Contact.create(
|
||||||
user_id=alias.user_id,
|
user_id=alias.user_id,
|
||||||
@ -243,7 +242,7 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
|
|||||||
website_email=contact_email,
|
website_email=contact_email,
|
||||||
name=contact_name,
|
name=contact_name,
|
||||||
mail_from=mail_from,
|
mail_from=mail_from,
|
||||||
reply_email=generate_reply_email(contact_email, alias.user)
|
reply_email=generate_reply_email(contact_email, alias)
|
||||||
if is_valid_email(contact_email)
|
if is_valid_email(contact_email)
|
||||||
else NOREPLY,
|
else NOREPLY,
|
||||||
automatic_created=True,
|
automatic_created=True,
|
||||||
@ -261,7 +260,7 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
|
|||||||
|
|
||||||
Session.commit()
|
Session.commit()
|
||||||
except IntegrityError:
|
except IntegrityError:
|
||||||
LOG.w("Contact %s %s already exist", alias, contact_email)
|
LOG.w(f"Contact with email {contact_email} for alias {alias} already exist")
|
||||||
Session.rollback()
|
Session.rollback()
|
||||||
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
|
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
|
||||||
|
|
||||||
@ -279,6 +278,9 @@ def get_or_create_reply_to_contact(
|
|||||||
except ValueError:
|
except ValueError:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
if len(contact_name) >= Contact.MAX_NAME_LENGTH:
|
||||||
|
contact_name = contact_name[0 : Contact.MAX_NAME_LENGTH]
|
||||||
|
|
||||||
if not is_valid_email(contact_address):
|
if not is_valid_email(contact_address):
|
||||||
LOG.w(
|
LOG.w(
|
||||||
"invalid reply-to address %s. Parse from %s",
|
"invalid reply-to address %s. Parse from %s",
|
||||||
@ -304,7 +306,7 @@ def get_or_create_reply_to_contact(
|
|||||||
alias_id=alias.id,
|
alias_id=alias.id,
|
||||||
website_email=contact_address,
|
website_email=contact_address,
|
||||||
name=contact_name,
|
name=contact_name,
|
||||||
reply_email=generate_reply_email(contact_address, alias.user),
|
reply_email=generate_reply_email(contact_address, alias),
|
||||||
automatic_created=True,
|
automatic_created=True,
|
||||||
)
|
)
|
||||||
Session.commit()
|
Session.commit()
|
||||||
@ -347,6 +349,10 @@ def replace_header_when_forward(msg: Message, alias: Alias, header: str):
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
|
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
|
||||||
|
contact_name = full_address.display_name
|
||||||
|
if len(contact_name) >= Contact.MAX_NAME_LENGTH:
|
||||||
|
contact_name = contact_name[0 : Contact.MAX_NAME_LENGTH]
|
||||||
|
|
||||||
if contact:
|
if contact:
|
||||||
# update the contact name if needed
|
# update the contact name if needed
|
||||||
if contact.name != full_address.display_name:
|
if contact.name != full_address.display_name:
|
||||||
@ -354,9 +360,9 @@ def replace_header_when_forward(msg: Message, alias: Alias, header: str):
|
|||||||
"Update contact %s name %s to %s",
|
"Update contact %s name %s to %s",
|
||||||
contact,
|
contact,
|
||||||
contact.name,
|
contact.name,
|
||||||
full_address.display_name,
|
contact_name,
|
||||||
)
|
)
|
||||||
contact.name = full_address.display_name
|
contact.name = contact_name
|
||||||
Session.commit()
|
Session.commit()
|
||||||
else:
|
else:
|
||||||
LOG.d(
|
LOG.d(
|
||||||
@ -371,8 +377,8 @@ def replace_header_when_forward(msg: Message, alias: Alias, header: str):
|
|||||||
user_id=alias.user_id,
|
user_id=alias.user_id,
|
||||||
alias_id=alias.id,
|
alias_id=alias.id,
|
||||||
website_email=contact_email,
|
website_email=contact_email,
|
||||||
name=full_address.display_name,
|
name=contact_name,
|
||||||
reply_email=generate_reply_email(contact_email, alias.user),
|
reply_email=generate_reply_email(contact_email, alias),
|
||||||
is_cc=header.lower() == "cc",
|
is_cc=header.lower() == "cc",
|
||||||
automatic_created=True,
|
automatic_created=True,
|
||||||
)
|
)
|
||||||
@ -540,12 +546,20 @@ def sign_msg(msg: Message) -> Message:
|
|||||||
signature.add_header("Content-Disposition", 'attachment; filename="signature.asc"')
|
signature.add_header("Content-Disposition", 'attachment; filename="signature.asc"')
|
||||||
|
|
||||||
try:
|
try:
|
||||||
signature.set_payload(sign_data(message_to_bytes(msg).replace(b"\n", b"\r\n")))
|
payload = sign_data(message_to_bytes(msg).replace(b"\n", b"\r\n"))
|
||||||
|
|
||||||
|
if not payload:
|
||||||
|
raise PGPException("Empty signature by gnupg")
|
||||||
|
|
||||||
|
signature.set_payload(payload)
|
||||||
except Exception:
|
except Exception:
|
||||||
LOG.e("Cannot sign, try using pgpy")
|
LOG.e("Cannot sign, try using pgpy")
|
||||||
signature.set_payload(
|
payload = sign_data_with_pgpy(message_to_bytes(msg).replace(b"\n", b"\r\n"))
|
||||||
sign_data_with_pgpy(message_to_bytes(msg).replace(b"\n", b"\r\n"))
|
|
||||||
)
|
if not payload:
|
||||||
|
raise PGPException("Empty signature by pgpy")
|
||||||
|
|
||||||
|
signature.set_payload(payload)
|
||||||
|
|
||||||
container.attach(signature)
|
container.attach(signature)
|
||||||
|
|
||||||
@ -622,8 +636,8 @@ def handle_forward(envelope, msg: Message, rcpt_to: str) -> List[Tuple[bool, str
|
|||||||
|
|
||||||
user = alias.user
|
user = alias.user
|
||||||
|
|
||||||
if user.disabled:
|
if not user.can_send_or_receive():
|
||||||
LOG.w("User %s disabled, disable forwarding emails for %s", user, alias)
|
LOG.i(f"User {user} cannot receive emails")
|
||||||
if should_ignore_bounce(envelope.mail_from):
|
if should_ignore_bounce(envelope.mail_from):
|
||||||
return [(True, status.E207)]
|
return [(True, status.E207)]
|
||||||
else:
|
else:
|
||||||
@ -845,38 +859,40 @@ def forward_email_to_mailbox(
|
|||||||
f"""Email sent to {alias.email} from an invalid address and cannot be replied""",
|
f"""Email sent to {alias.email} from an invalid address and cannot be replied""",
|
||||||
)
|
)
|
||||||
|
|
||||||
delete_all_headers_except(
|
headers_to_keep = [
|
||||||
msg,
|
headers.FROM,
|
||||||
[
|
headers.TO,
|
||||||
headers.FROM,
|
headers.CC,
|
||||||
headers.TO,
|
headers.SUBJECT,
|
||||||
headers.CC,
|
headers.DATE,
|
||||||
headers.SUBJECT,
|
# do not delete original message id
|
||||||
headers.DATE,
|
headers.MESSAGE_ID,
|
||||||
# do not delete original message id
|
# References and In-Reply-To are used for keeping the email thread
|
||||||
headers.MESSAGE_ID,
|
headers.REFERENCES,
|
||||||
# References and In-Reply-To are used for keeping the email thread
|
headers.IN_REPLY_TO,
|
||||||
headers.REFERENCES,
|
headers.LIST_UNSUBSCRIBE,
|
||||||
headers.IN_REPLY_TO,
|
headers.LIST_UNSUBSCRIBE_POST,
|
||||||
]
|
] + headers.MIME_HEADERS
|
||||||
+ headers.MIME_HEADERS,
|
if user.include_header_email_header:
|
||||||
)
|
headers_to_keep.append(headers.AUTHENTICATION_RESULTS)
|
||||||
|
delete_all_headers_except(msg, headers_to_keep)
|
||||||
|
|
||||||
|
if mailbox.generic_subject:
|
||||||
|
LOG.d("Use a generic subject for %s", mailbox)
|
||||||
|
orig_subject = msg[headers.SUBJECT]
|
||||||
|
orig_subject = get_header_unicode(orig_subject)
|
||||||
|
add_or_replace_header(msg, "Subject", mailbox.generic_subject)
|
||||||
|
sender = msg[headers.FROM]
|
||||||
|
sender = get_header_unicode(sender)
|
||||||
|
msg = add_header(
|
||||||
|
msg,
|
||||||
|
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with "{orig_subject}" as subject""",
|
||||||
|
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with <b>{orig_subject}</b> as subject""",
|
||||||
|
)
|
||||||
|
|
||||||
# create PGP email if needed
|
# create PGP email if needed
|
||||||
if mailbox.pgp_enabled() and user.is_premium() and not alias.disable_pgp:
|
if mailbox.pgp_enabled() and user.is_premium() and not alias.disable_pgp:
|
||||||
LOG.d("Encrypt message using mailbox %s", mailbox)
|
LOG.d("Encrypt message using mailbox %s", mailbox)
|
||||||
if mailbox.generic_subject:
|
|
||||||
LOG.d("Use a generic subject for %s", mailbox)
|
|
||||||
orig_subject = msg[headers.SUBJECT]
|
|
||||||
orig_subject = get_header_unicode(orig_subject)
|
|
||||||
add_or_replace_header(msg, "Subject", mailbox.generic_subject)
|
|
||||||
sender = msg[headers.FROM]
|
|
||||||
sender = get_header_unicode(sender)
|
|
||||||
msg = add_header(
|
|
||||||
msg,
|
|
||||||
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with "{orig_subject}" as subject""",
|
|
||||||
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with <b>{orig_subject}</b> as subject""",
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
msg = prepare_pgp_message(
|
msg = prepare_pgp_message(
|
||||||
@ -897,6 +913,11 @@ def forward_email_to_mailbox(
|
|||||||
msg[headers.SL_EMAIL_LOG_ID] = str(email_log.id)
|
msg[headers.SL_EMAIL_LOG_ID] = str(email_log.id)
|
||||||
if user.include_header_email_header:
|
if user.include_header_email_header:
|
||||||
msg[headers.SL_ENVELOPE_FROM] = envelope.mail_from
|
msg[headers.SL_ENVELOPE_FROM] = envelope.mail_from
|
||||||
|
if contact.name:
|
||||||
|
original_from = f"{contact.name} <{contact.website_email}>"
|
||||||
|
else:
|
||||||
|
original_from = contact.website_email
|
||||||
|
msg[headers.SL_ORIGINAL_FROM] = original_from
|
||||||
# when an alias isn't in the To: header, there's no way for users to know what alias has received the email
|
# when an alias isn't in the To: header, there's no way for users to know what alias has received the email
|
||||||
msg[headers.SL_ENVELOPE_TO] = alias.email
|
msg[headers.SL_ENVELOPE_TO] = alias.email
|
||||||
|
|
||||||
@ -945,10 +966,11 @@ def forward_email_to_mailbox(
|
|||||||
envelope.rcpt_options,
|
envelope.rcpt_options,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
contact_domain = get_email_domain_part(contact.reply_email)
|
||||||
try:
|
try:
|
||||||
sl_sendmail(
|
sl_sendmail(
|
||||||
# use a different envelope sender for each forward (aka VERP)
|
# use a different envelope sender for each forward (aka VERP)
|
||||||
generate_verp_email(VerpType.bounce_forward, email_log.id),
|
generate_verp_email(VerpType.bounce_forward, email_log.id, contact_domain),
|
||||||
mailbox.email,
|
mailbox.email,
|
||||||
msg,
|
msg,
|
||||||
envelope.mail_options,
|
envelope.mail_options,
|
||||||
@ -1017,10 +1039,14 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
|
|||||||
|
|
||||||
reply_email = rcpt_to
|
reply_email = rcpt_to
|
||||||
|
|
||||||
# reply_email must end with EMAIL_DOMAIN
|
reply_domain = get_email_domain_part(reply_email)
|
||||||
|
|
||||||
|
# reply_email must end with EMAIL_DOMAIN or a domain that can be used as reverse alias domain
|
||||||
if not reply_email.endswith(EMAIL_DOMAIN):
|
if not reply_email.endswith(EMAIL_DOMAIN):
|
||||||
LOG.w(f"Reply email {reply_email} has wrong domain")
|
sl_domain: SLDomain = SLDomain.get_by(domain=reply_domain)
|
||||||
return False, status.E501
|
if sl_domain is None:
|
||||||
|
LOG.w(f"Reply email {reply_email} has wrong domain")
|
||||||
|
return False, status.E501
|
||||||
|
|
||||||
# handle case where reply email is generated with non-allowed char
|
# handle case where reply email is generated with non-allowed char
|
||||||
reply_email = normalize_reply_email(reply_email)
|
reply_email = normalize_reply_email(reply_email)
|
||||||
@ -1032,7 +1058,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
|
|||||||
|
|
||||||
alias = contact.alias
|
alias = contact.alias
|
||||||
alias_address: str = contact.alias.email
|
alias_address: str = contact.alias.email
|
||||||
alias_domain = alias_address[alias_address.find("@") + 1 :]
|
alias_domain = get_email_domain_part(alias_address)
|
||||||
|
|
||||||
# Sanity check: verify alias domain is managed by SimpleLogin
|
# Sanity check: verify alias domain is managed by SimpleLogin
|
||||||
# scenario: a user have removed a domain but due to a bug, the aliases are still there
|
# scenario: a user have removed a domain but due to a bug, the aliases are still there
|
||||||
@ -1043,13 +1069,8 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
|
|||||||
user = alias.user
|
user = alias.user
|
||||||
mail_from = envelope.mail_from
|
mail_from = envelope.mail_from
|
||||||
|
|
||||||
if user.disabled:
|
if not user.can_send_or_receive():
|
||||||
LOG.e(
|
LOG.i(f"User {user} cannot send emails")
|
||||||
"User %s disabled, disable sending emails from %s to %s",
|
|
||||||
user,
|
|
||||||
alias,
|
|
||||||
contact,
|
|
||||||
)
|
|
||||||
return False, status.E504
|
return False, status.E504
|
||||||
|
|
||||||
# Check if we need to reject or quarantine based on dmarc
|
# Check if we need to reject or quarantine based on dmarc
|
||||||
@ -1175,7 +1196,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
|
|||||||
)
|
)
|
||||||
|
|
||||||
# replace reverse alias by real address for all contacts
|
# replace reverse alias by real address for all contacts
|
||||||
for (reply_email, website_email) in contact_query.values(
|
for reply_email, website_email in contact_query.values(
|
||||||
Contact.reply_email, Contact.website_email
|
Contact.reply_email, Contact.website_email
|
||||||
):
|
):
|
||||||
msg = replace(msg, reply_email, website_email)
|
msg = replace(msg, reply_email, website_email)
|
||||||
@ -1230,7 +1251,6 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
|
|||||||
if str(msg[headers.TO]).lower() == "undisclosed-recipients:;":
|
if str(msg[headers.TO]).lower() == "undisclosed-recipients:;":
|
||||||
# no need to replace TO header
|
# no need to replace TO header
|
||||||
LOG.d("email is sent in BCC mode")
|
LOG.d("email is sent in BCC mode")
|
||||||
del msg[headers.TO]
|
|
||||||
else:
|
else:
|
||||||
replace_header_when_reply(msg, alias, headers.TO)
|
replace_header_when_reply(msg, alias, headers.TO)
|
||||||
|
|
||||||
@ -1931,7 +1951,7 @@ def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
|
|||||||
for is_delivered, smtp_status in handle_forward(envelope, msg, alias.email):
|
for is_delivered, smtp_status in handle_forward(envelope, msg, alias.email):
|
||||||
res.append((is_delivered, smtp_status))
|
res.append((is_delivered, smtp_status))
|
||||||
|
|
||||||
for (is_success, smtp_status) in res:
|
for is_success, smtp_status in res:
|
||||||
# Consider all deliveries successful if 1 delivery is successful
|
# Consider all deliveries successful if 1 delivery is successful
|
||||||
if is_success:
|
if is_success:
|
||||||
return smtp_status
|
return smtp_status
|
||||||
@ -2251,7 +2271,7 @@ def handle(envelope: Envelope, msg: Message) -> str:
|
|||||||
if nb_success > 0 and nb_non_success > 0:
|
if nb_success > 0 and nb_non_success > 0:
|
||||||
LOG.e(f"some deliveries fail and some success, {mail_from}, {rcpt_tos}, {res}")
|
LOG.e(f"some deliveries fail and some success, {mail_from}, {rcpt_tos}, {res}")
|
||||||
|
|
||||||
for (is_success, smtp_status) in res:
|
for is_success, smtp_status in res:
|
||||||
# Consider all deliveries successful if 1 delivery is successful
|
# Consider all deliveries successful if 1 delivery is successful
|
||||||
if is_success:
|
if is_success:
|
||||||
return smtp_status
|
return smtp_status
|
||||||
|
@ -42,14 +42,16 @@ def add_sl_domains():
|
|||||||
LOG.d("%s is already a SL domain", alias_domain)
|
LOG.d("%s is already a SL domain", alias_domain)
|
||||||
else:
|
else:
|
||||||
LOG.i("Add %s to SL domain", alias_domain)
|
LOG.i("Add %s to SL domain", alias_domain)
|
||||||
SLDomain.create(domain=alias_domain)
|
SLDomain.create(domain=alias_domain, use_as_reverse_alias=True)
|
||||||
|
|
||||||
for premium_domain in PREMIUM_ALIAS_DOMAINS:
|
for premium_domain in PREMIUM_ALIAS_DOMAINS:
|
||||||
if SLDomain.get_by(domain=premium_domain):
|
if SLDomain.get_by(domain=premium_domain):
|
||||||
LOG.d("%s is already a SL domain", premium_domain)
|
LOG.d("%s is already a SL domain", premium_domain)
|
||||||
else:
|
else:
|
||||||
LOG.i("Add %s to SL domain", premium_domain)
|
LOG.i("Add %s to SL domain", premium_domain)
|
||||||
SLDomain.create(domain=premium_domain, premium_only=True)
|
SLDomain.create(
|
||||||
|
domain=premium_domain, premium_only=True, use_as_reverse_alias=True
|
||||||
|
)
|
||||||
|
|
||||||
Session.commit()
|
Session.commit()
|
||||||
|
|
||||||
|
@ -192,7 +192,6 @@ amigos
|
|||||||
amines
|
amines
|
||||||
amnion
|
amnion
|
||||||
amoeba
|
amoeba
|
||||||
amoral
|
|
||||||
amount
|
amount
|
||||||
amours
|
amours
|
||||||
ampere
|
ampere
|
||||||
@ -215,7 +214,6 @@ animus
|
|||||||
anions
|
anions
|
||||||
ankles
|
ankles
|
||||||
anklet
|
anklet
|
||||||
annals
|
|
||||||
anneal
|
anneal
|
||||||
annoys
|
annoys
|
||||||
annual
|
annual
|
||||||
@ -364,7 +362,6 @@ auntie
|
|||||||
aureus
|
aureus
|
||||||
aurora
|
aurora
|
||||||
author
|
author
|
||||||
autism
|
|
||||||
autumn
|
autumn
|
||||||
avails
|
avails
|
||||||
avatar
|
avatar
|
||||||
@ -638,14 +635,12 @@ bigwig
|
|||||||
bijoux
|
bijoux
|
||||||
bikers
|
bikers
|
||||||
biking
|
biking
|
||||||
bikini
|
|
||||||
bilges
|
bilges
|
||||||
bilked
|
bilked
|
||||||
bilker
|
bilker
|
||||||
billed
|
billed
|
||||||
billet
|
billet
|
||||||
billow
|
billow
|
||||||
bimbos
|
|
||||||
binary
|
binary
|
||||||
binder
|
binder
|
||||||
binged
|
binged
|
||||||
@ -710,8 +705,6 @@ blocks
|
|||||||
blokes
|
blokes
|
||||||
blonde
|
blonde
|
||||||
blonds
|
blonds
|
||||||
bloods
|
|
||||||
bloody
|
|
||||||
blooms
|
blooms
|
||||||
bloops
|
bloops
|
||||||
blotch
|
blotch
|
||||||
@ -817,8 +810,6 @@ bounds
|
|||||||
bounty
|
bounty
|
||||||
bovine
|
bovine
|
||||||
bovver
|
bovver
|
||||||
bowels
|
|
||||||
bowers
|
|
||||||
bowing
|
bowing
|
||||||
bowled
|
bowled
|
||||||
bowleg
|
bowleg
|
||||||
@ -827,10 +818,8 @@ bowman
|
|||||||
bowmen
|
bowmen
|
||||||
bowwow
|
bowwow
|
||||||
boxcar
|
boxcar
|
||||||
boxers
|
|
||||||
boxier
|
boxier
|
||||||
boxing
|
boxing
|
||||||
boyish
|
|
||||||
braced
|
braced
|
||||||
bracer
|
bracer
|
||||||
braces
|
braces
|
||||||
@ -861,7 +850,6 @@ breach
|
|||||||
breads
|
breads
|
||||||
breaks
|
breaks
|
||||||
breams
|
breams
|
||||||
breast
|
|
||||||
breath
|
breath
|
||||||
breech
|
breech
|
||||||
breeds
|
breeds
|
||||||
@ -872,9 +860,6 @@ brevet
|
|||||||
brewed
|
brewed
|
||||||
brewer
|
brewer
|
||||||
briars
|
briars
|
||||||
bribed
|
|
||||||
briber
|
|
||||||
bribes
|
|
||||||
bricks
|
bricks
|
||||||
bridal
|
bridal
|
||||||
brides
|
brides
|
||||||
@ -926,13 +911,7 @@ buffed
|
|||||||
buffer
|
buffer
|
||||||
buffet
|
buffet
|
||||||
bugged
|
bugged
|
||||||
bugger
|
|
||||||
bugled
|
|
||||||
bugler
|
|
||||||
bugles
|
|
||||||
builds
|
builds
|
||||||
bulged
|
|
||||||
bulges
|
|
||||||
bulked
|
bulked
|
||||||
bulled
|
bulled
|
||||||
bullet
|
bullet
|
||||||
@ -1340,8 +1319,6 @@ clingy
|
|||||||
clinic
|
clinic
|
||||||
clinks
|
clinks
|
||||||
clique
|
clique
|
||||||
cloaca
|
|
||||||
cloaks
|
|
||||||
cloche
|
cloche
|
||||||
clocks
|
clocks
|
||||||
clomps
|
clomps
|
||||||
@ -1448,7 +1425,6 @@ comply
|
|||||||
compos
|
compos
|
||||||
conchs
|
conchs
|
||||||
concur
|
concur
|
||||||
condom
|
|
||||||
condor
|
condor
|
||||||
condos
|
condos
|
||||||
coneys
|
coneys
|
||||||
@ -1568,8 +1544,6 @@ cranes
|
|||||||
cranks
|
cranks
|
||||||
cranky
|
cranky
|
||||||
cranny
|
cranny
|
||||||
crapes
|
|
||||||
crappy
|
|
||||||
crated
|
crated
|
||||||
crater
|
crater
|
||||||
crates
|
crates
|
||||||
@ -1585,7 +1559,6 @@ crazes
|
|||||||
creaks
|
creaks
|
||||||
creaky
|
creaky
|
||||||
creams
|
creams
|
||||||
creamy
|
|
||||||
crease
|
crease
|
||||||
create
|
create
|
||||||
creche
|
creche
|
||||||
@ -1594,8 +1567,6 @@ credos
|
|||||||
creeds
|
creeds
|
||||||
creeks
|
creeks
|
||||||
creels
|
creels
|
||||||
creeps
|
|
||||||
creepy
|
|
||||||
cremes
|
cremes
|
||||||
creole
|
creole
|
||||||
crepes
|
crepes
|
||||||
@ -1728,9 +1699,6 @@ dainty
|
|||||||
daises
|
daises
|
||||||
damage
|
damage
|
||||||
damask
|
damask
|
||||||
dammed
|
|
||||||
dammit
|
|
||||||
damned
|
|
||||||
damped
|
damped
|
||||||
dampen
|
dampen
|
||||||
damper
|
damper
|
||||||
@ -1754,7 +1722,6 @@ darers
|
|||||||
daring
|
daring
|
||||||
darken
|
darken
|
||||||
darker
|
darker
|
||||||
darkie
|
|
||||||
darkly
|
darkly
|
||||||
darned
|
darned
|
||||||
darner
|
darner
|
||||||
@ -1763,8 +1730,6 @@ darter
|
|||||||
dashed
|
dashed
|
||||||
dasher
|
dasher
|
||||||
dashes
|
dashes
|
||||||
daters
|
|
||||||
dating
|
|
||||||
dative
|
dative
|
||||||
daubed
|
daubed
|
||||||
dauber
|
dauber
|
||||||
@ -1921,7 +1886,6 @@ dharma
|
|||||||
dhotis
|
dhotis
|
||||||
diadem
|
diadem
|
||||||
dialog
|
dialog
|
||||||
diaper
|
|
||||||
diatom
|
diatom
|
||||||
dibble
|
dibble
|
||||||
dicier
|
dicier
|
||||||
@ -1943,7 +1907,6 @@ digits
|
|||||||
diking
|
diking
|
||||||
diktat
|
diktat
|
||||||
dilate
|
dilate
|
||||||
dildos
|
|
||||||
dilute
|
dilute
|
||||||
dimity
|
dimity
|
||||||
dimmed
|
dimmed
|
||||||
@ -2058,7 +2021,6 @@ dotted
|
|||||||
double
|
double
|
||||||
doubly
|
doubly
|
||||||
doubts
|
doubts
|
||||||
douche
|
|
||||||
doughy
|
doughy
|
||||||
dourer
|
dourer
|
||||||
dourly
|
dourly
|
||||||
@ -2139,15 +2101,6 @@ duenna
|
|||||||
duffed
|
duffed
|
||||||
duffer
|
duffer
|
||||||
dugout
|
dugout
|
||||||
dulcet
|
|
||||||
dulled
|
|
||||||
duller
|
|
||||||
dumber
|
|
||||||
dumbly
|
|
||||||
dumbos
|
|
||||||
dumdum
|
|
||||||
dumped
|
|
||||||
dumper
|
|
||||||
dunces
|
dunces
|
||||||
dunged
|
dunged
|
||||||
dunked
|
dunked
|
||||||
@ -2285,7 +2238,6 @@ endows
|
|||||||
endued
|
endued
|
||||||
endues
|
endues
|
||||||
endure
|
endure
|
||||||
enemas
|
|
||||||
energy
|
energy
|
||||||
enfold
|
enfold
|
||||||
engage
|
engage
|
||||||
@ -2333,7 +2285,6 @@ erects
|
|||||||
ermine
|
ermine
|
||||||
eroded
|
eroded
|
||||||
erodes
|
erodes
|
||||||
erotic
|
|
||||||
errand
|
errand
|
||||||
errant
|
errant
|
||||||
errata
|
errata
|
||||||
@ -2344,7 +2295,6 @@ eructs
|
|||||||
erupts
|
erupts
|
||||||
escape
|
escape
|
||||||
eschew
|
eschew
|
||||||
escort
|
|
||||||
escrow
|
escrow
|
||||||
escudo
|
escudo
|
||||||
espied
|
espied
|
||||||
@ -2363,7 +2313,6 @@ ethnic
|
|||||||
etudes
|
etudes
|
||||||
euchre
|
euchre
|
||||||
eulogy
|
eulogy
|
||||||
eunuch
|
|
||||||
eureka
|
eureka
|
||||||
evaded
|
evaded
|
||||||
evader
|
evader
|
||||||
@ -2392,7 +2341,6 @@ exempt
|
|||||||
exerts
|
exerts
|
||||||
exeunt
|
exeunt
|
||||||
exhale
|
exhale
|
||||||
exhort
|
|
||||||
exhume
|
exhume
|
||||||
exiled
|
exiled
|
||||||
exiles
|
exiles
|
||||||
@ -2415,7 +2363,6 @@ extant
|
|||||||
extend
|
extend
|
||||||
extent
|
extent
|
||||||
extols
|
extols
|
||||||
extort
|
|
||||||
extras
|
extras
|
||||||
exuded
|
exuded
|
||||||
exudes
|
exudes
|
||||||
@ -2440,7 +2387,6 @@ faeces
|
|||||||
faerie
|
faerie
|
||||||
faffed
|
faffed
|
||||||
fagged
|
fagged
|
||||||
faggot
|
|
||||||
failed
|
failed
|
||||||
faille
|
faille
|
||||||
fainer
|
fainer
|
||||||
@ -2473,18 +2419,10 @@ faring
|
|||||||
farmed
|
farmed
|
||||||
farmer
|
farmer
|
||||||
farrow
|
farrow
|
||||||
farted
|
|
||||||
fascia
|
fascia
|
||||||
fasted
|
fasted
|
||||||
fasten
|
fasten
|
||||||
faster
|
faster
|
||||||
father
|
|
||||||
fathom
|
|
||||||
fating
|
|
||||||
fatsos
|
|
||||||
fatten
|
|
||||||
fatter
|
|
||||||
fatwas
|
|
||||||
faucet
|
faucet
|
||||||
faults
|
faults
|
||||||
faulty
|
faulty
|
||||||
@ -2532,7 +2470,6 @@ fesses
|
|||||||
festal
|
festal
|
||||||
fester
|
fester
|
||||||
feting
|
feting
|
||||||
fetish
|
|
||||||
fetter
|
fetter
|
||||||
fettle
|
fettle
|
||||||
feudal
|
feudal
|
||||||
@ -2617,9 +2554,7 @@ flaked
|
|||||||
flakes
|
flakes
|
||||||
flambe
|
flambe
|
||||||
flamed
|
flamed
|
||||||
flamer
|
|
||||||
flames
|
flames
|
||||||
flange
|
|
||||||
flanks
|
flanks
|
||||||
flared
|
flared
|
||||||
flares
|
flares
|
||||||
@ -2754,8 +2689,6 @@ franks
|
|||||||
frappe
|
frappe
|
||||||
frauds
|
frauds
|
||||||
frayed
|
frayed
|
||||||
freaks
|
|
||||||
freaky
|
|
||||||
freely
|
freely
|
||||||
freest
|
freest
|
||||||
freeze
|
freeze
|
||||||
@ -2795,8 +2728,6 @@ fryers
|
|||||||
frying
|
frying
|
||||||
ftpers
|
ftpers
|
||||||
ftping
|
ftping
|
||||||
fucked
|
|
||||||
fucker
|
|
||||||
fuddle
|
fuddle
|
||||||
fudged
|
fudged
|
||||||
fudges
|
fudges
|
||||||
@ -2891,10 +2822,7 @@ gasbag
|
|||||||
gashed
|
gashed
|
||||||
gashes
|
gashes
|
||||||
gasket
|
gasket
|
||||||
gasman
|
|
||||||
gasmen
|
|
||||||
gasped
|
gasped
|
||||||
gassed
|
|
||||||
gasses
|
gasses
|
||||||
gateau
|
gateau
|
||||||
gather
|
gather
|
||||||
@ -3104,7 +3032,6 @@ grimed
|
|||||||
grimes
|
grimes
|
||||||
grimly
|
grimly
|
||||||
grinds
|
grinds
|
||||||
gringo
|
|
||||||
griped
|
griped
|
||||||
griper
|
griper
|
||||||
gripes
|
gripes
|
||||||
@ -3186,8 +3113,6 @@ gypsum
|
|||||||
gyrate
|
gyrate
|
||||||
gyving
|
gyving
|
||||||
habits
|
habits
|
||||||
hacked
|
|
||||||
hacker
|
|
||||||
hackle
|
hackle
|
||||||
hadith
|
hadith
|
||||||
haggis
|
haggis
|
||||||
@ -3195,8 +3120,6 @@ haggle
|
|||||||
hailed
|
hailed
|
||||||
hairdo
|
hairdo
|
||||||
haired
|
haired
|
||||||
hajjes
|
|
||||||
hajjis
|
|
||||||
halest
|
halest
|
||||||
haling
|
haling
|
||||||
halite
|
halite
|
||||||
@ -3223,11 +3146,8 @@ happen
|
|||||||
haptic
|
haptic
|
||||||
harass
|
harass
|
||||||
harden
|
harden
|
||||||
harder
|
|
||||||
hardly
|
|
||||||
harems
|
harems
|
||||||
haring
|
haring
|
||||||
harked
|
|
||||||
harlot
|
harlot
|
||||||
harmed
|
harmed
|
||||||
harped
|
harped
|
||||||
@ -3407,7 +3327,6 @@ hoofed
|
|||||||
hoofer
|
hoofer
|
||||||
hookah
|
hookah
|
||||||
hooked
|
hooked
|
||||||
hooker
|
|
||||||
hookup
|
hookup
|
||||||
hooped
|
hooped
|
||||||
hoopla
|
hoopla
|
||||||
@ -3459,8 +3378,6 @@ huffed
|
|||||||
hugely
|
hugely
|
||||||
hugest
|
hugest
|
||||||
hugged
|
hugged
|
||||||
hulled
|
|
||||||
huller
|
|
||||||
humane
|
humane
|
||||||
humans
|
humans
|
||||||
humble
|
humble
|
||||||
@ -3667,8 +3584,6 @@ jacket
|
|||||||
jading
|
jading
|
||||||
jagged
|
jagged
|
||||||
jaguar
|
jaguar
|
||||||
jailed
|
|
||||||
jailer
|
|
||||||
jalopy
|
jalopy
|
||||||
jammed
|
jammed
|
||||||
jangle
|
jangle
|
||||||
@ -3689,8 +3604,6 @@ jejune
|
|||||||
jelled
|
jelled
|
||||||
jellos
|
jellos
|
||||||
jennet
|
jennet
|
||||||
jerked
|
|
||||||
jerkin
|
|
||||||
jersey
|
jersey
|
||||||
jested
|
jested
|
||||||
jester
|
jester
|
||||||
@ -3814,11 +3727,7 @@ kidded
|
|||||||
kidder
|
kidder
|
||||||
kiddie
|
kiddie
|
||||||
kiddos
|
kiddos
|
||||||
kidnap
|
|
||||||
kidney
|
kidney
|
||||||
killed
|
|
||||||
killer
|
|
||||||
kilned
|
|
||||||
kilted
|
kilted
|
||||||
kilter
|
kilter
|
||||||
kimono
|
kimono
|
||||||
@ -3827,15 +3736,11 @@ kinder
|
|||||||
kindle
|
kindle
|
||||||
kindly
|
kindly
|
||||||
kingly
|
kingly
|
||||||
kinked
|
|
||||||
kiosks
|
kiosks
|
||||||
kipped
|
kipped
|
||||||
kipper
|
kipper
|
||||||
kirsch
|
kirsch
|
||||||
kismet
|
kismet
|
||||||
kissed
|
|
||||||
kisser
|
|
||||||
kisses
|
|
||||||
kiting
|
kiting
|
||||||
kitsch
|
kitsch
|
||||||
kitted
|
kitted
|
||||||
@ -3847,10 +3752,6 @@ kluges
|
|||||||
klutzy
|
klutzy
|
||||||
knacks
|
knacks
|
||||||
knaves
|
knaves
|
||||||
kneads
|
|
||||||
kneels
|
|
||||||
knells
|
|
||||||
knifed
|
|
||||||
knifes
|
knifes
|
||||||
knight
|
knight
|
||||||
knives
|
knives
|
||||||
@ -4210,8 +4111,6 @@ lunges
|
|||||||
lupine
|
lupine
|
||||||
lupins
|
lupins
|
||||||
luring
|
luring
|
||||||
lurked
|
|
||||||
lurker
|
|
||||||
lusher
|
lusher
|
||||||
lushes
|
lushes
|
||||||
lushly
|
lushly
|
||||||
@ -4608,7 +4507,6 @@ muggle
|
|||||||
mukluk
|
mukluk
|
||||||
mulcts
|
mulcts
|
||||||
mulish
|
mulish
|
||||||
mullah
|
|
||||||
mulled
|
mulled
|
||||||
mullet
|
mullet
|
||||||
mumble
|
mumble
|
||||||
@ -4721,9 +4619,6 @@ nickel
|
|||||||
nicker
|
nicker
|
||||||
nickle
|
nickle
|
||||||
nieces
|
nieces
|
||||||
niggas
|
|
||||||
niggaz
|
|
||||||
nigger
|
|
||||||
niggle
|
niggle
|
||||||
nigher
|
nigher
|
||||||
nights
|
nights
|
||||||
@ -4736,7 +4631,6 @@ ninjas
|
|||||||
ninths
|
ninths
|
||||||
nipped
|
nipped
|
||||||
nipper
|
nipper
|
||||||
nipple
|
|
||||||
nitric
|
nitric
|
||||||
nitwit
|
nitwit
|
||||||
nixing
|
nixing
|
||||||
@ -4781,15 +4675,6 @@ nozzle
|
|||||||
nuance
|
nuance
|
||||||
nubbin
|
nubbin
|
||||||
nubile
|
nubile
|
||||||
nuclei
|
|
||||||
nudest
|
|
||||||
nudged
|
|
||||||
nudges
|
|
||||||
nudism
|
|
||||||
nudist
|
|
||||||
nudity
|
|
||||||
nugget
|
|
||||||
nuking
|
|
||||||
numbed
|
numbed
|
||||||
number
|
number
|
||||||
numbly
|
numbly
|
||||||
@ -4804,7 +4689,6 @@ nutter
|
|||||||
nuzzle
|
nuzzle
|
||||||
nybble
|
nybble
|
||||||
nylons
|
nylons
|
||||||
nympho
|
|
||||||
nymphs
|
nymphs
|
||||||
oafish
|
oafish
|
||||||
oaring
|
oaring
|
||||||
@ -4885,7 +4769,6 @@ opting
|
|||||||
option
|
option
|
||||||
opuses
|
opuses
|
||||||
oracle
|
oracle
|
||||||
orally
|
|
||||||
orange
|
orange
|
||||||
orated
|
orated
|
||||||
orates
|
orates
|
||||||
@ -4897,7 +4780,6 @@ ordeal
|
|||||||
orders
|
orders
|
||||||
ordure
|
ordure
|
||||||
organs
|
organs
|
||||||
orgasm
|
|
||||||
orgies
|
orgies
|
||||||
oriels
|
oriels
|
||||||
orient
|
orient
|
||||||
@ -4993,10 +4875,6 @@ pander
|
|||||||
panels
|
panels
|
||||||
panics
|
panics
|
||||||
panned
|
panned
|
||||||
panted
|
|
||||||
pantie
|
|
||||||
pantos
|
|
||||||
pantry
|
|
||||||
papacy
|
papacy
|
||||||
papaya
|
papaya
|
||||||
papers
|
papers
|
||||||
@ -5078,7 +4956,6 @@ pebble
|
|||||||
pebbly
|
pebbly
|
||||||
pecans
|
pecans
|
||||||
pecked
|
pecked
|
||||||
pecker
|
|
||||||
pectic
|
pectic
|
||||||
pectin
|
pectin
|
||||||
pedalo
|
pedalo
|
||||||
@ -5151,9 +5028,6 @@ phenom
|
|||||||
phials
|
phials
|
||||||
phlegm
|
phlegm
|
||||||
phloem
|
phloem
|
||||||
phobia
|
|
||||||
phobic
|
|
||||||
phoebe
|
|
||||||
phoned
|
phoned
|
||||||
phones
|
phones
|
||||||
phoney
|
phoney
|
||||||
@ -5228,9 +5102,6 @@ piques
|
|||||||
piracy
|
piracy
|
||||||
pirate
|
pirate
|
||||||
pirogi
|
pirogi
|
||||||
pissed
|
|
||||||
pisser
|
|
||||||
pisses
|
|
||||||
pistes
|
pistes
|
||||||
pistil
|
pistil
|
||||||
pistol
|
pistol
|
||||||
@ -5311,8 +5182,6 @@ pogrom
|
|||||||
points
|
points
|
||||||
pointy
|
pointy
|
||||||
poised
|
poised
|
||||||
poises
|
|
||||||
poison
|
|
||||||
pokers
|
pokers
|
||||||
pokeys
|
pokeys
|
||||||
pokier
|
pokier
|
||||||
@ -5422,7 +5291,6 @@ preyed
|
|||||||
priced
|
priced
|
||||||
prices
|
prices
|
||||||
pricey
|
pricey
|
||||||
pricks
|
|
||||||
prided
|
prided
|
||||||
prides
|
prides
|
||||||
priers
|
priers
|
||||||
@ -5602,14 +5470,9 @@ rabbit
|
|||||||
rabble
|
rabble
|
||||||
rabies
|
rabies
|
||||||
raceme
|
raceme
|
||||||
racers
|
|
||||||
racial
|
|
||||||
racier
|
racier
|
||||||
racily
|
racily
|
||||||
racing
|
racing
|
||||||
racism
|
|
||||||
racist
|
|
||||||
racked
|
|
||||||
racket
|
racket
|
||||||
radars
|
radars
|
||||||
radial
|
radial
|
||||||
@ -5661,8 +5524,6 @@ rapers
|
|||||||
rapids
|
rapids
|
||||||
rapier
|
rapier
|
||||||
rapine
|
rapine
|
||||||
raping
|
|
||||||
rapist
|
|
||||||
rapped
|
rapped
|
||||||
rappel
|
rappel
|
||||||
rapper
|
rapper
|
||||||
@ -5747,7 +5608,6 @@ recoup
|
|||||||
rectal
|
rectal
|
||||||
rector
|
rector
|
||||||
rectos
|
rectos
|
||||||
rectum
|
|
||||||
recurs
|
recurs
|
||||||
recuse
|
recuse
|
||||||
redact
|
redact
|
||||||
@ -5891,7 +5751,6 @@ resume
|
|||||||
retail
|
retail
|
||||||
retain
|
retain
|
||||||
retake
|
retake
|
||||||
retard
|
|
||||||
retell
|
retell
|
||||||
retest
|
retest
|
||||||
retied
|
retied
|
||||||
@ -6125,8 +5984,6 @@ sadden
|
|||||||
sadder
|
sadder
|
||||||
saddle
|
saddle
|
||||||
sadhus
|
sadhus
|
||||||
sadism
|
|
||||||
sadist
|
|
||||||
safari
|
safari
|
||||||
safely
|
safely
|
||||||
safest
|
safest
|
||||||
@ -6364,16 +6221,6 @@ severs
|
|||||||
sewage
|
sewage
|
||||||
sewers
|
sewers
|
||||||
sewing
|
sewing
|
||||||
sexier
|
|
||||||
sexily
|
|
||||||
sexing
|
|
||||||
sexism
|
|
||||||
sexist
|
|
||||||
sexpot
|
|
||||||
sextet
|
|
||||||
sexton
|
|
||||||
sexual
|
|
||||||
shabby
|
|
||||||
shacks
|
shacks
|
||||||
shaded
|
shaded
|
||||||
shades
|
shades
|
||||||
@ -6383,10 +6230,7 @@ shaggy
|
|||||||
shaken
|
shaken
|
||||||
shaker
|
shaker
|
||||||
shakes
|
shakes
|
||||||
shalom
|
|
||||||
shaman
|
shaman
|
||||||
shamed
|
|
||||||
shames
|
|
||||||
shandy
|
shandy
|
||||||
shanks
|
shanks
|
||||||
shanty
|
shanty
|
||||||
@ -6432,7 +6276,6 @@ shirks
|
|||||||
shirrs
|
shirrs
|
||||||
shirts
|
shirts
|
||||||
shirty
|
shirty
|
||||||
shitty
|
|
||||||
shiver
|
shiver
|
||||||
shoals
|
shoals
|
||||||
shoats
|
shoats
|
||||||
@ -6575,9 +6418,6 @@ slangy
|
|||||||
slants
|
slants
|
||||||
slated
|
slated
|
||||||
slates
|
slates
|
||||||
slaved
|
|
||||||
slaver
|
|
||||||
slaves
|
|
||||||
slayed
|
slayed
|
||||||
slayer
|
slayer
|
||||||
sleaze
|
sleaze
|
||||||
@ -6672,7 +6512,6 @@ snarks
|
|||||||
snarky
|
snarky
|
||||||
snarls
|
snarls
|
||||||
snarly
|
snarly
|
||||||
snatch
|
|
||||||
snazzy
|
snazzy
|
||||||
sneaks
|
sneaks
|
||||||
sneaky
|
sneaky
|
||||||
@ -6716,7 +6555,6 @@ socket
|
|||||||
sodded
|
sodded
|
||||||
sodden
|
sodden
|
||||||
sodium
|
sodium
|
||||||
sodomy
|
|
||||||
soever
|
soever
|
||||||
soften
|
soften
|
||||||
softer
|
softer
|
||||||
@ -7468,7 +7306,6 @@ torrid
|
|||||||
torsos
|
torsos
|
||||||
tortes
|
tortes
|
||||||
tossed
|
tossed
|
||||||
tosser
|
|
||||||
tosses
|
tosses
|
||||||
tossup
|
tossup
|
||||||
totals
|
totals
|
||||||
@ -7686,7 +7523,6 @@ unhook
|
|||||||
unhurt
|
unhurt
|
||||||
unions
|
unions
|
||||||
unique
|
unique
|
||||||
unisex
|
|
||||||
unison
|
unison
|
||||||
united
|
united
|
||||||
unites
|
unites
|
||||||
@ -7793,7 +7629,6 @@ vacant
|
|||||||
vacate
|
vacate
|
||||||
vacuum
|
vacuum
|
||||||
vagary
|
vagary
|
||||||
vagina
|
|
||||||
vaguer
|
vaguer
|
||||||
vainer
|
vainer
|
||||||
vainly
|
vainly
|
||||||
@ -7930,9 +7765,6 @@ votive
|
|||||||
vowels
|
vowels
|
||||||
vowing
|
vowing
|
||||||
voyage
|
voyage
|
||||||
voyeur
|
|
||||||
vulgar
|
|
||||||
vulvae
|
|
||||||
wabbit
|
wabbit
|
||||||
wacker
|
wacker
|
||||||
wackos
|
wackos
|
||||||
@ -7975,7 +7807,6 @@ wander
|
|||||||
wangle
|
wangle
|
||||||
waning
|
waning
|
||||||
wanked
|
wanked
|
||||||
wanker
|
|
||||||
wanner
|
wanner
|
||||||
wanted
|
wanted
|
||||||
wanton
|
wanton
|
||||||
|
@ -89,7 +89,6 @@ aghast
|
|||||||
agile
|
agile
|
||||||
agility
|
agility
|
||||||
aging
|
aging
|
||||||
agnostic
|
|
||||||
agonize
|
agonize
|
||||||
agonizing
|
agonizing
|
||||||
agony
|
agony
|
||||||
@ -375,8 +374,6 @@ augmented
|
|||||||
august
|
august
|
||||||
authentic
|
authentic
|
||||||
author
|
author
|
||||||
autism
|
|
||||||
autistic
|
|
||||||
autograph
|
autograph
|
||||||
automaker
|
automaker
|
||||||
automated
|
automated
|
||||||
@ -446,7 +443,6 @@ backyard
|
|||||||
bacon
|
bacon
|
||||||
bacteria
|
bacteria
|
||||||
bacterium
|
bacterium
|
||||||
badass
|
|
||||||
badge
|
badge
|
||||||
badland
|
badland
|
||||||
badly
|
badly
|
||||||
@ -1106,7 +1102,6 @@ clinic
|
|||||||
clinking
|
clinking
|
||||||
clip
|
clip
|
||||||
clique
|
clique
|
||||||
cloak
|
|
||||||
clobber
|
clobber
|
||||||
clock
|
clock
|
||||||
clone
|
clone
|
||||||
@ -1776,7 +1771,6 @@ diagnosis
|
|||||||
diagram
|
diagram
|
||||||
dial
|
dial
|
||||||
diameter
|
diameter
|
||||||
diaper
|
|
||||||
diaphragm
|
diaphragm
|
||||||
diary
|
diary
|
||||||
dice
|
dice
|
||||||
@ -1950,7 +1944,6 @@ dosage
|
|||||||
dose
|
dose
|
||||||
dotted
|
dotted
|
||||||
doubling
|
doubling
|
||||||
douche
|
|
||||||
dove
|
dove
|
||||||
down
|
down
|
||||||
dowry
|
dowry
|
||||||
@ -2032,9 +2025,6 @@ duffel
|
|||||||
dugout
|
dugout
|
||||||
duh
|
duh
|
||||||
duke
|
duke
|
||||||
duller
|
|
||||||
dullness
|
|
||||||
duly
|
|
||||||
dumping
|
dumping
|
||||||
dumpling
|
dumpling
|
||||||
dumpster
|
dumpster
|
||||||
@ -2527,8 +2517,6 @@ feisty
|
|||||||
feline
|
feline
|
||||||
felt-tip
|
felt-tip
|
||||||
feminine
|
feminine
|
||||||
feminism
|
|
||||||
feminist
|
|
||||||
feminize
|
feminize
|
||||||
femur
|
femur
|
||||||
fence
|
fence
|
||||||
@ -2667,7 +2655,6 @@ fondness
|
|||||||
fondue
|
fondue
|
||||||
font
|
font
|
||||||
food
|
food
|
||||||
fool
|
|
||||||
footage
|
footage
|
||||||
football
|
football
|
||||||
footbath
|
footbath
|
||||||
@ -2777,7 +2764,6 @@ gag
|
|||||||
gainfully
|
gainfully
|
||||||
gaining
|
gaining
|
||||||
gains
|
gains
|
||||||
gala
|
|
||||||
gallantly
|
gallantly
|
||||||
galleria
|
galleria
|
||||||
gallery
|
gallery
|
||||||
@ -3028,7 +3014,6 @@ groom
|
|||||||
groove
|
groove
|
||||||
grooving
|
grooving
|
||||||
groovy
|
groovy
|
||||||
grope
|
|
||||||
ground
|
ground
|
||||||
grouped
|
grouped
|
||||||
grout
|
grout
|
||||||
@ -3148,7 +3133,6 @@ happiness
|
|||||||
happy
|
happy
|
||||||
harbor
|
harbor
|
||||||
hardcopy
|
hardcopy
|
||||||
hardcore
|
|
||||||
hardcover
|
hardcover
|
||||||
harddisk
|
harddisk
|
||||||
hardened
|
hardened
|
||||||
@ -3164,8 +3148,6 @@ hardware
|
|||||||
hardwired
|
hardwired
|
||||||
hardwood
|
hardwood
|
||||||
hardy
|
hardy
|
||||||
harmful
|
|
||||||
harmless
|
|
||||||
harmonica
|
harmonica
|
||||||
harmonics
|
harmonics
|
||||||
harmonize
|
harmonize
|
||||||
@ -3340,7 +3322,6 @@ identical
|
|||||||
identify
|
identify
|
||||||
identity
|
identity
|
||||||
ideology
|
ideology
|
||||||
idiocy
|
|
||||||
idiom
|
idiom
|
||||||
idly
|
idly
|
||||||
igloo
|
igloo
|
||||||
@ -3357,7 +3338,6 @@ imaging
|
|||||||
imbecile
|
imbecile
|
||||||
imitate
|
imitate
|
||||||
imitation
|
imitation
|
||||||
immature
|
|
||||||
immerse
|
immerse
|
||||||
immersion
|
immersion
|
||||||
imminent
|
imminent
|
||||||
@ -3387,14 +3367,10 @@ implode
|
|||||||
implosion
|
implosion
|
||||||
implosive
|
implosive
|
||||||
imply
|
imply
|
||||||
impolite
|
|
||||||
important
|
important
|
||||||
importer
|
importer
|
||||||
impose
|
impose
|
||||||
imposing
|
imposing
|
||||||
impotence
|
|
||||||
impotency
|
|
||||||
impotent
|
|
||||||
impound
|
impound
|
||||||
imprecise
|
imprecise
|
||||||
imprint
|
imprint
|
||||||
@ -3424,8 +3400,6 @@ irritable
|
|||||||
irritably
|
irritably
|
||||||
irritant
|
irritant
|
||||||
irritate
|
irritate
|
||||||
islamic
|
|
||||||
islamist
|
|
||||||
isolated
|
isolated
|
||||||
isolating
|
isolating
|
||||||
isolation
|
isolation
|
||||||
@ -3524,7 +3498,6 @@ june
|
|||||||
junior
|
junior
|
||||||
juniper
|
juniper
|
||||||
junkie
|
junkie
|
||||||
junkman
|
|
||||||
junkyard
|
junkyard
|
||||||
jurist
|
jurist
|
||||||
juror
|
juror
|
||||||
@ -3570,9 +3543,6 @@ king
|
|||||||
kinship
|
kinship
|
||||||
kinsman
|
kinsman
|
||||||
kinswoman
|
kinswoman
|
||||||
kissable
|
|
||||||
kisser
|
|
||||||
kissing
|
|
||||||
kitchen
|
kitchen
|
||||||
kite
|
kite
|
||||||
kitten
|
kitten
|
||||||
@ -3649,7 +3619,6 @@ laundry
|
|||||||
laurel
|
laurel
|
||||||
lavender
|
lavender
|
||||||
lavish
|
lavish
|
||||||
laxative
|
|
||||||
lazily
|
lazily
|
||||||
laziness
|
laziness
|
||||||
lazy
|
lazy
|
||||||
@ -3690,7 +3659,6 @@ liable
|
|||||||
liberty
|
liberty
|
||||||
librarian
|
librarian
|
||||||
library
|
library
|
||||||
licking
|
|
||||||
licorice
|
licorice
|
||||||
lid
|
lid
|
||||||
life
|
life
|
||||||
@ -3741,8 +3709,6 @@ livestock
|
|||||||
lividly
|
lividly
|
||||||
living
|
living
|
||||||
lizard
|
lizard
|
||||||
lubricant
|
|
||||||
lubricate
|
|
||||||
lucid
|
lucid
|
||||||
luckily
|
luckily
|
||||||
luckiness
|
luckiness
|
||||||
@ -3878,7 +3844,6 @@ marshland
|
|||||||
marshy
|
marshy
|
||||||
marsupial
|
marsupial
|
||||||
marvelous
|
marvelous
|
||||||
marxism
|
|
||||||
mascot
|
mascot
|
||||||
masculine
|
masculine
|
||||||
mashed
|
mashed
|
||||||
@ -3914,8 +3879,6 @@ maximum
|
|||||||
maybe
|
maybe
|
||||||
mayday
|
mayday
|
||||||
mayflower
|
mayflower
|
||||||
moaner
|
|
||||||
moaning
|
|
||||||
mobile
|
mobile
|
||||||
mobility
|
mobility
|
||||||
mobilize
|
mobilize
|
||||||
@ -4124,7 +4087,6 @@ nemeses
|
|||||||
nemesis
|
nemesis
|
||||||
neon
|
neon
|
||||||
nephew
|
nephew
|
||||||
nerd
|
|
||||||
nervous
|
nervous
|
||||||
nervy
|
nervy
|
||||||
nest
|
nest
|
||||||
@ -4139,7 +4101,6 @@ never
|
|||||||
next
|
next
|
||||||
nibble
|
nibble
|
||||||
nickname
|
nickname
|
||||||
nicotine
|
|
||||||
niece
|
niece
|
||||||
nifty
|
nifty
|
||||||
nimble
|
nimble
|
||||||
@ -4167,14 +4128,10 @@ nuptials
|
|||||||
nursery
|
nursery
|
||||||
nursing
|
nursing
|
||||||
nurture
|
nurture
|
||||||
nutcase
|
|
||||||
nutlike
|
nutlike
|
||||||
nutmeg
|
nutmeg
|
||||||
nutrient
|
nutrient
|
||||||
nutshell
|
nutshell
|
||||||
nuttiness
|
|
||||||
nutty
|
|
||||||
nuzzle
|
|
||||||
nylon
|
nylon
|
||||||
oaf
|
oaf
|
||||||
oak
|
oak
|
||||||
@ -4205,7 +4162,6 @@ obstinate
|
|||||||
obstruct
|
obstruct
|
||||||
obtain
|
obtain
|
||||||
obtrusive
|
obtrusive
|
||||||
obtuse
|
|
||||||
obvious
|
obvious
|
||||||
occultist
|
occultist
|
||||||
occupancy
|
occupancy
|
||||||
@ -4446,7 +4402,6 @@ palpitate
|
|||||||
paltry
|
paltry
|
||||||
pampered
|
pampered
|
||||||
pamperer
|
pamperer
|
||||||
pampers
|
|
||||||
pamphlet
|
pamphlet
|
||||||
panama
|
panama
|
||||||
pancake
|
pancake
|
||||||
@ -4651,7 +4606,6 @@ plated
|
|||||||
platform
|
platform
|
||||||
plating
|
plating
|
||||||
platinum
|
platinum
|
||||||
platonic
|
|
||||||
platter
|
platter
|
||||||
platypus
|
platypus
|
||||||
plausible
|
plausible
|
||||||
@ -4777,8 +4731,6 @@ prancing
|
|||||||
pranker
|
pranker
|
||||||
prankish
|
prankish
|
||||||
prankster
|
prankster
|
||||||
prayer
|
|
||||||
praying
|
|
||||||
preacher
|
preacher
|
||||||
preaching
|
preaching
|
||||||
preachy
|
preachy
|
||||||
@ -4796,8 +4748,6 @@ prefix
|
|||||||
preflight
|
preflight
|
||||||
preformed
|
preformed
|
||||||
pregame
|
pregame
|
||||||
pregnancy
|
|
||||||
pregnant
|
|
||||||
preheated
|
preheated
|
||||||
prelaunch
|
prelaunch
|
||||||
prelaw
|
prelaw
|
||||||
@ -4937,7 +4887,6 @@ prudishly
|
|||||||
prune
|
prune
|
||||||
pruning
|
pruning
|
||||||
pry
|
pry
|
||||||
psychic
|
|
||||||
public
|
public
|
||||||
publisher
|
publisher
|
||||||
pucker
|
pucker
|
||||||
@ -4957,8 +4906,7 @@ punctual
|
|||||||
punctuate
|
punctuate
|
||||||
punctured
|
punctured
|
||||||
pungent
|
pungent
|
||||||
punisher
|
punishe
|
||||||
punk
|
|
||||||
pupil
|
pupil
|
||||||
puppet
|
puppet
|
||||||
puppy
|
puppy
|
||||||
@ -5040,7 +4988,6 @@ quote
|
|||||||
rabid
|
rabid
|
||||||
race
|
race
|
||||||
racing
|
racing
|
||||||
racism
|
|
||||||
rack
|
rack
|
||||||
racoon
|
racoon
|
||||||
radar
|
radar
|
||||||
@ -5155,7 +5102,6 @@ recount
|
|||||||
recoup
|
recoup
|
||||||
recovery
|
recovery
|
||||||
recreate
|
recreate
|
||||||
rectal
|
|
||||||
rectangle
|
rectangle
|
||||||
rectified
|
rectified
|
||||||
rectify
|
rectify
|
||||||
@ -5622,7 +5568,6 @@ sarcastic
|
|||||||
sardine
|
sardine
|
||||||
sash
|
sash
|
||||||
sasquatch
|
sasquatch
|
||||||
sassy
|
|
||||||
satchel
|
satchel
|
||||||
satiable
|
satiable
|
||||||
satin
|
satin
|
||||||
@ -5651,7 +5596,6 @@ scaling
|
|||||||
scallion
|
scallion
|
||||||
scallop
|
scallop
|
||||||
scalping
|
scalping
|
||||||
scam
|
|
||||||
scandal
|
scandal
|
||||||
scanner
|
scanner
|
||||||
scanning
|
scanning
|
||||||
@ -5928,8 +5872,6 @@ silent
|
|||||||
silica
|
silica
|
||||||
silicon
|
silicon
|
||||||
silk
|
silk
|
||||||
silliness
|
|
||||||
silly
|
|
||||||
silo
|
silo
|
||||||
silt
|
silt
|
||||||
silver
|
silver
|
||||||
@ -5991,7 +5933,6 @@ skimmer
|
|||||||
skimming
|
skimming
|
||||||
skimpily
|
skimpily
|
||||||
skincare
|
skincare
|
||||||
skinhead
|
|
||||||
skinless
|
skinless
|
||||||
skinning
|
skinning
|
||||||
skinny
|
skinny
|
||||||
@ -6197,7 +6138,6 @@ splinter
|
|||||||
splotchy
|
splotchy
|
||||||
splurge
|
splurge
|
||||||
spoilage
|
spoilage
|
||||||
spoiled
|
|
||||||
spoiler
|
spoiler
|
||||||
spoiling
|
spoiling
|
||||||
spoils
|
spoils
|
||||||
@ -6610,7 +6550,6 @@ swimmer
|
|||||||
swimming
|
swimming
|
||||||
swimsuit
|
swimsuit
|
||||||
swimwear
|
swimwear
|
||||||
swinger
|
|
||||||
swinging
|
swinging
|
||||||
swipe
|
swipe
|
||||||
swirl
|
swirl
|
||||||
@ -7079,7 +7018,6 @@ undocked
|
|||||||
undoing
|
undoing
|
||||||
undone
|
undone
|
||||||
undrafted
|
undrafted
|
||||||
undress
|
|
||||||
undrilled
|
undrilled
|
||||||
undusted
|
undusted
|
||||||
undying
|
undying
|
||||||
|
File diff suppressed because it is too large
Load Diff
31
app/migrations/versions/2023_040318_5f4a5625da66_.py
Normal file
31
app/migrations/versions/2023_040318_5f4a5625da66_.py
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 5f4a5625da66
|
||||||
|
Revises: 2c2093c82bc0
|
||||||
|
Create Date: 2023-04-03 18:30:46.488231
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '5f4a5625da66'
|
||||||
|
down_revision = '2c2093c82bc0'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.add_column('public_domain', sa.Column('partner_id', sa.Integer(), nullable=True))
|
||||||
|
op.create_foreign_key(None, 'public_domain', 'partner', ['partner_id'], ['id'], ondelete='cascade')
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_constraint(None, 'public_domain', type_='foreignkey')
|
||||||
|
op.drop_column('public_domain', 'partner_id')
|
||||||
|
# ### end Alembic commands ###
|
29
app/migrations/versions/2023_041418_893c0d18475f_.py
Normal file
29
app/migrations/versions/2023_041418_893c0d18475f_.py
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 893c0d18475f
|
||||||
|
Revises: 5f4a5625da66
|
||||||
|
Create Date: 2023-04-14 18:20:03.807367
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '893c0d18475f'
|
||||||
|
down_revision = '5f4a5625da66'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.create_index(op.f('ix_contact_pgp_finger_print'), 'contact', ['pgp_finger_print'], unique=False)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_contact_pgp_finger_print'), table_name='contact')
|
||||||
|
# ### end Alembic commands ###
|
35
app/migrations/versions/2023_041419_bc496c0a0279_.py
Normal file
35
app/migrations/versions/2023_041419_bc496c0a0279_.py
Normal file
@ -0,0 +1,35 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: bc496c0a0279
|
||||||
|
Revises: 893c0d18475f
|
||||||
|
Create Date: 2023-04-14 19:09:38.540514
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = 'bc496c0a0279'
|
||||||
|
down_revision = '893c0d18475f'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.create_index(op.f('ix_alias_used_on_alias_id'), 'alias_used_on', ['alias_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_client_user_alias_id'), 'client_user', ['alias_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_hibp_notified_alias_alias_id'), 'hibp_notified_alias', ['alias_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_users_newsletter_alias_id'), 'users', ['newsletter_alias_id'], unique=False)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_users_newsletter_alias_id'), table_name='users')
|
||||||
|
op.drop_index(op.f('ix_hibp_notified_alias_alias_id'), table_name='hibp_notified_alias')
|
||||||
|
op.drop_index(op.f('ix_client_user_alias_id'), table_name='client_user')
|
||||||
|
op.drop_index(op.f('ix_alias_used_on_alias_id'), table_name='alias_used_on')
|
||||||
|
# ### end Alembic commands ###
|
29
app/migrations/versions/2023_041520_2d89315ac650_.py
Normal file
29
app/migrations/versions/2023_041520_2d89315ac650_.py
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 2d89315ac650
|
||||||
|
Revises: bc496c0a0279
|
||||||
|
Create Date: 2023-04-15 20:43:44.218020
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '2d89315ac650'
|
||||||
|
down_revision = 'bc496c0a0279'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.create_index(op.f('ix_partner_subscription_end_at'), 'partner_subscription', ['end_at'], unique=False)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_partner_subscription_end_at'), table_name='partner_subscription')
|
||||||
|
# ### end Alembic commands ###
|
29
app/migrations/versions/2023_041916_01e2997e90d3_.py
Normal file
29
app/migrations/versions/2023_041916_01e2997e90d3_.py
Normal file
@ -0,0 +1,29 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 01e2997e90d3
|
||||||
|
Revises: 893c0d18475f
|
||||||
|
Create Date: 2023-04-19 16:09:11.851588
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '01e2997e90d3'
|
||||||
|
down_revision = '893c0d18475f'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.add_column('public_domain', sa.Column('use_as_reverse_alias', sa.Boolean(), server_default='0', nullable=False))
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_column('public_domain', 'use_as_reverse_alias')
|
||||||
|
# ### end Alembic commands ###
|
25
app/migrations/versions/2023_042011_2634b41f54db_.py
Normal file
25
app/migrations/versions/2023_042011_2634b41f54db_.py
Normal file
@ -0,0 +1,25 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 2634b41f54db
|
||||||
|
Revises: 01e2997e90d3, 2d89315ac650
|
||||||
|
Create Date: 2023-04-20 11:47:43.048536
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '2634b41f54db'
|
||||||
|
down_revision = ('01e2997e90d3', '2d89315ac650')
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
pass
|
42
app/migrations/versions/2023_072819_01827104004b_.py
Normal file
42
app/migrations/versions/2023_072819_01827104004b_.py
Normal file
@ -0,0 +1,42 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 01827104004b
|
||||||
|
Revises: 2634b41f54db
|
||||||
|
Create Date: 2023-07-28 19:39:28.675490
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '01827104004b'
|
||||||
|
down_revision = '2634b41f54db'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.create_index(op.f('ix_alias_hibp_last_check'), 'alias', ['hibp_last_check'], unique=False, postgresql_concurrently=True)
|
||||||
|
op.create_index('ix_bounce_created_at', 'bounce', ['created_at'], unique=False, postgresql_concurrently=True)
|
||||||
|
op.create_index('ix_monitoring_created_at', 'monitoring', ['created_at'], unique=False, postgresql_concurrently=True)
|
||||||
|
op.create_index('ix_transactional_email_created_at', 'transactional_email', ['created_at'], unique=False, postgresql_concurrently=True)
|
||||||
|
op.create_index(op.f('ix_users_activated'), 'users', ['activated'], unique=False, postgresql_concurrently=True)
|
||||||
|
op.create_index('ix_users_activated_trial_end_lifetime', 'users', ['activated', 'trial_end', 'lifetime'], unique=False, postgresql_concurrently=True)
|
||||||
|
op.create_index(op.f('ix_users_referral_id'), 'users', ['referral_id'], unique=False, postgresql_concurrently=True)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_users_referral_id'), table_name='users')
|
||||||
|
op.drop_index('ix_users_activated_trial_end_lifetime', table_name='users')
|
||||||
|
op.drop_index(op.f('ix_users_activated'), table_name='users')
|
||||||
|
op.drop_index('ix_transactional_email_created_at', table_name='transactional_email')
|
||||||
|
op.drop_index('ix_monitoring_created_at', table_name='monitoring')
|
||||||
|
op.drop_index('ix_bounce_created_at', table_name='bounce')
|
||||||
|
op.drop_index(op.f('ix_alias_hibp_last_check'), table_name='alias')
|
||||||
|
# ### end Alembic commands ###
|
33
app/migrations/versions/2023_090715_0a5701a4f5e4_.py
Normal file
33
app/migrations/versions/2023_090715_0a5701a4f5e4_.py
Normal file
@ -0,0 +1,33 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 0a5701a4f5e4
|
||||||
|
Revises: 01827104004b
|
||||||
|
Create Date: 2023-09-07 15:28:10.122756
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '0a5701a4f5e4'
|
||||||
|
down_revision = '01827104004b'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.add_column('users', sa.Column('delete_on', sqlalchemy_utils.types.arrow.ArrowType(), nullable=True))
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
op.create_index('ix_users_delete_on', 'users', ['delete_on'], unique=False, postgresql_concurrently=True)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
op.drop_index('ix_users_delete_on', table_name='users', postgresql_concurrently=True)
|
||||||
|
op.drop_column('users', 'delete_on')
|
||||||
|
# ### end Alembic commands ###
|
34
app/migrations/versions/2023_092818_ec7fdde8da9f_.py
Normal file
34
app/migrations/versions/2023_092818_ec7fdde8da9f_.py
Normal file
@ -0,0 +1,34 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: ec7fdde8da9f
|
||||||
|
Revises: 0a5701a4f5e4
|
||||||
|
Create Date: 2023-09-28 18:09:48.016620
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = "ec7fdde8da9f"
|
||||||
|
down_revision = "0a5701a4f5e4"
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
op.create_index(
|
||||||
|
"ix_email_log_created_at", "email_log", ["created_at"], unique=False
|
||||||
|
)
|
||||||
|
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
op.drop_index("ix_email_log_created_at", table_name="email_log")
|
||||||
|
# ### end Alembic commands ###
|
39
app/migrations/versions/2023_100510_46ecb648a47e_.py
Normal file
39
app/migrations/versions/2023_100510_46ecb648a47e_.py
Normal file
@ -0,0 +1,39 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 46ecb648a47e
|
||||||
|
Revises: ec7fdde8da9f
|
||||||
|
Create Date: 2023-10-05 10:43:35.668902
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = "46ecb648a47e"
|
||||||
|
down_revision = "ec7fdde8da9f"
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_message_id_matching_email_log_id"),
|
||||||
|
"message_id_matching",
|
||||||
|
["email_log_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
with op.get_context().autocommit_block():
|
||||||
|
op.drop_index(
|
||||||
|
op.f("ix_message_id_matching_email_log_id"),
|
||||||
|
table_name="message_id_matching",
|
||||||
|
)
|
||||||
|
# ### end Alembic commands ###
|
31
app/migrations/versions/2023_110714_4bc54632d9aa_.py
Normal file
31
app/migrations/versions/2023_110714_4bc54632d9aa_.py
Normal file
@ -0,0 +1,31 @@
|
|||||||
|
"""empty message
|
||||||
|
|
||||||
|
Revision ID: 4bc54632d9aa
|
||||||
|
Revises: 46ecb648a47e
|
||||||
|
Create Date: 2023-11-07 14:02:17.610226
|
||||||
|
|
||||||
|
"""
|
||||||
|
import sqlalchemy_utils
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = '4bc54632d9aa'
|
||||||
|
down_revision = '46ecb648a47e'
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index('ix_newsletter_subject', table_name='newsletter')
|
||||||
|
op.create_index(op.f('ix_newsletter_subject'), 'newsletter', ['subject'], unique=False)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade():
|
||||||
|
# ### commands auto generated by Alembic - please adjust! ###
|
||||||
|
op.drop_index(op.f('ix_newsletter_subject'), table_name='newsletter')
|
||||||
|
op.create_index('ix_newsletter_subject', 'newsletter', ['subject'], unique=True)
|
||||||
|
# ### end Alembic commands ###
|
0
app/monitor/__init__.py
Normal file
0
app/monitor/__init__.py
Normal file
21
app/monitor/metric.py
Normal file
21
app/monitor/metric.py
Normal file
@ -0,0 +1,21 @@
|
|||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import List
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class UpcloudRecord:
|
||||||
|
db_role: str
|
||||||
|
label: str
|
||||||
|
time: str
|
||||||
|
value: float
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class UpcloudMetric:
|
||||||
|
metric_name: str
|
||||||
|
records: List[UpcloudRecord]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class UpcloudMetrics:
|
||||||
|
metrics: List[UpcloudMetric]
|
20
app/monitor/metric_exporter.py
Normal file
20
app/monitor/metric_exporter.py
Normal file
@ -0,0 +1,20 @@
|
|||||||
|
from app.config import UPCLOUD_DB_ID, UPCLOUD_PASSWORD, UPCLOUD_USERNAME
|
||||||
|
from app.log import LOG
|
||||||
|
from monitor.newrelic import NewRelicClient
|
||||||
|
from monitor.upcloud import UpcloudClient
|
||||||
|
|
||||||
|
|
||||||
|
class MetricExporter:
|
||||||
|
def __init__(self, newrelic_license: str):
|
||||||
|
self.__upcloud = UpcloudClient(
|
||||||
|
username=UPCLOUD_USERNAME, password=UPCLOUD_PASSWORD
|
||||||
|
)
|
||||||
|
self.__newrelic = NewRelicClient(newrelic_license)
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
try:
|
||||||
|
metrics = self.__upcloud.get_metrics(UPCLOUD_DB_ID)
|
||||||
|
self.__newrelic.send(metrics)
|
||||||
|
LOG.info("Upcloud metrics sent to NewRelic")
|
||||||
|
except Exception as e:
|
||||||
|
LOG.warn(f"Could not export metrics: {e}")
|
26
app/monitor/newrelic.py
Normal file
26
app/monitor/newrelic.py
Normal file
@ -0,0 +1,26 @@
|
|||||||
|
from monitor.metric import UpcloudMetrics
|
||||||
|
|
||||||
|
from newrelic_telemetry_sdk import GaugeMetric, MetricClient
|
||||||
|
|
||||||
|
_NEWRELIC_BASE_HOST = "metric-api.eu.newrelic.com"
|
||||||
|
|
||||||
|
|
||||||
|
class NewRelicClient:
|
||||||
|
def __init__(self, license_key: str):
|
||||||
|
self.__client = MetricClient(license_key=license_key, host=_NEWRELIC_BASE_HOST)
|
||||||
|
|
||||||
|
def send(self, metrics: UpcloudMetrics):
|
||||||
|
batch = []
|
||||||
|
|
||||||
|
for metric in metrics.metrics:
|
||||||
|
for record in metric.records:
|
||||||
|
batch.append(
|
||||||
|
GaugeMetric(
|
||||||
|
name=f"upcloud.db.{metric.metric_name}",
|
||||||
|
value=record.value,
|
||||||
|
tags={"host": record.label, "db_role": record.db_role},
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
response = self.__client.send_batch(batch)
|
||||||
|
response.raise_for_status()
|
82
app/monitor/upcloud.py
Normal file
82
app/monitor/upcloud.py
Normal file
@ -0,0 +1,82 @@
|
|||||||
|
from app.log import LOG
|
||||||
|
from monitor.metric import UpcloudMetric, UpcloudMetrics, UpcloudRecord
|
||||||
|
|
||||||
|
import base64
|
||||||
|
import requests
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
|
||||||
|
BASE_URL = "https://api.upcloud.com"
|
||||||
|
|
||||||
|
|
||||||
|
def get_metric(json: Any, metric: str) -> UpcloudMetric:
|
||||||
|
records = []
|
||||||
|
|
||||||
|
if metric in json:
|
||||||
|
metric_data = json[metric]
|
||||||
|
data = metric_data["data"]
|
||||||
|
cols = list(map(lambda x: x["label"], data["cols"][1:]))
|
||||||
|
latest = data["rows"][-1]
|
||||||
|
time = latest[0]
|
||||||
|
for column_idx in range(len(cols)):
|
||||||
|
value = latest[1 + column_idx]
|
||||||
|
|
||||||
|
# If the latest value is None, try to fetch the second to last
|
||||||
|
if value is None:
|
||||||
|
value = data["rows"][-2][1 + column_idx]
|
||||||
|
|
||||||
|
if value is not None:
|
||||||
|
label = cols[column_idx]
|
||||||
|
if "(master)" in label:
|
||||||
|
db_role = "master"
|
||||||
|
else:
|
||||||
|
db_role = "standby"
|
||||||
|
records.append(
|
||||||
|
UpcloudRecord(time=time, db_role=db_role, label=label, value=value)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
LOG.warn(f"Could not get value for metric {metric}")
|
||||||
|
|
||||||
|
return UpcloudMetric(metric_name=metric, records=records)
|
||||||
|
|
||||||
|
|
||||||
|
def get_metrics(json: Any) -> UpcloudMetrics:
|
||||||
|
return UpcloudMetrics(
|
||||||
|
metrics=[
|
||||||
|
get_metric(json, "cpu_usage"),
|
||||||
|
get_metric(json, "disk_usage"),
|
||||||
|
get_metric(json, "diskio_reads"),
|
||||||
|
get_metric(json, "diskio_writes"),
|
||||||
|
get_metric(json, "load_average"),
|
||||||
|
get_metric(json, "mem_usage"),
|
||||||
|
get_metric(json, "net_receive"),
|
||||||
|
get_metric(json, "net_send"),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class UpcloudClient:
|
||||||
|
def __init__(self, username: str, password: str):
|
||||||
|
if not username:
|
||||||
|
raise Exception("UpcloudClient username must be set")
|
||||||
|
if not password:
|
||||||
|
raise Exception("UpcloudClient password must be set")
|
||||||
|
|
||||||
|
client = requests.Session()
|
||||||
|
encoded_auth = base64.b64encode(
|
||||||
|
f"{username}:{password}".encode("utf-8")
|
||||||
|
).decode("utf-8")
|
||||||
|
client.headers = {"Authorization": f"Basic {encoded_auth}"}
|
||||||
|
self.__client = client
|
||||||
|
|
||||||
|
def get_metrics(self, db_uuid: str) -> UpcloudMetrics:
|
||||||
|
url = f"{BASE_URL}/1.3/database/{db_uuid}/metrics?period=hour"
|
||||||
|
LOG.d(f"Performing request to {url}")
|
||||||
|
response = self.__client.get(url)
|
||||||
|
LOG.d(f"Status code: {response.status_code}")
|
||||||
|
if response.status_code != 200:
|
||||||
|
return UpcloudMetrics(metrics=[])
|
||||||
|
|
||||||
|
as_json = response.json()
|
||||||
|
|
||||||
|
return get_metrics(as_json)
|
@ -1,3 +1,4 @@
|
|||||||
|
import configparser
|
||||||
import os
|
import os
|
||||||
import subprocess
|
import subprocess
|
||||||
from time import sleep
|
from time import sleep
|
||||||
@ -7,6 +8,7 @@ import newrelic.agent
|
|||||||
|
|
||||||
from app.db import Session
|
from app.db import Session
|
||||||
from app.log import LOG
|
from app.log import LOG
|
||||||
|
from monitor.metric_exporter import MetricExporter
|
||||||
|
|
||||||
# the number of consecutive fails
|
# the number of consecutive fails
|
||||||
# if more than _max_nb_fails, alert
|
# if more than _max_nb_fails, alert
|
||||||
@ -19,6 +21,18 @@ _max_nb_fails = 10
|
|||||||
# the maximum number of emails in incoming & active queue
|
# the maximum number of emails in incoming & active queue
|
||||||
_max_incoming = 50
|
_max_incoming = 50
|
||||||
|
|
||||||
|
_NR_CONFIG_FILE_LOCATION_VAR = "NEW_RELIC_CONFIG_FILE"
|
||||||
|
|
||||||
|
|
||||||
|
def get_newrelic_license() -> str:
|
||||||
|
nr_file = os.environ.get(_NR_CONFIG_FILE_LOCATION_VAR, None)
|
||||||
|
if nr_file is None:
|
||||||
|
raise Exception(f"{_NR_CONFIG_FILE_LOCATION_VAR} not defined")
|
||||||
|
|
||||||
|
config = configparser.ConfigParser()
|
||||||
|
config.read(nr_file)
|
||||||
|
return config["newrelic"]["license_key"]
|
||||||
|
|
||||||
|
|
||||||
@newrelic.agent.background_task()
|
@newrelic.agent.background_task()
|
||||||
def log_postfix_metrics():
|
def log_postfix_metrics():
|
||||||
@ -80,10 +94,13 @@ def log_nb_db_connection():
|
|||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
exporter = MetricExporter(get_newrelic_license())
|
||||||
while True:
|
while True:
|
||||||
log_postfix_metrics()
|
log_postfix_metrics()
|
||||||
log_nb_db_connection()
|
log_nb_db_connection()
|
||||||
Session.close()
|
Session.close()
|
||||||
|
|
||||||
|
exporter.run()
|
||||||
|
|
||||||
# 1 min
|
# 1 min
|
||||||
sleep(60)
|
sleep(60)
|
||||||
|
5347
app/poetry.lock
generated
5347
app/poetry.lock
generated
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user