Compare commits

...

55 Commits

Author SHA1 Message Date
e82190f227 4.46.3
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 2m51s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 3m20s
Build-Release-Image / Merge-Images (push) Successful in 13s
Build-Release-Image / Create-Release (push) Successful in 9s
Build-Release-Image / Notify (push) Successful in 4s
2024-07-12 12:00:06 +01:00
9002bbad09 4.46.2
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m7s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m35s
Build-Release-Image / Merge-Images (push) Successful in 21s
Build-Release-Image / Create-Release (push) Successful in 9s
Build-Release-Image / Notify (push) Successful in 3s
2024-07-11 12:00:06 +01:00
f51d31f431 4.46.0
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m41s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m54s
Build-Release-Image / Merge-Images (push) Successful in 19s
Build-Release-Image / Create-Release (push) Successful in 16s
Build-Release-Image / Notify (push) Successful in 19s
2024-07-09 12:00:06 +01:00
c67b97fe32 4.45.1
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m52s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m0s
Build-Release-Image / Merge-Images (push) Successful in 18s
Build-Release-Image / Create-Release (push) Successful in 14s
Build-Release-Image / Notify (push) Successful in 8s
2024-06-26 12:00:08 +01:00
bd414b1fc7 4.45.0
All checks were successful
Build-Release-Image / Build-Image (linux/amd64) (push) Successful in 3m1s
Build-Release-Image / Build-Image (linux/arm64) (push) Successful in 4m13s
Build-Release-Image / Merge-Images (push) Successful in 22s
Build-Release-Image / Create-Release (push) Successful in 9s
Build-Release-Image / Notify (push) Successful in 4s
2024-06-11 12:00:06 +01:00
0f73a14926 4.44.3 2024-05-24 12:00:06 +01:00
0ea33ca5f8 4.44.0 2024-05-23 12:00:07 +01:00
4e178ad676 4.43.0 2024-05-09 12:00:07 +01:00
24ba25ab6a 4.42.2 2024-04-10 17:23:11 +01:00
78184eeae4 4.42.1 2024-03-26 12:00:08 +00:00
c111fbe8e1 4.42.0 2024-03-19 12:00:09 +00:00
d5981588e4 4.41.2 2024-03-15 12:00:08 +00:00
6af1c2ccf4 Merge pull request 'Correct docker package name' (#2) from fix-package-name-in-gitea-actions into main
Reviewed-on: #2
2024-03-14 15:47:01 +00:00
76664f6e4c Correct docker package name 2024-03-14 15:46:44 +00:00
f7125618c4 4.41.0 2024-03-14 12:00:08 +00:00
050cef0e4e 4.40.2 2024-03-07 12:00:08 +00:00
0d557ef875 4.40.1 2024-03-05 12:00:09 +00:00
6e56ea4489 Merge pull request 'Replace Drone with Gitea Actions' (#1) from gitea-actions into main
Reviewed-on: #1
2024-03-04 13:42:58 +00:00
def0de643b Remove Drone 2024-03-04 13:38:57 +00:00
9e7cb2c7dd Add Gitea Actions 2024-03-04 13:38:52 +00:00
f1110506c0 4.39.3 2024-02-27 12:00:07 +00:00
f5bce7d7ff 4.39.2 2024-02-23 12:00:07 +00:00
75f45d9365 4.39.1 2024-02-20 12:00:07 +00:00
ead425e0c2 4.38.3 2024-02-14 12:00:07 +00:00
6c910d62c5 4.38.2 2024-02-06 12:00:07 +00:00
99ffd1ec0c 4.38.0 2024-02-03 16:55:23 +00:00
eda940f8b2 4.37.2 2024-01-27 12:00:07 +00:00
1dad582523 4.37.1 2024-01-25 12:00:08 +00:00
e516266a27 4.37.0 2024-01-18 12:00:07 +00:00
850fc95477 4.36.8 2023-12-28 12:00:07 +00:00
d172825900 4.36.7 2023-12-21 12:00:09 +00:00
026865e5bf 4.36.6 2023-12-17 14:56:57 +00:00
add94ef2a2 4.36.5 2023-11-30 12:00:09 +00:00
1081400948 4.36.4 2023-11-22 12:00:09 +00:00
5776128905 4.36.3 2023-11-08 12:00:06 +00:00
d661860f4c 4.35.6 2023-11-07 12:00:06 +00:00
0a52e32972 4.35.3 2023-10-05 12:00:06 +01:00
703dcbd0eb 4.35.2 2023-10-03 12:00:06 +01:00
ce7ed69547 4.35.1 2023-10-02 12:00:06 +01:00
4f5564df16 4.35.0 2023-09-29 12:00:06 +01:00
2fee569131 4.34.4 2023-08-31 12:00:06 +01:00
7ea45d6f5d 4.34.3 2023-08-29 20:20:00 +01:00
6d24db50bd 4.34.2 2023-08-25 12:00:05 +01:00
88f270c6a1 4.34.1 2023-08-09 12:00:05 +01:00
0962b1cf29 Update .drone.yml 2023-08-06 17:56:31 +00:00
6051d72691 4.33.3 2023-08-06 17:51:04 +01:00
c31a75a9ef Update README.md 2023-08-06 16:04:57 +00:00
ef289385ff Update README.md 2023-08-06 16:04:47 +00:00
9b12a2ad33 Update README.md 2023-08-06 16:04:41 +00:00
8eb19d88f3 Remove provenance [CI SKIP] 2023-08-06 16:01:04 +00:00
e36e9d3077 4.32.4 2023-08-02 16:49:54 +01:00
b2430cbc5b 4.32.1 2023-07-12 11:00:04 +00:00
1258115397 4.32.0 2023-07-11 11:00:05 +00:00
38c134d903 4.31.0 2023-06-30 11:00:06 +00:00
cd77e4cc2d 4.30.1 2023-06-28 11:00:03 +00:00
225 changed files with 7717 additions and 372677 deletions

View File

@ -1,50 +0,0 @@
kind: pipeline
type: docker
name: build-multiarch-images
platform:
os: linux
arch: amd64
steps:
- name: make-tags
image: node
commands:
- echo -n "${DRONE_TAG}, latest" > .tags
- name: build
image: thegeeklab/drone-docker-buildx
privileged: true
settings:
dockerfile: app/Dockerfile
context: app
registry: git.mrmeeb.stream
username:
from_secret: docker_username
password:
from_secret: docker_password
repo: git.mrmeeb.stream/mrmeeb/simple-login
platforms:
- linux/arm64
- linux/amd64
- name: notify
image: plugins/slack
when:
status:
- success
- failure
settings:
webhook:
from_secret: slack_webhook
icon_url:
from_secret: slack_avatar
trigger:
event:
include:
- tag
ref:
include:
- refs/tags/**

View File

@ -0,0 +1,195 @@
name: Build-Release-Image
on:
push:
tags:
- '*'
env:
CONTAINER_NAME: git.mrmeeb.stream/mrmeeb/simple-login
TEA_VERSION: 0.9.2
jobs:
Build-Image:
runs-on: [ubuntu-docker-latest, "${{ matrix.platform }}"]
strategy:
fail-fast: false
matrix:
platform:
- linux/amd64
- linux/arm64
steps:
- name: Prepare
run: |
platform=${{ matrix.platform }}
echo "PLATFORM_PAIR=${platform//\//-}" >> $GITHUB_ENV
echo "RELEASE_VERSION=${GITHUB_REF#refs/*/}" >> $GITHUB_ENV
- name: Checkout
uses: actions/checkout@v2
# Not needed currently due to https://github.com/go-gitea/gitea/issues/29563
#- name: Prepare tags
# id: meta
# uses: docker/metadata-action@v5
# with:
# images: ${{ env.CONTAINER_NAME }}
# tags: |
# type=pep440,pattern={{version}}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: git.mrmeeb.stream
username: ${{ env.GITHUB_ACTOR }}
password: ${{ secrets.GTCR_TOKEN }}
- name: Build and push by digest
uses: docker/build-push-action@v5
id: build
with:
context: ./app
platforms: ${{ matrix.platform }}
provenance: false
outputs: type=image,name=${{ env.CONTAINER_NAME }},push-by-digest=true,name-canonical=true,push=true
- name: Export digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- name: Upload digest
uses: actions/upload-artifact@v3
with:
name: digests-${{ env.PLATFORM_PAIR }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 1
- name: Notify
uses: rjstone/discord-webhook-notify@v1
if: failure()
with:
severity: ${{ job.status == 'success' && 'info' || (job.status == 'cancelled' && 'warn' || 'error') }}
details: Build ${{ job.status == 'success' && 'succeeded' || (job.status == 'cancelled' && 'cancelled' || 'failed') }}!
webhookUrl: ${{ secrets.DISCORD_WEBHOOK }}
username: Gitea
avatarUrl: ${{ vars.RUNNER_ICON_URL }}
Merge-Images:
runs-on: ubuntu-docker-latest
needs: [Build-Image]
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Get tag
run: echo "RELEASE_VERSION=${GITHUB_REF#refs/*/}" >> $GITHUB_ENV
- name: Download digests
uses: actions/download-artifact@v3
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
# Not needed currently due to https://github.com/go-gitea/gitea/issues/29563
#- name: Prepare Docker metadata
# id: meta
# uses: docker/metadata-action@v5
# with:
# images: ${{ env.CONTAINER_NAME }}
- name: Login to Gitea Container Registry
uses: docker/login-action@v3
with:
registry: git.mrmeeb.stream
username: ${{ env.GITHUB_ACTOR }}
password: ${{ secrets.GTCR_TOKEN }}
- name: Create manifest latest
working-directory: /tmp/digests
run: |
docker manifest create ${{ env.CONTAINER_NAME }}:latest \
--amend ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-amd64/* | cut -d / -f 2) \
--amend ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-arm64/* | cut -d / -f 2)
#docker manifest annotate --arch amd64 --os linux ${{ env.CONTAINER_NAME }}:latest ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-amd64/* | cut -d / -f 2)
#docker manifest annotate --arch arm64 --os linux ${{ env.CONTAINER_NAME }}:latest ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-arm64/* | cut -d / -f 2)
docker manifest inspect ${{ env.CONTAINER_NAME }}:latest
docker manifest push ${{ env.CONTAINER_NAME }}:latest
- name: Create manifest tagged
working-directory: /tmp/digests
run: |
docker manifest create ${{ env.CONTAINER_NAME }}:${{ env.RELEASE_VERSION }} \
--amend ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-amd64/* | cut -d / -f 2) \
--amend ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-arm64/* | cut -d / -f 2)
#docker manifest annotate --arch amd64 --os linux ${{ env.CONTAINER_NAME }}:${{ env.RELEASE_VERSION }} ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-amd64/* | cut -d / -f 2)
#docker manifest annotate --arch arm64 --os linux ${{ env.CONTAINER_NAME }}:${{ env.RELEASE_VERSION }} ${{ env.CONTAINER_NAME }}@sha256:$(ls -p digests-linux-arm64/* | cut -d / -f 2)
docker manifest inspect ${{ env.CONTAINER_NAME }}:${{ env.RELEASE_VERSION }}
docker manifest push ${{ env.CONTAINER_NAME }}:${{ env.RELEASE_VERSION }}
# Disabled due to https://github.com/go-gitea/gitea/issues/29563
#- name: Create manifest list and push
# working-directory: /tmp/digests
# run: |
# echo $DOCKER_METADATA_OUTPUT_JSON
# echo $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
# $(printf '${{ env.CONTAINER_NAME }}@sha256:%s ' $(ls -p */* | cut -d / -f 2))
# docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
# $(printf '${{ env.CONTAINER_NAME }}@sha256:%s ' $(ls -p */* | cut -d / -f 2))
#- name: Inspect image
# run: |
# docker buildx imagetools inspect ${{ env.CONTAINER_NAME }}:${{ steps.meta.outputs.version }}
- name: Notify
uses: rjstone/discord-webhook-notify@v1
if: failure()
with:
severity: ${{ job.status == 'success' && 'info' || (job.status == 'cancelled' && 'warn' || 'error') }}
details: Build ${{ job.status == 'success' && 'succeeded' || (job.status == 'cancelled' && 'cancelled' || 'failed') }}!
webhookUrl: ${{ secrets.DISCORD_WEBHOOK }}
username: Gitea
avatarUrl: ${{ vars.RUNNER_ICON_URL }}
Create-Release:
runs-on: [ubuntu-latest, linux/amd64]
needs: [Merge-Images]
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Get tag
run: echo "RELEASE_VERSION=${GITHUB_REF#refs/*/}" >> $GITHUB_ENV
- name: Prepare tea
run: |
# Download tea from Gitea release page
echo "Downloading Tea v${{ env.TEA_VERSION }}" && \
wget -q -O tea https://gitea.com/gitea/tea/releases/download/v${{ env.TEA_VERSION }}/tea-${{ env.TEA_VERSION }}-linux-amd64 && \
echo "Downloaded Tea" && \
chmod +x tea && \
# Login to Gitea
echo "Logging in to Gitea using Tea" && \
./tea login add --name SimpleLogin --url https://git.mrmeeb.stream --token ${{ secrets.GITHUB_TOKEN }} && \
echo "Done"
- name: Make release
run: |
echo "Creating release" && \
./tea release create --login "SimpleLogin" --repo ${{ env.GITHUB_REPOSITORY }} --tag ${{ env.RELEASE_VERSION }} -t ${{ env.RELEASE_VERSION }} -n "Triggered by release of v${{ env.RELEASE_VERSION }} by the SimpleLogin team. <a href=\"https://github.com/simple-login/app/releases/tag/v${{ env.RELEASE_VERSION }}\" target=\"_blank\">View the changelog</a>" && \
echo "Done"
- name: Notify
uses: rjstone/discord-webhook-notify@v1
if: failure()
with:
severity: ${{ job.status == 'success' && 'info' || (job.status == 'cancelled' && 'warn' || 'error') }}
details: Release ${{ job.status == 'success' && 'succeeded' || (job.status == 'cancelled' && 'cancelled' || 'failed') }}!
webhookUrl: ${{ secrets.DISCORD_WEBHOOK }}
username: Gitea
avatarUrl: ${{ vars.RUNNER_ICON_URL }}
Notify:
runs-on: ubuntu-latest
needs: [Build-Image, Merge-Images, Create-Release]
steps:
- name: Notify
uses: rjstone/discord-webhook-notify@v1
if: always()
with:
severity: ${{ job.status == 'success' && 'info' || (job.status == 'cancelled' && 'warn' || 'error') }}
details: Release ${{ job.status == 'success' && 'succeeded' || (job.status == 'cancelled' && 'cancelled' || 'failed') }}!
webhookUrl: ${{ secrets.DISCORD_WEBHOOK }}
username: Gitea
avatarUrl: ${{ vars.RUNNER_ICON_URL }}

View File

@ -1,9 +1,7 @@
# Simple Login
# SimpleLogin
[![Build Status](https://drone.mrmeeb.stream/api/badges/MrMeeb/simple-login/status.svg?ref=refs/heads/main)](https://drone.mrmeeb.stream/MrMeeb/simple-login)
This repo exists to automatically capture any releases of the SaaS edition of SimpleLogin. It checks the simplelogin/app GitHub repo once a day, and builds the latest release automatically if it is newer than the currently built version.
This repo exists to automatically capture any releases of the SaaS edition of SimpleLogin. It checks once a day, and builds the latest one automatically if it is newer than the currentlty built version.
I did this to simplify deployment of my self-hosted SimpleLogin instance. SimpleLogin do not provide an up-to-date version for self-hosting, leaving you with the options of either running a very outdated version with no app support, a beta version, or their `simplelogin/app-ci` version. This last option works well if you use an x86 machine, but I'm running SimpleLogin on an ARM machine. Since I don't want to have to build containers on the machine itself, this repo handles that for me.
This exists to simplify deployment of SimpleLogin in a self-hosted capacity, while also allowing the use of the latest version; SimpleLogin do not provide an up-to-date version for this use.
The image is built for amd64 and arm64 devices.
As a result, this image is built for both amd64 and arm64 devices.

View File

@ -1,7 +1,6 @@
name: Test and lint
on:
push:
on: [push, pull_request]
jobs:
lint:
@ -15,9 +14,15 @@ jobs:
- uses: actions/setup-python@v4
with:
python-version: '3.9'
python-version: '3.10'
cache: 'poetry'
- name: Install OS dependencies
if: ${{ matrix.python-version }} == '3.10'
run: |
sudo apt update
sudo apt install -y libre2-dev libpq-dev
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction
@ -133,6 +138,12 @@ jobs:
with:
fetch-depth: 0
- name: Set up QEMU
uses: docker/setup-qemu-action@v2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
- name: Create Sentry release
uses: getsentry/action-release@v1
env:
@ -152,6 +163,7 @@ jobs:
uses: docker/build-push-action@v3
with:
context: .
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ steps.meta.outputs.tags }}

View File

@ -7,18 +7,19 @@ repos:
hooks:
- id: check-yaml
- id: trailing-whitespace
- repo: https://github.com/psf/black
rev: 22.3.0
hooks:
- id: black
- repo: https://github.com/pycqa/flake8
rev: 3.9.2
hooks:
- id: flake8
- repo: https://github.com/Riverside-Healthcare/djLint
rev: v1.3.0
hooks:
- id: djlint-jinja
files: '.*\.html'
entry: djlint --reformat
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.1.5
hooks:
# Run the linter.
- id: ruff
args: [ --fix ]
# Run the formatter.
- id: ruff-format

View File

@ -68,6 +68,12 @@ For most tests, you will need to have ``redis`` installed and started on your ma
sh scripts/run-test.sh
```
You can also run tests using a local Postgres DB to speed things up. This can be done by
- creating an empty test DB and running the database migration by `dropdb test && createdb test && DB_URI=postgresql://localhost:5432/test alembic upgrade head`
- replacing the `DB_URI` in `test.env` file by `DB_URI=postgresql://localhost:5432/test`
## Run the code locally
Install npm packages
@ -151,10 +157,10 @@ Here are the small sum-ups of the directory structures and their roles:
## Pull request
The code is formatted using https://github.com/psf/black, to format the code, simply run
The code is formatted using [ruff](https://github.com/astral-sh/ruff), to format the code, simply run
```
poetry run black .
poetry run ruff format .
```
The code is also checked with `flake8`, make sure to run `flake8` before creating the pull request by
@ -169,6 +175,12 @@ For HTML templates, we use `djlint`. Before creating a pull request, please run
poetry run djlint --check templates
```
If some files aren't properly formatted, you can format all files with
```bash
poetry run djlint --reformat .
```
## Test sending email
[swaks](http://www.jetmore.org/john/code/swaks/) is used for sending test emails to the `email_handler`.

View File

@ -23,7 +23,7 @@ COPY poetry.lock pyproject.toml ./
# Install and setup poetry
RUN pip install -U pip \
&& apt-get update \
&& apt install -y curl netcat-traditional gcc python3-dev gnupg git libre2-dev \
&& apt install -y curl netcat-traditional gcc python3-dev gnupg git libre2-dev cmake ninja-build\
&& curl -sSL https://install.python-poetry.org | python3 - \
# Remove curl and netcat from the image
&& apt-get purge -y curl netcat-traditional \
@ -31,7 +31,7 @@ RUN pip install -U pip \
&& poetry config virtualenvs.create false \
&& poetry install --no-interaction --no-ansi --no-root \
# Clear apt cache \
&& apt-get purge -y libre2-dev \
&& apt-get purge -y libre2-dev cmake ninja-build\
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*

View File

@ -74,7 +74,7 @@ Setting up DKIM is highly recommended to reduce the chance your emails ending up
First you need to generate a private and public key for DKIM:
```bash
openssl genrsa -out dkim.key 1024
openssl genrsa -out dkim.key -traditional 1024
openssl rsa -in dkim.key -pubout -out dkim.pub.key
```
@ -510,11 +510,14 @@ server {
server_name app.mydomain.com;
location / {
proxy_pass http://localhost:7777;
proxy_pass http://localhost:7777;
proxy_set_header Host $host;
}
}
```
Note: If `/etc/nginx/sites-enabled/default` exists, delete it or certbot will fail due to the conflict. The `simplelogin` file should be the only file in `sites-enabled`.
Reload Nginx with the command below
```bash

View File

@ -5,13 +5,15 @@ from typing import Optional
from arrow import Arrow
from newrelic import agent
from sqlalchemy import or_
from app.db import Session
from app.email_utils import send_welcome_email
from app.utils import sanitize_email
from app.utils import sanitize_email, canonicalize_email
from app.errors import (
AccountAlreadyLinkedToAnotherPartnerException,
AccountIsUsingAliasAsEmail,
AccountAlreadyLinkedToAnotherUserException,
)
from app.log import LOG
from app.models import (
@ -130,8 +132,9 @@ class ClientMergeStrategy(ABC):
class NewUserStrategy(ClientMergeStrategy):
def process(self) -> LinkResult:
# Will create a new SL User with a random password
canonical_email = canonicalize_email(self.link_request.email)
new_user = User.create(
email=self.link_request.email,
email=canonical_email,
name=self.link_request.name,
password=random_string(20),
activated=True,
@ -165,7 +168,8 @@ class NewUserStrategy(ClientMergeStrategy):
class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
def process(self) -> LinkResult:
# IF it was scheduled to be deleted. Unschedule it.
self.user.delete_on = None
partner_user = ensure_partner_user_exists_for_user(
self.link_request, self.user, self.partner
)
@ -179,7 +183,7 @@ class ExistingUnlinkedUserStrategy(ClientMergeStrategy):
class LinkedWithAnotherPartnerUserStrategy(ClientMergeStrategy):
def process(self) -> LinkResult:
raise AccountAlreadyLinkedToAnotherPartnerException()
raise AccountAlreadyLinkedToAnotherUserException()
def get_login_strategy(
@ -212,11 +216,21 @@ def process_login_case(
partner_id=partner.id, external_user_id=link_request.external_user_id
)
if partner_user is None:
canonical_email = canonicalize_email(link_request.email)
# We didn't find any SimpleLogin user registered with that partner user id
# Make sure they aren't using an alias as their link email
check_alias(link_request.email)
check_alias(canonical_email)
# Try to find it using the partner's e-mail address
user = User.get_by(email=link_request.email)
users = User.filter(
or_(User.email == link_request.email, User.email == canonical_email)
).all()
if len(users) > 1:
user = [user for user in users if user.email == canonical_email][0]
elif len(users) == 1:
user = users[0]
else:
user = None
return get_login_strategy(link_request, user, partner).process()
else:
# We found the SL user registered with that partner user id
@ -234,6 +248,8 @@ def link_user(
) -> LinkResult:
# Sanitize email just in case
link_request.email = sanitize_email(link_request.email)
# If it was scheduled to be deleted. Unschedule it.
current_user.delete_on = None
partner_user = ensure_partner_user_exists_for_user(
link_request, current_user, partner
)

View File

@ -46,7 +46,8 @@ class SLModelView(sqla.ModelView):
def inaccessible_callback(self, name, **kwargs):
# redirect to login page if user doesn't have access
return redirect(url_for("auth.login", next=request.url))
flash("You don't have access to the admin page", "error")
return redirect(url_for("dashboard.index", next=request.url))
def on_model_change(self, form, model, is_created):
changes = {}
@ -214,6 +215,20 @@ class UserAdmin(SLModelView):
Session.commit()
@action(
"remove trial",
"Stop trial period",
"Remove trial for this user?",
)
def stop_trial(self, ids):
for user in User.filter(User.id.in_(ids)):
user.trial_end = None
flash(f"Stopped trial for {user}", "success")
AdminAuditLog.stop_trial(current_user.id, user.id)
Session.commit()
@action(
"disable_otp_fido",
"Disable OTP & FIDO",
@ -256,6 +271,17 @@ class UserAdmin(SLModelView):
Session.commit()
@action(
"clear_delete_on",
"Remove scheduled deletion of user",
"This will remove the scheduled deletion for this users",
)
def clean_delete_on(self, ids):
for user in User.filter(User.id.in_(ids)):
user.delete_on = None
Session.commit()
# @action(
# "login_as",
# "Login as this user",
@ -600,6 +626,26 @@ class NewsletterAdmin(SLModelView):
else:
flash(error_msg, "error")
@action(
"clone_newsletter",
"Clone this newsletter",
)
def clone_newsletter(self, newsletter_ids):
if len(newsletter_ids) != 1:
flash("you can only select 1 newsletter", "error")
return
newsletter_id = newsletter_ids[0]
newsletter: Newsletter = Newsletter.get(newsletter_id)
new_newsletter = Newsletter.create(
subject=newsletter.subject,
html=newsletter.html,
plain_text=newsletter.plain_text,
commit=True,
)
flash(f"Newsletter {new_newsletter.subject} has been cloned", "success")
class NewsletterUserAdmin(SLModelView):
column_searchable_list = ["id"]

View File

@ -70,7 +70,6 @@ def verify_prefix_suffix(
# when DISABLE_ALIAS_SUFFIX is true, alias_domain_prefix is empty
and not config.DISABLE_ALIAS_SUFFIX
):
if not alias_domain_prefix.startswith("."):
LOG.e("User %s submits a wrong alias suffix %s", user, alias_suffix)
return False

View File

@ -21,11 +21,20 @@ from app.email_utils import (
send_cannot_create_directory_alias_disabled,
get_email_local_part,
send_cannot_create_domain_alias,
send_email,
render,
)
from app.errors import AliasInTrashError
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import (
AliasDeleted,
AliasStatusChanged,
EventContent,
)
from app.log import LOG
from app.models import (
Alias,
AliasDeleteReason,
CustomDomain,
Directory,
User,
@ -36,6 +45,8 @@ from app.models import (
EmailLog,
Contact,
AutoCreateRule,
AliasUsedOn,
ClientUser,
)
from app.regex_utils import regex_match
@ -299,36 +310,44 @@ def try_auto_create_via_domain(address: str) -> Optional[Alias]:
return None
def delete_alias(alias: Alias, user: User):
def delete_alias(
alias: Alias, user: User, reason: AliasDeleteReason = AliasDeleteReason.Unspecified
):
"""
Delete an alias and add it to either global or domain trash
Should be used instead of Alias.delete, DomainDeletedAlias.create, DeletedAlias.create
"""
# save deleted alias to either global or domain trash
LOG.i(f"User {user} has deleted alias {alias}")
# save deleted alias to either global or domain tra
if alias.custom_domain_id:
if not DomainDeletedAlias.get_by(
email=alias.email, domain_id=alias.custom_domain_id
):
LOG.d("add %s to domain %s trash", alias, alias.custom_domain_id)
Session.add(
DomainDeletedAlias(
user_id=user.id,
email=alias.email,
domain_id=alias.custom_domain_id,
)
domain_deleted_alias = DomainDeletedAlias(
user_id=user.id,
email=alias.email,
domain_id=alias.custom_domain_id,
reason=reason,
)
Session.add(domain_deleted_alias)
Session.commit()
LOG.i(
f"Moving {alias} to domain {alias.custom_domain_id} trash {domain_deleted_alias}"
)
else:
if not DeletedAlias.get_by(email=alias.email):
LOG.d("add %s to global trash", alias)
Session.add(DeletedAlias(email=alias.email))
deleted_alias = DeletedAlias(email=alias.email, reason=reason)
Session.add(deleted_alias)
Session.commit()
LOG.i(f"Moving {alias} to global trash {deleted_alias}")
LOG.i("delete alias %s", alias)
Alias.filter(Alias.id == alias.id).delete()
Session.commit()
EventDispatcher.send_event(
user, EventContent(alias_deleted=AliasDeleted(alias_id=alias.id))
)
def aliases_for_mailbox(mailbox: Mailbox) -> [Alias]:
"""
@ -399,3 +418,73 @@ def alias_export_csv(user, csv_direct_export=False):
output.headers["Content-Disposition"] = "attachment; filename=aliases.csv"
output.headers["Content-type"] = "text/csv"
return output
def transfer_alias(alias, new_user, new_mailboxes: [Mailbox]):
# cannot transfer alias which is used for receiving newsletter
if User.get_by(newsletter_alias_id=alias.id):
raise Exception("Cannot transfer alias that's used to receive newsletter")
# update user_id
Session.query(Contact).filter(Contact.alias_id == alias.id).update(
{"user_id": new_user.id}
)
Session.query(AliasUsedOn).filter(AliasUsedOn.alias_id == alias.id).update(
{"user_id": new_user.id}
)
Session.query(ClientUser).filter(ClientUser.alias_id == alias.id).update(
{"user_id": new_user.id}
)
# remove existing mailboxes from the alias
Session.query(AliasMailbox).filter(AliasMailbox.alias_id == alias.id).delete()
# set mailboxes
alias.mailbox_id = new_mailboxes.pop().id
for mb in new_mailboxes:
AliasMailbox.create(alias_id=alias.id, mailbox_id=mb.id)
# alias has never been transferred before
if not alias.original_owner_id:
alias.original_owner_id = alias.user_id
# inform previous owner
old_user = alias.user
send_email(
old_user.email,
f"Alias {alias.email} has been received",
render(
"transactional/alias-transferred.txt",
user=old_user,
alias=alias,
),
render(
"transactional/alias-transferred.html",
user=old_user,
alias=alias,
),
)
# now the alias belongs to the new user
alias.user_id = new_user.id
# set some fields back to default
alias.disable_pgp = False
alias.pinned = False
Session.commit()
def change_alias_status(alias: Alias, enabled: bool, commit: bool = False):
LOG.i(f"Changing alias {alias} enabled to {enabled}")
alias.enabled = enabled
event = AliasStatusChanged(
alias_id=alias.id, alias_email=alias.email, enabled=enabled
)
EventDispatcher.send_event(alias.user, EventContent(alias_status_change=event))
if commit:
Session.commit()

View File

@ -16,3 +16,22 @@ from .views import (
sudo,
user,
)
__all__ = [
"alias_options",
"new_custom_alias",
"custom_domain",
"new_random_alias",
"user_info",
"auth",
"auth_mfa",
"alias",
"apple",
"mailbox",
"notification",
"setting",
"export",
"phone",
"sudo",
"user",
]

View File

@ -5,6 +5,7 @@ import arrow
from flask import Blueprint, request, jsonify, g
from flask_login import current_user
from app import constants
from app.db import Session
from app.models import ApiKey
@ -18,7 +19,9 @@ def authorize_request() -> Optional[Tuple[str, int]]:
api_key = ApiKey.get_by(code=api_code)
if not api_key:
if current_user.is_authenticated:
if current_user.is_authenticated and request.headers.get(
constants.HEADER_ALLOW_API_COOKIES
):
g.user = current_user
else:
return jsonify(error="Wrong api key"), 401
@ -33,6 +36,9 @@ def authorize_request() -> Optional[Tuple[str, int]]:
if g.user.disabled:
return jsonify(error="Disabled account"), 403
if not g.user.is_active():
return jsonify(error="Account does not exist"), 401
g.api_key = api_key
return None

View File

@ -201,10 +201,10 @@ def get_alias_infos_with_pagination_v3(
q = q.order_by(Alias.pinned.desc())
q = q.order_by(latest_activity.desc())
q = list(q.limit(page_limit).offset(page_id * page_size))
q = q.limit(page_limit).offset(page_id * page_size)
ret = []
for alias, contact, email_log, nb_reply, nb_blocked, nb_forward in q:
for alias, contact, email_log, nb_reply, nb_blocked, nb_forward in list(q):
ret.append(
AliasInfo(
alias=alias,
@ -358,7 +358,6 @@ def construct_alias_query(user: User):
else_=0,
)
).label("nb_forward"),
func.max(EmailLog.created_at).label("latest_email_log_created_at"),
)
.join(EmailLog, Alias.id == EmailLog.alias_id, isouter=True)
.filter(Alias.user_id == user.id)
@ -366,14 +365,6 @@ def construct_alias_query(user: User):
.subquery()
)
alias_contact_subquery = (
Session.query(Alias.id, func.max(Contact.id).label("max_contact_id"))
.join(Contact, Alias.id == Contact.alias_id, isouter=True)
.filter(Alias.user_id == user.id)
.group_by(Alias.id)
.subquery()
)
return (
Session.query(
Alias,
@ -385,23 +376,7 @@ def construct_alias_query(user: User):
)
.options(joinedload(Alias.hibp_breaches))
.options(joinedload(Alias.custom_domain))
.join(Contact, Alias.id == Contact.alias_id, isouter=True)
.join(EmailLog, Contact.id == EmailLog.contact_id, isouter=True)
.join(EmailLog, Alias.last_email_log_id == EmailLog.id, isouter=True)
.join(Contact, EmailLog.contact_id == Contact.id, isouter=True)
.filter(Alias.id == alias_activity_subquery.c.id)
.filter(Alias.id == alias_contact_subquery.c.id)
.filter(
or_(
EmailLog.created_at
== alias_activity_subquery.c.latest_email_log_created_at,
and_(
# no email log yet for this alias
alias_activity_subquery.c.latest_email_log_created_at.is_(None),
# to make sure only 1 contact is returned in this case
or_(
Contact.id == alias_contact_subquery.c.max_contact_id,
alias_contact_subquery.c.max_contact_id.is_(None),
),
),
)
)
)

View File

@ -24,12 +24,15 @@ from app.errors import (
ErrContactAlreadyExists,
ErrAddressInvalid,
)
from app.models import Alias, Contact, Mailbox, AliasMailbox
from app.extensions import limiter
from app.log import LOG
from app.models import Alias, Contact, Mailbox, AliasMailbox, AliasDeleteReason
@deprecated
@api_bp.route("/aliases", methods=["GET", "POST"])
@require_api_auth
@limiter.limit("10/minute", key_func=lambda: g.user.id)
def get_aliases():
"""
Get aliases
@ -72,6 +75,7 @@ def get_aliases():
@api_bp.route("/v2/aliases", methods=["GET", "POST"])
@require_api_auth
@limiter.limit("50/minute", key_func=lambda: g.user.id)
def get_aliases_v2():
"""
Get aliases
@ -157,7 +161,7 @@ def delete_alias(alias_id):
if not alias or alias.user_id != user.id:
return jsonify(error="Forbidden"), 403
alias_utils.delete_alias(alias, user)
alias_utils.delete_alias(alias, user, AliasDeleteReason.ManualAction)
return jsonify(deleted=True), 200
@ -181,7 +185,8 @@ def toggle_alias(alias_id):
if not alias or alias.user_id != user.id:
return jsonify(error="Forbidden"), 403
alias.enabled = not alias.enabled
alias_utils.change_alias_status(alias, enabled=not alias.enabled)
LOG.i(f"User {user} changed alias {alias} enabled status to {alias.enabled}")
Session.commit()
return jsonify(enabled=alias.enabled), 200

View File

@ -17,9 +17,14 @@ from app.models import PlanEnum, AppleSubscription
_MONTHLY_PRODUCT_ID = "io.simplelogin.ios_app.subscription.premium.monthly"
_YEARLY_PRODUCT_ID = "io.simplelogin.ios_app.subscription.premium.yearly"
# SL Mac app used to be in SL account
_MACAPP_MONTHLY_PRODUCT_ID = "io.simplelogin.macapp.subscription.premium.monthly"
_MACAPP_YEARLY_PRODUCT_ID = "io.simplelogin.macapp.subscription.premium.yearly"
# SL Mac app is moved to Proton account
_MACAPP_MONTHLY_PRODUCT_ID_NEW = "me.proton.simplelogin.macos.premium.monthly"
_MACAPP_YEARLY_PRODUCT_ID_NEW = "me.proton.simplelogin.macos.premium.yearly"
# Apple API URL
_SANDBOX_URL = "https://sandbox.itunes.apple.com/verifyReceipt"
_PROD_URL = "https://buy.itunes.apple.com/verifyReceipt"
@ -263,7 +268,11 @@ def apple_update_notification():
plan = (
PlanEnum.monthly
if transaction["product_id"]
in (_MONTHLY_PRODUCT_ID, _MACAPP_MONTHLY_PRODUCT_ID)
in (
_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID_NEW,
)
else PlanEnum.yearly
)
@ -517,7 +526,11 @@ def verify_receipt(receipt_data, user, password) -> Optional[AppleSubscription]:
plan = (
PlanEnum.monthly
if latest_transaction["product_id"]
in (_MONTHLY_PRODUCT_ID, _MACAPP_MONTHLY_PRODUCT_ID)
in (
_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID,
_MACAPP_MONTHLY_PRODUCT_ID_NEW,
)
else PlanEnum.yearly
)

View File

@ -11,7 +11,7 @@ from itsdangerous import Signer
from app import email_utils
from app.api.base import api_bp
from app.config import FLASK_SECRET, DISABLE_REGISTRATION
from app.dashboard.views.setting import send_reset_password_email
from app.dashboard.views.account_setting import send_reset_password_email
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
@ -63,6 +63,11 @@ def auth_login():
elif user.disabled:
LoginEvent(LoginEvent.ActionType.disabled_login, LoginEvent.Source.api).send()
return jsonify(error="Account disabled"), 400
elif user.delete_on is not None:
LoginEvent(
LoginEvent.ActionType.scheduled_to_be_deleted, LoginEvent.Source.api
).send()
return jsonify(error="Account scheduled for deletion"), 400
elif not user.activated:
LoginEvent(LoginEvent.ActionType.not_activated, LoginEvent.Source.api).send()
return jsonify(error="Account not activated"), 422
@ -124,8 +129,8 @@ def auth_register():
send_email(
email,
"Just one more step to join SimpleLogin",
render("transactional/code-activation.txt.jinja2", code=code),
render("transactional/code-activation.html", code=code),
render("transactional/code-activation.txt.jinja2", user=user, code=code),
render("transactional/code-activation.html", user=user, code=code),
)
RegisterEvent(RegisterEvent.ActionType.success, RegisterEvent.Source.api).send()
@ -221,8 +226,8 @@ def auth_reactivate():
send_email(
email,
"Just one more step to join SimpleLogin",
render("transactional/code-activation.txt.jinja2", code=code),
render("transactional/code-activation.html", code=code),
render("transactional/code-activation.txt.jinja2", user=user, code=code),
render("transactional/code-activation.html", user=user, code=code),
)
return jsonify(msg="User needs to confirm their account"), 200

View File

@ -13,8 +13,8 @@ from app.db import Session
from app.email_utils import (
mailbox_already_used,
email_can_be_used_as_mailbox,
is_valid_email,
)
from app.email_validation import is_valid_email
from app.log import LOG
from app.models import Mailbox, Job
from app.utils import sanitize_email
@ -45,7 +45,7 @@ def create_mailbox():
mailbox_email = sanitize_email(request.get_json().get("email"))
if not user.is_premium():
return jsonify(error=f"Only premium plan can add additional mailbox"), 400
return jsonify(error="Only premium plan can add additional mailbox"), 400
if not is_valid_email(mailbox_email):
return jsonify(error=f"{mailbox_email} invalid"), 400

View File

@ -150,7 +150,7 @@ def new_custom_alias_v3():
if not data:
return jsonify(error="request body cannot be empty"), 400
if type(data) is not dict:
if not isinstance(data, dict):
return jsonify(error="request body does not follow the required format"), 400
alias_prefix = data.get("alias_prefix", "").strip().lower().replace(" ", "")
@ -168,7 +168,7 @@ def new_custom_alias_v3():
return jsonify(error="alias prefix invalid format or too long"), 400
# check if mailbox is not tempered with
if type(mailbox_ids) is not list:
if not isinstance(mailbox_ids, list):
return jsonify(error="mailbox_ids must be an array of id"), 400
mailboxes = []
for mailbox_id in mailbox_ids:

View File

@ -32,6 +32,7 @@ def user_to_dict(user: User) -> dict:
"in_trial": user.in_trial(),
"max_alias_free_plan": user.max_alias_for_free_account(),
"connected_proton_address": None,
"can_create_reverse_alias": user.can_create_contacts(),
}
if config.CONNECT_WITH_PROTON:
@ -58,6 +59,7 @@ def user_info():
- in_trial
- max_alias_free
- is_connected_with_proton
- can_create_reverse_alias
"""
user = g.user

View File

@ -16,4 +16,26 @@ from .views import (
social,
recovery,
api_to_cookie,
oidc,
)
__all__ = [
"login",
"logout",
"register",
"activate",
"resend_activation",
"reset_password",
"forgot_password",
"github",
"google",
"facebook",
"proton",
"change_email",
"mfa",
"fido",
"social",
"recovery",
"api_to_cookie",
"oidc",
]

View File

@ -3,10 +3,13 @@ from flask_login import login_user
from app.auth.base import auth_bp
from app.db import Session
from app.extensions import limiter
from app.log import LOG
from app.models import EmailChange, ResetPasswordCode
@auth_bp.route("/change_email", methods=["GET", "POST"])
@limiter.limit("3/hour")
def change_email():
code = request.args.get("code")
@ -22,12 +25,14 @@ def change_email():
return render_template("auth/change_email.html")
user = email_change.user
old_email = user.email
user.email = email_change.new_email
EmailChange.delete(email_change.id)
ResetPasswordCode.filter_by(user_id=user.id).delete()
Session.commit()
LOG.i(f"User {user} has changed their email from {old_email} to {user.email}")
flash("Your new email has been updated", "success")
login_user(user)

View File

@ -62,7 +62,7 @@ def fido():
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
if browser and not browser.is_expired() and browser.user_id == user.id:
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
return redirect(next_url or url_for("dashboard.index"))
else:
@ -110,7 +110,7 @@ def fido():
session["sudo_time"] = int(time())
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
response = make_response(redirect(next_url or url_for("dashboard.index")))

View File

@ -3,7 +3,7 @@ from flask_wtf import FlaskForm
from wtforms import StringField, validators
from app.auth.base import auth_bp
from app.dashboard.views.setting import send_reset_password_email
from app.dashboard.views.account_setting import send_reset_password_email
from app.extensions import limiter
from app.log import LOG
from app.models import User

View File

@ -7,7 +7,7 @@ from app.config import URL, GOOGLE_CLIENT_ID, GOOGLE_CLIENT_SECRET
from app.db import Session
from app.log import LOG
from app.models import User, File, SocialAuth
from app.utils import random_string, sanitize_email
from app.utils import random_string, sanitize_email, sanitize_next_url
from .login_utils import after_login
_authorization_base_url = "https://accounts.google.com/o/oauth2/v2/auth"
@ -29,7 +29,7 @@ def google_login():
# to avoid flask-login displaying the login error message
session.pop("_flashes", None)
next_url = request.args.get("next")
next_url = sanitize_next_url(request.args.get("next"))
# Google does not allow to append param to redirect_url
# we need to pass the next url by session

View File

@ -5,7 +5,7 @@ from wtforms import StringField, validators
from app.auth.base import auth_bp
from app.auth.views.login_utils import after_login
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, CONNECT_WITH_OIDC_ICON, OIDC_CLIENT_ID
from app.events.auth_event import LoginEvent
from app.extensions import limiter
from app.log import LOG
@ -54,6 +54,12 @@ def login():
"error",
)
LoginEvent(LoginEvent.ActionType.disabled_login).send()
elif user.delete_on is not None:
flash(
f"Your account is scheduled to be deleted on {user.delete_on}",
"error",
)
LoginEvent(LoginEvent.ActionType.scheduled_to_be_deleted).send()
elif not user.activated:
show_resend_activation = True
flash(
@ -71,4 +77,6 @@ def login():
next_url=next_url,
show_resend_activation=show_resend_activation,
connect_with_proton=CONNECT_WITH_PROTON,
connect_with_oidc=OIDC_CLIENT_ID is not None,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -55,7 +55,7 @@ def mfa():
browser = MfaBrowser.get_by(token=request.cookies.get("mfa"))
if browser and not browser.is_expired() and browser.user_id == user.id:
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
return redirect(next_url or url_for("dashboard.index"))
else:
@ -73,7 +73,7 @@ def mfa():
Session.commit()
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
# Redirect user to correct page
response = make_response(redirect(next_url or url_for("dashboard.index")))

135
app/app/auth/views/oidc.py Normal file
View File

@ -0,0 +1,135 @@
from flask import request, session, redirect, flash, url_for
from requests_oauthlib import OAuth2Session
import requests
from app import config
from app.auth.base import auth_bp
from app.auth.views.login_utils import after_login
from app.config import (
URL,
OIDC_SCOPES,
OIDC_NAME_FIELD,
)
from app.db import Session
from app.email_utils import send_welcome_email
from app.log import LOG
from app.models import User, SocialAuth
from app.utils import sanitize_email, sanitize_next_url
# need to set explicitly redirect_uri instead of leaving the lib to pre-fill redirect_uri
# when served behind nginx, the redirect_uri is localhost... and not the real url
redirect_uri = URL + "/auth/oidc/callback"
SESSION_STATE_KEY = "oauth_state"
SESSION_NEXT_KEY = "oauth_redirect_next"
@auth_bp.route("/oidc/login")
def oidc_login():
if config.OIDC_CLIENT_ID is None or config.OIDC_CLIENT_SECRET is None:
return redirect(url_for("auth.login"))
next_url = sanitize_next_url(request.args.get("next"))
auth_url = requests.get(config.OIDC_WELL_KNOWN_URL).json()["authorization_endpoint"]
oidc = OAuth2Session(
config.OIDC_CLIENT_ID, scope=[OIDC_SCOPES], redirect_uri=redirect_uri
)
authorization_url, state = oidc.authorization_url(auth_url)
# State is used to prevent CSRF, keep this for later.
session[SESSION_STATE_KEY] = state
session[SESSION_NEXT_KEY] = next_url
return redirect(authorization_url)
@auth_bp.route("/oidc/callback")
def oidc_callback():
if SESSION_STATE_KEY not in session:
flash("Invalid state, please retry", "error")
return redirect(url_for("auth.login"))
if config.OIDC_CLIENT_ID is None or config.OIDC_CLIENT_SECRET is None:
return redirect(url_for("auth.login"))
# user clicks on cancel
if "error" in request.args:
flash("Please use another sign in method then", "warning")
return redirect("/")
oidc_configuration = requests.get(config.OIDC_WELL_KNOWN_URL).json()
user_info_url = oidc_configuration["userinfo_endpoint"]
token_url = oidc_configuration["token_endpoint"]
oidc = OAuth2Session(
config.OIDC_CLIENT_ID,
state=session[SESSION_STATE_KEY],
scope=[OIDC_SCOPES],
redirect_uri=redirect_uri,
)
oidc.fetch_token(
token_url,
client_secret=config.OIDC_CLIENT_SECRET,
authorization_response=request.url,
)
oidc_user_data = oidc.get(user_info_url)
if oidc_user_data.status_code != 200:
LOG.e(
f"cannot get oidc user data {oidc_user_data.status_code} {oidc_user_data.text}"
)
flash(
"Cannot get user data from OIDC, please use another way to login/sign up",
"error",
)
return redirect(url_for("auth.login"))
oidc_user_data = oidc_user_data.json()
email = oidc_user_data.get("email")
if not email:
LOG.e(f"cannot get email for OIDC user {oidc_user_data} {email}")
flash(
"Cannot get a valid email from OIDC, please another way to login/sign up",
"error",
)
return redirect(url_for("auth.login"))
email = sanitize_email(email)
user = User.get_by(email=email)
if not user and config.DISABLE_REGISTRATION:
flash(
"Sorry you cannot sign up via the OIDC provider. Please sign-up first with your email.",
"error",
)
return redirect(url_for("auth.register"))
elif not user:
user = create_user(email, oidc_user_data)
if not SocialAuth.get_by(user_id=user.id, social="oidc"):
SocialAuth.create(user_id=user.id, social="oidc")
Session.commit()
# The activation link contains the original page, for ex authorize page
next_url = session[SESSION_NEXT_KEY]
session[SESSION_NEXT_KEY] = None
return after_login(user, next_url)
def create_user(email, oidc_user_data):
new_user = User.create(
email=email,
name=oidc_user_data.get(OIDC_NAME_FIELD),
password="",
activated=True,
)
LOG.i(f"Created new user for login request from OIDC. New user {new_user.id}")
Session.commit()
send_welcome_email(new_user)
return new_user

View File

@ -53,7 +53,7 @@ def recovery_route():
del session[MFA_USER_ID]
login_user(user)
flash(f"Welcome back!", "success")
flash("Welcome back!", "success")
recovery_code.used = True
recovery_code.used_at = arrow.now()

View File

@ -6,7 +6,7 @@ from wtforms import StringField, validators
from app import email_utils, config
from app.auth.base import auth_bp
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, CONNECT_WITH_OIDC_ICON
from app.auth.views.login_utils import get_referral
from app.config import URL, HCAPTCHA_SECRET, HCAPTCHA_SITEKEY
from app.db import Session
@ -94,9 +94,7 @@ def register():
try:
send_activation_email(user, next_url)
RegisterEvent(RegisterEvent.ActionType.success).send()
DailyMetric.get_or_create_today_metric().nb_new_web_non_proton_user += (
1
)
DailyMetric.get_or_create_today_metric().nb_new_web_non_proton_user += 1
Session.commit()
except Exception:
flash("Invalid email, are you sure the email is correct?", "error")
@ -111,6 +109,8 @@ def register():
next_url=next_url,
HCAPTCHA_SITEKEY=HCAPTCHA_SITEKEY,
connect_with_proton=CONNECT_WITH_PROTON,
connect_with_oidc=config.OIDC_CLIENT_ID is not None,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)
@ -125,4 +125,4 @@ def send_activation_email(user, next_url):
LOG.d("redirect user to %s after activation", next_url)
activation_link = activation_link + "&next=" + encode_url(next_url)
email_utils.send_activation_email(user.email, activation_link)
email_utils.send_activation_email(user, activation_link)

View File

@ -120,7 +120,7 @@ if POSTFIX_SUBMISSION_TLS:
else:
default_postfix_port = 25
POSTFIX_PORT = int(os.environ.get("POSTFIX_PORT", default_postfix_port))
POSTFIX_TIMEOUT = os.environ.get("POSTFIX_TIMEOUT", 3)
POSTFIX_TIMEOUT = int(os.environ.get("POSTFIX_TIMEOUT", 3))
# ["domain1.com", "domain2.com"]
OTHER_ALIAS_DOMAINS = sl_getenv("OTHER_ALIAS_DOMAINS", list)
@ -179,6 +179,7 @@ AWS_REGION = os.environ.get("AWS_REGION") or "eu-west-3"
BUCKET = os.environ.get("BUCKET")
AWS_ACCESS_KEY_ID = os.environ.get("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = os.environ.get("AWS_SECRET_ACCESS_KEY")
AWS_ENDPOINT_URL = os.environ.get("AWS_ENDPOINT_URL", None)
# Paddle
try:
@ -233,7 +234,7 @@ else:
print("WARNING: Use a temp directory for GNUPGHOME", GNUPGHOME)
# Github, Google, Facebook client id and secrets
# Github, Google, Facebook, OIDC client id and secrets
GITHUB_CLIENT_ID = os.environ.get("GITHUB_CLIENT_ID")
GITHUB_CLIENT_SECRET = os.environ.get("GITHUB_CLIENT_SECRET")
@ -243,6 +244,13 @@ GOOGLE_CLIENT_SECRET = os.environ.get("GOOGLE_CLIENT_SECRET")
FACEBOOK_CLIENT_ID = os.environ.get("FACEBOOK_CLIENT_ID")
FACEBOOK_CLIENT_SECRET = os.environ.get("FACEBOOK_CLIENT_SECRET")
CONNECT_WITH_OIDC_ICON = os.environ.get("CONNECT_WITH_OIDC_ICON")
OIDC_WELL_KNOWN_URL = os.environ.get("OIDC_WELL_KNOWN_URL")
OIDC_CLIENT_ID = os.environ.get("OIDC_CLIENT_ID")
OIDC_CLIENT_SECRET = os.environ.get("OIDC_CLIENT_SECRET")
OIDC_SCOPES = os.environ.get("OIDC_SCOPES")
OIDC_NAME_FIELD = os.environ.get("OIDC_NAME_FIELD", "name")
PROTON_CLIENT_ID = os.environ.get("PROTON_CLIENT_ID")
PROTON_CLIENT_SECRET = os.environ.get("PROTON_CLIENT_SECRET")
PROTON_BASE_URL = os.environ.get(
@ -273,6 +281,7 @@ JOB_DELETE_MAILBOX = "delete-mailbox"
JOB_DELETE_DOMAIN = "delete-domain"
JOB_SEND_USER_REPORT = "send-user-report"
JOB_SEND_PROTON_WELCOME_1 = "proton-welcome-1"
JOB_SEND_ALIAS_CREATION_EVENTS = "send-alias-creation-events"
# for pagination
PAGE_LIMIT = 20
@ -420,6 +429,11 @@ try:
except Exception:
HIBP_SCAN_INTERVAL_DAYS = 7
HIBP_API_KEYS = sl_getenv("HIBP_API_KEYS", list) or []
HIBP_MAX_ALIAS_CHECK = 10_000
HIBP_RPM = int(os.environ.get("HIBP_API_RPM", 100))
HIBP_SKIP_PARTNER_ALIAS = os.environ.get("HIBP_SKIP_PARTNER_ALIAS")
KEEP_OLD_DATA_DAYS = 30
POSTMASTER = os.environ.get("POSTMASTER")
@ -488,7 +502,34 @@ def setup_nameservers():
NAMESERVERS = setup_nameservers()
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = False
DISABLE_CREATE_CONTACTS_FOR_FREE_USERS = os.environ.get(
"DISABLE_CREATE_CONTACTS_FOR_FREE_USERS", False
)
# Expect format hits,seconds:hits,seconds...
# Example 1,10:4,60 means 1 in the last 10 secs or 4 in the last 60 secs
def getRateLimitFromConfig(
env_var: string, default: string = ""
) -> list[tuple[int, int]]:
value = os.environ.get(env_var, default)
if not value:
return []
entries = [entry for entry in value.split(":")]
limits = []
for entry in entries:
fields = entry.split(",")
limit = (int(fields[0]), int(fields[1]))
limits.append(limit)
return limits
ALIAS_CREATE_RATE_LIMIT_FREE = getRateLimitFromConfig(
"ALIAS_CREATE_RATE_LIMIT_FREE", "10,900:50,3600"
)
ALIAS_CREATE_RATE_LIMIT_PAID = getRateLimitFromConfig(
"ALIAS_CREATE_RATE_LIMIT_PAID", "50,900:200,3600"
)
PARTNER_API_TOKEN_SECRET = os.environ.get("PARTNER_API_TOKEN_SECRET") or (
FLASK_SECRET + "partnerapitoken"
)
@ -534,3 +575,16 @@ SKIP_MX_LOOKUP_ON_CHECK = False
DISABLE_RATE_LIMIT = "DISABLE_RATE_LIMIT" in os.environ
SUBSCRIPTION_CHANGE_WEBHOOK = os.environ.get("SUBSCRIPTION_CHANGE_WEBHOOK", None)
MAX_API_KEYS = int(os.environ.get("MAX_API_KEYS", 30))
UPCLOUD_USERNAME = os.environ.get("UPCLOUD_USERNAME", None)
UPCLOUD_PASSWORD = os.environ.get("UPCLOUD_PASSWORD", None)
UPCLOUD_DB_ID = os.environ.get("UPCLOUD_DB_ID", None)
STORE_TRANSACTIONAL_EMAILS = "STORE_TRANSACTIONAL_EMAILS" in os.environ
EVENT_WEBHOOK = os.environ.get("EVENT_WEBHOOK", None)
# We want it disabled by default, so only skip if defined
EVENT_WEBHOOK_SKIP_VERIFY_SSL = "EVENT_WEBHOOK_SKIP_VERIFY_SSL" in os.environ
EVENT_WEBHOOK_DISABLE = "EVENT_WEBHOOK_DISABLE" in os.environ

1
app/app/constants.py Normal file
View File

@ -0,0 +1 @@
HEADER_ALLOW_API_COOKIES = "X-Sl-Allowcookies"

View File

@ -32,4 +32,42 @@ from .views import (
delete_account,
notification,
support,
account_setting,
)
__all__ = [
"index",
"pricing",
"setting",
"custom_alias",
"subdomain",
"billing",
"alias_log",
"alias_export",
"unsubscribe",
"api_key",
"custom_domain",
"alias_contact_manager",
"enter_sudo",
"mfa_setup",
"mfa_cancel",
"fido_setup",
"coupon",
"fido_manage",
"domain_detail",
"lifetime_licence",
"directory",
"mailbox",
"mailbox_detail",
"refused_email",
"referral",
"contact_detail",
"setup_done",
"batch_import",
"alias_transfer",
"app",
"delete_account",
"notification",
"support",
"account_setting",
]

View File

@ -0,0 +1,242 @@
import arrow
from flask import (
render_template,
request,
redirect,
url_for,
flash,
)
from flask_login import login_required, current_user
from app import email_utils
from app.config import (
URL,
FIRST_ALIAS_DOMAIN,
ALIAS_RANDOM_SUFFIX_LENGTH,
CONNECT_WITH_PROTON,
)
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.dashboard.views.mailbox_detail import ChangeEmailForm
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
personal_email_already_used,
)
from app.extensions import limiter
from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG
from app.models import (
BlockBehaviourEnum,
PlanEnum,
ResetPasswordCode,
EmailChange,
User,
Alias,
AliasGeneratorEnum,
SenderFormatEnum,
UnsubscribeBehaviourEnum,
)
from app.proton.utils import perform_proton_account_unlink
from app.utils import (
random_string,
CSRFValidationForm,
canonicalize_email,
)
@dashboard_bp.route("/account_setting", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("5/minute", methods=["POST"])
def account_setting():
change_email_form = ChangeEmailForm()
csrf_form = CSRFValidationForm()
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
pending_email = email_change.new_email
else:
pending_email = None
if request.method == "POST":
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-email":
if change_email_form.validate():
# whether user can proceed with the email update
new_email_valid = True
new_email = canonicalize_email(change_email_form.email.data)
if new_email != current_user.email and not pending_email:
# check if this email is not already used
if personal_email_already_used(new_email) or Alias.get_by(
email=new_email
):
flash(f"Email {new_email} already used", "error")
new_email_valid = False
elif not email_can_be_used_as_mailbox(new_email):
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
# a pending email change with the same email exists from another user
elif EmailChange.get_by(new_email=new_email):
other_email_change: EmailChange = EmailChange.get_by(
new_email=new_email
)
LOG.w(
"Another user has a pending %s with the same email address. Current user:%s",
other_email_change,
current_user,
)
if other_email_change.is_expired():
LOG.d(
"delete the expired email change %s", other_email_change
)
EmailChange.delete(other_email_change.id)
Session.commit()
else:
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
if new_email_valid:
email_change = EmailChange.create(
user_id=current_user.id,
code=random_string(
60
), # todo: make sure the code is unique
new_email=new_email,
)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash(
"A confirmation email is on the way, please check your inbox",
"success",
)
return redirect(url_for("dashboard.account_setting"))
elif request.form.get("form-name") == "change-password":
flash(
"You are going to receive an email containing instructions to change your password",
"success",
)
send_reset_password_email(current_user)
return redirect(url_for("dashboard.account_setting"))
elif request.form.get("form-name") == "send-full-user-report":
if ExportUserDataJob(current_user).store_job_in_db():
flash(
"You will receive your SimpleLogin data via email shortly",
"success",
)
else:
flash("An export of your data is currently in progress", "error")
partner_sub = None
partner_name = None
return render_template(
"dashboard/account_setting.html",
csrf_form=csrf_form,
PlanEnum=PlanEnum,
SenderFormatEnum=SenderFormatEnum,
BlockBehaviourEnum=BlockBehaviourEnum,
change_email_form=change_email_form,
pending_email=pending_email,
AliasGeneratorEnum=AliasGeneratorEnum,
UnsubscribeBehaviourEnum=UnsubscribeBehaviourEnum,
partner_sub=partner_sub,
partner_name=partner_name,
FIRST_ALIAS_DOMAIN=FIRST_ALIAS_DOMAIN,
ALIAS_RAND_SUFFIX_LENGTH=ALIAS_RANDOM_SUFFIX_LENGTH,
connect_with_proton=CONNECT_WITH_PROTON,
)
def send_reset_password_email(user):
"""
generate a new ResetPasswordCode and send it over email to user
"""
# the activation code is valid for 1h
reset_password_code = ResetPasswordCode.create(
user_id=user.id, code=random_string(60)
)
Session.commit()
reset_password_link = f"{URL}/auth/reset_password?code={reset_password_code.code}"
email_utils.send_reset_password_email(user, reset_password_link)
def send_change_email_confirmation(user: User, email_change: EmailChange):
"""
send confirmation email to the new email address
"""
link = f"{URL}/auth/change_email?code={email_change.code}"
email_utils.send_change_email(user, email_change.new_email, link)
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
@limiter.limit("5/hour")
@login_required
@sudo_required
def resend_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
# extend email change expiration
email_change.expired = arrow.now().shift(hours=12)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash("A confirmation email is on the way, please check your inbox", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
@login_required
@sudo_required
def cancel_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
EmailChange.delete(email_change.id)
Session.commit()
flash("Your email change is cancelled", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/unlink_proton_account", methods=["POST"])
@login_required
@sudo_required
def unlink_proton_account():
csrf_form = CSRFValidationForm()
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
perform_proton_account_unlink(current_user)
flash("Your Proton account has been unlinked", "success")
return redirect(url_for("dashboard.setting"))

View File

@ -13,10 +13,10 @@ from app import config, parallel_limiter
from app.dashboard.base import dashboard_bp
from app.db import Session
from app.email_utils import (
is_valid_email,
generate_reply_email,
parse_full_address,
)
from app.email_validation import is_valid_email
from app.errors import (
CannotCreateContactForReverseAlias,
ErrContactErrorUpgradeNeeded,
@ -51,14 +51,6 @@ def email_validator():
return _check
def user_can_create_contacts(user: User) -> bool:
if user.is_premium():
return True
if user.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
"""
Create a contact for a user. Can be restricted for new free users by enabling DISABLE_CREATE_CONTACTS_FOR_FREE_USERS.
@ -82,7 +74,7 @@ def create_contact(user: User, alias: Alias, contact_address: str) -> Contact:
if contact:
raise ErrContactAlreadyExists(contact)
if not user_can_create_contacts(user):
if not user.can_create_contacts():
raise ErrContactErrorUpgradeNeeded()
contact = Contact.create(
@ -327,6 +319,6 @@ def alias_contact_manager(alias_id):
last_page=last_page,
query=query,
nb_contact=nb_contact,
can_create_contacts=user_can_create_contacts(current_user),
can_create_contacts=current_user.can_create_contacts(),
csrf_form=csrf_form,
)

View File

@ -1,9 +1,13 @@
from app.dashboard.base import dashboard_bp
from flask_login import login_required, current_user
from app.alias_utils import alias_export_csv
from app.dashboard.views.enter_sudo import sudo_required
from app.extensions import limiter
@dashboard_bp.route("/alias_export", methods=["GET"])
@login_required
@sudo_required
@limiter.limit("2/minute")
def alias_export_route():
return alias_export_csv(current_user)

View File

@ -87,6 +87,6 @@ def get_alias_log(alias: Alias, page_id=0) -> [AliasLog]:
contact=contact,
)
logs.append(al)
logs = sorted(logs, key=lambda l: l.when, reverse=True)
logs = sorted(logs, key=lambda log: log.when, reverse=True)
return logs

View File

@ -7,79 +7,19 @@ from flask import render_template, redirect, url_for, flash, request
from flask_login import login_required, current_user
from app import config
from app.alias_utils import transfer_alias
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.email_utils import send_email, render
from app.extensions import limiter
from app.log import LOG
from app.models import (
Alias,
Contact,
AliasUsedOn,
AliasMailbox,
User,
ClientUser,
)
from app.models import Mailbox
from app.utils import CSRFValidationForm
def transfer(alias, new_user, new_mailboxes: [Mailbox]):
# cannot transfer alias which is used for receiving newsletter
if User.get_by(newsletter_alias_id=alias.id):
raise Exception("Cannot transfer alias that's used to receive newsletter")
# update user_id
Session.query(Contact).filter(Contact.alias_id == alias.id).update(
{"user_id": new_user.id}
)
Session.query(AliasUsedOn).filter(AliasUsedOn.alias_id == alias.id).update(
{"user_id": new_user.id}
)
Session.query(ClientUser).filter(ClientUser.alias_id == alias.id).update(
{"user_id": new_user.id}
)
# remove existing mailboxes from the alias
Session.query(AliasMailbox).filter(AliasMailbox.alias_id == alias.id).delete()
# set mailboxes
alias.mailbox_id = new_mailboxes.pop().id
for mb in new_mailboxes:
AliasMailbox.create(alias_id=alias.id, mailbox_id=mb.id)
# alias has never been transferred before
if not alias.original_owner_id:
alias.original_owner_id = alias.user_id
# inform previous owner
old_user = alias.user
send_email(
old_user.email,
f"Alias {alias.email} has been received",
render(
"transactional/alias-transferred.txt",
alias=alias,
),
render(
"transactional/alias-transferred.html",
alias=alias,
),
)
# now the alias belongs to the new user
alias.user_id = new_user.id
# set some fields back to default
alias.disable_pgp = False
alias.pinned = False
Session.commit()
def hmac_alias_transfer_token(transfer_token: str) -> str:
alias_hmac = hmac.new(
config.ALIAS_TRANSFER_TOKEN_SECRET.encode("utf-8"),
@ -214,7 +154,7 @@ def alias_transfer_receive_route():
mailboxes,
token,
)
transfer(alias, current_user, mailboxes)
transfer_alias(alias, current_user, mailboxes)
# reset transfer token
alias.transfer_token = None

View File

@ -3,9 +3,11 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from wtforms import StringField, validators
from app import config
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.extensions import limiter
from app.models import ApiKey
from app.utils import CSRFValidationForm
@ -14,9 +16,34 @@ class NewApiKeyForm(FlaskForm):
name = StringField("Name", validators=[validators.DataRequired()])
def clean_up_unused_or_old_api_keys(user_id: int):
total_keys = ApiKey.filter_by(user_id=user_id).count()
if total_keys <= config.MAX_API_KEYS:
return
# Remove oldest unused
for api_key in (
ApiKey.filter_by(user_id=user_id, last_used=None)
.order_by(ApiKey.created_at.asc())
.all()
):
Session.delete(api_key)
total_keys -= 1
if total_keys <= config.MAX_API_KEYS:
return
# Clean up oldest used
for api_key in (
ApiKey.filter_by(user_id=user_id).order_by(ApiKey.last_used.asc()).all()
):
Session.delete(api_key)
total_keys -= 1
if total_keys <= config.MAX_API_KEYS:
return
@dashboard_bp.route("/api_key", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("10/hour")
def api_key():
api_keys = (
ApiKey.filter(ApiKey.user_id == current_user.id)
@ -50,6 +77,7 @@ def api_key():
elif request.form.get("form-name") == "create":
if new_api_key_form.validate():
clean_up_unused_or_old_api_keys(current_user.id)
new_api_key = ApiKey.create(
name=new_api_key_form.name.data, user_id=current_user.id
)

View File

@ -1,14 +1,9 @@
from app.db import Session
"""
List of apps that user has used via the "Sign in with SimpleLogin"
"""
from flask import render_template, request, flash, redirect
from flask_login import login_required, current_user
from sqlalchemy.orm import joinedload
from app.dashboard.base import dashboard_bp
from app.db import Session
from app.models import (
ClientUser,
)
@ -17,6 +12,10 @@ from app.models import (
@dashboard_bp.route("/app", methods=["GET", "POST"])
@login_required
def app_route():
"""
List of apps that user has used via the "Sign in with SimpleLogin"
"""
client_users = (
ClientUser.filter_by(user_id=current_user.id)
.options(joinedload(ClientUser.client))

View File

@ -5,7 +5,9 @@ from flask_login import login_required, current_user
from app import s3
from app.config import JOB_BATCH_IMPORT
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.extensions import limiter
from app.log import LOG
from app.models import File, BatchImport, Job
from app.utils import random_string, CSRFValidationForm
@ -13,6 +15,8 @@ from app.utils import random_string, CSRFValidationForm
@dashboard_bp.route("/batch_import", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("10/minute", methods=["POST"])
def batch_import_route():
# only for users who have custom domains
if not current_user.verified_custom_domains():
@ -37,7 +41,7 @@ def batch_import_route():
return redirect(request.url)
if len(batch_imports) > 10:
flash(
"You have too many imports already. Wait until some get cleaned up",
"You have too many imports already. Please wait until some get cleaned up",
"error",
)
return render_template(

View File

@ -100,7 +100,7 @@ def coupon_route():
commit=True,
)
flash(
f"Your account has been upgraded to Premium, thanks for your support!",
"Your account has been upgraded to Premium, thanks for your support!",
"success",
)

View File

@ -24,6 +24,7 @@ from app.models import (
AliasMailbox,
DomainDeletedAlias,
)
from app.utils import CSRFValidationForm
@dashboard_bp.route("/custom_alias", methods=["GET", "POST"])
@ -48,9 +49,13 @@ def custom_alias():
at_least_a_premium_domain = True
break
csrf_form = CSRFValidationForm()
mailboxes = current_user.mailboxes()
if request.method == "POST":
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(request.url)
alias_prefix = request.form.get("prefix").strip().lower().replace(" ", "")
signed_alias_suffix = request.form.get("signed-alias-suffix")
mailbox_ids = request.form.getlist("mailboxes")
@ -164,4 +169,5 @@ def custom_alias():
alias_suffixes=alias_suffixes,
at_least_a_premium_domain=at_least_a_premium_domain,
mailboxes=mailboxes,
csrf_form=csrf_form,
)

View File

@ -67,7 +67,7 @@ def directory():
if request.method == "POST":
if request.form.get("form-name") == "delete":
if not delete_dir_form.validate():
flash(f"Invalid request", "warning")
flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory"))
dir_obj = Directory.get(delete_dir_form.directory_id.data)
@ -87,7 +87,7 @@ def directory():
if request.form.get("form-name") == "toggle-directory":
if not toggle_dir_form.validate():
flash(f"Invalid request", "warning")
flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory"))
dir_id = toggle_dir_form.directory_id.data
dir_obj = Directory.get(dir_id)
@ -109,7 +109,7 @@ def directory():
elif request.form.get("form-name") == "update":
if not update_dir_form.validate():
flash(f"Invalid request", "warning")
flash("Invalid request", "warning")
return redirect(url_for("dashboard.directory"))
dir_id = update_dir_form.directory_id.data
dir_obj = Directory.get(dir_id)

View File

@ -6,14 +6,15 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from wtforms import PasswordField, validators
from app.config import CONNECT_WITH_PROTON
from app.config import CONNECT_WITH_PROTON, OIDC_CLIENT_ID, CONNECT_WITH_OIDC_ICON
from app.dashboard.base import dashboard_bp
from app.extensions import limiter
from app.log import LOG
from app.models import PartnerUser
from app.models import PartnerUser, SocialAuth
from app.proton.utils import get_proton_partner
from app.utils import sanitize_next_url
_SUDO_GAP = 900
_SUDO_GAP = 120
class LoginForm(FlaskForm):
@ -21,6 +22,7 @@ class LoginForm(FlaskForm):
@dashboard_bp.route("/enter_sudo", methods=["GET", "POST"])
@limiter.limit("3/minute")
@login_required
def enter_sudo():
password_check_form = LoginForm()
@ -49,11 +51,19 @@ def enter_sudo():
if not partner_user or partner_user.partner_id != get_proton_partner().id:
proton_enabled = False
oidc_enabled = OIDC_CLIENT_ID is not None
if oidc_enabled:
oidc_enabled = (
SocialAuth.get_by(user_id=current_user.id, social="oidc") is not None
)
return render_template(
"dashboard/enter_sudo.html",
password_check_form=password_check_form,
next=request.args.get("next"),
connect_with_proton=proton_enabled,
connect_with_oidc=oidc_enabled,
connect_with_oidc_icon=CONNECT_WITH_OIDC_ICON,
)

View File

@ -12,6 +12,7 @@ from app.extensions import limiter
from app.log import LOG
from app.models import (
Alias,
AliasDeleteReason,
AliasGeneratorEnum,
User,
EmailLog,
@ -52,12 +53,13 @@ def get_stats(user: User) -> Stats:
@dashboard_bp.route("/", methods=["GET", "POST"])
@login_required
@limiter.limit(
ALIAS_LIMIT,
methods=["POST"],
exempt_when=lambda: request.form.get("form-name") != "create-random-email",
)
@login_required
@limiter.limit("10/minute", methods=["GET"], key_func=lambda: current_user.id)
@parallel_limiter.lock(
name="alias_creation",
only_when=lambda: request.form.get("form-name") == "create-random-email",
@ -140,12 +142,14 @@ def index():
)
if request.form.get("form-name") == "delete-alias":
LOG.d("delete alias %s", alias)
LOG.i(f"User {current_user} requested deletion of alias {alias}")
email = alias.email
alias_utils.delete_alias(alias, current_user)
alias_utils.delete_alias(
alias, current_user, AliasDeleteReason.ManualAction
)
flash(f"Alias {email} has been deleted", "success")
elif request.form.get("form-name") == "disable-alias":
alias.enabled = False
alias_utils.change_alias_status(alias, enabled=False)
Session.commit()
flash(f"Alias {alias.email} has been disabled", "success")

View File

@ -19,8 +19,8 @@ from app.email_utils import (
mailbox_already_used,
render,
send_email,
is_valid_email,
)
from app.email_validation import is_valid_email
from app.log import LOG
from app.models import Mailbox, Job
from app.utils import CSRFValidationForm

View File

@ -11,9 +11,11 @@ from wtforms.fields.html5 import EmailField
from app.config import ENFORCE_SPF, MAILBOX_SECRET
from app.config import URL
from app.dashboard.base import dashboard_bp
from app.dashboard.views.enter_sudo import sudo_required
from app.db import Session
from app.email_utils import email_can_be_used_as_mailbox
from app.email_utils import mailbox_already_used, render, send_email
from app.extensions import limiter
from app.log import LOG
from app.models import Alias, AuthorizedAddress
from app.models import Mailbox
@ -29,8 +31,10 @@ class ChangeEmailForm(FlaskForm):
@dashboard_bp.route("/mailbox/<int:mailbox_id>/", methods=["GET", "POST"])
@login_required
@sudo_required
@limiter.limit("20/minute", methods=["POST"])
def mailbox_detail_route(mailbox_id):
mailbox = Mailbox.get(mailbox_id)
mailbox: Mailbox = Mailbox.get(mailbox_id)
if not mailbox or mailbox.user_id != current_user.id:
flash("You cannot see this page", "warning")
return redirect(url_for("dashboard.index"))
@ -144,6 +148,15 @@ def mailbox_detail_route(mailbox_id):
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
)
if mailbox.is_proton():
flash(
"Enabling PGP for a Proton Mail mailbox is redundant and does not add any security benefit",
"info",
)
return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
)
mailbox.pgp_public_key = request.form.get("pgp")
try:
mailbox.pgp_finger_print = load_public_key_and_check(
@ -170,8 +183,15 @@ def mailbox_detail_route(mailbox_id):
elif request.form.get("form-name") == "toggle-pgp":
if request.form.get("pgp-enabled") == "on":
mailbox.disable_pgp = False
flash(f"PGP is enabled on {mailbox.email}", "success")
if mailbox.is_proton():
mailbox.disable_pgp = True
flash(
"Enabling PGP for a Proton Mail mailbox is redundant and does not add any security benefit",
"info",
)
else:
mailbox.disable_pgp = False
flash(f"PGP is enabled on {mailbox.email}", "info")
else:
mailbox.disable_pgp = True
flash(f"PGP is disabled on {mailbox.email}", "info")
@ -182,25 +202,16 @@ def mailbox_detail_route(mailbox_id):
)
elif request.form.get("form-name") == "generic-subject":
if request.form.get("action") == "save":
if not mailbox.pgp_enabled():
flash(
"Generic subject can only be used on PGP-enabled mailbox",
"error",
)
return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
)
mailbox.generic_subject = request.form.get("generic-subject")
Session.commit()
flash("Generic subject for PGP-encrypted email is enabled", "success")
flash("Generic subject is enabled", "success")
return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
)
elif request.form.get("action") == "remove":
mailbox.generic_subject = None
Session.commit()
flash("Generic subject for PGP-encrypted email is disabled", "success")
flash("Generic subject is disabled", "success")
return redirect(
url_for("dashboard.mailbox_detail_route", mailbox_id=mailbox_id)
)

View File

@ -13,34 +13,24 @@ from flask_login import login_required, current_user
from flask_wtf import FlaskForm
from flask_wtf.file import FileField
from wtforms import StringField, validators
from wtforms.fields.html5 import EmailField
from app import s3, email_utils
from app import s3
from app.config import (
URL,
FIRST_ALIAS_DOMAIN,
ALIAS_RANDOM_SUFFIX_LENGTH,
CONNECT_WITH_PROTON,
)
from app.dashboard.base import dashboard_bp
from app.db import Session
from app.email_utils import (
email_can_be_used_as_mailbox,
personal_email_already_used,
)
from app.errors import ProtonPartnerNotSetUp
from app.extensions import limiter
from app.image_validation import detect_image_format, ImageFormat
from app.jobs.export_user_data_job import ExportUserDataJob
from app.log import LOG
from app.models import (
BlockBehaviourEnum,
PlanEnum,
File,
ResetPasswordCode,
EmailChange,
User,
Alias,
CustomDomain,
AliasGeneratorEnum,
AliasSuffixEnum,
@ -53,11 +43,10 @@ from app.models import (
PartnerSubscription,
UnsubscribeBehaviourEnum,
)
from app.proton.utils import get_proton_partner, perform_proton_account_unlink
from app.proton.utils import get_proton_partner
from app.utils import (
random_string,
CSRFValidationForm,
canonicalize_email,
)
@ -66,12 +55,6 @@ class SettingForm(FlaskForm):
profile_picture = FileField("Profile Picture")
class ChangeEmailForm(FlaskForm):
email = EmailField(
"email", validators=[validators.DataRequired(), validators.Email()]
)
class PromoCodeForm(FlaskForm):
code = StringField("Name", validators=[validators.DataRequired()])
@ -109,7 +92,6 @@ def get_partner_subscription_and_name(
def setting():
form = SettingForm()
promo_form = PromoCodeForm()
change_email_form = ChangeEmailForm()
csrf_form = CSRFValidationForm()
email_change = EmailChange.get_by(user_id=current_user.id)
@ -122,64 +104,7 @@ def setting():
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-email":
if change_email_form.validate():
# whether user can proceed with the email update
new_email_valid = True
new_email = canonicalize_email(change_email_form.email.data)
if new_email != current_user.email and not pending_email:
# check if this email is not already used
if personal_email_already_used(new_email) or Alias.get_by(
email=new_email
):
flash(f"Email {new_email} already used", "error")
new_email_valid = False
elif not email_can_be_used_as_mailbox(new_email):
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
# a pending email change with the same email exists from another user
elif EmailChange.get_by(new_email=new_email):
other_email_change: EmailChange = EmailChange.get_by(
new_email=new_email
)
LOG.w(
"Another user has a pending %s with the same email address. Current user:%s",
other_email_change,
current_user,
)
if other_email_change.is_expired():
LOG.d(
"delete the expired email change %s", other_email_change
)
EmailChange.delete(other_email_change.id)
Session.commit()
else:
flash(
"You cannot use this email address as your personal inbox.",
"error",
)
new_email_valid = False
if new_email_valid:
email_change = EmailChange.create(
user_id=current_user.id,
code=random_string(
60
), # todo: make sure the code is unique
new_email=new_email,
)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash(
"A confirmation email is on the way, please check your inbox",
"success",
)
return redirect(url_for("dashboard.setting"))
if request.form.get("form-name") == "update-profile":
if form.validate():
profile_updated = False
@ -223,15 +148,6 @@ def setting():
if profile_updated:
flash("Your profile has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-password":
flash(
"You are going to receive an email containing instructions to change your password",
"success",
)
send_reset_password_email(current_user)
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "notification-preference":
choose = request.form.get("notification")
if choose == "on":
@ -241,7 +157,6 @@ def setting():
Session.commit()
flash("Your notification preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-alias-generator":
scheme = int(request.form.get("alias-generator-scheme"))
if AliasGeneratorEnum.has_value(scheme):
@ -249,7 +164,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-random-alias-default-domain":
default_domain = request.form.get("random-alias-default-domain")
@ -288,7 +202,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "random-alias-suffix":
scheme = int(request.form.get("random-alias-suffix-generator"))
if AliasSuffixEnum.has_value(scheme):
@ -296,7 +209,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "change-sender-format":
sender_format = int(request.form.get("sender-format"))
if SenderFormatEnum.has_value(sender_format):
@ -306,7 +218,6 @@ def setting():
flash("Your sender format preference has been updated", "success")
Session.commit()
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "replace-ra":
choose = request.form.get("replace-ra")
if choose == "on":
@ -316,7 +227,21 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "enable_data_breach_check":
if not current_user.is_premium():
flash("Only premium plan can enable data breach monitoring", "warning")
return redirect(url_for("dashboard.setting"))
choose = request.form.get("enable_data_breach_check")
if choose == "on":
LOG.i("User {current_user} has enabled data breach monitoring")
current_user.enable_data_breach_check = True
flash("Data breach monitoring is enabled", "success")
else:
LOG.i("User {current_user} has disabled data breach monitoring")
current_user.enable_data_breach_check = False
flash("Data breach monitoring is disabled", "info")
Session.commit()
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "sender-in-ra":
choose = request.form.get("enable")
if choose == "on":
@ -326,7 +251,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "expand-alias-info":
choose = request.form.get("enable")
if choose == "on":
@ -388,14 +312,6 @@ def setting():
Session.commit()
flash("Your preference has been updated", "success")
return redirect(url_for("dashboard.setting"))
elif request.form.get("form-name") == "send-full-user-report":
if ExportUserDataJob(current_user).store_job_in_db():
flash(
"You will receive your SimpleLogin data via email shortly",
"success",
)
else:
flash("An export of your data is currently in progress", "error")
manual_sub = ManualSubscription.get_by(user_id=current_user.id)
apple_sub = AppleSubscription.get_by(user_id=current_user.id)
@ -418,7 +334,6 @@ def setting():
SenderFormatEnum=SenderFormatEnum,
BlockBehaviourEnum=BlockBehaviourEnum,
promo_form=promo_form,
change_email_form=change_email_form,
pending_email=pending_email,
AliasGeneratorEnum=AliasGeneratorEnum,
UnsubscribeBehaviourEnum=UnsubscribeBehaviourEnum,
@ -433,85 +348,3 @@ def setting():
connect_with_proton=CONNECT_WITH_PROTON,
proton_linked_account=proton_linked_account,
)
def send_reset_password_email(user):
"""
generate a new ResetPasswordCode and send it over email to user
"""
# the activation code is valid for 1h
reset_password_code = ResetPasswordCode.create(
user_id=user.id, code=random_string(60)
)
Session.commit()
reset_password_link = f"{URL}/auth/reset_password?code={reset_password_code.code}"
email_utils.send_reset_password_email(user.email, reset_password_link)
def send_change_email_confirmation(user: User, email_change: EmailChange):
"""
send confirmation email to the new email address
"""
link = f"{URL}/auth/change_email?code={email_change.code}"
email_utils.send_change_email(email_change.new_email, user.email, link)
@dashboard_bp.route("/resend_email_change", methods=["GET", "POST"])
@limiter.limit("5/hour")
@login_required
def resend_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
# extend email change expiration
email_change.expired = arrow.now().shift(hours=12)
Session.commit()
send_change_email_confirmation(current_user, email_change)
flash("A confirmation email is on the way, please check your inbox", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/cancel_email_change", methods=["GET", "POST"])
@login_required
def cancel_email_change():
form = CSRFValidationForm()
if not form.validate():
flash("Invalid request. Please try again", "warning")
return redirect(url_for("dashboard.setting"))
email_change = EmailChange.get_by(user_id=current_user.id)
if email_change:
EmailChange.delete(email_change.id)
Session.commit()
flash("Your email change is cancelled", "success")
return redirect(url_for("dashboard.setting"))
else:
flash(
"You have no pending email change. Redirect back to Setting page", "warning"
)
return redirect(url_for("dashboard.setting"))
@dashboard_bp.route("/unlink_proton_account", methods=["POST"])
@login_required
def unlink_proton_account():
csrf_form = CSRFValidationForm()
if not csrf_form.validate():
flash("Invalid request", "warning")
return redirect(url_for("dashboard.setting"))
perform_proton_account_unlink(current_user)
flash("Your Proton account has been unlinked", "success")
return redirect(url_for("dashboard.setting"))

View File

@ -8,6 +8,7 @@ from app.db import Session
from flask import redirect, url_for, flash, request, render_template
from flask_login import login_required, current_user
from app import alias_utils
from app.dashboard.base import dashboard_bp
from app.handler.unsubscribe_encoder import UnsubscribeAction
from app.handler.unsubscribe_handler import UnsubscribeHandler
@ -31,7 +32,7 @@ def unsubscribe(alias_id):
# automatic unsubscribe, according to https://tools.ietf.org/html/rfc8058
if request.method == "POST":
alias.enabled = False
alias_utils.change_alias_status(alias, False)
flash(f"Alias {alias.email} has been blocked", "success")
Session.commit()
@ -75,12 +76,11 @@ def block_contact(contact_id):
@dashboard_bp.route("/unsubscribe/encoded/<encoded_request>", methods=["GET"])
@login_required
def encoded_unsubscribe(encoded_request: str):
unsub_data = UnsubscribeHandler().handle_unsubscribe_from_request(
current_user, encoded_request
)
if not unsub_data:
flash(f"Invalid unsubscribe request", "error")
flash("Invalid unsubscribe request", "error")
return redirect(url_for("dashboard.index"))
if unsub_data.action == UnsubscribeAction.DisableAlias:
alias = Alias.get(unsub_data.data)
@ -97,14 +97,14 @@ def encoded_unsubscribe(encoded_request: str):
)
)
if unsub_data.action == UnsubscribeAction.UnsubscribeNewsletter:
flash(f"You've unsubscribed from the newsletter", "success")
flash("You've unsubscribed from the newsletter", "success")
return redirect(
url_for(
"dashboard.index",
)
)
if unsub_data.action == UnsubscribeAction.OriginalUnsubscribeMailto:
flash(f"The original unsubscribe request has been forwarded", "success")
flash("The original unsubscribe request has been forwarded", "success")
return redirect(
url_for(
"dashboard.index",

View File

@ -1 +1,3 @@
from .views import index, new_client, client_detail
__all__ = ["index", "new_client", "client_detail"]

View File

@ -87,7 +87,7 @@ def client_detail(client_id):
)
flash(
f"Thanks for submitting, we are informed and will come back to you asap!",
"Thanks for submitting, we are informed and will come back to you asap!",
"success",
)

View File

@ -1 +1,3 @@
from .views import index
__all__ = ["index"]

View File

@ -34,7 +34,7 @@ def get_cname_record(hostname) -> Optional[str]:
def get_mx_domains(hostname) -> [(int, str)]:
"""return list of (priority, domain name).
"""return list of (priority, domain name) sorted by priority (lowest priority first)
domain name ends with a "." at the end.
"""
try:
@ -50,7 +50,7 @@ def get_mx_domains(hostname) -> [(int, str)]:
ret.append((int(parts[0]), parts[1]))
return ret
return sorted(ret, key=lambda prio_domain: prio_domain[0])
_include_spf = "include:"

View File

@ -21,6 +21,7 @@ LIST_UNSUBSCRIBE = "List-Unsubscribe"
LIST_UNSUBSCRIBE_POST = "List-Unsubscribe-Post"
RETURN_PATH = "Return-Path"
AUTHENTICATION_RESULTS = "Authentication-Results"
SL_QUEUE_ID = "X-SL-Queue-Id"
# headers used to DKIM sign in order of preference
DKIM_HEADERS = [

View File

@ -33,6 +33,7 @@ from flanker.addresslib import address
from flanker.addresslib.address import EmailAddress
from jinja2 import Environment, FileSystemLoader
from sqlalchemy import func
from flask_login import current_user
from app import config
from app.db import Session
@ -68,17 +69,27 @@ VERP_TIME_START = 1640995200
VERP_HMAC_ALGO = "sha3-224"
def render(template_name, **kwargs) -> str:
def render(template_name: str, user: Optional[User], **kwargs) -> str:
templates_dir = os.path.join(config.ROOT_DIR, "templates", "emails")
env = Environment(loader=FileSystemLoader(templates_dir))
template = env.get_template(template_name)
if user is None:
if current_user and current_user.is_authenticated:
user = current_user
use_partner_template = False
if user:
use_partner_template = user.has_used_alias_from_partner()
kwargs["user"] = user
return template.render(
MAX_NB_EMAIL_FREE_PLAN=config.MAX_NB_EMAIL_FREE_PLAN,
URL=config.URL,
LANDING_PAGE_URL=config.LANDING_PAGE_URL,
YEAR=arrow.now().year,
USE_PARTNER_TEMPLATE=use_partner_template,
**kwargs,
)
@ -93,7 +104,7 @@ def send_welcome_email(user):
send_email(
comm_email,
f"Welcome to SimpleLogin",
"Welcome to SimpleLogin",
render("com/welcome.txt", user=user, alias=alias),
render("com/welcome.html", user=user, alias=alias),
unsubscribe_link,
@ -104,60 +115,66 @@ def send_welcome_email(user):
def send_trial_end_soon_email(user):
send_email(
user.email,
f"Your trial will end soon",
"Your trial will end soon",
render("transactional/trial-end.txt.jinja2", user=user),
render("transactional/trial-end.html", user=user),
ignore_smtp_error=True,
)
def send_activation_email(email, activation_link):
def send_activation_email(user: User, activation_link):
send_email(
email,
f"Just one more step to join SimpleLogin",
user.email,
"Just one more step to join SimpleLogin",
render(
"transactional/activation.txt",
user=user,
activation_link=activation_link,
email=email,
email=user.email,
),
render(
"transactional/activation.html",
user=user,
activation_link=activation_link,
email=email,
email=user.email,
),
)
def send_reset_password_email(email, reset_password_link):
def send_reset_password_email(user: User, reset_password_link):
send_email(
email,
user.email,
"Reset your password on SimpleLogin",
render(
"transactional/reset-password.txt",
user=user,
reset_password_link=reset_password_link,
),
render(
"transactional/reset-password.html",
user=user,
reset_password_link=reset_password_link,
),
)
def send_change_email(new_email, current_email, link):
def send_change_email(user: User, new_email, link):
send_email(
new_email,
"Confirm email update on SimpleLogin",
render(
"transactional/change-email.txt",
user=user,
link=link,
new_email=new_email,
current_email=current_email,
current_email=user.email,
),
render(
"transactional/change-email.html",
user=user,
link=link,
new_email=new_email,
current_email=current_email,
current_email=user.email,
),
)
@ -170,28 +187,32 @@ def send_invalid_totp_login_email(user, totp_type):
"Unsuccessful attempt to login to your SimpleLogin account",
render(
"transactional/invalid-totp-login.txt",
user=user,
type=totp_type,
),
render(
"transactional/invalid-totp-login.html",
user=user,
type=totp_type,
),
1,
)
def send_test_email_alias(email, name):
def send_test_email_alias(user: User, email: str):
send_email(
email,
f"This email is sent to {email}",
render(
"transactional/test-email.txt",
name=name,
user=user,
name=user.name,
alias=email,
),
render(
"transactional/test-email.html",
name=name,
user=user,
name=user.name,
alias=email,
),
)
@ -206,11 +227,13 @@ def send_cannot_create_directory_alias(user, alias_address, directory_name):
f"Alias {alias_address} cannot be created",
render(
"transactional/cannot-create-alias-directory.txt",
user=user,
alias=alias_address,
directory=directory_name,
),
render(
"transactional/cannot-create-alias-directory.html",
user=user,
alias=alias_address,
directory=directory_name,
),
@ -228,11 +251,13 @@ def send_cannot_create_directory_alias_disabled(user, alias_address, directory_n
f"Alias {alias_address} cannot be created",
render(
"transactional/cannot-create-alias-directory-disabled.txt",
user=user,
alias=alias_address,
directory=directory_name,
),
render(
"transactional/cannot-create-alias-directory-disabled.html",
user=user,
alias=alias_address,
directory=directory_name,
),
@ -248,11 +273,13 @@ def send_cannot_create_domain_alias(user, alias, domain):
f"Alias {alias} cannot be created",
render(
"transactional/cannot-create-alias-domain.txt",
user=user,
alias=alias,
domain=domain,
),
render(
"transactional/cannot-create-alias-domain.html",
user=user,
alias=alias,
domain=domain,
),
@ -494,9 +521,10 @@ def delete_header(msg: Message, header: str):
def sanitize_header(msg: Message, header: str):
"""remove trailing space and remove linebreak from a header"""
header_lowercase = header.lower()
for i in reversed(range(len(msg._headers))):
header_name = msg._headers[i][0].lower()
if header_name == header.lower():
if header_name == header_lowercase:
# msg._headers[i] is a tuple like ('From', 'hey@google.com')
if msg._headers[i][1]:
msg._headers[i] = (
@ -583,6 +611,26 @@ def email_can_be_used_as_mailbox(email_address: str) -> bool:
LOG.d("MX Domain %s %s is invalid mailbox domain", mx_domain, domain)
return False
existing_user = User.get_by(email=email_address)
if existing_user and existing_user.disabled:
LOG.d(
f"User {existing_user} is disabled. {email_address} cannot be used for other mailbox"
)
return False
for existing_user in (
User.query()
.join(Mailbox, User.id == Mailbox.user_id)
.filter(Mailbox.email == email_address)
.group_by(User.id)
.all()
):
if existing_user.disabled:
LOG.d(
f"User {existing_user} is disabled and has a mailbox with {email_address}. Id cannot be used for other mailbox"
)
return False
return True
@ -768,7 +816,7 @@ def get_header_unicode(header: Union[str, Header]) -> str:
ret = ""
for to_decoded_str, charset in decode_header(header):
if charset is None:
if type(to_decoded_str) is bytes:
if isinstance(to_decoded_str, bytes):
decoded_str = to_decoded_str.decode()
else:
decoded_str = to_decoded_str
@ -805,13 +853,13 @@ def to_bytes(msg: Message):
for generator_policy in [None, policy.SMTP, policy.SMTPUTF8]:
try:
return msg.as_bytes(policy=generator_policy)
except:
except Exception:
LOG.w("as_bytes() fails with %s policy", policy, exc_info=True)
msg_string = msg.as_string()
try:
return msg_string.encode()
except:
except Exception:
LOG.w("as_string().encode() fails", exc_info=True)
return msg_string.encode(errors="replace")
@ -828,19 +876,6 @@ def should_add_dkim_signature(domain: str) -> bool:
return False
def is_valid_email(email_address: str) -> bool:
"""
Used to check whether an email address is valid
NOT run MX check.
NOT allow unicode.
"""
try:
validate_email(email_address, check_deliverability=False, allow_smtputf8=False)
return True
except EmailNotValidError:
return False
class EmailEncoding(enum.Enum):
BASE64 = "base64"
QUOTED = "quoted-printable"
@ -911,15 +946,25 @@ def decode_text(text: str, encoding: EmailEncoding = EmailEncoding.NO) -> str:
return text
def add_header(msg: Message, text_header, html_header=None) -> Message:
def add_header(
msg: Message, text_header, html_header=None, subject_prefix=None
) -> Message:
if not html_header:
html_header = text_header.replace("\n", "<br>")
if subject_prefix is not None:
subject = msg[headers.SUBJECT]
if not subject:
msg.add_header(headers.SUBJECT, subject_prefix)
else:
subject = f"{subject_prefix} {subject}"
msg.replace_header(headers.SUBJECT, subject)
content_type = msg.get_content_type().lower()
if content_type == "text/plain":
encoding = get_encoding(msg)
payload = msg.get_payload()
if type(payload) is str:
if isinstance(payload, str):
clone_msg = copy(msg)
new_payload = f"""{text_header}
------------------------------
@ -929,7 +974,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
elif content_type == "text/html":
encoding = get_encoding(msg)
payload = msg.get_payload()
if type(payload) is str:
if isinstance(payload, str):
new_payload = f"""<table width="100%" style="width: 100%; -premailer-width: 100%; -premailer-cellpadding: 0;
-premailer-cellspacing: 0; margin: 0; padding: 0;">
<tr>
@ -951,6 +996,8 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
for part in msg.get_payload():
if isinstance(part, Message):
new_parts.append(add_header(part, text_header, html_header))
elif isinstance(part, str):
new_parts.append(MIMEText(part))
else:
new_parts.append(part)
clone_msg = copy(msg)
@ -959,7 +1006,14 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
elif content_type in ("multipart/mixed", "multipart/signed"):
new_parts = []
parts = list(msg.get_payload())
payload = msg.get_payload()
if isinstance(payload, str):
# The message is badly formatted inject as new
new_parts = [MIMEText(text_header, "plain"), MIMEText(payload, "plain")]
clone_msg = copy(msg)
clone_msg.set_payload(new_parts)
return clone_msg
parts = list(payload)
LOG.d("only add header for the first part for %s", content_type)
for ix, part in enumerate(parts):
if ix == 0:
@ -976,7 +1030,7 @@ def add_header(msg: Message, text_header, html_header=None) -> Message:
def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
if type(msg) is str:
if isinstance(msg, str):
msg = msg.replace(old, new)
return msg
@ -999,7 +1053,7 @@ def replace(msg: Union[Message, str], old, new) -> Union[Message, str]:
if content_type in ("text/plain", "text/html"):
encoding = get_encoding(msg)
payload = msg.get_payload()
if type(payload) is str:
if isinstance(payload, str):
if encoding == EmailEncoding.QUOTED:
LOG.d("handle quoted-printable replace %s -> %s", old, new)
# first decode the payload
@ -1107,26 +1161,6 @@ def is_reverse_alias(address: str) -> bool:
)
# allow also + and @ that are present in a reply address
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-.+@"
def normalize_reply_email(reply_email: str) -> str:
"""Handle the case where reply email contains *strange* char that was wrongly generated in the past"""
if not reply_email.isascii():
reply_email = convert_to_id(reply_email)
ret = []
# drop all control characters like shift, separator, etc
for c in reply_email:
if c not in _ALLOWED_CHARS:
ret.append("_")
else:
ret.append(c)
return "".join(ret)
def should_disable(alias: Alias) -> (bool, str):
"""
Return whether an alias should be disabled and if yes, the reason why
@ -1256,6 +1290,7 @@ def spf_pass(
f"SimpleLogin Alert: attempt to send emails from your alias {alias.email} from unknown IP Address",
render(
"transactional/spf-fail.txt",
user=user,
alias=alias.email,
ip=ip,
mailbox_url=config.URL + f"/dashboard/mailbox/{mailbox.id}#spf",
@ -1265,6 +1300,7 @@ def spf_pass(
),
render(
"transactional/spf-fail.html",
user=user,
ip=ip,
mailbox_url=config.URL + f"/dashboard/mailbox/{mailbox.id}#spf",
to_email=contact_email,
@ -1407,7 +1443,7 @@ def generate_verp_email(
# Time is in minutes granularity and start counting on 2022-01-01 to reduce bytes to represent time
data = [
verp_type.value,
object_id,
object_id or 0,
int((time.time() - VERP_TIME_START) / 60),
]
json_payload = json.dumps(data).encode("utf-8")

View File

@ -0,0 +1,38 @@
from email_validator import (
validate_email,
EmailNotValidError,
)
from app.utils import convert_to_id
# allow also + and @ that are present in a reply address
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-.+@"
def is_valid_email(email_address: str) -> bool:
"""
Used to check whether an email address is valid
NOT run MX check.
NOT allow unicode.
"""
try:
validate_email(email_address, check_deliverability=False, allow_smtputf8=False)
return True
except EmailNotValidError:
return False
def normalize_reply_email(reply_email: str) -> str:
"""Handle the case where reply email contains *strange* char that was wrongly generated in the past"""
if not reply_email.isascii():
reply_email = convert_to_id(reply_email)
ret = []
# drop all control characters like shift, separator, etc
for c in reply_email:
if c not in _ALLOWED_CHARS:
ret.append("_")
else:
ret.append(c)
return "".join(ret)

View File

@ -84,6 +84,14 @@ class ErrAddressInvalid(SLException):
return f"{self.address} is not a valid email address"
class InvalidContactEmailError(SLException):
def __init__(self, website_email: str): # noqa: F821
self.website_email = website_email
def error_for_user(self) -> str:
return f"Cannot create contact with invalid email {self.website_email}"
class ErrContactAlreadyExists(SLException):
"""raised when a contact already exists"""
@ -113,3 +121,10 @@ class AccountAlreadyLinkedToAnotherUserException(LinkException):
class AccountIsUsingAliasAsEmail(LinkException):
def __init__(self):
super().__init__("Your account has an alias as it's email address")
class ProtonAccountNotVerified(LinkException):
def __init__(self):
super().__init__(
"The Proton account you are trying to use has not been verified"
)

View File

View File

@ -9,6 +9,7 @@ class LoginEvent:
failed = 1
disabled_login = 2
not_activated = 3
scheduled_to_be_deleted = 4
class Source(EnumE):
web = 0

View File

@ -0,0 +1,66 @@
from abc import ABC, abstractmethod
from app import config
from app.db import Session
from app.errors import ProtonPartnerNotSetUp
from app.events.generated import event_pb2
from app.models import User, PartnerUser, SyncEvent
from app.proton.utils import get_proton_partner
from typing import Optional
NOTIFICATION_CHANNEL = "simplelogin_sync_events"
class Dispatcher(ABC):
@abstractmethod
def send(self, event: bytes):
pass
class PostgresDispatcher(Dispatcher):
def send(self, event: bytes):
instance = SyncEvent.create(content=event, flush=True)
Session.execute(f"NOTIFY {NOTIFICATION_CHANNEL}, '{instance.id}';")
@staticmethod
def get():
return PostgresDispatcher()
class EventDispatcher:
@staticmethod
def send_event(
user: User,
content: event_pb2.EventContent,
dispatcher: Dispatcher = PostgresDispatcher.get(),
skip_if_webhook_missing: bool = True,
):
if config.EVENT_WEBHOOK_DISABLE:
return
if not config.EVENT_WEBHOOK and skip_if_webhook_missing:
return
partner_user = EventDispatcher.__partner_user(user.id)
if not partner_user:
return
event = event_pb2.Event(
user_id=user.id,
external_user_id=partner_user.external_user_id,
partner_id=partner_user.partner_id,
content=content,
)
serialized = event.SerializeToString()
dispatcher.send(serialized)
@staticmethod
def __partner_user(user_id: int) -> Optional[PartnerUser]:
# Check if the current user has a partner_id
try:
proton_partner_id = get_proton_partner().id
except ProtonPartnerNotSetUp:
return None
# It has. Retrieve the information for the PartnerUser
return PartnerUser.get_by(user_id=user_id, partner_id=proton_partner_id)

View File

@ -0,0 +1,50 @@
# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# NO CHECKED-IN PROTOBUF GENCODE
# source: event.proto
# Protobuf Python Version: 5.27.0
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import descriptor_pool as _descriptor_pool
from google.protobuf import runtime_version as _runtime_version
from google.protobuf import symbol_database as _symbol_database
from google.protobuf.internal import builder as _builder
_runtime_version.ValidateProtobufRuntimeVersion(
_runtime_version.Domain.PUBLIC,
5,
27,
0,
'',
'event.proto'
)
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x0b\x65vent.proto\x12\x12simplelogin_events\"(\n\x0fUserPlanChanged\x12\x15\n\rplan_end_time\x18\x01 \x01(\r\"\r\n\x0bUserDeleted\"Z\n\x0c\x41liasCreated\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x12\n\nalias_note\x18\x03 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x04 \x01(\x08\"L\n\x12\x41liasStatusChanged\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\x12\x0f\n\x07\x65nabled\x18\x03 \x01(\x08\"5\n\x0c\x41liasDeleted\x12\x10\n\x08\x61lias_id\x18\x01 \x01(\r\x12\x13\n\x0b\x61lias_email\x18\x02 \x01(\t\"D\n\x10\x41liasCreatedList\x12\x30\n\x06\x65vents\x18\x01 \x03(\x0b\x32 .simplelogin_events.AliasCreated\"\x93\x03\n\x0c\x45ventContent\x12?\n\x10user_plan_change\x18\x01 \x01(\x0b\x32#.simplelogin_events.UserPlanChangedH\x00\x12\x37\n\x0cuser_deleted\x18\x02 \x01(\x0b\x32\x1f.simplelogin_events.UserDeletedH\x00\x12\x39\n\ralias_created\x18\x03 \x01(\x0b\x32 .simplelogin_events.AliasCreatedH\x00\x12\x45\n\x13\x61lias_status_change\x18\x04 \x01(\x0b\x32&.simplelogin_events.AliasStatusChangedH\x00\x12\x39\n\ralias_deleted\x18\x05 \x01(\x0b\x32 .simplelogin_events.AliasDeletedH\x00\x12\x41\n\x11\x61lias_create_list\x18\x06 \x01(\x0b\x32$.simplelogin_events.AliasCreatedListH\x00\x42\t\n\x07\x63ontent\"y\n\x05\x45vent\x12\x0f\n\x07user_id\x18\x01 \x01(\r\x12\x18\n\x10\x65xternal_user_id\x18\x02 \x01(\t\x12\x12\n\npartner_id\x18\x03 \x01(\r\x12\x31\n\x07\x63ontent\x18\x04 \x01(\x0b\x32 .simplelogin_events.EventContentb\x06proto3')
_globals = globals()
_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'event_pb2', _globals)
if not _descriptor._USE_C_DESCRIPTORS:
DESCRIPTOR._loaded_options = None
_globals['_USERPLANCHANGED']._serialized_start=35
_globals['_USERPLANCHANGED']._serialized_end=75
_globals['_USERDELETED']._serialized_start=77
_globals['_USERDELETED']._serialized_end=90
_globals['_ALIASCREATED']._serialized_start=92
_globals['_ALIASCREATED']._serialized_end=182
_globals['_ALIASSTATUSCHANGED']._serialized_start=184
_globals['_ALIASSTATUSCHANGED']._serialized_end=260
_globals['_ALIASDELETED']._serialized_start=262
_globals['_ALIASDELETED']._serialized_end=315
_globals['_ALIASCREATEDLIST']._serialized_start=317
_globals['_ALIASCREATEDLIST']._serialized_end=385
_globals['_EVENTCONTENT']._serialized_start=388
_globals['_EVENTCONTENT']._serialized_end=791
_globals['_EVENT']._serialized_start=793
_globals['_EVENT']._serialized_end=914
# @@protoc_insertion_point(module_scope)

View File

@ -0,0 +1,80 @@
from google.protobuf.internal import containers as _containers
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from typing import ClassVar as _ClassVar, Iterable as _Iterable, Mapping as _Mapping, Optional as _Optional, Union as _Union
DESCRIPTOR: _descriptor.FileDescriptor
class UserPlanChanged(_message.Message):
__slots__ = ("plan_end_time",)
PLAN_END_TIME_FIELD_NUMBER: _ClassVar[int]
plan_end_time: int
def __init__(self, plan_end_time: _Optional[int] = ...) -> None: ...
class UserDeleted(_message.Message):
__slots__ = ()
def __init__(self) -> None: ...
class AliasCreated(_message.Message):
__slots__ = ("alias_id", "alias_email", "alias_note", "enabled")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
ALIAS_NOTE_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
alias_note: str
enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., alias_note: _Optional[str] = ..., enabled: bool = ...) -> None: ...
class AliasStatusChanged(_message.Message):
__slots__ = ("alias_id", "alias_email", "enabled")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
ENABLED_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
enabled: bool
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ..., enabled: bool = ...) -> None: ...
class AliasDeleted(_message.Message):
__slots__ = ("alias_id", "alias_email")
ALIAS_ID_FIELD_NUMBER: _ClassVar[int]
ALIAS_EMAIL_FIELD_NUMBER: _ClassVar[int]
alias_id: int
alias_email: str
def __init__(self, alias_id: _Optional[int] = ..., alias_email: _Optional[str] = ...) -> None: ...
class AliasCreatedList(_message.Message):
__slots__ = ("events",)
EVENTS_FIELD_NUMBER: _ClassVar[int]
events: _containers.RepeatedCompositeFieldContainer[AliasCreated]
def __init__(self, events: _Optional[_Iterable[_Union[AliasCreated, _Mapping]]] = ...) -> None: ...
class EventContent(_message.Message):
__slots__ = ("user_plan_change", "user_deleted", "alias_created", "alias_status_change", "alias_deleted", "alias_create_list")
USER_PLAN_CHANGE_FIELD_NUMBER: _ClassVar[int]
USER_DELETED_FIELD_NUMBER: _ClassVar[int]
ALIAS_CREATED_FIELD_NUMBER: _ClassVar[int]
ALIAS_STATUS_CHANGE_FIELD_NUMBER: _ClassVar[int]
ALIAS_DELETED_FIELD_NUMBER: _ClassVar[int]
ALIAS_CREATE_LIST_FIELD_NUMBER: _ClassVar[int]
user_plan_change: UserPlanChanged
user_deleted: UserDeleted
alias_created: AliasCreated
alias_status_change: AliasStatusChanged
alias_deleted: AliasDeleted
alias_create_list: AliasCreatedList
def __init__(self, user_plan_change: _Optional[_Union[UserPlanChanged, _Mapping]] = ..., user_deleted: _Optional[_Union[UserDeleted, _Mapping]] = ..., alias_created: _Optional[_Union[AliasCreated, _Mapping]] = ..., alias_status_change: _Optional[_Union[AliasStatusChanged, _Mapping]] = ..., alias_deleted: _Optional[_Union[AliasDeleted, _Mapping]] = ..., alias_create_list: _Optional[_Union[AliasCreatedList, _Mapping]] = ...) -> None: ...
class Event(_message.Message):
__slots__ = ("user_id", "external_user_id", "partner_id", "content")
USER_ID_FIELD_NUMBER: _ClassVar[int]
EXTERNAL_USER_ID_FIELD_NUMBER: _ClassVar[int]
PARTNER_ID_FIELD_NUMBER: _ClassVar[int]
CONTENT_FIELD_NUMBER: _ClassVar[int]
user_id: int
external_user_id: str
partner_id: int
content: EventContent
def __init__(self, user_id: _Optional[int] = ..., external_user_id: _Optional[str] = ..., partner_id: _Optional[int] = ..., content: _Optional[_Union[EventContent, _Mapping]] = ...) -> None: ...

View File

@ -30,14 +30,16 @@ def apply_dmarc_policy_for_forward_phase(
) -> Tuple[Message, Optional[str]]:
spam_result = SpamdResult.extract_from_headers(msg, Phase.forward)
if not DMARC_CHECK_ENABLED or not spam_result:
LOG.i("DMARC check disabled")
return msg, None
LOG.i(f"Spam check result in {spam_result}")
from_header = get_header_unicode(msg[headers.FROM])
warning_plain_text = f"""This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
warning_plain_text = """This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
More info on https://simplelogin.io/docs/getting-started/anti-phishing/
"""
warning_html = f"""
warning_html = """
<p style="color:red">
This email failed anti-phishing checks when it was received by SimpleLogin, be careful with its content.
More info on <a href="https://simplelogin.io/docs/getting-started/anti-phishing/">anti-phishing measure</a>
@ -62,6 +64,7 @@ More info on https://simplelogin.io/docs/getting-started/anti-phishing/
msg,
warning_plain_text,
warning_html,
subject_prefix="[Possible phishing attempt]",
)
return changed_msg, None
@ -74,6 +77,7 @@ More info on https://simplelogin.io/docs/getting-started/anti-phishing/
msg,
warning_plain_text,
warning_html,
subject_prefix="[Possible phishing attempt]",
)
return changed_msg, None
@ -102,12 +106,14 @@ More info on https://simplelogin.io/docs/getting-started/anti-phishing/
f"An email sent to {alias.email} has been quarantined",
render(
"transactional/message-quarantine-dmarc.txt.jinja2",
user=user,
from_header=from_header,
alias=alias,
refused_email_url=email_log.get_dashboard_url(),
),
render(
"transactional/message-quarantine-dmarc.html",
user=user,
from_header=from_header,
alias=alias,
refused_email_url=email_log.get_dashboard_url(),
@ -131,7 +137,7 @@ def quarantine_dmarc_failed_forward_email(alias, contact, envelope, msg) -> Emai
refused_email = RefusedEmail.create(
full_report_path=s3_report_path, user_id=alias.user_id, flush=True
)
return EmailLog.create(
email_log = EmailLog.create(
user_id=alias.user_id,
mailbox_id=alias.mailbox_id,
contact_id=contact.id,
@ -142,6 +148,7 @@ def quarantine_dmarc_failed_forward_email(alias, contact, envelope, msg) -> Emai
blocked=True,
commit=True,
)
return email_log
def apply_dmarc_policy_for_reply_phase(
@ -149,8 +156,10 @@ def apply_dmarc_policy_for_reply_phase(
) -> Optional[str]:
spam_result = SpamdResult.extract_from_headers(msg, Phase.reply)
if not DMARC_CHECK_ENABLED or not spam_result:
LOG.i("DMARC check disabled")
return None
LOG.i(f"Spam check result is {spam_result}")
if spam_result.dmarc not in (
DmarcCheckResult.quarantine,
DmarcCheckResult.reject,
@ -169,12 +178,14 @@ def apply_dmarc_policy_for_reply_phase(
f"Attempt to send an email to your contact {contact_recipient.email} from {envelope.mail_from}",
render(
"transactional/spoof-reply.txt.jinja2",
user=alias_from.user,
contact=contact_recipient,
alias=alias_from,
sender=envelope.mail_from,
),
render(
"transactional/spoof-reply.html",
user=alias_from.user,
contact=contact_recipient,
alias=alias_from,
sender=envelope.mail_from,

View File

@ -221,7 +221,7 @@ def handle_complaint(message: Message, origin: ProviderComplaintOrigin) -> bool:
return True
if is_deleted_alias(msg_info.sender_address):
LOG.i(f"Complaint is for deleted alias. Do nothing")
LOG.i("Complaint is for deleted alias. Do nothing")
return True
contact = Contact.get_by(reply_email=msg_info.sender_address)
@ -231,7 +231,7 @@ def handle_complaint(message: Message, origin: ProviderComplaintOrigin) -> bool:
alias = find_alias_with_address(msg_info.rcpt_address)
if is_deleted_alias(msg_info.rcpt_address):
LOG.i(f"Complaint is for deleted alias. Do nothing")
LOG.i("Complaint is for deleted alias. Do nothing")
return True
if not alias:
@ -319,11 +319,13 @@ def report_complaint_to_user_in_forward_phase(
f"Abuse report from {capitalized_name}",
render(
"transactional/provider-complaint-forward-phase.txt.jinja2",
user=user,
email=mailbox_email,
provider=capitalized_name,
),
render(
"transactional/provider-complaint-forward-phase.html",
user=user,
email=mailbox_email,
provider=capitalized_name,
),

View File

@ -54,9 +54,8 @@ class UnsubscribeEncoder:
def encode_subject(
cls, action: UnsubscribeAction, data: Union[int, UnsubscribeOriginalData]
) -> str:
if (
action != UnsubscribeAction.OriginalUnsubscribeMailto
and type(data) is not int
if action != UnsubscribeAction.OriginalUnsubscribeMailto and not isinstance(
data, int
):
raise ValueError(f"Data has to be an int for an action of type {action}")
if action == UnsubscribeAction.OriginalUnsubscribeMailto:
@ -74,8 +73,8 @@ class UnsubscribeEncoder:
)
signed_data = cls._get_signer().sign(serialized_data).decode("utf-8")
encoded_request = f"{UNSUB_PREFIX}.{signed_data}"
if len(encoded_request) > 256:
LOG.e("Encoded request is longer than 256 chars")
if len(encoded_request) > 512:
LOG.w("Encoded request is longer than 512 chars")
return encoded_request
@staticmethod

View File

@ -1,7 +1,9 @@
import urllib
from email.header import Header
from email.message import Message
from app.email import headers
from app import config
from app.email_utils import add_or_replace_header, delete_header
from app.handler.unsubscribe_encoder import (
UnsubscribeEncoder,
@ -33,6 +35,8 @@ class UnsubscribeGenerator:
if not unsubscribe_data:
LOG.info("Email has no unsubscribe header")
return message
if isinstance(unsubscribe_data, Header):
unsubscribe_data = str(unsubscribe_data.encode())
raw_methods = [method.strip() for method in unsubscribe_data.split(",")]
mailto_unsubs = None
other_unsubs = []
@ -44,6 +48,11 @@ class UnsubscribeGenerator:
method = raw_method[start + 1 : end]
url_data = urllib.parse.urlparse(method)
if url_data.scheme == "mailto":
if url_data.path == config.UNSUBSCRIBER:
LOG.debug(
f"Skipping replacing unsubscribe since the original email already points to {config.UNSUBSCRIBER}"
)
return message
query_data = urllib.parse.parse_qs(url_data.query)
mailto_unsubs = (url_data.path, query_data.get("subject", [""])[0])
LOG.debug(f"Unsub is mailto to {mailto_unsubs}")
@ -64,8 +73,8 @@ class UnsubscribeGenerator:
return message
elif not mailto_unsubs:
LOG.debug("No unsubs. Deleting all unsub headers")
message = delete_header(message, headers.LIST_UNSUBSCRIBE)
message = delete_header(message, headers.LIST_UNSUBSCRIBE_POST)
delete_header(message, headers.LIST_UNSUBSCRIBE)
delete_header(message, headers.LIST_UNSUBSCRIBE_POST)
return message
unsub_data = UnsubscribeData(
UnsubscribeAction.OriginalUnsubscribeMailto,

View File

@ -5,6 +5,7 @@ from typing import Optional
from aiosmtpd.smtp import Envelope
from app import config
from app import alias_utils
from app.db import Session
from app.email import headers, status
from app.email_utils import (
@ -101,7 +102,8 @@ class UnsubscribeHandler:
mailbox.email, alias
):
return status.E509
alias.enabled = False
LOG.i(f"User disabled alias {alias} via unsubscribe header")
alias_utils.change_alias_status(alias, enabled=False)
Session.commit()
enable_alias_url = config.URL + f"/dashboard/?highlight_alias_id={alias.id}"
for mailbox in alias.mailboxes:

View File

@ -30,7 +30,10 @@ def handle_batch_import(batch_import: BatchImport):
LOG.d("Download file %s from %s", batch_import.file, file_url)
r = requests.get(file_url)
lines = [line.decode() for line in r.iter_lines()]
# Replace invisible character
lines = [
line.decode("utf-8").replace("\ufeff", "").strip() for line in r.iter_lines()
]
import_from_csv(batch_import, user, lines)

View File

@ -1,2 +1,4 @@
from .integrations import set_enable_proton_cookie
from .exit_sudo import exit_sudo_mode
__all__ = ["set_enable_proton_cookie", "exit_sudo_mode"]

View File

@ -0,0 +1,40 @@
from app.events.event_dispatcher import EventDispatcher, Dispatcher
from app.events.generated.event_pb2 import EventContent, AliasCreated, AliasCreatedList
from app.log import LOG
from app.models import User, Alias
def send_alias_creation_events_for_user(
user: User, dispatcher: Dispatcher, chunk_size=50
):
if user.disabled:
LOG.i("User {user} is disabled. Skipping sending events for that user")
return
chunk_size = min(chunk_size, 50)
event_list = []
for alias in (
Alias.yield_per_query(chunk_size)
.filter_by(user_id=user.id)
.order_by(Alias.id.asc())
):
event_list.append(
AliasCreated(
alias_id=alias.id,
alias_email=alias.email,
alias_note=alias.note,
enabled=alias.enabled,
)
)
if len(event_list) >= chunk_size:
EventDispatcher.send_event(
user,
EventContent(alias_create_list=AliasCreatedList(events=event_list)),
dispatcher=dispatcher,
)
event_list = []
if len(event_list) > 0:
EventDispatcher.send_event(
user,
EventContent(alias_create_list=AliasCreatedList(events=event_list)),
dispatcher=dispatcher,
)

View File

@ -39,7 +39,6 @@ from app.models import (
class ExportUserDataJob:
REMOVE_FIELDS = {
"User": ("otp_secret", "password"),
"Alias": ("ts_vector", "transfer_token", "hibp_last_check"),
@ -138,7 +137,9 @@ class ExportUserDataJob:
msg[headers.SUBJECT] = "Your SimpleLogin data"
msg[headers.FROM] = f'"SimpleLogin (noreply)" <{config.NOREPLY}>'
msg[headers.TO] = to_email
msg.attach(MIMEText(render("transactional/user-report.html"), "html"))
msg.attach(
MIMEText(render("transactional/user-report.html", user=self._user), "html")
)
attachment = MIMEApplication(zipped_contents.read())
attachment.add_header(
"Content-Disposition", "attachment", filename="user_report.zip"

View File

@ -22,7 +22,6 @@ from app.message_utils import message_to_bytes, message_format_base64_parts
@dataclass
class SendRequest:
SAVE_EXTENSION = "sendrequest"
envelope_from: str
@ -46,6 +45,7 @@ class SendRequest:
"mail_options": self.mail_options,
"rcpt_options": self.rcpt_options,
"is_forward": self.is_forward,
"retries": self.retries,
}
return json.dumps(data).encode("utf-8")
@ -66,6 +66,7 @@ class SendRequest:
mail_options=decoded_data["mail_options"],
rcpt_options=decoded_data["rcpt_options"],
is_forward=decoded_data["is_forward"],
retries=decoded_data.get("retries", 1),
)
def save_request_to_unsent_dir(self, prefix: str = "DeliveryFail"):
@ -75,7 +76,6 @@ class SendRequest:
file_path = os.path.join(config.SAVE_UNSENT_DIR, file_name)
self.save_request_to_file(file_path)
@staticmethod
def save_request_to_failed_dir(self, prefix: str = "DeliveryRetryFail"):
file_name = (
f"{prefix}-{int(time.time())}-{uuid.uuid4()}.{SendRequest.SAVE_EXTENSION}"

View File

@ -27,9 +27,11 @@ from sqlalchemy.orm import deferred
from sqlalchemy.sql import and_
from sqlalchemy_utils import ArrowType
from app import config
from app import config, rate_limiter
from app import s3
from app.db import Session
from app.dns_utils import get_mx_domains
from app.errors import (
AliasInTrashError,
DirectoryInTrashError,
@ -233,6 +235,7 @@ class AuditLogActionEnum(EnumE):
download_provider_complaint = 8
disable_user = 9
enable_user = 10
stop_trial = 11
class Phase(EnumE):
@ -260,6 +263,15 @@ class UnsubscribeBehaviourEnum(EnumE):
PreserveOriginal = 2
class AliasDeleteReason(EnumE):
Unspecified = 0
UserHasBeenDeleted = 1
ManualAction = 2
DirectoryDeleted = 3
MailboxDeleted = 4
CustomDomainDeleted = 5
class IntEnumType(sa.types.TypeDecorator):
impl = sa.Integer
@ -278,6 +290,7 @@ class IntEnumType(sa.types.TypeDecorator):
class AliasOptions:
show_sl_domains: bool = True
show_partner_domains: Optional[Partner] = None
show_partner_premium: Optional[bool] = None
class Hibp(Base, ModelMixin):
@ -326,6 +339,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
FLAG_FREE_DISABLE_CREATE_ALIAS = 1 << 0
FLAG_CREATED_FROM_PARTNER = 1 << 1
FLAG_FREE_OLD_ALIAS_LIMIT = 1 << 2
FLAG_CREATED_ALIAS_FROM_PARTNER = 1 << 3
email = sa.Column(sa.String(256), unique=True, nullable=False)
@ -341,7 +355,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
sa.Boolean, default=True, nullable=False, server_default="1"
)
activated = sa.Column(sa.Boolean, default=False, nullable=False)
activated = sa.Column(sa.Boolean, default=False, nullable=False, index=True)
# an account can be disabled if having harmful behavior
disabled = sa.Column(sa.Boolean, default=False, nullable=False, server_default="0")
@ -411,7 +425,10 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
)
referral_id = sa.Column(
sa.ForeignKey("referral.id", ondelete="SET NULL"), nullable=True, default=None
sa.ForeignKey("referral.id", ondelete="SET NULL"),
nullable=True,
default=None,
index=True,
)
referral = orm.relationship("Referral", foreign_keys=[referral_id])
@ -518,6 +535,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
sa.Boolean, default=True, nullable=False, server_default="1"
)
# user opted in for data breach check
enable_data_breach_check = sa.Column(
sa.Boolean, default=False, nullable=False, server_default="0"
)
# bitwise flags. Allow for future expansion
flags = sa.Column(
sa.BigInteger,
@ -534,6 +556,16 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
nullable=False,
)
# Trigger hard deletion of the account at this time
delete_on = sa.Column(ArrowType, default=None)
__table_args__ = (
sa.Index(
"ix_users_activated_trial_end_lifetime", activated, trial_end, lifetime
),
sa.Index("ix_users_delete_on", delete_on),
)
@property
def directory_quota(self):
return min(
@ -568,6 +600,7 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
@classmethod
def create(cls, email, name="", password=None, from_partner=False, **kwargs):
email = sanitize_email(email)
user: User = super(User, cls).create(email=email, name=name[:100], **kwargs)
if password:
@ -634,6 +667,27 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return user
@classmethod
def delete(cls, obj_id, commit=False):
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import UserDeleted, EventContent
user: User = cls.get(obj_id)
EventDispatcher.send_event(user, EventContent(user_deleted=UserDeleted()))
# Manually delete all aliases for the user that is about to be deleted
from app.alias_utils import delete_alias
for alias in Alias.filter_by(user_id=user.id):
delete_alias(alias, user, AliasDeleteReason.UserHasBeenDeleted)
res = super(User, cls).delete(obj_id)
if commit:
Session.commit()
return res
def get_active_subscription(
self, include_partner_subscription: bool = True
) -> Optional[
@ -709,6 +763,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return True
def is_active(self) -> bool:
if self.delete_on is None:
return True
return self.delete_on < arrow.now()
def in_trial(self):
"""return True if user does not have lifetime licence or an active subscription AND is in trial period"""
if self.lifetime_or_active_subscription():
@ -810,6 +869,9 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
Whether user can create a new alias. User can't create a new alias if
- has more than 15 aliases in the free plan, *even in the free trial*
"""
if not self.is_active():
return False
if self.disabled:
return False
@ -821,6 +883,17 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
< self.max_alias_for_free_account()
)
def can_send_or_receive(self) -> bool:
if self.disabled:
LOG.i(f"User {self} is disabled. Cannot receive or send emails")
return False
if self.delete_on is not None:
LOG.i(
f"User {self} is scheduled to be deleted. Cannot receive or send emails"
)
return False
return True
def profile_picture_url(self):
if self.profile_picture_id:
return self.profile_picture.get_url()
@ -879,7 +952,11 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return sub
def verified_custom_domains(self) -> List["CustomDomain"]:
return CustomDomain.filter_by(user_id=self.id, ownership_verified=True).all()
return (
CustomDomain.filter_by(user_id=self.id, ownership_verified=True)
.order_by(CustomDomain.domain.asc())
.all()
)
def mailboxes(self) -> List["Mailbox"]:
"""list of mailbox that user own"""
@ -1011,29 +1088,35 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
) -> list["SLDomain"]:
if alias_options is None:
alias_options = AliasOptions()
conditions = [SLDomain.hidden == False] # noqa: E712
if not self.is_premium():
conditions.append(SLDomain.premium_only == False) # noqa: E712
partner_domain_cond = [] # noqa:E711
top_conds = [SLDomain.hidden == False] # noqa: E712
or_conds = [] # noqa:E711
if self.default_alias_public_domain_id is not None:
partner_domain_cond.append(
SLDomain.id == self.default_alias_public_domain_id
)
default_domain_conds = [SLDomain.id == self.default_alias_public_domain_id]
if not self.is_premium():
default_domain_conds.append(
SLDomain.premium_only == False # noqa: E712
)
or_conds.append(and_(*default_domain_conds).self_group())
if alias_options.show_partner_domains is not None:
partner_user = PartnerUser.filter_by(
user_id=self.id, partner_id=alias_options.show_partner_domains.id
).first()
if partner_user is not None:
partner_domain_cond.append(
SLDomain.partner_id == partner_user.partner_id
)
partner_domain_cond = [SLDomain.partner_id == partner_user.partner_id]
if alias_options.show_partner_premium is None:
alias_options.show_partner_premium = self.is_premium()
if not alias_options.show_partner_premium:
partner_domain_cond.append(
SLDomain.premium_only == False # noqa: E712
)
or_conds.append(and_(*partner_domain_cond).self_group())
if alias_options.show_sl_domains:
partner_domain_cond.append(SLDomain.partner_id == None) # noqa:E711
if len(partner_domain_cond) == 1:
conditions.append(partner_domain_cond[0])
else:
conditions.append(or_(*partner_domain_cond))
query = Session.query(SLDomain).filter(*conditions).order_by(SLDomain.order)
sl_conds = [SLDomain.partner_id == None] # noqa: E711
if not self.is_premium():
sl_conds.append(SLDomain.premium_only == False) # noqa: E712
or_conds.append(and_(*sl_conds).self_group())
top_conds.append(or_(*or_conds))
query = Session.query(SLDomain).filter(*top_conds).order_by(SLDomain.order)
return query.all()
def available_alias_domains(
@ -1079,6 +1162,20 @@ class User(Base, ModelMixin, UserMixin, PasswordOracle):
return random_words(1)
def can_create_contacts(self) -> bool:
if self.is_premium():
return True
if self.flags & User.FLAG_FREE_DISABLE_CREATE_ALIAS == 0:
return True
return not config.DISABLE_CREATE_CONTACTS_FOR_FREE_USERS
def has_used_alias_from_partner(self) -> bool:
return (
self.flags
& (User.FLAG_CREATED_ALIAS_FROM_PARTNER | User.FLAG_CREATED_FROM_PARTNER)
> 0
)
def __repr__(self):
return f"<User {self.id} {self.name} {self.email}>"
@ -1368,6 +1465,9 @@ def generate_random_alias_email(
class Alias(Base, ModelMixin):
__tablename__ = "alias"
FLAG_PARTNER_CREATED = 1 << 0
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
)
@ -1377,6 +1477,9 @@ class Alias(Base, ModelMixin):
name = sa.Column(sa.String(128), nullable=True, default=None)
enabled = sa.Column(sa.Boolean(), default=True, nullable=False)
flags = sa.Column(
sa.BigInteger(), default=0, server_default="0", nullable=False, index=True
)
custom_domain_id = sa.Column(
sa.ForeignKey("custom_domain.id", ondelete="cascade"), nullable=True, index=True
@ -1445,7 +1548,7 @@ class Alias(Base, ModelMixin):
)
# have I been pwned
hibp_last_check = sa.Column(ArrowType, default=None)
hibp_last_check = sa.Column(ArrowType, default=None, index=True)
hibp_breaches = orm.relationship("Hibp", secondary="alias_hibp")
# to use Postgres full text search. Only applied on "note" column for now
@ -1454,6 +1557,8 @@ class Alias(Base, ModelMixin):
TSVector(), sa.Computed("to_tsvector('english', note)", persisted=True)
)
last_email_log_id = sa.Column(sa.Integer, default=None, nullable=True)
__table_args__ = (
Index("ix_video___ts_vector__", ts_vector, postgresql_using="gin"),
# index on note column using pg_trgm
@ -1472,7 +1577,8 @@ class Alias(Base, ModelMixin):
def mailboxes(self):
ret = [self.mailbox]
for m in self._mailboxes:
ret.append(m)
if m.id is not self.mailbox.id:
ret.append(m)
ret = [mb for mb in ret if mb.verified]
ret = sorted(ret, key=lambda mb: mb.email)
@ -1521,6 +1627,15 @@ class Alias(Base, ModelMixin):
flush = kw.pop("flush", False)
new_alias = cls(**kw)
user = User.get(new_alias.user_id)
if user.is_premium():
limits = config.ALIAS_CREATE_RATE_LIMIT_PAID
else:
limits = config.ALIAS_CREATE_RATE_LIMIT_FREE
# limits is array of (hits,days)
for limit in limits:
key = f"alias_create_{limit[1]}d:{user.id}"
rate_limiter.check_bucket_limit(key, limit[0], limit[1])
email = kw["email"]
# make sure email is lowercase and doesn't have any whitespace
@ -1542,6 +1657,24 @@ class Alias(Base, ModelMixin):
Session.add(new_alias)
DailyMetric.get_or_create_today_metric().nb_alias += 1
# Internal import to avoid global import cycles
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import AliasCreated, EventContent
event = AliasCreated(
alias_id=new_alias.id,
alias_email=new_alias.email,
alias_note=new_alias.note,
enabled=True,
)
EventDispatcher.send_event(user, EventContent(alias_created=event))
if (
new_alias.flags & cls.FLAG_PARTNER_CREATED > 0
and new_alias.user.flags & User.FLAG_CREATED_ALIAS_FROM_PARTNER == 0
):
user.flags = user.flags | User.FLAG_CREATED_ALIAS_FROM_PARTNER
if commit:
Session.commit()
@ -1913,6 +2046,7 @@ class Contact(Base, ModelMixin):
class EmailLog(Base, ModelMixin):
__tablename__ = "email_log"
__table_args__ = (Index("ix_email_log_created_at", "created_at"),)
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
@ -2002,6 +2136,20 @@ class EmailLog(Base, ModelMixin):
def get_dashboard_url(self):
return f"{config.URL}/dashboard/refused_email?highlight_id={self.id}"
@classmethod
def create(cls, *args, **kwargs):
commit = kwargs.pop("commit", False)
email_log = super().create(*args, **kwargs)
Session.flush()
if "alias_id" in kwargs:
sql = "UPDATE alias SET last_email_log_id = :el_id WHERE id = :alias_id"
Session.execute(
sql, {"el_id": email_log.id, "alias_id": kwargs["alias_id"]}
)
if commit:
Session.commit()
return email_log
def __repr__(self):
return f"<EmailLog {self.id}>"
@ -2128,6 +2276,12 @@ class DeletedAlias(Base, ModelMixin):
__tablename__ = "deleted_alias"
email = sa.Column(sa.String(256), unique=True, nullable=False)
reason = sa.Column(
IntEnumType(AliasDeleteReason),
nullable=False,
default=AliasDeleteReason.Unspecified,
server_default=str(AliasDeleteReason.Unspecified.value),
)
@classmethod
def create(cls, **kw):
@ -2291,6 +2445,7 @@ class CustomDomain(Base, ModelMixin):
@classmethod
def create(cls, **kwargs):
domain = kwargs.get("domain")
kwargs["domain"] = domain.replace("\n", "")
if DeletedSubdomain.get_by(domain=domain):
raise SubdomainInTrashError
@ -2314,6 +2469,13 @@ class CustomDomain(Base, ModelMixin):
if obj.is_sl_subdomain:
DeletedSubdomain.create(domain=obj.domain)
from app import alias_utils
for alias in Alias.filter_by(custom_domain_id=obj_id):
alias_utils.delete_alias(
alias, obj.user, AliasDeleteReason.CustomDomainDeleted
)
return super(CustomDomain, cls).delete(obj_id)
@property
@ -2386,6 +2548,12 @@ class DomainDeletedAlias(Base, ModelMixin):
domain = orm.relationship(CustomDomain)
user = orm.relationship(User, foreign_keys=[user_id])
reason = sa.Column(
IntEnumType(AliasDeleteReason),
nullable=False,
default=AliasDeleteReason.Unspecified,
server_default=str(AliasDeleteReason.Unspecified.value),
)
@classmethod
def create(cls, **kw):
@ -2477,7 +2645,7 @@ class Directory(Base, ModelMixin):
for alias in Alias.filter_by(directory_id=obj_id):
from app import alias_utils
alias_utils.delete_alias(alias, user)
alias_utils.delete_alias(alias, user, AliasDeleteReason.DirectoryDeleted)
DeletedDirectory.create(name=obj.name)
cls.filter(cls.id == obj_id).delete()
@ -2504,10 +2672,13 @@ class Job(Base, ModelMixin):
nullable=False,
server_default=str(JobState.ready.value),
default=JobState.ready.value,
index=True,
)
attempts = sa.Column(sa.Integer, nullable=False, server_default="0", default=0)
taken_at = sa.Column(ArrowType, nullable=True)
__table_args__ = (Index("ix_state_run_at_taken_at", state, run_at, taken_at),)
def __repr__(self):
return f"<Job {self.id} {self.name} {self.payload}>"
@ -2553,10 +2724,37 @@ class Mailbox(Base, ModelMixin):
return False
def nb_alias(self):
return (
AliasMailbox.filter_by(mailbox_id=self.id).count()
+ Alias.filter_by(mailbox_id=self.id).count()
alias_ids = set(
am.alias_id
for am in AliasMailbox.filter_by(mailbox_id=self.id).values(
AliasMailbox.alias_id
)
)
for alias in Alias.filter_by(mailbox_id=self.id).values(Alias.id):
alias_ids.add(alias.id)
return len(alias_ids)
def is_proton(self) -> bool:
if (
self.email.endswith("@proton.me")
or self.email.endswith("@protonmail.com")
or self.email.endswith("@protonmail.ch")
or self.email.endswith("@proton.ch")
or self.email.endswith("@pm.me")
):
return True
from app.email_utils import get_email_local_part
mx_domains: [(int, str)] = get_mx_domains(get_email_local_part(self.email))
# Proton is the first domain
if mx_domains and mx_domains[0][1] in (
"mail.protonmail.ch.",
"mailsec.protonmail.ch.",
):
return True
return False
@classmethod
def delete(cls, obj_id):
@ -2575,7 +2773,7 @@ class Mailbox(Base, ModelMixin):
from app import alias_utils
# only put aliases that have mailbox as a single mailbox into trash
alias_utils.delete_alias(alias, user)
alias_utils.delete_alias(alias, user, AliasDeleteReason.MailboxDeleted)
Session.commit()
cls.filter(cls.id == obj_id).delete()
@ -2583,12 +2781,21 @@ class Mailbox(Base, ModelMixin):
@property
def aliases(self) -> [Alias]:
ret = Alias.filter_by(mailbox_id=self.id).all()
ret = dict(
(alias.id, alias) for alias in Alias.filter_by(mailbox_id=self.id).all()
)
for am in AliasMailbox.filter_by(mailbox_id=self.id):
ret.append(am.alias)
if am.alias_id not in ret:
ret[am.alias_id] = am.alias
return ret
return list(ret.values())
@classmethod
def create(cls, **kw):
if "email" in kw:
kw["email"] = sanitize_email(kw["email"])
return super().create(**kw)
def __repr__(self):
return f"<Mailbox {self.id} {self.email}>"
@ -2812,11 +3019,7 @@ class RecoveryCode(Base, ModelMixin):
@classmethod
def find_by_user_code(cls, user: User, code: str):
hashed_code = cls._hash_code(code)
# TODO: Only return hashed codes once there aren't unhashed codes in the db.
found_code = cls.get_by(user_id=user.id, code=hashed_code)
if found_code:
return found_code
return cls.get_by(user_id=user.id, code=code)
return cls.get_by(user_id=user.id, code=hashed_code)
@classmethod
def empty(cls, user):
@ -2827,7 +3030,9 @@ class RecoveryCode(Base, ModelMixin):
class Notification(Base, ModelMixin):
__tablename__ = "notification"
user_id = sa.Column(sa.ForeignKey(User.id, ondelete="cascade"), nullable=False)
user_id = sa.Column(
sa.ForeignKey(User.id, ondelete="cascade"), nullable=False, index=True
)
message = sa.Column(sa.Text, nullable=False)
title = sa.Column(sa.String(512))
@ -2928,6 +3133,8 @@ class Monitoring(Base, ModelMixin):
active_queue = sa.Column(sa.Integer, nullable=False)
deferred_queue = sa.Column(sa.Integer, nullable=False)
__table_args__ = (Index("ix_monitoring_created_at", "created_at"),)
class BatchImport(Base, ModelMixin):
__tablename__ = "batch_import"
@ -3053,6 +3260,8 @@ class Bounce(Base, ModelMixin):
email = sa.Column(sa.String(256), nullable=False, index=True)
info = sa.Column(sa.Text, nullable=True)
__table_args__ = (sa.Index("ix_bounce_created_at", "created_at"),)
class TransactionalEmail(Base, ModelMixin):
"""Storing all email addresses that receive transactional emails, including account email and mailboxes.
@ -3062,6 +3271,22 @@ class TransactionalEmail(Base, ModelMixin):
__tablename__ = "transactional_email"
email = sa.Column(sa.String(256), nullable=False, unique=False)
__table_args__ = (sa.Index("ix_transactional_email_created_at", "created_at"),)
@classmethod
def create(cls, **kw):
# whether to call Session.commit
commit = kw.pop("commit", False)
r = cls(**kw)
if not config.STORE_TRANSACTIONAL_EMAILS:
return r
Session.add(r)
if commit:
Session.commit()
return r
class Payout(Base, ModelMixin):
"""Referral payouts"""
@ -3114,7 +3339,7 @@ class MessageIDMatching(Base, ModelMixin):
# to track what email_log that has created this matching
email_log_id = sa.Column(
sa.ForeignKey("email_log.id", ondelete="cascade"), nullable=True
sa.ForeignKey("email_log.id", ondelete="cascade"), nullable=True, index=True
)
email_log = orm.relationship("EmailLog")
@ -3252,6 +3477,15 @@ class AdminAuditLog(Base):
},
)
@classmethod
def stop_trial(cls, admin_user_id: int, user_id: int):
cls.create(
admin_user_id=admin_user_id,
action=AuditLogActionEnum.stop_trial.value,
model="User",
model_id=user_id,
)
@classmethod
def disable_otp_fido(
cls, admin_user_id: int, user_id: int, had_otp: bool, had_fido: bool
@ -3447,7 +3681,7 @@ class PartnerSubscription(Base, ModelMixin):
class Newsletter(Base, ModelMixin):
__tablename__ = "newsletter"
subject = sa.Column(sa.String(), nullable=False, unique=True, index=True)
subject = sa.Column(sa.String(), nullable=False, index=True)
html = sa.Column(sa.Text)
plain_text = sa.Column(sa.Text)
@ -3485,3 +3719,52 @@ class ApiToCookieToken(Base, ModelMixin):
code = secrets.token_urlsafe(32)
return super().create(code=code, **kwargs)
class SyncEvent(Base, ModelMixin):
"""This model holds the events that need to be sent to the webhook"""
__tablename__ = "sync_event"
content = sa.Column(sa.LargeBinary, unique=False, nullable=False)
taken_time = sa.Column(
ArrowType, default=None, nullable=True, server_default=None, index=True
)
__table_args__ = (
sa.Index("ix_sync_event_created_at", "created_at"),
sa.Index("ix_sync_event_taken_time", "taken_time"),
)
def mark_as_taken(self) -> bool:
sql = """
UPDATE sync_event
SET taken_time = :taken_time
WHERE id = :sync_event_id
AND taken_time IS NULL
"""
args = {"taken_time": arrow.now().datetime, "sync_event_id": self.id}
res = Session.execute(sql, args)
Session.commit()
return res.rowcount > 0
@classmethod
def get_dead_letter(cls, older_than: Arrow) -> [SyncEvent]:
return (
SyncEvent.filter(
(
(
SyncEvent.taken_time.isnot(None)
& (SyncEvent.taken_time < older_than)
)
| (
SyncEvent.taken_time.is_(None)
& (SyncEvent.created_at < older_than)
)
)
)
.order_by(SyncEvent.id)
.limit(100)
.all()
)

View File

@ -1 +1,3 @@
from . import views
__all__ = ["views"]

View File

@ -1 +1,3 @@
from .views import authorize, token, user_info
__all__ = ["authorize", "token", "user_info"]

View File

@ -140,7 +140,7 @@ def authorize():
Scope=Scope,
)
else: # POST - user allows or denies
if not current_user.is_authenticated or not current_user.is_active:
if not current_user.is_authenticated or not current_user.is_active():
LOG.i(
"Attempt to validate a OAUth allow request by an unauthenticated user"
)

View File

@ -64,7 +64,7 @@ def _split_arg(arg_input: Union[str, list]) -> Set[str]:
- the response_type/scope passed as a list ?scope=scope_1&scope=scope_2
"""
res = set()
if type(arg_input) is str:
if isinstance(arg_input, str):
if " " in arg_input:
for x in arg_input.split(" "):
if x:

View File

@ -5,3 +5,11 @@ from .views import (
account_activated,
extension_redirect,
)
__all__ = [
"index",
"final",
"setup_done",
"account_activated",
"extension_redirect",
]

View File

@ -20,7 +20,7 @@ def final():
if form.validate_on_submit():
alias = Alias.get_by(email=form.email.data)
if alias and alias.user_id == current_user.id:
send_test_email_alias(alias.email, current_user.name)
send_test_email_alias(current_user, alias.email)
flash("An email is sent to your alias", "success")
return render_template(

View File

@ -27,6 +27,7 @@ def failed_payment(sub: Subscription, subscription_id: str):
"SimpleLogin - your subscription has failed to be renewed",
render(
"transactional/subscription-cancel.txt",
user=user,
end_date=arrow.arrow.datetime.utcnow(),
),
)

View File

@ -39,7 +39,6 @@ class _InnerLock:
lock_redis.storage.delete(lock_name)
def __call__(self, f: Callable[..., Any]):
if self.lock_suffix is None:
lock_suffix = f.__name__
else:

View File

@ -5,3 +5,11 @@ from .views import (
provider1_callback,
provider2_callback,
)
__all__ = [
"index",
"phone_reservation",
"twilio_callback",
"provider1_callback",
"provider2_callback",
]

View File

@ -7,11 +7,12 @@ from typing import Optional
from app.account_linking import SLPlan, SLPlanType
from app.config import PROTON_EXTRA_HEADER_NAME, PROTON_EXTRA_HEADER_VALUE
from app.errors import ProtonAccountNotVerified
from app.log import LOG
_APP_VERSION = "OauthClient_1.0.0"
PROTON_ERROR_CODE_NOT_EXISTS = 2501
PROTON_ERROR_CODE_HV_NEEDED = 9001
PLAN_FREE = 1
PLAN_PREMIUM = 2
@ -57,6 +58,15 @@ def convert_access_token(access_token_response: str) -> AccessCredentials:
)
def handle_response_not_ok(status: int, body: dict, text: str) -> Exception:
if status == HTTPStatus.UNPROCESSABLE_ENTITY:
res_code = body.get("Code")
if res_code == PROTON_ERROR_CODE_HV_NEEDED:
return ProtonAccountNotVerified()
return Exception(f"Unexpected status code. Wanted 200 and got {status}: " + text)
class ProtonClient(ABC):
@abstractmethod
def get_user(self) -> Optional[UserInformation]:
@ -124,11 +134,11 @@ class HttpProtonClient(ProtonClient):
@staticmethod
def __validate_response(res: Response) -> dict:
status = res.status_code
if status != HTTPStatus.OK:
raise Exception(
f"Unexpected status code. Wanted 200 and got {status}: " + res.text
)
as_json = res.json()
if status != HTTPStatus.OK:
raise HttpProtonClient.__handle_response_not_ok(
status=status, body=as_json, text=res.text
)
res_code = as_json.get("Code")
if not res_code or res_code != 1000:
raise Exception(

View File

@ -2,6 +2,7 @@ from newrelic import agent
from typing import Optional
from app.db import Session
from app.log import LOG
from app.errors import ProtonPartnerNotSetUp
from app.models import Partner, PartnerUser, User
@ -30,6 +31,7 @@ def perform_proton_account_unlink(current_user: User):
user_id=current_user.id, partner_id=proton_partner.id
)
if partner_user is not None:
LOG.info(f"User {current_user} has unlinked the account from {partner_user}")
PartnerUser.delete(partner_user.id)
Session.commit()
agent.record_custom_event("AccountUnlinked", {"partner": proton_partner.name})

42
app/app/rate_limiter.py Normal file
View File

@ -0,0 +1,42 @@
from datetime import datetime
from typing import Optional
import newrelic.agent
import redis.exceptions
import werkzeug.exceptions
from limits.storage import RedisStorage
from app.log import LOG
lock_redis: Optional[RedisStorage] = None
def set_redis_concurrent_lock(redis: RedisStorage):
global lock_redis
lock_redis = redis
def check_bucket_limit(
lock_name: Optional[str] = None,
max_hits: int = 5,
bucket_seconds: int = 3600,
):
# Calculate current bucket time
int_time = int(datetime.utcnow().timestamp())
bucket_id = int_time - (int_time % bucket_seconds)
bucket_lock_name = f"bl:{lock_name}:{bucket_id}"
if not lock_redis:
return
try:
value = lock_redis.incr(bucket_lock_name, bucket_seconds)
if value > max_hits:
LOG.i(
f"Rate limit hit for {lock_name} (bucket id {bucket_id}) -> {value}/{max_hits}"
)
newrelic.agent.record_custom_event(
"BucketRateLimit",
{"lock_name": lock_name, "bucket_seconds": bucket_seconds},
)
raise werkzeug.exceptions.TooManyRequests()
except (redis.exceptions.RedisError, AttributeError):
LOG.e("Cannot connect to redis")

View File

@ -2,21 +2,23 @@ import flask
import limits.storage
from app.parallel_limiter import set_redis_concurrent_lock
from app.rate_limiter import set_redis_concurrent_lock as rate_limit_set_redis
from app.session import RedisSessionStore
def initialize_redis_services(app: flask.Flask, redis_url: str):
if redis_url.startswith("redis://") or redis_url.startswith("rediss://"):
storage = limits.storage.RedisStorage(redis_url)
app.session_interface = RedisSessionStore(storage.storage, storage.storage, app)
set_redis_concurrent_lock(storage)
rate_limit_set_redis(storage)
elif redis_url.startswith("redis+sentinel://"):
storage = limits.storage.RedisSentinelStorage(redis_url)
app.session_interface = RedisSessionStore(
storage.storage, storage.storage_slave, app
)
set_redis_concurrent_lock(storage)
rate_limit_set_redis(storage)
else:
raise RuntimeError(
f"Tried to set_redis_session with an invalid redis url: ${redis_url}"

View File

@ -5,36 +5,39 @@ from typing import Optional
import boto3
import requests
from app.config import (
AWS_REGION,
BUCKET,
AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY,
LOCAL_FILE_UPLOAD,
UPLOAD_DIR,
URL,
)
from app import config
from app.log import LOG
if not LOCAL_FILE_UPLOAD:
_session = boto3.Session(
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
region_name=AWS_REGION,
)
_s3_client = None
def upload_from_bytesio(key: str, bs: BytesIO, content_type="string"):
def _get_s3client():
global _s3_client
if _s3_client is None:
args = {
"aws_access_key_id": config.AWS_ACCESS_KEY_ID,
"aws_secret_access_key": config.AWS_SECRET_ACCESS_KEY,
"region_name": config.AWS_REGION,
}
if config.AWS_ENDPOINT_URL:
args["endpoint_url"] = config.AWS_ENDPOINT_URL
_s3_client = boto3.client("s3", **args)
return _s3_client
def upload_from_bytesio(key: str, bs: BytesIO, content_type="application/octet-stream"):
bs.seek(0)
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, key)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, key)
file_dir = os.path.dirname(file_path)
os.makedirs(file_dir, exist_ok=True)
with open(file_path, "wb") as f:
f.write(bs.read())
else:
_session.resource("s3").Bucket(BUCKET).put_object(
_get_s3client().put_object(
Bucket=config.BUCKET,
Key=key,
Body=bs,
ContentType=content_type,
@ -44,15 +47,16 @@ def upload_from_bytesio(key: str, bs: BytesIO, content_type="string"):
def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
bs.seek(0)
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, path)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
file_dir = os.path.dirname(file_path)
os.makedirs(file_dir, exist_ok=True)
with open(file_path, "wb") as f:
f.write(bs.read())
else:
_session.resource("s3").Bucket(BUCKET).put_object(
_get_s3client().put_object(
Bucket=config.BUCKET,
Key=path,
Body=bs,
# Support saving a remote file using Http header
@ -63,16 +67,13 @@ def upload_email_from_bytesio(path: str, bs: BytesIO, filename):
def download_email(path: str) -> Optional[str]:
if LOCAL_FILE_UPLOAD:
file_path = os.path.join(UPLOAD_DIR, path)
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
with open(file_path, "rb") as f:
return f.read()
resp = (
_session.resource("s3")
.Bucket(BUCKET)
.get_object(
Key=path,
)
resp = _get_s3client().get_object(
Bucket=config.BUCKET,
Key=path,
)
if not resp or "Body" not in resp:
return None
@ -85,20 +86,30 @@ def upload_from_url(url: str, upload_path):
def get_url(key: str, expires_in=3600) -> str:
if LOCAL_FILE_UPLOAD:
return URL + "/static/upload/" + key
if config.LOCAL_FILE_UPLOAD:
return config.URL + "/static/upload/" + key
else:
s3_client = _session.client("s3")
return s3_client.generate_presigned_url(
return _get_s3client().generate_presigned_url(
ExpiresIn=expires_in,
ClientMethod="get_object",
Params={"Bucket": BUCKET, "Key": key},
Params={"Bucket": config.BUCKET, "Key": key},
)
def delete(path: str):
if LOCAL_FILE_UPLOAD:
os.remove(os.path.join(UPLOAD_DIR, path))
if config.LOCAL_FILE_UPLOAD:
file_path = os.path.join(config.UPLOAD_DIR, path)
os.remove(file_path)
else:
o = _session.resource("s3").Bucket(BUCKET).Object(path)
o.delete()
_get_s3client().delete_object(Bucket=config.BUCKET, Key=path)
def create_bucket_if_not_exists():
s3client = _get_s3client()
buckets = s3client.list_buckets()
for bucket in buckets["Buckets"]:
if bucket["Name"] == config.BUCKET:
LOG.i("Bucket already exists")
return
s3client.create_bucket(Bucket=config.BUCKET)
LOG.i(f"Bucket {config.BUCKET} created")

View File

@ -75,7 +75,7 @@ class RedisSessionStore(SessionInterface):
try:
data = pickle.loads(val)
return ServerSession(data, session_id=session_id)
except:
except Exception:
pass
return ServerSession(session_id=str(uuid.uuid4()))
@ -87,6 +87,7 @@ class RedisSessionStore(SessionInterface):
httponly = self.get_cookie_httponly(app)
secure = self.get_cookie_secure(app)
expires = self.get_expiration_time(app, session)
samesite = self.get_cookie_samesite(app)
val = pickle.dumps(dict(session))
ttl = int(app.permanent_session_lifetime.total_seconds())
# Only 5 minutes for non-authenticated sessions.
@ -109,6 +110,7 @@ class RedisSessionStore(SessionInterface):
domain=domain,
path=path,
secure=secure,
samesite=samesite,
)

View File

@ -2,6 +2,8 @@ import requests
from requests import RequestException
from app import config
from app.events.event_dispatcher import EventDispatcher
from app.events.generated.event_pb2 import EventContent, UserPlanChanged
from app.log import LOG
from app.models import User
@ -31,3 +33,6 @@ def execute_subscription_webhook(user: User):
)
except RequestException as e:
LOG.error(f"Subscription request exception: {e}")
event = UserPlanChanged(plan_end_time=sl_subscription_end)
EventDispatcher.send_event(user, EventContent(user_plan_change=event))

View File

@ -49,11 +49,11 @@ def random_string(length=10, include_digits=False):
def convert_to_id(s: str):
"""convert a string to id-like: remove space, remove special accent"""
s = s.replace(" ", "")
s = s.lower()
s = unidecode(s)
s = s.replace(" ", "")
return s
return s[:256]
_ALLOWED_CHARS = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_-."
@ -99,7 +99,7 @@ def sanitize_email(email_address: str, not_lower=False) -> str:
email_address = email_address.strip().replace(" ", "").replace("\n", " ")
if not not_lower:
email_address = email_address.lower()
return email_address
return email_address.replace("\u200f", "")
class NextUrlSanitizer:

View File

@ -5,11 +5,11 @@ from typing import List, Tuple
import arrow
import requests
from sqlalchemy import func, desc, or_
from sqlalchemy import func, desc, or_, and_
from sqlalchemy.ext.compiler import compiles
from sqlalchemy.orm import joinedload
from sqlalchemy.orm.exc import ObjectDeletedError
from sqlalchemy.sql import Insert
from sqlalchemy.sql import Insert, text
from app import s3, config
from app.alias_utils import nb_email_log_for_mailbox
@ -22,10 +22,9 @@ from app.email_utils import (
render,
email_can_be_used_as_mailbox,
send_email_with_rate_control,
normalize_reply_email,
is_valid_email,
get_email_domain_part,
)
from app.email_validation import is_valid_email, normalize_reply_email
from app.errors import ProtonPartnerNotSetUp
from app.log import LOG
from app.mail_sender import load_unsent_mails_from_fs_and_resend
@ -62,16 +61,23 @@ from app.pgp_utils import load_public_key_and_check, PGPException
from app.proton.utils import get_proton_partner
from app.utils import sanitize_email
from server import create_light_app
from tasks.cleanup_old_imports import cleanup_old_imports
from tasks.cleanup_old_jobs import cleanup_old_jobs
from tasks.cleanup_old_notifications import cleanup_old_notifications
DELETE_GRACE_DAYS = 30
def notify_trial_end():
for user in User.filter(
User.activated.is_(True), User.trial_end.isnot(None), User.lifetime.is_(False)
User.activated.is_(True),
User.trial_end.isnot(None),
User.trial_end >= arrow.now().shift(days=2),
User.trial_end < arrow.now().shift(days=3),
User.lifetime.is_(False),
).all():
try:
if user.in_trial() and arrow.now().shift(
days=3
) > user.trial_end >= arrow.now().shift(days=2):
if user.in_trial():
LOG.d("Send trial end email to user %s", user)
send_trial_end_soon_email(user)
# happens if user has been deleted in the meantime
@ -84,27 +90,49 @@ def delete_logs():
delete_refused_emails()
delete_old_monitoring()
for t in TransactionalEmail.filter(
for t_email in TransactionalEmail.filter(
TransactionalEmail.created_at < arrow.now().shift(days=-7)
):
TransactionalEmail.delete(t.id)
TransactionalEmail.delete(t_email.id)
for b in Bounce.filter(Bounce.created_at < arrow.now().shift(days=-7)):
Bounce.delete(b.id)
Session.commit()
LOG.d("Delete EmailLog older than 2 weeks")
LOG.d("Deleting EmailLog older than 2 weeks")
max_dt = arrow.now().shift(weeks=-2)
nb_deleted = EmailLog.filter(EmailLog.created_at < max_dt).delete()
Session.commit()
total_deleted = 0
batch_size = 500
Session.execute("set session statement_timeout=30000").rowcount
queries_done = 0
cutoff_time = arrow.now().shift(days=-14)
rows_to_delete = EmailLog.filter(EmailLog.created_at < cutoff_time).count()
expected_queries = int(rows_to_delete / batch_size)
sql = text(
"DELETE FROM email_log WHERE id IN (SELECT id FROM email_log WHERE created_at < :cutoff_time order by created_at limit :batch_size)"
)
str_cutoff_time = cutoff_time.isoformat()
while total_deleted < rows_to_delete:
deleted_count = Session.execute(
sql, {"cutoff_time": str_cutoff_time, "batch_size": batch_size}
).rowcount
Session.commit()
total_deleted += deleted_count
queries_done += 1
LOG.i(
f"[{queries_done}/{expected_queries}] Deleted {total_deleted} EmailLog entries"
)
if deleted_count < batch_size:
break
LOG.i("Delete %s email logs", nb_deleted)
LOG.i("Deleted %s email logs", total_deleted)
def delete_refused_emails():
for refused_email in RefusedEmail.filter_by(deleted=False).all():
for refused_email in (
RefusedEmail.filter_by(deleted=False).order_by(RefusedEmail.id).all()
):
if arrow.now().shift(days=1) > refused_email.delete_at >= arrow.now():
LOG.d("Delete refused email %s", refused_email)
if refused_email.path:
@ -138,7 +166,7 @@ def notify_premium_end():
send_email(
user.email,
f"Your subscription will end soon",
"Your subscription will end soon",
render(
"transactional/subscription-end.txt",
user=user,
@ -195,7 +223,7 @@ def notify_manual_sub_end():
LOG.d("Remind user %s that their manual sub is ending soon", user)
send_email(
user.email,
f"Your subscription will end soon",
"Your subscription will end soon",
render(
"transactional/manual-subscription-end.txt",
user=user,
@ -238,11 +266,13 @@ def notify_manual_sub_end():
"Your SimpleLogin subscription will end soon",
render(
"transactional/coinbase/reminder-subscription.txt",
user=user,
coinbase_subscription=coinbase_subscription,
extend_subscription_url=extend_subscription_url,
),
render(
"transactional/coinbase/reminder-subscription.html",
user=user,
coinbase_subscription=coinbase_subscription,
extend_subscription_url=extend_subscription_url,
),
@ -272,7 +302,11 @@ def compute_metric2() -> Metric2:
_24h_ago = now.shift(days=-1)
nb_referred_user_paid = 0
for user in User.filter(User.referral_id.isnot(None)):
for user in (
User.filter(User.referral_id.isnot(None))
.yield_per(500)
.enable_eagerloads(False)
):
if user.is_paid():
nb_referred_user_paid += 1
@ -563,21 +597,21 @@ nb_total_bounced_last_24h: {stats_today.nb_total_bounced_last_24h} - {increase_p
"""
monitoring_report += "\n====================================\n"
monitoring_report += f"""
monitoring_report += """
# Account bounce report:
"""
for email, bounces in bounce_report():
monitoring_report += f"{email}: {bounces}\n"
monitoring_report += f"""\n
monitoring_report += """\n
# Alias creation report:
"""
for email, nb_alias, date in alias_creation_report():
monitoring_report += f"{email}, {date}: {nb_alias}\n"
monitoring_report += f"""\n
monitoring_report += """\n
# Full bounce detail report:
"""
monitoring_report += all_bounce_report()
@ -794,10 +828,12 @@ def check_mailbox_valid_domain():
f"Mailbox {mailbox.email} is disabled",
render(
"transactional/disable-mailbox-warning.txt.jinja2",
user=mailbox.user,
mailbox=mailbox,
),
render(
"transactional/disable-mailbox-warning.html",
user=mailbox.user,
mailbox=mailbox,
),
retries=3,
@ -852,6 +888,7 @@ def check_mailbox_valid_pgp_keys():
f"Mailbox {mailbox.email}'s PGP Key is invalid",
render(
"transactional/invalid-mailbox-pgp-key.txt.jinja2",
user=mailbox.user,
mailbox=mailbox,
),
retries=3,
@ -892,6 +929,7 @@ def check_single_custom_domain(custom_domain):
f"Please update {custom_domain.domain} DNS on SimpleLogin",
render(
"transactional/custom-domain-dns-issue.txt.jinja2",
user=user,
custom_domain=custom_domain,
domain_dns_url=domain_dns_url,
),
@ -933,6 +971,9 @@ async def _hibp_check(api_key, queue):
This function to be ran simultaneously (multiple _hibp_check functions with different keys on the same queue) to make maximum use of multiple API keys.
"""
default_rate_sleep = (60.0 / config.HIBP_RPM) + 0.1
rate_sleep = default_rate_sleep
rate_hit_counter = 0
while True:
try:
alias_id = queue.get_nowait()
@ -940,9 +981,14 @@ async def _hibp_check(api_key, queue):
return
alias = Alias.get(alias_id)
# an alias can be deleted in the meantime
if not alias:
return
continue
user = alias.user
if user.disabled or not user.is_paid():
# Mark it as hibp done to skip it as if it had been checked
alias.hibp_last_check = arrow.utcnow()
Session.commit()
continue
LOG.d("Checking HIBP for %s", alias)
@ -954,7 +1000,6 @@ async def _hibp_check(api_key, queue):
f"https://haveibeenpwned.com/api/v3/breachedaccount/{urllib.parse.quote(alias.email)}",
headers=request_headers,
)
if r.status_code == 200:
# Breaches found
alias.hibp_breaches = [
@ -962,20 +1007,27 @@ async def _hibp_check(api_key, queue):
]
if len(alias.hibp_breaches) > 0:
LOG.w("%s appears in HIBP breaches %s", alias, alias.hibp_breaches)
if rate_hit_counter > 0:
rate_hit_counter -= 1
elif r.status_code == 404:
# No breaches found
alias.hibp_breaches = []
elif r.status_code == 429:
# rate limited
LOG.w("HIBP rate limited, check alias %s in the next run", alias)
await asyncio.sleep(1.6)
return
rate_hit_counter += 1
rate_sleep = default_rate_sleep + (0.2 * rate_hit_counter)
if rate_hit_counter > 10:
LOG.w(f"HIBP rate limited too many times stopping with alias {alias}")
return
# Just sleep for a while
asyncio.sleep(5)
elif r.status_code > 500:
LOG.w("HIBP server 5** error %s", r.status_code)
return
else:
LOG.error(
"An error occured while checking alias %s: %s - %s",
"An error occurred while checking alias %s: %s - %s",
alias,
r.status_code,
r.text,
@ -986,9 +1038,63 @@ async def _hibp_check(api_key, queue):
Session.add(alias)
Session.commit()
LOG.d("Updated breaches info for %s", alias)
LOG.d("Updated breach info for %s", alias)
await asyncio.sleep(rate_sleep)
await asyncio.sleep(1.6)
def get_alias_to_check_hibp(
oldest_hibp_allowed: arrow.Arrow,
user_ids_to_skip: list[int],
min_alias_id: int,
max_alias_id: int,
):
now = arrow.now()
alias_query = (
Session.query(Alias)
.join(User, User.id == Alias.user_id)
.join(Subscription, User.id == Subscription.user_id, isouter=True)
.join(ManualSubscription, User.id == ManualSubscription.user_id, isouter=True)
.join(AppleSubscription, User.id == AppleSubscription.user_id, isouter=True)
.join(
CoinbaseSubscription,
User.id == CoinbaseSubscription.user_id,
isouter=True,
)
.join(PartnerUser, User.id == PartnerUser.user_id, isouter=True)
.join(
PartnerSubscription,
PartnerSubscription.partner_user_id == PartnerUser.id,
isouter=True,
)
.filter(
or_(
Alias.hibp_last_check.is_(None),
Alias.hibp_last_check < oldest_hibp_allowed,
),
Alias.user_id.notin_(user_ids_to_skip),
Alias.enabled,
Alias.id >= min_alias_id,
Alias.id < max_alias_id,
User.disabled == False, # noqa: E712
User.enable_data_breach_check,
or_(
User.lifetime,
ManualSubscription.end_at > now,
Subscription.next_bill_date > now.date(),
AppleSubscription.expires_date > now,
CoinbaseSubscription.end_at > now,
PartnerSubscription.end_at > now,
),
)
)
if config.HIBP_SKIP_PARTNER_ALIAS:
alias_query = alias_query.filter(
Alias.flags.op("&")(Alias.FLAG_PARTNER_CREATED) == 0
)
for alias in (
alias_query.order_by(Alias.id.asc()).enable_eagerloads(False).yield_per(500)
):
yield alias
async def check_hibp():
@ -1011,39 +1117,49 @@ async def check_hibp():
Session.commit()
LOG.d("Updated list of known breaches")
LOG.d("Preparing list of aliases to check")
LOG.d("Getting the list of users to skip")
query = "select u.id, count(a.id) from users u, alias a where a.user_id=u.id group by u.id having count(a.id) > :max_alias"
rows = Session.execute(query, {"max_alias": config.HIBP_MAX_ALIAS_CHECK})
user_ids = [row[0] for row in rows]
LOG.d("Got %d users to skip" % len(user_ids))
LOG.d("Checking aliases")
queue = asyncio.Queue()
max_date = arrow.now().shift(days=-config.HIBP_SCAN_INTERVAL_DAYS)
for alias in (
Alias.filter(
or_(Alias.hibp_last_check.is_(None), Alias.hibp_last_check < max_date)
min_alias_id = 0
max_alias_id = Session.query(func.max(Alias.id)).scalar()
step = 10000
now = arrow.now()
oldest_hibp_allowed = now.shift(days=-config.HIBP_SCAN_INTERVAL_DAYS)
alias_checked = 0
for alias_batch_id in range(min_alias_id, max_alias_id, step):
for alias in get_alias_to_check_hibp(
oldest_hibp_allowed, user_ids, alias_batch_id, alias_batch_id + step
):
await queue.put(alias.id)
alias_checked += queue.qsize()
LOG.d(
f"Need to check about {queue.qsize()} aliases in this loop {alias_batch_id}/{max_alias_id}"
)
.filter(Alias.enabled)
.order_by(Alias.hibp_last_check.asc())
.all()
):
await queue.put(alias.id)
LOG.d("Need to check about %s aliases", queue.qsize())
# Start one checking process per API key
# Each checking process will take one alias from the queue, get the info
# and then sleep for 1.5 seconds (due to HIBP API request limits)
checkers = []
for i in range(len(config.HIBP_API_KEYS)):
checker = asyncio.create_task(
_hibp_check(
config.HIBP_API_KEYS[i],
queue,
# Start one checking process per API key
# Each checking process will take one alias from the queue, get the info
# and then sleep for 1.5 seconds (due to HIBP API request limits)
checkers = []
for i in range(len(config.HIBP_API_KEYS)):
checker = asyncio.create_task(
_hibp_check(
config.HIBP_API_KEYS[i],
queue,
)
)
)
checkers.append(checker)
checkers.append(checker)
# Wait until all checking processes are done
for checker in checkers:
await checker
# Wait until all checking processes are done
for checker in checkers:
await checker
LOG.d("Done checking HIBP API for aliases in breaches")
LOG.d(f"Done checking {alias_checked} HIBP API for aliases in breaches")
def notify_hibp():
@ -1071,14 +1187,14 @@ def notify_hibp():
)
LOG.d(
f"Send new breaches found email to %s for %s breaches aliases",
"Send new breaches found email to %s for %s breaches aliases",
user,
len(breached_aliases),
)
send_email(
user.email,
f"You were in a data breach",
"You were in a data breach",
render(
"transactional/hibp-new-breaches.txt.jinja2",
user=user,
@ -1098,6 +1214,30 @@ def notify_hibp():
Session.commit()
def clear_users_scheduled_to_be_deleted(dry_run=False):
users = User.filter(
and_(
User.delete_on.isnot(None),
User.delete_on <= arrow.now().shift(days=-DELETE_GRACE_DAYS),
)
).all()
for user in users:
LOG.i(
f"Scheduled deletion of user {user} with scheduled delete on {user.delete_on}"
)
if dry_run:
continue
User.delete(user.id)
Session.commit()
def delete_old_data():
oldest_valid = arrow.now().shift(days=-config.KEEP_OLD_DATA_DAYS)
cleanup_old_imports(oldest_valid)
cleanup_old_jobs(oldest_valid)
cleanup_old_notifications(oldest_valid)
if __name__ == "__main__":
LOG.d("Start running cronjob")
parser = argparse.ArgumentParser()
@ -1112,6 +1252,7 @@ if __name__ == "__main__":
"notify_manual_subscription_end",
"notify_premium_end",
"delete_logs",
"delete_old_data",
"poll_apple_subscription",
"sanity_check",
"delete_old_monitoring",
@ -1140,6 +1281,9 @@ if __name__ == "__main__":
elif args.job == "delete_logs":
LOG.d("Deleted Logs")
delete_logs()
elif args.job == "delete_old_data":
LOG.d("Delete old data")
delete_old_data()
elif args.job == "poll_apple_subscription":
LOG.d("Poll Apple Subscriptions")
poll_apple_subscription()
@ -1164,3 +1308,6 @@ if __name__ == "__main__":
elif args.job == "send_undelivered_mails":
LOG.d("Sending undelivered emails")
load_unsent_mails_from_fs_and_resend()
elif args.job == "delete_scheduled_users":
LOG.d("Deleting users scheduled to be deleted")
clear_users_scheduled_to_be_deleted(dry_run=True)

View File

@ -5,65 +5,72 @@ jobs:
schedule: "0 0 * * *"
captureStderr: true
- name: SimpleLogin Notify Trial Ends
command: python /code/cron.py -j notify_trial_end
shell: /bin/bash
schedule: "0 8 * * *"
captureStderr: true
- name: SimpleLogin Notify Manual Subscription Ends
command: python /code/cron.py -j notify_manual_subscription_end
shell: /bin/bash
schedule: "0 9 * * *"
captureStderr: true
- name: SimpleLogin Notify Premium Ends
command: python /code/cron.py -j notify_premium_end
shell: /bin/bash
schedule: "0 10 * * *"
captureStderr: true
- name: SimpleLogin Delete Logs
command: python /code/cron.py -j delete_logs
shell: /bin/bash
schedule: "0 11 * * *"
captureStderr: true
- name: SimpleLogin Poll Apple Subscriptions
command: python /code/cron.py -j poll_apple_subscription
shell: /bin/bash
schedule: "0 12 * * *"
captureStderr: true
- name: SimpleLogin Sanity Check
command: python /code/cron.py -j sanity_check
shell: /bin/bash
schedule: "0 2 * * *"
captureStderr: true
- name: SimpleLogin Delete Old Monitoring records
command: python /code/cron.py -j delete_old_monitoring
shell: /bin/bash
schedule: "0 14 * * *"
schedule: "15 1 * * *"
captureStderr: true
- name: SimpleLogin Custom Domain check
command: python /code/cron.py -j check_custom_domain
shell: /bin/bash
schedule: "0 15 * * *"
schedule: "15 2 * * *"
captureStderr: true
- name: SimpleLogin HIBP check
command: python /code/cron.py -j check_hibp
shell: /bin/bash
schedule: "0 18 * * *"
schedule: "15 3 * * *"
captureStderr: true
concurrencyPolicy: Forbid
- name: SimpleLogin Notify HIBP breaches
command: python /code/cron.py -j notify_hibp
shell: /bin/bash
schedule: "0 19 * * *"
schedule: "15 4 * * *"
captureStderr: true
concurrencyPolicy: Forbid
- name: SimpleLogin Delete Logs
command: python /code/cron.py -j delete_logs
shell: /bin/bash
schedule: "15 5 * * *"
captureStderr: true
- name: SimpleLogin Delete Old data
command: python /code/cron.py -j delete_old_data
shell: /bin/bash
schedule: "30 5 * * *"
captureStderr: true
- name: SimpleLogin Poll Apple Subscriptions
command: python /code/cron.py -j poll_apple_subscription
shell: /bin/bash
schedule: "15 6 * * *"
captureStderr: true
- name: SimpleLogin Notify Trial Ends
command: python /code/cron.py -j notify_trial_end
shell: /bin/bash
schedule: "15 8 * * *"
captureStderr: true
- name: SimpleLogin Notify Manual Subscription Ends
command: python /code/cron.py -j notify_manual_subscription_end
shell: /bin/bash
schedule: "15 9 * * *"
captureStderr: true
- name: SimpleLogin Notify Premium Ends
command: python /code/cron.py -j notify_premium_end
shell: /bin/bash
schedule: "15 10 * * *"
captureStderr: true
- name: SimpleLogin delete users scheduled to be deleted
command: python /code/cron.py -j delete_scheduled_users
shell: /bin/bash
schedule: "15 11 * * *"
captureStderr: true
concurrencyPolicy: Forbid

View File

@ -388,7 +388,7 @@ Input:
- (Optional but recommended) `hostname` passed in query string
- Request Message Body in json (`Content-Type` is `application/json`)
- alias_prefix: string. The first part of the alias that user can choose.
- signed_suffix: should be one of the suffixes returned in the `GET /api/v4/alias/options` endpoint.
- signed_suffix: should be one of the suffixes returned in the `GET /api/v5/alias/options` endpoint.
- mailbox_ids: list of mailbox_id that "owns" this alias
- (Optional) note: alias note
- (Optional) name: alias name

View File

@ -1,4 +1,4 @@
# SSL, HTTPS, and HSTS
# SSL, HTTPS, HSTS and additional security measures
It's highly recommended to enable SSL/TLS on your server, both for the web app and email server.
@ -58,3 +58,124 @@ Now, reload Nginx:
```bash
sudo systemctl reload nginx
```
## Additional security measures
For additional security, we recommend you take some extra steps.
### Enable Certificate Authority Authorization (CAA)
[Certificate Authority Authorization](https://letsencrypt.org/docs/caa/) is a step you can take to restrict the list of certificate authorities that are allowed to issue certificates for your domains.
Use [SSLMates CAA Record Generator](https://sslmate.com/caa/) to create a **CAA record** with the following configuration:
- `flags`: `0`
- `tag`: `issue`
- `value`: `"letsencrypt.org"`
To verify if the DNS works, the following command
```bash
dig @1.1.1.1 mydomain.com caa
```
should return:
```
mydomain.com. 3600 IN CAA 0 issue "letsencrypt.org"
```
### SMTP MTA Strict Transport Security (MTA-STS)
[MTA-STS](https://datatracker.ietf.org/doc/html/rfc8461) is an extra step you can take to broadcast the ability of your instance to receive and, optionally enforce, TSL-secure SMTP connections to protect email traffic.
Enabling MTA-STS requires you serve a specific file from subdomain `mta-sts.domain.com` on a well-known route.
Create a text file `/var/www/.well-known/mta-sts.txt` with the content:
```txt
version: STSv1
mode: testing
mx: app.mydomain.com
max_age: 86400
```
It is recommended to start with `mode: testing` for starters to get time to review failure reports. Add as many `mx:` domain entries as you have matching **MX records** in your DNS configuration.
Create a **TXT record** for `_mta-sts.mydomain.com.` with the following value:
```txt
v=STSv1; id=UNIX_TIMESTAMP
```
With `UNIX_TIMESTAMP` being the current date/time.
Use the following command to generate the record:
```bash
echo "v=STSv1; id=$(date +%s)"
```
To verify if the DNS works, the following command
```bash
dig @1.1.1.1 _mta-sts.mydomain.com txt
```
should return a result similar to this one:
```
_mta-sts.mydomain.com. 3600 IN TXT "v=STSv1; id=1689416399"
```
Create an additional Nginx configuration in `/etc/nginx/sites-enabled/mta-sts` with the following content:
```
server {
server_name mta-sts.mydomain.com;
root /var/www;
listen 80;
location ^~ /.well-known {}
}
```
Restart Nginx with the following command:
```sh
sudo service nginx restart
```
A correct configuration of MTA-STS, however, requires that the certificate used to host the `mta-sts` subdomain matches that of the subdomain referred to by the **MX record** from the DNS. In other words, both `mta-sts.mydomain.com` and `app.mydomain.com` must share the same certificate.
The easiest way to do this is to _expand_the certificate associated with `app.mydomain.com` to also support the `mta-sts` subdomain using the following command:
```sh
certbot --expand --nginx -d app.mydomain.com,mta-sts.mydomain.com
```
## SMTP TLS Reporting
[TLSRPT](https://datatracker.ietf.org/doc/html/rfc8460) is used by SMTP systems to report failures in establishing TLS-secure sessions as broadcast by the MTA-STS configuration.
Configuring MTA-STS in `mode: testing` as shown in the previous section gives you time to review failures from some SMTP senders.
Create a **TXT record** for `_smtp._tls.mydomain.com.` with the following value:
```txt
v=TSLRPTv1; rua=mailto:YOUR_EMAIL
```
The TLSRPT configuration at the DNS level allows SMTP senders that fail to initiate TLS-secure sessions to send reports to a particular email address. We suggest creating a `tls-reports` alias in SimpleLogin for this purpose.
To verify if the DNS works, the following command
```bash
dig @1.1.1.1 _smtp._tls.mydomain.com txt
```
should return a result similar to this one:
```
_smtp._tls.mydomain.com. 3600 IN TXT "v=TSLRPTv1; rua=mailto:tls-reports@mydomain.com"
```

View File

@ -53,7 +53,7 @@ from flanker.addresslib.address import EmailAddress
from sqlalchemy.exc import IntegrityError
from app import pgp_utils, s3, config
from app.alias_utils import try_auto_create
from app.alias_utils import try_auto_create, change_alias_status
from app.config import (
EMAIL_DOMAIN,
URL,
@ -106,8 +106,6 @@ from app.email_utils import (
get_header_unicode,
generate_reply_email,
is_reverse_alias,
normalize_reply_email,
is_valid_email,
replace,
should_disable,
parse_id_from_bounce,
@ -123,6 +121,7 @@ from app.email_utils import (
generate_verp_email,
sl_formataddr,
)
from app.email_validation import is_valid_email, normalize_reply_email
from app.errors import (
NonReverseAliasInReplyPhase,
VERPTransactional,
@ -236,17 +235,18 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
contact.mail_from = mail_from
Session.commit()
else:
alias_id = alias.id
try:
contact_email_for_reply = (
contact_email if is_valid_email(contact_email) else ""
)
contact = Contact.create(
user_id=alias.user_id,
alias_id=alias.id,
alias_id=alias_id,
website_email=contact_email,
name=contact_name,
mail_from=mail_from,
reply_email=generate_reply_email(contact_email, alias)
if is_valid_email(contact_email)
else NOREPLY,
reply_email=generate_reply_email(contact_email_for_reply, alias),
automatic_created=True,
)
if not contact_email:
@ -262,9 +262,11 @@ def get_or_create_contact(from_header: str, mail_from: str, alias: Alias) -> Con
Session.commit()
except IntegrityError:
LOG.w("Contact %s %s already exist", alias, contact_email)
Session.rollback()
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
# No need to manually rollback, as IntegrityError already rolls back
LOG.info(
f"Contact with email {contact_email} for alias_id {alias_id} already existed, fetching from DB"
)
contact = Contact.get_by(alias_id=alias_id, website_email=contact_email)
return contact
@ -280,6 +282,9 @@ def get_or_create_reply_to_contact(
except ValueError:
return
if len(contact_name) >= Contact.MAX_NAME_LENGTH:
contact_name = contact_name[0 : Contact.MAX_NAME_LENGTH]
if not is_valid_email(contact_address):
LOG.w(
"invalid reply-to address %s. Parse from %s",
@ -348,6 +353,10 @@ def replace_header_when_forward(msg: Message, alias: Alias, header: str):
continue
contact = Contact.get_by(alias_id=alias.id, website_email=contact_email)
contact_name = full_address.display_name
if len(contact_name) >= Contact.MAX_NAME_LENGTH:
contact_name = contact_name[0 : Contact.MAX_NAME_LENGTH]
if contact:
# update the contact name if needed
if contact.name != full_address.display_name:
@ -355,9 +364,9 @@ def replace_header_when_forward(msg: Message, alias: Alias, header: str):
"Update contact %s name %s to %s",
contact,
contact.name,
full_address.display_name,
contact_name,
)
contact.name = full_address.display_name
contact.name = contact_name
Session.commit()
else:
LOG.d(
@ -372,7 +381,7 @@ def replace_header_when_forward(msg: Message, alias: Alias, header: str):
user_id=alias.user_id,
alias_id=alias.id,
website_email=contact_email,
name=full_address.display_name,
name=contact_name,
reply_email=generate_reply_email(contact_email, alias),
is_cc=header.lower() == "cc",
automatic_created=True,
@ -541,12 +550,20 @@ def sign_msg(msg: Message) -> Message:
signature.add_header("Content-Disposition", 'attachment; filename="signature.asc"')
try:
signature.set_payload(sign_data(message_to_bytes(msg).replace(b"\n", b"\r\n")))
payload = sign_data(message_to_bytes(msg).replace(b"\n", b"\r\n"))
if not payload:
raise PGPException("Empty signature by gnupg")
signature.set_payload(payload)
except Exception:
LOG.e("Cannot sign, try using pgpy")
signature.set_payload(
sign_data_with_pgpy(message_to_bytes(msg).replace(b"\n", b"\r\n"))
)
payload = sign_data_with_pgpy(message_to_bytes(msg).replace(b"\n", b"\r\n"))
if not payload:
raise PGPException("Empty signature by pgpy")
signature.set_payload(payload)
container.attach(signature)
@ -587,12 +604,14 @@ def handle_email_sent_to_ourself(alias, from_addr: str, msg: Message, user):
f"Email sent to {alias.email} from its own mailbox {from_addr}",
render(
"transactional/cycle-email.txt.jinja2",
user=user,
alias=alias,
from_addr=from_addr,
refused_email_url=refused_email_url,
),
render(
"transactional/cycle-email.html",
user=user,
alias=alias,
from_addr=from_addr,
refused_email_url=refused_email_url,
@ -623,8 +642,12 @@ def handle_forward(envelope, msg: Message, rcpt_to: str) -> List[Tuple[bool, str
user = alias.user
if user.disabled:
LOG.w("User %s disabled, disable forwarding emails for %s", user, alias)
if not user.is_active():
LOG.w(f"User {user} has been soft deleted")
return False, status.E502
if not user.can_send_or_receive():
LOG.i(f"User {user} cannot receive emails")
if should_ignore_bounce(envelope.mail_from):
return [(True, status.E207)]
else:
@ -642,6 +665,9 @@ def handle_forward(envelope, msg: Message, rcpt_to: str) -> List[Tuple[bool, str
from_header = get_header_unicode(msg[headers.FROM])
LOG.d("Create or get contact for from_header:%s", from_header)
contact = get_or_create_contact(from_header, envelope.mail_from, alias)
alias = (
contact.alias
) # In case the Session was closed in the get_or_create we re-fetch the alias
reply_to_contact = None
if msg[headers.REPLY_TO]:
@ -710,12 +736,14 @@ def handle_forward(envelope, msg: Message, rcpt_to: str) -> List[Tuple[bool, str
f"Your mailbox {mailbox.email} is an alias",
render(
"transactional/mailbox-invalid.txt.jinja2",
user=mailbox.user,
mailbox=mailbox,
mailbox_url=mailbox_url,
alias=alias,
),
render(
"transactional/mailbox-invalid.html",
user=mailbox.user,
mailbox=mailbox,
mailbox_url=mailbox_url,
alias=alias,
@ -768,12 +796,14 @@ def forward_email_to_mailbox(
f"Your mailbox {mailbox.email} and alias {alias.email} use the same domain",
render(
"transactional/mailbox-invalid.txt.jinja2",
user=mailbox.user,
mailbox=mailbox,
mailbox_url=mailbox_url,
alias=alias,
),
render(
"transactional/mailbox-invalid.html",
user=mailbox.user,
mailbox=mailbox,
mailbox_url=mailbox_url,
alias=alias,
@ -857,6 +887,7 @@ def forward_email_to_mailbox(
# References and In-Reply-To are used for keeping the email thread
headers.REFERENCES,
headers.IN_REPLY_TO,
headers.SL_QUEUE_ID,
headers.LIST_UNSUBSCRIBE,
headers.LIST_UNSUBSCRIBE_POST,
] + headers.MIME_HEADERS
@ -864,21 +895,22 @@ def forward_email_to_mailbox(
headers_to_keep.append(headers.AUTHENTICATION_RESULTS)
delete_all_headers_except(msg, headers_to_keep)
if mailbox.generic_subject:
LOG.d("Use a generic subject for %s", mailbox)
orig_subject = msg[headers.SUBJECT]
orig_subject = get_header_unicode(orig_subject)
add_or_replace_header(msg, "Subject", mailbox.generic_subject)
sender = msg[headers.FROM]
sender = get_header_unicode(sender)
msg = add_header(
msg,
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with "{orig_subject}" as subject""",
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with <b>{orig_subject}</b> as subject""",
)
# create PGP email if needed
if mailbox.pgp_enabled() and user.is_premium() and not alias.disable_pgp:
LOG.d("Encrypt message using mailbox %s", mailbox)
if mailbox.generic_subject:
LOG.d("Use a generic subject for %s", mailbox)
orig_subject = msg[headers.SUBJECT]
orig_subject = get_header_unicode(orig_subject)
add_or_replace_header(msg, "Subject", mailbox.generic_subject)
sender = msg[headers.FROM]
sender = get_header_unicode(sender)
msg = add_header(
msg,
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with "{orig_subject}" as subject""",
f"""Forwarded by SimpleLogin to {alias.email} from "{sender}" with <b>{orig_subject}</b> as subject""",
)
try:
msg = prepare_pgp_message(
@ -899,7 +931,11 @@ def forward_email_to_mailbox(
msg[headers.SL_EMAIL_LOG_ID] = str(email_log.id)
if user.include_header_email_header:
msg[headers.SL_ENVELOPE_FROM] = envelope.mail_from
msg[headers.SL_ORIGINAL_FROM] = contact.website_email
if contact.name:
original_from = f"{contact.name} <{contact.website_email}>"
else:
original_from = contact.website_email
msg[headers.SL_ORIGINAL_FROM] = original_from
# when an alias isn't in the To: header, there's no way for users to know what alias has received the email
msg[headers.SL_ENVELOPE_TO] = alias.email
@ -1037,6 +1073,9 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
if not contact:
LOG.w(f"No contact with {reply_email} as reverse alias")
return False, status.E502
if not contact.user.is_active():
LOG.w(f"User {contact.user} has been soft deleted")
return False, status.E502
alias = contact.alias
alias_address: str = contact.alias.email
@ -1051,13 +1090,8 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
user = alias.user
mail_from = envelope.mail_from
if user.disabled:
LOG.e(
"User %s disabled, disable sending emails from %s to %s",
user,
alias,
contact,
)
if not user.can_send_or_receive():
LOG.i(f"User {user} cannot send emails")
return False, status.E504
# Check if we need to reject or quarantine based on dmarc
@ -1158,6 +1192,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
# References and In-Reply-To are used for keeping the email thread
headers.REFERENCES,
headers.IN_REPLY_TO,
headers.SL_QUEUE_ID,
]
+ headers.MIME_HEADERS,
)
@ -1183,7 +1218,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
)
# replace reverse alias by real address for all contacts
for (reply_email, website_email) in contact_query.values(
for reply_email, website_email in contact_query.values(
Contact.reply_email, Contact.website_email
):
msg = replace(msg, reply_email, website_email)
@ -1238,7 +1273,6 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
if str(msg[headers.TO]).lower() == "undisclosed-recipients:;":
# no need to replace TO header
LOG.d("email is sent in BCC mode")
del msg[headers.TO]
else:
replace_header_when_reply(msg, alias, headers.TO)
@ -1254,6 +1288,7 @@ def handle_reply(envelope, msg: Message, rcpt_to: str) -> (bool, str):
f"Email sent to {contact.email} contains non reverse-alias addresses",
render(
"transactional/non-reverse-alias-reply-phase.txt.jinja2",
user=alias.user,
destination=contact.email,
alias=alias.email,
subject=msg[headers.SUBJECT],
@ -1475,6 +1510,7 @@ def handle_unknown_mailbox(
f"Attempt to use your alias {alias.email} from {envelope.mail_from}",
render(
"transactional/reply-must-use-personal-email.txt",
user=user,
alias=alias,
sender=envelope.mail_from,
authorize_address_link=authorize_address_link,
@ -1482,6 +1518,7 @@ def handle_unknown_mailbox(
),
render(
"transactional/reply-must-use-personal-email.html",
user=user,
alias=alias,
sender=envelope.mail_from,
authorize_address_link=authorize_address_link,
@ -1563,7 +1600,7 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
LOG.w(
f"Disable alias {alias} because {reason}. {alias.mailboxes} {alias.user}. Last contact {contact}"
)
alias.enabled = False
change_alias_status(alias, enabled=False)
Notification.create(
user_id=user.id,
@ -1582,12 +1619,14 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
f"Alias {alias.email} has been disabled due to multiple bounces",
render(
"transactional/bounce/automatic-disable-alias.txt",
user=alias.user,
alias=alias,
refused_email_url=refused_email_url,
mailbox_email=mailbox.email,
),
render(
"transactional/bounce/automatic-disable-alias.html",
user=alias.user,
alias=alias,
refused_email_url=refused_email_url,
mailbox_email=mailbox.email,
@ -1626,6 +1665,7 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
f"An email sent to {alias.email} cannot be delivered to your mailbox",
render(
"transactional/bounce/bounced-email.txt.jinja2",
user=alias.user,
alias=alias,
website_email=contact.website_email,
disable_alias_link=disable_alias_link,
@ -1635,6 +1675,7 @@ def handle_bounce_forward_phase(msg: Message, email_log: EmailLog):
),
render(
"transactional/bounce/bounced-email.html",
user=alias.user,
alias=alias,
website_email=contact.website_email,
disable_alias_link=disable_alias_link,
@ -1727,12 +1768,14 @@ def handle_bounce_reply_phase(envelope, msg: Message, email_log: EmailLog):
f"Email cannot be sent to { contact.email } from your alias { alias.email }",
render(
"transactional/bounce/bounce-email-reply-phase.txt",
user=user,
alias=alias,
contact=contact,
refused_email_url=refused_email_url,
),
render(
"transactional/bounce/bounce-email-reply-phase.html",
user=user,
alias=alias,
contact=contact,
refused_email_url=refused_email_url,
@ -1795,6 +1838,7 @@ def handle_spam(
f"Email from {alias.email} to {contact.website_email} is detected as spam",
render(
"transactional/spam-email-reply-phase.txt",
user=user,
alias=alias,
website_email=contact.website_email,
disable_alias_link=disable_alias_link,
@ -1802,6 +1846,7 @@ def handle_spam(
),
render(
"transactional/spam-email-reply-phase.html",
user=user,
alias=alias,
website_email=contact.website_email,
disable_alias_link=disable_alias_link,
@ -1824,6 +1869,7 @@ def handle_spam(
f"Email from {contact.website_email} to {alias.email} is detected as spam",
render(
"transactional/spam-email.txt",
user=user,
alias=alias,
website_email=contact.website_email,
disable_alias_link=disable_alias_link,
@ -1831,6 +1877,7 @@ def handle_spam(
),
render(
"transactional/spam-email.html",
user=user,
alias=alias,
website_email=contact.website_email,
disable_alias_link=disable_alias_link,
@ -1871,24 +1918,30 @@ def handle_transactional_bounce(
envelope: Envelope, msg, rcpt_to, transactional_id=None
):
LOG.d("handle transactional bounce sent to %s", rcpt_to)
if transactional_id is None:
LOG.i(
f"No transactional record for {envelope.mail_from} -> {envelope.rcpt_tos}"
)
return
# parse the TransactionalEmail
transactional_id = transactional_id or parse_id_from_bounce(rcpt_to)
transactional = TransactionalEmail.get(transactional_id)
# a transaction might have been deleted in delete_logs()
if transactional:
LOG.i("Create bounce for %s", transactional.email)
bounce_info = get_mailbox_bounce_info(msg)
if bounce_info:
Bounce.create(
email=transactional.email,
info=bounce_info.as_bytes().decode(),
commit=True,
)
else:
LOG.w("cannot get bounce info, debug at %s", save_email_for_debugging(msg))
Bounce.create(email=transactional.email, commit=True)
if not transactional:
LOG.i(
f"No transactional record for {envelope.mail_from} -> {envelope.rcpt_tos}"
)
return
LOG.i("Create bounce for %s", transactional.email)
bounce_info = get_mailbox_bounce_info(msg)
if bounce_info:
Bounce.create(
email=transactional.email,
info=bounce_info.as_bytes().decode(),
commit=True,
)
else:
LOG.w("cannot get bounce info, debug at %s", save_email_for_debugging(msg))
Bounce.create(email=transactional.email, commit=True)
def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
@ -1909,6 +1962,9 @@ def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
contact,
alias,
)
if not email_log.user.is_active():
LOG.d(f"User {email_log.user} is not active")
return status.E510
if email_log.is_reply:
content_type = msg.get_content_type().lower()
@ -1939,7 +1995,7 @@ def handle_bounce(envelope, email_log: EmailLog, msg: Message) -> str:
for is_delivered, smtp_status in handle_forward(envelope, msg, alias.email):
res.append((is_delivered, smtp_status))
for (is_success, smtp_status) in res:
for is_success, smtp_status in res:
# Consider all deliveries successful if 1 delivery is successful
if is_success:
return smtp_status
@ -1970,12 +2026,15 @@ def send_no_reply_response(mail_from: str, msg: Message):
if not mailbox:
LOG.d("Unknown sender. Skipping reply from {}".format(NOREPLY))
return
if not mailbox.user.is_active():
LOG.d(f"User {mailbox.user} is soft-deleted. Skipping sending reply response")
return
send_email_at_most_times(
mailbox.user,
ALERT_TO_NOREPLY,
mailbox.user.email,
"Auto: {}".format(msg[headers.SUBJECT] or "No subject"),
render("transactional/noreply.text.jinja2"),
render("transactional/noreply.text.jinja2", user=mailbox.user),
)
@ -2008,10 +2067,11 @@ def handle(envelope: Envelope, msg: Message) -> str:
return status.E204
# sanitize email headers
sanitize_header(msg, "from")
sanitize_header(msg, "to")
sanitize_header(msg, "cc")
sanitize_header(msg, "reply-to")
sanitize_header(msg, headers.FROM)
sanitize_header(msg, headers.TO)
sanitize_header(msg, headers.CC)
sanitize_header(msg, headers.REPLY_TO)
sanitize_header(msg, headers.MESSAGE_ID)
LOG.d(
"==>> Handle mail_from:%s, rcpt_tos:%s, header_from:%s, header_to:%s, "
@ -2056,6 +2116,7 @@ def handle(envelope: Envelope, msg: Message) -> str:
"SimpleLogin shouldn't be used with another email forwarding system",
render(
"transactional/email-sent-from-reverse-alias.txt.jinja2",
user=user,
),
)
@ -2259,7 +2320,7 @@ def handle(envelope: Envelope, msg: Message) -> str:
if nb_success > 0 and nb_non_success > 0:
LOG.e(f"some deliveries fail and some success, {mail_from}, {rcpt_tos}, {res}")
for (is_success, smtp_status) in res:
for is_success, smtp_status in res:
# Consider all deliveries successful if 1 delivery is successful
if is_success:
return smtp_status

64
app/event_listener.py Normal file
View File

@ -0,0 +1,64 @@
import argparse
from enum import Enum
from sys import argv, exit
from app.config import DB_URI
from app.log import LOG
from events.runner import Runner
from events.event_source import DeadLetterEventSource, PostgresEventSource
from events.event_sink import ConsoleEventSink, HttpEventSink
class Mode(Enum):
DEAD_LETTER = "dead_letter"
LISTENER = "listener"
@staticmethod
def from_str(value: str):
if value == Mode.DEAD_LETTER.value:
return Mode.DEAD_LETTER
elif value == Mode.LISTENER.value:
return Mode.LISTENER
else:
raise ValueError(f"Invalid mode: {value}")
def main(mode: Mode, dry_run: bool):
if mode == Mode.DEAD_LETTER:
LOG.i("Using DeadLetterEventSource")
source = DeadLetterEventSource()
elif mode == Mode.LISTENER:
LOG.i("Using PostgresEventSource")
source = PostgresEventSource(DB_URI)
else:
raise ValueError(f"Invalid mode: {mode}")
if dry_run:
LOG.i("Starting with ConsoleEventSink")
sink = ConsoleEventSink()
else:
LOG.i("Starting with HttpEventSink")
sink = HttpEventSink()
runner = Runner(source=source, sink=sink)
runner.run()
def args():
parser = argparse.ArgumentParser(description="Run event listener")
parser.add_argument(
"mode",
help="Mode to run",
choices=[Mode.DEAD_LETTER.value, Mode.LISTENER.value],
)
parser.add_argument("--dry-run", help="Dry run mode", action="store_true")
return parser.parse_args()
if __name__ == "__main__":
if len(argv) < 2:
print("Invalid usage. Pass 'listener' or 'dead_letter' as argument")
exit(1)
args = args()
main(Mode.from_str(args.mode), args.dry_run)

0
app/events/__init__.py Normal file
View File

42
app/events/event_sink.py Normal file
View File

@ -0,0 +1,42 @@
import requests
from abc import ABC, abstractmethod
from app.config import EVENT_WEBHOOK, EVENT_WEBHOOK_SKIP_VERIFY_SSL
from app.log import LOG
from app.models import SyncEvent
class EventSink(ABC):
@abstractmethod
def process(self, event: SyncEvent) -> bool:
pass
class HttpEventSink(EventSink):
def process(self, event: SyncEvent) -> bool:
if not EVENT_WEBHOOK:
LOG.warning("Skipping sending event because there is no webhook configured")
return False
LOG.info(f"Sending event {event.id} to {EVENT_WEBHOOK}")
res = requests.post(
url=EVENT_WEBHOOK,
data=event.content,
headers={"Content-Type": "application/x-protobuf"},
verify=not EVENT_WEBHOOK_SKIP_VERIFY_SSL,
)
if res.status_code != 200:
LOG.warning(
f"Failed to send event to webhook: {res.status_code} {res.text}"
)
return False
else:
LOG.info(f"Event {event.id} sent successfully to webhook")
return True
class ConsoleEventSink(EventSink):
def process(self, event: SyncEvent) -> bool:
LOG.info(f"Handling event {event.id}")
return True

Some files were not shown because too many files have changed in this diff Show More