[#1234] Merge remote-tracking branch 'remotes/upstream/develop' into 1234-mastodon-2-4-3-oauth-scopes

# Conflicts:
#	lib/pleroma/web/activity_pub/activity_pub_controller.ex
#	lib/pleroma/web/router.ex
This commit is contained in:
Ivan Tashkinov 2019-09-15 18:52:27 +03:00
commit efbc2edba1
48 changed files with 5023 additions and 1223 deletions

1
.gitignore vendored
View File

@ -38,6 +38,7 @@ erl_crash.dump
# Prevent committing docs files # Prevent committing docs files
/priv/static/doc/* /priv/static/doc/*
docs/generated_config.md
# Code test coverage # Code test coverage
/cover /cover

View File

@ -7,6 +7,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
### Security ### Security
- OStatus: eliminate the possibility of a protocol downgrade attack. - OStatus: eliminate the possibility of a protocol downgrade attack.
- OStatus: prevent following locked accounts, bypassing the approval process. - OStatus: prevent following locked accounts, bypassing the approval process.
- Mastodon API: respect post privacy in `/api/v1/statuses/:id/{favourited,reblogged}_by`
### Removed ### Removed
- **Breaking:** GNU Social API with Qvitter extensions support - **Breaking:** GNU Social API with Qvitter extensions support
@ -20,11 +21,12 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities. - **Breaking:** `/api/pleroma/notifications/read` is moved to `/api/v1/pleroma/notifications/read` and now supports `max_id` and responds with Mastodon API entities.
- Configuration: OpenGraph and TwitterCard providers enabled by default - Configuration: OpenGraph and TwitterCard providers enabled by default
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text - Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text
- Mastodon API: `pleroma.thread_muted` key in the Status entity - Configuration: added `config/description.exs`, from which `docs/config.md` is generated
- Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set - Federation: Return 403 errors when trying to request pages from a user's follower/following collections if they have `hide_followers`/`hide_follows` set
- NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option - NodeInfo: Return `skipThreadContainment` in `metadata` for the `skip_thread_containment` option
- NodeInfo: Return `mailerEnabled` in `metadata` - NodeInfo: Return `mailerEnabled` in `metadata`
- Mastodon API: Unsubscribe followers when they unfollow a user - Mastodon API: Unsubscribe followers when they unfollow a user
- Mastodon API: `pleroma.thread_muted` key in the Status entity
- AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses) - AdminAPI: Add "godmode" while fetching user statuses (i.e. admin can see private statuses)
- Improve digest email template - Improve digest email template
Pagination: (optional) return `total` alongside with `items` when paginating Pagination: (optional) return `total` alongside with `items` when paginating
@ -60,6 +62,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances - Reverse Proxy limiting `max_body_length` was incorrectly defined and only checked `Content-Length` headers which may not be sufficient in some circumstances
- MRF: fix use of unserializable keyword lists in describe() implementations - MRF: fix use of unserializable keyword lists in describe() implementations
- ActivityPub: Deactivated user deletion - ActivityPub: Deactivated user deletion
- ActivityPub: Fix `/users/:nickname/inbox` crashing without an authenticated user
- MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled - MRF: fix ability to follow a relay when AntiFollowbotPolicy was enabled
### Added ### Added
@ -104,10 +107,13 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
- ActivityPub: Optional signing of ActivityPub object fetches. - ActivityPub: Optional signing of ActivityPub object fetches.
- Admin API: Endpoint for fetching latest user's statuses - Admin API: Endpoint for fetching latest user's statuses
- Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation. - Pleroma API: Add `/api/v1/pleroma/accounts/confirmation_resend?email=<email>` for resending account confirmation.
- Pleroma API: Email change endpoint.
- Relays: Added a task to list relay subscriptions. - Relays: Added a task to list relay subscriptions.
- Mix Tasks: `mix pleroma.database fix_likes_collections` - Mix Tasks: `mix pleroma.database fix_likes_collections`
- Federation: Remove `likes` from objects. - Federation: Remove `likes` from objects.
- Admin API: Added moderation log - Admin API: Added moderation log
- Web response cache (currently, enabled for ActivityPub)
- Mastodon API: Added an endpoint to get multiple statuses by IDs (`GET /api/v1/statuses/?ids[]=1&ids[]=2`)
### Changed ### Changed
- Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text - Configuration: Filter.AnonymizeFilename added ability to retain file extension with custom text

View File

@ -373,6 +373,8 @@
config :phoenix, :format_encoders, json: Jason config :phoenix, :format_encoders, json: Jason
config :phoenix, :json_library, Jason
config :pleroma, :gopher, config :pleroma, :gopher,
enabled: false, enabled: false,
ip: {0, 0, 0, 0}, ip: {0, 0, 0, 0},
@ -560,6 +562,10 @@
config :pleroma, Pleroma.ActivityExpiration, enabled: true config :pleroma, Pleroma.ActivityExpiration, enabled: true
config :pleroma, :web_cache_ttl,
activity_pub: nil,
activity_pub_question: 30_000
# Import environment specific config. This must remain at the bottom # Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above. # of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs" import_config "#{Mix.env()}.exs"

2837
config/description.exs Normal file

File diff suppressed because it is too large Load Diff

View File

@ -60,9 +60,13 @@ Authentication is required and the user must be an admin.
- Method: `POST` - Method: `POST`
- Params: - Params:
- `nickname` `users`: [
- `email` {
- `password` `nickname`,
`email`,
`password`
}
]
- Response: Users nickname - Response: Users nickname
## `/api/pleroma/admin/users/follow` ## `/api/pleroma/admin/users/follow`

View File

@ -91,6 +91,20 @@ Additional parameters can be added to the JSON body/Form data:
- `expires_in`: The number of seconds the posted activity should expire in. When a posted activity expires it will be deleted from the server, and a delete request for it will be federated. This needs to be longer than an hour. - `expires_in`: The number of seconds the posted activity should expire in. When a posted activity expires it will be deleted from the server, and a delete request for it will be federated. This needs to be longer than an hour.
- `in_reply_to_conversation_id`: Will reply to a given conversation, addressing only the people who are part of the recipient set of that conversation. Sets the visibility to `direct`. - `in_reply_to_conversation_id`: Will reply to a given conversation, addressing only the people who are part of the recipient set of that conversation. Sets the visibility to `direct`.
## GET `/api/v1/statuses`
An endpoint to get multiple statuses by IDs.
Required parameters:
- `ids`: array of activity ids
Usage example: `GET /api/v1/statuses/?ids[]=1&ids[]=2`.
Returns: array of Status.
The maximum number of statuses is limited to 100 per request.
## PATCH `/api/v1/update_credentials` ## PATCH `/api/v1/update_credentials`
Additional parameters can be added to the JSON body/Form data: Additional parameters can be added to the JSON body/Form data:

View File

@ -252,7 +252,7 @@ See [Admin-API](Admin-API.md)
* Params: * Params:
* `email`: email of that needs to be verified * `email`: email of that needs to be verified
* Authentication: not required * Authentication: not required
* Response: 204 No Content * Response: 204 No Content
## `/api/v1/pleroma/mascot` ## `/api/v1/pleroma/mascot`
### Gets user mascot image ### Gets user mascot image
@ -321,11 +321,21 @@ See [Admin-API](Admin-API.md)
} }
``` ```
## `/api/pleroma/change_email`
### Change account email
* Method `POST`
* Authentication: required
* Params:
* `password`: user's password
* `email`: new email
* Response: JSON. Returns `{"status": "success"}` if the change was successful, `{"error": "[error message]"}` otherwise
* Note: Currently, Mastodon has no API for changing email. If they add it in future it might be incompatible with Pleroma.
# Pleroma Conversations # Pleroma Conversations
Pleroma Conversations have the same general structure that Mastodon Conversations have. The behavior differs in the following ways when using these endpoints: Pleroma Conversations have the same general structure that Mastodon Conversations have. The behavior differs in the following ways when using these endpoints:
1. Pleroma Conversations never add or remove recipients, unless explicitly changed by the user. 1. Pleroma Conversations never add or remove recipients, unless explicitly changed by the user.
2. Pleroma Conversations statuses can be requested by Conversation id. 2. Pleroma Conversations statuses can be requested by Conversation id.
3. Pleroma Conversations can be replied to. 3. Pleroma Conversations can be replied to.

File diff suppressed because it is too large Load Diff

View File

@ -27,7 +27,7 @@ def run(["tag"]) do
}) })
end end
def run(["render_timeline", nickname]) do def run(["render_timeline", nickname | _] = args) do
start_pleroma() start_pleroma()
user = Pleroma.User.get_by_nickname(nickname) user = Pleroma.User.get_by_nickname(nickname)
@ -37,33 +37,37 @@ def run(["render_timeline", nickname]) do
|> Map.put("blocking_user", user) |> Map.put("blocking_user", user)
|> Map.put("muting_user", user) |> Map.put("muting_user", user)
|> Map.put("user", user) |> Map.put("user", user)
|> Map.put("limit", 80) |> Map.put("limit", 4096)
|> Pleroma.Web.ActivityPub.ActivityPub.fetch_public_activities() |> Pleroma.Web.ActivityPub.ActivityPub.fetch_public_activities()
|> Enum.reverse() |> Enum.reverse()
inputs = %{ inputs = %{
"One activity" => Enum.take_random(activities, 1), "1 activity" => Enum.take_random(activities, 1),
"Ten activities" => Enum.take_random(activities, 10), "10 activities" => Enum.take_random(activities, 10),
"Twenty activities" => Enum.take_random(activities, 20), "20 activities" => Enum.take_random(activities, 20),
"Forty activities" => Enum.take_random(activities, 40), "40 activities" => Enum.take_random(activities, 40),
"Eighty activities" => Enum.take_random(activities, 80) "80 activities" => Enum.take_random(activities, 80)
} }
inputs =
if Enum.at(args, 2) == "extended" do
Map.merge(inputs, %{
"200 activities" => Enum.take_random(activities, 200),
"500 activities" => Enum.take_random(activities, 500),
"2000 activities" => Enum.take_random(activities, 2000),
"4096 activities" => Enum.take_random(activities, 4096)
})
else
inputs
end
Benchee.run( Benchee.run(
%{ %{
"Parallel rendering" => fn activities ->
Pleroma.Web.MastodonAPI.StatusView.render("index.json", %{
activities: activities,
for: user,
as: :activity
})
end,
"Standart rendering" => fn activities -> "Standart rendering" => fn activities ->
Pleroma.Web.MastodonAPI.StatusView.render("index.json", %{ Pleroma.Web.MastodonAPI.StatusView.render("index.json", %{
activities: activities, activities: activities,
for: user, for: user,
as: :activity, as: :activity
parallel: false
}) })
end end
}, },

View File

@ -0,0 +1,42 @@
defmodule Mix.Tasks.Pleroma.Docs do
use Mix.Task
import Mix.Pleroma
@shortdoc "Generates docs from descriptions.exs"
@moduledoc """
Generates docs from `descriptions.exs`.
Supports two formats: `markdown` and `json`.
## Generate Markdown docs
`mix pleroma.docs`
## Generate JSON docs
`mix pleroma.docs json`
"""
def run(["json"]) do
do_run(Pleroma.Docs.JSON)
end
def run(_) do
do_run(Pleroma.Docs.Markdown)
end
defp do_run(implementation) do
start_pleroma()
with {descriptions, _paths} <- Mix.Config.eval!("config/description.exs"),
{:ok, file_path} <-
Pleroma.Docs.Generator.process(
implementation,
descriptions[:pleroma][:config_description]
) do
type = if implementation == Pleroma.Docs.Markdown, do: "Markdown", else: "JSON"
Mix.shell().info([:green, "#{type} docs successfully generated to #{file_path}."])
end
end
end

View File

@ -6,6 +6,7 @@ defmodule Pleroma.Activity do
use Ecto.Schema use Ecto.Schema
alias Pleroma.Activity alias Pleroma.Activity
alias Pleroma.Activity.Queries
alias Pleroma.ActivityExpiration alias Pleroma.ActivityExpiration
alias Pleroma.Bookmark alias Pleroma.Bookmark
alias Pleroma.Notification alias Pleroma.Notification
@ -65,8 +66,8 @@ defmodule Pleroma.Activity do
timestamps() timestamps()
end end
def with_joined_object(query) do def with_joined_object(query, join_type \\ :inner) do
join(query, :inner, [activity], o in Object, join(query, join_type, [activity], o in Object,
on: on:
fragment( fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')", "(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
@ -78,10 +79,10 @@ def with_joined_object(query) do
) )
end end
def with_preloaded_object(query) do def with_preloaded_object(query, join_type \\ :inner) do
query query
|> has_named_binding?(:object) |> has_named_binding?(:object)
|> if(do: query, else: with_joined_object(query)) |> if(do: query, else: with_joined_object(query, join_type))
|> preload([activity, object: object], object: object) |> preload([activity, object: object], object: object)
end end
@ -107,12 +108,9 @@ def with_set_thread_muted_field(query, %User{} = user) do
def with_set_thread_muted_field(query, _), do: query def with_set_thread_muted_field(query, _), do: query
def get_by_ap_id(ap_id) do def get_by_ap_id(ap_id) do
Repo.one( ap_id
from( |> Queries.by_ap_id()
activity in Activity, |> Repo.one()
where: fragment("(?)->>'id' = ?", activity.data, ^to_string(ap_id))
)
)
end end
def get_bookmark(%Activity{} = activity, %User{} = user) do def get_bookmark(%Activity{} = activity, %User{} = user) do
@ -133,21 +131,10 @@ def change(struct, params \\ %{}) do
end end
def get_by_ap_id_with_object(ap_id) do def get_by_ap_id_with_object(ap_id) do
Repo.one( ap_id
from( |> Queries.by_ap_id()
activity in Activity, |> with_preloaded_object(:left)
where: fragment("(?)->>'id' = ?", activity.data, ^to_string(ap_id)), |> Repo.one()
left_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
)
end end
def get_by_id(id) do def get_by_id(id) do
@ -158,66 +145,34 @@ def get_by_id(id) do
end end
def get_by_id_with_object(id) do def get_by_id_with_object(id) do
from(activity in Activity, Activity
where: activity.id == ^id, |> where(id: ^id)
inner_join: o in Object, |> with_preloaded_object()
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
|> Repo.one() |> Repo.one()
end end
def by_object_ap_id(ap_id) do def all_by_ids_with_object(ids) do
from( Activity
activity in Activity, |> where([a], a.id in ^ids)
where: |> with_preloaded_object()
fragment( |> Repo.all()
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^to_string(ap_id)
)
)
end end
def create_by_object_ap_id(ap_ids) when is_list(ap_ids) do @doc """
from( Accepts `ap_id` or list of `ap_id`.
activity in Activity, Returns a query.
where: """
fragment( @spec create_by_object_ap_id(String.t() | [String.t()]) :: Ecto.Queryable.t()
"coalesce((?)->'object'->>'id', (?)->>'object') = ANY(?)", def create_by_object_ap_id(ap_id) do
activity.data, ap_id
activity.data, |> Queries.by_object_id()
^ap_ids |> Queries.by_type("Create")
),
where: fragment("(?)->>'type' = 'Create'", activity.data)
)
end end
def create_by_object_ap_id(ap_id) when is_binary(ap_id) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^to_string(ap_id)
),
where: fragment("(?)->>'type' = 'Create'", activity.data)
)
end
def create_by_object_ap_id(_), do: nil
def get_all_create_by_object_ap_id(ap_id) do def get_all_create_by_object_ap_id(ap_id) do
Repo.all(create_by_object_ap_id(ap_id)) ap_id
|> create_by_object_ap_id()
|> Repo.all()
end end
def get_create_by_object_ap_id(ap_id) when is_binary(ap_id) do def get_create_by_object_ap_id(ap_id) when is_binary(ap_id) do
@ -228,54 +183,17 @@ def get_create_by_object_ap_id(ap_id) when is_binary(ap_id) do
def get_create_by_object_ap_id(_), do: nil def get_create_by_object_ap_id(_), do: nil
def create_by_object_ap_id_with_object(ap_ids) when is_list(ap_ids) do @doc """
from( Accepts `ap_id` or list of `ap_id`.
activity in Activity, Returns a query.
where: """
fragment( @spec create_by_object_ap_id_with_object(String.t() | [String.t()]) :: Ecto.Queryable.t()
"coalesce((?)->'object'->>'id', (?)->>'object') = ANY(?)", def create_by_object_ap_id_with_object(ap_id) do
activity.data, ap_id
activity.data, |> create_by_object_ap_id()
^ap_ids |> with_preloaded_object()
),
where: fragment("(?)->>'type' = 'Create'", activity.data),
inner_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
end end
def create_by_object_ap_id_with_object(ap_id) when is_binary(ap_id) do
from(
activity in Activity,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^to_string(ap_id)
),
where: fragment("(?)->>'type' = 'Create'", activity.data),
inner_join: o in Object,
on:
fragment(
"(?->>'id') = COALESCE(?->'object'->>'id', ?->>'object')",
o.data,
activity.data,
activity.data
),
preload: [object: o]
)
end
def create_by_object_ap_id_with_object(_), do: nil
def get_create_by_object_ap_id_with_object(ap_id) when is_binary(ap_id) do def get_create_by_object_ap_id_with_object(ap_id) when is_binary(ap_id) do
ap_id ap_id
|> create_by_object_ap_id_with_object() |> create_by_object_ap_id_with_object()
@ -299,7 +217,8 @@ def normalize(ap_id) when is_binary(ap_id), do: get_by_ap_id_with_object(ap_id)
def normalize(_), do: nil def normalize(_), do: nil
def delete_by_ap_id(id) when is_binary(id) do def delete_by_ap_id(id) when is_binary(id) do
by_object_ap_id(id) id
|> Queries.by_object_id()
|> select([u], u) |> select([u], u)
|> Repo.delete_all() |> Repo.delete_all()
|> elem(1) |> elem(1)
@ -308,10 +227,19 @@ def delete_by_ap_id(id) when is_binary(id) do
%{data: %{"type" => "Create", "object" => %{"id" => ap_id}}} -> ap_id == id %{data: %{"type" => "Create", "object" => %{"id" => ap_id}}} -> ap_id == id
_ -> nil _ -> nil
end) end)
|> purge_web_resp_cache()
end end
def delete_by_ap_id(_), do: nil def delete_by_ap_id(_), do: nil
defp purge_web_resp_cache(%Activity{} = activity) do
%{path: path} = URI.parse(activity.data["id"])
Cachex.del(:web_resp_cache, path)
activity
end
defp purge_web_resp_cache(nil), do: nil
for {ap_type, type} <- @mastodon_notification_types do for {ap_type, type} <- @mastodon_notification_types do
def mastodon_notification_type(%Activity{data: %{"type" => unquote(ap_type)}}), def mastodon_notification_type(%Activity{data: %{"type" => unquote(ap_type)}}),
do: unquote(type) do: unquote(type)
@ -334,31 +262,10 @@ def all_by_actor_and_id(actor, status_ids) do
end end
def follow_requests_for_actor(%Pleroma.User{ap_id: ap_id}) do def follow_requests_for_actor(%Pleroma.User{ap_id: ap_id}) do
from( ap_id
a in Activity, |> Queries.by_object_id()
where: |> Queries.by_type("Follow")
fragment( |> where([a], fragment("? ->> 'state' = 'pending'", a.data))
"? ->> 'type' = 'Follow'",
a.data
),
where:
fragment(
"? ->> 'state' = 'pending'",
a.data
),
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
a.data,
a.data,
^ap_id
)
)
end
@spec query_by_actor(actor()) :: Ecto.Query.t()
def query_by_actor(actor) do
from(a in Activity, where: a.actor == ^actor)
end end
def restrict_deactivated_users(query) do def restrict_deactivated_users(query) do

View File

@ -13,6 +13,14 @@ defmodule Pleroma.Activity.Queries do
alias Pleroma.Activity alias Pleroma.Activity
@spec by_ap_id(query, String.t()) :: query
def by_ap_id(query \\ Activity, ap_id) do
from(
activity in query,
where: fragment("(?)->>'id' = ?", activity.data, ^to_string(ap_id))
)
end
@spec by_actor(query, String.t()) :: query @spec by_actor(query, String.t()) :: query
def by_actor(query \\ Activity, actor) do def by_actor(query \\ Activity, actor) do
from( from(
@ -21,8 +29,23 @@ def by_actor(query \\ Activity, actor) do
) )
end end
@spec by_object_id(query, String.t()) :: query @spec by_object_id(query, String.t() | [String.t()]) :: query
def by_object_id(query \\ Activity, object_id) do def by_object_id(query \\ Activity, object_id)
def by_object_id(query, object_ids) when is_list(object_ids) do
from(
activity in query,
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ANY(?)",
activity.data,
activity.data,
^object_ids
)
)
end
def by_object_id(query, object_id) when is_binary(object_id) do
from(activity in query, from(activity in query,
where: where:
fragment( fragment(
@ -41,9 +64,4 @@ def by_type(query \\ Activity, activity_type) do
where: fragment("(?)->>'type' = ?", activity.data, ^activity_type) where: fragment("(?)->>'type' = ?", activity.data, ^activity_type)
) )
end end
@spec limit(query, pos_integer()) :: query
def limit(query \\ Activity, limit) do
from(activity in query, limit: ^limit)
end
end end

View File

@ -116,7 +116,8 @@ defp cachex_children do
build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500), build_cachex("object", default_ttl: 25_000, ttl_interval: 1000, limit: 2500),
build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000), build_cachex("rich_media", default_ttl: :timer.minutes(120), limit: 5000),
build_cachex("scrubber", limit: 2500), build_cachex("scrubber", limit: 2500),
build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500) build_cachex("idempotency", expiration: idempotency_expiration(), limit: 2500),
build_cachex("web_resp", limit: 2500)
] ]
end end

View File

@ -0,0 +1,73 @@
defmodule Pleroma.Docs.Generator do
@callback process(keyword()) :: {:ok, String.t()}
@spec process(module(), keyword()) :: {:ok, String.t()}
def process(implementation, descriptions) do
implementation.process(descriptions)
end
@spec uploaders_list() :: [module()]
def uploaders_list do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Uploaders"]) and
List.last(name_as_list) != "Uploader"
end)
end
@spec filters_list() :: [module()]
def filters_list do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Upload", "Filter"])
end)
end
@spec mrf_list() :: [module()]
def mrf_list do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Web", "ActivityPub", "MRF"]) and
length(name_as_list) > 4
end)
end
@spec richmedia_parsers() :: [module()]
def richmedia_parsers do
{:ok, modules} = :application.get_key(:pleroma, :modules)
Enum.filter(modules, fn module ->
name_as_list = Module.split(module)
List.starts_with?(name_as_list, ["Pleroma", "Web", "RichMedia", "Parsers"]) and
length(name_as_list) == 5
end)
end
end
defimpl Jason.Encoder, for: Tuple do
def encode(tuple, opts) do
Jason.Encode.list(Tuple.to_list(tuple), opts)
end
end
defimpl Jason.Encoder, for: [Regex, Function] do
def encode(term, opts) do
Jason.Encode.string(inspect(term), opts)
end
end
defimpl String.Chars, for: Regex do
def to_string(term) do
inspect(term)
end
end

20
lib/pleroma/docs/json.ex Normal file
View File

@ -0,0 +1,20 @@
defmodule Pleroma.Docs.JSON do
@behaviour Pleroma.Docs.Generator
@spec process(keyword()) :: {:ok, String.t()}
def process(descriptions) do
config_path = "docs/generate_config.json"
with {:ok, file} <- File.open(config_path, [:write]),
json <- generate_json(descriptions),
:ok <- IO.write(file, json),
:ok <- File.close(file) do
{:ok, config_path}
end
end
@spec generate_json([keyword()]) :: String.t()
def generate_json(descriptions) do
Jason.encode!(descriptions)
end
end

View File

@ -0,0 +1,78 @@
defmodule Pleroma.Docs.Markdown do
@behaviour Pleroma.Docs.Generator
@spec process(keyword()) :: {:ok, String.t()}
def process(descriptions) do
config_path = "docs/generated_config.md"
{:ok, file} = File.open(config_path, [:utf8, :write])
IO.write(file, "# Generated configuration\n")
IO.write(file, "Date of generation: #{Date.utc_today()}\n\n")
IO.write(
file,
"This file describe the configuration, it is recommended to edit the relevant `*.secret.exs` file instead of the others founds in the ``config`` directory.\n\n" <>
"If you run Pleroma with ``MIX_ENV=prod`` the file is ``prod.secret.exs``, otherwise it is ``dev.secret.exs``.\n\n"
)
for group <- descriptions do
if is_nil(group[:key]) do
IO.write(file, "## #{inspect(group[:group])}\n")
else
IO.write(file, "## #{inspect(group[:key])}\n")
end
IO.write(file, "#{group[:description]}\n")
for child <- group[:children] do
print_child_header(file, child)
print_suggestions(file, child[:suggestions])
if child[:children] do
for subchild <- child[:children] do
print_child_header(file, subchild)
print_suggestions(file, subchild[:suggestions])
end
end
end
IO.write(file, "\n")
end
:ok = File.close(file)
{:ok, config_path}
end
defp print_suggestion(file, suggestion) when is_list(suggestion) do
IO.write(file, " `#{inspect(suggestion)}`\n")
end
defp print_suggestion(file, suggestion) when is_function(suggestion) do
IO.write(file, " `#{inspect(suggestion.())}`\n")
end
defp print_suggestion(file, suggestion, as_list \\ false) do
list_mark = if as_list, do: "- ", else: ""
IO.write(file, " #{list_mark}`#{inspect(suggestion)}`\n")
end
defp print_suggestions(_file, nil), do: nil
defp print_suggestions(file, suggestions) do
IO.write(file, "Suggestions:\n")
if length(suggestions) > 1 do
for suggestion <- suggestions do
print_suggestion(file, suggestion, true)
end
else
print_suggestion(file, List.first(suggestions))
end
end
defp print_child_header(file, child) do
IO.write(file, "- `#{inspect(child[:key])}` -`#{inspect(child[:type])}` \n")
IO.write(file, "#{child[:description]} \n")
end
end

View File

@ -9,6 +9,7 @@ defmodule Pleroma.Healthcheck do
alias Pleroma.Healthcheck alias Pleroma.Healthcheck
alias Pleroma.Repo alias Pleroma.Repo
@derive Jason.Encoder
defstruct pool_size: 0, defstruct pool_size: 0,
active: 0, active: 0,
idle: 0, idle: 0,

View File

@ -130,14 +130,16 @@ def swap_object_with_tombstone(object) do
def delete(%Object{data: %{"id" => id}} = object) do def delete(%Object{data: %{"id" => id}} = object) do
with {:ok, _obj} = swap_object_with_tombstone(object), with {:ok, _obj} = swap_object_with_tombstone(object),
deleted_activity = Activity.delete_by_ap_id(id), deleted_activity = Activity.delete_by_ap_id(id),
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}") do {:ok, true} <- Cachex.del(:object_cache, "object:#{id}"),
{:ok, _} <- Cachex.del(:web_resp_cache, URI.parse(id).path) do
{:ok, object, deleted_activity} {:ok, object, deleted_activity}
end end
end end
def prune(%Object{data: %{"id" => id}} = object) do def prune(%Object{data: %{"id" => id}} = object) do
with {:ok, object} <- Repo.delete(object), with {:ok, object} <- Repo.delete(object),
{:ok, true} <- Cachex.del(:object_cache, "object:#{id}") do {:ok, true} <- Cachex.del(:object_cache, "object:#{id}"),
{:ok, _} <- Cachex.del(:web_resp_cache, URI.parse(id).path) do
{:ok, object} {:ok, object}
end end
end end

122
lib/pleroma/plugs/cache.ex Normal file
View File

@ -0,0 +1,122 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Plugs.Cache do
@moduledoc """
Caches successful GET responses.
To enable the cache add the plug to a router pipeline or controller:
plug(Pleroma.Plugs.Cache)
## Configuration
To configure the plug you need to pass settings as the second argument to the `plug/2` macro:
plug(Pleroma.Plugs.Cache, [ttl: nil, query_params: true])
Available options:
- `ttl`: An expiration time (time-to-live). This value should be in milliseconds or `nil` to disable expiration. Defaults to `nil`.
- `query_params`: Take URL query string into account (`true`), ignore it (`false`) or limit to specific params only (list). Defaults to `true`.
Additionally, you can overwrite the TTL inside a controller action by assigning `cache_ttl` to the connection struct:
def index(conn, _params) do
ttl = 60_000 # one minute
conn
|> assign(:cache_ttl, ttl)
|> render("index.html")
end
"""
import Phoenix.Controller, only: [current_path: 1, json: 2]
import Plug.Conn
@behaviour Plug
@defaults %{ttl: nil, query_params: true}
@impl true
def init([]), do: @defaults
def init(opts) do
opts = Map.new(opts)
Map.merge(@defaults, opts)
end
@impl true
def call(%{method: "GET"} = conn, opts) do
key = cache_key(conn, opts)
case Cachex.get(:web_resp_cache, key) do
{:ok, nil} ->
cache_resp(conn, opts)
{:ok, record} ->
send_cached(conn, record)
{atom, message} when atom in [:ignore, :error] ->
render_error(conn, message)
end
end
def call(conn, _), do: conn
# full path including query params
defp cache_key(conn, %{query_params: true}), do: current_path(conn)
# request path without query params
defp cache_key(conn, %{query_params: false}), do: conn.request_path
# request path with specific query params
defp cache_key(conn, %{query_params: query_params}) when is_list(query_params) do
query_string =
conn.params
|> Map.take(query_params)
|> URI.encode_query()
conn.request_path <> "?" <> query_string
end
defp cache_resp(conn, opts) do
register_before_send(conn, fn
%{status: 200, resp_body: body} = conn ->
ttl = Map.get(conn.assigns, :cache_ttl, opts.ttl)
key = cache_key(conn, opts)
content_type = content_type(conn)
record = {content_type, body}
Cachex.put(:web_resp_cache, key, record, ttl: ttl)
put_resp_header(conn, "x-cache", "MISS from Pleroma")
conn ->
conn
end)
end
defp content_type(conn) do
conn
|> Plug.Conn.get_resp_header("content-type")
|> hd()
end
defp send_cached(conn, {content_type, body}) do
conn
|> put_resp_content_type(content_type, nil)
|> put_resp_header("x-cache", "HIT from Pleroma")
|> send_resp(:ok, body)
|> halt()
end
defp render_error(conn, message) do
conn
|> put_status(:internal_server_error)
|> json(%{error: message})
|> halt()
end
end

View File

@ -174,11 +174,25 @@ def following_count(%User{} = user) do
|> Repo.aggregate(:count, :id) |> Repo.aggregate(:count, :id)
end end
defp truncate_if_exists(params, key, max_length) do
if Map.has_key?(params, key) and is_binary(params[key]) do
{value, _chopped} = String.split_at(params[key], max_length)
Map.put(params, key, value)
else
params
end
end
def remote_user_creation(params) do def remote_user_creation(params) do
bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000) bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000)
name_limit = Pleroma.Config.get([:instance, :user_name_length], 100) name_limit = Pleroma.Config.get([:instance, :user_name_length], 100)
params = Map.put(params, :info, params[:info] || %{}) params =
params
|> Map.put(:info, params[:info] || %{})
|> truncate_if_exists(:name, name_limit)
|> truncate_if_exists(:bio, bio_limit)
info_cng = User.Info.remote_user_creation(%User.Info{}, params[:info]) info_cng = User.Info.remote_user_creation(%User.Info{}, params[:info])
changes = changes =
@ -1219,7 +1233,7 @@ def follow_import(%User{} = follower, followed_identifiers) when is_list(followe
def delete_user_activities(%User{ap_id: ap_id} = user) do def delete_user_activities(%User{ap_id: ap_id} = user) do
ap_id ap_id
|> Activity.query_by_actor() |> Activity.Queries.by_actor()
|> RepoStreamer.chunk_stream(50) |> RepoStreamer.chunk_stream(50)
|> Stream.each(fn activities -> |> Stream.each(fn activities ->
Enum.each(activities, &delete_activity(&1)) Enum.each(activities, &delete_activity(&1))
@ -1624,4 +1638,13 @@ defp put_password_hash(changeset), do: changeset
def is_internal_user?(%User{nickname: nil}), do: true def is_internal_user?(%User{nickname: nil}), do: true
def is_internal_user?(%User{local: true, nickname: "internal." <> _}), do: true def is_internal_user?(%User{local: true, nickname: "internal." <> _}), do: true
def is_internal_user?(_), do: false def is_internal_user?(_), do: false
def change_email(user, email) do
user
|> cast(%{email: email}, [:email])
|> validate_required([:email])
|> unique_constraint(:email)
|> validate_format(:email, @email_regex)
|> update_and_set_cache()
end
end end

View File

@ -242,6 +242,13 @@ def set_keys(info, keys) do
end end
def remote_user_creation(info, params) do def remote_user_creation(info, params) do
params =
if Map.has_key?(params, :fields) do
Map.put(params, :fields, Enum.map(params[:fields], &truncate_field/1))
else
params
end
info info
|> cast(params, [ |> cast(params, [
:ap_enabled, :ap_enabled,
@ -326,6 +333,16 @@ defp valid_field?(%{"name" => name, "value" => value}) do
defp valid_field?(_), do: false defp valid_field?(_), do: false
defp truncate_field(%{"name" => name, "value" => value}) do
{name, _chopped} =
String.split_at(name, Pleroma.Config.get([:instance, :account_field_name_length], 255))
{value, _chopped} =
String.split_at(value, Pleroma.Config.get([:instance, :account_field_value_length], 255))
%{"name" => name, "value" => value}
end
@spec confirmation_changeset(Info.t(), keyword()) :: Changeset.t() @spec confirmation_changeset(Info.t(), keyword()) :: Changeset.t()
def confirmation_changeset(info, opts) do def confirmation_changeset(info, opts) do
need_confirmation? = Keyword.get(opts, :need_confirmation) need_confirmation? = Keyword.get(opts, :need_confirmation)

View File

@ -23,6 +23,8 @@ defmodule Pleroma.Web.ActivityPub.ActivityPubController do
action_fallback(:errors) action_fallback(:errors)
plug(Pleroma.Plugs.Cache, [query_params: false] when action in [:activity, :object])
plug( plug(
Pleroma.Plugs.OAuthScopesPlug, Pleroma.Plugs.OAuthScopesPlug,
%{scopes: ["read:accounts"]} when action in [:followers, :following] %{scopes: ["read:accounts"]} when action in [:followers, :following]
@ -58,8 +60,10 @@ def object(conn, %{"uuid" => uuid}) do
%Object{} = object <- Object.get_cached_by_ap_id(ap_id), %Object{} = object <- Object.get_cached_by_ap_id(ap_id),
{_, true} <- {:public?, Visibility.is_public?(object)} do {_, true} <- {:public?, Visibility.is_public?(object)} do
conn conn
|> set_cache_ttl_for(object)
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("object.json", %{object: object})) |> put_view(ObjectView)
|> render("object.json", object: object)
else else
{:public?, false} -> {:public?, false} ->
{:error, :not_found} {:error, :not_found}
@ -101,14 +105,36 @@ def activity(conn, %{"uuid" => uuid}) do
%Activity{} = activity <- Activity.normalize(ap_id), %Activity{} = activity <- Activity.normalize(ap_id),
{_, true} <- {:public?, Visibility.is_public?(activity)} do {_, true} <- {:public?, Visibility.is_public?(activity)} do
conn conn
|> set_cache_ttl_for(activity)
|> put_resp_content_type("application/activity+json") |> put_resp_content_type("application/activity+json")
|> json(ObjectView.render("object.json", %{object: activity})) |> put_view(ObjectView)
|> render("object.json", object: activity)
else else
{:public?, false} -> {:public?, false} -> {:error, :not_found}
{:error, :not_found} nil -> {:error, :not_found}
end end
end end
defp set_cache_ttl_for(conn, %Activity{object: object}) do
set_cache_ttl_for(conn, object)
end
defp set_cache_ttl_for(conn, entity) do
ttl =
case entity do
%Object{data: %{"type" => "Question"}} ->
Pleroma.Config.get([:web_cache_ttl, :activity_pub_question])
%Object{} ->
Pleroma.Config.get([:web_cache_ttl, :activity_pub])
_ ->
nil
end
assign(conn, :cache_ttl, ttl)
end
# GET /relay/following # GET /relay/following
def following(%{assigns: %{relay: true}} = conn, _params) do def following(%{assigns: %{relay: true}} = conn, _params) do
conn conn
@ -256,22 +282,36 @@ def whoami(%{assigns: %{user: %User{} = user}} = conn, _params) do
def whoami(_conn, _params), do: {:error, :not_found} def whoami(_conn, _params), do: {:error, :not_found}
def read_inbox(%{assigns: %{user: user}} = conn, %{"nickname" => nickname} = params) do def read_inbox(
if nickname == user.nickname do %{assigns: %{user: %{nickname: nickname} = user}} = conn,
conn %{"nickname" => nickname} = params
|> put_resp_content_type("application/activity+json") ) do
|> json(UserView.render("inbox.json", %{user: user, max_id: params["max_id"]})) conn
else |> put_resp_content_type("application/activity+json")
err = |> put_view(UserView)
dgettext("errors", "can't read inbox of %{nickname} as %{as_nickname}", |> render("inbox.json", user: user, max_id: params["max_id"])
nickname: nickname, end
as_nickname: user.nickname
)
conn def read_inbox(%{assigns: %{user: nil}} = conn, %{"nickname" => nickname}) do
|> put_status(:forbidden) err = dgettext("errors", "can't read inbox of %{nickname}", nickname: nickname)
|> json(err)
end conn
|> put_status(:forbidden)
|> json(err)
end
def read_inbox(%{assigns: %{user: %{nickname: as_nickname}}} = conn, %{
"nickname" => nickname
}) do
err =
dgettext("errors", "can't read inbox of %{nickname} as %{as_nickname}",
nickname: nickname,
as_nickname: as_nickname
)
conn
|> put_status(:forbidden)
|> json(err)
end end
def handle_user_activity(user, %{"type" => "Create"} = params) do def handle_user_activity(user, %{"type" => "Create"} = params) do

View File

@ -185,12 +185,12 @@ def fix_in_reply_to(%{"inReplyTo" => in_reply_to} = object, options)
|> Map.put("context", replied_object.data["context"] || object["conversation"]) |> Map.put("context", replied_object.data["context"] || object["conversation"])
else else
e -> e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}") Logger.error("Couldn't fetch #{inspect(in_reply_to_id)}, error: #{inspect(e)}")
object object
end end
e -> e ->
Logger.error("Couldn't fetch \"#{inspect(in_reply_to_id)}\", error: #{inspect(e)}") Logger.error("Couldn't fetch #{inspect(in_reply_to_id)}, error: #{inspect(e)}")
object object
end end
else else

View File

@ -85,15 +85,13 @@ defp extract_list(lst) when is_list(lst), do: lst
defp extract_list(_), do: [] defp extract_list(_), do: []
def maybe_splice_recipient(ap_id, params) do def maybe_splice_recipient(ap_id, params) do
need_splice = need_splice? =
!recipient_in_collection(ap_id, params["to"]) && !recipient_in_collection(ap_id, params["to"]) &&
!recipient_in_collection(ap_id, params["cc"]) !recipient_in_collection(ap_id, params["cc"])
cc_list = extract_list(params["cc"]) if need_splice? do
cc_list = extract_list(params["cc"])
if need_splice do Map.put(params, "cc", [ap_id | cc_list])
params
|> Map.put("cc", [ap_id | cc_list])
else else
params params
end end
@ -139,7 +137,7 @@ def get_notified_from_object(%{"type" => type} = object) when type in @supported
"object" => object "object" => object
} }
Notification.get_notified_from_activity(%Activity{data: fake_create_activity}, false) get_notified_from_object(fake_create_activity)
end end
def get_notified_from_object(object) do def get_notified_from_object(object) do
@ -188,9 +186,9 @@ def maybe_federate(_), do: :ok
Adds an id and a published data if they aren't there, Adds an id and a published data if they aren't there,
also adds it to an included object also adds it to an included object
""" """
def lazy_put_activity_defaults(map, fake \\ false) do def lazy_put_activity_defaults(map, fake? \\ false) do
map = map =
unless fake do if not fake? do
%{data: %{"id" => context}, id: context_id} = create_context(map["context"]) %{data: %{"id" => context}, id: context_id} = create_context(map["context"])
map map
@ -207,7 +205,7 @@ def lazy_put_activity_defaults(map, fake \\ false) do
end end
if is_map(map["object"]) do if is_map(map["object"]) do
object = lazy_put_object_defaults(map["object"], map, fake) object = lazy_put_object_defaults(map["object"], map, fake?)
%{map | "object" => object} %{map | "object" => object}
else else
map map
@ -217,9 +215,9 @@ def lazy_put_activity_defaults(map, fake \\ false) do
@doc """ @doc """
Adds an id and published date if they aren't there. Adds an id and published date if they aren't there.
""" """
def lazy_put_object_defaults(map, activity \\ %{}, fake) def lazy_put_object_defaults(map, activity \\ %{}, fake?)
def lazy_put_object_defaults(map, activity, true = _fake) do def lazy_put_object_defaults(map, activity, true = _fake?) do
map map
|> Map.put_new_lazy("published", &make_date/0) |> Map.put_new_lazy("published", &make_date/0)
|> Map.put_new("id", "pleroma:fake_object_id") |> Map.put_new("id", "pleroma:fake_object_id")
@ -228,7 +226,7 @@ def lazy_put_object_defaults(map, activity, true = _fake) do
|> Map.put_new("context_id", activity["context_id"]) |> Map.put_new("context_id", activity["context_id"])
end end
def lazy_put_object_defaults(map, activity, _fake) do def lazy_put_object_defaults(map, activity, _fake?) do
map map
|> Map.put_new_lazy("id", &generate_object_id/0) |> Map.put_new_lazy("id", &generate_object_id/0)
|> Map.put_new_lazy("published", &make_date/0) |> Map.put_new_lazy("published", &make_date/0)
@ -242,9 +240,7 @@ def lazy_put_object_defaults(map, activity, _fake) do
def insert_full_object(%{"object" => %{"type" => type} = object_data} = map) def insert_full_object(%{"object" => %{"type" => type} = object_data} = map)
when is_map(object_data) and type in @supported_object_types do when is_map(object_data) and type in @supported_object_types do
with {:ok, object} <- Object.create(object_data) do with {:ok, object} <- Object.create(object_data) do
map = map = Map.put(map, "object", object.data["id"])
map
|> Map.put("object", object.data["id"])
{:ok, map, object} {:ok, map, object}
end end
@ -263,7 +259,7 @@ def get_existing_like(actor, %{data: %{"id" => id}}) do
|> Activity.Queries.by_actor() |> Activity.Queries.by_actor()
|> Activity.Queries.by_object_id(id) |> Activity.Queries.by_object_id(id)
|> Activity.Queries.by_type("Like") |> Activity.Queries.by_type("Like")
|> Activity.Queries.limit(1) |> limit(1)
|> Repo.one() |> Repo.one()
end end
@ -380,12 +376,11 @@ def update_follow_state(
%Activity{data: %{"actor" => actor, "object" => object}} = activity, %Activity{data: %{"actor" => actor, "object" => object}} = activity,
state state
) do ) do
with new_data <- new_data = Map.put(activity.data, "state", state)
activity.data changeset = Changeset.change(activity, data: new_data)
|> Map.put("state", state),
changeset <- Changeset.change(activity, data: new_data), with {:ok, activity} <- Repo.update(changeset) do
{:ok, activity} <- Repo.update(changeset), User.set_follow_state_cache(actor, object, state)
_ <- User.set_follow_state_cache(actor, object, state) do
{:ok, activity} {:ok, activity}
end end
end end
@ -410,28 +405,14 @@ def make_follow_data(
end end
def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do
query = "Follow"
from( |> Activity.Queries.by_type()
activity in Activity, |> where(actor: ^follower_id)
where: # this is to use the index
fragment( |> Activity.Queries.by_object_id(followed_id)
"? ->> 'type' = 'Follow'", |> order_by([activity], fragment("? desc nulls last", activity.id))
activity.data |> limit(1)
), |> Repo.one()
where: activity.actor == ^follower_id,
# this is to use the index
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^followed_id
),
order_by: [fragment("? desc nulls last", activity.id)],
limit: 1
)
Repo.one(query)
end end
#### Announce-related helpers #### Announce-related helpers
@ -439,23 +420,13 @@ def fetch_latest_follow(%User{ap_id: follower_id}, %User{ap_id: followed_id}) do
@doc """ @doc """
Retruns an existing announce activity if the notice has already been announced Retruns an existing announce activity if the notice has already been announced
""" """
def get_existing_announce(actor, %{data: %{"id" => id}}) do def get_existing_announce(actor, %{data: %{"id" => ap_id}}) do
query = "Announce"
from( |> Activity.Queries.by_type()
activity in Activity, |> where(actor: ^actor)
where: activity.actor == ^actor, # this is to use the index
# this is to use the index |> Activity.Queries.by_object_id(ap_id)
where: |> Repo.one()
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^id
),
where: fragment("(?)->>'type' = 'Announce'", activity.data)
)
Repo.one(query)
end end
@doc """ @doc """
@ -538,11 +509,13 @@ def add_announce_to_object(
object object
) do ) do
announcements = announcements =
if is_list(object.data["announcements"]), do: object.data["announcements"], else: [] if is_list(object.data["announcements"]) do
Enum.uniq([actor | object.data["announcements"]])
else
[actor]
end
with announcements <- [actor | announcements] |> Enum.uniq() do update_element_in_object("announcement", announcements, object)
update_element_in_object("announcement", announcements, object)
end
end end
def add_announce_to_object(_, object), do: {:ok, object} def add_announce_to_object(_, object), do: {:ok, object}
@ -570,28 +543,14 @@ def make_unfollow_data(follower, followed, follow_activity, activity_id) do
#### Block-related helpers #### Block-related helpers
def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do def fetch_latest_block(%User{ap_id: blocker_id}, %User{ap_id: blocked_id}) do
query = "Block"
from( |> Activity.Queries.by_type()
activity in Activity, |> where(actor: ^blocker_id)
where: # this is to use the index
fragment( |> Activity.Queries.by_object_id(blocked_id)
"? ->> 'type' = 'Block'", |> order_by([activity], fragment("? desc nulls last", activity.id))
activity.data |> limit(1)
), |> Repo.one()
where: activity.actor == ^blocker_id,
# this is to use the index
where:
fragment(
"coalesce((?)->'object'->>'id', (?)->>'object') = ?",
activity.data,
activity.data,
^blocked_id
),
order_by: [fragment("? desc nulls last", activity.id)],
limit: 1
)
Repo.one(query)
end end
def make_block_data(blocker, blocked, activity_id) do def make_block_data(blocker, blocked, activity_id) do
@ -695,11 +654,11 @@ def fetch_ordered_collection(from, pages_left, acc \\ []) do
#### Report-related helpers #### Report-related helpers
def update_report_state(%Activity{} = activity, state) when state in @supported_report_states do def update_report_state(%Activity{} = activity, state) when state in @supported_report_states do
with new_data <- Map.put(activity.data, "state", state), new_data = Map.put(activity.data, "state", state)
changeset <- Changeset.change(activity, data: new_data),
{:ok, activity} <- Repo.update(changeset) do activity
{:ok, activity} |> Changeset.change(data: new_data)
end |> Repo.update()
end end
def update_report_state(_, _), do: {:error, "Unsupported state"} def update_report_state(_, _), do: {:error, "Unsupported state"}
@ -766,21 +725,13 @@ defp get_updated_targets(
end end
def get_existing_votes(actor, %{data: %{"id" => id}}) do def get_existing_votes(actor, %{data: %{"id" => id}}) do
query = actor
from( |> Activity.Queries.by_actor()
[activity, object: object] in Activity.with_preloaded_object(Activity), |> Activity.Queries.by_type("Create")
where: fragment("(?)->>'type' = 'Create'", activity.data), |> Activity.with_preloaded_object()
where: fragment("(?)->>'actor' = ?", activity.data, ^actor), |> where([a, object: o], fragment("(?)->>'inReplyTo' = ?", o.data, ^to_string(id)))
where: |> where([a, object: o], fragment("(?)->>'type' = 'Answer'", o.data))
fragment( |> Repo.all()
"(?)->>'inReplyTo' = ?",
object.data,
^to_string(id)
),
where: fragment("(?)->>'type' = 'Answer'", object.data)
)
Repo.all(query)
end end
defp maybe_put(map, _key, nil), do: map defp maybe_put(map, _key, nil), do: map

View File

@ -90,6 +90,8 @@ defp do_convert(entity) when is_list(entity) do
for v <- entity, into: [], do: do_convert(v) for v <- entity, into: [], do: do_convert(v)
end end
defp do_convert(%Regex{} = entity), do: inspect(entity)
defp do_convert(entity) when is_map(entity) do defp do_convert(entity) when is_map(entity) do
for {k, v} <- entity, into: %{}, do: {do_convert(k), do_convert(v)} for {k, v} <- entity, into: %{}, do: {do_convert(k), do_convert(v)}
end end
@ -122,7 +124,7 @@ def transform(entity) when is_binary(entity) or is_map(entity) or is_list(entity
def transform(entity), do: :erlang.term_to_binary(entity) def transform(entity), do: :erlang.term_to_binary(entity)
defp do_transform(%Regex{} = entity) when is_map(entity), do: entity defp do_transform(%Regex{} = entity), do: entity
defp do_transform(%{"tuple" => [":dispatch", [entity]]}) do defp do_transform(%{"tuple" => [":dispatch", [entity]]}) do
{dispatch_settings, []} = do_eval(entity) {dispatch_settings, []} = do_eval(entity)
@ -154,8 +156,15 @@ defp do_transform(entity) when is_binary(entity) do
defp do_transform(entity), do: entity defp do_transform(entity), do: entity
defp do_transform_string("~r/" <> pattern) do defp do_transform_string("~r/" <> pattern) do
pattern = String.trim_trailing(pattern, "/") modificator = String.split(pattern, "/") |> List.last()
~r/#{pattern}/ pattern = String.trim_trailing(pattern, "/" <> modificator)
case modificator do
"" -> ~r/#{pattern}/
"i" -> ~r/#{pattern}/i
"u" -> ~r/#{pattern}/u
"s" -> ~r/#{pattern}/s
end
end end
defp do_transform_string(":" <> atom), do: String.to_atom(atom) defp do_transform_string(":" <> atom), do: String.to_atom(atom)

View File

@ -34,79 +34,38 @@ defp param_to_integer(val, default) when is_binary(val) do
defp param_to_integer(_, default), do: default defp param_to_integer(_, default), do: default
def add_link_headers( def add_link_headers(conn, activities, extra_params \\ %{}) do
conn, case List.last(activities) do
method, %{id: max_id} ->
activities, params =
param \\ nil, conn.params
params \\ %{}, |> Map.drop(Map.keys(conn.path_params))
func3 \\ nil, |> Map.drop(["since_id", "max_id", "min_id"])
func4 \\ nil |> Map.merge(extra_params)
) do
params =
conn.params
|> Map.drop(["since_id", "max_id", "min_id"])
|> Map.merge(params)
last = List.last(activities) limit =
params
|> Map.get("limit", "20")
|> String.to_integer()
func3 = func3 || (&mastodon_api_url/3) min_id =
func4 = func4 || (&mastodon_api_url/4) if length(activities) <= limit do
activities
|> List.first()
|> Map.get(:id)
else
activities
|> Enum.at(limit * -1)
|> Map.get(:id)
end
if last do next_url = current_url(conn, Map.merge(params, %{max_id: max_id}))
max_id = last.id prev_url = current_url(conn, Map.merge(params, %{min_id: min_id}))
limit = put_resp_header(conn, "link", "<#{next_url}>; rel=\"next\", <#{prev_url}>; rel=\"prev\"")
params
|> Map.get("limit", "20")
|> String.to_integer()
min_id = _ ->
if length(activities) <= limit do conn
activities
|> List.first()
|> Map.get(:id)
else
activities
|> Enum.at(limit * -1)
|> Map.get(:id)
end
{next_url, prev_url} =
if param do
{
func4.(
Pleroma.Web.Endpoint,
method,
param,
Map.merge(params, %{max_id: max_id})
),
func4.(
Pleroma.Web.Endpoint,
method,
param,
Map.merge(params, %{min_id: min_id})
)
}
else
{
func3.(
Pleroma.Web.Endpoint,
method,
Map.merge(params, %{max_id: max_id})
),
func3.(
Pleroma.Web.Endpoint,
method,
Map.merge(params, %{min_id: min_id})
)
}
end
conn
|> put_resp_header("link", "<#{next_url}>; rel=\"next\", <#{prev_url}>; rel=\"prev\"")
else
conn
end end
end end
end end

View File

@ -6,7 +6,7 @@ defmodule Pleroma.Web.MastodonAPI.MastodonAPIController do
use Pleroma.Web, :controller use Pleroma.Web, :controller
import Pleroma.Web.ControllerHelper, import Pleroma.Web.ControllerHelper,
only: [json_response: 3, add_link_headers: 5, add_link_headers: 4, add_link_headers: 3] only: [json_response: 3, add_link_headers: 2, add_link_headers: 3]
alias Ecto.Changeset alias Ecto.Changeset
alias Pleroma.Activity alias Pleroma.Activity
@ -101,7 +101,14 @@ defmodule Pleroma.Web.MastodonAPI.MastodonAPIController do
plug( plug(
OAuthScopesPlug, OAuthScopesPlug,
%{@unauthenticated_access | scopes: ["read:statuses"]} %{@unauthenticated_access | scopes: ["read:statuses"]}
when action in [:user_statuses, :get_status, :get_context, :status_card, :get_poll] when action in [
:user_statuses,
:get_statuses,
:get_status,
:get_context,
:status_card,
:get_poll
]
) )
plug( plug(
@ -526,7 +533,7 @@ def home_timeline(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse() |> Enum.reverse()
conn conn
|> add_link_headers(:home_timeline, activities) |> add_link_headers(activities)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end
@ -545,7 +552,7 @@ def public_timeline(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse() |> Enum.reverse()
conn conn
|> add_link_headers(:public_timeline, activities, false, %{"local" => local_only}) |> add_link_headers(activities, %{"local" => local_only})
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end
@ -559,7 +566,7 @@ def user_statuses(%{assigns: %{user: reading_user}} = conn, params) do
activities = ActivityPub.fetch_user_activities(user, reading_user, params) activities = ActivityPub.fetch_user_activities(user, reading_user, params)
conn conn
|> add_link_headers(:user_statuses, activities, params["id"]) |> add_link_headers(activities)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{ |> render("index.json", %{
activities: activities, activities: activities,
@ -583,11 +590,25 @@ def dm_timeline(%{assigns: %{user: user}} = conn, params) do
|> Pagination.fetch_paginated(params) |> Pagination.fetch_paginated(params)
conn conn
|> add_link_headers(:dm_timeline, activities) |> add_link_headers(activities)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end
def get_statuses(%{assigns: %{user: user}} = conn, %{"ids" => ids}) do
limit = 100
activities =
ids
|> Enum.take(limit)
|> Activity.all_by_ids_with_object()
|> Enum.filter(&Visibility.visible_for_user?(&1, user))
conn
|> put_view(StatusView)
|> render("index.json", activities: activities, for: user, as: :activity)
end
def get_status(%{assigns: %{user: user}} = conn, %{"id" => id}) do def get_status(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id), with %Activity{} = activity <- Activity.get_by_id_with_object(id),
true <- Visibility.visible_for_user?(activity, user) do true <- Visibility.visible_for_user?(activity, user) do
@ -684,7 +705,7 @@ def poll_vote(%{assigns: %{user: user}} = conn, %{"id" => id, "choices" => choic
def scheduled_statuses(%{assigns: %{user: user}} = conn, params) do def scheduled_statuses(%{assigns: %{user: user}} = conn, params) do
with scheduled_activities <- MastodonAPI.get_scheduled_activities(user, params) do with scheduled_activities <- MastodonAPI.get_scheduled_activities(user, params) do
conn conn
|> add_link_headers(:scheduled_statuses, scheduled_activities) |> add_link_headers(scheduled_activities)
|> put_view(ScheduledActivityView) |> put_view(ScheduledActivityView)
|> render("index.json", %{scheduled_activities: scheduled_activities}) |> render("index.json", %{scheduled_activities: scheduled_activities})
end end
@ -867,7 +888,7 @@ def notifications(%{assigns: %{user: user}} = conn, params) do
notifications = MastodonAPI.get_notifications(user, params) notifications = MastodonAPI.get_notifications(user, params)
conn conn
|> add_link_headers(:notifications, notifications) |> add_link_headers(notifications)
|> put_view(NotificationView) |> put_view(NotificationView)
|> render("index.json", %{notifications: notifications, for: user}) |> render("index.json", %{notifications: notifications, for: user})
end end
@ -989,6 +1010,7 @@ def get_mascot(%{assigns: %{user: user}} = conn, _params) do
def favourited_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do def favourited_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id), with %Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
%Object{data: %{"likes" => likes}} <- Object.normalize(activity) do %Object{data: %{"likes" => likes}} <- Object.normalize(activity) do
q = from(u in User, where: u.ap_id in ^likes) q = from(u in User, where: u.ap_id in ^likes)
@ -1000,12 +1022,14 @@ def favourited_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
|> put_view(AccountView) |> put_view(AccountView)
|> render("accounts.json", %{for: user, users: users, as: :user}) |> render("accounts.json", %{for: user, users: users, as: :user})
else else
{:visible, false} -> {:error, :not_found}
_ -> json(conn, []) _ -> json(conn, [])
end end
end end
def reblogged_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do def reblogged_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
with %Activity{} = activity <- Activity.get_by_id_with_object(id), with %Activity{} = activity <- Activity.get_by_id_with_object(id),
{:visible, true} <- {:visible, Visibility.visible_for_user?(activity, user)},
%Object{data: %{"announcements" => announces}} <- Object.normalize(activity) do %Object{data: %{"announcements" => announces}} <- Object.normalize(activity) do
q = from(u in User, where: u.ap_id in ^announces) q = from(u in User, where: u.ap_id in ^announces)
@ -1017,6 +1041,7 @@ def reblogged_by(%{assigns: %{user: user}} = conn, %{"id" => id}) do
|> put_view(AccountView) |> put_view(AccountView)
|> render("accounts.json", %{for: user, users: users, as: :user}) |> render("accounts.json", %{for: user, users: users, as: :user})
else else
{:visible, false} -> {:error, :not_found}
_ -> json(conn, []) _ -> json(conn, [])
end end
end end
@ -1055,7 +1080,7 @@ def hashtag_timeline(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse() |> Enum.reverse()
conn conn
|> add_link_headers(:hashtag_timeline, activities, params["tag"], %{"local" => local_only}) |> add_link_headers(activities, %{"local" => local_only})
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end
@ -1071,7 +1096,7 @@ def followers(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params) do
end end
conn conn
|> add_link_headers(:followers, followers, user) |> add_link_headers(followers)
|> put_view(AccountView) |> put_view(AccountView)
|> render("accounts.json", %{for: for_user, users: followers, as: :user}) |> render("accounts.json", %{for: for_user, users: followers, as: :user})
end end
@ -1088,7 +1113,7 @@ def following(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params) do
end end
conn conn
|> add_link_headers(:following, followers, user) |> add_link_headers(followers)
|> put_view(AccountView) |> put_view(AccountView)
|> render("accounts.json", %{for: for_user, users: followers, as: :user}) |> render("accounts.json", %{for: for_user, users: followers, as: :user})
end end
@ -1313,7 +1338,7 @@ def favourites(%{assigns: %{user: user}} = conn, params) do
|> Enum.reverse() |> Enum.reverse()
conn conn
|> add_link_headers(:favourites, activities) |> add_link_headers(activities)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end
@ -1340,7 +1365,7 @@ def user_favourites(%{assigns: %{user: for_user}} = conn, %{"id" => id} = params
|> Enum.reverse() |> Enum.reverse()
conn conn
|> add_link_headers(:favourites, activities) |> add_link_headers(activities)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: for_user, as: :activity}) |> render("index.json", %{activities: activities, for: for_user, as: :activity})
else else
@ -1361,7 +1386,7 @@ def bookmarks(%{assigns: %{user: user}} = conn, params) do
|> Enum.map(fn b -> Map.put(b.activity, :bookmark, Map.delete(b, :activity)) end) |> Enum.map(fn b -> Map.put(b.activity, :bookmark, Map.delete(b, :activity)) end)
conn conn
|> add_link_headers(:bookmarks, bookmarks) |> add_link_headers(bookmarks)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end
@ -1803,7 +1828,7 @@ def conversations(%{assigns: %{user: user}} = conn, params) do
end) end)
conn conn
|> add_link_headers(:conversations, participations) |> add_link_headers(participations)
|> json(conversations) |> json(conversations)
end end

View File

@ -73,14 +73,12 @@ defp reblogged?(activity, user) do
def render("index.json", opts) do def render("index.json", opts) do
replied_to_activities = get_replied_to_activities(opts.activities) replied_to_activities = get_replied_to_activities(opts.activities)
parallel = unless is_nil(opts[:parallel]), do: opts[:parallel], else: true
opts.activities opts.activities
|> safe_render_many( |> safe_render_many(
StatusView, StatusView,
"status.json", "status.json",
Map.put(opts, :replied_to_activities, replied_to_activities), Map.put(opts, :replied_to_activities, replied_to_activities)
parallel
) )
end end
@ -499,7 +497,7 @@ def build_tags(object_tags) when is_list(object_tags) do
object_tags = for tag when is_binary(tag) <- object_tags, do: tag object_tags = for tag when is_binary(tag) <- object_tags, do: tag
Enum.reduce(object_tags, [], fn tag, tags -> Enum.reduce(object_tags, [], fn tag, tags ->
tags ++ [%{name: tag, url: "/tag/#{tag}"}] tags ++ [%{name: tag, url: "/tag/#{URI.encode(tag)}"}]
end) end)
end end

View File

@ -5,7 +5,7 @@
defmodule Pleroma.Web.PleromaAPI.PleromaAPIController do defmodule Pleroma.Web.PleromaAPI.PleromaAPIController do
use Pleroma.Web, :controller use Pleroma.Web, :controller
import Pleroma.Web.ControllerHelper, only: [add_link_headers: 7] import Pleroma.Web.ControllerHelper, only: [add_link_headers: 2]
alias Pleroma.Conversation.Participation alias Pleroma.Conversation.Participation
alias Pleroma.Notification alias Pleroma.Notification
@ -40,31 +40,22 @@ def conversation_statuses(
%{assigns: %{user: user}} = conn, %{assigns: %{user: user}} = conn,
%{"id" => participation_id} = params %{"id" => participation_id} = params
) do ) do
params = participation = Participation.get(participation_id, preload: [:conversation])
params
|> Map.put("blocking_user", user)
|> Map.put("muting_user", user)
|> Map.put("user", user)
participation =
participation_id
|> Participation.get(preload: [:conversation])
if user.id == participation.user_id do if user.id == participation.user_id do
params =
params
|> Map.put("blocking_user", user)
|> Map.put("muting_user", user)
|> Map.put("user", user)
activities = activities =
participation.conversation.ap_id participation.conversation.ap_id
|> ActivityPub.fetch_activities_for_context(params) |> ActivityPub.fetch_activities_for_context(params)
|> Enum.reverse() |> Enum.reverse()
conn conn
|> add_link_headers( |> add_link_headers(activities)
:conversation_statuses,
activities,
participation_id,
params,
nil,
&pleroma_api_url/4
)
|> put_view(StatusView) |> put_view(StatusView)
|> render("index.json", %{activities: activities, for: user, as: :activity}) |> render("index.json", %{activities: activities, for: user, as: :activity})
end end

View File

@ -192,6 +192,7 @@ defmodule Pleroma.Web.Router do
scope "/api/pleroma", Pleroma.Web.TwitterAPI do scope "/api/pleroma", Pleroma.Web.TwitterAPI do
pipe_through(:authenticated_api) pipe_through(:authenticated_api)
post("/change_email", UtilController, :change_email)
post("/change_password", UtilController, :change_password) post("/change_password", UtilController, :change_password)
post("/delete_account", UtilController, :delete_account) post("/delete_account", UtilController, :delete_account)
put("/notification_settings", UtilController, :update_notificaton_settings) put("/notification_settings", UtilController, :update_notificaton_settings)
@ -360,48 +361,46 @@ defmodule Pleroma.Web.Router do
scope "/api/v1", Pleroma.Web.MastodonAPI do scope "/api/v1", Pleroma.Web.MastodonAPI do
pipe_through(:api) pipe_through(:api)
post("/accounts", MastodonAPIController, :account_register)
get("/instance", MastodonAPIController, :masto_instance) get("/instance", MastodonAPIController, :masto_instance)
get("/instance/peers", MastodonAPIController, :peers) get("/instance/peers", MastodonAPIController, :peers)
post("/apps", MastodonAPIController, :create_app) post("/apps", MastodonAPIController, :create_app)
get("/apps/verify_credentials", MastodonAPIController, :verify_app_credentials) get("/apps/verify_credentials", MastodonAPIController, :verify_app_credentials)
get("/custom_emojis", MastodonAPIController, :custom_emojis) get("/custom_emojis", MastodonAPIController, :custom_emojis)
get("/statuses/:id/card", MastodonAPIController, :status_card)
get("/statuses/:id/favourited_by", MastodonAPIController, :favourited_by)
get("/statuses/:id/reblogged_by", MastodonAPIController, :reblogged_by)
get("/trends", MastodonAPIController, :empty_array) get("/trends", MastodonAPIController, :empty_array)
get("/accounts/search", SearchController, :account_search) get("/accounts/search", SearchController, :account_search)
get("/timelines/public", MastodonAPIController, :public_timeline)
get("/timelines/tag/:tag", MastodonAPIController, :hashtag_timeline)
get("/timelines/list/:list_id", MastodonAPIController, :list_timeline)
get("/polls/:id", MastodonAPIController, :get_poll)
post("/accounts", MastodonAPIController, :account_register)
get("/accounts/:id", MastodonAPIController, :user)
get("/accounts/:id/followers", MastodonAPIController, :followers)
get("/accounts/:id/following", MastodonAPIController, :following)
get("/accounts/:id/statuses", MastodonAPIController, :user_statuses)
get("/search", SearchController, :search)
get("/statuses", MastodonAPIController, :get_statuses)
get("/statuses/:id", MastodonAPIController, :get_status)
get("/statuses/:id/context", MastodonAPIController, :get_context)
get("/statuses/:id/card", MastodonAPIController, :status_card)
get("/statuses/:id/favourited_by", MastodonAPIController, :favourited_by)
get("/statuses/:id/reblogged_by", MastodonAPIController, :reblogged_by)
get("/pleroma/accounts/:id/favourites", MastodonAPIController, :user_favourites)
post( post(
"/pleroma/accounts/confirmation_resend", "/pleroma/accounts/confirmation_resend",
MastodonAPIController, MastodonAPIController,
:account_confirmation_resend :account_confirmation_resend
) )
get("/timelines/public", MastodonAPIController, :public_timeline)
get("/timelines/tag/:tag", MastodonAPIController, :hashtag_timeline)
get("/pleroma/accounts/:id/favourites", MastodonAPIController, :user_favourites)
get("/search", SearchController, :search)
get("/polls/:id", MastodonAPIController, :get_poll)
get("/accounts/:id/followers", MastodonAPIController, :followers)
get("/accounts/:id/following", MastodonAPIController, :following)
get("/timelines/list/:list_id", MastodonAPIController, :list_timeline)
get("/accounts/:id", MastodonAPIController, :user)
get("/accounts/:id/statuses", MastodonAPIController, :user_statuses)
get("/statuses/:id", MastodonAPIController, :get_status)
get("/statuses/:id/context", MastodonAPIController, :get_context)
end end
scope "/api/v2", Pleroma.Web.MastodonAPI do scope "/api/v2", Pleroma.Web.MastodonAPI do

View File

@ -31,6 +31,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilController do
OAuthScopesPlug, OAuthScopesPlug,
%{scopes: ["write:accounts"]} %{scopes: ["write:accounts"]}
when action in [ when action in [
:change_email,
:change_password, :change_password,
:delete_account, :delete_account,
:update_notificaton_settings, :update_notificaton_settings,
@ -334,6 +335,25 @@ def change_password(%{assigns: %{user: user}} = conn, params) do
end end
end end
def change_email(%{assigns: %{user: user}} = conn, params) do
case CommonAPI.Utils.confirm_current_password(user, params["password"]) do
{:ok, user} ->
with {:ok, _user} <- User.change_email(user, params["email"]) do
json(conn, %{status: "success"})
else
{:error, changeset} ->
{_, {error, _}} = Enum.at(changeset.errors, 0)
json(conn, %{error: "Email #{error}."})
_ ->
json(conn, %{error: "Unable to change email."})
end
{:error, msg} ->
json(conn, %{error: msg})
end
end
def delete_account(%{assigns: %{user: user}} = conn, params) do def delete_account(%{assigns: %{user: user}} = conn, params) do
case CommonAPI.Utils.confirm_current_password(user, params["password"]) do case CommonAPI.Utils.confirm_current_password(user, params["password"]) do
{:ok, user} -> {:ok, user} ->

View File

@ -66,23 +66,9 @@ def safe_render(view, template, assigns \\ %{}) do
end end
@doc """ @doc """
Same as `render_many/4` but wrapped in rescue block and parallelized (unless disabled by passing false as a fifth argument). Same as `render_many/4` but wrapped in rescue block.
""" """
def safe_render_many(collection, view, template, assigns \\ %{}, parallel \\ true) def safe_render_many(collection, view, template, assigns \\ %{}) do
def safe_render_many(collection, view, template, assigns, true) do
Enum.map(collection, fn resource ->
Task.async(fn ->
as = Map.get(assigns, :as) || view.__resource__
assigns = Map.put(assigns, as, resource)
safe_render(view, template, assigns)
end)
end)
|> Enum.map(&Task.await(&1, :infinity))
|> Enum.filter(& &1)
end
def safe_render_many(collection, view, template, assigns, false) do
Enum.map(collection, fn resource -> Enum.map(collection, fn resource ->
as = Map.get(assigns, :as) || view.__resource__ as = Map.get(assigns, :as) || view.__resource__
assigns = Map.put(assigns, as, resource) assigns = Map.put(assigns, as, resource)

View File

@ -125,7 +125,7 @@ defp deps do
{:crypt, {:crypt,
git: "https://github.com/msantos/crypt", ref: "1f2b58927ab57e72910191a7ebaeff984382a1d3"}, git: "https://github.com/msantos/crypt", ref: "1f2b58927ab57e72910191a7ebaeff984382a1d3"},
{:cors_plug, "~> 1.5"}, {:cors_plug, "~> 1.5"},
{:ex_doc, "~> 0.20.2", only: :dev, runtime: false}, {:ex_doc, "~> 0.21", only: :dev, runtime: false},
{:web_push_encryption, "~> 0.2.1"}, {:web_push_encryption, "~> 0.2.1"},
{:swoosh, "~> 0.23.2"}, {:swoosh, "~> 0.23.2"},
{:phoenix_swoosh, "~> 0.2"}, {:phoenix_swoosh, "~> 0.2"},

View File

@ -20,7 +20,7 @@
"db_connection": {:hex, :db_connection, "2.0.6", "bde2f85d047969c5b5800cb8f4b3ed6316c8cb11487afedac4aa5f93fd39abfa", [:mix], [{:connection, "~> 1.0.2", [hex: :connection, repo: "hexpm", optional: false]}], "hexpm"}, "db_connection": {:hex, :db_connection, "2.0.6", "bde2f85d047969c5b5800cb8f4b3ed6316c8cb11487afedac4aa5f93fd39abfa", [:mix], [{:connection, "~> 1.0.2", [hex: :connection, repo: "hexpm", optional: false]}], "hexpm"},
"decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"}, "decimal": {:hex, :decimal, "1.8.0", "ca462e0d885f09a1c5a342dbd7c1dcf27ea63548c65a65e67334f4b61803822e", [:mix], [], "hexpm"},
"deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm"}, "deep_merge": {:hex, :deep_merge, "1.0.0", "b4aa1a0d1acac393bdf38b2291af38cb1d4a52806cf7a4906f718e1feb5ee961", [:mix], [], "hexpm"},
"earmark": {:hex, :earmark, "1.3.2", "b840562ea3d67795ffbb5bd88940b1bed0ed9fa32834915125ea7d02e35888a5", [:mix], [], "hexpm"}, "earmark": {:hex, :earmark, "1.3.6", "ce1d0675e10a5bb46b007549362bd3f5f08908843957687d8484fe7f37466b19", [:mix], [], "hexpm"},
"ecto": {:hex, :ecto, "3.1.4", "69d852da7a9f04ede725855a35ede48d158ca11a404fe94f8b2fb3b2162cd3c9", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"}, "ecto": {:hex, :ecto, "3.1.4", "69d852da7a9f04ede725855a35ede48d158ca11a404fe94f8b2fb3b2162cd3c9", [:mix], [{:decimal, "~> 1.6", [hex: :decimal, repo: "hexpm", optional: false]}, {:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}], "hexpm"},
"ecto_sql": {:hex, :ecto_sql, "3.1.3", "2c536139190492d9de33c5fefac7323c5eaaa82e1b9bf93482a14649042f7cd9", [:mix], [{:db_connection, "~> 2.0", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.1.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:mariaex, "~> 0.9.1", [hex: :mariaex, repo: "hexpm", optional: true]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.14.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"}, "ecto_sql": {:hex, :ecto_sql, "3.1.3", "2c536139190492d9de33c5fefac7323c5eaaa82e1b9bf93482a14649042f7cd9", [:mix], [{:db_connection, "~> 2.0", [hex: :db_connection, repo: "hexpm", optional: false]}, {:ecto, "~> 3.1.0", [hex: :ecto, repo: "hexpm", optional: false]}, {:mariaex, "~> 0.9.1", [hex: :mariaex, repo: "hexpm", optional: true]}, {:myxql, "~> 0.2.0", [hex: :myxql, repo: "hexpm", optional: true]}, {:postgrex, "~> 0.14.0", [hex: :postgrex, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},
"esshd": {:hex, :esshd, "0.1.0", "6f93a2062adb43637edad0ea7357db2702a4b80dd9683482fe00f5134e97f4c1", [:mix], [], "hexpm"}, "esshd": {:hex, :esshd, "0.1.0", "6f93a2062adb43637edad0ea7357db2702a4b80dd9683482fe00f5134e97f4c1", [:mix], [], "hexpm"},
@ -29,7 +29,7 @@
"ex_aws": {:hex, :ex_aws, "2.1.0", "b92651527d6c09c479f9013caa9c7331f19cba38a650590d82ebf2c6c16a1d8a", [:mix], [{:configparser_ex, "~> 2.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "1.6.3 or 1.6.5 or 1.7.1 or 1.8.6 or ~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jsx, "~> 2.8", [hex: :jsx, repo: "hexpm", optional: true]}, {:poison, ">= 1.2.0", [hex: :poison, repo: "hexpm", optional: true]}, {:sweet_xml, "~> 0.6", [hex: :sweet_xml, repo: "hexpm", optional: true]}, {:xml_builder, "~> 0.1.0", [hex: :xml_builder, repo: "hexpm", optional: true]}], "hexpm"}, "ex_aws": {:hex, :ex_aws, "2.1.0", "b92651527d6c09c479f9013caa9c7331f19cba38a650590d82ebf2c6c16a1d8a", [:mix], [{:configparser_ex, "~> 2.0", [hex: :configparser_ex, repo: "hexpm", optional: true]}, {:hackney, "1.6.3 or 1.6.5 or 1.7.1 or 1.8.6 or ~> 1.9", [hex: :hackney, repo: "hexpm", optional: true]}, {:jsx, "~> 2.8", [hex: :jsx, repo: "hexpm", optional: true]}, {:poison, ">= 1.2.0", [hex: :poison, repo: "hexpm", optional: true]}, {:sweet_xml, "~> 0.6", [hex: :sweet_xml, repo: "hexpm", optional: true]}, {:xml_builder, "~> 0.1.0", [hex: :xml_builder, repo: "hexpm", optional: true]}], "hexpm"},
"ex_aws_s3": {:hex, :ex_aws_s3, "2.0.1", "9e09366e77f25d3d88c5393824e613344631be8db0d1839faca49686e99b6704", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm"}, "ex_aws_s3": {:hex, :ex_aws_s3, "2.0.1", "9e09366e77f25d3d88c5393824e613344631be8db0d1839faca49686e99b6704", [:mix], [{:ex_aws, "~> 2.0", [hex: :ex_aws, repo: "hexpm", optional: false]}, {:sweet_xml, ">= 0.0.0", [hex: :sweet_xml, repo: "hexpm", optional: true]}], "hexpm"},
"ex_const": {:hex, :ex_const, "0.2.4", "d06e540c9d834865b012a17407761455efa71d0ce91e5831e86881b9c9d82448", [:mix], [], "hexpm"}, "ex_const": {:hex, :ex_const, "0.2.4", "d06e540c9d834865b012a17407761455efa71d0ce91e5831e86881b9c9d82448", [:mix], [], "hexpm"},
"ex_doc": {:hex, :ex_doc, "0.20.2", "1bd0dfb0304bade58beb77f20f21ee3558cc3c753743ae0ddbb0fd7ba2912331", [:mix], [{:earmark, "~> 1.3", [hex: :earmark, repo: "hexpm", optional: false]}, {:makeup_elixir, "~> 0.10", [hex: :makeup_elixir, repo: "hexpm", optional: false]}], "hexpm"}, "ex_doc": {:hex, :ex_doc, "0.21.2", "caca5bc28ed7b3bdc0b662f8afe2bee1eedb5c3cf7b322feeeb7c6ebbde089d6", [:mix], [{:earmark, "~> 1.3.3 or ~> 1.4", [hex: :earmark, repo: "hexpm", optional: false]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}], "hexpm"},
"ex_machina": {:hex, :ex_machina, "2.3.0", "92a5ad0a8b10ea6314b876a99c8c9e3f25f4dde71a2a835845b136b9adaf199a", [:mix], [{:ecto, "~> 2.2 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_sql, "~> 3.0", [hex: :ecto_sql, repo: "hexpm", optional: true]}], "hexpm"}, "ex_machina": {:hex, :ex_machina, "2.3.0", "92a5ad0a8b10ea6314b876a99c8c9e3f25f4dde71a2a835845b136b9adaf199a", [:mix], [{:ecto, "~> 2.2 or ~> 3.0", [hex: :ecto, repo: "hexpm", optional: true]}, {:ecto_sql, "~> 3.0", [hex: :ecto_sql, repo: "hexpm", optional: true]}], "hexpm"},
"ex_rated": {:hex, :ex_rated, "1.3.3", "30ecbdabe91f7eaa9d37fa4e81c85ba420f371babeb9d1910adbcd79ec798d27", [:mix], [{:ex2ms, "~> 1.5", [hex: :ex2ms, repo: "hexpm", optional: false]}], "hexpm"}, "ex_rated": {:hex, :ex_rated, "1.3.3", "30ecbdabe91f7eaa9d37fa4e81c85ba420f371babeb9d1910adbcd79ec798d27", [:mix], [{:ex2ms, "~> 1.5", [hex: :ex2ms, repo: "hexpm", optional: false]}], "hexpm"},
"ex_syslogger": {:git, "https://github.com/slashmili/ex_syslogger.git", "f3963399047af17e038897c69e20d552e6899e1d", [tag: "1.4.0"]}, "ex_syslogger": {:git, "https://github.com/slashmili/ex_syslogger.git", "f3963399047af17e038897c69e20d552e6899e1d", [tag: "1.4.0"]},
@ -46,8 +46,8 @@
"jason": {:hex, :jason, "1.1.2", "b03dedea67a99223a2eaf9f1264ce37154564de899fd3d8b9a21b1a6fd64afe7", [:mix], [{:decimal, "~> 1.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm"}, "jason": {:hex, :jason, "1.1.2", "b03dedea67a99223a2eaf9f1264ce37154564de899fd3d8b9a21b1a6fd64afe7", [:mix], [{:decimal, "~> 1.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm"},
"joken": {:hex, :joken, "2.0.1", "ec9ab31bf660f343380da033b3316855197c8d4c6ef597fa3fcb451b326beb14", [:mix], [{:jose, "~> 1.9", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm"}, "joken": {:hex, :joken, "2.0.1", "ec9ab31bf660f343380da033b3316855197c8d4c6ef597fa3fcb451b326beb14", [:mix], [{:jose, "~> 1.9", [hex: :jose, repo: "hexpm", optional: false]}], "hexpm"},
"jose": {:hex, :jose, "1.9.0", "4167c5f6d06ffaebffd15cdb8da61a108445ef5e85ab8f5a7ad926fdf3ada154", [:mix, :rebar3], [{:base64url, "~> 0.0.1", [hex: :base64url, repo: "hexpm", optional: false]}], "hexpm"}, "jose": {:hex, :jose, "1.9.0", "4167c5f6d06ffaebffd15cdb8da61a108445ef5e85ab8f5a7ad926fdf3ada154", [:mix, :rebar3], [{:base64url, "~> 0.0.1", [hex: :base64url, repo: "hexpm", optional: false]}], "hexpm"},
"makeup": {:hex, :makeup, "0.8.0", "9cf32aea71c7fe0a4b2e9246c2c4978f9070257e5c9ce6d4a28ec450a839b55f", [:mix], [{:nimble_parsec, "~> 0.5.0", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"}, "makeup": {:hex, :makeup, "1.0.0", "671df94cf5a594b739ce03b0d0316aa64312cee2574b6a44becb83cd90fb05dc", [:mix], [{:nimble_parsec, "~> 0.5.0", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm"},
"makeup_elixir": {:hex, :makeup_elixir, "0.13.0", "be7a477997dcac2e48a9d695ec730b2d22418292675c75aa2d34ba0909dcdeda", [:mix], [{:makeup, "~> 0.8", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm"}, "makeup_elixir": {:hex, :makeup_elixir, "0.14.0", "cf8b7c66ad1cff4c14679698d532f0b5d45a3968ffbcbfd590339cb57742f1ae", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm"},
"meck": {:hex, :meck, "0.8.13", "ffedb39f99b0b99703b8601c6f17c7f76313ee12de6b646e671e3188401f7866", [:rebar3], [], "hexpm"}, "meck": {:hex, :meck, "0.8.13", "ffedb39f99b0b99703b8601c6f17c7f76313ee12de6b646e671e3188401f7866", [:rebar3], [], "hexpm"},
"metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm"}, "metrics": {:hex, :metrics, "1.0.1", "25f094dea2cda98213cecc3aeff09e940299d950904393b2a29d191c346a8486", [:rebar3], [], "hexpm"},
"mime": {:hex, :mime, "1.3.1", "30ce04ab3175b6ad0bdce0035cba77bba68b813d523d1aac73d9781b4d193cf8", [:mix], [], "hexpm"}, "mime": {:hex, :mime, "1.3.1", "30ce04ab3175b6ad0bdce0035cba77bba68b813d523d1aac73d9781b4d193cf8", [:mix], [], "hexpm"},
@ -56,7 +56,7 @@
"mock": {:hex, :mock, "0.3.3", "42a433794b1291a9cf1525c6d26b38e039e0d3a360732b5e467bfc77ef26c914", [:mix], [{:meck, "~> 0.8.13", [hex: :meck, repo: "hexpm", optional: false]}], "hexpm"}, "mock": {:hex, :mock, "0.3.3", "42a433794b1291a9cf1525c6d26b38e039e0d3a360732b5e467bfc77ef26c914", [:mix], [{:meck, "~> 0.8.13", [hex: :meck, repo: "hexpm", optional: false]}], "hexpm"},
"mogrify": {:hex, :mogrify, "0.6.1", "de1b527514f2d95a7bbe9642eb556061afb337e220cf97adbf3a4e6438ed70af", [:mix], [], "hexpm"}, "mogrify": {:hex, :mogrify, "0.6.1", "de1b527514f2d95a7bbe9642eb556061afb337e220cf97adbf3a4e6438ed70af", [:mix], [], "hexpm"},
"mox": {:hex, :mox, "0.5.1", "f86bb36026aac1e6f924a4b6d024b05e9adbed5c63e8daa069bd66fb3292165b", [:mix], [], "hexpm"}, "mox": {:hex, :mox, "0.5.1", "f86bb36026aac1e6f924a4b6d024b05e9adbed5c63e8daa069bd66fb3292165b", [:mix], [], "hexpm"},
"nimble_parsec": {:hex, :nimble_parsec, "0.5.0", "90e2eca3d0266e5c53f8fbe0079694740b9c91b6747f2b7e3c5d21966bba8300", [:mix], [], "hexpm"}, "nimble_parsec": {:hex, :nimble_parsec, "0.5.1", "c90796ecee0289dbb5ad16d3ad06f957b0cd1199769641c961cfe0b97db190e0", [:mix], [], "hexpm"},
"parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm"}, "parse_trans": {:hex, :parse_trans, "3.3.0", "09765507a3c7590a784615cfd421d101aec25098d50b89d7aa1d66646bc571c1", [:rebar3], [], "hexpm"},
"pbkdf2_elixir": {:hex, :pbkdf2_elixir, "0.12.3", "6706a148809a29c306062862c803406e88f048277f6e85b68faf73291e820b84", [:mix], [], "hexpm"}, "pbkdf2_elixir": {:hex, :pbkdf2_elixir, "0.12.3", "6706a148809a29c306062862c803406e88f048277f6e85b68faf73291e820b84", [:mix], [], "hexpm"},
"phoenix": {:hex, :phoenix, "1.4.9", "746d098e10741c334d88143d3c94cab1756435f94387a63441792e66ec0ee974", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"}, "phoenix": {:hex, :phoenix, "1.4.9", "746d098e10741c334d88143d3c94cab1756435f94387a63441792e66ec0ee974", [:mix], [{:jason, "~> 1.0", [hex: :jason, repo: "hexpm", optional: true]}, {:phoenix_pubsub, "~> 1.1", [hex: :phoenix_pubsub, repo: "hexpm", optional: false]}, {:plug, "~> 1.8.1 or ~> 1.9", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 1.0 or ~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: true]}, {:telemetry, "~> 0.4", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm"},

View File

@ -173,4 +173,51 @@ test "add an activity with an expiration" do
|> where([a], a.activity_id == ^activity.id) |> where([a], a.activity_id == ^activity.id)
|> Repo.one!() |> Repo.one!()
end end
test "all_by_ids_with_object/1" do
%{id: id1} = insert(:note_activity)
%{id: id2} = insert(:note_activity)
activities =
[id1, id2]
|> Activity.all_by_ids_with_object()
|> Enum.sort(&(&1.id < &2.id))
assert [%{id: ^id1, object: %Object{}}, %{id: ^id2, object: %Object{}}] = activities
end
test "get_by_id_with_object/1" do
%{id: id} = insert(:note_activity)
assert %Activity{id: ^id, object: %Object{}} = Activity.get_by_id_with_object(id)
end
test "get_by_ap_id_with_object/1" do
%{data: %{"id" => ap_id}} = insert(:note_activity)
assert %Activity{data: %{"id" => ^ap_id}, object: %Object{}} =
Activity.get_by_ap_id_with_object(ap_id)
end
test "get_by_id/1" do
%{id: id} = insert(:note_activity)
assert %Activity{id: ^id} = Activity.get_by_id(id)
end
test "all_by_actor_and_id/2" do
user = insert(:user)
{:ok, %{id: id1}} = Pleroma.Web.CommonAPI.post(user, %{"status" => "cofe"})
{:ok, %{id: id2}} = Pleroma.Web.CommonAPI.post(user, %{"status" => "cofefe"})
assert [] == Activity.all_by_actor_and_id(user, [])
activities =
user.ap_id
|> Activity.all_by_actor_and_id([id1, id2])
|> Enum.sort(&(&1.id < &2.id))
assert [%Activity{id: ^id1}, %Activity{id: ^id2}] = activities
end
end end

View File

@ -5,6 +5,7 @@
defmodule Pleroma.Integration.MastodonWebsocketTest do defmodule Pleroma.Integration.MastodonWebsocketTest do
use Pleroma.DataCase use Pleroma.DataCase
import ExUnit.CaptureLog
import Pleroma.Factory import Pleroma.Factory
alias Pleroma.Integration.WebsocketClient alias Pleroma.Integration.WebsocketClient
@ -39,13 +40,17 @@ def start_socket(qs \\ nil, headers \\ []) do
end end
test "refuses invalid requests" do test "refuses invalid requests" do
assert {:error, {400, _}} = start_socket() capture_log(fn ->
assert {:error, {404, _}} = start_socket("?stream=ncjdk") assert {:error, {400, _}} = start_socket()
assert {:error, {404, _}} = start_socket("?stream=ncjdk")
end)
end end
test "requires authentication and a valid token for protected streams" do test "requires authentication and a valid token for protected streams" do
assert {:error, {403, _}} = start_socket("?stream=user&access_token=aaaaaaaaaaaa") capture_log(fn ->
assert {:error, {403, _}} = start_socket("?stream=user") assert {:error, {403, _}} = start_socket("?stream=user&access_token=aaaaaaaaaaaa")
assert {:error, {403, _}} = start_socket("?stream=user")
end)
end end
test "allows public streams without authentication" do test "allows public streams without authentication" do
@ -100,19 +105,27 @@ test "accepts valid tokens", state do
test "accepts the 'user' stream", %{token: token} = _state do test "accepts the 'user' stream", %{token: token} = _state do
assert {:ok, _} = start_socket("?stream=user&access_token=#{token.token}") assert {:ok, _} = start_socket("?stream=user&access_token=#{token.token}")
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user")
assert capture_log(fn ->
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user")
end) =~ ":badarg"
end end
test "accepts the 'user:notification' stream", %{token: token} = _state do test "accepts the 'user:notification' stream", %{token: token} = _state do
assert {:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}") assert {:ok, _} = start_socket("?stream=user:notification&access_token=#{token.token}")
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user:notification")
assert capture_log(fn ->
assert {:error, {403, "Forbidden"}} = start_socket("?stream=user:notification")
end) =~ ":badarg"
end end
test "accepts valid token on Sec-WebSocket-Protocol header", %{token: token} do test "accepts valid token on Sec-WebSocket-Protocol header", %{token: token} do
assert {:ok, _} = start_socket("?stream=user", [{"Sec-WebSocket-Protocol", token.token}]) assert {:ok, _} = start_socket("?stream=user", [{"Sec-WebSocket-Protocol", token.token}])
assert {:error, {403, "Forbidden"}} = assert capture_log(fn ->
start_socket("?stream=user", [{"Sec-WebSocket-Protocol", "I am a friend"}]) assert {:error, {403, "Forbidden"}} =
start_socket("?stream=user", [{"Sec-WebSocket-Protocol", "I am a friend"}])
end) =~ ":badarg"
end end
end end
end end

View File

@ -53,9 +53,12 @@ test "ensures cache is cleared for the object" do
assert object == cached_object assert object == cached_object
Cachex.put(:web_resp_cache, URI.parse(object.data["id"]).path, "cofe")
Object.delete(cached_object) Object.delete(cached_object)
{:ok, nil} = Cachex.get(:object_cache, "object:#{object.data["id"]}") {:ok, nil} = Cachex.get(:object_cache, "object:#{object.data["id"]}")
{:ok, nil} = Cachex.get(:web_resp_cache, URI.parse(object.data["id"]).path)
cached_object = Object.get_cached_by_ap_id(object.data["id"]) cached_object = Object.get_cached_by_ap_id(object.data["id"])

186
test/plugs/cache_test.exs Normal file
View File

@ -0,0 +1,186 @@
# Pleroma: A lightweight social networking server
# Copyright © 2017-2019 Pleroma Authors <https://pleroma.social/>
# SPDX-License-Identifier: AGPL-3.0-only
defmodule Pleroma.Plugs.CacheTest do
use ExUnit.Case, async: true
use Plug.Test
alias Pleroma.Plugs.Cache
@miss_resp {200,
[
{"cache-control", "max-age=0, private, must-revalidate"},
{"content-type", "cofe/hot; charset=utf-8"},
{"x-cache", "MISS from Pleroma"}
], "cofe"}
@hit_resp {200,
[
{"cache-control", "max-age=0, private, must-revalidate"},
{"content-type", "cofe/hot; charset=utf-8"},
{"x-cache", "HIT from Pleroma"}
], "cofe"}
@ttl 5
setup do
Cachex.clear(:web_resp_cache)
:ok
end
test "caches a response" do
assert @miss_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
assert_raise(Plug.Conn.AlreadySentError, fn ->
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
end)
assert @hit_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: nil})
|> sent_resp()
end
test "ttl is set" do
assert @miss_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: @ttl})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
assert @hit_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: @ttl})
|> sent_resp()
:timer.sleep(@ttl + 1)
assert @miss_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: @ttl})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
end
test "set ttl via conn.assigns" do
assert @miss_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> assign(:cache_ttl, @ttl)
|> send_resp(:ok, "cofe")
|> sent_resp()
assert @hit_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: nil})
|> sent_resp()
:timer.sleep(@ttl + 1)
assert @miss_resp ==
conn(:get, "/")
|> Cache.call(%{query_params: false, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
end
test "ignore query string when `query_params` is false" do
assert @miss_resp ==
conn(:get, "/?cofe")
|> Cache.call(%{query_params: false, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
assert @hit_resp ==
conn(:get, "/?cofefe")
|> Cache.call(%{query_params: false, ttl: nil})
|> sent_resp()
end
test "take query string into account when `query_params` is true" do
assert @miss_resp ==
conn(:get, "/?cofe")
|> Cache.call(%{query_params: true, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
assert @miss_resp ==
conn(:get, "/?cofefe")
|> Cache.call(%{query_params: true, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
end
test "take specific query params into account when `query_params` is list" do
assert @miss_resp ==
conn(:get, "/?a=1&b=2&c=3&foo=bar")
|> fetch_query_params()
|> Cache.call(%{query_params: ["a", "b", "c"], ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
assert @hit_resp ==
conn(:get, "/?bar=foo&c=3&b=2&a=1")
|> fetch_query_params()
|> Cache.call(%{query_params: ["a", "b", "c"], ttl: nil})
|> sent_resp()
assert @miss_resp ==
conn(:get, "/?bar=foo&c=3&b=2&a=2")
|> fetch_query_params()
|> Cache.call(%{query_params: ["a", "b", "c"], ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
end
test "ignore not GET requests" do
expected =
{200,
[
{"cache-control", "max-age=0, private, must-revalidate"},
{"content-type", "cofe/hot; charset=utf-8"}
], "cofe"}
assert expected ==
conn(:post, "/")
|> Cache.call(%{query_params: true, ttl: nil})
|> put_resp_content_type("cofe/hot")
|> send_resp(:ok, "cofe")
|> sent_resp()
end
test "ignore non-successful responses" do
expected =
{418,
[
{"cache-control", "max-age=0, private, must-revalidate"},
{"content-type", "tea/iced; charset=utf-8"}
], "🥤"}
assert expected ==
conn(:get, "/cofe")
|> Cache.call(%{query_params: true, ttl: nil})
|> put_resp_content_type("tea/iced")
|> send_resp(:im_a_teapot, "🥤")
|> sent_resp()
end
end

View File

@ -570,22 +570,6 @@ test "it has required fields" do
refute cs.valid? refute cs.valid?
end) end)
end end
test "it restricts some sizes" do
bio_limit = Pleroma.Config.get([:instance, :user_bio_length], 5000)
name_limit = Pleroma.Config.get([:instance, :user_name_length], 100)
[bio: bio_limit, name: name_limit]
|> Enum.each(fn {field, size} ->
string = String.pad_leading(".", size)
cs = User.remote_user_creation(Map.put(@valid_remote, field, string))
assert cs.valid?
string = String.pad_leading(".", size + 1)
cs = User.remote_user_creation(Map.put(@valid_remote, field, string))
refute cs.valid?
end)
end
end end
describe "followers and friends" do describe "followers and friends" do
@ -1081,7 +1065,7 @@ test "it deletes a user, all follow relationships and all activities", %{user: u
user_activities = user_activities =
user.ap_id user.ap_id
|> Activity.query_by_actor() |> Activity.Queries.by_actor()
|> Repo.all() |> Repo.all()
|> Enum.map(fn act -> act.data["type"] end) |> Enum.map(fn act -> act.data["type"] end)
@ -1117,11 +1101,60 @@ test "get_public_key_for_ap_id fetches a user that's not in the db" do
assert {:ok, _key} = User.get_public_key_for_ap_id("http://mastodon.example.org/users/admin") assert {:ok, _key} = User.get_public_key_for_ap_id("http://mastodon.example.org/users/admin")
end end
test "insert or update a user from given data" do describe "insert or update a user from given data" do
user = insert(:user, %{nickname: "nick@name.de"}) test "with normal data" do
data = %{ap_id: user.ap_id <> "xxx", name: user.name, nickname: user.nickname} user = insert(:user, %{nickname: "nick@name.de"})
data = %{ap_id: user.ap_id <> "xxx", name: user.name, nickname: user.nickname}
assert {:ok, %User{}} = User.insert_or_update_user(data) assert {:ok, %User{}} = User.insert_or_update_user(data)
end
test "with overly long fields" do
current_max_length = Pleroma.Config.get([:instance, :account_field_value_length], 255)
user = insert(:user, nickname: "nickname@supergood.domain")
data = %{
ap_id: user.ap_id,
name: user.name,
nickname: user.nickname,
info: %{
fields: [
%{"name" => "myfield", "value" => String.duplicate("h", current_max_length + 1)}
]
}
}
assert {:ok, %User{}} = User.insert_or_update_user(data)
end
test "with an overly long bio" do
current_max_length = Pleroma.Config.get([:instance, :user_bio_length], 5000)
user = insert(:user, nickname: "nickname@supergood.domain")
data = %{
ap_id: user.ap_id,
name: user.name,
nickname: user.nickname,
bio: String.duplicate("h", current_max_length + 1),
info: %{}
}
assert {:ok, %User{}} = User.insert_or_update_user(data)
end
test "with an overly long display name" do
current_max_length = Pleroma.Config.get([:instance, :user_name_length], 100)
user = insert(:user, nickname: "nickname@supergood.domain")
data = %{
ap_id: user.ap_id,
name: String.duplicate("h", current_max_length + 1),
nickname: user.nickname,
info: %{}
}
assert {:ok, %User{}} = User.insert_or_update_user(data)
end
end end
describe "per-user rich-text filtering" do describe "per-user rich-text filtering" do
@ -1614,4 +1647,31 @@ test "syncronizes the counters with the remote instance for the follower when en
assert User.user_info(other_user).following_count == 152 assert User.user_info(other_user).following_count == 152
end end
end end
describe "change_email/2" do
setup do
[user: insert(:user)]
end
test "blank email returns error", %{user: user} do
assert {:error, %{errors: [email: {"can't be blank", _}]}} = User.change_email(user, "")
assert {:error, %{errors: [email: {"can't be blank", _}]}} = User.change_email(user, nil)
end
test "non unique email returns error", %{user: user} do
%{email: email} = insert(:user)
assert {:error, %{errors: [email: {"has already been taken", _}]}} =
User.change_email(user, email)
end
test "invalid email returns error", %{user: user} do
assert {:error, %{errors: [email: {"has invalid format", _}]}} =
User.change_email(user, "cofe")
end
test "changes email", %{user: user} do
assert {:ok, %User{email: "cofe@cofe.party"}} = User.change_email(user, "cofe@cofe.party")
end
end
end end

View File

@ -175,6 +175,49 @@ test "it returns 404 for tombstone objects", %{conn: conn} do
assert json_response(conn, 404) assert json_response(conn, 404)
end end
test "it caches a response", %{conn: conn} do
note = insert(:note)
uuid = String.split(note.data["id"], "/") |> List.last()
conn1 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/objects/#{uuid}")
assert json_response(conn1, :ok)
assert Enum.any?(conn1.resp_headers, &(&1 == {"x-cache", "MISS from Pleroma"}))
conn2 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/objects/#{uuid}")
assert json_response(conn1, :ok) == json_response(conn2, :ok)
assert Enum.any?(conn2.resp_headers, &(&1 == {"x-cache", "HIT from Pleroma"}))
end
test "cached purged after object deletion", %{conn: conn} do
note = insert(:note)
uuid = String.split(note.data["id"], "/") |> List.last()
conn1 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/objects/#{uuid}")
assert json_response(conn1, :ok)
assert Enum.any?(conn1.resp_headers, &(&1 == {"x-cache", "MISS from Pleroma"}))
Object.delete(note)
conn2 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/objects/#{uuid}")
assert "Not found" == json_response(conn2, :not_found)
end
end end
describe "/object/:uuid/likes" do describe "/object/:uuid/likes" do
@ -264,6 +307,51 @@ test "it returns 404 for non-public activities", %{conn: conn} do
assert json_response(conn, 404) assert json_response(conn, 404)
end end
test "it caches a response", %{conn: conn} do
activity = insert(:note_activity)
uuid = String.split(activity.data["id"], "/") |> List.last()
conn1 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/activities/#{uuid}")
assert json_response(conn1, :ok)
assert Enum.any?(conn1.resp_headers, &(&1 == {"x-cache", "MISS from Pleroma"}))
conn2 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/activities/#{uuid}")
assert json_response(conn1, :ok) == json_response(conn2, :ok)
assert Enum.any?(conn2.resp_headers, &(&1 == {"x-cache", "HIT from Pleroma"}))
end
test "cached purged after activity deletion", %{conn: conn} do
user = insert(:user)
{:ok, activity} = CommonAPI.post(user, %{"status" => "cofe"})
uuid = String.split(activity.data["id"], "/") |> List.last()
conn1 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/activities/#{uuid}")
assert json_response(conn1, :ok)
assert Enum.any?(conn1.resp_headers, &(&1 == {"x-cache", "MISS from Pleroma"}))
Activity.delete_by_ap_id(activity.object.data["id"])
conn2 =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/activities/#{uuid}")
assert "Not found" == json_response(conn2, :not_found)
end
end end
describe "/inbox" do describe "/inbox" do
@ -365,6 +453,17 @@ test "it rejects reads from other users", %{conn: conn} do
assert json_response(conn, 403) assert json_response(conn, 403)
end end
test "it doesn't crash without an authenticated user", %{conn: conn} do
user = insert(:user)
conn =
conn
|> put_req_header("accept", "application/activity+json")
|> get("/users/#{user.nickname}/inbox")
assert json_response(conn, 403)
end
test "it returns a note activity in a collection", %{conn: conn} do test "it returns a note activity in a collection", %{conn: conn} do
note_activity = insert(:direct_note_activity) note_activity = insert(:direct_note_activity)
note_object = Object.normalize(note_activity) note_object = Object.normalize(note_activity)

View File

@ -5,6 +5,7 @@
defmodule Pleroma.Web.ActivityPub.PublisherTest do defmodule Pleroma.Web.ActivityPub.PublisherTest do
use Pleroma.DataCase use Pleroma.DataCase
import ExUnit.CaptureLog
import Pleroma.Factory import Pleroma.Factory
import Tesla.Mock import Tesla.Mock
import Mock import Mock
@ -188,7 +189,10 @@ test "it returns inbox for messages involving single recipients in total" do
actor = insert(:user) actor = insert(:user)
inbox = "http://connrefused.site/users/nick1/inbox" inbox = "http://connrefused.site/users/nick1/inbox"
assert {:error, _} = Publisher.publish_one(%{inbox: inbox, json: "{}", actor: actor, id: 1}) assert capture_log(fn ->
assert {:error, _} =
Publisher.publish_one(%{inbox: inbox, json: "{}", actor: actor, id: 1})
end) =~ "connrefused"
assert called(Instances.set_unreachable(inbox)) assert called(Instances.set_unreachable(inbox))
end end
@ -212,14 +216,16 @@ test "it returns inbox for messages involving single recipients in total" do
actor = insert(:user) actor = insert(:user)
inbox = "http://connrefused.site/users/nick1/inbox" inbox = "http://connrefused.site/users/nick1/inbox"
assert {:error, _} = assert capture_log(fn ->
Publisher.publish_one(%{ assert {:error, _} =
inbox: inbox, Publisher.publish_one(%{
json: "{}", inbox: inbox,
actor: actor, json: "{}",
id: 1, actor: actor,
unreachable_since: NaiveDateTime.utc_now() id: 1,
}) unreachable_since: NaiveDateTime.utc_now()
})
end) =~ "connrefused"
refute called(Instances.set_unreachable(inbox)) refute called(Instances.set_unreachable(inbox))
end end

View File

@ -10,6 +10,7 @@ defmodule Pleroma.Web.ActivityPub.RelayTest do
alias Pleroma.Web.ActivityPub.ActivityPub alias Pleroma.Web.ActivityPub.ActivityPub
alias Pleroma.Web.ActivityPub.Relay alias Pleroma.Web.ActivityPub.Relay
import ExUnit.CaptureLog
import Pleroma.Factory import Pleroma.Factory
import Mock import Mock
@ -20,7 +21,9 @@ test "gets an actor for the relay" do
describe "follow/1" do describe "follow/1" do
test "returns errors when user not found" do test "returns errors when user not found" do
assert Relay.follow("test-ap-id") == {:error, "Could not fetch by AP id"} assert capture_log(fn ->
assert Relay.follow("test-ap-id") == {:error, "Could not fetch by AP id"}
end) =~ "Could not fetch by AP id"
end end
test "returns activity" do test "returns activity" do
@ -37,7 +40,9 @@ test "returns activity" do
describe "unfollow/1" do describe "unfollow/1" do
test "returns errors when user not found" do test "returns errors when user not found" do
assert Relay.unfollow("test-ap-id") == {:error, "Could not fetch by AP id"} assert capture_log(fn ->
assert Relay.unfollow("test-ap-id") == {:error, "Could not fetch by AP id"}
end) =~ "Could not fetch by AP id"
end end
test "returns activity" do test "returns activity" do
@ -78,7 +83,9 @@ test "returns error when object is unknown" do
} }
) )
assert Relay.publish(activity) == {:error, nil} assert capture_log(fn ->
assert Relay.publish(activity) == {:error, nil}
end) =~ "[error] error: nil"
end end
test_with_mock "returns announce activity and publish to federate", test_with_mock "returns announce activity and publish to federate",

View File

@ -102,7 +102,7 @@ test "it does not crash if the object in inReplyTo can't be fetched" do
assert capture_log(fn -> assert capture_log(fn ->
{:ok, _returned_activity} = Transmogrifier.handle_incoming(data) {:ok, _returned_activity} = Transmogrifier.handle_incoming(data)
end) =~ "[error] Couldn't fetch \"\"https://404.site/whatever\"\", error: nil" end) =~ "[error] Couldn't fetch \"https://404.site/whatever\", error: nil"
end end
test "it works for incoming notices" do test "it works for incoming notices" do

View File

@ -1779,7 +1779,11 @@ test "common config example", %{conn: conn} do
%{"tuple" => [":seconds_valid", 60]}, %{"tuple" => [":seconds_valid", 60]},
%{"tuple" => [":path", ""]}, %{"tuple" => [":path", ""]},
%{"tuple" => [":key1", nil]}, %{"tuple" => [":key1", nil]},
%{"tuple" => [":partial_chain", "&:hackney_connect.partial_chain/1"]} %{"tuple" => [":partial_chain", "&:hackney_connect.partial_chain/1"]},
%{"tuple" => [":regex1", "~r/https:\/\/example.com/"]},
%{"tuple" => [":regex2", "~r/https:\/\/example.com/u"]},
%{"tuple" => [":regex3", "~r/https:\/\/example.com/i"]},
%{"tuple" => [":regex4", "~r/https:\/\/example.com/s"]}
] ]
} }
] ]
@ -1796,7 +1800,11 @@ test "common config example", %{conn: conn} do
%{"tuple" => [":seconds_valid", 60]}, %{"tuple" => [":seconds_valid", 60]},
%{"tuple" => [":path", ""]}, %{"tuple" => [":path", ""]},
%{"tuple" => [":key1", nil]}, %{"tuple" => [":key1", nil]},
%{"tuple" => [":partial_chain", "&:hackney_connect.partial_chain/1"]} %{"tuple" => [":partial_chain", "&:hackney_connect.partial_chain/1"]},
%{"tuple" => [":regex1", "~r/https:\\/\\/example.com/"]},
%{"tuple" => [":regex2", "~r/https:\\/\\/example.com/u"]},
%{"tuple" => [":regex3", "~r/https:\\/\\/example.com/i"]},
%{"tuple" => [":regex4", "~r/https:\\/\\/example.com/s"]}
] ]
} }
] ]

View File

@ -103,6 +103,30 @@ test "sigil" do
assert Config.from_binary(binary) == ~r/comp[lL][aA][iI][nN]er/ assert Config.from_binary(binary) == ~r/comp[lL][aA][iI][nN]er/
end end
test "link sigil" do
binary = Config.transform("~r/https:\/\/example.com/")
assert binary == :erlang.term_to_binary(~r/https:\/\/example.com/)
assert Config.from_binary(binary) == ~r/https:\/\/example.com/
end
test "link sigil with u modifier" do
binary = Config.transform("~r/https:\/\/example.com/u")
assert binary == :erlang.term_to_binary(~r/https:\/\/example.com/u)
assert Config.from_binary(binary) == ~r/https:\/\/example.com/u
end
test "link sigil with i modifier" do
binary = Config.transform("~r/https:\/\/example.com/i")
assert binary == :erlang.term_to_binary(~r/https:\/\/example.com/i)
assert Config.from_binary(binary) == ~r/https:\/\/example.com/i
end
test "link sigil with s modifier" do
binary = Config.transform("~r/https:\/\/example.com/s")
assert binary == :erlang.term_to_binary(~r/https:\/\/example.com/s)
assert Config.from_binary(binary) == ~r/https:\/\/example.com/s
end
test "2 child tuple" do test "2 child tuple" do
binary = Config.transform(%{"tuple" => ["v1", ":v2"]}) binary = Config.transform(%{"tuple" => ["v1", ":v2"]})
assert binary == :erlang.term_to_binary({"v1", :v2}) assert binary == :erlang.term_to_binary({"v1", :v2})

View File

@ -744,6 +744,16 @@ test "get a status", %{conn: conn} do
assert id == to_string(activity.id) assert id == to_string(activity.id)
end end
test "get statuses by IDs", %{conn: conn} do
%{id: id1} = insert(:note_activity)
%{id: id2} = insert(:note_activity)
query_string = "ids[]=#{id1}&ids[]=#{id2}"
conn = get(conn, "/api/v1/statuses/?#{query_string}")
assert [%{"id" => ^id1}, %{"id" => ^id2}] = json_response(conn, :ok)
end
describe "deleting a status" do describe "deleting a status" do
test "when you created it", %{conn: conn} do test "when you created it", %{conn: conn} do
activity = insert(:note_activity) activity = insert(:note_activity)
@ -3688,7 +3698,7 @@ test "returns 404 when poll is private and not available for user", %{conn: conn
build_conn() build_conn()
|> assign(:user, user) |> assign(:user, user)
[conn: conn, activity: activity] [conn: conn, activity: activity, user: user]
end end
test "returns users who have favorited the status", %{conn: conn, activity: activity} do test "returns users who have favorited the status", %{conn: conn, activity: activity} do
@ -3748,6 +3758,32 @@ test "does not fail on an unauthenticated request", %{conn: conn, activity: acti
[%{"id" => id}] = response [%{"id" => id}] = response
assert id == other_user.id assert id == other_user.id
end end
test "requires authentification for private posts", %{conn: conn, user: user} do
other_user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
"status" => "@#{other_user.nickname} wanna get some #cofe together?",
"visibility" => "direct"
})
{:ok, _, _} = CommonAPI.favorite(activity.id, other_user)
conn
|> assign(:user, nil)
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response(404)
response =
build_conn()
|> assign(:user, other_user)
|> get("/api/v1/statuses/#{activity.id}/favourited_by")
|> json_response(200)
[%{"id" => id}] = response
assert id == other_user.id
end
end end
describe "GET /api/v1/statuses/:id/reblogged_by" do describe "GET /api/v1/statuses/:id/reblogged_by" do
@ -3759,7 +3795,7 @@ test "does not fail on an unauthenticated request", %{conn: conn, activity: acti
build_conn() build_conn()
|> assign(:user, user) |> assign(:user, user)
[conn: conn, activity: activity] [conn: conn, activity: activity, user: user]
end end
test "returns users who have reblogged the status", %{conn: conn, activity: activity} do test "returns users who have reblogged the status", %{conn: conn, activity: activity} do
@ -3819,6 +3855,29 @@ test "does not fail on an unauthenticated request", %{conn: conn, activity: acti
[%{"id" => id}] = response [%{"id" => id}] = response
assert id == other_user.id assert id == other_user.id
end end
test "requires authentification for private posts", %{conn: conn, user: user} do
other_user = insert(:user)
{:ok, activity} =
CommonAPI.post(user, %{
"status" => "@#{other_user.nickname} wanna get some #cofe together?",
"visibility" => "direct"
})
conn
|> assign(:user, nil)
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response(404)
response =
build_conn()
|> assign(:user, other_user)
|> get("/api/v1/statuses/#{activity.id}/reblogged_by")
|> json_response(200)
assert [] == response
end
end end
describe "POST /auth/password, with valid parameters" do describe "POST /auth/password, with valid parameters" do
@ -3953,13 +4012,15 @@ test "returns error", %{conn: conn, user: user} do
Config.put([:suggestions, :enabled], true) Config.put([:suggestions, :enabled], true)
Config.put([:suggestions, :third_party_engine], "http://test500?{{host}}&{{user}}") Config.put([:suggestions, :third_party_engine], "http://test500?{{host}}&{{user}}")
res = assert capture_log(fn ->
conn res =
|> assign(:user, user) conn
|> get("/api/v1/suggestions") |> assign(:user, user)
|> json_response(500) |> get("/api/v1/suggestions")
|> json_response(500)
assert res == "Something went wrong" assert res == "Something went wrong"
end) =~ "Could not retrieve suggestions"
end end
test "returns suggestions", %{conn: conn, user: user, other_user: other_user} do test "returns suggestions", %{conn: conn, user: user, other_user: other_user} do

View File

@ -8,6 +8,7 @@ defmodule Pleroma.Web.TwitterAPI.UtilControllerTest do
alias Pleroma.Repo alias Pleroma.Repo
alias Pleroma.User alias Pleroma.User
alias Pleroma.Web.CommonAPI alias Pleroma.Web.CommonAPI
import ExUnit.CaptureLog
import Pleroma.Factory import Pleroma.Factory
import Mock import Mock
@ -340,12 +341,14 @@ test "show follow page if the `acct` is a account link", %{conn: conn} do
test "show follow page with error when user cannot fecth by `acct` link", %{conn: conn} do test "show follow page with error when user cannot fecth by `acct` link", %{conn: conn} do
user = insert(:user) user = insert(:user)
response = assert capture_log(fn ->
conn response =
|> assign(:user, user) conn
|> get("/ostatus_subscribe?acct=https://mastodon.social/users/not_found") |> assign(:user, user)
|> get("/ostatus_subscribe?acct=https://mastodon.social/users/not_found")
assert html_response(response, 200) =~ "Error fetching user" assert html_response(response, 200) =~ "Error fetching user"
end) =~ "Object has been deleted"
end end
end end
@ -664,4 +667,111 @@ test "it returns new captcha", %{conn: conn} do
assert called(Pleroma.Captcha.new()) assert called(Pleroma.Captcha.new())
end end
end end
defp with_credentials(conn, username, password) do
header_content = "Basic " <> Base.encode64("#{username}:#{password}")
put_req_header(conn, "authorization", header_content)
end
defp valid_user(_context) do
user = insert(:user)
[user: user]
end
describe "POST /api/pleroma/change_email" do
setup [:valid_user]
test "without credentials", %{conn: conn} do
conn = post(conn, "/api/pleroma/change_email")
assert json_response(conn, 403) == %{"error" => "Invalid credentials."}
end
test "with credentials and invalid password", %{conn: conn, user: current_user} do
conn =
conn
|> with_credentials(current_user.nickname, "test")
|> post("/api/pleroma/change_email", %{
"password" => "hi",
"email" => "test@test.com"
})
assert json_response(conn, 200) == %{"error" => "Invalid password."}
end
test "with credentials, valid password and invalid email", %{
conn: conn,
user: current_user
} do
conn =
conn
|> with_credentials(current_user.nickname, "test")
|> post("/api/pleroma/change_email", %{
"password" => "test",
"email" => "foobar"
})
assert json_response(conn, 200) == %{"error" => "Email has invalid format."}
end
test "with credentials, valid password and no email", %{
conn: conn,
user: current_user
} do
conn =
conn
|> with_credentials(current_user.nickname, "test")
|> post("/api/pleroma/change_email", %{
"password" => "test"
})
assert json_response(conn, 200) == %{"error" => "Email can't be blank."}
end
test "with credentials, valid password and blank email", %{
conn: conn,
user: current_user
} do
conn =
conn
|> with_credentials(current_user.nickname, "test")
|> post("/api/pleroma/change_email", %{
"password" => "test",
"email" => ""
})
assert json_response(conn, 200) == %{"error" => "Email can't be blank."}
end
test "with credentials, valid password and non unique email", %{
conn: conn,
user: current_user
} do
user = insert(:user)
conn =
conn
|> with_credentials(current_user.nickname, "test")
|> post("/api/pleroma/change_email", %{
"password" => "test",
"email" => user.email
})
assert json_response(conn, 200) == %{"error" => "Email has already been taken."}
end
test "with credentials, valid password and valid email", %{
conn: conn,
user: current_user
} do
conn =
conn
|> with_credentials(current_user.nickname, "test")
|> post("/api/pleroma/change_email", %{
"password" => "test",
"email" => "cofe@foobar.com"
})
assert json_response(conn, 200) == %{"status" => "success"}
end
end
end end

View File

@ -5,6 +5,7 @@
defmodule Pleroma.Web.WebFinger.WebFingerControllerTest do defmodule Pleroma.Web.WebFinger.WebFingerControllerTest do
use Pleroma.Web.ConnCase use Pleroma.Web.ConnCase
import ExUnit.CaptureLog
import Pleroma.Factory import Pleroma.Factory
import Tesla.Mock import Tesla.Mock
@ -75,11 +76,13 @@ test "it returns 404 when user isn't found (XML)" do
test "Sends a 404 when invalid format" do test "Sends a 404 when invalid format" do
user = insert(:user) user = insert(:user)
assert_raise Phoenix.NotAcceptableError, fn -> assert capture_log(fn ->
build_conn() assert_raise Phoenix.NotAcceptableError, fn ->
|> put_req_header("accept", "text/html") build_conn()
|> get("/.well-known/webfinger?resource=acct:#{user.nickname}@localhost") |> put_req_header("accept", "text/html")
end |> get("/.well-known/webfinger?resource=acct:#{user.nickname}@localhost")
end
end) =~ "no supported media type in accept header"
end end
test "Sends a 400 when resource param is missing" do test "Sends a 400 when resource param is missing" do