Compare commits

...

41 Commits

Author SHA1 Message Date
Josiah Glosson
41cd6feff2 Misc Cargo.lock updates 2025-08-07 22:19:05 -05:00
Josiah Glosson
2443a96d1e Update zip 4.2.0 -> 4.3.0 2025-08-07 22:07:05 -05:00
Josiah Glosson
3962f338a5 Update tracing-actix-web 0.7.18 -> 0.7.19 2025-08-07 22:03:05 -05:00
Josiah Glosson
67e48331a1 Update tokio 1.45.1 -> 1.47.1 and tokio-util 0.7.15 -> 0.7.16 2025-08-07 21:59:41 -05:00
Josiah Glosson
a9bfba9d9e Fix build by updating mappings 2025-08-07 21:53:27 -05:00
Josiah Glosson
26dd65d7a3 Update tauri suite 2025-08-07 21:51:19 -05:00
Josiah Glosson
b1093af893 Update sysinfo 0.35.2 -> 0.36.1 2025-08-07 21:43:55 -05:00
Josiah Glosson
b19ad9ccad Update spdx 0.10.8 -> 0.10.9 2025-08-07 21:40:15 -05:00
Josiah Glosson
4aae445e4a Update serde_with 3.13.0 -> 3.14.0 2025-08-07 21:37:04 -05:00
Josiah Glosson
58f1e2c585 Update serde_json 1.0.140 -> 1.0.142 2025-08-07 21:33:49 -05:00
Josiah Glosson
89901de4bd Update sentry 0.41.0 -> 0.42.0 and sentry-actix 0.41.0 -> 0.42.0 2025-08-07 21:29:18 -05:00
Josiah Glosson
2e18d5023e Update rgb 0.8.50 -> 0.8.52 2025-08-07 21:23:35 -05:00
Josiah Glosson
d501f1fec6 Cargo fmt in theseus 2025-08-07 21:18:52 -05:00
Josiah Glosson
2217da078a Update reqwest 0.12.20 -> 0.12.22 2025-08-07 21:18:18 -05:00
Josiah Glosson
fadbf80093 Fix theseus lint 2025-08-07 21:13:27 -05:00
Josiah Glosson
9cb1ad8024 Update quick-xml 0.37.5 -> 0.38.1 2025-08-07 20:56:28 -05:00
Josiah Glosson
14b27552db Update notify 8.0.0 -> 8.2.0 and notify-debouncer-mini 0.6.0 -> 0.7.0 2025-08-07 20:43:55 -05:00
Josiah Glosson
1c940a0d25 Update meilisearch-sdk 0.28.0 -> 0.29.1 2025-08-07 20:39:22 -05:00
Josiah Glosson
1a7b2f1806 Update lettre 0.11.17 -> 0.11.18 2025-08-07 20:31:17 -05:00
Josiah Glosson
39aea6545d Update jemalloc_pprof 0.7.0 -> 0.8.1 2025-08-07 20:27:40 -05:00
Josiah Glosson
86c408c700 Update indicatif 0.17.11 -> 0.18.0 2025-08-07 20:22:53 -05:00
Josiah Glosson
211cd05750 Update indexmap 2.9.0 -> 2.10.0 2025-08-07 16:51:25 -05:00
Josiah Glosson
b7f0ec3199 Update hyper-util 0.1.14 -> 0.1.16 2025-08-07 16:44:42 -05:00
Josiah Glosson
f1825fb9fa Update enumset 1.1.6 -> 1.1.7 2025-08-07 16:38:08 -05:00
Josiah Glosson
719aba383b Update deadpool-redis 0.21.1 -> 0.22.0 and redis 0.31.0 -> 0.32.4 2025-08-07 16:26:24 -05:00
Josiah Glosson
dc33b4b05c Update clap 4.5.40 -> 4.5.43 2025-08-07 12:31:44 -05:00
Josiah Glosson
97a6e94d32 Update bytemuck 1.23.0 -> 1.23.1 2025-08-07 12:25:53 -05:00
Josiah Glosson
37a10c76ab Update async-tungstenite 0.29.1 -> 0.30.0 2025-08-07 12:21:16 -05:00
Josiah Glosson
3ebdac4df9 Update async-compression 0.4.25 -> 0.4.27 2025-08-07 12:11:38 -05:00
Josiah Glosson
8d16834e39 Update Rust version 2025-08-07 11:59:00 -05:00
Alejandro González
d22c9e24f4 tweak(frontend): improve Nuxt build state generation logging and caching (#4133) 2025-08-06 22:05:33 +00:00
fishstiz
e31197f649 feat(app): pass selected version to incompatibility warning modal (#4115)
Co-authored-by: IMB11 <hendersoncal117@gmail.com>
2025-08-05 11:10:02 +00:00
Emma Alexia
0dee21814d Change "Billing" link on dashboard for admins (#3951)
* Change "Billing" link on dashboard for admins

Requires an archon change before merging

* change order

* steal changes from prospector's old PR

supersedes #3234

Co-authored-by: Prospector <prospectordev@gmail.com>

* lint?

---------

Co-authored-by: Prospector <prospectordev@gmail.com>
2025-08-04 20:13:33 +00:00
Josiah Glosson
0657e4466f Allow direct joining servers on old instances (#4094)
* Implement direct server joining for 1.6.2 through 1.19.4

* Implement direct server joining for versions before 1.6.2

* Ignore methods with a $ in them

* Run intl:extract

* Improve code of MinecraftTransformer

* Support showing last played time for profiles before 1.7

* Reorganize QuickPlayVersion a bit to prepare for singleplayer

* Only inject quick play checking in versions where it's needed

* Optimize agent some and fix error on NeoForge

* Remove some code for quickplay singleplayer support before 1.20, as we can't reasonably support that with an agent

* Invert the default hasServerQuickPlaySupport return value

* Remove Play Anyway button

* Fix "Server couldn't be contacted" on singleplayer worlds

* Fix "Jump back in" section not working
2025-08-04 19:29:20 +00:00
Josiah Glosson
13dbb4c57e Fix most packs showing as "Optimization" on the app homepage (#4119) 2025-08-04 19:21:37 +00:00
Prospector
99493b9917 Updated changelog 2025-08-01 21:31:22 -04:00
IMB11
72a52eb7b1 fix: improve error message for rate limiting (#4101)
Co-authored-by: Prospector <6166773+Prospector@users.noreply.github.com>
2025-08-01 21:27:25 +00:00
IMB11
b33e12c71d fix: startup settings not visible on hard page refresh/direct load (#4100)
* fix: startup settings not visible on hard page refresh/direct load

* refactor: const func => named
2025-08-01 21:22:22 +00:00
IMB11
82d86839c7 fix: approve status incorrect (#4104) 2025-08-01 20:24:40 +00:00
coolbot
3a20e15340 Coolbot/moderation updates aug1 (#4103)
* oop, all commas!

* Only show slug stuff when needed.

* Move status alerts to top of message, getting rid of separators.

* redist libs message altered, and now shows on plugins too

* Update versions.ts

remove unnecessary import

Signed-off-by: coolbot <76798835+coolbot100s@users.noreply.github.com>

* Tweak summary formatting msg

* Update license messages to use flink

* reorder link text to match the settings page

* add Description clarity button

---------

Signed-off-by: coolbot <76798835+coolbot100s@users.noreply.github.com>
2025-08-01 20:21:28 +00:00
jade
1c89b84314 fix(moderation): Replace dead modpack link with a valid one in side-types message (#4095) 2025-07-31 17:50:33 +00:00
133 changed files with 2867 additions and 2166 deletions

3
.idea/code.iml generated
View File

@@ -10,11 +10,10 @@
<sourceFolder url="file://$MODULE_DIR$/apps/labrinth/src" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/apps/labrinth/tests" isTestSource="true" />
<sourceFolder url="file://$MODULE_DIR$/packages/app-lib/src" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/packages/rust-common/src" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/packages/ariadne/src" isTestSource="false" />
<excludeFolder url="file://$MODULE_DIR$/target" />
</content>
<orderEntry type="inheritedJdk" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>
</module>

1476
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -25,31 +25,31 @@ actix-ws = "0.3.0"
argon2 = { version = "0.5.3", features = ["std"] }
ariadne = { path = "packages/ariadne" }
async_zip = "0.0.17"
async-compression = { version = "0.4.25", default-features = false }
async-compression = { version = "0.4.27", default-features = false }
async-recursion = "1.1.1"
async-stripe = { version = "0.41.0", default-features = false, features = [
"runtime-tokio-hyper-rustls",
] }
async-trait = "0.1.88"
async-tungstenite = { version = "0.29.1", default-features = false, features = [
async-tungstenite = { version = "0.30.0", default-features = false, features = [
"futures-03-sink",
] }
async-walkdir = "2.1.0"
base64 = "0.22.1"
bitflags = "2.9.1"
bytemuck = "1.23.0"
bytemuck = "1.23.1"
bytes = "1.10.1"
censor = "0.3.0"
chardetng = "0.1.17"
chrono = "0.4.41"
clap = "4.5.40"
clap = "4.5.43"
clickhouse = "0.13.3"
color-thief = "0.2.2"
console-subscriber = "0.4.1"
daedalus = { path = "packages/daedalus" }
dashmap = "6.1.0"
data-url = "0.3.1"
deadpool-redis = "0.21.1"
deadpool-redis = "0.22.0"
dirs = "6.0.0"
discord-rich-presence = "0.2.5"
dotenv-build = "0.1.1"
@@ -57,7 +57,7 @@ dotenvy = "0.15.7"
dunce = "1.0.5"
either = "1.15.0"
encoding_rs = "0.8.35"
enumset = "1.1.6"
enumset = "1.1.7"
flate2 = "1.1.2"
fs4 = { version = "0.13.1", default-features = false }
futures = { version = "0.3.31", default-features = false }
@@ -74,15 +74,15 @@ hyper-rustls = { version = "0.27.7", default-features = false, features = [
"ring",
"tls12",
] }
hyper-util = "0.1.14"
hyper-util = "0.1.16"
iana-time-zone = "0.1.63"
image = { version = "0.25.6", default-features = false, features = ["rayon"] }
indexmap = "2.9.0"
indicatif = "0.17.11"
indexmap = "2.10.0"
indicatif = "0.18.0"
itertools = "0.14.0"
jemalloc_pprof = "0.7.0"
jemalloc_pprof = "0.8.1"
json-patch = { version = "4.0.0", default-features = false }
lettre = { version = "0.11.17", default-features = false, features = [
lettre = { version = "0.11.18", default-features = false, features = [
"builder",
"hostname",
"pool",
@@ -92,24 +92,24 @@ lettre = { version = "0.11.17", default-features = false, features = [
"smtp-transport",
] }
maxminddb = "0.26.0"
meilisearch-sdk = { version = "0.28.0", default-features = false }
meilisearch-sdk = { version = "0.29.1", default-features = false }
murmur2 = "0.1.0"
native-dialog = "0.9.0"
notify = { version = "8.0.0", default-features = false }
notify-debouncer-mini = { version = "0.6.0", default-features = false }
notify = { version = "8.2.0", default-features = false }
notify-debouncer-mini = { version = "0.7.0", default-features = false }
p256 = "0.13.2"
paste = "1.0.15"
phf = { version = "0.12.1", features = ["macros"] }
png = "0.17.16"
prometheus = "0.14.0"
quartz_nbt = "0.2.9"
quick-xml = "0.37.5"
quick-xml = "0.38.1"
rand = "=0.8.5" # Locked on 0.8 until argon2 and p256 update to 0.9
rand_chacha = "=0.3.1" # Locked on 0.3 until we can update rand to 0.9
redis = "=0.31.0" # Locked on 0.31 until deadpool-redis updates to 0.32
redis = "0.32.4"
regex = "1.11.1"
reqwest = { version = "0.12.20", default-features = false }
rgb = "0.8.50"
reqwest = { version = "0.12.22", default-features = false }
rgb = "0.8.52"
rust_decimal = { version = "1.37.2", features = [
"serde-with-float",
"serde-with-str",
@@ -121,7 +121,7 @@ rust-s3 = { version = "0.35.1", default-features = false, features = [
"tokio-rustls-tls",
] }
rusty-money = "0.4.1"
sentry = { version = "0.41.0", default-features = false, features = [
sentry = { version = "0.42.0", default-features = false, features = [
"backtrace",
"contexts",
"debug-images",
@@ -129,45 +129,45 @@ sentry = { version = "0.41.0", default-features = false, features = [
"reqwest",
"rustls",
] }
sentry-actix = "0.41.0"
sentry-actix = "0.42.0"
serde = "1.0.219"
serde_bytes = "0.11.17"
serde_cbor = "0.11.2"
serde_ini = "0.2.0"
serde_json = "1.0.140"
serde_with = "3.13.0"
serde_json = "1.0.142"
serde_with = "3.14.0"
serde-xml-rs = "0.8.1" # Also an XML (de)serializer, consider dropping yaserde in favor of this
sha1 = "0.10.6"
sha1_smol = { version = "1.0.1", features = ["std"] }
sha2 = "0.10.9"
spdx = "0.10.8"
spdx = "0.10.9"
sqlx = { version = "0.8.6", default-features = false }
sysinfo = { version = "0.35.2", default-features = false }
sysinfo = { version = "0.36.1", default-features = false }
tar = "0.4.44"
tauri = "2.6.1"
tauri-build = "2.3.0"
tauri-plugin-deep-link = "2.4.0"
tauri-plugin-dialog = "2.3.0"
tauri-plugin-http = "2.5.0"
tauri = "2.7.0"
tauri-build = "2.3.1"
tauri-plugin-deep-link = "2.4.1"
tauri-plugin-dialog = "2.3.2"
tauri-plugin-http = "2.5.1"
tauri-plugin-opener = "2.4.0"
tauri-plugin-os = "2.3.0"
tauri-plugin-single-instance = "2.3.0"
tauri-plugin-single-instance = "2.3.2"
tauri-plugin-updater = { version = "2.9.0", default-features = false, features = [
"rustls-tls",
"zip",
] }
tauri-plugin-window-state = "2.3.0"
tauri-plugin-window-state = "2.4.0"
tempfile = "3.20.0"
theseus = { path = "packages/app-lib" }
thiserror = "2.0.12"
tikv-jemalloc-ctl = "0.6.0"
tikv-jemallocator = "0.6.0"
tokio = "1.45.1"
tokio = "1.47.1"
tokio-stream = "0.1.17"
tokio-util = "0.7.15"
tokio-util = "0.7.16"
totp-rs = "5.7.0"
tracing = "0.1.41"
tracing-actix-web = "0.7.18"
tracing-actix-web = "0.7.19"
tracing-error = "0.2.1"
tracing-subscriber = "0.3.19"
url = "2.5.4"
@@ -179,7 +179,7 @@ whoami = "1.6.0"
winreg = "0.55.0"
woothee = "0.13.0"
yaserde = "0.12.0"
zip = { version = "4.2.0", default-features = false, features = [
zip = { version = "4.3.0", default-features = false, features = [
"bzip2",
"deflate",
"deflate64",
@@ -226,7 +226,7 @@ wildcard_dependencies = "warn"
warnings = "deny"
[patch.crates-io]
wry = { git = "https://github.com/modrinth/wry", rev = "21db186" }
wry = { git = "https://github.com/modrinth/wry", rev = "f2ce0b0" }
# Optimize for speed and reduce size on release builds
[profile.release]

View File

@@ -9,7 +9,7 @@
"tsc:check": "vue-tsc --noEmit",
"lint": "eslint . && prettier --check .",
"fix": "eslint . --fix && prettier --write .",
"intl:extract": "formatjs extract \"{,src/components,src/composables,src/helpers,src/pages,src/store}/**/*.{vue,ts,tsx,js,jsx,mts,cts,mjs,cjs}\" --ignore '**/*.d.ts' --ignore 'node_modules' --out-file src/locales/en-US/index.json --format crowdin --preserve-whitespace",
"intl:extract": "formatjs extract \"src/**/*.{vue,ts,tsx,js,jsx,mts,cts,mjs,cjs}\" --ignore \"**/*.d.ts\" --ignore node_modules --out-file src/locales/en-US/index.json --format crowdin --preserve-whitespace",
"test": "vue-tsc --noEmit"
},
"dependencies": {

View File

@@ -21,14 +21,11 @@ const props = defineProps({
})
const featuredCategory = computed(() => {
if (props.project.categories.includes('optimization')) {
if (props.project.display_categories.includes('optimization')) {
return 'optimization'
}
if (props.project.categories.length > 0) {
return props.project.categories[0]
}
return undefined
return props.project.display_categories[0] ?? props.project.categories[0]
})
const toColor = computed(() => {

View File

@@ -76,10 +76,10 @@ const installing = ref(false)
const onInstall = ref(() => {})
defineExpose({
show: (instanceVal, projectVal, projectVersions, callback) => {
show: (instanceVal, projectVal, projectVersions, selected, callback) => {
instance.value = instanceVal
versions.value = projectVersions
selectedVersion.value = projectVersions[0]
selectedVersion.value = selected ?? projectVersions[0]
project.value = projectVal

View File

@@ -6,9 +6,8 @@ import type {
ServerWorld,
SingleplayerWorld,
World,
set_world_display_status,
getWorldIdentifier,
} from '@/helpers/worlds.ts'
import { set_world_display_status, getWorldIdentifier } from '@/helpers/worlds.ts'
import { formatNumber, getPingLevel } from '@modrinth/utils'
import {
useRelativeTime,
@@ -61,7 +60,8 @@ const props = withDefaults(
playingInstance?: boolean
playingWorld?: boolean
startingInstance?: boolean
supportsQuickPlay?: boolean
supportsServerQuickPlay?: boolean
supportsWorldQuickPlay?: boolean
currentProtocol?: ProtocolVersion | null
highlighted?: boolean
@@ -85,7 +85,8 @@ const props = withDefaults(
playingInstance: false,
playingWorld: false,
startingInstance: false,
supportsQuickPlay: false,
supportsServerQuickPlay: true,
supportsWorldQuickPlay: false,
currentProtocol: null,
refreshing: false,
@@ -128,9 +129,13 @@ const messages = defineMessages({
id: 'instance.worlds.a_minecraft_server',
defaultMessage: 'A Minecraft Server',
},
noQuickPlay: {
id: 'instance.worlds.no_quick_play',
defaultMessage: 'You can only jump straight into worlds on Minecraft 1.20+',
noServerQuickPlay: {
id: 'instance.worlds.no_server_quick_play',
defaultMessage: 'You can only jump straight into servers on Minecraft Alpha 1.0.5+',
},
noSingleplayerQuickPlay: {
id: 'instance.worlds.no_singleplayer_quick_play',
defaultMessage: 'You can only jump straight into singleplayer worlds on Minecraft 1.20+',
},
gameAlreadyOpen: {
id: 'instance.worlds.game_already_open',
@@ -152,10 +157,6 @@ const messages = defineMessages({
id: 'instance.worlds.view_instance',
defaultMessage: 'View instance',
},
playAnyway: {
id: 'instance.worlds.play_anyway',
defaultMessage: 'Play anyway',
},
playInstance: {
id: 'instance.worlds.play_instance',
defaultMessage: 'Play instance',
@@ -330,17 +331,24 @@ const messages = defineMessages({
<ButtonStyled v-else>
<button
v-tooltip="
!serverStatus
? formatMessage(messages.noContact)
: serverIncompatible
? formatMessage(messages.incompatibleServer)
: !supportsQuickPlay
? formatMessage(messages.noQuickPlay)
: playingOtherWorld || locked
? formatMessage(messages.gameAlreadyOpen)
: null
world.type == 'server' && !supportsServerQuickPlay
? formatMessage(messages.noServerQuickPlay)
: world.type == 'singleplayer' && !supportsWorldQuickPlay
? formatMessage(messages.noSingleplayerQuickPlay)
: playingOtherWorld || locked
? formatMessage(messages.gameAlreadyOpen)
: !serverStatus
? formatMessage(messages.noContact)
: serverIncompatible
? formatMessage(messages.incompatibleServer)
: null
"
:disabled="
playingOtherWorld ||
startingInstance ||
(world.type == 'server' && !supportsServerQuickPlay) ||
(world.type == 'singleplayer' && !supportsWorldQuickPlay)
"
:disabled="!supportsQuickPlay || playingOtherWorld || startingInstance"
@click="emit('play')"
>
<SpinnerIcon v-if="startingInstance && playingWorld" class="animate-spin" />
@@ -357,11 +365,6 @@ const messages = defineMessages({
disabled: playingInstance,
action: () => emit('play-instance'),
},
{
id: 'play-anyway',
shown: serverIncompatible && !playingInstance && supportsQuickPlay,
action: () => emit('play'),
},
{
id: 'open-instance',
shown: !!instancePath,
@@ -427,10 +430,6 @@ const messages = defineMessages({
<PlayIcon aria-hidden="true" />
{{ formatMessage(messages.playInstance) }}
</template>
<template #play-anyway>
<PlayIcon aria-hidden="true" />
{{ formatMessage(messages.playAnyway) }}
</template>
<template #open-instance>
<EyeIcon aria-hidden="true" />
{{ formatMessage(messages.viewInstance) }}

View File

@@ -311,15 +311,24 @@ export async function refreshWorlds(instancePath: string): Promise<World[]> {
return worlds ?? []
}
const FIRST_QUICK_PLAY_VERSION = '23w14a'
export function hasServerQuickPlaySupport(gameVersions: GameVersion[], currentVersion: string) {
if (!gameVersions.length) {
return true
}
export function hasQuickPlaySupport(gameVersions: GameVersion[], currentVersion: string) {
const versionIndex = gameVersions.findIndex((v) => v.version === currentVersion)
const targetIndex = gameVersions.findIndex((v) => v.version === 'a1.0.5_01')
return versionIndex === -1 || targetIndex === -1 || versionIndex <= targetIndex
}
export function hasWorldQuickPlaySupport(gameVersions: GameVersion[], currentVersion: string) {
if (!gameVersions.length) {
return false
}
const versionIndex = gameVersions.findIndex((v) => v.version === currentVersion)
const targetIndex = gameVersions.findIndex((v) => v.version === FIRST_QUICK_PLAY_VERSION)
const targetIndex = gameVersions.findIndex((v) => v.version === '23w14a')
return versionIndex !== -1 && targetIndex !== -1 && versionIndex <= targetIndex
}

View File

@@ -383,11 +383,11 @@
"instance.worlds.no_contact": {
"message": "Server couldn't be contacted"
},
"instance.worlds.no_quick_play": {
"message": "You can only jump straight into worlds on Minecraft 1.20+"
"instance.worlds.no_server_quick_play": {
"message": "You can only jump straight into servers on Minecraft Alpha 1.0.5+"
},
"instance.worlds.play_anyway": {
"message": "Play anyway"
"instance.worlds.no_singleplayer_quick_play": {
"message": "You can only jump straight into singleplayer worlds on Minecraft 1.20+"
},
"instance.worlds.play_instance": {
"message": "Play instance"

View File

@@ -67,7 +67,8 @@
:key="`world-${world.type}-${world.type == 'singleplayer' ? world.path : `${world.address}-${world.index}`}`"
:world="world"
:highlighted="highlightedWorld === getWorldIdentifier(world)"
:supports-quick-play="supportsQuickPlay"
:supports-server-quick-play="supportsServerQuickPlay"
:supports-world-quick-play="supportsWorldQuickPlay"
:current-protocol="protocolVersion"
:playing-instance="playing"
:playing-world="worldsMatch(world, worldPlaying)"
@@ -150,10 +151,11 @@ import {
refreshWorld,
sortWorlds,
refreshServers,
hasQuickPlaySupport,
hasWorldQuickPlaySupport,
refreshWorlds,
handleDefaultProfileUpdateEvent,
showWorldInFolder,
hasServerQuickPlaySupport,
} from '@/helpers/worlds.ts'
import AddServerModal from '@/components/ui/world/modal/AddServerModal.vue'
import EditServerModal from '@/components/ui/world/modal/EditServerModal.vue'
@@ -355,8 +357,11 @@ function worldsMatch(world: World, other: World | undefined) {
}
const gameVersions = ref<GameVersion[]>(await get_game_versions().catch(() => []))
const supportsQuickPlay = computed(() =>
hasQuickPlaySupport(gameVersions.value, instance.value.game_version),
const supportsServerQuickPlay = computed(() =>
hasServerQuickPlaySupport(gameVersions.value, instance.value.game_version),
)
const supportsWorldQuickPlay = computed(() =>
hasWorldQuickPlaySupport(gameVersions.value, instance.value.game_version),
)
const filterOptions = computed(() => {

View File

@@ -29,8 +29,8 @@ export const useInstall = defineStore('installStore', {
setIncompatibilityWarningModal(ref) {
this.incompatibilityWarningModal = ref
},
showIncompatibilityWarningModal(instance, project, versions, onInstall) {
this.incompatibilityWarningModal.show(instance, project, versions, onInstall)
showIncompatibilityWarningModal(instance, project, versions, selected, onInstall) {
this.incompatibilityWarningModal.show(instance, project, versions, selected, onInstall)
},
setModInstallModal(ref) {
this.modInstallModal = ref
@@ -133,7 +133,13 @@ export const install = async (
callback(version.id)
} else {
const install = useInstall()
install.showIncompatibilityWarningModal(instance, project, projectVersions, callback)
install.showIncompatibilityWarningModal(
instance,
project,
projectVersions,
version,
callback,
)
}
} else {
const versions = (await get_version_many(project.versions).catch(handleError)).sort(

View File

@@ -197,15 +197,13 @@ pub async fn open_link<R: Runtime>(
if url::Url::parse(&path).is_ok()
&& !state.malicious_origins.contains(&origin)
&& let Some(last_click) = state.last_click
&& last_click.elapsed() < Duration::from_millis(100)
{
if let Some(last_click) = state.last_click {
if last_click.elapsed() < Duration::from_millis(100) {
let _ = app.opener().open_url(&path, None::<String>);
state.last_click = None;
let _ = app.opener().open_url(&path, None::<String>);
state.last_click = None;
return Ok(());
}
}
return Ok(());
}
tracing::info!("Malicious click: {path} origin {origin}");

View File

@@ -59,16 +59,13 @@ pub async fn login<R: Runtime>(
.url()?
.as_str()
.starts_with("https://login.live.com/oauth20_desktop.srf")
{
if let Some((_, code)) =
&& let Some((_, code)) =
window.url()?.query_pairs().find(|x| x.0 == "code")
{
window.close()?;
let val =
minecraft_auth::finish_login(&code.clone(), flow).await?;
{
window.close()?;
let val = minecraft_auth::finish_login(&code.clone(), flow).await?;
return Ok(Some(val));
}
return Ok(Some(val));
}
tokio::time::sleep(std::time::Duration::from_millis(50)).await;

View File

@@ -250,7 +250,7 @@ pub async fn profile_get_pack_export_candidates(
// invoke('plugin:profile|profile_run', path)
#[tauri::command]
pub async fn profile_run(path: &str) -> Result<ProcessMetadata> {
let process = profile::run(path, &QuickPlayType::None).await?;
let process = profile::run(path, QuickPlayType::None).await?;
Ok(process)
}

View File

@@ -63,11 +63,11 @@ pub async fn should_disable_mouseover() -> bool {
// We try to match version to 12.2 or higher. If unrecognizable to pattern or lower, we default to the css with disabled mouseover for safety
if let tauri_plugin_os::Version::Semantic(major, minor, _) =
tauri_plugin_os::version()
&& major >= 12
&& minor >= 3
{
if major >= 12 && minor >= 3 {
// Mac os version is 12.3 or higher, we allow mouseover
return false;
}
// Mac os version is 12.3 or higher, we allow mouseover
return false;
}
true
} else {

View File

@@ -4,6 +4,7 @@ use enumset::EnumSet;
use tauri::{AppHandle, Manager, Runtime};
use theseus::prelude::ProcessMetadata;
use theseus::profile::{QuickPlayType, get_full_path};
use theseus::server_address::ServerAddress;
use theseus::worlds::{
DisplayStatus, ProtocolVersion, ServerPackStatus, ServerStatus, World,
WorldType, WorldWithProfile,
@@ -203,7 +204,7 @@ pub async fn start_join_singleplayer_world(
world: String,
) -> Result<ProcessMetadata> {
let process =
profile::run(path, &QuickPlayType::Singleplayer(world)).await?;
profile::run(path, QuickPlayType::Singleplayer(world)).await?;
Ok(process)
}
@@ -213,8 +214,11 @@ pub async fn start_join_server(
path: &str,
address: &str,
) -> Result<ProcessMetadata> {
let process =
profile::run(path, &QuickPlayType::Server(address.to_owned())).await?;
let process = profile::run(
path,
QuickPlayType::Server(ServerAddress::Unresolved(address.to_owned())),
)
.await?;
Ok(process)
}

View File

@@ -233,10 +233,10 @@ fn main() {
});
#[cfg(not(target_os = "linux"))]
if let Some(window) = app.get_window("main") {
if let Err(e) = window.set_shadow(true) {
tracing::warn!("Failed to set window shadow: {e}");
}
if let Some(window) = app.get_window("main")
&& let Err(e) = window.set_shadow(true)
{
tracing::warn!("Failed to set window shadow: {e}");
}
Ok(())

View File

@@ -506,27 +506,25 @@ async fn fetch(
return Ok(lib);
}
} else if let Some(url) = &lib.url {
if !url.is_empty() {
insert_mirrored_artifact(
&lib.name,
None,
vec![
url.clone(),
"https://libraries.minecraft.net/"
.to_string(),
"https://maven.creeperhost.net/"
.to_string(),
maven_url.to_string(),
],
false,
mirror_artifacts,
)?;
} else if let Some(url) = &lib.url
&& !url.is_empty()
{
insert_mirrored_artifact(
&lib.name,
None,
vec![
url.clone(),
"https://libraries.minecraft.net/".to_string(),
"https://maven.creeperhost.net/".to_string(),
maven_url.to_string(),
],
false,
mirror_artifacts,
)?;
lib.url = Some(format_url("maven/"));
lib.url = Some(format_url("maven/"));
return Ok(lib);
}
return Ok(lib);
}
// Other libraries are generally available in the "maven" directory of the installer. If they are

View File

@@ -93,22 +93,22 @@ async fn main() -> Result<()> {
.ok()
.and_then(|x| x.parse::<bool>().ok())
.unwrap_or(false)
&& let Ok(token) = dotenvy::var("CLOUDFLARE_TOKEN")
&& let Ok(zone_id) = dotenvy::var("CLOUDFLARE_ZONE_ID")
{
if let Ok(token) = dotenvy::var("CLOUDFLARE_TOKEN") {
if let Ok(zone_id) = dotenvy::var("CLOUDFLARE_ZONE_ID") {
let cache_clears = upload_files
let cache_clears = upload_files
.into_iter()
.map(|x| format_url(&x.0))
.chain(
mirror_artifacts
.into_iter()
.map(|x| format_url(&x.0))
.chain(
mirror_artifacts
.into_iter()
.map(|x| format_url(&format!("maven/{}", x.0))),
)
.collect::<Vec<_>>();
.map(|x| format_url(&format!("maven/{}", x.0))),
)
.collect::<Vec<_>>();
// Cloudflare ratelimits cache clears to 500 files per request
for chunk in cache_clears.chunks(500) {
REQWEST_CLIENT.post(format!("https://api.cloudflare.com/client/v4/zones/{zone_id}/purge_cache"))
// Cloudflare ratelimits cache clears to 500 files per request
for chunk in cache_clears.chunks(500) {
REQWEST_CLIENT.post(format!("https://api.cloudflare.com/client/v4/zones/{zone_id}/purge_cache"))
.bearer_auth(&token)
.json(&serde_json::json!({
"files": chunk
@@ -128,8 +128,6 @@ async fn main() -> Result<()> {
item: "cloudflare clear cache".to_string(),
}
})?;
}
}
}
}

View File

@@ -167,20 +167,18 @@ pub async fn download_file(
let bytes = x.bytes().await;
if let Ok(bytes) = bytes {
if let Some(sha1) = sha1 {
if &*sha1_async(bytes.clone()).await? != sha1 {
if attempt <= 3 {
continue;
} else {
return Err(
crate::ErrorKind::ChecksumFailure {
hash: sha1.to_string(),
url: url.to_string(),
tries: attempt,
}
.into(),
);
if let Some(sha1) = sha1
&& &*sha1_async(bytes.clone()).await? != sha1
{
if attempt <= 3 {
continue;
} else {
return Err(crate::ErrorKind::ChecksumFailure {
hash: sha1.to_string(),
url: url.to_string(),
tries: attempt,
}
.into());
}
}

View File

@@ -143,8 +143,13 @@ export default defineNuxtConfig({
state.lastGenerated &&
new Date(state.lastGenerated).getTime() + TTL > new Date().getTime() &&
// ...but only if the API URL is the same
state.apiUrl === API_URL
state.apiUrl === API_URL &&
// ...and if no errors were caught during the last generation
(state.errors ?? []).length === 0
) {
console.log(
"Tags already recently generated. Delete apps/frontend/generated/state.json to force regeneration.",
);
return;
}

View File

@@ -259,7 +259,7 @@
</button>
</ButtonStyled>
<ButtonStyled color="green">
<button @click="sendMessage('approved')">
<button @click="sendMessage(project.requested_status ?? 'approved')">
<CheckIcon aria-hidden="true" />
Approve
</button>
@@ -355,6 +355,7 @@ import {
renderHighlightedString,
type ModerationJudgements,
type ModerationModpackItem,
type ProjectStatus,
} from "@modrinth/utils";
import { computedAsync, useLocalStorage } from "@vueuse/core";
import {
@@ -527,7 +528,7 @@ function handleKeybinds(event: KeyboardEvent) {
tryResetProgress: resetProgress,
tryExitModeration: () => emit("exit"),
tryApprove: () => sendMessage("approved"),
tryApprove: () => sendMessage(props.project.requested_status),
tryReject: () => sendMessage("rejected"),
tryWithhold: () => sendMessage("withheld"),
tryEditMessage: goBackToStages,
@@ -1208,7 +1209,7 @@ function generateModpackMessage(allFiles: {
}
const hasNextProject = ref(false);
async function sendMessage(status: "approved" | "rejected" | "withheld") {
async function sendMessage(status: ProjectStatus) {
try {
await useBaseFetch(`project/${props.project.id}`, {
method: "PATCH",

View File

@@ -2,7 +2,10 @@
<div class="static w-full grid-cols-1 md:relative md:flex">
<div class="static h-full flex-col pb-4 md:flex md:pb-0 md:pr-4">
<div class="z-10 flex select-none flex-col gap-2 rounded-2xl bg-bg-raised p-4 md:w-[16rem]">
<div v-for="link in navLinks" :key="link.label">
<div
v-for="link in navLinks.filter((x) => x.shown === undefined || x.shown)"
:key="link.label"
>
<NuxtLink
:to="link.href"
class="flex items-center gap-2 rounded-xl p-2 hover:bg-button-bg"
@@ -40,7 +43,7 @@ import { ModrinthServer } from "~/composables/servers/modrinth-servers.ts";
const emit = defineEmits(["reinstall"]);
defineProps<{
navLinks: { label: string; href: string; icon: Component; external?: boolean }[];
navLinks: { label: string; href: string; icon: Component; external?: boolean; shown?: boolean }[];
route: RouteLocationNormalized;
server: ModrinthServer;
backupInProgress?: BackupInProgressReason;

View File

@@ -6,6 +6,7 @@ import { ServerModule } from "./base.ts";
export class GeneralModule extends ServerModule implements ServerGeneral {
server_id!: string;
name!: string;
owner_id!: string;
net!: { ip: string; port: number; domain: string };
game!: string;
backup_quota!: number;

View File

@@ -147,7 +147,7 @@ export async function useServersFetch<T>(
404: "Not Found",
405: "Method Not Allowed",
408: "Request Timeout",
429: "Too Many Requests",
429: "You're making requests too quickly. Please wait a moment and try again.",
500: "Internal Server Error",
502: "Bad Gateway",
503: "Service Unavailable",
@@ -167,11 +167,17 @@ export async function useServersFetch<T>(
console.error("Fetch error:", error);
const fetchError = new ModrinthServersFetchError(
`[Modrinth Servers] ${message}`,
`[Modrinth Servers] ${error.message}`,
statusCode,
error,
);
throw new ModrinthServerError(error.message, statusCode, fetchError, module, v1Error);
throw new ModrinthServerError(
`[Modrinth Servers] ${message}`,
statusCode,
fetchError,
module,
v1Error,
);
}
const baseDelay = statusCode && statusCode >= 500 ? 5000 : 1000;

View File

@@ -16,12 +16,15 @@ import {
CardIcon,
UserIcon,
WrenchIcon,
ModrinthIcon,
} from "@modrinth/assets";
import { isAdmin as isUserAdmin, type User } from "@modrinth/utils";
import { ModrinthServer } from "~/composables/servers/modrinth-servers.ts";
import type { BackupInProgressReason } from "~/pages/servers/manage/[id].vue";
const route = useRoute();
const serverId = route.params.id as string;
const auth = await useAuth();
const props = defineProps<{
server: ModrinthServer;
@@ -32,7 +35,11 @@ useHead({
title: `Options - ${props.server.general?.name ?? "Server"} - Modrinth`,
});
const navLinks = [
const ownerId = computed(() => props.server.general?.owner_id ?? "Ghost");
const isOwner = computed(() => (auth.value?.user as User | null)?.id === ownerId.value);
const isAdmin = computed(() => isUserAdmin(auth.value?.user));
const navLinks = computed(() => [
{ icon: SettingsIcon, label: "General", href: `/servers/manage/${serverId}/options` },
{ icon: WrenchIcon, label: "Platform", href: `/servers/manage/${serverId}/options/loader` },
{ icon: TextQuoteIcon, label: "Startup", href: `/servers/manage/${serverId}/options/startup` },
@@ -48,7 +55,15 @@ const navLinks = [
label: "Billing",
href: `/settings/billing#server-${serverId}`,
external: true,
shown: isOwner.value,
},
{
icon: ModrinthIcon,
label: "Admin Billing",
href: `/admin/billing/${ownerId.value}`,
external: true,
shown: isAdmin.value,
},
{ icon: InfoIcon, label: "Info", href: `/servers/manage/${serverId}/options/info` },
];
]);
</script>

View File

@@ -42,7 +42,7 @@
</label>
<ButtonStyled>
<button
:disabled="invocation === startupSettings?.original_invocation"
:disabled="invocation === originalInvocation"
class="!w-full sm:!w-auto"
@click="resetToDefault"
>
@@ -120,8 +120,9 @@ const props = defineProps<{
server: ModrinthServer;
}>();
await props.server.startup.fetch();
const data = computed(() => props.server.general);
const startupSettings = computed(() => props.server.startup);
const showAllVersions = ref(false);
const jdkVersionMap = [
@@ -137,33 +138,15 @@ const jdkBuildMap = [
{ value: "graal", label: "GraalVM" },
];
const invocation = ref("");
const jdkVersion = ref("");
const jdkBuild = ref("");
const originalInvocation = ref("");
const originalJdkVersion = ref("");
const originalJdkBuild = ref("");
watch(
startupSettings,
(newSettings) => {
if (newSettings) {
invocation.value = newSettings.invocation;
originalInvocation.value = newSettings.invocation;
const jdkVersionLabel =
jdkVersionMap.find((v) => v.value === newSettings.jdk_version)?.label || "";
jdkVersion.value = jdkVersionLabel;
originalJdkVersion.value = jdkVersionLabel;
const jdkBuildLabel = jdkBuildMap.find((v) => v.value === newSettings.jdk_build)?.label || "";
jdkBuild.value = jdkBuildLabel;
originalJdkBuild.value = jdkBuildLabel;
}
},
{ immediate: true },
const invocation = ref(props.server.startup.invocation);
const jdkVersion = ref(
jdkVersionMap.find((v) => v.value === props.server.startup.jdk_version)?.label,
);
const jdkBuild = ref(jdkBuildMap.find((v) => v.value === props.server.startup.jdk_build)?.label);
const originalInvocation = ref(invocation.value);
const originalJdkVersion = ref(jdkVersion.value);
const originalJdkBuild = ref(jdkBuild.value);
const hasUnsavedChanges = computed(
() =>
@@ -195,7 +178,7 @@ const displayedJavaVersions = computed(() => {
return showAllVersions.value ? jdkVersionMap.map((v) => v.label) : compatibleJavaVersions.value;
});
const saveStartup = async () => {
async function saveStartup() {
try {
isUpdating.value = true;
const invocationValue = invocation.value ?? "";
@@ -232,17 +215,17 @@ const saveStartup = async () => {
} finally {
isUpdating.value = false;
}
};
}
const resetStartup = () => {
function resetStartup() {
invocation.value = originalInvocation.value;
jdkVersion.value = originalJdkVersion.value;
jdkBuild.value = originalJdkBuild.value;
};
}
const resetToDefault = () => {
invocation.value = startupSettings.value?.original_invocation ?? "";
};
function resetToDefault() {
invocation.value = originalInvocation.value ?? "";
}
</script>
<style scoped>

View File

@@ -322,12 +322,11 @@ pub async fn is_visible_collection(
} else {
!collection_data.status.is_hidden()
}) && !collection_data.projects.is_empty();
if let Some(user) = &user_option {
if !authorized
&& (user.role.is_mod() || user.id == collection_data.user_id.into())
{
authorized = true;
}
if let Some(user) = &user_option
&& !authorized
&& (user.role.is_mod() || user.id == collection_data.user_id.into())
{
authorized = true;
}
Ok(authorized)
}
@@ -356,10 +355,10 @@ pub async fn filter_visible_collections(
for collection in check_collections {
// Collections are simple- if we are the owner or a mod, we can see it
if let Some(user) = user_option {
if user.role.is_mod() || user.id == collection.user_id.into() {
return_collections.push(collection.into());
}
if let Some(user) = user_option
&& (user.role.is_mod() || user.id == collection.user_id.into())
{
return_collections.push(collection.into());
}
}

View File

@@ -95,10 +95,10 @@ impl DBFlow {
redis: &RedisPool,
) -> Result<Option<DBFlow>, DatabaseError> {
let flow = Self::get(id, redis).await?;
if let Some(flow) = flow.as_ref() {
if predicate(flow) {
Self::remove(id, redis).await?;
}
if let Some(flow) = flow.as_ref()
&& predicate(flow)
{
Self::remove(id, redis).await?;
}
Ok(flow)
}

View File

@@ -801,24 +801,24 @@ impl VersionField {
};
if let Some(count) = countable {
if let Some(min) = loader_field.min_val {
if count < min {
return Err(format!(
"Provided value '{v}' for {field_name} is less than the minimum of {min}",
v = serde_json::to_string(&value).unwrap_or_default(),
field_name = loader_field.field,
));
}
if let Some(min) = loader_field.min_val
&& count < min
{
return Err(format!(
"Provided value '{v}' for {field_name} is less than the minimum of {min}",
v = serde_json::to_string(&value).unwrap_or_default(),
field_name = loader_field.field,
));
}
if let Some(max) = loader_field.max_val {
if count > max {
return Err(format!(
"Provided value '{v}' for {field_name} is greater than the maximum of {max}",
v = serde_json::to_string(&value).unwrap_or_default(),
field_name = loader_field.field,
));
}
if let Some(max) = loader_field.max_val
&& count > max
{
return Err(format!(
"Provided value '{v}' for {field_name} is greater than the maximum of {max}",
v = serde_json::to_string(&value).unwrap_or_default(),
field_name = loader_field.field,
));
}
}

View File

@@ -483,20 +483,20 @@ impl DBTeamMember {
.await?;
}
if let Some(accepted) = new_accepted {
if accepted {
sqlx::query!(
"
if let Some(accepted) = new_accepted
&& accepted
{
sqlx::query!(
"
UPDATE team_members
SET accepted = TRUE
WHERE (team_id = $1 AND user_id = $2)
",
id as DBTeamId,
user_id as DBUserId,
)
.execute(&mut **transaction)
.await?;
}
id as DBTeamId,
user_id as DBUserId,
)
.execute(&mut **transaction)
.await?;
}
if let Some(payouts_split) = new_payouts_split {

View File

@@ -353,10 +353,10 @@ impl RedisPool {
};
for (idx, key) in fetch_ids.into_iter().enumerate() {
if let Some(locked) = results.get(idx) {
if locked.is_none() {
continue;
}
if let Some(locked) = results.get(idx)
&& locked.is_none()
{
continue;
}
if let Some((key, raw_key)) = ids.remove(&key) {

View File

@@ -334,18 +334,14 @@ impl From<Version> for LegacyVersion {
// the v2 loaders are whatever the corresponding loader fields are
let mut loaders =
data.loaders.into_iter().map(|l| l.0).collect::<Vec<_>>();
if loaders.contains(&"mrpack".to_string()) {
if let Some((_, mrpack_loaders)) = data
if loaders.contains(&"mrpack".to_string())
&& let Some((_, mrpack_loaders)) = data
.fields
.into_iter()
.find(|(key, _)| key == "mrpack_loaders")
{
if let Ok(mrpack_loaders) =
serde_json::from_value(mrpack_loaders)
{
loaders = mrpack_loaders;
}
}
&& let Ok(mrpack_loaders) = serde_json::from_value(mrpack_loaders)
{
loaders = mrpack_loaders;
}
let loaders = loaders.into_iter().map(Loader).collect::<Vec<_>>();

View File

@@ -43,35 +43,33 @@ impl LegacyResultSearchProject {
pub fn from(result_search_project: ResultSearchProject) -> Self {
let mut categories = result_search_project.categories;
categories.extend(result_search_project.loaders.clone());
if categories.contains(&"mrpack".to_string()) {
if let Some(mrpack_loaders) = result_search_project
if categories.contains(&"mrpack".to_string())
&& let Some(mrpack_loaders) = result_search_project
.project_loader_fields
.get("mrpack_loaders")
{
categories.extend(
mrpack_loaders
.iter()
.filter_map(|c| c.as_str())
.map(String::from),
);
categories.retain(|c| c != "mrpack");
}
{
categories.extend(
mrpack_loaders
.iter()
.filter_map(|c| c.as_str())
.map(String::from),
);
categories.retain(|c| c != "mrpack");
}
let mut display_categories = result_search_project.display_categories;
display_categories.extend(result_search_project.loaders);
if display_categories.contains(&"mrpack".to_string()) {
if let Some(mrpack_loaders) = result_search_project
if display_categories.contains(&"mrpack".to_string())
&& let Some(mrpack_loaders) = result_search_project
.project_loader_fields
.get("mrpack_loaders")
{
categories.extend(
mrpack_loaders
.iter()
.filter_map(|c| c.as_str())
.map(String::from),
);
display_categories.retain(|c| c != "mrpack");
}
{
categories.extend(
mrpack_loaders
.iter()
.filter_map(|c| c.as_str())
.map(String::from),
);
display_categories.retain(|c| c != "mrpack");
}
// Sort then remove duplicates

View File

@@ -166,10 +166,10 @@ impl From<ProjectQueryResult> for Project {
Ok(spdx_expr) => {
let mut vec: Vec<&str> = Vec::new();
for node in spdx_expr.iter() {
if let spdx::expression::ExprNode::Req(req) = node {
if let Some(id) = req.req.license.id() {
vec.push(id.full_name);
}
if let spdx::expression::ExprNode::Req(req) = node
&& let Some(id) = req.req.license.id()
{
vec.push(id.full_name);
}
}
// spdx crate returns AND/OR operations in postfix order

View File

@@ -51,16 +51,16 @@ impl ProjectPermissions {
return Some(ProjectPermissions::all());
}
if let Some(member) = project_team_member {
if member.accepted {
return Some(member.permissions);
}
if let Some(member) = project_team_member
&& member.accepted
{
return Some(member.permissions);
}
if let Some(member) = organization_team_member {
if member.accepted {
return Some(member.permissions);
}
if let Some(member) = organization_team_member
&& member.accepted
{
return Some(member.permissions);
}
if role.is_mod() {
@@ -107,10 +107,10 @@ impl OrganizationPermissions {
return Some(OrganizationPermissions::all());
}
if let Some(member) = team_member {
if member.accepted {
return member.organization_permissions;
}
if let Some(member) = team_member
&& member.accepted
{
return member.organization_permissions;
}
if role.is_mod() {
return Some(

View File

@@ -45,17 +45,15 @@ impl MaxMindIndexer {
if let Ok(entries) = archive.entries() {
for mut file in entries.flatten() {
if let Ok(path) = file.header().path() {
if path.extension().and_then(|x| x.to_str()) == Some("mmdb")
{
let mut buf = Vec::new();
file.read_to_end(&mut buf).unwrap();
if let Ok(path) = file.header().path()
&& path.extension().and_then(|x| x.to_str()) == Some("mmdb")
{
let mut buf = Vec::new();
file.read_to_end(&mut buf).unwrap();
let reader =
maxminddb::Reader::from_source(buf).unwrap();
let reader = maxminddb::Reader::from_source(buf).unwrap();
return Ok(Some(reader));
}
return Ok(Some(reader));
}
}
}

View File

@@ -371,8 +371,8 @@ impl AutomatedModerationQueue {
for file in
files.iter().filter(|x| x.version_id == version.id.into())
{
if let Some(hash) = file.hashes.get("sha1") {
if let Some((index, (sha1, _, file_name, _))) = hashes
if let Some(hash) = file.hashes.get("sha1")
&& let Some((index, (sha1, _, file_name, _))) = hashes
.iter()
.enumerate()
.find(|(_, (value, _, _, _))| value == hash)
@@ -382,7 +382,6 @@ impl AutomatedModerationQueue {
hashes.remove(index);
}
}
}
}
@@ -420,12 +419,11 @@ impl AutomatedModerationQueue {
.await?;
for row in rows {
if let Some(sha1) = row.sha1 {
if let Some((index, (sha1, _, file_name, _))) = hashes.iter().enumerate().find(|(_, (value, _, _, _))| value == &sha1) {
if let Some(sha1) = row.sha1
&& let Some((index, (sha1, _, file_name, _))) = hashes.iter().enumerate().find(|(_, (value, _, _, _))| value == &sha1) {
final_hashes.insert(sha1.clone(), IdentifiedFile { file_name: file_name.clone(), status: ApprovalType::from_string(&row.status).unwrap_or(ApprovalType::Unidentified) });
hashes.remove(index);
}
}
}
if hashes.is_empty() {
@@ -499,8 +497,8 @@ impl AutomatedModerationQueue {
let mut insert_ids = Vec::new();
for row in rows {
if let Some((curse_index, (hash, _flame_id))) = flame_files.iter().enumerate().find(|(_, x)| Some(x.1 as i32) == row.flame_project_id) {
if let Some((index, (sha1, _, file_name, _))) = hashes.iter().enumerate().find(|(_, (value, _, _, _))| value == hash) {
if let Some((curse_index, (hash, _flame_id))) = flame_files.iter().enumerate().find(|(_, x)| Some(x.1 as i32) == row.flame_project_id)
&& let Some((index, (sha1, _, file_name, _))) = hashes.iter().enumerate().find(|(_, (value, _, _, _))| value == hash) {
final_hashes.insert(sha1.clone(), IdentifiedFile {
file_name: file_name.clone(),
status: ApprovalType::from_string(&row.status).unwrap_or(ApprovalType::Unidentified),
@@ -512,7 +510,6 @@ impl AutomatedModerationQueue {
hashes.remove(index);
flame_files.remove(curse_index);
}
}
}
if !insert_ids.is_empty() && !insert_hashes.is_empty() {
@@ -581,8 +578,8 @@ impl AutomatedModerationQueue {
for (sha1, _pack_file, file_name, _mumur2) in hashes {
let flame_file = flame_files.iter().find(|x| x.0 == sha1);
if let Some((_, flame_project_id)) = flame_file {
if let Some(project) = flame_projects.iter().find(|x| &x.id == flame_project_id) {
if let Some((_, flame_project_id)) = flame_file
&& let Some(project) = flame_projects.iter().find(|x| &x.id == flame_project_id) {
missing_metadata.flame_files.insert(sha1, MissingMetadataFlame {
title: project.name.clone(),
file_name,
@@ -592,7 +589,6 @@ impl AutomatedModerationQueue {
continue;
}
}
missing_metadata.unknown_files.insert(sha1, file_name);
}

View File

@@ -257,31 +257,30 @@ impl PayoutsQueue {
)
})?;
if !status.is_success() {
if let Some(obj) = value.as_object() {
if let Some(array) = obj.get("errors") {
#[derive(Deserialize)]
struct TremendousError {
message: String,
}
let err = serde_json::from_value::<TremendousError>(
array.clone(),
)
.map_err(|_| {
ApiError::Payments(
"could not retrieve Tremendous error json body"
.to_string(),
)
})?;
return Err(ApiError::Payments(err.message));
if !status.is_success()
&& let Some(obj) = value.as_object()
{
if let Some(array) = obj.get("errors") {
#[derive(Deserialize)]
struct TremendousError {
message: String,
}
return Err(ApiError::Payments(
"could not retrieve Tremendous error body".to_string(),
));
let err =
serde_json::from_value::<TremendousError>(array.clone())
.map_err(|_| {
ApiError::Payments(
"could not retrieve Tremendous error json body"
.to_string(),
)
})?;
return Err(ApiError::Payments(err.message));
}
return Err(ApiError::Payments(
"could not retrieve Tremendous error body".to_string(),
));
}
Ok(serde_json::from_value(value)?)
@@ -449,10 +448,10 @@ impl PayoutsQueue {
};
// we do not support interval gift cards with non US based currencies since we cannot do currency conversions properly
if let PayoutInterval::Fixed { .. } = method.interval {
if !product.currency_codes.contains(&"USD".to_string()) {
continue;
}
if let PayoutInterval::Fixed { .. } = method.interval
&& !product.currency_codes.contains(&"USD".to_string())
{
continue;
}
methods.push(method);

View File

@@ -286,17 +286,17 @@ pub async fn refund_charge(
.upsert(&mut transaction)
.await?;
if body.0.unprovision.unwrap_or(false) {
if let Some(subscription_id) = charge.subscription_id {
let open_charge =
DBCharge::get_open_subscription(subscription_id, &**pool)
.await?;
if let Some(mut open_charge) = open_charge {
open_charge.status = ChargeStatus::Cancelled;
open_charge.due = Utc::now();
if body.0.unprovision.unwrap_or(false)
&& let Some(subscription_id) = charge.subscription_id
{
let open_charge =
DBCharge::get_open_subscription(subscription_id, &**pool)
.await?;
if let Some(mut open_charge) = open_charge {
open_charge.status = ChargeStatus::Cancelled;
open_charge.due = Utc::now();
open_charge.upsert(&mut transaction).await?;
}
open_charge.upsert(&mut transaction).await?;
}
}
@@ -392,17 +392,16 @@ pub async fn edit_subscription(
}
}
if let Some(interval) = &edit_subscription.interval {
if let Price::Recurring { intervals } = &current_price.prices {
if let Some(price) = intervals.get(interval) {
open_charge.subscription_interval = Some(*interval);
open_charge.amount = *price as i64;
} else {
return Err(ApiError::InvalidInput(
"Interval is not valid for this subscription!"
.to_string(),
));
}
if let Some(interval) = &edit_subscription.interval
&& let Price::Recurring { intervals } = &current_price.prices
{
if let Some(price) = intervals.get(interval) {
open_charge.subscription_interval = Some(*interval);
open_charge.amount = *price as i64;
} else {
return Err(ApiError::InvalidInput(
"Interval is not valid for this subscription!".to_string(),
));
}
}
@@ -1225,38 +1224,36 @@ pub async fn initiate_payment(
}
};
if let Price::Recurring { .. } = price_item.prices {
if product.unitary {
let user_subscriptions =
if let Price::Recurring { .. } = price_item.prices
&& product.unitary
{
let user_subscriptions =
user_subscription_item::DBUserSubscription::get_all_user(
user.id.into(),
&**pool,
)
.await?;
let user_products =
product_item::DBProductPrice::get_many(
&user_subscriptions
.iter()
.filter(|x| {
x.status
== SubscriptionStatus::Provisioned
})
.map(|x| x.price_id)
.collect::<Vec<_>>(),
&**pool,
)
.await?;
let user_products = product_item::DBProductPrice::get_many(
&user_subscriptions
.iter()
.filter(|x| {
x.status == SubscriptionStatus::Provisioned
})
.map(|x| x.price_id)
.collect::<Vec<_>>(),
&**pool,
)
.await?;
if user_products
.into_iter()
.any(|x| x.product_id == product.id)
{
return Err(ApiError::InvalidInput(
"You are already subscribed to this product!"
.to_string(),
));
}
if user_products
.into_iter()
.any(|x| x.product_id == product.id)
{
return Err(ApiError::InvalidInput(
"You are already subscribed to this product!"
.to_string(),
));
}
}
@@ -2004,38 +2001,36 @@ pub async fn stripe_webhook(
EventType::PaymentMethodAttached => {
if let EventObject::PaymentMethod(payment_method) =
event.data.object
{
if let Some(customer_id) =
&& let Some(customer_id) =
payment_method.customer.map(|x| x.id())
{
let customer = stripe::Customer::retrieve(
&stripe_client,
&customer_id,
&[],
)
.await?;
if customer
.invoice_settings
.is_none_or(|x| x.default_payment_method.is_none())
{
let customer = stripe::Customer::retrieve(
stripe::Customer::update(
&stripe_client,
&customer_id,
&[],
UpdateCustomer {
invoice_settings: Some(
CustomerInvoiceSettings {
default_payment_method: Some(
payment_method.id.to_string(),
),
..Default::default()
},
),
..Default::default()
},
)
.await?;
if customer
.invoice_settings
.is_none_or(|x| x.default_payment_method.is_none())
{
stripe::Customer::update(
&stripe_client,
&customer_id,
UpdateCustomer {
invoice_settings: Some(
CustomerInvoiceSettings {
default_payment_method: Some(
payment_method.id.to_string(),
),
..Default::default()
},
),
..Default::default()
},
)
.await?;
}
}
}
}

View File

@@ -79,13 +79,12 @@ impl TempUser {
file_host: &Arc<dyn FileHost + Send + Sync>,
redis: &RedisPool,
) -> Result<crate::database::models::DBUserId, AuthenticationError> {
if let Some(email) = &self.email {
if crate::database::models::DBUser::get_by_email(email, client)
if let Some(email) = &self.email
&& crate::database::models::DBUser::get_by_email(email, client)
.await?
.is_some()
{
return Err(AuthenticationError::DuplicateUser);
}
{
return Err(AuthenticationError::DuplicateUser);
}
let user_id =
@@ -1269,19 +1268,19 @@ pub async fn delete_auth_provider(
.update_user_id(user.id.into(), None, &mut transaction)
.await?;
if delete_provider.provider != AuthProvider::PayPal {
if let Some(email) = user.email {
send_email(
email,
"Authentication method removed",
&format!(
"When logging into Modrinth, you can no longer log in using the {} authentication provider.",
delete_provider.provider.as_str()
),
"If you did not make this change, please contact us immediately through our support channels on Discord or via email (support@modrinth.com).",
None,
)?;
}
if delete_provider.provider != AuthProvider::PayPal
&& let Some(email) = user.email
{
send_email(
email,
"Authentication method removed",
&format!(
"When logging into Modrinth, you can no longer log in using the {} authentication provider.",
delete_provider.provider.as_str()
),
"If you did not make this change, please contact us immediately through our support channels on Discord or via email (support@modrinth.com).",
None,
)?;
}
transaction.commit().await?;

View File

@@ -189,17 +189,16 @@ pub async fn get_project_meta(
.iter()
.find(|x| Some(x.1.id as i32) == row.flame_project_id)
.map(|x| x.0.clone())
&& let Some(val) = merged.flame_files.remove(&sha1)
{
if let Some(val) = merged.flame_files.remove(&sha1) {
merged.identified.insert(
sha1,
IdentifiedFile {
file_name: val.file_name.clone(),
status: ApprovalType::from_string(&row.status)
.unwrap_or(ApprovalType::Unidentified),
},
);
}
merged.identified.insert(
sha1,
IdentifiedFile {
file_name: val.file_name.clone(),
status: ApprovalType::from_string(&row.status)
.unwrap_or(ApprovalType::Unidentified),
},
);
}
}

View File

@@ -185,69 +185,69 @@ pub async fn edit_pat(
)
.await?;
if let Some(pat) = pat {
if pat.user_id == user.id.into() {
let mut transaction = pool.begin().await?;
if let Some(pat) = pat
&& pat.user_id == user.id.into()
{
let mut transaction = pool.begin().await?;
if let Some(scopes) = &info.scopes {
if scopes.is_restricted() {
return Err(ApiError::InvalidInput(
"Invalid scopes requested!".to_string(),
));
}
if let Some(scopes) = &info.scopes {
if scopes.is_restricted() {
return Err(ApiError::InvalidInput(
"Invalid scopes requested!".to_string(),
));
}
sqlx::query!(
"
sqlx::query!(
"
UPDATE pats
SET scopes = $1
WHERE id = $2
",
scopes.bits() as i64,
pat.id.0
)
.execute(&mut *transaction)
.await?;
}
if let Some(name) = &info.name {
sqlx::query!(
"
scopes.bits() as i64,
pat.id.0
)
.execute(&mut *transaction)
.await?;
}
if let Some(name) = &info.name {
sqlx::query!(
"
UPDATE pats
SET name = $1
WHERE id = $2
",
name,
pat.id.0
)
.execute(&mut *transaction)
.await?;
name,
pat.id.0
)
.execute(&mut *transaction)
.await?;
}
if let Some(expires) = &info.expires {
if expires < &Utc::now() {
return Err(ApiError::InvalidInput(
"Expire date must be in the future!".to_string(),
));
}
if let Some(expires) = &info.expires {
if expires < &Utc::now() {
return Err(ApiError::InvalidInput(
"Expire date must be in the future!".to_string(),
));
}
sqlx::query!(
"
sqlx::query!(
"
UPDATE pats
SET expires = $1
WHERE id = $2
",
expires,
pat.id.0
)
.execute(&mut *transaction)
.await?;
}
transaction.commit().await?;
database::models::pat_item::DBPersonalAccessToken::clear_cache(
vec![(Some(pat.id), Some(pat.access_token), Some(pat.user_id))],
&redis,
expires,
pat.id.0
)
.execute(&mut *transaction)
.await?;
}
transaction.commit().await?;
database::models::pat_item::DBPersonalAccessToken::clear_cache(
vec![(Some(pat.id), Some(pat.access_token), Some(pat.user_id))],
&redis,
)
.await?;
}
Ok(HttpResponse::NoContent().finish())
@@ -276,21 +276,21 @@ pub async fn delete_pat(
)
.await?;
if let Some(pat) = pat {
if pat.user_id == user.id.into() {
let mut transaction = pool.begin().await?;
database::models::pat_item::DBPersonalAccessToken::remove(
pat.id,
&mut transaction,
)
.await?;
transaction.commit().await?;
database::models::pat_item::DBPersonalAccessToken::clear_cache(
vec![(Some(pat.id), Some(pat.access_token), Some(pat.user_id))],
&redis,
)
.await?;
}
if let Some(pat) = pat
&& pat.user_id == user.id.into()
{
let mut transaction = pool.begin().await?;
database::models::pat_item::DBPersonalAccessToken::remove(
pat.id,
&mut transaction,
)
.await?;
transaction.commit().await?;
database::models::pat_item::DBPersonalAccessToken::clear_cache(
vec![(Some(pat.id), Some(pat.access_token), Some(pat.user_id))],
&redis,
)
.await?;
}
Ok(HttpResponse::NoContent().finish())

View File

@@ -185,21 +185,21 @@ pub async fn delete(
let session = DBSession::get(info.into_inner().0, &**pool, &redis).await?;
if let Some(session) = session {
if session.user_id == current_user.id.into() {
let mut transaction = pool.begin().await?;
DBSession::remove(session.id, &mut transaction).await?;
transaction.commit().await?;
DBSession::clear_cache(
vec![(
Some(session.id),
Some(session.session),
Some(session.user_id),
)],
&redis,
)
.await?;
}
if let Some(session) = session
&& session.user_id == current_user.id.into()
{
let mut transaction = pool.begin().await?;
DBSession::remove(session.id, &mut transaction).await?;
transaction.commit().await?;
DBSession::clear_cache(
vec![(
Some(session.id),
Some(session.session),
Some(session.user_id),
)],
&redis,
)
.await?;
}
Ok(HttpResponse::NoContent().body(""))

View File

@@ -401,14 +401,13 @@ async fn broadcast_to_known_local_friends(
friend.user_id
};
if friend.accepted {
if let Some(socket_ids) =
if friend.accepted
&& let Some(socket_ids) =
sockets.sockets_by_user_id.get(&friend_id.into())
{
for socket_id in socket_ids.iter() {
if let Some(socket) = sockets.sockets.get(&socket_id) {
let _ = send_message(socket.value(), &message).await;
}
{
for socket_id in socket_ids.iter() {
if let Some(socket) = sockets.sockets.get(&socket_id) {
let _ = send_message(socket.value(), &message).await;
}
}
}

View File

@@ -387,17 +387,16 @@ pub async fn revenue_get(
.map(|x| (x.to_string(), HashMap::new()))
.collect::<HashMap<_, _>>();
for value in payouts_values {
if let Some(mod_id) = value.mod_id {
if let Some(amount) = value.amount_sum {
if let Some(interval_start) = value.interval_start {
let id_string = to_base62(mod_id as u64);
if !hm.contains_key(&id_string) {
hm.insert(id_string.clone(), HashMap::new());
}
if let Some(hm) = hm.get_mut(&id_string) {
hm.insert(interval_start.timestamp(), amount);
}
}
if let Some(mod_id) = value.mod_id
&& let Some(amount) = value.amount_sum
&& let Some(interval_start) = value.interval_start
{
let id_string = to_base62(mod_id as u64);
if !hm.contains_key(&id_string) {
hm.insert(id_string.clone(), HashMap::new());
}
if let Some(hm) = hm.get_mut(&id_string) {
hm.insert(interval_start.timestamp(), amount);
}
}
}

View File

@@ -192,10 +192,10 @@ pub async fn collection_get(
.map(|x| x.1)
.ok();
if let Some(data) = collection_data {
if is_visible_collection(&data, &user_option, false).await? {
return Ok(HttpResponse::Ok().json(Collection::from(data)));
}
if let Some(data) = collection_data
&& is_visible_collection(&data, &user_option, false).await?
{
return Ok(HttpResponse::Ok().json(Collection::from(data)));
}
Err(ApiError::NotFound)
}

View File

@@ -536,11 +536,9 @@ pub async fn create_payout(
Some(true),
)
.await
&& let Some(data) = res.items.first()
{
if let Some(data) = res.items.first() {
payout_item.platform_id =
Some(data.payout_item_id.clone());
}
payout_item.platform_id = Some(data.payout_item_id.clone());
}
}

View File

@@ -182,10 +182,10 @@ pub async fn project_get(
.map(|x| x.1)
.ok();
if let Some(data) = project_data {
if is_visible_project(&data.inner, &user_option, &pool, false).await? {
return Ok(HttpResponse::Ok().json(Project::from(data)));
}
if let Some(data) = project_data
&& is_visible_project(&data.inner, &user_option, &pool, false).await?
{
return Ok(HttpResponse::Ok().json(Project::from(data)));
}
Err(ApiError::NotFound)
}
@@ -403,34 +403,36 @@ pub async fn project_edit(
.await?;
}
if status.is_searchable() && !project_item.inner.webhook_sent {
if let Ok(webhook_url) = dotenvy::var("PUBLIC_DISCORD_WEBHOOK") {
crate::util::webhook::send_discord_webhook(
project_item.inner.id.into(),
&pool,
&redis,
webhook_url,
None,
)
.await
.ok();
if status.is_searchable()
&& !project_item.inner.webhook_sent
&& let Ok(webhook_url) = dotenvy::var("PUBLIC_DISCORD_WEBHOOK")
{
crate::util::webhook::send_discord_webhook(
project_item.inner.id.into(),
&pool,
&redis,
webhook_url,
None,
)
.await
.ok();
sqlx::query!(
"
sqlx::query!(
"
UPDATE mods
SET webhook_sent = TRUE
WHERE id = $1
",
id as db_ids::DBProjectId,
)
.execute(&mut *transaction)
.await?;
}
id as db_ids::DBProjectId,
)
.execute(&mut *transaction)
.await?;
}
if user.role.is_mod() {
if let Ok(webhook_url) = dotenvy::var("MODERATION_SLACK_WEBHOOK") {
crate::util::webhook::send_slack_webhook(
if user.role.is_mod()
&& let Ok(webhook_url) = dotenvy::var("MODERATION_SLACK_WEBHOOK")
{
crate::util::webhook::send_slack_webhook(
project_item.inner.id.into(),
&pool,
&redis,
@@ -449,7 +451,6 @@ pub async fn project_edit(
)
.await
.ok();
}
}
if team_member.is_none_or(|x| !x.accepted) {
@@ -692,45 +693,45 @@ pub async fn project_edit(
.await?;
}
if let Some(links) = &new_project.link_urls {
if !links.is_empty() {
if !perms.contains(ProjectPermissions::EDIT_DETAILS) {
return Err(ApiError::CustomAuthentication(
if let Some(links) = &new_project.link_urls
&& !links.is_empty()
{
if !perms.contains(ProjectPermissions::EDIT_DETAILS) {
return Err(ApiError::CustomAuthentication(
"You do not have the permissions to edit the links of this project!"
.to_string(),
));
}
}
let ids_to_delete = links.keys().cloned().collect::<Vec<String>>();
// Deletes all links from hashmap- either will be deleted or be replaced
sqlx::query!(
"
let ids_to_delete = links.keys().cloned().collect::<Vec<String>>();
// Deletes all links from hashmap- either will be deleted or be replaced
sqlx::query!(
"
DELETE FROM mods_links
WHERE joining_mod_id = $1 AND joining_platform_id IN (
SELECT id FROM link_platforms WHERE name = ANY($2)
)
",
id as db_ids::DBProjectId,
&ids_to_delete
)
.execute(&mut *transaction)
.await?;
id as db_ids::DBProjectId,
&ids_to_delete
)
.execute(&mut *transaction)
.await?;
for (platform, url) in links {
if let Some(url) = url {
let platform_id =
db_models::categories::LinkPlatform::get_id(
platform,
&mut *transaction,
)
.await?
.ok_or_else(|| {
ApiError::InvalidInput(format!(
"Platform {} does not exist.",
platform.clone()
))
})?;
sqlx::query!(
for (platform, url) in links {
if let Some(url) = url {
let platform_id = db_models::categories::LinkPlatform::get_id(
platform,
&mut *transaction,
)
.await?
.ok_or_else(|| {
ApiError::InvalidInput(format!(
"Platform {} does not exist.",
platform.clone()
))
})?;
sqlx::query!(
"
INSERT INTO mods_links (joining_mod_id, joining_platform_id, url)
VALUES ($1, $2, $3)
@@ -741,7 +742,6 @@ pub async fn project_edit(
)
.execute(&mut *transaction)
.await?;
}
}
}
}
@@ -2430,7 +2430,7 @@ pub async fn project_get_organization(
organization,
team_members,
);
return Ok(HttpResponse::Ok().json(organization));
Ok(HttpResponse::Ok().json(organization))
} else {
Err(ApiError::NotFound)
}

View File

@@ -767,12 +767,13 @@ pub async fn edit_team_member(
));
}
if let Some(new_permissions) = edit_member.permissions {
if !permissions.contains(new_permissions) {
return Err(ApiError::InvalidInput(
"The new permissions have permissions that you don't have".to_string(),
));
}
if let Some(new_permissions) = edit_member.permissions
&& !permissions.contains(new_permissions)
{
return Err(ApiError::InvalidInput(
"The new permissions have permissions that you don't have"
.to_string(),
));
}
if edit_member.organization_permissions.is_some() {
@@ -800,13 +801,12 @@ pub async fn edit_team_member(
}
if let Some(new_permissions) = edit_member.organization_permissions
&& !organization_permissions.contains(new_permissions)
{
if !organization_permissions.contains(new_permissions) {
return Err(ApiError::InvalidInput(
return Err(ApiError::InvalidInput(
"The new organization permissions have permissions that you don't have"
.to_string(),
));
}
}
if edit_member.permissions.is_some()
@@ -822,13 +822,13 @@ pub async fn edit_team_member(
}
}
if let Some(payouts_split) = edit_member.payouts_split {
if payouts_split < Decimal::ZERO || payouts_split > Decimal::from(5000)
{
return Err(ApiError::InvalidInput(
"Payouts split must be between 0 and 5000!".to_string(),
));
}
if let Some(payouts_split) = edit_member.payouts_split
&& (payouts_split < Decimal::ZERO
|| payouts_split > Decimal::from(5000))
{
return Err(ApiError::InvalidInput(
"Payouts split must be between 0 and 5000!".to_string(),
));
}
DBTeamMember::edit_team_member(
@@ -883,13 +883,13 @@ pub async fn transfer_ownership(
DBTeam::get_association(id.into(), &**pool).await?;
if let Some(TeamAssociationId::Project(pid)) = team_association_id {
let result = DBProject::get_id(pid, &**pool, &redis).await?;
if let Some(project_item) = result {
if project_item.inner.organization_id.is_some() {
return Err(ApiError::InvalidInput(
if let Some(project_item) = result
&& project_item.inner.organization_id.is_some()
{
return Err(ApiError::InvalidInput(
"You cannot transfer ownership of a project team that is owend by an organization"
.to_string(),
));
}
}
}

View File

@@ -289,36 +289,33 @@ pub async fn thread_get(
.await?
.1;
if let Some(mut data) = thread_data {
if is_authorized_thread(&data, &user, &pool).await? {
let authors = &mut data.members;
if let Some(mut data) = thread_data
&& is_authorized_thread(&data, &user, &pool).await?
{
let authors = &mut data.members;
authors.append(
&mut data
.messages
.iter()
.filter_map(|x| {
if x.hide_identity && !user.role.is_mod() {
None
} else {
x.author_id
}
})
.collect::<Vec<_>>(),
);
authors.append(
&mut data
.messages
.iter()
.filter_map(|x| {
if x.hide_identity && !user.role.is_mod() {
None
} else {
x.author_id
}
})
.collect::<Vec<_>>(),
);
let users: Vec<User> = database::models::DBUser::get_many_ids(
authors, &**pool, &redis,
)
.await?
.into_iter()
.map(From::from)
.collect();
let users: Vec<User> =
database::models::DBUser::get_many_ids(authors, &**pool, &redis)
.await?
.into_iter()
.map(From::from)
.collect();
return Ok(
HttpResponse::Ok().json(Thread::from(data, users, &user))
);
}
return Ok(HttpResponse::Ok().json(Thread::from(data, users, &user)));
}
Err(ApiError::NotFound)
}
@@ -454,33 +451,32 @@ pub async fn thread_send_message(
)
.await?;
if let Some(project) = project {
if project.inner.status != ProjectStatus::Processing
&& user.role.is_mod()
{
let members =
database::models::DBTeamMember::get_from_team_full(
project.inner.team_id,
&**pool,
&redis,
)
.await?;
NotificationBuilder {
body: NotificationBody::ModeratorMessage {
thread_id: thread.id.into(),
message_id: id.into(),
project_id: Some(project.inner.id.into()),
report_id: None,
},
}
.insert_many(
members.into_iter().map(|x| x.user_id).collect(),
&mut transaction,
if let Some(project) = project
&& project.inner.status != ProjectStatus::Processing
&& user.role.is_mod()
{
let members =
database::models::DBTeamMember::get_from_team_full(
project.inner.team_id,
&**pool,
&redis,
)
.await?;
NotificationBuilder {
body: NotificationBody::ModeratorMessage {
thread_id: thread.id.into(),
message_id: id.into(),
project_id: Some(project.inner.id.into()),
report_id: None,
},
}
.insert_many(
members.into_iter().map(|x| x.user_id).collect(),
&mut transaction,
&redis,
)
.await?;
}
} else if let Some(report_id) = thread.report_id {
let report = database::models::report_item::DBReport::get(

View File

@@ -522,10 +522,10 @@ async fn version_create_inner(
.fetch_optional(pool)
.await?;
if let Some(project_status) = project_status {
if project_status.status == ProjectStatus::Processing.as_str() {
moderation_queue.projects.insert(project_id.into());
}
if let Some(project_status) = project_status
&& project_status.status == ProjectStatus::Processing.as_str()
{
moderation_queue.projects.insert(project_id.into());
}
Ok(HttpResponse::Ok().json(response))
@@ -871,16 +871,16 @@ pub async fn upload_file(
ref format,
ref files,
} = validation_result
&& dependencies.is_empty()
{
if dependencies.is_empty() {
let hashes: Vec<Vec<u8>> = format
.files
.iter()
.filter_map(|x| x.hashes.get(&PackFileHash::Sha1))
.map(|x| x.as_bytes().to_vec())
.collect();
let hashes: Vec<Vec<u8>> = format
.files
.iter()
.filter_map(|x| x.hashes.get(&PackFileHash::Sha1))
.map(|x| x.as_bytes().to_vec())
.collect();
let res = sqlx::query!(
let res = sqlx::query!(
"
SELECT v.id version_id, v.mod_id project_id, h.hash hash FROM hashes h
INNER JOIN files f on h.file_id = f.id
@@ -892,45 +892,44 @@ pub async fn upload_file(
.fetch_all(&mut **transaction)
.await?;
for file in &format.files {
if let Some(dep) = res.iter().find(|x| {
Some(&*x.hash)
== file
.hashes
.get(&PackFileHash::Sha1)
.map(|x| x.as_bytes())
}) {
dependencies.push(DependencyBuilder {
project_id: Some(models::DBProjectId(dep.project_id)),
version_id: Some(models::DBVersionId(dep.version_id)),
file_name: None,
dependency_type: DependencyType::Embedded.to_string(),
});
} else if let Some(first_download) = file.downloads.first() {
dependencies.push(DependencyBuilder {
project_id: None,
version_id: None,
file_name: Some(
first_download
.rsplit('/')
.next()
.unwrap_or(first_download)
.to_string(),
),
dependency_type: DependencyType::Embedded.to_string(),
});
}
for file in &format.files {
if let Some(dep) = res.iter().find(|x| {
Some(&*x.hash)
== file
.hashes
.get(&PackFileHash::Sha1)
.map(|x| x.as_bytes())
}) {
dependencies.push(DependencyBuilder {
project_id: Some(models::DBProjectId(dep.project_id)),
version_id: Some(models::DBVersionId(dep.version_id)),
file_name: None,
dependency_type: DependencyType::Embedded.to_string(),
});
} else if let Some(first_download) = file.downloads.first() {
dependencies.push(DependencyBuilder {
project_id: None,
version_id: None,
file_name: Some(
first_download
.rsplit('/')
.next()
.unwrap_or(first_download)
.to_string(),
),
dependency_type: DependencyType::Embedded.to_string(),
});
}
}
for file in files {
if !file.is_empty() {
dependencies.push(DependencyBuilder {
project_id: None,
version_id: None,
file_name: Some(file.to_string()),
dependency_type: DependencyType::Embedded.to_string(),
});
}
for file in files {
if !file.is_empty() {
dependencies.push(DependencyBuilder {
project_id: None,
version_id: None,
file_name: Some(file.to_string()),
dependency_type: DependencyType::Embedded.to_string(),
});
}
}
}
@@ -974,10 +973,10 @@ pub async fn upload_file(
));
}
if let ValidationResult::Warning(msg) = validation_result {
if primary {
return Err(CreateError::InvalidInput(msg.to_string()));
}
if let ValidationResult::Warning(msg) = validation_result
&& primary
{
return Err(CreateError::InvalidInput(msg.to_string()));
}
let url = format!("{cdn_url}/{file_path_encode}");

View File

@@ -148,65 +148,55 @@ pub async fn get_update_from_hash(
&redis,
)
.await?
{
if let Some(project) = database::models::DBProject::get_id(
&& let Some(project) = database::models::DBProject::get_id(
file.project_id,
&**pool,
&redis,
)
.await?
{
let mut versions = database::models::DBVersion::get_many(
&project.versions,
&**pool,
&redis,
)
.await?
.into_iter()
.filter(|x| {
let mut bool = true;
if let Some(version_types) = &update_data.version_types {
bool &= version_types
.iter()
.any(|y| y.as_str() == x.inner.version_type);
}
if let Some(loaders) = &update_data.loaders {
bool &= x.loaders.iter().any(|y| loaders.contains(y));
}
if let Some(loader_fields) = &update_data.loader_fields {
for (key, values) in loader_fields {
bool &= if let Some(x_vf) = x
.version_fields
.iter()
.find(|y| y.field_name == *key)
{
values
.iter()
.any(|v| x_vf.value.contains_json_value(v))
} else {
true
};
}
}
bool
})
.sorted();
if let Some(first) = versions.next_back() {
if !is_visible_version(
&first.inner,
&user_option,
&pool,
&redis,
)
.await?
{
return Err(ApiError::NotFound);
}
return Ok(HttpResponse::Ok()
.json(models::projects::Version::from(first)));
{
let mut versions = database::models::DBVersion::get_many(
&project.versions,
&**pool,
&redis,
)
.await?
.into_iter()
.filter(|x| {
let mut bool = true;
if let Some(version_types) = &update_data.version_types {
bool &= version_types
.iter()
.any(|y| y.as_str() == x.inner.version_type);
}
if let Some(loaders) = &update_data.loaders {
bool &= x.loaders.iter().any(|y| loaders.contains(y));
}
if let Some(loader_fields) = &update_data.loader_fields {
for (key, values) in loader_fields {
bool &= if let Some(x_vf) =
x.version_fields.iter().find(|y| y.field_name == *key)
{
values.iter().any(|v| x_vf.value.contains_json_value(v))
} else {
true
};
}
}
bool
})
.sorted();
if let Some(first) = versions.next_back() {
if !is_visible_version(&first.inner, &user_option, &pool, &redis)
.await?
{
return Err(ApiError::NotFound);
}
return Ok(
HttpResponse::Ok().json(models::projects::Version::from(first))
);
}
}
Err(ApiError::NotFound)
@@ -398,13 +388,12 @@ pub async fn update_files(
if let Some(version) = versions
.iter()
.find(|x| x.inner.project_id == file.project_id)
&& let Some(hash) = file.hashes.get(&algorithm)
{
if let Some(hash) = file.hashes.get(&algorithm) {
response.insert(
hash.clone(),
models::projects::Version::from(version.clone()),
);
}
response.insert(
hash.clone(),
models::projects::Version::from(version.clone()),
);
}
}
@@ -484,69 +473,59 @@ pub async fn update_individual_files(
for project in projects {
for file in files.iter().filter(|x| x.project_id == project.inner.id) {
if let Some(hash) = file.hashes.get(&algorithm) {
if let Some(query_file) =
if let Some(hash) = file.hashes.get(&algorithm)
&& let Some(query_file) =
update_data.hashes.iter().find(|x| &x.hash == hash)
{
let version = all_versions
.iter()
.filter(|x| x.inner.project_id == file.project_id)
.filter(|x| {
let mut bool = true;
{
let version = all_versions
.iter()
.filter(|x| x.inner.project_id == file.project_id)
.filter(|x| {
let mut bool = true;
if let Some(version_types) =
&query_file.version_types
{
bool &= version_types.iter().any(|y| {
y.as_str() == x.inner.version_type
});
}
if let Some(loaders) = &query_file.loaders {
bool &= x
.loaders
.iter()
.any(|y| loaders.contains(y));
}
if let Some(loader_fields) =
&query_file.loader_fields
{
for (key, values) in loader_fields {
bool &= if let Some(x_vf) = x
.version_fields
.iter()
.find(|y| y.field_name == *key)
{
values.iter().any(|v| {
x_vf.value.contains_json_value(v)
})
} else {
true
};
}
}
bool
})
.sorted()
.next_back();
if let Some(version) = version {
if is_visible_version(
&version.inner,
&user_option,
&pool,
&redis,
)
.await?
{
response.insert(
hash.clone(),
models::projects::Version::from(
version.clone(),
),
);
if let Some(version_types) = &query_file.version_types {
bool &= version_types
.iter()
.any(|y| y.as_str() == x.inner.version_type);
}
}
if let Some(loaders) = &query_file.loaders {
bool &=
x.loaders.iter().any(|y| loaders.contains(y));
}
if let Some(loader_fields) = &query_file.loader_fields {
for (key, values) in loader_fields {
bool &= if let Some(x_vf) = x
.version_fields
.iter()
.find(|y| y.field_name == *key)
{
values.iter().any(|v| {
x_vf.value.contains_json_value(v)
})
} else {
true
};
}
}
bool
})
.sorted()
.next_back();
if let Some(version) = version
&& is_visible_version(
&version.inner,
&user_option,
&pool,
&redis,
)
.await?
{
response.insert(
hash.clone(),
models::projects::Version::from(version.clone()),
);
}
}
}

View File

@@ -106,13 +106,12 @@ pub async fn version_project_get_helper(
|| x.inner.version_number == id.1
});
if let Some(version) = version {
if is_visible_version(&version.inner, &user_option, &pool, &redis)
if let Some(version) = version
&& is_visible_version(&version.inner, &user_option, &pool, &redis)
.await?
{
return Ok(HttpResponse::Ok()
.json(models::projects::Version::from(version)));
}
{
return Ok(HttpResponse::Ok()
.json(models::projects::Version::from(version)));
}
}
@@ -190,12 +189,12 @@ pub async fn version_get_helper(
.map(|x| x.1)
.ok();
if let Some(data) = version_data {
if is_visible_version(&data.inner, &user_option, &pool, &redis).await? {
return Ok(
HttpResponse::Ok().json(models::projects::Version::from(data))
);
}
if let Some(data) = version_data
&& is_visible_version(&data.inner, &user_option, &pool, &redis).await?
{
return Ok(
HttpResponse::Ok().json(models::projects::Version::from(data))
);
}
Err(ApiError::NotFound)

View File

@@ -15,14 +15,12 @@ pub async fn get_user_status(
return Some(friend_status);
}
if let Ok(mut conn) = redis.pool.get().await {
if let Ok(mut statuses) =
if let Ok(mut conn) = redis.pool.get().await
&& let Ok(mut statuses) =
conn.sscan::<_, String>(get_field_name(user)).await
{
if let Some(status_json) = statuses.next_item().await {
return serde_json::from_str::<UserStatus>(&status_json).ok();
}
}
&& let Some(status_json) = statuses.next_item().await
{
return serde_json::from_str::<UserStatus>(&status_json).ok();
}
None

View File

@@ -138,12 +138,11 @@ fn process_image(
let (orig_width, orig_height) = img.dimensions();
let aspect_ratio = orig_width as f32 / orig_height as f32;
if let Some(target_width) = target_width {
if img.width() > target_width {
let new_height =
(target_width as f32 / aspect_ratio).round() as u32;
img = img.resize(target_width, new_height, FilterType::Lanczos3);
}
if let Some(target_width) = target_width
&& img.width() > target_width
{
let new_height = (target_width as f32 / aspect_ratio).round() as u32;
img = img.resize(target_width, new_height, FilterType::Lanczos3);
}
if let Some(min_aspect_ratio) = min_aspect_ratio {

View File

@@ -133,12 +133,11 @@ pub async fn rate_limit_middleware(
.expect("Rate limiter not configured properly")
.clone();
if let Some(key) = req.headers().get("x-ratelimit-key") {
if key.to_str().ok()
if let Some(key) = req.headers().get("x-ratelimit-key")
&& key.to_str().ok()
== dotenvy::var("RATE_LIMIT_IGNORE_KEY").ok().as_deref()
{
return Ok(next.call(req).await?.map_into_left_body());
}
{
return Ok(next.call(req).await?.map_into_left_body());
}
let conn_info = req.connection_info().clone();

View File

@@ -22,46 +22,47 @@ pub fn validation_errors_to_string(
let key_option = map.keys().next();
if let Some(field) = key_option {
if let Some(error) = map.get(field) {
return match error {
ValidationErrorsKind::Struct(errors) => {
validation_errors_to_string(
if let Some(field) = key_option
&& let Some(error) = map.get(field)
{
return match error {
ValidationErrorsKind::Struct(errors) => {
validation_errors_to_string(
*errors.clone(),
Some(format!("of item {field}")),
)
}
ValidationErrorsKind::List(list) => {
if let Some((index, errors)) = list.iter().next() {
output.push_str(&validation_errors_to_string(
*errors.clone(),
Some(format!("of item {field}")),
)
Some(format!("of list {field} with index {index}")),
));
}
ValidationErrorsKind::List(list) => {
if let Some((index, errors)) = list.iter().next() {
output.push_str(&validation_errors_to_string(
*errors.clone(),
Some(format!("of list {field} with index {index}")),
));
}
output
}
ValidationErrorsKind::Field(errors) => {
if let Some(error) = errors.first() {
if let Some(adder) = adder {
write!(
output
}
ValidationErrorsKind::Field(errors) => {
if let Some(error) = errors.first() {
if let Some(adder) = adder {
write!(
&mut output,
"Field {field} {adder} failed validation with error: {}",
error.code
).unwrap();
} else {
write!(
&mut output,
"Field {field} failed validation with error: {}",
error.code
).unwrap();
}
} else {
write!(
&mut output,
"Field {field} failed validation with error: {}",
error.code
)
.unwrap();
}
output
}
};
}
output
}
};
}
String::new()

View File

@@ -238,17 +238,17 @@ pub async fn send_slack_webhook(
}
});
if let Some(icon_url) = metadata.project_icon_url {
if let Some(project_block) = project_block.as_object_mut() {
project_block.insert(
"accessory".to_string(),
serde_json::json!({
"type": "image",
"image_url": icon_url,
"alt_text": metadata.project_title
}),
);
}
if let Some(icon_url) = metadata.project_icon_url
&& let Some(project_block) = project_block.as_object_mut()
{
project_block.insert(
"accessory".to_string(),
serde_json::json!({
"type": "image",
"image_url": icon_url,
"alt_text": metadata.project_title
}),
);
}
blocks.push(project_block);

View File

@@ -222,10 +222,10 @@ impl<'a, A: Api> PermissionsTest<'a, A> {
resp.status().as_u16()
));
}
if resp.status() == StatusCode::OK {
if let Some(failure_json_check) = &self.failure_json_check {
failure_json_check(&test::read_body_json(resp).await);
}
if resp.status() == StatusCode::OK
&& let Some(failure_json_check) = &self.failure_json_check
{
failure_json_check(&test::read_body_json(resp).await);
}
// Failure test- logged in on a non-team user
@@ -246,10 +246,10 @@ impl<'a, A: Api> PermissionsTest<'a, A> {
resp.status().as_u16()
));
}
if resp.status() == StatusCode::OK {
if let Some(failure_json_check) = &self.failure_json_check {
failure_json_check(&test::read_body_json(resp).await);
}
if resp.status() == StatusCode::OK
&& let Some(failure_json_check) = &self.failure_json_check
{
failure_json_check(&test::read_body_json(resp).await);
}
// Failure test- logged in with EVERY non-relevant permission
@@ -270,10 +270,10 @@ impl<'a, A: Api> PermissionsTest<'a, A> {
resp.status().as_u16()
));
}
if resp.status() == StatusCode::OK {
if let Some(failure_json_check) = &self.failure_json_check {
failure_json_check(&test::read_body_json(resp).await);
}
if resp.status() == StatusCode::OK
&& let Some(failure_json_check) = &self.failure_json_check
{
failure_json_check(&test::read_body_json(resp).await);
}
// Patch user's permissions to success permissions
@@ -300,10 +300,10 @@ impl<'a, A: Api> PermissionsTest<'a, A> {
resp.status().as_u16()
));
}
if resp.status() == StatusCode::OK {
if let Some(success_json_check) = &self.success_json_check {
success_json_check(&test::read_body_json(resp).await);
}
if resp.status() == StatusCode::OK
&& let Some(success_json_check) = &self.success_json_check
{
success_json_check(&test::read_body_json(resp).await);
}
// If the remove_user flag is set, remove the user from the project

View File

@@ -1,2 +1,2 @@
allow-dbg-in-tests = true
msrv = "1.88.0"
msrv = "1.89.0"

View File

@@ -53,6 +53,7 @@ fn build_java_jars() {
.arg("build")
.arg("--no-daemon")
.arg("--console=rich")
.arg("--info")
.current_dir(dunce::canonicalize("java").unwrap())
.status()
.expect("Failed to wait on Gradle build");

View File

@@ -1,6 +1,7 @@
plugins {
java
id("com.diffplug.spotless") version "7.0.4"
id("com.gradleup.shadow") version "9.0.0-rc2"
}
repositories {
@@ -8,6 +9,9 @@ repositories {
}
dependencies {
implementation("org.ow2.asm:asm:9.8")
implementation("org.ow2.asm:asm-tree:9.8")
testImplementation(libs.junit.jupiter)
testRuntimeOnly("org.junit.platform:junit-platform-launcher")
}
@@ -31,7 +35,17 @@ spotless {
}
tasks.jar {
enabled = false
}
tasks.shadowJar {
archiveFileName = "theseus.jar"
manifest {
attributes["Premain-Class"] = "com.modrinth.theseus.agent.TheseusAgent"
}
enableRelocation = true
relocationPrefix = "com.modrinth.theseus.shadow"
}
tasks.named<Test>("test") {

View File

@@ -0,0 +1,45 @@
package com.modrinth.theseus.agent;
import java.util.ListIterator;
import java.util.function.Predicate;
import org.objectweb.asm.Type;
import org.objectweb.asm.tree.AbstractInsnNode;
import org.objectweb.asm.tree.FieldInsnNode;
public interface InsnPattern extends Predicate<AbstractInsnNode> {
/**
* Advances past the first match of all instructions in the pattern.
* @return {@code true} if the pattern was found, {@code false} if not
*/
static boolean findAndSkip(ListIterator<AbstractInsnNode> iterator, InsnPattern... pattern) {
if (pattern.length == 0) {
return true;
}
int patternIndex = 0;
while (iterator.hasNext()) {
final AbstractInsnNode insn = iterator.next();
if (insn.getOpcode() == -1) continue;
if (pattern[patternIndex].test(insn) && ++patternIndex == pattern.length) {
return true;
} else {
patternIndex = 0;
}
}
return false;
}
static InsnPattern opcode(int opcode) {
return insn -> insn.getOpcode() == opcode;
}
static InsnPattern field(int opcode, Type fieldType) {
final String typeDescriptor = fieldType.getDescriptor();
return insn -> {
if (insn.getOpcode() != opcode || !(insn instanceof FieldInsnNode)) {
return false;
}
final FieldInsnNode fieldInsn = (FieldInsnNode) insn;
return typeDescriptor.equals(fieldInsn.desc);
};
}
}

View File

@@ -0,0 +1,12 @@
package com.modrinth.theseus.agent;
// Must be kept up-to-date with quick_play_version.rs
public enum QuickPlayServerVersion {
BUILTIN,
BUILTIN_LEGACY,
INJECTED,
UNSUPPORTED;
public static final QuickPlayServerVersion CURRENT =
valueOf(System.getProperty("modrinth.internal.quickPlay.serverVersion"));
}

View File

@@ -0,0 +1,85 @@
package com.modrinth.theseus.agent;
import com.modrinth.theseus.agent.transformers.ClassTransformer;
import com.modrinth.theseus.agent.transformers.MinecraftTransformer;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.lang.instrument.Instrumentation;
import java.nio.file.FileVisitResult;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.SimpleFileVisitor;
import java.nio.file.attribute.BasicFileAttributes;
import java.util.HashMap;
import java.util.Map;
import org.objectweb.asm.ClassReader;
import org.objectweb.asm.ClassWriter;
@SuppressWarnings({"NullableProblems", "CallToPrintStackTrace"})
public final class TheseusAgent {
private static final boolean DEBUG_AGENT = Boolean.getBoolean("modrinth.debugAgent");
public static void premain(String args, Instrumentation instrumentation) {
final Path debugPath = Paths.get("ModrinthDebugTransformed");
if (DEBUG_AGENT) {
System.out.println(
"===== Theseus agent debugging enabled. Dumping transformed classes to " + debugPath + " =====");
if (Files.exists(debugPath)) {
try {
Files.walkFileTree(debugPath, new SimpleFileVisitor<Path>() {
@Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) throws IOException {
Files.delete(file);
return FileVisitResult.CONTINUE;
}
@Override
public FileVisitResult postVisitDirectory(Path dir, IOException exc) throws IOException {
Files.delete(dir);
return FileVisitResult.CONTINUE;
}
});
} catch (IOException e) {
new UncheckedIOException("Failed to delete " + debugPath, e).printStackTrace();
}
}
System.out.println("===== Quick play server version: " + QuickPlayServerVersion.CURRENT + " =====");
}
final Map<String, ClassTransformer> transformers = new HashMap<>();
transformers.put("net/minecraft/client/Minecraft", new MinecraftTransformer());
instrumentation.addTransformer((loader, className, classBeingRedefined, protectionDomain, classData) -> {
final ClassTransformer transformer = transformers.get(className);
if (transformer == null) {
return null;
}
final ClassReader reader = new ClassReader(classData);
final ClassWriter writer = new ClassWriter(reader, ClassWriter.COMPUTE_MAXS);
try {
if (!transformer.transform(reader, writer)) {
if (DEBUG_AGENT) {
System.out.println("Not writing " + className + " as its transformer returned false");
}
return null;
}
} catch (Throwable t) {
new IllegalStateException("Failed to transform " + className, t).printStackTrace();
return null;
}
final byte[] result = writer.toByteArray();
if (DEBUG_AGENT) {
try {
final Path path = debugPath.resolve(className + ".class");
Files.createDirectories(path.getParent());
Files.write(path, result);
System.out.println("Dumped class to " + path.toAbsolutePath());
} catch (IOException e) {
new UncheckedIOException("Failed to dump class " + className, e).printStackTrace();
}
}
return result;
});
}
}

View File

@@ -0,0 +1,20 @@
package com.modrinth.theseus.agent.transformers;
import org.objectweb.asm.ClassReader;
import org.objectweb.asm.ClassWriter;
import org.objectweb.asm.tree.ClassNode;
public abstract class ClassNodeTransformer extends ClassTransformer {
protected abstract boolean transform(ClassNode classNode);
@Override
public final boolean transform(ClassReader reader, ClassWriter writer) {
final ClassNode classNode = new ClassNode();
reader.accept(classNode, 0);
if (!transform(classNode)) {
return false;
}
classNode.accept(writer);
return true;
}
}

View File

@@ -0,0 +1,14 @@
package com.modrinth.theseus.agent.transformers;
import org.objectweb.asm.ClassReader;
import org.objectweb.asm.ClassWriter;
import org.objectweb.asm.Opcodes;
import org.objectweb.asm.tree.ClassNode;
public abstract class ClassTransformer {
public abstract boolean transform(ClassReader reader, ClassWriter writer);
protected static boolean needsStackMap(ClassNode classNode) {
return (classNode.version & 0xffff) >= Opcodes.V1_6;
}
}

View File

@@ -0,0 +1,99 @@
package com.modrinth.theseus.agent.transformers;
import com.modrinth.theseus.agent.InsnPattern;
import com.modrinth.theseus.agent.QuickPlayServerVersion;
import java.util.ListIterator;
import org.objectweb.asm.Opcodes;
import org.objectweb.asm.tree.AbstractInsnNode;
import org.objectweb.asm.tree.ClassNode;
import org.objectweb.asm.tree.FrameNode;
import org.objectweb.asm.tree.InsnNode;
import org.objectweb.asm.tree.JumpInsnNode;
import org.objectweb.asm.tree.LabelNode;
import org.objectweb.asm.tree.LdcInsnNode;
import org.objectweb.asm.tree.MethodInsnNode;
import org.objectweb.asm.tree.MethodNode;
import org.objectweb.asm.tree.VarInsnNode;
public final class MinecraftTransformer extends ClassNodeTransformer {
private static final String SET_SERVER_NAME_DESC = "(Ljava/lang/String;I)V";
private static final InsnPattern[] INITIALIZE_THIS_PATTERN = {InsnPattern.opcode(Opcodes.INVOKESPECIAL)};
@Override
protected boolean transform(ClassNode classNode) {
if (QuickPlayServerVersion.CURRENT == QuickPlayServerVersion.INJECTED) {
return addServerJoinSupport(classNode);
}
return false;
}
private static boolean addServerJoinSupport(ClassNode classNode) {
String setServerName = null;
MethodNode constructor = null;
for (final MethodNode method : classNode.methods) {
if (constructor == null && method.name.equals("<init>")) {
constructor = method;
} else if (method.desc.equals(SET_SERVER_NAME_DESC) && method.name.indexOf('$') == -1) {
// Check for $ is because Mixin-injected methods should have $ in it
if (setServerName == null) {
setServerName = method.name;
} else {
// Already found a setServer method, but we found another one? Since we can't
// know which is real, just return so we don't call something we shouldn't.
// Note this can't happen unless some other mod is adding a method with this
// same descriptor.
return false;
}
}
}
if (constructor == null) {
return false;
}
final ListIterator<AbstractInsnNode> it = constructor.instructions.iterator();
if (!InsnPattern.findAndSkip(it, INITIALIZE_THIS_PATTERN)) {
return true;
}
final LabelNode noQuickPlayLabel = new LabelNode();
final LabelNode doneQuickPlayLabel = new LabelNode();
it.add(new LdcInsnNode("modrinth.internal.quickPlay.host"));
// String
it.add(new MethodInsnNode(
Opcodes.INVOKESTATIC, "java/lang/System", "getProperty", "(Ljava/lang/String;)Ljava/lang/String;"));
// String
it.add(new InsnNode(Opcodes.DUP));
// String String
it.add(new JumpInsnNode(Opcodes.IFNULL, noQuickPlayLabel));
// String
it.add(new VarInsnNode(Opcodes.ALOAD, 0));
// String Minecraft
it.add(new InsnNode(Opcodes.SWAP));
// Minecraft String
it.add(new LdcInsnNode("modrinth.internal.quickPlay.port"));
// Minecraft String String
it.add(new MethodInsnNode(
Opcodes.INVOKESTATIC, "java/lang/System", "getProperty", "(Ljava/lang/String;)Ljava/lang/String;"));
// Minecraft String String
it.add(new MethodInsnNode(Opcodes.INVOKESTATIC, "java/lang/Integer", "parseInt", "(Ljava/lang/String;)I"));
// Minecraft String int
it.add(new MethodInsnNode(
Opcodes.INVOKEVIRTUAL, "net/minecraft/client/Minecraft", setServerName, SET_SERVER_NAME_DESC));
//
it.add(new JumpInsnNode(Opcodes.GOTO, doneQuickPlayLabel));
it.add(noQuickPlayLabel);
if (needsStackMap(classNode)) {
it.add(new FrameNode(Opcodes.F_SAME, 0, null, 0, null));
}
// String
it.add(new InsnNode(Opcodes.POP));
//
it.add(doneQuickPlayLabel);
if (needsStackMap(classNode)) {
it.add(new FrameNode(Opcodes.F_SAME, 0, null, 0, null));
}
//
return true;
}
}

View File

@@ -50,10 +50,10 @@ pub async fn parse_command(
// We assume anything else is a filepath to an .mrpack file
let path = PathBuf::from(command_string);
let path = io::canonicalize(path)?;
if let Some(ext) = path.extension() {
if ext == "mrpack" {
return Ok(CommandPayload::RunMRPack { path });
}
if let Some(ext) = path.extension()
&& ext == "mrpack"
{
return Ok(CommandPayload::RunMRPack { path });
}
emit_warning(&format!(
"Invalid command, unrecognized filetype: {}",

View File

@@ -106,13 +106,13 @@ pub async fn auto_install_java(java_version: u32) -> crate::Result<PathBuf> {
})?;
// removes the old installation of java
if let Some(file) = archive.file_names().next() {
if let Some(dir) = file.split('/').next() {
let path = path.join(dir);
if let Some(file) = archive.file_names().next()
&& let Some(dir) = file.split('/').next()
{
let path = path.join(dir);
if path.exists() {
io::remove_dir_all(path).await?;
}
if path.exists() {
io::remove_dir_all(path).await?;
}
}

View File

@@ -54,11 +54,11 @@ pub async fn remove_user(uuid: uuid::Uuid) -> crate::Result<()> {
if let Some((uuid, user)) = users.remove(&uuid) {
Credentials::remove(uuid, &state.pool).await?;
if user.active {
if let Some((_, mut user)) = users.into_iter().next() {
user.active = true;
user.upsert(&state.pool).await?;
}
if user.active
&& let Some((_, mut user)) = users.into_iter().next()
{
user.active = true;
user.upsert(&state.pool).await?;
}
}

View File

@@ -11,6 +11,7 @@ pub mod mr_auth;
pub mod pack;
pub mod process;
pub mod profile;
pub mod server_address;
pub mod settings;
pub mod tags;
pub mod worlds;

View File

@@ -221,14 +221,14 @@ async fn import_atlauncher_unmanaged(
.unwrap_or_else(|| backup_name.to_string());
prof.install_stage = ProfileInstallStage::PackInstalling;
if let Some(ref project_id) = description.project_id {
if let Some(ref version_id) = description.version_id {
prof.linked_data = Some(LinkedData {
project_id: project_id.clone(),
version_id: version_id.clone(),
locked: true,
})
}
if let Some(ref project_id) = description.project_id
&& let Some(ref version_id) = description.version_id
{
prof.linked_data = Some(LinkedData {
project_id: project_id.clone(),
version_id: version_id.clone(),
locked: true,
})
}
prof.icon_path = description

View File

@@ -383,18 +383,18 @@ pub async fn set_profile_information(
.unwrap_or_else(|| backup_name.to_string());
prof.install_stage = ProfileInstallStage::PackInstalling;
if let Some(ref project_id) = description.project_id {
if let Some(ref version_id) = description.version_id {
prof.linked_data = Some(LinkedData {
project_id: project_id.clone(),
version_id: version_id.clone(),
locked: if !ignore_lock {
true
} else {
prof.linked_data.as_ref().is_none_or(|x| x.locked)
},
})
}
if let Some(ref project_id) = description.project_id
&& let Some(ref version_id) = description.version_id
{
prof.linked_data = Some(LinkedData {
project_id: project_id.clone(),
version_id: version_id.clone(),
locked: if !ignore_lock {
true
} else {
prof.linked_data.as_ref().is_none_or(|x| x.locked)
},
})
}
prof.icon_path = description

View File

@@ -149,13 +149,12 @@ pub async fn install_zipped_mrpack_files(
let profile_path = profile_path.clone();
async move {
//TODO: Future update: prompt user for optional files in a modpack
if let Some(env) = project.env {
if env
if let Some(env) = project.env
&& env
.get(&EnvType::Client)
.is_some_and(|x| x == &SideType::Unsupported)
{
return Ok(());
}
{
return Ok(());
}
let file = fetch_mirrors(
@@ -375,12 +374,12 @@ pub async fn remove_all_related_files(
)
.await?
{
if let Some(metadata) = &project.metadata {
if to_remove.contains(&metadata.project_id) {
let path = profile_full_path.join(file_path);
if path.exists() {
io::remove_file(&path).await?;
}
if let Some(metadata) = &project.metadata
&& to_remove.contains(&metadata.project_id)
{
let path = profile_full_path.join(file_path);
if path.exists() {
io::remove_file(&path).await?;
}
}
}

View File

@@ -23,6 +23,7 @@ use serde_json::json;
use std::collections::{HashMap, HashSet};
use crate::data::Settings;
use crate::server_address::ServerAddress;
use dashmap::DashMap;
use std::iter::FromIterator;
use std::{
@@ -40,7 +41,7 @@ pub mod update;
pub enum QuickPlayType {
None,
Singleplayer(String),
Server(String),
Server(ServerAddress),
}
/// Remove a profile
@@ -336,28 +337,26 @@ pub async fn update_project(
)
.await?
.remove(project_path)
&& let Some(update_version) = &file.update_version_id
{
if let Some(update_version) = &file.update_version_id {
let path = Profile::add_project_version(
profile_path,
update_version,
&state.pool,
&state.fetch_semaphore,
&state.io_semaphore,
)
.await?;
let path = Profile::add_project_version(
profile_path,
update_version,
&state.pool,
&state.fetch_semaphore,
&state.io_semaphore,
)
.await?;
if path != project_path {
Profile::remove_project(profile_path, project_path).await?;
}
if !skip_send_event.unwrap_or(false) {
emit_profile(profile_path, ProfilePayloadType::Edited)
.await?;
}
return Ok(path);
if path != project_path {
Profile::remove_project(profile_path, project_path).await?;
}
if !skip_send_event.unwrap_or(false) {
emit_profile(profile_path, ProfilePayloadType::Edited).await?;
}
return Ok(path);
}
Err(crate::ErrorKind::InputError(
@@ -478,10 +477,10 @@ pub async fn export_mrpack(
let included_export_candidates = included_export_candidates
.into_iter()
.filter(|x| {
if let Some(f) = PathBuf::from(x).file_name() {
if f.to_string_lossy().starts_with(".DS_Store") {
return false;
}
if let Some(f) = PathBuf::from(x).file_name()
&& f.to_string_lossy().starts_with(".DS_Store")
{
return false;
}
true
})
@@ -630,7 +629,7 @@ fn pack_get_relative_path(
#[tracing::instrument]
pub async fn run(
path: &str,
quick_play_type: &QuickPlayType,
quick_play_type: QuickPlayType,
) -> crate::Result<ProcessMetadata> {
let state = State::get().await?;
@@ -646,7 +645,7 @@ pub async fn run(
async fn run_credentials(
path: &str,
credentials: &Credentials,
quick_play_type: &QuickPlayType,
quick_play_type: QuickPlayType,
) -> crate::Result<ProcessMetadata> {
let state = State::get().await?;
let settings = Settings::get(&state.pool).await?;

View File

@@ -0,0 +1,166 @@
use crate::{Error, ErrorKind, Result};
use std::fmt::Display;
use std::mem;
use std::net::{Ipv4Addr, Ipv6Addr};
use tokio::sync::Semaphore;
#[derive(Debug, Clone)]
pub enum ServerAddress {
Unresolved(String),
Resolved {
original_host: String,
original_port: u16,
resolved_host: String,
resolved_port: u16,
},
}
impl ServerAddress {
pub async fn resolve(&mut self) -> Result<()> {
match self {
Self::Unresolved(address) => {
let (host, port) = parse_server_address(address)?;
let (resolved_host, resolved_port) =
resolve_server_address(host, port).await?;
*self = Self::Resolved {
original_host: if host.len() == address.len() {
mem::take(address)
} else {
host.to_owned()
},
original_port: port,
resolved_host,
resolved_port,
}
}
Self::Resolved { .. } => {}
}
Ok(())
}
pub fn require_resolved(&self) -> Result<(&str, u16)> {
match self {
Self::Resolved {
resolved_host,
resolved_port,
..
} => Ok((resolved_host, *resolved_port)),
Self::Unresolved(address) => Err(ErrorKind::InputError(format!(
"Unexpected unresolved server address: {address}"
))
.into()),
}
}
}
impl Display for ServerAddress {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
Self::Unresolved(address) => write!(f, "{address}"),
Self::Resolved {
resolved_host,
resolved_port,
..
} => {
if resolved_host.contains(':') {
write!(f, "[{resolved_host}]:{resolved_port}")
} else {
write!(f, "{resolved_host}:{resolved_port}")
}
}
}
}
}
pub fn parse_server_address(address: &str) -> Result<(&str, u16)> {
parse_server_address_inner(address)
.map_err(|e| Error::from(ErrorKind::InputError(e)))
}
// Reimplementation of Guava's HostAndPort#fromString with a default port of 25565
fn parse_server_address_inner(
address: &str,
) -> std::result::Result<(&str, u16), String> {
let (host, port_str) = if address.starts_with("[") {
let colon_index = address.find(':');
let close_bracket_index = address.rfind(']');
if colon_index.is_none() || close_bracket_index.is_none() {
return Err(format!("Invalid bracketed host/port: {address}"));
}
let close_bracket_index = close_bracket_index.unwrap();
let host = &address[1..close_bracket_index];
if close_bracket_index + 1 == address.len() {
(host, "")
} else {
if address.as_bytes().get(close_bracket_index).copied()
!= Some(b':')
{
return Err(format!(
"Only a colon may follow a close bracket: {address}"
));
}
let port_str = &address[close_bracket_index + 2..];
for c in port_str.chars() {
if !c.is_ascii_digit() {
return Err(format!("Port must be numeric: {address}"));
}
}
(host, port_str)
}
} else {
let colon_pos = address.find(':');
if let Some(colon_pos) = colon_pos {
(&address[..colon_pos], &address[colon_pos + 1..])
} else {
(address, "")
}
};
let mut port = None;
if !port_str.is_empty() {
if port_str.starts_with('+') {
return Err(format!("Unparseable port number: {port_str}"));
}
port = port_str.parse::<u16>().ok();
if port.is_none() {
return Err(format!("Unparseable port number: {port_str}"));
}
}
Ok((host, port.unwrap_or(25565)))
}
pub async fn resolve_server_address(
host: &str,
port: u16,
) -> Result<(String, u16)> {
static SIMULTANEOUS_DNS_QUERIES: Semaphore = Semaphore::const_new(24);
if port != 25565
|| host.parse::<Ipv4Addr>().is_ok()
|| host.parse::<Ipv6Addr>().is_ok()
{
return Ok((host.to_owned(), port));
}
let _permit = SIMULTANEOUS_DNS_QUERIES.acquire().await?;
let resolver = hickory_resolver::TokioResolver::builder_tokio()?.build();
Ok(
match resolver.srv_lookup(format!("_minecraft._tcp.{host}")).await {
Err(e)
if e.proto()
.filter(|x| x.kind().is_no_records_found())
.is_some() =>
{
None
}
Err(e) => return Err(e.into()),
Ok(lookup) => lookup
.into_iter()
.next()
.map(|r| (r.target().to_string(), r.port())),
}
.unwrap_or_else(|| (host.to_owned(), port)),
)
}

View File

@@ -1,6 +1,7 @@
use crate::data::ModLoader;
use crate::launcher::get_loader_version_from_profile;
use crate::profile::get_full_path;
use crate::server_address::{parse_server_address, resolve_server_address};
use crate::state::attached_world_data::AttachedWorldData;
use crate::state::{
Profile, ProfileInstallStage, attached_world_data, server_join_log,
@@ -11,7 +12,7 @@ pub use crate::util::server_ping::{
ServerGameProfile, ServerPlayers, ServerStatus, ServerVersion,
};
use crate::util::{io, server_ping};
use crate::{Error, ErrorKind, Result, State, launcher};
use crate::{ErrorKind, Result, State, launcher};
use async_walkdir::WalkDir;
use async_zip::{Compression, ZipEntryBuilder};
use chrono::{DateTime, Local, TimeZone, Utc};
@@ -24,11 +25,9 @@ use regex::{Regex, RegexBuilder};
use serde::{Deserialize, Serialize};
use std::cmp::Reverse;
use std::io::Cursor;
use std::net::{Ipv4Addr, Ipv6Addr};
use std::path::{Path, PathBuf};
use std::sync::LazyLock;
use tokio::io::AsyncWriteExt;
use tokio::sync::Semaphore;
use tokio::task::JoinSet;
use tokio_util::compat::FuturesAsyncWriteCompatExt;
use url::Url;
@@ -433,9 +432,9 @@ async fn get_server_worlds_in_profile(
let mut futures = JoinSet::new();
for (index, world) in worlds.iter().enumerate().skip(first_server_index)
{
if world.last_played.is_some() {
continue;
}
// We can't check for the profile already having a last_played, in case the user joined
// the target address directly more recently. This is often the case when using
// quick-play before 1.20.
if let WorldDetails::Server { address, .. } = &world.details
&& let Ok((host, port)) = parse_server_address(address)
{
@@ -917,93 +916,3 @@ pub async fn get_server_status(
)
.await
}
pub fn parse_server_address(address: &str) -> Result<(&str, u16)> {
parse_server_address_inner(address)
.map_err(|e| Error::from(ErrorKind::InputError(e)))
}
// Reimplementation of Guava's HostAndPort#fromString with a default port of 25565
fn parse_server_address_inner(
address: &str,
) -> std::result::Result<(&str, u16), String> {
let (host, port_str) = if address.starts_with("[") {
let colon_index = address.find(':');
let close_bracket_index = address.rfind(']');
if colon_index.is_none() || close_bracket_index.is_none() {
return Err(format!("Invalid bracketed host/port: {address}"));
}
let close_bracket_index = close_bracket_index.unwrap();
let host = &address[1..close_bracket_index];
if close_bracket_index + 1 == address.len() {
(host, "")
} else {
if address.as_bytes().get(close_bracket_index).copied()
!= Some(b':')
{
return Err(format!(
"Only a colon may follow a close bracket: {address}"
));
}
let port_str = &address[close_bracket_index + 2..];
for c in port_str.chars() {
if !c.is_ascii_digit() {
return Err(format!("Port must be numeric: {address}"));
}
}
(host, port_str)
}
} else {
let colon_pos = address.find(':');
if let Some(colon_pos) = colon_pos {
(&address[..colon_pos], &address[colon_pos + 1..])
} else {
(address, "")
}
};
let mut port = None;
if !port_str.is_empty() {
if port_str.starts_with('+') {
return Err(format!("Unparseable port number: {port_str}"));
}
port = port_str.parse::<u16>().ok();
if port.is_none() {
return Err(format!("Unparseable port number: {port_str}"));
}
}
Ok((host, port.unwrap_or(25565)))
}
async fn resolve_server_address(
host: &str,
port: u16,
) -> Result<(String, u16)> {
static SIMULTANEOUS_DNS_QUERIES: Semaphore = Semaphore::const_new(24);
if host.parse::<Ipv4Addr>().is_ok() || host.parse::<Ipv6Addr>().is_ok() {
return Ok((host.to_owned(), port));
}
let _permit = SIMULTANEOUS_DNS_QUERIES.acquire().await?;
let resolver = hickory_resolver::TokioResolver::builder_tokio()?.build();
Ok(
match resolver.srv_lookup(format!("_minecraft._tcp.{host}")).await {
Err(e)
if e.proto()
.filter(|x| x.kind().is_no_records_found())
.is_some() =>
{
None
}
Err(e) => return Err(e.into()),
Ok(lookup) => lookup
.into_iter()
.next()
.map(|r| (r.target().to_string(), r.port())),
}
.unwrap_or_else(|| (host.to_owned(), port)),
)
}

View File

@@ -184,6 +184,7 @@ pub enum LoadingBarType {
}
#[derive(Serialize, Clone)]
#[cfg(feature = "tauri")]
pub struct LoadingPayload {
pub event: LoadingBarType,
pub loader_uuid: Uuid,
@@ -192,11 +193,7 @@ pub struct LoadingPayload {
}
#[derive(Serialize, Clone)]
pub struct OfflinePayload {
pub offline: bool,
}
#[derive(Serialize, Clone)]
#[cfg(feature = "tauri")]
pub struct WarningPayload {
pub message: String,
}
@@ -220,12 +217,14 @@ pub enum CommandPayload {
}
#[derive(Serialize, Clone)]
#[cfg(feature = "tauri")]
pub struct ProcessPayload {
pub profile_path_id: String,
pub uuid: Uuid,
pub event: ProcessPayloadType,
pub message: String,
}
#[derive(Serialize, Clone, Debug)]
#[serde(rename_all = "snake_case")]
pub enum ProcessPayloadType {
@@ -234,11 +233,13 @@ pub enum ProcessPayloadType {
}
#[derive(Serialize, Clone)]
#[cfg(feature = "tauri")]
pub struct ProfilePayload {
pub profile_path_id: String,
#[serde(flatten)]
pub event: ProfilePayloadType,
}
#[derive(Serialize, Clone)]
#[serde(tag = "event", rename_all = "snake_case")]
pub enum ProfilePayloadType {
@@ -257,6 +258,16 @@ pub enum ProfilePayloadType {
Removed,
}
#[derive(Serialize, Clone)]
#[serde(rename_all = "snake_case")]
#[serde(tag = "event")]
pub enum FriendPayload {
FriendRequest { from: UserId },
UserOffline { id: UserId },
StatusUpdate { user_status: UserStatus },
StatusSync,
}
#[derive(Debug, thiserror::Error)]
pub enum EventError {
#[error("Event state was not properly initialized")]
@@ -269,13 +280,3 @@ pub enum EventError {
#[error("Tauri error: {0}")]
TauriError(#[from] tauri::Error),
}
#[derive(Serialize, Clone)]
#[serde(rename_all = "snake_case")]
#[serde(tag = "event")]
pub enum FriendPayload {
FriendRequest { from: UserId },
UserOffline { id: UserId },
StatusUpdate { user_status: UserStatus },
StatusSync,
}

View File

@@ -1,5 +1,6 @@
//! Minecraft CLI argument logic
use crate::launcher::parse_rules;
use crate::launcher::quick_play_version::QuickPlayServerVersion;
use crate::launcher::{QuickPlayVersion, parse_rules};
use crate::profile::QuickPlayType;
use crate::state::Credentials;
use crate::{
@@ -31,15 +32,15 @@ pub fn get_class_paths(
let mut cps = libraries
.iter()
.filter_map(|library| {
if let Some(rules) = &library.rules {
if !parse_rules(
if let Some(rules) = &library.rules
&& !parse_rules(
rules,
java_arch,
&QuickPlayType::None,
minecraft_updated,
) {
return None;
}
)
{
return None;
}
if !library.include_in_classpath {
@@ -115,11 +116,13 @@ pub fn get_jvm_arguments(
libraries_path: &Path,
log_configs_path: &Path,
class_paths: &str,
agent_path: &Path,
version_name: &str,
memory: MemorySettings,
custom_args: Vec<String>,
java_arch: &str,
quick_play_type: &QuickPlayType,
quick_play_version: QuickPlayVersion,
log_config: Option<&LoggingConfiguration>,
) -> crate::Result<Vec<String>> {
let mut parsed_arguments = Vec::new();
@@ -155,13 +158,45 @@ pub fn get_jvm_arguments(
parsed_arguments.push("-cp".to_string());
parsed_arguments.push(class_paths.to_string());
}
parsed_arguments.push(format!("-Xmx{}M", memory.maximum));
if let Some(LoggingConfiguration::Log4j2Xml { argument, file }) = log_config
{
let full_path = log_configs_path.join(&file.id);
let full_path = full_path.to_string_lossy();
parsed_arguments.push(argument.replace("${path}", &full_path));
}
parsed_arguments.push(format!(
"-javaagent:{}",
canonicalize(agent_path)
.map_err(|_| {
crate::ErrorKind::LauncherError(format!(
"Specified Java Agent path {} does not exist",
libraries_path.to_string_lossy()
))
.as_error()
})?
.to_string_lossy()
));
parsed_arguments.push(format!(
"-Dmodrinth.internal.quickPlay.serverVersion={}",
serde_json::to_value(quick_play_version.server)?
.as_str()
.unwrap()
));
if let QuickPlayType::Server(server) = quick_play_type
&& quick_play_version.server == QuickPlayServerVersion::Injected
{
let (host, port) = server.require_resolved()?;
parsed_arguments.extend_from_slice(&[
format!("-Dmodrinth.internal.quickPlay.host={host}"),
format!("-Dmodrinth.internal.quickPlay.port={port}"),
]);
}
for arg in custom_args {
if !arg.is_empty() {
parsed_arguments.push(arg);
@@ -225,13 +260,13 @@ pub async fn get_minecraft_arguments(
resolution: WindowSize,
java_arch: &str,
quick_play_type: &QuickPlayType,
quick_play_version: QuickPlayVersion,
) -> crate::Result<Vec<String>> {
let access_token = credentials.access_token.clone();
let profile = credentials.maybe_online_profile().await;
let mut parsed_arguments = Vec::new();
if let Some(arguments) = arguments {
let mut parsed_arguments = Vec::new();
parse_arguments(
arguments,
&mut parsed_arguments,
@@ -253,10 +288,7 @@ pub async fn get_minecraft_arguments(
java_arch,
quick_play_type,
)?;
Ok(parsed_arguments)
} else if let Some(legacy_arguments) = legacy_arguments {
let mut parsed_arguments = Vec::new();
for x in legacy_arguments.split(' ') {
parsed_arguments.push(parse_minecraft_argument(
&x.replace(' ', TEMPORARY_REPLACE_CHAR),
@@ -272,10 +304,21 @@ pub async fn get_minecraft_arguments(
quick_play_type,
)?);
}
Ok(parsed_arguments)
} else {
Ok(Vec::new())
}
if let QuickPlayType::Server(server) = quick_play_type
&& quick_play_version.server == QuickPlayServerVersion::BuiltinLegacy
{
let (host, port) = server.require_resolved()?;
parsed_arguments.extend_from_slice(&[
"--server".to_string(),
host.to_string(),
"--port".to_string(),
port.to_string(),
]);
}
Ok(parsed_arguments)
}
#[allow(clippy::too_many_arguments)]
@@ -354,9 +397,9 @@ fn parse_minecraft_argument(
)
.replace(
"${quickPlayMultiplayer}",
match quick_play_type {
QuickPlayType::Server(address) => address,
_ => "",
&match quick_play_type {
QuickPlayType::Server(address) => address.to_string(),
_ => "".to_string(),
},
))
}
@@ -461,10 +504,10 @@ pub async fn get_processor_main_class(
let mut line = line.map_err(IOError::from)?;
line.retain(|c| !c.is_whitespace());
if line.starts_with("Main-Class:") {
if let Some(class) = line.split(':').nth(1) {
return Ok(Some(class.to_string()));
}
if line.starts_with("Main-Class:")
&& let Some(class) = line.split(':').nth(1)
{
return Ok(Some(class.to_string()));
}
}

View File

@@ -290,12 +290,11 @@ pub async fn download_libraries(
loading_try_for_each_concurrent(
stream::iter(libraries.iter())
.map(Ok::<&Library, crate::Error>), None, loading_bar,loading_amount,num_files, None,|library| async move {
if let Some(rules) = &library.rules {
if !parse_rules(rules, java_arch, &QuickPlayType::None, minecraft_updated) {
if let Some(rules) = &library.rules
&& !parse_rules(rules, java_arch, &QuickPlayType::None, minecraft_updated) {
tracing::trace!("Skipped library {}", &library.name);
return Ok(());
}
}
if !library.downloadable {
tracing::trace!("Skipped non-downloadable library {}", &library.name);
@@ -311,15 +310,14 @@ pub async fn download_libraries(
return Ok(());
}
if let Some(d::minecraft::LibraryDownloads { artifact: Some(ref artifact), ..}) = library.downloads {
if !artifact.url.is_empty(){
if let Some(d::minecraft::LibraryDownloads { artifact: Some(ref artifact), ..}) = library.downloads
&& !artifact.url.is_empty(){
let bytes = fetch(&artifact.url, Some(&artifact.sha1), &st.fetch_semaphore, &st.pool)
.await?;
write(&path, &bytes, &st.io_semaphore).await?;
tracing::trace!("Fetched library {} to path {:?}", &library.name, &path);
return Ok::<_, crate::Error>(());
}
}
let url = [
library

View File

@@ -4,6 +4,9 @@ use crate::event::emit::{emit_loading, init_or_edit_loading};
use crate::event::{LoadingBarId, LoadingBarType};
use crate::launcher::download::download_log_config;
use crate::launcher::io::IOError;
use crate::launcher::quick_play_version::{
QuickPlayServerVersion, QuickPlayVersion,
};
use crate::profile::QuickPlayType;
use crate::state::{
Credentials, JavaVersion, ProcessMetadata, ProfileInstallStage,
@@ -25,6 +28,7 @@ use tokio::process::Command;
mod args;
pub mod download;
pub mod quick_play_version;
// All nones -> disallowed
// 1+ true -> allowed
@@ -337,10 +341,10 @@ pub async fn install_minecraft(
// Forge processors (90-100)
for (index, processor) in processors.iter().enumerate() {
if let Some(sides) = &processor.sides {
if !sides.contains(&String::from("client")) {
continue;
}
if let Some(sides) = &processor.sides
&& !sides.contains(&String::from("client"))
{
continue;
}
let cp = {
@@ -457,7 +461,7 @@ pub async fn launch_minecraft(
credentials: &Credentials,
post_exit_hook: Option<String>,
profile: &Profile,
quick_play_type: &QuickPlayType,
mut quick_play_type: QuickPlayType,
) -> crate::Result<ProcessMetadata> {
if profile.install_stage == ProfileInstallStage::PackInstalling
|| profile.install_stage == ProfileInstallStage::MinecraftInstalling
@@ -589,6 +593,18 @@ pub async fn launch_minecraft(
io::create_dir_all(&natives_dir).await?;
}
let quick_play_version =
QuickPlayVersion::find_version(version_index, &minecraft.versions);
tracing::debug!(
"Found QuickPlayVersion for {}: {quick_play_version:?}",
profile.game_version
);
if let QuickPlayType::Server(address) = &mut quick_play_type
&& quick_play_version.server >= QuickPlayServerVersion::BuiltinLegacy
{
address.resolve().await?;
}
let (main_class_keep_alive, main_class_path) =
get_resource_file!(env "JAVA_JARS_DIR" / "theseus.jar")?;
@@ -606,11 +622,13 @@ pub async fn launch_minecraft(
&java_version.architecture,
minecraft_updated,
)?,
&main_class_path,
&version_jar,
*memory,
Vec::from(java_args),
&java_version.architecture,
quick_play_type,
&quick_play_type,
quick_play_version,
version_info
.logging
.as_ref()
@@ -646,7 +664,8 @@ pub async fn launch_minecraft(
&version.type_,
*resolution,
&java_version.architecture,
quick_play_type,
&quick_play_type,
quick_play_version,
)
.await?
.into_iter(),

View File

@@ -0,0 +1,102 @@
use daedalus::minecraft::Version;
use serde::{Deserialize, Serialize};
// If modified, also update QuickPlayServerVersion.java
#[derive(
Debug, Copy, Clone, Eq, PartialEq, Ord, PartialOrd, Serialize, Deserialize,
)]
#[serde(rename_all = "SCREAMING_SNAKE_CASE")]
pub enum QuickPlayServerVersion {
Builtin,
BuiltinLegacy,
Injected,
Unsupported,
}
impl QuickPlayServerVersion {
pub fn min_version(&self) -> Option<&'static str> {
match self {
Self::Builtin => Some("23w14a"),
Self::BuiltinLegacy => Some("13w17a"),
Self::Injected => Some("a1.0.5_01"),
Self::Unsupported => None,
}
}
pub fn older_version(&self) -> Option<Self> {
match self {
Self::Builtin => Some(Self::BuiltinLegacy),
Self::BuiltinLegacy => Some(Self::Injected),
Self::Injected => Some(Self::Unsupported),
Self::Unsupported => None,
}
}
}
// If modified, also update QuickPlaySingleplayerVersion.java
#[derive(
Debug, Copy, Clone, Eq, PartialEq, Ord, PartialOrd, Serialize, Deserialize,
)]
#[serde(rename_all = "SCREAMING_SNAKE_CASE")]
pub enum QuickPlaySingleplayerVersion {
Builtin,
Unsupported,
}
impl QuickPlaySingleplayerVersion {
pub fn min_version(&self) -> Option<&'static str> {
match self {
Self::Builtin => Some("23w14a"),
Self::Unsupported => None,
}
}
pub fn older_version(&self) -> Option<Self> {
match self {
Self::Builtin => Some(Self::Unsupported),
Self::Unsupported => None,
}
}
}
#[derive(Debug, Copy, Clone, Eq, PartialEq)]
pub struct QuickPlayVersion {
pub server: QuickPlayServerVersion,
pub singleplayer: QuickPlaySingleplayerVersion,
}
impl QuickPlayVersion {
pub fn find_version(version_index: usize, versions: &[Version]) -> Self {
let mut server = QuickPlayServerVersion::Builtin;
let mut server_version = server.min_version();
let mut singleplayer = QuickPlaySingleplayerVersion::Builtin;
let mut singleplayer_version = singleplayer.min_version();
for version in versions.iter().take(version_index - 1) {
if let Some(check_version) = server_version
&& version.id == check_version
{
// Safety: older_version will always be Some when min_version is Some
server = server.older_version().unwrap();
server_version = server.min_version();
}
if let Some(check_version) = singleplayer_version
&& version.id == check_version
{
singleplayer = singleplayer.older_version().unwrap();
singleplayer_version = singleplayer.min_version();
}
if server_version.is_none() && singleplayer_version.is_none() {
break;
}
}
Self {
server,
singleplayer,
}
}
}

View File

@@ -385,10 +385,10 @@ impl DirectoryInfo {
return Err(e);
}
} else {
if let Some(disk_usage) = get_disk_usage(&move_dir)? {
if total_size > disk_usage {
return Err(crate::ErrorKind::DirectoryMoveError(format!("Not enough space to move directory to {}: only {} bytes available", app_dir.display(), disk_usage)).into());
}
if let Some(disk_usage) = get_disk_usage(&move_dir)?
&& total_size > disk_usage
{
return Err(crate::ErrorKind::DirectoryMoveError(format!("Not enough space to move directory to {}: only {} bytes available", app_dir.display(), disk_usage)).into());
}
let loader_bar_id = Arc::new(&loader_bar_id);

View File

@@ -9,7 +9,7 @@ use ariadne::networking::message::{
ClientToServerMessage, ServerToClientMessage,
};
use ariadne::users::UserStatus;
use async_tungstenite::WebSocketStream;
use async_tungstenite::WebSocketSender;
use async_tungstenite::tokio::{ConnectStream, connect_async};
use async_tungstenite::tungstenite::Message;
use async_tungstenite::tungstenite::client::IntoClientRequest;
@@ -17,7 +17,6 @@ use bytes::Bytes;
use chrono::{DateTime, Utc};
use dashmap::DashMap;
use either::Either;
use futures::stream::SplitSink;
use futures::{SinkExt, StreamExt};
use reqwest::Method;
use reqwest::header::HeaderValue;
@@ -32,7 +31,7 @@ use tokio::sync::{Mutex, RwLock};
use uuid::Uuid;
pub(super) type WriteSocket =
Arc<RwLock<Option<SplitSink<WebSocketStream<ConnectStream>, Message>>>>;
Arc<RwLock<Option<WebSocketSender<ConnectStream>>>>;
pub(super) type TunnelSockets = Arc<DashMap<Uuid, Arc<InternalTunnelSocket>>>;
pub struct FriendsSocket {
@@ -180,27 +179,24 @@ impl FriendsSocket {
ServerToClientMessage::FriendSocketStoppedListening { .. } => {}, // TODO
ServerToClientMessage::SocketConnected { to_socket, new_socket } => {
if let Some(connected_to) = sockets.get(&to_socket) {
if let InternalTunnelSocket::Listening(local_addr) = *connected_to.value().clone() {
if let Ok(new_stream) = TcpStream::connect(local_addr).await {
if let Some(connected_to) = sockets.get(&to_socket)
&& let InternalTunnelSocket::Listening(local_addr) = *connected_to.value().clone()
&& let Ok(new_stream) = TcpStream::connect(local_addr).await {
let (read, write) = new_stream.into_split();
sockets.insert(new_socket, Arc::new(InternalTunnelSocket::Connected(Mutex::new(write))));
Self::socket_read_loop(write_handle.clone(), read, new_socket);
continue;
}
}
}
let _ = Self::send_message(&write_handle, ClientToServerMessage::SocketClose { socket: new_socket }).await;
},
ServerToClientMessage::SocketClosed { socket } => {
sockets.remove_if(&socket, |_, x| matches!(*x.clone(), InternalTunnelSocket::Connected(_)));
},
ServerToClientMessage::SocketData { socket, data } => {
if let Some(mut socket) = sockets.get_mut(&socket) {
if let InternalTunnelSocket::Connected(ref stream) = *socket.value_mut().clone() {
if let Some(mut socket) = sockets.get_mut(&socket)
&& let InternalTunnelSocket::Connected(ref stream) = *socket.value_mut().clone() {
let _ = stream.lock().await.write_all(&data).await;
}
}
},
}
}

View File

@@ -100,8 +100,8 @@ pub async fn init_watcher() -> crate::Result<FileWatcher> {
let profile_path_str = profile_path_str.clone();
let world = world.clone();
tokio::spawn(async move {
if let Ok(state) = State::get().await {
if let Err(e) = attached_world_data::AttachedWorldData::remove_for_world(
if let Ok(state) = State::get().await
&& let Err(e) = attached_world_data::AttachedWorldData::remove_for_world(
&profile_path_str,
WorldType::Singleplayer,
&world,
@@ -109,7 +109,6 @@ pub async fn init_watcher() -> crate::Result<FileWatcher> {
).await {
tracing::warn!("Failed to remove AttachedWorldData for '{world}': {e}")
}
}
});
}
Some(ProfilePayloadType::WorldUpdated { world })
@@ -150,14 +149,14 @@ pub(crate) async fn watch_profiles_init(
) {
if let Ok(profiles_dir) = std::fs::read_dir(dirs.profiles_dir()) {
for profile_dir in profiles_dir {
if let Ok(file_name) = profile_dir.map(|x| x.file_name()) {
if let Some(file_name) = file_name.to_str() {
if file_name.starts_with(".DS_Store") {
continue;
};
if let Ok(file_name) = profile_dir.map(|x| x.file_name())
&& let Some(file_name) = file_name.to_str()
{
if file_name.starts_with(".DS_Store") {
continue;
};
watch_profile(file_name, watcher, dirs).await;
}
watch_profile(file_name, watcher, dirs).await;
}
}
}

View File

@@ -76,10 +76,9 @@ where
.loaded_config_dir
.clone()
.and_then(|x| x.to_str().map(|x| x.to_string()))
&& path != old_launcher_root_str
{
if path != old_launcher_root_str {
settings.custom_dir = Some(path);
}
settings.custom_dir = Some(path);
}
settings.prev_custom_dir = Some(old_launcher_root_str.clone());
@@ -136,31 +135,27 @@ where
.await?;
}
if let Some(device_token) = minecraft_auth.token {
if let Ok(private_key) =
if let Some(device_token) = minecraft_auth.token
&& let Ok(private_key) =
SigningKey::from_pkcs8_pem(&device_token.private_key)
{
if let Ok(uuid) = Uuid::parse_str(&device_token.id) {
DeviceTokenPair {
token: DeviceToken {
issue_instant: device_token.token.issue_instant,
not_after: device_token.token.not_after,
token: device_token.token.token,
display_claims: device_token
.token
.display_claims,
},
key: DeviceTokenKey {
id: uuid,
key: private_key,
x: device_token.x,
y: device_token.y,
},
}
.upsert(exec)
.await?;
}
&& let Ok(uuid) = Uuid::parse_str(&device_token.id)
{
DeviceTokenPair {
token: DeviceToken {
issue_instant: device_token.token.issue_instant,
not_after: device_token.token.not_after,
token: device_token.token.token,
display_claims: device_token.token.display_claims,
},
key: DeviceTokenKey {
id: uuid,
key: private_key,
x: device_token.x,
y: device_token.y,
},
}
.upsert(exec)
.await?;
}
}
@@ -207,100 +202,93 @@ where
update_version,
..
} = project.metadata
{
if let Some(file) = version
&& let Some(file) = version
.files
.iter()
.find(|x| x.hashes.get("sha512") == Some(&sha512))
{
if let Some(sha1) = file.hashes.get("sha1") {
if let Ok(metadata) = full_path.metadata() {
let file_name = format!(
"{}/{}",
profile.path,
path.replace('\\', "/")
.replace(".disabled", "")
);
&& let Some(sha1) = file.hashes.get("sha1")
{
if let Ok(metadata) = full_path.metadata() {
let file_name = format!(
"{}/{}",
profile.path,
path.replace('\\', "/")
.replace(".disabled", "")
);
cached_entries.push(CacheValue::FileHash(
CachedFileHash {
path: file_name,
size: metadata.len(),
hash: sha1.clone(),
project_type: ProjectType::get_from_parent_folder(&full_path),
},
));
}
cached_entries.push(CacheValue::File(
CachedFile {
hash: sha1.clone(),
project_id: version.project_id.clone(),
version_id: version.id.clone(),
},
));
if let Some(update_version) = update_version {
let mod_loader: ModLoader =
profile.metadata.loader.into();
cached_entries.push(
CacheValue::FileUpdate(
CachedFileUpdate {
hash: sha1.clone(),
game_version: profile
.metadata
.game_version
.clone(),
loaders: vec![
mod_loader
.as_str()
.to_string(),
],
update_version_id:
update_version.id.clone(),
},
cached_entries.push(CacheValue::FileHash(
CachedFileHash {
path: file_name,
size: metadata.len(),
hash: sha1.clone(),
project_type:
ProjectType::get_from_parent_folder(
&full_path,
),
);
cached_entries.push(CacheValue::Version(
(*update_version).into(),
));
}
let members = members
.into_iter()
.map(|x| {
let user = User {
id: x.user.id,
username: x.user.username,
avatar_url: x.user.avatar_url,
bio: x.user.bio,
created: x.user.created,
role: x.user.role,
badges: 0,
};
cached_entries.push(CacheValue::User(
user.clone(),
));
TeamMember {
team_id: x.team_id,
user,
is_owner: x.role == "Owner",
role: x.role,
ordering: x.ordering,
}
})
.collect::<Vec<_>>();
cached_entries.push(CacheValue::Team(members));
cached_entries.push(CacheValue::Version(
(*version).into(),
));
}
},
));
}
cached_entries.push(CacheValue::File(CachedFile {
hash: sha1.clone(),
project_id: version.project_id.clone(),
version_id: version.id.clone(),
}));
if let Some(update_version) = update_version {
let mod_loader: ModLoader =
profile.metadata.loader.into();
cached_entries.push(CacheValue::FileUpdate(
CachedFileUpdate {
hash: sha1.clone(),
game_version: profile
.metadata
.game_version
.clone(),
loaders: vec![
mod_loader.as_str().to_string(),
],
update_version_id: update_version
.id
.clone(),
},
));
cached_entries.push(CacheValue::Version(
(*update_version).into(),
));
}
let members = members
.into_iter()
.map(|x| {
let user = User {
id: x.user.id,
username: x.user.username,
avatar_url: x.user.avatar_url,
bio: x.user.bio,
created: x.user.created,
role: x.user.role,
badges: 0,
};
cached_entries
.push(CacheValue::User(user.clone()));
TeamMember {
team_id: x.team_id,
user,
is_owner: x.role == "Owner",
role: x.role,
ordering: x.ordering,
}
})
.collect::<Vec<_>>();
cached_entries.push(CacheValue::Team(members));
cached_entries
.push(CacheValue::Version((*version).into()));
}
}
@@ -332,16 +320,15 @@ where
.map(|x| x.id),
groups: profile.metadata.groups,
linked_data: profile.metadata.linked_data.and_then(|x| {
if let Some(project_id) = x.project_id {
if let Some(version_id) = x.version_id {
if let Some(locked) = x.locked {
return Some(LinkedData {
project_id,
version_id,
locked,
});
}
}
if let Some(project_id) = x.project_id
&& let Some(version_id) = x.version_id
&& let Some(locked) = x.locked
{
return Some(LinkedData {
project_id,
version_id,
locked,
});
}
None

View File

@@ -393,10 +393,9 @@ impl Credentials {
..
},
) = *err.raw
&& (source.is_connect() || source.is_timeout())
{
if source.is_connect() || source.is_timeout() {
return Ok(Some(creds));
}
return Ok(Some(creds));
}
Err(err)
@@ -640,36 +639,31 @@ impl DeviceTokenPair {
.fetch_optional(exec)
.await?;
if let Some(x) = res {
if let Ok(uuid) = Uuid::parse_str(&x.uuid) {
if let Ok(private_key) =
SigningKey::from_pkcs8_pem(&x.private_key)
{
return Ok(Some(Self {
token: DeviceToken {
issue_instant: Utc
.timestamp_opt(x.issue_instant, 0)
.single()
.unwrap_or_else(Utc::now),
not_after: Utc
.timestamp_opt(x.not_after, 0)
.single()
.unwrap_or_else(Utc::now),
token: x.token,
display_claims: serde_json::from_value(
x.display_claims,
)
.unwrap_or_default(),
},
key: DeviceTokenKey {
id: uuid,
key: private_key,
x: x.x,
y: x.y,
},
}));
}
}
if let Some(x) = res
&& let Ok(uuid) = Uuid::parse_str(&x.uuid)
&& let Ok(private_key) = SigningKey::from_pkcs8_pem(&x.private_key)
{
return Ok(Some(Self {
token: DeviceToken {
issue_instant: Utc
.timestamp_opt(x.issue_instant, 0)
.single()
.unwrap_or_else(Utc::now),
not_after: Utc
.timestamp_opt(x.not_after, 0)
.single()
.unwrap_or_else(Utc::now),
token: x.token,
display_claims: serde_json::from_value(x.display_claims)
.unwrap_or_default(),
},
key: DeviceTokenKey {
id: uuid,
key: private_key,
x: x.x,
y: x.y,
},
}));
}
Ok(None)
@@ -724,7 +718,7 @@ const MICROSOFT_CLIENT_ID: &str = "00000000402b5328";
const AUTH_REPLY_URL: &str = "https://login.live.com/oauth20_desktop.srf";
const REQUESTED_SCOPE: &str = "service::user.auth.xboxlive.com::MBI_SSL";
struct RequestWithDate<T> {
pub struct RequestWithDate<T> {
pub date: DateTime<Utc>,
pub value: T,
}

View File

@@ -2,7 +2,7 @@ use crate::event::emit::{emit_process, emit_profile};
use crate::event::{ProcessPayloadType, ProfilePayloadType};
use crate::profile;
use crate::util::io::IOError;
use chrono::{DateTime, TimeZone, Utc};
use chrono::{DateTime, NaiveDateTime, TimeZone, Utc};
use dashmap::DashMap;
use quick_xml::Reader;
use quick_xml::events::Event;
@@ -360,18 +360,17 @@ impl Process {
}
// Write the throwable if present
if !current_content.is_empty() {
if let Err(e) =
if !current_content.is_empty()
&& let Err(e) =
Process::append_to_log_file(
&log_path,
&current_content,
)
{
tracing::error!(
"Failed to write throwable to log file: {}",
e
);
}
{
tracing::error!(
"Failed to write throwable to log file: {}",
e
);
}
}
}
@@ -429,15 +428,13 @@ impl Process {
if let Some(timestamp) =
current_event.timestamp.as_deref()
{
if let Err(e) = Self::maybe_handle_server_join_logging(
&& let Err(e) = Self::maybe_handle_server_join_logging(
profile_path,
timestamp,
message
).await {
tracing::error!("Failed to handle server join logging: {e}");
}
}
}
}
_ => {}
@@ -445,35 +442,29 @@ impl Process {
}
Ok(Event::Text(mut e)) => {
if in_message || in_throwable {
if let Ok(text) = e.unescape() {
if let Ok(text) = e.xml_content() {
current_content.push_str(&text);
}
} else if !in_event
&& !e.inplace_trim_end()
&& !e.inplace_trim_start()
&& let Ok(text) = e.xml_content()
&& let Err(e) = Process::append_to_log_file(
&log_path,
&format!("{text}\n"),
)
{
if let Ok(text) = e.unescape() {
if let Err(e) = Process::append_to_log_file(
&log_path,
&format!("{text}\n"),
) {
tracing::error!(
"Failed to write to log file: {}",
e
);
}
}
tracing::error!(
"Failed to write to log file: {}",
e
);
}
}
Ok(Event::CData(e)) => {
if in_message || in_throwable {
if let Ok(text) = e
.escape()
.map_err(|x| x.into())
.and_then(|x| x.unescape())
{
current_content.push_str(&text);
}
if (in_message || in_throwable)
&& let Ok(text) = e.xml_content()
{
current_content.push_str(&text);
}
}
_ => (),
@@ -493,6 +484,16 @@ impl Process {
if let Err(e) = Self::append_to_log_file(&log_path, &line) {
tracing::warn!("Failed to write to log file: {}", e);
}
if let Err(e) = Self::maybe_handle_old_server_join_logging(
profile_path,
line.trim_ascii_end(),
)
.await
{
tracing::error!(
"Failed to handle old server join logging: {e}"
);
}
}
line.clear();
@@ -540,17 +541,6 @@ impl Process {
timestamp: &str,
message: &str,
) -> crate::Result<()> {
let Some(host_port_string) = message.strip_prefix("Connecting to ")
else {
return Ok(());
};
let Some((host, port_string)) = host_port_string.rsplit_once(", ")
else {
return Ok(());
};
let Some(port) = port_string.parse::<u16>().ok() else {
return Ok(());
};
let timestamp = timestamp
.parse::<i64>()
.map(|x| x / 1000)
@@ -566,6 +556,46 @@ impl Process {
)
})
})?;
Self::parse_and_insert_server_join(profile_path, message, timestamp)
.await
}
async fn maybe_handle_old_server_join_logging(
profile_path: &str,
line: &str,
) -> crate::Result<()> {
if let Some((timestamp, message)) = line.split_once(" [CLIENT] [INFO] ")
{
let timestamp =
NaiveDateTime::parse_from_str(timestamp, "%Y-%m-%d %H:%M:%S")?
.and_local_timezone(chrono::Local)
.map(|x| x.to_utc())
.single()
.unwrap_or_else(Utc::now);
Self::parse_and_insert_server_join(profile_path, message, timestamp)
.await
} else {
Self::parse_and_insert_server_join(profile_path, line, Utc::now())
.await
}
}
async fn parse_and_insert_server_join(
profile_path: &str,
message: &str,
timestamp: DateTime<Utc>,
) -> crate::Result<()> {
let Some(host_port_string) = message.strip_prefix("Connecting to ")
else {
return Ok(());
};
let Some((host, port_string)) = host_port_string.rsplit_once(", ")
else {
return Ok(());
};
let Some(port) = port_string.parse::<u16>().ok() else {
return Ok(());
};
let state = crate::State::get().await?;
crate::state::server_join_log::JoinLogEntry {
@@ -681,16 +711,13 @@ impl Process {
let logs_folder = state.directories.profile_logs_dir(&profile_path);
let log_path = logs_folder.join(LAUNCHER_LOG_PATH);
if log_path.exists() {
if let Err(e) = Process::append_to_log_file(
if log_path.exists()
&& let Err(e) = Process::append_to_log_file(
&log_path,
&format!("\n# Process exited with status: {mc_exit_status}\n"),
) {
tracing::warn!(
"Failed to write exit status to log file: {}",
e
);
}
)
{
tracing::warn!("Failed to write exit status to log file: {}", e);
}
let _ = state.discord_rpc.clear_to_default(true).await;

View File

@@ -595,8 +595,8 @@ impl Profile {
}
#[tracing::instrument(skip(self, semaphore, icon))]
pub async fn set_icon<'a>(
&'a mut self,
pub async fn set_icon(
&mut self,
cache_dir: &Path,
semaphore: &IoSemaphore,
icon: bytes::Bytes,
@@ -629,21 +629,20 @@ impl Profile {
{
let subdirectory =
subdirectory.map_err(io::IOError::from)?.path();
if subdirectory.is_file() {
if let Some(file_name) = subdirectory
if subdirectory.is_file()
&& let Some(file_name) = subdirectory
.file_name()
.and_then(|x| x.to_str())
{
let file_size = subdirectory
.metadata()
.map_err(io::IOError::from)?
.len();
{
let file_size = subdirectory
.metadata()
.map_err(io::IOError::from)?
.len();
keys.push(format!(
"{file_size}-{}/{folder}/{file_name}",
profile.path
));
}
keys.push(format!(
"{file_size}-{}/{folder}/{file_name}",
profile.path
));
}
}
}
@@ -901,30 +900,29 @@ impl Profile {
{
let subdirectory =
subdirectory.map_err(io::IOError::from)?.path();
if subdirectory.is_file() {
if let Some(file_name) =
if subdirectory.is_file()
&& let Some(file_name) =
subdirectory.file_name().and_then(|x| x.to_str())
{
let file_size = subdirectory
.metadata()
.map_err(io::IOError::from)?
.len();
{
let file_size = subdirectory
.metadata()
.map_err(io::IOError::from)?
.len();
keys.push(InitialScanFile {
path: format!(
"{}/{folder}/{}",
self.path,
file_name.trim_end_matches(".disabled")
),
file_name: file_name.to_string(),
project_type,
size: file_size,
cache_key: format!(
"{file_size}-{}/{folder}/{file_name}",
self.path
),
});
}
keys.push(InitialScanFile {
path: format!(
"{}/{folder}/{}",
self.path,
file_name.trim_end_matches(".disabled")
),
file_name: file_name.to_string(),
project_type,
size: file_size,
cache_key: format!(
"{file_size}-{}/{folder}/{file_name}",
self.path
),
});
}
}
}

View File

@@ -254,7 +254,7 @@ where
}
#[tracing::instrument(skip(bytes, semaphore))]
pub async fn write<'a>(
pub async fn write(
path: &Path,
bytes: &[u8],
semaphore: &IoSemaphore,

View File

@@ -191,22 +191,21 @@ async fn get_all_autoinstalled_jre_path() -> Result<HashSet<PathBuf>, JREError>
let mut jre_paths = HashSet::new();
let base_path = state.directories.java_versions_dir();
if base_path.is_dir() {
if let Ok(dir) = std::fs::read_dir(base_path) {
for entry in dir.flatten() {
let file_path = entry.path().join("bin");
if base_path.is_dir()
&& let Ok(dir) = std::fs::read_dir(base_path)
{
for entry in dir.flatten() {
let file_path = entry.path().join("bin");
if let Ok(contents) =
std::fs::read_to_string(file_path.clone())
if let Ok(contents) = std::fs::read_to_string(file_path.clone())
{
let entry = entry.path().join(contents);
jre_paths.insert(entry);
} else {
#[cfg(not(target_os = "macos"))]
{
let entry = entry.path().join(contents);
jre_paths.insert(entry);
} else {
#[cfg(not(target_os = "macos"))]
{
let file_path = file_path.join(JAVA_BIN);
jre_paths.insert(file_path);
}
let file_path = file_path.join(JAVA_BIN);
jre_paths.insert(file_path);
}
}
}
@@ -300,20 +299,20 @@ pub async fn check_java_at_filepath(path: &Path) -> crate::Result<JavaVersion> {
}
// Extract version info from it
if let Some(arch) = java_arch {
if let Some(version) = java_version {
if let Ok(version) = extract_java_version(version) {
let path = java.to_string_lossy().to_string();
return Ok(JavaVersion {
parsed_version: version,
path,
version: version.to_string(),
architecture: arch.to_string(),
});
}
return Err(JREError::InvalidJREVersion(version.to_owned()).into());
if let Some(arch) = java_arch
&& let Some(version) = java_version
{
if let Ok(version) = extract_java_version(version) {
let path = java.to_string_lossy().to_string();
return Ok(JavaVersion {
parsed_version: version,
path,
version: version.to_string(),
architecture: arch.to_string(),
});
}
return Err(JREError::InvalidJREVersion(version.to_owned()).into());
}
Err(JREError::FailedJavaCheck(java).into())

View File

@@ -33,12 +33,11 @@ pub fn is_feature_supported_in(
if part_version == part_first_release {
continue;
}
if let Ok(part_version) = part_version.parse::<u32>() {
if let Ok(part_first_release) = part_first_release.parse::<u32>() {
if part_version > part_first_release {
return true;
}
}
if let Ok(part_version) = part_version.parse::<u32>()
&& let Ok(part_first_release) = part_first_release.parse::<u32>()
&& part_version > part_first_release
{
return true;
}
}
false

View File

@@ -1,4 +1,4 @@
**Discord:** %PROJECT_DISCORD_URL% \
**Issues:** %PROJECT_ISSUES_URL% \
**Source:** %PROJECT_SOURCE_URL% \
**Wiki:** %PROJECT_WIKI_URL%
**Wiki:** %PROJECT_WIKI_URL% \
**Discord:** %PROJECT_DISCORD_URL%

View File

@@ -0,0 +1,3 @@
**Slug:** `%PROJECT_SLUG%` </br>
**Title issues?**

View File

@@ -0,0 +1 @@
**Title:** %PROJECT_TITLE% </br>

View File

@@ -0,0 +1,7 @@
## Description Clarity
Per section 2 of %RULES% It's important that your Description accurately and honestly represents the content of your project.
Currently, some elements in your Description may be confusing or misleading.
Please edit your description to ensure it accurately represents the current functionality of your project.
Avoid making hyperbolic claims that could misrepresent the facts of your project.
Ensure that your Description is accurate and not likely to confuse users.

View File

@@ -1,6 +1,6 @@
## Description Accessibility
In accordance with section 2.2 of [Modrinth's Content Rules](https://modrinth.com/legal/rules) we request that `# header`s not be used as body text.
In accordance with section 2.2 of %RULES%, we request that `# header`s not be used as body text.
Headers are interpreted differently by screen-readers and thus should generally only be used for things like separating sections of your Description.

View File

@@ -1,6 +1,6 @@
## Image Descriptions
In accordance with section 2.2 of [Modrinth's Content Rules](https://modrinth.com/legal/rules) we ask that you provide a text alternative to your current Description.
In accordance with section 2.2 of %RULES%, we ask that you provide a text alternative to your current Description.
It is important that your Description contains enough detail about your project that a user can have a full understanding of it from text alone.

Some files were not shown because too many files have changed in this diff Show More