math

269 readers
2 users here now

Interesting news and discussion centered around Mathematics

founded 1 year ago
MODERATORS
1
2
 
 

The Wikipedia article on Steiner constructions mentions it, but doesn't explain it, and the source linked is a book I don't have. This has come up in a practical project.

3
2
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 
 

cross-posted from: https://slrpnk.net/post/3863820

Institution: Berkeley
Lecturer: Richard E Borcherds
University Course Code: Math 250A
Subject: #math #grouptheory
Description: This is an experimental online course on mathematical group theory, corresponding to about the first third of the Berkeley course 250A (introductory graduate algebra). The level is for first year graduate students or advanced undergraduates. The topics covered are roughly the parts of group theory that a mathematician not specializing in groups might find useful.

More at [email protected]

4
5
 
 

If not, that seems like a good argument in favour of finitism. If so, what if anything does it mean if you solve it by brute force?

6
7
 
 

(a OR b) -> c

= ~(a OR b) OR c

= (~a AND ~b) OR c

= (~a OR c) AND (~b OR c)

= (a -> c) AND (b -> c) as required

I haven’t formally learnt logic so I’m not sure if my proof is what you’d call rigorous, but the result is pretty useful for splitting up conditionals in proofs like some of the number theory proofs I’ve been trying. E.g.

Show that if a is greater than 2 and a^m + 1 is prime, then a is even and m is a power of 2

In symbolic form this is:

∀a >= 2 ( a^m + 1 is prime -> a is even AND m is a power of 2 )

The contrapositive is:

∀a >= 2 ( a is odd OR m is NOT a power of 2 -> a^m + 1 is composite )

and due to the result above, this becomes

∀a >= 2 ( a is odd -> a^m + 1 is composite ) AND ( m is NOT a power of 2 -> a^m + 1 is composite )

so you can just prove two simpler conditionals instead of one more complicated one.

8
 
 

I've been reading this book lately, although I'm not finished yet.

It's basically a "second course" of matrix algebra that uses the full-rank factorization and the Moore-Penrose pseudoinverse to construct other generalized inverses and prove cool stuff about matrices. I initially borrowed a copy from the library for its extensive coverage of the Jordan decomposition (whose existence was really important for my control systems coursework), but I actually bought a copy as a reference because I found myself thumbing through it all the time. Although it is mostly theoretical, all the algorithms are covered sufficiently to do everything on paper if you wanted to.

If this isn't in the spirit of the community please let me know.

9
10
 
 

Paul Cohen I understand constructed such a set of axioms, which logically imply the existence of an evil set family like that. Constructive is of course preferred for extra WTF.

11
 
 

The cyclic group case is the discrete logarithm problem, but I don't know what keyword to use for other cases.

What I'm really interested in is the symmetric group. If I have a fixed set of permutations, how do i combine them into the one I want?

12
 
 

"As viral puzzles became popular, mathematicians joined the game too. Here’s a fun puzzle that has been widely shared."

13
 
 

So looking at this Aaronson post and this easier to grasp codegolf post, you're presented with programs that only terminate if these theories are inconsistent. They're very long running in the mathematical sense of "long", but putting aside any philosophical objections, say you ran one and it eventually terminated. How surprising is that?

14
 
 

some links are broken but otherwise good. Post your open source math textbooks here

15
16
 
 

cross-posted from: https://lemmy.sdf.org/post/36227

Abstract: "Prompting is now the primary way to utilize the multitask capabilities of language models (LMs), but prompts occupy valuable space in the input context window, and re-encoding the same prompt is computationally inefficient. Finetuning and distillation methods allow for specialization of LMs without prompting, but require retraining the model for each task. To avoid this trade-off entirely, we present gisting, which trains an LM to compress prompts into smaller sets of "gist" tokens which can be reused for compute efficiency. Gist models can be easily trained as part of instruction finetuning via a restricted attention mask that encourages prompt compression. On decoder (LLaMA-7B) and encoder-decoder (FLAN-T5-XXL) LMs, gisting enables up to 26x compression of prompts, resulting in up to 40% FLOPs reductions, 4.2% wall time speedups, storage savings, and minimal loss in output quality. "

17
18
1
The TeX book (drive.google.com)
submitted 1 year ago by [email protected] to c/[email protected]
19
20
21
22