Chruesimuesi

joined 1 year ago
10
submitted 9 months ago* (last edited 9 months ago) by [email protected] to c/[email protected]
 

In 2023, residential property prices in Switzerland continued to increase, though the growth rate slowed due to interest rate hikes. The Swiss Real Estate Institute's analysis, based on actual sales prices from the Swiss Real Estate Data Pool, found that:

  1. Single-family homes saw an average price increase of 3.6%, outpacing the general inflation rate of 2.1%. Flats, however, only experienced a slight price increase of 0.4%.

  2. Central Switzerland had the most expensive detached houses, averaging CHF 1.64 million, a 5.1% increase from 2022. Zurich was the second most expensive region.

  3. The largest price increases were in Eastern Switzerland (8.2%) and Ticino (6.7%). This is attributed to a catch-up effect and relatively more affordable financing options in these regions.

  4. Bern and Solothurn were the only regions with declining prices, with a 2.1% decrease and the lowest average selling price for single-family homes at CHF 920,000.

  5. In the flat market, Zurich overtook Central Switzerland with an average price increase of 1.8% to CHF 1.14 million. Bern and Solothurn saw the most significant decrease in flat prices, dropping by 8.8%.

  6. The municipality of Erlenbach in the canton of Zurich had the highest average price for detached houses at CHF 5.16 million.

Overall, the Swiss residential property market showed varied trends across different regions and types of properties, with some areas experiencing significant price increases while others saw declines.

 

The article discusses the Adam optimizer, a popular algorithm in deep learning known for its efficiency in adjusting learning rates for different parameters.

Unlike other optimizers like SGD or Adagrad, Adam dynamically changes its step size based on the complexity of the problem, analogous to adjusting stride in varying terrains. This ability to adapt makes it effective in quickly finding the minimum loss in machine learning tasks, a key reason for its popularity in winning Kaggle competitions and among those seeking a deeper understanding of optimizer mechanics.

 
[–] [email protected] 0 points 9 months ago
[–] [email protected] 19 points 9 months ago (10 children)

I'm sorry to hear about your difficult experience and the impact it's had on you. It's obvious that this situation left a deep scar on your soul, and I applaud you for seeking therapy and getting sober.

I, personally don't know you nor her, nor was I present when all this happened but I think it's important to remind yourself that there are always two sides to a story. While your intentions may have been innocent, her perception of the interactions might have been different. It's possible that what felt like friendly conversation to you was perceived as uncomfortable or intrusive to her. This doesn't necessarily make anyone the "bad guy" – it's just a reminder of how complex human interactions can be and how two people can interpret the same situation very differently.

Regarding your self-perception and fear about future relationships, it's crucial to understand that one incident doesn't define who you are or dictate your future. People grow and change, especially when they actively work on themselves as you have. Being sober and in therapy are important steps towards understanding yourself and learning how to build healthy relationships.

Regarding how she chose to handle the situation, it's important to acknowledge that her actions, whether perceived as right or wrong, are beyond your control. While it's possible that her intentions were not entirely good-hearted, focusing on this aspect might not be constructive for your own healing journey. You cannot change her actions or her perception of the events, only how you respond and learn from the experience. This is part of accepting the past and focusing on your own growth and future.

I want to stress, that self-forgiveness is a vital part of healing. Continually hating yourself for past mistakes is not productive. Recognizing your growth and the efforts you've made to improve is important. You're not the person you were six years ago.

Finally let me tell you that everyone deserves love and the chance to enter into a healthy relationship. This experience doesn't change that. I highly recommend discussing your feelings with your therapist, who can provide more personalized guidance and support.

And as last two cents: remember, growth often comes from challenging experiences. You're on the right path by acknowledging the past, learning from it, and making positive changes. Keep moving forward!

I hope you find something useful in my babbling and wish you a wonderful day 🙂

[–] [email protected] 6 points 9 months ago

My guess is most airlines have clauses in their terms and conditions that allow them to change the aircraft type without prior notice. Pretty sure their lawyers would argue that this is considered a management right for operational reasons.

But I'm no expert 🙃

[–] [email protected] 8 points 9 months ago (1 children)

I know perplexity.ai, but don't think it's "open source privacy respecting"

[–] [email protected] 3 points 9 months ago

This is so freaking cute, it triggers my cuteness aggression! (👁 ͜ʖ👁)

[–] [email protected] 2 points 10 months ago

I can't stop watching this 🤣

[–] [email protected] 6 points 11 months ago (3 children)

I love this play of big things in small places. Awesome work.

[–] [email protected] 6 points 11 months ago (3 children)

And for the creation of this thought water was also involved; assuming this was indeed a showerthougt

[–] [email protected] 26 points 1 year ago

Big beards can alter facial recognition and obscure expressions, making someone look more unpredictable or wild to observers.

[–] [email protected] 2 points 1 year ago

Beautiful 😊

 

Large language models (LLMs) are data-efficient but their size makes them difficult to deploy in real-world scenarios.

"Distilling Step-by-Step" is a new method introduced by Google researchers that enables smaller models to outperform LLMs using less training data. This method extracts natural language rationales from LLMs, which provide intermediate reasoning steps, and uses these rationales to train smaller models more efficiently.

In experiments, the distilling step-by-step method consistently outperformed LLMs and standard training approaches, offering both reduced model size and reduced training data requirements.

[–] [email protected] 11 points 1 year ago

But depending on your current knowledge and future accidents it might be more useful 😉

view more: next ›