In the 1920s and 1930s, nudism gained popularity in the United States, with the establishment of the first American nudist resort in 1933. However, it wasn't until the 1960s and 1970s that nudism began to gain mainstream acceptance, with the rise of the counterculture movement and the increasing awareness of the benefits of nudity.
In the 1920s and 1930s, nudism gained popularity in the United States, with the establishment of the first American nudist resort in 1933. However, it wasn't until the 1960s and 1970s that nudism began to gain mainstream acceptance, with the rise of the counterculture movement and the increasing awareness of the benefits of nudity.