After the end of the French and Indian War, the following changes occurred in Florida:
- Britain gained control of the territory from Spain.
- Slavery was permitted in Florida to entice Southerners to settle there.
The other statements may contain elements of truth in a broader context, but they do not accurately reflect the immediate changes that occurred in Florida specifically after the French and Indian War.