android app fatch website data kotlin Step By Step To fetch website data in an Android app using Kotlin, you can use Retrofit (for API-based data) or Jsoup (for web scraping). Here’s how you can do both: android app fatch website data kotlin Step By Step
Table of Contents
android app fatch website data kotlin Step By Step
1. Using Retrofit (For API-based Websites)
If the website has an API (like JSON data), Retrofit is the best option.
Step 1: Add Dependencies
Add these lines to your app-level build.gradle
file:
dependencies {
implementation 'com.squareup.retrofit2:retrofit:2.9.0'
implementation 'com.squareup.retrofit2:converter-gson:2.9.0'
}
How Alexa Ranking Works
Step 2: Create a Data Model
Assume we are fetching JSON data from https://jsonplaceholder.typicode.com/posts
:
data class Post(
val userId: Int,
val id: Int,
val title: String,
val body: String
)
Step 3: Define API Interface
import retrofit2.Call
import retrofit2.http.GET
interface ApiService {
@GET("posts")
fun getPosts(): Call<List<Post>>
}
Step 4: Create Retrofit Instance
import retrofit2.Retrofit
import retrofit2.converter.gson.GsonConverterFactory
object RetrofitClient {
private const val BASE_URL = "https://jsonplaceholder.typicode.com/"
val instance: ApiService by lazy {
Retrofit.Builder()
.baseUrl(BASE_URL)
.addConverterFactory(GsonConverterFactory.create())
.build()
.create(ApiService::class.java)
}
}
Step 5: Fetch Data in Activity
import android.os.Bundle
import android.util.Log
import androidx.appcompat.app.AppCompatActivity
import retrofit2.Call
import retrofit2.Callback
import retrofit2.Response
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
RetrofitClient.instance.getPosts().enqueue(object : Callback<List<Post>> {
override fun onResponse(call: Call<List<Post>>, response: Response<List<Post>>) {
if (response.isSuccessful) {
response.body()?.forEach {
Log.d("Post", it.title)
}
}
}
override fun onFailure(call: Call<List<Post>>, t: Throwable) {
Log.e("Error", t.message.toString())
}
})
}
}
2. Using Jsoup (For Web Scraping)
If the website does not have an API and you need to extract HTML data, use Jsoup.
Step 1: Add Jsoup Dependency
In build.gradle (app-level):
dependencies {
implementation 'org.jsoup:jsoup:1.13.1'
}
Step 2: Fetch Website Data
import android.os.Bundle
import android.util.Log
import androidx.appcompat.app.AppCompatActivity
import kotlinx.coroutines.CoroutineScope
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.launch
import org.jsoup.Jsoup
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
CoroutineScope(Dispatchers.IO).launch {
try {
val doc = Jsoup.connect("https://example.com").get()
val title = doc.title() // Get page title
val firstParagraph = doc.select("p").first()?.text() // Get first paragraph
Log.d("Title", title)
Log.d("Paragraph", firstParagraph ?: "No paragraph found")
} catch (e: Exception) {
Log.e("Error", e.message.toString())
}
}
}
}
Which Method Should You Use?
- Retrofit → If the website provides JSON data via an API.
- Jsoup → If you need to extract HTML from a webpage.
Let me know if you need help implementing this!